Vinny
Vinny

Reputation: 865

How to speed up the execution of inserting values in postgresql

data_input=open(ratingsfilepath,'r')
for row in data_input:
   cur_load.execute("INSERT INTO "+ratingstablename+" VALUES (%s, %s, %s)", (row.split('::')[0],row.split('::')[1],row.split('::')[2]))

I have 10 million records in .dat file I am loading them into table using python script. But it takes nearly 1 hour to load them. Is there anything to reduce the time

Upvotes: 2

Views: 64

Answers (1)

e4c5
e4c5

Reputation: 53734

Inserting 10 million records will anyway take a very long time, but you can still speed it up by using your python script to convert your data file into CSV format that corresponds to your table structure. Then you can use the COPY FROM sql command to load it into the table in one go.

Using copy is considerably faster than 10 million inserts.

Upvotes: 1

Related Questions