Reputation: 359
I am importing some data from a 10Gb file to a postgres database tables using java (jdbc). Import process taking more 12 hours to complete, so need to improve the importing process. I tried copy command for inserting. Some select commands are also running with the inserting tables. Can anyone suggests the way to improve the speed?
Upvotes: 1
Views: 6136
Reputation: 340903
Standard SQL INSERT
statement typically has a too big overhead when millions of rows are needed. 10 GiB of data isn't really that much, but certainly too much for INSERT
(you either have a huge transaction or commit/rollback every INSERT).
There is a nice 14.4. Populating a Database chapter in official documentation. 14.4.2. Use COPY
is especially interesting for you:
Use
COPY
to load all the rows in one command, instead of using a series of INSERT commands. TheCOPY
command is optimized for loading large numbers of rows; it is less flexible thanINSERT
, but incurs significantly less overhead for large data loads. SinceCOPY
is a single command, there is no need to disable autocommit if you use this method to populate a table.
Upvotes: 4