kavipriya M
kavipriya M

Reputation: 61

Loading zipped CSV(filename.csv.gz) file into PostgreSQL table

How to load larger amount of csv.gz file into Postgresql without unzipping to csv file ,since I tried pipeline command (mkfifo pipelinename) but it doesn't work for me. Is there any other solution to solve this issue ?

I have try to load it from local to postgresql using following command command : zcat file.csv.gz | psql -U username -d database;

Result : out of memory

Need : I want to load a big size csv.gz (around 15+ GB) file from centos to postgresql database.

Upvotes: 6

Views: 9268

Answers (3)

hmadinei
hmadinei

Reputation: 51

It works for me:
7z e -so file.csv.gz | psql -U username -d databasename -c "COPY tablename FROM STDIN WITH (FORMAT csv, HEADER true);"

I use 7zip to do that. For example, You can use gzip.

Upvotes: 0

Kemin Zhou
Kemin Zhou

Reputation: 6891

Just to share my simple examples using zcat instead of gzip. Simply less typing. I am using zcat to expand the gzipped file.

\copy tmptable from program 'zcat O1variant.tab.gz' with (format csv, delimiter E'\t', header TRUE)

Upvotes: 4

Vincent
Vincent

Reputation: 8846

Note that this should also work from inside psql:

\copy TABLE_NAME FROM PROGRAM 'gzip -dc FILENAME.csv.gz' DELIMITER ',' CSV HEADER NULL ''

Upvotes: 7

Related Questions