Dan Ciborowski - MSFT
Dan Ciborowski - MSFT

Reputation: 7237

How to import Zipped file into Postgres Table

I would like to important a file into my Postgresql system(specificly RedShift). I have found a arguement for copy that allows importing a gzip file. But the provider for the data I am trying to include in my system only produces the data in a .zip. Any built in postgres commands for opening a .zip?

Upvotes: 5

Views: 14572

Answers (4)

Isaac Turner
Isaac Turner

Reputation: 2911

From within Postgres:

COPY table_name FROM PROGRAM 'unzip -p input.csv.zip' DELIMITER ',';

From the man page for unzip -p:

-p     extract files to pipe (stdout).  Nothing but the file data is sent to stdout, and the files are always extracted  in  binary
       format, just as they are stored (no conversions).

Upvotes: 3

Sir Mbuki
Sir Mbuki

Reputation: 1210

unzip -c /path/to/.zip | psql -U user

The 'user' must be have super user right else you will get a

ERROR:  must be superuser to COPY to or from a file

To learn more about this see here.

Basically this command is used in handling large databases

Upvotes: 1

Neil Neyman
Neil Neyman

Reputation: 2158

Can you just do something like

unzip -c myfile.zip | gzip myfile.gz

Easy enough to automate if you have enough files.

Upvotes: 2

TheProletariat
TheProletariat

Reputation: 1056

This might only work when loading redshift from S3, but you can actually just include a "gzip" flag when copying data to redshift tables, as described here:

This is the format that works for me if my s3 bucket contains a gzipped .csv.

copy <table> from 's3://mybucket/<foldername> '<aws-auth-args>' delimiter ',' gzip;

Upvotes: 1

Related Questions