Andrew
Andrew

Reputation: 1526

gzip table to csv on AWS instance, prior to downloading / saving locally with psql

I'm new to PostgreSQL and the psql CLI. My bandwidth is extremely limited, which results in it taking hours to download each table from an AWS instance, that are 1 - 5 GB's each. The current command I use, after logging into the DB with psql:

\copy (SELECT * FROM table) TO table.csv CSV DELIMITER ','

Is it possible to query a table, similar to the above, that actually zips the csv file ON the Amazon PostgreSQL instance, prior to downloading and saving locally, thus reducing the 1 - 5 GB downloads to < 1 GB; significantly reducing the download times?

Something like:

\copy (SELECT * FROM table) TO csv.zip CSV DELIMITER ',' TO table.csv.zip

I came across this gist, but the commands listed appear to be a complete dump of all tables / the entire db. I would like the ability to do the same for tables and subset queries.

EDIT: Solution = \copy (SELECT * FROM table) TO PROGRAM 'gzip > Users/username/folder/folder/my_table.gz' DELIMITER ',' after logging into psql

Upvotes: 2

Views: 1280

Answers (1)

Jim Jones
Jim Jones

Reputation: 19653

Using psql and the STDOUT. This command will return the output to the client and will compress it:

psql yourdb -c "\COPY (SELECT * FROM table) TO STDOUT;" | gzip > output.gz

Or directly at the database server (also into a compressed file), using a client of your choice:

COPY (SELECT * FROM table) TO PROGRAM 'gzip > /var/lib/postgresql/my_table.gz' DELIMITER ',';

Upvotes: 3

Related Questions