Reputation: 659
I have a large table (300 million lines) that I would like to dump to a csv - I need to do some processing that cannot be done with SQL. Right now I am using Squirrel as a client, and it does not apparently deal very well with large datasets - at least as far as I can tell from my own (limited) experience. If I run the query on the actual host, will it use less memory? Thanks for any help.
Upvotes: 1
Views: 1191
Reputation: 58657
Try this:
COPY tablename
TO 'filename.csv'
WITH
DELIMITER AS ','
NULL AS ''
CSV HEADER
Upvotes: 1