00__00__00
00__00__00

Reputation: 5367

downloading a mysql table too large to fit in memory into csv files (using WorkBench)

I have some huge sql tables (>50M) store in mysql database. I just need to download them into .csv files.

I understand I can limit the size of the returned values by:

SELECT * FROM myTable
LIMIT 1000000; 

However, for >1M rows I consistently run out of memory.

Moreover for some reason, if use any other client than MySQL Workbench 5.2.47, I receive a failed handshake answer/connection denied from the server. So I have to stick to this tool and version. Also, I can not change machine so memory limit is a constraint.

How to implement and loop to download and store a huge table in csv file(s), and ensure order and uniqueness in the results?

Upvotes: 0

Views: 721

Answers (1)

Jigar Patel
Jigar Patel

Reputation: 4615

You can use SELECT INTO FILE to generate csv. It should not have any problem generating file with 1M rows.

Refer to the documentation for propert syntax and field separation character etc..

https://dev.mysql.com/doc/refman/8.0/en/select-into.html

One personal sample SELECT * INTO OUTFILE '/<OUTPUT_DIR>/export.csv' FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '\"' LINES TERMINATED BY '\n' FROM TABLE_NAME <WHERE condition> <ordering> LIMIT <BATCH_SIZE>;"

Upvotes: 1

Related Questions