Reputation: 678
10 million rows. Want to backup data in SQL files with 100k rows (breaking the data into chunks). Is it possible to run a query like this:
SELECT * FROM `myTable` WHERE `counter` > 200000 and `counter` <= 300000 ---> Send to .sql file
I want to replace the psuedocode at the end of that statement with real code.
Using the "export" feature of PHPMyAdmin more than 100 times would take too long.
Upvotes: 1
Views: 752
Reputation: 46249
You can try to use mysqldump command with script to backup your script.
for i in {1..100}
do
beginRow=$(( ($i - 1) * 100000 ))
endRow=$(( $i * 100000 ))
mysqldump -h <hostname> -u <username> -p <databasename> myTable --where="counter = > $beginRow and counter <= $endRow" --no-create-info > "./data-$i.sql"
done
Upvotes: 1
Reputation: 386
You should be able to use the mysqldump command:
mysqldump -u root -p [database_name] [tablename]
--where="'counter' > 200000 and 'counter' <= 300000" > [dumpfile.sql]
Additional info on the command here: https://dev.mysql.com/doc/refman/8.0/en/mysqldump.html
Upvotes: 1