Reputation: 6016
For a bit of context I have a 4G RAM server and I have a cron job which loops through a MySQL table with around 15 million rows and does some processing.
To save memory I'm looping and selecting 200,000 records at a time and then repeating till I've been through every record in the table.
Is it quicker to do it this way or should I do less queries but select a larger subset each time?
Upvotes: 0
Views: 118
Reputation: 98398
It depends what language/client library you are using. MySQL has two different methods, mysql_store_result and mysql_use_result. The former will read all rows from the server into memory before your loop, which can be more efficient, but is more expensive in memory. The latter can defer getting rows until you loop over them. Some clients allow choosing which is used.
In the end, if you want to know if and how much to batch your requests, you are going to need to benchmark your actual situation.
Upvotes: 1