Reputation: 289
When I execute the following code for a user table of about 60,000 records:
mysql_connect("localhost", "root", "");
mysql_select_db("test");
$result = mysql_query("select * from users");
while ($row = mysql_fetch_object($result)) {
echo(convert(memory_get_usage(true))."\n");
}
function convert($size) {
$unit=array('b','kb','mb','gb','tb','pb');
return @round($size/pow(1024,($i=floor(log($size,1024)))),2).' '.$unit[$i];
}
I get the following error:
PHP Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 32 bytes)
Any thoughts on how to avoid having the script take up additional memory with each pass through the loop? In my actual code I'm trying to provide a CSV download for a large dataset, with a little PHP pre-processing.
Please don't recommend increasing PHP's memory limit--it's a bad idea and, more importantly, will still create an upward bound on how large a dataset can be processed with this technique.
Upvotes: 5
Views: 6750
Reputation: 31854
mysql_query buffers the entire result set into php memory. This is convenient and generally very fast, but you're experiencing a drawback to it.
mysql_unbuffered_query() exists. It doesn't grab the entire result set all at once. It grabs little pieces at a time when you fetch rows from the result set.
Upvotes: 2
Reputation: 1870
I have had a similar problem. What I did to get it to work was to create a temporary file (you can use hash or something similar to keep a record of the name).
Go like that in circles until you got it all. I had to do this work around for two reasons,
Drawbacks of this method is that it requires many HTTP calls to get data. Also in the mean time there could be rows that have changed etc. It is a pretty "dirty" way of doing it. I'm yet to find something that works better. Hope that helps.
Upvotes: 0