Lou Kosak
Lou Kosak

Reputation: 289

Memory leak in PHP when fetching large dataset from MySQL

When I execute the following code for a user table of about 60,000 records:

mysql_connect("localhost", "root", "");
mysql_select_db("test");

$result = mysql_query("select * from users");

while ($row = mysql_fetch_object($result)) {
  echo(convert(memory_get_usage(true))."\n");
}


function convert($size) {
  $unit=array('b','kb','mb','gb','tb','pb');
  return @round($size/pow(1024,($i=floor(log($size,1024)))),2).' '.$unit[$i];
}

I get the following error:

PHP Fatal error:  Allowed memory size of 134217728 bytes exhausted (tried to allocate 32 bytes)

Any thoughts on how to avoid having the script take up additional memory with each pass through the loop? In my actual code I'm trying to provide a CSV download for a large dataset, with a little PHP pre-processing.

Please don't recommend increasing PHP's memory limit--it's a bad idea and, more importantly, will still create an upward bound on how large a dataset can be processed with this technique.

Upvotes: 5

Views: 6750

Answers (3)

goat
goat

Reputation: 31854

mysql_query buffers the entire result set into php memory. This is convenient and generally very fast, but you're experiencing a drawback to it.

mysql_unbuffered_query() exists. It doesn't grab the entire result set all at once. It grabs little pieces at a time when you fetch rows from the result set.

Upvotes: 2

Luke
Luke

Reputation: 1870

I have had a similar problem. What I did to get it to work was to create a temporary file (you can use hash or something similar to keep a record of the name).

  • Pull 10,000 rows and put them into a file (temp). Put it into a temp csv file.
  • Reload page (using header with specific parameters / session)
  • Pull another 10,000 rows and append it to a file.
  • When you reach end of the table - buffer file to user.

Go like that in circles until you got it all. I had to do this work around for two reasons,

  1. Timeout
  2. Out of memory error.

Drawbacks of this method is that it requires many HTTP calls to get data. Also in the mean time there could be rows that have changed etc. It is a pretty "dirty" way of doing it. I'm yet to find something that works better. Hope that helps.

Upvotes: 0

SuperTron
SuperTron

Reputation: 4243

I'm not 100% sure if this will solve your problem, but have you considered using PDO? It has several advantages; you can read more about them here. If you do go in that direction, there is a similar question about memory usage here.

Upvotes: 1

Related Questions