Matthieu Napoli
Matthieu Napoli

Reputation: 49533

Why "Allowed memory size exhausted"?

I am writing a batch script and get a Allowed memory size of 134217728 bytes exhausted error.

I don't understand why the memory is filling up. I tried unsetting the $row variable, but that didn't change a thing. Here is my code:

// ... (sql connection)
$result = mysql_query("SELECT * FROM large_table");

while ($row = mysql_fetch_array($result)) {
    echo $row['id'] . PHP_EOL;
    unset($row);
}

(simplified code)

Why does the memory fill up, and how can I avoid it?

Note: this is a batch script. This is normal that I have to handle data like that (go through 1 million lines).

Update: The out of memory happens around the 400 000th line, so this has got to be something in the loop? I'd like to avoid having to implement the paging if possible.

Upvotes: 8

Views: 11819

Answers (4)

Pete855217
Pete855217

Reputation: 1602

I had the same problem with a large database. I ran out of memory, despite unsetting the $row variable at about 400,000 records, but the unbuffered query fixed it.

Just for reference for others (and me when I do it again!), some unbuffered query example code is:

$sql = "SELECT SEQ, RECTYPE, ROSTERGRP, EMPNM, EMPNUM, DT, RDUTYCAT, ADUTYCAT FROM " .
        $tblRosters . " ORDER BY EMPNUM,DT";
$result  = mysql_unbuffered_query( $sql, $dbConn );
$svRow = array();
while ( $row = mysql_fetch_array( $result ) )
    {
         // your processing code here
    }
    // Unset, close db etc. if you are finished goes here.

Upvotes: 1

KillerX
KillerX

Reputation: 1466

Try using http://www.php.net/manual/en/function.mysql-unbuffered-query.php (mysql_unbuffered_query()) to prevent the whole table being loaded into memory, but still avoiding pagination.

Upvotes: 13

Vilius Sutkus '89
Vilius Sutkus '89

Reputation: 804

Limit your query to for example 1k results and execute it again (with the offset ofcourse) until you've gone through all the table. Your current unset makes no difference, since $row gets overwritten with each while iteration, so you can skip it.

$chunk_size = 1000;
$done = 0;

$keep_asking_for_data = true;
do{
    $result = mysql_query("SELECT * FROM `large_table` LIMIT {$done}, {$chunk_size}");
    $num_rows = mysql_num_rows($result);
    if($num_rows){
        $done += $num_rows;
        while($row = mysql_fetch_assoc($result)){
            echo "{$row['id']}\n";
        }
    } else {
        $keep_asking_for_data = false;
    }
    mysql_free_result($result);
}while($keep_asking_for_data);

Just compiled on my head, hope it works =D

Upvotes: 5

halfer
halfer

Reputation: 20420

If you're using MySQL, page your results so you don't exhaust your available memory. MySQL itself is taking up this memory with your database resultset. Have a look at the following link, in particular the LIMIT offset, limit syntax of SELECT:

http://dev.mysql.com/doc/refman/5.0/en/select.html

Upvotes: 1

Related Questions