km6zla
km6zla

Reputation: 4877

Iterate mysqli unbuffered query result more than once

Problem:

I have a query that returns a large result set. It is too large to bring into PHP. I get a fatal memory max error and cannot increase memory limit. Unbuffered Queries

I need to iterate over the array multiple times but mysqli_data_seek doesn't work on unbuffered queries. mysqli_result::data_seek

//I have a buffered result set
$bresult = $mysql->query("SELECT * FROM Small_Table");

//And a very large unbuffered result set
$uresult = $mysqli->query("SELECT * FROM Big_Table", MYSQLI_USE_RESULT);

//The join to combine them takes too long and is too large
//The result set returned by the unbuffered query is too large itself to store in PHP
//There are too many rows in $bresult to re-execute the query or even a subset of it for each one

foreach($bresult as &$row) {
    //My solution was to search $uresult foreach row in $bresult to get the values I need
    $row['X'] = searchResult($uresult, $row['Key']);
    //PROBLEM: After the first search, $uresult is at its and and cannot be reset with mysqli_result::data_seek
}

function searchResult($uresult, $val)
    while($row = $uresult->fetch_assoc()){
        if($row['X'] == $val) {
            return $row['X'];
        }
    }
}

If you have another solution that meets these requirements I will accept it: - Does not try to join the result in a single query (takes too long) - Does not run any query for each result in another query (too many queries, takes too long, slows down system)

Please leave a comment if you need more info.

Thank you.

Upvotes: 1

Views: 1031

Answers (1)

Zak
Zak

Reputation: 25205

If you're trying to process a big data set have you considered using an intermediary like Hadoop? you can set up a small hadoop cluster, do your processing, then have your php code make a request for the processed data to the hadoop output.

Upvotes: 0

Related Questions