Reputation: 7
I have a list data with 999,000 records.
I have a select query and a while loop to get the data, I use array_push to add the retrieved value in loop into one array.
And then I want it so every loop processes 1000 values in this array.
My problem is when use array_push
with big data I get the error:
Fatal Error: Allowed Memory Size of 134217728 Bytes
How can I optimize my code to resolve my problem?
My code is below:
$sql = "select customer_id";
$sql .= " from";
$sql .= " t_customer t1";
$sql .= " inner join t_mail_address t2 using(mid, customer_id)";
$result = $conn->query($sql);
$customerArray = array();
while ($row = $result ->fetch(PDO::FETCH_ASSOC)) {
array_push($customerArray , $row);
}
// Execute every 1000 record
foreach(array_chunk($customerArray , 1000) as $execCustomerArray ) {
// My code to execute for every records.
// ....
}
Upvotes: 0
Views: 1359
Reputation: 2220
I'm unsure if it would fix anything, but one thing I will say is, your use of pushing all records into an array is silly.
You're using fetch to fetch them one by one, then adding them all to an array, why on earth aren't you just using PDOStatement::fetchAll()
?
Example:
$sql = "select customer_id";
$sql .= " from";
$sql .= " t_customer t1";
$sql .= " inner join t_mail_address t2 using(mid, customer_id)";
$result = $conn->query($sql);
$customerArray = $result->fetchAll(PDO::FETCH_ASSOC);
// Execute every 1000 record
foreach(array_chunk($customerArray , 1000) as $execCustomerArray ) {
// My code to execute for every records.
// ....
}
This may not fix your memory issue, because we can't see what the heavy lifting is for every customer record, but I will say that while loop you had was silly but most likely not the cause of your memory issue
Depending on if this is a script, or a web page thing, you could also have an incremental loop sort of thing, and use the MySQL LIMIT function to implement basic paging for your data, thus preventing it from coming into memory all at once,
Upvotes: 1