Reputation: 1102
In order to avoid overloading the server, I made a loop of queryen, I'll get 150k members up and stored in an array. This works fine, but when the loop has finished with its job, the array has to be printed out, but this takes a long time and it ends up, with the side crashes.
$development = array(
'testing' => false,
'testing_loops' => 1
);
$settings = array(
'times_looped' => 0,
'members_at_a_time' => 2000,
'print_settings' => true,
'members_looped' => 0,
'test' => 0,
);
function outputCSV($data)
{
$outstream = fopen("php://output", 'w');
array_walk($data, '__outputCSV', $outstream);
fclose($outstream);
}
function __outputCSV(&$vals, $key, $filehandler)
{
fwrite($filehandler, implode(',',$vals). "\n");
}
function getMembers(&$settings, $ee)
{
// SQL FROM
$sql_from = $settings['times_looped'] * $settings['members_at_a_time'];
// SQL LIMIT
$sql_limit = $sql_from . ', ' . $settings['members_at_a_time'];
$settings['test'] = $sql_limit;
// GET MEMBERS
$query = $ee->EE->db->query("SELECT m.email,
cr.near_rest_1_id, cr.near_rest_1_distance,
cr.near_rest_2_id, cr.near_rest_2_distance,
cr.near_rest_3_id, cr.near_rest_3_distance
from exp_members m
left join
exp_menucard_closest_restaurants cr
on m.member_id = cr.member_id
where group_id = 8 or 14 limit ".$sql_limit."");
// Check if members found
if($query->num_rows() == 0)
{
return $query->num_rows();
}
// Update number of members
$settings['members_looped'] = $settings['members_looped'] + $query->num_rows();
// Loop members
foreach($query->result_array() as $row) {
if($row['near_rest_1_distance'] > 1.0)
{$near_rest_1_distance= number_format($row['near_rest_1_distance'], 2, ',', ',') ." ". 'km';}
else
{$near_rest_1_distance= number_format($row['near_rest_1_distance'], 3, ',', '')*1000 ." ". 'meter';}
if($row['near_rest_2_distance'] > 1.0)
{$near_rest_2_distance= number_format($row['near_rest_2_distance'], 2, ',', ',') ." ". 'km';}
else
{$near_rest_2_distance= number_format($row['near_rest_2_distance'], 3, ',', '')*1000 ." ". 'meter';}
if($row['near_rest_3_distance'] > 1.0)
{$near_rest_3_distance= number_format($row['near_rest_3_distance'], 2, ',', ',') ." ". 'km';}
else
{$near_rest_3_distance= number_format($row['near_rest_3_distance'], 3, ',', '')*1000 ." ". 'meter';}
$nearest_rest_result_array[] = array(
'email' => $row['email'],
'near_rest_1_id' => $row['near_rest_1_id'],
'near_rest_1_distance' => $near_rest_1_distance,
'near_rest_2_id' => $row['near_rest_2_id'],
'near_rest_2_distance' => $near_rest_2_distance,
'near_rest_3_id' => $row['near_rest_3_id'],
'near_rest_3_distance' => $near_rest_3_distance
);
}
// Loop again
return $query->num_rows();
}
// Loop
$more_rows = true;
while($more_rows == true || $more_rows > 0)
{
// Test
if($settings['times_looped'] >= $development['testing_loops'] && $development['testing'] == true){
break;
}
// get members
$more_rows = getMembers($settings, $this);
$settings['members_looped'] = $settings['members_looped'] + $more_rows;
$settings['times_looped']++;
// Got last bunch of members
if($settings['members_looped'] < $settings['members_at_a_time'])
{
break;
}
}
When the loop has finished with its job, it will print all the array out
// Write to CSV
outputCSV($nearest_rest_result_array);
Upvotes: 0
Views: 269
Reputation: 9775
Consider implementing pagination in your webpage.
Lets take an example. Suppose your database has 10,000 rows. There maybe no need for those 10,000 rows to be displayed at once. Instead we can display 100 records per page and have links 100 such pages.
Best Example can be https://www.google.co.in/?gws_rd=cr&ei=-HggUuXWBMj4rQeNr4CADw#q=pagination+in+php
Of 6,190,000 results they have shown only 11 per page.
Upvotes: 0
Reputation: 198237
The mistake starts early, use an iterator instead of the array you currently do:
foreach($query->result_array() as $row)
PDO and Mysqli allow to iterate over the result. Create the output on the fly and stream it to the client, your webserver will chunk it normally, if not, set your PHP output buffer to 4096k or similar.
Upvotes: 2
Reputation: 57721
Don't use a foreach loop. Use a while
-loop that reads a rows from the database and writes it to the CSV file. This way you're operating line-by-line which doesn't use as much memory.
If you're working with large data sets it's usually better to have some concept of iterators or streams, rather that trying to modify the whole in one big operation.
Upvotes: 3