Reputation: 667
I have a script which is fetching data from another server via curl_multi_exec
using below script, this script is working fine, but I'm getting out of memory exception.
$curly = array(); // array of curl handles
$result = array(); // data to be returned
$mh = curl_multi_init(); // multi handle
foreach ($xmlarray as $id => $d) {
$curly[$id] = curl_init();
curl_setopt($curly[$id], CURLOPT_URL, $url);
curl_setopt($curly[$id], CURLOPT_POST, true);
curl_setopt($curly[$id], CURLOPT_POSTFIELDS, $d);
curl_setopt($curly[$id], CURLOPT_RETURNTRANSFER, true);
curl_setopt($curly[$id], CURLOPT_TIMEOUT, 60);
curl_setopt($curly[$id], CURLOPT_SSLVERSION, 3);
curl_multi_add_handle($mh, $curly[$id]);
} // query data for each of sub queries on the $xmlarray
$running = null; // execute the handles
do {
curl_multi_exec($mh, $running);
curl_multi_select($mh);
} while($running > 0);
foreach($curly as $id => $c) {
$result[$id] = curl_multi_getcontent($c);
curl_multi_remove_handle($mh, $c);
}// get content and remove handles
$active = null;
curl_multi_close($mh);
file_put_contents('test.xml',$result);
$xmlarray
here contains an array of requests, each of which contains around 500 users! When running the script for 5000 users - all works fine, when running it for 10000 users I'm getting out of memory exception and debug shows that the most memory is used by curl_multi_exec()!
What would be the best way for me to overcome this? Any assistance is highly appreciated! Thanks in advance.
EDIT
Tried to split my $xmlarray into number of arrays and action each batch separately (code below). This solution got me from 5k users to 13k users being processed.
$xmlarrayB = array_chunk($xmlarray, 5, true);
if(is_array($xmlarrayB)) {
foreach ($xmlarrayB as $xmlarrayBA) {
$curly = array(); // array of curl handles
$result = array(); // data to be returned
$mh = curl_multi_init(); // multi handle
foreach ($xmlarrayBA as $id => $d) {
$curly[$id] = curl_init();
curl_setopt($curly[$id], CURLOPT_URL, $url);
curl_setopt($curly[$id], CURLOPT_POST, true);
curl_setopt($curly[$id], CURLOPT_POSTFIELDS, $d);
curl_setopt($curly[$id], CURLOPT_RETURNTRANSFER, true);
curl_setopt($curly[$id], CURLOPT_TIMEOUT, 60);
curl_setopt($curly[$id], CURLOPT_SSLVERSION, 3);
curl_multi_add_handle($mh, $curly[$id]);
} // query data for each of sub queries on the $xmlarray
$running = null; // execute the handles
do {
curl_multi_exec($mh, $running);
curl_multi_select($mh);
} while($running > 0);
foreach($curly as $id => $c) {
$result[$id] = curl_multi_getcontent($c);
curl_multi_remove_handle($mh, $c);
}// get content and remove handles
$active = null;
//execute the handles
curl_multi_close($mh);
}
}
Any idea on how to increase that number for 5ok users?
EDIT2 - sample $xmlarray just for 2 users
Accept-Encoding: gzip&token=305c7c5be78b5c8dd583312fe20578ac&subid=test_sub_id&idomain=adk.mediaff.com&cdomain=adk.mediaff.com&request=%3Crequest%3E%3Cemail%3E%3Crecipient%3Ed3e51df8f588139fb210d898c5964c3f%3C%2Frecipient%3E%3Clist%3E23413%3C%2Flist%3E%3Cdomain%3Eicloud.com%3C%2Fdomain%3E%3Ccountrycode%3E%3C%2Fcountrycode%3E%3Cmetrocode%3E%3C%2Fmetrocode%3E%3Cpostalcode%3E%3C%2Fpostalcode%3E%3Cgender%3E2%3C%2Fgender%3E%3Ctest%3E0%3C%2Ftest%3E%3C%2Femail%3E%3Cemail%3E%3Crecipient%3E728929dfbc0d785e41316d4fa97518e9%3C%2Frecipient%3E%3Clist%3E23413%3C%2Flist%3E%3Cdomain%3Ehotmail.com%3C%2Fdomain%3E%3Ccountrycode%3E%3C%2Fcountrycode%3E%3Cmetrocode%3E%3C%2Fmetrocode%3E%3Cpostalcode%3E%3C%2Fpostalcode%3E%3Cgender%3E1%3C%2Fgender%3E%3Ctest%3E0%3C%2Ftest%3E%3C%2Femail%3E%3C%2Frequest%3E&test=0
Upvotes: 1
Views: 840
Reputation: 1107
I would suggest you to split your array $xmlarray
into chunks maybe 500 or 5000 chunk size.
then execute your curl request for each of these chunks. Use FILE_APPEND
with file_put_contents
when trying to put result into the file, otherwise the file will be overwritten for each chunk.
Upvotes: 2