Reputation: 6601
I am having continuous problems with my script running our of memory.
I need the script to loop through each customer in a database and then to get all the product data and geneate a text file. Each customer can have anything between 1 and 100,000 products.
I bring out the product data in batches of 1,000 and write to a file to try and stop the script from timing out. This has improved a great deal, however, I am still having issues with customers what have large numbers of products. It seems to have issues with customers that have over 5,000 products.
It seems to stop writing to file after the 5th batch (5,000 prods) but the browser just hangs as though it is still generating the file but the product no in the file never increase.
Can anyone help?
set_time_limit(0);
$db = new msSqlConnect('db');
$select = "SELECT customer FROM feeds ";
$run = mssql_query($select);
while($row = mssql_fetch_array($run)){
$arg = $row['customer'];
$txt_file = 'shopkeeper/'. $arg . '.txt';
$generated = generateFeed($db, $arg, $txt_file);
if ($generated){
$update = "UPDATE feeds SET lastGenerated = '$generated' WHERE customer = '$arg' ";
mssql_query($update);
}
}
function generateFeed($db, $customer, $file){
//if file already exists then delete file so can write new file
if (file_exists($file)){
unlink($file);
}
$datafeed_separator = "|";
//get product details
$productsObj = new Products($db, customer)
//find out how many products customer has
$countProds = $productsObj->countProducts();
$productBatchLimit = 1000;
//create new file
$fh = fopen($file, 'a');
$counter = 1;
for ($i = 0; $i < $countProds; $i += $productBatchLimit) {
$txt = '';
$limit = $productBatchLimit*$counter;
$products = $productsObj->getProducts($i, $limit);
foreach($products as $product){
$txt .=
$prod_name . $datafeed_separator .
$prod_brand . $datafeed_separator .
$prod_desc . $datafeed_separator .
$prod_price . $datafeed_separator . "\n";
}
}
fwrite($fh, $txt);
flush();
$counter++;
}
fclose($fh);
$endTime = date('Y-m-d H:i:s');
return $endTime;
}
Upvotes: 0
Views: 1428
Reputation: 976
I can see one thing that might help on your memory usage. If you move the fwrite() inside the foreach loop, you can free up $txt also inside the loop. So it would be something like:
foreach($products as $product){
$txt =
$prod_name . $datafeed_separator .
$prod_brand . $datafeed_separator .
$prod_desc . $datafeed_separator .
$prod_price . $datafeed_separator . "\n";
fwrite($fh, $txt);
}
This will prevent $txt growing large if you have many products.
Upvotes: 1