Reputation: 3534
I have a very large .csv file that I'm writing rows to from a mysql query. I can't do a direct export because I have to process each row, but that's not a big deal. The problem is that I get an out of memory exception when I call fwrite() on the file:
Allowed memory size of 1572864000 bytes exhausted (tried to allocate 2262361 bytes)
This doesn't happen until the file in question reaches around 700MB. The code that fails is in a loop and looks like this:
$file=fopen(export_path."/".$filename, "a");
fwrite($file, implode("\n", $output)."\n");
fclose($file);
where output[] is up to 5,000 rows. I used to leave the file open until all the writing was done but figured maybe leaving the file handle open might use up a lot of ram, but no dice. What are some strategies for appending to very large files via php?
I looked at What is the best way to write a large file to disk in PHP? but I still got the out of memory exception. Is there a trick to writing to a large file in PHP without having to load the file in memory?
Upvotes: 2
Views: 2470
Reputation: 1601
The problem here likely relates to the quantity of data you're attempting to store in an array, and therefore the solution for this really is to write lines to the file as you process the database records, instead of accumulating a large amount of data in an array and then using implode()
at the end to write all the data in a single dump to the file.
<?php
/* $aRecords contains the database records as returned from your SQL query ($output) */
if (count($aRecords) > 0) {
$hFile = fopen('output.txt', 'a') or die('Unable to open file with writable permission');
foreach ($aRecords as &$aRecord) {
/* Do your processing here, assign the data to $sLine instead of adding an element to
an array such as $aRecords[] = 'data'; or $output[] = 'data'; */
$sLine = $aRecord['whatever'] . $aRecord['from'] . $aRecord['your'] . $aRecord['data'];
fwrite($hFile, $sLine . "\n");
}
fclose($hFile);
}
Upvotes: 1