Reputation: 71
I am trying to write a file in php that is in the range of 50+ MB.
This is kind of working as expected, though it goes rather slow.
it is running quite simple:
$fileAccess = fopen($filename, 'w');
fwrite($fileAccess, $line);
*a lot of lines and loops...*
fclose($fileAccess);
My question is. Can I do anything to optimize it.
I send around 350000 fwrite
statements of around 100-10000 characters to the file, and I was wondering if there is something I can do to make the file generation more efficient.
Is it better to do all these small writes, or should I internally "cache" a bit of the content before writing it, or is there a third option I just don't know about.
I have to keep my memory consumption down, or I will hit server limitation.
Thanks
Upvotes: 1
Views: 3434
Reputation: 674
This may help you
// Copy big file from somewhere else
$src_filepath = 'http://example.com/all_the_things.txt'; $src = fopen($src_filepath, 'r');
$tmp_filepath = '...'; $tmp = fopen($tmp_filepath, 'w');
$buffer_size = 1024;
while (!feof($src)) {
$buffer = fread($src, $buffer_size); // Read big file/data source/etc. in small chunks
fwrite($tmp, $buffer); // Write in small chunks
}
fclose($tmp_filepath); // Clean up
fclose($src_filepath);
rename($tmp_filepath, '/final/path/to/file.txt');
Upvotes: 2
Reputation:
Rather than using the overhead of many file writes, you can use just one, for example, I use an array which I implode by new lines at the end to write with;
<?php
$filecontent = array();
$handle = fopen($filename, "w");
while ($x == $y)
{
if ($condition_met)
{
$filecontent[] = "Some message to say this worked";
}
if ($condition_met)
{
$filecontent[] = "Some message to say this failed";
}
}
$filecontent = implode("\r\n", $filecontent);
fwrite($handle, $filecontent);
fclose($file);
It means you have the resource open for the handle, as well as an array of values to be added, simply implode and write once usually works for me
EDIT
If you are getting memory overuse, still using an array, you can loop this at the end to avoid having to keep write the write all the time, but you'll still see the same performance hit, I have attempted to reduce this by adding a counter so writes are not done as often;
$filecontent = array();
$handle = fopen($filename, "w");
while ($x == $y)
{
if ($condition_met)
{
$filecontent[] = "Some message to say this worked";
}
}
$counts = 0;
$addtofile = "";
foreach ($filecontent as $addline)
{
if ($counts < 2500)
{
$addtofile .= $addline . "\r\n";
$counts++;
}
else
{
fwrite($handle, $addtofile);
$addtofile = "";
$counts = 0;
}
}
fclose($file);
Upvotes: 0
Reputation: 43451
Use file_put_contents
. Writing to file is slowest part here, so use it as little as possible:
$content = '';
$threshold = 1000;
$handler = fopen('my_file.txt', 'a');
foreach ($contents as $i => $data) {
$content .= $data;
// Write in batches
if ($i > $threshold) {
fwrite($handler, $content);
$content = ''; // Reset content
}
}
// Write what's left in $content
if (!empty($content)) {
fwrite($handler, $content);
}
fclose($handler);
Upvotes: 0