Reputation: 21437
I have the code below to output a big file, but it's falling over because PHP's memory use seems to grow and grow as the file is read:
<?php
// various header() calls etc.
$stream = fopen($tarfile,'r');
ob_end_flush();
while (!feof($stream)) {
$buf = fread($stream, 4096);
print $buf;
flush();
unset($buf);
$aa_usage = memory_get_usage(TRUE); // ← this keeps going up!
}
fclose($stream);
I had thought that by the combination of flush and unset the additional memory use would be limited to the 4k buffer, but I'm clearly wrong.
Upvotes: 8
Views: 1364
Reputation: 1
You can try load only as much data as you need first, and if you load more data use the function: fseek()
Upvotes: 0
Reputation: 72376
If all you need is to output the content of a file then the right tool to do it is the PHP function readfile()
. Replace all the code you posted with:
readfile($tarfile);
As the documentation says:
Note:
readfile()
will not present any memory issues, even when sending large files, on its own. If you encounter an out of memory error ensure that output buffering is off withob_get_level()
.
Upvotes: 4