Amit
Amit

Reputation: 712

Allowed Memory Exhausted when reading files

I am using PHP script (similar to the one given below) to stream file.
(server stack includes nginx + php-fpm + apc)

function send_headers($content_type, $filename)
{
    @ob_end_clean();
    header("Content-Type: ".$content_type);
    header("Content-Disposition: attachment; filename=\"".$filename."\"");
    @ob_end_flush();
}

function stream_file($file)
{
    $fp = fopen($file, "r");
    while(true)
    {
        $buffer = fgets($fp, 1024);
        if ($buffer === FALSE)
        {
            break;
        }

        echo $buffer;
        $buffer = NULL;  // unset($buffer) as well gives memory exhaustion error
    }
    fclose($fp);
}

send_headers('text/plain', 'sample.txt');
stream_file('home/linux/report.txt');


Following error is being logged frequently in nginx log

[error] 18391#0: *13673875 FastCGI sent in stderr: "PHP message: PHP Fatal error:
Allowed memory size of 134217728 bytes exhausted (tried to allocate 64 bytes) in
/home/linux/src/library/file.php on line XX" while reading response header from
upstream, client: XXX.XXX.XXX.XXX, server: example.com, request: 
"GET /file/download HTTP/1.1", upstream: "fastcgi://127.0.0.1:9000",
host: "example.com"

Why would php run out of 128MB of memory with such a simple script?

Upvotes: 0

Views: 1335

Answers (2)

kieron
kieron

Reputation: 31

Replacing readfile with the stream_file function worked for me. I also added the file size to my headers.

header('Content-Length: ' . $file->getSize());

Upvotes: 0

Choerun Asnawi
Choerun Asnawi

Reputation: 181

Try this one:

function stream_file($file)
{
    $fp = fopen($file, "r");
    while ($buffer = fgets($fp, 1024))
    {
        echo $buffer;
    }
    fclose($fp);
}

Upvotes: 2

Related Questions