Reputation: 1110
I'm not sure if this is a PHP problem, server config problem or hardware problem, but thought I'd start with PHP and see if I get any suggestions. This code worked fine until recently, and I'm not aware of any configuration changes that might have caused this. We did upgrade from Debian Lenny to Squeeze recently (and from PHP 5.2 to 5.3), but the code works fine on another Squeeze server.
I have a bit of PHP code that takes a path to a file passed as a GET variable (rewritten via mod_rewrite from a request to http://site.com/request/for/file.pdf to http://site.com/downloader.php?path=/path/to/file.pdf). The reason for doing this relates to stats tracking.
The file get passed along to this bit code (simplified for readability).
if(is_readable($theFile)) {
//$fh= fopen($theFile, "r");
header("Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0");
header("Pragma: no-cache");
header("Content-Type: application/pdf");
header("Content-Disposition: attachment; filename=\"".basename($theFile)."\"");
header("Content-Length:".(string)(filesize($theFile)));
sleep(1);
//fpassthru($fh);
readfile($theFile);
}
As you can see, the code only gets executed if the file is readable (ie the path is all correct). For files under about 63MB, everything works fine. For anything over 63MB, the server returns a 500 error. (This gets reported in Firefox/Chrome as 'file not found', when I guess it should be 'internal server error', but that's another story I guess). There is nothing in the Apache error logs.
Can anybody think of any PHP or Apache server configuration that would cause this to happen? PHP memory limits should not be affected by readfile or fpassthru as far as I know. I do note that my PHP memory limit is 64MB, however, turning off the mod_rewrite redirect to the PHP does not fix the problem. The files still won't download.
Many thanks for any suggestions.
UPDATED***********
Ok, so I increased the PHP Memory limit from 64MB to 200MB. That allows files up to 200MB to download. However, the question remains. Given that readfile and fpassthru should not be affected by memory limit, and I have checked that output buffering is off, why are large files causing this (new) problem?
Upvotes: 1
Views: 2977
Reputation: 1110
Resolution was simple (after hours of work).
php_value output_buffering 0
added to Apache virtual host config.
Seems that, regardless of what ob_get_level() said, output buffering was taking place. In other words, just the option to use output buffering is enough for it to affect PHP memory use.
Upvotes: 1