Reputation: 565
I have written a PHP page for downloading files from the server. The name of the file is passed as a GET variable in the URL, and then the following code serves up the file for download:
$filepath = "/path/to/files";
$filename = $_GET['id'];
if( ! file_exists($filepath . "/" . $filename) )
{
header("HTTP/1.1 404 Not Found");
@session_destroy();
exit(0);
}
$cmd = '/usr/bin/stat -c "%s" ' . $filepath . "/" . $filename;
$out = array();
$ret = 0;
exec( $cmd, $out, $ret );
header('Content-type: application/octet-stream');
header('Content-Length: ' . $out[0]);
header('Content-Disposition: attachment; filename="' . $filename . '"');
readfile($filepath . "/" . $filename);
NOTE: I am using the exec() call because most of the files are large (>2GB) and the larger files caused the filesize() and stat() functions to fail.
Anyway, this code works perfectly for almost all of the files. However, when a file is exactly 2 GB in size (2147483648 bytes), no headers are sent and the browser attempts to download the PHP page itself, which results in a empty file being saved called download.php.
Here's what happened when I tested this with curl:
Test #1: Get a 1 GB file called bigfile1:
$ curl -v http://<SERVER>/download.php?id=bigfile1
* About to connect() to <SERVER> port 80 (#0)
* Trying <IP_ADDRESS>... connected
* Connected to <SERVER> (<IP_ADDRESS>) port 80 (#0)
> GET /download.php?id=bigfile1 HTTP/1.1
> User-Agent: curl/7.18.2 (i486-pc-linux-gnu) libcurl/7.18.2 OpenSSL/0.9.8g zlib/1.2.3.3 libidn/1.8 libssh2/0.18
> Host: <SERVER>
> Accept: */*
>
< HTTP/1.1 200 OK
< Date: Tue, 21 Jun 2011 19:10:06 GMT
< Server: Apache/2.2.4 (Unix) mod_ssl/2.2.4 OpenSSL/0.9.8c PHP/5.3.0
< X-Powered-By: PHP/5.3.0
< Set-Cookie: PHPSESSID=382769731f5e3782e3c1e3e14fc8ae71; path=/
< Expires: Thu, 19 Nov 1981 08:52:00 GMT
< Cache-Control: no-store, no-cache, must-revalidate, post-check=0, pre-check=0
< Pragma: no-cache
< Content-Length: 1073741824
< Content-Disposition: attachment; filename="bigfile1"
< Content-Type: application/octet-stream
<
* Connection #0 to host <SERVER> left intact
* Closing connection #0
Test #2: Get a 2 GB file called bigfile2
$ curl -v http://<SERVER>/download.php?id=bigfile2
* About to connect() to <SERVER> port 80 (#0)
* Trying <IP_ADDRESS>... connected
* Connected to <SERVER> (<IP_ADDRESS>) port 80 (#0)
> GET /download.php?id=bigfile2 HTTP/1.1
> User-Agent: curl/7.18.2 (i486-pc-linux-gnu) libcurl/7.18.2 OpenSSL/0.9.8g zlib/1.2.3.3 libidn/1.8 libssh2/0.18
> Host: <SERVER>
> Accept: */*
>
* Empty reply from server
* Connection #0 to host <SERVER> left intact
curl: (52) Empty reply from server
* Closing connection #0
I have created test files which are 1 GB, 2 GB, 3 GB, 4 GB, and 5 GB in size. The 2 GB file is the only one that causes this behavior, but it happens consistently, and it seems to happen regardless of the client browser or OS. The server is running Debian GNU/Linux 4.0, Apache 2.2.4, and PHP 5.3.0.
Any ideas would be greatly appreciated. Thanks!
Upvotes: 3
Views: 1788
Reputation: 22972
I put that in a comment, but the formatting doesn't help, so I'll post this as an answer too, although it does not answer your question. What would happen if an attacker visits this URL?
http://yourserver.com/download.php?id=../../../../etc/passwd
Watch out for directory traversals in your id
GET variable.
$realfile = realpath($filepath .'/'. $filename);
if (substr($realfile, strlen($filepath)) == $filepath && file_exists($realfile))
{
do_the_download();
} else {
die('file not found (or you are a bad person)');
}
Upvotes: 1
Reputation: 48284
On a 64-bit installation, I had no problems with the file size you mentioned using PHP 5.3.6. A 32-bit installation gave me:
failed to open stream: Value too large for defined data type
But that said, I would not even use readfile
. Instead:
header("X-Sendfile: $filepath/$filename");
It requires mod_xsendfile be installed and configured.
Upvotes: 3