Reputation: 274
I'm trying to write a script in PHP that downloads a large zip file (2,663,439,370 bytes) and I ran into an interesting but frustrating problem : the script downloads the first 2.147.483.647 bytes then continues to download the file but instead of appending to it the byte number 2.147.483.648, 2.147.483.649 and so on, it continues appending bytes to the file starting from byte number 1.
So, the downloaded file is made out of : byte 1, byte 2, ... byte 2.147.483.647, byte 1, byte 2 ... and so on.
I notice that 2.147.483.647 is the maximum integer value a 32 byte system can store. However, my server is a 64 byte system and can store values greater than that. To prove it, var_dump((int) 2147483648) returns the correct integer.
My download script is as correct as possible (is taken from php.net by copy-paste)
header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename="Certificat.zip"');
header('Expires: 0');
header('Cache-Control: must-revalidate');
header('Pragma: public');
header('Content-Length: ' . filesize($zipname));
readfile($zipname);
Has anyone ran into this problem?
Upvotes: 4
Views: 994
Reputation: 274
I fixed the problem in the end by eliminating the readfile()
function and outputting the content of the file myself using fopen()
and fread()
.
However the reason why readfile()
' fails when dealing with such large files remains an open subject (I've read the documentation on php.net and I couldn't find any notes about such issue). So i'm still waiting for that experienced programmer to enlighten me :).
Upvotes: 0
Reputation: 14091
Reading the 5.6 source, readfile
returns RETURN_LONG(size)
. Here's the source reference.
Digging some more, it appears that this macro deals with long int definition which is usually 4 bytes, giving us the signed maximum value of integer where your function stops at. I didn't dig deeper than this, this is my assumption, and given the behavior you're experiencing, this would be enough for me (but not for someone more pedantic).
On the other hand, I personally never used readfile
, the fopen
/ fread
combo always proved better in terms of performance and memory used. Since you can read chunks, instead of gulping the mammoth of 2GB, it's simply easier on the server resources.
Upvotes: 1
Reputation: 388
I'm not sure if it'd be helpful in your case, but You should remove
header('Content-Length: ' . filesize($zipname));
I did this few times when I was loading from ongoing stream, browser continues loading until script stops responding with content.I'm not sure it works with this large file, it could possibly be php limit.
Upvotes: 0