AarioAi
AarioAi

Reputation: 635

PHP: Why I Can Only Download 16 KB (16624 Bytes) Of The Files About 8M by PHP echo?

I use file_get_contents() to read a file larger than 7M (7 - 30M). And then I echo it to let the clients download it.

Note: there's no direact link for downloading these files. So I have to read it into PHP, and make a download file. Read a large file into memory is absolutely a wrong idea --- The best way is to make the file a direact download link. But not now...Coz the file is created by others, and they don't have plan to move them to our cloud-file servers. --- So I have to handle it by PHP.

There's something weired. It works well on my personal PHP7 docker container ( the config file is here: https://github.com/AarioAi/Conf )

I can download the full file ( maybe 7.8M).

But when I push these codes to another server (PHP 5.5) which has almost the same PHP and Nginx configs. I can only download 16KB(16624B) of this file.

Here's the debug information about the 7.8M binary file:

"HTTP_RANGE": null,
"Content-Length:": 7490833,
"md5": "aaefc8249e2a96574fa7b83e7bc168f0",
"fileSize": 7490833

In order to check whether the 16KB segment is a part of the original file. I fill a 16M file with repeated string "Hello, Aario." by shell script. Then I open the downloaded 16KB segment from the PHP 5.5 server. They are all "Hello, Aario. Hello, Aario. ....."

So the 16KB is definitely the segment of the original file.

Here's the debug information about the 16M "Hello, Aario.":

"HTTP_RANGE": null,
"Content-type": "application/force-download",
"Content-Length:": 16155228,
"md5": "5c919dd9d1049a59e7c2a8cb55a695a9",
"fileSize": 16155228

I send nothing to the ob_xxx, so I think there's nothing wrong with output_buffering config. The PHP memory_limit configs are the same on both servers. And I also notice that there's no significant diffirence between the php-fpm.conf files.

I guess it may be timeout. I check the Nginx keepalive_timeout on the another server (PHP 5.5), it's 65 which is longer than on this server (PHP 7).

So could you give me some tips?

 //---------------------------------------------------------------------
 $length = (function_exists('mb_strlen') ? mb_strlen($content, '8bit') : strlen($content));
 // it's just the sample to calculate Content-Length, actually, 
 // it needs be handled in the if (isset($_SERVER['HTTP_RANGE'])) {}
 //----------------------------------------------------------------------

 if (isset($_SERVER['HTTP_RANGE'])) {
        header('Accept-Ranges: bytes');
        //client sent us a multibyte range, can not hold this one for now
        if (strpos($_SERVER['HTTP_RANGE'], ',') !== false) {
            header("Content-Range: bytes $contentStart-$contentEnd/$fileSize");
   ....
   ....
 }



 if ($debug > 0) {
        return [
            'Content-type' => $contentType,
            'Content-Length:' => $length,
            'md5' => md5_file($file),
            'fileSize' => $fileSize,
        ];
    } else {
        header('Pragma: public');
        header('Expires: 0');
        header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
        header("Content-Type: $contentType");
        header('Content-Length: ' . $length);
        header("Content-Disposition: attachment; filename=\"$fileName\"");
        header('Content-Transfer-Encoding: binary');
        $content = function_exists('mb_substr') ? mb_substr($content, $contentStart, $length) : substr($content, $contentStart, $length);
        ob_start();
        ob_end_clean();
        echo $content;
        exit(0);
    }

Upvotes: 1

Views: 243

Answers (1)

KIKO Software
KIKO Software

Reputation: 16686

I don't see how the piece of code you show could work. It doesn't. For one, there's no file_get_contents() in it. And what does this do?

    ob_start();
    ob_end_clean();

Nothing? There are also a lot of variables which are not define. In short, this is a bad example.

Reading larger files with file_get_contents() and then echo it, is, in my opion, not a good idea. The whole file will be read to memory, stored there, before it can send to the output. Traditionally files have always been read in chunks, and then send in chunks to the output.

A better way to stream a file to the output is this:

// open the file in a binary mode
$filename = '/mybigfile.huge';
$handle   = fopen($filename,'rb');

if ($handle)
{
  // send the right headers
  header("Content-Type: .....");
  header("Content-Length: " . filesize($filename ));
  ........

  // dump the file to the output
  fpassthru($handle);
  // close file
  fclose($handle);
}

Upvotes: 1

Related Questions