YD8877
YD8877

Reputation: 10800

Downloading / Resuming large files in PHP using native PHP functions

I have to implement a simple file download client in PHP capable of downloading large files as well as resuming them.

Is there a way i can download large files (>700 MB) in PHP and still have my PHP memory limit to 128M ? I'm guessing this has to do with writing to a file pointer. Any clue on which file handling functions to use ? there are so many. I am guessing fopen, flock, (fwrite,fgets,fread), fclose. Or should i use cURL ?

How do i resume downloads which are broken ? Script execution timeout, user stopping script, remote server timeout etc. ?

Upvotes: 0

Views: 1631

Answers (2)

deceze
deceze

Reputation: 522499

This should be possible using cURL by setting the CURLOPT_FILE and CURLOPT_RESUME_FROM options. I'm not sure whether cURL will overwrite the file or append to it, also whether it'll buffer the file in memory or write it straight to disk. You may have to do some tests there.

If you want more control over the whole process, you can use fsockopen to create a raw connection to the server you're downloading from and write to and read from this connection using the normal fread and fwrite functions. You'd have to send (fwrite) the correct HTTP headers to the connection to initiate the transfer—most importantly the Range header for resuming transfers—and then read a few bytes using fread, then write those to a file and repeat until the transfer is complete.

Upvotes: 2

Yanick Rochon
Yanick Rochon

Reputation: 53606

The problem is not PHP, it's how the file is sent. While you can fopen() and fread(), etc. on the server, how would you fopen() and fwrite() on the client side? Standard Javascript cannot open files from within the browser, you would have to rely on other methods, such as (gulp) Java applets, or a Flash component (??)...

Well, since HTML5, it might be possible, however it is not fully adopted yet by major browsers.

Upvotes: 0

Related Questions