Reputation: 3349
I'm currently using the following PHP function to allow a user to select a file and then download it. This happens over FTP. However, if the user chooses a large file then while the download is occurring it locks up the server for any other requests. Is there any way I can host the file while having PHP continue to respond to requests?
I need PHP to verify that the user is permitted to download the file with their credentials so I can't just host it as an asset. The file is located on an FTP server.
function download($file) {
$fileStream = "";
if($this->get($file)) {
//Send header to browser to receive a file
header("Content-disposition: attachment; filename=\"$file\"");
header("Content-type: application/octet-stream");
header("Pragma: ");
header("Cache-Control: no-cache");
header("Expires: 0");
$data = readfile($this->downloadDir . $file);
$i=0;
while ($data[$i] != "")
{
$fileStream .= $data[$i];
$i++;
}
unlink($this->downloadDir . $file);
echo $fileStream;
exit;
} else {
return false;
}
}
Upvotes: 2
Views: 164
Reputation: 10536
PHP is not the best solution for this kind of work, but it can delegate the job to the web server you are using. And as the file is in the same place as your application, this can work.
All major web servers that usually run PHP applications (Apache, lighttpd and nginx) have all support for XSendfile
.
To use it, you have to first enable the functionality in your web server (check the links above for each of the web servers), then in your script add a new header:
Apache:
header("X-Sendfile: $location_of_file_to_download");
Lighttpd:
header("X-LIGHTTPD-send-file: $location_of_file_to_download");
nginx:
header("X-Accel-Redirect: $location_of_file_to_download");
The web server will catch this header from your application, and will replace the body of your PHP response with the file. And while it servers this file to the user, the PHP gets unblocked and ready to server a new user.
(The other headers will be kept, so you can retain the content-type and content-disposition headers)
Upvotes: 1
Reputation: 321
Since PHP is single-threaded, you would have to make a structure for each request. Then, instead of just processing one request at a time, you should loop through the structures and slowly process all of them concurrently (as in send a few hundred kb to one, then move onto the next, etc).
Honestly, PHP doesn't sound like the right language to do this job. Why not using a purpose built FTP server like vsftp or something of that nature?
Upvotes: 1