wiseguydigital
wiseguydigital

Reputation: 315

Downloadable large remote images using PHP

I know there are lots of posts similar to this, but after crawling SO, still not found the answer.

I am looking to write a script that acts as a proxy for downloading large remote images (around 10mb each). So far I am using curl to read in the remote image url and then using headers to force a download. Something like (not the full script):

function getRemoteFile($url) {
  $ch = curl_init();
  curl_setopt($ch, CURLOPT_URL, $url);
  curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
  curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 50);
  $data = curl_exec($ch);
  curl_close($ch);
  return $data;
}

header('Content-Type: octet/stream');
header('Content-Disposition: attachment; filename="random.jpg"');
header('Content-Length: ' . strlen($file));
echo $file;

This works, but is there a better way as this script may see quite a lot of traffic - maybe 300 concurrent users with 10 requests each?

The images will be served from a server on the same network.

Upvotes: 1

Views: 430

Answers (1)

Baba
Baba

Reputation: 95111

10mb is pretty large with 300 concurrent users with 10 requests.

You are saying 10 * 300 * 10 = 30,000 MB = 30GB

I Suggest you use a Job Queue

You can use Gearman

$worker= new GearmanWorker();
$worker->addServer();
$worker->addFunction("download", "getRemoteFile");
while ($worker->work());

You can not use AJAX and to check if the image is downloaded and display it

I would also recommend you look at the following

Upvotes: 1

Related Questions