Reputation: 343
I want the following curl code stable for up to 50 connections from different ip's, So it can easily handle up to 50 connections requests at once without hanging or putting much load on server.
Actually i am on shared hosting. but i want this curl script to make less load on server even if.. if in future it gets more than 50 or 100 requests at once. otherwise my hosting resources can be limited by admin if i put high load on shared hosting server.
One more thing i have to tell that, each request fetch just average 30kb file from remote server with this curl script. So i think each request job will complete in few seconds less than 3 seconds. because file size is very small.
Also Please tell me is this script needs any modification like (curl_multi) to face 50 to 100 small requests at once ? ...OR it is perfect and NO need of any modification. ... OR i just need make changes in shared hosting php ini settings via cpanel.
$userid = $_GET['id'];
if (file_exists($userid.".txt") && (filemtime($userid.".txt") > (time() - 3600 * $ttime ))) {
$ffile = file_get_contents($userid.".txt");} else {
$dcurl = curl_init();
$ffile = fopen($userid.".txt", "w+");
curl_setopt($dcurl, CURLOPT_URL,"http://remoteserver.com/data/$userid");
curl_setopt($dcurl, CURLOPT_RETURNTRANSFER, TRUE);
curl_setopt($dcurl, CURLOPT_HTTP_VERSION, CURL_HTTP_VERSION_1_0);
curl_setopt($dcurl, CURLOPT_TIMEOUT, 50);
curl_setopt($dcurl, CURLOPT_FILE, $ffile);
$ffile = curl_exec($dcurl);
if(curl_errno($dcurl)) // check for execution errors
{
echo 'Script error: ' . curl_error($dcurl);
exit;
}
curl_close($dcurl);$ffile = file_get_contents($userid.".txt");}
Upvotes: 1
Views: 3229
Reputation: 5679
you can use curl_multi
http://php.net/manual/en/function.curl-multi-init.php - description and example
Upvotes: 3