Reputation: 2962
I'm trying to build some kind of web based download manager using PHP and the cURL extension, I am stuck with one problem though, how can I download and save the file with cURL without having to make the user wait, meaning he will make the request and it will be processed in the background.
Now I can't use system calls (exec, system ... etc) as most of the hosts I work with disable these functions, and the other problem is the max execution time for PHP scripts, but I guess this one can be changed in .htaccess or using ini_set or can it not?
I've read somewhere that setting connect_timeout to 1 will work, but does not that terminate the connection?
One solution is to use cronjobs, after the users submits the file he wants to download, a cronjob will check the database and if there is a file in the queue it will start downloading it, but I thought I want to avoid using cron jobs if possible.
So back to the main question, is there a way to tell the php script to run some certain function in the background and delivering the response to the users regardless of the result of that function
Thanks
Upvotes: 8
Views: 24874
Reputation: 61
function background_curl_request($url, $method, $post_parameters){
if (is_array($post_parameters)){
$params = "";
foreach ($post_parameters as $key=>$value){
$params .= $key."=".urlencode($value).'&';
}
$params = rtrim($params, "&");
} else {
$params = $post_parameters;
}
$command = "/usr/bin/curl -X '".$method."' -d '".$params."' --url '".$url."' >> /dev/shm/request.log 2> /dev/null &";
exec($command);
}
Upvotes: 5
Reputation: 101
My experience is to wrap your command correctly.
$time = microtime(true);
$command = '/usr/bin/curl -H \'Content-Type: application/json\' -d \'' . $curlPost . '\' --url \'' . $wholeUrl . '\' >> /dev/shm/request.log 2> /dev/null &';
exec($command);
echo (microtime(true) - $time) * 1000 . ' ms';
Above works well for me, takes only 3ms, but following won't work, takes 1500ms.
$time = microtime(true);
$command = '/usr/bin/curl -H \'Content-Type: application/json\' -d \'' . $curlPost . '\' --url ' . $wholeUrl;
exec($command . ' >> /dev/shm/request.log 2> /dev/null &');
echo (microtime(true) - $time) * 1000 . ' ms';
In total, adding " &> /dev/null &" at the end of your command could makes it running on background and don't hang the php and apache processes, just remember to WRAP your command properly.
Upvotes: 3
Reputation: 101
Make simple curl request from php like below or ajax call using javascript
$curl = curl_init('http://mutant-tech.com/execjob.php');
curl_setopt($curl, CURLOPT_RETURNTRANSFER, true);
$result = curl_exec($curl);
echo $result; // should get - Closing Curl Request
Close http connection using header at starting, later write php code to execute
// File - execjob.php START
ob_end_clean();
header("Connection: close");
ignore_user_abort();
ob_start();
echo ('Closing Curl Request');
$size = ob_get_length();
header("Content-Length: $size");
ob_end_flush();
flush();
// BELOW YOUR BACKGROUND JOB/CODE
If you need more exec time / memory, use function accordingly
Upvotes: 4
Reputation: 2088
I think there have been other posts exploring the options already. Given your constraints, I see two other approaches:
Upvotes: 5