Reputation: 743
Im having a little trouble with running a php script in the background.
I have my admin area which has HTTP authentication to access and a mail out script which I want to run in the background. This script will take a long time to execute so I wish it to execute in the background.
My idea is when I access the "send" page, it executes the send script in the background and redirects the user off elsewhere.
However currently attempting to use cURL I cannot cURL into the send script as it returns Authorization required.
Any advice appreciated, thanks.
Upvotes: 3
Views: 1961
Reputation: 2547
Version 1:
system('php your_script &');
- The & will send the execution in background.
Version 2:
Do not require authentication for your script and you will be able to use cURL.
Upvotes: 0
Reputation: 43794
cURL has options for HTTP authentication...
But to save an HTTP request... here's the function I use for executing local PHP asynchronously...
/**
* Asynchronously execute/include a PHP file. Does not record the output of the file anywhere.
* Relies on the PHP_PATH config constant.
*
* @param string $filename file to execute
* @param string $options (optional) arguments to pass to file via the command line
*/
function asyncInclude($filename, $options = '') {
exec(PHP_PATH . " -f {$filename} {$options} >> /dev/null &");
}
(where PHP_PATH
is a const defined like define('PHP_PATH', '/opt/bin/php5')
or similar)
It passes in arguments via the command line. To read them in PHP, see argv.
Upvotes: 1
Reputation: 11628
You might try pcntl_fork() if you're running it on Linux. Note that this approach requires some magic on the PHP installation part (pcntl is disabled by default for cgi) and even more magic to make your script survive apache process cleanup. See this comment on PHP to get you started.
So forking off a process would look like this:
if ($pid = pcntl_fork())
die(); // Parent
function shutdown() {
posix_kill(posix_getpid(), SIGHUP);
}
ob_end_clean(); // Discard the output buffer and close
fclose(STDIN); // Close all of the standard
fclose(STDOUT); // file descriptors as we
fclose(STDERR); // are running as a daemon.
register_shutdown_function('shutdown');
if (posix_setsid() < 0)
die(); // <- This is an error
if ($pid = pcntl_fork())
die(); // Parent
// Do your stuff here
Upvotes: 1
Reputation: 12870
Some --other-- things to consider consider....we do a lot of mailing as you've described, at least 10k/week, with bursts of 20k/day. Ours runs on Cron (regularly fired via the system, which is slick for our application) If you don't put some sort of throttle on your script, it'll eventually timeout and your mail job will be stuck with no way to restart, somewhere midway through. Ours times out and restarts after a certain number, keeping track by writing progress to the database. Because of the frequency of the mailer and the unknown length of operation, we also utilize a lock file that prevents concurrent mailers from double sending.
If your emails are going to approach 15K/batch, I would recommend checking out a hosted SMTP solution : http://smtp.com or http://www.socketlabs.com/ Queue them via your server and send via SMTP relay to ensure reliable delivery.
No matter how many emails you send out, don't forget CAN-Spam compliance!
Upvotes: 0
Reputation: 18588
Have you tried sending the auth to cURL?
Can be as simple as http://user:[email protected]
or else something like:
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_USERPWD, "$username:$password");
curl_setopt($ch, CURLOPT_HTTPAUTH, CURLAUTH_BASIC);
$output = curl_exec($ch);
$info = curl_getinfo($ch);
curl_close($ch);
EDIT
FWIW it sounds like cURL is probably the wrong tool but it shouldnt be discounted due to auth issues
Upvotes: 0
Reputation: 70460
Upvotes: 4
Reputation: 1480
What I understand from your question is, you have a PHP script that runs another PHP script. You do not need to use CURL to do that, instead, just run your PHP script asynchronously.
Upvotes: 0