Reputation: 481
I have two files client.php and server.php. The client file send a HTTP request to the server file. The server file can be very slow to process the request so I just want it to answer to the client "OK the request is correct, the result will be sent by email".
But I don't know to make the server close the HTTP request with suitable headers and continue its job. If I specify a timeout of 1 second, I wont't be able to know if the request will be accepted by the server.
So is it possible in PHP ? Do you know how ?
client.php:
<?php
$resource = curl_init();
curl_setopt($resource, CURLOPT_URL, 'http://localhost/server.php');
curl_setopt($resource, CURLOPT_RETURNTRANSFER, true);
curl_setopt($resource, CURLOPT_TIMEOUT, 30);
curl_exec($resource);
server.php
<?php
header('200 OK');
echo 'OK the request is correct, the result will be sent by email';
// How to write the method below?
send_result_to_client();
// Simulates a slow process
sleep(60);
Ok I found the solution. In the function send_result_to_client should seem to :
function send_result_to_client()
{
$myString = '...';
$size = strlen($myString);
header("Content-Length: $size");
header('Connection: close');
flush();
}
Upvotes: 1
Views: 781
Reputation: 57408
I'm sorry, but sending Connection:close
and flush()
is not the solution - not portably across servers or browsers. It may seem to be a solution because it appears to work when you run a simple test. If you try it with
send_result_to_client();
sleep(60); // simulate a job...
mail(...); // ...ENDING with a mail message
and then do nothing, or worse, do what any user will do - close the window as soon as he sees the "You'll be notified by email" - you'll discover that on most platforms, the email never gets sent, and the job never gets done.
curl
may terminate if Connection:close
and Content-Length
bytes have been received, but from RFC 2616,
HTTP/1.1 defines the "close" connection option for **the sender** to signal that the connection will be closed after completion of the response.
But the server is not closing the connection, not really. That would require a die()
or exit()
before the job has been processed.
So your script continues executing only until the browser actually closes the connection - at that point (usually) the processing aborts. Maybe on some platforms it doesn't, but don't count on it.
You may try to ameliorate the situation with another hack:
set_time_limit(VERY_LONG_TIME);
ignore_user_abort();
but it is, indeed, a hack.
There are no truly portable solutions. Usually, as suggested by Manatok, the job is queued somewhere else, and the queue processed asynchronously by another thread. This is possibly the best solution.
Other possibilities involve a different and maybe simpler way to do the same thing, using atd and at command (or equivalent under Windows) to queue the job. You can create the "job control" with shell_exec
by piping a new PHP script to be executed by atd
:
shell_exec("echo 'php -q /path/to/script.php \"param1\" \"param2\" | at now");
(or more efficiently, running "at now
" with popen
and writing the command to execute to its stdin
).
See for example:
https://github.com/treffynnon/PHP-at-Job-Queue-Wrapper
You can also try to spawn a concurrent job, through popen
or shell_exec
, but you need to detach it from the server process, or you'll find the system hogged by copies of the job executable (or CMD.EXE
if you do this under Windows).
See:
Asynchronous shell exec in PHP
Upvotes: 0
Reputation: 5706
I know this isn't exactly what you asking (not sure this is possible) but I would have a queue that is processed by another thread. The request comes into the server, it adds the details to the queue (can be in the db) and responds to client. You can then have a cron which runs and processes the queue later.
Upvotes: 1