Reputation: 75
I have a php script that fetches data from external sites using curl then, after three minutes, reloads itself, fetches new data and displays updates. It works fine but if there is a network failure, and I presume it's curl not getting responses, php just hangs without returning errors or anything. These hanging processes then needs to be killed manually.
How can I deal with this situation? Tweak curl options? Modify php script that it watches for unresponsive curl? Or handle everything from the browser through ajax, including firing off a script that kills hanging php processes?
Solution: I've added
curl_setopt($ch, CURLOPT_FAILONERROR = true);
curl_setopt($ch, CURLOPT_TIMEOUT, 10);
to my curl and added a catch for these errors to my response checking part. Conceptually, it's all that was needed, CURLOPT_CONNECTTIMEOUT doesn't seem to be necessary because I already have reloading set up in case of errors.
It works with manual disconnect but I haven't seen how the script handles real life network failures yet. Should be okay.
Upvotes: 2
Views: 204
Reputation: 39365
To handle network issue, use CURLOPT_CONNECTTIMEOUT
option to define some seconds. It will wait for the given amount of seconds to connect to the targeted host.
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 10);
And use CURLOPT_TIMEOUT
option to define number of seconds you want to allow your curl for a particular operation. This will be helpful if the targeted server doesn't release the connection.
curl_setopt($ch, CURLOPT_TIMEOUT, 30);
Upvotes: 1