Reputation: 4909
I'm using some sites to detect my site visitor's country. I mean like this
$ip = $_SERVER['REMOTE_ADDR'];
$url1 = 'http://api.hostip.info/get_json.php?ip='.$ip;
$url2 = 'http://ip2country.sourceforge.net/ip2c.php?format=JSON&ip='.$ip;
Sometimes sites like sourgeforge taking too much time to load.
So can anyone tell how to limit the http response time.?
if url1
is down or not responded in x seconds
then move to url2,url3,etc
Upvotes: 4
Views: 1200
Reputation: 1638
There is another solution, just download the DB and offer that service to yourself on a faster machine of your own:
Upvotes: 1
Reputation: 31823
$context = stream_context_create(array(
'http' => array(
'method' => 'GET',
, 'timeout' => 3
)
));
Then supply the stream context to fopen() or file_get_contents() etc...
http://php.net/manual/en/stream.contexts.php
http://php.net/manual/en/context.http.php
The manual calls that a "read timeout". I worry it may not include time for stuff like dns resolution + socket connection. I think the timeout before php tries reading from the stream may be governed by the default_socket_timeout setting.
You may want to consider curl, it seems a bit more specific, but I'm not sure if CURLOPT_TIMEOUT
is inclusive of CURLOPT_CONNECTTIMEOUT
.
$ch = curl_init();
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 2);
curl_setopt($ch, CURLOPT_TIMEOUT, 2);
http://php.net/manual/en/function.curl-setopt.php
Upvotes: 5
Reputation: 16304
If this is done using streams, you could use stream_set_timeout
for this. A decent example from the php manual, it also describes more advanced ways of archieving this:
$fp = fsockopen("www.example.com", 80);
if (!$fp) {
echo "Unable to open\n";
} else {
fwrite($fp, "GET / HTTP/1.0\r\n\r\n");
stream_set_timeout($fp, 2);
$res = fread($fp, 2000);
$info = stream_get_meta_data($fp);
fclose($fp);
if ($info['timed_out']) {
echo 'Connection timed out!';
} else {
echo $res;
}
}
Upvotes: 2