tarnfeld
tarnfeld

Reputation: 26556

PHP Concurrent HTTP requests?

I was wondering what the best way to do concurrent HTTP requests in PHP? I have a lot of data to get and i'd rather do multiple requests at once to retrieve it all.

Does anybody know how I can do this? Preferably in an anonymous/callback function mannor...

Thanks,

Tom.

Upvotes: 5

Views: 12207

Answers (4)

Ishan Srivastava
Ishan Srivastava

Reputation: 1189

Or if you want you can send you data as json. In php you can defragment it into all the values again. for eg.

xhttp.open("GET", "gotoChatRoomorNot.php?q=[{"+str+"},{"+user1+"},{"+user2"}]", true);

and in php you can follow this to get your data back: How do I extract data from JSON with PHP?

So make a string in json format and send the entire thing through http. I think you can perform the same kind of behaviour with xml, but i am not aware of xml

Upvotes: 1

Tobias Gassmann
Tobias Gassmann

Reputation: 11819

you can use HttpRequestPool http://www.php.net/manual/de/httprequestpool.construct.php

$multiRequests = array(
  new HttpRequest('http://www.google.com', HttpRequest::METH_GET),
  new HttpRequest('http://www.yahoo.com', HttpRequest::METH_GET)
  new HttpRequest('http://www.bing.com', HttpRequest::METH_GET)
);

$pool = new HttpRequestPool();
foreach ($multiRequests as $request)
{
  $pool->attach($request);
}

$pool->send();

foreach($pool as $request) 
{
  echo $request->getResponseBody();
}

Upvotes: 1

Alexei Tenitski
Alexei Tenitski

Reputation: 9360

You can try either curl_multi() or use a lower level function socket_select()

Upvotes: 2

Marc B
Marc B

Reputation: 360672

You can use curl_multi, which internally fires off multiple separate requests under a single curl handle.

But otherwise PHP itself not in any way/shape/form "multithreaded" and will not allow things to run in parallel, except via gross hacks (multiple parallel scripts, one script firing up multiple background tasks via exec(), etc...).

Upvotes: 12

Related Questions