Reputation: 3067
I'm querying an API by using cURL (GET), and the API gives me only 25 results per call (it's a hard limit and there's nothing I can do about it since it's not mine).
The results look like:
<response v="2">
<query>my nifty query</query>
<location>new york, ny</location>
<totalresults>920</totalresults>
<start>1</start>
<end>25</end>
<pageNumber>0</pageNumber>
<results>
<result></result>
<result></result>
...
</results>
</response>
The total results (which appear on the returned xml [above]) can contain up to 1,000 results, and in like most cases, I usually have 1,000 results per query, so I basically have to call the API 40 times (i.e. 25 results-per-call × 40 calls = 1,000 results). Every call, I change the API start parameter and increase it by 25 (i.e. 0, 25, 50, ..., 975).
This whole process takes about 8 seconds to complete, as I currently call the API in a synchronous way (I'm using curl_exec()
to execute the GET command, one by one in a for loop). Is there any efficient and faster way to call the API in a parallel way and get these results faster? Thanks.
Upvotes: 1
Views: 2495
Reputation: 88647
You can execute multiple a-synchronous cURL calls with curl_multi_exec()
. This will allow you to execute multiple calls simultaneously.
Just be aware that when querying the same server multiple times, there is an upper limit to the number of concurrent requests vs. efficiency ratio. I'm sure I remember reading that after a lot or research, Facebook concluded that this limit was between 3 and 4 concurrent requests, but I cannot find the reference so I may have imagined it. This would depend on the server and client you were using, so really I would say you will just have have to suck it and see.
Upvotes: 5