beginner_
beginner_

Reputation: 7622

PHP cURL: improve performance when doing lots of REST calls

I have a function that calls an external rest service a lot. how often depends on the input data, eg. what the user request but the idea is to batch convert a special kind of data and identifiers. This is very slow even for as little as like 50 conversions.

The REST Web Service is called with php_curl. Now I thought that creating a new connection for each single conversion is causing this. I tried reusign the same handle and adjust url for each call and I also tried this:

http://technosophos.com/content/connection-sharing-curl-php-how-re-use-http-connections-knock-70-rest-network-time

I got 0 performance increase in both cases. I'm on Windows. Maybe that's the issue? With TCPView Tool I can easily see that new connections are created "en masse". maybe I'm interpreting it wrong but for me it looks like it is still creating 1 connection per conversion.

So I'm kind of lost. Anyone knows if connections are actually reused on Windows implementation? Can it be caused by the remote server?

EDIT:

Current setup is trivial:

if(empty($this->curlHandle)){
    $this->curlHandle = curl_init();
    curl_setopt($this->curlHandle, CURLOPT_RETURNTRANSFER, 1);
    curl_setopt($this->curlHandle, CURLOPT_PROXY, $this->proxy);
    curl_setopt($this->curlHandle, CURLOPT_PROXYPORT, $this->proxyPort);
    curl_setopt($this->curlHandle, CURLOPT_HTTP_VERSION, 1.1);
}        
//$ch = curl_init();
curl_setopt($this->curlHandle, CURLOPT_URL, $baseUrl . $identifier . '/' . $format);


$result = curl_exec($this->curlHandle);

And here the header returned by an example call:

HTTP/1.1 200 OK
Date: Fri, 28 Sep 2012 05:23:37 GMT
Server: Apache/2.2.15 (Scientific Linux)
Last-Modified: Fri, 28 Sep 2012 05:16:37 GMT
ETag: "0924166dd08dd5845929794dbd07d288"
Expires: Mon, 08 Oct 2012 05:16:37 GMT
Cache-Control: max-age=864000
Access-Control-Allow-Origin: *
Connection: close
Content-Type: text/plain; charset=UTF-8

EDIT 2:

There is a different remote web service that offers similar functionality (but has less data, eg look-up data for an ID). When run in Firefox that service sends a keep alive header back. So i implemented it. But the issue remains. Just as slow. I then used curl cmd-line to play around and noticed that when using curl said server returns a connection:close.

Upvotes: 0

Views: 5197

Answers (2)

Technosophos
Technosophos

Reputation: 561

Re-using the same CURL handle really doesn't work all that well for a variety of reasons. curl_exec wasn't really intended for that usage.

The curl_multi_exec does handle that situation very well. But it will typically require that the remote host also keep alive the connection.

HOWEVER, before any of these strategies work, the client and the server have to agree to keep a connection open, and that is what is not happening.

According to the headers you posted above, "Connection: close" comes back from the server. This means that Keep-alives are not enabled. This may be because the REST server (or your proxy server, if there is one) does not allow them, which might make some sense. It may also be because you're local CURL call is for some reason not sending "Connection: Keep-Alive" to the remote server.

Try the following:

  • Check CURL's outbound headers to see if it asks for a Keep-alive.
  • Try enabling CURLOPT_VERBOSE, which will spew out low-level connection details

Those should at least get you on the right track. At the end of the day, though, if the remote server does not support multiple requests over the same connection, you won't be able to do anything on your side to change that.

Upvotes: 1

maalls
maalls

Reputation: 759

Since you are batching you could execute several calls in parallel with curl_multi_exec

http://php.net/manual/en/function.curl-multi-exec.php

Upvotes: 0

Related Questions