Radical_Activity
Radical_Activity

Reputation: 2738

Guzzle Pool: how to wait for all requests to finish without timing out?

I was using Guzzle Pool to request data (around 1-2 MB with each request) from an external server for the past 6 months. I'm doing 5 simultaneous requests at once. However, things have changed and the external server seems to be overloaded and therefore became very slow. Sometimes it's very fast like 1-2 seconds, but many times the server needs to wait 2+ minutes for each request.

This shouldn't be an issue though. However nowadays (since it becomes slow to make requests), some of my requests in the pool are returning an error:

cURL error 18: transfer closed with outstanding read data remaining

It usually does that after waiting for exactly 2 minutes.

The interesting thing though is that if I make the request through Postman (for example) then, I still have to wait 2-3+ minutes, but I got the response in the end.

So this led me to believe that Guzzle's blocking the requests after 2 minutes. I, however, couldn't find any settings to change this. I have even tried sending a Keep Alive and a Content-Length header, but they did not work (maybe I didn't use them correctly though).

Here is a part of my current code that's doing the Guzzle Pool requests. (I'm using PHP 7.1, Guzzle 6.3 and Laravel 5.7).

$headers = ['Authorization' => 'Bearer ' . $token];

$client  = new Client();

$requests = function ($urls, $headers)
{
    foreach ($urls as $key => $url)
    {
        yield new Requests('GET', $url, $headers);
    }
};

$pool = new Pool($client, $requests($urls, $headers),
[
    'concurrency' => 5,
    'fulfilled'   => function ($response, $index)
    {
        echo 'fulfilled -> ' . $index;
    },
    'rejected' => function ($reason, $index)
    {
        echo 'rejected -> ' . $index . ' -> error:' . $reason->getMessage();
    },
]);

$promise = $pool->promise();
$promise->wait();

Unfortunately, I cannot share the external server URL, because it's private.

What am I doing wrong here, that doesn't allow the request to wait for it to finish / send over the data?


Update: I have tried @Alexey Shokov's recommendation, which got rid of the rejected states. The requests are not timing out at 2 minutes now. However I get null as a response once I get something back from those previously timed out sources.

Upvotes: 0

Views: 1709

Answers (1)

Alexey Shokov
Alexey Shokov

Reputation: 5010

Looks like there is an issue on the server side. The error indicated that cURL client sees a mismatch between expected response size and the real response size, sent by the server.

Take a look at this SO topic, same issue. I think it worths trying to set HTTP version to 1.0 as stated in the discussion with Guzzle's version option or directly in the Request object (yield new Requests('GET', $url, $headers, null, '1.0')).

Upvotes: 1

Related Questions