Daniel Dewhurst
Daniel Dewhurst

Reputation: 2603

Queuing Guzzle Requests With Limits

I'm working on a Laravel application, using Guzzle 6. A lot of functionality relies on an API, of which I've created a wrapper for.

My wrapper's a single class, that creates the Guzzle client in the __construct(), and has a variety of public functions which return responses from Guzzle requests.

The API I'm using has a limit of 40 requests every 10 seconds. I am caching things, so it would be very rare to hit this limit, but I'd like to know that my application wouldn't just die if it did!

Some notes about my app:

So, my question is, how should I make sure I do not hit this limit? A few ideas of mine are the following:

I'm trying to not to provoke too opinionated responses, but I'm sure there's probably a better and/or easier way than the above, or if they are good ideas, any pointers or recommendations would be great.

Thanks in advance.

Upvotes: 17

Views: 4947

Answers (4)

mixel
mixel

Reputation: 25876

  1. Wrap your API calls with Jobs and push them to separate queue:

    ApiJob::dispatch()->onQueue('api');
    
  2. Use mxl/laravel-queue-rate-limit package (I'm the author) to rate limit api queue. Add this to config/queue.php:

    'rateLimit' => [
        'api' => [
            'allows' => 40,
            'every' => 10
        ]
    ]
    
  3. Run queue worker:

    $ php artisan queue:work --queue api
    

See also this answer.

Upvotes: 2

anwerj
anwerj

Reputation: 2488

I am also working on same issue, i preferred a callback based architecture where my Client class controls the flow of requests. Currently I am doing sleep and check algorithm. I works for me as i have 3 seconds of cool down time.

I use Cache to hold the count of fired requests.

while(($count = Cache::get($this->getCacheKey(),0)) >= 40){ // Max request
    sleep(1);
}
Cache::set($this->getCacheKey(), ++$count);
// fire request

function getCacheKey(){
    return floor(time()/10); // Cool down time
}

Queueing seems to be a better options, and I will eventually move to that. There are few things you need to keep in mind before putting queue in between.

  1. Callback based Architecture, because you may need to save a serialise state of code in queue. Callback based design will give control all control to Client Class. You will not have to worry about throttling in you code.
  2. Serialization could be tricky, try __sleep and __wakeup.
  3. You may also want to prioritise few call, you can allocate a quota from clients for such calls.

Upvotes: 1

mark.sagikazar
mark.sagikazar

Reputation: 1042

Personally I think Guzzle should not handle this case, but if you want Guzzle handle it I would write a Middleware which checks the response and if it returns a rate limit error (eg. status code 429). Then you can either emit a custom error or wait until the rate limit is over and try again. However this could possibly end up in long response time (since you wait for the rate limit).

I don't think the Laravel queue would be any better since that would make the response asynchronously available and you would have to poll your database or your cache, wherever you store the results. (Of course it can work if you don't need the results to be immediately available)

If this third party service is directly connected to a user facing interface, I would probably apply the same rate limit (in your application code) and return an error message to the user instead of waiting and autoresolving the issue.

Upvotes: 0

Kyle
Kyle

Reputation: 405

There's not enough information to really dig deep into this, but to get you started, good APIs typically return a 429 response code when you're exceeding their throttled limit.

You could use $res->getStatusCode() from guzzle to check for this and flash a message back to the user if they're making too many requests too quickly.

Can you give some more information about what your app is doing? Are you making requests in a foreach loop? Is the view dependent on data from this API?

Upvotes: 2

Related Questions