Reputation: 39
In my rails app, I need to call a http API in other address (just call it "Server A"), but Server A cannot process too much request at the same time. So I need to limit query from my rails app by using a queue or connection pool, or something like that. First I use sidekiq, but there is not a good way to waiting for sidekiq complete in rails (I need to response inside rails request, this request can take a little longer but it is ok). Can anyone suggest me a solution for this situation?
Upvotes: 2
Views: 1124
Reputation: 2014
Under the assumption you stated that it's ok for you to block the request until you're within your rate limit (a very questionable assumption in my opinion), then you will essentially need a way to coordinate blocking among your Rails worker processes attempting the operation.
This suggests to me that you need to implement a solution using a distributed semaphore. There are a number of libraries and services that provide such functionality.
Here's an example for Redis.
But please consider that you can end up blocking all your workers and make your app unresponsive if you're not careful in designing your solution.
Upvotes: 0
Reputation: 1958
if you can switch the HTTP client, Typhoeus has a feature to limit the HTTP requests count.When more requests than that are queued up, hydra will save them for later and start the requests as others are finished.
https://github.com/typhoeus/typhoeus#specifying-max-concurrency
Upvotes: 0
Reputation: 22228
Sidekiq does not have a way to rate limit external operations. Sidekiq Enterprise offers a rate limiting API which will do what you need.
https://github.com/mperham/sidekiq/wiki/Ent-Rate-Limiting#concurrent
Upvotes: 1
Reputation: 159
You can set concurrency in sidekiq in config/sidekiq.yml
---
:concurrency: 5
so there wont be more than 5 requests at the same time and others will wait in the queue until a process is finished
Upvotes: 0