Reputation: 652
I want to hit a URL which will eventually return the required data. Sometimes the requests timeout & it took upto 1 minute to receive the data. I want to make it as fast as possible. I am thinking of starting multiple threads & use the data from the earliest completed thread. Can anyone please help with the best approach?
I think I can do it via a infinite loop waiting for the result in an array from the thread but that seems to be a very inefficient way of doing it.
Upvotes: 1
Views: 97
Reputation: 26778
Something like this could be a strategy. It builds a list of threads which each attempt to set the result
to some value. Then it sleeps until result
is set and kills all the threads (setting a total 60 second timeout limit).
require 'timeout'
proxies = "proxy.com", "proxy.org" # replace with proxies
result = nil
Timeout.timeout(60) do
threads = proxies.map do |proxy|
Thread.new do
result = get(proxy, target_url) # replace with HTTP call
end
end
sleep 0.25 until result
threads.each(&:kill)
end
You would probably want to modify this so that it checks that the response is successful, and not just that it's non-nil (for example, if it returns a 500 error).
Also, I would advise trying to adhere to the rate limits of the API, and check with their terms of service to ensure this is allowed.
Keep if mind that if you set a timeout of 60 seconds, that means anyone sending a request to this endpoinnt will have to wait up to 60 seconds for a response. This is usually undesirable, and people use async approaches instead.
Upvotes: 2
Reputation: 5426
Not sure what you're doing to get that data or what are the constraints of your client but it seems like you might need something like background jobs (see: https://github.com/mperham/sidekiq or https://github.com/collectiveidea/delayed_job). Depending on your exact case you can use various techniques to push the obtained data to the client.
Upvotes: 0