Reputation: 13335
I have the following async code that gets called from so many places in my project:
public async Task<HttpResponseMessage> MakeRequestAsync(HttpRequestMessage request)
{
var client = new HttpClient();
return await client.SendAsync(request).ConfigureAwait(false);
}
An example of how the above method gets called:
var tasks = items.Select(async i =>
{
var response = await MakeRequestAsync(i.Url);
//do something with response
});
The ZenDesk API that I'm hitting allows about 200 requests per minute after which I'm getting a 429 error. I need to do some sort of a Thread.sleep if I encounter the 429 error, but with with async/await, there may be so many requests in parallel threads waiting to process, I am not sure how I can make all of them sleep for 5 seconds or so and then resume again.
What's the correct way to approach this problem? I'd like to hear quick solutions as well as good-design solutions.
Upvotes: 3
Views: 4921
Reputation: 135
if there are multiple requests waiting in the while loop for RequestAllowed some of them might start at the same time. how about a simple StartRequestIfAllowed?
public class ThrottlingHelper : DisposeBase
{
//Holds time stamps for all started requests
private readonly List<long> _requestsTx;
private readonly Mutex _mutex = new Mutex();
private readonly int _maxLimit;
private readonly TimeSpan _interval;
public ThrottlingHelper(int maxLimit, TimeSpan interval)
{
_requestsTx = new List<long>();
_maxLimit = maxLimit;
_interval = interval;
}
public bool StartRequestIfAllowed
{
get
{
_mutex.WaitOne();
try
{
var nowTx = DateTime.Now.Ticks;
if (_requestsTx.Count(tx => nowTx - tx < _interval.Ticks) < _maxLimit)
{
_requestsTx.Add(DateTime.Now.Ticks);
return true;
}
else
{
return false;
}
}
finally
{
_mutex.ReleaseMutex();
}
}
}
public void EndRequest()
{
_mutex.WaitOne();
try
{
var nowTx = DateTime.Now.Ticks;
_requestsTx.RemoveAll(tx => nowTx - tx >= _interval.Ticks);
}
finally
{
_mutex.ReleaseMutex();
}
}
protected override void DisposeResources()
{
_mutex.Dispose();
}
}
Upvotes: 0
Reputation: 13976
I do not think that this is a duplicate, as marked recently. The other SO poster does not need a time-based sliding window (or time based throttling) and the answer there does not cover this situation. That works only when you want to set a hard limit on outgoing requests.
Anyway, a quasi-quick solution is to make the throttling in the MakeRequestAsync
method. Something like this:
public async Task<HttpResponseMessage> MakeRequestAsync(HttpRequestMessage request)
{
//Wait while the limit has been reached.
while(!_throttlingHelper.RequestAllowed)
{
await Task.Delay(1000);
}
var client = new HttpClient();
_throttlingHelper.StartRequest();
var result = await client.SendAsync(request).ConfigureAwait(false);
_throttlingHelper.EndRequest();
return result;
}
The class ThrottlingHelper
is just something I made now so you may need to debug it a bit (read - may not work out of the box).
It tries to be a timestamp sliding window.
public class ThrottlingHelper : IDisposable
{
//Holds time stamps for all started requests
private readonly List<long> _requestsTx;
private readonly ReaderWriterLockSlim _lock;
private readonly int _maxLimit;
private TimeSpan _interval;
public ThrottlingHelper(int maxLimit, TimeSpan interval)
{
_requestsTx = new List<long>();
_maxLimit = maxLimit;
_interval = interval;
_lock = new ReaderWriterLockSlim(LockRecursionPolicy.NoRecursion);
}
public bool RequestAllowed
{
get
{
_lock.EnterReadLock();
try
{
var nowTx = DateTime.Now.Ticks;
return _requestsTx.Count(tx => nowTx - tx < _interval.Ticks) < _maxLimit;
}
finally
{
_lock.ExitReadLock();
}
}
}
public void StartRequest()
{
_lock.EnterWriteLock();
try
{
_requestsTx.Add(DateTime.Now.Ticks);
}
finally
{
_lock.ExitWriteLock();
}
}
public void EndRequest()
{
_lock.EnterWriteLock();
try
{
var nowTx = DateTime.Now.Ticks;
_requestsTx.RemoveAll(tx => nowTx - tx >= _interval.Ticks);
}
finally
{
_lock.ExitWriteLock();
}
}
public void Dispose()
{
_lock.Dispose();
}
}
You would use it as a member in the class that makes the requests, and instantiate it like this:
_throttlingHelper = new ThrottlingHelper(200, TimeSpan.FromMinutes(1));
Don't forget to dispose it when you're done with it.
A bit of documentation about ThrottlingHelper
:
RequestAllowed
lets you know if you are able to do a request with the current throttling settings. StartRequest
& EndRequest
register/unregister a request by using the current date/time. EDIT/Pitfalls
As indicated by @PhilipABarnes, EndRequest
can potentially remove requests that are still in progress. As far as I can see, this can happen in two situations:
The proposed solution involves actually matching EndRequest
calls to StartRequest
calls by means of a GUID or something similar.
Upvotes: 6