Reputation: 980
I'm using ASP.NET Web API 2.2 along with Owin to build a web service and I observed each call to the controller will be served by a separate thread running on the server side, that's nothing surprising and is the behavior I expected.
One issue I'm having now is that because the server side actions are very memory intense so if more than X number of users are calling in at the same time there is a good chance the server code will throw an out-of-memory exception.
Is it possible to set a global "maximum action count" so that Web Api can queue (not reject) the incoming calls and only proceed when there's an empty slot.
I can't run the web service in 64bit because some of the referenced libraries won't support that.
I also looked at libraries like https://github.com/stefanprodan/WebApiThrottle but it can only throttle based on the frequency of calls.
Thanks
Upvotes: 5
Views: 1987
Reputation: 16393
You could add a piece of OwinMiddleware
along these lines (influenced by the WebApiThrottle you linked to):
public class MaxConccurrentMiddleware : OwinMiddleware
{
private readonly int maxConcurrentRequests;
private int currentRequestCount;
public MaxConccurrentMiddleware(int maxConcurrentRequests)
{
this.maxConcurrentRequests = maxConcurrentRequests;
}
public override async Task Invoke(IOwinContext context)
{
try
{
if (Interlocked.Increment(ref currentRequestCount) > maxConcurrentRequests)
{
var response = context.Response;
response.OnSendingHeaders(state =>
{
var resp = (OwinResponse)state;
resp.StatusCode = 429; // 429 Too Many Requests
}, response);
return Task.FromResult(0);
}
await Next.Invoke(context);
}
finally
{
Interlocked.Decrement(ref currentRequestCount);
}
}
}
Upvotes: 5