Reputation: 57
I have a webApi that gets requests and execute the data with XPO. According to my business logic, to avoid conflicts on same DB table, every request must be executed one by one in a order.
Honestly, i couldn't try anything yet. I'm a little bit confused with asynchronous side of Asp.Net WebAPI.
The code below is the part of the response is being created in a Custom DelegatingHandler. When the base.SendAsync(request, cancellationToken)
method executed, my business logic starts in related System.Web.Http.ApiController
class.
protected override async Task<HttpResponseMessage> SendAsync(HttpRequestMessage request, System.Threading.CancellationToken cancellationToken)
{
// some logic on request like isUserGranted etc.
//
//
var response = await base.SendAsync(request, cancellationToken);
return buildApiResponse(request, response);
}
Probably, I have a misconception here. I need to wait until current response has completed to handle the next Request if there is a request on queue. I've read about ConcurrentQueue but couldn't realise how to implement it for this case.
Thanks in advance.
Upvotes: 0
Views: 1286
Reputation: 456437
According to my business logic, to avoid conflicts on same DB table, every request must be executed one by one in a order.
This is "pessimistic locking", and is generally considered not good for scalability reasons. If possible, change the business requirements to use "optimistic locking".
But if you really want to use pessimistic locking, you can enforce one-at-a-time behavior using an asynchronous lock (SemaphoreSlim
):
private static readonly SemaphoreSlim Mutex = new SemaphoreSlim(1);
protected override async Task<HttpResponseMessage> SendAsync(HttpRequestMessage request, System.Threading.CancellationToken cancellationToken)
{
// some logic on request like isUserGranted etc.
await Mutex.WaitAsync();
try
{
var response = await base.SendAsync(request, cancellationToken);
return buildApiResponse(request, response);
}
finally
{
Mutex.Release();
}
}
Upvotes: 1