Steven Li
Steven Li

Reputation: 764

C# allowing one thread to run the method at a time after multiple requests triggerred to application server

I am building an ASP.NET web.api service. there is api needs more than 2 minutes to retrieve desired data, so I implemented cache mechanism, and every request sent to API Server, the server will return the cached data and meanwhile start a new thread to load new data into the cache, the issue is if I submitted a lot of requests, a lot of thread will be running and eventually crashed the server, I want to implement a mechanism to control only a thread at any certain time, but I know ASP.NET Web.API is inherently multi threads, how do I tell other request to wait, because there is one thread already retrieving new set of data ?

[Dependency]
    public ICacheManager<OrderArray> orderArrayCache { get; set; }
    private ReadOrderService Service = new ReadOrderService();
    private const string _ckey = "all";
    public dynamic Get()
    {
        try
        {
            OrderArray cache = orderArrayCache.Get(_ckey);
            if(cache == null || cache.orders.Length == 0)
            {
                OrderArray data = Service.GetAllOrders();
                orderArrayCache.Add(_ckey, data);
                return data;
            }
            else
            {
                Caching();
                return cache;
            }
        }
        catch (Exception error)
        {
            ErrorLog.WriteLog(Config._SystemName, this.GetType().Name, System.Reflection.MethodBase.GetCurrentMethod().Name, error.ToString());
            return 0;
        }
    }

    public void Caching()
    {
        Thread worker = new Thread(() => CacheWorker());
        worker.Start();
    }
    public void CacheWorker()
    {
        try
        {
            //ActivityLog.WriteLog(Config._SystemName, this.GetType().Name, System.Reflection.MethodBase.GetCurrentMethod().Name, "Cache Worker Is Starting to Work");
            OrderArray data = Service.GetAllOrders();
            orderArrayCache.Put(_ckey, data);
            //ActivityLog.WriteLog(Config._SystemName, this.GetType().Name, System.Reflection.MethodBase.GetCurrentMethod().Name, "Cache Worker Is Working Hard");
        }
        catch(Exception error)
        {
            //ActivityLog.WriteLog(Config._SystemName, this.GetType().Name, System.Reflection.MethodBase.GetCurrentMethod().Name, error.ToString());
        }
    }

Upvotes: 0

Views: 2280

Answers (1)

CodeCaster
CodeCaster

Reputation: 151594

Without commenting on the overall architecture, it's as trivial as setting a flag that you're working, and not starting the thread if that flag is set.

Of course in the ASP.NET MVC/WebAPI context, a controller instance is created for every request, so a simple field won't work. You could make it static, but that'll only work per AppDomain: one application can run in multiple AppDomains, by using multiple worker processes.

You could solve that by using a mutex, but then your application could be in a server farm, introducing a whole shebang of new problems.

That being said, the naive, static approach:

private static bool _currentlyRetrievingCacheableData = false;

public void Caching()
{
    if (_currentlyRetrievingCacheableData)
    {
        return;
    }

    Thread worker = new Thread(() => CacheWorker());
    worker.Start();
}

public void CacheWorker()
{
    try
    {
        _currentlyRetrievingCacheableData = true;

        // ...
    }
    catch(Exception error)
    {
        // ...
    }
    finally
    {
        _currentlyRetrievingCacheableData = false;
    }
}

There's still a race issue here, but at most two threads can be accessing the CacheWorker() method. You can prevent that by using a lock statement.

Do note that all of this are workarounds for doing the obvious: let the cache refreshing mechanism live outside your web application code, for example in a Windows Service or a Scheduled Task.

Upvotes: 1

Related Questions