Reputation: 1725
I'm storing data in cache as not to hit the database constantly (doesn't matter if the data is a little stale), the dataset isn't particularly large but the operation can take some time due to the complexity of the query (lot's of joins and sub queries). I have a static helper class and the data is used for binding on individual pages. The page calls it like so:
public static List<MyList> MyDataListCache
{
get
{
var myList = HttpContext.Current.Cache["myList"];
if (myList == null)
{
var result = MyLongRunningOperation();
HttpContext.Current.Cache.Add("myList", result, null, DateTime.Now.AddMinutes(3),
Cache.NoSlidingExpiration, CacheItemPriority.Normal, null);
return result;
}
}
else
{
return (List<MyList>)myList;
}
}
}
This works fine unless lot's of people hit the page at the same time when the item is out of cache. Hundreds of the LongRunningOperations are spun up and cause the application to crash. How do I avoid this problem? I've tried using async tasks to subscribe to if the task is currently running but had no luck in getting it to work.
Upvotes: 0
Views: 350
Reputation: 3222
Upon starting this service, you should call LongRunningOperation() immediately to warm up your cache.
Second, you always want something to be returned, so I would consider a background task to refresh this cache prior to it's expiry.
Doing these two things will void the situation you described. The cache will be refreshed by a backgroundworker, and so everyone is happy :)
Upvotes: 1