Kyle
Kyle

Reputation: 33701

How to cache database data into memory for use by MVC application?

I have a somewhat complex permission system that uses six database tables in total and in order to speed it up, I would like to cache these tables in memory instead of having to hit the database every page load.

However, I'll need to update this cache when a new user is added or a permission is changed. I'm not sure how to go about having this in memory cache, and how to update it safely without causing problems if its accessed at the same time as updating

Does anyone have an example of how to do something like this or can point me in the right direction for research?

Upvotes: 5

Views: 8604

Answers (6)

nozari
nozari

Reputation: 504

I would cache items as you use it, which means on your data layer when you are getting you data back you check on your cache if it is available there otherwise you go to the database and cache the result after.

public AccessModel GetAccess(string accessCode)
{
     if(cache.Get<string>(accessCode) != null)
          return cache.Get<string>(accessCode);

     return GetFromDatabase(accessCode);
}

Then I would think next on my cache invalidate strategy. You can follow two ways:

One would be set expire data to be 1 hour and then you just hit the database once in a hour.

Or invalidate the cache whenever you update the data. That is for sure the best but is a bit more complex.

Hope it helps.

Note: you can either use ASP.NET Cache or another solution like memcached depending on your infrastructure

Upvotes: 1

Artyom Neustroev
Artyom Neustroev

Reputation: 8715

I can suggest cache such data in Application state object. For thread-safe usage, consider using lock operator. Your code would look something like this:

public void ClearTableCache(string tableName) 
{
    lock (System.Web.HttpContext.Current) 
    {
       System.Web.HttpContext.Current.Application[tableName] = null;
    }        
}

public SomeDataType GetTableData(string tableName) 
{
    lock (System.Web.HttpContext.Current) 
    {
       if (System.Web.HttpContext.Current.Application[tableName] == null) 
       {
          //get data from DB then put it into application state
          System.Web.HttpContext.Current.Application[tableName] = dataFromDb;
          return dataFromDb;
       }
       return (SomeDataType)System.Web.HttpContext.Current.Application[tableName];
    }            
}

Upvotes: 0

David
David

Reputation: 218827

Without knowing more about the structure of the application, there are lots of possible options. One such option might be to abstract the data access behind a repository interface and handle in-memory caching within that repository. Something as simple as a private IEnumerable<T> on the repository object.

So, for example, say you have a User object which contains information about the user (name, permissions, etc.). You'd have a UserRepository with some basic fetch/save methods on it. Inside that repository, you could maintain a private static HashSet<User> which holds User objects which have already been retrieved from the database.

When you fetch a User from the repository, it first checks the HashSet for an object to return, and if it doesn't find out it gets it from the database, adds it to the HashSet, then returns it. When you save a User it updates both the HashSet and the database.

Again, without knowing the specifics of the codebase and overall design, it's hard to give a more specific answer. This should be a generic enough solution to work in any application, though.

Upvotes: 2

Igor Zelaya
Igor Zelaya

Reputation: 4277

You can use the Cache object built in ASP.Net. Here is an article that explains how.

Upvotes: 0

sgnsajgon
sgnsajgon

Reputation: 704

You can take advantage of ASP.NET Caching and SqlCacheDependency Class. There is article on MSDN.

Upvotes: 0

devuxer
devuxer

Reputation: 42364

Is it hitting the database every page load that's the problem or is it joining six tables that's the problem?

If it's just that the join is slow, why not create a database table that summarizes the data in a way that is much easier and faster to query?

This way, you just have to update your summary table each time you add a user or update a permission. If you group all of this into a single transaction, you shouldn't have issues with out-of-sync data.

Upvotes: 0

Related Questions