Agendum
Agendum

Reputation: 2011

Use IMemoryCache singleton-per-repository?

I am integrating IMemoryCache into a project, and I am going out of my way to make sure the keys in each repository or provider do not conflict (i.e. coming up with a key-space). But then it occurred to me, why not just provide each repository/provider their own singleton instance of IMemoryCache -- i.e. a named singleton. This would guarantee that keys could never clash, and one repository could never get access the internal state of another repository. The primitive side of my also says this may improve performance because now each repository is not competing for the internal locks of IMemoryCache.

But I am not going to pretend to understand how IMemoryCache manages all of it's logic. Maybe it is important to have an application-wide singleton instance so that it can manage cache entry lifetimes in a more performant way.

Basically, I have never seen anybody use the singleton-per-repository pattern with IMemoryCache, so I am soliciting feedback on this approach.

Thanks

Upvotes: 2

Views: 2210

Answers (2)

Power Mouse
Power Mouse

Reputation: 1441

This is a linqpad example I created. I think it's what you're looking for:

void Main()
{
    DataTable dt = new DataTable();

    DateTime start = DateTime.Now;

    Random _rand = new Random();
    List<int> result = Enumerable.Range(1, 6000)
    .Select(x => x++)
    .ToList();
    result.AsParallel<int>().ForAll(c =>
    {
        Util.GetFromCache("datatable", c);
        System.Threading.Thread.Sleep(1);
    });

    DateTime.Now.Subtract(start).Seconds.Dump();
    "....DONE.....".Dump();

}

public static class Util
{
    private static readonly object _Lock = new object();

    public static object GetFromCache(string cachename, int i)
    {
        object obj = MemoryCacher.GetValue(cachename);
//      if (i == 5) //when time is up - reset token, update DB and add to cache es example i set a count=5
//      {
//          obj = null;
//          MemoryCacher.Delete(cachename);
//      }
        if (obj == null)
        {
            lock (_Lock)
            {
                obj = MemoryCacher.GetValue(cachename);
                if (obj == null)
                {
                    $"{i} from DATA".Dump();
                    obj = GetData();
                    MemoryCacher.Delete(cachename);
                    MemoryCacher.Add(cachename, obj, DateTimeOffset.Now.AddSeconds(5));
                    return obj;
                }
                $"{i} from CACHE with lock".Dump();
            }
        }
        $"{i} from CACHE".Dump();
        return obj;
    }

    public static DataTable GetData()
    {
        DataTable dt = new DataTable();

        FileInfo fi = new FileInfo("c:\\1\\text.txt");
        dt = CSVtoDS(fi.FullName, true).AsEnumerable().Take(10).CopyToDataTable();
        return dt.Dump("call");
    }

    public static DataTable CSVtoDS(string filePath, bool isFirstLineHeader)
    {
        DataTable dt = new DataTable();
        using (TextReader tr = File.OpenText(filePath))
        {
            string strRow = string.Empty;
            string[] arrColumns = null;
            int indx = 0;
            while ((strRow = tr.ReadLine()) != null)
            {
                //set up columns
                if (indx == 0)
                {
                    arrColumns = strRow.Split('\t')[0].Split(',').Select(x => x.Replace(" ", "_")).ToArray();
                    if (dt.Columns.Count != arrColumns.Length + 1)
                        for (int i = 0; i <= arrColumns.Length - 1; i++)
                        {
                            if (isFirstLineHeader)
                                dt.Columns.Add(new DataColumn(arrColumns[i]));
                            else
                                dt.Columns.Add(new DataColumn());
                        }
                    indx = 1;
                }
                else
                {
                    DataRow dr = dt.NewRow();
                    dr.ItemArray = strRow.Split(',');
                    dt.Rows.Add(dr);
                }
            }
            tr.Close();
        }
        return dt;
    }

    public static class MemoryCacher
    {
        public static object GetValue(string key)
        {
            MemoryCache memoryCache = MemoryCache.Default;
            return memoryCache.Get(key);
        }

        public static void Add(string key, object value, DateTimeOffset absExpiration)
        {
            MemoryCache memoryCache = MemoryCache.Default;
            memoryCache.Set(key, value, absExpiration);
        }

        public static void Delete(string key)
        {
            MemoryCache memoryCache = MemoryCache.Default;
            if (memoryCache.Contains(key))
            {
                memoryCache.Remove(key);
            }
        }
    }
}

Upvotes: 0

spender
spender

Reputation: 120496

I believe that the original .NET Full Framework MemoryCache was intended to be used as a singleton instance, via the MemoryCache.Default property. This was because it operated on something called MemoryPressure. There was exactly zero documentation about how this magical MemoryPressure was actually calculated. No doubt, it probably didn't work too cleverly over multiple instances.

It seems that the preference now is to set a size limit on the cache. The MemoryCache.Default static property no longer exists in the dotnet core version, and no warnings seem to be present indicating that multiple-instances are an antipattern. Furthermore, it seems that the MemoryCacheOptions.CompactOnMemoryPressure is now deprecated and the preference is to supply a fixed size. I can't see any problems with using multiple instances.

Upvotes: 2

Related Questions