Krishnaraj Barvathaya
Krishnaraj Barvathaya

Reputation: 595

IMemoryCache, refresh cache before eviction

I am trying to migrate my .Net framework application to .Net Core and in this process, I want to move my in-memory caching from System.Runtime.Caching/MemoryCache to Microsoft.Extensions.Caching.Memory/IMemoryCache. But I have one problem with IMemoryCache, I could not find a way to refresh the cache before it is removed/evicted.

In the case of System.Runtime.Caching/MemoryCache, there is UpdateCallback property in CacheItemPolicy to which I can set the delegate of callback function and this function will be called in a separate thread just before the eviction of the cached object. Even if callback function takes a long time to fetch fresh data, MemoryCache will continue to serve old data beyond its expiry deadline, this ensures my code need not wait for data during the process of cache refresh.

But I don't see such functionality in Microsoft.Extensions.Caching.Memory/IMemoryCache, there is RegisterPostEvictionCallback property and PostEvictionCallbacks extension method in MemoryCacheEntryOptions. But both of these will be fired after the cache entry is evicted from the cache. So if this callback takes a longer time, all the requests to get this data need to wait.

Is there any solution?

Upvotes: 15

Views: 13268

Answers (6)

Jody Donetti
Jody Donetti

Reputation: 558

You may try to take a look at FusionCache ⚡🦥, a library I recently released.

Features to use

The first interesting thing is that it provides an optimization for concurrent factory calls so that only one call per-key will be exeuted, relieving the load on your data source: basically all concurrent callers for the same cache key at the same time will be blocked and only one factory will be executed.

Then you can specify some timeouts for the factory, so that it will not take too much time: background factory completion isenabled by default so that, even if it will actually times out, it can keep running in the background and update the cache with the new value as soon as it will finish.

Then simply enable fail-safe to re-use the expired value in case of timeouts, or any problem really (the database is down, there are temporary network errors, etc).

A practical example

You can cache something for, let's say, 2 min after which a factory would be called to refresh the data but, in case of problems (exceptions, timeouts, etc), that expired value would be used again until the factory is able to complete in the background, after which it will update the cache right away.

One more thing

Another interesting feature is support for an optional, distributed 2nd level cache, automatically managed and kept in sync with the local one for you without doing anything.

If you will give it a chance please let me know what you think.

/shameless-plug

Upvotes: 1

Chris Moschini
Chris Moschini

Reputation: 37947

That's because there is no eviction, and, I would argue, that makes IMemoryCache not a cache:

"The ASP.NET Core runtime doesn't trim the cache when system memory is low."

https://learn.microsoft.com/en-us/aspnet/core/performance/caching/memory?view=aspnetcore-5.0#use-setsize-size-and-sizelimit-to-limit-cache-size

"If SizeLimit isn't set, the cache grows without bound." "The cache size limit does not have a defined unit of measure because the cache has no mechanism to measure the size of entries." "An entry will not be cached if the sum of the cached entry sizes exceeds the value specified by SizeLimit."

So, not only does the IMemoryCache fail to do the most basic thing you'd expect from a cache - respond to memory pressure by evicting oldest entries - you also don't have the insert logic you expect. Adding a fresh item to a full "cache" doesn't evict an older entry, it refuses to insert the new item.

I argue this is just an unfortunate Dictionary, and not a cache at all. The cake/class is a lie.

To get this to actually work like a cache, you'd need to write a wrapper class that does measure memory size, and system code that interacts with the wrapper that evicts (via .Remove()) in response to memory pressure and expiration, periodically. You know - most of the work of implementing a cache.

So, the reason you couldn't find a way to update before eviction is because by default there isn't any eviction, and if you've implemented your own eviction scheme, you've written so much of an actual cache, what's writing a bit more?

Upvotes: 10

vernou
vernou

Reputation: 7590

I had this need and I write the class :

public abstract class AutoRefreshCache<TKey, TValue>
{
    private readonly ConcurrentDictionary<TKey, TValue> _entries = new ConcurrentDictionary<TKey, TValue>();

    protected AutoRefreshCache(TimeSpan interval)
    {
        var timer = new System.Timers.Timer();
        timer.Interval = interval.TotalMilliseconds;
        timer.AutoReset = true;
        timer.Elapsed += (o, e) =>
        {
            ((System.Timers.Timer)o).Stop();
            RefreshAll();
            ((System.Timers.Timer)o).Start();
        };
        timer.Start();
    }

    public TValue Get(TKey key)
    {
        return _entries.GetOrAdd(key, k => Load(k));
    }

    public void RefreshAll()
    {
        var keys = _entries.Keys;
        foreach(var key in keys)
        {
            _entries.AddOrUpdate(key, k => Load(key), (k, v) => Load(key));
        }
    }

    protected abstract TValue Load(TKey key);
}

Values aren't evicted, just refreshed. Only the first Get wait to load the value. During the refresh, Get return the precedent value (no wait).

Example of use :

class Program
{
    static void Main(string[] args)
    {
        var cache = new MyCache();
        while (true)
        {
            System.Threading.Thread.Sleep(TimeSpan.FromSeconds(1));
            Console.WriteLine(cache.Get("Key1") ?? "<null>");
        }
    }
}

public class MyCache : AutoRefreshCache<string, string>
{
    public MyCache() 
        : base(TimeSpan.FromSeconds(5))
    { }

    readonly Random random = new Random();
    protected override string Load(string key)
    {
        Console.WriteLine($"Load {key} begin");
        System.Threading.Thread.Sleep(TimeSpan.FromSeconds(3));
        Console.WriteLine($"Load {key} end");
        return "Value " + random.Next();
    }
}

Result :

Load Key1 begin
Load Key1 end
Value 1648258406
Load Key1 begin
Value 1648258406
Value 1648258406
Value 1648258406
Load Key1 end
Value 1970225921
Value 1970225921
Value 1970225921
Value 1970225921
Value 1970225921
Load Key1 begin
Value 1970225921
Value 1970225921
Value 1970225921
Load Key1 end
Value 363174357
Value 363174357

Upvotes: 1

Farhad Rahmanifard
Farhad Rahmanifard

Reputation: 688

I suggest using the "NeverRemove" priority for cache items and handle cache size and update procedure by methods like "MemoryCache.Compact" if it does not change your current design significantly. You may find the page "Cache in-memory in ASP.NET Core" useful. Please see the titles of: - MemoryCache.Compact - Additional notes: second item

Upvotes: 0

Kahbazi
Kahbazi

Reputation: 14995

You can do a trick here and add the old cache in RegisterPostEvictionCallback before looking up for the new value. This way if the callback takes a longer time, the old value is still available in cache.

Upvotes: 1

Claudiu Guiman
Claudiu Guiman

Reputation: 867

It looks like you need to set your own ChangeToken for each CacheEntry by calling AddExpirationToken. Then in your implementation of IChangeToken.HasChanged you can have a simple timeout expiration, and right before that gets triggered, asynchronously, you can search for new data that you can add in the cache.

Upvotes: 1

Related Questions