Reputation: 33006
I have my Blazor server app set up where I cache the AppUser
object for the logged in user. It's a pretty heavy read from the DB with 11 queries (using SplitQueries). But once I have it, it then gives me 90% of what I need for every page. I figure that's a decent trade-off.
The problem is, for most pages I have several components that need this object. So all of them, in their OnInitializedAsync()
, call the service that caches this object. So 4 calls come in for the object. All 4 find it is not in the cache and go read it from the DB, and then all 4 place it in the cache. And a ½ second delay to show the first page becomes a 2 second delay.
How can I get around this? The best I've come up with is I have a dictionary of pending queries and a mutex or event. If there's a request for an AppUser
in the pending dictionary, then the request blocks on the mutex/event and when free-ed, then starts over looking in the cache, then reading if required (it shouldn't be).
Is there another approach to this? And if the pending - mutex is the best way to do this, is there anything I have to watch out for?
Upvotes: 1
Views: 39
Reputation: 33006
I discovered a very easy solution to this problem - LazyCache. Designed to solve this specific issue. (Not an endorsement! Just pointing out this solves the issue.)
Upvotes: 0
Reputation: 30330
You should be able to do something like this with Scoped services:
public class MyService
{
private Task? _loadingTask;
// Can kick off the load in the Ctor if you wish
public MyService()
{
_loadingTask = this.Initialize();
}
public Task LoadData()
{
// If _loadingTask is null then we need to start the loading process
if (_loadingTask == null)
{
_loadingTask = this.Initialize();
return _loadingTask;
}
if (_loadingTask.IsFaulted)
{
// Add code to deal with exceptions
}
return _loadingTask;
}
public async Task GetData()
{
await LoadData();
// return whatever data you want;
}
private async Task Initialize()
{
// Fake an async operation
await Task.Delay(1000);
}
}
Upvotes: 0