Navin
Navin

Reputation: 744

Handle List in Redis

A large collection of Master Data (50K records in a list) needs to be cached using Redis. but the application will retrieve single records from master data. Does it make sense to serialize the 50K records in a list and store it in Redis and every time Get the serialized string and deserialize to List object and query using linq and return one data? e.g. store

IDistributedCache _cache

// data is list<object> of size 50K
var jsonString = JsonConvert.SerializeObject(data);
await _cache.SetStringAsync(key, jsonString, options);

retrieve

var cachedItem = await _cache.GetStringAsync(key);
var result = JsonConvert.DeserializeObject<T>(cachedItem);
...
return result.where(x=>x.Id=id) 

because the DeserializeObject of a large string is not efficient. If I store one by one in redis and if there is a requirement to also needs to be stored as whole then I am worried there will be a data duplication storing same data twice. so thats why I prefer to store 50K as whole.

any advice?

Upvotes: 1

Views: 1509

Answers (1)

Marc Gravell
Marc Gravell

Reputation: 1063944

The usage:

var result = JsonConvert.DeserializeObject<T>(cachedItem);
...
return result.where(x=>x.Id=id) 

strongly suggests that something very inefficient is happening here. If you want to be able to fetch by a key, the more obvious model here would be a redis hash, using the Id as the key, and the serialized payload for each value (HashSet[Async]). Then you can fetch, add, update or remove indivudual elements individually; you can also still get everything if needed (HashGetAll[Async]/HashScan[Async]).

You could also store each item individually (StringSet[Async]) using the id as part of the key, but if you do that, then there is no good mechanism to get the entire list.

Upvotes: 1

Related Questions