Reputation: 8716
I have a classified advertisements website ala craigslist.org - just on a much smaller scale.
I'm running MongoDB and is caching all API requests in Redis where the Mongo query is the key and the value is the MongoDB result document.
Pseudo code:
// The mongo query
var query = {section: 'home', category: 'garden', 'region': 'APAC', country: 'au', city: 'sydney', limit: 100}
// Getting the mongo result..
// Storing in Redis
redisClient.set(JSON.stringify(query), result);
Now a user creates a new post in the same category, but Redis now serves up a stale record because Redis has no idea that the dataset has changed.
How can we overcome this?
I could set an expiry in general or on that particular key, but essentially mem cached keys need to expire in the moment a user creates a new post for those keys where the result set would include the newly created record.
One way is to iterate though all the Redis keys and come up with a pattern to detect which keys should be deleted based on the characteristics of the newly created record. Only this approach seem "too clever" and not quite right.
So we want to mem cache at the same time as serving up fresh content instantly.
Upvotes: 3
Views: 261
Reputation: 19794
I would avoid caching bulk query results as a single key. Redis is for use cases where you need to access and update data at very high frequency and where you benefit from use of data structures such as hashes, sets, lists, strings, or sorted sets [1]. Also, keep in mind MongoDB will already have part of the database cached in memory so you might not see much in the way of performance gains.
A better approach would be to cache each post individually. You can add keys to sets to group them into categories or even just pages (like the 20 or so posts that the user expects to see on each page). This way, every time a user makes a new post or updates an existing one, you can update the corresponding key in your Redis cache as well.
Upvotes: 3