Reputation: 1088
I am working on a nodejs API application that store and retrieve the data using MongoDB Database. For fast execution, I am using Redis DB to cache data. I am using a hash set to store and retrieve data.
But I observe that when I observe the Concurrency of data that time it is not working correctly it creating duplicate data in MongoDB.As Concurrency increase, multiple requests come at the same time due tho that it Redis caching not working properly
SO how I deal with such a case?
Upvotes: 1
Views: 3322
Reputation: 84
Redis is a single-threaded DB server. If you send multiple concurrent requests, then Redis will process them in the order that those requests are received at Redis' end. Therefore, you need to ensure the order of the requests sent from the application side.
If you still want to maintain the atomicity of a batch of commands, you can read more about Redis transactions and use Multi Exec block. When using a Multi command, subsequent commands are queued in the same order and executed when the Exec is received.
Upvotes: 1