Reputation: 3910
I have an application that, when unoptimized, will require many writes to the postgreSQL database in response to real-time information - as many as 1 per second!!!
Therefore, I'd like to cache this stream of data - either through redis/redisco or memcache - and then do a single bulk_create in my postgreSQL database every ~5 min.
As I understand, the django memcache will store in memory, but it is possible to invalidate the memcache when a write is needed.
Alternatively, I was considering to put information in redis, perhaps using redisco models, and do a bulk_create to the database every ~5 min.
3 part question:
Thanks!
Upvotes: 1
Views: 1361
Reputation: 26464
Premature optimization is the root of all evil. PostgreSQL is capable of handling heavy mixed read/write workloads. Start there, and start exploring other options as you need to, however with a high-end server, you will be able to get up to about 14000 writes per second (depending on query specifics) on PostgreSQL 9.2 when it comes out. With 9.1, you max out at about 3000 writes per second and the difference has to do with locking behavior.
Don't optimize yet. If you start to get into the hundreds of writes per second, then maybe it would be worth it. However especially if these are simple writes, you are better off keeping your architecture simple.
Upvotes: 3