Reputation: 183
I'm kind of a newb with Redis, so I apologize if this is a stupid question. I'm using Java to save data to redis.When I save small data ,everything is going well.But,when save and get a Map(the map have More than twenty thousand key-value,I use spring-data-redis to save this map , and I use 'redisTemplate.opsForHash().putAll()'), I need Almost 2 seconds to save or get it.
The map use 20M space on redis, and I use the cursor to get it. Is this data too large, or is the network problem?
I need to get the whole map data,What should I do?
Upvotes: 0
Views: 1599
Reputation: 6792
If you are using opsForHash, the proper pattern is to use .scan()
to pull down large amounts of data in chunks. As far as I know, no such API exists for put and you may need to iteratively add objects to your redis keyspace to not create congestion there.
Upvotes: 0
Reputation: 323
You should consider avoiding such activities. Redis is not designed to hold so much data in the keys. You can try to gzip it and I think it will help.
Remember that Redis works only on 1 thread so he does only 1 operation at the time. It means when he's busy other operations are waiting for their turn. It may have a huge impact on the scalability and performance.
Upvotes: 1
Reputation: 229
Maybe you can think about sharding the big map, or saving zipped binary data.
Upvotes: 1