Reputation: 6587
I have a large dictionary and I would like to store it in redis. I can do this by iterating through it but this will take a long time.
for k,v in my_dict.iteritems():
r.hset('my_dict', k, v)
Is there a way to bulk store it (similar to how I can do it in Mongodb, simply by uploading the dictionary)?
Upvotes: 1
Views: 215
Reputation: 49942
AFAIK, the iteration is mandatory. That said, you could improve on the above with either or both of the following "tricks":
r.pipeline()
, batching some commands and then calling the object's execute()
method.r.hmset()
with (iterated) chunks of your dictionary.Upvotes: 2