Reputation: 264
I am using redis to save jsonwebtokens. I am confused a little about the consumption of memory for every record.
Let's say I have an instance on Google cloud that has 4GB Memory allocated to it, I want to know that how many records can it handle.
Given that a record has on an average 1 string values excluding he identifier and every string has on an average 200 characters.
Upvotes: 0
Views: 1384
Reputation: 355
Redis wraps strings into sds struct, which requires 3 extra bytes (or more) for each string.
Each sds is stored in a redisObject struct (using a pointer pointing to that sds object). It takes about 16 extra bytes if you're on a 64-bit machine.
You may also consider the entries in the hash table. Each one takes 24 bytes.
So you can assume each of your string occupies 243 bytes. 1 million strings will use more than 250 MB (Redis itself needs memory).
Upvotes: 2
Reputation: 5689
It's all about how you store them. Using hashes (sizing them properly), or plain key value pair. Do read this doc for more info http://redis.io/topics/memory-optimization
For 1 million keys (simple key value pair) of 200 characters it takes about 300 MB. So for 4 GB you can store more or less 14 million keys I guess. To make sure this, install redis in your machine, run a simple java (using jedis) snippet, and check the memory consumption before and after the insertion.
Jedis jedis = new Jedis("localhost");
for i=0 to N
jedis.set("Key_"+i,string);
Upvotes: 1