Furzel
Furzel

Reputation: 616

Redis high memory usage for almot no keys

I have a redis instance hosted by heroku ( https://elements.heroku.com/addons/heroku-redis ) and using the plan "Premium 1"

This redis is usued only to host a small queue system called Bull ( https://www.npmjs.com/package/bull )

The memory usage is now almost at 100 % ( of the 100 Mo allowed ) even though there is barely any job stored in redis.

I ran an INFO command on this instance and here are the important part ( can post more if needed ) :

# Server
redis_version:3.2.4

# Memory
used_memory:98123632
used_memory_human:93.58M
used_memory_rss:470360064
used_memory_rss_human:448.57M
used_memory_peak:105616528
used_memory_peak_human:100.72M
total_system_memory:16040415232
total_system_memory_human:14.94G
used_memory_lua:280863744
used_memory_lua_human:267.85M
maxmemory:104857600
maxmemory_human:100.00M
maxmemory_policy:noeviction
mem_fragmentation_ratio:4.79
mem_allocator:jemalloc-4.0.3

# Keyspace
db0:keys=45,expires=0,avg_ttl=0  

# Replication
role:master
connected_slaves:1
master_repl_offset:25687582196
repl_backlog_active:1
repl_backlog_size:1048576
repl_backlog_first_byte_offset:25686533621
repl_backlog_histlen:1048576

I have a really hard time figuring out how I can be using 95 Mo with barely 50 object stored. These objects are really small, usually a JSON with 2-3 fields containing small strings and ids

I've tried https://github.com/gamenet/redis-memory-analyzer but it crashes on me when I try to run it

I can't get a dump because Heroku does not allow it.

I'm a bit lost here, there might be something obvious I've missed but I'm reaching the limit of my understanding of Redis.

Thanks in advance for any tips / pointer.

EDIT

We had to upgrade our Redis instance to keep everything running but it seems the issue is still here. Currently sitting at 34 keys / 34 Mo

I've tried redis-cli --bigkeys :

Sampled 34 keys in the keyspace!
Total key length in bytes is 743 (avg len 21.85)

9 strings with 43 bytes (26.47% of keys, avg size 4.78)
0 lists with 0 items (00.00% of keys, avg size 0.00)
0 sets with 0 members (00.00% of keys, avg size 0.00) 
24 hashs with 227 fields (70.59% of keys, avg size 9.46)
1 zsets with 23 members (02.94% of keys, avg size 23.00)

I'm pretty sure there is some overhead building up somewhere but I can't find what.

EDIT 2

I'm actually blind : used_memory_lua_human:267.85M in the INFO command I run when first creating this post and now used_memory_lua_human:89.25M on the new instance

This seems super high, and might explain the memory usage

Upvotes: 1

Views: 6880

Answers (2)

Furzel
Furzel

Reputation: 616

After a lot of digging, the issue is not coming from Redis or Heroku in anyway.

The queue system we use has a somewhat recent bug where Redis ends up caching a Lua script repeatedly eating up memory as time goes on.

More info here : https://github.com/OptimalBits/bull/issues/426

Thanks for those who took the time to reply.

Upvotes: 2

Alexander Bocharov
Alexander Bocharov

Reputation: 201

You have just 45 keys in database, so what you can do is:

  1. List all keys with KEYS * command
  2. Run DEBUG OBJECT <key> command for each or several keys, it will return serialized length so you will get better understanding what keys consume lot of space.

Alternative option is to run redis-cli --bigkeys so it will show biggest keys. You can see content of the key by specific for the data type command - for strings it's GET command, for hashes it's HGETALL and so on.

Upvotes: 2

Related Questions