Gary
Gary

Reputation: 483

Go Redis Loading

I am trying to load 200 million keys into redis and usually start to get an error at around 31 million keys and have to stop.I am using golang and the redis library "github.com/garyburd/redigo/redis"

I set up a connection pool as so:

func newPool(server string) *redis.Pool {
    return &redis.Pool{
        MaxIdle: 3,
        MaxActive: 10,
        IdleTimeout: 240 * time.Second,
        Dial: func () (redis.Conn, error) {
            c, err := redis.Dial("tcp", server)
            if err != nil {
                return nil, err
            }
            return c, err
        },
        TestOnBorrow: func(c redis.Conn, t time.Time) error {
            _, err := c.Do("PING")
            return err
        },
    }
}

I then try to fill up redis with values with this function:

func RedisServerBatchLoadKeys(rtbExchange string, keys []string){
  redisLock.Lock()
  defer redisLock.Unlock()
  retry := 0
  for {
    conn := GetConnOrPanic(rtbExchange)
    defer conn.Close()
    conn.Send("MULTI")
    for _, key := range keys {
      conn.Send("SET", key, maxCount)
      conn.Send("EXPIRE", key, numSecondsExpire)
    }
    _, err := conn.Do("EXEC")
    if err == nil {
      break
    } else if !(err == io.EOF) {
      CheckRedisError(err, rtbExchange, "Could not load batch")
    } else {
      retry ++
    }
    if retry >= 10 {
      CheckRedisError(err, rtbExchange, "Could not load batch - 10 retries")
    }
  }
}

I have been getting numerous errors such as:

Am I doing something fundamentally wrong or do I have to add in more error checks (aside from the EOF that I added).

Thanks,

Upvotes: 1

Views: 3432

Answers (1)

Caleb
Caleb

Reputation: 9458

Just a guess: 200 million keys is a lot. Do you have enough memory for that size database?

The Redis docs say:

Redis can handle up to 2^32 keys, and was tested in practice to handle at least 250 million of keys per instance.

In other words your limit is likely the available memory in your system.

They also say:

What happens if Redis runs out of memory?

Redis will either be killed by the Linux kernel OOM killer, crash with an error, or will start to slow down.

It seems plausible to me that you're not able to connect because the server is actually down. Perhaps it gets restarted, and the next time you run your script it gets to the same place every time because that's when you run out of memory.

If this is your problem there are a couple things you could try:

  1. Use a redis hash which can store data more efficiently. See http://redis.io/topics/memory-optimization
  2. Partition (shard) your data set across multiple servers (for example if you had 4 servers you could take your key % 4 to determine which redis server to store under) If what you're going for is O(1) lookup you'll still get that, though you've made your system more brittle because there are multiple points of failure.

Upvotes: 2

Related Questions