Reputation: 1193
I am trying to write a simple http server in go , accepting a query string and pushing that to redis. I use the redigo module for redis connection and fasthttp for http server.
In the code (below). I am trying to use a redis pool, so that the connections are reused. When I try to benchmark using ab
ab -n 10000 -c 100 -k -r http://127.0.0.1:9080/?a=b
I see that almost 6000 redis connections are opened up. I can see that using netstat
Is the connection pooling not working ? How do I reduce the connections to redis ?
package main
import (
"fmt"
"time"
"github.com/garyburd/redigo/redis"
"github.com/valyala/fasthttp"
)
var Pool *redis.Pool
var err error
func init() {
Pool = newPool("127.0.0.1:6379")
_, err = Pool.Dial()
if err != nil {
fmt.Println(err)
}
}
func newPool(server string) *redis.Pool {
return &redis.Pool{
MaxIdle: 3,
IdleTimeout: 240 * time.Second,
Dial: func() (redis.Conn, error) {
c, err := redis.Dial("tcp", server)
if err != nil {
return nil, err
}
return c, err
},
}
}
func fastHTTPHandler(ctx *fasthttp.RequestCtx) {
conn := Pool.Get()
defer conn.Close()
conn.Do("RPUSH", "LTBENCH_PARAMS", ctx.QueryArgs().QueryString())
}
func main() {
fasthttp.ListenAndServe(":9080", fastHTTPHandler)
}
Upvotes: 2
Views: 2153
Reputation: 109339
You're only allowing 3 idle connections in the pool at a time, so after the first concurrent batch of 100 clients is completed, up to 97 of those connections could be closed instead of re-used.
You need to set the MaxIdle
to a value that can handle your expected concurrency, which is 100 in this case. If you want to put a limit on open connections, you should also set MaxActive
and Wait
so that spikes in activity don't exhaust server resources.
return &redis.Pool{
MaxIdle: 100,
IdleTimeout: 240 * time.Second,
MaxActive: 200,
Wait: true,
Dial: func() (redis.Conn, error) { return redis.Dial("tcp", addr) },
}
Upvotes: 7