DurandA
DurandA

Reputation: 1550

Behavior of redis.ConnectionPool with asyncio (redis-py)

I am using Redis in a FastAPI application using the redis-py library. I am unclear about the behavior of redis.ConnectionPool in this context, especially regarding how it handles situations where the number of connections exceeds the max_connections limit. I'm also seeking advice on whether I should create multiple clients using a single ConnectionPool, or to reuse a single client across the application.

Here's a example simple application:

from redis.asyncio import ConnectionPool, Redis
from fastapi import Depends, FastAPI

app = FastAPI()

pool = ConnectionPool(max_connections=10)
redis_shared = Redis(connection_pool=pool)

async def get_redis():
    return Redis(connection_pool=pool)
    # or alternatively
    #return redis_shared

@app.get("/")
async def my_route(redis: Redis = Depends(get_redis)):
    return await redis.get('mykey')

In this code, I'm not sure what happens when the number of concurrent tasks exceeds the max_connections (10) in the ConnectionPool. Does redis-py queue the excess connections, or will it raise an exception?

Additionally, is it more efficient or advisable to create multiple clients from a single ConnectionPool, or should I create and reuse a single client across my application?

Any insights or best practices regarding the use of ConnectionPool with asyncio in redis-py would be greatly appreciated.

Upvotes: 4

Views: 1277

Answers (0)

Related Questions