Reputation: 3138
I have a celery server that handles some counters for my application
class IncrementStatsCounterTask(Task):
def run(self, count, shortcode, stat_type, operator_id, date, **kwargs):
r_server = redis.Redis(settings.REDIS_HOST)
key = key_mask % {
'shortcode': shortcode,
'stat_type': stat_type,
'operator_id': operator_id,
'date': date.strftime('%Y%m%d')
}
return key, r_server.incr(key, count)
It all works great,however this opens and closes the redis connection every time my task, runs. Is there a better way to handle the connections? maybe have some sort of persistent connection?
I'm running latest django-celery
Upvotes: 6
Views: 1866
Reputation: 5841
In python redis library you can use connection pooling. Just create a pool globally in one of your modules and use it for every new connection.
Upvotes: 1