user2349115
user2349115

Reputation: 1286

Worker dying after receiving 1000s of tasks at a time in Django Celery

I am running a script that generates nearly 10000 objects. On save of each object, a Celery task will be called, so in just 1-3 min, Celery is receiving 1000s of tasks and its worker is dying (with state still showing as RUNNING).

So I need to restart again and again. Because I am restarting many times, many Python processes (that run Celery) are consuming a lot of memory.

Upvotes: 2

Views: 1780

Answers (1)

Hodson
Hodson

Reputation: 3556

If I am understanding correctly, you are having the same problem I had a few weeks back. Every so often, it seemed as if our celery worker was just freezing (we found it was actually receiving tasks but not executing any) and after restarting the worker it would rush through the tasks until it decided to freeze again.

The problem was solved by doing the following pip installs.

pip install https://github.com/celery/billiard/zipball/2.7
pip install https://github.com/celery/celery/zipball/asynwrite

I found the solution on the GitHub issue tracker page for the celery project but I can't find the exact ticket. Heres a link to a similar issue though (that uses this as a solution).

Upvotes: 2

Related Questions