Omroth
Omroth

Reputation: 1119

Celery worker receiving too many tasks

I have started my Celery worker like this:

celery -A workerTasks worker --concurrency 1 --loglevel=info

But i see that it is receiving multiple tasks:

[2019-05-08 10:33:07,371: INFO/MainProcess] celery@2aaf46abfaed ready.
[2019-05-08 10:33:07,372: INFO/MainProcess] Received task: workerTasks.do_processing_task[a3c19e7a-6b04-4236-8afc-0884547d3f39]  
[2019-05-08 10:33:07,373: INFO/MainProcess] Received task: workerTasks.do_processing_task[f22443c1-6dee-4ecf-8c54-7a7c4da5418e]  
[2019-05-08 10:33:07,373: INFO/MainProcess] Received task: workerTasks.do_processing_task[6eb501b7-c192-46db-be78-6061300b6bdf]  
[2019-05-08 10:33:07,373: INFO/MainProcess] Received task: workerTasks.do_processing_task[ec08b59f-541e-42fc-806d-cfbd40daf7b7]  
[2019-05-08 10:33:07,479: INFO/MainProcess] Received task: workerTasks.do_processing_task[deaaec17-b07c-4476-9b44-9f8e884b0b6e]  

Why is it not just receiving one?

Thank you.

Upvotes: 4

Views: 2306

Answers (1)

Cesar Canassa
Cesar Canassa

Reputation: 20173

I believe that this due to the worker_prefetch_multiplier setting. From the documentation:

How many messages to prefetch at a time multiplied by the number of concurrent processes. The default is 4 (four messages for each process). The default setting is usually a good choice, however – if you have very long running tasks waiting in the queue and you have to start the workers, note that the first worker to start will receive four times the number of messages initially. Thus the tasks may not be fairly distributed to the workers.

Basically, Celery will try to fetch tasks in batches from your broker. This is done in order to avoid too many round trips.

Upvotes: 4

Related Questions