bmiljevic
bmiljevic

Reputation: 782

Starting multiple processes in Celery

I have a problem with configuring Celery for Django.

This is how I start django-celery:

python manage.py celery worker --autoscale=10,2

Example of a task:

@task
def test(i):
    print "ITERATION {} START".format(i)
    time.sleep(10)
    print "ITERATION {} END".format(i)
    return True

And I call this task with:

for i in range(10):
    test.delay(i)

What I expect to happen is that if I send 10 tasks to queue, 10 processes should open - one for each task.

What actually happens is that the random number of processes are started, usually 4 and after these 4 tasks are finished, another 3 start and after they finish, another 3 start. This happens even for tasks that take longer to complete, e.g. 2 minutes.

Can someone explain this behavior? How can I start all tasks immediately if autoscale upper limit allows it?

Also, though lower limit in autoscale is 2, when server is started, 3 processes run. Why is that?

Platform: OpenWRT, Dual-Core processor, 2GB RAM.

Upvotes: 2

Views: 4302

Answers (1)

Uri Shalit
Uri Shalit

Reputation: 2308

Celery by default creates a worker per core, so I am assuming you are running on a machine with 4 cores. You can configure this using the flag --concurrency see documentation for further details.

After that you said 3 processes start and then 2 and so on, a task will start only once another has finished, and that also can be delayed sometimes because of the prefetch policy. You can see this thread for more details.

Celery starts a main process and X workers - the main process manages the workers, restarts them when needed and dispatches tasks to the workers. Therefore if you have 2 workers - you will have 3 processes.

Upvotes: 1

Related Questions