Prometheus
Prometheus

Reputation: 33615

Executing two tasks at the same time with Celery

I'm testing celery in a local environment. My Python file has the following two lines of code:

celery_app.send_task('tasks.test1', args=[self.id], kwargs={})
celery_app.send_task('tasks.test2', args=[self.id], kwargs={})

Looking at the console output they seem to execute one after another in sequence. But test2 only runs after test1 has finished. At least this is the way it seems reading the console output.

These tasks have no dependancies on each other so I don't want one task waiting for another to complete before moving onto the next line.

How can I execute both tasks as the same time?

---- **** -----
--- * ***  * -- Darwin-14.0.0-x86_64-i386-64bit
-- * - **** ---
- ** ---------- [config]
- ** ---------- .> app:         tasks:0x104cd8c10
- ** ---------- .> transport:   sqs://123
- ** ---------- .> results:     disabled
- *** --- * --- .> concurrency: 4 (prefork)
-- ******* ----
--- ***** ----- [queues]
 -------------- .> celery           exchange=celery(direct) key=celery

Upvotes: 14

Views: 17764

Answers (2)

Chillar Anand
Chillar Anand

Reputation: 29514

There are multiple ways to achieve this.

1. Single Worker - Single Queue.

$ celery -A my_app worker -l info  -c 2 -n my_worker

This will start a worker which executes 2 tasks at the same time.

2. Multiple workers - Single Queue.

$ celery -A my_app worker -l info  -c 1 -n my_worker1
$ celery -A my_app worker -l info  -c 1 -n my_worker2

This will start two workers which executes one task at a time. Note both tasks are in the same queue.

3. Multiple workers - Multiple Queues.

$ celery -A my_app worker -l info  -c 1 -n my_worker1 -Q queue1
$ celery -A my_app worker -l info  -c 1 -n my_worker2 -Q queue2

This will start two workers which executes one task at a time. But here you have route the tasks accordingly.

celery_app.send_task('tasks.test1', args=[self.id], kwargs={}, queue='queue1')
celery_app.send_task('tasks.test2', args=[self.id], kwargs={}, queue='queue2')

4. Single worker - All Queues

$ celery -A my_app worker -l info -n my_worker1 

If you don't mention any queue, it will consume from all queues by default.

Upvotes: 28

All Іѕ Vаиітy
All Іѕ Vаиітy

Reputation: 26412

Call the worker with --autoscale option which would scale up and down processes as required.

--autoscale AUTOSCALE
                       Enable autoscaling by providing max_concurrency,
                       min_concurrency. Example:: --autoscale=10,3 (always
                       keep 3 processes, but grow to 10 if necessary)

example.

celery -A sandbox worker --autoscale=10,0 --loglevel=info 

Upvotes: 7

Related Questions