aceminer
aceminer

Reputation: 4295

celery group runs tasks sequentially instead of in parallel

I am learning celery group function

@celery_app.task
def celery_task():
    import time
    time.sleep(30)
    print('task 1')

@celery_app.task
def celery_task2():
    import time
    time.sleep(10)
    print('task 2')

@celery_app.task
def test():
    from datetime import datetime
    print(datetime.now())
    job = group(
        celery_task.s(),
        celery_task2.s()
    )
    result = job()
    result.get()
    print(datetime.now())

However, as i was running test() from the python console and viewing them in the celery logs, it seems that task1 was run then task2 was run.

Shouldnt it be run in parallel? The whole test() function took 30s to complete

To start up my celery workers i use the command celery -A tasks worker -l=INFO

Upvotes: 1

Views: 2089

Answers (1)

ItayB
ItayB

Reputation: 11337

Are you sure that the whole test() took 30s? If so, I don't understand what's the problem? If it wasn't parallel - it had to take 30s+10s=40s.

Two things here:

  1. Use --concurrency flag when you run your worker so it can handle more than one task. Alternatively, use more than one worker (run two processes): celery -A tasks worker -l=INFO --concurrency=4 (I'm not sure what is the default - I guess that's one).
  2. Run your canvas asynchronously with job.delay() or job. apply_async() to run async.

Upvotes: 4

Related Questions