pulkit
pulkit

Reputation: 81

Using Single Celery Server For Multi Django Projects

I have 3 Separate Django Projects sharing same DB running on the same Machine. What I require is to configure Celery For them. Now my question is:

1.) Whether should I run separate celery daemons for separate projects, and set different vhosts and users in rabbitmq which I don't want to opt as it would be a waste of resources or

2.) Is there a way I can target all the tasks from different projects to a single celery server.

Also, How handy would supervisord be in the solution?

Upvotes: 7

Views: 2643

Answers (1)

Chillar Anand
Chillar Anand

Reputation: 29514

Yes, you can use same celery server to receive task from seperate projects.

Have a seperate celery app(or just a single file) say foo which has all tasks which are used in different projects.

# foo.py    
from celery import Celery

app = Celery(broker='amqp://guest@localhost//')

@app.task
def add(x, y):
    return x + y

@app.task
def sub(x, y):
    return x - y

Start a worker to run tasks

celery worker -l info -A foo

Now from Project A, you can call add

import celery

celery.current_app.send_task('foo.add', args=(1, 2))

And from Project B, you can call sub

import celery

celery.current_app.send_task('foo.sub', args=(1, 2))

You can use supervisord, to manage celery worker.

This approach might be slightly harder for testing as send_task won't respect CELERY_ALWAYS_EAGER. However you can use this snippet so that CELERY_ALWAYS_EAGER will be honored by send_task.

Upvotes: 2

Related Questions