krisdigitx
krisdigitx

Reputation: 7126

python Django Celery project structure

i am testing out celery to be used as multi-processing cluster application, however i am looking for some hints and whether this is the correct way to do or not...

i am using

python 2.6.6 celery-3.1.7 Django14-1.4.8-1.el6.noarch on Centos 6.4

I have setup two celery projects ie. non-django and django...

non-django project directory

/usr/local/proj
├── celery.py
├── celery.pyc
├── __init__.py
├── __init__.pyc
├── tasks.py
└── tasks.pyc

Django project directory

/usr/local/proj.django/
├── demoapp
│   ├── __init__.py
│   ├── models.py
│   ├── tasks.py
│   ├── tests.py
│   └── views.py
├── django.wsgi
├── manage.py
└── proj
    ├── celery.py
    ├── __init__.py
    ├── __init__.pyc
    ├── settings.py
    ├── settings.pyc
    ├── urls.py
    ├── urls.pyc
    ├── wsgi.py
    └── wsgi.pyc

the non-django project is mounted as NFS on the celery servers and I am able to submit tasks from tasks.py and I can check the status on celery Flower.

however, i am confused how to use the same via celery django project.

questions:

I might be sounding like stupid here, so apologizes....

any advice would be appreciated...

Upvotes: 1

Views: 874

Answers (1)

Guy Gavriely
Guy Gavriely

Reputation: 11396

there is absolutely no need to mount anything, you should be able to run celery workers individually listening to the queue.

if your tasks have to use django code as I understand they do, consider replicating your code in all celery nodes, for example, all nodes should be pulling from the same git repository preferably even use a tool like fabric

one thing to be aware of is to enable remote access to you database (i.e. mysql) from all nodes

EDIT: by "no need to mount" I mean have all needed tools installed on all nodes, i.e pip install django celery ... and have your code pulled from central repo (i.e. git) as well, depending on your hosting, maybe even replicate the machine itself, have identical machines all listening to the central queue

Edit2:

Do i need to share the /usr/local/proj.django/proj via NFS to all celery nodes?

I guess it can work, though a better practice would be to pull from a central repo, see above

Is it possible that Django project uses the non-django celery project for all celery tasks.

the question is where do you run celery worker and with which parameters? see celery workers guide

as a side note consider installing / enabling rabbitmq management plugin to see what exactly is going on

Upvotes: 1

Related Questions