Reputation: 11
On localhost, i used these statements to execute tasks and workers. Run tasks: python manage.py celery beat Run workers: python manage.py celery worker --loglevel=info I used otp, rabbitmq server and django-celery. It is working fine. I uploaded the project on ubuntu server. I would like to daemonize these. For that i created a file /etc/default/celeryd as below config settings.
# Name of nodes to start, here we have a single node
CELERYD_NODES="w1"
# or we could have three nodes:
#CELERYD_NODES="w1 w2 w3"
# Where to chdir at start.
CELERYD_CHDIR="/home/sandbox/myprojrepo/myproj"
# How to call "manage.py celeryd_multi"
CELERYD_MULTI="$CELERYD_CHDIR/manage.py celeryd_multi"
# How to call "manage.py celeryctl"
CELERYCTL="$CELERYD_CHDIR/manage.py celeryctl"
# Extra arguments to celeryd
CELERYD_OPTS="--time-limit=300 --concurrency=8"
CELERY_CONFIG_MODULE="celeryconfig"
# %n will be replaced with the nodename.
CELERYD_LOG_FILE="/var/log/celery/%n.log"
CELERYD_PID_FILE="/var/run/celery/%n.pid"
# Workers should run as an unprivileged user.
CELERYD_USER="celery"
CELERYD_GROUP="celery"
# Name of the projects settings module.
export DJANGO_SETTINGS_MODULE="settings"
Also i created a file /etc/init.d/celeryd with script i downloaded.
Now when i try to execute /etc/init.d/celeryd start it gives error as Unrecogonized command line argument. I issued "celeryd-multi start nodeN" as a command and it said nodeN started. But tasks execution havent started yet.
I am new to daemonizing and server hosting.
Upvotes: 1
Views: 1190
Reputation: 1559
You can run celery within supervisor: https://pypi.python.org/pypi/supervisor http://thomassileo.com/blog/2012/08/20/how-to-keep-celery-running-with-supervisor/
hth.
Upvotes: 2