Reputation: 1317
I have a running supervisor job for my celery server. Now I need to add a new task to it, but unfortunately my celery server command is not configured to track those dynamic changes automatically.
Here is my celery command:
python manage.py celery worker --broker=amqp://username:password@localhost/our_app_vhost
To restart my celery process, I have tried,
sudo supervisorctl -c /etc/supervisor/supervisord.conf restart <process_name>
supervisorctl stop all
supervisorctl start all
service supervisor restart
But nothing found working. How to restart it?
Upvotes: 4
Views: 15973
Reputation: 61
If some task running then restart celery waiting for complete them. So need to kill all running process. run following command for kill all celery process:
kill -9 $(ps aux | grep celery | grep -v grep | awk '{print $2}' | tr '\n' ' ') > /dev/null 2>&1
Restart celery:
sudo supervisorctl stop all sudo supervisorctl start all
Upvotes: 2
Reputation: 4051
you can write your celery task in /etc/supervisor/conf.d/
. create a new config file for celery like celery.conf
.
Assuming your virtualenv is venv
, your django project is sample and your celery script is in _celery.py
The file should look like
[program:celery]
command=/home/ubuntu/.virtualenvs/venv/bin/celery --app=sample._celery:app worker --loglevel=INFO
directory=/home/ubuntu/sample/
user=ubuntu
numprocs=1
stdout_logfile=/home/ubuntu/logs/celery-worker.log
stderr_logfile=/home/ubuntu/logs/celery-error.log
autostart=true
autorestart=true
startsecs=10
; Need to wait for currently executing tasks to finish at shutdown.
; Increase this if you have very long running tasks.
stopwaitsecs = 600
; When resorting to send SIGKILL to the program to terminate it
; send SIGKILL to its whole process group instead,
; taking care of its children as well.
killasgroup=true
; if rabbitmq is supervised, set its priority higher
; so it starts first
priority=998
after writing this supervisor program you need to run
If you add the supervisor program run this
$ sudo supervisorctl reread
celery: available
If you add/update the supervisor program run this
$ sudo supervisorctl update
celery: added process group
To check the status of your celery task
$ sudo supervisorctl status celery
celery RUNNING pid 18020, uptime 0:00:50
To stop the celery task
$ sudo supervisorctl stop celery
celery: stopped
To start the celery task
$ sudo supervisorctl start celery
celery: started
To restart the celery task (this would stop and again start the specified task)
$ sudo supervisorctl restart celery
celery: stopped celery: started
Upvotes: 4
Reputation: 29534
If you want to manage process with supervisorctl, you should configure supervisorctl, rpcinterface in your configuration file.
Here is a sample configuration file.
sample.conf
[supervisord]
logfile=/tmp/supervisord.log ; (main log file;default $CWD/supervisord.log)
logfile_maxbytes=50MB ; (max main logfile bytes b4 rotation;default 50MB)
logfile_backups=10 ; (num of main logfile rotation backups;default 10)
loglevel=info ; (log level;default info; others: debug,warn,trace)
pidfile=/tmp/supervisord.pid ; (supervisord pidfile;default supervisord.pid)
nodaemon=false ; (start in foreground if true;default false)
minfds=1024 ; (min. avail startup file descriptors;default 1024)
minprocs=200 ; (min. avail process descriptors;default 200)
[program:my_worker]
command = python manage.py celery worker --broker=amqp://username:password@localhost/our_app_vhost
[unix_http_server]
file=/tmp/supervisor.sock ; (the path to the socket file)
[supervisorctl]
serverurl=unix:///tmp/supervisor.sock ; use a unix:// URL for a unix socket
[rpcinterface:supervisor]
supervisor.rpcinterface_factory = supervisor.rpcinterface:make_main_rpcinterface
Now start supervisor with
supervisord -c sample.conf
Now if you want to restart your worker you can do it with
supervisorctl -c sample.conf restart my_worker
This restarts your worker. Alternatively you can also drop to supervisor shell and you can restart it
sudo supervisorctl -c sample.conf
supervisor> restart my_worker
my_worker: stopped
my_worker: started
Note:
There is an option to autoreload workers in Celery
python manage.py celery worker --autoreload --broker=amqp://username:password@localhost/our_app_vhost
This should be used in development mode only. Using this in production is not recommended.
More about this on celery docs.
Upvotes: 9