Binoy Mathew
Binoy Mathew

Reputation: 143

ERROR: Pidfile (celerybeat.pid) already exists

I am getting this issue while re-build and re-start cookiecutter-django docker-compose in production. I am able to solve this by either removing all stopped docker containers or by adding rm -f './celerybeat.pid' in /compose/production/django/celery/beat/start.sh similar to /compose/local/django/celery/beat/start.sh. Is there any reason for not including this specific code in production version of compose file?

Upvotes: 6

Views: 7437

Answers (5)

ViaTech
ViaTech

Reputation: 2843

There was an earlier post on how to fix this issue by setting the PID file to an empty value in the run command but the solution was not complete and took me a tiny bit of trial and error to get it working on my production system so I figured I'd post a docker-compose file that has a beats service that is run with a command to create a new celerybeats.pid file when it starts.

As a note I am using django-celery-beat: https://pypi.org/project/django-celery-beat/

version: '3'

services:

  redis:
    image: redis
    restart: unless-stopped
    ports:
      - "6379"

  beats:
    build: .
    user: user1
    # note the --pidfile= in this command
    command: celery --pidfile= -A YOURPROJECT beat -l info --scheduler django_celery_beat.schedulers:DatabaseScheduler
    env_file: ./.env.prod
    restart: unless-stopped
    volumes:
      - .:/code
      - tmp:/tmp
    links:
      - redis
    depends_on:
      - redis

volumes:
  tmp:

Doing this I no longer get the ERROR: Pidfile (celerybeat.pid) already exists error, and I do not have to run a rm command.

Upvotes: 1

Artur Drożdżyk
Artur Drożdżyk

Reputation: 615

Please, take a look here:

Disable pidfile for celerybeat

You can specify pidfile without any location, so that it will be recreated each time the celery starts

--pidfile=

Upvotes: 2

mustang
mustang

Reputation: 161

If you can live without beat, there's a way for celery to handle periodic tasks by passing in the 'B' flag. When you do this, no .pid file is generated, a celerybeat-schedule file is generated. When you rerun celery, it won't complain about reusing this file. As far as source control does, just add it to your .gitignore.

Here's the command in full form:

celery -A <appname> worker -l info -BE

Upvotes: 1

Victor Villacorta
Victor Villacorta

Reputation: 617

Other way, create a django command celery_kill.py

import shlex
import subprocess

from django.core.management.base import BaseCommand


class Command(BaseCommand):
    def handle(self, *args, **options):
        kill_worker_cmd = 'pkill -9 celery'
        subprocess.call(shlex.split(kill_worker_cmd))

docker-compose.yml :

celery:
    build: ./src
    restart: always
    command: celery -A project worker -l info
    volumes:
      - ./src:/var/lib/celery/data/
    depends_on:
      - db
      - redis
      - app

  celery-beat:
    build: ./src
    restart: always
    command: celery -A project beat -l info --pidfile=/tmp/celeryd.pid
    volumes:
      - ./src:/var/lib/beat/data/
    depends_on:
      - db
      - redis
      - app

and Makefile:

run:
    docker-compose up -d --force-recreate
    docker-compose exec app python manage.py celery_kill
    docker-compose restart
    docker-compose exec app python manage.py migrate

Upvotes: 0

Siyu
Siyu

Reputation: 12139

You can use celery worker --pidfile=/path/to/celeryd.pid to specify a non mounted path so that it is not mirror on the host.

Upvotes: 0

Related Questions