Kabiljan Tanaguzov
Kabiljan Tanaguzov

Reputation: 323

Celery worker not work in docker container

I have a k8s cluster where I deploy my django application. I don't use celery beat so I deploy a worker in every container along with django. But for some reason this doesn’t work, if I go into the pod and execute the command manually, then everything is fine, but it doesn’t work through the dockerfile.

FROM python:3.10.2-slim-buster

# set environment variables
ENV PYTHONDONTWRITEBYTECODE 1
ENV PYTHONUNBUFFERED 1

RUN apt-get update && \
  apt-get install -y -q \
  openssl curl libpq-dev python3-dev build-essential && \
  apt-get clean

ADD sizze /app
WORKDIR /app

RUN pip install --upgrade pip
COPY requirements.txt .
RUN pip install -r requirements.txt

EXPOSE 8000
CMD ["mkdir", "logs", "&&", "touch", "logs/error.log", "&&", "touch", "logs/access.log"]
CMD ["celery", "-A", "app.celery", "worker", "-l", "info", "--detach"]
CMD ["python", "manage.py", "migrate"]
CMD ["gunicorn", "--bind", ":8000", "--workers", "2", "--timeout", "300", "--error-logfile", "logs/error.log", "--access-logfile", "logs/access.log", "--capture-output", "--log-level", "debug", "app.wsgi"]

Then if I go to the pod and check the workers, they are not thereenter image description here

Now I'm running celery worker and everything is fine, why doesn't it run in dockerfile?

celery -A sizze.celery worker -l info --detach
celery@api-production-6c784b68d6-gn74d': {'ok': 'pong'}

Upvotes: 0

Views: 879

Answers (1)

aramcpp
aramcpp

Reputation: 364

There can only be one CMD instruction in a Dockerfile. If you list more than one CMD then only the last CMD will take effect.

source https://docs.docker.com/engine/reference/builder/#cmd

I can suggest you to create shell script containing all you commands e.g. run.sh and add CMD ["run.sh"]

Upvotes: 1

Related Questions