magnoz
magnoz

Reputation: 1995

Problem running celery in a different docker container than the Flask app

I’m running a Flask application with Celery for submitting sub-processes using docker-compose. However I cannot make Celery work when trying to run it in a different container.

If I run Celery in the same container I’m running the flask app it works, but feels like the wrong way, I’m coupling two different things in one container, by adding this in the startup script before the flask app runs:

nohup celery worker -A app.controller.engine.celery -l info &

However if I add Celery as a new container in my docker-compose.yml it doesn’t work. This is my config:

(..)

engine:
  image: engine:latest
  container_name: engine
  ports:
    - 5000:5000
  volumes:
    - $HOME/data/engine-import:/app/import
  depends_on:
    - mongo
    - redis
  environment:
    - HOST=localhost

celery:
  image: engine:latest
  environment:
        - C_FORCE_ROOT=true
  command: ["/bin/bash", "-c", "./start-celery.sh"]
  user: nobody
  depends_on:
    - redis

(..)

And this is the start-celery.sh:

#!/bin/bash
source ./env/bin/activate

cd ..
celery worker -A app.controller.engine.celery -l info

Its logs:

INFO:engineio:Server initialized for eventlet.
INFO:engineio:Server initialized for threading.
[2018-09-12 09:43:19,649: INFO/MainProcess] Connected to redis://redis:6379//
[2018-09-12 09:43:19,664: INFO/MainProcess] mingle: searching for neighbors
[2018-09-12 09:43:20,697: INFO/MainProcess] mingle: all alone
[2018-09-12 09:43:20,714: INFO/MainProcess] celery@8729618bd4bc ready.

And that’s all, processes are not submited to it.

What can be missing?

Upvotes: 1

Views: 1276

Answers (1)

magnoz
magnoz

Reputation: 1995

I've found that It works only if I add this to the docker-compose definition of the celery service:

environment: 
      - C_FORCE_ROOT=true

I wonder though why I didn't get any error otherwise.

Upvotes: 1

Related Questions