Alex Godwin
Alex Godwin

Reputation: 13

Celery worker in docker won't get correct message broker

I'm creating a flask service using an app factory pattern and I need to use celery for async tasks. I'm also using docker and docker-compose to contain and run everything. My structure looks like this:

server
 |
 +-- manage.py
 +-- docker-compose.yml
 +-- requirements.txt
 +-- Dockerfile
 |    
 +-- project
 |  |  
 |  +-- api
 |      |
 |      +--tasks.py
 |
 |  +-- __init__.py

My tasks.py file looks like this:

from project import celery_app

@celery_app.task
def celery_check(test):
    print(test)

I call manage.py to run which looks like this:

# manage.py

from flask_script import Manager
from project import create_app

app = create_app()
manager = Manager(app)

if __name__ == '__main__':
    manager.run()

And my __init__.pylooks like this:

# project/__init__.py

import os
import json
from flask_mongoalchemy import MongoAlchemy
from flask_cas import CAS
from flask import Flask
from itsdangerous import JSONWebSignatureSerializer as JWT
from flask_httpauth import HTTPTokenAuth
from celery import Celery

# instantiate the database and CAS
db = MongoAlchemy()
cas = CAS()

# Auth stuff (ReplaceMe is replaced below in create_app())
jwt = JWT("ReplaceMe")
auth = HTTPTokenAuth('Bearer')
celery_app = Celery(__name__, broker=os.environ.get("CELERY_BROKER_URL"))


def create_app():
    # instantiate the app
    app = Flask(__name__, template_folder='client/templates', static_folder='client/static')

    # set config
    app_settings = os.getenv('APP_SETTINGS')
    app.config.from_object(app_settings)

    # Send new static files every time if debug is enabled
    if app.debug:
        app.config['SEND_FILE_MAX_AGE_DEFAULT'] = 0

    # Get the secret keys
    parse_secret(app.config['CONFIG_FILE'], app)

    celery_app.conf.update(app.config)
    print(celery_app.conf)

    # set up extensions
    db.init_app(app)
    cas.init_app(app)
    # Replace the secret key with the app's
    jwt.secret_key = app.config["SECRET_KEY"]

    parse_config(app.config['CONFIG_FILE'])

    # register blueprints
    from project.api.views import twist_blueprint
    app.register_blueprint(twist_blueprint)

    return app

In my docker-compose I start a worker and define some environment variables like this:

version: '2.1'

services:
  twist-service:
    container_name: twist-service
    build: .
    volumes:
      - '.:/usr/src/app'
    ports:
      - 5001:5000 # expose ports - HOST:CONTAINER
    environment:
      - APP_SETTINGS=project.config.DevelopmentConfig
      - DATABASE_NAME_TESTING=testing
      - DATABASE_NAME_DEV=dev
      - DATABASE_URL=twist-database
      - CONFIG_FILE=./project/default_config.json
      - MONGO_PASSWORD=user
      - CELERY_RESULT_BACKEND=redis://redis:6379
      - CELERY_BROKER_URL=redis://redis:6379/0
      - MONGO_PORT=27017
    depends_on:
      - celery
      - twist-database
  celery:
    container_name: celery
    build: .
    command: celery -A project.api.tasks --loglevel=debug worker
    volumes:
      - '.:/usr/src/app'
  twist-database:
    image: mongo:latest
    container_name: "twist-database"
    environment:
      - MONGO_DATA_DIR=/data/db
      - MONGO_USER=mongo
    volumes:
      - /data/db
    ports:
      - 27017:27017  # expose ports - HOST:CONTAINER
    command: mongod
  redis:
    image: "redis:alpine"
    command: redis-server
    volumes:
      - '/redis'
    ports:
      - '6379:6379'

However when I run my docker-compose file and generate the containers, I end up with this in the celery worker logs:

[2017-07-20 16:53:06,721: ERROR/MainProcess] consumer: Cannot connect to amqp://guest:**@127.0.0.1:5672//: [Errno 111] Connection refused.

Which means the worker is ignoring the configuration set for redis when celery was created, and trying to use rabbitmq instead. I've tried changing the project.api.tasks to project and project.celery_app, but to no avail.

Upvotes: 1

Views: 1528

Answers (2)

Anis
Anis

Reputation: 3094

It seems to me like the celery service should have the environment variables CELERY_RESULT_BACKEND and CELERY_BROKER_URL as well.

Upvotes: 5

Brian Bruggeman
Brian Bruggeman

Reputation: 5324

You need to link the docker services together. The most straight-forward mechanism to do this is to add a networks section in your dockerfile.

Upvotes: 0

Related Questions