lapinkoira
lapinkoira

Reputation: 8978

Developing with celery and docker

I have noticed that developing with celery in container, something like this:

  celeryworker:
    build: .
    user: django
    command: celery -A project.celery worker -Q project -l DEBUG
    links:
     - redis
     - postgres
    depends_on:
      - redis
      - postgres
    env_file: .env
    environment:
      DJANGO_SETTINGS_MODULE: config.settings.celery

if I want to do some changes on some celery task, I have to completly rebuild the docker image in order to have the latest changes.

So I tried:

docker-compose -f celery.yml down
docker-compose -f celery.yml up

Nothing changed, then:

docker-compose -f celery.yml down
docker-compsoe -f celery.yml build
docker-compose -f celery.yml up

I have the new changes.

Is this the way to do it? seems very slow to me, all the time rebuilding the image, is then better have the local celery outsite docker containers?

Upvotes: 0

Views: 386

Answers (1)

AKX
AKX

Reputation: 168913

Mount your . (that is, your working copy) as a volume within the container you're developing in.

That way you're using the fresh code from your working directory without having to rebuild (unless, say, you're changing dependencies or something else that requires a rebuild).

The idea is explained here by Heroku, emphasis mine:

version: '2'
services:
  web:
    build: .
    ports:
      - "5000:5000"
    env_file: .env
    depends_on:
      - db
    volumes:
      - ./webapp:/opt/webapp  # <--- Whatever code your Celery workers need should be here
  db:
    image: postgres:latest
    ports:
      - "5432:5432"
  redis:
    image: redis:alpine
    ports:
      - "6379:6379"

Upvotes: 2

Related Questions