Snobby
Snobby

Reputation: 1155

Docker. Celery and code in different containers

I want to make additional container for celery workers. So the structure should be the following:

celery_container - Celery
code_container - RabbitMQ, DB, code, everything else

I know how to organise a network, so celery is connected to Rabbit in another container.

But I can't realize, should I keep my code in both containers?

My tasks are done both with celery workers and synchronous. So, now I see only the option to run both containers with --volume param. Like this:

docker run \
-tid \
-v $(pwd):/home \
--name code_container \
code_container

docker run \
-tid \
-v $(pwd):/home \
--name celery_container \
celery_container

Upvotes: 5

Views: 1196

Answers (2)

Snobby
Snobby

Reputation: 1155

As I understood, the best way is to keep code in both containers, with code and with celery.

It's useful to build smth like base image where will be almost all dependencies and app code. Then you will be able to build container with code and celery from this container. So if you'll need to build any other container with code inside just use this base image and update Dockerfile with appropriate processes.

Upvotes: 3

Athul Krishna
Athul Krishna

Reputation: 41

You could easily use docker-compose to link the containers. Create the code_container and give it as link for celery in docker-compose as shown

celery:
  ports:
    - ":"
  links:
   - code_container

now the database or whatever you need will be available from the celery container by calling code_container:port

Upvotes: 2

Related Questions