Marie
Marie

Reputation: 146

Summoning multiples containers from a single containers using outside data

I've made two python application.

  1. a watchdog application, always running, that is looking for new data in a folder. If new data is detected and match all the conditions, it send the data to the next step.
  2. a python analysis application that is processing the data and sending back the result in a folder near the watchdog application.

It works well and the watchdog and call several instances of the python analysis application without problems.

I now need to use Docker. I'm learning it.

I manage to make 2 Dockerfile, and with some testing data already in the container, the two images works well separately.

I want to find the best make to make it works with Docker.

My questions :

  1. I found on Stackoverflow that if you need two containers working together, you need to define a network in your composer. Since I only have 1 container running 24/7 and the rest is just instances of the same images, should I still make a network ?

  2. How is the best way to load the data ? Should the watchdog access from a folder outside a container, copy it inside a volume (defined in the composer file), and allow containers from the python analysis app to access it and write the result back in this volume ? Can this volume be accessible like a regular folder for non-tech users ?

Please, be tolerant with my questions, I'm very new to Docker. Many thanks !

Upvotes: 1

Views: 135

Answers (1)

The Fool
The Fool

Reputation: 20497

If you run your images via compose, they will automatically be in a docker network together. That would allow them to communicate.
Either by IP, which is not recommended, since the IPS are dynamic, or container name, or compose service name.

In the below example, the watchdog could reach the analysis service by the name analysis. For example http://analysis:8080.

You also want to create a volume and let the containers share this volume, so that the watchdog can see the files the analysis service creates, if I understand this correctly.

services:
  analysis:
    image: analysis
    build: ./analysis/
    volumes:
      - ./data:/data:delegated,rw
    deploy:
      replicas: 3

  watchdog:
    image: watchdog
    build: ./watchdog/
    volumes:
      - ./data:/data:cached,ro

Note that this is making usage of a bind mount, meaning it a local folder is mounted. In this case, the folder with name data in the same directory as the compose file.

There are also options for volumes, you can read about it in the docs.

Also note, the modifiers on the volumes.

  • rw means this container is allowed to read-write to the volumes.
  • ro means read only.
  • delegated and cached are useful to increase the performance around bind mounts at the cost of introducing some risk. You can read more about it here.

As per your comment, you want to spawn other containers from the watchdog container.

Since you are using already python, you could use the docker python sdk to do so.

In order to make this work, you could mount the docker socket.

services:
  watchdog:
    image: watchdog
    build: ./watchdog/
    volumes:
      - /var/run/docker.sock:/var/run/docker.sock
      - ./data:/data

In your watchdog code, you can then instantiate a client like so:

import docker
client = docker.DockerClient(base_url='unix://var/run/docker.sock')

Upvotes: 1

Related Questions