Reputation: 1661
I was wondering if anyone had any idea how to run compose on Batch Shipyard, or short of directly using compose, allowing multiple containers to work in concert on the same node in a job. Is this possible?
To clarify - I have several containers that work together parse and process a file. The main container utilizes other services via network calls. A simplified example of the compose file that I want to replicate is like so:
version: "3"
services:
primary:
image: "primaryimage"
depends_on:
- secondary
environment:
SECONDARY_URL: secondary:8888
secondary:
image: secondaryimage
Within code run in primary
there are calls to the URL given in SECONDARY_URL
to perform some transformations on the data, and a response is returned.
Upvotes: 1
Views: 597
Reputation: 2369
Here is a workaround to allow this to work. Note that as a workaround, it is not officially supported and may break at any time.
tasks
specification in jobs.yaml
:
docker_image
is the image you created aboveresource_files
or input_data
to ingress your compose file for the taskadditional_docker_run_options
should have an item:
-v /var/run/docker.sock:/var/run/docker.sock
command
would be your docker-compose up
command with whatever arguments you requireconfig.yaml
specifies all of the Docker images required by your Docker compose file and the Docker image that contains Docker/docker-compose itself.Upvotes: 0
Reputation: 489
Azure Batch (Shipyard) does not have out-of-the-box support for Docker Compose.
But if you are already familiar with Docker Compose then it's not too hard convert it to shipyard configuration files.
If you want to run MPI/multi-instance tasks (a cluster of nodes cooperating on solving parts of a computation) then take a look at this.
However, Service Fabric does support Docker Compose. So if Docker Compose support is a strict requirement you could combine you Azure Batch setup with calls to a Service Fabric cluster.
Upvotes: 2