swimagers
swimagers

Reputation: 71

Multi service node.js web application backend in one free docker repository

I am fairly new to docker and I don't get the right workflow for me now.

My goal is:

  1. To write a web application with a frontend, a GraphQL API and an AuthServer. Other microservices will follow.
  2. To easily deploy the application to my root server, when I make a commit to my master branch in Bitbucket. I want to use the automated build from docker hub.
  3. To stay at the free account of docker hub, where I only have one free repository.

May it be possible to achieve these requirements, if my project structure is like the following:

- services
  - react frontend
      (I think it's okay to just put the built static files to the nginx html-folder)
  - graphqlapi
      Dockerfile
  - authservice
      Dockerfile
  - another service in the future
      Dockerfile
docker-compose.yml

I have the docker-compose.yml in the root folder. But the automatic build in docker hub says that it needs a Dockerfile there to build the image.

For me it would be okay to run all the services in just one image/container, because currently I just want to have it all run on the same machine.

So again my question: Is it possible to dockerize a multi service web application into one docker image/container for the free docker hub repository?

Upvotes: 0

Views: 1139

Answers (1)

lifeisfoo
lifeisfoo

Reputation: 16374

TL;TR

It is possible to run multiple services inside a single container, but I highly discourage you to do it.

Anyway, you can do this bundling all services inside the same image and then (1) running a container with many services using, or (2) running them using a supervisor.

If your constraint is only the single image, you can do better, (3) running multiple containers using the same image, customizing the service to run on each.

Why running multiple processes in a container should be avoided

The Docker documentation says (bold is mine):

It is generally recommended that you separate areas of concern by using one service per container. That service may fork into multiple processes (for example, Apache web server starts multiple worker processes).

and then it continues:

It’s ok to have multiple processes, but to get the most benefit out of Docker, avoid one container being responsible for multiple aspects of your overall application.

This is because Docker containers are attached to a single process (usually defined by ENTRYPOINT/CMD in Dockerfiles) and when this process dies, the entire container is stopped.

Containers are designed to isolate services: if you want a isolated environment with many, non isolated services (like in the first two ways described above), probably it's better to use a virtual machine.

The base

The common idea of every approach is to pack every application you have in the same final image using a single Dockerfile:

FROM ubuntu

// RUN install dependencies for every app you have
// COPY all your binaries/apps (e.g. service 1, 2, 3) or build them

The wrapper script way (1)

Following the first example in the Docker documentation you can start multiple services using a wrapper script that starts them in the background and the check every minute if all of them are running. When a service crashes, the entire container is stopped.

In this case your image will end with a command line like CMD ./my_wrapper_script.sh.

The supervisor way (2)

As suggested in the comment above, you can use a supervisor inside the container to run multiple services avoiding the problem above. In this way you have many processes managed by the supervisor and if the supervisor crashes, all your service will be taken down (you have a single point of failure).

In this case your image will end with a line like CMD start-supervisor.

The custom command way (3)

If your constraint is only the single image and you can have multiple container this is the best approach. Just run multiple containers starting them using an explicit command as the last parameter:

docker run your-image your-service-1
docker run your-image your-service-2
docker run your-image your-service-3

You can do it also using a docker-compose file.

With this approach you don't "break" one service per container rule having a more resilient deploy.

Upvotes: 1

Related Questions