Reputation: 1101
Generally it is recommended to run a single process per docker container. And that makes sense if you are trying to run a single web application that requires different kinds of tools.
For example, the open source web application kanboard
makes use of
Now if that was the only web application I was going to run, then it makes sense to run each tool in a separate container to take advantage of dockers one process per container.
But say, instead of running only one web application I wanted to run multiple web applications,
Now how can I use docker to isolate those web applications? The reason I ask is because each application mentioned above might have its own,
There are 2 ways to use docker to run those web applications. 2 ways that I know off,
one process per container
and create one for mysql, postgres, sqlite, memcache and so on and one for each application code itself and use docker linking
to link the related containers together. This is more messy. Lot more organizing and management required.My question is if there is any other way? And if there isn't which of the above options should I choose and why?
Or maybe am I using the wrong tool(docker containers) for the job? Perhaps there is another way accomplishing application isolation without using docker containers?
Upvotes: 4
Views: 1008
Reputation: 428
You say:
This is more messy. Lot more organizing and management required.
I think it's complety the other way round. Here are my pros and cons:
multi-process:
pros
cons
CMD
or ENTRYPOINT
). If anything fails you'll end up with a failing containerI had exactly the same task recently and I decided to go the 2nd appraoch due to the following reasons:
pros:
cons:
I really recommend the 2nd approach. With tools like the mentioned docker-compose
you'll "build" your app out of different containers configured in one single docker-compose.yml
If you then use a tool like https://github.com/jwilder/nginx-proxy (I did, works like a charm) even the reverse proxying is a simple thing and you can run X different software on one host.
This way we set up our jenkins, redmine, cms and many more things for our company. Hope this helps you with your decision.
Upvotes: 0
Reputation: 8695
Your second approach is preferred in principle. Tools like docker compose might help you in fighting the messiness of the linking.
Upvotes: 1
Reputation: 1324218
You can run multiple processes per container.
You simply need to use a base image able to manage all those processes end of life (see "PID 1 zombie reaping issue"). Use a base image which knows how to do that: phusion/baseimage-docker
You will then have one container per webapp (with all its dependent processes)
Check if you can put in common some of those processes in their container of their own.
Typically, NGiNX could run in only one additional container, making reverse proxy to all your other webapps, allowing to access them through the same url (url/discourse
would redirect to the container managing discourse, url/plex
to the one for plex, and so on)
Upvotes: 0