Reputation: 307
I have simple Dockerfile
FROM base
RUN <code which installs redis>
RUN npm install redis-adapter
EXPOSE 6379
ENTRYPOINT redis-server --daemonize yes && /app/tasks/redis/entrypoint.sh
And in my entrypoint I'm setting up some configurational keys and set some data to redis via node:
#!/bin/sh
redis-cli hset app:cfg env dev
redis-cli hset app:cfg maxconnections 1024
node /app/tasks/redis/init.js
Image builds succesfully, but when I running it - nothing happens. What's the problem? What should I do to run redis in container and make some configuration after? May be the trouble is in that I'm running redis as daemon?
Upvotes: 1
Views: 1152
Reputation: 307
Answer from author
TL:DR
There is a pretty similiar question in Stackoverflow which help to fix my problem:
The problem was in that a Docker ENTRYPOINT
or CMD
should "spawn single process". And I put Redis starting and node init.js
execution as different programs into supervisord
. Providing supervisord.conf
for example:
[supervisord]
nodaemon=true
loglevel=debug
[program:redis]
priority=1
command=redis-server
[program:configurations]
priority=2
command=/bin/sh -c /app/tasks/redis/entrypoint.sh
Why did I do that?
The main trouble which I have with this issue was with misunderstanding what actually is a Docker container. And what does ENTRYPOINT
or CMD
in Docker. I thought that I should just "run some server into Docker and expose some port and Docker will do everything with itself", but that is not the way how containers work. There is a difference between containers and VM's. Look this: How is Docker different from a virtual machine?
When thinking about Docker container as a wrap over one single process it seems clear that code written in my Dockerfile
will not work in way that I was expected.
If there is a need to run multiple processes in Docker container then you should use something like supervisord
or concurrently
(if you prefer Node
ecosystem).
Upvotes: 1