Mickael_86130
Mickael_86130

Reputation: 213

How to run a Redis server AND another application inside Docker?

I created a Django application which runs inside a Docker container. I needed to create a thread inside the Django application so I used Celery and Redis as the Celery Database. If I install redis in the docker image (Ubuntu 14.04):

RUN apt-get update && apt-get -y install redis-server
RUN pip install redis

The Redis server is not launched: the Django application throws an exception because the connection is refused on the port 6379. If I manually start Redis, it works.

If I start the Redis server with the following command, it hangs :

RUN redis-server

If I try to tweak the previous line, it does not work either :

RUN nohup redis-server &

So my question is: is there a way to start Redis in background and to make it restart when the Docker container is restarted ?

The Docker "last command" is already used with:

CMD uwsgi --http 0.0.0.0:8000 --module mymodule.wsgi

Upvotes: 16

Views: 13144

Answers (3)

Paul Becotte
Paul Becotte

Reputation: 9977

When you run a Docker container, there is always a single top level process. When you fire up your laptop, that top level process is an "init" script, systemd or the like. A docker image has an ENTRYPOINT directive. This is the top level process that runs in your docker container, with anything else you want to run being a child of that. In order to run Django, a Celery Worker, and Redis all inside a single Docker container, you would have to run a process that starts all three of them as child processes. As explained by Milan, you could set up a Supervisor configuration to do it, and launch supervisor as your parent process.

Another option is to actually boot the init system. This will get you very close to what you want since it will basically run things as though you had a full scale virtual machine. However, you lose many of the benefits of containerization by doing that :)

The simplest way altogether is to run several containers using Docker-compose. A container for Django, one for your Celery worker, and another for Redis (and one for your data store as well?) is pretty easy to set up that way. For example...

# docker-compose.yml
web:
    image: myapp
    command: uwsgi --http 0.0.0.0:8000 --module mymodule.wsgi
    links:
      - redis
      - mysql
celeryd:
    image: myapp
    command: celery worker -A myapp.celery
    links:
      - redis
      - mysql
redis:
    image: redis
mysql:
    image: mysql

This would give you four containers for your four top level processes. redis and mysql would be exposed with the dns name "redis" and "mysql" inside your app containers, so instead of pointing at "localhost" you'd point at "redis".

There is a lot of good info on the Docker-compose docs

Upvotes: 8

milan
milan

Reputation: 12402

use supervisord which would control both processes. The conf file might look like this:

...
[program:redis]
command= /usr/bin/redis-server /srv/redis/redis.conf
stdout_logfile=/var/log/supervisor/redis-server.log
stderr_logfile=/var/log/supervisor/redis-server_err.log
autorestart=true

[program:nginx]
command=/usr/sbin/nginx
stdout_events_enabled=true
stderr_events_enabled=true

Upvotes: 6

Henrik Sachse
Henrik Sachse

Reputation: 54162

RUN commands are adding new image layers only. They are not executed during runtime. Only during build time of the image.

Use CMD instead. You can combine multiple commands by externalizing them into a shell script which is invoked by CMD:

CMD start.sh

In the start.sh script you write the following:

#!/bin/bash
nohup redis-server &
uwsgi --http 0.0.0.0:8000 --module mymodule.wsgi

Upvotes: 9

Related Questions