Marcin_S
Marcin_S

Reputation: 539

Run Two Services in One container Docker

Below is my Docker File, I am creating an image of it and starting my container:

ENTRYPOINT ["/usr/bin/docker-entrypoint.sh"]

My docker-entrypoint.sh is as follow :

set -eo pipefail

if [ "${#}" -ne 0 ]; then
    exec "${@}"
else
    gunicorn \
        --bind  "0.0.0.0:${SUPERSET_PORT}" \
        --access-logfile '-' \
        --error-logfile '-' \
        --workers 10 \
        --worker-class gthread \
        -k gevent \
        --threads 20 \
        --timeout 6000 \
        --limit-request-line 0 \
        --limit-request-field_size 0 \
        "${FLASK_APP}"
    celery --app=superset.tasks.celery_app:app worker -Ofair -l INFO    
fi

gunicorn server is running fine but the celery worker are not coming up, is there anything i am doing wrong here ?

Update After Answer :

 gunicorn \
        --bind  "0.0.0.0:${SUPERSET_PORT}" \
        --access-logfile '-' \
        --error-logfile '-' \
        --workers 10 \
        --worker-class gthread \
        -k gevent \
        --threads 20 \
        --timeout 6000 \
        --limit-request-line 0 \
        --daemon \
        --limit-request-field_size 0 \
        "${FLASK_APP}" &\
    celery --app=superset.tasks.celery_app:app worker -Ofair -l INFO

Upvotes: 1

Views: 697

Answers (1)

Giorgi Jambazishvili
Giorgi Jambazishvili

Reputation: 743

What happens is that once flow reach the gunicorn part, this will block and will not release till gunicorn exits / returns something.

I would consider using supervisor (process control system)

On the other hand, if you would like quick fix to test things, I'd suggest to use --daemon (or -D) flag, which will not block the execution flow and detach the process (see docs)

Upvotes: 1

Related Questions