rura6502
rura6502

Reputation: 385

why postgres process doesn't run before execute RUN/CMD/ENTRYPOINT?

My first try, I got the error with this Dockerfile

FROM postgres:9.6-alpine
RUN psql

############# this is error message
 => ERROR [2/2] RUN psql                                                                                                                                                0.6s 
------
 > [2/2] RUN psql:
#6 0.349 psql: could not connect to server: No such file or directory
#6 0.349        Is the server running locally and accepting
#6 0.349        connections on Unix domain socket "/var/run/postgresql/.s.PGSQL.5432"?

so I though this error's cause is no postgres process. and I tested something CMD psql, ENTRYPOINT psql then when i run this image after build, all container got same error. why do these problems happen?

Upvotes: 0

Views: 456

Answers (3)

David Maze
David Maze

Reputation: 159733

A Docker container only runs one process.

If you set the image's CMD to be psql this runs instead of the base image's normal CMD; you get a PostgreSQL client instead of the PostgreSQL server.

Setting ENTRYPOINT ["psql"] doesn't really make sense to me, since its effect would be to cause the actual CMD to have psql appended to it; commands like psql bash don't actually make sense. This also replaces the extensive startup-time script in the base image. (But, see @rassakra's answer for a specific pattern that uses it.)

If you RUN psql, that actually runs in its own container during the build phase. The Docker image doesn't persist running processes, so this is the same situation as CMD: you have a container running psql and nothing else, so there's no database server for it to connect to.

During the build sequence there are three additional problems. If you have a database running in another container, the image build isn't connected to any network in particular, so it can't connect to the database. If you were planning to run the database with some storage attached, that's not available yet at build time. In fact, the standard Docker Hub database images are configured so that you can't create a database image with embedded data (because the server data is in a declared VOLUME and changes to VOLUME directories are not persisted).


If you're just trying to run psql, you can just run it, even if the database is in a container.

docker run -d -p 5678:5432 postgres
psql -p 5678

Or, if installing standard client tools is somehow unacceptable, you can launch a second Docker container connecting to the first one. Both containers need to be on the same Docker network. The psql client container can use the standard postgres image but specify an alternate command to run after the image name.

docker network create some-network
docker run -d --net some-network --name db postgres
docker run --rm -it --net some-network --name client postgres \
  psql -h db

Upvotes: 0

atline
atline

Reputation: 31664

See postgresql dockerfile:

Dockerfile:

ENTRYPOINT ["docker-entrypoint.sh"]
CMD ["postgres"]

docker-entrypoint.sh here:

exec gosu postgres "$BASH_SOURCE" "$@"

And if you define multiple ENTRYPOINT or CMD in Dockerfile, only the last one will be used. That means your CMD psql, ENTRYPOINT psql will override the default psql server start command, which makes you still don't have a psql server process.

Upvotes: 0

rassakra
rassakra

Reputation: 1121

if you want to use psql directly using Docker image, I have this solution :

FROM alpine:latest
RUN apk --update add postgresql-client

ENTRYPOINT ["psql"]

after that:

docker build -t test .

and

alias psql='docker run --rm -it test:latest'

and finally start using psql :

psql --help

Demo: enter image description here

I hope that this can help you to resolve your issue.

Upvotes: 1

Related Questions