Reputation: 51
I am trying to use GitLab CI PostgreSQL for my integration tests but it doesn´t work.
Here's the code of the stage:
integration_test:
stage: test
tags:
- custom_tag
services:
- postgres
variables:
POSTGRES_DB: test
POSTGRES_HOST: postgres
POSTGRES_USER: postgres
POSTGRES_PASSWORD: postgres
POSTGRES_HOST_AUTH_METHOD: trust
script:
- docker login -u ${DOCKER_USER} -p ${DOCKER_PASSWORD} ${DOCKER_REGISTRY}
- docker pull ${DOCKER_IMAGE_CI}
- export PGPASSWORD=${POSTGRES_PASSWORD}
- docker run --rm postgres psql -h ${POSTGRES_HOST} -U ${POSTGRES_USER} -d ${POSTGRES_DB} -c "SELECT 'OK' AS status;"
It returns an error like this:
psql: error: could not connect to server: could not translate host name "postgres" to address: Name or service not known Anybody can help me?
Upvotes: 5
Views: 4215
Reputation: 123
Perhaps it's better to look at dockerizing test functions. This approach also provides better control over networking by means of docker bridge. In this way your config could looks like this:
.gitlab-ci.yml
:
stages:
- test
before_script:
- docker login -u ${DOCKER_USER} -p ${DOCKER_PASSWORD} ${DOCKER_REGISTRY}
integration_test:
stage: test
script:
- docker-compose build
- docker-compose up
docker-compose.yml
:
version: '3'
networks:
database:
services:
postgres-db:
image: ${DOCKER_IMAGE_CI}
networks:
- database
container_name: postgres
test-container:
build:
context: .
dockerfile: Dockerfile
networks:
- database
container_name: testcon
Dockerfile
:
FROM postgres
ENV POSTGRES_DB=test \
POSTGRES_HOST=postgres \
POSTGRES_USER=postgres \
POSTGRES_PASSWORD=postgres \
POSTGRES_HOST_AUTH_METHOD=trust
CMD psql -h ${POSTGRES_HOST} -U ${POSTGRES_USER} -d ${POSTGRES_DB} -c "SELECT 'OK' AS status;"
Upvotes: 1
Reputation: 4366
Your pipeline looks like it uses shell executor with gitlab services.
Command docker run --rm postgres <docker command>
does not automatically connect to postgres network. You could try running your docker image with --link postgres
, more details here. Note that link is legacy feature and may be removed in the future.
Personally I would try running my pipeline job using docker image. If your image is not publicly visible then you can bypass it with DOCKER_AUTH_CONFIG
If you used docker runner with password protected image then yaml would look like:
integration_test:
image: ${DOCKER_IMAGE_CI}
stage: test
tags:
- custom_tag
services:
- postgres
variables:
POSTGRES_DB: test
POSTGRES_HOST: postgres
POSTGRES_USER: postgres
POSTGRES_PASSWORD: postgres
POSTGRES_HOST_AUTH_METHOD: trust
script:
- PGPASSWORD=${POSTGRES_PASSWORD} psql -h ${POSTGRES_HOST} -U ${POSTGRES_USER} -d ${POSTGRES_DB} -c "SELECT 'OK' AS status;"
DOCKER_AUTH_CONFIG
environment variable would be as follows(gitlab docs):
{
"auths": {
"${DOCKER_REGISTRY}": {
"auth": "(Base64 content from ${DOCKER_USER}:${DOCKER_PASSWORD})"
}
}
}
and to generate base64 auth you can use echo -n "${DOCKER_USER}:${DOCKER_PASSWORD}" | base64
Upvotes: 0