Cannot connect to postgres in Gitlab CI - postgresql

I am trying to use GitLab CI PostgreSQL for my integration tests but it doesn´t work.
Here's the code of the stage:
integration_test:
stage: test
tags:
- custom_tag
services:
- postgres
variables:
POSTGRES_DB: test
POSTGRES_HOST: postgres
POSTGRES_USER: postgres
POSTGRES_PASSWORD: postgres
POSTGRES_HOST_AUTH_METHOD: trust
script:
- docker login -u ${DOCKER_USER} -p ${DOCKER_PASSWORD} ${DOCKER_REGISTRY}
- docker pull ${DOCKER_IMAGE_CI}
- export PGPASSWORD=${POSTGRES_PASSWORD}
- docker run --rm postgres psql -h ${POSTGRES_HOST} -U ${POSTGRES_USER} -d ${POSTGRES_DB} -c "SELECT 'OK' AS status;"
It returns an error like this:
psql: error: could not connect to server: could not translate host name "postgres" to address: Name or service not known
Anybody can help me?

Perhaps it's better to look at dockerizing test functions. This approach also provides better control over networking by means of docker bridge.
In this way your config could looks like this:
.gitlab-ci.yml:
stages:
- test
before_script:
- docker login -u ${DOCKER_USER} -p ${DOCKER_PASSWORD} ${DOCKER_REGISTRY}
integration_test:
stage: test
script:
- docker-compose build
- docker-compose up
docker-compose.yml:
version: '3'
networks:
database:
services:
postgres-db:
image: ${DOCKER_IMAGE_CI}
networks:
- database
container_name: postgres
test-container:
build:
context: .
dockerfile: Dockerfile
networks:
- database
container_name: testcon
Dockerfile:
FROM postgres
ENV POSTGRES_DB=test \
POSTGRES_HOST=postgres \
POSTGRES_USER=postgres \
POSTGRES_PASSWORD=postgres \
POSTGRES_HOST_AUTH_METHOD=trust
CMD psql -h ${POSTGRES_HOST} -U ${POSTGRES_USER} -d ${POSTGRES_DB} -c "SELECT 'OK' AS status;"

Your pipeline looks like it uses shell executor with gitlab services.
Command docker run --rm postgres <docker command> does not automatically connect to postgres network. You could try running your docker image with --link postgres, more details here. Note that link is legacy feature and may be removed in the future.
Personally I would try running my pipeline job using docker image. If your image is not publicly visible then you can bypass it with DOCKER_AUTH_CONFIG
If you used docker runner with password protected image then yaml would look like:
integration_test:
image: ${DOCKER_IMAGE_CI}
stage: test
tags:
- custom_tag
services:
- postgres
variables:
POSTGRES_DB: test
POSTGRES_HOST: postgres
POSTGRES_USER: postgres
POSTGRES_PASSWORD: postgres
POSTGRES_HOST_AUTH_METHOD: trust
script:
- PGPASSWORD=${POSTGRES_PASSWORD} psql -h ${POSTGRES_HOST} -U ${POSTGRES_USER} -d ${POSTGRES_DB} -c "SELECT 'OK' AS status;"
DOCKER_AUTH_CONFIG environment variable would be as follows(gitlab docs):
{
"auths": {
"${DOCKER_REGISTRY}": {
"auth": "(Base64 content from ${DOCKER_USER}:${DOCKER_PASSWORD})"
}
}
}
and to generate base64 auth you can use echo -n "${DOCKER_USER}:${DOCKER_PASSWORD}" | base64

Related

problem with postgres docker container inside Gitlab CI

It's been few days I am blocked on this problem with my project, it's working on localhost but not on gitlabCI.
I would like to build a test database on the postgres docker image in gitlabCI but it doesn't work, I have try a lot of things and lose a lot of hours before ask this there :'(.
below my docker-compose.yml file :
version: "3"
services:
nginx:
image: nginx:latest
container_name: nginx
depends_on:
- postgres
- monapp
volumes:
- ./nginx-conf:/etc/nginx/conf.d
- ./util/certificates/certs:/etc/nginx/certs/localhost.crt
- ./util/certificates/private:/etc/nginx/certs/localhost.key
ports:
- 81:80
- 444:443
networks:
- monreseau
monapp:
image: monimage
container_name: monapp
depends_on:
- postgres
ports:
- "3000:3000"
networks:
- monreseau
command: "npm run local"
postgres:
image: postgres:9.6
container_name: postgres
environment:
POSTGRES_USER: postgres
POSTGRES_HOST: postgres
POSTGRES_PASSWORD: postgres
volumes:
- ./pgDatas:/var/lib/postgresql/data/
- ./db_dumps:/home/dumps/
ports:
- "5432:5432"
networks:
- monreseau
networks:
monreseau:
and below my gitlab-ci.yml file:
stages:
# - build
- test
image:
name: docker/compose:latest
services:
- docker:dind
before_script:
- docker version
- docker-compose version
variables:
DOCKER_HOST: tcp://docker:2375/
# build:
# stage: build
# script:
# - docker build -t monimage .
# - docker-compose up -d
test:
stage: test
script :
- docker build -t monimage .
- docker-compose up -d
- docker ps
- docker exec -i postgres psql -U postgres -h postgres -f /home/dumps/test/dump_test_001 -c \\q
- exit
- docker exec -i monapp ./node_modules/.bin/env-cmd -f ./env/.env.builded-test npx jasmine spec/auth_queries.spec.js
- exit
this is the content of docker ps log on gitlabCI server :
docker ps on gitlab-CI
I thought to put postgres on host would work, but no I always have in gitlab-ci terminal:
psql: could not connect to server: Connection refused
Is the server running on host "postgres" (172.19.0.2) and accepting
TCP/IP connections on port 5432?
I also tried to put docker on host but error :
psql: could not translate host name "docker" to address: Name or service not known
little precision : it is working on localhost of my computer when i am doing make builded-test
bellow my makefile:
builded-test:
docker build -t monimage .
docker-compose up -d
docker ps
docker exec -i postgres psql -U postgres -h postgres -f /home/dumps/test/dump_test_001 -c \\q
exit
docker exec -i monapp ./node_modules/.bin/env-cmd -f ./env/.env.builded-test npx jasmine spec/auth_queries.spec.js
exit
docker-compose down
I want to make work postgres image in my docker-compose on gitlab CI to execute my tests help me please :) thanks by advance
UPDATE
Now it working in gitlab-runner but still not on gitlab when I push, I update the files like following
I added :
variables:
POSTGRES_DB: postgres
POSTGRES_USER: postgres
POSTGRES_PASSWORD: ""
POSTGRES_HOST_AUTH_METHOD: trust
and changed
test:
stage: test
script :
- docker build -t monimage .
- docker-compose up -d
- docker ps
- docker exec postgres psql -U postgres **-h postgres** -f /home/dumps/test/dump_test_001
- docker exec monapp ./node_modules/.bin/env-cmd -f ./env/.env.builded-test npx jasmine spec/auth_queries.spec.js
in the .gitlab-ci.yml
but still don't work when I push it to gitlab, it give me :
sql: could not connect to server: Connection refused
Is the server running on host "postgres" (172.19.0.2) and accepting
TCP/IP connections on port 5432?
any ideas ? :)
Maybe you need to wait for PostgreSQL service to be up and running.
Can you add a 10 seconds delay before trying the psql stuff? Something like:
- sleep 10
If it works, then you can use a more specific solution to wait for PostgreSQL to be initialized, like Docker wait for postgresql to be running

Passing a sql backup file through the docker daemon to populate a docker database

Context
I need to populate a database inside a docker container from a backup file that I have on the host machine.
I've tried this docker command while the PostGIS container is up (see docker-compose.yml at the end):
sudo docker exec -i db_container_1 ./usr/local/bin/pg_restore --no-owner --role=postgres -h localhost -U postgres -p 5434 -d database_name < ../db/dumps/dump_prod_2020.backup
But I've got this message:
read unix #->/var/run/docker.sock: read: connection reset by peer
As documented here, I also tried using this docker-compose command but it raises the exact same strange message:
docker-compose exec -T db /usr/local/bin/pg_restore --no-owner --role=postgres -h localhost -U postgres -p 5434 -d database_name < ../db/dumps/dump_prod_2020.backup
Question
What am I doing wrong and how could I populate my docker database with my local dump?
More infos
Here's the docker-compose.yml used to start the db service (docker ps outputs db_container_1 as the corresponding container name):
version: '3.6'
volumes:
db_data:
services:
db:
image: mdillon/postgis:11-alpine
environment:
POSTGRES_DB: database_name
POSTGRES_USER: postgres
POSTGRES_PASSWORD: password
ports:
- '${DB_PORT:-5434}:5432'
restart: 'no'
volumes:
- './docker/db:/docker-entrypoint-initdb.d:ro'
- 'db_data:/var/lib/postgresql/data' # to persist storage

Docker-compose postgresql password authentication failed

I'm trying to get a setup going with a webservice that consumes a postgres database. Should be simple to setup, but I'm getting errors. So, first thing I want to make sure is that the database I set up is actually there and running.
To test this I substitute the "consumer" or "client" for an alpine interactive shell like so:
version: '3'
services:
db:
image: postgres:10.1-alpine
container_name: db
expose:
- 5432
volumes:
- "dbdata:/var/lib/postgresql/data"
environment:
- POSTGES_USER=user
- POSTGRES_PASSWORD=pass
- POSTGRES_DB=db
web:
image: alpine:latest
stdin_open: true
tty: true
entrypoint: /bin/sh
depends_on:
- db
volumes:
dbdata:
Then I run the following command to get into the interactive shell:
docker-compose run web
and the following command to get in the database:
apk --update add postgresql-client && rm -rf /var/cache/apk/*
psql -h db -U user db
I get a plain denial from postgresql:
psql: FATAL: password authentication failed for user "user"
Same error message for each combo of username/password/databasename I try. Not much helpful.
What am I doing wrong here?
You have a typo in your docker-compose file. You mispelled POSTGRES here:
POSTGES_USER=user
That means the user user isn't being created. If I correct that typo, so that I have:
version: '3'
services:
db:
image: postgres:10.1-alpine
expose:
- 5432
volumes:
- "dbdata:/var/lib/postgresql/data"
environment:
- POSTGRES_USER=user
- POSTGRES_PASSWORD=pass
- POSTGRES_DB=db
web:
image: alpine:latest
stdin_open: true
tty: true
entrypoint: /bin/sh
depends_on:
- db
volumes:
dbdata:
Start the environment:
docker-compose up -d
Attach to the web contained and install the postgresql client:
$ docker attach project_web_1
/ # apk add --update postgresql-client
Then I can connect without a problem:
/ # psql -h db -U user db
Password for user user:
psql (11.2, server 10.1)
Type "help" for help.
db=#

How to create postgres database and run migration when docker-compose up

I'm setting up a simple backend that perform CRUD action with postgres database and want to create database and migration automatically when the docker-compose up runs.
I have already tried to add below code to Dockerfile or entrypoint.sh but none of them work.
createdb --host=localhost -p 5432 --username=postgres --no-password pg_development
createdb db:migrate
This code will work if run separately after docker is fully up
I have already tried to add - ./db-init:/docker-entrypoint-initdb.d to the volumes but that didn't work either
This is the Dockerfile
FROM node:10.12.0
# Create app directory
RUN mkdir -p /restify-pg
WORKDIR /restify-pg
EXPOSE 1337
ENTRYPOINT [ "./entrypoint.sh" ]
This is my docker-compose.yml
version: '3'
services:
db:
image: "postgres:11.2"
ports:
- "5432:5432"
volumes:
- ./pgData:/var/lib/psotgresql/data
environment:
POSTGRES_USER: postgres
POSTGRES_PASSWORD:
POSTGRES_DB: pg_development
app:
build:
context: .
dockerfile: Dockerfile
ports:
- "3000:3000"
volumes:
- .:/restify-pg
environment:
DB_HOST: db
entrypoint.sh (in here I get createdb: command not found)
#!/bin/bash
cd app
createdb --host=localhost -p 5432 --username=postgres --no-password pg_development
sequelize db:migrate
npm install
npm run dev
I expect that when I run docker, the migration and the db creation would happen.
entrypoint.sh (in here I get createdb: command not found)
Running createdb in the nodejs container will not work because it is postgres specific command and it's not installed by default in the nodejs image.
If you specify POSTGRES_DB: pg_development env var on postgres container, the database will be created automatically when the container starts. So no need to run createdb anyway in entrypoint.sh that is mounted in the nodejs container.
In order to make sequelize db:migrate work you need to:
add sequelize-cli to dependencies in package.json
run npm install so it gets installed
run npx sequelize db:migrate
Here is a proposal:
# docker-compose.yml
version: '3'
services:
db:
image: "postgres:11.2"
ports:
- "5432:5432"
volumes:
- ./pgData:/var/lib/postgresql/data
environment:
POSTGRES_USER: postgres
POSTGRES_PASSWORD:
POSTGRES_DB: pg_development
app:
working_dir: /restify-pg
entrypoint: ["/bin/bash", "./entrypoint.sh"]
image: node:10.12.0
ports:
- "3000:3000"
volumes:
- .:/restify-pg
environment:
DB_HOST: db
# package.json
{
...
"dependencies": {
...
"pg": "^7.9.0",
"pg-hstore": "^2.3.2",
"sequelize": "^5.2.9",
"sequelize-cli": "^5.4.0"
}
}
# entrypoint.sh
npm install
npx sequelize db:migrate
npm run dev
If you can run your migrations from nodejs instead of docker then consider this solution instead

Install Postgres extensions in bitbucket pipeline

So I've setup a bitbucket-pipelines.yml for my python app. It needs a postgres database so I've followed the tutorial here (https://confluence.atlassian.com/bitbucket/test-with-databases-in-bitbucket-pipelines-856697462.html) which has led me to the following config:
image: node
pipelines:
default:
- step:
script:
- npm install
- npm test
services:
- postgres
definitions:
services:
postgres:
image: postgres
environment:
POSTGRES_DB: 'pipelines'
POSTGRES_USER: 'test_user'
POSTGRES_PASSWORD: 'test_user_password'
I need some specific extensions in my db, how can I add these. I tried to add an extra in the script that installs them but at that point postgres doesn't seem to be up and running.
This worked for me, without the need to build my own image (I add 2 extensions to postgres):
image: node:8.11.1 # or any image you need
clone:
depth: 1 # include the last commit
definitions:
services:
postgres:
image: postgres
environment:
POSTGRES_DB: test
POSTGRES_USER: postgres
POSTGRES_PASSWORD: your_password
pipelines:
default:
- step:
caches:
- node
script:
- npm install
- apt-get update
- apt-get -y install postgresql-client
- ./bin/utilities/wait-for-it.sh -h localhost -p 5432 -t 30
- PGPASSWORD=your_password psql -h localhost -p 5432 -c "create extension if not exists \"uuid-ossp\"; create extension if not exists pg_trgm;" -U postgres test;
- npm test test/drivers/* test/helpers/* test/models/*
services:
- postgres
wait-for-it.sh makes sure that postgres is ready to run, you can find it here: https://github.com/vishnubob/wait-for-it
Then, I run psql to create the extensions in the test database. Here note the variable PGPASSWORD I set before running that and that I use the -h and -p parameters to connect to the running postgres instance (otherwise it will try to do it with unix sockets, which doesn't seem to work).
You need to create your own image based on postgres, then push it to repository and use in pipeline
definitions:
services:
postgres:
image: your_custom_image_based_on_postgres
environment:
POSTGRES_DB: 'pipelines'
POSTGRES_USER: 'test_user'
POSTGRES_PASSWORD: 'test_user_password'
You can also find image that suit your requirements in https://hub.docker.com/