It's been few days I am blocked on this problem with my project, it's working on localhost but not on gitlabCI.
I would like to build a test database on the postgres docker image in gitlabCI but it doesn't work, I have try a lot of things and lose a lot of hours before ask this there :'(.
below my docker-compose.yml file :
version: "3"
services:
nginx:
image: nginx:latest
container_name: nginx
depends_on:
- postgres
- monapp
volumes:
- ./nginx-conf:/etc/nginx/conf.d
- ./util/certificates/certs:/etc/nginx/certs/localhost.crt
- ./util/certificates/private:/etc/nginx/certs/localhost.key
ports:
- 81:80
- 444:443
networks:
- monreseau
monapp:
image: monimage
container_name: monapp
depends_on:
- postgres
ports:
- "3000:3000"
networks:
- monreseau
command: "npm run local"
postgres:
image: postgres:9.6
container_name: postgres
environment:
POSTGRES_USER: postgres
POSTGRES_HOST: postgres
POSTGRES_PASSWORD: postgres
volumes:
- ./pgDatas:/var/lib/postgresql/data/
- ./db_dumps:/home/dumps/
ports:
- "5432:5432"
networks:
- monreseau
networks:
monreseau:
and below my gitlab-ci.yml file:
stages:
# - build
- test
image:
name: docker/compose:latest
services:
- docker:dind
before_script:
- docker version
- docker-compose version
variables:
DOCKER_HOST: tcp://docker:2375/
# build:
# stage: build
# script:
# - docker build -t monimage .
# - docker-compose up -d
test:
stage: test
script :
- docker build -t monimage .
- docker-compose up -d
- docker ps
- docker exec -i postgres psql -U postgres -h postgres -f /home/dumps/test/dump_test_001 -c \\q
- exit
- docker exec -i monapp ./node_modules/.bin/env-cmd -f ./env/.env.builded-test npx jasmine spec/auth_queries.spec.js
- exit
this is the content of docker ps log on gitlabCI server :
docker ps on gitlab-CI
I thought to put postgres on host would work, but no I always have in gitlab-ci terminal:
psql: could not connect to server: Connection refused
Is the server running on host "postgres" (172.19.0.2) and accepting
TCP/IP connections on port 5432?
I also tried to put docker on host but error :
psql: could not translate host name "docker" to address: Name or service not known
little precision : it is working on localhost of my computer when i am doing make builded-test
bellow my makefile:
builded-test:
docker build -t monimage .
docker-compose up -d
docker ps
docker exec -i postgres psql -U postgres -h postgres -f /home/dumps/test/dump_test_001 -c \\q
exit
docker exec -i monapp ./node_modules/.bin/env-cmd -f ./env/.env.builded-test npx jasmine spec/auth_queries.spec.js
exit
docker-compose down
I want to make work postgres image in my docker-compose on gitlab CI to execute my tests help me please :) thanks by advance
UPDATE
Now it working in gitlab-runner but still not on gitlab when I push, I update the files like following
I added :
variables:
POSTGRES_DB: postgres
POSTGRES_USER: postgres
POSTGRES_PASSWORD: ""
POSTGRES_HOST_AUTH_METHOD: trust
and changed
test:
stage: test
script :
- docker build -t monimage .
- docker-compose up -d
- docker ps
- docker exec postgres psql -U postgres **-h postgres** -f /home/dumps/test/dump_test_001
- docker exec monapp ./node_modules/.bin/env-cmd -f ./env/.env.builded-test npx jasmine spec/auth_queries.spec.js
in the .gitlab-ci.yml
but still don't work when I push it to gitlab, it give me :
sql: could not connect to server: Connection refused
Is the server running on host "postgres" (172.19.0.2) and accepting
TCP/IP connections on port 5432?
any ideas ? :)
Maybe you need to wait for PostgreSQL service to be up and running.
Can you add a 10 seconds delay before trying the psql stuff? Something like:
- sleep 10
If it works, then you can use a more specific solution to wait for PostgreSQL to be initialized, like Docker wait for postgresql to be running
Related
I've been created a docker container of postgres service, but on start it and try to connect in database I get erros like I didn't defined a user and database to Postgres instance, I already tried to change the docker-compose and find the poblem but I didn't find.
Follow the attachments:
Dockerfile:
FROM wyveo/nginx-php-fpm:latest
RUN chmod -R 775 /usr/share/nginx/
RUN export pwd=pwd
docker-compose.yml:
version: '3'
services:
laravel-app_prm:
build: .
ports:
- "8099:80"
volumes:
- ${pwd}/.docker/nginx/:/usr/share/nginx
postgres_prm:
image: postgres
restart: always
environment:
- POSTGRES_USER=db_usr
- POSTGRES_PASSWORD=postgres_password
- POSTGRES_DB=db_prm
ports:
- "5432:5440"
volumes:
- ${pwd}/.docker/dbdata:/var/lib/postgresql/data/
**when I try to connect to the database directly through the container's bash, I get an error that user and database, both being inserted in the same way as defined in docker-compose.yml, do not exist.
sudo docker container <postgres_container_id> bash
psql -h localhost -U db_usr
... and so on...
And to set up connection in pgAdmin I got the container IP using:
sudo docker container inspect <postgres_container_id>
and getting the value from IPAddress atribute.
Trying to dockerize, nests, and Prisma.
Nest is responding correctly to curl requests and and I can connect to the Postgres server fine with this command
--- docker compose exec postgres psql -h localhost -U postgres -d webapp_dev
Everything works until i try to run
npx prisma migrate dev --name init
then i get back
Error: P1001: Can't reach database server at `postgres`:`5432`
Here is my code:
docker-compose.yml
version: "2"
services:
backend:
build: .
ports:
- 3000:3000
- 9229:9229 # debugger port
volumes:
- .:/usr/src/app
- /usr/src/app/node_modules
command: yarn start:debug
restart: unless-stopped
depends_on:
- postgres
environment:
DATABASE_URL: postgres://postgres#postgres/webapp_dev
PORT: 8000
postgres:
image: postgres:14-alpine
ports:
- 5432:5432
environment:
POSTGRES_DB: webapp_dev
POSTGRES_HOST_AUTH_METHOD: trust
DockerFile
FROM node:16
# Create app directory, this is in our container
WORKDIR /usr/src/app
# Install app dependencies
# Need to copy both package and lock to work
COPY package.json yarn.lock ./
RUN yarn install
COPY prisma/schema.prisma ./prisma/
RUN npx prisma generate
# Bundle app source
COPY . .
RUN yarn build
EXPOSE 8080
CMD ["node": "dist/main"]
.env
//.env
DATABASE_URL=postgres://postgres#postgres/webapp_dev
not sure if this is the only issue but your db url does not contain the db secret in it
DATABASE_URL: postgres://postgres:mysecret#postgres/webapp_dev?schema=public
I got the same error I solved it after adding ?connect_timeout=300 at my DATABASE_URL
I am trying to use GitLab CI PostgreSQL for my integration tests but it doesn´t work.
Here's the code of the stage:
integration_test:
stage: test
tags:
- custom_tag
services:
- postgres
variables:
POSTGRES_DB: test
POSTGRES_HOST: postgres
POSTGRES_USER: postgres
POSTGRES_PASSWORD: postgres
POSTGRES_HOST_AUTH_METHOD: trust
script:
- docker login -u ${DOCKER_USER} -p ${DOCKER_PASSWORD} ${DOCKER_REGISTRY}
- docker pull ${DOCKER_IMAGE_CI}
- export PGPASSWORD=${POSTGRES_PASSWORD}
- docker run --rm postgres psql -h ${POSTGRES_HOST} -U ${POSTGRES_USER} -d ${POSTGRES_DB} -c "SELECT 'OK' AS status;"
It returns an error like this:
psql: error: could not connect to server: could not translate host name "postgres" to address: Name or service not known
Anybody can help me?
Perhaps it's better to look at dockerizing test functions. This approach also provides better control over networking by means of docker bridge.
In this way your config could looks like this:
.gitlab-ci.yml:
stages:
- test
before_script:
- docker login -u ${DOCKER_USER} -p ${DOCKER_PASSWORD} ${DOCKER_REGISTRY}
integration_test:
stage: test
script:
- docker-compose build
- docker-compose up
docker-compose.yml:
version: '3'
networks:
database:
services:
postgres-db:
image: ${DOCKER_IMAGE_CI}
networks:
- database
container_name: postgres
test-container:
build:
context: .
dockerfile: Dockerfile
networks:
- database
container_name: testcon
Dockerfile:
FROM postgres
ENV POSTGRES_DB=test \
POSTGRES_HOST=postgres \
POSTGRES_USER=postgres \
POSTGRES_PASSWORD=postgres \
POSTGRES_HOST_AUTH_METHOD=trust
CMD psql -h ${POSTGRES_HOST} -U ${POSTGRES_USER} -d ${POSTGRES_DB} -c "SELECT 'OK' AS status;"
Your pipeline looks like it uses shell executor with gitlab services.
Command docker run --rm postgres <docker command> does not automatically connect to postgres network. You could try running your docker image with --link postgres, more details here. Note that link is legacy feature and may be removed in the future.
Personally I would try running my pipeline job using docker image. If your image is not publicly visible then you can bypass it with DOCKER_AUTH_CONFIG
If you used docker runner with password protected image then yaml would look like:
integration_test:
image: ${DOCKER_IMAGE_CI}
stage: test
tags:
- custom_tag
services:
- postgres
variables:
POSTGRES_DB: test
POSTGRES_HOST: postgres
POSTGRES_USER: postgres
POSTGRES_PASSWORD: postgres
POSTGRES_HOST_AUTH_METHOD: trust
script:
- PGPASSWORD=${POSTGRES_PASSWORD} psql -h ${POSTGRES_HOST} -U ${POSTGRES_USER} -d ${POSTGRES_DB} -c "SELECT 'OK' AS status;"
DOCKER_AUTH_CONFIG environment variable would be as follows(gitlab docs):
{
"auths": {
"${DOCKER_REGISTRY}": {
"auth": "(Base64 content from ${DOCKER_USER}:${DOCKER_PASSWORD})"
}
}
}
and to generate base64 auth you can use echo -n "${DOCKER_USER}:${DOCKER_PASSWORD}" | base64
I created docker-compose.yml which content you can find below. I navigate to the folder where file resist and run command:
docker-compose up -d
This was shown:
Starting postgres ... done
then i run that query:
docker-compose ps
Result:
Name Command State Ports
---------------------------------------------------------
postgres docker-entrypoint.sh postgres Exit 1
Now i wanted to run some command:
docker exec -it postgres psql -h localhost -p 54320 -U robert
This is what i get:
Error response from daemon: Container ae1565a84bcf0b3662b47d4f277efd2830273554b6bcf4437129e33b31c88b35 is not running
Is my container not running or? please of support.
docker-compose.yml:
version: "3"
services:
# Create a service named db.
db:
# Use the Docker Image postgres. This will pull the newest release.
image: "postgres"
# Give the container the name my_postgres. You can changes to something else.
container_name: "postgres"
# Setup the username, password, and database name. You can changes these values.
environment:
- POSTGRES_USER=robert
- POSTGRES_PASSWORD=robert
- POSTGRES_DB=mydb
# Maps port 54320 (localhost) to port 5432 on the container. You can change the ports to fix your needs.
ports:
- "54320:5432"
# Set a volume some that database is not lost after shutting down the container.
# I used the name postgres-data but you can changed it to something else.
volumes:
- ./volumes/postgres:/var/lib/postgresql/data
Can you attempt exec
docker run -it postgres psql -h localhost -p 54320 -U robert
?
$ docker exec --help
Usage: docker exec [OPTIONS] CONTAINER COMMAND [ARG...]
Run a command in a running container
Since your container has the status exit, you can't use docker exec
Can you use this docker-compose file?
version: "3"
volumes:
postgres_app: ~
services:
# Create a service named db.
postgres:
image: "postgres"
environment:
POSTGRES_USER: robert
POSTGRES_PASSWORD: robert
POSTGRES_DB: "mydb"
volumes:
- "postgres_app:/var/lib/postgresql/data"
ports:
- "54320:5432"
restart: always
And this command docker-compose exec postgres psql -U robert -d mydb
I hope this will help!
On my computer i executed this file
I am using docker compose to combine 2 images (tomcat with my app and database - postgres).
My compose file looks like this :
version: '3'
services:
tomcat:
build: ./tomcat-img
ports:
- "8080:8080"
depends_on:
- "db"
db:
build: ./db-img
volumes:
- db-data:/var/lib/postgres/data
ports:
- "5433:5432"
volumes:
db-data:
and here is dockerfile for database image:
FROM postgres:9.5-alpine
ENV POSTGRES_DB mydb
ENV POSTGRES_USER xxxx
ENV POSTGRES_PASSWORD xxxx
COPY init-db.sql /docker-entrypoint-initdb.d/
EXPOSE 5432
CMD ["postgres"]
Next I started my containers with docker-compose cli docker-compose -f docker-compose.yml up
and run psql tool with:
docker exec -it container_id psql -d xxxx -U xxxx
and insert new record. After that I check if there really is:
select * from my_table;
After that I tried stopped docker compose and remove containers with:
docker-compose -f docker-compose.yml down
and start it again
docker-compose -f docker-compose.yml up
when I run again psql tool of db container and select data in my_table, there is no previous inserted record ... Can you help me to fix it please? I need init my db with init-db.sql just once and next using that persist storage. Thanks for answers.
In my dockerized Postgresql with a data volume I am binding to /var/lib/postgresql and not to /var/lib/postgres/data. Try changing your compose file to
volumes:
- db-data:/var/lib/postgresql