How to create Postgres backups from docker container - postgresql

I have a Postgres 9.5.4 database running inside a docker container. I am using the official Postgres images and I have been trying to create backups from my database, but not luck so far. According docker documentation I was expecting the following command to store the dump.tar file in the /releases/ folder in the host machine (outside the temporal docker container)
docker run -i --rm --link my_postgres:postgres --volume /releases/:/tmp/ postgres:latest pg_dumpall > /tmp/dump.tar
But this throws the following error:
pg_dumpall: could not connect to database "template1": could not connect to server: Connection refused
Is the server running locally and accepting
connections on Unix domain socket "/var/run/postgresql/.s.PGSQL.5432"?
Any idea what could be wrong?

Actually I have found the solution
docker run -i --rm --link my_container:postgres -v /releases/:/tmp/ postgres:latest bash -c 'exec pg_dumpall -h "$POSTGRES_PORT_5432_TCP_ADDR" -p "$POSTGRES_PORT_5432_TCP_PORT" -U postgres > /tmp/my_backup.tar'
Where Docker arguments:
-i For interactive processes, like a shell, bash, etc.
--rm For deleting the container once the backup is finished
--link For linking this temporal container to my_container, which contains the currently running Postgres database.
-v /releases/:/tmp/ Creates shared volume. The content inside the folder /tmp/ in the container (In this case my_backup.tar) will be automatically visible on folder /releases/ in the host machine.
And bash arguments:
pg_dumpall To export PostgreSQL cluster into a script file.
-h "$POSTGRES_PORT_5432_TCP_ADDR" Obtains the ip address out of the Postgres variable.
-p "$POSTGRES_PORT_5432_TCP_PORT" Obtains the Port out of the Postgres variable (These variables are defined only in the temporal container we just created, as result of the linkage with my_container)
-U The username for connecting to the database

Related

psql command from official page on dockerhub

I am learning Docker. I have practiced a lot, including testing commands from the official Postgres page on dockerhub.
I ran this command:
docker run -it --rm --network some-network postgres psql -h some-postgres -U postgres
Could someone give a complete and concrete example to make this command work (i mean with a real existing container). I can't see how it could work.
docker run create a docker container
-it create a connection to said container (kinda like TTY) taking in what we write into interactive bash in the container
--rm delete the container when it exit
--network some-network assign some-network network to the container
postgres name of the image
psql -h some-postgres -U postgres connect to PostgreSQL at some-postgres address using postgres user.
Combine the entire command and flags: create a PostgreSQL container and the use the psql command from inside the container to connect to some-postgres using postgres user
For more flags and usage, you can learning from the doc here
Probably, in the Docker hub page is not perfectly clear but your command is used to connect to an already existing Postgres instance.
So, for example, you first create a container with the command:
docker run -it --rm --name postgresql -p 5432:5432 -e POSTGRES_USER=admin -e POSTGRES_PASSWORD=admin -d postgres:latest
then you can execute your command to connet to it
docker run -it --rm postgres psql -h <your_ip> -U postgres
If your container is running locally, you can get the ip from the bash command ip address
The network attibute is related to the container you first startup so you can decide to leave or remove from the command in relation to the container deploy.

How to fix: 'Psql: could not connect to server: Connection refused' when using Postgres in a docker container in gitlab-ci

I’m currently trying to run fossology in Gitlab CI. Fossology requires an external database that can be set up from a schema created using pg_dump. When I'm trying to use psql I get the title error.
At the moment, I have a script that sets up a container that runs the required version of postgres (9.6). It then tries to run an .sql script via psql in the postgres container via docker exec. Upon doing so it gets the title error.
I have tried specifying both a port and a host when issuing the psql statement, neither of which worked. I have tried using localhost, 127.0.0.1, the IP address of the postgres container and the name of the container as a host. I have tried rewriting things in different scripts, but nothing seems to work.
After extensive google searching, many people seem to have the same error message but not for the same reasons and not usually when using a docker container to host the database.
When I have run the contents of my script in the command line, i do not get this error, the script works fine and I can connect to Fossology. The issue only arises when trying to do the same in Gitlab CI.
The sequence of steps (i.e. pasted line by line) that works when using the command line on Mac:
# creates blank database and hosts it in a docker container
docker run -d --name fossdb -p 5432:5432 postgres:9.6
docker cp /fossology_db_schema.sql fossdb:/fossy.sql
docker exec -it fossdb bash
psql postgres -U postgres
# creates user needed for database to work with fossology
create user fossy with password 'fossy';
create database fossology;
grant all privileges on database fossology to fossy;
\q
# builds the fossology database in the hosted blank database
psql fossology < fossy.sql
psql postgres -U postgres
\connect fossology
exit
What I am attempting in GitLab CI:
# creates container with postgres image
docker run -d --name fossdb -p 5432:5432 --network foss-net postgres:9.6
# creates blank database (error occurs here)
docker exec fossdb psql -h $(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' fossdb) -f ./createBlank.sql -U postgres
# builds fossology database from schema
docker exec fossdb psql -h $(docker inspect --format='{{range .NetworkSettings.Networks}}{{.IPAddress}}{{end}}' fossdb) fossology < ./schema.sql -U postgres
createBlank.sql:
create user fossy with password 'fossy';
create database fossology;
grant all privileges on database fossology to fossy;
Expected results: runs createBlank.sql to create a blank database called fossology, then builds fossology database from schema
Actual results: psql: could not connect to server: Connection refused
Is the server running on host "172.19.0.2" and accepting
TCP/IP connections on port 5432?
Are you sure you set up postgres completely?
A few quick checks you can perform:
(Excuse me, you DID do that. goto suggestion 2)
suggestion 1: Did you tell postgres there is a user with a password? (createuser command)
https://www.postgresql.org/docs/9.2/app-createuser.html
suggestion 2: Did you tell postgres that user can connect, and how? (tcp or local sockets)
https://www.postgresql.org/docs/9.2/auth-pg-hba-conf.html

PostgreSQL and pgAdmin both running in Docker containers and connecting

I am attempting to run both PostgreSQL and pgAdmin in Docker containers. The idea is that the PostgreSQL database should be accessible to any applications I have running on the host machine, and also to pgAdmin.
I am using this command to run PostgreSQL:
docker run -d -e POSTGRES_USER=username -e POSTGRES_PASSWORD=password --name postgres -p 5432:5432 postgres
And to run pgAdmin:
docker run -d -p 1111:1111 --name pgadmin -e "PGADMIN_LISTEN_PORT=1111" -e "PGADMIN_DEFAULT_EMAIL=admin#test.com" -e "PGADMIN_DEFAULT_PASSWORD=test" dpage/pgadmin4
If I go to localhost:1111, I can connect to pgAdmin and login. However, when I try to connect to my local PostgreSQL instance, it gets no response.
Therefore, I tried to run pgAdmin with access to the host internet using --net=host instead of -p 1111:1111:
docker run -d --net=host --name pgadmin -e "PGADMIN_LISTEN_PORT=1111" -e "PGADMIN_DEFAULT_EMAIL=admin#test.com" -e "PGADMIN_DEFAULT_PASSWORD=test" dpage/pgadmin4
Now, when I try to go to localhost:1111 to connect to pgAdmin, I get no response in my browser.
Docker Compose is a possible solution, as I could link the two containers together so they could access each other without having to worry about ports, but I also need pgAdmin to be able to access PostgreSQL instances on other machines, as well as my local one.
I feel like --net=host is broken in Docker. There's a whole thread here with a lot of confusion.
My setup:
Host: Windows 10
Docker: Docker Desktop Community v2.0.0.3 (31259)
Update
I have now tried using --link postgres on the pgAdmin container and it allows me to connect to my local instance of PostgreSQL but not non-local ones, the full command is:
docker run -d -p 1111:1111 --link postgres --name pgadmin -e "PGADMIN_LISTEN_PORT=1111" -e "PGADMIN_DEFAULT_EMAIL=admin#test.com" -e "PGADMIN_DEFAULT_PASSWORD=test" dpage/pgadmin4
The commands were not wrong, only the connection in pgAdmin, so the full list of commands are:
For PostgreSQL:
docker run -d -e POSTGRES_USER=username -e POSTGRES_PASSWORD=password --name postgres -p 5432:5432 postgres
And pgAdmin:
docker run -d -p 1111:1111 --name pgadmin -e "PGADMIN_LISTEN_PORT=1111" -e "PGADMIN_DEFAULT_EMAIL=admin#test.com" -e "PGADMIN_DEFAULT_PASSWORD=test" dpage/pgadmin4
Now the pgAdmin container won't connect to localhost but needs the IP of the PostgreSQL container. Run:
docker inspect postgres
Inspect result:
[
{
...
"NetworkSettings": {
...
"Networks": {
...
"IPAddress": "172.17.0.3",
...
}
}
}
]
We're only interested in the IPAddress from the inspect command. This is the IP which pgAdmin should connect to.
pgAdmin is also capable of access external IPs from your machine.
When you create docker containers it creates a bridge network. First find the network range for the bridge network. You can use ifconfig to find it. Let's say 172.17.0.5 is ip of pgAdmin, Then create a user 'root'#'172.17.0.5' for PostgreSQL and give database permissions for that user. Then you can connect to the database. Also check if port 3306 is accessible using telnet.

How can I connect to Postgres database in the container via port 5432

I am running a postgres docker container by using the commands below: (reference: https://docs.docker.com/engine/examples/postgresql_service/)
docker build -t eg_postgresql .
docker run --rm -P --name pg_test eg_postgresql
This works but the port number is dynamic. I can connect to the database by giving the port number. (the port I see in docker ps command)
I would like to connect to this docker database from Python so I need a static port number.
I tried the parameters below:
-p 127.0.0.1:5432:5432
-p 5432:5432
In that case, the docker container's port number was set as 5432. However, I could not connect to the database. I get docker user does not exist error message.
What is your advice?
I took the Dockerfile from the link you posted. After building the container with
docker build -t eg_postgresql .
I started the container with
docker run --rm -p 5432:5432 --name pg_test eg_postgresql (which binds localhost port 5432 to the container port 5432)
and then I tried to connect with
psql -h localhost -p 5432 -d docker -U docker --password
It works like a charm. If you get a message that docker user does not exist please double check that all steps from the Dockerfile are executed succesfully during the docker build command as the creation of the docker user is done in the command RUN /etc/init.d/postgresql start &&\
psql --command "CREATE USER docker WITH SUPERUSER PASSWORD 'docker';" &&\
createdb -O docker docker. Make also sure that you have no PostgreSQL server running on your localhost so that you can be sure that you are trying to connect to PostgreSQL inside the container.

can't run docker image of jhipster webapp

I have a jhipster monolithic web app with postgress database. I built a docker image using
./gradlew bootRepackage -Pprod buildDocker
Now when I try to run the image using docker run , it fails with following error.
Caused by: org.postgresql.util.PSQLException: Connection to localhost:5432 refused. Check that the hostname and port are correct and that the postmaster is accepting TCP/IP connections.
at org.postgresql.core.v3.ConnectionFactoryImpl.openConnectionImpl(ConnectionFactoryImpl.java:247)
at org.postgresql.core.ConnectionFactory.openConnection(ConnectionFactory.java:65)
I tried few things like, but still get the same error:
docker create -v /var/lib/postgresql/data --name spring_app_data postgres:9.5.1
docker run --volumes-from spring_app_data --name spring_app_pg -e POSTGRES_USER=postgres -e POSTGRES_PASSWORD=password -d -P postgres:9.5.1
docker run -it --link spring_app_pg:postgres --rm postgres sh -c 'exec psql -h "$POSTGRES_PORT_5432_TCP_ADDR" -p "$POSTGRES_PORT_5432_TCP_PORT" -U postgres'
docker run --name spring_app_container --link spring_app_pg:spring_app_pg -p 8080:8080 -d wmd_server_pg
Any suggestions on how to run the docker image for a webapp with PostgreSQL. BTW I get same kind of error when I use mongodb.
Going by your example commands your database won't be accessible as localhost from the app, it will be via the named container. Configure your apps database connection to use spring_app_pg:5432.
Also, don't use links. Use a user defined network, most likely a bridge is all you will need.
docker network create my_app
docker run --net=my_app --name=spring_app_pg <dbimage>
docker run --net=my_app --name=spring_app_container <appimage>
That should give you the same result as your linked setup.