I'm trying to connect weblate to external rds postgres database.
I'm using docker compose file that run weblate container. To this container I add the environment variables to connect to rds postgres.
The weblate container doesn't connect to rds postgres and give me this error:
psql: error: connection to server at "XXXX.rds.amazonaws.com", port 5432 failed: FATAL: password authentication failed for user "postgres"
but if I try to connect to rds postgres from inside the weblate container via cli, it works.
docker compose file:
version: '3'
services:
weblate:
image: weblate/weblate
tmpfs:
- /app/cache
volumes:
- weblate-data:/app/data
env_file:
- ./environment
restart: always
ports:
- 80:8080
environment:
POSTGRES_PASSWORD: XXXX
POSTGRES_USER: myuser
POSTGRES_DATABASE: mydb
POSTGRES_HOST: XXX.rds.amazonaws.com
POSTGRES_PORT: 5432
It tries to connect as postgres user while your configuration states myuser. Maybe the ./environment file overrides that?
I found the problem.
The problem was the char $ inside password.
Maybe the library used to connect to postgres has a bug or, simple, not allowed $ in password string.
When I removed that char it works.
Related
I am trying to connect to a Postgres instance running in a Docker container. In the docker-compose file, the postgres service looks like this:
flask-api-postgres:
container_name: flask-api-postgres
image: postgres:13.4-alpine
env_file:
- dev.env
ports:
- "5433:5433"
networks:
flask-network:
With docker inspect I get that the container has the address: 172.19.0.2.
The API works fine, but when trying to access the database from Pgadmin with the config shown in the image (user and password are correctly set), I get the shown error.
Pgadmin config
I do not know how to access the postgres instance from pgadmin.
One approach is you can access the postgres db docker container from pgadmin which is hosted in your host machine using 127.0.0.1 instead of 172.19.0.2
Another way is you can create another container for pgadmin. In this case, you can access your PostgreSQL using container IP (For example: 172.19.0.2). Add this to your docker-compose file
pgadmin:
image: dpage/pgadmin4
depends_on:
- flask-api-postgres
ports:
- "5050:80"
environment:
PGADMIN_DEFAULT_EMAIL: pgadmin4#pgadmin.org
PGADMIN_DEFAULT_PASSWORD: admin
restart: unless-stopped
networks:
flask-network:
Make sure both are under same network.
Please check the port you are using. The default is 5432.
See experiment:
> docker ps
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
0c4d92a623a6 postgres:latest "docker-entrypoint.s…" 14 minutes ago Up 14 minutes 5432/tcp, 0.0.0.0:5433->5433/tcp cannot-access-postgres-instance-running-in-docker-container-from-pgadmin-database-1
> docker exec -it 0c4d92a623a6 sh
# psql "host=127.0.0.1 port=5433"
psql: error: connection to server at "127.0.0.1", port 5433 failed: Connection refused
Is the server running on that host and accepting TCP/IP connections?
# psql "host=127.0.0.1 port=5432"
psql: error: connection to server at "127.0.0.1", port 5432 failed: FATAL: role "root" does not exist
#
I have created a PostgreSQL database inside a docker-compose container with the following yaml config:
version: '3'
services:
...
db:
image: postgres
ports:
- "5432:5432"
environment:
POSTGRES_DB: my_db
POSTGRES_USER: postgres
POSTGRES_PASSWORD: my_password
volumes:
db:
driver: local
Without the volumes section, I was not able to access the database from outside the container, meaning that I was not able to add any data to it. With this added section, I am now able to run:
docker exec -it container_db_1 psql -U postgres
Which allows me to create databases, tables, add data, etc.
However I am now trying to connect to the database with Azure Data Studio but I get the error FATAL: password authentication failed for user "postgres". I've triple checked all the connection settings but I always get this same error.
In the past I was able to connect to a postgres database created through a docker container (on its own, not with docker-compose). And I don't understand what is different this time, since I can connect through the a terminal in the same way.
I have my Postgres container running, built from this docker-compose file:
version: "3.9"
services:
db:
image: postgres
volumes:
- ./data/db:/var/lib/postgresql/data
ports:
- "5432:5432"
environment:
- POSTGRES_DB=db
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=password.
It spins up fine & my other dockerized servers can connect to it. But, if I open up a CLI from outside the docker instance & try to connect with
psql postgres://postgres:password#localhost:5432/db
Or try to add a database connection in PyCharm, I get
psql: error: could not connect to server: FATAL: database "db" does not exist
as a response. What do I need to do to allow outside calls to the containerized database? I've tried adding "expose:5432" to the docker compose, but that didnt help.
answered my own question here - by stopping PgAdmin from running, which I guess was blocking the 5432 port, the containerized postgres service could be accessed by the CLI & PyCharm
I have configured a production postgres sql database.
If I need to do debugging work, I don't want to be interacting with the production database or else that will affect the user base. Instead, I need to create a local environment such that nothing will be changed in the production database during debugging.
I am using Postgres SQL 10 and PGAdmin 4
How can I achieve that?
Thanks.
You could set up a test environment with docker.
first a docker-compose.yml file:
version: "3"
services:
db:
image: postgres:10-alpine
volumes:
- ./local_path:/var/lib/postgresql/data
ports:
- "8000:5432"
expose:
- "5432"
admin:
image: dpage/pgadmin4
environment:
- PGADMIN_DEFAULT_EMAIL=admin#admin.com
- PGADMIN_DEFAULT_PASSWORD=admin
ports:
- "8080:80"
See the documentation for the docker postgres image on how to set environment variables to define user/password/db name. https://hub.docker.com/_/postgres/
I'm not too familiar with pgadmin but container has minimal setup options:
https://hub.docker.com/r/dpage/pgadmin4/
Then you start the containers with sudo docker-compose up.
The db container is publishing its port on 8000 on your host machine, so there should be no conflict with the postgres server running on the host.
To connect:
psql -h localhost -p 8000 -U postgres
The admin page should be available at port 8080 on your host machine.
When you connect the admin to the database in the UI, the hostname is db and the port is 5432
Now that you have a docker container set up, you might also consider using it for production also :)
I'm trying to create a database and connect to it within my container network. I don't want to have to ssh into a box to create users/databases etc, as this is not a scalable or easily distributable process.
This is what I have so far:
# docker-compose.yml
db:
image: postgres:9.4
volumes:
- ./db/init.sql:/docker-entrypoint-initdb/10-init.sql
environment:
- PGDATA=/tmp
- PGDATABASE=web
- PGUSER=docker
- PGPASSWORD=password
This is my init.sql file:
CREATE DATABASE web;
CREATE USER docker WITH PASSWORD 'password';
GRANT ALL PRIVILEGES ON DATABASE web TO docker;
When I start up the container and try to connect to it, I get this error:
db_1 | FATAL: role "docker" does not exist
db_1 | done
db_1 | server started
db_1 | FATAL: database "web" does not exist
db_1 | psql: FATAL: database "web" does not exist
The first time this happened, I tried to create a role like this:
CREATE ROLE docker with SUPERUSER PASSWORD password;
GRANT web TO docker;
But it did not have any effect. To make matters even more confusing, when I use node-postgres to connect to the db, I get this error:
Error: connect ECONNREFUSED
But how can the connection be refused if the db service isnt even up??
In a nutshell, these are the questions I'm trying to solve:
How can I create a database using only the files in my project (i.e. no manual commands)?
How do I create a user/role using only the files in my project?
How do I connect to this database?
Thank you in advance.
How can I create a database using only the files in my project (i.e.
no manual commands)?
The minimal docker-compose.yml config for you defined user and database is:
postgres:
image: postgres:9.4
environment:
- POSTGRES_DB=web
- POSTGRES_USER=myuser
How do I create a user/role using only the files in my project?
To execute scripts on database initialization take a look at the official docs for initdb.
To get you started with a quick and dirty solution create a new file e.g. init_conf.sh in the same directory as your docker-compose.yml:
#!/bin/bash
set -e
psql -v ON_ERROR_STOP=1 -U "$POSTGRES_USER" -d "$POSTGRES_DB" <<-EOSQL
CREATE ROLE docker with SUPERUSER PASSWORD 'password';
EOSQL
And add the volumes directive to your docker-compose.yml.
volumes:
- .:/docker-entrypoint-initdb.d
Recreate your container because otherwise, you wouldn't trigger a new database initialization. That means, docker stop and docker rm the old one first before executing docker-compose up again. STDOUT gives you now some information about our newly introduced script.
How do I connect to this database?
To connect to your database with docker exec via the terminal:
docker exec -ti folder_postgres_1 psql -U myuser -d web
A docker-compose.yml in one of my production environments looks like the following:
services:
postgres:
logging: &logging
driver: json-file
options:
max-size: "10m"
max-file: "5"
build: ./docker/postgres # path to custom Dockerfile
volumes:
- postgres_data:/var/lib/postgresql/data
- postgres_backup:/backups
env_file: .env
restart: always
# ... other services like web, celery, redis, etc.
Dockerfile:
FROM postgres:latest
# ...
COPY *.sh /docker-entrypoint-initdb.d/
# ...
The environment variable you are using are wrong. Try this
version: '3.3'
services:
db:
image: postgres:9.4
restart: always
environment:
- POSTGRES_USER=docker
- POSTGRES_PASSWORD=password
- POSTGRES_DB=web
volumes:
- db_data:/var/lib/postgresql/data
# optional port
ports: ["5555:5432"]
volumes:
db_data:
then from any other docker-compose service you can access the DB at db:5432 and from your host machine you can access postgres on localhost:5555 if you also add the ports