How connect to postgres from a golang app when using docker-compose? - postgresql

My docker-compose file
version: "2"
services: db:
restart: always
image: postgres:latest
ports:
- "5435:5432"
environment:
POSTGRES_PASSWORD: password
POSTGRES_USER: user
POSTGRES_DB: db adminer:
web:
image: golang:1.7
working_dir: /go/src/app
command: go run bot.go
ports:
- "3000:3000"
volumes:
- ./bot:/go/src/app
links:
- db
environment:
PORT: 3000
CONNECTION_STRING_DEV: postgres://user:password#db/db
and my bot.go, where I try connect
db, err = sql.Open("postgres", "user=user password=password host=db dbname=db port=5432 sslmode=verify-full ")
When I bring up my containers, I see errors:
panic: dial tcp 5.61.14.99:5432: getsockopt: connection refused
I changed the port on 5432 and tried connect like this:
db, err = sql.Open("postgres", "postgres://user:password#db/db")
but I get the same errors
What's wrong with my docker-compose setup?

Your docker-compose looks a little messy but that's probably from copy and pasting. It's likely that postgres is not yet up and running when Go tries to connect. To test if that's the problem, first:
docker-compose up -d db
Then wait until postgres is ready by checking:
docker-compose logs -f db
and look out for a log line like:
db_1 | LOG: database system is ready to accept connections
When that line appears, quit the log command (Ctrl+C) and run your bot:
docker-compose up web
If it is now working, that was indeed your problem.
Solution: Wait until postgres is ready. Easy ways to achieve this are:
sleep for an amount of time (e.g. 1 min) before running web
sleep inside web before connecting
when connecting fails, sleep for 5 seconds and retry indefinitely
The disadvantage of these are that you don't know when postgres is ready, so you could wait too long or not long enough. A better solution is to run your bot only after a successful connection to postgres has been made.
Example from https://docs.docker.com/compose/startup-order/:
#!/bin/bash
# wait-for-postgres.sh
set -e
host="$1"
shift
cmd="$#"
until psql -h "$host" -U "postgres" -c '\l'; do
>&2 echo "Postgres is unavailable - sleeping"
sleep 1
done
>&2 echo "Postgres is up - executing command"
exec $cmd
Add this script as wait-for-postgres.sh and in you docker-compose.yml change the command for web like so:
command: ["./wait-for-postgres.sh", "db", "go", "run", "bot.go"]

I found answer, i ran container with postgres already in other app, i didn't think about this, because docker compose didn't show errors when build db container. I used docker ps then docker stop xxxxx and stop db container from other app, then build and up my app, and problem solved.

Related

cannot access postgres db running docker container from local machine

I have been spending 3-4 hours on this and still have not found a solution.
I can successfully run the docker container and use psql from the container bash, however, when I try to call the db from my local machine I continue to get this error message:
error role "postgres" does not exist
I have already tried editing "listen_addresses" in the postgresql.conf file from the container bash
My setup:
I am using a macbook - Monterey 12.4
my docker compose file:
version: '3.4'
services:
postgres:
image: postgres:latest
ports:
- "5432:5432"
environment:
- POSTGRES_DB=postgres_db
- POSTGRES_USER=testUser
- POSTGRES_PASSWORD=testPW
volumes:
- postgres-data:/var/lib/postgresql/db
but this issue occurs if I do it through the standard CLI command as well, i.e:
docker run -d -p 5432:5432 --name my-postgres -e POSTGRES_PASSWORD=mysecretpassword postgres
I tried to follow this tutorial but it didnt work:
[https://betterprogramming.pub/connect-from-local-machine-to-postgresql-docker-container-f785f00461a7][1]
when I try this command:
psql -h localhost -p 5432 -U postgres -W
it doesnt work:
psql: error: connection to server at "localhost" (::1), port 5432 failed: FATAL: role "postgres" does not exist
Also for reference, the user "postgres" does exist in postgres - as a superuser
Replace POSTGRES_USER=testUser with POSTGRES_USER=postgres in the compose configuration. Also use the password defined in POSTGRES_PASSWORD. Delete the old container and create a new one.
Thank you all for your help on this.
It turns out the issue was that I was running postgres on my local machine as well.
so once I turn that off I was able to connect.
I appreciate your time!

How to connect to a Postgres database running in local Docker container through locally-run psql command?

I'm running a docker container with the vanilla Postgres image on my local machine. I'd like to connect to the database from my local machine (i.e., not from "within the container". However, on trying to connect, I get an error.
Here's my docker-compose.yml file:
version: "3.8"
services:
db:
image: postgres
restart: always
ports:
- 5432:5432
environment:
POSTGRES_USER: postgres
POSTGRES_PASSWORD: mypassword
Here's how I start up:
docker-compose run db
Here's how I connect:
psql -h localhost -p 5432 -U postgres
This produces the error:
could not connect to server: Connection refused Is the server running on host "localhost" (127.0.0.1) and accepting TCP/IP connections on port 5432?
If I spin up the database without Docker Compose, the same connection command works as expected:
docker run --name mypg -p 5432:5432 -e POSTGRES_PASSWORD=password postgres
I could just go with the flow and use the command above. But this seems to be pointing to a flaw in how I think about Docker/Compose. For example, maybe Docker Compose's internal DNS resolver makes this approach fail.
Any ideas?
Version info:
psql --version
psql (PostgreSQL) 13.3
I have read through several SO posts, including these, but they don't address or fix the problem I'm seeing:
docker-compose: accessing postgres' shell (psql)
Can't connect to postgres when using docker-compose
Try docker-compose up db instead of run. Using run will run a one-off command against your container, whereas up will turn on the container and leave it running, so another application should be able to access it.
https://docs.docker.com/compose/faq/#whats-the-difference-between-up-run-and-start

Using Docker: sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) FATAL: password authentication failed for user "username"

Error:
sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) FATAL: password authentication failed for user "username"
Hello, I am trying to run a program locally using Docker and am getting the error in the title, even though it was working before.
I've tried reinstalling Docker, re-cloning the repo, reinstalling PostgresSQL (the problem started when I installed it for the first time). From reading similar questions, I ensured that the password matches. The password is 'password' for the Docker Postgres Database and I've tried changing it but it still hasn't worked.
I'm using 'docker-compose up -d' and then running tests but I get the error in the title. I've tried running 'docker-compose down' and then redoing it, but I still get the error.
.env file:
FLASK_ENV=development
REDIS_URL=redis://localhost:6379
DATABASE_URL=postgresql+psycopg2://username:password#localhost:5432/programname
PROGRAM_API_APP_NAME=test-prod.compute.random.com
docker-compose.yml:
version: '3'
services:
postgresql:
image: postgres:10-alpine
environment:
- POSTGRES_DB=programname
- POSTGRES_PASSWORD=password
- POSTGRES_USER=username
ports:
- "5432"
redis:
image: redis:5.0.3
ports:
- "6379:6379"
(I don't have the port as 5432:5432 because it didn't work with that and I found an answer to remove the second 5432.)
Boss helped me with this. Apparently, when I installed Postgres, it created a postgres user that had a process that was using port 5432 and even when we killed it, it automatically restarted. To solve this specific problem, we changed the docker-compose file to use port 5433 locally and 5432 in the container. Still have to find out how to get rid of the postgres user.

password authentication failed for user "postgres" with docker-compose up on EC2

On EC2 linux server create by docker-machine, when I launch docker postgres:10.6 by docker-compose up, I have these loop errors :
FATAL: password authentication failed for user "postgres"
DETAIL: Password does not match for user "postgres".
Connection matched pg_hba.conf line 95: "host all all all md5"
I don't have these errors if I start container manually
=> docker run -e POSTGRES_PASSWORD=myPassword postgres:10.6
I don't have these errors in my local docker.
My docker-compose :
db:
container_name: postgres
image: postgres:10.6
restart: always
environment:
POSTGRES_PASSWORD: myPassword
ports:
- "5432:5432"
Does anyone know how to fix this?
It might be because the volume (or bind mount directory) being already initialized after your first start. The postgres user, and database creation only happens on the first start (ie, /var/lib/postgresql/data must not already contain database files).
Try to run:
docker-compose rm -fv postgres to delete any containers or volumes (in particular).
docker-compose up -d to start your container.
Sorry for that I have the answer to my question, it's not a bug I just have something that tries to connect permanently to postgres (through port 5432 which is open) ...
After search, I think it's an attempt at hacking because incoming connections never come from the same IP
connection received: host=45.120.120.149.81 port=47118
connection received: host=210.4.125.252 port=44774
connection received: host=82.223.55.254 port=36320
etc....

How do I solve postgresql error "connection attempt failed"?

I wanted to build my springboot project. Then I want to dockerize my code. But when I built, I got error. I think this occured caused by postgresql setting. But I could not find reason.
Could you please help me?
docker-compose.yml file;
version: '2'
services:
web:
build: .
ports:
- 8080:8080
db:
container_name: productdb
image: postgres:9.5
volumes:
- sample_db:/var/lib/postgresql/data
environment:
- POSTGRES_PASSWORD=bright
- POSTGRES_USER=postgres
- POSTGRES_DB=productdb
- PGDATA=/var/lib/postgresql/data/pgdata
ports:
- "5432:5432"
volumes:
productdb: {}
application.yml file;
server:
port: 8761
eureka:
client:
registerWithEureka: false
fetchRegistry: false
server:
enableSelfPreservation: false
waitTimeInMsWhenSyncEmpty: 0
spring:
application:
name: product-service
datasource:
url: jdbc:postgresql://db:5432/productdb
username: postgres
password: xxxx
initialization-mode: always
jpa:
show-sql: true
hibernate:
ddl-auto:
properties:
hibernate:
temp:
use_jdbc_metadata_defaults: false
Error looks like;
org.postgresql.util.PSQLException: The connection attempt failed.
Thank you
If your docker-compose.yml file is well configured, it should be start two containers:
docker ps
source: https://intelligentbee.com/2017/09/18/setup-docker-symfony-project/
One for app and one for db.
These containers are in the same host, so if your web need to connect to the database, you must the ip instead : localhost, 127.0.0.1 or 0.0.0.0
You cat get the ip with this
hostname -I| awk '{printf $1}'
If your web and your database would be in different host, you can use the public ip where is hosted the database. But as you are using docker-compose this is not the case.
I suggest you to test if your database is ready and available, before using it in your web app.
In order to test your database , You can following one of these approaches:
Check db status with telnet
There are several way , but the easiest option is the telnet command. For instance, in order to test if mysql container is ready to use in the same machine where was started:
telnet localhost 3306
If your mysql is ready, telnet must show you a result like the following picture:
Any other negative result, would indicate that your mysql container is exited or wrong.
Note:Change 3306 for the correct postgress port
Check db status with Database IDE
Other option for UI users is testing the database connection using some Database IDE. Just download one of the several postgress client IDEs and testing your database.
Don't hardcode parameters
It is a good practice to externalize configuration using environment variables. Spring and docker know and allow us to use them.
So, modify your application.yml :
From
datasource:
url: jdbc:postgresql://db:5432/productdb
To
datasource:
url: jdbc:postgresql://${DATABASE_HOST}:5432/productdb
For development, in your eclipse use run as configurations >> environment section
For production you can:
export variable before run
pass it to your docker run sentence...
docker run -d \
--name my_funny_api \
-p 8080:8080 \
-e "DATABASE_HOST=10.10.01.52" \
-i -t my_funny_api_image
or
export HOST_IP=$(hostname -I| awk '{printf $1}')
docker run -d \
--name my_funny_api \
-p 8080:8080 \
-e "DATABASE_HOST=${DATABASE_HOST}" \
-i -t my_funny_api_image
Finally to avoid manually task to manage your variables, you can use : http://github.com/jrichardsz/tachikoma-ops
Using DataGrip software and DB in DigitalOcean. Got error
[08001] The connection attempt failed. java.net.SocketTimeoutException: connect timed out.
Made sure my current IP was one of the allowed inbound connections and that worked. (Even though the error should probably have been different.)
Hope this is useful to someone eventually.
Your DB should accept connections outside of the container
sudo docker run --name pg -p 5432:5432 -v pg_data:/var/lib/postgres/data -e POSTGRES_DB=mydb -e POSTGRES_USER=pg_user -e POSTGRES_PASSWORD=pg_password -d postgres -c "listen_addresses=*"
"listen_addresses=" It will accept connection outside of the container*
You can use follow credential to connect your spring boot project
db_user=pg_user
db_password=pg_password
db_url=jdbc:postgresql://localhost:5432/mydb