How to connect to dbeaver using Docker in Windows 10 environment? - postgresql

I am studying docker.
but i have a issue. because I can't see the db using dbeaver.
when you are using previous Linux, i could write the same and then it is working
but now i use window so not working
i made docker compose file
version: "3.8"
services:
backend:
build:
context: .
dockerfile: Dockerfile
# network: auth-module
volumes:
- ./src:/server/src
ports:
- 4000:4000
env_file:
- ./.env.dev
# command: "npm run prod"
links:
- postgres
postgres:
image: postgres:12
environment:
POSTGRES_USERNAME: "postgres"
POSTGRES_DB: "auth"
POSTGRES_PASSWORD: "1234"
ports:
- 5432:5432
networks:
default:
external:
name: auth-module
What is the problem?
and How to fix it?

Related

Password auth failed for user "postgres"

I am trying to run my postgres server and nestjs project with docker script and it does fire up the server and database. While firing up it does run migrations too but when I open pgAdmin I see no database there and if i try to create new server i get fatal pasword incorrect error. Also my server crashes too with error saying Password authentication failed for user "postgres". It was running fine yesterday but today its not running at all. I tried pruning everything and made fresh build and then compose up but nothing. Here is docker script
version: "3.5"
services:
dev-api:
container_name: xxxxxxxx-api
build:
context: .
dockerfile: Dockerfile
depends_on:
- dev-db
environment:
DATABASE_URL: postgresql://postgres:postgres#dev-db:5432/xxxxxxx_api
APP_ENV: development
PORT: 3030
WAIT_HOSTS: dev-db:5432
ports:
- "3030:3030"
- "9229:9229"
volumes:
- .:/usr/api/
dev-db:
container_name: xxxxxxxx-postgres
image: postgres:13.5-alpine
restart: always
ports:
- "5432:5432"
volumes:
- ./pg-data:/var/lib/postgresql/data
- ./src/db/docker/init.sql:/docker-entrypoint-initdb.d/dbinit.sql
environment:
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
- POSTGRES_DB=xxxxxxx_api
expose:
- "5432"
pgadmin:
container_name: xxxxxxx-pgadmin
image: dpage/pgadmin4:6.2
ports:
- 8080:80
volumes:
- pgadmin-data:/var/lib/pgadmin
environment:
- PGADMIN_DEFAULT_EMAIL=user#postgres.com
- PGADMIN_DEFAULT_PASSWORD=postgres
- PGADMIN_LISTEN_PORT=80
depends_on:
- dev-db
volumes:
pgadmin-data:

pgAdmin disable login dialog / automatic login

I'm running pgAdmin using docker-compose with the following script:
version: "3.9"
services:
postgres:
image: "postgres:13"
container_name: "postgres"
environment:
POSTGRES_PASSWORD: pwd
ports:
- "5432:5432"
volumes:
- ./initdb:/docker-entrypoint-initdb.d
pgadmin:
image: "dpage/pgadmin4"
container_name: "pgadmin"
environment:
PGADMIN_DEFAULT_EMAIL: pgadmin#mycomp.com
PGADMIN_DEFAULT_PASSWORD: secret
links:
- postgres:postgres
ports:
- 5050:80
The script uses PGADMIN_DEFAULT_EMAIL and PGADMIN_DEFAULT_PASSWORD to change the default pgAdmin credentials.
However, as I'm running this docker instance on a development machine, I would like to auto-login into pgAdmin.
Is it possible to disable the login / automatically login?
You have to set SERVER_MODE parameter to False on your development machine. Due to documentation you should use PGADMIN_CONFIG_SERVER_MODE in your docker-compose.yml:
version: "3.9"
services:
postgres:
image: "postgres:13"
container_name: "postgres"
environment:
POSTGRES_PASSWORD: pwd
ports:
- "5432:5432"
volumes:
- ./initdb:/docker-entrypoint-initdb.d
pgadmin:
image: "dpage/pgadmin4"
container_name: "pgadmin"
environment:
PGADMIN_DEFAULT_EMAIL: pgadmin#mycomp.com
PGADMIN_DEFAULT_PASSWORD: secret
PGADMIN_CONFIG_SERVER_MODE: 'False'
links:
- postgres:postgres
ports:
- 5050:80
Now if you have a problem with the missing crypt key, you can disable MASTER_PASSWORD by specifying this parameter:
PGADMIN_CONFIG_MASTER_PASSWORD_REQUIRED: 'False'

How do I run SQL Scripts after DB Initialized from Docker-Compose?

I have the Docker Compose file below. I'm trying to run the following:
Set up Postgres
Run Entity Framework to set up my schemas/tables
Set up PG Admin
Run some SQL scripts on the database.
The I can get the first three items done no problem, but I'm not sure where to put the running of my SQL scripts. Right now it's on the last line of the YAML, but I'm sure this is wrong. Where would I put this? I'm not sure how to reference the database I'd set up earlier to run the SQL on.
version: '3.8'
services:
#SET UP POSTGRES
db:
image: postgres
restart: always
environment:
POSTGRES_USER: marmalade
POSTGRES_PASSWORD: marmalade
POSTGRES_DB: marmalade
ports:
- "15432:5432"
healthcheck:
test: ["CMD-SHELL", "pg_isready -U marmalade"]
interval: 5s
timeout: 5s
retries: 5
#RUN ENTITY FRAMEWORK TO INITIALIZE DATABASE
db-migrator:
image: ${DOCKER_REGISTRY-}db-migrator
build:
context: ../../../
dockerfile: src/marmalade/Dockerfile
environment:
- DOTNET_ENVIRONMENT=IntegrationTest
depends_on:
db:
condition: service_healthy
#SET UP PGADMIN
pgadmin:
container_name: pgadmin4_container
image: dpage/pgadmin4
restart: always
environment:
PGADMIN_DEFAULT_EMAIL: admin#admin.com
PGADMIN_DEFAULT_PASSWORD: marmalade
ports:
- "5050:80"
volumes:
- ./servers.json:/pgadmin4/servers.json # preconfigured servers/connections
- ./sql/admin_schema.sql:/docker-entrypoint-initdb.d/admin_schema.sql #<- WHERE DO I PUT THIS?
Its correct but it needs to be in your Db service
Example:
services:
my_db:
image: postgres:latest
volumes:
- ./init.sql:/docker-entrypoint-initdb.d/init.sql
UPDATE:
problem with running in any other service is that its not going to have the credentials to connect to the database. So you can just create a shell script and run it the old fashioned way like so:
services:
some_service:
image: your_image
volume: ./init.sh:/init.sh
entrypoint: sh -c "/init.sh"
assuming of course that you have the shell already installed in your image

Docker / Postgres - Trying to run 2 databases and 2 apis, cannot connect

I am trying to use my docker-compose file to run 2 instances of both my database, and my rest api, so that I can run tests on a test instance of the database.
version: "3.8"
services:
db:
image: postgres:13.2-alpine
container_name: "db-prod"
ports:
- "5432:5432"
environment:
- POSTGRES_DB=postgres
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=password
networks:
- fullstack
volumes:
- database_postgres:/var/lib/postgresql/data
db_test:
image: postgres:13.2-alpine
container_name: "db-test"
ports:
- "5433:5432"
environment:
- POSTGRES_DB=postgres
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=password
networks:
- fullstack-test
volumes:
- database_postgres_test:/var/lib/postgresql/data
api:
build: .
container_name: "rest-api"
environment:
DB_USERNAME: "postgres"
DB_PASSWORD: "password"
DB_HOST: "db-prod"
DB_TABLE: "postgres"
DB_DB: "postgres"
DB_PORT: "5432"
ports:
- "8080:8080"
depends_on:
- db
networks:
- fullstack
api_test:
build: .
container_name: "rest-api-test"
environment:
DB_USERNAME: "postgres"
DB_PASSWORD: "password"
DB_HOST: "db-test"
DB_TABLE: "postgres"
DB_DB: "postgres"
DB_PORT: "5433"
ports:
- "8081:8080"
depends_on:
- db_test
networks:
- fullstack-test
volumes:
database_postgres:
database_postgres_test:
networks:
fullstack:
driver: bridge
fullstack-test:
driver: bridge
When i run this, my prod database starts, and my regular API connects to it fine.
My test DB also starts, and I can connect to it using
psql -U postgres -h localhost -p 5433
however my test rest API wi
dial tcp 192.168.112.2:5433: connect: connection refused
The goal is to set up my go tests to run on the test DB and just clear after each test as needed, and not affect the prod db.
I am not sure if I am going about this the right way - perhaps there is a better construct for this - and if so please correct me. But regardless, I do not understand why im getting this error?
I dont get why one connection works well and the other fails?
Edit: Also interesting, i just noticed if i change the api_test container to use:
DB_HOST: "host.docker.internal"
it works. But i still dont understand why one can use a container name and the other cannot? And i cant leave it this way as it needs to work on a mac as well, and host.docker.internal doesnt work on my mac (hence why the first one was changed to the container name)

I cannot connect from adminer to postgresql

I try to up postgresql and adminer via docker container. But from adminer I cannot enter to postgresql with password and user I whote.
SQLSTATE[08006] [7] FATAL: password authentication failed for user "root"
I tried all.
version: '3'
services:
web:
build: .
environment:
- APACHE_RUN_USER=www-data
volumes:
- ./blog:/var/www/html/
ports:
- 8080:80
working_dir: /var/www/html/
db:
image: postgres
restart: always
environment:
POSTGRES_PASSWORD: kisphp
POSTGRES_USER: root
POSTGRES_DB: kisphp
ports:
- "5432:5432"
volumes:
- ./postgres:/var/lib/postgresql/data
adminer:
image: adminer
restart: always
ports:
- "6080:8080"
This docker-compose configuration works well.
Try recreating it from scratch:
Delete ./postgres folder
docker-compose stop
docker-compose down
docker-compose up -d