Postgres docker container external access issue when using a bind mounted data directory with pre-existing database - postgresql

I am trying to use the data directory from a preexisting database & bring up a new postgres docker container (same version 9.5) with its '/var/lib/postgresql/data' bind mounted to the data directory.
I find that even though i am able to bring up the container & use psql within the container to connect to it, external connections fail with invalid password. This despite me setting the POSTGRES_USER, POSTGRES_DB & POSTGRES_PASSWORD environment variables.
Is there a way to make this work? I also tried this method but ran into permission error,
"sh: 1: /usr/local/bin/init.sh: Permission denied"
Thanks

This happends when your user/group id does not match the file owner. You should run your docker with --user
please have a look to Arbitrary --user Notes of https://hub.docker.com/_/postgres
Hope that will help you to fix your problem.
For composer look at https://docs.docker.com/compose/reference/run/

OK i figured out the way to do this & it turns out to be very simple. All i did was,
Add a script & copy it into docker-entrypoint-initdb.d in my Dockerfile
In the script i had a loop that was waiting for the db to be up & running before resetting the superuser password & privileges.
Thanks

Related

Export a backup from a container to host machine

I have a docker container running mongoDB and want to export the database to my local file system.
I have created a mongo dump by running
mongodump -u root -p root -o /data/my_dump
Which has created the dump inside of my container - Now i want to bring this folder to my host machine.
I have tried running:
docker cp . mycontainerID:/data/mydump
but nothing really seems to be happening, It will take some time and then show an x on the terminal without telling me if there was an error or not or what went wrong.
Does anyone know what im doing wrong here?
I am trying to copy the file to wherever i run the command in the terminal
Why don't you use volumes to do this? I guess it's the best way to keep data.

Can't retrieve MongoDB to local drive using SCP from AWS EC2

I have a Docker container using Strapi (which used MondoDB) on a now defunct AWS EC2. I need the content off that server - it can't run because it's too full. So i've tried to retrieve all the files using SCP - which worked a treat apart from download the database content (the actual stuff i need - Strapi and docker book up fine, but because it has to database content, it treats it as a new instance).
Every time i try to download the contents on db from AWS i get 'permission denied'
I'm using SCP something like this
scp -i /directory/to/***.pem -r user#ec2-xx-xx-xxx-xxx.compute-1.amazonaws.com:strapi-docker/* /your/local/directory/files/to/download
Does anyone know how i can get this entire docker container running locally with the database content?
You can temporarily change permissions (recursively) on the directory in question to be world-readable using chmod.

Pushkin fails to initialize PostgreSQL database

I am following the Pushkin Quickstart guide.
At pushkin init site, I get an error that no container was found for test_db_1:
...
Pulling test_db (postgres:11)...
11: Pulling from library/postgres
Digest: sha256:8e096175da9b7a1d5f073e4ff0b2058a68b3110dc9c26bcee0975d25ad1c008e
Status: Downloaded newer image for postgres:11
Creating pushkin_test_db_1 ... done
Creating pushkin_message-queue_1 ... done
Creating pushkin_api_1 ... done
Creating pushkin_server_1 ... done
Starting test_db ... done
Creating local test database
No container found for test_db_1
Failed to run create database command in test_db container: 1
If I open Docker Desktop > pushkin > pushkin_test_db_1 > Logs, I see
Error: Database is uninitialized and superuser password is not specified.
You must specify POSTGRES_PASSWORD to a non-empty value for the superuser.
For example, "-e POSTGRES_PASSWORD=password" on "docker run".
You may also use "POSTGRES_HOST_AUTH_METHOD=trust" to allow all connections without a password.
This is *not* recommended. See PostgreSQL documentation about "trust":
https://www.postgresql.org/docs/current/auth-trust.html
I tried opening ./pushkin.yaml, adding a password there, and running pushkin init site but nothing has changed. My guess is that the command represented by Creating pushkin_test_db_1 doesn't provide Docker a password to the database. Looking on the postgres github, this behavior seems to be a new "feature."
Does anyone have a recommendation to fix this issue?
I think I found a fix. First, I deleted all of the old docker containers to prevent conflicts when trying to create new ones. Then, from a new directory, I ran pushkin site default to initialize the filesystem. Next, in ./pushkin/docker-compose.dev.yml, update services: test_db to look like the following:
test_db:
image: 'postgres:11'
environment:
POSTGRES_PASSWORD: testpassword
ports:
- '5432:5432'
volumes:
- 'test_db_volume:/var/lib/postgresql/data'
This will provide a password for the postgres database so that it doesn't complain. It looks like this could be easily updated at the source so that when pushkin site default downloads temp.zip, this change is already implemented in the yaml file.
This is a synchronization problem with the CLI. Sometimes, simply running "pushkin init site" again works. Try that and let me know if you get a different error.
What the pushkin init site command does is download all the dependencies the pushkin project needs. You may want to continue the following steps on the quick start and see if things go well.
And for this issue, try to run docker-compose -f pushkin/docker-compose.dev.yml exec -T test_db psql -U postgres -c "create database test_db and docker-compose -f pushkin/docker-compose.dev.yml stop test_db to see if anything change.

How do I set the database password for pg_prove in PgTAP?

Just getting started with pgTAP, I'm using a localhost running docker container server. How do I tell pgTAP what the password for that server is? I don't see anything mentioned in the documentation.
PgTAP pg_prove respects the PostgreSQL Environment Vars, so you need to set PGPASSWORD in the environment before running. Since pg_prove runs using psql I suspect you can also set a ~/.pgpass.

How to use docker with mongo to achieve replication and with opening authentication

I want to use docker run a vm mongodb, at the same time, the mongo configure file use my own defined configure file to archive replication and open authentication.
Scanning some files but don't resolve the problem.
Any ideas?
The docker mongo image has a docker-entrypoint.sh it calls in the Dockerfile
Check if you can:
create your own image which would create the right user and restart mongo with authentication on: see "umputun/mongo-auth" and its init.sh script
or mount a createUser.js script in docker-entrypoint-initdb.d.
See "how to make a mongo docker container with auth"