I created app on openshift and added a cartridge postgresql.
There is no manage application like pgadmin supported by openshift.
I manage the DB by PuTTY under windows.
But how I can import the local data to the DB on openshift.
Thank you in advance!
First take backup of your database using below command.
pg_dump dbname > outfile
Then
add the postgresql database like so
rhc-ctl-app –a postgresApp –e add-postgresql-8.4
then access the remote psql and you can restore using this command pg_restore
Related
I want to dump all my PostgreSQL databases directly to a remote server from my localhost, just a simple .sql file and create a cronjob for it. Is this possible ?
We use Postgres and Flask for our website, and we use the production database dump locally pretty often. To get a fresh dump, I use a remote desktop connection (RDC) to connect to pgAdmin then use RDC again to copy .bak file from server and save it locally. Likewise, I use a local instance of pdAdmin to restore the database state from the backup.
My manager asked me to automate this process to use production database each time when a local Flask instance is launched. How can I do that?
You could write a shell script that dumps the database to a local file using pg_dump, then use pg_restore to build a new local database from that dump. You could probably even just pass the output from pg_dump to pg_restore... something like
pg_dump --host <remote-database-host> --dbname <remote-database-name> --username <remote-username> > pg_restore --host <local-database-host> --username <local-username>
To get your password into pg_dump / pg_restore you'll probably want to use a .pgpass file, as described here: How to pass in password to pg_dump?
If you want this to happen automatically when you launch a Flask instance locally, you could call the shell script from your initialization code using a subprocess call if a LOCAL_INSTANCE environment variable is set, or something along those lines
Currently we have all in one single docker container for our production gitlab, where we are using bundled postgres and redis. So everything in same container. We want to use external postgres db and separate container for redis as well to follow the production standards.
How can I migrate from internal postgres db to external postgres db? If anyone provides process and steps that will be really helpful. We are new to this process. Please let us know If anyone knows
Thank you everyone for your inputs ,
PRS
You can follow the article "Migrating GitLab from internal to external PostgreSQL", which involves:
a database dump/reload, using pg_dumpall
sudo -u gitlab-psql /opt/gitlab/embedded/bin/pg_dumpall \
--username=gitlab-psql --host=/var/opt/gitlab/postgresql > /var/lib/pgsql/database.sql
sudo -u postgres psql -f /var/lib/pgsql/database.sql
Note: yuo can also use a backup of the database, but only if the external PostgreSQL version matches exactly the embedded one.
setting its password
sudo -u postgres psql -c "ALTER USER gitlab ENCRYPTED PASSWORD '***' VALID UNTIL 'infinity';"
and modifying the GitLab configuration:
That is:
# Disable the built-in Postgres
postgresql['enable'] = false
# Fill in the connection details
gitlab_rails['db_adapter'] = 'postgresql'
gitlab_rails['db_encoding'] = 'utf8'
gitlab_rails['db_host'] = '127.0.0.1'
gitlab_rails['db_port'] = 5432
gitlab_rails['db_database'] = "gitlabhq_production"
gitlab_rails['db_username'] = 'gitlab'
gitlab_rails['db_password'] = '***'
apply tour changes:
gitlab-ctl reconfigure && gitlab-ctl restart
#VonC
Hi, let me know about the process I have done below
We currently have single all in one docker gitlab container which is using bundled postgres and redis . To follow the production standards we are looking to maintain separate postgres and redis instances for our prod gitlab..We already have data in bundled db ..so we took back up current gitlab with bundled postgres ..it generated .tar file....Next we did change gitlab.rb to point external post gres db [ same version ]..then we are able connect to gitlab but didn;t see any data because nothing was there as it is fresh db. Later we did the restore using external postgres db ...now we can see all the data?? Can we do in this method ? Now our gitlab is attached to external postgres and I can see all the restored data. Will this process works ? Any downfalls?
How this process is different from pgdump and import ?
I recently switched from SQLite to PostgreSQL and am using Cloud9 IDE (Rails). With SQLite there was a file db/development.sqlite that I could open to examine the db contents. Where can I find a similar file now I've switched to PostgreSQL?
You should be able to use the following:
sudo sudo -u postgres psql
to access your PostgreSQL server. Here's some documentation from Cloud9 regarding PostgreSQL.
I am building a Flask app which I have deployed on Heroku. I recently added a Postgres database which I have been running locally and storing data in. I'm now ready to add the remote Postgres database to my Heroku site.
My question is what are the steps to transfer data from local to the remote Postgres database on Heroku?
Thank you in advance!
You just need to export your local database using pg_dump and import it on heroku using the PGBackups function.
A full explanation of the procces can be found on:
https://devcenter.heroku.com/articles/heroku-postgres-import-export#import