Database transfer from Heroku to Digital Ocean - postgresql

I'm currently in the process of switching my cloud server from Heroku to Digital Ocean. However is there a way to migrate the database from the heroku server to the digital ocean one? I use postgresql for my database

I hope you already got a solution, but in case you didn’t, I’ll provide a simple guide on how I did it. I am going to assume that you have already created a postgres database on digitalocean. Also you need navigate to your project directory and log in to heroku using the heroku cli. And, you need to have postgresql installed or a psql client. Installing postgresql would do it as it comes with psql.
Step 1: Create a backup and download the backup from heroku postgres
heroku pg:backups:capture --app <app_name>
heroku pg:backups:download --app <app_name>
The first command will create a backup of your database and the second command will download it to your current directory, its a .dump file. If you would like to read more, here is an article.
Step 2: Connect to your remote (digital ocean’s) database using psql
Before you can do this, you need to go and add your machine you are connecting from to the list of database’s list of trusted sources, If you don’t, you’ll get a Connection Timed Out error. That’s because the database’s firewall doesn’t allow you to connect to the database from your local machine or resource (for security reasons).
Step 3: Import the Database
pg_restore -d "postgresql://<database_username>:<database_password>#<host>:<port>/<database>?sslmode=require" --jobs 4 -c "/path/to/dump_file.dump"
This will import your database from your dump file. Just substitute the variables will your connection parameters that you get from your dashboard. If you would like to read more, here is another article for this step.
Another thing to make clear is, sometimes, you will see some harmless error messages when running this command, but it will push through anyway. To learn more about pg_restore read this article.
And that’s it, your database has been migrated. Now, can you confirm it worked?, well, as for me, I used pgAdmin to connect to the remote database and I saw the tables and data as expected.
Hope this helps anyone with the same problem :)

Related

Backup/Restore with pgAdmin

I'm not sure that I actually have a problem, but I am confused. I wanted to move a PostgreSQL database from PythonAnywhere to an AWS RDS server. I connected to both servers with pgAdmin from my Windows PC (ssh tunnel with PuTTY). I then did a backup of the database from PythonAnywhere and then did a restore to a clean database on the RDS server.
The backup had no issues, but, while the restore seemed to run fine, pgAdmin showed the process "Failed". The database on the RDS server looks fine. I checked row counts on a few tables, and they matched what I had on PythonAnywhere. I don't see any messages in pgAdmin other than that the process failed. I don't see anything in the pgAdmin logs to indicate what might be wrong. Do I have a problem? Should I use a command line restore instead?
Thanks for any insights.
--Al
Using cli is a better way, the ide may not give full errors. pg_dump will work for you. Maybe you should build a EC2 for this job.

Migrating database with heroku pg:pull in detached state

I'm using the Heroku CLI pg:pull command to migrate a Heroku Postgres connected database from one Heroku app (my-source-app) to another (my-target-app) - both of which are in my control.
First, I clear the database on the target application;
heroku pg:reset -a my-target-app
Then initiate the pg:pull
heroku pg:pull DATABASE $(heroku config:get DATABASE_URL -a my-target-app) --exclude-table-data='table5;table9' -a my-source-app
It seems to start working (transferring schema then data table-by-table), but is very slow. The original db is ~20GB; large, but not unreasonable. If I monitor the size of the target database (via the Heroku dashboard) it seems to fill at only about 35MB/minute.
My questions;
Is this command routing the data through my local machine or is it direct machine-to-machine?
Is there a way to "detach" from the process, and later monitor it (as I can with Heroku's run:detached command) so I don't need to remain online for the duration?
Is there a better approach for migrating the data here (such as creating a follower and switching it over to the new app somehow; I've tried this without success)
Answering the specific questions;
The data was not copied via my local machine while running the command.
In the end, I remained connected while the pg:pull operation completed; there doesn't seem to be a way to detach.
A similar feature (which copies everything across) is pg:copy - see docs - which was a viable alternative here.

How do I view a PostgreSQL database on heroku with a GUI?

I have a rails app on heroku that is using a Postgre database. My database has > 40 tables and > 10,000 rows. I would like to delete a lot of data, but it would be much easier if I was able to view and interact with it in a GUI table. I can access my data in rails console, but it's taking too long.
pgweb is a great cross-platform GUI, and it's easy to connect to your Heroku Postgres when launching from the command line.
I installed via Homebrew on a Mac (brew install pgweb), but instructions for other platforms are listed on the site. Here's how I launch pgweb connected to a Heroku Postgres DB:
heroku config:get DATABASE_URL | xargs pgweb --url
And if you want to connect to your localhost:
pgweb --host localhost
I'm a little late here, but this may help someone else who stumbles across this thread...
If you go to your Heroku app's dashboard (through the website) > settings > "Reveal Config Vars" > DATABASE_URL, and paste that URL into the browser.
I use TablePlus for database management, when I paste the link into the browser it asks if it can open TablePlus and then I can edit my production database in real time just like I would in development.
I'm not sure what pasting the URL into the browser will do if you don't have TablePlus. I assume it will request to open any other SQL management app you might have.
As slumdog wrote in the comment to your question, you can use pgAdmin, which comes with your local Postgres installation.
This article explains how to connect your remote heroku db with pgAdmin, using heroku credentials: https://medium.com/#vapurrmaid/getting-started-with-heroku-postgres-and-pgadmin-run-on-part-2-90d9499ed8fb
From the article:
"pgAdmin is a GUI for postgresql databases that can be used to access and modify databases that not only exist locally, but also remotely. For a fresh install of pgAdmin, the dashboard likely contains only one server. This is your local server...
We have to configure a new remote server with its credentials.
right click server(s) > create > server …
Fill out the following:
Name: This is solely for you. Name it whatever you want, I chose ‘Heroku-Run — On’
Under the connection tab: hostname/address. If you go back to your datastores ‘reveal credentials’, this is the host credential. It should look like --**...amazonaws.com
Keep the port at 5432, unless your credentials list otherwise
Maintenance database — this is the database field in the credentials
Username — this is the user field in the credentials
Password — the password field in the credentials. I highly advise checking save password so that you don’t have to copypasta this every time you want to connect.
In the SSL tab, mark SSL mode as require
At this point, if we were to hit ‘save’ (please don’t), something very strange would happen. You’d see hundreds if not thousands of databases appear in pgAdmin. This has to do with how Heroku configures their servers. You’ll still only have access to your specific database, not those of others. In order to avoid parsing so many databases, we have to white list only those databases we care about.
go to the Advanced tab and under db restriction copy the database name (it’s the same value as the Maintenance database field filled earlier)."
Article contains other usefull guidelines and screenshots.
Try GUI of DBWeaver.
https://dbeaver.io/
Download it, after that you can connect your heroku postgres using Database Credentials data.
You can use Heroku's hosted DB viewer on the Overview pane of your dashboard:
Create and click the Dataclip:
Dataclip GUI is fairly easy to use, we can type and customize SQL queries at the top etc.

How I can copy my local PostgreSQL database to Heroku for SpringBoot app

I have deployed my SpringBoot app to Heroku. Now I would like to copy my local PostgreSQL to Heroku.
I have found some information on devcenter.heroku.com.
However I don't understand enough about the using of file db.changelog-master.yaml.
Could anyone give me details about the simplest solutions to copy the database?
Create a valid dump of your local postgres database and host it somewhere publicly available. Now you will be able to restore this entire dataset (schema and records) with pg:backups:restore as shown here. The sole caveat here is that the target database must be completely empty for this to work. You can empty a Heroku postgres database with heroku pg:reset.
If you cannot take the approach listed above then you can run pg_restore directly from your local instance, provided your local version of Postgres is >= the target version of Postgres. This also applies to creating the dumpfile and is a requirement because pg utilities are not guaranteed to be forward compatible. Documentation for pg_restore is here.

Remote trigger a postgres database backup

I would like to backup my production database before and after running a database migration from my deploy server (not the database server) I've got a Postgresql 8.4 server sitting on a CentOS 5 machine. The website accessing the database is on a Windows 2008 server running an MVC.Net application, it checks out changes in the source code, compiles the project, runs any DB Changes, then deploys to IIS.
I have the DB server set up to do a crontab job backup for daily backups, but I also want a way of calling a backup from the deploy server during the deploy process. From what I can figure out, there isn't a way to tell the database from a client connection to back itself up. If I call pg_dump from the web server as part of the deploy script it will create the backup on the web server (not desirable). I've looked at the COPY command, and it probably won't give me what I want. MS SQLServer lets you call the BACKUP command from within a DB Connection which will put the backups on the database machine.
I found this post about MySQL, and that it's not a supported feature in MySQL. Is Postgres the same? Remote backup of MySQL database
What would be the best way to accomplish this? I thought about creating a small application that makes an SSH connection to the DB Server, then calls pg_dump? This would mean I'm storing SSH connection information on the server, which I'd really rather not do if possible.
Create a database user pgbackup and assign him read-only privileges to all your database tables.
Setup a new OS user pgbackup on CentOS server with a /bin/bash shell.
Login as pgbackup and create a pair of ssh authentication keys without passphrase, and allow this user to login using generated private key:
su - pgbackup
ssh-keygen -q -t rsa -f ~/.ssh/id_rsa -N ""
cp -a ~/.ssh/.id_rsa.pub ~/.ssh/authorized_keys
Create a file ~pgbackup/.bash_profile:
exec pg_dump databasename --file=`date +databasename-%F-%H-%M-%S-%N.sql`
Setup your script on Windows to connect using ssh and authorize using primary key. It will not be able to do anything besides creating a database backup, so it would be reasonably safe.
I think this could be possible if you create a trigger that uses the PostgreSQL module dblink to make a remote database connection from within PL/pgSQL.
I'm not sure what you mean but I think you can just use pg_dump from your Windows computer:
pg_dump --host=centos-server-name > backup.sql
You'd need to install Windows version of PostgreSQL there, so pg_dump.exe would be installed, but you don't need to start PostgreSQL service or even create a tablespace there.
Hi Mike you are correct,
Using the pg_dump we can save the backup only on the local system. In our case we have created a script on the db server for taking the base backup. We have created a expect script on another server which run the script on database server.
All our servers are linux servers , we have done this using the shell script.