I'm using the Heroku CLI pg:pull command to migrate a Heroku Postgres connected database from one Heroku app (my-source-app) to another (my-target-app) - both of which are in my control.
First, I clear the database on the target application;
heroku pg:reset -a my-target-app
Then initiate the pg:pull
heroku pg:pull DATABASE $(heroku config:get DATABASE_URL -a my-target-app) --exclude-table-data='table5;table9' -a my-source-app
It seems to start working (transferring schema then data table-by-table), but is very slow. The original db is ~20GB; large, but not unreasonable. If I monitor the size of the target database (via the Heroku dashboard) it seems to fill at only about 35MB/minute.
My questions;
Is this command routing the data through my local machine or is it direct machine-to-machine?
Is there a way to "detach" from the process, and later monitor it (as I can with Heroku's run:detached command) so I don't need to remain online for the duration?
Is there a better approach for migrating the data here (such as creating a follower and switching it over to the new app somehow; I've tried this without success)
Answering the specific questions;
The data was not copied via my local machine while running the command.
In the end, I remained connected while the pg:pull operation completed; there doesn't seem to be a way to detach.
A similar feature (which copies everything across) is pg:copy - see docs - which was a viable alternative here.
Related
I have recently started working on an existing Heroku environment.
How can I tell if there are database backups scheduled?
Assuming you are using Heroku Postgres, you can view backup schedules with the following command:
heroku pg:backups:schedules
You might have to provide the --app argument so Heroku knows which app you're interested in.
I'm currently in the process of switching my cloud server from Heroku to Digital Ocean. However is there a way to migrate the database from the heroku server to the digital ocean one? I use postgresql for my database
I hope you already got a solution, but in case you didn’t, I’ll provide a simple guide on how I did it. I am going to assume that you have already created a postgres database on digitalocean. Also you need navigate to your project directory and log in to heroku using the heroku cli. And, you need to have postgresql installed or a psql client. Installing postgresql would do it as it comes with psql.
Step 1: Create a backup and download the backup from heroku postgres
heroku pg:backups:capture --app <app_name>
heroku pg:backups:download --app <app_name>
The first command will create a backup of your database and the second command will download it to your current directory, its a .dump file. If you would like to read more, here is an article.
Step 2: Connect to your remote (digital ocean’s) database using psql
Before you can do this, you need to go and add your machine you are connecting from to the list of database’s list of trusted sources, If you don’t, you’ll get a Connection Timed Out error. That’s because the database’s firewall doesn’t allow you to connect to the database from your local machine or resource (for security reasons).
Step 3: Import the Database
pg_restore -d "postgresql://<database_username>:<database_password>#<host>:<port>/<database>?sslmode=require" --jobs 4 -c "/path/to/dump_file.dump"
This will import your database from your dump file. Just substitute the variables will your connection parameters that you get from your dashboard. If you would like to read more, here is another article for this step.
Another thing to make clear is, sometimes, you will see some harmless error messages when running this command, but it will push through anyway. To learn more about pg_restore read this article.
And that’s it, your database has been migrated. Now, can you confirm it worked?, well, as for me, I used pgAdmin to connect to the remote database and I saw the tables and data as expected.
Hope this helps anyone with the same problem :)
I've decided to save time on the ops side of things and move to Heroku. I'm planning to have a production dyno on Heroku with a postgres database AND another dyno that reads from the same database.
However when I opened the settings of postgres, it said:
Database Credentials
Get credentials for manual connections to this database.
Please note that these credentials are not permanent.
Heroku rotates credentials periodically and updates applications where this database is attached.
What's a good way to go about this?
From Heroku Documentation,
Credentials
Do not copy and paste database credentials to a separate environment or into your application’s code. The database URL is managed by Heroku and will change under some circumstances such as:
User initiated database credential rotations using heroku pg:credentials:rotate.
Catastrophic hardware failure leading to Heroku Postgres staff recovering your database on new hardware.
Automated failover events on HA enabled plans.
It is best practice to always fetch the database URL config var from the corresponding Heroku app when your application starts. For example, you may follow 12Factor application configuration principles by using the Heroku CLI and invoke your process like so:
DATABASE_URL=$(heroku config:get DATABASE_URL -a your-app-name) your_process
This way, you ensure your process or application always has correct database credentials.
May be attaching the same database to two heroku-apps will better suit you. In this way, pg creds will be auto-managed by heroku.
I am also using this technique. I have one client-facing app and another operation-app sharing the same database instance.
You can either do this using UI or via CLI
see Share database between 2 apps in Heroku
I am migrating from server OLD at the old hosting company to server NEW at the new hosting company.
I want to run the clone command so I clone the mongoDB from OLD to NEW.
For OLD:
The public ip address is: 44.55.66.77.
The machine login user name is: admin, and the password is password
What is the right way to do this?
So far I can't even log into the server OLD
So far I have tried the following command prompts on NEW:
mongo -u admin -p password 44.55.66.77
mongo remote-ip:44.55.77.66 -u admin -p password
That don't work
I also tried this from mongo shell:
db.CopyDatabase('OldDb', 'NewDb', '44.55.66.77', 'admin', 'password')
and I get: the "could not connect to server" error message
Aside from firewall considerations in order to copy data between MongoDB servers, db.copyDatabase() (aka the copydb command) has a number of important usage caveats including:
copydb does not produce point-in-time snapshots of the source database; writing data to the source or destination database during the copy process will result in divergent data sets
copydb does not lock the destination server during its operation, so the copy will occasionally yield to allow other operations to complete.
There is also a known issue that copydb may not work with the role-based privileges in MongoDB 2.4 if you have authentication enabled (see SERVER-8213, which was recently fixed in the 2.5.x development releases).
A much better approach to migrating your data would be to restore from a normal backup using mongodump/mongorestore or file system snapshots. The Backup & Recovery section of the MongoDB manual has tutorials covering procedures for different deployment types.
I'd like to have my master Postgres DB, which is is hosted on Heroku, replicate down to a slave DB on my laptop. Is this possible?
Heroku's documentation talks about both master and slave hosted within Heroku:
https://devcenter.heroku.com/articles/heroku-postgres-follower-databases
Someone else asked whether it's possible to have the master outside Heroku and a slave inside Heroku (it's not):
Follow external database from Heroku
I haven't seen an answer for the reverse -- having the master in Heroku and the slave outside.
Why do I want this? To speed up development. With my app running locally and the DB in the cloud, the round-trip is long so data access is slow. Most data access is read-only. If I could have a local slave, it would speed things up significantly.
Related: what if my laptop is disconnected for a while? Would that cause problems for the master?
You cannot make a follower (slave) outside of the Heroku network – followers need superuser access to create, which Heroku Postgres doesn't provide you, so you are limited to running a follower on Heroku.
If you want to pull down a copy locally for use/inspection, you can do so with pgbackups: https://devcenter.heroku.com/articles/heroku-postgres-import-export
I'd highly recommend the program Parity for this.
It copies down the last Heroku backup to your local machine with a nice command line interface:
development restore production
I'd rather just pull the production database's contents from Heroku every now and then.
$ heroku db:pull
You can speed that up with a rake task.
# lib/tasks/deployment.rake
namespace :production do
desc 'Pull the production DB to the local env'
task :pull_db do
puts 'Pulling PRODUCTION db to local...'
system 'heroku db:pull --remote MY_REMOTE_NAME --confirm MY_APP_NAME'
puts 'Pulled production db to local'
end
end
You can call rake production:pull_db to overwrite your local development database.