This kind of arbitrary failure seems to be part of the Rails app learning curve, so I apologize for such a simpleminded question, but: 'heroku pgbackups:capture' is simply...failing. That is, I captured the URL for my Postgresql database on heroku, then pasted it into:
% heroku pgbackups:capture postgres://<secret rest of db URL>
...and get the following response:
Database on ec2-50-19-215-116.compute-1.amazonaws.com ----backup---> b003
Pending... \
! An error occurred and your backup did not finish.
Helpful, eh? Any clues how I can suss this out? Thanks for help with a naive question.
Steve Upstill
If it persists, contact support - there's not a huge amount you can do here.
In my case, the problem was that the connection count for my database was exhausted. I was on a hobby db that only has a max of 20 concurrent connections, and they were all in use. I was able to trigger a backup successfully after freeing up some connections (which you can do e.g. by tuning your connection pools or shutting down some nodes).
You can see the connection count of your db via the pg add-on UI:
Make sure that you are running the latest version of the Heroku command-line tool:
$ heroku update
then re-install the add-on:
$ heroku addons:add pgbackups
then capture a backup of your primary database ( without specifying the DB path, from ineer local branch on your local machine ):
$ heroku pgbackups:capture
Related
anyone facing something similar? sadly I'm on a free plan so I can't open a ticket... does anyone know of a way to restart the service/machine of the DB? maybe that would just solve it...
ok so the issue in my case is that indeed heroku performed some sort of maintenance on the DB, and apparently it's connection params (host/url/user/password) had changed... and since it was embedded throughout all my interfaces (the app, my db tools, admin app I also have) - none were able to connect (they'd timeout).
while trying to figure out, I used the wonderful CLI tools of heroku pg:info and heroku pg:diagnose and even heroku pg:psql and tested to see that my data is still there... eventually I went to the online admin and that's where I saw the connections params had changed. BTW - I have this project for 2 years and this is the first time this had happened...
I was in the same situation. Heroku will email you that your database is scheduled for maintenance. After it's complete, connecting to the database fails because of this error:
error: no pg_hba.conf entry for host "IP_ADDRESS", user "DB_USER", database "DB_NAME", no encryption
This is telling us that incorrect credentials were given, meaning the database connection string has changed. I first checked the credentials on the website for the Heroku Postgres Add-on (data.heroku.com), but the connection string was still the same as before; it has not updated or changed at all, therefore this was misleading. Instead, the updated connection string is found inside of the DATABASE_URL config variable, located in the Settings tab of your app on the Heroku dashboard (dashboard.heroku.com), under Config Vars. To avoid manually correcting this problem again, get the connection string from the DATABASE_URL config variable directly, as opposed to hard coding it in your app.
Yes, Im in the same situation. Heroku are a bunch of amateurs. They did some maintenance on the DB, and after it was done, credentials don't work, even those listed in admin/dashboard section of web. Bunch of loosers... #heroku
I'm using the Heroku CLI pg:pull command to migrate a Heroku Postgres connected database from one Heroku app (my-source-app) to another (my-target-app) - both of which are in my control.
First, I clear the database on the target application;
heroku pg:reset -a my-target-app
Then initiate the pg:pull
heroku pg:pull DATABASE $(heroku config:get DATABASE_URL -a my-target-app) --exclude-table-data='table5;table9' -a my-source-app
It seems to start working (transferring schema then data table-by-table), but is very slow. The original db is ~20GB; large, but not unreasonable. If I monitor the size of the target database (via the Heroku dashboard) it seems to fill at only about 35MB/minute.
My questions;
Is this command routing the data through my local machine or is it direct machine-to-machine?
Is there a way to "detach" from the process, and later monitor it (as I can with Heroku's run:detached command) so I don't need to remain online for the duration?
Is there a better approach for migrating the data here (such as creating a follower and switching it over to the new app somehow; I've tried this without success)
Answering the specific questions;
The data was not copied via my local machine while running the command.
In the end, I remained connected while the pg:pull operation completed; there doesn't seem to be a way to detach.
A similar feature (which copies everything across) is pg:copy - see docs - which was a viable alternative here.
I'm currently in the process of switching my cloud server from Heroku to Digital Ocean. However is there a way to migrate the database from the heroku server to the digital ocean one? I use postgresql for my database
I hope you already got a solution, but in case you didn’t, I’ll provide a simple guide on how I did it. I am going to assume that you have already created a postgres database on digitalocean. Also you need navigate to your project directory and log in to heroku using the heroku cli. And, you need to have postgresql installed or a psql client. Installing postgresql would do it as it comes with psql.
Step 1: Create a backup and download the backup from heroku postgres
heroku pg:backups:capture --app <app_name>
heroku pg:backups:download --app <app_name>
The first command will create a backup of your database and the second command will download it to your current directory, its a .dump file. If you would like to read more, here is an article.
Step 2: Connect to your remote (digital ocean’s) database using psql
Before you can do this, you need to go and add your machine you are connecting from to the list of database’s list of trusted sources, If you don’t, you’ll get a Connection Timed Out error. That’s because the database’s firewall doesn’t allow you to connect to the database from your local machine or resource (for security reasons).
Step 3: Import the Database
pg_restore -d "postgresql://<database_username>:<database_password>#<host>:<port>/<database>?sslmode=require" --jobs 4 -c "/path/to/dump_file.dump"
This will import your database from your dump file. Just substitute the variables will your connection parameters that you get from your dashboard. If you would like to read more, here is another article for this step.
Another thing to make clear is, sometimes, you will see some harmless error messages when running this command, but it will push through anyway. To learn more about pg_restore read this article.
And that’s it, your database has been migrated. Now, can you confirm it worked?, well, as for me, I used pgAdmin to connect to the remote database and I saw the tables and data as expected.
Hope this helps anyone with the same problem :)
I have a django project that I was initially running on pythonanywhere using a postgres database. However, pythonanywhere doesn't support ASGI, so I am migrating the project over to heroku. Since heroku either much prefers or mandates use of its own postgres functionality, I need to migrate the data from elephantsql. However, I'm really coming up short on how... I tried running pg_dumpall but it didn't seem to want to work and/or the file disappeared into the ether. I'm at a loss for how to make this migration... If anyone could help, I'd appreciate it so much. T-T
After hours of searching and doing what I can to scour heroku's listed info, I found it by running heroku pg:push --help.
For a locally running server, run
heroku pg:push '<db_name>' <heroku_db_name> --app <app_name>
For a hosted one, run
heroku pg:push <postgres_link> <heroku_db_name> --app <app_name>
I'd like to have my master Postgres DB, which is is hosted on Heroku, replicate down to a slave DB on my laptop. Is this possible?
Heroku's documentation talks about both master and slave hosted within Heroku:
https://devcenter.heroku.com/articles/heroku-postgres-follower-databases
Someone else asked whether it's possible to have the master outside Heroku and a slave inside Heroku (it's not):
Follow external database from Heroku
I haven't seen an answer for the reverse -- having the master in Heroku and the slave outside.
Why do I want this? To speed up development. With my app running locally and the DB in the cloud, the round-trip is long so data access is slow. Most data access is read-only. If I could have a local slave, it would speed things up significantly.
Related: what if my laptop is disconnected for a while? Would that cause problems for the master?
You cannot make a follower (slave) outside of the Heroku network – followers need superuser access to create, which Heroku Postgres doesn't provide you, so you are limited to running a follower on Heroku.
If you want to pull down a copy locally for use/inspection, you can do so with pgbackups: https://devcenter.heroku.com/articles/heroku-postgres-import-export
I'd highly recommend the program Parity for this.
It copies down the last Heroku backup to your local machine with a nice command line interface:
development restore production
I'd rather just pull the production database's contents from Heroku every now and then.
$ heroku db:pull
You can speed that up with a rake task.
# lib/tasks/deployment.rake
namespace :production do
desc 'Pull the production DB to the local env'
task :pull_db do
puts 'Pulling PRODUCTION db to local...'
system 'heroku db:pull --remote MY_REMOTE_NAME --confirm MY_APP_NAME'
puts 'Pulled production db to local'
end
end
You can call rake production:pull_db to overwrite your local development database.