With the Heroku Release Phase is it possible to run pg:backups:capture? Or is there another method to go about for creating a database backup before trying to run migrations?
Technically this is possible, but must have heroku cli installed on your dyno and you need to authenticate it somehow. So one solution is to find or write buildpack that will install cli tool and add config variable with authentication credentials.
Another approach is to use a library such as https://github.com/kjohnston/pgbackups-archive. There is a problem though, it is using old heroku api, which will be disabled in April 2017. I don't know if there is any similar library that uses new api.
If you just want to backup your data and not necessarily use pg:backups:capture, you can just use write simple script that runs pg_dump DATABASE_URL with some additional options and uploads dump file to S3 or any other location. I think this is the easiest solution. Then just add this script as release command to Procfile.
Related
I recently finished developing a Laravel 9 app, using wsl2 and sail, just like I was told in the Laravel documentation. Since it’s my first time deploying to live ever, I ran into some differences with local and production files such as .env, docker-compose.yml and Dockerfile.
I tried using guides and tutorials but I can’t seem to make sense as to how to make it work. I have a droplet with a non-root user with sudo privileges, since I used these two kind guides:
https://www.digitalocean.com/community/tutorials/how-to-install-and-use-docker-on-ubuntu-20-04
https://www.digitalocean.com/community/tutorials/how-to-install-and-use-docker-compose-on-ubuntu-20-04
after finishing with the installation, I tried to clone my app and run it like I do locally, and nothing happened. I realize I can’t use sail on the server, but what is the correct way to make it work?
All three local files (.env, docker-compose.yml and Dockerfile) were never edited.
use this repository docker-laravel
i have Php, Composer, and mariadb on it.
How the Heroku CLI tool manages DBs. What are the APIs they use? The tasks I am trying to do from the app are create/delete a postgres DB, create a dump, and import a dump using python code and not from the console or cli.
There is no publicly defined API for the Heroku Data products, unfortunately. That said, in my experience, the paths are fairly stable and can mostly be reasoned out. This CLI plugin might give you a head start on trying to work out the routes you'd need to hit in order to achieve your goals.
We have a cloud-foundry app that is bound to a Postgresql service. Right now we have to manually connect to the Posgresql database with pgAdmin, and then manually run the queries to create our tables.
Attempted solution:
Do a cloud foundry run-task in which I would install
1) Install psql and connect to the remote database
2) Create the tables
The problem I ran into was that cf run-task has limited permissions to install packages.
What is the best way to automate database table creation for a cloud-foundry application?
Your application will run as a non-root user, so it will not have the ability to install packages, at least in the traditional way. If you want to install a package, you can use the Apt Buildpack to install it. This will install the package, but into a location that does not require root access. It then adjusts your environment variables so that binaries & libraries can be found properly.
Also keep in mind that tasks are associated with an application (they both use the same droplet), so to make this work you'd need to do one of two things:
1.) Use multi-buildpacks to run the Apt buildpack plus your standard buildpack. This will produce a droplet that has both your required packages and your app bits. Then you can start your app and kick of tasks to set up the DB.
2.) Use two separate apps. One for your actual app and one for your code that seeds the database.
Either one should work though. Both are valid ways to seed your database. The other option, which is what I typically done, is to use some sort of tool to do this. Some frameworks like Rails, have this built-in. If your framework does not, you could bring your own tool, like Flyway. These tools often also help with the evolution of your DB schema, which can be useful too.
We want to restore the database that we have got from the client as backup in our development environment, we are unable to restore the database successfully, can any one help us to know the steps involved in this restore process? Thanks in Advance.
Vijay, if you plan to make a new database out of checkpoints (+journals) made on another (physical) server, then I must disappoint you - it is going to be a painful process. Follow these instructions http://docs.actian.com/ingres/10.0/migration-guide/1375-upgrading-using-upgradedb . The process is basically the same as upgradedb . However, if architecture of the development server is different (say backup has been made on a 32bit system, and development machine is, say POWER6-based) then it is impossible to make your development copy of the database using this method.
On top of all this, this method of restoring backups is not officially supported by Actian.
My recommendation is to use the 'unloaddb' tool on the production server, export the database in some directory, SCP that directory to your development server, and then use the generated 'copy.in' file to create the development database. NOTE: this is the way supported by Actian, and you may find more details on this page: http://docs.actian.com/ingres/10.0/migration-guide/1610-how-you-perform-an-upgrade-using-unloadreload . This is the preferred way of migrating databases across various platforms.
It really depends on how the database has been backed up and provided to you.
In Ingres there is a snapshot (called a checkpoint) that can be restored into a congruent environment, but that can be quite involved.
There is also output from copydb and unloaddb commands which can be reloaded into another database. Things to look out for here are a change in machine architecture or paths that may have been embedded into the scripts.
Do you know how the database was backed up?
I have a local version of Strapi set up, and the codebase is pushing fine to Netlify for the frontend and Heroku for the backend.
However, I can't work out how to get the content held in the .tmp/data.db file into the mLab instance of the database on Heroku.
The structure is all in sync with my local version.
I've tried to export tables from SQL Lite to JSON files and then import them as collections using the CLI - which says it's imported the documents into Heroku (and I can see them in the mLab interface), but this was a last ditched attempt as I couldn't see a way of transferring the entire file.
However, this isn't working as the content types are still empty.
Make sure you well configured your ./config/environments/production/database.json with your mLab configs.
In development, you look using SQLite. This database is good for local development but can't be used in Heroku (see the storage system used by Heroku you will understand why.)
Be careful, you are using an SQL database in dev and a NoSQL database in production.
This looks special - depending on your data structure, you can have issues about the data migration. I don't suggest you to do that. Use the same type of database in dev and prod.