Create, restore, and download MongoDB backups automatically on Swisscom Application Cloud - swisscomdev

We can create create backups from the Developer Portal for MongoDB, but I'm wondering if this is exposed in any other way through the CLI?
Also, how can I access the backup to inspect it on my machine for example?

There is a CLI tool for making backups: mongodump (in the mongodb-org-tools package)
See also https://docs.mongodb.com/manual/core/backups/

Yes there are two CLI keystrokes to backup and restore
Backup
Mongodump backupPath
Restore
Mongorestore -d databasename backupPath
Ex- mongodump C:/user/desktop/backup
mongorestore -d DB1 C:/user/desktop/backup/DB1

Turns out you can't do anything outside of the UI. As per their docs: https://docs.developer.swisscom.com/devguide-sc/services/backups.html

There is an API for Service Instance backups and restores (the interface you see in developer portal). See other question Are mongodb backups made automatically?. There is even a CLI plugin to automate Developer Portal backups. Try it with cf install-plugin -r CF-Community "Swisscom Application Cloud".
You can't download the backups to your local computer. For that you you need to use cf ssh tunnel and mongodump/mongorestore. There is a guide to migrate Swisscom MongoDB to other DB provider (like your own computer), see here for the exact commands.

Related

How to check if there is a PostgreSQL backup scheduled?

I have recently started working on an existing Heroku environment.
How can I tell if there are database backups scheduled?
Assuming you are using Heroku Postgres, you can view backup schedules with the following command:
heroku pg:backups:schedules
You might have to provide the --app argument so Heroku knows which app you're interested in.

Database transfer from Heroku to Digital Ocean

I'm currently in the process of switching my cloud server from Heroku to Digital Ocean. However is there a way to migrate the database from the heroku server to the digital ocean one? I use postgresql for my database
I hope you already got a solution, but in case you didn’t, I’ll provide a simple guide on how I did it. I am going to assume that you have already created a postgres database on digitalocean. Also you need navigate to your project directory and log in to heroku using the heroku cli. And, you need to have postgresql installed or a psql client. Installing postgresql would do it as it comes with psql.
Step 1: Create a backup and download the backup from heroku postgres
heroku pg:backups:capture --app <app_name>
heroku pg:backups:download --app <app_name>
The first command will create a backup of your database and the second command will download it to your current directory, its a .dump file. If you would like to read more, here is an article.
Step 2: Connect to your remote (digital ocean’s) database using psql
Before you can do this, you need to go and add your machine you are connecting from to the list of database’s list of trusted sources, If you don’t, you’ll get a Connection Timed Out error. That’s because the database’s firewall doesn’t allow you to connect to the database from your local machine or resource (for security reasons).
Step 3: Import the Database
pg_restore -d "postgresql://<database_username>:<database_password>#<host>:<port>/<database>?sslmode=require" --jobs 4 -c "/path/to/dump_file.dump"
This will import your database from your dump file. Just substitute the variables will your connection parameters that you get from your dashboard. If you would like to read more, here is another article for this step.
Another thing to make clear is, sometimes, you will see some harmless error messages when running this command, but it will push through anyway. To learn more about pg_restore read this article.
And that’s it, your database has been migrated. Now, can you confirm it worked?, well, as for me, I used pgAdmin to connect to the remote database and I saw the tables and data as expected.
Hope this helps anyone with the same problem :)

Backup gem pg_dump version mismatch between EC2 and RDS

I am in the process of configuring the backup gem (http://backup.github.io/backup/v4/) to run on my EC2 instance, copy the PostgreSQL database in RDS, and store the backup in a new S3 bucket.
The backup gem runs the pg_dump command, however AWS doesn't allow for the same version of Postgres to be installed on both EC2 and RDS, resulting in the following error:
pg_dump: server version: 9.4.7; pg_dump version: 9.2.13
pg_dump: aborting because of server version mismatch
This is because the EC2 instance has version:
$ pg_dump --version
pg_dump (PostgreSQL) 9.2.13
And the RDS instance has version:
9.4.7-R1 (with the only other version option of 9.5.2-R1)
On EC2, running yum list postgres* only offers Available Packages up to PostgreSQL 9.3.
So it seems like I am unable to either downgrade RDS or upgrade EC2 to a matching version.
Here is my Backup gem model config if it helps: https://gist.github.com/anonymous/35f6f9e81846f53693fb03662c2192ad
Before too many people start reminding me that RDS has built-in backups, I am aware. My use-case: instead of only having full database fallbacks, I would also like the ability to roll back individual users' data to different time periods without affecting the whole database. I planned on keeping these manual backups and eventually writing a script to pull previous user specific data from them.
My friend recommended another option: If a user wants to roll back, I could spin up a new RDS from the automated snapshots, clone my EC2 instance, connect them to each other, collect the user specific data from that snapshot, and then merge those changes back into the main EC2 instance.
Set up PostgreSQL’s YUM repository on your EC2 instance:
https://yum.postgresql.org/
and install a newer PostgreSQL client version.

Heroku: Can I schedule PostgreSQL backups using the Scheduler add-on (maybe with pgbackups)?

I have a small app, and I run backups manually using heroku pgbackups:capture on my dev machine.
I'd like to use the Heroku Scheduler send these backups to my own S3 bucket.
The thing is: pg_dump is not an available on Heroku boxes, and heroku pgbackups:capture is a local CLI command, also not available.
Is there another way to achieve this using Scheduler?

Remote trigger a postgres database backup

I would like to backup my production database before and after running a database migration from my deploy server (not the database server) I've got a Postgresql 8.4 server sitting on a CentOS 5 machine. The website accessing the database is on a Windows 2008 server running an MVC.Net application, it checks out changes in the source code, compiles the project, runs any DB Changes, then deploys to IIS.
I have the DB server set up to do a crontab job backup for daily backups, but I also want a way of calling a backup from the deploy server during the deploy process. From what I can figure out, there isn't a way to tell the database from a client connection to back itself up. If I call pg_dump from the web server as part of the deploy script it will create the backup on the web server (not desirable). I've looked at the COPY command, and it probably won't give me what I want. MS SQLServer lets you call the BACKUP command from within a DB Connection which will put the backups on the database machine.
I found this post about MySQL, and that it's not a supported feature in MySQL. Is Postgres the same? Remote backup of MySQL database
What would be the best way to accomplish this? I thought about creating a small application that makes an SSH connection to the DB Server, then calls pg_dump? This would mean I'm storing SSH connection information on the server, which I'd really rather not do if possible.
Create a database user pgbackup and assign him read-only privileges to all your database tables.
Setup a new OS user pgbackup on CentOS server with a /bin/bash shell.
Login as pgbackup and create a pair of ssh authentication keys without passphrase, and allow this user to login using generated private key:
su - pgbackup
ssh-keygen -q -t rsa -f ~/.ssh/id_rsa -N ""
cp -a ~/.ssh/.id_rsa.pub ~/.ssh/authorized_keys
Create a file ~pgbackup/.bash_profile:
exec pg_dump databasename --file=`date +databasename-%F-%H-%M-%S-%N.sql`
Setup your script on Windows to connect using ssh and authorize using primary key. It will not be able to do anything besides creating a database backup, so it would be reasonably safe.
I think this could be possible if you create a trigger that uses the PostgreSQL module dblink to make a remote database connection from within PL/pgSQL.
I'm not sure what you mean but I think you can just use pg_dump from your Windows computer:
pg_dump --host=centos-server-name > backup.sql
You'd need to install Windows version of PostgreSQL there, so pg_dump.exe would be installed, but you don't need to start PostgreSQL service or even create a tablespace there.
Hi Mike you are correct,
Using the pg_dump we can save the backup only on the local system. In our case we have created a script on the db server for taking the base backup. We have created a expect script on another server which run the script on database server.
All our servers are linux servers , we have done this using the shell script.