MariaDB Backup on Swisscom Cloud - swisscomdev

I'm interested in the MariaDB Service from the Swisscom Cloud.
https://docs.developer.swisscom.com/service-offerings/mariadb.html
What backup capabilities are offered by the Swisscom Cloud?
Is there something similar like on pivotal cloud foundry?
https://docs.pivotal.io/p-mysql/backup.html

Perfect timing for asking: we have just released major update to our platform and instant back/restore for MariaDB is a new feature that is available from today!
You can take backups and restore them from the administration console (GUI)
Michal Maczka
Product Manager Application Cloud

There's a MariaDB backup plugin for the CF CLI:
https://github.com/gsmachado/cf-mariadb-backup-plugin
Then, you are able to automate backup creations -- e.g., you can come up with a bash script to create a backup every day, or every two days.

Related

How to back update Postgres database inside K8s cluster

I have setup a postgres database inside the Kubernetes cluster and now I would like to backup the database and don't know how it is possible.
Can anyone help me to get it done ?
Thanks
Sure you can backup your database. You can setup a CronJob to periodically run pg_dump and upload the dumped data into a cloud bucket. Check this blog post for more details.
However, I recommend you to use a Kubernetes native disaster recovery tool like Velero, Stash, Portworx PX-Backup, etc.
If you use an operator to manage your database such as zalando/postgres-operator, CrunchyData, KubeDB, etc. You can use their native database backup functionality.
Disclosure: I am one of the developer of Stash tool.

MariaDB Backup from the command line

The Backup feature in the developer console for creating backups is great. I would however like the possibility to automate this. Is there a way to do so from the cf command line app?
Thanks
It's not possible from the cf cli, but there's an API endpoint for triggering backups.
API Docs | Custom Extensions | Swisscom Application Cloud Filter for
Cloud Foundry (CF) Cloud Controller (CC) API. Implements Swisscom
proprietary extensions
POST /custom/service_instances/{service-instance-id}/backups
Creates a backup for a given service instance
See for more Info Service Backup and Restore in docs.developer.swisscom.com
Create Backup To create a backup, navigate to the service instance in
the web console and then to the “Backups” tab. There you can click the
“Create” button to trigger a manual backup.
Note: Backups have to be triggered manually from the web console.
Be aware that you can only keep a set number of backups per service
instance. The actual number is dependent on the service type and
service plan. In case you already have the maximum number, you cannot
create any new backups before deleting one of the existing.
It may take several minutes to backup your service (depending on the
size of your service instance).
Restore Backup You can restore any backup at any time. The current
state of your backup will be overwritten and replaced with the state
saved to the backup. You are advised to create a backup of the current
state before restoring an old state.
Limitations You can only perform one backup or restore action per
service instance at a time. If an action is still ongoing, you cannot
trigger another one. You cannot exceed the maxmimum number of backups
per service instance
We did this by developing a small Node.js application which is running on the cloud in the same space and which backups our maria and mongo db every night automatically.
EDIT:
You can download the code from here:
https://github.com/theonlyandone/cf-backup-app
Fresh from the press: Swisscom Application Cloud cf CLI Plugin can also automate backup and restore.
The official cf CLI plugin for the Swisscom Application Cloud gives
you access to all the additional features of the App Cloud.
cf install-plugin -r CF-Community "Swisscom Application Cloud"
from 0.1.0 release notes
Service Instance Backups
Add cf backups command (list all backups of a service instance)
Add cf create-backup command (create a new backup of a service instance)
Add cf restore-backup command (restore an existing backup of a service instance)
Add cf delete-backup command (delete an existing backup of a service instance)
Despite the answer from Matthias Winzeler saying it's not possible, in fact it's totally possible to automate MariaDB backups through the command line.
I developed a plugin for the CF CLI:
https://github.com/gsmachado/cf-mariadb-backup-plugin
In future I could extend such plugin to backup any kind of service that is supported by the Cloud Foundry Provider's API (in this case, Swisscom AppCloud API).

Setting up backup strategy for backing up postgresql database on cloud foundry

We have setup a community postgresql service on Cloud Foundry (IBM Blumix). This is a free service and no automated backup and recovery is supported out of the box.
Is there a way to set up a standby server or a regular backup in case there is any data corruption/failure?
IBM compose and ElephantSQL can provide this service at a cost, butwe are not ready for it yet.
PostgreSQL is an experimental service and there is not a dashboard and other advanced features (Daily backup for example) that you can find in other services that you mentioned. If you want to do a backup you could write an ad-hoc script that 'saves'\exports all tables as you want and run it every day.
If you need PostegreSQL you can create a PostegreSQL by compose service $17.50 / mo for the first GB and $12 for Extra GB )
We used Postgresql Studio and deployed it on IBM Bluemix. The database service was connected to the pgstudio interface (This restricts the access to only connected databases). We also had to make minor changes to pgstudio so that we could use pg_dump with the interface.
The result: We could manually dump the data. This solution works well as we could take regular dumps (though manually).
In the free tier you are right in saying that you cant get the backup. Those features are available only in Compose for PostgresSQL service - but that's a paid service.

How to replicate MySQL database to Cloud SQL Database

I have read that you can replicate a Cloud SQL database to MySQL. Instead, I want to replicate from a MySQL database (that the business uses to keep inventory) to Cloud SQL so it can have up-to-date inventory levels for use on a web site.
Is it possible to replicate MySQL to Cloud SQL. If so, how do I configure that?
This is something that is not yet possible in CloudSQL.
I'm using DBSync to do it, and working fine.
http://dbconvert.com/mysql.php
The Sync version do the service that you want.
It work well with App Engine and Cloud SQL. You must authorize external conections first.
This is a rather old question, but it might be worth noting that this seems now possible by Configuring External Masters.
The high level steps are:
Create a dump of the data from the master and upload the file to a storage bucket
Create a master instance in CloudSQL
Setup a replica of that instance, using the external master IP, username and password. Also provide the dump file location
Setup additional replicas if needed
Voilà!

Does azure support things like mongodb and redis?

Can you use mongodb and redis/memcached with azure?
I'm guessing no but just want to make sure.
It turns out they do support things other than .net, are they using linux servers then?
You can very easily run mongodb in Windows Azure. I presented this at MongoSV - video here.
EDIT: In December 2011, 10gen published their official MongoDB+Azure code on github. This contains a project for replica-sets, as well as a demo ASP.NET MVC application (taken from the Windows Azure Platform Training Kit) that uses a replica set for its storage.
Standalone servers are straightforward, except you have to deal with scale-out: you can't have multiple instances of a standalone server simultaneously, so you'll need to plan for this: take all but one out of the load balancer, or only launch mongod if you can acquire the Cloud Drive lock.
Replicasets are doable, as I demonstrated at MongoSV. However, I didn't cover the intricacies of graceful shutdown of a replicaset to ensure zero data loss.
You can run memcached as well - see David Aiken's post about this. Note: Now that the AppFabric Cache service is live, you should look into the pros/cons of using that over memcached. Cost-wise, AppFabric Cache should run much less, as you don't have to pay for role instances to host your cache. More info about AppFabric Cache here.
You now also have the option of running Redis in Windows Azure on Linux virtual machines ! In the case of Redis, this would allow you to use the "official" build instead of the "unsupported" Windows build ... For MongoDB, both choices seem equally valid (running on Linux virtual machines, "plain" Windows virtual machines, or using 10gen's package to run on "managed" VMs (Cloud Services).
FYI, there's now a Redis installer for Windows Azure available from MS Open Tech (my team). Here's a tutorial on how to use it: http://ossonazure.interoperabilitybridges.com/articles/how-to-deploy-redis-to-windows-azure-using-the-command-line-tool
[UPDATE] Azure now supports MongoDB and Redis.
http://azure.microsoft.com/blog/2014/04/22/announcing-new-mongodb-instances-on-microsoft-azure/
http://azure.microsoft.com/en-us/services/cache/
In the Azure Store you can now select Redis Cloud as an add-on.
Heres the Azure store description:
"Redis Cloud is a fully-managed cloud service for hosting and running Redis in a highly-available and scalable manner, with predictable and stable top performance. Tell us how much memory you need and get started instantly with your new Redis database."
PUBLISHED DATE 3/31/2014
You can access the store by selecting the "New" button in the Azure portal then "Store". I have yet to use it but it looks promising.
Azure now has a first-party Redis service, currently in preview:
http://azure.microsoft.com/en-us/documentation/articles/cache-dotnet-how-to-use-azure-redis-cache/