how we can do automatic backup for compute engine disk everyday ? in google cloud - google-cloud-storage

I have created instance in compute engine with windows server 2012. i cant see any option to take automatic backup for instance disk database everyday. there is option of snapshot but we need to operate this manually. please suggest any way to backup automatically and can be restore able on a single click. if is there any other possibility using cloud SQL storage or any other storage please recommend.
thanks

There's an API to take snapshots, see API section here:
https://cloud.google.com/compute/docs/disks/create-snapshots#create_your_snapshot
You can write a simple app to get triggered from Cron or something to take a snapshot periodically.

You have no provision for automatic back up for compute engine disk. But you can do a manual disk backup by creating a snapshot.
Best alternative way is to create a bucket and move your files there. Google cloud buckets have automated back up facility available.
Cloud storage and cloud SQL are your options for automated back ups in google cloud.

Related

Google Cloud CloudSQL Cloning

I am trying to find more information on GCP CloudSQL Cloning feature but can't find good enough info anywhere or in google docs.
Is this feature similar to zero copy clones which will make instant clones of databases, or it does physical copy of blocks from one instance to create new Cloud SQL instance?
I have 10 TB of postgres database and wondering what's the best way to make a copy of it?
Thanks in advance.

Azure Durable Function app with Postgres data store

We need to host existing Azure Durable Function app outside of Azure. We can run the function app as a container, but we'll need to configure an alternate data store (which is currently using Azure Storage). I can see MS SQL is a supported alternate - see here - and this will work for us, but Postgres is more aligned with the direction we're headed, so would be preferable. Has anyone used Postgres as the storage provider for Azure Durable Function apps?
The language specific operations to deploy outside the azure can be performed by the steps mentioned in https://learn.microsoft.com/en-us/azure/azure-functions/durable/durable-functions-overview?tabs=python
For the storage considerations refer: https://learn.microsoft.com/en-us/azure/azure-functions/storage-considerations
But there are no specific and perfect procedures for Postgre SQL and waiting for update from Azure.

Shared Access for Home Directory in google cloud Shell

I am currently using the Google Cloud Shell, and I wish to access the persistent disk of another user. (Not using local shell)
More info on topic of inquiry: https://medium.com/google-cloud/no-localhost-no-problem-using-google-cloud-shell-as-my-full-time-development-environment-22d5a1942439
Cloud Shell is a micro vm dedicated to you, free, and with a mounted personal disk.
EDITED: Thanks to #Johnhanley comment, you can access to the cloud shell file of someone else with this code provided. However, you need the credential of the target Cloud Shell env and it's not very secure and recommended.
However, you can mount a fuse directory. And the other user also. With fuse, you navigate in a bucket like in directory. But be carefull, Storage bucket is not a file system: performance and usage aren't the same. Moreover, Fuse don't guaranty the data integrity in case of simultaneous file use, especially writing concurrency. Use with precaution.
But you can have a common workspace if it's your requirement.
If you use Cloud Shell as dev environment, like a computer or a VM, the same best practice are to apply. The dev environment has to be considered as ephemeral (computer can have outage or be lost/stolen, People can leave a company and you no longer have access to their cloud shell), and thereby you have to save your sources frequently on safe space (Git repository, Cloud Storage with Fuse)

Fluentd daemonset alternative to S3 on Azure (Blob)

For log-intensive microservices, I was hoping to persist my logs into blobs and save them in azure blob storage (s3 alternative). However, I noticed that fluentd does not seem to support it out of the box.
Is there any alternative for persisting my logs in azure like so in S3?
There are plugins that support Fluend with Azure blob storage,specifically blob append:
Azure Storage Append Blob output plugin buffers logs in local file and uploads them to Azure Storage Append Blob periodically.
there's a step by step guide available here which is a Microsoft solution, there's also an external plugin with same capabilities here
There is an easy solution to use a lightweight log forwarding agent from DataDog which is called vector. This is, of course, free to use and a better alternative to fluentd for a non-enterprise level use case.
I recently set that up to forward the logs from Azure AKS to a storage bucket in near real-time. Feel free to check out my Blog and Youtube Video on the same. I hope it helps.

How to replicate MySQL database to Cloud SQL Database

I have read that you can replicate a Cloud SQL database to MySQL. Instead, I want to replicate from a MySQL database (that the business uses to keep inventory) to Cloud SQL so it can have up-to-date inventory levels for use on a web site.
Is it possible to replicate MySQL to Cloud SQL. If so, how do I configure that?
This is something that is not yet possible in CloudSQL.
I'm using DBSync to do it, and working fine.
http://dbconvert.com/mysql.php
The Sync version do the service that you want.
It work well with App Engine and Cloud SQL. You must authorize external conections first.
This is a rather old question, but it might be worth noting that this seems now possible by Configuring External Masters.
The high level steps are:
Create a dump of the data from the master and upload the file to a storage bucket
Create a master instance in CloudSQL
Setup a replica of that instance, using the external master IP, username and password. Also provide the dump file location
Setup additional replicas if needed
VoilĂ !