How to take backup of azure SQL-Managed Instance backup to azure blob - tsql

I am having a SQL-Managed Instance database now I wanted to take backup in .bak format to blob storage. The current Command I am using is
WITH IDENTITY = 'SHARED ACCESS SIGNATURE'
, SECRET = 'Pasted my sas token generated from azure portal blob storage'
go
BACKUP DATABASE [DB_Name]
TO URL = 'blob url/cointainer name/testing.bak'with checksum;
But by this I am getting a error:
"BACKUP DATABASE failed. SQL Database Managed Instance supports only COPY_ONLY full database backups which are initiated by user."
I have also tried to give "COPY_ONLY" instead of checksum but then again I am facing a error:
"Msg 41922, Level 16, State 1, Line 6
The backup operation for a database with service-managed transparent data encryption is not supported on SQL Database Managed Instance.
Msg 3013, Level 16, State 1, Line 6
BACKUP DATABASE is terminating abnormally.
"
Note: I have a database of approx size 800GB

To prevent the original error message and you are comfortable with the increased security risks you can remove encryption:
Alter database [database_name] set encryption Off
use [database_name]
DROP DATABASE ENCRYPTION KEY

The backup command should be:
USE [master]
BACKUP DATABASE [SQLTestDB]
TO URL = N'https://msftutorialstorage.blob.core.windows.net/sql-backup/sqltestdb_backup_2020_01_01_000001.bak'
WITH COPY_ONLY, CHECKSUM
GO
You could follow this Azure tutorial:
Quickstart: SQL backup and restore to Azure Blob storage service:
It will help you backup the database(.bak) to Blob Storage step by step:
Create credential
Back up database
Hope this helps

Error is related to service managed TDE encryption since all database by default encrypted and service managed TDE does not allow to take copy_only backups. You need to either disable service managed TDE or Enable TDE with customer managed keys to take backups.
Since your database size is 800 GB and if BackupSize > 200 GB then split your backups to multiple files. This is a limitation with blockblob.

Related

Can we use pg_dumpall for Azure managed flexible Server

We are trying to devise a backup strategy for Azure Flexible Server. Can we use the native pgdumpall as a backup and restore mechanism, I am able to take a dumpall by excluding azure default database (as we are not super user in managed server)
pg_dumpall - Azure Database for PostgreSQL - permission denied for database "azure_maintenance"
Can we restore the dump that we create somehow in the managed postgres flexible server, I am unable to find any support in this regard.
You should be able to restore the dump to Flexible server. See if any of these docs help however for the link you shared Yes, we do not allow dump all as some of the internal DBs are not exposed to customers due to security.
Upgrade using dump and restore - Azure Database for PostgreSQL | Microsoft Docs
Dump and restore - Azure Database for PostgreSQL - Single Server | Microsoft Docs

Copy data from Postgres DB (GCP Project A) to another Postgres DB (GCP Project B)

I would be happy to get your help / feedback re data load.
Goal:
Load source data from a Postgres database, which is located in GCP project A to another Postgres database, which is located in GCP project B.
Challenge:
Get a connection (I have an IAM account with sufficient rights to run a COPY TO / COPY FROM command) to the Postgres DB in GCP Project A and copy the table either to a CSV or create a dump that can be used in order to be inserted to another Postgres DB in GCP Project B.
How do I connect to the database (e.g. if I create a key, where shall I store the json keyfile and would that approach even be feasible?) with this IAM email account?
Other ways I've researched were to use psycopg2 (thus I could use the function cursor.copy_expert (which doesn’t need any superuser right or Postgres user credentials and copy the data), but I didn’t succeed in connecting to the database with psycopg2 due to challenges with cloud proxy.
Another idea was to use pg_dump or gcloud sql export csv.
I would be curious if some of you were facing a similar challenge and how did you solve it and what might be the best way/practice
You can have a try out database migration service. You can set up a continuous migration configuration and use Cloud SQL for PostgreSQL.
Hello after a lot of searching I've come to these solutions:
If you have continuous copy, you need to use the database migration service, check this documentation.
If you have one shot copy:
you can restore your instance, see the bottom page of this documentation
you can create a bucket and backup your instance on it, then import it from the other project

Azure Database for PostgreSQL server backup before destroy

I currently have an Azure PostgreSQL server which I would like to permanently delete.
For obvious reasons I would like to make a snapshot/soft delete before turning off and deleting. What is the simplest way to accomplish this? I know there is a backup center option but I feel this is rather sophisticated for what I need.
Thanks
You can take the backup of Azure PostgreSQL server on the Azure Storage Account, then download the backup file locally from the storage account. After that you can delete the storage account and Azure PostgreSQL server.
Below is the article that shows how to take back up on Blob Storage.
Backup Azure Database for PostgreSQL to a Blob Storage

Backup of Dashdb on Bluemix - what options available

I tried taking backup of dashdb from bluemix cloud using Data Studio. I am getting this error 'Remote target is unreachable.'.
Since this is an admin activity, I assume it should be done on the server. As this is cloud server, I am trying to understand how this can be done!
Are there any tools which support ssh to the server and how to take backup of the db? Any documentation in this regard?
Thanks
Which plan are you using ?
Have you read the details of backup on the FAQ ?
This section might help:
Is my data backed up? Encrypted backups on the full Db2 managed service database are done daily. For the Db2 Warehouse on Cloud Flex Performance plan, the
last 7 daily backups are retained. For all other Db2 Warehouse on
Cloud plans, the last 2 daily backups are retained. For Db2 on Cloud,
the last 14 daily backups are retained. In the Db2 Warehouse on Cloud Flex Performance plan, you can restore your database from any of your retained backups at any time
that you choose. In the case of all of the other Db2 Warehouse on
Cloud plans, the retained backups are used exclusively by IBM for only
system recovery purposes in the event of a disaster or system loss. A
request to restore your database from a backup is not supported. You
can export your data using Db2 tools such as IBM Data Studio or by
using the db2 export command.>
For Db2 on Cloud, backups can be stored off site in a different data center or region upon request to IBM Support. These backups are
also used exclusively by IBM to recover from only disaster or system
loss events. A request to restore your database from a backup is not
supported.
Are there any tools which support ssh to the server and how to take backup of the db? Any documentation in this regard?
SSH is not supported as this is a managed service. This is documented in the FAQ:
How do I access the Db2 managed service database?
You can access your Db2 managed service database through the web console from a browser, or you can connect to the database with a client connection such as JDBC, ODBC, CLI, CLP, or CLPPlus. A direct login to the server with Telnet or a Secure Shell (ssh) is not supported.
If you want to take your own backups at the interval of your choice, exporting your data is your best option. That can be done from the web console or from a database client.
Alternatively, if what you're after is access to historical data, you can use time travel query.

Is there any way I can restore a DB2 backup file onto IBM DashDB?

I am trying to restore a DB2 backup file into my BlueMix DashDB service. How do I go about doing this?
You cannot restore your DB2 backup image into dashDB for several reasons.
In an entry-level, shared dashDB instance you only have access to one schema in a physical database shared by others.
Even if you have a dedicated instance, you need 1) access to the database local disk to upload the image and 2) sufficient privileges (at least SYSMAINT authority) to perform the restore. I doubt either will be available to you.
What you can do is run db2look and db2move locally to extract your database DDL statements and data respectively. You can then run the extracted DDL script against dashDB provided you replace the original schema name(s) with the one available to you in dashDB and, after creating the tables, load your data into them.