How can i backup and Azure DB to local disk? - powershell

I have tried using the option Export Data-tier option but I get errors in all outputs
Exporting Database
Extracting Schema
Extracting Schema from database
and there are no details about why it failed.\
Is there a way I can back it up on Azure and copy the bacpac file or with powershell ?

You can use SqlPackage to create a backup (export a bacpac) of your Azure SQL Database in your local drive.
sqlpackage.exe /Action:Export /ssn:tcp:sqlftpbackupserver.database.windows.net /sdn:sqlftpbackupdb /su:alberto /tf:c:\sql\sqlftpbackup.bacpac /sp:yourpwd /p:Storage=File
In above example, we are exporting a file (Export) to an Azure SQL Server named sqlftpbackupserver.database.windows.net and the source database name is sqlftpbackup. The source user is alberto and the target file where we will export is in the c:\sql\sqlftpbackup.bacpac sp is to specify the Azure SQL database password of the Azure SQL user. Finally, we will store in a file.
Another example is:
SqlPackage /Action:Export /SourceServerName:SampleSQLServer.sample.net,1433
/SourceDatabaseName:SampleDatabase /TargetFile:"F:\Temp\SampleDatabase.bacpac"

You can try the backup the database to Blob Storage on Portal, then download it to your local disk.
For Powershell, here's the Command example: Backup SQL Azure database to local disk only:
C:\PS>Backup-Database -Name “database1” -DownloadLocation “D:\temp” -Server “mydatabaseserver” -UserName “username” -Password “password” -Verbose
For more details, reference Backup SQL Azure Database.
Here's a tutorial How to backup Azure SQL Database to Local Machine talks about almost all the ways to help you backup the Azure SQL Database to local disk:
Hope this helps.

Related

Read .sql files with Azure Automation runbook

I am trying to connect my powershell runbook to a storage account (blob) to read .sql files and execute them in my Azure SQL Database.
Connect to a blob container
Read the script on .sql file
Execute the script on the db
When I try Invoke-Sqlcmd, it requires a dedicated storage to store the file. However, runbooks work serverless and there is no storage I can use for the files, as far as I know.
Is there a way to only read the files (without moving them around) via powershell runbook or can I store the files to read with it?

Azure Database for PostgreSQL server backup before destroy

I currently have an Azure PostgreSQL server which I would like to permanently delete.
For obvious reasons I would like to make a snapshot/soft delete before turning off and deleting. What is the simplest way to accomplish this? I know there is a backup center option but I feel this is rather sophisticated for what I need.
Thanks
You can take the backup of Azure PostgreSQL server on the Azure Storage Account, then download the backup file locally from the storage account. After that you can delete the storage account and Azure PostgreSQL server.
Below is the article that shows how to take back up on Blob Storage.
Backup Azure Database for PostgreSQL to a Blob Storage

Google Cloud SQL Restore BAK file

I am new in Google Cloud. I created a Cloud SQL Instance and I need to restore the data from a .bak file. I have the .bak file in a GCS bucket, and I am trying to restore using Microsoft Management Studio -> Task -> Restore. But I'm not able to access the file.
Can anyone help me with the procedure on how to restore from a .bak file?
You need to give the Cloud SQL service Account access to the bucket where the file is saved.
On Cloud Shell run the following:
gcloud sql instances describe [INSTANCE_NAME]
On the output search for the field "serviceAccountEmailAddress" an copy the SA email.
Then again on cloud shell run the following:
gsutil iam ch serviceAccount:<<SERVICE_ACCOUNT_EMAIL>:legacyBucketWriter gs://<<BUCKET_NAME>>
gsutil iam ch serviceAccount:<<SERVICE_ACCOUNT_EMAIL>:objectViewer gs://<<BUCKET_NAME>>
That should give the service account permission to access the bucket and retrieve the file, also here is the guide on doing the import, take in mind that doing the import will override all the data on the DB.
Also remember that:
You cannot import a database that was exported from a higher version of SQL Server. For example, if you exported a SQL Server 2017 Enterprise version, you cannot import it into a SQL Server 2017 Standard version.

How to take backup of azure SQL-Managed Instance backup to azure blob

I am having a SQL-Managed Instance database now I wanted to take backup in .bak format to blob storage. The current Command I am using is
WITH IDENTITY = 'SHARED ACCESS SIGNATURE'
, SECRET = 'Pasted my sas token generated from azure portal blob storage'
go
BACKUP DATABASE [DB_Name]
TO URL = 'blob url/cointainer name/testing.bak'with checksum;
But by this I am getting a error:
"BACKUP DATABASE failed. SQL Database Managed Instance supports only COPY_ONLY full database backups which are initiated by user."
I have also tried to give "COPY_ONLY" instead of checksum but then again I am facing a error:
"Msg 41922, Level 16, State 1, Line 6
The backup operation for a database with service-managed transparent data encryption is not supported on SQL Database Managed Instance.
Msg 3013, Level 16, State 1, Line 6
BACKUP DATABASE is terminating abnormally.
"
Note: I have a database of approx size 800GB
To prevent the original error message and you are comfortable with the increased security risks you can remove encryption:
Alter database [database_name] set encryption Off
use [database_name]
DROP DATABASE ENCRYPTION KEY
The backup command should be:
USE [master]
BACKUP DATABASE [SQLTestDB]
TO URL = N'https://msftutorialstorage.blob.core.windows.net/sql-backup/sqltestdb_backup_2020_01_01_000001.bak'
WITH COPY_ONLY, CHECKSUM
GO
You could follow this Azure tutorial:
Quickstart: SQL backup and restore to Azure Blob storage service:
It will help you backup the database(.bak) to Blob Storage step by step:
Create credential
Back up database
Hope this helps
Error is related to service managed TDE encryption since all database by default encrypted and service managed TDE does not allow to take copy_only backups. You need to either disable service managed TDE or Enable TDE with customer managed keys to take backups.
Since your database size is 800 GB and if BackupSize > 200 GB then split your backups to multiple files. This is a limitation with blockblob.

How to use PowerShell to refresh a dev database from backup of prod

We are using DDBoost to backup and restore SQL Server databases.
Now we want to create a script or a job, so developers can kick off the script or the job to refresh their dev databases without asking a DBA.
I know in SQL Server Management Studio we can't take input in Job so I want to create a script with T-SQL or CLI or Powershell to take inputs like SourceDB, TargetDB, and then refresh a dev database using with the parameters.
I know how to take input in PowerShell so if someone can tell how to do it either by:
Using powershell to restore from DDBoost
Passing the values from PowerShell to T-SQL or CLI
Any other option in SQL Server management Studio.
I'm going to ignore the fact about DDBoost, because I don't know it.
But I know about dbatools.io, which is very solid powershell module that does exactly what you are looking for. It can with one line of powershell restore an entire database. You can make it as simple or advanced as you want.
Prerequisites:
A working SQL Server backup file. This can be created from SQL
Server Management Studio or dbatools.io
A SQL Server
The executing user needs privileges to restore the database.
Install dbatools
Installation
Start PowerShell (Run As Administrator)
Install-Module dbatools
Backup database
Here I'll show how you can backup a database using dbatools.
Backup-DbaDatabase -SqlInstance servername -Database databasename -BackupDirectory C:\Temp -CompressBackup
Restore database
Here I'll show how you can restore a database using dbatools.
Restore-DbaDatabase -SqlInstance servername -DatabaseName databasename -Path C:\Temp\filename.bak -WithReplace -useDestinationDefaultDirectories -ReplaceDbNameInFile
The restore script instructs the SQL Server to overwrite any existing database. It will restore the database files into the folders that are configured inside the SQL Server configuration and it will replace the database name in the physical files.