Export bacpac to a Azure storage account using powershell - powershell

I'm trying to create a powershell script to backup a SQL database on Azure to a storage account as below,
$exportRequest = New-AzureRmSqlDatabaseExport -ResourceGroupName
$ResourceGroupName -ServerName $ServerName `
-DatabaseName $DatabaseName -StorageKeytype $StorageKeytype -StorageKey
$StorageKey -StorageUri $BacpacUri `
-AdministratorLogin $creds.UserName -AdministratorLoginPassword $creds.Password
This is the document i'm following,
https://learn.microsoft.com/en-us/azure/sql-database/sql-database-export
I assume the following,
$ResourceGroupName - my azure resource group
$ServerName - db server name
$DatabaseName - database name
**$StorageKeytype - NOT SURE WHAT VALUE SHOULD BE PLACED HERE**
**$StorageKey - I'm hoping this is one of the access keys under the azure storage account**
$BacpacUri - Azure storage account bacpac URI path
Please advice what parameters need to passed here.

Please advice what parameters need to passed here.
StorageKey : Specifies the access key for the storage account.
StorageKeyType:
Specifies the type of access key for the storage account.
The acceptable values for this parameter are:
StorageAccessKey. This value uses a storage account key.
SharedAccessKey. This value uses a Shared Access Signature (SAS) key.
For more details, refer to this link.

Related

upload file to blob storage with Azure functions in PowerShell using Azure module

Requirement is to store the file in Storage account through Azure functions in PowerShell using Az module. Please help.
$todaydate = Get-Date -Format MM-dd-yy
$LogFull = "AzureScan-$todaydate.log"
$LogItem = New-Item -ItemType File -Name $LogFull
" Text to write" | Out-File -FilePath $LogFull -Append
First of all, what you need to figure out is the input of your function and how you're handling that. If you're just wanting to write a file to blob storage everytime an HTTP triggered Azure function is executed then that is simple enough.
There are a number of elements that come into play when working with blob storage with Azure Functions however that you will need to understand to develop a working solution.
Managed Identities
Azure Funtions are able to be assigned an identity so that you can grant access to the FunctionApp itself rather than having to authenticate as a user. This means you don't have to handle the authentication aspect of your function to access the storage account content and you just need to grant your FunctionApp the relevant permissions to read/write/delete blob or storage content.
There are a number of built in RBAC roles in AzureAD which you can grant to access storage accounts and blobs etc.
You can find the documentation on the RBAC permissions for that here: https://learn.microsoft.com/en-us/azure/role-based-access-control/built-in-roles#storage
and the documentation on how to activate a managed identity on your functionApp can be found here: https://learn.microsoft.com/en-us/azure/app-service/overview-managed-identity?tabs=dotnet#add-a-system-assigned-identity
Storage Account(s)
Programmatically accessing storage account contents depends on the permissions but you can use the access keys associated to the storage account which provide access to at the storage account level
You can read about the access keys here: https://learn.microsoft.com/en-us/azure/storage/common/storage-account-keys-manage?tabs=azure-portal#view-account-access-keys
Just remember that least-privilege access should be adopted and if you leak your keys then someone could access your data.
PowerShell Commands
The PowerShell commands required for programmatically accessing storage accounts and writing blob data can be summarised below
# Variables required - Fill these out
$storageAccountName = '<Insert Storage Account Here'
$containerName = '<Insert StorageContainer Name Here>'
# Set the context to the subscription you want to use
# If your functionApp has access to more than one subscription it will load the first subscription by default.
# Possibly a good habit to be explicit about context.
Set-AzContext -Subscription $subscription
# Get the Storage Account Key to authenticate
$storAccKeys = Get-AzStorageAccountKey -ResourceGroupName 'Storage-ResourceGroup' -Name $storageAccountName
$primaryKey = $storAccKeys | Where-Object keyname -eq 'key1' | Select-Object -ExpandProperty value
# Create a Storage Context which will be used in the subsequent commands
$storageContext = New-AzStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $primaryKey
# Attempt to create a container in the storage account. Handle Error appropriately.
try {
New-AzStorageContainer -Name $containerName -Context $storageContext -ErrorAction Stop
}
catch [Microsoft.WindowsAzure.Commands.Storage.Common.ResourceAlreadyExistException] {
Write-Output ('Container {0} already exists in Storage Account {1}' -f $containerName, $storageAccountName)
# Throw Here if you want it to fail instead.
}
catch {
throw $_
}
# Upload your file here. This may vary depending on your function input and how you plan to have your functionApp work.
Set-AzStorageBlobContent -Container $containerName -File ".\PlanningData" -Blob "Planning2015"
You can see the documentation on Set-AzStorageBlobContent for examples on that here:
https://learn.microsoft.com/en-us/powershell/module/az.storage/set-azstorageblobcontent?view=azps-6.2.1#examples
Generally though you will need a file to upload to blob storage and you can't just write directly to a file in blob storage.
If you need to read more on the Azure Functions side of things then there is the quickstart guide:
https://learn.microsoft.com/en-us/azure/azure-functions/create-first-function-vs-code-powershell
Or the Developer Reference on MS docs is really detailed:
https://learn.microsoft.com/en-us/azure/azure-functions/functions-reference-powershell?tabs=portal

Accessing Azure Storage through keys managed by Azure Key Vault via Powershell

I am having trouble accessing my storage account keys in a managed key vault.
Here is my code:
$secret = Get-AzKeyVaultManagedStorageAccount -VaultName $keyVaultName -Name $storageAccountName
$ctx =New-AzStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $secret.SecretValueText
It seems that $secret.SecretValueText is empty/null. How do I retrieve the storage account key correctly? This is the error that appears.
New-AzStorageContext : Cannot validate argument on parameter 'StorageAccountKey'. The argument is null or empty. Provide an argument that is not null or empty, and then try the command again.
Definitely it will be empty, the output of Get-AzKeyVaultManagedStorageAccount is PSKeyVaultManagedStorageAccount, it does not have such a property, the SecretValueText is a property of PSKeyVaultSecret.
And I don't think you can get the storage account key via Get-AzKeyVaultManagedStorageAccount, if you want to get the storage account key, you can use the command below, make sure your logged user account/service principal has the RBAC role e.g. Storage Account Key Operator Service Role, Contributor, Owner in your storage account.
$key = (Get-AzStorageAccountKey -ResourceGroupName <group-name> -Name <storageaccount-name>).Value[0]
New-AzStorageContext -StorageAccountName <storageaccount-name> -StorageAccountKey $key

Devops - Linked ARM template - Geneate blob storage SAS toekn using powershell

I'm trying to deploy linked ARM template using devops.
Instead of hard coding SAS token, I would like to generate SAS token using powershell script but I'm not familiar with using powershell to generate blob SAS token.
Any help with this powershell will be appreciated!
Updated 0512:
If you want to get the account key automatically, you should take use this cmdlet Get-AzStorageAccountKey.
The example:
1.Get both of the key1 and key2 of your storage account:
Get-AzStorageAccountKey -ResourceGroupName "your_resourceGroupName" -Name "your_storageAccountName"
Test result:
2.Get the key1 of your storage account:
$s=Get-AzStorageAccountKey -ResourceGroupName "your_resourceGroupName" -Name "your_storageAccountName"
$s[0].Value
Test result:
Original answer:
If you're using azure powershell az module, then you can use New-AzStorageBlobSASToken cmdlet.
Sample code:
$accountName="xxx"
$accountKey="xxxx"
$context=New-AzStorageContext -StorageAccountName $accountName -StorageAccountKey $accountKey
New-AzStorageBlobSASToken -Container "ContainerName" -Blob "BlobName" -Permission rwd -Context $context
Test result:

What format does the Shared Access Signature (SAS) key have in Powershell cmdlet New-AzureRmSqlDatabaseExport?

I'm trying to export a Azure Sql database to a Azure storage with Powershell cmdlet New-AzureRmSqlDatabaseExport but can't seem to figure out how to use the -StorageKeyType = "SharedAccessKey" option where one is supposed enter Shared Access Signature (SAS) key for the -StorageKey parameter. It is not the SAS token, is it parts of this token or how does one find the correct key format?
Turns out it was the SaS token returned by New-AzureStorageContainerSASToken after all and format should include "?."
https://learn.microsoft.com/en-us/dotnet/api/microsoft.azure.management.sql.models.exportrequest?view=azure-dotnet
-StorageKeyType "SharedAccessKey"
-StorageKey "?xxxxxxxxxxxxxxxx"
Below an example of how to use the StorageKeyType:
New-AzureRmSqlDatabaseExport -ServerName "xxxxx" -AuthenticationType Sql
-AdministratorLogin "xxx#xxxxx" -DatabaseName "xxxxx"
-StorageUri "xxxxxxx.blob.core.windows.net/xxxxx";
-StorageKey "xxxxxxxxxxxxxxxxxxxxxxxxxx/xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx=="
–ResourceGroupName "Default-SQL-SoutheastAsia" –StorageKeytype "StorageAccessKey"
On the ServerName do not use the fully qualified name if you receive any errors.

Azure PowerShell Get-AzureSqlDatabaseServiceObjective returns null when specifying certain Premium performance levels

I've created an Azure PowerShell script in a Runbook within the Azure Automation portal in order to automatically scale the database performance level depending on what time it is.
I can successfully retrieve a service objective via "Get-AzureSqlDatabaseServiceObjective" when I want to scale down to a "P1" or "P2" performance level; however, when I want to scale up to "P6" or "P11", I am unable to do so with the same exact block of code:
$Edition = "Premium"
$PerfLevel = "P6"
$Servercredential = new-object System.Management.Automation.PSCredential($Credential.UserName, (($Credential).GetNetworkCredential().Password | ConvertTo-SecureString -asPlainText -Force))
$CTX = New-AzureSqlDatabaseServerContext -ManageUrl “https://$ServerName.database.windows.net” -Credential $ServerCredential
$ServiceObjective = Get-AzureSqlDatabaseServiceObjective $CTX -ServiceObjectiveName $PerfLevel
Set-AzureSqlDatabase $CTX –DatabaseName $DatabaseName –ServiceObjective $ServiceObjective –Edition $Edition -Force
When I specify "P6" as the "ServiceObjectiveName" this cmdlet returns null; however, when I specify "P1" or P2" the cmdlet returns the correct ServiceObjective object, and the code will execute properly.
The MSDN documentation for "Get-AzureSqlDatabaseServiceObjective" only shows "P1, P2, P3" as valid Premium values; however, there has to be a way to scale the database to these higher performance levels (I can specify "P3" as a parameter in this script and it will actually change the database performance level to P3, even though you can't select this performance level manually through the Azure Portal anymore).
Can anyone give advice or maybe another method to achieve scaling up to these higher performance levels via a PowerShell script? I've done hours of research on here and elsewhere and I can't find a solution to this or any other post with a similar problem that was resolved.
Azure Resource Management commandlet for sql server i.e Set-AzureRmSqlDatabase can be used to upscale db to any desired edition/performance level.
We have to connect to our Azure subscription and acquire database instance to upscale it. We can create a runbook to schedule upgrade/downgrade.
# Read the subscription credentials. AzureRunAsConnection asset is created as a part of Automation account setup(see link below)
$Conn = Get-AutomationConnection -Name AzureRunAsConnection
# Connect to the subscription (Uninteractive login).
Add-AzureRMAccount -ServicePrincipal -Tenant $Conn.TenantID `
-ApplicationId $Conn.ApplicationID `
-CertificateThumbprint $Conn.CertificateThumbprint
# Set the new performance tier of db.
Set-AzureRmSqlDatabase -DatabaseName $Using:DatabaseName `
-ServerName $Using:SqlServerName `
-ResourceGroupName $Using:ResourceGroupName `
-Edition $Using:Edition `
-RequestedServiceObjectiveName $Using:PerfLevel
Read Authentication runbook with AzureRunAsAccount for details on authentication using runbook connection asset.
NOTE: `(tick) is used to break the commandlet in multiple lines.
Here is my upgrade and downgrade example for Azure DB and Azure DWH: http://microsoft-bitools.blogspot.com/2017/04/scheduled-upgrade-and-downgrade-azure.html