Setting Azure KeyVault Variable in DevOps Pipeline - powershell

Hi I'm trying to create a file and upload it to a storage account within Azure. I can create the file and send it to the storage account if I hard code the values, such as storage account name, access key and container, however I'd like to save them in Azure Key Vault and reference them from there.
I have an ARM template running which I have my admin username and password in and it works perfectly using "$(adminUser)" and "$(adminPass)"
The code for the storage account is below:
$StorageAccountName = "$env:storageaccountname"
$StorageAccountKey = "$env:secretstoragekey"
$ctx = New-AzStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $StorageAccountKey
$ContainerName = "$env:vhdcontainer"
Set-AzStorageBlobContent -file ./$Filename -Container $ContainerName -Blob $Filename -Context $ctx -force
I have tried using $(StorageAccountName), $(secretstoragekey) and $(vhdContainer) bu no joy. Can anyone see where I'm going wrong and point me in the right direction please.
Thanks in advance :)

Related

Migrate local bash script with Azure CLI commands to Azure Powershell task in Azure DevOps

I currently have a bash script that runs some az commands to clean a storage account container and upload the contents of a directory to it:
az storage blob delete-batch --account-name $ACCOUNT_NAME --source $web
az storage blob upload-batch --account-name $ACCOUNT_NAME -s $SOURCE_PATH -d $web
I would like to reuse that functionality inside a Powershell Azure task that runs on Azure DevOps Services because I have a lot of other stuff going on that script besides the storage cleaning and upload.
What's the best way to migrate this? Been looking in the Powershell Azure module documentation but I can't find a proper equivalent to blob delete-batch and blob upload-batch.
Also though in calling the az command directly but for that I would have to login so I would need a way to pass the service principal details from the Powershell Azure task into the az login command before executing those lines.
Any ideas are welcome. thanks in advance
use Azure PowerShell to login Azure with service principal
You can use the following script
$appId = "your application id "
$password = "your application secret"
$secpasswd = ConvertTo-SecureString $password -AsPlainText -Force
$mycreds = New-Object System.Management.Automation.PSCredential ($appId, $secpasswd)
Add-AzureRmAccount -Credential $mycreds -Tenant "your tenat id" -ServicePrincipal
I can use Azure CLI command "az storage blob upload-batch" to upload a
local directory to Azure storage. How to implement it with Azure
PowerShell
Azure PowerShell does not provide the command like that. It just provides the command Set-AzureStorageBlobContent to allow customers to upload file to Azure storage. So you need to write a script with the command to implement how to upload a directory to Azure storage. For example
$StorageAccountKey=" "
$sourceFileRootDirectory=" "
$StorageAccountName=" "
$ContainerName=" "
$ctx = New-AzureStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $StorageAccountKey
$container = Get-AzureStorageContainer -Name $ContainerName -Context $ctx
if ($container) {
$filesToUpload = Get-ChildItem $sourceFileRootDirectory -Recurse -File
foreach ($x in $filesToUpload) {
$blobName = ($x.fullname.Substring($sourceFileRootDirectory.Length + 1)).Replace("\", "/")
Set-AzureStorageBlobContent -File $x.fullname -Container $container.Name -Blob $blobName -Context $ctx -Force:$Force
}
}
I can use Azure CLI command "az storage blob delete-batch" to clean up
a container. How to implement it with Azure
PowerShell
Azure PowerShell does not provide the command that we can use to directly delete all blob in one container. So we need to write a script to implement it. Now, we have two choices
Delete the container and create a new container with the same name
$StorageAccountKey=" "
$StorageAccountName=" "
$ContainerName=" "
$context = New-AzureStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $StorageAccountKey
Remove-AzureStorageContainer -Name $ContainerName -Context $context
New-AzureStorageContainer -Name $ContainerName -Context $context
Get all blobs in the container then delete them
$StorageAccountKey=" "
$StorageAccountName=" "
$ContainerName=" "
$Token = $null
$Total = 0
$MaxCount=5000
$context = New-AzureStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $StorageAccountKey
do
{
$Blobs = Get-AzureStorageBlob -Container $ContainerName -MaxCount $MaxCount -ContinuationToken $Token -Context $context
if($Blobs.Length -le 0) { Break;}
$Token = $Blobs[$blobs.Count -1].ContinuationToken;
foreach($blob in $blobs){
Remove-AzStorageBlob -Blob $blob.Name -Container $ContainerName -Context $context
}
}
While ($Token -ne $null)
Why not use the Azure CLI task in Azure DevOps that uses a Service connection for the authentication part? See the documentation here.

Azure Runbook - Get a file from Azure File System Storage

I am creating a Azure workflow runbook wherein I have to get a file from Azure File System Storage and publish that to a azure web app.
I tried with New-PSDrive but that command is not supported in runbook (even InlineScript doesn't work). Could anyone help me with the script. In the below code I need to populate file path from azure file system.
$Conn = Get-AutomationConnection -Name AzureRunAsConnection
Connect-AzureRmAccount -ServicePrincipal -Tenant $Conn.TenantID `
-ApplicationId $Conn.ApplicationID `
-CertificateThumbprint $Conn.CertificateThumbprint
$zipFilePath = ???
Publish-AzureWebsiteProject -Name $siteName -Package $zipFilePath
I searched a lot but couldn't find much information on this.
Are you referring to a file in a Azure Storage account? If so, that is pretty easy to accomplish. Add the following to your Runbook, filling in the required information:
$StorageAccountKey = Get-AutomationVariable -Name 'storageKey'
$Context = New-AzureStorageContext -StorageAccountName 'your-storage' `
-StorageAccountKey $StorageAccountKey
Get-AzureStorageFileContent -ShareName 'your-share' -Context $Context `
-path 'your-file' -Destination 'C:\Temp'
$filePath = Join-Path -Path 'C:\Temp' -ChildPath 'your-file'
You also need to create an variable in your Automation Account, called "storageKey" containing your Storage Accounts key.
Mounting Azure File share as a drive is not currently supported in Automation cloud jobs, though it will probably be supported in a few months. In the meantime, use the Get-AzureStorageFile command from the Azure.Storage module to retrieve the file to a temp folder.
Alternatively, run this job on a Hybrid worker. In this case, make sure all the prerequisites are met in order to mount the share as a network drive.

how to copy capture images in azure from one subscription to another subscription

i am trying to copy azure captured image from one subscription-another subscription currently in portal move is coming soon so i am trying to copy it and searching for some power shell script i don't want to create a vm from that with out creating vm i am trying to copy
it is managed disk through power shell i can copy managed disk from one subscription-another by creating vm from it but i am trying without creating vm i am trying copy or move capture image is this possible with power shell can any have idea about this.?
i am trying copy or move capture image is this possible with power
shell can any have idea about this.?
No, it is not possible. Image does not copy from one subscription to another subscription. You need copy image's managed disk to other subscription.
You have two option.
1.Using image's managed to create a snapshot and copy this snapshot to other subscription, then using this snapshot to create a managed disk, then create a image.
#Create a snapshot from managed disk
$disk = "/subscriptions/************/resourceGroups/SHUICLI/providers/Microsoft.Compute/disks/shui_OsDisk_1_21af43450987448184b5e9793da08e54"
$snapshot = New-AzureRmSnapshotConfig -SourceUri $disk.Id -CreateOption Copy -Location $region
$snapshotName = $imageName + "-" + $region + "-snap"
New-AzureRmSnapshot -ResourceGroupName $resourceGroupName -Snapshot $snapshot -SnapshotName $snapshotName
#copy the snapshot to another subscription, same region
$snap = Get-AzureRmSnapshot -ResourceGroupName $resourceGroupName -SnapshotName $snapshotName
#change to the target subscription
Select-AzureRmSubscription -SubscriptionId $targetSubscriptionId
$snapshotConfig = New-AzureRmSnapshotConfig -OsType Windows `
-Location $region `
-CreateOption Copy `
-SourceResourceId $snap.Id
$snap = New-AzureRmSnapshot -ResourceGroupName $resourceGroupName `
-SnapshotName $snapshotName `
-Snapshot $snapshotConfig
More information about this please refer to this blog.
2.Copy image's managed disk to a storage account, then using this VHD to create a new image.
##create $SAS
$sas = Grant-AzureRmDiskAccess -ResourceGroupName shui -DiskName shuitest -DurationInSecond 3600 -Access Read
$destContext = New-AzureStorageContext –StorageAccountName contosostorageav1 -StorageAccountKey 'YourStorageAccountKey'
Start-AzureStorageBlobCopy -AbsoluteUri $sas.AccessSAS -DestContainer 'vhds' -DestContext $destContext -DestBlob 'MyDestinationBlobName.vhd'
See this answer.

Running powershell script in Webjob

I am running below powershell script in Azure Webjob to conenct to Storage account and upload a file
Write-Output "Getting Azure storage context..."
$storageContext = New-AzureStorageContext -StorageAccountName "awemigcitest" -StorageAccountKey "xx+74Z81YJf373p88Emp2jKidMZ8b4/+UTLJT4Rvgqrc8IedRxkg=="
$ProgressPreference="SilentlyContinue"
Set-AzureStorageBlobContent -Blob $azureBlobStorageFileName -File $tempFilename -Container $MigrationReportsContainerName -Context $storageContext -Force
Write-Output "Copied HTML file to Azure blob storage."
But getting below error. Any idea on this?
New-AzureStorageContext : The Win32 internal error "The handle is invalid" 0x6
[09/30/2017 06:41:20 > 4db5e9: ERR ] occurred while setting character attributes for the console output buffer.
The commands you're using seems ok, it works for me both on local and Azure WebJob. Please check if your scripts can work on local. Besides, if possible, you can try to create a new WebJob to run your script, and check if it can work fine.
Write-Output "Getting Azure storage context..."
$storageContext = New-AzureStorageContext -StorageAccountName "{account_name}" -StorageAccountKey "{account_key}"
$ProgressPreference="SilentlyContinue"
Set-AzureStorageBlobContent -Blob 'source.txt' -File 'D:\home\data\jobs\continuous\FileIn.txt' -Container 'mycontainer' -Context $storageContext -Force
Write-Output "Copied HTML file to Azure blob storage."
WebJob Logs
Note: I use Kudu Console to access site folder and create FileIn.txt.

Working with Azure Development Storage from command-line

I need to upload some files to my Azure storage emulator using scripts. The same task for remote Azure storage is performed easily with Azure PowerShell cmdlets, just call
Add-Blob -BlobType Block -FilePath $myFilePath -ContainerName $myContainerName
But how can I do the same thing for local storage emulator?
For those looking for how to do this with Azure SDK (2.1), here is how:
$StorageContext = New-AzureStorageContext -Local
Set-AzureStorageBlobContent -File $SourceFilePath `
-Container $DestinationContainerName -Blob `
$DestinationBlobName -Context $StorageContext
If you want to actually upload to an Azure storage account, change the $StorageContext:
New-AzureStorageContext –StorageAccountName $StorageAccountName `
-StorageAccountKey $StorageAccountKey
You could use the Azure Command Line Tools, available here:
https://github.com/RobBlackwell/AzureCommandLineTools
They run on the normal command prompt, they're not actually powershell cmdlets.
SET AZURE_CONNECTION_STRING=UseDevelopmentStorage=true
PutBlob filename [containername[/blobname]]
Found the solution using PowerShell Cmdlets.
You need to specify -UseDevelopmentStorage option to the cmdlets:
Get-Container -UseDevelopmentStorage
or
Add-Blob -UseDevelopmentStorage -BlobType Block -FilePath $myFilePath -ContainerName $myContainerName