Publish Test Results on Azure from Azure Storage account - azure-devops

I have a container that creates a report.xml file that I wish to use to create a test report in Azure.
I was thinking to do this during the pipeline, but I am missing how to get this file during the Azure pipeline.
What is the way to download a file in storage account and use it on a Test Result during the pipeline?

If I understood, you want to copy a file from blob storage in pipeline?
For that you can use Azure Cli task with this command:
az storage blob download
But can you esplain what you mean by this use it on a Test Result during the pipeline??

Try to use Azure Powershell to handle this get files from Azure Storage Blob:
$storage = Get-AzStorageAccount -ResourceGroupName xxx -Name yyy
$ctx = $storageAccount.Context
# download blob
Get-AzStorageblobcontent -Blob "report.xml" `
-Container $containerName `
-Destination " $(System.DefaultWorkingDirectory)" `
-Context $Context
According to this question:How to copy a file from Azure Storage fileshare to Release pipeline agent
Seems you were trying to download file from Azure File share (not blob), please simply refer my reply in that link.

After some digging this is the solution.
What I have done is mount a Storage volume during the container instances to my container that generates the report.
Afterwards in the pipeline I have copied the files that are stored in the storage account to the Pipeline agent.
$storageAcct = Get-AzStorageAccount -ResourceGroupName XXX -Name YYY
Get-AzStorageFileContent -Context $storageAcct.context -ShareName "Sharename" -Path "report.xml" -Destination $(System.DefaultWorkingDirectory)
Then its just matter of getting that files from the agent when using the test reporter during the release pipeline.

Related

how to delete subfolder in ADLS gen2 via azure databricks

I need to delete ADLS Gen2 subfolder with specific name using databricks dbutils but not able to perform wild card recursion.
e.g.
adlsstorageaccount/container1/folder2/folder3/folderA/**abc**/.parquet
adlsstorageaccount/container1/folder2/folder3/folderB/**abc**/.parquet
adlsstorageaccount/container1/folder2/folder3/folderC/**abc**/.parquet
Need to delete subfolder name "abc" & its contents only, entire path is dynamic.
We have reproduced the same Folders in our environment and here are the commands that worked.
$FileSystemName="<Your Container Name>"
$dirname="folder2/folder3/folderA/abc/"
$ctx=New-AzStorageContext -StorageAccountName '<Your Storage Account>' -StorageAccountKey '<Your Access Key>'
Remove-AzDataLakeGen2Item -Context $ctx -FileSystem $filesystemName -Path $dirname
and then confirm it with Y - 'yes'
Here are the screenshots for your reference
Before using the command execution
After the command execution
REFERENCES:
https://learn.microsoft.com/en-us/azure/storage/blobs/data-lake-storage-directory-file-acl-powershell#delete-a-directory
you need just use magic command %sh in databricks and remove it with wilcard using standard linux commands, something like:
%sh
rm /dbfs/mnt/adlsstorageaccount/container1/folder2/folder3/folderA/*abc*/*.parquet

How to get the command line for AZCopy?

I want to send dump files to a storage container and for the copy to work we need to obtain a SAS key for the container we’re copying to.
When you use Azure Storage Explorer you can copy a file to a container and then copy the command it used to the clipboard which looks something like this:
$env:AZCOPY_CRED_TYPE = "Anonymous";
./azcopy.exe copy "C:\temp\test.txt" "https://dbbackups.blob.core.windows.net/memorydumps/test.txt?{SAS-TOKEN}" --overwrite=prompt --from-to=LocalBlob --blob-type Detect --follow-symlinks --put-md5 --follow-symlinks --recursive;
$env:AZCOPY_CRED_TYPE = "";
I copied this from AZ Storage Explorer when copying a file called test.txt from c:\temp to the memorydumps container in a Storage Account.
What I would need help with is creating a PowerShell script that generates the above command line so I can run it on azcopy-less nodes and have the dumps end up in the storage container. Any ideas?
You could use the Azure PowerShell equivalent to upload blobs to your container. The Set-AzStorageBlobContent uploads a local file to an Azure Storage blob.
Set-AzStorageBlobContent -File "C:\Temp\test.txt" `
-Container $containerName `
-Blob "Image001.jpg" `
-Context $ctx
Refer to this blog post for a detailed walkthough: File Transfers to Azure Blob Storage Using Azure PowerShell

Generate Azure Storage Account SAS Key using PowerShell

I am trying to generate an azure storage account shared access key so that i can use it with azcopy to retrieve files from all containers in my storage account.
I have generated a key successfully using the Azure Portal and proven this works with azcopy
But i am struggling to get an equivalent key to generate using PowerShell that works.
Powershell Query
az storage container generate-sas --account-name $SaName --account-key $accountKey --permissions 'rl' --start $start --expiry $expiry --name $SaName --https-only --output tsv
Azure Portal (GUI) Result
sv=2019-12-12
&ss=b
&srt=sco
&sp=rl
&se=2021-02-08T17:40:26Z
&st=2021-02-08T09:40:26Z
&spr=https
&sig=REDACTED
Powershell Result
st=2021-02-08T17%3A17%3A47Z
&se=2021-02-08T17%3A47%3A47Z
&sp=rl
&spr=https
&sv=2018-11-09
&sr=c
&sig=REDACTED
I guess the first problem is that i have not found a way of adding the missing and ss=b srt=sco (not sr) there doesn't seem to be those parameters available, perhaps if they were there the sig would have the correct hash.
I have tried this in Azure Cloudshell as well as on my own machine with az 1.12.1
The command az storage container generate-sas is not powershell command, it's azure cli command.
Because in Azure portal, you're generating an account level sas-token, but in azure cli, you're actually generating a container level sas-token by using az storage container generate-sas.
To generate an account level sas-token, you should use this azure cli command: az storage account generate-sas.
The sample like below:
az storage account generate-sas --account-key "xxxxx" --account-name your_storage_account_name --expiry 2020-02-10 --https-only --permissions rl --resource-types sco --services b
Here is the test result, the ss=b srt=sco are generated:
If you want to use powershell to generate an account level sas-token, please use this powershell command: New-AzStorageAccountSASToken. The sample is as below(you can add other parameters as per your need):
$account_name = "yy1"
$account_key = "xxxxxxx"
$context = New-AzStorageContext -StorageAccountName $account_name -StorageAccountKey $account_key
#you can also add other parameter as per your need, like StartTime, ExpiryTime etc.
New-AzStorageAccountSASToken -Service Blob -ResourceType Service,Container,Object -Permission rl -Context $context
Here is the test result:

Devops - Linked ARM template - Geneate blob storage SAS toekn using powershell

I'm trying to deploy linked ARM template using devops.
Instead of hard coding SAS token, I would like to generate SAS token using powershell script but I'm not familiar with using powershell to generate blob SAS token.
Any help with this powershell will be appreciated!
Updated 0512:
If you want to get the account key automatically, you should take use this cmdlet Get-AzStorageAccountKey.
The example:
1.Get both of the key1 and key2 of your storage account:
Get-AzStorageAccountKey -ResourceGroupName "your_resourceGroupName" -Name "your_storageAccountName"
Test result:
2.Get the key1 of your storage account:
$s=Get-AzStorageAccountKey -ResourceGroupName "your_resourceGroupName" -Name "your_storageAccountName"
$s[0].Value
Test result:
Original answer:
If you're using azure powershell az module, then you can use New-AzStorageBlobSASToken cmdlet.
Sample code:
$accountName="xxx"
$accountKey="xxxx"
$context=New-AzStorageContext -StorageAccountName $accountName -StorageAccountKey $accountKey
New-AzStorageBlobSASToken -Container "ContainerName" -Blob "BlobName" -Permission rwd -Context $context
Test result:

how to download file from azure storage blob in project repository inside build pipeline (Azure DevOps)

Need a way to download a set of files from Azure Blob storage to Project repository during the build.
The aim of process is to CI-CD the mobile app. but the mobile app's icon, background image and some other images are provided by other application, so during build the images are suppose to take from blob storage container.
you can use Azure Powershell to do that, would probably be the easiest way of doing that:
$storage = Get-AzStorageAccount -ResourceGroupName xxx -Name yyy
$ctx = $storageAccount.Context
# download blob
Get-AzStorageblobcontent -Blob "Image001.jpg" `
-Container $containerName `
-Destination "D:\_TestImages\Downloads\" `
-Context $ctx
Reading:
https://learn.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-powershell#download-blobs