How to use "Set-AzureStorageFileContent" to upload file to HDInsight? - powershell

I think I have setup my powershell environment to connect to my Azure account. I want to upload a file to my HDInsight blob storage. I did:
Set-AzureStorageFileContent -ShareName "what is a share name?" -Source "c:\local.file" -Path "test/"
But I got
Set-AzureStorageFileContent : Can not find your azure storage credential.
Please set current storage account using
"Set-AzureSubscription" or set the "AZURE_STORAGE_CONNECTION_STRING" environment variable.
The help information for Set-AzureSubscription is so useless, I have no idea what it is talking about...

A few things here:
Set-AzureStorageFileContent uploads a file into File Service Share and not Blob Storage. To upload a file into blob storage, you would need to use Set-AzureStorageBlobContent Cmdlet.
I believe the error you're getting is because no storage account is specified. Set-AzureStorageBlobContent cmdlet has a parameter called Context and you would need to specify the context which can do by calling New-AzureStorageContext Cmdlet.
Sample code:
$storageContext = New-AzureStorageContext -StorageAccountName "accountname" -StorageAccountKey "accountkey"
Set-AzureStorageBlobContent -File "full file path" -Container "container name" -BlobType "Block" -Context $storageContext -Verbose
Please note that the container must exist in your storage account which you can create using New-AzureStorageContainer Cmdlet.

Related

upload file to blob storage with Azure functions in PowerShell using Azure module

Requirement is to store the file in Storage account through Azure functions in PowerShell using Az module. Please help.
$todaydate = Get-Date -Format MM-dd-yy
$LogFull = "AzureScan-$todaydate.log"
$LogItem = New-Item -ItemType File -Name $LogFull
" Text to write" | Out-File -FilePath $LogFull -Append
First of all, what you need to figure out is the input of your function and how you're handling that. If you're just wanting to write a file to blob storage everytime an HTTP triggered Azure function is executed then that is simple enough.
There are a number of elements that come into play when working with blob storage with Azure Functions however that you will need to understand to develop a working solution.
Managed Identities
Azure Funtions are able to be assigned an identity so that you can grant access to the FunctionApp itself rather than having to authenticate as a user. This means you don't have to handle the authentication aspect of your function to access the storage account content and you just need to grant your FunctionApp the relevant permissions to read/write/delete blob or storage content.
There are a number of built in RBAC roles in AzureAD which you can grant to access storage accounts and blobs etc.
You can find the documentation on the RBAC permissions for that here: https://learn.microsoft.com/en-us/azure/role-based-access-control/built-in-roles#storage
and the documentation on how to activate a managed identity on your functionApp can be found here: https://learn.microsoft.com/en-us/azure/app-service/overview-managed-identity?tabs=dotnet#add-a-system-assigned-identity
Storage Account(s)
Programmatically accessing storage account contents depends on the permissions but you can use the access keys associated to the storage account which provide access to at the storage account level
You can read about the access keys here: https://learn.microsoft.com/en-us/azure/storage/common/storage-account-keys-manage?tabs=azure-portal#view-account-access-keys
Just remember that least-privilege access should be adopted and if you leak your keys then someone could access your data.
PowerShell Commands
The PowerShell commands required for programmatically accessing storage accounts and writing blob data can be summarised below
# Variables required - Fill these out
$storageAccountName = '<Insert Storage Account Here'
$containerName = '<Insert StorageContainer Name Here>'
# Set the context to the subscription you want to use
# If your functionApp has access to more than one subscription it will load the first subscription by default.
# Possibly a good habit to be explicit about context.
Set-AzContext -Subscription $subscription
# Get the Storage Account Key to authenticate
$storAccKeys = Get-AzStorageAccountKey -ResourceGroupName 'Storage-ResourceGroup' -Name $storageAccountName
$primaryKey = $storAccKeys | Where-Object keyname -eq 'key1' | Select-Object -ExpandProperty value
# Create a Storage Context which will be used in the subsequent commands
$storageContext = New-AzStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $primaryKey
# Attempt to create a container in the storage account. Handle Error appropriately.
try {
New-AzStorageContainer -Name $containerName -Context $storageContext -ErrorAction Stop
}
catch [Microsoft.WindowsAzure.Commands.Storage.Common.ResourceAlreadyExistException] {
Write-Output ('Container {0} already exists in Storage Account {1}' -f $containerName, $storageAccountName)
# Throw Here if you want it to fail instead.
}
catch {
throw $_
}
# Upload your file here. This may vary depending on your function input and how you plan to have your functionApp work.
Set-AzStorageBlobContent -Container $containerName -File ".\PlanningData" -Blob "Planning2015"
You can see the documentation on Set-AzStorageBlobContent for examples on that here:
https://learn.microsoft.com/en-us/powershell/module/az.storage/set-azstorageblobcontent?view=azps-6.2.1#examples
Generally though you will need a file to upload to blob storage and you can't just write directly to a file in blob storage.
If you need to read more on the Azure Functions side of things then there is the quickstart guide:
https://learn.microsoft.com/en-us/azure/azure-functions/create-first-function-vs-code-powershell
Or the Developer Reference on MS docs is really detailed:
https://learn.microsoft.com/en-us/azure/azure-functions/functions-reference-powershell?tabs=portal

Publish Test Results on Azure from Azure Storage account

I have a container that creates a report.xml file that I wish to use to create a test report in Azure.
I was thinking to do this during the pipeline, but I am missing how to get this file during the Azure pipeline.
What is the way to download a file in storage account and use it on a Test Result during the pipeline?
If I understood, you want to copy a file from blob storage in pipeline?
For that you can use Azure Cli task with this command:
az storage blob download
But can you esplain what you mean by this use it on a Test Result during the pipeline??
Try to use Azure Powershell to handle this get files from Azure Storage Blob:
$storage = Get-AzStorageAccount -ResourceGroupName xxx -Name yyy
$ctx = $storageAccount.Context
# download blob
Get-AzStorageblobcontent -Blob "report.xml" `
-Container $containerName `
-Destination " $(System.DefaultWorkingDirectory)" `
-Context $Context
According to this question:How to copy a file from Azure Storage fileshare to Release pipeline agent
Seems you were trying to download file from Azure File share (not blob), please simply refer my reply in that link.
After some digging this is the solution.
What I have done is mount a Storage volume during the container instances to my container that generates the report.
Afterwards in the pipeline I have copied the files that are stored in the storage account to the Pipeline agent.
$storageAcct = Get-AzStorageAccount -ResourceGroupName XXX -Name YYY
Get-AzStorageFileContent -Context $storageAcct.context -ShareName "Sharename" -Path "report.xml" -Destination $(System.DefaultWorkingDirectory)
Then its just matter of getting that files from the agent when using the test reporter during the release pipeline.

how to download file from azure storage blob in project repository inside build pipeline (Azure DevOps)

Need a way to download a set of files from Azure Blob storage to Project repository during the build.
The aim of process is to CI-CD the mobile app. but the mobile app's icon, background image and some other images are provided by other application, so during build the images are suppose to take from blob storage container.
you can use Azure Powershell to do that, would probably be the easiest way of doing that:
$storage = Get-AzStorageAccount -ResourceGroupName xxx -Name yyy
$ctx = $storageAccount.Context
# download blob
Get-AzStorageblobcontent -Blob "Image001.jpg" `
-Container $containerName `
-Destination "D:\_TestImages\Downloads\" `
-Context $ctx
Reading:
https://learn.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-powershell#download-blobs

Function to move/delete files within file share in Azure Storage Explorer?

I'm not proficient in Powershell yet, so please bear with me if I use the incorrect terminology.(And please correct me if I do.)
I have installed the Az and Azure.Storage modules.
I have also connected to my account using Connect-AZAccount (Is this the best way? Since you need to copy the URL and login via a browser)
Then I was just trying to view the files, to test the connection. Using Get-AzureStorageFile
This prompts me for a sharename - I used the name of the folder under File Shares in Azure Storage Explorer. But this failed, see failure below
cmdlet Get-AzureStorageFile at command pipeline position 1 Supply
values for the following parameters: (Type !? for Help.) ShareName:
bss get-azurestoragefile : Could not get the storage context. Please
pass in a storage context or set the current storage context.
Additional information to note, I do not have access to the Account Key, only the SAS Token.
Any help would be appreciated.
If you use Connect-AzAccount, you will use the Az module powershell Get-AzStorageFile instead of Get-AzureStorageFile. Before running the Get-AzStorageFile command, you need to pass the storage context with New-AzStorageContext to fix the error.
Sample:
$context = New-AzStorageContext -StorageAccountName "<StorageAccountName>" -StorageAccountKey "<StorageAccountKey>"
Get-AzStorageFile -ShareName "<ShareName>" -Path "<ContosoWorkingFolder>" -Context $context

Using SAS token to upload Blob content

I'm having difficulty in using a Blob SAS token to write a file to a Blob in Azure via Powershell.
The code I'm using to generate the SAS token is:
$storageContext = Get-AzureRmStorageAccount -ResourceGroupName $resourceGroup -Name $storageName
$token = New-AzureStorageBlobSASToken -Container $conName -Context $storageContext.Context -Blob $blobName -ExpiryTime $expiry -Permission rw -FullUri
This generates a token as expected: https://name.blob.core.windows.net/container/test.json?sv=2015-04-05&sr=b&sig=abc123&se=2017-03-07T12%3A58%3A52Z&sp=rw
If I use this in the browser it's working fine and downloading the file as expected. However, I can't use this to upload a file. Whenever I try I'm receiving a (403) Forbidden. The code I'm using to upload is:
$accountContext = New-AzureStorageContext -SasToken $sasToken
Get-AzureStorageContainer -Context $accountContext.Context | Set-AzureStorageBlobContent -File $blobFile
I've successfully been using a method similar to this to set Blob content after making a call to Add-AzureRmAccount to authenticate.
I've also tried to use a Container SAS token but I keep getting a 403 error with that.
The fact that the token works for a read leads me to believe that I'm missing something in my Powershell script - can anyone shed any light on what that is?
The fact that the token works for a read leads me to believe that I'm
missing something in my Powershell script - can anyone shed any light
on what that is?
I believe the problem is with the following line of code:
Get-AzureStorageContainer -Context $accountContext.Context
Two things here:
This cmdlet tries to list the blob containers in your storage account. In order to list blob containers using SAS, you would need an Account SAS where as the SAS you're using is a Container SAS.
Your SAS only has Read and Write permission. For listing containers, you would need List permission as well.
I would recommend simply using Set-AzureStorageBlobContent Cmdlet and provide necessary information to it instead of getting the container name through pipeline.
Set-AzureStorageBlobContent -File $blobFile -Container $conName -Context $accountContext.Context -Blob $blobName