Powershell List all tables on Azure Storage - powershell

I am trying to automate a query from tables on various Azure Storages.
The tables are automatically generated with a different name every week
I am looking for a way to automatically generate a list of the tables available on a given storage and then I can use the foreach() function to query each table.
I've seen a few bits of scripts here and there but cannot get something effective
for example:
$response = Invoke-WebRequest -Uri 'https://MyAccountName.table.core.windows.net/Tables/'
[xml]$tables = $response.Content
$tableNames = $tables.feed.entry.content.properties.TableName

To fetch the list of tables in your storage account you can use Azure PowerShell Cmdlets. There's absolutely no need to do it via consuming REST API. The Cmdlet you would want to use is Get-AzureStorageTable.
Here's a sample code:
$StorageAccountName = "your storage account name"
$StorageAccountKey = "your storage account key"
$ctx = New-AzureStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $StorageAccountKey
Get-AzureStorageTable -Context $ctx

Related

Getting blank while trying to get list of blobs from PowerShell

I am trying to fetch the list of blobs present in the Azure Blob Container from PowerShell.
I have tried using function in my script to do that. But it is returning nothing.
My script is somewhat like this(Hiding names of resources):
## Connect to Azure Account
Connect-AzAccount
Function GetBlobs
{
## Get the storage account
$storageAcc=Get-AzStorageAccount -ResourceGroupName $resourceGroupName -Name $storageAccountName
## Get all the containers
$containers=Get-AzStorageContainer
## Get all the blobs
$blobs=Get-AzStorageBlob -Container $containerName
## Loop through all the blobs
foreach($blob in $blobs)
{
write-host $blob.Name
}
}
GetBlobs
But it returned blank though I have blobs in my container. I don't know what I'm doing wrong.
Can someone help me out? I'm new to this platform too, don't know if I put my question in the right way.
I have tested in my environment. It returned the list of blobs successfully.
Try including storage account context. If you want to know more about this, go through this link. After including that, you may get the list of blobs successfully.
Please check if your script is something like this:
$resourceGroupName = "your_RG"
$storageAccountName= "your_SA"
$containerName= "your_container_name"
## Connect to Azure Account
Connect-AzAccount
## Function to get all the blobs
Function GetBlobs
{
$storageAcc=Get-AzStorageAccount -ResourceGroupName $resourceGroupName -Name $storageAccountName
## Get the storage account context
$context=$storageAcc.Context
## Get all the containers
$containers=Get-AzStorageContainer -Context $context
$blobs=Get-AzStorageBlob -Container $containerName -Context $context
foreach ($blob in $blobs)
{
write-host $blob.Name
}
}
GetBlobs
Please check the below reference if it is helpful.
Reference:
How to Get All the Blobs from an Azure Storage Account using PowerShell (c-sharpcorner.com)

Using Only a SAS Token To Upload in PowerShell

I have a SAS Token in the form:
https://name.blob.core.windows.net/container?sv=2015-04-05&sr=b&sig=abc123&se=2017-03-07T12%3A58%3A52Z&sp=rw
I am attempting to use an Az Provided Cmdlet in Powershell to upload content to this blob. I am unable to find an API that simply takes the above SAS token and a file to upload.
Reading this reddit post it seems like my options are:
Parse out the StorageAccountName (in the example name), Container (in the example container) and SASToken (in the example above sv=2015-04-05&sr=b&sig=abc123&se=2017-03-07T12%3A58%3A52Z&sp=rw) and then use New-AzStorageContext/Set-AzStorageBlobContent. This more or less is the answer in this StackOverflow Post (Connect to Azure Storage Account using only SAS token?)
Use Invoke-WebRequest or its kin to basically perform the REST call myself.
I would like to use as many Az provided cmdlets possible so starting with option 1, there doesn't seem to be an API to parse this, the closest I can find is this StackOverflow Post (Using SAS token to upload Blob content) talking about using CloudBlockBlob, however it is unclear if this class is available to me in PowerShell.
To these ends I've created a Regex that appears to work, but is most likely brittle, is there a better way to do this?
$SASUri = https://name.blob.core.windows.net/container?sv=2015-04-05&sr=b&sig=abc123&se=2017-03-07T12%3A58%3A52Z&sp=rw
$fileToUpload = 'Test.json'
$regex = [System.Text.RegularExpressions.Regex]::Match($SASUri, '(?i)\/+(?<StorageAccountName>.*?)\..*\/(?<Container>.*)\?(?<SASToken>.*)')
$storageAccountName = $regex.Groups['StorageAccountName'].Value
$container = $regex.Groups['Container'].Value
$sasToken = $regex.Groups['SASToken'].Value
$storageContext = New-AzStorageContext -StorageAccountName $storageAccountName -SasToken $sasToken
Set-AzStorageBlobContent -File $fileToUpload -Container $container -Context $storageContext -Force
To Restate The Questions
Is there an Az Cmdlet that takes the SAS URI and SAS Token to allow upload?
(If not) Is there an API to parse the SAS URI + SAS Token?
Considering $SASUri is a URI, you can get a System.Uri object using something like:
$uri = [System.Uri] $SASUri
Once you have that, you can get the container name and the SAS token using something like:
$storageAccountName = $uri.DnsSafeHost.Split(".")[0]
$container = $uri.LocalPath.Substring(1)
$sasToken = $uri.Query
After that your code should work just fine:
$storageContext = New-AzStorageContext -StorageAccountName $storageAccountName -SasToken $sasToken
Set-AzStorageBlobContent -File $fileToUpload -Container $container -Context $storageContext -Force

upload file to blob storage with Azure functions in PowerShell using Azure module

Requirement is to store the file in Storage account through Azure functions in PowerShell using Az module. Please help.
$todaydate = Get-Date -Format MM-dd-yy
$LogFull = "AzureScan-$todaydate.log"
$LogItem = New-Item -ItemType File -Name $LogFull
" Text to write" | Out-File -FilePath $LogFull -Append
First of all, what you need to figure out is the input of your function and how you're handling that. If you're just wanting to write a file to blob storage everytime an HTTP triggered Azure function is executed then that is simple enough.
There are a number of elements that come into play when working with blob storage with Azure Functions however that you will need to understand to develop a working solution.
Managed Identities
Azure Funtions are able to be assigned an identity so that you can grant access to the FunctionApp itself rather than having to authenticate as a user. This means you don't have to handle the authentication aspect of your function to access the storage account content and you just need to grant your FunctionApp the relevant permissions to read/write/delete blob or storage content.
There are a number of built in RBAC roles in AzureAD which you can grant to access storage accounts and blobs etc.
You can find the documentation on the RBAC permissions for that here: https://learn.microsoft.com/en-us/azure/role-based-access-control/built-in-roles#storage
and the documentation on how to activate a managed identity on your functionApp can be found here: https://learn.microsoft.com/en-us/azure/app-service/overview-managed-identity?tabs=dotnet#add-a-system-assigned-identity
Storage Account(s)
Programmatically accessing storage account contents depends on the permissions but you can use the access keys associated to the storage account which provide access to at the storage account level
You can read about the access keys here: https://learn.microsoft.com/en-us/azure/storage/common/storage-account-keys-manage?tabs=azure-portal#view-account-access-keys
Just remember that least-privilege access should be adopted and if you leak your keys then someone could access your data.
PowerShell Commands
The PowerShell commands required for programmatically accessing storage accounts and writing blob data can be summarised below
# Variables required - Fill these out
$storageAccountName = '<Insert Storage Account Here'
$containerName = '<Insert StorageContainer Name Here>'
# Set the context to the subscription you want to use
# If your functionApp has access to more than one subscription it will load the first subscription by default.
# Possibly a good habit to be explicit about context.
Set-AzContext -Subscription $subscription
# Get the Storage Account Key to authenticate
$storAccKeys = Get-AzStorageAccountKey -ResourceGroupName 'Storage-ResourceGroup' -Name $storageAccountName
$primaryKey = $storAccKeys | Where-Object keyname -eq 'key1' | Select-Object -ExpandProperty value
# Create a Storage Context which will be used in the subsequent commands
$storageContext = New-AzStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $primaryKey
# Attempt to create a container in the storage account. Handle Error appropriately.
try {
New-AzStorageContainer -Name $containerName -Context $storageContext -ErrorAction Stop
}
catch [Microsoft.WindowsAzure.Commands.Storage.Common.ResourceAlreadyExistException] {
Write-Output ('Container {0} already exists in Storage Account {1}' -f $containerName, $storageAccountName)
# Throw Here if you want it to fail instead.
}
catch {
throw $_
}
# Upload your file here. This may vary depending on your function input and how you plan to have your functionApp work.
Set-AzStorageBlobContent -Container $containerName -File ".\PlanningData" -Blob "Planning2015"
You can see the documentation on Set-AzStorageBlobContent for examples on that here:
https://learn.microsoft.com/en-us/powershell/module/az.storage/set-azstorageblobcontent?view=azps-6.2.1#examples
Generally though you will need a file to upload to blob storage and you can't just write directly to a file in blob storage.
If you need to read more on the Azure Functions side of things then there is the quickstart guide:
https://learn.microsoft.com/en-us/azure/azure-functions/create-first-function-vs-code-powershell
Or the Developer Reference on MS docs is really detailed:
https://learn.microsoft.com/en-us/azure/azure-functions/functions-reference-powershell?tabs=portal

Devops - Linked ARM template - Geneate blob storage SAS toekn using powershell

I'm trying to deploy linked ARM template using devops.
Instead of hard coding SAS token, I would like to generate SAS token using powershell script but I'm not familiar with using powershell to generate blob SAS token.
Any help with this powershell will be appreciated!
Updated 0512:
If you want to get the account key automatically, you should take use this cmdlet Get-AzStorageAccountKey.
The example:
1.Get both of the key1 and key2 of your storage account:
Get-AzStorageAccountKey -ResourceGroupName "your_resourceGroupName" -Name "your_storageAccountName"
Test result:
2.Get the key1 of your storage account:
$s=Get-AzStorageAccountKey -ResourceGroupName "your_resourceGroupName" -Name "your_storageAccountName"
$s[0].Value
Test result:
Original answer:
If you're using azure powershell az module, then you can use New-AzStorageBlobSASToken cmdlet.
Sample code:
$accountName="xxx"
$accountKey="xxxx"
$context=New-AzStorageContext -StorageAccountName $accountName -StorageAccountKey $accountKey
New-AzStorageBlobSASToken -Container "ContainerName" -Blob "BlobName" -Permission rwd -Context $context
Test result:

How can I call New-AzureStorageContext using -SasToken instead of -StorageAccountKey

Using Azure Powershell v1.3, I'm trying to create a new storage context using an existing Shared Access Signature token which references an existing policy. When I call New-AzureStorageContext with -SasToken and -StorageAccountName I get an error:
PS C:\> $ctx = New-AzureStorageContext -SasToken '?sr=c&sv=2015-02-21&si=ReadOnly&sig=<signature omitted>=&api-version=2015-04-05' -StorageAccountName 'mystorageaccount'
New-AzureStorageContext : An item with the same key has already been added.
At line:1 char:8
I feel like I'm getting the format of the SAS token wrong, or am missing a step or parameter, but there are no examples on what it should look like, and this is the only SAS form I've been able to query from Azure.
Note I don't want to use New-AzureStorageAccountSASToken (which all examples use) because I already have a token, and just want to use it for read-only purposes, so I don't need to use the storage account keys. Creating a new one would require permissions I don't want this client to have.
What is the missing syntax / step?
I think you've discovered a bug in Storage Client Library. I traced the code from PowerShell to Storage Client Library and here's what I found. PowerShell Cmdlet code tries to create a StorageCredentials object by passing this SAS Token.
public StorageCredentials(string sasToken)
{
CommonUtility.AssertNotNullOrEmpty("sasToken", sasToken);
this.SASToken = sasToken;
this.UpdateQueryBuilder();
}
private void UpdateQueryBuilder()
{
SasQueryBuilder newQueryBuilder = new SasQueryBuilder(this.SASToken);
newQueryBuilder.Add(Constants.QueryConstants.ApiVersion, Constants.HeaderConstants.TargetStorageVersion);
this.queryBuilder = newQueryBuilder;
}
Now if you look at the code for UpdateQueryBuilder, it tries to add api-version again without checking if this is already there.
I created an issue on Github for this: https://github.com/Azure/azure-storage-net/issues/259.
It is an old one but now Storage Context is working with SAS:
$resourceGroup="YourResourceGroupName"
$storAccName = "YourStorageAccountName"
# get Storage Key
$storKey = (Get-AzureRmStorageAccountKey -ResourceGroupName $resourceGroup -Name $storAccName).Value[0]
# create main Storage Context
$storCont = New-AzureStorageContext -StorageAccountName $storAccName -StorageAccountKey $storKey
# create SAS token
$storSAS = New-AzureStorageAccountSASToken -Service Blob, Queue -ResourceType Service, Container, Object -Permission "rwdalucp" -Context $storCont
# create SAS-based Storage Context
$storContSAS = New-AzureStorageContext -StorageAccountName $storAccName -SasToken $storSAS