How to rename a blob file using powershell - powershell

seemingly simple task. I just want to rename a blob file, I know I have to copy it to rename or something, then delete the original but this is proving tricky. I have created the storage context (New-AzureStorageContext), and got the blob (Get-AzureStorageBlob), and found Start-AzureStorageBlobCopy, but how to I actually rename it?
I'd like to do this within the same container if possible as well. Ideally I'd run it in an Azure Runbook and call it using a webhook I Azure Data Factory v2. I did try to rename the file using 'Add Dynamic Content' in the copy job sink in DFv2, but I don't think you can. By the way, I just want to append the date to the existing file name. Thank you.

You can use my Rename-AzureStorageBlobconvenience function:
function Rename-AzureStorageBlob
{
[CmdletBinding()]
Param
(
[Parameter(Mandatory=$true, ValueFromPipeline=$true, Position=0)]
[Microsoft.WindowsAzure.Commands.Common.Storage.ResourceModel.AzureStorageBlob]$Blob,
[Parameter(Mandatory=$true, Position=1)]
[string]$NewName
)
Process {
$blobCopyAction = Start-AzureStorageBlobCopy `
-ICloudBlob $Blob.ICloudBlob `
-DestBlob $NewName `
-Context $Blob.Context `
-DestContainer $Blob.ICloudBlob.Container.Name
$status = $blobCopyAction | Get-AzureStorageBlobCopyState
while ($status.Status -ne 'Success')
{
$status = $blobCopyAction | Get-AzureStorageBlobCopyState
Start-Sleep -Milliseconds 50
}
$Blob | Remove-AzureStorageBlob -Force
}
}
It accepts the blob as pipeline input so you can pipe the result of the Get-AzureStorageBlob to it and just provide a new name:
$connectionString= 'DefaultEndpointsProtocol=https;AccountName....'
$storageContext = New-AzureStorageContext -ConnectionString $connectionString
Get-AzureStorageBlob -Container 'MyContainer' -Context $storageContext -Blob 'myBlob.txt'|
Rename-AzureStorageBlob -NewName 'MyNewBlob.txt'
To append the date to the existing file name you can use something like:
Get-AzureStorageBlob -Container 'MyContainer' -Context $storageContext -Blob 'myBlob.txt' | ForEach-Object {
$_ | Rename-AzureStorageBlob -NewName "$($_.Name)$(Get-Date -f "FileDateTime")" }
Further reading: Rename Azure Storage Blob using PowerShell

Related

The term 'AzCopy' is not recognized as a name of a cmdlet, function, script file

I need to copy tables from table storage into a different storage account. When attempting to execute AzCopy I'm getting the following exception:
The term 'AzCopy' is not recognized as a name of a cmdlet, function,
script file, or executable program. Check the spelling of the name, or
if a path was included, verify that the path is correct and try again.
I'm connected to the terminal from the portal, and have a powershell prompt:
The issue seems to be with this line:
AzCopy /Source:$SrcTableUrl `
/Dest:$DstBlobUrl/$TableName `
/SourceKey:$SrcAccessKey `
/Destkey:$DstAccessKey
How do we run AzCopy command in the terminal in the Azure portal?
Here's the full code powershell script that I'm attempting to execute:
# This simple PowerShell script will copy one or more Azure storage table from one location into another azure storage table
#
# Dependencies :
# https://learn.microsoft.com/en-us/azure/storage/common/storage-use-azcopy
# https://learn.microsoft.com/en-us/powershell/azure/overview?view=azps-1.6.0
#
# Usage :
# Copy-AzureStorageTable -SrcStorageName "" -SrcAccessKey "" -DstStorageName "" -DstAccessKey "" -IncludeTable All
# Copy-AzureStorageTable -SrcStorageName "" -SrcAccessKey "" -DstStorageName "" -DstAccessKey "" -IncludeTable Table1,Table2,Table3
function Copy-AzureStorageTable
{
param
(
[parameter(Mandatory=$true)]
[String]
$SrcStorageName,
[parameter(Mandatory=$true)]
[String]
$SrcAccessKey,
[parameter(Mandatory=$true)]
[String]
$DstStorageName,
[parameter(Mandatory=$true)]
[String]
$DstAccessKey,
[parameter(Mandatory=$true)]
[String[]]
$IncludeTable
)
# Check if logged in
Azure-Login
# Source Account Storage Parameters
$SrcContext = New-AzureStorageContext -StorageAccountName $SrcStorageName -StorageAccountKey $SrcAccessKey
$SrcBaseUrl = "https://" + $SrcStorageName + ".table.core.windows.net/"
# Destination Account Storage Parameters
$DstContext = New-AzureStorageContext -StorageAccountName $DstStorageName -StorageAccountKey $DstAccessKey
$DstTempContainer = "temptable"
$DstBlobUrl = "https://" + $DstStorageName + ".blob.core.windows.net/$DstTempContainer"
$DstTableUrl = "https://" + $DstStorageName + ".table.core.windows.net"
# Create container in destination blob
Write-Host "$DstTempContainer is not existing in $DstStorageName..."
Write-Host "Creating container $DstTempContainer in $DstStorageName..."
New-AzureStorageContainer -Name $DstTempContainer -Permission Off -Context $DstContext
# Get all tables from source
$SrcTables = Get-AzureStorageTable -Name "*" -Context $SrcContext
foreach($table in $SrcTables)
{
$TableName = $table.Name
Write-Host "Table $TableName"
# Validate if copy all table from source
# Validate if table name is included in our list
if(!$IncludeTable.Contains("All") -and !$IncludeTable.Contains($TableName))
{
Write-Host "Skipping table $TableName"
return
}
Write-Host "Migrating Table $TableName"
$SrcTableUrl = $SrcBaseUrl + $TableName
# Copy Table from source to blob destination. As far as I know there is way no way to copy table to table directly.
# Alternatively, we will copy the table temporaryly into destination blob.
# Take note to put the actual path of AzCopy.exe
Write-Host "Start exporting table $TableName..."
Write-Host "From : $SrcTableUrl"
Write-Host "To : $DstBlobUrl/$TableName"
AzCopy /Source:$SrcTableUrl `
/Dest:$DstBlobUrl/$TableName `
/SourceKey:$SrcAccessKey `
/Destkey:$DstAccessKey
# Get the newly created blob
Write-Host "Get all blobs in $DstTempContainer..."
$CurrentBlob = Get-AzureStorageBlob -Container $DstTempContainer -Prefix $TableName -Context $DstContext
# Loop and check manifest, then import blob to table
foreach($blob in $CurrentBlob)
{
if(!$blob.Name.contains('.manifest'))
{
return
}
$manifest = $($blob.Name).split('/')[1]
Write-Host "Start importing $TableName..."
Write-Host "Source blob url : $DstBlobUrl/$TableName"
Write-Host "Dest table url : $DstTableUrl/$TableName"
Write-Host "Manifest name : $manifest"
# Import blob to table. Insert entity if missing and update entity if exists
AzCopy /Source:$DstBlobUrl/$TableName `
/Dest:$DstTableUrl/$TableName `
/SourceKey:$DstAccessKey `
/DestKey:$DstAccessKey `
/Manifest:$manifest `
/EntityOperation:"InsertOrReplace"
}
}
# Delete temp table storage after export and import process
Write-Host "Removing $DstTempContainer from destination blob storage..."
Remove-AzureStorageContainer -Name $DstTempContainer -Context $DstContext -Force
}
# Login
function Azure-Login
{
$needLogin = $true
Try
{
$content = Get-AzureRmContext
if ($content)
{
$needLogin = ([string]::IsNullOrEmpty($content.Account))
}
}
Catch
{
if ($_ -like "*Login-AzureRmAccount to login*")
{
$needLogin = $true
}
else
{
throw
}
}
if ($needLogin)
{
Login-AzureRmAccount
}
}
My solution was just running a command:
Install-Module -Name Az -Scope CurrentUser -Repository PSGallery -Force
in the powershell
Before:
azcopy now works
and the script now works:
Hope it helps
Azure Portal cloud shell shows to be using AzCopy version 10.6.1 as of 2021.11.10. The ability to copy between tables has been removed after version 7.3.
You need to run the script from a machine where you can download the older version of AzCopy.

Check if table exists in azure storage powershell

i have this powershell using which I want to create new table in azure storage account.
Param(
[string]$rgName,
[string] $tableName
)
$storcontext= New-AzureStorageContext -ConnectionString '$(MyConnectionString)'
if(!(Get-AzureStorageTable -Name $tableName -Context $storcontext ))
{
New-AzureStorageTable -Name $tableName -Context $storcontext
}
New-AzureStorageTable command works perfect.however i tried adding a check to see if table already exists . but on Get command, powershell throws me saying table does not exist.
What I want to do is check if table exist, if not , then create it.
Is there another way of doing this?
The cmdlet throws an error if the table doesn't exists so you could set the ErrorAction to SilentlyContinue and specify a variable for the error which you could check:
Get-AzureStorageTable -Name $tableName -Context $storcontext -ErrorVariable ev -ErrorAction SilentlyContinue
if ($ev) {
New-AzureStorageTable -Name $tableName -Context $storcontext
}

Error while deploying the tables using powershell script into Azure table storage

I am running the below script and passing the script parameters for the $fileObj through powershell script using arguments section in VSTS powershell task.I am trying to deploy table data into Azure table storage. I have table data in .csv files and I am trying to deploy those table entities using powershell script and deploying into azure table storage.The below script is not deploying the table entities and failing with error. Could any one please help me out.
I have attached the error log in onedrive location: https://onedrive.live.com/?authkey=%21AEh2aAOnbmuzq9U&cid=5599285D52BD31F3&id=5599285D52BD31F3%21900&parId=root&action=locate
foreach($fo in $fileObj){
Write-Host $fo.filepath
$csv = Import-CSV $fo.filepath
$cArray=$fo.Cols.split(",")
foreach($line in $csv)
{
Write-Host "$($line.partitionkey), $($line.rowKey)"
$entity = New-Object -TypeName Microsoft.WindowsAzure.Storage.Table.DynamicTableEntity -ArgumentList $line.partitionkey, $line.rowKey
foreach($c in $cArray){
Write-Host "$c,$($line.$c)"
$entity.Properties.Add($c,$line.$c)
}
$result = $table.CloudTable.Execute([Microsoft.WindowsAzure.Storage.Table.TableOperation]::Insert($entity))
}
}
$subscriptionName = ""
$resourceGroupName = ""
$storageAccountName = ""
$location = ""
# Get the storage key for the storage account
$StorageAccountKey = ""
# Get a storage context
$ctx = New-AzureStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountKey
According to your mentioned log, I find that it seems that your csv column name is not corresponding to your code. And your CSV file format with 2 Colunms named Partitionkey and Rowkey is not correct. Please have a try to use the following demo code and csv file format. It works correctly on myside.
$resourceGroup ="resourceGroup name"
$storageAccount = "storage account Name"
$tableName = "table name"
$storageAccountKey = "storage key"
$ctx = New-AzureStorageContext -StorageAccountName $storageAccount -
StorageAccountKey $storageAccountKey
######### Add removing table and create table code #######
try
{
Write-Host "Start to remove table $tableName, please wait a moment..."
Remove-AzureStorageTable -Name $tableName -Context $ctx -Force # Remove the Azure table
Start-Sleep -Seconds 60 # waiting for removing table, you could change it according to your table
Write-Host "$tableName table has been removed"
}
catch
{
Write-Host "$tableName is not existing"
}
Write-Host "Start to create $tableName table"
New-AzureStorageTable -Name $tableName -Context $ctx # Create new azure storage table
##########Add removing table and create table code ############
$table = Get-AzureStorageTable -Name $tableName -Context $ctx
$csvPath ='csv file path'
$cols = "Label_Usage,Label_Value,Usage_Location" #should be corrensponding to your csv column exclude Partitionkey and RowKey
$csv = Import-Csv -Path $csvPath
$number = 0
[Microsoft.WindowsAzure.Storage.Table.TableBatchOperation]$batchOperation = New-Object -TypeName Microsoft.WindowsAzure.Storage.Table.TableBatchOperation
foreach($line in $csv)
{
$number++
$entity = New-Object -TypeName Microsoft.WindowsAzure.Storage.Table.DynamicTableEntity -ArgumentList $line.partitionkey, $line.rowKey
$colArray = $cols.split(",")
Write-Host "$($line.partitionkey), $($line.rowKey)" #output partitionkey and rowKey value
foreach($colName in $colArray)
{
Write-Host "$colName,$($line.$colName)" #output column name and value
$entity.Properties.Add($colName,$line.$colName)
}
if($number -le 100)
{
$batchOperation.InsertOrReplace($entity) # Changed code
}
else
{ $number =0
$result = $table.CloudTable.ExecuteBatch($batchOperation)
[Microsoft.WindowsAzure.Storage.Table.TableBatchOperation]$batchOperation = New-Object -TypeName Microsoft.WindowsAzure.Storage.Table.TableBatchOperation
}
}
if($batchOperation.Count -ne 0)
{
$result = $table.CloudTable.ExecuteBatch($batchOperation)
}
Note: For batch operation requires records in the CSV file with the same partition key value.
csv file example format
Test Result:

Upload files and folder into Azure Blob Storage

I have created a storage account in Azure and created a container. I am trying to upload files stored in my Server the files are stored within 800 folders.
I have tried doing this with this Powershell script however it does not work with the subfolders.
$LocalFolder = "\\Server\Data"
Add-AzureAccount
# Set a default Azure subscription.
Select-AzureSubscription -SubscriptionName 'nameofsubscription' –Default
$Key = Get-AzureStorageKey -StorageAccountName mydatastorename
$Context = New-AzureStorageContext -StorageAccountKey $Key.Primary -StorageAccountName mydatastorename
foreach ($folder in Get-ChildItem $LocalFolder)
{
ls –Recurse –Path $LocalFolder |Set-AzureStorageBlobContent -Container nameofcontainer -Context $Context -BlobType Block
}
If set the $LocalFolder as "\Server\Data\subfolders001" the files in subfolder001 get uploaded to the container. But when I keep it as "\Server\Data" then it does not work.
I want the script to upload all the sub folders and files within into the storage container.
I have added the output I get when I run it
I don't get any error message, but one warning message each subfolder
WARNING: Can not upload the directory '\\Server\Data\subfolders001' to azure. If you want to upload directory, please use "ls -File -Recurse | Set-AzureStorageBlobContent -Container containerName".
Then noting happens after waiting for a while I have to stop the powershell script.
Was just solving this myself, see my solution below:
$StorageAccountName = "storageAccountName"
$StorageAccountKey = "StorageAccountKey"
$ContainerName = "ContainerName"
$sourceFileRootDirectory = "AbsolutePathToStartingDirectory" # i.e. D:\Docs
function Upload-FileToAzureStorageContainer {
[cmdletbinding()]
param(
$StorageAccountName,
$StorageAccountKey,
$ContainerName,
$sourceFileRootDirectory,
$Force
)
$ctx = New-AzureStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $StorageAccountKey
$container = Get-AzureStorageContainer -Name $ContainerName -Context $ctx
$container.CloudBlobContainer.Uri.AbsoluteUri
if ($container) {
$filesToUpload = Get-ChildItem $sourceFileRootDirectory -Recurse -File
foreach ($x in $filesToUpload) {
$targetPath = ($x.fullname.Substring($sourceFileRootDirectory.Length + 1)).Replace("\", "/")
Write-Verbose "Uploading $("\" + $x.fullname.Substring($sourceFileRootDirectory.Length + 1)) to $($container.CloudBlobContainer.Uri.AbsoluteUri + "/" + $targetPath)"
Set-AzureStorageBlobContent -File $x.fullname -Container $container.Name -Blob $targetPath -Context $ctx -Force:$Force | Out-Null
}
}
}
Running:
Upload-FileToAzureStorageContainer -StorageAccountName $StorageAccountName -StorageAccountKey $StorageAccountKey -ContainerName $ContainerName -sourceFileRootDirectory $sourceFileRootDirectory -Verbose
You should see that it prints the following output as it runs:
VERBOSE: Uploading testfile.json to https://storagecontainername.blob.core.windows.net/path/testfile.json
VERBOSE: Performing the operation "Set" on target "/path/testfile.json".
ICloudBlob : Microsoft.WindowsAzure.Storage.Blob.CloudBlockBlob
BlobType : BlockBlob
Length : 0
ContentType : application/octet-stream
LastModified : 23/03/2017 14:20:53 +00:00
SnapshotTime :
ContinuationToken :
Context : Microsoft.WindowsAzure.Commands.Common.Storage.AzureStorageContext
Name : /path/testfile.json
VERBOSE: Transfer Summary
--------------------------------
Total: 1.
Successful: 1.
Failed: 0.
I admit this isn't perfect but with a bit of effort you could extend it to cater for more complex scenario file uploads but it should do the trick if you wanted to upload the contents of a directory to a target storage container.
Azure Storage Containers are essentially directories that can't contain other storage containers. So in order to implement a folder structure in an Azure Storage Container, you need to prefix the file with the path up to the root folder you started searching from. I.e. If your root is :
D:\Files
And Files contains:
Folder 1
-> File 1.txt
File 2.txt
You need to set the target paths to be
$StorageContainerObject.CloudBlobContainer.Uri.AbsoluteUri + "\Files\Folder 1\File 1.txt"
$StorageContainerObject.CloudBlobContainer.Uri.AbsoluteUri + "\Files\File 2.txt"
If you view your storage container using a tool like Azure Storage Explorer then it recognises the file structures even though the files are all stored within one container.
Thanks,
Jamie
Azure Blob Storage has only one level of folders - containers.
Instead, you may use few options below
Use Azure File Services
https://learn.microsoft.com/en-us/azure/storage/storage-dotnet-how-to-use-files
Use workaround as described here https://www.codeproject.com/articles/597939/modelingplusaplusdirectoryplusstructureplusonplusa

Get Azure storage blob url after uploading powershell

How can i get url of the blob i just uploaded using powershell.My code currently is
$storagekey=Get-AzureStorageKey -StorageAccountName appstorageacc
$ctx=New-AzureStorageContext -StorageAccountName
appstorageacc - StorageAccountKey $storagekey.Primary
Set-AzureStorageBlobContent -File C:\Package\StarterSite.zip
-Container clouddata -Context $ctx -Force
Blob is uploaded successfully, but how can i get it's url out?
Retrieve blob information using the Get-AzureStorageBlob cmdlet and select the AbsoluteUri:
(Get-AzureStorageBlob -blob 'StarterSite.zip' -Container 'clouddata ').ICloudBlob.uri.AbsoluteUri
Another option: just capture the result of Set-AzureStorageBlobContent
$result = Set-AzureStorageBlobContent -File C:\Package\StarterSite.zip
$blobUri = $result.ICloudBlob.Uri.AbsoluteUri
It's not necessary to call Get-AzureStorageBlob after calling Set-AzureStorageBlobContent.
Set-AzureStorageBlobContent returns the same object type that Get-AzureStorageBlob returns.
More details
The output type is AzureStorageBlob for both Get-AzureStorageBlob and Set-AzureStorageBlobContent
AzureStorageBlob has a ICloudBlob property
AzureStorageBlob.ICloudBlob getter returns a CloudBlob which has the Uri property