Upload files and folder into Azure Blob Storage - powershell

I have created a storage account in Azure and created a container. I am trying to upload files stored in my Server the files are stored within 800 folders.
I have tried doing this with this Powershell script however it does not work with the subfolders.
$LocalFolder = "\\Server\Data"
Add-AzureAccount
# Set a default Azure subscription.
Select-AzureSubscription -SubscriptionName 'nameofsubscription' –Default
$Key = Get-AzureStorageKey -StorageAccountName mydatastorename
$Context = New-AzureStorageContext -StorageAccountKey $Key.Primary -StorageAccountName mydatastorename
foreach ($folder in Get-ChildItem $LocalFolder)
{
ls –Recurse –Path $LocalFolder |Set-AzureStorageBlobContent -Container nameofcontainer -Context $Context -BlobType Block
}
If set the $LocalFolder as "\Server\Data\subfolders001" the files in subfolder001 get uploaded to the container. But when I keep it as "\Server\Data" then it does not work.
I want the script to upload all the sub folders and files within into the storage container.
I have added the output I get when I run it
I don't get any error message, but one warning message each subfolder
WARNING: Can not upload the directory '\\Server\Data\subfolders001' to azure. If you want to upload directory, please use "ls -File -Recurse | Set-AzureStorageBlobContent -Container containerName".
Then noting happens after waiting for a while I have to stop the powershell script.

Was just solving this myself, see my solution below:
$StorageAccountName = "storageAccountName"
$StorageAccountKey = "StorageAccountKey"
$ContainerName = "ContainerName"
$sourceFileRootDirectory = "AbsolutePathToStartingDirectory" # i.e. D:\Docs
function Upload-FileToAzureStorageContainer {
[cmdletbinding()]
param(
$StorageAccountName,
$StorageAccountKey,
$ContainerName,
$sourceFileRootDirectory,
$Force
)
$ctx = New-AzureStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $StorageAccountKey
$container = Get-AzureStorageContainer -Name $ContainerName -Context $ctx
$container.CloudBlobContainer.Uri.AbsoluteUri
if ($container) {
$filesToUpload = Get-ChildItem $sourceFileRootDirectory -Recurse -File
foreach ($x in $filesToUpload) {
$targetPath = ($x.fullname.Substring($sourceFileRootDirectory.Length + 1)).Replace("\", "/")
Write-Verbose "Uploading $("\" + $x.fullname.Substring($sourceFileRootDirectory.Length + 1)) to $($container.CloudBlobContainer.Uri.AbsoluteUri + "/" + $targetPath)"
Set-AzureStorageBlobContent -File $x.fullname -Container $container.Name -Blob $targetPath -Context $ctx -Force:$Force | Out-Null
}
}
}
Running:
Upload-FileToAzureStorageContainer -StorageAccountName $StorageAccountName -StorageAccountKey $StorageAccountKey -ContainerName $ContainerName -sourceFileRootDirectory $sourceFileRootDirectory -Verbose
You should see that it prints the following output as it runs:
VERBOSE: Uploading testfile.json to https://storagecontainername.blob.core.windows.net/path/testfile.json
VERBOSE: Performing the operation "Set" on target "/path/testfile.json".
ICloudBlob : Microsoft.WindowsAzure.Storage.Blob.CloudBlockBlob
BlobType : BlockBlob
Length : 0
ContentType : application/octet-stream
LastModified : 23/03/2017 14:20:53 +00:00
SnapshotTime :
ContinuationToken :
Context : Microsoft.WindowsAzure.Commands.Common.Storage.AzureStorageContext
Name : /path/testfile.json
VERBOSE: Transfer Summary
--------------------------------
Total: 1.
Successful: 1.
Failed: 0.
I admit this isn't perfect but with a bit of effort you could extend it to cater for more complex scenario file uploads but it should do the trick if you wanted to upload the contents of a directory to a target storage container.
Azure Storage Containers are essentially directories that can't contain other storage containers. So in order to implement a folder structure in an Azure Storage Container, you need to prefix the file with the path up to the root folder you started searching from. I.e. If your root is :
D:\Files
And Files contains:
Folder 1
-> File 1.txt
File 2.txt
You need to set the target paths to be
$StorageContainerObject.CloudBlobContainer.Uri.AbsoluteUri + "\Files\Folder 1\File 1.txt"
$StorageContainerObject.CloudBlobContainer.Uri.AbsoluteUri + "\Files\File 2.txt"
If you view your storage container using a tool like Azure Storage Explorer then it recognises the file structures even though the files are all stored within one container.
Thanks,
Jamie

Azure Blob Storage has only one level of folders - containers.
Instead, you may use few options below
Use Azure File Services
https://learn.microsoft.com/en-us/azure/storage/storage-dotnet-how-to-use-files
Use workaround as described here https://www.codeproject.com/articles/597939/modelingplusaplusdirectoryplusstructureplusonplusa

Related

The term 'AzCopy' is not recognized as a name of a cmdlet, function, script file

I need to copy tables from table storage into a different storage account. When attempting to execute AzCopy I'm getting the following exception:
The term 'AzCopy' is not recognized as a name of a cmdlet, function,
script file, or executable program. Check the spelling of the name, or
if a path was included, verify that the path is correct and try again.
I'm connected to the terminal from the portal, and have a powershell prompt:
The issue seems to be with this line:
AzCopy /Source:$SrcTableUrl `
/Dest:$DstBlobUrl/$TableName `
/SourceKey:$SrcAccessKey `
/Destkey:$DstAccessKey
How do we run AzCopy command in the terminal in the Azure portal?
Here's the full code powershell script that I'm attempting to execute:
# This simple PowerShell script will copy one or more Azure storage table from one location into another azure storage table
#
# Dependencies :
# https://learn.microsoft.com/en-us/azure/storage/common/storage-use-azcopy
# https://learn.microsoft.com/en-us/powershell/azure/overview?view=azps-1.6.0
#
# Usage :
# Copy-AzureStorageTable -SrcStorageName "" -SrcAccessKey "" -DstStorageName "" -DstAccessKey "" -IncludeTable All
# Copy-AzureStorageTable -SrcStorageName "" -SrcAccessKey "" -DstStorageName "" -DstAccessKey "" -IncludeTable Table1,Table2,Table3
function Copy-AzureStorageTable
{
param
(
[parameter(Mandatory=$true)]
[String]
$SrcStorageName,
[parameter(Mandatory=$true)]
[String]
$SrcAccessKey,
[parameter(Mandatory=$true)]
[String]
$DstStorageName,
[parameter(Mandatory=$true)]
[String]
$DstAccessKey,
[parameter(Mandatory=$true)]
[String[]]
$IncludeTable
)
# Check if logged in
Azure-Login
# Source Account Storage Parameters
$SrcContext = New-AzureStorageContext -StorageAccountName $SrcStorageName -StorageAccountKey $SrcAccessKey
$SrcBaseUrl = "https://" + $SrcStorageName + ".table.core.windows.net/"
# Destination Account Storage Parameters
$DstContext = New-AzureStorageContext -StorageAccountName $DstStorageName -StorageAccountKey $DstAccessKey
$DstTempContainer = "temptable"
$DstBlobUrl = "https://" + $DstStorageName + ".blob.core.windows.net/$DstTempContainer"
$DstTableUrl = "https://" + $DstStorageName + ".table.core.windows.net"
# Create container in destination blob
Write-Host "$DstTempContainer is not existing in $DstStorageName..."
Write-Host "Creating container $DstTempContainer in $DstStorageName..."
New-AzureStorageContainer -Name $DstTempContainer -Permission Off -Context $DstContext
# Get all tables from source
$SrcTables = Get-AzureStorageTable -Name "*" -Context $SrcContext
foreach($table in $SrcTables)
{
$TableName = $table.Name
Write-Host "Table $TableName"
# Validate if copy all table from source
# Validate if table name is included in our list
if(!$IncludeTable.Contains("All") -and !$IncludeTable.Contains($TableName))
{
Write-Host "Skipping table $TableName"
return
}
Write-Host "Migrating Table $TableName"
$SrcTableUrl = $SrcBaseUrl + $TableName
# Copy Table from source to blob destination. As far as I know there is way no way to copy table to table directly.
# Alternatively, we will copy the table temporaryly into destination blob.
# Take note to put the actual path of AzCopy.exe
Write-Host "Start exporting table $TableName..."
Write-Host "From : $SrcTableUrl"
Write-Host "To : $DstBlobUrl/$TableName"
AzCopy /Source:$SrcTableUrl `
/Dest:$DstBlobUrl/$TableName `
/SourceKey:$SrcAccessKey `
/Destkey:$DstAccessKey
# Get the newly created blob
Write-Host "Get all blobs in $DstTempContainer..."
$CurrentBlob = Get-AzureStorageBlob -Container $DstTempContainer -Prefix $TableName -Context $DstContext
# Loop and check manifest, then import blob to table
foreach($blob in $CurrentBlob)
{
if(!$blob.Name.contains('.manifest'))
{
return
}
$manifest = $($blob.Name).split('/')[1]
Write-Host "Start importing $TableName..."
Write-Host "Source blob url : $DstBlobUrl/$TableName"
Write-Host "Dest table url : $DstTableUrl/$TableName"
Write-Host "Manifest name : $manifest"
# Import blob to table. Insert entity if missing and update entity if exists
AzCopy /Source:$DstBlobUrl/$TableName `
/Dest:$DstTableUrl/$TableName `
/SourceKey:$DstAccessKey `
/DestKey:$DstAccessKey `
/Manifest:$manifest `
/EntityOperation:"InsertOrReplace"
}
}
# Delete temp table storage after export and import process
Write-Host "Removing $DstTempContainer from destination blob storage..."
Remove-AzureStorageContainer -Name $DstTempContainer -Context $DstContext -Force
}
# Login
function Azure-Login
{
$needLogin = $true
Try
{
$content = Get-AzureRmContext
if ($content)
{
$needLogin = ([string]::IsNullOrEmpty($content.Account))
}
}
Catch
{
if ($_ -like "*Login-AzureRmAccount to login*")
{
$needLogin = $true
}
else
{
throw
}
}
if ($needLogin)
{
Login-AzureRmAccount
}
}
My solution was just running a command:
Install-Module -Name Az -Scope CurrentUser -Repository PSGallery -Force
in the powershell
Before:
azcopy now works
and the script now works:
Hope it helps
Azure Portal cloud shell shows to be using AzCopy version 10.6.1 as of 2021.11.10. The ability to copy between tables has been removed after version 7.3.
You need to run the script from a machine where you can download the older version of AzCopy.

How to rename a blob file using powershell

seemingly simple task. I just want to rename a blob file, I know I have to copy it to rename or something, then delete the original but this is proving tricky. I have created the storage context (New-AzureStorageContext), and got the blob (Get-AzureStorageBlob), and found Start-AzureStorageBlobCopy, but how to I actually rename it?
I'd like to do this within the same container if possible as well. Ideally I'd run it in an Azure Runbook and call it using a webhook I Azure Data Factory v2. I did try to rename the file using 'Add Dynamic Content' in the copy job sink in DFv2, but I don't think you can. By the way, I just want to append the date to the existing file name. Thank you.
You can use my Rename-AzureStorageBlobconvenience function:
function Rename-AzureStorageBlob
{
[CmdletBinding()]
Param
(
[Parameter(Mandatory=$true, ValueFromPipeline=$true, Position=0)]
[Microsoft.WindowsAzure.Commands.Common.Storage.ResourceModel.AzureStorageBlob]$Blob,
[Parameter(Mandatory=$true, Position=1)]
[string]$NewName
)
Process {
$blobCopyAction = Start-AzureStorageBlobCopy `
-ICloudBlob $Blob.ICloudBlob `
-DestBlob $NewName `
-Context $Blob.Context `
-DestContainer $Blob.ICloudBlob.Container.Name
$status = $blobCopyAction | Get-AzureStorageBlobCopyState
while ($status.Status -ne 'Success')
{
$status = $blobCopyAction | Get-AzureStorageBlobCopyState
Start-Sleep -Milliseconds 50
}
$Blob | Remove-AzureStorageBlob -Force
}
}
It accepts the blob as pipeline input so you can pipe the result of the Get-AzureStorageBlob to it and just provide a new name:
$connectionString= 'DefaultEndpointsProtocol=https;AccountName....'
$storageContext = New-AzureStorageContext -ConnectionString $connectionString
Get-AzureStorageBlob -Container 'MyContainer' -Context $storageContext -Blob 'myBlob.txt'|
Rename-AzureStorageBlob -NewName 'MyNewBlob.txt'
To append the date to the existing file name you can use something like:
Get-AzureStorageBlob -Container 'MyContainer' -Context $storageContext -Blob 'myBlob.txt' | ForEach-Object {
$_ | Rename-AzureStorageBlob -NewName "$($_.Name)$(Get-Date -f "FileDateTime")" }
Further reading: Rename Azure Storage Blob using PowerShell

Issues using Get-StorageBlobContent to downoald blob to local directory

I am trying to use Get-StorageBlobContent to download a blob to local directory. However when I specify the container directory of the blob I get an error. My code looks something like this:
$ctx = New-AzureStorageContext $StorageAccountName -StorageAccountKey $StorageAccountKey
$BlobName = "blob123.vhd"
$LocalTargetDirectory = "D:\vhds"
$ContainerName = transfer/2018/vhds
$Get-AzureStorageBlobContent -Blob $BlobName -Container $ContainerName -Destination $LocalTargetDirectory -Context $ctx
It complains that the container name I specify is not a valid:
Container name 'transfer/2018/vhds' is invalid. Valid names start and end with a lower case letter or a number and has in between a lower case letter, number or dash with no consecutive
My question is then, if the blob I am trying to copy is in a folder in a storage account, how do I correctly give its location?
Any help would be appreciated!
Try something like the following:
$ctx = New-AzureStorageContext $StorageAccountName -StorageAccountKey $StorageAccountKey
$BlobName = "2018/vhds/blob123.vhd"
$LocalTargetDirectory = "D:\vhds"
$ContainerName = "transfer"
$Get-AzureStorageBlobContent -Blob $BlobName -Container $ContainerName -Destination $LocalTargetDirectory -Context $ctx
I am assuming that your blob container name is transfer and blob123.vhd is present in 2018/vhds folder.

Azure storage table entries deletion

I have some existing data in Azure table storage.So when I deploy csv file ,the latest changes are deploying but the data which is existed in the azure table storage is not overwriting or old data is not deleting.for ex:I have 3 rows of data in azure storage existing,when I deploy csv file which is having 5 rows,the5 rows data is deploying and old data of 3 rows is not deleting.It should be overwrite but its not happening.Please help me. – Subscription Details:
$subscriptionName = "Tech Enabled Solutions"
$resourceGroupName = "abc"
$storageAccountName = "defghi"
$location = "North Central US"
$tableName = "TestTable"
# Get the storage key for the storage account
$storageAccountKey = "12345678990"
# Get a storage context
$ctx = New-AzureStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountKey
$table = Get-AzureStorageTable -Name $tableName -Context $ctx -ErrorAction Ignore
#If the table exists, start deleting its entities.
if ($table -ne $null)
{
$table=Get-AzureStorageTableRowAll -table $table | Remove-AzureStorageTableRow -table $table -Context $ctx
}
The Get-AzureStorageTableRowAll command is in AzureRmStorageTable Module, so install it before call command.
Add this command to your script to install that module:
Install-PackageProvider -Name NuGet -Force -Scope CurrentUser
Install-Module -Name AzureRmStorageTable -Force -Verbose -Scope CurrentUser

Get-AzureStorageBlob : Could not get the storage context. Please pass in a storage context or set the current storage context

I am using powershell with Azure cmdlets to try and simply see items in blob storage
$StorageContext = New-AzureStorageContext -StorageAccountName 'myblobname' -StorageAccountKey '2341231234asdff2352354345=='
$Container = Get-AzureStorageContainer -Name 'mycontainer' -Context $StorageContext
$blobs = Get-AzureStorageBlob -Container $Container
Error:
Get-AzureStorageBlob : Could not get the storage context. Please pass in a storage context or set the current storage context.
I am 100% sure that the credentials are correct (just random shortened credential data in this post)
Why would I get this error?
Is AzureRM used? The AzureRM version is listed as 3.8.0.
Which versions of which scripts would I need for this to work?
You would need to include StorageContext here as well:
$blobs = Get-AzureStorageBlob -Container $Container
So your code would be:
$blobs = Get-AzureStorageBlob -Container $Container -Context $StorageContext