Issues using Get-StorageBlobContent to downoald blob to local directory - powershell

I am trying to use Get-StorageBlobContent to download a blob to local directory. However when I specify the container directory of the blob I get an error. My code looks something like this:
$ctx = New-AzureStorageContext $StorageAccountName -StorageAccountKey $StorageAccountKey
$BlobName = "blob123.vhd"
$LocalTargetDirectory = "D:\vhds"
$ContainerName = transfer/2018/vhds
$Get-AzureStorageBlobContent -Blob $BlobName -Container $ContainerName -Destination $LocalTargetDirectory -Context $ctx
It complains that the container name I specify is not a valid:
Container name 'transfer/2018/vhds' is invalid. Valid names start and end with a lower case letter or a number and has in between a lower case letter, number or dash with no consecutive
My question is then, if the blob I am trying to copy is in a folder in a storage account, how do I correctly give its location?
Any help would be appreciated!

Try something like the following:
$ctx = New-AzureStorageContext $StorageAccountName -StorageAccountKey $StorageAccountKey
$BlobName = "2018/vhds/blob123.vhd"
$LocalTargetDirectory = "D:\vhds"
$ContainerName = "transfer"
$Get-AzureStorageBlobContent -Blob $BlobName -Container $ContainerName -Destination $LocalTargetDirectory -Context $ctx
I am assuming that your blob container name is transfer and blob123.vhd is present in 2018/vhds folder.

Related

Azure storage table entries deletion

I have some existing data in Azure table storage.So when I deploy csv file ,the latest changes are deploying but the data which is existed in the azure table storage is not overwriting or old data is not deleting.for ex:I have 3 rows of data in azure storage existing,when I deploy csv file which is having 5 rows,the5 rows data is deploying and old data of 3 rows is not deleting.It should be overwrite but its not happening.Please help me. – Subscription Details:
$subscriptionName = "Tech Enabled Solutions"
$resourceGroupName = "abc"
$storageAccountName = "defghi"
$location = "North Central US"
$tableName = "TestTable"
# Get the storage key for the storage account
$storageAccountKey = "12345678990"
# Get a storage context
$ctx = New-AzureStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountKey
$table = Get-AzureStorageTable -Name $tableName -Context $ctx -ErrorAction Ignore
#If the table exists, start deleting its entities.
if ($table -ne $null)
{
$table=Get-AzureStorageTableRowAll -table $table | Remove-AzureStorageTableRow -table $table -Context $ctx
}
The Get-AzureStorageTableRowAll command is in AzureRmStorageTable Module, so install it before call command.
Add this command to your script to install that module:
Install-PackageProvider -Name NuGet -Force -Scope CurrentUser
Install-Module -Name AzureRmStorageTable -Force -Verbose -Scope CurrentUser

Get-AzureStorageBlob : Could not get the storage context. Please pass in a storage context or set the current storage context

I am using powershell with Azure cmdlets to try and simply see items in blob storage
$StorageContext = New-AzureStorageContext -StorageAccountName 'myblobname' -StorageAccountKey '2341231234asdff2352354345=='
$Container = Get-AzureStorageContainer -Name 'mycontainer' -Context $StorageContext
$blobs = Get-AzureStorageBlob -Container $Container
Error:
Get-AzureStorageBlob : Could not get the storage context. Please pass in a storage context or set the current storage context.
I am 100% sure that the credentials are correct (just random shortened credential data in this post)
Why would I get this error?
Is AzureRM used? The AzureRM version is listed as 3.8.0.
Which versions of which scripts would I need for this to work?
You would need to include StorageContext here as well:
$blobs = Get-AzureStorageBlob -Container $Container
So your code would be:
$blobs = Get-AzureStorageBlob -Container $Container -Context $StorageContext

SAS token error in Azure

I am generating SAS token from PowerShell but when I am trying to access that token from Azure Storage explorer, it is giving problem "Authentication Error. Signature fields not well formed."
here is the full Powershell command :-
Parameter required
$StorageAccountName = 'XXXXXX'
$ResourceGroup = 'remoteaccess'
$ContainerName = "vhds"
Powershell Cmd
$AzStrAct = Get-AzureRmStorageAccount -Name $StorageAccountName -ResourceGroupName $ResourceGroup
$AzStrKey = Get-AzureRmStorageAccountKey -Name $StorageAccountName -ResourceGroupName $ResourceGroup
$AzStrCtx = New-AzureStorageContext $StorageAccountName -StorageAccountKey $AzStrKey[0].Value
Get-AzureStorageContainer -Name $ContainerName -Context $AzStrCtx
$ContainerSASTokenURI = New-AzureStorageContainerSASToken -Name $ContainerName -Permission "rwdl" -StartTime "2017-04-12" -ExpiryTime "2017-04-16" -Context $AzStrCtx -FullUri
Write-Host "The SAS Token of container as below:"
$ContainerSASTokenURI
output
https://XXXXXXX.blob.core.windows.net/vhds?sv=2015-04-05&sr=c&sig=XXXXXXXXXXXXXXXXXXXXXXXX&st=2017-04-1
1T18%3A30%3A00Z&se=2017-04-15T18%3A30%3A00Z&sp=rwdl
I test with your script, and it works for me, my Azure storage explorer version is 0.8.12, I suggest you upgrade Azure storage explorer's version to 0.8.12.

Upload files and folder into Azure Blob Storage

I have created a storage account in Azure and created a container. I am trying to upload files stored in my Server the files are stored within 800 folders.
I have tried doing this with this Powershell script however it does not work with the subfolders.
$LocalFolder = "\\Server\Data"
Add-AzureAccount
# Set a default Azure subscription.
Select-AzureSubscription -SubscriptionName 'nameofsubscription' –Default
$Key = Get-AzureStorageKey -StorageAccountName mydatastorename
$Context = New-AzureStorageContext -StorageAccountKey $Key.Primary -StorageAccountName mydatastorename
foreach ($folder in Get-ChildItem $LocalFolder)
{
ls –Recurse –Path $LocalFolder |Set-AzureStorageBlobContent -Container nameofcontainer -Context $Context -BlobType Block
}
If set the $LocalFolder as "\Server\Data\subfolders001" the files in subfolder001 get uploaded to the container. But when I keep it as "\Server\Data" then it does not work.
I want the script to upload all the sub folders and files within into the storage container.
I have added the output I get when I run it
I don't get any error message, but one warning message each subfolder
WARNING: Can not upload the directory '\\Server\Data\subfolders001' to azure. If you want to upload directory, please use "ls -File -Recurse | Set-AzureStorageBlobContent -Container containerName".
Then noting happens after waiting for a while I have to stop the powershell script.
Was just solving this myself, see my solution below:
$StorageAccountName = "storageAccountName"
$StorageAccountKey = "StorageAccountKey"
$ContainerName = "ContainerName"
$sourceFileRootDirectory = "AbsolutePathToStartingDirectory" # i.e. D:\Docs
function Upload-FileToAzureStorageContainer {
[cmdletbinding()]
param(
$StorageAccountName,
$StorageAccountKey,
$ContainerName,
$sourceFileRootDirectory,
$Force
)
$ctx = New-AzureStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $StorageAccountKey
$container = Get-AzureStorageContainer -Name $ContainerName -Context $ctx
$container.CloudBlobContainer.Uri.AbsoluteUri
if ($container) {
$filesToUpload = Get-ChildItem $sourceFileRootDirectory -Recurse -File
foreach ($x in $filesToUpload) {
$targetPath = ($x.fullname.Substring($sourceFileRootDirectory.Length + 1)).Replace("\", "/")
Write-Verbose "Uploading $("\" + $x.fullname.Substring($sourceFileRootDirectory.Length + 1)) to $($container.CloudBlobContainer.Uri.AbsoluteUri + "/" + $targetPath)"
Set-AzureStorageBlobContent -File $x.fullname -Container $container.Name -Blob $targetPath -Context $ctx -Force:$Force | Out-Null
}
}
}
Running:
Upload-FileToAzureStorageContainer -StorageAccountName $StorageAccountName -StorageAccountKey $StorageAccountKey -ContainerName $ContainerName -sourceFileRootDirectory $sourceFileRootDirectory -Verbose
You should see that it prints the following output as it runs:
VERBOSE: Uploading testfile.json to https://storagecontainername.blob.core.windows.net/path/testfile.json
VERBOSE: Performing the operation "Set" on target "/path/testfile.json".
ICloudBlob : Microsoft.WindowsAzure.Storage.Blob.CloudBlockBlob
BlobType : BlockBlob
Length : 0
ContentType : application/octet-stream
LastModified : 23/03/2017 14:20:53 +00:00
SnapshotTime :
ContinuationToken :
Context : Microsoft.WindowsAzure.Commands.Common.Storage.AzureStorageContext
Name : /path/testfile.json
VERBOSE: Transfer Summary
--------------------------------
Total: 1.
Successful: 1.
Failed: 0.
I admit this isn't perfect but with a bit of effort you could extend it to cater for more complex scenario file uploads but it should do the trick if you wanted to upload the contents of a directory to a target storage container.
Azure Storage Containers are essentially directories that can't contain other storage containers. So in order to implement a folder structure in an Azure Storage Container, you need to prefix the file with the path up to the root folder you started searching from. I.e. If your root is :
D:\Files
And Files contains:
Folder 1
-> File 1.txt
File 2.txt
You need to set the target paths to be
$StorageContainerObject.CloudBlobContainer.Uri.AbsoluteUri + "\Files\Folder 1\File 1.txt"
$StorageContainerObject.CloudBlobContainer.Uri.AbsoluteUri + "\Files\File 2.txt"
If you view your storage container using a tool like Azure Storage Explorer then it recognises the file structures even though the files are all stored within one container.
Thanks,
Jamie
Azure Blob Storage has only one level of folders - containers.
Instead, you may use few options below
Use Azure File Services
https://learn.microsoft.com/en-us/azure/storage/storage-dotnet-how-to-use-files
Use workaround as described here https://www.codeproject.com/articles/597939/modelingplusaplusdirectoryplusstructureplusonplusa

Get Azure storage blob url after uploading powershell

How can i get url of the blob i just uploaded using powershell.My code currently is
$storagekey=Get-AzureStorageKey -StorageAccountName appstorageacc
$ctx=New-AzureStorageContext -StorageAccountName
appstorageacc - StorageAccountKey $storagekey.Primary
Set-AzureStorageBlobContent -File C:\Package\StarterSite.zip
-Container clouddata -Context $ctx -Force
Blob is uploaded successfully, but how can i get it's url out?
Retrieve blob information using the Get-AzureStorageBlob cmdlet and select the AbsoluteUri:
(Get-AzureStorageBlob -blob 'StarterSite.zip' -Container 'clouddata ').ICloudBlob.uri.AbsoluteUri
Another option: just capture the result of Set-AzureStorageBlobContent
$result = Set-AzureStorageBlobContent -File C:\Package\StarterSite.zip
$blobUri = $result.ICloudBlob.Uri.AbsoluteUri
It's not necessary to call Get-AzureStorageBlob after calling Set-AzureStorageBlobContent.
Set-AzureStorageBlobContent returns the same object type that Get-AzureStorageBlob returns.
More details
The output type is AzureStorageBlob for both Get-AzureStorageBlob and Set-AzureStorageBlobContent
AzureStorageBlob has a ICloudBlob property
AzureStorageBlob.ICloudBlob getter returns a CloudBlob which has the Uri property