I want to send dump files to a storage container and for the copy to work we need to obtain a SAS key for the container we’re copying to.
When you use Azure Storage Explorer you can copy a file to a container and then copy the command it used to the clipboard which looks something like this:
$env:AZCOPY_CRED_TYPE = "Anonymous";
./azcopy.exe copy "C:\temp\test.txt" "https://dbbackups.blob.core.windows.net/memorydumps/test.txt?{SAS-TOKEN}" --overwrite=prompt --from-to=LocalBlob --blob-type Detect --follow-symlinks --put-md5 --follow-symlinks --recursive;
$env:AZCOPY_CRED_TYPE = "";
I copied this from AZ Storage Explorer when copying a file called test.txt from c:\temp to the memorydumps container in a Storage Account.
What I would need help with is creating a PowerShell script that generates the above command line so I can run it on azcopy-less nodes and have the dumps end up in the storage container. Any ideas?
You could use the Azure PowerShell equivalent to upload blobs to your container. The Set-AzStorageBlobContent uploads a local file to an Azure Storage blob.
Set-AzStorageBlobContent -File "C:\Temp\test.txt" `
-Container $containerName `
-Blob "Image001.jpg" `
-Context $ctx
Refer to this blog post for a detailed walkthough: File Transfers to Azure Blob Storage Using Azure PowerShell
Related
I need to delete ADLS Gen2 subfolder with specific name using databricks dbutils but not able to perform wild card recursion.
e.g.
adlsstorageaccount/container1/folder2/folder3/folderA/**abc**/.parquet
adlsstorageaccount/container1/folder2/folder3/folderB/**abc**/.parquet
adlsstorageaccount/container1/folder2/folder3/folderC/**abc**/.parquet
Need to delete subfolder name "abc" & its contents only, entire path is dynamic.
We have reproduced the same Folders in our environment and here are the commands that worked.
$FileSystemName="<Your Container Name>"
$dirname="folder2/folder3/folderA/abc/"
$ctx=New-AzStorageContext -StorageAccountName '<Your Storage Account>' -StorageAccountKey '<Your Access Key>'
Remove-AzDataLakeGen2Item -Context $ctx -FileSystem $filesystemName -Path $dirname
and then confirm it with Y - 'yes'
Here are the screenshots for your reference
Before using the command execution
After the command execution
REFERENCES:
https://learn.microsoft.com/en-us/azure/storage/blobs/data-lake-storage-directory-file-acl-powershell#delete-a-directory
you need just use magic command %sh in databricks and remove it with wilcard using standard linux commands, something like:
%sh
rm /dbfs/mnt/adlsstorageaccount/container1/folder2/folder3/folderA/*abc*/*.parquet
I'm attempting to clone my storage tables into a different storage account. What is the best way to do this with powershell?
I've attempted this solution; however, at this point I'm getting this exception:
Copy-AzureStorageTable : The term 'Copy-AzureStorageTable' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the
name, or if a path was included, verify that the path is correct and try again.
How do we copy a table into another storage account?
AzCopy v10 doesn't support Azure Table Storage unfortunately. To export/import data from/to Azure Table Storage, you need to use AzCopy v7.3 instead.
Note that it doesn't support direct Table to Table copy, so you need to export the source table to local disk or Blob Storage at first, then import it to another destination table.
We have written the below PowerShell script that will download all the tables under the storage account to your local & it will upload to the destination storage account which is working fine.
Here is the PowerShell Script:
Connect-azaccount
$strgName='<storageAccountName>'
$stcontext=New-AzStorageContext -StorageAccountName $strgName -StorageAccountKey <StorageAccountKey>
$tablelist=Get-AzStorageTable -Context $stcontext | Select-Object -Property Uri,Name
foreach($table in $tablelist){
$Sourceuri=$table.Uri
cd "C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy"
.\AzCopy /Source:$Sourceuri /Dest:C:\Users\Downloads\azcopy1 /SourceKey:<StorageAccountKey>
}
$localist=Get-ChildItem -Path C:\users\Downloads\azcopy1\ -Exclude *.json
foreach( $item in $localist){
$tbname=$item.Name.Replace('<storageaccountName>_','').Replace('.manifest','').Replace('_','').Replace('.','')
$manifest=$item.Name.Replace('C:\users\Downloads\azcopy1\','')
cd "C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy" `
.\AzCopy /Source:C:\users\Downloads\azcopy\ /Dest:https://<DestinationStorageAccount>.table.core.windows.net/$tbname/ /DestKey:<DestinationAccountKey> /Manifest:$manifest /EntityOperation:InsertOrReplace
}
Here is the output for reference :
I am copying some files to azure blob storage. I am following Microsoft docs https://learn.microsoft.com/en-us/powershell/module/azure.storage/new-azurestoragecontainersastoken and https://learn.microsoft.com/en-us/powershell/module/azure.storage/new-azurestoragecontext to create a SAS token. I have the code working but I can't figure out how to set a custom path for URL that New-AzureStorageContext generates, because I would like to copy to a particular path and not the container root. Is there some sort of flag or something for New-AzureStorageContext that will allow me to set this?
# Set AzStorageContext
$destinationContext = New-AzureStorageContext -ConnectionString "DefaultEndpointsProtocol=https;AccountName=xxx;AccountKey=xxx;EndpointSuffix=xxx;"
# Generate SAS URI
$containerSASURI = New-AzureStorageContainerSASToken -Context $destinationContext -ExpiryTime(get-date).AddSeconds(3600) -FullUri -Name "xxx" -Permission rw -Protocol HttpsOnly
My issue is that New-AzureStorageContainerSASToken generates a URL the I use for azure copy destination azcopy copy "xxx" $containerSASURI but it copies to the container root and I would like it to copy to a specific directory e.g \test\demo
Assuming you're manually copying the Container SAS URL from PowerShell and using it in azcopy, simplest would be to insert the path in the SAS URL.
For example, if your Container SAS URL looks something like:
https://account.blob.core.windows.net/container-name?sv=2020-08-04&se=2021-10-04T18%3A30%3A00Z&sr=c&sp=w&sig={signature}
and you want to upload the files in test/demo folder inside this container, you will change this SAS URL to something like:
https://account.blob.core.windows.net/container-name/test/demo?sv=2020-08-04&se=2021-10-04T18%3A30%3A00Z&sr=c&sp=w&sig={signature}
and use that in azcopy. All the files are then uploaded into test/demo folder inside container-name blob container.
I have a container that creates a report.xml file that I wish to use to create a test report in Azure.
I was thinking to do this during the pipeline, but I am missing how to get this file during the Azure pipeline.
What is the way to download a file in storage account and use it on a Test Result during the pipeline?
If I understood, you want to copy a file from blob storage in pipeline?
For that you can use Azure Cli task with this command:
az storage blob download
But can you esplain what you mean by this use it on a Test Result during the pipeline??
Try to use Azure Powershell to handle this get files from Azure Storage Blob:
$storage = Get-AzStorageAccount -ResourceGroupName xxx -Name yyy
$ctx = $storageAccount.Context
# download blob
Get-AzStorageblobcontent -Blob "report.xml" `
-Container $containerName `
-Destination " $(System.DefaultWorkingDirectory)" `
-Context $Context
According to this question:How to copy a file from Azure Storage fileshare to Release pipeline agent
Seems you were trying to download file from Azure File share (not blob), please simply refer my reply in that link.
After some digging this is the solution.
What I have done is mount a Storage volume during the container instances to my container that generates the report.
Afterwards in the pipeline I have copied the files that are stored in the storage account to the Pipeline agent.
$storageAcct = Get-AzStorageAccount -ResourceGroupName XXX -Name YYY
Get-AzStorageFileContent -Context $storageAcct.context -ShareName "Sharename" -Path "report.xml" -Destination $(System.DefaultWorkingDirectory)
Then its just matter of getting that files from the agent when using the test reporter during the release pipeline.
I have an Azure Cloud Service Worker Role which needs a separate Windows Service installed to redirect application tracing to a centralized server. I've placed the installation binaries for this Windows Service in a Storage Account's file storage as shown below. I then have my startup task call a batch file, which in turn executes a power-shell script to retrieve the file and install the service
When Azure deploys a new instance of the role, the script execution fails with the following error:
Cannot find path
'\\{name}.file.core.windows.net\utilities\slab1-1.zip' because it does
not exist
However, when I run the script after connecting through RDP, all is fine. Does anybody know why this might be happening? Here is the script below...
cmdkey /add:$storageAccountName.file.core.windows.net /user:$shareUser /pass:$shareAccessKey
net use * \\$storageAccountName.file.core.windows.net\utilities
mkdir slab
copy \\$storageAccountName.file.core.windows.net\utilities\$package .\slab\$package
I always have problem here and there by using a script to access the mounted azure file drive. I believe this is more or less related to the drive is mounted only for the current user and may not always work the same when called from a script.
I ended up pulling files from azure file the hard way without network drive.
$source= $stroageAccountName
$sourceKey = $shareAccessKey
$sharename = "utilities"
$package = "slab1-1.zip"
$dest = ".\slab\" + $package
#Define Azure file share root
$ctx=New-AzureStorageContext $source $sourceKey
$share = get-AzureStorageShare $sharename -Context $ctx
Get-AzureStorageFileContent -share $share -Destination $dest -Path $package -confirm:$false
Code example here will get you a good start:
https://azure.microsoft.com/en-us/documentation/articles/storage-dotnet-how-to-use-files/
It would be harder to manage if you have more complex folder structure, but objects there are CloudFileDirectory and CloudFile, property and methods there works seamlessly for me in powershell 4.0
*Azure Powershell module is required for 'Get-AzureStorageFileContent' cmdlet