I am running below powershell script in Azure Webjob to conenct to Storage account and upload a file
Write-Output "Getting Azure storage context..."
$storageContext = New-AzureStorageContext -StorageAccountName "awemigcitest" -StorageAccountKey "xx+74Z81YJf373p88Emp2jKidMZ8b4/+UTLJT4Rvgqrc8IedRxkg=="
$ProgressPreference="SilentlyContinue"
Set-AzureStorageBlobContent -Blob $azureBlobStorageFileName -File $tempFilename -Container $MigrationReportsContainerName -Context $storageContext -Force
Write-Output "Copied HTML file to Azure blob storage."
But getting below error. Any idea on this?
New-AzureStorageContext : The Win32 internal error "The handle is invalid" 0x6
[09/30/2017 06:41:20 > 4db5e9: ERR ] occurred while setting character attributes for the console output buffer.
The commands you're using seems ok, it works for me both on local and Azure WebJob. Please check if your scripts can work on local. Besides, if possible, you can try to create a new WebJob to run your script, and check if it can work fine.
Write-Output "Getting Azure storage context..."
$storageContext = New-AzureStorageContext -StorageAccountName "{account_name}" -StorageAccountKey "{account_key}"
$ProgressPreference="SilentlyContinue"
Set-AzureStorageBlobContent -Blob 'source.txt' -File 'D:\home\data\jobs\continuous\FileIn.txt' -Container 'mycontainer' -Context $storageContext -Force
Write-Output "Copied HTML file to Azure blob storage."
WebJob Logs
Note: I use Kudu Console to access site folder and create FileIn.txt.
Related
Hi I'm trying to create a file and upload it to a storage account within Azure. I can create the file and send it to the storage account if I hard code the values, such as storage account name, access key and container, however I'd like to save them in Azure Key Vault and reference them from there.
I have an ARM template running which I have my admin username and password in and it works perfectly using "$(adminUser)" and "$(adminPass)"
The code for the storage account is below:
$StorageAccountName = "$env:storageaccountname"
$StorageAccountKey = "$env:secretstoragekey"
$ctx = New-AzStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $StorageAccountKey
$ContainerName = "$env:vhdcontainer"
Set-AzStorageBlobContent -file ./$Filename -Container $ContainerName -Blob $Filename -Context $ctx -force
I have tried using $(StorageAccountName), $(secretstoragekey) and $(vhdContainer) bu no joy. Can anyone see where I'm going wrong and point me in the right direction please.
Thanks in advance :)
I currently have a bash script that runs some az commands to clean a storage account container and upload the contents of a directory to it:
az storage blob delete-batch --account-name $ACCOUNT_NAME --source $web
az storage blob upload-batch --account-name $ACCOUNT_NAME -s $SOURCE_PATH -d $web
I would like to reuse that functionality inside a Powershell Azure task that runs on Azure DevOps Services because I have a lot of other stuff going on that script besides the storage cleaning and upload.
What's the best way to migrate this? Been looking in the Powershell Azure module documentation but I can't find a proper equivalent to blob delete-batch and blob upload-batch.
Also though in calling the az command directly but for that I would have to login so I would need a way to pass the service principal details from the Powershell Azure task into the az login command before executing those lines.
Any ideas are welcome. thanks in advance
use Azure PowerShell to login Azure with service principal
You can use the following script
$appId = "your application id "
$password = "your application secret"
$secpasswd = ConvertTo-SecureString $password -AsPlainText -Force
$mycreds = New-Object System.Management.Automation.PSCredential ($appId, $secpasswd)
Add-AzureRmAccount -Credential $mycreds -Tenant "your tenat id" -ServicePrincipal
I can use Azure CLI command "az storage blob upload-batch" to upload a
local directory to Azure storage. How to implement it with Azure
PowerShell
Azure PowerShell does not provide the command like that. It just provides the command Set-AzureStorageBlobContent to allow customers to upload file to Azure storage. So you need to write a script with the command to implement how to upload a directory to Azure storage. For example
$StorageAccountKey=" "
$sourceFileRootDirectory=" "
$StorageAccountName=" "
$ContainerName=" "
$ctx = New-AzureStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $StorageAccountKey
$container = Get-AzureStorageContainer -Name $ContainerName -Context $ctx
if ($container) {
$filesToUpload = Get-ChildItem $sourceFileRootDirectory -Recurse -File
foreach ($x in $filesToUpload) {
$blobName = ($x.fullname.Substring($sourceFileRootDirectory.Length + 1)).Replace("\", "/")
Set-AzureStorageBlobContent -File $x.fullname -Container $container.Name -Blob $blobName -Context $ctx -Force:$Force
}
}
I can use Azure CLI command "az storage blob delete-batch" to clean up
a container. How to implement it with Azure
PowerShell
Azure PowerShell does not provide the command that we can use to directly delete all blob in one container. So we need to write a script to implement it. Now, we have two choices
Delete the container and create a new container with the same name
$StorageAccountKey=" "
$StorageAccountName=" "
$ContainerName=" "
$context = New-AzureStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $StorageAccountKey
Remove-AzureStorageContainer -Name $ContainerName -Context $context
New-AzureStorageContainer -Name $ContainerName -Context $context
Get all blobs in the container then delete them
$StorageAccountKey=" "
$StorageAccountName=" "
$ContainerName=" "
$Token = $null
$Total = 0
$MaxCount=5000
$context = New-AzureStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $StorageAccountKey
do
{
$Blobs = Get-AzureStorageBlob -Container $ContainerName -MaxCount $MaxCount -ContinuationToken $Token -Context $context
if($Blobs.Length -le 0) { Break;}
$Token = $Blobs[$blobs.Count -1].ContinuationToken;
foreach($blob in $blobs){
Remove-AzStorageBlob -Blob $blob.Name -Container $ContainerName -Context $context
}
}
While ($Token -ne $null)
Why not use the Azure CLI task in Azure DevOps that uses a Service connection for the authentication part? See the documentation here.
I am creating a Azure workflow runbook wherein I have to get a file from Azure File System Storage and publish that to a azure web app.
I tried with New-PSDrive but that command is not supported in runbook (even InlineScript doesn't work). Could anyone help me with the script. In the below code I need to populate file path from azure file system.
$Conn = Get-AutomationConnection -Name AzureRunAsConnection
Connect-AzureRmAccount -ServicePrincipal -Tenant $Conn.TenantID `
-ApplicationId $Conn.ApplicationID `
-CertificateThumbprint $Conn.CertificateThumbprint
$zipFilePath = ???
Publish-AzureWebsiteProject -Name $siteName -Package $zipFilePath
I searched a lot but couldn't find much information on this.
Are you referring to a file in a Azure Storage account? If so, that is pretty easy to accomplish. Add the following to your Runbook, filling in the required information:
$StorageAccountKey = Get-AutomationVariable -Name 'storageKey'
$Context = New-AzureStorageContext -StorageAccountName 'your-storage' `
-StorageAccountKey $StorageAccountKey
Get-AzureStorageFileContent -ShareName 'your-share' -Context $Context `
-path 'your-file' -Destination 'C:\Temp'
$filePath = Join-Path -Path 'C:\Temp' -ChildPath 'your-file'
You also need to create an variable in your Automation Account, called "storageKey" containing your Storage Accounts key.
Mounting Azure File share as a drive is not currently supported in Automation cloud jobs, though it will probably be supported in a few months. In the meantime, use the Get-AzureStorageFile command from the Azure.Storage module to retrieve the file to a temp folder.
Alternatively, run this job on a Hybrid worker. In this case, make sure all the prerequisites are met in order to mount the share as a network drive.
I am using Set-AzureStorageBlobContent in Powershell script to upload multiple files[bak] in a loop to blob storage. The issue is that the script runs successfully when I run manually using Powershell IDE. However, it fails when I run it through SQL agent job.I have tried to catch the exception but it seems that the script crashes. The first file in the loop get uploaded. I have tried including delay/Wait in the loop but of no help.
Below is the code snippet.I have tried ErrorAction [Ignore/Stop] but no use.
$context = New-AzureStorageContext -StorageAccountName $storagename -StorageAccountKey $key
Loop{
Try {
Set-AzureStorageBlobContent -BlobType Page -Blob $nm -Container backups -File $fn -Context $context -Force
}
Catch {
Log exception
}
}
I need to upload some files to my Azure storage emulator using scripts. The same task for remote Azure storage is performed easily with Azure PowerShell cmdlets, just call
Add-Blob -BlobType Block -FilePath $myFilePath -ContainerName $myContainerName
But how can I do the same thing for local storage emulator?
For those looking for how to do this with Azure SDK (2.1), here is how:
$StorageContext = New-AzureStorageContext -Local
Set-AzureStorageBlobContent -File $SourceFilePath `
-Container $DestinationContainerName -Blob `
$DestinationBlobName -Context $StorageContext
If you want to actually upload to an Azure storage account, change the $StorageContext:
New-AzureStorageContext –StorageAccountName $StorageAccountName `
-StorageAccountKey $StorageAccountKey
You could use the Azure Command Line Tools, available here:
https://github.com/RobBlackwell/AzureCommandLineTools
They run on the normal command prompt, they're not actually powershell cmdlets.
SET AZURE_CONNECTION_STRING=UseDevelopmentStorage=true
PutBlob filename [containername[/blobname]]
Found the solution using PowerShell Cmdlets.
You need to specify -UseDevelopmentStorage option to the cmdlets:
Get-Container -UseDevelopmentStorage
or
Add-Blob -UseDevelopmentStorage -BlobType Block -FilePath $myFilePath -ContainerName $myContainerName