Azure Powershell: Restore Services backup restore location - powershell

I can issue a request to start a recovery job using Restore-AzureRmRecoveryServicesBackupItem, which will restore the given recovery point (in this case, a VHD) to the storage account provided. Is there a way to derive the location that the blob is being written to from the AzureRM API or Powershell commandlets?
Getting the recovery job:
$recoveryJob = Restore-AzureRmRecoveryServicesBackupItem `
-RecoveryPoint $recoveryPoint `
-StorageAccountName $storageAccount.name `
-StorageAccountResourceGroupName $storageAccount.resourceGroupName
... but $recoveryJob does not have the storage destination. Get-AzureRmRecoveryServicesBackupJobDetails does not provide this information either, and I'm out of ideas.

Related

Publish Test Results on Azure from Azure Storage account

I have a container that creates a report.xml file that I wish to use to create a test report in Azure.
I was thinking to do this during the pipeline, but I am missing how to get this file during the Azure pipeline.
What is the way to download a file in storage account and use it on a Test Result during the pipeline?
If I understood, you want to copy a file from blob storage in pipeline?
For that you can use Azure Cli task with this command:
az storage blob download
But can you esplain what you mean by this use it on a Test Result during the pipeline??
Try to use Azure Powershell to handle this get files from Azure Storage Blob:
$storage = Get-AzStorageAccount -ResourceGroupName xxx -Name yyy
$ctx = $storageAccount.Context
# download blob
Get-AzStorageblobcontent -Blob "report.xml" `
-Container $containerName `
-Destination " $(System.DefaultWorkingDirectory)" `
-Context $Context
According to this question:How to copy a file from Azure Storage fileshare to Release pipeline agent
Seems you were trying to download file from Azure File share (not blob), please simply refer my reply in that link.
After some digging this is the solution.
What I have done is mount a Storage volume during the container instances to my container that generates the report.
Afterwards in the pipeline I have copied the files that are stored in the storage account to the Pipeline agent.
$storageAcct = Get-AzStorageAccount -ResourceGroupName XXX -Name YYY
Get-AzStorageFileContent -Context $storageAcct.context -ShareName "Sharename" -Path "report.xml" -Destination $(System.DefaultWorkingDirectory)
Then its just matter of getting that files from the agent when using the test reporter during the release pipeline.

Function to move/delete files within file share in Azure Storage Explorer?

I'm not proficient in Powershell yet, so please bear with me if I use the incorrect terminology.(And please correct me if I do.)
I have installed the Az and Azure.Storage modules.
I have also connected to my account using Connect-AZAccount (Is this the best way? Since you need to copy the URL and login via a browser)
Then I was just trying to view the files, to test the connection. Using Get-AzureStorageFile
This prompts me for a sharename - I used the name of the folder under File Shares in Azure Storage Explorer. But this failed, see failure below
cmdlet Get-AzureStorageFile at command pipeline position 1 Supply
values for the following parameters: (Type !? for Help.) ShareName:
bss get-azurestoragefile : Could not get the storage context. Please
pass in a storage context or set the current storage context.
Additional information to note, I do not have access to the Account Key, only the SAS Token.
Any help would be appreciated.
If you use Connect-AzAccount, you will use the Az module powershell Get-AzStorageFile instead of Get-AzureStorageFile. Before running the Get-AzStorageFile command, you need to pass the storage context with New-AzStorageContext to fix the error.
Sample:
$context = New-AzStorageContext -StorageAccountName "<StorageAccountName>" -StorageAccountKey "<StorageAccountKey>"
Get-AzStorageFile -ShareName "<ShareName>" -Path "<ContosoWorkingFolder>" -Context $context

Upload on-premises content to SharePoint Online how to retrieve the logs (using powershell)

Based on the following article i'm uploading file server information to SharePoint online.
Everything is working except for Step 7: Processing and Monitoring your SPO Migration.
The description there is:
Checking job status You can check the status of your job by viewing
the real time updates posted in the Azure storage account queue by
using the Encryption.EncryptionKey returned in step 6.
Viewing logs If you’re using your own Azure storage account, you can
look into the manifest container in the Azure Storage for logs of
everything that happened. At this stage, it is now safe to delete
those containers if you don’t want to keep them as backup in Azure.
If there were errors or warnings, .err and .wrn files will be created
in the manifest container.
If you’re using the temporary Azure storage created by
Invoke-SPOMigrationEncryptUploadSubmit in step 6, the import log SAS
URL can be obtained by decrypting the Azure queue message with the
“Event” value “JobLogFileCreate”. With the import log SAS URL, you can
download the log file and decrypt it with the same encryption key as
returned in Step 6.
I have the encryptionKey and ReportingQueueUri, there is no explanation on how to use them, trying with Azure Storage Explorer i opened the reporting queue but its all encrypted there and there is no option to use the encryptionKey.
If anyone did this or know how to i'd really appreciate some help.
One has to use two other cmdlets, Get-SPOMigrationJobProgress and Get-SPOMigrationJobStatus
$job = Invoke-SPOMigrationEncryptUploadSubmit `
-SourceFilesPath $sourceFiles `
-SourcePackagePath $targetPackage `
-Credentials $creds `
-TargetWebUrl $targetWebUrl
$encryption = $job.Encryption
$queueLink = $job.ReportingQueueUri.AbsoluteUri
$jobID = $job.jobid
Get-SPOMigrationJobProgress -AzureQueueUri $queueLink `
-Credentials $creds `
-TargetWebUrl $targetWebUrl `
-JobIds $jobID `
-EncryptionParameters $encryption
Get-SPOMigrationJobStatus -TargetWebUrl $targetWebUrl -Credentials $creds -JobId $jobID

manage azure resource manager storage powershell

I'm facing following issue
Switch-AzureMode AzureResourceManager
New-AzureStorageAccount -ResourceGroupName "XYZ" -Name "VmTemplateStorage" -Type "Standard_LRS"
# lists the account
Get-AzureStorageAccount
Set-AzureSubscription -SubscriptionName "ABC" -CurrentStorageAccountName -"VmTemplateStorage"
# now this outputs error saying: Storage account 'VmTemplateStorage' was not found.
Get-AzureStorageContainer
I'm aware that I can create storage account in "classic" mode. But then I'm unable to use it as a source for vm images deployed using resource manager.
This way however, I'm unable to manage the account and upload blobs using powershell.
Any ideas how to manage resource manager based storage accounts?
you can use the following new PowerShell commands to manage ARM (Azure Resource Manager) based storage accounts.
Note: You need to update your Azure PowerShell later versions which support this new feature such as the November 2015 release. Also, the Switch-AzureMode command is deprecated in the latest release.
New-AzureRmStorageAccount
Get-AzureRmStorageAccount
Set-AzureRmStorageAccount
Remove-AzureRmStorageAccount

Get-AzureStorageBlob throws Can not find your azure storage credential

I have just started using Azure and I am facing issues using the PowerShell cmdlets to work with my storage account.
I have created a Storage account and a container in that storage account. Next I installed the Azure Powershell SDK and command lets etc. and imported the publishsettings file. When I do the Get-AzureSubscription or Get-AzureStorageAccount command it correctly shows my subscription in the PowerShell console along with various storage end points.
However if I do a Get-AzureStorageBlob call or a Set-AzureStorageBlobContent I get the following error
Get-AzureStorageBlob : Can not find your azure storage credential. Please set current storage account using
"Set-AzureSubscription" or set the "AZURE_STORAGE_CONNECTION_STRING" environment variable.
I am literally at wits ends here. A Google search on this error string only brings up references to code on Github etc. Would really appreciate some help.
Right so I finally managed to do this! Here is the overall details on how to use PowerShell to create a blob in Azure and store a file there.
http://www.nikgupta.net/2013/09/azure-blob-storage-powershell/
$context = New-AzureStorageContext -StorageAccountName FunkyStorage -StorageAccountKey {Enter your storage account key here}
Set-AzureStorageBlobContent -Blob "MyFunkyBlob" -Container FunkyContainer-File "c:\temp\1.txt" -Context $context -Force
You may need to set the 'current' subscription to use. For that, you must run Select-AzureSubscription.
If you run Get-AzureSubscription, you'll see all subscriptions in your publish settings. One of those subscriptions should be set as the default. As you scroll through the result list, you'll see one property, IsDefault for each subscription, set to True or False. If the subscription you're using is set to False, run:
Select-AzureSubscription -SubscriptionName mysub
Hopefully that fixes the issue you're running into.
Just a quick FYI: you can do this another (and faster way). I build a web language atop Windows PowerShell that heavily integrates with Azure. It's called PowerShell Pipeworks.
You can use 4 cmdlets to interact with the blobs:
Get-Blob
Import-Blob
Export-Blob
Remove-Blob
All take a -StorageAccount and a -StorageKey, and also a -StorageAccountSetting and a -StorageKeySetting. You can save creds to disk (or for use in a web app by using Add-SecureSetting). Once any blob cmdlet has a storage account, it will continue to reuse it.
Export-Blob is also handy in that you can pipe in a directory to it, and it will create the right content types, and provide -Public, which will mark the container it's stored in as public.
These cmdlets are a notch older (~3 months) than the Azure ones, and still about 3/4ths the time to execute (I believe a major chunk of this is their slower lookup on credentials), and are worth a try.