Download Azure VHD to local use powershell - powershell

How can I download azure vhd with powershell to local machine?
I read the document, but I can't find the blob url like "https://XXX.blob.core.windows.net/vhds/XXX.vhd"
Anybody know that?
Thanks

According to your description, your VM uses managed disk not unmanaged disk. So, you could not find the VHD file in storage account. More information about managed disk please refer to this link.
If you want to download the VHD in managed disk, you should copy it to a storage account first.
##create $SAS
$sas = Grant-AzureRmDiskAccess -ResourceGroupName shui -DiskName shuitest -DurationInSecond 3600 -Access Read
$destContext = New-AzureStorageContext –StorageAccountName contosostorageav1 -StorageAccountKey 'YourStorageAccountKey'
Start-AzureStorageBlobCopy -AbsoluteUri $sas.AccessSAS -DestContainer 'vhds' -DestContext $destContext -DestBlob 'MyDestinationBlobName.vhd'
Then, you could use Save-AzureVhd or Azcopy to download the VHD to your local.
Please refer to the similar question.

You can use Save-AzureVhd cmdlet to download the vhd file to local machine.
The Save-AzureVhd cmdlet enables download of .vhd images from a blob where they are stored to a file. It has parameters to configure the download process by specifying the number of downloader threads that are used or overwriting the file which already exists in the specified file path.
Save-AzureVhd does not do any VHD format conversion and the blob is downloaded as it is.
Example PowerShell cmdlets:
Save-AzureVhd -Source "http://YourStorageaccountname.blob.core.windows.net/yourContainerName/win7baseimage.vhd" -LocalFilePath "C:\vhd\Win7Image.vhd"
To get the exact URL, Login to Azure Portal, Select Storage Account and the vhd file, then copy the path.
You can also use Azure Storage Explorer to download the vhd file easily.

https://storage-account-name.blob.core.windows.net/container-name/vhd-name.vhd
You can also find the URL in Azure Portal. Open your storage account, go into the container and click the VHD you want. You will see "URL" in "Blob properties" blade.

Try this.
Download, install, login and then browse to your container, select your .vhd and click download.

Related

how do we copy an azure storage account table to another storage account?

I'm attempting to clone my storage tables into a different storage account. What is the best way to do this with powershell?
I've attempted this solution; however, at this point I'm getting this exception:
Copy-AzureStorageTable : The term 'Copy-AzureStorageTable' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the
name, or if a path was included, verify that the path is correct and try again.
How do we copy a table into another storage account?
AzCopy v10 doesn't support Azure Table Storage unfortunately. To export/import data from/to Azure Table Storage, you need to use AzCopy v7.3 instead.
Note that it doesn't support direct Table to Table copy, so you need to export the source table to local disk or Blob Storage at first, then import it to another destination table.
We have written the below PowerShell script that will download all the tables under the storage account to your local & it will upload to the destination storage account which is working fine.
Here is the PowerShell Script:
Connect-azaccount
$strgName='<storageAccountName>'
$stcontext=New-AzStorageContext -StorageAccountName $strgName -StorageAccountKey <StorageAccountKey>
$tablelist=Get-AzStorageTable -Context $stcontext | Select-Object -Property Uri,Name
foreach($table in $tablelist){
$Sourceuri=$table.Uri
cd "C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy"
.\AzCopy /Source:$Sourceuri /Dest:C:\Users\Downloads\azcopy1 /SourceKey:<StorageAccountKey>
}
$localist=Get-ChildItem -Path C:\users\Downloads\azcopy1\ -Exclude *.json
foreach( $item in $localist){
$tbname=$item.Name.Replace('<storageaccountName>_','').Replace('.manifest','').Replace('_','').Replace('.','')
$manifest=$item.Name.Replace('C:\users\Downloads\azcopy1\','')
cd "C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy" `
.\AzCopy /Source:C:\users\Downloads\azcopy\ /Dest:https://<DestinationStorageAccount>.table.core.windows.net/$tbname/ /DestKey:<DestinationAccountKey> /Manifest:$manifest /EntityOperation:InsertOrReplace
}
Here is the output for reference :

How to get the command line for AZCopy?

I want to send dump files to a storage container and for the copy to work we need to obtain a SAS key for the container we’re copying to.
When you use Azure Storage Explorer you can copy a file to a container and then copy the command it used to the clipboard which looks something like this:
$env:AZCOPY_CRED_TYPE = "Anonymous";
./azcopy.exe copy "C:\temp\test.txt" "https://dbbackups.blob.core.windows.net/memorydumps/test.txt?{SAS-TOKEN}" --overwrite=prompt --from-to=LocalBlob --blob-type Detect --follow-symlinks --put-md5 --follow-symlinks --recursive;
$env:AZCOPY_CRED_TYPE = "";
I copied this from AZ Storage Explorer when copying a file called test.txt from c:\temp to the memorydumps container in a Storage Account.
What I would need help with is creating a PowerShell script that generates the above command line so I can run it on azcopy-less nodes and have the dumps end up in the storage container. Any ideas?
You could use the Azure PowerShell equivalent to upload blobs to your container. The Set-AzStorageBlobContent uploads a local file to an Azure Storage blob.
Set-AzStorageBlobContent -File "C:\Temp\test.txt" `
-Container $containerName `
-Blob "Image001.jpg" `
-Context $ctx
Refer to this blog post for a detailed walkthough: File Transfers to Azure Blob Storage Using Azure PowerShell

Function to move/delete files within file share in Azure Storage Explorer?

I'm not proficient in Powershell yet, so please bear with me if I use the incorrect terminology.(And please correct me if I do.)
I have installed the Az and Azure.Storage modules.
I have also connected to my account using Connect-AZAccount (Is this the best way? Since you need to copy the URL and login via a browser)
Then I was just trying to view the files, to test the connection. Using Get-AzureStorageFile
This prompts me for a sharename - I used the name of the folder under File Shares in Azure Storage Explorer. But this failed, see failure below
cmdlet Get-AzureStorageFile at command pipeline position 1 Supply
values for the following parameters: (Type !? for Help.) ShareName:
bss get-azurestoragefile : Could not get the storage context. Please
pass in a storage context or set the current storage context.
Additional information to note, I do not have access to the Account Key, only the SAS Token.
Any help would be appreciated.
If you use Connect-AzAccount, you will use the Az module powershell Get-AzStorageFile instead of Get-AzureStorageFile. Before running the Get-AzStorageFile command, you need to pass the storage context with New-AzStorageContext to fix the error.
Sample:
$context = New-AzStorageContext -StorageAccountName "<StorageAccountName>" -StorageAccountKey "<StorageAccountKey>"
Get-AzStorageFile -ShareName "<ShareName>" -Path "<ContosoWorkingFolder>" -Context $context

How to transfer a html file from Azure VM via Azure powershell or Azure CLI to a local machine

I am working on developing a Automated QA script for my project for my organisation. My goal is to execute pester scripts through custom script extension feature of azure vms. I got the Pester executed and result exported as a nunit xml. I would like to fetch the xml back from VM to my local machine. One way of doing that is by uploading the xml into blob storage from VMs. but since it requires azure connection to be established in VM using SP account. I dont prefer this method.
I would like to know the best way to retrive pester results and get it outside VM.
Any help is much appreciated. Thanks .
I'd use a shared access signature token for that (link). that way your script doesnt really need SP, it just needs the token. that token would limit permissions to only upload file to specific container (or even blob).
$sascontext = New-AzureStorageContext -StorageAccountName accountname -SasToken '?tokenvalue'
Set-AzureStorageBlobContent -File path -Container name -Context $sascontext -Force
You can create new token with New-AzureStorageBlobSASToken or New-AzureStorageContainerSASToken
Your only requirement would be to install Azure.Storage module before hand.

(PowerShell) How to give a ZIP permissions to allow user to edit directly after downloading from Azure Blob

Good morning friends,
I've been writing a script in PowerShell to replace our current manual process to deploy our application to Azure Blob Storage in a ZIP folder during the Build Process in VS. I'm about done, but I've run into this issue:
When the ZIP that I upload to Azure is downloaded by anyone, the ZIP cannot be manipulated without having to extract the files first. This is something the current process is able to accomplish and I don't know how (The current process was written in C# and is done through a GUI). It needs to be editable via the ZIP because the current Updater is set to manipulate the ZIP without the extraction first.
So the initial question is: How do I set permissions on a ZIP archive that will follow it to Azure Blob Storage and then when it's downloaded on a client's machine that allow it's contents to be manipulated (The error itself at this time is that it cannot delete a file in a child folder) without extraction?
Currently, to ZIP my folder up, I use this process:
$src = "$TEMPFOL\$testBuildDrop"
$dst = "$TEMPFOL\LobbyGuard.zip"
[Reflection.Assembly]::LoadWithPartialName( "System.IO.Compression.FileSystem" )
[System.IO.Compression.ZipFile]::CreateFromDirectory($src, $dst)
and then push it to blob with:
set-azurestorageblobcontent -Container test -blob "LobbyGuard.zip" -file "$TEMPFOL\LobbyGuard.zip" -context $storageCreds -force
I've tried to set permissions on the folder prior to upload using
$getTEMPFOLACL = Get-ACL $TEMPFOL
$accessRule = New-Object System.Security.AccessControl.FileSystemAccessRule("Everyone", "FullControl", "Allow")
$getTEMPFOLACL.SetAccessRule($accessRule)
Which works on the current local file, but once downloaded, the permissions on the file are set as
Owner: BUILTIN\Adminstrators Access: NT AUTHORITY\SYSTEM Allow FullControl
Which is exactly the same permissions as the file that's downloaded from the current process. I'm not understanding what I'm missing here to make this work.
If necessary I can provide the DL link to our blob to show the current manual processes folder that can be manipulated IN the ZIP vs. My Scripts ZIP that cannot.
Try unblocking the file after downloading, ie
Unblock-File C:\path\yourDownloaded.zip