Copy From Azure Blob to AWS S3 Bucket - powershell

I'm trying to write a PowerShell script that copy recursively from Azure Blob to Amazon AWS S3 Bucket But without much success.
Can someone help me please ?
$container_name = #("<ContainerItemName>")
$connection_string = 'DefaultEndpointsProtocol=https;AccountName=<AccountName>;AccountKey=<AccountKey>'
$storage_account = New-AzureStorageContext -ConnectionString $connection_string
$S3Bucket = "<BucketName>"
foreach ($container_item in $container_name)
{
$blobs = Get-AzureStorageBlob -Container $container_item -Context $storage_account
foreach ($blob in $blobs)
{
$item = Get-AzureStorageBlobContent -Container $container_item -Blob $blob.Name -Context $storage_account
Write-S3Object -BucketName $S3Bucket -KeyPrefix $container_item\ -Folder $item -Force
}
}

One option is to use Azure Storage Data movement library. You can download the sample code here: https://github.com/Azure/azure-storage-net-data-movement/tree/master/samples/S3ToAzureSample. Following article contains more information on Data movement library: https://azure.microsoft.com/en-us/blog/introducing-azure-storage-data-movement-library-preview-2/

I haven't played with AWS PowerShell Cmdlets (so I may be wrong) but taking a quick look at their documentation here, I think you would need to save the blob on your local computer first and then specify that path in your Write-S3Object cmdlet -File parameter.
Downloading Azure Blob to a directory:
Get-AzureStorageBlobContent -Container containername -Blob blob -Destination C:\test\
Assuming the blob's name is blob.png, uploading it to S3:
Write-S3Object -BucketName $S3Bucket -KeyPrefix $container_item\ -File "C:\test\blob.png" -Force

Related

Azure RunBook - Can We pass the key to a file zip the file and push it to a blob (Key refresh)

i am able to create a run book to generate a key and passed into a variable using powershell.
can i move the key into a txt file and zip the same and move it to blob.
You may use below script to accomplish the requirement of sending a value into a text file, then to zip it and then sending it to a storage blob.
$storageAccountName = "xxxxxxxxxxxxxxxxxxxx"
$storageAccountKey = "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx=="
$context = New-AzureStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountKey
$KeyValue = "xxxxxxxxxxx"
$KeyValue | Out-File -FilePath ($env:temp+"/KeyFile.txt")
Compress-Archive -Path ($env:temp+"/KeyFile.txt") -DestinationPath ($env:temp+"/KeyFileZip.zip") -CompressionLevel optimal
Set-AzureStorageBlobContent -File ($env:temp+"/KeyFileZip.zip") -Container "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxx" -BlobType "Block" -Context $context -Verbose
Make sure you update the storage account name, storage account key and container name correctly before using the script.
For illustration on how I accomplished it, please check below screenshots.

Powershell script to copy files from server to azure blob container

I want to copy all the files in my server(azure vm) to a container (azure blob storage) is it possible through powershell?
I'm new to powershell please help me out
In any script with you please share
First, make sure you have installed the Az powershell module in your VM and the place you want to run the command. In my sample, I use my PC to run the command.
Try to store the script below in the PC, I use C:\Users\joyw\Desktop\script.ps1, the script will upload all the files in the folder C:\Users\xxxx\Desktop\test in your VM, you can change it to the path what you want.
script.ps1:
$context = New-AzStorageContext -StorageAccountName "<StorageAccountName>" -StorageAccountKey "<StorageAccountKey>"
$files = (Get-ChildItem "C:\Users\xxxx\Desktop\test" -recurse).FullName
foreach($file in $files){
Set-AzStorageBlobContent -Context $context -File "$file" -Container "<container name>" -Blob "$file"
}
Then run the command:
Connect-AzAccount
Invoke-AzVMRunCommand -ResourceGroupName 'rgname' -Name 'vmname' -CommandId 'RunPowerShellScript' -ScriptPath 'C:\Users\joyw\Desktop\script.ps1'
If you want to login with non-interactive way, see this link.
Change the parameters to what you want, after uploading the file to the container, it will appear like the original file structure.

adding variable from Powershell into AWS CLI path

I'll get right to it:
My join-path idea isn't going to work here, but this is what I am trying to do within powershell.
$HOSTNAME = $env:COMPUTERNAME
powershell -Command cd B:\backup\; ./cmd.exe /c "aws s3 sync . s3://backups123/$HOSTNAME/ --dryrun"
I am attempting to take a backup of a folder that im in within powershell and send it to an s3 bucket. The issue is I have to add computername into the path, but its not passing the variable through.
Anyone have a workaround?
If you use the AWS Tools for Powershell you can utilise the Write-S3Object cmdlet
$results = Get-ChildItem .\path\to\files -Recurse -Include *
foreach ($path in $results)
{Write-Host $path
$filename = [System.IO.Path]::GetFileName($path)
Write-S3Object -BucketName my-bucket -File $path -Key subfolder/$env:COMPUTERNAME/$filename -CannedACLName Private -AccessKey accessKey -SecretKey secretKey}
Credit to https://volkanpaksoy.com/archive/2014/12/04/uploading-files-to-s3-using-windows-powershell/

Delete local file after upload to Azure

Trying to delete local backup files after they have been uploaded to Azure storage, gets the following error:
Get-ChildItem : Cannot find path
'C:\Windows\system32\Microsoft.WindowsAzure.Commands.Common.Storage.ResourceModel.AzureStorageBlob'
because it does not exist.
When running the following code:
$BackupDir= 'C:\BackupDir'
$AzureDir= Get-AzureStorageBlob -Context $context -Container $containername -blob $file.name
Get-ChildItem $AzureDir | ForEach-Object
{
$FileInBackup= $AzureDir + $_.Name
If (Test-Path $FileInBackup) {Remove-Item $FileInBackup}
}
Why is it looking in C:\Windows*blahblah*?
If I print variable $AzureDir to screen, I see all my blobs.
Basically, it's probably obvious but what I want to do is check each file in my backup DIR and if it exists in Azure, delete it, if not, continue on to the upload step. I can share the rest of my code if need be.
RESOLUTION UPDATE:
Thanks to #OmegaMan, who pointed me down the right path, I was able to fix my issue. Here is what I'm now using. It's cycling through 4 'blobs' correctly and using the results correctly:
$BackupDir = 'C:\BackupDir'
$AzureFiles = Get-AzureStorageBlob -Context $context -Container $containername -blob *.name
foreach ($AzureFile in $AzureFiles)
{
$FileInBackup = $AzureFile.name
If (Test-Path $BackupDir\$FileInBackup)
{
Remove-Item $FileInBackup
}
}
You seem to use $AzureDir in one instance to hold all the blob infos, which is fine but then the line $FileInBackup= $AzureDir + $_.Name seems to think $AzureDir is a literal directory name.
It appears you need to rectify where the base directory is instead of $AzureDir in those instances.

How to get the Config Blob Uri parameter from restored VHD from Azure Recovery Services Vault in Powershell?

Currently, I am working in a Powershell to restore the VHD file from a Virtual Machine which is being backed up by Azure Recovery Services Vault.
That being said, my difficulty is how do I get the Config Blob Uri parameter after restoring the VHD using Powershell? Even using Get-AzureRmRecoveryServicesBackupJobDetails -Job $restoreJob I don't see any option that provides this information.
As you can see in the image below, the Azure Portal shows the Config Blob Uri parameter
Once the Powershell completes the restore, then I'd like to retrieve the Config Blob Uri to perform a VM creation based on that specific VHD file, however, without such information, I have to get it manually.
Is there any possibility to get it directly from Powershell?
#get restore job detail
$details = Get-AzureRmRecoveryServicesBackupJobDetails -Job $restorejob
#restored disk properties
$properties = $details.properties
$storageAccountName = $properties["Target Storage Account Name"]
$containerName = $properties["Config Blob Container Name"]
$blobName = $properties["Config Blob Name"]
#Set the Azure storage context and restore the JSON configuration file
Set-AzureRmCurrentStorageAccount -Name $storageAccountName -ResourceGroupName $resourceGroupName
$destination_path = "C:\temp\vmconfig.json"
Get-AzureStorageBlobContent -Container $containerName -Blob $blobName -Destination $destination_path
$obj = ((Get-Content -Path $destination_path -Raw -Encoding Unicode)).TrimEnd([char]0x00) | ConvertFrom-Json
This will download config json file to $destination_path and you can reference your that file when building your VM.
More details at: https://learn.microsoft.com/en-us/azure/backup/backup-azure-vms-automation#restore-an-azure-vm
Also, if you know your Storage Account Name, you can retrieve config uri from there:
$storageAccountName = (Get-AzureRmStorageAccount -ResourceGroupName $resourceGroupName).StorageAccountName
Set-AzureRmCurrentStorageAccount -Name $storageAccountName -ResourceGroupName $resourceGroupName
$storageContainerName = (Get-AzureStorageContainer).Name
$configBlob = Get-AzureStorageBlob -Container $storageContainerName | where {$_.Name -match "json"}
$configName = $configBlob.Name
$configURI = "https://$storageAccountname.blob.core.windows.net/$storageContainerName/$configName"
Hope this helps.