Trying to delete local backup files after they have been uploaded to Azure storage, gets the following error:
Get-ChildItem : Cannot find path
'C:\Windows\system32\Microsoft.WindowsAzure.Commands.Common.Storage.ResourceModel.AzureStorageBlob'
because it does not exist.
When running the following code:
$BackupDir= 'C:\BackupDir'
$AzureDir= Get-AzureStorageBlob -Context $context -Container $containername -blob $file.name
Get-ChildItem $AzureDir | ForEach-Object
{
$FileInBackup= $AzureDir + $_.Name
If (Test-Path $FileInBackup) {Remove-Item $FileInBackup}
}
Why is it looking in C:\Windows*blahblah*?
If I print variable $AzureDir to screen, I see all my blobs.
Basically, it's probably obvious but what I want to do is check each file in my backup DIR and if it exists in Azure, delete it, if not, continue on to the upload step. I can share the rest of my code if need be.
RESOLUTION UPDATE:
Thanks to #OmegaMan, who pointed me down the right path, I was able to fix my issue. Here is what I'm now using. It's cycling through 4 'blobs' correctly and using the results correctly:
$BackupDir = 'C:\BackupDir'
$AzureFiles = Get-AzureStorageBlob -Context $context -Container $containername -blob *.name
foreach ($AzureFile in $AzureFiles)
{
$FileInBackup = $AzureFile.name
If (Test-Path $BackupDir\$FileInBackup)
{
Remove-Item $FileInBackup
}
}
You seem to use $AzureDir in one instance to hold all the blob infos, which is fine but then the line $FileInBackup= $AzureDir + $_.Name seems to think $AzureDir is a literal directory name.
It appears you need to rectify where the base directory is instead of $AzureDir in those instances.
Related
Is there an AzCopy parameter to ensure the destination folder is empty?
If not, how should I check to ensure the folder is empty in a VSTS build pipeline?
What's your destination for the transfer, is it Blob or Local file?
If it's blob, use following script can check if the folder if empty
$ctx=New-AzureStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $key
$blob = Get-AzureStorageBlob -Container $containerName -Prefix dir1/dir2/ -Context $ctx -MaxCount 1
if ($blob -eq $null)
{
# container/dir1/dir2/ Don't have blob, so do AzCopy transfer
}
If it's localfile, using following script:
$file = Get-ChildItem c:\dir1\dir2\
if ($file -eq $null)
{
# c:\dir1\dir2\ Don't have file, so do AzCopy transfer
}
AFAIK, Azcopy has a feature that can only copy data that doesn't exist in the destination, if you want to prevent AzCopy.exe when destination folder exists already, it seems not necessary to check the destination folder if empty.
The /XO and /XN parameters allow you to exclude older or newer source resources from being copied, respectively. If you only want to copy source resources that don't exist in the destination, you can specify both parameters in the AzCopy command
I would like to zip a path (with a service windows running inside).
When the service is stopped, it works perfectly, when the service is running, I have the exception:
The process cannot access the file because it is being used by another
process.
However, when I zip with 7-zip, I don't have any exception.
My command:
Compress-Archive [PATH] -CompressionLevel Optimal -DestinationPath("[DEST_PATH]") -Force
Do you have any idea to perform the task without this exception?
Copy-Item allows you to access files that are being used in another process.
This is the solution I ended up using in my code:
Copy-Item -Path "C:\Temp\somefolder" -Force -PassThru |
Get-ChildItem |
Compress-Archive -DestinationPath "C:\Temp\somefolder.zip"
The idea is that you pass through all the copied items through the pipeline instead of having to copy them to a specific destination first before compressing.
I like to zip up a folder's content rather than the folder itself, therefore I'm using Get-ChildItem before compressing in the last line.
Sub-folders are already included. No need to use -recurse in the first line to do this
A good method to access files being used by another process is by creating snapshots using Volume Shadow Copy Service.
To do so, one can simply use PowerShells WMI Cmdlets:
$Path = "C:/my/used/folder"
$directoryRoot = [System.IO.Directory]::GetDirectoryRoot($Path).ToString()
$shadow = (Get-WmiObject -List Win32_ShadowCopy).Create($directoryRoot, "ClientAccessible")
$shadowCopy = Get-WmiObject Win32_ShadowCopy | ? { $_.ID -eq $shadow.ShadowID }
$snapshotPath = $shadowCopy.DeviceObject + "\" + $Path.Replace($directoryRoot, "")
Now you can use the $snapshotPath as -Path for your Compress-Archive call.
This method can also be used to create backups with symlinks.
From there on you can use the linked folders to copy backed up files, or to compress them without those Access exceptions.
I created a similiar function and a small Cmdlet in this Gist: Backup.ps1
There was a similar requirement where only few extensions needs to be added to zip.
With this approach, we can copy the all files including locked ones to a temp location > Zip the files and then delete the logs
This is bit lengthy process but made my day!
$filedate = Get-Date -Format yyyyMddhhmmss
$zipfile = 'C:\Logs\logfiles'+ $filedate +'.zip'
New-Item -Path "c:\" -Name "Logs" -ItemType "directory" -ErrorAction SilentlyContinue
Robocopy "<Log Location>" "C:\CRLogs\" *.txt *.csv *.log /s
Get-ChildItem -Path "C:\Logs\" -Recurse | Compress-Archive -DestinationPath $zipfile -Force -ErrorAction Continue
Remove-Item -Path "C:\Logs\" -Exclude *.zip -Recurse -Force
I am writing a PowerShell script to communicate with my Microsoft Azure account, and I am facing an issue with uploading and downloading files using the Set-AzureStorageBlobContent and Get-AzureStorageBlobContent modules.
$Upload = #{
Context = $storageContext;
Container = $container;
File = "C:\... \file.csv";
}
Set-AzureStorageBlobContent #Upload -Force;
The first time I run this, it appears to start uploading, but it stays at 0% and never rises. If I cancel it and try executing my script again, I get an error that says:
"Set-AzureStorageBlobContent : A transfer operation with the same source and destination already exists."
A nearly identical thing happens when I try to download an existing blob from Azure.
$params = #{
Context = $storageContext;
Container = $container;
Blob = $blob;
Destination = $dest
}
New-Item -Path $dest -ItemType Directory -Force
Get-AzureStorageBlobContent #params
I have tried reinstalling Azure.Storage, which is the module that contains the cmdlet Get-AzureStorageBlobContent and Set-AzureStorageBlobContent, but I had no luck. Am I doing something incorrectly, or is this code just wrong somehow?
It seems to be something with your $Upload variable. Try this instead:
Get-ChildItem –Path C:\Images\* | Set-AzureStorageBlobContent -Container "yourcontainername"
more info: https://learn.microsoft.com/en-us/azure/storage/storage-powershell-guide-full
I'm trying to write a PowerShell script that copy recursively from Azure Blob to Amazon AWS S3 Bucket But without much success.
Can someone help me please ?
$container_name = #("<ContainerItemName>")
$connection_string = 'DefaultEndpointsProtocol=https;AccountName=<AccountName>;AccountKey=<AccountKey>'
$storage_account = New-AzureStorageContext -ConnectionString $connection_string
$S3Bucket = "<BucketName>"
foreach ($container_item in $container_name)
{
$blobs = Get-AzureStorageBlob -Container $container_item -Context $storage_account
foreach ($blob in $blobs)
{
$item = Get-AzureStorageBlobContent -Container $container_item -Blob $blob.Name -Context $storage_account
Write-S3Object -BucketName $S3Bucket -KeyPrefix $container_item\ -Folder $item -Force
}
}
One option is to use Azure Storage Data movement library. You can download the sample code here: https://github.com/Azure/azure-storage-net-data-movement/tree/master/samples/S3ToAzureSample. Following article contains more information on Data movement library: https://azure.microsoft.com/en-us/blog/introducing-azure-storage-data-movement-library-preview-2/
I haven't played with AWS PowerShell Cmdlets (so I may be wrong) but taking a quick look at their documentation here, I think you would need to save the blob on your local computer first and then specify that path in your Write-S3Object cmdlet -File parameter.
Downloading Azure Blob to a directory:
Get-AzureStorageBlobContent -Container containername -Blob blob -Destination C:\test\
Assuming the blob's name is blob.png, uploading it to S3:
Write-S3Object -BucketName $S3Bucket -KeyPrefix $container_item\ -File "C:\test\blob.png" -Force
It's easy to copy multiple files to a folder that doesn't exists and let it be created:
Copy-Item C:\Temp\aa C:\Temp2\DoesNotExist
The command above will create the folder DoesNotExist. That's what I'm after.
But what is the PowerShell syntax to the same when the source is only a single file?
Copy-Item C:\Temp\test.txt C:\Temp2\DoesNotExist
I tried C:\Temp2\DoesNotExist\ (with the trailing slash), but Powershell says "The filename, directory name, or volume label syntax is incorrect." and refuses to copy the single file.
If you're looking for a one liner solution, you can do this.
copy "C:\test2.txt" -Destination (New-Item "C:\Temp2\DoesNotExist\" -Type container -force) -Container -force
I think Geoff Guynn's one-liner should be as follows:
Copy-Item -Path "C:\test2.txt" -Destination (New-Item "C:\Temp2\DoesNotExist\" -ItemType directory -Force) -Force
The parameter for the cmdlet New-Item should be -ItemType and the intended "Type" should be directory.
The additional parameter -Container for the cmdlet Copy-Item seems to me superfluous; on the one hand it is set to $true by default anyway, on the other hand a single file should be copied and not the folder structure should be preserved.