Prevent AzCopy.exe when destination folder exists already - azure-devops

Is there an AzCopy parameter to ensure the destination folder is empty?
If not, how should I check to ensure the folder is empty in a VSTS build pipeline?

What's your destination for the transfer, is it Blob or Local file?
If it's blob, use following script can check if the folder if empty
$ctx=New-AzureStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $key
$blob = Get-AzureStorageBlob -Container $containerName -Prefix dir1/dir2/ -Context $ctx -MaxCount 1
if ($blob -eq $null)
{
# container/dir1/dir2/ Don't have blob, so do AzCopy transfer
}
If it's localfile, using following script:
$file = Get-ChildItem c:\dir1\dir2\
if ($file -eq $null)
{
# c:\dir1\dir2\ Don't have file, so do AzCopy transfer
}

AFAIK, Azcopy has a feature that can only copy data that doesn't exist in the destination, if you want to prevent AzCopy.exe when destination folder exists already, it seems not necessary to check the destination folder if empty.
The /XO and /XN parameters allow you to exclude older or newer source resources from being copied, respectively. If you only want to copy source resources that don't exist in the destination, you can specify both parameters in the AzCopy command

Related

Azure RunBook - Can We pass the key to a file zip the file and push it to a blob (Key refresh)

i am able to create a run book to generate a key and passed into a variable using powershell.
can i move the key into a txt file and zip the same and move it to blob.
You may use below script to accomplish the requirement of sending a value into a text file, then to zip it and then sending it to a storage blob.
$storageAccountName = "xxxxxxxxxxxxxxxxxxxx"
$storageAccountKey = "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx=="
$context = New-AzureStorageContext -StorageAccountName $storageAccountName -StorageAccountKey $storageAccountKey
$KeyValue = "xxxxxxxxxxx"
$KeyValue | Out-File -FilePath ($env:temp+"/KeyFile.txt")
Compress-Archive -Path ($env:temp+"/KeyFile.txt") -DestinationPath ($env:temp+"/KeyFileZip.zip") -CompressionLevel optimal
Set-AzureStorageBlobContent -File ($env:temp+"/KeyFileZip.zip") -Container "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxx" -BlobType "Block" -Context $context -Verbose
Make sure you update the storage account name, storage account key and container name correctly before using the script.
For illustration on how I accomplished it, please check below screenshots.

Powershell script to copy files from server to azure blob container

I want to copy all the files in my server(azure vm) to a container (azure blob storage) is it possible through powershell?
I'm new to powershell please help me out
In any script with you please share
First, make sure you have installed the Az powershell module in your VM and the place you want to run the command. In my sample, I use my PC to run the command.
Try to store the script below in the PC, I use C:\Users\joyw\Desktop\script.ps1, the script will upload all the files in the folder C:\Users\xxxx\Desktop\test in your VM, you can change it to the path what you want.
script.ps1:
$context = New-AzStorageContext -StorageAccountName "<StorageAccountName>" -StorageAccountKey "<StorageAccountKey>"
$files = (Get-ChildItem "C:\Users\xxxx\Desktop\test" -recurse).FullName
foreach($file in $files){
Set-AzStorageBlobContent -Context $context -File "$file" -Container "<container name>" -Blob "$file"
}
Then run the command:
Connect-AzAccount
Invoke-AzVMRunCommand -ResourceGroupName 'rgname' -Name 'vmname' -CommandId 'RunPowerShellScript' -ScriptPath 'C:\Users\joyw\Desktop\script.ps1'
If you want to login with non-interactive way, see this link.
Change the parameters to what you want, after uploading the file to the container, it will appear like the original file structure.

Truncate azure storage file directory/delete azure storage file using wildcard suffix

I am writing a Powershell script to enable a monthly cleanup of the files in the azure fileshare.
Ideally I would want one script to just truncate the directory/file location.
But will settle for deleting each file individually.
There are about 20 files which are loaded monthly, each file has a unique name and a datetime stamp as a suffix included in the file name. e.g. filename_20190121123515
I've managed to delete a file using the full name, but would need to use wildcard values for the datetime suffix.
What is the correct syntax for what I am trying to achieve? Or how would I truncate the entire folder?
$context = New-AzStorageContext -StorageAccountName "AccountName" -SasToken
"?sv=2015-12-11&si=bss-15D97F9B09D&sr=s&sig=xxxxxxxxxxxxxxxxxxxxx"
Remove-AzStorageFile -ShareName "bss" -Path
"root/Temp_Clean_up_test_Folder/FileName_%%%%%%%%%%%%%%.csv" -Context
$context
$context = New-AzStorageContext -StorageAccountName "AccountName" -SasToken
"?sv=2015-12-11&si=bss-15D97F9B09D&sr=s&sig=xxxxxxxxxxxxxxxxxxxxx"
Remove-AzStorageFile -ShareName "bss" -Path
"root/Temp_Clean_up_test_Folder/FileName_*" -Context $context
The cmdlet Remove-AzStorageFile does not support wildcard, you need use a loop method to delete the files one by one.
Sample code like below:
$context = New-AzStorageContext -StorageAccountName "AccountName" -SasToken "xxx"
# Since the file name has date, you can specify which date to be deleted.
# In this sample, we try to delete the files' name contains 201901
$pattern ="201901"
# iterate all the files, and delete
Get-AzStorageFile -ShareName "testfolder" -Path "t1/t1sub" -Context $context | Get-AzStorageFile | where {($_.GetType().Name
-eq "CloudFile") -and ($_.Name -like "*$pattern*")} | Remove-AzStorageFile
Before delete:
After delete(all the files whose name contains "201901" are deleted):

Delete local file after upload to Azure

Trying to delete local backup files after they have been uploaded to Azure storage, gets the following error:
Get-ChildItem : Cannot find path
'C:\Windows\system32\Microsoft.WindowsAzure.Commands.Common.Storage.ResourceModel.AzureStorageBlob'
because it does not exist.
When running the following code:
$BackupDir= 'C:\BackupDir'
$AzureDir= Get-AzureStorageBlob -Context $context -Container $containername -blob $file.name
Get-ChildItem $AzureDir | ForEach-Object
{
$FileInBackup= $AzureDir + $_.Name
If (Test-Path $FileInBackup) {Remove-Item $FileInBackup}
}
Why is it looking in C:\Windows*blahblah*?
If I print variable $AzureDir to screen, I see all my blobs.
Basically, it's probably obvious but what I want to do is check each file in my backup DIR and if it exists in Azure, delete it, if not, continue on to the upload step. I can share the rest of my code if need be.
RESOLUTION UPDATE:
Thanks to #OmegaMan, who pointed me down the right path, I was able to fix my issue. Here is what I'm now using. It's cycling through 4 'blobs' correctly and using the results correctly:
$BackupDir = 'C:\BackupDir'
$AzureFiles = Get-AzureStorageBlob -Context $context -Container $containername -blob *.name
foreach ($AzureFile in $AzureFiles)
{
$FileInBackup = $AzureFile.name
If (Test-Path $BackupDir\$FileInBackup)
{
Remove-Item $FileInBackup
}
}
You seem to use $AzureDir in one instance to hold all the blob infos, which is fine but then the line $FileInBackup= $AzureDir + $_.Name seems to think $AzureDir is a literal directory name.
It appears you need to rectify where the base directory is instead of $AzureDir in those instances.

Issue with permissions when creating and copy directories

I have created a PowerShell script for copying files to a directory, the script, first creates a folder , or forces a new folder event if it exists. Then copies a directory from another location. After copying, the files I then need to copy the correct web config depending on a value given by the user execturing the script. The issue I am having is I can copy the files, but all the files are set to read-only meaning when I try and copy the correct web.config, the script fails as access is denied.
This is a cut down version of script for simplicity.
$WebApp_Root = 'C:\Documents and Settings\user\Desktop\Dummy.Website'
$Preview_WebApp_Root = 'c:\applications\Preview\'
$Choice = read-host("enter 'preview' to deploy to preview, enter Dummy to deploy to Dummy, or enter test to deploy to the test environment")
if (($Choice -eq 'Preview') -or ($Choice -eq 'preview'))
{
$Choice = 'Preview'
$Final_WebApp_Root = $Preview_WebApp_Root
}
write-host("Releasing Build to " + $Choice +'...')
write-host("Emptying web folders or creating them if they don't exist... ")
New-Item $Final_WebApp_Root -type directory -force
write-host("Copying Files... ")
Copy-Item $WebApp_Root $Final_WebApp_Root -recurse
write-host("Copy the correct config file over the top of the dev web config...")
Copy-Item $Final_WebApp_Root\Config\$Choice\Web.configX $Final_WebApp_Root\web.config
write-host("Copying correct nhibernate config over")
Copy-Item $Final_WebApp_Root\Config\$Choice\NHibernate.config $Final_WebApp_Root\NHibernate.config
write-host("Deployed full application to environment")
Try to use -Force parameter to replace read-only files. From documentation:
PS> help Copy-Item -Par force
-Force [<SwitchParameter>]
Allows the cmdlet to copy items that cannot otherwise be changed,
such as copying over a read-only file or alias.