Azcopy upload then delete local - powershell

I'm working on a Powershell script to upload files to an Azure blob from a local directory then if successful remove the local files. The blob is just a staging location and the files are moved from there so I cannot sync. I think the only way to do this might be to check the status of the copy from the log? I thought this may do what I need but it just starts uploading all the files again when it cycles back through and never continues and removes the source. Any help would be greatly appreciated.
while($true){
.\azcopy copy "$Upload\*.iso" (SAS URL) --log-level ERROR
Start-Sleep -Seconds 60 -Verbose
}

Daniel's link got me going in the right direction.
.\azcopy copy "$Upload\*.iso" "SAS" --log-level ERROR
IF ($LASTEXITCODE -eq 0){
Write-Host "Transfer Successful"
Write-Host "Remove files after Azure copy is done"
rm "Source" -Verbose
Stop-Transcript
}
IF ($LASTEXITCODE -ne 0){
Write-Host "Transfer Failed"
Stop-Transcript
}

Related

if then else not seeing else argument

I'm trying to learn myself some PowerShell scripting to automate some tasks at work.
The latest task I tried to automate was to create a copy of user files to a network-folder, so that users can easily relocate their files when swapping computers.
Problem is that my script automatically grabs the first option in the whole shebang, it never picks the "else"-option.
I'll walk you through part of the script. (I translated some words to make it easier to read)
#the script asks whether you want to create a copy, or put a copy back
$question1 = Read-Host "What would you like to do with your backup? make/put back"
if ($question1 -match 'put back')
{Write-Host ''
Write-Host 'Checking for backup'
Write-Host ''
#check for existing backup
if (-Not(Test-Path -Literalpath "G:\backupfolder"))
{Write-Host "no backup has been found"}
Elseif (Test-Path -LiteralPath "G:\backupfolder")
{Write-Host "a backup has been found."
Copy-Item -Path "G:\backupfolder\pictures\" -Destination "C:\Users\$env:USERNAME\ ....}}
Above you see the part where a user would want the user to put a "backup" back.
It checks if a "backup" exists on the G-drive. If the script doesn't see a backup-folder it says so. If the script DOES see the backup it should copy the content from the folders on the G-drive to the similarly named folder you'd find on the user-profile-folder. Problem is: So far it only acts as if there is never a G:\backupfolder to be found. It seems that I'm doing something wrong with if/then/else.
I tried with if-->Else, and with if-->Elseif, but neither works.
I also thought that it could be the Test-Path, so I tried adding -LiteralPath, but to no avail.
There is more to the script but it's just more if/then/else. If I can get it to work on this part I should be able to get the rest working. What am I not seeing/doing wrong?

How to read the logs from TFS Build/Release and change the task status accordingly?

I am using TFS(on premises 2015)automated build and release for one of my project. In release definition, I have an ALM task and I can see the TFS release log returning "completed successfully: Y (or N) " in the log based on the task completion status in ALM and the ALM task always shows a success. Is there any way that I can read the this "completed successfully: N" from the logs and fail the ALM release task itself as an indication of failure?
Thanks in advance for any help!
Well you're not giving much help here. With or having a better idea of what your script does... But you could do something like
(At the end of your command)
Command -errorvariable fail
If ($fail -ne $null){
$success = $fail
} Else {
$success = $true
}
You could also pipe the error variable into the file at the next line if it's a txt log.
Command -ev fail
$fail | out-file log.txt -append
Or
Command -ev fail
If ($fail -ne $null) {
Write-output "the command failed at $variable" | out-file log.txt -append
}
$variable would be the variable used for your loop or whatever to identify the current task.

Moving content from one external storage location to a network with email confirmation

This is my first non-very-basic attempt at PowerShell scripting, so please excuse any poor etiquette.
I have an need to transfer approximately 30GB of video data from USB attached storage to a local network share. As I started this little project, I quickly identified that the processes I do naturally when performing this task need to be accounted for during the scripting, so my question is, how do I lay this all out and achieve my end goal of a simple copy and allow for this.
This is what I have thus far;
### (The purpose of this script is to automate the offloading of the Driver Cameras during FP1 and FP2 on a Friday.
### Please remember that the cameras still need to be cleared after fireups on a Thursday"
Write-Host -NoNewLine "This script will move the footage captured on the SD cards of the FWD and RWD cameras and copy them to a defined network share" -ForegroundColor Green `n
Start-Sleep -Seconds 10
# Execute the copy from the foward facing camera to the network and remove the local files once complete
Write-Host "FWD Camera copy in progress" -ForegroundColor White -BackgroundColor Magenta `n
Start-Sleep -Seconds 5
Get-ChildItem -Path "SourcePath" -Recurse |
Move-Item -destination "DestinationPath" -Verbose
# Execute the copy from the rearward facing camera to the network and remove the local files once complete
Write-Host "RWD Camera copy in progress" -ForegroundColor White -BackgroundColor Magenta `n
Start-Sleep -Seconds 5
Get-ChildItem -Path "SourcePath" -Recurse |
Move-Item -destination "DestinationPath" -Verbose
Write-Host "Sending email confirmation" -ForegroundColor Green `n
Send-MailMessage -smtpserver ServerIP -To "EmailAddress" -From "EmailAddress" -Subject "Camera offload" -body BodyHere -bodyasHTML
Write-Host "All tasks have completed" -ForegroundColor Green `n
Read-Host "Press any key to exit..."
exit
What I'd like to add is fault tolerance and allow for this to be communicated via email dynamically. find these criteria below;
There's a chance the cable connecting the storage to the machine running the script could become disconnected and only have moved a number of items, can I add something to aid this?
If a file transfer fails how do i restart and track this? Can I add a loop to confirm all the items have been moved?
How do I reference a fault code to dynamically update the content of the email sent to the end user?
Finally, are there any other common practice references I've missed and that need to be included?
Many thanks in advance for any help.
This topic is a bit broad but let me try to address your question to help you to start. Of course I won't give you the whole code, just explanation what to use and why.
There's a chance the cable connecting the storage to the machine running the script could become disconnected and only have moved a number of items, can I add something to aid this?
First of all, as vonPryz said in comments, use robocopy!
It should survive network interruptions (e.g. check this post). As a general approach, I'd first make sure that the content is successfully copied before deleting it. For example you could use something like this:
(Get-ChildItem -Recurse).FullName.Replace("C:\old\path","D:\new\path") | Test-Path
Of course the above will only check if file exists, not if the file has the same content. To compare if the files are the same you could use Get-FileHash.
If a file transfer fails how do i restart and track this? Can I add a loop to confirm all the items have been moved?
Partially answered above. Robocopy has this feature built-in. And yes, you can add a loop to check.
How do I reference a fault code to dynamically update the content of the email sent to the end user?
Check the example from here:
robocopy b:\destinationdoesnotexist C:\documents /MIR
if ($lastexitcode -eq 0)
{
write-host "Success"
}
else
{
write-host "Failure with exit code:" $lastexitcode
}
Also, there's article on MS side listing all exit codes which might be helpful to handle the exceptions. All you have to do is to add $LASTEXITCODE to email body.

Get batch file return code in powershell

I'm trying to write a PowerShell script as a "Framework". Basically what it does is calls batch files that installs msi files locally.
There are few things I am trying to get done.
PowerShell should get all exit codes whatever the batch files return.
Need a timer and check if the batch file is stuck.If it's stuck kill it, return an error and continue
Need a progress bar for each batch file installation.
I'm not sure if I can do all or any. Since every package(the batch files) comes from different groups, I can't use PowerShell to do everything and have to use the batch files other groups sent me.
I was able to get the packages install with a simple script but couldn't get to do any of the above things I was trying to do. I keep getting last exit code as 0. I assume it's because the batch file ran successfully.
$APP1 = "Microsoft_RDP_8.1_L_EN_01"
Stop-Process -Name reg* -Force
Write-Host "=== $Time -- Starting $APP1 Installation"
TRY
{
$Install = "$pwd\cmd\$App1\Install.cmd"
cmd /c $Install /b 1
$App1ErrorCode = $LastExitCode
Write-Host "Final Return Code Of $APP1 Is = $APP1ErrorCode"
}
CATCH
{
Write-Host "-----------------------------------------------------------" -ForegroundColor Green
Write-Host "Caught an exception:" -ForegroundColor Red
Write-Host "Exception Type: $($_.Exception.GetType().FullName)" -ForegroundColor Red
Write-Host "Exception Message: $($_.Exception.Message)" -ForegroundColor Red
Write-Host "-----------------------------------------------------------" -ForegroundColor Green
}
$LastExitCode = $null
$Value = $null

Azure Blob Copy stuck in Pending State

I'm attempting to copy a VM from one subscription to another. I've done this for 5 other VM's without issue. All of the sudden, with THIS VM i'm having issues. The process continually gets hung when the OS disk is being copied.
The script I'm using is below:
foreach($disk in $allDisks)
{
$blobName = $disk.MediaLink.Segments[2]
$blobStorageAccount = $disk.MediaLink.Segments[1].Replace('/', '')
$targetBlob = Start-AzureStorageBlobCopy -SrcContainer $blobStorageAccount -SrcBlob $blobName `
-DestContainer vhds -DestBlob $blobName `
-Context $sourceContext -DestContext $destContext -Force
Write-Host "Copying blob $blobName"
$copyState = $targetBlob | Get-AzureStorageBlobCopyState
while ($copyState.Status -ne "Success")
{
$percent = ($copyState.BytesCopied / $copyState.TotalBytes) * 100
Write-Host "Completed $('{0:N2}' -f $percent)%"
sleep -Seconds 20
$copyState = $targetBlob | Get-AzureStorageBlobCopyState
}
When I check the status on the $copyState it's stuck at Pending. I've gone and used Stop-AzureStorageBlobCopy, deleted the destination blob and started over, but no matter what it's always just stuck in Pending state, with BytesCopied at 0.
The source VM has been stopped (deallocated) prior to copy. There are no other pending copy operations that I can see, and I've checked every blob in the destination subscription manually.
I even tried a rename operation of the source blob in AzureStorageExplorer, which ended up creating a copy. That copy completed without issue. I tried copying the COPY of the original file over to the other subscription, and it also got stuck on "Pending".
Any ideas why I can't copy between the subscriptions?