How do you reboot an ADO Microsoft Hosted Build Agent half way through a pipeline run and then continue the next tasks - azure-devops

I need to reboot an ADO Microsoft Build Agent halfway through a pipeline run and then continue on with the remaining tasks in that pipeline. The reason is that I need to install Hyper V on the build agent. I can run Enable-WindowsOptionalFeature -Online -FeatureName Microsoft-Hyper-V -All but it won't take effect until a reboot. Therefore, a reboot after running that command is needed and then the remaining tasks (Get-VM etc) need to be completed after the reboot is finished.

You do not. The agent will be recycled when it shuts down. There is no way to restart the agent and continue on the same machine.

Related

MDT will not allow BIOS update upon reboot

I have been picking at this issue for a few weeks now with no resolve or solution found on the webs. When I run a Dell BIOS update with /s and /f during a new Windows 10 21H1 build task sequence, the update runs successfully with the BIOS update log showing error 2 that a reboot is needed to perform the BIOS update upon reboot. So the next step in the task sequence I perform a reboot but the BIOS never does the update, it just boots into Windows. I tried this from command line, PowerShell and as an application with the reboot box checked. All the ways I run this the log says ready to update on reboot but never does. I can get the update to work if I manually perform the reboot by using the mouse before MDT reboots it. This actually performs the update at the reboot as it should! However this of course creates a dirty environment and MDT is grumpy.
This happens on all different Dell builds I try that are only one or two steps newer. We currently use PDQ to run the updates. When I call the install from there, this too works fine. We want to move away from PDQ to a free solution such as just straight from MDT. I found many different ways people have performed this via task sequence and no mention of this hiccup. What I seem to be running into is MDT is removing whatever the BIOS is putting into the boot sequence so it never gets performed. I've tried different credentials, Dell's flash64w.exe and too much to list. Things seem to work until reboot. I'm stumped.
Sample of simple working PowerShell:
# Get model of system to be updated
$Model = (gwmi Win32_ComputerSystem).Model
Write-Host "Model Found: $($Model)"
# Get root folder where BIOS for model is stored
$BIOSRoot = "Z:\Applications\BIOSUpdates\Dell\$Model"
Write-Host "BIOSRoot: $($BIOSRoot)"
# Get path with BIOS executable and list of arguments
$BIOSFile = Get-Childitem -Path "$BIOSRoot" -Include *.exe -Recurse -ErrorAction SilentlyContinue
#Write-Host "BIOSFile: $($BIOSFile)"
$ARGS = #('/s', '/f')
Write-Host "BIOSFile and arguments: $($BIOSFile) $($ARGS)"
#Start BIOS Update with completed path
Start-Process "$BIOSFile" -ArgumentList "$ARGS" -Wait
Is anyone else having this show stopping issue?
So after much testing, the answer is to run the Start-Process at the end of my code twice. Why? I have no idea. I accidentally had it looping and it worked on the second loop. It just needed to run twice. I thought maybe it just needed more time. I put a sleep at end of the code, but time is not what it wanted. Very bizarre.

Azure Devops Interactive agent

For running our UI tests were using the task "Visual Studio test agent deployment", which has been deprecated now. So i have moved the tasks to use "Visual Studio Test" task.
This needs agents to be configured as interactive process rather than service. So i created a new build server with interactive process agent running with admin rights user.
For signing or code we are using signed certificates that gets installed on the build server. But for some reason this new build server seems to loose the certificate cred every single time the pipeline runs.
I have tried to to run a ps1 file to reinstall the cert with no success. Below id the code that i have for reinstalling the cert and i am using task "PowerShell on target machines" to run the script. The user running this script is admin on the box, is there a different way of getting the ps1 file to run as admin all the time?
Set-Location "C:\Program Files (x86)\Microsoft SDKs\Windows\v10.0A\bin\NETFX 4.6.1 Tools"
Start-Process cmd.exe
Sleep 2
$WshShell = New-Object -ComObject WScript.Shell
Sleep 2
$WshShell.sendkeys(".\sn.exe -d $KeyContainer{Enter}");
Sleep 2
$WshShell.sendkeys(".\sn.exe -i $PfxCertificatePath $KeyContainer{Enter}");
Sleep 2;
$WshShell.sendkeys("$passWord{Enter}");
Sleep 2;
$WshShell.sendkeys("{Enter}");
Any help would be greatly appreciated.
Is there a way to get the cmd.exe to run as admin through the
pipeline?
As i know, for now only the version 3.0 of Powershell on Target Machine task supports our running it as admin. Apart from it, the Command-line task and PowerShell Task doesn't support this feature yet.
So I'm afraid the answer is negative, Azure Devops pipeline doesn't have the option to make us run those tasks(CMD task, PS task...) with admin rights. (According to your description, you're trying to get a different way of using PS on Target Machine task. )
Sorry for the inconvenience. Actually I think it could be one good idea so you could feel free to add your request(CMD task with admin...) for this feature on our UserVoice site, which is our main forum for product suggestions. But it may not be accepted by MS if it doesn't get enough votes.
Hope it helps:)

Powershell Script lagging on Gitlab CI

I am running a powershell script using the posh-ssh package to ssh files from a windows based gitlab CI to a Linux server.
Write-Output "`r`nUploading $($theme.name)..."
Set-SCPFolder -ComputerName '141.209.15.16' -Credential $sshCredentials -LocalFolder $theme.fullname -RemoteFolder "/home/cmuwebuser/$($theme.name)" -AcceptKey -ErrorAction Stop
Write-Output "Success"
When i run this process in the powershell terminal it runs in under 5 seconds per file but when I run it using the CI script it takes over 1 minute per file.
Added the -NoProgress option to the Set-SCPFolder command. Apparently the gitlab CI terminal view can not handle the dynamic nature of a powershell progress bar and hangs. Removing all progress bars from the script resolved the issue.

How can I run a PowerShell script after reboot?

I have a powershell script that tails specifics logs. If there is no update to the logs within a specific time span, an alert is then sent to nagios (as this depicts that the service is no longer running).
The powershell script works great when run manually, but my issue is that I want it to load up on reboot. I've tried creating a scheduled task that repeats itself every 5 minutes using the arguments '-noexit -file C::\script.ps1'. The problem is then that my script doesn't actually work when run as a scheduled task.
The execution policy is set to Unrestricted, so the script runs, but the code doesn't execute and work like it does when manually run.
FWIW, the code is:
function Write-EventlogCustom($msg) {
Write-EventLog System -source System -eventid 12345 -message $msg
}
Get-Content -Path C:\test.log -Wait | % {Write-EventlogCustom $_}
So if I update test.log while the powershell script runs a scheduled task, the event log doesn't get updated. However, when I run this script manually, and update to test.log, it does appear in the event viewer.
I'm hoping that a second set of eyes might find something that I may have missed?
As #Tim Ferrill has mentioned, I needed to run the process with task schedulers 'Run with highest privileges' setting. This resolved the issue.

Installing an .exe on a remote machine from PowerShell

So, I have been trying to do the following via a PowerShell script:
For a list of computers, do:
Ping the computer (via WMI) to see if it's available; if not, log & break, if so, continue on
Create a folder on the root of the C:\ drive (via Invoke-WmiMethod); if fails, log & break, if successful, continue on
Copy files (includes an .exe) from another machine into that folder; if fails, log & break, if successful, continue on
Run the .exe file (via Invoke-WmiMethod); if fails, log & break, if successful, log success, done (with this computer.)
The problem I'm running into is the execution of the .exe (program installer) -- the Invoke-WmiMethod command usually works, but for some machines, it hangs (not fails, but hangs.) I've tried a whole bunch of stuff to try to get it to run as a job so I can set a timeout on the install (running the Invoke-WmiMethod command with -AsJob param, always returns Failed...; Start-Job -Computer $compname { Invoke-WmiMethod..., returns Completed but the install never happens; making sure the remote machines have Windows Firewall disabled, UAC turned off, etc. but still if I run the Invoke-WmiMethod command on them, not running as a job, it hangs. And yes, I'm running PS as a Domain Admin, so I should have rights on the target machines.)
So being a newb at all things PowerShell, I'm now at a complete loss as to what to try next... How would you tackle running a .exe on a remote system from a PowerShell script? One caveat is that the target machines don't all run PowerShell [V1|V2] (target PCs are a mix of XP, Vista and 7) or don't have remoting enabled. The other caveat being that the installer is an .exe, and not an .msi, and this can't be changed (it's a third-party app.)
Thanks in advance to anyone who can point me in the right direction here (and give me some sample code...)
What OS is running on the system (management station or central system) where these scripts are getting executed? If Windows XP, there is a known issue with WMI and -asJob.
Check this: WMI Query Script as a Job
In such a case, I'd suggest moving to a Windows 7 system and then run the script to remotely install .exe on all other machines.