Returning success and then restarting computer - powershell

I'm integrating Jenkins with a bunch of stuff and im using powershell to do this. I have a script on a remote machine that is executed after a build is successful on jenkins. This script does a bunch of stuff and then will restart the machine.
What i need to do is:
Return to jenkins that the script was successful (meaning that it will end the job as SUCCESS)
Then restart the machine
So far i have not managed to send 'EXIT 0' to jenkins and then restart the machine. There's anyway to do this?
Thanks in advance.
Code example:
Write-Host "Code example"
Exit 0 #for jenkins success
Restart-Computer -Force

This will host a seperate command prompt that runs async from the powershell script and restarts the computer in 3 seconds, enough time for powershell to return the exit code to jenkins.
Start-Process -FilePath "cmd.exe" -ArgumentList '/c "timeout /t 3 /nobreak && shutdown -r -f -t 0"' -WindowStyle Hidden
Exit 0

As noted in a comment by #Avshalom, your problem is that the Exit statement will unconditionally exit your script without ever executing the Restart-Computer command placed after it.
Restart-Computer, when executed locally, is invariably asynchronous, so your script will continue to execute, at least for a while.
You can therefore try to call Restart-Computer first, and exit 0 afterwards:
Write-Host "Code example"
Restart-Computer -Force
exit 0 # for Jenkins success
However, there's no guarantee that control will return to Jenkins in time and that Jenkins itself will have time to process the successful exit before it is shut down itself.
You can improve the likelihood of that with a delay, via a separate, asynchronously launched PowerShell instance[1], similar to the approach in Evilcat's answer:
Write-Host "Code example"
# Asynchronously start a separate, hidden PowerShell instance
# that sleeps for 5 seconds before initiating the shutdown.
Start-Process -WindowStyle Hidden powershell.exe -Args '-command',
'Start-Sleep 5; Restart-Computer -Force'
exit 0 # for Jenkins success
This still isn't a fully robust solution, however; a fully robust solution requires changing your approach:
Let your script indicate success only, without initiating a restart.
Make Jenkins test for success and, if so, call another script that unconditionally initiates a shutdown.
[1] As Evilcat points out, using a background job with Start-Job does not work, because on exiting the calling PowerShell session with exit the background jobs are terminated too.

Related

PowerShell batch script Timeout ERROR: Input redirection is not supported, exiting the process immediately

I need to have a timeout in powershell code where I'm running batch file, in case the batch file runs for a longer time or gets stuck. I also have a timeout in the batch script timeout 300> nul from which I seem to be getting this error and it is just skipping through the timeout and executing next lines. I do not get this error if I remove the timeout from batch script. But I need timeouts at both places, how do I resolve this ?
Error- ERROR: Input redirection is not supported, exiting the process immediately.
PS Code-
$bs={
cd D:\files\
cmd.exe /c "mybatchfile.bat"
}
$timeoutseconds=800
$j=Start-Job -Scriptblock $bs
if(wait-Job $j -Timeout $timeoutseconds) {Receive-Job $j}
Remove-Job -force $j
batch script is something like this
cmd1
cmd2
timeout 300> nul
cmd3
Whenever timeout.exe detects that its stdin input is redirected (not attached to a console), it aborts with the error message you saw.
Because of how background jobs launched with Start-Job are implemented, whatever external processes you run from the job's script block invariably see stdin as redirected, so you won't be able to call timeout.exe.[1]
Your best bet is to use thread-based background jobs, where this problem doesn't arise:
Start-ThreadJob runs your script block in a separate thread in the same process, and stdin redirection isn't involved; also, because no new PowerShell (child) process must be started, thread jobs are faster to create and require fewer resources.
Start-ThreadJob is part of the ThreadJob module, which comes with PowerShell (Core) 7+, and can be installed on demand in Windows PowerShell (e.g., with Install-Module -Scope CurrentUser ThreadJob)
Since Start-ThreadJob also integrates with PowerShell's job-management infrastructure and shares the core parameter syntax with Start-Job, all that should be necessary is to replace Start-Job with Start-ThreadJob in your code.
[1] Unless there is a way to reattach stdin to the console in a timeout.exe call - I'm not aware of such a feature, however (<CON creates an Access denied error).
you might be intersted to add a delay in another way, one alternative is to wait with:
REM # waits a delay before refresh status of service
ping localhost
Indeed i had similar issue on remote powershell issue in a scenario where gitlab agent install script on remote server. by replacing timeout with ping the script do not finish anymore with an error.
My tech youtube channel in my profile

Powershell waits on cmd.exe differently depending on environment

Consider the powershell command:
cmd.exe "/c start notepad.exe"
Using powershell.exe (console) this command completes immediately after starting the cmd/notepad process and does not wait for notepad/cmd to exit. If I want it to wait for completion I have to pipe the output to Out-Null.
Using the powershell ISE this command blocks execution until notepad/cmd is closed.
Further, if I use create a new process (of powershell.exe) in a script running in powershell.exe using System.Diagnostics.Process and redirect standard output the script now blocks execution until notepad/cmd is closed. If I don't redirect output it does not block execution.
But if I use c# to create a new process with the same settings/start info, and run the same script with redirected output it doesn't block execution.
I'm at a loss here. Obviously it has something to do with the setup of the execution and output and maybe "cmd.exe". I'm hoping someone understands what's fundamentally happening behind the scenes differently in these cases. I know the ISE and cmd.exe isn't fully supported but the I don't know why the other 3 aren't consistent.
Why do the scripts not run with the same behavior (especially the powershell console ones) and how do I get them to?
Edit:
After some troubleshooting I was able to get all the powershell.exe versions to run the same (the ISE isn't of importance to me). The odd ball where cmd.exe "/c start notepad.exe" blocks execution is when a new process is created in powershell to run powershell.exe using System.Diagnostics.Process. If output is redirected (the reason I was using System.Diagnostics.Process in the first place, Start-Process doesn't support redirecting except to a file) then calling WaitForExit() on the process object causes the blocking to occur.
Simply substituting WaitForExit() with Wait-Process (and the process ID) causes the powershell script running in the new powershell.exe to execute as expected and exit after running the command. Hopefully this information combined with #mklement0 answer will be sufficient for anyone else having similar problems.
To get predictable behavior in both the regular console and in the ISE:
Start a GUI application asynchronously (return to the prompt right away / continue executing the script):
notepad.exe
Invoking Notepad directly makes it run asynchronously, because it is a GUI-subsystem application.
If you still want to track the process and later check whether it is still running and what its exit code was on termination, use -PassThru, which makes Start-Process return a [System.Diagnostic.Process] instance:
$process = Start-Process -PassThru notepad.exe
$process.HasExited later tells you whether the process is still running.
Once it has exited, $process.ExitCode tells you the exit code (which may not tell you much in the case of a GUI application).
To wait synchronously (at some point):
Use Wait-Process $process.ID to wait (indefinitely) for the process to terminate.
Add a -Timeout value in seconds to limit the waiting period; a non-terminating error is reported if the process doesn't terminate within the timeout period, causing $? to reflect $False.
Start a GUI application synchronously (block until the application terminates):
Start-Process -Wait notepad.exe
-Wait tells Start-Process to wait until the process created terminates; this is the equivalent of cmd /c 'start /wait notepad.exe'.
There's a non-obvious shortcut to Start-Process -Wait: you can repurpose the Out-Null cmdlet to take advantage of the fact that piping invocation of a program to it makes Out-Null to wait for the program's termination (a GUI application has no stdout or stderr output, so there's nothing for Out-Null to discard; the only effect is synchronous invocation):
notepad.exe | Out-Null
In fact, this approach has two advantages:
If arguments must be passed to the GUI application, they can be passed directly, as usual - rather than indirectly, via Start-Process's -ArgumentList parameter.
In the (rare) event that a GUI application reports a meaningful process exit code (e.g, msiexec.exe), the Out-Null approach (unlike Start-Process -Wait) causes it to be reflected in the automatic $LASTEXITCODE variable.
Note: In rare cases, a GUI application may explicitly attach to the caller's console and write information to it; in order to surface that, pipe to | Write-Output instead (you'll still be able to evaluate $LASTEXITCODE) - see this answer.
Note that for console-subsystem applications (e.g., findstr.exe), synchronous execution is the default; Start-Process is only needed for GUI applications (and for special situations, such as wanting to run an application in a new console window or with elevation (run as admin)).
To run a console application or shell command asynchronously (without opening a new console window), you have the following options:
[Preferred] Use Start-Job kick off the command, and Receive-Job to receive its output / success status later.
$j = Start-Job { sleep 2; 'hi' }
To synchronously wait for this job to finish (and remove it automatically), use
Receive-Job -Wait -AutoRemoveJob $j
In PowerShell (Core) 6+:
You can use the simpler ... & syntax (as in
POSIX-like Unix shells such as bash) in lieu of Start-Job; e.g.:
$j = & { sleep 2; 'hi!' } &
Better yet, you can use the lighter-weight, faster Start-ThreadJob cmdlet, which uses threads for concurrency, but otherwise seamlessly integrates with the other *-Job cmdlets (note that it has no short-hand syntax):
$j = Start-ThreadJob { sleep 2; 'hi' }
[Not advisable] You can use something like Start-Process -NoNewWindow powershell -Args ..., but it is ill-advised:
Passing a shell command to powershell via Start-Process requires intricate quoting based on obscure rules - see this answer of mine for background information.
Any stdout and stderr output produced by the application / shell command will by default arrive asynchronously in the current console, interspersed with what you're doing interactively.
While you can redirect these streams to files with RedirectStandardOutput and -RedirectStandardError (and stdin input via -RedirectStandardInput) and you can use -PassThru to obtain a process-information object to determine the status and exit code of the process, Start-Job and Receive-Job are a simpler, more convenient alternative.
P.S.: I don't have an explanation for why cmd.exe "/c start notepad.exe" is blocking in the ISE but not in the regular console. Given the solutions above, however, getting to the bottom of this discrepancy may not be needed.

PowerShell Start-Process not setting $lastexitcode

I have a set of test DLL's that I'm running from a powershell script that calls OpenCover.Console.exe via the Start-Process command.
I have the -returntargetcode flag set
After execution I check $lastexitcode and $?. They return 1 and True respectively all the time. Even when tests are failing.
Shouldn't $lastexitcode be 0 when all tests pass and 1 when they fail?
By default, Start-Process is asynchronous, so it doesn't wait for your process to exit. If you want your command-line tool to run synchronously, drop the Start-Process and invoke the command directly. That's the only way it will set $LASTEXITCODE. For example, causing CMD.exe to exit with a 2:
cmd /c exit 2
$LASTEXITCODE
You can make Start-Process synchronous by adding the -Wait flag, but it still wont' set $LASTEXITCODE. To get the ExitCode from Start-Process you add -PassThru to your Start-Process, which then outputs a [System.Diagnostics.Process] object which you can use to monitor the process, and (eventually) get access to its ExitCode property. Here's an example that should help:
$p = Start-Process "cmd" -ArgumentList "/c exit 2" -PassThru -Wait
$p.ExitCode
Of course the advantage of this approach is you don't need to wait for the process, but later when it exits you have the information about it's run in $p.
When executing a GUI application, dropping the Start-Process does not help, as PowerShell does not wait for GUI application to complete, when executing them directly this way. So $LASTEXITCODE is not set.
Piping the (non existing) GUI application output helps, as it makes PowerShell to wait for the application to complete.
notepad.exe | Out-Null
echo $LASTEXITCODE
Note that "GUI application" does not necessarily mean that the application has windows. Whether an application is GUI or console is a flag in .exe file header.
Start-Process -PassThru -Wait as suggested in the answer by #Burt_Harris works too in this case, it's just a bit more complicated.

Is it possible to get different results on `$?` on the same command?

I have a powershell script where I'm executing a node command which is meant to be executed by a TFS 2013 Build:
node "$Env:TF_BUILD_SOURCESDIRECTORY\app\Proj\App_Build\r.js" -o "$Env:TF_BUILD_SOURCESDIRECTORY\app\Proj\App_Build\build-styles.js"
$success = $?
if (!$success){
exit 1
}
When I run this script manually and the command fails $success is false and the script exits 1, but when the build executes the script and the node command fails, $success (and $?) is true.
What can change the behavior of powershell? I have no idea what else to try. So far I eliminated the following:
Changed the Build Service user to the same Admin user that executes the script manually
Tried executing the command with cmd /c node ...
Tried executing the command with Start-Process node...
Ran the Build Service interactively
Ran the build with both VSO Build Controller and an on premise Build Controller
Executed the script manually with the same command used by TFS (per the Build Log)
Thoughts?
Can we restructure this a bit so we have a better feel for what is happening? I tend to avoid $? because it is harder to debug and test with.
try
{
Write-Host "TF_BUILD_SOURCESDIRECTORY = $Env:TF_BUILD_SOURCESDIRECTORY"
$result = node "$Env:TF_BUILD_SOURCESDIRECTORY\app\Proj\App_Build\r.js" -o "$Env:TF_BUILD_SOURCESDIRECTORY\app\Proj\App_Build\build-styles.js"
Write-Host "Result = $result"
}
catch
{
Write-Error "Command failed"
Exit 1
}
Sometimes I wrap my command in a Start-Process -NoNewWindow -Wait just to see if that generates a different error message.
In your case, I would also try Enter-PSSession to get a non-interactive prompt on the TFS server. I have seen cases where powershell acts diferently when the shell is not interactive.

Powershell Invoke-Command remotely with -AsJob failed to launch remote program

I want to launch a long-running remote program but don't want to wait. So I put below one line command in PS script file and run "Powershell -file xxx.ps1"
Invoke-command server1 {xxx.exe} -AsJob
But the remote program is not running after the script run, but if I run above command in PS console interactively, remote program running ok. Seems the program is killed when script terminates.
Why is this?
UPDATE
If I sleep 1 second at end of script, the remote app running fine.
The best way around this is probably to use Wait-Job. It will wait for the job you launched until it's done. Otherwise the script just closes, probably before the connection has already been made completely.
If you want to wait for all jobs running in the background to be completed, you can use:
Invoke-command server1 {xxx.exe} -AsJob
Get-Job | Wait-Job