How to get background task status from parent window in PowerShell - powershell

I have the following powershell scripts to start two background task. I could able to fetch the status of background task if I use wait parameter.
$TestResult1=start .\TestFile1.bat -NoNewWindow -PassThru -ErrorAction Stop
$TestResult2=start .\TestFile2.bat -NoNewWindow -PassThru -Wait -ErrorAction Stop
if($TestResult1.ExitCode -gt 0){
throw 'Exceptions in TestFile1.bat'
}
if($TestResult2.ExitCode -gt 0){
throw 'Exceptions in TestFile2.bat'
}
Is there any way to fetch the status of background task without using wait parameter? In above example, I can able to fetch the status only from TestFile2.bat.

If you don't use -Wait, you can use Wait-Process with your $TestResult1 and $TestResult2 variables, which, thanks to -PassThru, contain System.Diagnostics.Process instances representing the processes launched:
# Waits synchronously for both processes to terminate.
$TestResult1, $TestResult2 | Wait-Process
# Now you can inspect the exit codes.
# NOTE: The .ExitCode property is only available after a process
# has *terminated*. Before that, it effectively returns `$null`
# (the underlying .NET exception that occurs is swallowed by PowerShell).
$TestResult1.ExitCode, $TestResult2.ExitCode
If you want to perform other operations while waiting for the processes to terminate, you can use the .HasExited property in a loop to periodically test if the process have terminated:
$leftToMonitor = $TestResult1, $TestResult2
do {
# Perform foreground operations...
Write-Host . -NoNewLine; Start-Sleep 1
# Check for processes that have already terminated.
$exited, $leftToMonitor = $psToMonitor.Where({ $_.HasExited }, 'Split')
foreach ($ps in $exited) {
# Output the command line and the exit code as properties of a custom object.
[pscustomobject] #{
CommandLine = $ps.CommandLine
ExitCode = $ps.ExitCode
]
}
} while ($leftToMonitor)
Note that Wait-Process also has a -Timeout parameter, and you can use -TimeOut 0 to momentarily test if processes have exited, but note that for (each) process that hasn't exited, a non-terminating error is reported, which makes checking the .HasExited property more convenient (and doing so is also faster).
That said, for invisible background tasks I recommend using
PowerShell jobs, either via Start-Job, or, preferably, via the faster and lighter-weight Start-ThreadJob (comes with PowerShell (Core) 7+, installable with Install-Module ThreadJob in Windows PowerShell) for background tasks rather than Start-Process -NoNewWindow, because they:
avoid the problem of potential output from the Start-Process -NoNewWindow-launched process printing output that cannot be captured to the console, which without -Wait will arrive with unpredictably timing.
instead allow you to collect output in a controlled manner on demand via the Receive-Job cmdlet.
Waiting for jobs to finish, optionally with a timeout, is done via the Wait-Job cmdlet.
Note:
Start-Job creates a hidden PowerShell child process in which to run given commands, which is what makes it slow, whereas Start-ThreadJob uses a thread in the current process.
As of PowerShell 7.1, background jobs do not automatically capture the exit code of an / the most recent external program executed by them, unlike in foreground execution, where the automatic $LASTEXITCODE variable reflects this information. Therefore, unfortunately, $LASTEXITCODE must be reported as part of each job's output, which is cumbersome - see below.
GitHub proposal #5422 suggests adding a .LastExitProperty to job objects to address this limitation.
Examples:
Note:
Instead of calling a batch file, the examples below call a cmd.exe command directly, with /c, but the principle is the same.
As stated, the exit code of the cmd.exe call must be reported as part of the job's output, hence the extra ; $LASTEXITCODE statement after the call.
Simplistic example: Wait synchronously for all jobs to terminate, and report the output, which comprises all stdout and stderr output from cmd.exe followed by the process exit code reported via $LASTEXITCODE:
# Start two thread jobs that call cmd.exe, with different outputs
# and different exit code.
# Note: If you don't have Start-ThreadJob, you can use Start-Job
$jobs =
(Start-ThreadJob { cmd /c 'echo ONE'; $LASTEXITCODE }),
(Start-ThreadJob { cmd /c 'echo TWO & exit /b 1'; $LASTEXITCODE })
$jobs | Receive-Job -Wait -AutoRemoveJob
The above yields (note that the output order isn't guaranteed):
ONE
0
TWO
1
Example with continued foreground operation while waiting:
# Start two thread jobs that call cmd.exe, with different outputs
# and different exit code.
# Note: If you don't have Start-ThreadJob, you can use Start-Job
$jobs =
(Start-ThreadJob { cmd /c echo ONE; $LASTEXITCODE }),
(Start-ThreadJob { cmd /c 'echo TWO & exit /b 1'; $LASTEXITCODE })
do {
# Perform foreground operations...
Write-Host . -NoNewLine; Start-Sleep 1
# Note: You can also capture *ongoing* job output via repeated Receive-Job calls.
# Find all all jobs that have finished.
$finished, $jobs = $jobs.Where({ $_.State -in 'Completed', 'Failed', 'Stopped' }, 'Split')
# Process all finished jobs.
foreach ($job in $finished) {
# Get the job's output and separate it into the actual output
# and the exit code, which is the *last* object.
$output = $job | Receive-Job
$i = 0
$lastExitCode, $actualOutput = $output.Where({ ++$i -eq $output.Count }, 'Split')
# Output a custom object that reflects the original command, the output, and the exit code.
[pscustomobject] #{
Command = $job.Command
Output = $($actualOutput) # unwrap a single-object output collection
ExitCode = $lastExitCode
}
# Remove the job
Remove-Job $job
}
} while ($jobs)
Note:
The above uses the fairly cumbersome $_.State -in 'Completed', 'Failed', 'Stopped' to momentarily test for finished jobs, without waiting.
Ideally, Wait-Job -Timeout 0 could more simply be used, but as of PowerShell 7.1 that doesn't work as expected (the minimum wait period is therefore -Timeout 1, i.e. 1 second) - see GitHub issue #14675.

Related

PowerShell: waiting for output redirect file to unlock after killing process

I have a PowerShell script that:
Starts a new process and redirects the output to two files
Waits with a timeout for the process to complete
Kills the process if it timed out
Reads the contents from the redirected output files
It looks something like this:
$_timeout = 30
$_stdoutfile = "./stdout"
$_stderrfile = "./stderr"
# Start the process
$_process = Start-Process powershell.exe -ArgumentList "-file ""$_cmdfile""" -PassThru -NoNewWindow -RedirectStandardError "$_stderrfile" -RedirectStandardOutput "$_stdoutfile"
# Wait for it to complete, or time out
$_timeout_error = $null
$_process | Wait-Process -Timeout $_timeout -ErrorAction SilentlyContinue -ErrorVariable _timeout_error
# Check if it timed out
if ($_timeout_error) {
# Kill it
$_process | Stop-Process -Force
# (1)
# Wait for the process to exit after the kill command
$_kill_timeout_error = $null
$_process | Wait-Process -Timeout 10 -ErrorAction SilentlyContinue -ErrorVariable _kill_timeout_error
# If the process is still running 10 seconds after the kill command die with an error
if ($_kill_timeout_error) {
Write-Error "Failed to terminate process (waited for 10 seconds after initiating termination)."
Exit 1
}
}
# (2)
# Read the stdout and stderr content that was output
$_stdout = [System.IO.File]::ReadAllText("$_stdoutfile")
$_stderr = [System.IO.File]::ReadAllText("$_stderrfile")
# Delete the files after reading them
Remove-Item -Force "$_stdoutfile"
Remove-Item -Force "$_stderrfile"
The majority of this works properly; the process is killed as expected if it runs too long. The problem I'm having is with the ReadAllText functions. They work fine if the process exits on its own, but if they were killed due to a timeout, these functions fail with the error:
The process cannot access the file
'C:\myfilename' because it is being used by another process.
I figured that perhaps it takes the OS a couple seconds to unlock the files after the process is killed, so I inserted a sleep-poll loop at # (2), but often several minutes later, they're still locked.
#Santiago Squarzon suggested (in the comments) a way to read the file while it's still locked, which may work, but I also need to be able to delete them after reading them.
So my questions are:
Is there a way to get these files to naturally unlock more quickly after killing the process?
Is there a way to force-unlock these files with a separate PowerShell command/function?
Unrelated, but is the part in my code around the comment # (1) necessary (waiting for the process to stop after the kill command), or will Stop-Process block until the process is actually stopped?

How to execute separate Jmeter test plans one at a time with powershell?

We have received 20 jmeter test plans, each testing one endpoint, which we need to run. On tests we need to pass parameters and others we don't.
My idea was to create a powershell script that loops through the directories and runs a test, waits until finished and then runs the next test. When we develop a new endpoint we just create a new test plan and save it in the appropriate folder and the powershell script will include it next time we loop through tests.
I need the tests to finish before starting the next plan, so I'm looking at something like:
Write-Output "Running Test 1"
$proc = Start-Process -FilePath "C:\JmeterLoadTests\apache-jmeter-5.2.1\bin\jmeter" -ArgumentList "-n -t C:\JmeterLoadTests\test\enpointsType1\test-1-1.jmx -Jduration=10"
$proc.WaitForExit()
Write-Output "Proc 1 Done"
Write-Output "Running Proc 2"
$proc2 = Start-Process -FilePath "C:\JmeterLoadTests\apache-jmeter-5.2.1\bin\jmeter" -ArgumentList "-n -t C:\JmeterLoadTests\test\enpointsType1\test-1-1.jmx -Jduration=10"
$proc2.WaitForExit()
This just launches both tests simultaneously.
My question is then how to make Powershell wait for the previous test to finish.
Your immediate problem is that your Start-Process call is missing the -PassThru switch, which is required for the call to return a System.Diagnostics.Process instance representing the newly launched process.
# ...
# Note the use of -PassThru
$proc = Start-Process -PassThru -FilePath "C:\JmeterLoadTests\apache-jmeter-5.2.1\bin\jmeter" -ArgumentList "-n -t C:\JmeterLoadTests\test\enpointsType1\test-1-1.jmx -Jduration=10"
$proc.WaitForExit()
# ...
Alternatively, if you don't need to examine the process exit code (which $proc.ExitCode in the above command would give you), you can simply use the -Wait switch, which makes Start-Process itself wait for the process to terminate:
# ...
# Note the use of -Wait
Start-Process -Wait -FilePath "C:\JmeterLoadTests\apache-jmeter-5.2.1\bin\jmeter" -ArgumentList "-n -t C:\JmeterLoadTests\test\enpointsType1\test-1-1.jmx -Jduration=10"
# ...
Taking a step back:
To synchronously execute console applications or batch files in the current console window, call them directly, do not use Start-Process (or the System.Diagnostics.Process API it is based on).
Aside from being syntactically easier and less verbose, this has two key advantages:
You can directly capture their output.
It allows you to examine the process exit code via the automatic $LASTEXITCODE variable afterwards.
Assuming that jmeter is a console application (the docs suggests it runs as one when invoked with arguments):
# ...
# Direct invocation in the current window.
# Stdout and stderr output will print to the console by default,
# but can be captured or redirected.
# Note: &, the call operator, isn't strictly needed here,
# but would be if your executable path were quoted
# or contained variable references.
& C:\JmeterLoadTests\apache-jmeter-5.2.1\bin\jmeter -n -t C:\JmeterLoadTests\test\enpointsType1\test-1-1.jmx -Jduration=10
# Use $LASTEXITCODE to examine the process exit code.
# ...
See this answer for more information.
It might be the case you're suffering from Out-Default cmdlet execution, the easiest is separating the commands with the semicolon like:
cmd1;cmd2;cmd3;etc;
this way Powershell will wait for the previous command to complete before starting the next one
Demo:
It might be a better idea considering switching to Maven JMeter Plugin which by default executes all tests it finds under src/test/jmeter folder relative to the pom.xml file

Error handling of command prompt commands in Powershell

My goal is to check, disable and remove Scheduled Tasks on numerous Windows servers using Powershell.
Some of the servers are Windows 2008R2, so Get-ScheduledTask is out of question. I have to use schtasks
Here is what I have thus far
$servers = (Get-ADComputer -Server DomainController -Filter 'OperatingSystem -like "*Server*"').DNSHostname
$servers |
ForEach-Object {
if (Test-Connection -Count 1 -Quiet -ComputerName $_) {
Write-Output "$($_) exists, checking for Scheduled Task"
Invoke-Command -ComputerName $_ {
If((schtasks /query /TN 'SOMETASK')) {
Write-Output "Processing removal of scheduled task`n"
schtasks /change /TN 'SOMETASK' /DISABLE
schtasks /delete /TN 'SOMETASK' /F
}
else {
Write-Output "Scheduled Task does not exist`n"
}
}
}
}
This works fine for when SOMETASK exists but when it doesn't, Powershell spits an error, like this:
ERROR: The system cannot find the file specified.
+ CategoryInfo : NotSpecified: (ERROR: The syst...file specified.:String) [], RemoteException
+ FullyQualifiedErrorId : NativeCommandError
+ PSComputerName : SERVER1
NotSpecified: (:) [], RemoteException
Scheduled Task does not exist
I can circumvent this behavior by setting $ErrorActionPreference to "SilentlyContinue" but this suppresses other errors I may be interested in. I also tried Try, Catch but that still generates the error. I don't think I can add -ErrorHandling argument to an IF statement. Can anyone please lend a helping hand?
Thank you,
tl;dr:
Use 2>$null to suppress the stderr output from a call to an external program (such as schtasksk.exe)
To work around a bug present up to at least PowerShell [Core] 7.0 (see below), make sure that $ErrorActionPreferece is not set to 'Stop'.
# Execute with stderr silenced.
# Rely on the presence of stdout output in the success case only
# to make the conditional true.
if (schtasks /query /TN 'SOMETASK' 2>$null) { # success, task exists
"Processing removal of scheduled task`n"
# ...
}
For background information and more general use cases, read on.
Given how the line from the external program's stderr stream manifests as shown in your question,
it sounds like you're running your code in the PowerShell ISE, which I suggest moving away from: The PowerShell ISE is obsolescent and should be avoided going forward (bottom section of the linked answer).
That the ISE surfaces stderr lines surface via PowerShell's error stream by default is especially problematic - see this GitHub issue.
The regular console doesn't do that, fortunately - it passes stderr lines through to the host (console), and prints them normally (not in red), which is the right thing to do, given that you cannot generally assume that all stderr output represents errors (the stream's name notwithstanding).
With well-behaved external programs, you should only ever derive success vs. failure from their process exit code (as reflected in the automatic $LASTEXITCODE variable[1]), not from the presence of stderr output.: exit code 0 indicates success, any nonzero exit code (typically) indicates failure.
As for your specific case:
In the regular console, the value of the $ErrorActionPreference preference variable does not apply to external programs such as schtasks.exe, except in the form of a bug [fixed in PowerShell 7.2+] when you also use a 2> redirection - see GitHub issue #4002; as of PowerShell 7.1.0-preview.6; the corrected behavior is a available as experimental feature PSNotApplyErrorActionToStderr.
Since your schtasks /query /TN 'SOMETASK' command functions as a test, you can do the following:
# Execute with all streams silenced (both stdout and stderr, in this case).
# schtask.exe will indicate the non-existence of the specified task
# with exit code 1
schtasks /query /TN 'SOMETASK' *>$null
if ($LASTEXITCODE -eq 0) { # success, task exists
"Processing removal of scheduled task`n"
# ...
}
# You can also squeeze it into a single conditional, using
# $(...), the subexpression operator.
if (0 -eq $(schtasks /query /TN 'SOMETASK' *>$null; $LASTEXITCODE)) { # success, task exists
"Processing removal of scheduled task`n"
# ...
}
In your specific case, a more concise solution is possible, which relies on your schtasks command (a) producing stdout output in the case of success (if the task exists) and (b) only doings so in the success case:
# Execute with stderr silenced.
# Rely on the presence of stdout output in the success case only
# to make the conditional true.
if (schtasks /query /TN 'SOMETASK' 2>$null) { # success, task exists
"Processing removal of scheduled task`n"
# ...
}
If schtasks.exe produces stdout output (which maps to PowerShell's success output stream, 1), PowerShell's implicit to-Boolean conversion will consider the conditional $true (see the bottom section of this answer for an overview of PowerShell's to-Boolean conversion rules).
Note that a conditional only ever acts on the success output stream's output (1), other streams are passed through, such as the stderr output (2) would be in this case (as you've experienced).
2>$null silences stderr output, by redirecting it to the null device.
1 and 2 are the numbers of PowerShell's success output / error streams, respectively; in the case of external programs, they refers to their stdout (standard output) and stderr (standard error) streams, respectively - see about_Redirection.
You can also capture stderr output with a 2> redirection, if you want to report it later (or need to examine it specifically for an ill-behaved program that doesn't use exit codes properly).
2> stderr.txt sends the stderr lines to file sdterr.txt; unfortunately, there is currently no way to capture stderr in a variable - see GitHub issue #4332, which proposes syntax 2>&variableName for that.
As implied by the aforementioned bug, you must ensure that $ErrorActionPreference isn't set to 'Stop', because the 2> will then mistakenly trigger a script-terminating error.
Aside from the aforementioned bug, using 2> currently has another unexpected side effect [fixed in PowerShell 7.2+]: The stderr lines are unexpectedly also added to the automatic $Error collection, as if they're errors (which they cannot assumed to be).
The root cause of both issues is that stderr lines are unexpectedly routed via PowerShell's error stream, even though there is no good reason to do so - see GitHub issue #11133.
[1] Note that the automatic $? variable that indicates success vs. failure as a Boolean ($true / $false) is also set, but not reliably so: since stderr output is currently (v7.0) unexpectedly routed via PowerShell's error stream if redirected with 2>&, the presence of any stderr output invariably sets $? to $false, even if the external program reports overall success, via $LASTEXITCODE reporting 0. Therefore, the only reliable way to test for success is $LASTEXITCODE -eq 0, not $?.
Personally I prefer to use the Scheduler ComObject to manage scheduled tasks. You can connect to other servers with it, and search them simply enough to manage their tasks.
$Scheduler = New-Object -ComObject Schedule.Service
$servers = (Get-ADComputer -Server DomainController -Filter 'OperatingSystem -like "*Server*"').DNSHostname
$servers |
ForEach-Object {
if (Test-Connection -Count 1 -Quiet -ComputerName $_) {
Write-Output "$($_) exists, checking for Scheduled Task"
$Scheduler.Connect($_)
$RootFolder = $Scheduler.GetFolder("\")
$TargetTask = $RootFolder.GetTask('SOMETASK')
# If the task wasn't found continue to the next server
If(!$TargetTask){
Write-Output "Scheduled Task does not exist`n"
Continue
}
Write-Output "Processing removal of scheduled task`n"
$TargetTask.Enabled = $false
$RootFolder.DeleteTask('SOMETASK')
}
}
This appears like you've way over-complicated execution of this effort.
Why disable and remove vs just remove, as that seems a bit redundant?
All scheduled tasks are nothing but xml files and reg entries, that you can just delete if you don't want the task any longer. Thus, you can use sue Get-ChildItem.
# File system:
(Get-ChildItem -Path "$env:windir\System32\Tasks").FullName
# Results
<#
...
C:\Windows\System32\Tasks\Microsoft
...
C:\Windows\System32\Tasks\MicrosoftEdgeUpdateTaskMachineCore
...
#>
# Registry:
Get-ChildItem -Path 'HKLM:\Software\Microsoft\Windows NT\CurrentVersion\Schedule\Taskcache\Tasks'
# Results
<#
Name Property
---- --------
{01C5B377-A7EB-4FF3-9C6C-86852 Path : \Microsoft\Windows\Management\Provisioning\Logon
...
#>
Get-ChildItem -Path 'HKLM:\Software\Microsoft\Windows NT\CurrentVersion\Schedule\Taskcache\Tree'
# Results
<#
Name Property
---- --------
Adobe Acrobat Update Task SD : {1...
#>
Just select your task by name and delete the file and the regkeys using the normal filesystem cmdlets.
So you just want to hide the error message from schtasks? One way is to redirect standard error or "2" to $null. This is an example anyone can run as admin. The if statement only works because there's no output to standard out when there's an error. It looks like invoke-command generates a remote exception when something comes over standard error, but it doesn't stop the commands that follow. I don't see a way to try/catch it.
invoke-command localhost { if (schtasks /query /tn 'foo' 2>$null) {
'yes' } ; 'hi'}
hi

kill child processes when parent ends

My goal is to kill (or somehow gracefully shutdown) child processes that were started by powershell script, so that nothing will keep running after the parent process dies (either normally by hitting end of script or via crash or ctrl+c or any other way).
I've tried several approaches but none worked as expected:
# only one line was active at time, all other were commented
start-job -scriptblock { & 'notepad.exe' } # notepad.exe started, script continues to end, notepad.exe keep running
start-job -scriptblock { 'notepad.exe' } # notepad.exe not started, script continues to end
notepad.exe # notepad.exe started, script continues to end, notepad.exe keep running
& notepad.exe # notepad.exe started, script continues to end, notepad.exe keep running
start-Process -passthru -FilePath notepad.exe # notepad.exe started, script continues to end, notepad.exe keep running
# give script some time before ending
Write-Host "Begin of sleep section"
Start-Sleep -Seconds 5
Write-Host "End of sleep section"
You can to this kind of thing with a finally clause. A finally clause gets executed after a try block, even if the execution of the try block threw an exception or if the execution was aborted by the user.
So one way to approach your problem would be the following:
keep track of the process ids of the child processes, your script is spawning and
kill these processes in the finally clause.
try
{
$process = Start-Process 'notepad.exe' -PassThru
# give script some time before ending
Write-Host "Begin of sleep section"
Start-Sleep -Seconds 5
Write-Host "End of sleep section"
}
finally
{
# Kill the process if it still exists after the script ends.
# This throws an exception, if process ended before the script.
Stop-Process -Id $process.Id
}

Disable close button of MSBuild child process

I have a PowerShell script that launches an MSBuild child process. I would like to disable the "close" button on the child process window, so that the user cannot interrupt the process. However, I have not been able to find any information indicating whether this is possible.
If someone could either confirm (and tell me how I would go about doing this) or deny whether this is possible I would greatly appreciate it.
MSBuild.exe is a console application, and as such by default it will run in a console window. You can't really "disable" the close button anymore than you could stop someone (with the right privileges) from just terminating the msbuild.exe process...
What you could do to mitigate some risk would be to use the the jobs feature that was introduced in PowerShell 2.0:
$job = Start-Job -ScriptBlock {
& msbuild app.csproj
if ($LASTEXITCODE -ne 0) { throw "MSBuild failed. Exit code: $LASTEXITCODE" }
}
This will schedule the script block to be run on a background thread of your PowerShell session and it will not show a window for msbuild. All of the output will be captured and held until you decide to retrieve the job. You can check the status of all background jobs with the Get-Job cmdlet, and receive the results with Receive-Job
Wait-Job $job # this line pauses PowerShell/your script until the job returns
$output = $job | Receive-Job
You can do whatever you want with the output - it is worth noting that the exception thrown if the msbuild exit status code is non-zero will be held until you receive the job, at which point it will be raised to your code like any other exception would be. You may want to consider wrapping your call to Receive-Job in a try/catch block to deal with a failed build.
Another option if you don't want a separate window to appear:
$buildArgs = "MySolution.sln", "/t:Build", "/p:Configuration=Debug"
Start-Process -FilePath "msbuild" -ArgumentList $buildArgs -NoNewWindow -Wait
Start-Process has other flags to control redirecting output as well.