kill child processes when parent ends - powershell

My goal is to kill (or somehow gracefully shutdown) child processes that were started by powershell script, so that nothing will keep running after the parent process dies (either normally by hitting end of script or via crash or ctrl+c or any other way).
I've tried several approaches but none worked as expected:
# only one line was active at time, all other were commented
start-job -scriptblock { & 'notepad.exe' } # notepad.exe started, script continues to end, notepad.exe keep running
start-job -scriptblock { 'notepad.exe' } # notepad.exe not started, script continues to end
notepad.exe # notepad.exe started, script continues to end, notepad.exe keep running
& notepad.exe # notepad.exe started, script continues to end, notepad.exe keep running
start-Process -passthru -FilePath notepad.exe # notepad.exe started, script continues to end, notepad.exe keep running
# give script some time before ending
Write-Host "Begin of sleep section"
Start-Sleep -Seconds 5
Write-Host "End of sleep section"

You can to this kind of thing with a finally clause. A finally clause gets executed after a try block, even if the execution of the try block threw an exception or if the execution was aborted by the user.
So one way to approach your problem would be the following:
keep track of the process ids of the child processes, your script is spawning and
kill these processes in the finally clause.
try
{
$process = Start-Process 'notepad.exe' -PassThru
# give script some time before ending
Write-Host "Begin of sleep section"
Start-Sleep -Seconds 5
Write-Host "End of sleep section"
}
finally
{
# Kill the process if it still exists after the script ends.
# This throws an exception, if process ended before the script.
Stop-Process -Id $process.Id
}

Related

PowerShell: waiting for output redirect file to unlock after killing process

I have a PowerShell script that:
Starts a new process and redirects the output to two files
Waits with a timeout for the process to complete
Kills the process if it timed out
Reads the contents from the redirected output files
It looks something like this:
$_timeout = 30
$_stdoutfile = "./stdout"
$_stderrfile = "./stderr"
# Start the process
$_process = Start-Process powershell.exe -ArgumentList "-file ""$_cmdfile""" -PassThru -NoNewWindow -RedirectStandardError "$_stderrfile" -RedirectStandardOutput "$_stdoutfile"
# Wait for it to complete, or time out
$_timeout_error = $null
$_process | Wait-Process -Timeout $_timeout -ErrorAction SilentlyContinue -ErrorVariable _timeout_error
# Check if it timed out
if ($_timeout_error) {
# Kill it
$_process | Stop-Process -Force
# (1)
# Wait for the process to exit after the kill command
$_kill_timeout_error = $null
$_process | Wait-Process -Timeout 10 -ErrorAction SilentlyContinue -ErrorVariable _kill_timeout_error
# If the process is still running 10 seconds after the kill command die with an error
if ($_kill_timeout_error) {
Write-Error "Failed to terminate process (waited for 10 seconds after initiating termination)."
Exit 1
}
}
# (2)
# Read the stdout and stderr content that was output
$_stdout = [System.IO.File]::ReadAllText("$_stdoutfile")
$_stderr = [System.IO.File]::ReadAllText("$_stderrfile")
# Delete the files after reading them
Remove-Item -Force "$_stdoutfile"
Remove-Item -Force "$_stderrfile"
The majority of this works properly; the process is killed as expected if it runs too long. The problem I'm having is with the ReadAllText functions. They work fine if the process exits on its own, but if they were killed due to a timeout, these functions fail with the error:
The process cannot access the file
'C:\myfilename' because it is being used by another process.
I figured that perhaps it takes the OS a couple seconds to unlock the files after the process is killed, so I inserted a sleep-poll loop at # (2), but often several minutes later, they're still locked.
#Santiago Squarzon suggested (in the comments) a way to read the file while it's still locked, which may work, but I also need to be able to delete them after reading them.
So my questions are:
Is there a way to get these files to naturally unlock more quickly after killing the process?
Is there a way to force-unlock these files with a separate PowerShell command/function?
Unrelated, but is the part in my code around the comment # (1) necessary (waiting for the process to stop after the kill command), or will Stop-Process block until the process is actually stopped?

Powershell script waiting for user input and wont exit

I am trying to run a script silently, its runs fine but then after its run it displays
Succeeded : 0
Press 'Enter' to continue
How can i check if succeeded and then send the enter key..
Note I am running this via the start process command as below but as it is waiting for the user to press enter it never exits:
Start-Process -Wait -FilePath "C:\windows\temp\abc.exe" -ArgumentList '/S','/v','/qn' -passthru
Your best bet is to check if your executable supports a command-line parameter (option) that skips the prompt at the end.
If there is none, you can try the following approach, but note that, as with all attempts to control an application via simulated user interaction, the solution is awkward and brittle:
# Comment this out to hide the verbose messages.
$VerbosePreference = 'Continue'
# Load helper assemblies.
Add-Type -ErrorAction Stop -AssemblyName Microsoft.VisualBasic, System.Windows.Forms
# Launch the external program.
# In this simple example, cmd.exe is invoked with its internal pause
# command, which waits for a keystroke to continue.
Write-Verbose 'Launching the external program asynchronously...'
# IMPORTANT: Do NOT use -Wait, as that will block execution
# indefinitely, and prevent you from sending the Enter keystroke.
$process = Start-Process -PassThru cmd '/c pause'
Write-Verbose 'Sleeping for as long as execution is expected to last at a minimum...'
Start-Sleep 5 # Adjust this as needed.
Write-Verbose 'Sending ENTER keystrokes until the window closes...'
while (-not $process.HasExited) {
# To be safe, activate the external program's window. If that fails, it must be closed already.
try { [Microsoft.VisualBasic.Interaction]::AppActivate($process.Id) } catch { break }
# Send the keystroke.
[System.Windows.Forms.SendKeys]::SendWait('{Enter}')
Start-Sleep -Milliseconds 200 # Sleep a little between attempts.
}
Write-Verbose 'The external program''s window is now closed.'

How to get background task status from parent window in PowerShell

I have the following powershell scripts to start two background task. I could able to fetch the status of background task if I use wait parameter.
$TestResult1=start .\TestFile1.bat -NoNewWindow -PassThru -ErrorAction Stop
$TestResult2=start .\TestFile2.bat -NoNewWindow -PassThru -Wait -ErrorAction Stop
if($TestResult1.ExitCode -gt 0){
throw 'Exceptions in TestFile1.bat'
}
if($TestResult2.ExitCode -gt 0){
throw 'Exceptions in TestFile2.bat'
}
Is there any way to fetch the status of background task without using wait parameter? In above example, I can able to fetch the status only from TestFile2.bat.
If you don't use -Wait, you can use Wait-Process with your $TestResult1 and $TestResult2 variables, which, thanks to -PassThru, contain System.Diagnostics.Process instances representing the processes launched:
# Waits synchronously for both processes to terminate.
$TestResult1, $TestResult2 | Wait-Process
# Now you can inspect the exit codes.
# NOTE: The .ExitCode property is only available after a process
# has *terminated*. Before that, it effectively returns `$null`
# (the underlying .NET exception that occurs is swallowed by PowerShell).
$TestResult1.ExitCode, $TestResult2.ExitCode
If you want to perform other operations while waiting for the processes to terminate, you can use the .HasExited property in a loop to periodically test if the process have terminated:
$leftToMonitor = $TestResult1, $TestResult2
do {
# Perform foreground operations...
Write-Host . -NoNewLine; Start-Sleep 1
# Check for processes that have already terminated.
$exited, $leftToMonitor = $psToMonitor.Where({ $_.HasExited }, 'Split')
foreach ($ps in $exited) {
# Output the command line and the exit code as properties of a custom object.
[pscustomobject] #{
CommandLine = $ps.CommandLine
ExitCode = $ps.ExitCode
]
}
} while ($leftToMonitor)
Note that Wait-Process also has a -Timeout parameter, and you can use -TimeOut 0 to momentarily test if processes have exited, but note that for (each) process that hasn't exited, a non-terminating error is reported, which makes checking the .HasExited property more convenient (and doing so is also faster).
That said, for invisible background tasks I recommend using
PowerShell jobs, either via Start-Job, or, preferably, via the faster and lighter-weight Start-ThreadJob (comes with PowerShell (Core) 7+, installable with Install-Module ThreadJob in Windows PowerShell) for background tasks rather than Start-Process -NoNewWindow, because they:
avoid the problem of potential output from the Start-Process -NoNewWindow-launched process printing output that cannot be captured to the console, which without -Wait will arrive with unpredictably timing.
instead allow you to collect output in a controlled manner on demand via the Receive-Job cmdlet.
Waiting for jobs to finish, optionally with a timeout, is done via the Wait-Job cmdlet.
Note:
Start-Job creates a hidden PowerShell child process in which to run given commands, which is what makes it slow, whereas Start-ThreadJob uses a thread in the current process.
As of PowerShell 7.1, background jobs do not automatically capture the exit code of an / the most recent external program executed by them, unlike in foreground execution, where the automatic $LASTEXITCODE variable reflects this information. Therefore, unfortunately, $LASTEXITCODE must be reported as part of each job's output, which is cumbersome - see below.
GitHub proposal #5422 suggests adding a .LastExitProperty to job objects to address this limitation.
Examples:
Note:
Instead of calling a batch file, the examples below call a cmd.exe command directly, with /c, but the principle is the same.
As stated, the exit code of the cmd.exe call must be reported as part of the job's output, hence the extra ; $LASTEXITCODE statement after the call.
Simplistic example: Wait synchronously for all jobs to terminate, and report the output, which comprises all stdout and stderr output from cmd.exe followed by the process exit code reported via $LASTEXITCODE:
# Start two thread jobs that call cmd.exe, with different outputs
# and different exit code.
# Note: If you don't have Start-ThreadJob, you can use Start-Job
$jobs =
(Start-ThreadJob { cmd /c 'echo ONE'; $LASTEXITCODE }),
(Start-ThreadJob { cmd /c 'echo TWO & exit /b 1'; $LASTEXITCODE })
$jobs | Receive-Job -Wait -AutoRemoveJob
The above yields (note that the output order isn't guaranteed):
ONE
0
TWO
1
Example with continued foreground operation while waiting:
# Start two thread jobs that call cmd.exe, with different outputs
# and different exit code.
# Note: If you don't have Start-ThreadJob, you can use Start-Job
$jobs =
(Start-ThreadJob { cmd /c echo ONE; $LASTEXITCODE }),
(Start-ThreadJob { cmd /c 'echo TWO & exit /b 1'; $LASTEXITCODE })
do {
# Perform foreground operations...
Write-Host . -NoNewLine; Start-Sleep 1
# Note: You can also capture *ongoing* job output via repeated Receive-Job calls.
# Find all all jobs that have finished.
$finished, $jobs = $jobs.Where({ $_.State -in 'Completed', 'Failed', 'Stopped' }, 'Split')
# Process all finished jobs.
foreach ($job in $finished) {
# Get the job's output and separate it into the actual output
# and the exit code, which is the *last* object.
$output = $job | Receive-Job
$i = 0
$lastExitCode, $actualOutput = $output.Where({ ++$i -eq $output.Count }, 'Split')
# Output a custom object that reflects the original command, the output, and the exit code.
[pscustomobject] #{
Command = $job.Command
Output = $($actualOutput) # unwrap a single-object output collection
ExitCode = $lastExitCode
}
# Remove the job
Remove-Job $job
}
} while ($jobs)
Note:
The above uses the fairly cumbersome $_.State -in 'Completed', 'Failed', 'Stopped' to momentarily test for finished jobs, without waiting.
Ideally, Wait-Job -Timeout 0 could more simply be used, but as of PowerShell 7.1 that doesn't work as expected (the minimum wait period is therefore -Timeout 1, i.e. 1 second) - see GitHub issue #14675.

How to "terminate" Get-Content -Wait after another command has finished in batch?

I'm writing on a batch script for Unity builds with Jenkins.
What I did so far
Unity has the problem that by default it it is not very verbous in -batchmode.
So in order to get the output into Jenkins I use -logFile to make Unity write to a specific logfile.
Until now I'm just able to read this file after a build succeeded or failed using
Unity.exe -batchmode -logFile JenkinsLOG.txt <more parameters>
type JenkinsLOG.txt
Now in order to get the content of JenkinsLOG.txt in realtime to the Jenkins log view I'm trying to use start to run the Unity process in a new console and use powershell Get-Content <file> -Wait to print the content of the logfile in realtime to the console:
start Unity.exe -batchmode -logFile JenkinsLOG.txt <more parameters>
powershell Get-Content JenkinsLOG.txt -Wait
this works great and I see the output in realtime appearing in Jenkins ...
But ... ofcourse the powershell command never terminates so the build process gets stuck still waiting for more lines beeing appended to JenkinsLOG.txt.
So my question is
Is there any possibility to tell this powershell command to terminate after the Unity process finished?
Get process id you want to monitor
Spin up a job that tails the log file
Loop the current console to read output from the job
Terminate the loop when the process exits
Clean up jobs
Here it is wrapped up in a function. There is probably a more elegant way, but I couldn't find another way to distinguish between a "timeout" exit of Wait-Process and a "process stopped" exit.
function TailFile-UntilProcessStops {
Param ($processID, $filePath)
$loopBlock = {
Param($filePath) Get-Content $filePath -Wait -Tail 0
}
$TailLoopJob = start-job -scriptBlock $loopBlock -ArgumentList $filePath
try {
do {
$TailLoopJob | Receive-Job
try {
Wait-Process -id $processID -ErrorAction Stop -Timeout 1
$waitMore = $false
} catch {
$waitMore = $true
}
} while($waitMore)
} finally {
Stop-Job $TailLoopJob
Remove-Job $TailLoopJob
}
}
Here is the test code with Notepad. Make sure the file exists, then modify it. Every time you save, the console should update with more data. Quit Notepad and control returns to the console.
$filename = 'h:\asdf\somefile.txt'
$process = start-process -FilePath 'notepad.exe' -ArgumentList #($filename) -PassThru
TailFile-UntilProcessStops -processID $process.id -filepath $filename

Windows powershell script triggered by PSExec is not killing Powershell process when it finishes running

I need to complete a series of tasks across several Windows 2008 servers that need elevated permissions and as such I've had to create a series of scheduled tasks that i'm running via psexec. Since they have to run in sequence, I found and modified a powershell script that 'stalls' until scheduled tasks are completed on remote machines. The problem I have is that when I launch the script with psexec on the remote machine, once it completes running (indicated by a message in the console output) PowerShell.exe doesn't exit cleanly, but rather hangs out and holds up the entire process. I need powershell to quit after the delay script exits, but even with an exit keyword at the end it stays in memory and prevents the process from finishing. I'm not terribly experienced with powershell so I'll attach my script in case I'm doing something foolish in it:
while($true) {
$status = schtasks /query /tn "AutoDeploy"| select-string -patt "AutoDeploy"
if($status.tostring().substring(64,7) -eq "Running") { "Waiting for task to complete..." } else { break }
start-sleep -s 5
}
"Task complete."
exit
Thanks in advance for any insight.
This works for me (using a different task name) and doesn't hang psexec:
$taskName = "AutoDeploy"
while (1)
{
$stat = schtasks /query /tn $taskName |
Select-String "$taskName.*?\s(\w+)\s*$" |
Foreach {$_.Matches[0].Groups[1].value}
if ($stat -ne 'Running')
{
"Task completed"
break
}
"Waiting for task to complete"
Start-Sleep 5
}