Timing Out a Command Line Execution in Powershell - powershell

I'm building a PS program to execute several java programs. Some programs may take over the maximum amount of time that I would like to allow. If this happens, I would like to stop the execution of them and continue forward. So far I've used this:
$j = Start-Job -ScriptBlock {java -jar .\Tournament.jar Students.csv 7777}
Wait-Job $j -Timeout 10 | out-null
if($j.State -eq "Completed"){ "Done!" }
elseif($j.State -eq "Running"){"Program took longer than the max limit."}
Remove-Job -force $j
It seems to not recognize that the jar file is still running, as it flags the "Completed" condition as true. The jar file should take at least 70 seconds to execute. Also, it does seem to shut down the jar execution, because the jar never creates it's output file.
Update 11/30/2015
Replaced the above code with this and it seems to work:
$p = Start-Process java -ArgumentList '-jar', '.\Tournament.jar', 'Students.csv', '7777' -RedirectStandardError '.\console.err' -NoNewWindow -passthru
if(! $p.WaitForExit(10000)){
"Program took longer than max time"
$p.kill()
}
Any additional comments would be appreciated.

Related

PowerShell: waiting for output redirect file to unlock after killing process

I have a PowerShell script that:
Starts a new process and redirects the output to two files
Waits with a timeout for the process to complete
Kills the process if it timed out
Reads the contents from the redirected output files
It looks something like this:
$_timeout = 30
$_stdoutfile = "./stdout"
$_stderrfile = "./stderr"
# Start the process
$_process = Start-Process powershell.exe -ArgumentList "-file ""$_cmdfile""" -PassThru -NoNewWindow -RedirectStandardError "$_stderrfile" -RedirectStandardOutput "$_stdoutfile"
# Wait for it to complete, or time out
$_timeout_error = $null
$_process | Wait-Process -Timeout $_timeout -ErrorAction SilentlyContinue -ErrorVariable _timeout_error
# Check if it timed out
if ($_timeout_error) {
# Kill it
$_process | Stop-Process -Force
# (1)
# Wait for the process to exit after the kill command
$_kill_timeout_error = $null
$_process | Wait-Process -Timeout 10 -ErrorAction SilentlyContinue -ErrorVariable _kill_timeout_error
# If the process is still running 10 seconds after the kill command die with an error
if ($_kill_timeout_error) {
Write-Error "Failed to terminate process (waited for 10 seconds after initiating termination)."
Exit 1
}
}
# (2)
# Read the stdout and stderr content that was output
$_stdout = [System.IO.File]::ReadAllText("$_stdoutfile")
$_stderr = [System.IO.File]::ReadAllText("$_stderrfile")
# Delete the files after reading them
Remove-Item -Force "$_stdoutfile"
Remove-Item -Force "$_stderrfile"
The majority of this works properly; the process is killed as expected if it runs too long. The problem I'm having is with the ReadAllText functions. They work fine if the process exits on its own, but if they were killed due to a timeout, these functions fail with the error:
The process cannot access the file
'C:\myfilename' because it is being used by another process.
I figured that perhaps it takes the OS a couple seconds to unlock the files after the process is killed, so I inserted a sleep-poll loop at # (2), but often several minutes later, they're still locked.
#Santiago Squarzon suggested (in the comments) a way to read the file while it's still locked, which may work, but I also need to be able to delete them after reading them.
So my questions are:
Is there a way to get these files to naturally unlock more quickly after killing the process?
Is there a way to force-unlock these files with a separate PowerShell command/function?
Unrelated, but is the part in my code around the comment # (1) necessary (waiting for the process to stop after the kill command), or will Stop-Process block until the process is actually stopped?

Start-Job: Call another script within a ScriptBlock (Loop)

I have another problem, which I can't solve by myself even using the search..
I have a script that starts Robocopy as job, and a second job that watches this script if it's running, if not, send an e-mail.
Now I want to add in the watch script part to start the whole script again. (loop)
& "$PSScriptRoot\Sync_Start&Watch.ps1"
Robocopy-Job:
Script_Block_Sync = {
param ($rc_logfile, $rc_source, $rc_destination)
robocopy $rc_source $rc_destination /MIR /SEC /SECFIX /COPYALL /m /r:5 /Mon:1 /Mot:1 /unilog:$rc_logfile
}
Start-Job -Name Robocopy_Sync -ScriptBlock $Script_Block_Sync -ArgumentList $rc_logfile, $rc_source, $rc_destination
Watch_Job:
$Script_Block_Check = {
param($MailParams, $mailbody_error, $PSScriptRoot)
while ((Get-Process Robocopy).Responding) {Start-Sleep -Seconds 30}
if (!(Get-Process Robocopy).Responding) {
Send-MailMessage #MailParams -body $mailbody_error
& "$PSScriptRoot\Sync_Start&Watch.ps1"
}
}
Start-Job -Name Robocopy_Check -ScriptBlock $Script_Block_Check -ArgumentList $MailParams, $mailbody_error, $PSScriptRoot
I've tried with $PSScriptRoot, with the full path and with separate $script variable. If I run only the line (F8) or the whole IF block (F8) the script starts running.
If it's not possible to start another script, maybe another job which starts the script?
Any idea what I missed, or is it still not possible?
Thank you for any help!
Best regards
After a week of searching and trying multiple variations I found a solution to get the wanted loop. Not what I actually want, but it works. If someone read this here and have another or even better idea, you're very welcome! My idea was, to start a script via Task Scheduler, that starts everything in background without showing "Running" state.
FYI: To my problem, I missed the point, that PS jobs only run in the session they started in, so everything works fine if I test the script in PS ISE, but not via Task Scheduler, because if the script (session) ends, the jobs started within the script ends too.
So to get a loop, I use the following code:
while (!(Get-Process Robocopy).Responding) {
$Script_Block {...}
Start-Job -Name Robocopy -ScriptBlock $Script_Block
Start-Sleep -Seconds 5
while ((Get-Process Robocopy).Responding) {
Start-Sleep -Seconds 60
}
Send-MailMessage ...
}

Run a external program with parameters and wait for it to end in Powershell

Actually, I've found so many solutions to this question, but none works.
The program I'd like to run in Powershell is Reaper - a Digital Audio Workstation, and I'm going to use its command line tool for batch-processing audio files within a PS script. The code related to Reaper is as below:
reaper -batchconvert $output_path\audio\Reaper_filelist.txt
I'm going to use the Start-Process with -wait parameter to allow my script to wait for it to end then go on to the next line of code which is a Rename-Item function.
ls $processed_audio_path | Rename-Item -NewName {$_.name.Replace("- ", "")}
If PS doesn't wait for the process to finish, then the next line would throw an error, something like "no such file was found in the directory".
The suggestions I've found are here, here, and here. But none of those work. The problem is that Reaper doesn't accept the argument to be added separately as:
$exe = "reaper"
$arguments = "-batchconvert $output_path\audio\Reaper_filelist.txt"
Start-Process -filepath $exe -argumentlist $arguments -wait
or:
Start-Process -filepath reaper -argumentlist "-batchconvert $output_path\audio\Reaper_filelist.txt" -Wait
or:
Start-Process -filepath reaper -argumentlist #("-batchconvert", "$output_path\audio\Reaper_filelist.txt") -Wait
It can only work without a problem as a whole block like the first code line above.
So what can I do now?
I've found a solution to this problem.
I think I need to describe more context about this. I always have Reaper launched with my Windows in the background, when the script calls the BatchConvert function of Reaper, it will fire up another instance of Reaper, so I got 2 instances of it when converting audio files. This - the instances of Reaper - could be a reliable condition to restrict the following code. I found something useful from here and here.
Finally, I got my code like this and it works:
# Batch converting through Reaper FX Chain
reaper -batchconvert $output_path\audio\Reaper_filelist.txt
while (#(Get-Process reaper).Count -eq 2){
Start-Sleep -Milliseconds 500
}
# Correct the Wrong file name produced by Reaper
ls $processed_audio_path | Rename-Item -NewName {$_.name.Replace("- ", "")}
As the one comment mentions, it could be that the process starts another process, causing powershell to move along in the script.
If that's the case, you could have a while statement to wait for the file to be created.
while (!(Test-Path "$output_path\audio\Reaper_filelist.txt")) { Start-Sleep 10 }

How do I wait for a series of batch files to complete before continuing with my code execution?

I have a series of 15 batch files I need to run as part of a powershell script. I currently have a working version of this, but I execute each batch file as below:
Start-Process -FilePath "Script1.bat" -WorkingDirectory "C:\Root" -Wait
Start-Process -FilePath "Script2.bat" -WorkingDirectory "C:\Root" -Wait
Start-Process -FilePath "Script3.bat" -WorkingDirectory "C:\Root" -Wait
and so on.
The batch files are lumping thousands of SQL scripts into one big script, and some of them are much slower than others (the SQL populates 5 different databases and 1 is much larger than the other 4)
I'm trying to improve the speed at which the 15 scripts run, and I thought a good idea would be to run the batch files in parallel so that all of the SQL files are created at the same time, rather than in sequence. To do this I removed the "-Wait"s, and can see all of the command line windows opening simultaneously.
The problem I'm having is that after I have created the SQL files, I'm using Copy-Item to move the scripts to the desired location. Because the Copy-Item commands are no longer waiting for the batch files to execute, it's attempting to copy the SQL files from the folder they are created in before the batch files have finished creating them.
So basically I'm trying to come up with a way to "Wait" for the collection of batch files to run so I can ensure the batch files have finished running before I start copying the files. I've looked online for ages, and have tried using powershell Jobs with the "Wait-Job" command, but the wait only waits until every batch file has been executed, and not until they have been completed. Does anyone have any ideas on how this can be achieved?
I'm thinking: put all the process objects in an array... and wait for all of them to have exited...
$p = 1..15 | ForEach-Object {
Start-Process -FilePath "Script$_.bat" -WorkingDirectory "C:\Root" -PassThru
}
while(($p.HasExited -eq $false).Count) {
Start-Sleep -Milliseconds 100
}
I'd use an event here.
Instead of triggering the batch with what you have try the following:
$bat3 = Start-Process -FilePath "Script3.bat" -WorkingDirectory "C:\Root" -Wait -PassThru
Notice, you're saving the process object to a variable and you've added a -PassThru parameter (so an object is returned by the commandlet). Next, create a scriptblock with whatever you want to happen when the batch script finishes:
$things-todo-when-batchfile-completes = {
Copy-Item -Path ... -Destination ...
Get-EventSubscriber | Unregister-Event
}
Make sure to end the block with Get-EventSubscriber | Unregister-Event. Next, create an event handler:
$job = Register-ObjectEvent -InputObject $bat3 `
-eventname exited `
-SourceIdentifier Batch3Handler `
-Action $things-todo-when-batchfile-completes
Now, when the task finishes running the handler will execute the actions in $things-todo-when-batchfile-completes.
This should let you kick off the batch files, and when they finish running, you execute the file copy you were after.
Microsoft has a blog post on the technique in the piece Weekend Scripter: Playing with PowerShell Processes and Events.

How to "terminate" Get-Content -Wait after another command has finished in batch?

I'm writing on a batch script for Unity builds with Jenkins.
What I did so far
Unity has the problem that by default it it is not very verbous in -batchmode.
So in order to get the output into Jenkins I use -logFile to make Unity write to a specific logfile.
Until now I'm just able to read this file after a build succeeded or failed using
Unity.exe -batchmode -logFile JenkinsLOG.txt <more parameters>
type JenkinsLOG.txt
Now in order to get the content of JenkinsLOG.txt in realtime to the Jenkins log view I'm trying to use start to run the Unity process in a new console and use powershell Get-Content <file> -Wait to print the content of the logfile in realtime to the console:
start Unity.exe -batchmode -logFile JenkinsLOG.txt <more parameters>
powershell Get-Content JenkinsLOG.txt -Wait
this works great and I see the output in realtime appearing in Jenkins ...
But ... ofcourse the powershell command never terminates so the build process gets stuck still waiting for more lines beeing appended to JenkinsLOG.txt.
So my question is
Is there any possibility to tell this powershell command to terminate after the Unity process finished?
Get process id you want to monitor
Spin up a job that tails the log file
Loop the current console to read output from the job
Terminate the loop when the process exits
Clean up jobs
Here it is wrapped up in a function. There is probably a more elegant way, but I couldn't find another way to distinguish between a "timeout" exit of Wait-Process and a "process stopped" exit.
function TailFile-UntilProcessStops {
Param ($processID, $filePath)
$loopBlock = {
Param($filePath) Get-Content $filePath -Wait -Tail 0
}
$TailLoopJob = start-job -scriptBlock $loopBlock -ArgumentList $filePath
try {
do {
$TailLoopJob | Receive-Job
try {
Wait-Process -id $processID -ErrorAction Stop -Timeout 1
$waitMore = $false
} catch {
$waitMore = $true
}
} while($waitMore)
} finally {
Stop-Job $TailLoopJob
Remove-Job $TailLoopJob
}
}
Here is the test code with Notepad. Make sure the file exists, then modify it. Every time you save, the console should update with more data. Quit Notepad and control returns to the console.
$filename = 'h:\asdf\somefile.txt'
$process = start-process -FilePath 'notepad.exe' -ArgumentList #($filename) -PassThru
TailFile-UntilProcessStops -processID $process.id -filepath $filename