Highly influenced by other questions here on Stackoverflow I have ended up with this method for starting processes from my Powershell-scripts
function global:system-diagnostics-processstartinfo {
[CmdletBinding(SupportsShouldProcess=$True,ConfirmImpact='Low')]
param
(
[Parameter(Mandatory=$True,HelpMessage='Full path to exectuable')]
[Alias('exectuable')]
[string]$exe,
[Parameter(Mandatory=$True,HelpMessage='All arguments to be sent to exectuable')]
[Alias('args')]
[string]$arguments
)
if (!(Test-Path $exe)) {
$log.errorFormat("Did not find exectuable={0}, aborting script", $exe)
exit 1
}
$log.infoFormat("Start exectuable={0} with arguments='{1}'", $exe, $arguments)
$processStartInfo = New-Object System.Diagnostics.ProcessStartInfo($exe)
$processStartInfo.FileName = $exe
$processStartInfo.RedirectStandardError = $true
$processStartInfo.RedirectStandardOutput = $true
$processStartInfo.UseShellExecute = $false
$processStartInfo.Arguments = $arguments
$p = New-Object System.Diagnostics.Process
$p.StartInfo = $processStartInfo
$log.info("Start exectuable and wait for exit")
$p.Start() | Out-Null
#$p.WaitForExit()
$stdout = $p.StandardOutput.ReadToEnd()
$stderr = $p.StandardError.ReadToEnd()
$log.infoFormat("exectuable={0} stdout: {1}", $exe, $stdout)
$log.debugFormat("exectuable={0} stderr: {1}", $exe,$stderr)
$global:ExitCode = $p.ExitCode
$log.debugFormat("exectuable={0} Exitcode: {1}", $exe, $p.ExitCode)
return $stdout
}
Pretty straight forward with some added logging etc. And it works in all my current use cases execpt one. I have created a script that copies the database dump for our production instance of Confluence to our test server. Then it uses the above method to drop existing database, all fine. But the actual restore just hangs for ever and ever. So right now I have to exit the script and then run the following command manually
d:\postgresql\bin\pg_restore.exe -U postgres -d confluencedb -v -1 d:\temp\latest-backup.pgdump
It takes some time and there is quite a lot of output. Which makes me belive that there must be either one the following causing the issue
The amount of output makes a buffer overflow and stalls the script
It takes to much time
Anyone with similar experiences who can help me resolve this. It would enable to schedule the import, not having to do it manually as today.
I had to do the following right after process. Start:
# Capture output during process execution so we don't hang
# if there is too much output.
do
{
if (!$process.StandardOutput.EndOfStream)
{
[void]$StdOut.AppendLine($process.StandardOutput.ReadLine())
}
if (!$process.StandardError.EndOfStream)
{
[void]$StdErr.AppendLine($process.StandardError.ReadLine())
}
Start-Sleep -Milliseconds 10
}
while (!$process.HasExited)
# Capture any standard output generated between our last poll and process end.
while (!$process.StandardOutput.EndOfStream)
{
[void]$StdOut.AppendLine($process.StandardOutput.ReadLine())
}
# Capture any error output generated between our last poll and process end.
while (!$process.StandardError.EndOfStream)
{
[void]$StdErr.AppendLine($process.StandardError.ReadLine())
}
# Wait for the process to exit.
$process.WaitForExit()
LogWriteFunc ("END process: " + $ProcessName)
if ($process.ExitCode -ne 0)
{
LogWriteFunc ("Error: Script execution failed: " + $process.ExitCode )
$FuncResult = 1
}
# Log and display any standard output.
if ($StdOut.Length -gt 0)
{
LogWriteFunc ($StdOut.ToString())
}
# Log and display any error output.
if ($StdErr.Length -gt 0)
{
LogWriteFunc ($StdErr.ToString())
}
Related
Starter is used for starting target script process:
# STARTING PS (TARGET) SCRIPT COMPILED TO EXE
$processStartInfo = New-Object System.Diagnostics.ProcessStartInfo
$processStartInfo.FileName = $somePath
$processStartInfo.WorkingDirectory = (Get-Location).Path
$processStartInfo.RedirectStandardInput = $true
$processStartInfo.RedirectStandardError = $true
$processStartInfo.UseShellExecute = $false
$process = [System.Diagnostics.Process]::Start($processStartInfo)
# SOME OTHER CODE ...
# HERE I'M SENDING "EXIT" TO RUNSPACE RUNNING INSIDE TARGET SCRIPT
$process.StandardInput.WriteLineAsync("exit") | Out-Null
Target script (compiled to *.exe) creates runspace that synchonously waits for ReadLine data from starter
function main {
. createRunspace
while ($true) {
# PARENT LOOP RUNS IN PARALLEL TO RUNSPACE LOOP
sleep -s 1
try {
if ($hash.flags.exit) {
# CLEAN UP AND BREAK
} else {
# RUN OTHER CODE
}
} catch {
# CAN NOT NOTIFY RUNSPACE ABOUT ERROR USING SYNCHRONIZED HASTABLE,
# BECAUSE RUNSPACE IS STUCK ON `ReadLine`.
# ALSO CAN NOT WRITE TO STANDRAD INPUT (DON'T KNOW HOW).
}
}
}
function createRunspace {
#CREATING RUNSPACE WITH SYNCHRONIZED HASTABLE
$hash = [hashtable]::Synchronized(#{ flags: #{} })
$runspace= [runspacefactory]::CreateRunspace()
$runspace.Open()
$runspace.SessionStateProxy.SetVariable('hash', $hash)
$powershell= [powershell]::Create()
$powershell.Runspace = $runspace
$powershell.AddScript({
# RUNSPACE LOOP
while ($true) {
$value = [Console]::In.ReadLine()
if ($value -eq "exit") {
$hash.flags.exit = $true
break
} elseif ($value -eq "valueFromParent") {
# DO STUFF
}
}
}) | Out
}
# OTHER CODE
. main
Is there a way to send standard input data from parent to runspace?
The PowerShell-script-packaged-as-an-*.exe packaging script you're using for some reason doesn't pass stdin input through to the wrapped script, so your script never receives the "exit" line you send from the caller.
I don't know your exact requirements, but here's a much simplified solution that shows that your approach works in principle:
# The code to execute in the background.
$backgroundScript = {
while ($true) {
$value = [Console]::In.ReadLine()
if ($value -eq "exit") {
"Background: Exiting."
break
}
else {
"Background: Performing task: $value"
}
}
}
# Start the background script.
$processStartInfo = [System.Diagnostics.ProcessStartInfo] #{
FileName = "powershell.exe"
Arguments = '-NoProfile', '-Command', $backgroundScript -replace '"', '\"'
WorkingDirectory = $PWD.ProviderPath
RedirectStandardInput = $true
RedirectStandardError = $true
RedirectStandardOutput = $true
UseShellExecute = $false
}
$process = [System.Diagnostics.Process]::Start($processStartInfo)
# Ask the background script to perform a task.
"Submitting task 'doStuff'"
$process.StandardInput.WriteLine("doStuff")
# Ask the background script to exit.
"Submitting exit request."
$process.StandardInput.WriteLine("exit")
# Wait for the background script's process to exit,
# then print its stdout.
$process.WaitForExit()
$process.StandardOutput.ReadToEnd()
The above yields:
Submitting task 'doStuff'
Submitting exit request.
Background: Performing task: doStuff
Background: Exiting.
I have a small PowerShell program that starts a few threads to do parallel calculations and then when they are finished they append a line with the results to a text file and proceed to do some more. This worked fine in development and testing, but occasionally in production it hangs, and it seems the file is "jammed open". I have the writes wrapped in "try" blocks, but that does not help. I have written a toy application to illustrate the problem, it hangs after about 10-15 minutes usually (and writing about 3000 lines).
It seems to me I would have been better off with a Python solution using mutexs or something, but I am pretty far down this road now. Looking for ideas how I can easily fix this. I really thought Add-Content would have been atomic...
Parentjob.ps1
# Start a bunch of jobs
$curdir = "c:\transfer\filecollide"
$tokens = "tok00","tok01","tok02",
"tok03","tok04","tok05",
"tok06","tok07","tok08"
$jobs = #()
foreach ($tok in $tokens)
{
$job = Start-Job -FilePath ".\childjob.ps1" -ArgumentList "${curdir}",$tok,2,1000
Start-Sleep -s 3 # stagger things a bit
Write-Output " Starting:${tok} job"
$jobs += ,$job
}
foreach ($job in $jobs)
{
wait-job $job
$out = receive-job $job
Write-Output($out)
}
childjob.ps1
param(
[string]$curdir = ".",
[string]$tok = "tok?",
[int]$interval = 10,
[int]$ntodo = 1
)
$nwritefails = 0
$nwritesuccess = 0
$nwrite2fails = 0
function singleLine
{
param(
[string]$tok,
[string]$fileappendout = "",
[int]$timeout = 3
)
$curdatetime = (Get-Date)
$sout = "${curdatetime},${tok},${global:nwritesuccess},${global:nwritefails},${global:nwrite2fails}"
$global:nwritesuccess++
try
{
Add-Content -Path $fileappendout -Value "${sout}"
}
catch
{
$global:nwritefails++
try
{
Start-Sleep -s 1
Add-Content -Path $fileappendout -Value "${sout}"
}
catch
{
$global:nwrite2fails++
Write-Output "Failed to write to ${fileappendout}"
}
}
}
Write-Output "Starting to process ${tok}"
#Start of main code
cd "${curdir}"
$ndone = 0
while ($true)
{
singleLine $tok "outfile.txt"
$ndone++
if ($ndone -gt $ntodo){ break }
Start-Sleep -s $interval
}
Write-Output "Successful ${tok} appends:${nwritesuccess} failed:${nwritefails} failed2:${nwrite2fails}"
Why not have the jobs write the results to the output stream, and use Receive-Job in the main thread to collect the results and update the file? You can do this while the jobs are still running. What you're writing to the out stream now looks like it might be more appropriately written to the Progress stream.
is it possible to capture the Output from a commandline while a cmd is running? I´ve a small exe, which displays various Messages that should be processed with a script and displays the user some informations while the script is running.
My script starts the program (the exe) with some Parameters und checks if the process is still running. While the program is running i want to capture all messages to a variable to process it. I can´t find any solution, some tests with "run.exe 2>&1" etc... fails.
Any ideas?
$oInfo = New-Object System.Diagnostics.ProcessStartInfo
$oInfo.FileName = "ping"
$oInfo.Arguments = "localhost"
$oInfo.UseShellExecute = $False
$oInfo.RedirectStandardOutput = $True
$oProcess = New-Object System.Diagnostics.Process
$oProcess.StartInfo = $oInfo
[Void]$oProcess.Start()
$bDone = $False
while (!$bDone)
{
$char = $oProcess.StandardOutput.Read()
if ($char -eq -1)
{
if ($oProcess.HasExited)
{
$bDone = $True
}
else
{
Wait-Event 1
}
}
else
{
Write-Host -NoNewline "".PadLeft(1, $char)
}
}
I'm writing a script to download several repositories from GitHub. Here is the command to download a repository:
git clone "$RepositoryUrl" "$localRepoDirectory"
When I run this command it displays some nice progress information in the console window that I want displayed.
The problem is that I also want to be able to detect if any errors have occurred while downloading. I found this post that talks about redirecting the various streams, so I tried:
(git clone "$RepositoryUrl" "$localRepoDirectory") 2> $errorLogFilePath
This pipes any errors from stderr to my file, but no longer displays the nice progress information in the console.
I can use the Tee-Object like so:
(git clone "$RepositoryUrl" "$localRepoDirectory") | Tee-Object -FilePath $errorLogFilePath
and I still get the nice progress output, but this pipes stdout to the file, not stderr; I'm only concerned with detecting errors.
Is there a way that I can store any errors that occur in a file or (preferably) a variable, while also still having the progress information piped to the console window? I have a feeling that the answer might lie in redirecting various streams into other streams as discusses in this post, but I'm not really sure.
======== Update =======
I'm not sure if the git.exe is different than your typical executable, but I've done some more testing and here is what I've found:
$output = (git clone "$RepositoryUrl" "$localRepoDirectory")
$output always contains the text "Cloning into '[localRepoDirectory]'...", whether the command completed successfully or produced an error. Also, the progress information is still written to the console when doing this. This leads me to think that the progress information is not written via stdout, but by some other stream?
If an error occurs the error is written to the console, but in the usual white foreground color, not the typical red for errors and yellow for warnings. When this is called from within a cmdlet function and the command fails with an error, the error is NOT returned via the function's -ErrorVariable (or -WarningVariable) parameter (however if I do my own Write-Error that does get returned via -ErrorVariable). This leads me to think that git.exe doesn't write to stderr, but when we do:
(git clone "$RepositoryUrl" "$localRepoDirectory") 2> $errorLogFilePath
The error message is written to the file, so that makes me think that it does write to stderr. So now I'm confused...
======== Update 2 =======
So with Byron's help I've tried a couple of more solutions using a new process, but still can't get what I want. When using a new process I never get the nice progress written to the console.
The three new methods that I've tried both use this bit of code in common:
$process = New-Object System.Diagnostics.Process
$process.StartInfo.Arguments = "clone ""$RepositoryUrl"" ""$localRepoDirectory"""
$process.StartInfo.UseShellExecute = $false
$process.StartInfo.RedirectStandardOutput = $true
$process.StartInfo.RedirectStandardError = $true
$process.StartInfo.CreateNoWindow = $true
$process.StartInfo.WorkingDirectory = $WORKING_DIRECTORY
$process.StartInfo.FileName = "git"
Method 1 - Run in new process and read output afterwards:
$process.Start()
$process.WaitForExit()
Write-Host Output - $process.StandardOutput.ReadToEnd()
Write-Host Errors - $process.StandardError.ReadToEnd()
Method 2 - Get output synchronously:
$process.Start()
while (!$process.HasExited)
{
Write-Host Output - $process.StandardOutput.ReadToEnd()
Write-Host Error Output - $process.StandardError.ReadToEnd()
Start-Sleep -Seconds 1
}
Even though this looks like it would write the output while the process is running, it doesn't write anything until after the process exits.
Method 3 - Get output asynchronously:
Register-ObjectEvent -InputObject $process -EventName "OutputDataReceived" -Action {Write-Host Output Data - $args[1].Data }
Register-ObjectEvent -InputObject $process -EventName "ErrorDataReceived" -Action { Write-Host Error Data - $args[1].Data }
$process.Start()
$process.BeginOutputReadLine()
$process.BeginErrorReadLine()
while (!$process.HasExited)
{
Start-Sleep -Seconds 1
}
This does output data while the process is working which is good, but it still doesn't display the nice progress information :(
I think I have your answer. I'm working with PowerShell for a while and created several build systems. Sorry if the script is a bit long, but it works.
$dir = <your dir>
$global:log = <your log file which must be in the global scope> # Not global = won't work
function Create-Process {
$process = New-Object -TypeName System.Diagnostics.Process
$process.StartInfo.CreateNoWindow = $false
$process.StartInfo.RedirectStandardError = $true
$process.StartInfo.UseShellExecute = $false
return $process
}
function Terminate-Process {
param([System.Diagnostics.Process]$process)
$code = $process.ExitCode
$process.Close()
$process.Dispose()
Remove-Variable process
return $code
}
function Launch-Process {
param([System.Diagnostics.Process]$process, [string]$log, [int]$timeout = 0)
$errorjob = Register-ObjectEvent -InputObject $process -EventName ErrorDataReceived -SourceIdentifier Common.LaunchProcess.Error -action {
if(-not [string]::IsNullOrEmpty($EventArgs.data)) {
"ERROR - $($EventArgs.data)" | Out-File $log -Encoding ASCII -Append
Write-Host "ERROR - $($EventArgs.data)"
}
}
$outputjob = Register-ObjectEvent -InputObject $process -EventName OutputDataReceived -SourceIdentifier Common.LaunchProcess.Output -action {
if(-not [string]::IsNullOrEmpty($EventArgs.data)) {
"Out - $($EventArgs.data)" | Out-File $log -Encoding ASCII -Append
Write-Host "Out - $($EventArgs.data)"
}
}
if($errorjob -eq $null) {
"ERROR - The error job is null" | Out-File $log -Encoding ASCII -Append
Write-Host "ERROR - The error job is null"
}
if($outputjob -eq $null) {
"ERROR - The output job is null" | Out-File $log -Encoding ASCII -Append
Write-Host "ERROR - The output job is null"
}
$process.Start()
$process.BeginErrorReadLine()
if($process.StartInfo.RedirectStandardOutput) {
$process.BeginOutputReadLine()
}
$ret = $null
if($timeout -eq 0)
{
$process.WaitForExit()
$ret = $true
}
else
{
if(-not($process.WaitForExit($timeout)))
{
Write-Host "ERROR - The process is not completed, after the specified timeout: $($timeout)"
$ret = $false
}
else
{
$ret = $true
}
}
# Cancel the event registrations
Remove-Event * -ErrorAction SilentlyContinue
Unregister-Event -SourceIdentifier Common.LaunchProcess.Error
Unregister-Event -SourceIdentifier Common.LaunchProcess.Output
Stop-Job $errorjob.Id
Remove-Job $errorjob.Id
Stop-Job $outputjob.Id
Remove-Job $outputjob.Id
$ret
}
$repo = <your repo>
$process = Create-Process
$process.StartInfo.RedirectStandardOutput = $true
$process.StartInfo.FileName = "git.exe"
$process.StartInfo.Arguments = "clone $($repo)"
$process.StartInfo.WorkingDirectory = $dir
Launch-Process $process $global:log
Terminate-Process $process
The log file must be in the global scope because the routine which runs the event processing is not in the script scope.
Sample of my log file:
Out - Cloning into ''...
ERROR - Checking out files: 22% (666/2971)
ERROR - Checking out files: 23% (684/2971)
ERROR - Checking out files: 24% (714/2971)
You can do this by putting the git clone command inside an advanced function e.g.:
function Clone-Git {
[CmdletBinding()]
param($repoUrl, $localRepoDir)
git clone $repoUrl $localRepoDir
}
Clone-Git $RepositoryUrl $localRepoDirectory -ev cloneErrors
$cloneErrors
If you use System.Diagnostics.Process to start Git, you can redirect all the error and output.
I just had to solve this problem for Inkscape:
$si = New-Object System.Diagnostics.ProcessStartInfo
$si.Arguments = YOUR PROCESS ARGS
$si.UseShellExecute = $false
$si.RedirectStandardOutput = $true
$si.RedirectStandardError = $true
$si.WorkingDirectory = $workingDir
$si.FileName = EXECUTABLE LOCATION
$process = [Diagnostics.Process]::Start($si)
while (!($process.HasExited))
{
// Do what you want with strerr and stdout
Start-Sleep -s 1 // Sleep for 1 second
}
You can, of course, wrap this in a function with proper arguments...
I'm running the DTEXEC.exe command from within a PowerShell script, trying to capture and log the output to a file. Sometimes the output is incomplete and I'm trying to figure out why this the case and what might be done about it. The lines that never seem to get logged are the most interesting:
DTEXEC: The package execution returned DTSER_SUCCESS(0)
Started: 10:58:43 a.m.
Finished: 10:59:24 a.m.
Elapsed: 41.484 seconds
The output always seems incomplete on packages that execute in less than ~ 8 seconds and this might be a clue (there isn't much output or they finish quickly).
I'm using .NETs System.Diagnostics.Process and ProcessStartInfo to setup and run the command, and I'm redirecting stdout and stderror to event handlers that each append to a StringBuilder which is subsequently written to disk.
The problem feels like a timing issue or a buffering issue. To solve the timing issue, I've attempted to use Monitor.Enter/Exit. If it's a buffering issue, I'm not sure how to force the Process to not buffer stdout and stderror.
The environment is
- PowerShell 2 running CLR version 2
- SQL 2008 32-bit DTEXEC.exe
- Host Operating System: XP Service Pack 3.
Here's the code:
function Execute-SSIS-Package
{
param([String]$fileName)
$cmd = GetDTExecPath
$proc = New-Object System.Diagnostics.Process
$proc.StartInfo.FileName = $cmd
$proc.StartInfo.Arguments = "/FILE ""$fileName"" /CHECKPOINTING OFF /REPORTING ""EWP"""
$proc.StartInfo.RedirectStandardOutput = $True
$proc.StartInfo.RedirectStandardError = $True
$proc.StartInfo.WorkingDirectory = Get-Location
$proc.StartInfo.UseShellExecute = $False
$proc.StartInfo.CreateNoWindow = $False
Write-Host $proc.StartInfo.FileName $proc.StartInfo.Arguments
$cmdOut = New-Object System.Text.StringBuilder
$errorEvent = Register-ObjectEvent -InputObj $proc `
-Event "ErrorDataReceived" `
-MessageData $cmdOut `
-Action `
{
param
(
[System.Object] $sender,
[System.Diagnostics.DataReceivedEventArgs] $e
)
try
{
[System.Threading.Monitor]::Enter($Event.MessageData)
Write-Host -ForegroundColor "DarkRed" $e.Data
[void](($Event.MessageData).AppendLine($e.Data))
}
catch
{
Write-Host -ForegroundColor "Red" "Error capturing processes std error" $Error
}
finally
{
[System.Threading.Monitor]::Exit($Event.MessageData)
}
}
$outEvent = Register-ObjectEvent -InputObj $proc `
-Event "OutputDataReceived" `
-MessageData $cmdOut `
-Action `
{
param
(
[System.Object] $sender,
[System.Diagnostics.DataReceivedEventArgs] $e
)
try
{
[System.Threading.Monitor]::Enter($Event.MessageData)
#Write-Host $e.Data
[void](($Event.MessageData).AppendLine($e.Data))
}
catch
{
Write-Host -ForegroundColor "Red" "Error capturing processes std output" $Error
}
finally
{
[System.Threading.Monitor]::Exit($Event.MessageData)
}
}
$isStarted = $proc.Start()
$proc.BeginOutputReadLine()
$proc.BeginErrorReadLine()
while (!$proc.HasExited)
{
Start-Sleep -Milliseconds 100
}
Start-Sleep -Milliseconds 1000
$procExitCode = $proc.ExitCode
$procStartTime = $proc.StartTime
$procFinishTime = Get-Date
$proc.Close()
$proc.CancelOutputRead()
$proc.CancelErrorRead()
$result = New-Object PsObject -Property #{
ExitCode = $procExitCode
StartTime = $procStartTime
FinishTime = $procFinishTime
ElapsedTime = $procFinishTime.Subtract($procStartTime)
StdErr = ""
StdOut = $cmdOut.ToString()
}
return $result
}
The reason that your output is truncated is that Powershell returns from WaitForExit() and sets the HasExited property before it has processed all the output events in the queue.
One solution it to loop an arbitrary amount of time with short sleeps to allow the events to be processed; Powershell event processing appear to not be pre-emptive so a single long sleep does not allow events to process.
A much better solution is to also register for the Exited event (in addition to Output and Error events) on the Process. This event is the last in the queue so if you set a flag when this event occurs then you can loop with short sleeps until this flag is set and know that you have processed all the output events.
I have written up a full solution on my blog but the core snippet is:
# Set up a pair of stringbuilders to which we can stream the process output
$global:outputSB = New-Object -TypeName "System.Text.StringBuilder";
$global:errorSB = New-Object -TypeName "System.Text.StringBuilder";
# Flag that shows that final process exit event has not yet been processed
$global:myprocessrunning = $true
$ps = new-object System.Diagnostics.Process
$ps.StartInfo.Filename = $target
$ps.StartInfo.WorkingDirectory = Split-Path $target -Parent
$ps.StartInfo.UseShellExecute = $false
$ps.StartInfo.RedirectStandardOutput = $true
$ps.StartInfo.RedirectStandardError = $true
$ps.StartInfo.CreateNoWindow = $true
# Register Asynchronous event handlers for Standard and Error Output
Register-ObjectEvent -InputObject $ps -EventName OutputDataReceived -action {
if(-not [string]::IsNullOrEmpty($EventArgs.data)) {
$global:outputSB.AppendLine(((get-date).toString('yyyyMMddHHmm')) + " " + $EventArgs.data)
}
} | Out-Null
Register-ObjectEvent -InputObject $ps -EventName ErrorDataReceived -action {
if(-not [string]::IsNullOrEmpty($EventArgs.data)) {
$global:errorSB.AppendLine(((get-date).toString('yyyyMMddHHmm')) + " " + $EventArgs.data)
}
} | Out-Null
Register-ObjectEvent -InputObject $ps -EventName Exited -action {
$global:myprocessrunning = $false
} | Out-Null
$ps.start() | Out-Null
$ps.BeginOutputReadLine();
$ps.BeginErrorReadLine();
# We set a timeout after which time the process will be forceably terminated
$processTimeout = $timeoutseconds * 1000
while (($global:myprocessrunning -eq $true) -and ($processTimeout -gt 0)) {
# We must use lots of shorts sleeps rather than a single long one otherwise events are not processed
$processTimeout -= 50
Start-Sleep -m 50
}
if ($processTimeout -le 0) {
Add-Content -Path $logFile -Value (((get-date).toString('yyyyMMddHHmm')) + " PROCESS EXCEEDED EXECUTION ALLOWANCE AND WAS ABENDED!")
$ps.Kill()
}
# Append the Standard and Error Output to log file, we don't use Add-Content as it appends a carriage return that is not required
[System.IO.File]::AppendAllText($logFile, $global:outputSB)
[System.IO.File]::AppendAllText($logFile, $global:errorSB)
My 2 cents...its not a powershell issue but an issue/bug in the System.Diagnostics.Process class and underlying shell. I've seen times when wrapping the StdError and StdOut does not catch everything, and other times when the 'listening' wrapper application will hang indefinitly because of HOW the underlying application writes to the console. (in the c/c++ world there are MANY different ways to do this, [e.g. WriteFile, fprintf, cout, etc])
In addition there are more than 2 outputs that may need to be captured, but the .net framework only shows you those two (given they are the two primary ones) [see this article about command redirection here as it starts to give hints).
My guess (for both your issue as well as mine) is that it has to do with some low-level buffer flushing and/or ref counting. (If you want to get deep, you can start here)
One (very hacky) way to get around this is instead of executing the program directly to actually execute wrap it in a call to cmd.exe with 2>&1, but this method has its own pitfalls and issues.
The most ideal solution is for the executable to have a logging parameter, and then go parse the log file after the process exits...but most of the time you don't have that option.
But wait, we're using powershell...why are you using System.Diagnositics.Process in the first place? you can just call the command directly:
$output = & (GetDTExecPath) /FILE "$fileName" /CHECKPOINTING OFF /REPORTING "EWP"