Powershell - after inputting to stdin capturing stdout/stderr no longer works - powershell

In a pretty restricted environment I'm basically only allowed to use Powershell + Plink if I want to automate some tasks.
I want to create a function that:
shows the output as it arrives if required
captures all output (stdout and stderr) to make it available later for further parsing or logging to console/file/whatever
inputs a password automatically
Unfortunately after the line where I input the password output capture stops. It used to work when I was only capturing stdout. After capturing stderr as well, no more luck.
The code:
function BaseRun {
param ($command, $arguments, $output = "Console")
$procInfo = New-Object System.Diagnostics.ProcessStartInfo
$procInfo.RedirectStandardOutput = $true
$procInfo.RedirectStandardError = $true
$procInfo.RedirectStandardInput = $true
$procInfo.FileName = $command
$procInfo.Arguments = $arguments
$procInfo.UseShellExecute = $false
$process = New-Object System.Diagnostics.Process
$process.StartInfo = $procInfo
[void]$process.Start()
$outputStream = $process.StandardOutput
$errorStream = $process.StandardError
$inputStream = $process.StandardInput
$outputBuffer = New-Object System.Text.StringBuilder
Start-Sleep -m 2000
$inputStream.Write("${env:password}`n")
while (-not $process.HasExited) {
do {
$outputLine = $outputStream.ReadLine()
$errorLine = $errorStream.ReadLine()
[void]$outputBuffer.Append("$outputLine`n")
if (($output -eq "All") -or ($output -eq "Console")) {
Write-Host "$outputLine"
Write-Host "$errorLine"
}
} while (($outputLine -ne $null) -and ($errorLine -ne $null))
}
return $outputBuffer.ToString()
}

Based on #w0xx0m's and #Martin Prikryl's help I managed to make this working solution:
Register-ObjectEvent -InputObject $process `
-EventName OutputDataReceived -SourceIdentifier processOutputDataReceived `
-Action {
$data = $EventArgs.data
if($data -ne $null) { Write-Host $data }
} | Out-Null
Register-ObjectEvent -InputObject $process `
-EventName ErrorDataReceived -SourceIdentifier processErrorDataReceived `
-Action {
$data = $EventArgs.data
if($data -ne $null) { Write-Host $data }
} | Out-Null
[void]$process.Start()
$process.BeginOutputReadLine()
$process.BeginErrorReadLine()
$inputStream = $process.StandardInput
Start-Sleep -m 2000
$inputStream.Write("${env:password}`n")
$process.WaitForExit()
Unregister-Event -SourceIdentifier processOutputDataReceived
Unregister-Event -SourceIdentifier processErrorDataReceived
$inputStream.Close()
For brevity I've removed the process launch section (it's the same as the one from above) and the data processing (in the example I just Write-Host it).

When you are reading both stdout and stderr, you cannot not use the ReadLine.
First, if there's only stderr output and no stdout, the $outputStream.ReadLine() never returns and you never get to the $errorStream.ReadLine().
A bigger problem is, that there's only a limited buffer in Windows for the outputs. So, when there's a lot of stderr before a complete stdout line is produced, the stderr buffer fills up and the applications (Plink) stops on the next attempt to write to stderr, waiting for the stderr buffer to be consumed. What it never does, as you keep waiting on a wrong stdout buffer. A deadlock.
You have to use a simple Read and never synchronously wait, when there's no output available.

Related

Asynchronously Logging Stdout from a PowerShell Process

I need to start a long-running process in a PowerShell script and capture the StandardOutput to a file as it is being generated. I'm currently using an asynchronous approach and appending the output to a text file each time an OutputDataReceived event is fired. The following code excerpt illustrates:
$PSI = New-Object System.Diagnostics.ProcessStartInfo
$PSI.CreateNoWindow = $true
$PSI.RedirectStandardOutput = $true
$PSI.RedirectStandardError = $true
$PSI.UseShellExecute = $false
$PSI.FileName = $EXECUTE
$PSI.RedirectStandardInput = $true
$PSI.WorkingDirectory = $PWD
$PSI.EnvironmentVariables["TMP"] = $TMPDIR
$PSI.EnvironmentVariables["TEMP"] = $TMPDIR
$PSI.EnvironmentVariables["TMPDIR"] = $TMPDIR
$PROCESS = New-Object System.Diagnostics.Process
$PROCESS.StartInfo = $PSI
[void]$PROCESS.Start()
# asynchronously listen for OutputDataReceived events on the process
$sb = [scriptblock]::Create("`$text = `$Event.SourceEventArgs.Data; `$OUTFIL = `$Event.MessageData; Add-Content $OUTFIL.out `$text")
$EVENT_OBJ = Register-ObjectEvent -InputObject $PROCESS -EventName OutputDataReceived -Action $sb -MessageData "$OUTFIL.out"
# asynchronously listen for ErrorDataReceived events on the process
$sb2 = [scriptblock]::Create("`$text = `$Event.SourceEventArgs.Data; Write-Host `$text;")
$EVENT_OBJ2 = Register-ObjectEvent -InputObject $PROCESS -EventName ErrorDataReceived -Action $sb2
# begin asynchronous read operations on the redirected StandardOutput
$PROCESS.BeginOutputReadLine();
# begin asynchronous read operations on the redirected StandardError
$PROCESS.BeginErrorReadLine();
# write the input file contents to standard input
$WRITER = $PROCESS.StandardInput
$READER = [System.IO.File]::OpenText("$IN_FILE")
try
{
while (($line = $READER.ReadLine()) -ne $null)
{
$WRITER.WriteLine($line)
}
}
finally
{
$WRITER.close()
$READER.close()
}
$Process.WaitForExit() | Out-Null
# end asynchronous read operations on the redirected StandardOutput
$PROCESS.CancelOutputRead();
# end asynchronous read operations on the redirected StandardError
$PROCESS.CancelErrorRead();
Unregister-Event -SubscriptionId $EVENT_OBJ.Id
Unregister-Event -SubscriptionId $EVENT_OBJ2.Id
The issue with this approach is the large amount of output being generated and the latency caused by having to open and close the text file each event (i.e., Add-Content $OUTFIL.out $text) is called each time the event is fired).
$sb = [scriptblock]::Create("`$text = `$Event.SourceEventArgs.Data; `$OUTFIL = `$Event.MessageData; Add-Content $OUTFIL.out `$text")
In each case the actual process will complete minutes before all of the output data has been written to the text file.
Is there a better approach to doing this? A faster way to append text to the file?

How to cancel wait-event by Ctrl-C instead of cancelling whole script in powershell

I am using Event related command in powershell, and when I register a object event and wait for it, my script will be paused by the wait-event call.
Based on the documentation,
The Wait-Event cmdlet suspends execution of a script or
function until a particular event is raised. Execution
resumes when the event is detected.
To cancel the wait, press CTRL+C.
But what I found is, when I press Ctrl-C in powershell console, the whole script is ended instead of wait-event call, which I thought the wait-event call maybe return with $null value.
I don't know if something wrong in my understanding, so hope anyone could share more idea on it.
$port = New-Object System.IO.Ports.SerialPort COM5,115200,None,8,One
$port.Open()
$subscription = Register-ObjectEvent -InputObject $port -EventName DataReceived -SourceIdentifier "DataReceived"
while($true)
{
$event = Wait-Event -SourceIdentifier "DataReceived"
#
# So how can I check if user press the Control-C to cancel wait-event
#
[System.IO.Ports.SerialPort]$sp = $event.Sender;
$line = $sp.ReadExisting();
Write-Host $event.EventIdentifier;
Write-Host $line;
Remove-Event -EventIdentifier $event.EventIdentifier;
}
Unregister-Event -SourceIdentifier "DataReceived"
$port.Close()
--EDIT---
Thanks for the answer, but I want to point out some known solution which I already tried.
[console]::TreatControlCAsInput = $true
if ([console]::KeyAvailable)
{
$key = [system.console]::readkey($true)
if (($key.modifiers -band [consolemodifiers]"control") -and ($key.key -eq "C"))
{
"Terminating..."
break
}
}
This is a good way to resolve the common Control-C scenario, but the problem is my application is paused by wait-event call, so I have no change to check the key input.
Especially, when I enable Ctrl C for console via [console]::TreatControlCAsInput = $true, Control-C will not be able to cancel Wait-Event any more, that is also a problem here.
Here is a good example, which you can try ( I rewrite a usable one for someone testing ):
$timer = New-Object System.Timers.Timer
[Console]::TreatControlCAsInput = $true;
$subscription = Register-ObjectEvent -InputObject $timer -EventName Elapsed -SourceIdentifier "TimeElapsed"
$event = Wait-Event -SourceIdentifier "DataReceived"
Write-Host "User Cancelled"
Unregister-Event -SubscriptionId $subscription.Id
You can run this script, with and without [Console]::.... to test.
You might have to call [console]::TreatControlCAsInput = $true to tell that (Ctrl+C) is treated as ordinary input and use statements like those in your loop:
if ([console]::KeyAvailable)
{
$key = [system.console]::readkey($true)
if (($key.modifiers -band [consolemodifiers]"control") -and ($key.key -eq "C"))
{
"Terminating..."
break
}
}
Wait-Event isn't handling Ctrl+C, the PowerShell host (console host) handles Ctrl+C by stopping your script.
You can specify a timeout with Wait-Event and check for a key press after the time out. Try the following:
$timer = New-Object System.Timers.Timer
$subscription = Register-ObjectEvent -InputObject $timer -EventName Elapsed -SourceIdentifier TimeElapsed
try
{
$oldTreatControlCAsInput = [Console]::TreatControlCAsInput
[Console]::TreatControlCAsInput = $true
Write-Host -NoNewline "Waiting for event "
do
{
$event = Wait-Event -SourceIdentifier DataReceived -Timeout 2
if ($null -eq $event -and [Console]::KeyAvailable)
{
$key = [Console]::ReadKey($true)
if ($key.KeyChar -eq 3)
{
Write-Host "Cancelled"
break
}
}
Write-Host -NoNewline "."
} while ($null -eq $event)
}
finally
{
[Console]::TreatControlCAsInput = $oldTreatControlCAsInput
Unregister-Event -SourceIdentifier TimeElapsed
}
The smallest timeout is 1 second so you may see a short lag between Ctrl+C and when the loop stops.

Reading exit code from a process in PowerShell

I'm working on a build environment for our JavaScript project. We use require.js (r.js) to combine different js modules into one output js file. We use TeamCity and I wanted to configure Powershell build step that would call r.js, read it's standard output and exit code and pass that exit code back to TeamCity (by exiting from Powershell with this exit code) so that if the tasks fails (r.js comes back with exit code 1) it won't proceed with other build steps. Also I wanted the standard output of r.js to be saved in the TeamCity log to allow developers to quickly see the error causing r.js to stop.
This is the way how I can start r.js process with it's arguments, read it's exit code and use it to exit from Powershell:
$process = start-process r.js.cmd -ArgumentList "-o build-dev.js" -PassThru -Wait
exit $process.ExitCode
If I try to read standard output in this way before exiting:
Write-Host $process.StandardOutput.ReadToEnd();
I get this error, which probably suggests that I can't read StandardOutput in this way as it is a stream:
You cannot call a method on a null-valued expression.
At line:1 char:45
+ Write-Host $process.StandardOutput.ReadToEnd <<<< ();
+ CategoryInfo : InvalidOperation: (ReadToEnd:String) [], Runtime
Exception
+ FullyQualifiedErrorId : InvokeMethodOnNull
Process exited with code 1
Now I found a way of running the process so that I can read the standard output but I'm not able to read the exit code:
$psi = New-object System.Diagnostics.ProcessStartInfo
$psi.UseShellExecute = $false
$psi.RedirectStandardOutput = $true
$psi.RedirectStandardError = $true
#$psi.FileName = "r.js.cmd"
$psi.FileName = "C:\Users\Administrator\AppData\Roaming\npm\r.js.cmd"
$psi.Arguments = #("-o build-dev.js")
#$psi.WorkingDirectory = (Get-Location).Path;
$process = New-Object System.Diagnostics.Process
$process.StartInfo = $psi
$process.Start() | Out-Host
$process.WaitForExit()
$output = $process.StandardOutput.ReadToEnd()
$stderr = $process.StandardError.ReadToEnd()
sleep(10)
$exit_code = $process.ExitCode
write-host "========== OUTPUT =========="
write-host $output
write-host "========== ERROR =========="
write-host $stderr
write-host "========== EXIT CODE =========="
write-host $exit_code
write-host "========== $LastExitCode =========="
#write-host $LastExitCode
#Exit $exit_code
But again this returns the console output but the exit code is always 0 even if r.js returns 1 because I have error in my js scripts.
Could anyone advise how can I read standard output with start-process or how can I read exit code with New-object System.Diadnostics.ProcessStartInfo
I can attach screenshots with more details of my TC build step configuration and output saved to the build log if that would help answering the question.
I usually tackle this situation by using a variation of the start-process command mentioned in the question.
$outputLog = "outputFile.log"
$errLog = "errorFile.log"
$process = Start-Process r.js.cmd -ArgumentList "-o build-dev.js" -PassThru -RedirectStandardOutput $outputLog -RedirectStandardError $errLog -Wait
$exitCode = $process.ExitCode
Now the log files $outputLog and $errorLog will have the standard output and error contents.
This is how you can do it. Cmd files are however not applications, they are associated with cmd so you might have to target cmd.exe and put the path to the cmd file as a parameter. Also UseShellExecute for process start info stops output and errors from being recorded so dont enable it if you need output. UseShellExecute is as if you put the $Path into the Win+R window. If its a png for example, windows will find an app that is supposed to open it and opens that file with it.
$Path = "C:\whatever.exe"
$WorkingDirectory = "C:\"
$CreateNoWindow = $true #$true to not create another window for console application
$Parameters = "-paremeters for -the app"
$WindowStyle = [Diagnostics.ProcessWindowStyle]::Normal
# prepare process start info
$processStartInfo = New-Object -TypeName 'System.Diagnostics.ProcessStartInfo' -ErrorAction 'Stop'
$processStartInfo.FileName = $Path
$processStartInfo.WorkingDirectory = $WorkingDirectory
$processStartInfo.ErrorDialog = $false
$processStartInfo.RedirectStandardOutput = $true
$processStartInfo.RedirectStandardError = $true
$processStartInfo.CreateNoWindow = $CreateNoWindow
If ($Parameters) { $processStartInfo.Arguments = $Parameters }
$processStartInfo.WindowStyle = $WindowStyle
#create a process and assign the process start info object to it
$process = New-Object -TypeName 'System.Diagnostics.Process' -ErrorAction 'Stop'
$process.StartInfo = $processStartInfo
#Add event handler to capture process's standard output redirection
[scriptblock]$processEventHandler = { If (-not [string]::IsNullOrEmpty($EventArgs.Data)) { $Event.MessageData.AppendLine($EventArgs.Data) } }
#create string builders to store the output and errors in
$stdOutBuilder = New-Object -TypeName 'System.Text.StringBuilder' -ArgumentList ''
$stdOutEvent = Register-ObjectEvent -InputObject $process -Action $processEventHandler -EventName 'OutputDataReceived' -MessageData $stdOutBuilder -ErrorAction 'Stop'
$stdErrBuilder = New-Object -TypeName 'System.Text.StringBuilder' -ArgumentList ''
$stdErrEvent = Register-ObjectEvent -InputObject $process -Action $processEventHandler -EventName 'ErrorDataReceived' -MessageData $stdErrBuilder -ErrorAction 'Stop'
#start the process
$null = $process.Start()
#begin reading the output and errors
$process.BeginOutputReadLine()
$process.BeginErrorReadLine()
#Instructs the Process component to wait indefinitely for the associated process to exit.
$process.WaitForExit()
#HasExited indicates that the associated process has terminated, either normally or abnormally. Wait until HasExited returns $true.
While (-not ($process.HasExited)) { $process.Refresh(); Start-Sleep -Seconds 1 }
## Get the exit code for the process
Try {
[int32]$returnCode = $process.ExitCode
}
Catch [System.Management.Automation.PSInvalidCastException] {
# Catch exit codes that are out of int32 range
[int32]$returnCode = 1
}
#unregister the events
If ($stdOutEvent) { Unregister-Event -SourceIdentifier $stdOutEvent.Name -ErrorAction 'Stop'; $stdOutEvent = $null }
If ($stdErrEvent) { Unregister-Event -SourceIdentifier $stdErrEvent.Name -ErrorAction 'Stop'; $stdErrEvent = $null }
$stdOut = $stdOutBuilder.ToString() -replace $null,''
$stdErr = $stdErrBuilder.ToString() -replace $null,''
## Free resources associated with the process, this does not cause process to exit
If ($process) { $process.Dispose() }
# check if we have output and return it
If ($stdOut) {
Write-Output "Process output:$stdOut"
}
If ($stdErr) {
Write-Output "Process errors:$stdErr"
}
# return exit code
Write-Output "Exit code:$returnCode"
Start-Sleep -Seconds 5
Exit $returnCode

PowerShell - redirect executable's stderr to file or variable but still have stdout go to console

I'm writing a script to download several repositories from GitHub. Here is the command to download a repository:
git clone "$RepositoryUrl" "$localRepoDirectory"
When I run this command it displays some nice progress information in the console window that I want displayed.
The problem is that I also want to be able to detect if any errors have occurred while downloading. I found this post that talks about redirecting the various streams, so I tried:
(git clone "$RepositoryUrl" "$localRepoDirectory") 2> $errorLogFilePath
This pipes any errors from stderr to my file, but no longer displays the nice progress information in the console.
I can use the Tee-Object like so:
(git clone "$RepositoryUrl" "$localRepoDirectory") | Tee-Object -FilePath $errorLogFilePath
and I still get the nice progress output, but this pipes stdout to the file, not stderr; I'm only concerned with detecting errors.
Is there a way that I can store any errors that occur in a file or (preferably) a variable, while also still having the progress information piped to the console window? I have a feeling that the answer might lie in redirecting various streams into other streams as discusses in this post, but I'm not really sure.
======== Update =======
I'm not sure if the git.exe is different than your typical executable, but I've done some more testing and here is what I've found:
$output = (git clone "$RepositoryUrl" "$localRepoDirectory")
$output always contains the text "Cloning into '[localRepoDirectory]'...", whether the command completed successfully or produced an error. Also, the progress information is still written to the console when doing this. This leads me to think that the progress information is not written via stdout, but by some other stream?
If an error occurs the error is written to the console, but in the usual white foreground color, not the typical red for errors and yellow for warnings. When this is called from within a cmdlet function and the command fails with an error, the error is NOT returned via the function's -ErrorVariable (or -WarningVariable) parameter (however if I do my own Write-Error that does get returned via -ErrorVariable). This leads me to think that git.exe doesn't write to stderr, but when we do:
(git clone "$RepositoryUrl" "$localRepoDirectory") 2> $errorLogFilePath
The error message is written to the file, so that makes me think that it does write to stderr. So now I'm confused...
======== Update 2 =======
So with Byron's help I've tried a couple of more solutions using a new process, but still can't get what I want. When using a new process I never get the nice progress written to the console.
The three new methods that I've tried both use this bit of code in common:
$process = New-Object System.Diagnostics.Process
$process.StartInfo.Arguments = "clone ""$RepositoryUrl"" ""$localRepoDirectory"""
$process.StartInfo.UseShellExecute = $false
$process.StartInfo.RedirectStandardOutput = $true
$process.StartInfo.RedirectStandardError = $true
$process.StartInfo.CreateNoWindow = $true
$process.StartInfo.WorkingDirectory = $WORKING_DIRECTORY
$process.StartInfo.FileName = "git"
Method 1 - Run in new process and read output afterwards:
$process.Start()
$process.WaitForExit()
Write-Host Output - $process.StandardOutput.ReadToEnd()
Write-Host Errors - $process.StandardError.ReadToEnd()
Method 2 - Get output synchronously:
$process.Start()
while (!$process.HasExited)
{
Write-Host Output - $process.StandardOutput.ReadToEnd()
Write-Host Error Output - $process.StandardError.ReadToEnd()
Start-Sleep -Seconds 1
}
Even though this looks like it would write the output while the process is running, it doesn't write anything until after the process exits.
Method 3 - Get output asynchronously:
Register-ObjectEvent -InputObject $process -EventName "OutputDataReceived" -Action {Write-Host Output Data - $args[1].Data }
Register-ObjectEvent -InputObject $process -EventName "ErrorDataReceived" -Action { Write-Host Error Data - $args[1].Data }
$process.Start()
$process.BeginOutputReadLine()
$process.BeginErrorReadLine()
while (!$process.HasExited)
{
Start-Sleep -Seconds 1
}
This does output data while the process is working which is good, but it still doesn't display the nice progress information :(
I think I have your answer. I'm working with PowerShell for a while and created several build systems. Sorry if the script is a bit long, but it works.
$dir = <your dir>
$global:log = <your log file which must be in the global scope> # Not global = won't work
function Create-Process {
$process = New-Object -TypeName System.Diagnostics.Process
$process.StartInfo.CreateNoWindow = $false
$process.StartInfo.RedirectStandardError = $true
$process.StartInfo.UseShellExecute = $false
return $process
}
function Terminate-Process {
param([System.Diagnostics.Process]$process)
$code = $process.ExitCode
$process.Close()
$process.Dispose()
Remove-Variable process
return $code
}
function Launch-Process {
param([System.Diagnostics.Process]$process, [string]$log, [int]$timeout = 0)
$errorjob = Register-ObjectEvent -InputObject $process -EventName ErrorDataReceived -SourceIdentifier Common.LaunchProcess.Error -action {
if(-not [string]::IsNullOrEmpty($EventArgs.data)) {
"ERROR - $($EventArgs.data)" | Out-File $log -Encoding ASCII -Append
Write-Host "ERROR - $($EventArgs.data)"
}
}
$outputjob = Register-ObjectEvent -InputObject $process -EventName OutputDataReceived -SourceIdentifier Common.LaunchProcess.Output -action {
if(-not [string]::IsNullOrEmpty($EventArgs.data)) {
"Out - $($EventArgs.data)" | Out-File $log -Encoding ASCII -Append
Write-Host "Out - $($EventArgs.data)"
}
}
if($errorjob -eq $null) {
"ERROR - The error job is null" | Out-File $log -Encoding ASCII -Append
Write-Host "ERROR - The error job is null"
}
if($outputjob -eq $null) {
"ERROR - The output job is null" | Out-File $log -Encoding ASCII -Append
Write-Host "ERROR - The output job is null"
}
$process.Start()
$process.BeginErrorReadLine()
if($process.StartInfo.RedirectStandardOutput) {
$process.BeginOutputReadLine()
}
$ret = $null
if($timeout -eq 0)
{
$process.WaitForExit()
$ret = $true
}
else
{
if(-not($process.WaitForExit($timeout)))
{
Write-Host "ERROR - The process is not completed, after the specified timeout: $($timeout)"
$ret = $false
}
else
{
$ret = $true
}
}
# Cancel the event registrations
Remove-Event * -ErrorAction SilentlyContinue
Unregister-Event -SourceIdentifier Common.LaunchProcess.Error
Unregister-Event -SourceIdentifier Common.LaunchProcess.Output
Stop-Job $errorjob.Id
Remove-Job $errorjob.Id
Stop-Job $outputjob.Id
Remove-Job $outputjob.Id
$ret
}
$repo = <your repo>
$process = Create-Process
$process.StartInfo.RedirectStandardOutput = $true
$process.StartInfo.FileName = "git.exe"
$process.StartInfo.Arguments = "clone $($repo)"
$process.StartInfo.WorkingDirectory = $dir
Launch-Process $process $global:log
Terminate-Process $process
The log file must be in the global scope because the routine which runs the event processing is not in the script scope.
Sample of my log file:
Out - Cloning into ''...
ERROR - Checking out files: 22% (666/2971)
ERROR - Checking out files: 23% (684/2971)
ERROR - Checking out files: 24% (714/2971)
You can do this by putting the git clone command inside an advanced function e.g.:
function Clone-Git {
[CmdletBinding()]
param($repoUrl, $localRepoDir)
git clone $repoUrl $localRepoDir
}
Clone-Git $RepositoryUrl $localRepoDirectory -ev cloneErrors
$cloneErrors
If you use System.Diagnostics.Process to start Git, you can redirect all the error and output.
I just had to solve this problem for Inkscape:
$si = New-Object System.Diagnostics.ProcessStartInfo
$si.Arguments = YOUR PROCESS ARGS
$si.UseShellExecute = $false
$si.RedirectStandardOutput = $true
$si.RedirectStandardError = $true
$si.WorkingDirectory = $workingDir
$si.FileName = EXECUTABLE LOCATION
$process = [Diagnostics.Process]::Start($si)
while (!($process.HasExited))
{
// Do what you want with strerr and stdout
Start-Sleep -s 1 // Sleep for 1 second
}
You can, of course, wrap this in a function with proper arguments...

Captured Output of command run by PowerShell Is Sometimes Incomplete

I'm running the DTEXEC.exe command from within a PowerShell script, trying to capture and log the output to a file. Sometimes the output is incomplete and I'm trying to figure out why this the case and what might be done about it. The lines that never seem to get logged are the most interesting:
DTEXEC: The package execution returned DTSER_SUCCESS(0)
Started: 10:58:43 a.m.
Finished: 10:59:24 a.m.
Elapsed: 41.484 seconds
The output always seems incomplete on packages that execute in less than ~ 8 seconds and this might be a clue (there isn't much output or they finish quickly).
I'm using .NETs System.Diagnostics.Process and ProcessStartInfo to setup and run the command, and I'm redirecting stdout and stderror to event handlers that each append to a StringBuilder which is subsequently written to disk.
The problem feels like a timing issue or a buffering issue. To solve the timing issue, I've attempted to use Monitor.Enter/Exit. If it's a buffering issue, I'm not sure how to force the Process to not buffer stdout and stderror.
The environment is
- PowerShell 2 running CLR version 2
- SQL 2008 32-bit DTEXEC.exe
- Host Operating System: XP Service Pack 3.
Here's the code:
function Execute-SSIS-Package
{
param([String]$fileName)
$cmd = GetDTExecPath
$proc = New-Object System.Diagnostics.Process
$proc.StartInfo.FileName = $cmd
$proc.StartInfo.Arguments = "/FILE ""$fileName"" /CHECKPOINTING OFF /REPORTING ""EWP"""
$proc.StartInfo.RedirectStandardOutput = $True
$proc.StartInfo.RedirectStandardError = $True
$proc.StartInfo.WorkingDirectory = Get-Location
$proc.StartInfo.UseShellExecute = $False
$proc.StartInfo.CreateNoWindow = $False
Write-Host $proc.StartInfo.FileName $proc.StartInfo.Arguments
$cmdOut = New-Object System.Text.StringBuilder
$errorEvent = Register-ObjectEvent -InputObj $proc `
-Event "ErrorDataReceived" `
-MessageData $cmdOut `
-Action `
{
param
(
[System.Object] $sender,
[System.Diagnostics.DataReceivedEventArgs] $e
)
try
{
[System.Threading.Monitor]::Enter($Event.MessageData)
Write-Host -ForegroundColor "DarkRed" $e.Data
[void](($Event.MessageData).AppendLine($e.Data))
}
catch
{
Write-Host -ForegroundColor "Red" "Error capturing processes std error" $Error
}
finally
{
[System.Threading.Monitor]::Exit($Event.MessageData)
}
}
$outEvent = Register-ObjectEvent -InputObj $proc `
-Event "OutputDataReceived" `
-MessageData $cmdOut `
-Action `
{
param
(
[System.Object] $sender,
[System.Diagnostics.DataReceivedEventArgs] $e
)
try
{
[System.Threading.Monitor]::Enter($Event.MessageData)
#Write-Host $e.Data
[void](($Event.MessageData).AppendLine($e.Data))
}
catch
{
Write-Host -ForegroundColor "Red" "Error capturing processes std output" $Error
}
finally
{
[System.Threading.Monitor]::Exit($Event.MessageData)
}
}
$isStarted = $proc.Start()
$proc.BeginOutputReadLine()
$proc.BeginErrorReadLine()
while (!$proc.HasExited)
{
Start-Sleep -Milliseconds 100
}
Start-Sleep -Milliseconds 1000
$procExitCode = $proc.ExitCode
$procStartTime = $proc.StartTime
$procFinishTime = Get-Date
$proc.Close()
$proc.CancelOutputRead()
$proc.CancelErrorRead()
$result = New-Object PsObject -Property #{
ExitCode = $procExitCode
StartTime = $procStartTime
FinishTime = $procFinishTime
ElapsedTime = $procFinishTime.Subtract($procStartTime)
StdErr = ""
StdOut = $cmdOut.ToString()
}
return $result
}
The reason that your output is truncated is that Powershell returns from WaitForExit() and sets the HasExited property before it has processed all the output events in the queue.
One solution it to loop an arbitrary amount of time with short sleeps to allow the events to be processed; Powershell event processing appear to not be pre-emptive so a single long sleep does not allow events to process.
A much better solution is to also register for the Exited event (in addition to Output and Error events) on the Process. This event is the last in the queue so if you set a flag when this event occurs then you can loop with short sleeps until this flag is set and know that you have processed all the output events.
I have written up a full solution on my blog but the core snippet is:
# Set up a pair of stringbuilders to which we can stream the process output
$global:outputSB = New-Object -TypeName "System.Text.StringBuilder";
$global:errorSB = New-Object -TypeName "System.Text.StringBuilder";
# Flag that shows that final process exit event has not yet been processed
$global:myprocessrunning = $true
$ps = new-object System.Diagnostics.Process
$ps.StartInfo.Filename = $target
$ps.StartInfo.WorkingDirectory = Split-Path $target -Parent
$ps.StartInfo.UseShellExecute = $false
$ps.StartInfo.RedirectStandardOutput = $true
$ps.StartInfo.RedirectStandardError = $true
$ps.StartInfo.CreateNoWindow = $true
# Register Asynchronous event handlers for Standard and Error Output
Register-ObjectEvent -InputObject $ps -EventName OutputDataReceived -action {
if(-not [string]::IsNullOrEmpty($EventArgs.data)) {
$global:outputSB.AppendLine(((get-date).toString('yyyyMMddHHmm')) + " " + $EventArgs.data)
}
} | Out-Null
Register-ObjectEvent -InputObject $ps -EventName ErrorDataReceived -action {
if(-not [string]::IsNullOrEmpty($EventArgs.data)) {
$global:errorSB.AppendLine(((get-date).toString('yyyyMMddHHmm')) + " " + $EventArgs.data)
}
} | Out-Null
Register-ObjectEvent -InputObject $ps -EventName Exited -action {
$global:myprocessrunning = $false
} | Out-Null
$ps.start() | Out-Null
$ps.BeginOutputReadLine();
$ps.BeginErrorReadLine();
# We set a timeout after which time the process will be forceably terminated
$processTimeout = $timeoutseconds * 1000
while (($global:myprocessrunning -eq $true) -and ($processTimeout -gt 0)) {
# We must use lots of shorts sleeps rather than a single long one otherwise events are not processed
$processTimeout -= 50
Start-Sleep -m 50
}
if ($processTimeout -le 0) {
Add-Content -Path $logFile -Value (((get-date).toString('yyyyMMddHHmm')) + " PROCESS EXCEEDED EXECUTION ALLOWANCE AND WAS ABENDED!")
$ps.Kill()
}
# Append the Standard and Error Output to log file, we don't use Add-Content as it appends a carriage return that is not required
[System.IO.File]::AppendAllText($logFile, $global:outputSB)
[System.IO.File]::AppendAllText($logFile, $global:errorSB)
My 2 cents...its not a powershell issue but an issue/bug in the System.Diagnostics.Process class and underlying shell. I've seen times when wrapping the StdError and StdOut does not catch everything, and other times when the 'listening' wrapper application will hang indefinitly because of HOW the underlying application writes to the console. (in the c/c++ world there are MANY different ways to do this, [e.g. WriteFile, fprintf, cout, etc])
In addition there are more than 2 outputs that may need to be captured, but the .net framework only shows you those two (given they are the two primary ones) [see this article about command redirection here as it starts to give hints).
My guess (for both your issue as well as mine) is that it has to do with some low-level buffer flushing and/or ref counting. (If you want to get deep, you can start here)
One (very hacky) way to get around this is instead of executing the program directly to actually execute wrap it in a call to cmd.exe with 2>&1, but this method has its own pitfalls and issues.
The most ideal solution is for the executable to have a logging parameter, and then go parse the log file after the process exits...but most of the time you don't have that option.
But wait, we're using powershell...why are you using System.Diagnositics.Process in the first place? you can just call the command directly:
$output = & (GetDTExecPath) /FILE "$fileName" /CHECKPOINTING OFF /REPORTING "EWP"