It sounds like a reasonable expectation that events fired from one and the same thread should be received in the order in which they were fired. However, that doesn't seem to be the case. Is this a known/documented behavior, and is there any recourse to correct it?
Below are two ready-to-run code snippets that exhibit the issue, tested with PS v5.1 under both Win7 and Win10.
(a) Events fired from a thread in a separate job (i.e. a different process).
$events = 1000
$recvd = 0
$ooseq = 0
$job = Register-EngineEvent -SourceIdentifier 'Posted' -Action {
$global:recvd++
if($global:recvd -ne $event.messageData) {
$global:ooseq++
("-?- event #{0} received as #{1}" -f $event.messageData, $global:recvd)
} }
$run = Start-Job -ScriptBlock {
Register-EngineEvent -SourceIdentifier 'Posted' -Forward
for($n = 1; $n -le $using:events; $n++) {
[void] (New-Event -SourceIdentifier 'Posted' -MessageData $n)
} }
Receive-Job -Job $run -Wait -AutoRemoveJob
Unregister-Event -SourceIdentifier 'Posted'
Receive-Job -Job $job -Wait -AutoRemoveJob
if($events -eq $script:recvd) {
("total {0} events" -f $events)
} else {
("total {0} events events, {1} received" -f $events, $recvd)
}
if($ooseq -ne 0) {
("{0} out-of-sequence events" -f $ooseq)
}
Sample output from a failure case (out of a batch of 100 consecutive runs).
-?- event #545 received as #543
-?- event #543 received as #544
-?- event #546 received as #545
-?- event #544 received as #546
total 1000 events
4 out-of-sequence events
(b) Events fired from a separate runspace (i.e. a different thread).
$recvd = 0
$ooseq = 0
$job = Register-EngineEvent -SourceIdentifier 'Posted' -Action {
$global:recvd++
if($recvd -ne $event.messageData) {
$global:ooseq++
("-?- event #{0} received as #{1}" -f $event.messageData, $recvd)
}}
$sync = [hashTable]::Synchronized(#{})
$sync.Host = $host
$sync.events = 1000
$sync.posted = 0
$rs = [runspaceFactory]::CreateRunspace()
$rs.ApartmentState = "STA"
$rs.ThreadOptions = "ReUseThread"
$rs.Open()
$rs.SessionStateProxy.SetVariable("sync",$sync)
$ps = [powerShell]::Create().AddScript({
for($n = 1; $n -le $sync.events; $n++) {
$sync.Host.Runspace.Events.GenerateEvent('Posted', $null, $null, $n)
$sync.posted++
}})
$ps.runspace = $rs
$thd = $ps.BeginInvoke()
$ret = $ps.EndInvoke($thd)
$ps.Dispose()
Unregister-Event -SourceIdentifier 'Posted'
Receive-Job -Job $job -Wait -AutoRemoveJob
if($sync.events -eq $recvd) {
("total {0} events" -f $sync.events)
} else {
("total {0} events fired, {1} posted, {2} received" -f $sync.events, $sync.posted, $recvd)
}
if($ooseq -ne 0) {
("{0} out-of-sequence events" -f $ooseq)
}
Failure cases resemble the sample one posted under (a) above, except a few runs also had events dropped altogether. This, however, is more likely related to the other question Action-based object events sometimes lost.
total 1000 events fired, 1000 posted, 999 received
484 out-of-sequence events
[ EDIT ] I ran some additional tests for case (b) specifically, and confirmed that:
the receiving Action (where $global:recvd++) is always called on the same managed thread (this was confirmed by saving and comparing the [System.Threading.Thread]::CurrentThread.ManagedThreadId between calls);
the receiving Action is not re-entered during execution (this was confirmed by adding a global "nesting" counter, wrapping the Action between [System.Threading.Interlocked]::Increment/Decrement calls and checking that the counter never takes any values other than 0 and 1).
These eliminate a couple of possible race conditions, but still do not explain why the observed behavior is happening or how to correct it, so the original question remains open.
Is this a known/documented behavior?
"Normally" event handling is asynchronous by design. And this is the case in PowerShell with cmdlets like Register-EngineEvent -Action. This is indeed known and intended behaviour. You can read more about PowerShell eventing here and here. Both Microsoft sources point out, that this way of event handling is asynchronous:
PowerShell Eventing lets you respond to the asynchronous notifications
that many objects support.
and
NOTE These cmdlets can only be used for asynchronous .NET events. It’s
not possible to set up event handlers for synchronous events using the
PowerShell eventing cmdlets. This is because synchronous events all
execute on the same thread and the cmdlets expect (require) that the
events will happen on another thread. Without the second thread, the
PowerShell engine will simply block the main thread and nothing will
ever get executed.
So that's basically what you are doing. You forward events from your background job to the event subscriber that has an action defined and perform the action without blocking your background job. As far as I can tell, there is nothing more to expect. There is no requirement specified to process the forwarded events in any special order. Even the -Forward switch does not ensure anything more, except passing the events:
Indicates that the cmdlet sends events for this subscription to the
session on the local computer. Use this parameter when you are
registering for events on a remote computer or in a remote session.
It is hard and maybe impossible to find any documentation on the internals of the cmdlets. Keep in mind that Microsoft does not publish any documentation about internals afaik from the past, instead it is up to the MVPs to guess what happens inside and write books about it (drastically expressed).
So as there is no requirement to process the events in a certain order, and PowerShell just has the task to perfrom actions on an event queue, it is also allowed to perform those actions in parallel to accelerate the processing of the event queue.
Test your scripts on a VM with only one vCPU. The wrong order will still occur sometimes, but way more rarely. So less (real) parallelism, less possibilities to scramble the order. Of course, you cannot prevent the logical parallelism, implemented by different threads executed on one physical core. So some "errors" remain.
Is there any recourse to correct it?
I put "normally" into quotation marks, because there are ways to implement it synchronously. You will have to implement your own event handler of type System.EventHandler. I recommend reading this article to get an example for an implementation.
Another workaround is to store the events in an own event queue and sort them after collection (runs in ISE, not yet in PS):
$events = 10000
$recvd = 0
$ooseq = 0
$myEventQueue = #()
$job = Register-EngineEvent -SourceIdentifier 'Posted' -Action {$global:myEventQueue += $event}
$run = Start-Job -ScriptBlock {
Register-EngineEvent -SourceIdentifier 'Posted' -Forward
for($n = 1; $n -le $using:events; $n++) {
[void] (New-Event -SourceIdentifier 'Posted' -MessageData $n)
}
}
Receive-Job -Job $run -Wait -AutoRemoveJob
Unregister-Event -SourceIdentifier 'Posted'
Receive-Job -Job $job -Wait -AutoRemoveJob
Write-Host "Wrong event order in unsorted queue:"
$i = 1
foreach ($event in $myEventQueue) {
if ($i -ne $event.messageData) {
Write-Host "Event $($event.messageData) received as event $i"
}
$i++
}
$myEventQueue = $myEventQueue | Sort-Object -Property EventIdentifier
Write-Host "Wrong event order in sorted queue:"
$i = 1
foreach ($event in $myEventQueue) {
if ($i -ne $event.messageData) {
Write-Host "Event $($event.messageData) received as event $i"
}
$i++
}
Archived links:
PowerShell eventing async 1
PowerShell eventing async 2
PowerShell eventing sync
Related
Using the Register-EnginerEvent -Forward and New-Event I am trying to forward object events from a remote server to the host server however it does not seem to work.
To prove the theory, tried the below simple code which does work:
$TargetServer = 'localhost'
Register-EngineEvent -SourceIdentifier TimerEventOccured -Action {
Write-Host "$(Get-Date -format "dd-MM-yyyy hh:mm:ss tt") - $($event.MessageData) received..." -ForegroundColor Green
} | Out-Null
$TimerScriptBlock = {
Register-EngineEvent -SourceIdentifier TimerEventOccured -Forward | Out-Null
$Count = 1
while($Count -lt 3) {
New-Event -SourceIdentifier TimerEventOccured -MessageData 'Timertriggered'
Start-Sleep -Seconds 5
$Count += 1
}
}
$RemoteTimerScriptBlockJob = Invoke-Command -ComputerName $TargetServer -ScriptBlock $TimerScriptBlock -AsJob
while($RemoteTimerScriptBlockJob.State -in #('NotStarted','Running')) {
Write-Host "$(Get-Date -format "dd-MM-yyyy hh:mm:ss tt") - remote timer job still running"
Start-Sleep -Seconds 5
}
Write-Host "$(Get-Date -format "dd-MM-yyyy hh:mm:ss tt") - remote timer job complete"
...where as the below adding Register-ObjectEvent, which is what I want to achieve, doesn't.
$TargetServer = 'localhost'
Register-EngineEvent -SourceIdentifier TimerEventOccured -Action {
Write-Host "$(Get-Date -format "dd-MM-yyyy hh:mm:ss tt") - $($event.MessageData) received..." -ForegroundColor Green
} | Out-Null
$TimerScriptBlock = {
Register-EngineEvent -SourceIdentifier TimerEventOccured -Forward | Out-Null
$timer = New-Object timers.timer
$timer.Enabled = $true
$timer.Interval = 3000
Register-ObjectEvent -InputObject $timer -EventName elapsed –SourceIdentifier thetimer -Action $action {
New-Event -SourceIdentifier TimerEventOccured -MessageData 'Timertriggered'
}
$timer.start()
Start-Sleep -Seconds 15 #just wait long enough for timer events to trigger a few times
}
$RemoteTimerScriptBlockJob = Invoke-Command -ComputerName $TargetServer -ScriptBlock $TimerScriptBlock -AsJob
while($RemoteTimerScriptBlockJob.State -in #('NotStarted','Running')) {
Write-Host "$(Get-Date -format "dd-MM-yyyy hh:mm:ss tt") - remote timer job still running"
Start-Sleep -Seconds 5
}
Write-Host "$(Get-Date -format "dd-MM-yyyy hh:mm:ss tt") - remote timer job complete"
Could you please help? Thanks.
Update:
Please note, I could directly forward the timer-event to the source server without needing the engine-event as the intermediary. But above timer event was only used to illustrate the point here. The real work I am dealing with is to monitor Windows Event log for certain event ids (which has become quite complex to share here).
So, if I were to use -forward directly on the Eventlog listener Object then it will create a lot of traffic from target servers to host session (i.e. every event written will be dispatched as opposed to the only ones I am after). I want to be able to process the triggered event first on the remote server itself (to match the input eventIDs) and then forward the filtered event through engine event, which is where I am stuck.
In short: Register-ObjectEvent isn't the problem in your case - it is the fact that you use a single Start-Sleep call after which you exit immediately, which causes most of the events to be lost.
When you suspend PowerShell's own foreground thread with Start-Sleep, you also suspend its event processing.
Specifically, this plays out as follows in your case:
While Start-Sleep runs, events are queued - the immediate side effect of which is that your events aren't processed in a timely fashion.
When Start-Sleep ends and the foreground thread regains control, the event queue starts getting processed, but since the script block ends right away, only an - unpredictable - subset of the queued events gets processed before overall execution of the remote script block ends. Seemingly, PowerShell doesn't ensure that queued events are processed before exiting.
Thus, if you break you single Start-Sleep -Seconds 15 call into multiple ones, giving PowerShell time to process events in between, your code should work:
1..3 | ForEach-Object { Start-Sleep -Seconds 5 }
Again, note that there's no guarantee that if events still happen to be queued afterwards that they will be processed before exiting.
However - as you've later discovered - you can use Wait-Event -Timeout as a superior alternative to Start-Process, as it does not block -Action script-block and -Forward event processing while it waits, allowing for the forwarded events to be processed in near-realtime.
Note: Wait-Event's (and also Get-Event's) primary purpose is to retrieve and output queued events, i.e. events that are not consumed by Register-ObjectEvent / Register-EngineEvent event subscriptions based on -Action or -Forward and must be retrieved and acted on on demand. However, as a beneficial side effect, Wait-Event also enables registration-based (subscriber-based) event processing (via -Action script blocks and -Forward) to occur while it waits.
The following self-contained example, which builds on your code:
Shows the use of Wait-Event, both in the remote script block and locally.
Retrieves the output produced directly by the remote script block, using Receive-Job
Performs cleanup, both of the remote job and the locale event subscription.
For details, refer to the source-code comments.
Note: Because "loopback remoting" is used, the local machine must be set up for remoting and you must run WITH ELEVATION (as admin) - the #Requires -RunAsAdministrator directive enforces the latter.
#Requires -RunAsAdministrator
# Running ELEVATED is a must if you use Invoke-Command -ComputerName with the local machine.
$TargetServer = 'localhost'
$eventJob = Register-EngineEvent -SourceIdentifier TimerEventOccurred -Action {
Write-Host "$(Get-Date -format "dd-MM-yyyy hh:mm:ss tt") - $($event.MessageData) received #$((++$i))..." -ForegroundColor Green
}
$TimerScriptBlock = {
$null = Register-EngineEvent -SourceIdentifier TimerEventOccurred -Forward
$timer = New-Object timers.timer
$timer.Interval = 1000 # Fire every second
$null = Register-ObjectEvent -InputObject $timer -EventName elapsed –SourceIdentifier thetimer -Action {
Write-Host "$(Get-Date -format "dd-MM-yyyy hh:mm:ss tt") - $($event.MessageData) TRIGGERED #$((++$i))..."
New-Event -SourceIdentifier TimerEventOccurred -MessageData 'Timertriggered'
}
$timer.start()
# Produce events for a certain number of seconds.
$secs = 5
# Wait-Event - unlike Start-Sleep - does NOT block the event processing.
# Note that since events created in this remote session are either forwarded
# or handled via an -Action script block, Wait-Event will produce *no output*.
Wait-Event -Timeout $secs
# Hack only to make this sample code work more predictably:
# Ensure that the last event gets processed too:
# -Timeout only accepts *whole* seconds and unpredictable runtime conditions
# can result in the last event to not have been processed yet when Wait-Event returns.
Start-Sleep -Milliseconds 100; Get-Event
"Exiting remote script block after $secs seconds."
}
$remoteTimerScriptBlockJob = Invoke-Command -ComputerName $TargetServer -ScriptBlock $TimerScriptBlock -AsJob
Write-Host "Processing events while waiting for the remote timer job to complete..."
do {
# Note that since the TimerEventOccurred is handled via an -Action script block,
# Wait-Event will produce *no output*, but it enables processing of those script blocks,
# unlike Start-Sleep.
Wait-Event -SourceIdentifier TimerEventOccurred -Timeout 3
} while ($remoteTimerScriptBlockJob.State -in 'NotStarted', 'Running')
Write-Host "$(Get-Date -format "dd-MM-yyyy hh:mm:ss tt") - Remote timer job terminated with the following output:"
# Receive the remote script block's output and clean up the job.
$remoteTimerScriptBlockJob | Receive-Job -Wait -AutoRemoveJob
# Also clean up the local event job.
$eventJob | Remove-Job -Force # -Force is needed, because event jobs run indefinitely.
# Note: This automatically also removes the job as an event subscriber, so there's no need
# for an additional Unregister-Event call.
Example output:
I'm very new to powershell.
My question is very simple if you know anything about powershell.
In the code below I'm trying to fire an event from a piece of code running asynchronously as a job. For some reason, the code doesn't work.
$callback = {
param($event);
Write "IN CALLBACK";
};
$jobScript = {
while ($true) {
sleep -s 1;
"IN JOB SCRIPT";
New-Event myEvent;
}
}
Register-EngineEvent -SourceIdentifier myEvent -Action $callback;
Start-Job -Name Job -ScriptBlock $jobScript;
while ($true) {
sleep -s 1;
"IN LOOP";
}
Expected output:
IN LOOP
IN JOB SCRIPT
IN CALLBACK
IN LOOP
IN JOB SCRIPT
IN CALLBACK
...
Actual output:
IN LOOP
IN LOOP
IN LOOP
IN LOOP
...
After some reading, I changed this line
Start-Job -Name Job -ScriptBlock $jobScript
to
Start-Job -Name Job -ScriptBlock $jobScript | Wait-Job | Receive-Job;
and I get no output at all, because job never finishes.
It's kind of asynchoronous, but not really.
It would be fairly simple to acomplish in JS.
const fireEvent = (eventName) => { ... }
const subscribeToEvent = (eventName, callback) => { ... }
const callback = () => console.log('IN CALLBACK')
subscribeToEvent('myEvent', callback);
setInterval(() => {
console.log('IN LOOP')
fireEvent('myEvent');
}, 1000)
Please, help!
Running:
while ($true) {
sleep -s 1;
"IN LOOP";
}
will always only give you:
IN LOOP
IN LOOP
IN LOOP
IN LOOP
Running this block:
$callback = {
param($event);
Write "IN CALLBACK";
};
$jobScript = {
while ($true) {
sleep -s 1;
"IN JOB SCRIPT";
New-Event myEvent;
}
}
Register-EngineEvent -SourceIdentifier myEvent -Action $callback;
Start-Job -Name Job -ScriptBlock $jobScript;
gives you a job called Job. This job will run untill you stop it. Output of the job will be something like this:
get-job Job | receive-job
IN JOB SCRIPT
RunspaceId : cf385728-926c-4dda-983e-6a5cfd4fd67f
ComputerName :
EventIdentifier : 1
Sender :
SourceEventArgs :
SourceArgs : {}
SourceIdentifier : myEvent
TimeGenerated : 6/15/2019 3:35:09 PM
MessageData :
IN JOB SCRIPT
RunspaceId : cf385728-926c-4dda-983e-6a5cfd4fd67f
ComputerName :
EventIdentifier : 2
Sender :
SourceEventArgs :
SourceArgs : {}
SourceIdentifier : myEvent
TimeGenerated : 6/15/2019 3:35:10 PM
MessageData :
...
JavaScript is really Greek to me, but it seems your issue is that the event is not registered in the scope of the job. If you move the registration to the job, it behaves a bit more like you expect.
If you do this:
$jobScript = {
$callback = {
Write-Host "IN CALLBACK"
}
$null = Register-EngineEvent -SourceIdentifier myEvent -Action $callback
while ($true) {
sleep -s 1
Write-Host "IN JOB SCRIPT"
$Null = New-Event myEvent
}
}
$Null = Start-Job -Name Job -ScriptBlock $jobScript;
while ($true) {
sleep -s 1
"IN LOOP"
Get-Job -Name Job | Receive-Job
}
When running $null = Register-EngineEvent ... or $null = Start-Job ... you avoid the objects these commands creates to be displayed at the console. Furthermore you do not need to terminate your lines with ; in PowerShell.
To complement Axel Anderson's helpful answer:
Register-EngineEvent subscribes to an event in the current session, whereas commands launched with Start-Job run in a background job that is a child process, which by definition is a separate session.
Similarly, any events you raise with New-Event are only seen in the same session, which is why the calling session never saw the events.
By moving all event logic into the background job, as in Axel's answer, the events are processed in the background job, but there's an important limitation - which may or may not matter to you:
You won't be able to capture output from the event handler: while you can make its output print to the console using Write-Host, that output cannot be captured for programmatic processing. Also, such Write-Host output cannot be suppressed.
By contrast, output sent to the success output stream directly from the background job - such as "IN JOB SCRIPT" is, implicitly (implied use of Write-Output) - can be captured via Receive-Job, which retrieves the output from background jobs.
Therefore, perhaps the following is sufficient in your case, which doesn't require use of events at all:
# Code to execute in the background, defined a script block.
$jobScript = {
while ($true) {
Start-Sleep -Seconds 1
"JOB OUTPUT #$(($i++))"
}
}
# Launch the background job and save the job object in variable $job.
$job = Start-Job -Name Job -ScriptBlock $jobScript;
# Periodically relay the background job's output.
while ($true) {
Start-Sleep -Seconds 1
"IN LOOP, about to get latest job output:"
$latestOutput = Receive-Job $job
"latest output: $latestOutput"
}
For an explanation of the $(($i++)) construct, see this answer.
The above yields the following:
IN LOOP, about to get latest job output:
latest output:
IN LOOP, about to get latest job output:
latest output: JOB OUTPUT #0
IN LOOP, about to get latest job output:
latest output: JOB OUTPUT #1
IN LOOP, about to get latest job output:
latest output: JOB OUTPUT #2
IN LOOP, about to get latest job output:
latest output: JOB OUTPUT #3
IN LOOP, about to get latest job output:
latest output: JOB OUTPUT #4
IN LOOP, about to get latest job output:
latest output: JOB OUTPUT #5
IN LOOP, about to get latest job output:
...
I have a powershell script that starts a job
start-job -scriptblock {
while($true) {
echo "Running"
Start-Sleep 2
}
}
and then it continues executing the rest of the script.
That job, is kind of a monitoring one for the PID of that process.
I would like to synchronously print the PID every n seconds, without having to end the job.
For example, as the rest of the script is being executed, i would like to see output in my console.
Is something like that possible in powershell?
Thanks.
Yes, you can use events:
$job = Start-Job -ScriptBlock {
while($true) {
Register-EngineEvent -SourceIdentifier MyNewMessage -Forward
Start-Sleep -Seconds 3
$null = New-Event -SourceIdentifier MyNewMessage -MessageData "Pingback from job."
}
}
$event = Register-EngineEvent -SourceIdentifier MyNewMessage -Action {
Write-Host $event.MessageData;
}
for($i=0; $i -lt 10; $i++) {
Start-Sleep -Seconds 1
Write-Host "Pingback from main."
}
$job,$event| Stop-Job -PassThru| Remove-Job #stop the job and event listener
Credit goes to this answer. Other useful links:
How to Get Windows PowerShell to Notify You When a Job is Complete
Manage Event Subscriptions with PowerShell - Hey, Scripting Guy! Blog
I'm running the DTEXEC.exe command from within a PowerShell script, trying to capture and log the output to a file. Sometimes the output is incomplete and I'm trying to figure out why this the case and what might be done about it. The lines that never seem to get logged are the most interesting:
DTEXEC: The package execution returned DTSER_SUCCESS(0)
Started: 10:58:43 a.m.
Finished: 10:59:24 a.m.
Elapsed: 41.484 seconds
The output always seems incomplete on packages that execute in less than ~ 8 seconds and this might be a clue (there isn't much output or they finish quickly).
I'm using .NETs System.Diagnostics.Process and ProcessStartInfo to setup and run the command, and I'm redirecting stdout and stderror to event handlers that each append to a StringBuilder which is subsequently written to disk.
The problem feels like a timing issue or a buffering issue. To solve the timing issue, I've attempted to use Monitor.Enter/Exit. If it's a buffering issue, I'm not sure how to force the Process to not buffer stdout and stderror.
The environment is
- PowerShell 2 running CLR version 2
- SQL 2008 32-bit DTEXEC.exe
- Host Operating System: XP Service Pack 3.
Here's the code:
function Execute-SSIS-Package
{
param([String]$fileName)
$cmd = GetDTExecPath
$proc = New-Object System.Diagnostics.Process
$proc.StartInfo.FileName = $cmd
$proc.StartInfo.Arguments = "/FILE ""$fileName"" /CHECKPOINTING OFF /REPORTING ""EWP"""
$proc.StartInfo.RedirectStandardOutput = $True
$proc.StartInfo.RedirectStandardError = $True
$proc.StartInfo.WorkingDirectory = Get-Location
$proc.StartInfo.UseShellExecute = $False
$proc.StartInfo.CreateNoWindow = $False
Write-Host $proc.StartInfo.FileName $proc.StartInfo.Arguments
$cmdOut = New-Object System.Text.StringBuilder
$errorEvent = Register-ObjectEvent -InputObj $proc `
-Event "ErrorDataReceived" `
-MessageData $cmdOut `
-Action `
{
param
(
[System.Object] $sender,
[System.Diagnostics.DataReceivedEventArgs] $e
)
try
{
[System.Threading.Monitor]::Enter($Event.MessageData)
Write-Host -ForegroundColor "DarkRed" $e.Data
[void](($Event.MessageData).AppendLine($e.Data))
}
catch
{
Write-Host -ForegroundColor "Red" "Error capturing processes std error" $Error
}
finally
{
[System.Threading.Monitor]::Exit($Event.MessageData)
}
}
$outEvent = Register-ObjectEvent -InputObj $proc `
-Event "OutputDataReceived" `
-MessageData $cmdOut `
-Action `
{
param
(
[System.Object] $sender,
[System.Diagnostics.DataReceivedEventArgs] $e
)
try
{
[System.Threading.Monitor]::Enter($Event.MessageData)
#Write-Host $e.Data
[void](($Event.MessageData).AppendLine($e.Data))
}
catch
{
Write-Host -ForegroundColor "Red" "Error capturing processes std output" $Error
}
finally
{
[System.Threading.Monitor]::Exit($Event.MessageData)
}
}
$isStarted = $proc.Start()
$proc.BeginOutputReadLine()
$proc.BeginErrorReadLine()
while (!$proc.HasExited)
{
Start-Sleep -Milliseconds 100
}
Start-Sleep -Milliseconds 1000
$procExitCode = $proc.ExitCode
$procStartTime = $proc.StartTime
$procFinishTime = Get-Date
$proc.Close()
$proc.CancelOutputRead()
$proc.CancelErrorRead()
$result = New-Object PsObject -Property #{
ExitCode = $procExitCode
StartTime = $procStartTime
FinishTime = $procFinishTime
ElapsedTime = $procFinishTime.Subtract($procStartTime)
StdErr = ""
StdOut = $cmdOut.ToString()
}
return $result
}
The reason that your output is truncated is that Powershell returns from WaitForExit() and sets the HasExited property before it has processed all the output events in the queue.
One solution it to loop an arbitrary amount of time with short sleeps to allow the events to be processed; Powershell event processing appear to not be pre-emptive so a single long sleep does not allow events to process.
A much better solution is to also register for the Exited event (in addition to Output and Error events) on the Process. This event is the last in the queue so if you set a flag when this event occurs then you can loop with short sleeps until this flag is set and know that you have processed all the output events.
I have written up a full solution on my blog but the core snippet is:
# Set up a pair of stringbuilders to which we can stream the process output
$global:outputSB = New-Object -TypeName "System.Text.StringBuilder";
$global:errorSB = New-Object -TypeName "System.Text.StringBuilder";
# Flag that shows that final process exit event has not yet been processed
$global:myprocessrunning = $true
$ps = new-object System.Diagnostics.Process
$ps.StartInfo.Filename = $target
$ps.StartInfo.WorkingDirectory = Split-Path $target -Parent
$ps.StartInfo.UseShellExecute = $false
$ps.StartInfo.RedirectStandardOutput = $true
$ps.StartInfo.RedirectStandardError = $true
$ps.StartInfo.CreateNoWindow = $true
# Register Asynchronous event handlers for Standard and Error Output
Register-ObjectEvent -InputObject $ps -EventName OutputDataReceived -action {
if(-not [string]::IsNullOrEmpty($EventArgs.data)) {
$global:outputSB.AppendLine(((get-date).toString('yyyyMMddHHmm')) + " " + $EventArgs.data)
}
} | Out-Null
Register-ObjectEvent -InputObject $ps -EventName ErrorDataReceived -action {
if(-not [string]::IsNullOrEmpty($EventArgs.data)) {
$global:errorSB.AppendLine(((get-date).toString('yyyyMMddHHmm')) + " " + $EventArgs.data)
}
} | Out-Null
Register-ObjectEvent -InputObject $ps -EventName Exited -action {
$global:myprocessrunning = $false
} | Out-Null
$ps.start() | Out-Null
$ps.BeginOutputReadLine();
$ps.BeginErrorReadLine();
# We set a timeout after which time the process will be forceably terminated
$processTimeout = $timeoutseconds * 1000
while (($global:myprocessrunning -eq $true) -and ($processTimeout -gt 0)) {
# We must use lots of shorts sleeps rather than a single long one otherwise events are not processed
$processTimeout -= 50
Start-Sleep -m 50
}
if ($processTimeout -le 0) {
Add-Content -Path $logFile -Value (((get-date).toString('yyyyMMddHHmm')) + " PROCESS EXCEEDED EXECUTION ALLOWANCE AND WAS ABENDED!")
$ps.Kill()
}
# Append the Standard and Error Output to log file, we don't use Add-Content as it appends a carriage return that is not required
[System.IO.File]::AppendAllText($logFile, $global:outputSB)
[System.IO.File]::AppendAllText($logFile, $global:errorSB)
My 2 cents...its not a powershell issue but an issue/bug in the System.Diagnostics.Process class and underlying shell. I've seen times when wrapping the StdError and StdOut does not catch everything, and other times when the 'listening' wrapper application will hang indefinitly because of HOW the underlying application writes to the console. (in the c/c++ world there are MANY different ways to do this, [e.g. WriteFile, fprintf, cout, etc])
In addition there are more than 2 outputs that may need to be captured, but the .net framework only shows you those two (given they are the two primary ones) [see this article about command redirection here as it starts to give hints).
My guess (for both your issue as well as mine) is that it has to do with some low-level buffer flushing and/or ref counting. (If you want to get deep, you can start here)
One (very hacky) way to get around this is instead of executing the program directly to actually execute wrap it in a call to cmd.exe with 2>&1, but this method has its own pitfalls and issues.
The most ideal solution is for the executable to have a logging parameter, and then go parse the log file after the process exits...but most of the time you don't have that option.
But wait, we're using powershell...why are you using System.Diagnositics.Process in the first place? you can just call the command directly:
$output = & (GetDTExecPath) /FILE "$fileName" /CHECKPOINTING OFF /REPORTING "EWP"
A script is executing the following steps in a loop, assume both steps take a long time to complete:
$x = DoSomeWork;
Start-Job -Name "Process $x" { DoSomeMoreWork $x; };
Step 1 blocks the script and step 2 does not, of course.
I can easily monitor the progress/state of the loop and step 1 through the console.
What I'd also like to do is monitor the job status of jobs started by step 2 while the batch is still executing.
In general, it is possible to 'attach' or query another powershell session from another session? (Assuming the monitoring session does not spawn the worker session)
If I'm following you, then you cannot share state between two different console instances. That is to say, it's not possible in the way you want to do it. However, it's not true that you cannot monitor a job from the same session. You can signal with events from within the job:
Start-Job -Name "bgsignal" -ScriptBlock {
# forward events named "progress" back to job owner
# this even works across machines ;-)
Register-EngineEvent -SourceIdentifier Progress -Forward
$percent = 0
while ($percent -lt 100) {
$percent += 10
# raise a new progress event, redirecting to $null to prevent
# it ending up in the job's output stream
New-Event -SourceIdentifier Progress -MessageData $percent > $null
# wait 5 seconds
sleep -Seconds 5
}
}
Now you have the choice to either use Wait-Event [-SourceIdentifier Progress], Register-EngineEvent -SourceIdentifier Progress [-Action { ... }] or plain old interactive Get-Event to see and/or act on progress from the same session (or a different machine if you started the job on a remote server.)
It's also entirely possible you don't need the Jobs infrastructure if all work is being done on the local machine. Take a look at an old blog post of mine on the RunspaceFactory and PowerShell objects for a rudimentary script "threadpool" implementation:
http://www.nivot.org/2009/01/22/CTP3TheRunspaceFactoryAndPowerShellAccelerators.aspx
Hope this helps,
-Oisin
State is easy to monitor:
$job = Start-Job -Name "Process $x" { DoSomeMoreWork $x }
$job.state
If you don't need to retrieve any output data from the function then you can write to output like so:
$job = Start-Job {$i=0; while (1) { "Step $i"; $i++; Start-Sleep -sec 1 }}
while ($job.State -eq 'Running')
{
Receive-Job $job.id
}
If you do need to capture the output, then you could use the progress stream I think:
$job = Start-Job {$i=0; while (1) {
Write-Progress Activity "Step $i"; $i++; Start-Sleep -sec 1 }}
while ($job.State -eq 'Running') {
$progress=$job.ChildJobs[0].progress;
$progress | %{$_.StatusDescription};
$progress.Clear(); Start-Sleep 1 }