Powershell - How to pass variable into scriptblock - powershell

I'm trying to understand and figure out how I can pass a variable into a scriptblock. In my below example script, when a new file is dropped into the monitored folder it executes the $action script block. But the $test1 variable just shows up blank. Only way I can make it work is by making it a global variable, but I don't really want to do that.
I've looked into this some and I'm more confused than when I started. Can anyone help me out or point me in the right direction to understand this?
$PathToMonitor = "\\path\to\folder"
$FileSystemWatcher = New-Object System.IO.FileSystemWatcher
$FileSystemWatcher.Path = $PathToMonitor
$FileSystemWatcher.Filter = "*.*"
$FileSystemWatcher.IncludeSubdirectories = $false
$FileSystemWatcher.EnableRaisingEvents = $true
$test1 = "Test variable"
$Action = {
Write-Host "$test1"
}
$handlers = . {
Register-ObjectEvent -InputObject $FileSystemWatcher -EventName Created -Action $Action -SourceIdentifier FSCreateConsumer
}
try {
do {
Wait-Event -Timeout 5
} while ($true)
}
finally {
Unregister-Event -SourceIdentifier FSCreateConsumer
$handlers | Remove-Job
$FileSystemWatcher.EnableRaisingEvents = $false
$FileSystemWatcher.Dispose()
}

Hidden away in the documentation for Register-ObjectEvent, way down in the -Action parameter description is this little tidbit:
The value of the Action parameter can include the $Event, $EventSubscriber, $Sender, $EventArgs, and $Args automatic variables. These variables provide information about the event to the Action script block. For more information, see about_Automatic_Variables.
What this means is PowerShell automatically creates some variables that you can use inside the event handler scriptblock and it populates them when the event is triggered - for example:
$Action = {
write-host ($Sender | format-list * | out-string)
write-host ($EventArgs | format-list * | out-string)
}
When you create a file in the watched folder you'll see some output like this:
NotifyFilter : FileName, DirectoryName, LastWrite
Filters : {*}
EnableRaisingEvents : True
Filter : *
IncludeSubdirectories : False
InternalBufferSize : 8192
Path : c:\temp\scratch
Site :
SynchronizingObject :
Container :
ChangeType : Created
FullPath : c:\temp\scratch\New Text Document (3).txt
Name : New Text Document (3).txt
If these contain the information you're after then you don't actually need to pass any parameters into the scriptblock yourself :-).
Update
If you still need to pass your own variables into the event you can use the -MessageData parameter of Register-ObjectEvent to be able to access it as $Event.MessageData inside your event scriptblock - for example:
$Action = {
write-host ($EventArgs | format-list * | out-string)
write-host "messagedata before = "
write-host ($Event.MessageData | ConvertTo-Json)
$Event.MessageData.Add($EventArgs.FullPath, $true)
write-host "messagedata after = "
write-host ($Event.MessageData | ConvertTo-Json)
}
$messageData = #{ };
$handlers = . {
# note the -MessageData parameter
Register-ObjectEvent `
-InputObject $FileSystemWatcher `
-EventName Created `
-Action $Action `
-MessageData $messageData `
-SourceIdentifier FSCreateConsumer
}
which will output something like this when the event triggers:
ChangeType : Created
FullPath : c:\temp\scratch\New Text Document (16).txt
Name : New Text Document (16).txt
messagedata before =
{}
messagedata after =
{
"c:\\temp\\scratch\\New Text Document (16).txt": true
}
$messageData is technically still a global variable but your $Action doesn't need to know about it anymore as it takes a reference from the $Event.
Note you'll need to use a mutable data structure if you want to persist changes - you can't just assign a new value to $Event.MessageData, and it'll possibly need to be thread-safe as well.

The event action block runs on a background thread and can't resolve $test1 when dispatched.
One workaround is to explicitly read from and write to a globally-scoped variable (eg. Write-Host $global:test1), but a better solution is to ensure the $Action block "remembers" the value of $test1 for later - something we can accomplish with a closure.
We'll need to reorganize the code slightly for this, so start by replacing the $test1 string literal with a synchronized hashtable:
$test1 = [hashtable]::Synchronized(#{
Value = "Test variable"
})
This will allow us to do 2 things:
we can modify the string value without changing the identity of the object stored in $test1,
string value can be modified by multiple background threads without any race conditions occuring
Now we just need to create the closure from the $Action block:
$Action = {
Write-Host $test1.Value
}.GetNewClosure()
This will bind the value of $test1 (the reference to the synchronized hashtable we just created on the line above) to the $Action block, and it will therefore "remember" that $test1 resolves to the hashtable rather than attempt (and fail) to resolve it at runtime.

Related

How to provide options to event registration?

I don't understand why the following code generates error messages. Powershell seems difficult to learn.
$fsw = New-Object IO.FileSystemWatcher ...
$Action = {Param($option)
if ($option -eq "Copy")
{Write-Host "Copy was specified"}
}
Register-ObjectEvent $fsw Created -SourceIdentifier FileCreated -Action $Action
Register-ObjectEvent $fsw Changed -SourceIdentifier FileChanged -Action $Action -ArgumentList Copy
(This is an updated version of question How to provide options to script blocks?)
When in doubt, read the documentation. The Register-ObjectEvent cmdlet does not have a parameter -ArgumentList, so getting an error when trying to call the cmdlet with a parameter that it doesn't have is hardly surprising. And as I already told you in my comment to your previous question, passing arguments to an event action the way you're trying to do does not make much sense in the first place.
If your actions are so similar that defining different action script blocks would yield too much boilerplate you can easily discriminate by event inside the scriptblock, since that information is passed to the action automatically (via the variable $Event).
$Action = {
$name = $Event.SourceEventArgs.Name
$type = $Event.SourceEventArgs.ChangeType
switch ($type) {
'Created' { Write-Host "file ${file} was created" }
'Changed' { Write-Host "file ${file} was changed" }
}
}

Using FileSystemWatcher to alert if multiple files change simulatenously

I am very new to powershell. The following code is created by BigTeddy and he gets the full credit (I also made some Changes using the While Loop)
I want to know how can I create an if/else statement such that if more than one file is changed/edited/created/deleted at the same time (say ten files have been edited simultaneously) a log file will be created saying these list files has been edited simultaneously at this particular time.
The following powershell script BigTeddy created basically spits out a log file (and on the output of the powershell ISE) of when a change/edit/creation/deletion has been made, the time it was changed and what file(s) was edited.
param(
[string]$folderToWatch = "C:\Users\gordon\Desktop\powershellStart"
, [string]$filter = "*.*"
, [string]$logFile = 'C:\Users\gordon\Desktop\powershellDest\filewatcher.log'
)
# In the following line, you can change 'IncludeSubdirectories to $true if required.
$fsw = New-Object IO.FileSystemWatcher $folderToWatch, $filter -Property #{IncludeSubdirectories = $false;NotifyFilter = [IO.NotifyFilters]'FileName, LastWrite'}
$timeStamp #My changes
$timeStampPrev = $timeStamp #My changes
# This script block is used/called by all 3 events and:
# appends the event to a log file, as well as reporting the event back to the console
$scriptBlock = {
# REPLACE THIS SECTION WITH YOUR PROCESSING CODE
$logFile = $event.MessageData # message data is how we pass in an argument to the event script block
$name = $Event.SourceEventArgs.Name
$changeType = $Event.SourceEventArgs.ChangeType
$timeStamp = $Event.TimeGenerated
while($timeStampPrev -eq $timeStamp) { #My changes
Write-Host "$timeStamp|$changeType|'$name'" -fore green
Out-File -FilePath $logFile -Append -InputObject "$timeStamp|$changeType|'$name'"
# REPLACE THIS SECTION WITH YOUR PROCESSING CODE
}
}
# Here, all three events are registered. You need only subscribe to events that you need:
Register-ObjectEvent $fsw Created -SourceIdentifier FileCreated -MessageData $logFile -Action $scriptBlock
Register-ObjectEvent $fsw Deleted -SourceIdentifier FileDeleted -MessageData $logFile -Action $scriptBlock
Register-ObjectEvent $fsw Changed -SourceIdentifier FileChanged -MessageData $logFile -Action $scriptBlock
# To stop the monitoring, run the following commands:
# Unregister-Event FileDeleted ; Unregister-Event FileCreated ; Unregister-Event FileChanged
#This script uses the .NET FileSystemWatcher class to monitor file events in folder(s).
#The advantage of this method over using WMI eventing is that this can monitor sub-folders.
#The -Action parameter can contain any valid Powershell commands.
#The script can be set to a wildcard filter, and IncludeSubdirectories can be changed to $true.
#You need not subscribe to all three types of event. All three are shown for example.
Have you considered a hash table with timestamps and action(create/modify/delete) as keys and file name as value. You iterate through the dictionary after a specific wait interval and you flush the entries in the dictionary to a log file.

Powershell - Listen for file, do something if file exists [duplicate]

Is there any simple way(i.e., script) to watch file in Powershell and run commands if file changes. I have been googling but can't find simple solution. Basically I run script in Powershell and if file changes then Powershell run other commands.
EDIT
Ok I think I made a mistake. I don't need script, a need function that I can include in my $PROFILE.ps1 file. But still, I was trying hard and still I'm unable to write it, so I will give bounty. It have to look like this:
function watch($command, $file) {
if($file #changed) {
#run $command
}
}
There is a NPM module that is doing what I want, watch , but it only watches for folders not files, and it's not Powershell xD.
Here is an example I have found in my snippets. Hopefully it is a little bit more comprehensive.
First you need to create a file system watcher and subsequently you subscribe to an event that the watcher is generating. This example listens for “Create” events, but could easily be modified to watch out for “Change”.
$folder = "C:\Users\LOCAL_~1\AppData\Local\Temp\3"
$filter = "*.LOG"
$Watcher = New-Object IO.FileSystemWatcher $folder, $filter -Property #{
IncludeSubdirectories = $false
NotifyFilter = [IO.NotifyFilters]'FileName, LastWrite'
}
$onCreated = Register-ObjectEvent $Watcher -EventName Created -SourceIdentifier FileCreated -Action {
$path = $Event.SourceEventArgs.FullPath
$name = $Event.SourceEventArgs.Name
$changeType = $Event.SourceEventArgs.ChangeType
$timeStamp = $Event.TimeGenerated
Write-Host "The file '$name' was $changeType at $timeStamp"
Write-Host $path
#Move-Item $path -Destination $destination -Force -Verbose
}
I will try to narrow this down to your requirements.
If you run this as part of your "profile.ps1" script you should read The Power of Profiles which explains the different profile scripts available and more.
Also, you should understand that waiting for a change in a folder can't be run as a function in the script. The profile script has to be finished, for your PowerShell session to start. You can, however use a function to register an event.
What this does, is register a piece of code, to be executed every time an event is triggered. This code will be executed in the context of your current PowerShell host (or shell) while the session remains open. It can interact with the host session, but has no knowledge of the original script that registered the code. The original script has probably finished already, by the time your code is triggered.
Here is the code:
Function Register-Watcher {
param ($folder)
$filter = "*.*" #all files
$watcher = New-Object IO.FileSystemWatcher $folder, $filter -Property #{
IncludeSubdirectories = $false
EnableRaisingEvents = $true
}
$changeAction = [scriptblock]::Create('
# This is the code which will be executed every time a file change is detected
$path = $Event.SourceEventArgs.FullPath
$name = $Event.SourceEventArgs.Name
$changeType = $Event.SourceEventArgs.ChangeType
$timeStamp = $Event.TimeGenerated
Write-Host "The file $name was $changeType at $timeStamp"
')
Register-ObjectEvent $Watcher -EventName "Changed" -Action $changeAction
}
Register-Watcher "c:\temp"
After running this code, change any file in the "C:\temp" directory (or any other directory you specify). You will see an event triggering execution of your code.
Also, valid FileSystemWatcher events you can register are "Changed", "Created", "Deleted" and "Renamed".
I will add another answer, because my previous one did miss the requirements.
Requirements
Write a function to WAIT for a change in a specific file
When a change is detected the function will execute a predefined command and return execution to the main script
File path and command are passed to the function as parameters
There is already an answer using file hashes. I want to follow my previous answer and show you how this can be accomplish using FileSystemWatcher.
$File = "C:\temp\log.txt"
$Action = 'Write-Output "The watched file was changed"'
$global:FileChanged = $false
function Wait-FileChange {
param(
[string]$File,
[string]$Action
)
$FilePath = Split-Path $File -Parent
$FileName = Split-Path $File -Leaf
$ScriptBlock = [scriptblock]::Create($Action)
$Watcher = New-Object IO.FileSystemWatcher $FilePath, $FileName -Property #{
IncludeSubdirectories = $false
EnableRaisingEvents = $true
}
$onChange = Register-ObjectEvent $Watcher Changed -Action {$global:FileChanged = $true}
while ($global:FileChanged -eq $false){
Start-Sleep -Milliseconds 100
}
& $ScriptBlock
Unregister-Event -SubscriptionId $onChange.Id
}
Wait-FileChange -File $File -Action $Action
Here is the solution I ended up with based on several of the previous answers here. I specifically wanted:
My code to be code, not a string
My code to be run on the I/O thread so I can see the console output
My code to be called every time there was a change, not once
Side note: I've left in the details of what I wanted to run due to the irony of using a global variable to communicate between threads so I can compile Erlang code.
Function RunMyStuff {
# this is the bit we want to happen when the file changes
Clear-Host # remove previous console output
& 'C:\Program Files\erl7.3\bin\erlc.exe' 'program.erl' # compile some erlang
erl -noshell -s program start -s init stop # run the compiled erlang program:start()
}
Function Watch {
$global:FileChanged = $false # dirty... any better suggestions?
$folder = "M:\dev\Erlang"
$filter = "*.erl"
$watcher = New-Object IO.FileSystemWatcher $folder, $filter -Property #{
IncludeSubdirectories = $false
EnableRaisingEvents = $true
}
Register-ObjectEvent $Watcher "Changed" -Action {$global:FileChanged = $true} > $null
while ($true){
while ($global:FileChanged -eq $false){
# We need this to block the IO thread until there is something to run
# so the script doesn't finish. If we call the action directly from
# the event it won't be able to write to the console
Start-Sleep -Milliseconds 100
}
# a file has changed, run our stuff on the I/O thread so we can see the output
RunMyStuff
# reset and go again
$global:FileChanged = $false
}
}
RunMyStuff # run the action at the start so I can see the current output
Watch
You could pass in folder/filter/action into watch if you want something more generic. Hopefully this is a helpful starting point for someone else.
Calculate the hash of a list of files
Store it in a dictionary
Check each hash on an interval
Perform action when hash is different
function watch($f, $command, $interval) {
$sha1 = New-Object System.Security.Cryptography.SHA1CryptoServiceProvider
$hashfunction = '[System.BitConverter]::ToString($sha1.ComputeHash([System.IO.File]::ReadAllBytes($file)))'
$files = #{}
foreach ($file in $f) {
$hash = iex $hashfunction
$files[$file.Name] = $hash
echo "$hash`t$($file.FullName)"
}
while ($true) {
sleep $interval
foreach ($file in $f) {
$hash = iex $hashfunction
if ($files[$file.Name] -ne $hash) {
iex $command
}
}
}
}
Example usage:
$c = 'send-mailmessage -to "admin#whatever.com" -from "watch#whatever.com" -subject "$($file.Name) has been altered!"'
$f = ls C:\MyFolder\aFile.jpg
watch $f $c 60
You can use the System.IO.FileSystemWatcher to monitor a file.
$watcher = New-Object System.IO.FileSystemWatcher
$watcher.Path = $searchPath
$watcher.IncludeSubdirectories = $true
$watcher.EnableRaisingEvents = $true
See also this article
Here is another option.
I just needed to write my own to watch and run tests within a Docker container. Jan's solution is much more elegant, but FileSystemWatcher is broken within Docker containers presently. My approach is similar to Vasili's, but much lazier, trusting the file system's write time.
Here's the function I needed, which runs the command block each time the file changes.
function watch($command, $file) {
$this_time = (get-item $file).LastWriteTime
$last_time = $this_time
while($true) {
if ($last_time -ne $this_time) {
$last_time = $this_time
invoke-command $command
}
sleep 1
$this_time = (get-item $file).LastWriteTime
}
}
Here is one that waits until the file changes, runs the block, then exits.
function waitfor($command, $file) {
$this_time = (get-item $file).LastWriteTime
$last_time = $this_time
while($last_time -eq $this_time) {
sleep 1
$this_time = (get-item $file).LastWriteTime
}
invoke-command $command
}
I had a similar problem. I first wanted to use Windows events and register, but this would be less fault-tolerant as the solution beneath.
My solution was a polling script (intervals of 3 seconds). The script has a minimal footprint on the system and notices changes very quickly. During the loop my script can do more things (actually I check 3 different folders).
My polling script is started through the task manager. The schedule is start every 5 minutes with the flag stop-when-already-running. This way it will restart after a reboot or after a crash.
Using the task manager for polling every 3 seconds is too frequent for the task manager.
When you add a task to the scheduler make sure you do not use network drives (that would call for extra settings) and give your user batch privileges.
I give my script a clean start by shutting it down a few minutes before midnight. The task manager starts the script every morning (the init function of my script will exit 1 minute around midnight).
I was looking for something I could run as a one-liner from a terminal. This is what I arrived at:
while ($True) { if ((Get-Item .\readme.md).LastWriteTime -ne $LastWriteTime) { "Hello!"; $LastWriteTime = (Get-Item .\readme.md).LastWriteTime; Sleep 1 } }
Another simple version:
$date = get-date
while ( (dir file.txt -ea 0 | % lastwritetime) -lt $date -and $count++ -lt 10) {
sleep 1
}
'file changed or timeout'

Powershell get-members of $EventArgs automatic variable

I want to list all available properties of the variable $EventArgs by piping it to get-member. but I am having trouble to get the members of the automatic variable $EventArgs.
In the example I can see that they get the property. FullPath from the automatic variable $EventArgs. I want a way to list all the properties that maybe are useful.
Any ideas of how to get the members of $EventArgs automatic variable.
$watcher = New-Object System.IO.FileSystemWatcher
$watcher.Path = "C:\"
$watcher.IncludeSubdirectories = $true
$watcher.EnableRaisingEvents = $true
$watcher.Site
$arrary = #()
$watcher.Filter = ""
Register-ObjectEvent $watcher "Created" -Action {
write-host "Created: $($eventArgs.FullPath)"
$arrary+= $EventArgs|gm
$arrary+= $EventArgs
write-host $EventArgs|gm
$EventArgs|gm
}
One quick way to see all of the properties (there are only three) is to do this:
Register-ObjectEvent $watcher "Created" -Action {
$eventArgs | Select-Object * | Write-Host
}
However, you'll get much more useful information if you do this:
Register-ObjectEvent $watcher "Created" -Action {
Write-Host $eventArgs.GetType()
}
and then look up the resultant type on MSDN. When you do that, you'll see that $EventArgs is actually an object of type System.IO.FileSystemEventArgs, which is fully documented here. Not only will you see the three properties, you'll see what they mean. For example, the first property ChangeType is really an enumeration of the type System.IO.WatcherFileTypes, and you can learn all of the different values it can have.

Captured Output of command run by PowerShell Is Sometimes Incomplete

I'm running the DTEXEC.exe command from within a PowerShell script, trying to capture and log the output to a file. Sometimes the output is incomplete and I'm trying to figure out why this the case and what might be done about it. The lines that never seem to get logged are the most interesting:
DTEXEC: The package execution returned DTSER_SUCCESS(0)
Started: 10:58:43 a.m.
Finished: 10:59:24 a.m.
Elapsed: 41.484 seconds
The output always seems incomplete on packages that execute in less than ~ 8 seconds and this might be a clue (there isn't much output or they finish quickly).
I'm using .NETs System.Diagnostics.Process and ProcessStartInfo to setup and run the command, and I'm redirecting stdout and stderror to event handlers that each append to a StringBuilder which is subsequently written to disk.
The problem feels like a timing issue or a buffering issue. To solve the timing issue, I've attempted to use Monitor.Enter/Exit. If it's a buffering issue, I'm not sure how to force the Process to not buffer stdout and stderror.
The environment is
- PowerShell 2 running CLR version 2
- SQL 2008 32-bit DTEXEC.exe
- Host Operating System: XP Service Pack 3.
Here's the code:
function Execute-SSIS-Package
{
param([String]$fileName)
$cmd = GetDTExecPath
$proc = New-Object System.Diagnostics.Process
$proc.StartInfo.FileName = $cmd
$proc.StartInfo.Arguments = "/FILE ""$fileName"" /CHECKPOINTING OFF /REPORTING ""EWP"""
$proc.StartInfo.RedirectStandardOutput = $True
$proc.StartInfo.RedirectStandardError = $True
$proc.StartInfo.WorkingDirectory = Get-Location
$proc.StartInfo.UseShellExecute = $False
$proc.StartInfo.CreateNoWindow = $False
Write-Host $proc.StartInfo.FileName $proc.StartInfo.Arguments
$cmdOut = New-Object System.Text.StringBuilder
$errorEvent = Register-ObjectEvent -InputObj $proc `
-Event "ErrorDataReceived" `
-MessageData $cmdOut `
-Action `
{
param
(
[System.Object] $sender,
[System.Diagnostics.DataReceivedEventArgs] $e
)
try
{
[System.Threading.Monitor]::Enter($Event.MessageData)
Write-Host -ForegroundColor "DarkRed" $e.Data
[void](($Event.MessageData).AppendLine($e.Data))
}
catch
{
Write-Host -ForegroundColor "Red" "Error capturing processes std error" $Error
}
finally
{
[System.Threading.Monitor]::Exit($Event.MessageData)
}
}
$outEvent = Register-ObjectEvent -InputObj $proc `
-Event "OutputDataReceived" `
-MessageData $cmdOut `
-Action `
{
param
(
[System.Object] $sender,
[System.Diagnostics.DataReceivedEventArgs] $e
)
try
{
[System.Threading.Monitor]::Enter($Event.MessageData)
#Write-Host $e.Data
[void](($Event.MessageData).AppendLine($e.Data))
}
catch
{
Write-Host -ForegroundColor "Red" "Error capturing processes std output" $Error
}
finally
{
[System.Threading.Monitor]::Exit($Event.MessageData)
}
}
$isStarted = $proc.Start()
$proc.BeginOutputReadLine()
$proc.BeginErrorReadLine()
while (!$proc.HasExited)
{
Start-Sleep -Milliseconds 100
}
Start-Sleep -Milliseconds 1000
$procExitCode = $proc.ExitCode
$procStartTime = $proc.StartTime
$procFinishTime = Get-Date
$proc.Close()
$proc.CancelOutputRead()
$proc.CancelErrorRead()
$result = New-Object PsObject -Property #{
ExitCode = $procExitCode
StartTime = $procStartTime
FinishTime = $procFinishTime
ElapsedTime = $procFinishTime.Subtract($procStartTime)
StdErr = ""
StdOut = $cmdOut.ToString()
}
return $result
}
The reason that your output is truncated is that Powershell returns from WaitForExit() and sets the HasExited property before it has processed all the output events in the queue.
One solution it to loop an arbitrary amount of time with short sleeps to allow the events to be processed; Powershell event processing appear to not be pre-emptive so a single long sleep does not allow events to process.
A much better solution is to also register for the Exited event (in addition to Output and Error events) on the Process. This event is the last in the queue so if you set a flag when this event occurs then you can loop with short sleeps until this flag is set and know that you have processed all the output events.
I have written up a full solution on my blog but the core snippet is:
# Set up a pair of stringbuilders to which we can stream the process output
$global:outputSB = New-Object -TypeName "System.Text.StringBuilder";
$global:errorSB = New-Object -TypeName "System.Text.StringBuilder";
# Flag that shows that final process exit event has not yet been processed
$global:myprocessrunning = $true
$ps = new-object System.Diagnostics.Process
$ps.StartInfo.Filename = $target
$ps.StartInfo.WorkingDirectory = Split-Path $target -Parent
$ps.StartInfo.UseShellExecute = $false
$ps.StartInfo.RedirectStandardOutput = $true
$ps.StartInfo.RedirectStandardError = $true
$ps.StartInfo.CreateNoWindow = $true
# Register Asynchronous event handlers for Standard and Error Output
Register-ObjectEvent -InputObject $ps -EventName OutputDataReceived -action {
if(-not [string]::IsNullOrEmpty($EventArgs.data)) {
$global:outputSB.AppendLine(((get-date).toString('yyyyMMddHHmm')) + " " + $EventArgs.data)
}
} | Out-Null
Register-ObjectEvent -InputObject $ps -EventName ErrorDataReceived -action {
if(-not [string]::IsNullOrEmpty($EventArgs.data)) {
$global:errorSB.AppendLine(((get-date).toString('yyyyMMddHHmm')) + " " + $EventArgs.data)
}
} | Out-Null
Register-ObjectEvent -InputObject $ps -EventName Exited -action {
$global:myprocessrunning = $false
} | Out-Null
$ps.start() | Out-Null
$ps.BeginOutputReadLine();
$ps.BeginErrorReadLine();
# We set a timeout after which time the process will be forceably terminated
$processTimeout = $timeoutseconds * 1000
while (($global:myprocessrunning -eq $true) -and ($processTimeout -gt 0)) {
# We must use lots of shorts sleeps rather than a single long one otherwise events are not processed
$processTimeout -= 50
Start-Sleep -m 50
}
if ($processTimeout -le 0) {
Add-Content -Path $logFile -Value (((get-date).toString('yyyyMMddHHmm')) + " PROCESS EXCEEDED EXECUTION ALLOWANCE AND WAS ABENDED!")
$ps.Kill()
}
# Append the Standard and Error Output to log file, we don't use Add-Content as it appends a carriage return that is not required
[System.IO.File]::AppendAllText($logFile, $global:outputSB)
[System.IO.File]::AppendAllText($logFile, $global:errorSB)
My 2 cents...its not a powershell issue but an issue/bug in the System.Diagnostics.Process class and underlying shell. I've seen times when wrapping the StdError and StdOut does not catch everything, and other times when the 'listening' wrapper application will hang indefinitly because of HOW the underlying application writes to the console. (in the c/c++ world there are MANY different ways to do this, [e.g. WriteFile, fprintf, cout, etc])
In addition there are more than 2 outputs that may need to be captured, but the .net framework only shows you those two (given they are the two primary ones) [see this article about command redirection here as it starts to give hints).
My guess (for both your issue as well as mine) is that it has to do with some low-level buffer flushing and/or ref counting. (If you want to get deep, you can start here)
One (very hacky) way to get around this is instead of executing the program directly to actually execute wrap it in a call to cmd.exe with 2>&1, but this method has its own pitfalls and issues.
The most ideal solution is for the executable to have a logging parameter, and then go parse the log file after the process exits...but most of the time you don't have that option.
But wait, we're using powershell...why are you using System.Diagnositics.Process in the first place? you can just call the command directly:
$output = & (GetDTExecPath) /FILE "$fileName" /CHECKPOINTING OFF /REPORTING "EWP"