PowerShell File Monitoring and Text-to-Speech - powershell

I am new to PowerShell and I'm trying to help a friend write a script that will constantly monitor a file and whenever the file changes, the new text in the file is read aloud (the text file is constantly edited and all of the old content within it is replaced by new content which is taken from emails as they arrive).
The script works perfectly as far as pulling the content from the file and reading it aloud, but I'm having one small issue in that it reads the contents either two or four times, while I only need it to read it once.
Additionally, while PowerShell is speaking the content, it doesn't update/queue changes in the file, so if two changes are made while PowerShell is speaking a prior change, the first change is skipped over and only the most recent change is read aloud. Is there a way to make it queue all changes of the file and read them sequentially?
What I currently have is
Add-Type -AssemblyName System.speech
$speak = New-Object System.Speech.Synthesis.SpeechSynthesizer
$speak.Rate = 0 # -10 is slowest, 10 is fastest
$watcher = New-Object System.IO.FileSystemWatcher
$watcher.Path = "C:\Users\Dylan\Desktop\"
$watcher.Filter = "alarm.txt"
$watcher.IncludeSubdirectories = $false
$watcher.EnableRaisingEvents = $true
$AlarmLocation = "C:\Users\Dylan\Desktop\alarm.txt"
$changeAction = {
$Alarm = (Get-Content $AlarmLocation)
$speak.Speak($Alarm)
}
Register-ObjectEvent $watcher "Changed" -Action $changeAction
while ($true) {sleep 5}
Am I missing something obvious here or is there a different function I have to include?
Thank you

The issue with multiple events is explained here:
https://blogs.msdn.microsoft.com/oldnewthing/20140507-00/?p=1053/
One way to deal with it is to keep track of LastWriteTime.
We can run the speaker in a different thread, so it doesn't block the watcher. That way we will detect if the file changes while speaking.
Something like this...
# When the file is changed,
# the content is stored in the queue,
# and the speaker is signaled.
# it breaks, if the file changes very rapidly.
# use a hashtable for all the vars
# for easier transport across scopes
$vars = [hashtable]::Synchronized(#{})
$vars.speakQueue = New-Object System.Collections.Queue
$vars.speakEvent = New-Object System.Threading.AutoResetEvent $false
$vars.speakLastWriteTime = [DateTime]::MinValue
$vars.speakRunning = $true
$vars.speakPS = [System.Management.Automation.PowerShell]::Create().AddScript({
# this is the speaker thread
Param (
$vars
)
Add-Type -AssemblyName System.speech
$speak = New-Object System.Speech.Synthesis.SpeechSynthesizer
$speak.Rate = 0 # -10 is slowest, 10 is fastest
# run until other thread sets running=false
while($vars.speakRunning) {
# other thread sets the event when content is available
if($vars.speakEvent.WaitOne(100)) {
# use System.Threading.Monitor to make queue thread safe
[System.Threading.Monitor]::Enter($vars.SyncRoot)
try {
# get all alarms
$alarm = while($vars.speakQueue.Count){ $vars.speakQueue.Dequeue() }
}
catch {
}
[System.Threading.Monitor]::Exit($vars.SyncRoot)
# speak now
$alarm | ForEach-Object { $speak.Speak($_) }
}
}
}).AddArgument($vars)
# start new thread
$vars.speakPSHandle = $vars.speakPS.BeginInvoke()
$watcher = New-Object System.IO.FileSystemWatcher
$watcher.Path = "C:\Users\Dylan\Desktop\"
$watcher.Filter = "alarm.txt"
$watcher.IncludeSubdirectories = $false
$watcher.EnableRaisingEvents = $true
$watcher.NotifyFilter = [System.IO.NotifyFilters]::LastWrite
$changeAction = {
$vars = $event.MessageData
# FullPath is the path of the changed file
$item = Get-Item $event.SourceEventArgs.FullPath
# only proceed, if LastWriteTime has changed
if($item.LastWriteTime -ne $vars.speakLastWriteTime) {
$vars.speakLastWriteTime = $item.LastWriteTime
$alarm = Get-Content $event.SourceEventArgs.FullPath -Raw
[System.Threading.Monitor]::Enter($vars.SyncRoot)
try {
# put content in queue
$vars.speakQueue.Enqueue($alarm)
}
catch {
}
[System.Threading.Monitor]::Exit($vars.SyncRoot)
# signal speaker in other thread
$vars.speakEvent.Set()
}
}
$job = Register-ObjectEvent $watcher "changed" -Action $changeAction -SourceIdentifier "FileChanged" -MessageData $vars
while($true) {
Start-Sleep -Milliseconds 25
}
# clean-up, if ever needed...
Unregister-Event "FileChanged"
$vars.speakRunning = $false # leaves while-loop in thread
$vars.speakPS.EndInvoke($vars.speakPSHandle) # waits for thread to end

Related

Powershell: Move-Item : Can it sit and wait until the source fle is closed and released?

I have an ObjectEvent that sees a new file getting created and then tries to move the file.
It works... except when it goes to move the file, the file is still open, so the Move-Item fails.
So I suppose there are two possible paths... I am open to either (or both!)
First, how would my ObjectEvent fire only after the file is closed? Current objectevent:
Register-ObjectEvent $Watcher -EventName Created -SourceIdentifier FileCreated -Action
Second, is it possible for MoveItem to sit and keep trying for 5 seconds or so before failing? Current Move-Item call:
Move-Item $path -Destination $destination -Force -Verbose
The FileSystemWatcher class have a few events available. Changed, Created, Deleted, Error and Renamed. There is nothing there to indicate if the file was recently unlocked by the system. So there is no direct way to fire your event specifically after the file was closed.
What I would do personally to make that the files are moved properly would be to delay the move to X ms after the file creation event is fired.
Here's a summary of how that would work.
Prerequisites
A Filewatcher, which detects the file being created
A timer, which will move the files after their creation
A queue, on which the Filewatcher enqueue items being created and on which the timer dequeue items that need processing. It is what will communicate the information between the two.
Flow
A file is created
The Filewatcher created event trigger and add the item in the queue, then start the timer.
The timer elapsed event trigger, stop itself and process the queue. If any items remains in the queue, it restart itself.
Here's a base version of all of that.
$Params = #{
# Path we are watching
WatchPath = 'C:\temp\11\test'
# Path we are moving stuff to.
DestinationPath = 'C:\temp\11'
# Stop after X attempts
MoveAttempts = 5
# Timer heartbeat
MoveAttemptsDelay = 1000
}
# Create watcher
$Watcher = [System.IO.FileSystemWatcher]::new()
$Watcher.Path = $Params.WatchPath
$Watcher.EnableRaisingEvents = $true
# Create Timer - stopped state
$WatchTimer = [System.Timers.Timer]::new()
$WatchTimer.Interval = 1000
#Create WatchQueue (Each items will be FullPath,Timestamp,MoveAttempts)
$WatchQueue = [System.Collections.Generic.Queue[psobject]]::new()
$FileWatcherReg = #{
InputObject = $Watcher
EventName = 'Created'
SourceIdentifier = 'FileCreated'
MessageData = #{WatchQueue = $WatchQueue; Timer = $WatchTimer }
Action = {
if ($null -ne $event) {
$Queue = $Event.MessageData.WatchQueue
$Timer = $Event.MessageData.Timer
$Queue.Enqueue([PSCustomObject]#{
FullPath = $Event.SourceArgs.FullPath
TimeStamp = $Event.TimeGenerated
MoveAttempts = 0
})
# We only start the timer if it is not already counting down
# We also don't want to start the timer if item is not 1 since this mean
# the timer logic is already running.
if ($Queue.Count -eq 1 -and ! $Timer.Enabled) { $Timer.Start() }
}
}
}
$TimerReg = #{
InputObject = $WatchTimer
EventName = 'Elapsed'
Sourceidentifier = 'WatchTimerElapsed'
MessageData = #{WatchQueue = $WatchQueue; ConfigParams = $Params }
Action = {
$Queue = $Event.MessageData.WatchQueue
$ConfigParams = $Event.MessageData.ConfigParams
$Event.Sender.Stop()
$SkipItemsCount = 0
while ($Queue.Count -gt 0 + $SkipItemsCount ) {
$Item = $Queue.Dequeue()
$ItemName = Split-Path $item.FullPath -Leaf
while ($Item.MoveAttempts -lt $ConfigParams.MoveAttempts) {
try {
Write-Host 'test'
$Item.MoveAttempts += 1
Move-Item -Path $Item.FullPath -Destination "$($ConfigParams.DestinationPath)\$ItemName" -ErrorAction Stop
break
}
catch {
$ex = $_.Exception
if ( $Item.MoveAttempts -eq 5) {
# Do something about it... Log / Warn / etc...
Write-warning "Move attempts: $($ConfigParams.MoveAttempts)"
Write-Warning "FilePath: $($Item.FullPath)"
Write-Warning $ex
continue
}
else {
$Queue.Enqueue($Item)
$SkipItemsCount += 1
}
}
}
# If we skipped any items, we don't want to dequeue until 0 anymore but rather we will stop
if ($SkipItemsCount -gt 0){
$Event.Sender.Start()
}
}
}
}
# ObjectEvent for FileWatcher
Register-ObjectEvent #FileWatcherReg
# ObjectEvent for Timer which process stuff in a delayed fashion
Register-ObjectEvent #TimerReg
while ($true) {
Start-Sleep -Milliseconds 100
}
# Unregister events at the end
Unregister-Event -SourceIdentifier FileCreated | Out-Null # Will fail first time since never registered
Unregister-Event -SourceIdentifier WatchTimerElapsed | Out-Null

Powershell hanging due to Filesystemwatcher

We have a convoluted solution to some printing issues (caused by citrix and remote servers). Basically from the main server, we force push a pdf file to the remote pc and then have a powershell script which constantly runs on the remote pc to "catch" the file and push it to the local printer
This works "fine"
However we get random dropouts. The powershell script doesn't seem to have crashed because it's still running in windows but the actual action doesn't seem to be processing new files
I have done A LOT of reading today and there's mention of having to name and unregister events when you're done otherwise it can cause a buffer overflow issues and make powershell stop processing the action. But I'm unsure where it should actually go within the code. The idea is that this script will run permanently, so do we unregister or remove the event within the action itself or somewhere else?
I previous had A LOT of dummy logging going on within the action to try to find where it failed, but it seems to stop at different points without any justifiable reason (ie, it would fail at the command to find files, other times at the command to move etc etc)
### SET FOLDER TO WATCH + FILES TO WATCH + SUBFOLDERS YES/NO
$watcher = New-Object System.IO.FileSystemWatcher
$watcher.Path = "l:\files\cut"
$watcher.Filter = "*.pdf"
$watcher.IncludeSubdirectories = $false
$watcher.EnableRaisingEvents = $true
### DEFINE ACTIONS AFTER AN EVENT IS DETECTED
$action = { $path = $Event.SourceEventArgs.FullPath
$changeType = $Event.SourceEventArgs.ChangeType
$scandir="l:\files\cut"
$scanbackdir="l:\files\cut\back"
$scanlogdir="l:\files\cut\log"
$sumatra="l:\SumatraPDF.exe"
$pdftoprint=""
$printername= "MainLBL"
### Get the List of files in the Directory, print file, wait and then move file
Get-ChildItem -Path $scandir -filter "*.pdf" -Name | % {
$pdftoprint=$_
& $sumatra -silent $scandir\$pdftoprint -print-to $printername
sleep 3
Move-Item -force $scandir\$pdftoprint $scanbackdir
}
}
### Define what happens when script fails
$erroraction = {echo $(get-date) the process crashed | Out-File -Append l:\files\cut\log\errorlog.txt}
### DECIDE WHICH EVENTS SHOULD BE WATCHED
Register-ObjectEvent $watcher "Error" -Action $erroraction
Register-ObjectEvent $watcher "Created" -Action $action
while ($true) {sleep 5}
If you want a script to run in the background, then look to PowerShell background jobs.
If you want a script to run permanently, then you want to make it a service ...
See these:
How to Create a User-Defined Service
How to run a PowerShell script as a Windows service
... or attach that to a scheduled task, that would restart it on reboots.
There are two ways to implement a FileSystemWatcher.
Synchronous
Asynchronous
A synchronous FileSystemWatcher, by it nature, when a change is detected, control is returned to your script so it can process the change. If another file change occurs while your script is no longer waiting for events, it gets lost. Hence leading to unexpected outcomes.
Using the FileSystemWatcher Asynchronously, it would continue to log new filesystem changes and process them once PowerShell is done processing previous changes.
* An Sample - Example Asynchronous FileSystemWatcher*
### New-FileSystemWatcherAsynchronous
# Set the folder target
$PathToMonitor = Read-Host -Prompt 'Enter a folder path'
$FileSystemWatcher = New-Object System.IO.FileSystemWatcher
$FileSystemWatcher.Path = $PathToMonitor
$FileSystemWatcher.IncludeSubdirectories = $true
# Set emits events
$FileSystemWatcher.EnableRaisingEvents = $true
# Define change actions
$Action = {
$details = $event.SourceEventArgs
$Name = $details.Name
$FullPath = $details.FullPath
$OldFullPath = $details.OldFullPath
$OldName = $details.OldName
$ChangeType = $details.ChangeType
$Timestamp = $event.TimeGenerated
$text = "{0} was {1} at {2}" -f $FullPath, $ChangeType, $Timestamp
Write-Host $text -ForegroundColor Green
# Define change types
switch ($ChangeType)
{
'Changed' { "CHANGE" }
'Created' { "CREATED"}
'Deleted' { "DELETED"
# Set time intensive handler
Write-Host "Deletion Started" -ForegroundColor Gray
Start-Sleep -Seconds 3
Write-Warning -Message 'Deletion complete'
}
'Renamed' {
$text = "File {0} was renamed to {1}" -f $OldName, $Name
Write-Host $text -ForegroundColor Yellow
}
default { Write-Host $_ -ForegroundColor Red -BackgroundColor White }
}
}
# Set event handlers
$handlers = . {
Register-ObjectEvent -InputObject $FileSystemWatcher -EventName Changed -Action $Action -SourceIdentifier FSChange
Register-ObjectEvent -InputObject $FileSystemWatcher -EventName Created -Action $Action -SourceIdentifier FSCreate
Register-ObjectEvent -InputObject $FileSystemWatcher -EventName Deleted -Action $Action -SourceIdentifier FSDelete
Register-ObjectEvent -InputObject $FileSystemWatcher -EventName Renamed -Action $Action -SourceIdentifier FSRename
}
Write-Host "Watching for changes to $PathToMonitor" -ForegroundColor Cyan
try
{
do
{
Wait-Event -Timeout 1
Write-Host '.' -NoNewline
} while ($true)
}
finally
{
# End script actions + CTRL+C executes the remove event handlers
Unregister-Event -SourceIdentifier FSChange
Unregister-Event -SourceIdentifier FSCreate
Unregister-Event -SourceIdentifier FSDelete
Unregister-Event -SourceIdentifier FSRename
# Remaining cleanup
$handlers |
Remove-Job
$FileSystemWatcher.EnableRaisingEvents = $false
$FileSystemWatcher.Dispose()
Write-Warning -Message 'Event Handler completed and disabled.'
}
I have not encountered a script that will run permanently on windows.
So with that in mind we take it for granted that some issue beyond your control such as the network or power or a system shutdown will occur.
With that in mind we have a lifecycle for this script and everything should be properly cleaned up at the end. In this case we have while loop that should theoretically never end however if an exception is thrown it will end. Within the while loop if any of the events have been deregistered we can reregister them. If the watcher has been disposed we can recreate it and the events. If this really is mission critical code, then I would look at .net as an alternative with something like hangfire with nlog as a windows service.
### WRAP Everything in a try finally so we dispose of events
try {
### SET FOLDER TO WATCH + FILES TO WATCH + SUBFOLDERS YES/NO
$watcherArgs = #{
Path = "l:\files\cut"
Filter = "*.pdf"
IncludeSubdirectories = $false
EnableRaisingEvents = $true
}
$watcher = New-Object System.IO.FileSystemWatcher
$watcher.Path = $watcherArgs.Path
$watcher.Filter = $watcherArgs.Filter
$watcher.IncludeSubdirectories = $watcherArgs.IncludeSubdirectories
$watcher.EnableRaisingEvents = $watcherArgs.EnableRaisingEvents
### DEFINE ACTIONS AFTER AN EVENT IS DETECTED
$action = { $path = $Event.SourceEventArgs.FullPath
$changeType = $Event.SourceEventArgs.ChangeType
$scandir="l:\files\cut"
$scanbackdir="l:\files\cut\back"
$scanlogdir="l:\files\cut\log"
$sumatra="l:\SumatraPDF.exe"
$pdftoprint=""
$printername= "MainLBL"
### Get the List of files in the Directory, print file, wait and then move file
Get-ChildItem -Path $scandir -filter "*.pdf" -Name | % {
$pdftoprint=$_
if($LASTEXITCODE -ne 0) {
# Do something
# Reset so we know when sumatra fails
$LASTEXITCODE = 0
}
& $sumatra -silent $scandir\$pdftoprint -print-to $printername
if($LASTEXITCODE -ne 0) {
# Do something to handle sumatra
}
sleep 3
# Split up copy and delete so we never loose files
[system.io.file]::Copy("$scandir\$pdftoprint", "$scanbackdir", $true)
[system.io.file]::Delete("$scandir\$pdftoprint")
}
}
### Define what happens when script fails
$erroraction = {
echo "$(get-date) the process crashed" | Out-File -Append "l:\files\cut\log\errorlog.txt"
}
### DECIDE WHICH EVENTS SHOULD BE WATCHED
$ErrorEvent = Register-ObjectEvent $watcher "Error" -Action $erroraction
$CreatedEvent = Register-ObjectEvent $watcher "Created" -Action $action
$ListOfEvents = #(
$ErrorEvent
$CreatedEvent
)
while ($true) {
$eventMissing = $false
$ListOfEvents | % {
$e = $_
if (!(Get-Event -SourceIdentifier $e.Name -ErrorAction SilentlyContinue)) {
# Event does not exist
$eventMissing = $true
}
}
if(!$watcher || $eventMissing -eq $true) {
# deregister events
$ListOfEvents | % {
$e = $_
try {
Unregister-Event -SourceIdentifier $e.Name
} catch {
# Do Nothing
}
}
if($watcher) {
$watcher.Dispose()
$watcher = $null
} else {
# Create watcher
$watcher = New-Object System.IO.FileSystemWatcher
$watcher.Path = $watcherArgs.Path
$watcher.Filter = $watcherArgs.Filter
$watcher.IncludeSubdirectories = $watcherArgs.IncludeSubdirectories
$watcher.EnableRaisingEvents = $watcherArgs.EnableRaisingEvents
$ErrorEvent = Register-ObjectEvent $watcher "Error" -Action $erroraction
$CreatedEvent = Register-ObjectEvent $watcher "Created" -Action $action
$ListOfEvents = #(
$ErrorEvent
$CreatedEvent
)
}
}
if ($watcher.EnableRaisingEvents -eq $false) {
$watcher.EnableRaisingEvents = $watcherArgs.EnableRaisingEvents
}
sleep 5
}
} finally {
$ListOfEvents | % {
$e = $_
try {
Unregister-Event -SourceIdentifier $e.Name
} catch {
# Do Nothing
}
}
if($watcher) {
$watcher.Dispose();
}
}

Powershell FileSystemWatcher - Avoiding duplicate action on Create

It's a known issue in Powershell that the FileSystemWatcher fires twice on events. I am trying to work around this as I am watching for files being created and then pushing it to a printer. The double firing means I am getting duplicated printouts
I know this question has been asked before but I am a complete newb when it comes to Powershell (and scripting in general really) so some of the answers have gone straight over my head
In the code, I am watching a folder and then passing the subdirectory names as the printer name for sending the job. This is because the software in use is copying the pdf files from a remote location into those folders (the software doesn't have direct access to the printers due to citrix)
### SET FOLDER TO WATCH + FILES TO WATCH + SUBFOLDERS YES/NO
$watcher = New-Object System.IO.FileSystemWatcher
$watcher.Path = "L:\Label\"
$watcher.Filter = "*.*"
$watcher.IncludeSubdirectories = $true
$watcher.EnableRaisingEvents = $true
### DEFINE ACTIONS AFTER AN EVENT IS DETECTED
$action = { $path = $Event.SourceEventArgs.FullPath
$changeType = $Event.SourceEventArgs.ChangeType
$printer = Split-Path (Split-Path $path -Parent) -Leaf
$logline = "$(Get-Date), $changeType, $path, $printer"
Add-content "c:\prog\log.txt" -value $logline
C:\prog\SumatraPDF.exe -print-to "\\http://srv:631\$printer" $path
}
### DECIDE WHICH EVENTS SHOULD BE WATCHED
Register-ObjectEvent $watcher "Created" -Action $action
while ($true) {sleep 5}
I expect to see the printing command (the Sumatra call) only occur once when a pdf file is dropped into the watch folder
Instead of telling you what you should or shouldn't do, here is how to do what you asked for:
### SET FOLDER TO WATCH + FILES TO WATCH + SUBFOLDERS YES/NO
$global:canDoEvent = $True #NEW
$watcher = New-Object System.IO.FileSystemWatcher
$watcher.Path = "L:\Label\"
$watcher.Filter = "*.*"
$watcher.IncludeSubdirectories = $true
$watcher.EnableRaisingEvents = $true
DEFINE ACTIONS AFTER AN EVENT IS DETECTED
$action = { if ($global:canDoEvent) { #NEW
$global:canDoEvent = $False #NEW
$path = $Event.SourceEventArgs.FullPath
$changeType = $Event.SourceEventArgs.ChangeType
$printer = Split-Path (Split-Path $path -Parent) -Leaf
$logline = "$(Get-Date), $changeType, $path, $printer"
Add-content "c:\prog\log.txt" -value $logline
C:\prog\SumatraPDF.exe -print-to "\\http://srv:631\$printer" $path
}
}
DECIDE WHICH EVENTS SHOULD BE WATCHED
Register-ObjectEvent $watcher "Created" -EventName newFile -Action $action #NEW
do { #NEW
$global:canDoEvent = $True
Wait-Event -SourceIdentifier newFile -Timeout 1
} while ($True)
It probably requires tuning, I'm no expert, but that's the idea.
Basically, add a global boolean var = True, put your wait-event on a timeout inside a do-while loop, make variable true every loop, then in your event action, make it false. your timeout will define how often the event can fire. a single second should suffice to prevent multiple firing events. Obviously, if there are contexts where more than 1 unique file could be created and printed during the same second, it would skip them.
I don't think it's a known issue that the FileSystemWatcher fires twice on events, not where you got that information.
Regardless, if I were you I wouldn't code the FileSystemWather events in Powershell myself, it's a real pain.
Instead, you can use Powershell Guard, just pass the print command instead of the TestCommand. https://github.com/smurawski/PowerShellGuard
All that PowerGuard does is abstract the use of the FileSystemWatcher. You can create your print command in a Powershell script, and just have PowerGuard call your script with -TestCommand "Print.ps1 .\PathToYourFile"
Final Solution (by the poster himself):
dir \\srv\label\prnlblCuts\*.pdf | New-Guard -TestCommand "C:\PROG\SumatraPDF.exe -print-to \\srv-tsv:631\prnlblCuts" -TestPath "$($_.FullName)" -Wait

Powershell - Listen for file, do something if file exists [duplicate]

Is there any simple way(i.e., script) to watch file in Powershell and run commands if file changes. I have been googling but can't find simple solution. Basically I run script in Powershell and if file changes then Powershell run other commands.
EDIT
Ok I think I made a mistake. I don't need script, a need function that I can include in my $PROFILE.ps1 file. But still, I was trying hard and still I'm unable to write it, so I will give bounty. It have to look like this:
function watch($command, $file) {
if($file #changed) {
#run $command
}
}
There is a NPM module that is doing what I want, watch , but it only watches for folders not files, and it's not Powershell xD.
Here is an example I have found in my snippets. Hopefully it is a little bit more comprehensive.
First you need to create a file system watcher and subsequently you subscribe to an event that the watcher is generating. This example listens for “Create” events, but could easily be modified to watch out for “Change”.
$folder = "C:\Users\LOCAL_~1\AppData\Local\Temp\3"
$filter = "*.LOG"
$Watcher = New-Object IO.FileSystemWatcher $folder, $filter -Property #{
IncludeSubdirectories = $false
NotifyFilter = [IO.NotifyFilters]'FileName, LastWrite'
}
$onCreated = Register-ObjectEvent $Watcher -EventName Created -SourceIdentifier FileCreated -Action {
$path = $Event.SourceEventArgs.FullPath
$name = $Event.SourceEventArgs.Name
$changeType = $Event.SourceEventArgs.ChangeType
$timeStamp = $Event.TimeGenerated
Write-Host "The file '$name' was $changeType at $timeStamp"
Write-Host $path
#Move-Item $path -Destination $destination -Force -Verbose
}
I will try to narrow this down to your requirements.
If you run this as part of your "profile.ps1" script you should read The Power of Profiles which explains the different profile scripts available and more.
Also, you should understand that waiting for a change in a folder can't be run as a function in the script. The profile script has to be finished, for your PowerShell session to start. You can, however use a function to register an event.
What this does, is register a piece of code, to be executed every time an event is triggered. This code will be executed in the context of your current PowerShell host (or shell) while the session remains open. It can interact with the host session, but has no knowledge of the original script that registered the code. The original script has probably finished already, by the time your code is triggered.
Here is the code:
Function Register-Watcher {
param ($folder)
$filter = "*.*" #all files
$watcher = New-Object IO.FileSystemWatcher $folder, $filter -Property #{
IncludeSubdirectories = $false
EnableRaisingEvents = $true
}
$changeAction = [scriptblock]::Create('
# This is the code which will be executed every time a file change is detected
$path = $Event.SourceEventArgs.FullPath
$name = $Event.SourceEventArgs.Name
$changeType = $Event.SourceEventArgs.ChangeType
$timeStamp = $Event.TimeGenerated
Write-Host "The file $name was $changeType at $timeStamp"
')
Register-ObjectEvent $Watcher -EventName "Changed" -Action $changeAction
}
Register-Watcher "c:\temp"
After running this code, change any file in the "C:\temp" directory (or any other directory you specify). You will see an event triggering execution of your code.
Also, valid FileSystemWatcher events you can register are "Changed", "Created", "Deleted" and "Renamed".
I will add another answer, because my previous one did miss the requirements.
Requirements
Write a function to WAIT for a change in a specific file
When a change is detected the function will execute a predefined command and return execution to the main script
File path and command are passed to the function as parameters
There is already an answer using file hashes. I want to follow my previous answer and show you how this can be accomplish using FileSystemWatcher.
$File = "C:\temp\log.txt"
$Action = 'Write-Output "The watched file was changed"'
$global:FileChanged = $false
function Wait-FileChange {
param(
[string]$File,
[string]$Action
)
$FilePath = Split-Path $File -Parent
$FileName = Split-Path $File -Leaf
$ScriptBlock = [scriptblock]::Create($Action)
$Watcher = New-Object IO.FileSystemWatcher $FilePath, $FileName -Property #{
IncludeSubdirectories = $false
EnableRaisingEvents = $true
}
$onChange = Register-ObjectEvent $Watcher Changed -Action {$global:FileChanged = $true}
while ($global:FileChanged -eq $false){
Start-Sleep -Milliseconds 100
}
& $ScriptBlock
Unregister-Event -SubscriptionId $onChange.Id
}
Wait-FileChange -File $File -Action $Action
Here is the solution I ended up with based on several of the previous answers here. I specifically wanted:
My code to be code, not a string
My code to be run on the I/O thread so I can see the console output
My code to be called every time there was a change, not once
Side note: I've left in the details of what I wanted to run due to the irony of using a global variable to communicate between threads so I can compile Erlang code.
Function RunMyStuff {
# this is the bit we want to happen when the file changes
Clear-Host # remove previous console output
& 'C:\Program Files\erl7.3\bin\erlc.exe' 'program.erl' # compile some erlang
erl -noshell -s program start -s init stop # run the compiled erlang program:start()
}
Function Watch {
$global:FileChanged = $false # dirty... any better suggestions?
$folder = "M:\dev\Erlang"
$filter = "*.erl"
$watcher = New-Object IO.FileSystemWatcher $folder, $filter -Property #{
IncludeSubdirectories = $false
EnableRaisingEvents = $true
}
Register-ObjectEvent $Watcher "Changed" -Action {$global:FileChanged = $true} > $null
while ($true){
while ($global:FileChanged -eq $false){
# We need this to block the IO thread until there is something to run
# so the script doesn't finish. If we call the action directly from
# the event it won't be able to write to the console
Start-Sleep -Milliseconds 100
}
# a file has changed, run our stuff on the I/O thread so we can see the output
RunMyStuff
# reset and go again
$global:FileChanged = $false
}
}
RunMyStuff # run the action at the start so I can see the current output
Watch
You could pass in folder/filter/action into watch if you want something more generic. Hopefully this is a helpful starting point for someone else.
Calculate the hash of a list of files
Store it in a dictionary
Check each hash on an interval
Perform action when hash is different
function watch($f, $command, $interval) {
$sha1 = New-Object System.Security.Cryptography.SHA1CryptoServiceProvider
$hashfunction = '[System.BitConverter]::ToString($sha1.ComputeHash([System.IO.File]::ReadAllBytes($file)))'
$files = #{}
foreach ($file in $f) {
$hash = iex $hashfunction
$files[$file.Name] = $hash
echo "$hash`t$($file.FullName)"
}
while ($true) {
sleep $interval
foreach ($file in $f) {
$hash = iex $hashfunction
if ($files[$file.Name] -ne $hash) {
iex $command
}
}
}
}
Example usage:
$c = 'send-mailmessage -to "admin#whatever.com" -from "watch#whatever.com" -subject "$($file.Name) has been altered!"'
$f = ls C:\MyFolder\aFile.jpg
watch $f $c 60
You can use the System.IO.FileSystemWatcher to monitor a file.
$watcher = New-Object System.IO.FileSystemWatcher
$watcher.Path = $searchPath
$watcher.IncludeSubdirectories = $true
$watcher.EnableRaisingEvents = $true
See also this article
Here is another option.
I just needed to write my own to watch and run tests within a Docker container. Jan's solution is much more elegant, but FileSystemWatcher is broken within Docker containers presently. My approach is similar to Vasili's, but much lazier, trusting the file system's write time.
Here's the function I needed, which runs the command block each time the file changes.
function watch($command, $file) {
$this_time = (get-item $file).LastWriteTime
$last_time = $this_time
while($true) {
if ($last_time -ne $this_time) {
$last_time = $this_time
invoke-command $command
}
sleep 1
$this_time = (get-item $file).LastWriteTime
}
}
Here is one that waits until the file changes, runs the block, then exits.
function waitfor($command, $file) {
$this_time = (get-item $file).LastWriteTime
$last_time = $this_time
while($last_time -eq $this_time) {
sleep 1
$this_time = (get-item $file).LastWriteTime
}
invoke-command $command
}
I had a similar problem. I first wanted to use Windows events and register, but this would be less fault-tolerant as the solution beneath.
My solution was a polling script (intervals of 3 seconds). The script has a minimal footprint on the system and notices changes very quickly. During the loop my script can do more things (actually I check 3 different folders).
My polling script is started through the task manager. The schedule is start every 5 minutes with the flag stop-when-already-running. This way it will restart after a reboot or after a crash.
Using the task manager for polling every 3 seconds is too frequent for the task manager.
When you add a task to the scheduler make sure you do not use network drives (that would call for extra settings) and give your user batch privileges.
I give my script a clean start by shutting it down a few minutes before midnight. The task manager starts the script every morning (the init function of my script will exit 1 minute around midnight).
I was looking for something I could run as a one-liner from a terminal. This is what I arrived at:
while ($True) { if ((Get-Item .\readme.md).LastWriteTime -ne $LastWriteTime) { "Hello!"; $LastWriteTime = (Get-Item .\readme.md).LastWriteTime; Sleep 1 } }
Another simple version:
$date = get-date
while ( (dir file.txt -ea 0 | % lastwritetime) -lt $date -and $count++ -lt 10) {
sleep 1
}
'file changed or timeout'

Asynchronously Logging Stdout from a PowerShell Process

I need to start a long-running process in a PowerShell script and capture the StandardOutput to a file as it is being generated. I'm currently using an asynchronous approach and appending the output to a text file each time an OutputDataReceived event is fired. The following code excerpt illustrates:
$PSI = New-Object System.Diagnostics.ProcessStartInfo
$PSI.CreateNoWindow = $true
$PSI.RedirectStandardOutput = $true
$PSI.RedirectStandardError = $true
$PSI.UseShellExecute = $false
$PSI.FileName = $EXECUTE
$PSI.RedirectStandardInput = $true
$PSI.WorkingDirectory = $PWD
$PSI.EnvironmentVariables["TMP"] = $TMPDIR
$PSI.EnvironmentVariables["TEMP"] = $TMPDIR
$PSI.EnvironmentVariables["TMPDIR"] = $TMPDIR
$PROCESS = New-Object System.Diagnostics.Process
$PROCESS.StartInfo = $PSI
[void]$PROCESS.Start()
# asynchronously listen for OutputDataReceived events on the process
$sb = [scriptblock]::Create("`$text = `$Event.SourceEventArgs.Data; `$OUTFIL = `$Event.MessageData; Add-Content $OUTFIL.out `$text")
$EVENT_OBJ = Register-ObjectEvent -InputObject $PROCESS -EventName OutputDataReceived -Action $sb -MessageData "$OUTFIL.out"
# asynchronously listen for ErrorDataReceived events on the process
$sb2 = [scriptblock]::Create("`$text = `$Event.SourceEventArgs.Data; Write-Host `$text;")
$EVENT_OBJ2 = Register-ObjectEvent -InputObject $PROCESS -EventName ErrorDataReceived -Action $sb2
# begin asynchronous read operations on the redirected StandardOutput
$PROCESS.BeginOutputReadLine();
# begin asynchronous read operations on the redirected StandardError
$PROCESS.BeginErrorReadLine();
# write the input file contents to standard input
$WRITER = $PROCESS.StandardInput
$READER = [System.IO.File]::OpenText("$IN_FILE")
try
{
while (($line = $READER.ReadLine()) -ne $null)
{
$WRITER.WriteLine($line)
}
}
finally
{
$WRITER.close()
$READER.close()
}
$Process.WaitForExit() | Out-Null
# end asynchronous read operations on the redirected StandardOutput
$PROCESS.CancelOutputRead();
# end asynchronous read operations on the redirected StandardError
$PROCESS.CancelErrorRead();
Unregister-Event -SubscriptionId $EVENT_OBJ.Id
Unregister-Event -SubscriptionId $EVENT_OBJ2.Id
The issue with this approach is the large amount of output being generated and the latency caused by having to open and close the text file each event (i.e., Add-Content $OUTFIL.out $text) is called each time the event is fired).
$sb = [scriptblock]::Create("`$text = `$Event.SourceEventArgs.Data; `$OUTFIL = `$Event.MessageData; Add-Content $OUTFIL.out `$text")
In each case the actual process will complete minutes before all of the output data has been written to the text file.
Is there a better approach to doing this? A faster way to append text to the file?