It's a known issue in Powershell that the FileSystemWatcher fires twice on events. I am trying to work around this as I am watching for files being created and then pushing it to a printer. The double firing means I am getting duplicated printouts
I know this question has been asked before but I am a complete newb when it comes to Powershell (and scripting in general really) so some of the answers have gone straight over my head
In the code, I am watching a folder and then passing the subdirectory names as the printer name for sending the job. This is because the software in use is copying the pdf files from a remote location into those folders (the software doesn't have direct access to the printers due to citrix)
### SET FOLDER TO WATCH + FILES TO WATCH + SUBFOLDERS YES/NO
$watcher = New-Object System.IO.FileSystemWatcher
$watcher.Path = "L:\Label\"
$watcher.Filter = "*.*"
$watcher.IncludeSubdirectories = $true
$watcher.EnableRaisingEvents = $true
### DEFINE ACTIONS AFTER AN EVENT IS DETECTED
$action = { $path = $Event.SourceEventArgs.FullPath
$changeType = $Event.SourceEventArgs.ChangeType
$printer = Split-Path (Split-Path $path -Parent) -Leaf
$logline = "$(Get-Date), $changeType, $path, $printer"
Add-content "c:\prog\log.txt" -value $logline
C:\prog\SumatraPDF.exe -print-to "\\http://srv:631\$printer" $path
}
### DECIDE WHICH EVENTS SHOULD BE WATCHED
Register-ObjectEvent $watcher "Created" -Action $action
while ($true) {sleep 5}
I expect to see the printing command (the Sumatra call) only occur once when a pdf file is dropped into the watch folder
Instead of telling you what you should or shouldn't do, here is how to do what you asked for:
### SET FOLDER TO WATCH + FILES TO WATCH + SUBFOLDERS YES/NO
$global:canDoEvent = $True #NEW
$watcher = New-Object System.IO.FileSystemWatcher
$watcher.Path = "L:\Label\"
$watcher.Filter = "*.*"
$watcher.IncludeSubdirectories = $true
$watcher.EnableRaisingEvents = $true
DEFINE ACTIONS AFTER AN EVENT IS DETECTED
$action = { if ($global:canDoEvent) { #NEW
$global:canDoEvent = $False #NEW
$path = $Event.SourceEventArgs.FullPath
$changeType = $Event.SourceEventArgs.ChangeType
$printer = Split-Path (Split-Path $path -Parent) -Leaf
$logline = "$(Get-Date), $changeType, $path, $printer"
Add-content "c:\prog\log.txt" -value $logline
C:\prog\SumatraPDF.exe -print-to "\\http://srv:631\$printer" $path
}
}
DECIDE WHICH EVENTS SHOULD BE WATCHED
Register-ObjectEvent $watcher "Created" -EventName newFile -Action $action #NEW
do { #NEW
$global:canDoEvent = $True
Wait-Event -SourceIdentifier newFile -Timeout 1
} while ($True)
It probably requires tuning, I'm no expert, but that's the idea.
Basically, add a global boolean var = True, put your wait-event on a timeout inside a do-while loop, make variable true every loop, then in your event action, make it false. your timeout will define how often the event can fire. a single second should suffice to prevent multiple firing events. Obviously, if there are contexts where more than 1 unique file could be created and printed during the same second, it would skip them.
I don't think it's a known issue that the FileSystemWatcher fires twice on events, not where you got that information.
Regardless, if I were you I wouldn't code the FileSystemWather events in Powershell myself, it's a real pain.
Instead, you can use Powershell Guard, just pass the print command instead of the TestCommand. https://github.com/smurawski/PowerShellGuard
All that PowerGuard does is abstract the use of the FileSystemWatcher. You can create your print command in a Powershell script, and just have PowerGuard call your script with -TestCommand "Print.ps1 .\PathToYourFile"
Final Solution (by the poster himself):
dir \\srv\label\prnlblCuts\*.pdf | New-Guard -TestCommand "C:\PROG\SumatraPDF.exe -print-to \\srv-tsv:631\prnlblCuts" -TestPath "$($_.FullName)" -Wait
Related
Am I missing something?
I can start the debug process with F5, but I cannot end it, and I cannot step through code or do normal debugging.
I assume this is due to the fact that the code is hanging off Register-ObjectEvent ?
(Watching a file system event....)
What is the method to run this code and keep the debugger attached to what is going on?
The code:
$folder_to_watch = 'C:\Users\demouser\Downloads\'
$file_name_filter = '*.aac'
# to archive .aac files
$destination = 'c:\temp\test\arc\'
$DestinationDirMP3 = 'C:\data\personal\hinative-mp3'
$Watcher = New-Object IO.FileSystemWatcher $folder_to_watch, $file_name_filter -Property #{
IncludeSubdirectories = $false
NotifyFilter = [IO.NotifyFilters]'FileName, LastWrite'
}
$VLCExe = 'C:\Program Files\VideoLAN\VLC\vlc.exe'
$onCreated = Register-ObjectEvent $Watcher -EventName Created -SourceIdentifier FileCreated -Action {
$path = $Event.SourceEventArgs.FullPath
$name = $Event.SourceEventArgs.Name
$changeType = $Event.SourceEventArgs.ChangeType
$timeStamp = $Event.TimeGenerated
Write-Host "The file '$name' was $changeType at $timeStamp"
Write-Host $path
# File Checks
while (Test-LockedFile $path) {
Start-Sleep -Seconds .2
}
# Move File
Write-Host "moving $path to $destination"
Move-Item $path -Destination $destination -Force -Verbose
# build the path to the archived .aac file
$SourceFileName = Split-Path $path -Leaf
$DestinationAACwoQuotes = Join-Path $destination $SourceFileName
$DestinationAAC = "`"$DestinationAACwoQuotes`""
$MP3FileName = [System.IO.Path]::ChangeExtension($SourceFileName,".mp3")
$DestinationMP3woQuotes = Join-Path $DestinationDirMP3 $MP3FileName
$DestinationMP3 = "`"$DestinationMP3woQuotes`""
$VLCArgs = "-I dummy -vvv $DestinationAAC --sout=#transcode{acodec=mp3,ab=48,channels=2,samplerate=32000}:standard{access=file,mux=ts,dst=$DestinationMP3} vlc://quit"
Write-Host "args $VLCArgs"
Start-Process -FilePath $VLCExe -ArgumentList $VLCArgs
}
function Test-LockedFile {
param ([parameter(Mandatory=$true)][string]$Path)
$oFile = New-Object System.IO.FileInfo $Path
if ((Test-Path -Path $Path) -eq $false)
{
return $false
}
try
{
$oStream = $oFile.Open([System.IO.FileMode]::Open, [System.IO.FileAccess]::ReadWrite, [System.IO.FileShare]::None)
if ($oStream)
{
$oStream.Close()
}
$false
}
catch
{
# file is locked by a process.
return $true
}
}
From the official documentation of Register-ObjectEvent (notes section)
Events, event subscriptions, and the event queue exist only in the current session. If you close the current session, the event queue is discarded and the event subscription is canceled.
Everything above belong to your session, which is terminated when the process exit. Even if it was somewhere else, your .net FileSystemWatcher is part of that process and is terminated as soon your session exit.
Now, when you debug through VSCode / ISE, your session is created beforehand and does not terminate after you exit the script, which allow you to evaluate variable of the last execution. It also mean your subscriptions and event callbacks and the associated .net objects remain in memory and active at this time.
That being said, your debugger also detached at the moment your script exited. It also mean that if you were not debugging and just running the script, your session would exit immediately after the script execution and thus, your listener, events callback and everything else would immediately exit without a chance to process anything.
To keep the debugger attached, be able to debug and also to have the script working in a normal context at all, you need to somewhat keep the process alive.
The usual way to ensure that is to add a loop at the end of the script.
# At the end of the script.
while ($true) {
Start-Sleep -Seconds 1
}
Note that events are raised to the main thread, which mean that if your script is sleeping, it won't be processed immediately. Therefore, in the example above, if 10 events were to occurs within the 1 second period, they would all get processed at the same time, when the thread stop sleeping.
Debugging note:
To deal with event already registered error during debugging Register-ObjectEvent : Cannot subscribe to the specified event. A subscriber with the source identifier 'FileCreated' already exists.., you can add cleanup code in the Finally part of a Try / Catch / Finally block.
try {
# At the end of the script...
while ($true) {
Start-Sleep -Seconds 1
}
}
catch {}
Finally {
# Work with CTRL + C exit too !
Unregister-Event -SourceIdentifier FileCreated
}
References:
Register-ObjectEvent
I am very new to powershell. The following code is created by BigTeddy and he gets the full credit (I also made some Changes using the While Loop)
I want to know how can I create an if/else statement such that if more than one file is changed/edited/created/deleted at the same time (say ten files have been edited simultaneously) a log file will be created saying these list files has been edited simultaneously at this particular time.
The following powershell script BigTeddy created basically spits out a log file (and on the output of the powershell ISE) of when a change/edit/creation/deletion has been made, the time it was changed and what file(s) was edited.
param(
[string]$folderToWatch = "C:\Users\gordon\Desktop\powershellStart"
, [string]$filter = "*.*"
, [string]$logFile = 'C:\Users\gordon\Desktop\powershellDest\filewatcher.log'
)
# In the following line, you can change 'IncludeSubdirectories to $true if required.
$fsw = New-Object IO.FileSystemWatcher $folderToWatch, $filter -Property #{IncludeSubdirectories = $false;NotifyFilter = [IO.NotifyFilters]'FileName, LastWrite'}
$timeStamp #My changes
$timeStampPrev = $timeStamp #My changes
# This script block is used/called by all 3 events and:
# appends the event to a log file, as well as reporting the event back to the console
$scriptBlock = {
# REPLACE THIS SECTION WITH YOUR PROCESSING CODE
$logFile = $event.MessageData # message data is how we pass in an argument to the event script block
$name = $Event.SourceEventArgs.Name
$changeType = $Event.SourceEventArgs.ChangeType
$timeStamp = $Event.TimeGenerated
while($timeStampPrev -eq $timeStamp) { #My changes
Write-Host "$timeStamp|$changeType|'$name'" -fore green
Out-File -FilePath $logFile -Append -InputObject "$timeStamp|$changeType|'$name'"
# REPLACE THIS SECTION WITH YOUR PROCESSING CODE
}
}
# Here, all three events are registered. You need only subscribe to events that you need:
Register-ObjectEvent $fsw Created -SourceIdentifier FileCreated -MessageData $logFile -Action $scriptBlock
Register-ObjectEvent $fsw Deleted -SourceIdentifier FileDeleted -MessageData $logFile -Action $scriptBlock
Register-ObjectEvent $fsw Changed -SourceIdentifier FileChanged -MessageData $logFile -Action $scriptBlock
# To stop the monitoring, run the following commands:
# Unregister-Event FileDeleted ; Unregister-Event FileCreated ; Unregister-Event FileChanged
#This script uses the .NET FileSystemWatcher class to monitor file events in folder(s).
#The advantage of this method over using WMI eventing is that this can monitor sub-folders.
#The -Action parameter can contain any valid Powershell commands.
#The script can be set to a wildcard filter, and IncludeSubdirectories can be changed to $true.
#You need not subscribe to all three types of event. All three are shown for example.
Have you considered a hash table with timestamps and action(create/modify/delete) as keys and file name as value. You iterate through the dictionary after a specific wait interval and you flush the entries in the dictionary to a log file.
I wrote a PowerShell script that monitors a specific folder for changes such as : creating /deleting / renaming items.
Then it writes all those actions into a log file. It works fine. The only problem is, that it works only when the console is on.
Whenever I'm trying to run it using a batch file it doesn't seems to be working (writing to a log file). I'm trying to run it as a background process.
This is the monitoring function :
function CreateMonitor($folder)
{
$folder = 'C:\Users\...\Monitored Folder\' # Enter the root path you want to monitor.
$filter = '*.*' # You can enter a wildcard filter here.
$fsw = New-Object IO.FileSystemWatcher $folder, $filter -Property #{IncludeSubdirectories = $false;NotifyFilter = [IO.NotifyFilters]'FileName, LastWrite'}
Register-ObjectEvent $fsw Created -SourceIdentifier FileCreated -Action {
$name = $Event.SourceEventArgs.Name
$changeType = $Event.SourceEventArgs.ChangeType
$timeStamp = $Event.TimeGenerated
Write-Host "The file '$name' was $changeType at $timeStamp" -fore green
Out-File -FilePath C:\Users\...\outlog.txt -Append -InputObject "The file '$name' was $changeType at $timeStamp"}
}
This is the batch file content :
powershell.exe -windowstyle hidden -file C:\Users\....ps1
How to make it run in the background / as a process?
Per the comment, its likely the Write-Host statement is breaking your script because that tries to write output to the console and when you run the script with no window there is no console to write to.
Best practice would be to change this to a Write-Verbose and then add [cmdletbinding()] to your function. Then if/when you want to see the statement, you call your function with a -Verbose switch:
function CreateMonitor
{
[cmdletbinding()]
Param(
$folder = 'C:\Users\...\Monitored Folder\',
$filter = '*.*'
)
$fsw = New-Object IO.FileSystemWatcher $folder, $filter -Property #{IncludeSubdirectories = $false;NotifyFilter = [IO.NotifyFilters]'FileName, LastWrite'}
Register-ObjectEvent $fsw Created -SourceIdentifier FileCreated -Action {
$name = $Event.SourceEventArgs.Name
$changeType = $Event.SourceEventArgs.ChangeType
$timeStamp = $Event.TimeGenerated
Write-Verbose "The file '$name' was $changeType at $timeStamp" -fore green
Out-File -FilePath C:\Users\...\outlog.txt -Append -InputObject "The file '$name' was $changeType at $timeStamp"}
}
You need to have Param() block to use [cmdletbinding()] so i've moved your $folder parameter in to this. I've also set its default value here, where as in your original function you were overriding it (if it had been set via a parameter). It also seemed to me to make sense to have $filter as a parameter also.
If you now want to see the verbose output you would simply invoke your function via:
CreateMonitor -Verbose
Of course you just wouldn't use this switch when running it in the background as there would be no where to see the output.
For more info on why using Write-Host is generally discouraged see this blog post from the creator of PowerShell: http://www.jsnover.com/blog/2013/12/07/write-host-considered-harmful/
As a guess, I'm assuming your file looks something like
function CreateMonitor { ... }
CreateMonitor
If this is the case then it is failing because once the ps1 file executes it then goes and exits since no further commands are needed.
If you add
while ($true) {} # alternately add Start-Sleep -s 1
Then the ps1 file will remain running and won't exit the moment it successfully creates the filewatcher.
Is there any simple way(i.e., script) to watch file in Powershell and run commands if file changes. I have been googling but can't find simple solution. Basically I run script in Powershell and if file changes then Powershell run other commands.
EDIT
Ok I think I made a mistake. I don't need script, a need function that I can include in my $PROFILE.ps1 file. But still, I was trying hard and still I'm unable to write it, so I will give bounty. It have to look like this:
function watch($command, $file) {
if($file #changed) {
#run $command
}
}
There is a NPM module that is doing what I want, watch , but it only watches for folders not files, and it's not Powershell xD.
Here is an example I have found in my snippets. Hopefully it is a little bit more comprehensive.
First you need to create a file system watcher and subsequently you subscribe to an event that the watcher is generating. This example listens for “Create” events, but could easily be modified to watch out for “Change”.
$folder = "C:\Users\LOCAL_~1\AppData\Local\Temp\3"
$filter = "*.LOG"
$Watcher = New-Object IO.FileSystemWatcher $folder, $filter -Property #{
IncludeSubdirectories = $false
NotifyFilter = [IO.NotifyFilters]'FileName, LastWrite'
}
$onCreated = Register-ObjectEvent $Watcher -EventName Created -SourceIdentifier FileCreated -Action {
$path = $Event.SourceEventArgs.FullPath
$name = $Event.SourceEventArgs.Name
$changeType = $Event.SourceEventArgs.ChangeType
$timeStamp = $Event.TimeGenerated
Write-Host "The file '$name' was $changeType at $timeStamp"
Write-Host $path
#Move-Item $path -Destination $destination -Force -Verbose
}
I will try to narrow this down to your requirements.
If you run this as part of your "profile.ps1" script you should read The Power of Profiles which explains the different profile scripts available and more.
Also, you should understand that waiting for a change in a folder can't be run as a function in the script. The profile script has to be finished, for your PowerShell session to start. You can, however use a function to register an event.
What this does, is register a piece of code, to be executed every time an event is triggered. This code will be executed in the context of your current PowerShell host (or shell) while the session remains open. It can interact with the host session, but has no knowledge of the original script that registered the code. The original script has probably finished already, by the time your code is triggered.
Here is the code:
Function Register-Watcher {
param ($folder)
$filter = "*.*" #all files
$watcher = New-Object IO.FileSystemWatcher $folder, $filter -Property #{
IncludeSubdirectories = $false
EnableRaisingEvents = $true
}
$changeAction = [scriptblock]::Create('
# This is the code which will be executed every time a file change is detected
$path = $Event.SourceEventArgs.FullPath
$name = $Event.SourceEventArgs.Name
$changeType = $Event.SourceEventArgs.ChangeType
$timeStamp = $Event.TimeGenerated
Write-Host "The file $name was $changeType at $timeStamp"
')
Register-ObjectEvent $Watcher -EventName "Changed" -Action $changeAction
}
Register-Watcher "c:\temp"
After running this code, change any file in the "C:\temp" directory (or any other directory you specify). You will see an event triggering execution of your code.
Also, valid FileSystemWatcher events you can register are "Changed", "Created", "Deleted" and "Renamed".
I will add another answer, because my previous one did miss the requirements.
Requirements
Write a function to WAIT for a change in a specific file
When a change is detected the function will execute a predefined command and return execution to the main script
File path and command are passed to the function as parameters
There is already an answer using file hashes. I want to follow my previous answer and show you how this can be accomplish using FileSystemWatcher.
$File = "C:\temp\log.txt"
$Action = 'Write-Output "The watched file was changed"'
$global:FileChanged = $false
function Wait-FileChange {
param(
[string]$File,
[string]$Action
)
$FilePath = Split-Path $File -Parent
$FileName = Split-Path $File -Leaf
$ScriptBlock = [scriptblock]::Create($Action)
$Watcher = New-Object IO.FileSystemWatcher $FilePath, $FileName -Property #{
IncludeSubdirectories = $false
EnableRaisingEvents = $true
}
$onChange = Register-ObjectEvent $Watcher Changed -Action {$global:FileChanged = $true}
while ($global:FileChanged -eq $false){
Start-Sleep -Milliseconds 100
}
& $ScriptBlock
Unregister-Event -SubscriptionId $onChange.Id
}
Wait-FileChange -File $File -Action $Action
Here is the solution I ended up with based on several of the previous answers here. I specifically wanted:
My code to be code, not a string
My code to be run on the I/O thread so I can see the console output
My code to be called every time there was a change, not once
Side note: I've left in the details of what I wanted to run due to the irony of using a global variable to communicate between threads so I can compile Erlang code.
Function RunMyStuff {
# this is the bit we want to happen when the file changes
Clear-Host # remove previous console output
& 'C:\Program Files\erl7.3\bin\erlc.exe' 'program.erl' # compile some erlang
erl -noshell -s program start -s init stop # run the compiled erlang program:start()
}
Function Watch {
$global:FileChanged = $false # dirty... any better suggestions?
$folder = "M:\dev\Erlang"
$filter = "*.erl"
$watcher = New-Object IO.FileSystemWatcher $folder, $filter -Property #{
IncludeSubdirectories = $false
EnableRaisingEvents = $true
}
Register-ObjectEvent $Watcher "Changed" -Action {$global:FileChanged = $true} > $null
while ($true){
while ($global:FileChanged -eq $false){
# We need this to block the IO thread until there is something to run
# so the script doesn't finish. If we call the action directly from
# the event it won't be able to write to the console
Start-Sleep -Milliseconds 100
}
# a file has changed, run our stuff on the I/O thread so we can see the output
RunMyStuff
# reset and go again
$global:FileChanged = $false
}
}
RunMyStuff # run the action at the start so I can see the current output
Watch
You could pass in folder/filter/action into watch if you want something more generic. Hopefully this is a helpful starting point for someone else.
Calculate the hash of a list of files
Store it in a dictionary
Check each hash on an interval
Perform action when hash is different
function watch($f, $command, $interval) {
$sha1 = New-Object System.Security.Cryptography.SHA1CryptoServiceProvider
$hashfunction = '[System.BitConverter]::ToString($sha1.ComputeHash([System.IO.File]::ReadAllBytes($file)))'
$files = #{}
foreach ($file in $f) {
$hash = iex $hashfunction
$files[$file.Name] = $hash
echo "$hash`t$($file.FullName)"
}
while ($true) {
sleep $interval
foreach ($file in $f) {
$hash = iex $hashfunction
if ($files[$file.Name] -ne $hash) {
iex $command
}
}
}
}
Example usage:
$c = 'send-mailmessage -to "admin#whatever.com" -from "watch#whatever.com" -subject "$($file.Name) has been altered!"'
$f = ls C:\MyFolder\aFile.jpg
watch $f $c 60
You can use the System.IO.FileSystemWatcher to monitor a file.
$watcher = New-Object System.IO.FileSystemWatcher
$watcher.Path = $searchPath
$watcher.IncludeSubdirectories = $true
$watcher.EnableRaisingEvents = $true
See also this article
Here is another option.
I just needed to write my own to watch and run tests within a Docker container. Jan's solution is much more elegant, but FileSystemWatcher is broken within Docker containers presently. My approach is similar to Vasili's, but much lazier, trusting the file system's write time.
Here's the function I needed, which runs the command block each time the file changes.
function watch($command, $file) {
$this_time = (get-item $file).LastWriteTime
$last_time = $this_time
while($true) {
if ($last_time -ne $this_time) {
$last_time = $this_time
invoke-command $command
}
sleep 1
$this_time = (get-item $file).LastWriteTime
}
}
Here is one that waits until the file changes, runs the block, then exits.
function waitfor($command, $file) {
$this_time = (get-item $file).LastWriteTime
$last_time = $this_time
while($last_time -eq $this_time) {
sleep 1
$this_time = (get-item $file).LastWriteTime
}
invoke-command $command
}
I had a similar problem. I first wanted to use Windows events and register, but this would be less fault-tolerant as the solution beneath.
My solution was a polling script (intervals of 3 seconds). The script has a minimal footprint on the system and notices changes very quickly. During the loop my script can do more things (actually I check 3 different folders).
My polling script is started through the task manager. The schedule is start every 5 minutes with the flag stop-when-already-running. This way it will restart after a reboot or after a crash.
Using the task manager for polling every 3 seconds is too frequent for the task manager.
When you add a task to the scheduler make sure you do not use network drives (that would call for extra settings) and give your user batch privileges.
I give my script a clean start by shutting it down a few minutes before midnight. The task manager starts the script every morning (the init function of my script will exit 1 minute around midnight).
I was looking for something I could run as a one-liner from a terminal. This is what I arrived at:
while ($True) { if ((Get-Item .\readme.md).LastWriteTime -ne $LastWriteTime) { "Hello!"; $LastWriteTime = (Get-Item .\readme.md).LastWriteTime; Sleep 1 } }
Another simple version:
$date = get-date
while ( (dir file.txt -ea 0 | % lastwritetime) -lt $date -and $count++ -lt 10) {
sleep 1
}
'file changed or timeout'
I have my PowerShell project broken into modules. But because they are modules I have to reload them every time I change them. So I wrote a loop that has a FileSystemWatcher and if one of the .psm1 file changes it will either reload or import that module.
The problem is that the above loop isn't going to let me run other scripts in its working environment, so a new environment will not have the same modules loaded/reloaded for it. I need to keep these modules out of the primary default PowerShell modules folder(s). Is there a way to run the script that reloads the modules when they change in the same environment or affect a certain environment?
UPDATE
So I run the following Module-Loader.ps1 script. The code block associated with the 'FileChanged' event does fire when I 'save' a *.psm1 file after having been modified. However two issues occure:
1) it fires twice when I save
2a) If the module is not loaded, it will run Import-Module $PWD\ModuleName, but it won't have actually loaded at least in the environment (if I run the same code in the environment it will load)
2b) if it is loaded, and it tries to remove the module, it will error that none exists.
# create a FileSystemWatcher on the currect directory
$filter = '*.psm1'
$folder = $PWD
$watcher = New-object IO.FileSystemWatcher $folder, $filter -Property #{IncludeSubdirectories = $false; EnableRaisingEvents = $true; NotifyFilter = [IO.NotifyFilters]'LastWrite'}
Register-ObjectEvent $watcher Changed -SourceIdentifier FileChanged -Action {
$name = $Event.SourceEventArgs.Name
$filename = $name.Remove($name.IndexOf('.'), 5)
$loadedModule = Get-Module | ? { $_.Name -eq $filename }
write-host $filename
if ($loadedModule) {
write-host "Reloading Module $folder\$($filename)"
Reload-Module $folder\$filename
} else {
write-host "Importing Module $folder\$($filename)"
Import-Module $folder\$filename
}
}
I am of the opinion that though this is being ran in a session, the code block in the event is not associated with this specific environment.
Here is an example from some code I have that copies a folder to a shared folder any time something has changed in it. It's kinda my little dropbox implementation :-)
Any time one of the file system watcher event types such as Changed occurs, the code specified in the -Action parameter of the Register-ObjectEvent cmdlet will fire.
In your -Action code you would call Import-Module with the -Force parameter to overwrite the current one in
memory.
function Backup-Folder {
& robocopy.exe "c:\folder" "\\server\share" /MIR /W:10 /R:10
}
$watcher = New-Object System.IO.FileSystemWatcher
$watcher.Path = "c:\folder"
$watcher.IncludeSubdirectories = $true
$watcher.EnableRaisingEvents = $true
Register-ObjectEvent $watcher "Changed" -Action { Backup-Folder }
Register-ObjectEvent $watcher "Created" -Action { Backup-Folder }
Register-ObjectEvent $watcher "Deleted" -Action { Backup-Folder }
Register-ObjectEvent $watcher "Renamed" -Action { Backup-Folder }