How to find the file that triggered a FileSystemWatcher event - powershell

I'm using a FileSystemWatcher to check for changed files in a target directory. However it does not seem like I can access the information on what file triggered the event, or I do simply not know how.
$Action = {
# Output name of trigger file here.
}
$FileSystemWatcher = New-Object System.IO.FileSystemWatcher $TargetDirectory
Register-ObjectEvent -InputObject $FileSystemWatcher -EventName Changed -Action $Action
As I'm waiting for events in multiple target directories, the alternative of using synchronized waiting is no option for me.
Am I doing something wrong?

When using this code:
$Action = {
# Output name of trigger file here.
Write-Host $Event.SourceEventArgs.FullPath
}
$TargetDirectory = "c:\temp\fsw"
$FileSystemWatcher = New-Object System.IO.FileSystemWatcher $TargetDirectory
Register-ObjectEvent -InputObject $FileSystemWatcher -EventName Changed -Action $Action
I get the full path of the file being changed.

Related

How to provide options to event registration?

I don't understand why the following code generates error messages. Powershell seems difficult to learn.
$fsw = New-Object IO.FileSystemWatcher ...
$Action = {Param($option)
if ($option -eq "Copy")
{Write-Host "Copy was specified"}
}
Register-ObjectEvent $fsw Created -SourceIdentifier FileCreated -Action $Action
Register-ObjectEvent $fsw Changed -SourceIdentifier FileChanged -Action $Action -ArgumentList Copy
(This is an updated version of question How to provide options to script blocks?)
When in doubt, read the documentation. The Register-ObjectEvent cmdlet does not have a parameter -ArgumentList, so getting an error when trying to call the cmdlet with a parameter that it doesn't have is hardly surprising. And as I already told you in my comment to your previous question, passing arguments to an event action the way you're trying to do does not make much sense in the first place.
If your actions are so similar that defining different action script blocks would yield too much boilerplate you can easily discriminate by event inside the scriptblock, since that information is passed to the action automatically (via the variable $Event).
$Action = {
$name = $Event.SourceEventArgs.Name
$type = $Event.SourceEventArgs.ChangeType
switch ($type) {
'Created' { Write-Host "file ${file} was created" }
'Changed' { Write-Host "file ${file} was changed" }
}
}

Using FileSystemWatcher to alert if multiple files change simulatenously

I am very new to powershell. The following code is created by BigTeddy and he gets the full credit (I also made some Changes using the While Loop)
I want to know how can I create an if/else statement such that if more than one file is changed/edited/created/deleted at the same time (say ten files have been edited simultaneously) a log file will be created saying these list files has been edited simultaneously at this particular time.
The following powershell script BigTeddy created basically spits out a log file (and on the output of the powershell ISE) of when a change/edit/creation/deletion has been made, the time it was changed and what file(s) was edited.
param(
[string]$folderToWatch = "C:\Users\gordon\Desktop\powershellStart"
, [string]$filter = "*.*"
, [string]$logFile = 'C:\Users\gordon\Desktop\powershellDest\filewatcher.log'
)
# In the following line, you can change 'IncludeSubdirectories to $true if required.
$fsw = New-Object IO.FileSystemWatcher $folderToWatch, $filter -Property #{IncludeSubdirectories = $false;NotifyFilter = [IO.NotifyFilters]'FileName, LastWrite'}
$timeStamp #My changes
$timeStampPrev = $timeStamp #My changes
# This script block is used/called by all 3 events and:
# appends the event to a log file, as well as reporting the event back to the console
$scriptBlock = {
# REPLACE THIS SECTION WITH YOUR PROCESSING CODE
$logFile = $event.MessageData # message data is how we pass in an argument to the event script block
$name = $Event.SourceEventArgs.Name
$changeType = $Event.SourceEventArgs.ChangeType
$timeStamp = $Event.TimeGenerated
while($timeStampPrev -eq $timeStamp) { #My changes
Write-Host "$timeStamp|$changeType|'$name'" -fore green
Out-File -FilePath $logFile -Append -InputObject "$timeStamp|$changeType|'$name'"
# REPLACE THIS SECTION WITH YOUR PROCESSING CODE
}
}
# Here, all three events are registered. You need only subscribe to events that you need:
Register-ObjectEvent $fsw Created -SourceIdentifier FileCreated -MessageData $logFile -Action $scriptBlock
Register-ObjectEvent $fsw Deleted -SourceIdentifier FileDeleted -MessageData $logFile -Action $scriptBlock
Register-ObjectEvent $fsw Changed -SourceIdentifier FileChanged -MessageData $logFile -Action $scriptBlock
# To stop the monitoring, run the following commands:
# Unregister-Event FileDeleted ; Unregister-Event FileCreated ; Unregister-Event FileChanged
#This script uses the .NET FileSystemWatcher class to monitor file events in folder(s).
#The advantage of this method over using WMI eventing is that this can monitor sub-folders.
#The -Action parameter can contain any valid Powershell commands.
#The script can be set to a wildcard filter, and IncludeSubdirectories can be changed to $true.
#You need not subscribe to all three types of event. All three are shown for example.
Have you considered a hash table with timestamps and action(create/modify/delete) as keys and file name as value. You iterate through the dictionary after a specific wait interval and you flush the entries in the dictionary to a log file.

Powershell: Execute a script after file finishes copying

I am having an issue with IO.FileSystemWatcher. I have a script that when file copies to a certain directory it processes that file and copies it to another folder on network. But there is a problem, it starts copying as soon as a fsw fires oncreate event, which generates an error (file is open) and I want it to start only after file has finished copying. Rename and delete works properly.
cloud.ps1 is script that processes file and copies it.
This is the code for monitor script:
$watch = '\\HR-ZAG-SR-0011\ACO\ACO2\99. BOX'
$watcher = New-Object System.IO.FileSystemWatcher
$watcher.Path = $watch
$watcher.IncludeSubdirectories = $true
$watcher.EnableRaisingEvents = $true
$changed = Register-ObjectEvent $watcher "Changed" -Action {
# Write-Host "Changed: $($eventArgs.ChangeType)"
# $watch\cloud.ps1 modify "$($eventArgs.FullPath)"
}
$created = Register-ObjectEvent $watcher "Created" -Action {
Write-Host "Created: $($eventArgs.FullPath)"
.\cloud.ps1 create "$($eventArgs.FullPath)"
}
$deleted = Register-ObjectEvent $watcher "Deleted" -Action {
Write-Host "Deleted: $($eventArgs.FullPath)"
.\cloud.ps1 delete "$($eventArgs.FullPath)"
}
$renamed = Register-ObjectEvent $watcher "Renamed" -Action {
Write-Host "Renamed: $($eventArgs.OldFullPath) to $($eventArgs.FullPath)"
.\cloud.ps1 rename "$($eventArgs.OldFullPath)" "$($eventArgs.FullPath)"
}
You can put in a loop to start checking to see if you can get a write lock on the file:
While ($True)
{
Try {
[IO.File]::OpenWrite($file).Close()
Break
}
Catch { Start-Sleep -Seconds 1 }
}
As long as the file is being written to, you won't be able to get a write lock, and the Try will fail. Once the write is complete and the file is closed, the OpenWrite will succeed, the loop will break and you can proceed with copying the file.

Monitoring files in Powershell and using ArrayList

I'm new to PowerShell. I would like to add a file path to an ArrayList every time it changes. However, this PowerShell script fails somehow. Any hints what might I be doing wrong?
$watcher = New-Object System.IO.FileSystemWatcher
$watcher.Path = "C:\Mydir"
$watcher.IncludeSubdirectories = $true
$watcher.EnableRaisingEvents = $true
$jobs = New-Object System.Collections.ArrayList
$changed = Register-ObjectEvent $watcher "Changed" -Action {
$changedFile = $($eventArgs.FullPath)
$jobs.Add($changedFile)
}
It's a scope issue. Add the global scope modifier:
$global:jobs.Add($changedFile)
see about_Scopes for more help.

How can I reload modules from one open work environment to affect another working environment

I have my PowerShell project broken into modules. But because they are modules I have to reload them every time I change them. So I wrote a loop that has a FileSystemWatcher and if one of the .psm1 file changes it will either reload or import that module.
The problem is that the above loop isn't going to let me run other scripts in its working environment, so a new environment will not have the same modules loaded/reloaded for it. I need to keep these modules out of the primary default PowerShell modules folder(s). Is there a way to run the script that reloads the modules when they change in the same environment or affect a certain environment?
UPDATE
So I run the following Module-Loader.ps1 script. The code block associated with the 'FileChanged' event does fire when I 'save' a *.psm1 file after having been modified. However two issues occure:
1) it fires twice when I save
2a) If the module is not loaded, it will run Import-Module $PWD\ModuleName, but it won't have actually loaded at least in the environment (if I run the same code in the environment it will load)
2b) if it is loaded, and it tries to remove the module, it will error that none exists.
# create a FileSystemWatcher on the currect directory
$filter = '*.psm1'
$folder = $PWD
$watcher = New-object IO.FileSystemWatcher $folder, $filter -Property #{IncludeSubdirectories = $false; EnableRaisingEvents = $true; NotifyFilter = [IO.NotifyFilters]'LastWrite'}
Register-ObjectEvent $watcher Changed -SourceIdentifier FileChanged -Action {
$name = $Event.SourceEventArgs.Name
$filename = $name.Remove($name.IndexOf('.'), 5)
$loadedModule = Get-Module | ? { $_.Name -eq $filename }
write-host $filename
if ($loadedModule) {
write-host "Reloading Module $folder\$($filename)"
Reload-Module $folder\$filename
} else {
write-host "Importing Module $folder\$($filename)"
Import-Module $folder\$filename
}
}
I am of the opinion that though this is being ran in a session, the code block in the event is not associated with this specific environment.
Here is an example from some code I have that copies a folder to a shared folder any time something has changed in it. It's kinda my little dropbox implementation :-)
Any time one of the file system watcher event types such as Changed occurs, the code specified in the -Action parameter of the Register-ObjectEvent cmdlet will fire.
In your -Action code you would call Import-Module with the -Force parameter to overwrite the current one in
memory.
function Backup-Folder {
& robocopy.exe "c:\folder" "\\server\share" /MIR /W:10 /R:10
}
$watcher = New-Object System.IO.FileSystemWatcher
$watcher.Path = "c:\folder"
$watcher.IncludeSubdirectories = $true
$watcher.EnableRaisingEvents = $true
Register-ObjectEvent $watcher "Changed" -Action { Backup-Folder }
Register-ObjectEvent $watcher "Created" -Action { Backup-Folder }
Register-ObjectEvent $watcher "Deleted" -Action { Backup-Folder }
Register-ObjectEvent $watcher "Renamed" -Action { Backup-Folder }