FileSystemWatcher - New Files During a Task - powershell

I have a folder being watched for new file creations. When a file is created the task that is run can take 5-30 min. Often a new file can be created when the last one is still being processed.
From a few test cases it seems that there is some sort of queue for the tasks. Is there a way to allow the tasks to be run simultaneously? If not is there a way to see the queue or be notified when something is put into the queue?
$folder = 'C:\Users\Public\Recorded TV' # Enter the root path you want to monitor.
$filter = '*.*' # You can enter a wildcard filter here.
$fsw = New-Object IO.FileSystemWatcher $folder, $filter -Property #{IncludeSubdirectories = $false;NotifyFilter = [IO.NotifyFilters]'FileName, LastWrite'}
Register-ObjectEvent $fsw Created -SourceIdentifier FileCreated -Action
{ <do stuff> }

If you want to do things simultaneously you would want to have the action start a job to run in the background, but then you may need a system to track jobs. If you don't mine letting things just filter through the event queue you can leave it as it is and just monitor things with Get-Event.
More info on either of these can be found online or in powershell with get-help about_jobs or get-help get-event.

Related

robocopy: how to make automatic monitoring and copy contents of a folder when anything changes with robocopy?

in the source (parent) folder, there are certain contents (subfolders, files), changes will happen in these contents time to time. my idea is to monitor this parent folder with the help of robocopy and look for changes every 2 minutes. If changes occurs, copy changes to the destination folder. Below is my command in robocopy:
robocopy C:\Users\username\Desktop\test_robocopy\source /E C:\Users\username\Desktop\test_robocopy\destination /mot:2
At first it detects changes automatically and the command runs and files are copied from source to destination. The problem is that, it is not automated afterwards, it needs key interrupt for example "hit enter" in the terminal and then it runs through. What am I missing above in the robocopy command to make it automated so that no other intervention needed.
This isn't exactly what you expect, but I think it accomplishes what you want.
I have based my script on the one found here.
This is a powershell script, so you need to save it as .ps1 and run it.
You can check the original answer for more details.
Basically what it does is to watch for file changes and trigger robocopy whenever this change happens.
In your case, I think you basically need to raise the wait time
$block = {
function Do-Something
{
param ($message, $event)
# function to call when event is raised
# do a robocopy or whatever
foreach ($folder in $watchedFolder) {
robocopy $folder ("o:\" + $folder.Replace("C:\Users\User\", "")) /MIR /e #here I'm using a string replace so I keep only the
#name of the folder to be mirrored, not the whole path - I'm sure there's a way to do this without the string replacement but
#I still didn't have time to find out how.
}
}
$watchedFolder = ("C:\Users\User\Desktop\a", "C:\Users\User\Desktop\b") #here I have created an array instead of only one folder.
#This wai I can simply add all my folders here and when my "foreach" is fired above, it will run for all my folders
$watcher = New-Object System.IO.FileSystemWatcher
$watcher.Path = $watchedFolder
Register-ObjectEvent -InputObject $watcher -EventName Created -SourceIdentifier File.Created -Action { Do-Something "Created" $event }
Register-ObjectEvent -InputObject $watcher -EventName Deleted -SourceIdentifier File.Deleted -Action { Do-Something "Deleted" $event }
Register-ObjectEvent -InputObject $watcher -EventName Changed -SourceIdentifier File.Changed -Action { Do-Something "Changed" $event }
Register-ObjectEvent -InputObject $watcher -EventName Renamed -SourceIdentifier File.Renamed -Action { Do-Something "Renamed" $event }
}
$encodedBlock = [Convert]::ToBase64String([Text.Encoding]::Unicode.GetBytes($block))
Start-Process PowerShell.exe -argumentlist '-WindowStyle Hidden', '-NoExit', '-EncodedCommand', $encodedBlock
while ($true) {sleep 5} #here the script sleeps 5 miliseconds and repeats all again - That way, I can listen to file changes and copy
#the changes instantly. In your case, you can use sleep -Seconds 120

How to watch a file, whenever it changes, take the new additional line and perform some actions on it with powershell

How can I watch a log file and each time a line is added to it, I want to reformat that line and broadcast it to a webhook using Invoke-RestMethod (basically using discord as a log file) then possibly also output some/same info to the console.
This is a similar question to others asked like it. I've been trying to do this for 2 weeks. In the following post they answered most of this question:
Watch file for changes and run command with powershell
However, it does not show how to take the added line to the file and perform actions on it all while it is still waiting for the log file to be updated again with the next entry. Log files can be updated fast in succession so not sure if these methods will keep up with the change if another change comes to the log file while the program is performing said action on the detected change.
So far, I was trying with something like this in a loop (this is not the entire code):
get-content '<file-name>' -tail 1 -wait
But It looks like this may miss lines in succession as while it's processing the additional actions, more lines may come in. So it looks like I may need to do this in a much more complicated way (which is okay). Just trying to figure out which direction to go.
Any suggestions or direction to go are much appreciated. Even a link if that's what it takes.
What you are doing is only allowed to show a file in real-time on the screen. You cannot mess with the output doing that.
The command you are using is not for interactive use cases.
You can monitor for file updates without doing what you are doing, by using a SystemFileWatcher, which allows for monitor for file actions, that you can then take action on.
'PowerShell filesystemwatcher monitor file'
https://duckduckgo.com/?q=%27PowerShell+filesystemwatcher+monitor+file%27&t=h_&ia=web
For example from one of the hits from the above link.
https://powershell.one/tricks/filesystem/filesystemwatcher
Monitoring Folders for File Changes
With a FileSystemWatcher, you can monitor folders for file changes and
respond immediately when changes are detected. This way, you can
create “drop” folders and respond to log file changes.
Specifically, as per your use case:
Advanced Mode (asynchonous)
If you expect changes to happen in rapid succession or even
simultaneously, you can use the FileSystemWatcher in asynchronous
mode: the FileSystemWatcher now works in the background and no longer
blocks PowerShell. Instead, whenever a change occurs, an event is
raised. So with this approach, you get a queue and won’t miss any
change.
On the back side, this approach has two challenges:
Handling Events: since PowerShell is single-threaded by nature, it is
not trivial to respond to events, and even more cumbersome to debug
event handler code.
Keeping PowerShell running: ironically, because the FileSystemWatcher
now no longer blocks PowerShell, this leads to another problem. You
need to keep PowerShell waiting for events but you cannot use
Start-Sleep or and endless loop because as long as PowerShell is busy
and it is considered busy even if it sleeps - no events can be handled.
Implementation
The script below does the exact same thing as the synchronous version
from above, only it is event-based and won’t miss any events anymore:
# find the path to the desktop folder:
$desktop = [Environment]::GetFolderPath('Desktop')
# specify the path to the folder you want to monitor:
$Path = $desktop
# specify which files you want to monitor
$FileFilter = '*'
# specify whether you want to monitor subfolders as well:
$IncludeSubfolders = $true
# specify the file or folder properties you want to monitor:
$AttributeFilter = [IO.NotifyFilters]::FileName, [IO.NotifyFilters]::LastWrite
try
{
$watcher = New-Object -TypeName System.IO.FileSystemWatcher -Property #{
Path = $Path
Filter = $FileFilter
IncludeSubdirectories = $IncludeSubfolders
NotifyFilter = $AttributeFilter
}
# define the code that should execute when a change occurs:
$action = {
# the code is receiving this to work with:
# change type information:
$details = $event.SourceEventArgs
$Name = $details.Name
$FullPath = $details.FullPath
$OldFullPath = $details.OldFullPath
$OldName = $details.OldName
# type of change:
$ChangeType = $details.ChangeType
# when the change occured:
$Timestamp = $event.TimeGenerated
# save information to a global variable for testing purposes
# so you can examine it later
# MAKE SURE YOU REMOVE THIS IN PRODUCTION!
$global:all = $details
# now you can define some action to take based on the
# details about the change event:
# let's compose a message:
$text = "{0} was {1} at {2}" -f $FullPath, $ChangeType, $Timestamp
Write-Host ""
Write-Host $text -ForegroundColor DarkYellow
# you can also execute code based on change type here:
switch ($ChangeType)
{
'Changed' { "CHANGE" }
'Created' { "CREATED"}
'Deleted' { "DELETED"
# to illustrate that ALL changes are picked up even if
# handling an event takes a lot of time, we artifically
# extend the time the handler needs whenever a file is deleted
Write-Host "Deletion Handler Start" -ForegroundColor Gray
Start-Sleep -Seconds 4
Write-Host "Deletion Handler End" -ForegroundColor Gray
}
'Renamed' {
# this executes only when a file was renamed
$text = "File {0} was renamed to {1}" -f $OldName, $Name
Write-Host $text -ForegroundColor Yellow
}
# any unhandled change types surface here:
default { Write-Host $_ -ForegroundColor Red -BackgroundColor White }
}
}
# subscribe your event handler to all event types that are
# important to you. Do this as a scriptblock so all returned
# event handlers can be easily stored in $handlers:
$handlers = . {
Register-ObjectEvent -InputObject $watcher -EventName Changed -Action $action
Register-ObjectEvent -InputObject $watcher -EventName Created -Action $action
Register-ObjectEvent -InputObject $watcher -EventName Deleted -Action $action
Register-ObjectEvent -InputObject $watcher -EventName Renamed -Action $action
}
# monitoring starts now:
$watcher.EnableRaisingEvents = $true
Write-Host "Watching for changes to $Path"
# since the FileSystemWatcher is no longer blocking PowerShell
# we need a way to pause PowerShell while being responsive to
# incoming events. Use an endless loop to keep PowerShell busy:
do
{
# Wait-Event waits for a second and stays responsive to events
# Start-Sleep in contrast would NOT work and ignore incoming events
Wait-Event -Timeout 1
# write a dot to indicate we are still monitoring:
Write-Host "." -NoNewline
} while ($true)
}
finally
{
# this gets executed when user presses CTRL+C:
# stop monitoring
$watcher.EnableRaisingEvents = $false
# remove the event handlers
$handlers | ForEach-Object {
Unregister-Event -SourceIdentifier $_.Name
}
# event handlers are technically implemented as a special kind
# of background job, so remove the jobs now:
$handlers | Remove-Job
# properly dispose the FileSystemWatcher:
$watcher.Dispose()
Write-Warning "Event Handler disabled, monitoring ends."
}
So, with the above, you tweak it to look for updates/modifications, then use
$CaptureLine = Get-Content -Path 'UNCToTheLogFile' | Select-Object -Last 1
Or
$CaptureLine = Get-Content -Path 'D:\temp\book1.csv' -Tail 1
And do what you want from that.

How to tell if a path is a file or a directory?

tl;dr:
Given a filesystem path, how can I determine whether it refers to a file or a directory in PowerShell?
Firstly, I actually know pretty much nothing about Powershell so please excuse any ignorance I may display in this posting.
I am hoping to create a script which can trigger processes whenever changes occur in a monitored folder. I have found a Powershell script on the Internet which I think has given me a good head start. However, after adapting it a bit and testing it out I have discovered a problem with it.
The script basically creates and appends a log file of changes to the monitored folder and sub-folders and it seems to work well. However, if I make any changes inside a sub-folder, as well as recording those changes in the log file, it also adds a line to the log file stating that the sub-folder itself has changed, like as if it was a file that has changed.
I need the script to trigger a process whenever a file changes but it must not trigger an extra process in addition to the processes that are triggered by any changes inside a sub-folder. I hope you can understand what I am saying here.
I therefore need a way (which integrates well into the script that I have found) of distinguishing between when a file is changed and a folder is changed.
The script (which I have adapted a bit) is as follows:
### SET FOLDER TO WATCH + FILES TO WATCH + SUBFOLDERS YES/NO
$watcher = New-Object System.IO.FileSystemWatcher
$watcher.Path = "monitoredfolder"
$watcher.Filter = "*.*"
$watcher.IncludeSubdirectories = $true
$watcher.EnableRaisingEvents = $true
### DEFINE ACTIONS AFTER AN EVENT IS DETECTED
$action = {
$path = $Event.SourceEventArgs.FullPath
$Name = $Event.SourceEventArgs.Name
$OldName = $Event.SourceEventArgs.OldName
$OldFullPath = $Event.SourceEventArgs.OldFullPath
$changeType = $Event.SourceEventArgs.ChangeType
$logline = "$(Get-Date), $changeType, $OldFullPath, $path, $OldName, $Name"
Add-content "log.txt" -value $logline
}
### DECIDE WHICH EVENTS SHOULD BE WATCHED
Register-ObjectEvent $watcher "Created" -Action $action
Register-ObjectEvent $watcher "Changed" -Action $action
Register-ObjectEvent $watcher "Deleted" -Action $action
Register-ObjectEvent $watcher "Renamed" -Action $action
while ($true) {sleep 5}
I am hoping to be able to add a line to the $action section which would create another variable which can be added as a column in the log file. Based purely on guesswork, I have tried the following and others but they did not work:
$pathtype = $Event.SourceEventArgs.FullPathType
$pathAttributes = $Event.SourceEventArgs.FullPathAttributes
$Attributes = $Event.SourceEventArgs.Attributes
$IsDirectory = $Event.SourceEventArgs.IsDirectory
To be honest, I did not really expect them to work but I thought I would give them a go.
Does anyone here know all of the things which can go after the second dot in "$Event.SourceEventArgs."?
Is there another approach I should be taking?
Thank you very much.
Test-Path -Type Leaf -LiteralPath $Event.SourceEventArgs.FullPath
tells you whether path string $Event.SourceEventArgs.FullPath refers to an (existing) file, as opposed to a directory (-Type Container).
Another way to do this it is with System.IO.File and the Exists static method.
[System.IO.File]::Exists($Event.SourceEventArgs.FullPath)
I've found it to be roughly 4 times faster than Test-Path when benchmarked for not much more typing.
To answer your other question: Does anyone here know all of the things which can go after the second dot in "$Event.SourceEventArgs."? you can easily find out what properties and methods are available on an object by sending it to Get-Member.
Changing your action to
$action = {
$Event.SourceEventArgs | Get-Member
}
and then retrieving the output from the created job will list the properties, methods, and type of object.
Id Name PSJobTypeName State HasMoreData Location Command
-- ---- ------------- ----- ----------- -------- -------
3 fe46cc0b-51e... NotStarted False ...
PS> Get-Job 3 | Receive-Job -Keep
TypeName: System.IO.FileSystemEventArgs
Name MemberType Definition
---- ---------- ----------
Equals Method bool Equals(System.Object obj)
GetHashCode Method int GetHashCode()
GetType Method type GetType()
ToString Method string ToString()
ChangeType Property System.IO.WatcherChangeTypes ChangeType {get;}
FullPath Property string FullPath {get;}
Name Property string Name {get;}

Copy Files and Folders After Antivirus Scan

I would like to run a PowerShell script on a Windows 2012 Server in the background at startup that does the following:
Watches a folder and all subfolders (not a scheduled task; needs to be "live")
Invokes a antivirus scan (if anyone is familiar with Symantec, I would like to utilize DoScan.exe)
Copy files to a destination folder if they are "clean"; quarantine those that are not
Log all activity
I have found bits and pieces of code and built a skeleton for this, but I am running into various issues. Firstly, PowerShell script quits after running for less than 10 seconds. Runs continuously in the ISE, however. Secondly, the -WindowStyle Hidden does not function as I hoped. I do not want the user to directly know the script is running. Thirdly (this may be more of a Symantec thing), my "invoke scan" routine seems to register as 1 big scan, instead of an individual file scan.
Any guidance is appreciated. Please explain why/why not this will work, and if I should use a different language, as well as if I should break this into more than one script.
$incomingfile = "E:\"
$filter = "*.*"
$watcher = New-Object IO.FileSystemWatcher $incomingfile, $filter -Property #{
IncludeSubdirectories = $true
NotifyFilter = [IO.NotifyFilters] 'FileName, LastWrite'
}
$onCreated = Register-ObjectEvent $watcher Created -SourceIdentifier FileCreated -Action {
$path = $Event.SourceEventArgs.FullPath
$name = $Event.SourceEventArgs.Name
$changeType = $Even.SourceEventArgs.ChangeType
$timeStamp = $Event.TimeGenerated
foreach ($incomingfile in $watcher) {
& "C:\Program Files (x86)\Symantec\Symantec Endpoint Protection\DoScan.exe" /ScanFile $path
}
}
Several things:
There's a t missing in one of the statements of your watcher action:
$changeType = $Even.SourceEventArgs.ChangeType
# ^^
Shouldn't have an effect, though, since you're not using that variable anyway.
The watcher only remains active while its parent PowerShell process is running.
Your foreach loop is wrong. The watcher fires for each created file, so remove the loop and put the command directly in the action scriptblock.
$onCreated = Register-ObjectEvent ... -Action {
$path = $Event.SourceEventArgs.FullPath
& "C:\Pro...can.exe" /ScanFile $path
}
Everything you want the script to do SEP can already do by itself. There's no need at all to re-implement the wheel with a FileSystemWatcher.

Powershell FileSystemWatcher network drive stops working. Need resetting?

I have a FileSystemWatcher program in PowerShell that is supposed to run on a server as long as the server is on. The program is started when the server starts. The FSW is supposed to run a program each time a new file is added to the folder it is watching, which is on a network drive. But for some reason, it doesn't execute the "action" program after some time. After a restart, the program works fine, running the "action" each time a new file arrives. I have not been able to find a clear pattern - it seems to stop responding, sometimes after a day, other times after just one "firing" of the action program.
I suspect this is because I am watching a file on a network drive, which according to other threads on stackoverflow, is unreliable and might need resetting: FileSystemWatcher stops catching events
The provided link sovles it in C#, so I wonder if a similar resetting could be written in Powershell?
Here is the program I am trying to run. I am working on a try catch for the fsw, but from what I have gathered so far, the problem must likely be solved by resetting if the network connection is interrupted.
$ErrorActionPreference = "Stop"
$action = "someprogram.ps1"
function log($string, $color, $logfile)
{
if ($Color -eq $null) {$color = 'white'}
write-host $string -foregroundcolor $color
$string | out-file -Filepath $logfile -append
}
$folder = "somefolder"
$filter = '*.csv'
$fsw = New-Object IO.FileSystemWatcher $folder, $filter -Property
#{
IncludeSubdirectories = $false
NotifyFilter = [IO.NotifyFilters]'FileName, LastWrite'
}
$onCreated = Register-ObjectEvent $fsw Created -SourceIdentifier FileCreated -Action
{
$global:FilePath = $Event.SourceEventArgs.FullPath
$time = get-date
log -String "File $FilePath detected at $time" -logfile $logfile
& $action
}
I am thinking a solution like this would be preferred to running a C# code:
while (Test-Path -Path $folder)
{
Start-Sleep -Milliseconds 100 #to reduce cpu load
##FileSystemWatcher code here
}
##Somehow restart file watcher if connection is lost
However, I dont want the script to run again every second or so. So I am thinking I might have to have another script running in parralell that checks if the folder path exists, and upon a disconnect, run a
Unregister-Event -SourceIdentifier FileCreated
and then restart the script. Then again, what happens if the connection to the folder is broken for one millisecond, while my script is sleeping? In that case, the test-path will not notice anything, and the script will fail as the filewatcher will no longer be able to detect a new file