Powershell - Monitoring folder Content - powershell

I would like to be able to monitor the changes in a folder for a short period of time when a lot of files will be created and other changes will be made.
The code below is working but doesn't catch all the changes.
$folder = ’C:\Data’
$timeout = 1000
$FileSystemWatcher = New-Object System.IO.FileSystemWatcher $folder
Write-Host ”Press CTRL+C to abort monitoring $folder”
while ($true) {
$result = $FileSystemWatcher.WaitForChanged(‘all’, $timeout)
if ($result.TimedOut -eq $false)
{
Write-Warning (‘File {0} : {1}’ -f $result.ChangeType, $result.name)
}
}
Write-Host ’Monitoring aborted.’
If I use this on C:\Data it works but:
I create a .txt and it says 2 times new text document.txt.
Then I fill the name for the new txt and it outputs it 3 times. Or the other way around.
Please see below the output of
creating a hello.txt in my folder
creating a new folder with the name HiThere and
then renaming hello.txt to someTxt.txt
then deleting them both
output:
`Press CTRL+C to abort monitoring C:\Data
WARNING: File Created : New Text Document.txt
WARNING: File Changed : New Text Document.txt
WARNING: File Changed : New Text Document.txt
WARNING: File Changed : hello.txt
WARNING: File Created : New folder
WARNING: File Renamed : HiThere
WARNING: File Renamed : someTxt.txt
WARNING: File Changed : someTxt.txt
WARNING: File Changed : someTxt.txt
WARNING: File Changed : someTxt.txt
WARNING: File Deleted : someTxt.txt
WARNING: File Deleted : HiThere`
More problems: If I use this on a newtwork drive then not all the changes are being catched. (And this would be the point of this script, to monitor a folder from a mapped drive).
Test the code on your machine by only changing the folder path.
Using Powershell ISE 3.0

Insted of the while($true) loop, have you tried the "Register-ObjectEvent"?
I just tested one of my script using this method and could easily take 2000 empty files (generated in powershell). Unfortunatly, this was on a local machine.
Instructions: Define function like normal and off you go.
The command you use is:
Start-BackupScript -WatchFolder "C:\temp\my watch folder\" -DestinationFolder "C:\temp\backup\"
The script now monitors "C:\temp\my watch folder\" for new files created in that specific folder and will move them to "C:\temp\backup\". It will also append the date and time to the file.
Lets say you have started the script with the parameters above. You now place "hello_world.txt" in the watch folder. The script will move the file to "C:\temp\backup\" with the new filename being: "hello_world_2016-02-10_10-00-00.txt"
The script runs in the background. If you want to know how it's doing, then use the command:
Get-Job $backupscript -Keep
There you can see what it has been doing when. Please note that the -Keep parameter keeps the output in the "log", so you can check it later.
Script:
function Start-BackupScript
{
[CmdletBinding()]
Param
(
[Parameter()]
[String]$WatchFolder,
[Parameter()]
[String]$DestinationFolder
)
Process
{
$filter = '*.*'
$fsw = New-Object IO.FileSystemWatcher $WatchFolder, $filter -Property #{IncludeSubdirectories = $false;NotifyFilter = [IO.NotifyFilters]'FileName, LastWrite'}
$action = {
$fileMissing = $false
$FileInUseMessage = $false
$copied = $false
$file = Get-Item $Args.FullPath
$dateString = Get-Date -format "_yyyy-MM-dd_HH-mm-ss"
$DestinationFolder = $event.MessageData
$DestinationFileName = $file.basename + $dateString + $file.extension
$resultfilename = Join-Path $DestinationFolder $DestinationFileName
Write-Output ""
while(!$copied) {
try {
Move-Item -Path $file.FullName -Destination $resultfilename -ErrorAction Stop
$copied = $true
}
catch [System.IO.IOException] {
if(!$FileInUseMessage) {
Write-Output "$(Get-Date -Format "yyyy-MM-dd # HH:mm:ss") - $file in use. Waiting to move file"
$FileInUseMessage = $true
}
Start-Sleep -s 1
}
catch [System.Management.Automation.ItemNotFoundException] {
$fileMissing = $true
$copied = $true
}
}
if($fileMissing) {
Write-Output "$(Get-Date -Format "yyyy-MM-dd # HH:mm:ss") - $file not found!"
} else {
Write-Output "$(Get-Date -Format "yyyy-MM-dd # HH:mm:ss") - Moved $file to backup! `n`tFilename: `"$resultfilename`""
}
}
$backupscript = Register-ObjectEvent -InputObject $fsw -EventName "Created" -Action $action -MessageData $DestinationFolder
Write-Host "Started. WatchFolder: `"$($WatchFolder)`" DestinationFolder: `"$($DestinationFolder)`". Job is in: `$backupscript"
}
}

Have you looked at Register-WMIEvent?
Something like this:
Register-WmiEvent -Query "SELECT * FROM __InstanceModificationEvent WITHIN 5 WHERE TargetInstance ISA 'CIM_DataFile' and TargetInstance.Path = '\\Users\\Administrator\\' and targetInstance.Drive = 'C:' and (targetInstance.Extension = 'txt' or targetInstance.Extension = 'doc' or targetInstance.Extension = 'rtf') and targetInstance.LastAccessed > '$($cur)' " -sourceIdentifier "Accessor" -Action $action `
You can monitor a folder and then specific extensions within the folder. And then you set up a PowerShell scriptblock to handle any accesses. There's more here on this if you're interested: https://blog.varonis.com/practical-powershell-for-it-security-part-i-file-event-monitoring/
Agree with some of the comments above that file monitoring is not reliable -- lags and hiccups. Anywyay, hope this helps.

Related

Running a shell script through an automated Powershell folder listener

Over the last couple of days, I've been putting together a Powershell script to watch of particular folder on my c drive for new files with the help of mcpcmag's articule. The new files will be created when an email is requested by the end user. The new file is a .sh file containing a cURL command which sends a post request to the SendGrid API.
Here's the powershell script below:
# Create a listener for our emailOut directory
$watcher = New-Object System.IO.FileSystemWatcher
# Set the path and turn on the listener
$watcher.Path = 'C:\emailOut'
$watcher.EnableRaisingEvents = $true
$action =
{
# set the shellScript to the full path of the added file
$shellScript = $event.SourceEventArgs.fullPath
# get the name of the file
$name = (Get-Item $shellScript).Name
# set the path which the file will be moved to on completion
$oldFolderDir = "C:\emailOut\old\" + $name
# run the script
& $shellScript
# set the log file path
$logFile = "C:\emailOut\emailLog_" + $(get-date -format FileDate) + ".txt"
# set the log message
$logInfo = $name + " ---> " + $(get-date)
# add entry to log file
$logInfo | Out-File $logFile -Append -encoding utf8
# Move the script file into the historic folder
Move-Item -Path $shellScript -Destination $oldFolderDir
}
# Start the watcher
Register-ObjectEvent $watcher 'Created' -Action $action
The comments make the process fairly self explanatory but to briefly describe, here is what happens (or should happen):
File goes into folder that is being watched
Shell script is run
Entry is added to the logFile
File is moved to the old folder
Done
Unfortunately, there seems to be a problem with the way I am running the shell script. When running the process, it is obvious that the script is running as a black box appear for a fraction of a second but no email is sent. I have checked the original script and there are no issues because when I double click on the .sh file the email is sent perfectly.
Any ideas?
Thanks in advance for any help or advice.
I think you have a logical problem here:
File goes into folder that is being watched
[PowerShell]Shell script is run
The problem is after you call Register-ObjectEvent, your script ends, powershell process terminates, and this removes FileSystemWatcher implicitly.
You should keep script running all the time you want to 'catch' events.
You can run it through scheduled task, with option Do not start a new instance if there is instance running and set to repeat every N minutes (to restore script if something fails or someone kills process, etc.). This PowerShell instance must be running to keep getting events.
$eventSI = 'CustomSI_FSWCreated'
$watcher = New-Object System.IO.FileSystemWatcher
$watcher.Path = 'S:\SCRIPTS\FileTest'
$watcher.EnableRaisingEvents = $true
$action = { [System.Console]::WriteLine($event.SourceEventArgs.fullPath) }
# Start the watcher
Register-ObjectEvent -InputObject $watcher -EventName 'Created' -Action $action -SourceIdentifier $eventSI | Out-Null
try
{
$l = 1;
while($true)
{
# Actually this timeout only sets the frequence of dots on your screen. Set it bigger if you want or remove Write-Host-s
Wait-Event -Timeout 1
Write-Host "." -NoNewline
$l++;
if ($l -ge 40)
{
Write-Host ""
$l = 1;
}
}
}
finally
{
# This executes when you press Ctrl+C
Unregister-Event -SourceIdentifier $eventSI
$watcher.EnableRaisingEvents = $false
$watcher.Dispose()
Write-Host "Unregistered FSWE"
}
Just wanted to share my final working solution.
I did try Start-Sleep -seconds 5 prior to running the script but this unfortunately did not have the expected results.
After a bit of trial and error I arrived at a working solution:
# Create a listener for our emailOut directory
$watcher = New-Object System.IO.FileSystemWatcher
# Set the path and turn on the listener
$watcher.Path = 'C:\emailOut'
$watcher.EnableRaisingEvents = $true
$action =
{
# set the shellScript to the full path of the added file
$shellScript = $event.SourceEventArgs.fullPath
# get the name of the file
$name = (Get-Item $shellScript).Name
#set the path which the file will be moved to on completion
$oldFolderDir = "C:\emailOut\old\" + $name
#### SOLUTION START ###
#go to where bash.exe is stored
cd "C:\Program Files\Git\bin"
# run the script
& .\bash.exe $shellScript
### SOLUTION END ###
# set the log file path
$logFile = "C:\emailOut\emailLog_" + $(get-date -format FileDate) + ".txt"
# set the log message
$logInfo = $name + " ---> " + $(get-date)
# add entry to log file
$logInfo | Out-File $logFile -Append -encoding utf8
# Move the script file into the historic folder
Move-Item -Path $shellScript -Destination $oldFolderDir
}
# Start the watcher
Register-ObjectEvent $watcher 'Created' -Action $action
Basically, I ran my script by passing my cURL script to git bash.exe.
Any further queries or questions feel free to contact me.
Patrick

Copy files to FTP and archive with today's date

I need to create a script that does the following:
Copies all files in a folder to an FTP site.
If the copy was successful move the files to an archive.
The archive should be a freshly created folder with today's date (so we know when they were transmitted).
I've tried to cannibalise other scripts to get something to work but I'm not getting anywhere so I need some help I've been working on this for hours.
I'm using the WinSCP DLL only because my other (working) script uses SFTP which needs it. I know normal FTP doesn't but I couldn't find any easily transferrable code so trying to modify that instead.
So, here's the code I have, which doesn't even run, never mind run properly, can someone help me get it right? Sorry it's a bit of a mess.
param (
$localPath = "c:\test\source",
$remotePath = "/upload",
$folder = ($_.CreationTime | Get-Date -Format yyyyMMdd)
# not sure this works but don't see how to point the destination
# to the newly created folder
$backupPath = "c:\test\destination\$folder"
)
try
{
# Load WinSCP .NET assembly
Add-Type -Path "C:\Windows\System32\WindowsPowerShell\v1.0\WinSCPnet.dll"
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions -Property #{
Protocol = [WinSCP.Protocol]::ftp
HostName = "xxxxxxxx"
UserName = "xxxxxxxx"
Password = "xxxxxxxx"
}
$session = New-Object WinSCP.Session
try
{
# Connect
$session.Open($sessionOptions)
# Upload files, collect results
$transferResult = $session.PutFiles($localPath, $remotePath)
# Iterate over every transfer
foreach ($transfer in $transferResult.Transfers)
{
# Success or error?
if ($transfer.Error -eq $Null)
{
# If today's folder doesn't exist, create it
if (!(Test-Path $BackupPath))
{
New-Item -ItemType Directory -Force -Path $BackupPath
}
Write-Host ("Upload of {0} succeeded, moving to Uploaded folder" -f
$transfer.FileName)
# Upload succeeded, move source file to backup
Move-Item $transfer.FileName $backupPath
}
else
{
Write-Host ("Upload of {0} failed: {1}" -f
$transfer.FileName, $transfer.Error.Message)
}
}
}
finally
{
# Disconnect, clean up
$session.Dispose()
}
exit 0
}
catch [Exception]
{
Write-Host ("Error: {0}" -f $_.Exception.Message)
exit 1
}
So there's the code. I'm happy to use built in PowerShell for the FTP side to simplify it, I just want it to work.
I'm not sure what's your concern with the code. It looks pretty much ok, except for a syntax error, when setting $folder:
Why are you even trying to use $_.CreationTime as folder timestamp? Just use current date:
$folder = (Get-Date -Format "yyyyMMdd")
See Formatting timestamps in PowerShell in WinSCP documentation.
Also I do not see a point of setting $folder and $backupPath in the params block. Move it after the params block. If you want this anyway, you are missing a comma after the $folder assignment.
Other than that, your code should work.
You cannot simplify it by using the built-in PowerShell (or rather .NET) FTP functionality, as it does not have as powerful commands as WinSCP .NET assembly.
I'd write the code as:
$localPath = "C:\source\local\path\*"
$remotePath = "/dest/remote/path/"
$folder = (Get-Date -Format "yyyyMMdd")
$backupPath = "C:\local\backup\path\$folder"
# If today's folder doesn't exist, create it
if (!(Test-Path $BackupPath))
{
New-Item -ItemType Directory -Force -Path $BackupPath | Out-Null
}
try
{
# Load WinSCP .NET assembly
Add-Type -Path "WinSCPnet.dll"
# Setup session options
$sessionOptions = New-Object WinSCP.SessionOptions -Property #{
Protocol = [WinSCP.Protocol]::ftp
HostName = "ftp.example.com"
UserName = "username"
Password = "password"
}
$session = New-Object WinSCP.Session
try
{
# Connect
$session.Open($sessionOptions)
# Upload files, collect results
$transferResult = $session.PutFiles($localPath, $remotePath)
# Iterate over every transfer
foreach ($transfer in $transferResult.Transfers)
{
# Success or error?
if ($transfer.Error -eq $Null)
{
Write-Host ("Upload of $($transfer.FileName) succeeded, " +
"moving to backup")
# Upload succeeded, move source file to backup
Move-Item $transfer.FileName $backupPath
}
else
{
Write-Host ("Upload of $($transfer.FileName) failed: " +
"$($transfer.Error.Message)")
}
}
}
finally
{
# Disconnect, clean up
$session.Dispose()
}
exit 0
}
catch [Exception]
{
Write-Host "Error: $($_.Exception.Message)"
exit 1
}
Based on Moving local files to different location after successful upload.

Green frames / no subtitles in converted files of watching directory (Powershell) using HandbrakeCLI + JSON preset

So I needed a script that watches a directory and converts files using HandbrakeCLI. I found a part of this powershell here on stackoverflow and I adjusted some things for my project.
$global:watch = "C:\~\cmp\" ### watching directory
$global:convert = "C:\~\convert\" ### handbrakecli and preset location
$global:outp = "C:\~\output_folder\" ### output location
$global:extIn = ".mkv"
$global:extOut = ".mp4"
Write-Host "Watching directory = $watch"
Write-Host "HandbrakeCLI / json preset location = $convert"
Write-Host "Output directory = $outp"
Write-Host "Input extension = $extIn ; Output extension = $extOut"
Write-Host "Waiting for change in directory..."
### SET FOLDER TO WATCH + FILES TO WATCH + SUBFOLDERS YES/NO
$watcher = New-Object System.IO.FileSystemWatcher;
$watcher.Path = $watch;
$watcher.Filter = "*"+$extIn;
$watcher.IncludeSubdirectories = $false;
$watcher.EnableRaisingEvents = $false;
### DEFINE ACTIONS AFTER AN EVENT IS DETECTED
$action =
{
$path = $Event.SourceEventArgs.FullPath;
$handbrakecli = $convert+"HandBrakeCLI.exe";
$fl = Split-Path $path -leaf;
Write-Host "New file found: $fl";
$flName, $flExt = $fl.split('.')
$mp4File = $watch+"\"+$flName+$extOut
$changeType = $Event.SourceEventArgs.ChangeType
$logline = "$(Get-Date), $changeType, $path"
Add-content -path ($convert+"log.txt") -value $logline
Write-Host "Added entry to log"
Write-Host "Start converting using HandbrakeCLI..."
& cmd.exe /c $handbrakecli -i $path -o $mp4File --preset-import-file my_preset.json
Write-Host "Done converting!"
Write-Host "Moving file to folder..."
& cmd.exe /c move /Y $mp4File $outp
Write-Host "File moved!"
& cmd.exe /c del $path /F
Write-Host "$fl has been removed from local folder"
}
### DECIDE WHICH EVENTS SHOULD BE WATCHED
Register-ObjectEvent $watcher "Created" -Action $action
Register-ObjectEvent $watcher "Changed" -Action $action
Register-ObjectEvent $watcher "Renamed" -Action $action
while ($true) {sleep 5}
While at first everything seemed to work, I started to notice that "sometimes" the subtitles were not added or green frames were inserted (or replaced original frame) after every frame (normal - green - normal - green - etc.).
An example: I added 2 mkv files to the directory, the 1st one got converted just fine with subtitles while the 2nd file didn't have any subtitles.
I'm an amateur when it comes to this stuff, but I think it has something to do with the & cmd.exe /c. I also found that you could to Start-Process in powershell, but I don't know how to use it.
So if someone could help me convert this & cmd.exe /c $handbrakecli -i $path -o $mp4File --preset-import-file my_preset.json to something with Start-Process ..., maybe it will help me out.
EDIT
So I made the changes that Tomalak suggested (simpler this way), but Move-Item and Remove-Item don't seem to work.
EDIT 2
Added -LiteralPath as argument for Move-Item / Remove-Item (needed for filenames containt square brackets)
$inputFolder = "C:\~\cmp\"
$outputFolder = "C:\~\output_folder\"
$handbrake = "C:\~\convert\HandBrakeCLI.exe"
$presetJson ="C:\~\convert\my_preset.json"
$extIn = "mkv"
$extOut = "mp4"
while ($true) {
Get-ChildItem -Path $inputFolder -Filter "*.$extIn" | ForEach-Object {
$inFile = $_.FullName
$outFile = $inputFolder + $_.FullName.split('\.')[-2] + ".$extOut" #changed this because I wanted the file in the same directory as input file
Write-Host "Converting: $inFile"
& $handbrake -i $inFile -o $outFile --preset-import-file $presetJson
if ($LASTEXITCODE -eq 0) {
Move-Item -LiteralPath $outFile $outputFolder -Force #move to output folder
Write-Host "Done: $outFile"
Remove-Item -LiteralPath $inFile -Force #removing the input item, not output
Write-Host "Removed input file!"
} else {
Write-Error "Conversion failed!"
}
}
sleep 5
}
While subtitles are added to all output files, I still get green-flickering sometimes. I used 3 files as a test run, result: 1st flickering, 2nd OK, 3rd flickering. I have no clue why some are fine and some got the flickering. So I'm considering to maybe use ffmpeg instead.
EDIT 3
For future visitors: use ffmpeg instead of HandbrakeCLI:
ffmpeg.exe -i "C:\~\inputfile.mkv" -filter_complex "subtitles='C\:/Users/~/inputfile.mkv'" -c:v libx264 -preset veryfast -b:v 2750k -c:a aac $outputfile.mp4
Instead of using file system notifications, structure your script around a simple endless loop:
$inputFolder = "C:\~\cmp"
$outputFolder = "C:\~\convert"
$handbrake = "C:\~\convert\HandBrakeCLI.exe"
$presetJson = "C:\~\convert\my_preset.json"
$extIn = "mkv"
$extOut = "mp4"
while ($true) {
Get-ChildItem -Path $inputFolder -Filter "*.$extIn" | ForEach-Object {
$inFile = $_.FullName
$outFile = "$($_.BaseName).$extOut"
if (Test-Path $outFile) { Remove-Item $outFile -Force -LiteralPath }
Write-Host "Converting: $inFile"
& $handbrake -i $inFile -o $outFile --preset-import-file $presetJson
if ($LASTEXITCODE -eq 0) {
Move-Item $outFile $outputFolder -Force -LiteralPath
Write-Host "Done: $outFile"
} else {
Write-Error "Conversion not successful."
}
}
sleep 5
}
The & makes Powershell execute whatever program the $handbrake variable points to.
As an exercise you can convert the top-level variables to script parameters, so that you can re-use the script for other batch jobs.

Powershell - Listen for file, do something if file exists [duplicate]

Is there any simple way(i.e., script) to watch file in Powershell and run commands if file changes. I have been googling but can't find simple solution. Basically I run script in Powershell and if file changes then Powershell run other commands.
EDIT
Ok I think I made a mistake. I don't need script, a need function that I can include in my $PROFILE.ps1 file. But still, I was trying hard and still I'm unable to write it, so I will give bounty. It have to look like this:
function watch($command, $file) {
if($file #changed) {
#run $command
}
}
There is a NPM module that is doing what I want, watch , but it only watches for folders not files, and it's not Powershell xD.
Here is an example I have found in my snippets. Hopefully it is a little bit more comprehensive.
First you need to create a file system watcher and subsequently you subscribe to an event that the watcher is generating. This example listens for “Create” events, but could easily be modified to watch out for “Change”.
$folder = "C:\Users\LOCAL_~1\AppData\Local\Temp\3"
$filter = "*.LOG"
$Watcher = New-Object IO.FileSystemWatcher $folder, $filter -Property #{
IncludeSubdirectories = $false
NotifyFilter = [IO.NotifyFilters]'FileName, LastWrite'
}
$onCreated = Register-ObjectEvent $Watcher -EventName Created -SourceIdentifier FileCreated -Action {
$path = $Event.SourceEventArgs.FullPath
$name = $Event.SourceEventArgs.Name
$changeType = $Event.SourceEventArgs.ChangeType
$timeStamp = $Event.TimeGenerated
Write-Host "The file '$name' was $changeType at $timeStamp"
Write-Host $path
#Move-Item $path -Destination $destination -Force -Verbose
}
I will try to narrow this down to your requirements.
If you run this as part of your "profile.ps1" script you should read The Power of Profiles which explains the different profile scripts available and more.
Also, you should understand that waiting for a change in a folder can't be run as a function in the script. The profile script has to be finished, for your PowerShell session to start. You can, however use a function to register an event.
What this does, is register a piece of code, to be executed every time an event is triggered. This code will be executed in the context of your current PowerShell host (or shell) while the session remains open. It can interact with the host session, but has no knowledge of the original script that registered the code. The original script has probably finished already, by the time your code is triggered.
Here is the code:
Function Register-Watcher {
param ($folder)
$filter = "*.*" #all files
$watcher = New-Object IO.FileSystemWatcher $folder, $filter -Property #{
IncludeSubdirectories = $false
EnableRaisingEvents = $true
}
$changeAction = [scriptblock]::Create('
# This is the code which will be executed every time a file change is detected
$path = $Event.SourceEventArgs.FullPath
$name = $Event.SourceEventArgs.Name
$changeType = $Event.SourceEventArgs.ChangeType
$timeStamp = $Event.TimeGenerated
Write-Host "The file $name was $changeType at $timeStamp"
')
Register-ObjectEvent $Watcher -EventName "Changed" -Action $changeAction
}
Register-Watcher "c:\temp"
After running this code, change any file in the "C:\temp" directory (or any other directory you specify). You will see an event triggering execution of your code.
Also, valid FileSystemWatcher events you can register are "Changed", "Created", "Deleted" and "Renamed".
I will add another answer, because my previous one did miss the requirements.
Requirements
Write a function to WAIT for a change in a specific file
When a change is detected the function will execute a predefined command and return execution to the main script
File path and command are passed to the function as parameters
There is already an answer using file hashes. I want to follow my previous answer and show you how this can be accomplish using FileSystemWatcher.
$File = "C:\temp\log.txt"
$Action = 'Write-Output "The watched file was changed"'
$global:FileChanged = $false
function Wait-FileChange {
param(
[string]$File,
[string]$Action
)
$FilePath = Split-Path $File -Parent
$FileName = Split-Path $File -Leaf
$ScriptBlock = [scriptblock]::Create($Action)
$Watcher = New-Object IO.FileSystemWatcher $FilePath, $FileName -Property #{
IncludeSubdirectories = $false
EnableRaisingEvents = $true
}
$onChange = Register-ObjectEvent $Watcher Changed -Action {$global:FileChanged = $true}
while ($global:FileChanged -eq $false){
Start-Sleep -Milliseconds 100
}
& $ScriptBlock
Unregister-Event -SubscriptionId $onChange.Id
}
Wait-FileChange -File $File -Action $Action
Here is the solution I ended up with based on several of the previous answers here. I specifically wanted:
My code to be code, not a string
My code to be run on the I/O thread so I can see the console output
My code to be called every time there was a change, not once
Side note: I've left in the details of what I wanted to run due to the irony of using a global variable to communicate between threads so I can compile Erlang code.
Function RunMyStuff {
# this is the bit we want to happen when the file changes
Clear-Host # remove previous console output
& 'C:\Program Files\erl7.3\bin\erlc.exe' 'program.erl' # compile some erlang
erl -noshell -s program start -s init stop # run the compiled erlang program:start()
}
Function Watch {
$global:FileChanged = $false # dirty... any better suggestions?
$folder = "M:\dev\Erlang"
$filter = "*.erl"
$watcher = New-Object IO.FileSystemWatcher $folder, $filter -Property #{
IncludeSubdirectories = $false
EnableRaisingEvents = $true
}
Register-ObjectEvent $Watcher "Changed" -Action {$global:FileChanged = $true} > $null
while ($true){
while ($global:FileChanged -eq $false){
# We need this to block the IO thread until there is something to run
# so the script doesn't finish. If we call the action directly from
# the event it won't be able to write to the console
Start-Sleep -Milliseconds 100
}
# a file has changed, run our stuff on the I/O thread so we can see the output
RunMyStuff
# reset and go again
$global:FileChanged = $false
}
}
RunMyStuff # run the action at the start so I can see the current output
Watch
You could pass in folder/filter/action into watch if you want something more generic. Hopefully this is a helpful starting point for someone else.
Calculate the hash of a list of files
Store it in a dictionary
Check each hash on an interval
Perform action when hash is different
function watch($f, $command, $interval) {
$sha1 = New-Object System.Security.Cryptography.SHA1CryptoServiceProvider
$hashfunction = '[System.BitConverter]::ToString($sha1.ComputeHash([System.IO.File]::ReadAllBytes($file)))'
$files = #{}
foreach ($file in $f) {
$hash = iex $hashfunction
$files[$file.Name] = $hash
echo "$hash`t$($file.FullName)"
}
while ($true) {
sleep $interval
foreach ($file in $f) {
$hash = iex $hashfunction
if ($files[$file.Name] -ne $hash) {
iex $command
}
}
}
}
Example usage:
$c = 'send-mailmessage -to "admin#whatever.com" -from "watch#whatever.com" -subject "$($file.Name) has been altered!"'
$f = ls C:\MyFolder\aFile.jpg
watch $f $c 60
You can use the System.IO.FileSystemWatcher to monitor a file.
$watcher = New-Object System.IO.FileSystemWatcher
$watcher.Path = $searchPath
$watcher.IncludeSubdirectories = $true
$watcher.EnableRaisingEvents = $true
See also this article
Here is another option.
I just needed to write my own to watch and run tests within a Docker container. Jan's solution is much more elegant, but FileSystemWatcher is broken within Docker containers presently. My approach is similar to Vasili's, but much lazier, trusting the file system's write time.
Here's the function I needed, which runs the command block each time the file changes.
function watch($command, $file) {
$this_time = (get-item $file).LastWriteTime
$last_time = $this_time
while($true) {
if ($last_time -ne $this_time) {
$last_time = $this_time
invoke-command $command
}
sleep 1
$this_time = (get-item $file).LastWriteTime
}
}
Here is one that waits until the file changes, runs the block, then exits.
function waitfor($command, $file) {
$this_time = (get-item $file).LastWriteTime
$last_time = $this_time
while($last_time -eq $this_time) {
sleep 1
$this_time = (get-item $file).LastWriteTime
}
invoke-command $command
}
I had a similar problem. I first wanted to use Windows events and register, but this would be less fault-tolerant as the solution beneath.
My solution was a polling script (intervals of 3 seconds). The script has a minimal footprint on the system and notices changes very quickly. During the loop my script can do more things (actually I check 3 different folders).
My polling script is started through the task manager. The schedule is start every 5 minutes with the flag stop-when-already-running. This way it will restart after a reboot or after a crash.
Using the task manager for polling every 3 seconds is too frequent for the task manager.
When you add a task to the scheduler make sure you do not use network drives (that would call for extra settings) and give your user batch privileges.
I give my script a clean start by shutting it down a few minutes before midnight. The task manager starts the script every morning (the init function of my script will exit 1 minute around midnight).
I was looking for something I could run as a one-liner from a terminal. This is what I arrived at:
while ($True) { if ((Get-Item .\readme.md).LastWriteTime -ne $LastWriteTime) { "Hello!"; $LastWriteTime = (Get-Item .\readme.md).LastWriteTime; Sleep 1 } }
Another simple version:
$date = get-date
while ( (dir file.txt -ea 0 | % lastwritetime) -lt $date -and $count++ -lt 10) {
sleep 1
}
'file changed or timeout'

Getting error output from a powershell 2.0 script running as a task

TL:DR actual question is at the bottom
I'm trying to troubleshoot a Powershell v1.0 script issue. The script basically downloads a file from an FTP site and puts it on a remote server via UNC and emails the success or failure of the task.
The script runs as a task with a generic ID that is a Domain Admin but is not used to log into systems so the server it runs off of does not contain a profile for it.
If I do a runas for that user and execute the script via command line it works flawlessly. However, if I try to run it as a task it runs then exits instantly. If I open a runas command prompt and run the scheduled task vi at he command line all I get back is:
SUCCESS: Attempted to run the scheduled task "Task Name".
I've tried writing variable values to a text file to see what is going on but it never writes even when I write them as the very first step of execution.
What I want to do is capture any script error messages you would normally see when trying to run the script and/or write the variable information to a text file.
Is there any way to do this? BTW I doing via calling powershell with the following arguments:
-file -ExecutionPolicy Bypass "d:\datscript\myscript.ps1"
-I've tried -command instead of -file.
-I've tried "d:\datscript\myscript.ps1 5>&1 test.txt"
-I've tried "d:\datscript\myscript.ps1 9>&1 test.txt"
-I've tried "d:\datscript\myscript.ps1 | out-file d:\datscript\test.txt"
Nothing worked. I'm sure I can fix whatever bug I have but I'm banging my head against the wall trying to get some kind of failure info.
--Update: Here is a copy of the script minus details--
#-------------------------------------------------------------------------------------------------------------------------------------------------------------
#
#Variable Declaration
#
#$path = Path on local server to downlaod DAT to
#$olddat = Old/last DAT downloaded
#$currentdat = Next DAT number
#$ftpsite = McAfee FTP site. Update if path changes
#$ftpuser = FTP user (anon login)
#$ftppass = FTP password (anon login)
#$tempstring = Manipulation variable
#$gotdat = Boolean if updated DAT exists
#$success = Status if a new DAT exists and has been downloaded (used for email notification).
#$thetime = Variable use dto hold time of day manipulation.
$path = "\\myservername\ftproot\pub\mcafee\datfiles\"
$olddat = ""
$currentdat =""
$ftpsite = "ftp://ftp.nai.com/virusdefs/4.x/"
$ftpuser = "something"
$ftppass = "anything"
$tempstring =""
$gotdat = "False"
$success = ""
$thetime = ""
#
#Normalized functions handles UNC paths
#
function Get-NormalizedFileSystemPath
{
<#
.Synopsis
Normalizes file system paths.
.DESCRIPTION
Normalizes file system paths. This is similar to what the Resolve-Path cmdlet does, except Get-NormalizedFileSystemPath also properly handles UNC paths and converts 8.3 short names to long paths.
.PARAMETER Path
The path or paths to be normalized.
.PARAMETER IncludeProviderPrefix
If this switch is passed, normalized paths will be prefixed with 'FileSystem::'. This allows them to be reliably passed to cmdlets such as Get-Content, Get-Item, etc, regardless of Powershell's current location.
.EXAMPLE
Get-NormalizedFileSystemPath -Path '\\server\share\.\SomeFolder\..\SomeOtherFolder\File.txt'
Returns '\\server\share\SomeOtherFolder\File.txt'
.EXAMPLE
'\\server\c$\.\SomeFolder\..\PROGRA~1' | Get-NormalizedFileSystemPath -IncludeProviderPrefix
Assuming you can access the c$ share on \\server, and PROGRA~1 is the short name for "Program Files" (which is common), returns:
'FileSystem::\\server\c$\Program Files'
.INPUTS
String
.OUTPUTS
String
.NOTES
Paths passed to this command cannot contain wildcards; these will be treated as invalid characters by the .NET Framework classes which do the work of validating and normalizing the path.
.LINK
Resolve-Path
#>
[CmdletBinding()]
param (
[Parameter(Mandatory = $true, ValueFromPipeline = $true, ValueFromPipelineByPropertyName = $true)]
[Alias('PSPath', 'FullName')]
[string[]]
$Path,
[switch]
$IncludeProviderPrefix
)
process
{
foreach ($_path in $Path)
{
$_resolved = $_path
if ($_resolved -match '^([^:]+)::')
{
$providerName = $matches[1]
if ($providerName -ne 'FileSystem')
{
Write-Error "Only FileSystem paths may be passed to Get-NormalizedFileSystemPath. Value '$_path' is for provider '$providerName'."
continue
}
$_resolved = $_resolved.Substring($matches[0].Length)
}
if (-not [System.IO.Path]::IsPathRooted($_resolved))
{
$_resolved = Join-Path -Path $PSCmdlet.SessionState.Path.CurrentFileSystemLocation -ChildPath $_resolved
}
try
{
$dirInfo = New-Object System.IO.DirectoryInfo($_resolved)
}
catch
{
$exception = $_.Exception
while ($null -ne $exception.InnerException)
{
$exception = $exception.InnerException
}
Write-Error "Value '$_path' could not be parsed as a FileSystem path: $($exception.Message)"
continue
}
$_resolved = $dirInfo.FullName
if ($IncludeProviderPrefix)
{
$_resolved = "FileSystem::$_resolved"
}
Write-Output $_resolved
}
} # process
} # function Get-NormalizedFileSystemPath
#
#Get the number of the exisiting DAT file and increment for next DAT if the DAT's age is older than today.
# Otherwise, exit the program if DATs age is today.
#
$tempstring = "xdat.exe"
$env:Path = $env:Path + ";d:\datscript"
$path2 ="d:\datscript\debug.txt"
add-content $path2 $path
add-content $path2 $olddat
add-content $path2 $currentdat
add-content $path2 $success
add-content $path2 " "
$path = Get-NormalizedFileSystemPath -Path $path
Set-Location -Path $path
$olddat = dir $path | %{$_.Name.substring(0, 4) }
$olddatfull = "$olddat" + "$tempstring"
if ( ((get-date) - (ls $olddatfull).LastWriteTime).day -lt 1)
{
#***** Commented out for testing!
# exit
}
$currentdat = [INT] $olddat
$currentdat++
$currentdat = "$currentdat" + "$tempstring"
add-content $path2 $olddat
add-content $path2 $currentdat
add-content $path2 $success
add-content $path2 " "
#
#Connect to FTP site and get a current directory listing.
#
[System.Net.FtpWebRequest]$ftp = [System.Net.WebRequest]::Create($ftpsite)
$ftp.Method = [System.Net.WebRequestMethods+FTP]::ListDirectoryDetails
$response = $ftp.getresponse()
$stream = $response.getresponsestream()
$buffer = new-object System.Byte[] 1024
$encoding = new-object System.Text.AsciiEncoding
$outputBuffer = ""
$foundMore = $false
#
# Read all the data available from the ftp directory stream, writing it to the
# output buffer when done. After that the buffer is searched to see if it cotains the expected
# lastest DAT.
#
do
{
## Allow data to buffer for a bit
start-sleep -m 1000
## Read what data is available
$foundmore = $false
$stream.ReadTimeout = 1000
do
{
try
{
$read = $stream.Read($buffer, 0, 1024)
if($read -gt 0)
{
$foundmore = $true
$outputBuffer += ($encoding.GetString($buffer, 0, $read))
}
} catch { $foundMore = $false; $read = 0 }
} while($read -gt 0)
} while($foundmore)
$gotdat = $outputbuffer.Contains($currentdat)
$target = $path + $currentdat
#
# Downloads DATs and cleans up old DAT file. Returns status of the operation.
# Return 1 = success
# Return 2 = Latest DAT not found and 4pm or later
# Return 3 = DAT available but did not download or is 0 bytes
# Return 4 = LatesT DAT not found and before 4pm
#
$success = 0
if ($gotdat -eq "True")
{
$ftpfile = $ftpsite + $ftppath + $currentdat
write-host $ftpfile
write-host $target
$ftpclient = New-Object system.Net.WebClient
$uri = New-Object System.Uri($ftpfile)
$ftpclient.DownloadFile($uri, $target)
Start-Sleep -s 30
if ( ((get-date) - (ls $target).LastWriteTime).days -ge 1)
{
$success = 3
}
else
{
$testlength = (get-item $target).length
if( (get-item $target).length -gt 0)
{
Remove-Item "$olddatfull"
$success = 1
}
else
{
$success = 3
}
}
}
else
{
$thetime = Get-Date
$thetime = $thetime.Hour
if ($thetime -ge 16)
{
$success = 2
}
else
{
$success = 4
exit
}
}
#
# If successful download (success = 1) run push bat
#
if ($success -eq 1)
{
Start-Process "cmd.exe" "/c c:\scripts\mcafeepush.bat"
}
#Email structure
#
#Sends result email based on previous determination
#
#SMTP server name
$smtpServer = "emailserver.domain.com"
#Creating a Mail object
$msg = new-object Net.Mail.MailMessage
#Creating SMTP server object
$smtp = new-object Net.Mail.SmtpClient($smtpServer)
$msg.From = "email1#domain.com"
$msg.ReplyTo = "email2#domain.com"
$msg.To.Add("email2#domain.com")
switch ($success)
{
1 {
$msg.subject = "McAfee Dats $currentdat successful"
$msg.body = ("DAT download completed successfully. Automaton v1.0")
}
2 {
$msg.subject = "McAfee DATs Error"
$msg.body = ("Looking for DAT $currentdat on the FTP site but I coud not find it. Human intervention may be required. Automaton v1.0")
}
3 {
$msg.subject = "McAfee DATs Error"
$msg.body = ("$currentdat is available for download but download has failed. Human intervention will be required. Automaton v1.0")
}
default {
$msg.subject = "DAT Automaton Error"
$msg.body = ("Something broke with the McAfee automation script. Human intervention will be required. Automaton v1.0")
}
}
#Sending email
$smtp.Send($msg)
#Needed to keep the program from exiting too fast.
Start-Sleep -s 30
#debugging stuff
add-content $path2 $olddat
add-content $path2 $currentdat
add-content $path2 $success
add-content $path2 " "
Apparently you have an error in starting Powershell, either because execution policy is different on the Powershell version you start, or on the account, or there is an access error on the scheduled task. To gather actual error, you can launch a task like so:
cmd /c "powershell.exe -file d:\datscript\myscript.ps1 test.txt 2>&1" >c:\windows\temp\test.log 2&>1
This way if there would be an error on starting Powershell, it will be logged in the c:\windows\temp\test.log file. If the issue is in execution policy, you can create and run (once) a task with the following:
powershell -command "Get-ExecutionPolicy -List | out-file c:/windows/temp/policy.txt; Set-ExecutionPolicy RemoteSigned -Scope LocalMachine -Force"
Running a task under the account you plan to run your main task will first get the policies in effect (so that if setting machine-level policy won't help, you'll know what scope to alter) and set machine-level policy to "RemoteSigned", the least restrictive level beyond allowing every script (highly not recommended, there are encoder scripts written on Powershell that can ruin your data).
Hope this helps.
UPDATE: If that's not policy, there might be some errors in properly writing the parameters for the task. You can do this: Create a .bat file with the string that launches your script and redirects output to say test1.txt, then change the scheduled task to cmd.exe -c launcher.bat >test2.txt, properly specifying the home folder. Run the task and review both files, at least one of them should contain an error that prevents your script from launching.