So, I have a script which shows Download Progress from FTP.
I just try many ways to solve this task.
One of the conclusions was that cmdlet Register-ObjectEvent is a really bad idea.. Async eventing is rather poorly supported in Powershell...
And I stopped there -
$webclient.add_DownloadProgressChanged([System.Net.DownloadProgressChangedEventHandler]$webclient_DownloadProgressChanged )
....
$webclient_DownloadProgressChanged = {
param([System.Net.DownloadProgressChangedEventArgs]$Global:e)
$progressbaroverlay1.value=$e.ProgressPercentage
....
}
And everything in this sript works fine, but you can understand that I did this was for a one file.
But then I started thinking - How I can download several files at the same time and show it in a one progress bar?
So anyone have any great ideas? Or best way to solve this task?
P.S
WebClient can only download one file at a time.
Of course, I know it.
I came up with the same kind of Scriptblock Create approach, but I did use Register-ObjectEvent, with more or less success. The Async downloads happen as background jobs, and use events to communicate their progress back to the main script.
$Progress = #{}
$Isos = 'https://cdimage.debian.org/debian-cd/current/i386/iso-cd/debian-8.8.0-i386-CD-1.iso',
'https://cdimage.debian.org/debian-cd/current/i386/iso-cd/debian-8.8.0-i386-CD-2.iso'
$Count = 0
$WebClients = $Isos | ForEach-Object {
$w = New-Object System.Net.WebClient
$null = Register-ObjectEvent -InputObject $w -EventName DownloadProgressChanged -Action ([scriptblock]::Create(
"`$Percent = 100 * `$eventargs.BytesReceived / `$eventargs.TotalBytesToReceive; `$null = New-Event -SourceIdentifier MyDownloadUpdate -MessageData #($count,`$Percent)"
))
$w.DownloadFileAsync($_, "C:\PATH_TO_DOWNLOAD_FOLDER\$count.iso")
$Count = $Count + 1
$w
}
$event = Register-EngineEvent -SourceIdentifier MyDownloadUpdate -Action {
$progress[$event.MessageData[0]] = $event.MessageData[1]
}
$Timer = New-Object System.Timers.Timer
Register-ObjectEvent -InputObject $Timer -EventName Elapsed -Action {
if ($Progress.Values.Count -gt 0)
{
$PercentComplete = 100 * ($Progress.values | Measure-Object -Sum | Select-Object -ExpandProperty Sum) / $Progress.Values.Count
Write-Progress -Activity "Download Progress" -PercentComplete $PercentComplete
}
}
$timer.Interval = 100
$timer.AutoReset = $true
$timer.Start()
Exercise for the reader for how to tell that the downloads have finished.
You can use BitsTransfer module Asynchronous download.
https://technet.microsoft.com/en-us/library/dd819420.aspx
Example code to show overall process of 3 files, you specify an equal array of urls and download locations, you can do further with that to your liking like exception handling etc:
Import-Module BitsTransfer
[string[]]$url = #();
$url += 'https://www.samba.org/ftp/talloc/talloc-2.1.6.tar.gz';
$url += 'https://e.thumbs.redditmedia.com/pF525auqxnTG-FFj.png';
$url += 'http://bchavez.bitarmory.com/Skins/DayDreaming/images/bg-header.gif';
[string[]]$destination = #();
$destination += 'C:\Downloads\talloc-2.1.6.tar.gz';
$destination += 'C:\Downloads\pF525auqxnTG-FFj.png';
$destination += 'C:\Downloads\bg-header.gif';
$result = Start-BitsTransfer -Source $url -Destination $destination -TransferType Download -Asynchronous
$downloadsFinished = $false;
While ($downloadsFinished -ne $true) {
sleep 1
$jobstate = $result.JobState;
if($jobstate.ToString() -eq 'Transferred') { $downloadsFinished = $true }
$percentComplete = ($result.BytesTransferred / $result.BytesTotal) * 100
Write-Progress -Activity ('Downloading' + $result.FilesTotal + ' files') -PercentComplete $percentComplete
}
I see two possible concepts for this:
Create (with [scriptblock]::Create) an anonymous function on the fly, some something like:
$Id = 0
... | ForEach {
$webclient[$Id].add_DownloadProgressChanged([System.Net.DownloadProgressChangedEventHandler]{[scriptblock]::Create("
....
`$webclient_DownloadProgressChanged = {
param([System.Net.DownloadProgressChangedEventArgs]`$e)
`$Global:ProgressPercentage[$Id]=`$e.ProgressPercentage
`$progressbaroverlay1.value=(`$Global:ProgressPercentage | Measure-Object -Average).Average
....
"
$Id++
})
}
Note that in this idea you need to prevent everything but the $Id to be directly interpreted with a backtick.
Or if the function gets too large to be read, simplify the [ScriptBlock]:
[ScriptBlock]::Create("param(`$e); webclient_DownloadProgressChanged $Id `$e")
and call a global function:
$Global:webclient_DownloadProgressChanged($Id, $e) {
$Global:ProgressPercentage[$Id]=$e.ProgressPercentage
$progressbaroverlay1.value=($Global:ProgressPercentage | Measure-Object -Average).Average
}
Create your own custom background workers (threads):
For an example see: PowerShell: Job Event Action with Form not executed
In the main thread build your UI with progress bars
For each FTP download:
Create a shared (hidden) windows control (e.g. .TextBox[$Id])
Start a new background worker and share the related control, something like:
$SyncHash = [hashtable]::Synchronized(#{TextBox =
$TextBox[$Id]})
Update the shared $SyncHash.TextBox.Text = from within the WebWorker(s)
Capture the events (e.g. .Add_TextChanged) on each .TextBox[$Id] in
the main thread
Update your progress bars accordingly based the average status passed in each .TextBox[$Id].Text
Related
I have an ObjectEvent that sees a new file getting created and then tries to move the file.
It works... except when it goes to move the file, the file is still open, so the Move-Item fails.
So I suppose there are two possible paths... I am open to either (or both!)
First, how would my ObjectEvent fire only after the file is closed? Current objectevent:
Register-ObjectEvent $Watcher -EventName Created -SourceIdentifier FileCreated -Action
Second, is it possible for MoveItem to sit and keep trying for 5 seconds or so before failing? Current Move-Item call:
Move-Item $path -Destination $destination -Force -Verbose
The FileSystemWatcher class have a few events available. Changed, Created, Deleted, Error and Renamed. There is nothing there to indicate if the file was recently unlocked by the system. So there is no direct way to fire your event specifically after the file was closed.
What I would do personally to make that the files are moved properly would be to delay the move to X ms after the file creation event is fired.
Here's a summary of how that would work.
Prerequisites
A Filewatcher, which detects the file being created
A timer, which will move the files after their creation
A queue, on which the Filewatcher enqueue items being created and on which the timer dequeue items that need processing. It is what will communicate the information between the two.
Flow
A file is created
The Filewatcher created event trigger and add the item in the queue, then start the timer.
The timer elapsed event trigger, stop itself and process the queue. If any items remains in the queue, it restart itself.
Here's a base version of all of that.
$Params = #{
# Path we are watching
WatchPath = 'C:\temp\11\test'
# Path we are moving stuff to.
DestinationPath = 'C:\temp\11'
# Stop after X attempts
MoveAttempts = 5
# Timer heartbeat
MoveAttemptsDelay = 1000
}
# Create watcher
$Watcher = [System.IO.FileSystemWatcher]::new()
$Watcher.Path = $Params.WatchPath
$Watcher.EnableRaisingEvents = $true
# Create Timer - stopped state
$WatchTimer = [System.Timers.Timer]::new()
$WatchTimer.Interval = 1000
#Create WatchQueue (Each items will be FullPath,Timestamp,MoveAttempts)
$WatchQueue = [System.Collections.Generic.Queue[psobject]]::new()
$FileWatcherReg = #{
InputObject = $Watcher
EventName = 'Created'
SourceIdentifier = 'FileCreated'
MessageData = #{WatchQueue = $WatchQueue; Timer = $WatchTimer }
Action = {
if ($null -ne $event) {
$Queue = $Event.MessageData.WatchQueue
$Timer = $Event.MessageData.Timer
$Queue.Enqueue([PSCustomObject]#{
FullPath = $Event.SourceArgs.FullPath
TimeStamp = $Event.TimeGenerated
MoveAttempts = 0
})
# We only start the timer if it is not already counting down
# We also don't want to start the timer if item is not 1 since this mean
# the timer logic is already running.
if ($Queue.Count -eq 1 -and ! $Timer.Enabled) { $Timer.Start() }
}
}
}
$TimerReg = #{
InputObject = $WatchTimer
EventName = 'Elapsed'
Sourceidentifier = 'WatchTimerElapsed'
MessageData = #{WatchQueue = $WatchQueue; ConfigParams = $Params }
Action = {
$Queue = $Event.MessageData.WatchQueue
$ConfigParams = $Event.MessageData.ConfigParams
$Event.Sender.Stop()
$SkipItemsCount = 0
while ($Queue.Count -gt 0 + $SkipItemsCount ) {
$Item = $Queue.Dequeue()
$ItemName = Split-Path $item.FullPath -Leaf
while ($Item.MoveAttempts -lt $ConfigParams.MoveAttempts) {
try {
Write-Host 'test'
$Item.MoveAttempts += 1
Move-Item -Path $Item.FullPath -Destination "$($ConfigParams.DestinationPath)\$ItemName" -ErrorAction Stop
break
}
catch {
$ex = $_.Exception
if ( $Item.MoveAttempts -eq 5) {
# Do something about it... Log / Warn / etc...
Write-warning "Move attempts: $($ConfigParams.MoveAttempts)"
Write-Warning "FilePath: $($Item.FullPath)"
Write-Warning $ex
continue
}
else {
$Queue.Enqueue($Item)
$SkipItemsCount += 1
}
}
}
# If we skipped any items, we don't want to dequeue until 0 anymore but rather we will stop
if ($SkipItemsCount -gt 0){
$Event.Sender.Start()
}
}
}
}
# ObjectEvent for FileWatcher
Register-ObjectEvent #FileWatcherReg
# ObjectEvent for Timer which process stuff in a delayed fashion
Register-ObjectEvent #TimerReg
while ($true) {
Start-Sleep -Milliseconds 100
}
# Unregister events at the end
Unregister-Event -SourceIdentifier FileCreated | Out-Null # Will fail first time since never registered
Unregister-Event -SourceIdentifier WatchTimerElapsed | Out-Null
I created a tool (to be precise: a Powershell script) that helps me with converting pictures in folders, i.e. it looks for all files of a certain ending (say, *.TIF) and converts them to JPEGs via ImageMagick. It then transfers some EXIF, IPTC and XMP information from the source image to the JPEG via exiftool:
# searching files (done before converting the files, so just listed for reproduction):
$WorkingFiles = #(Get-ChildItem -Path D:\MyPictures\Testfiles -Filter *.tif | ForEach-Object {
[PSCustomObject]#{
SourceFullName = $_.FullName
JPEGFullName = $_.FullName -Replace 'tif$','jpg'
}
})
# Then, converting is done. PowerShell will wait until every jpeg is successfully created.
# + + + + The problem occurs somewhere after this line + + + +
# Creating the exiftool process:
$psi = New-Object System.Diagnostics.ProcessStartInfo
$psi.FileName = .\exiftool.exe
$psi.Arguments = "-stay_open True -charset utf8 -# -"
$psi.UseShellExecute = $false
$psi.RedirectStandardInput = $true
$psi.RedirectStandardOutput = $true
$psi.RedirectStandardError = $true
$exiftoolproc = [System.Diagnostics.Process]::Start($psi)
# creating the string argument for every file, then pass it over to exiftool:
for($i=0; $i -lt $WorkingFiles.length; $i++){
[string]$ArgList = "-All:all=`n-charset`nfilename=utf8`n-tagsFromFile`n$($WorkingFiles[$i].SourceFullName)`n-EXIF:All`n-charset`nfilename=utf8`n$($WorkingFiles[$i].JPEGFullName)"
# using -overwrite_original makes no difference
# Also, just as good as above code:
# [string]$ArgList = "-All:All=`n-EXIF:XResolution=300`n-EXIF:YResolution=300`n-charset`nfilename=utf8`n-overwrite_original`n$($WorkingFiles[$i].JPEGFullName)"
$exiftoolproc.StandardInput.WriteLine("$ArgList`n-execute`n")
# no difference using start-sleep:
# Start-Sleep -Milliseconds 25
}
# close exiftool:
$exiftoolproc.StandardInput.WriteLine("-stay_open`nFalse`n")
# read StandardError and StandardOutput of exiftool, then print it:
[array]$outputerror = #($exiftoolproc.StandardError.ReadToEnd().Split("`r`n",[System.StringSplitOptions]::RemoveEmptyEntries))
[string]$outputout = $exiftoolproc.StandardOutput.ReadToEnd()
$outputout = $outputout -replace '========\ ','' -replace '\[1/1]','' -replace '\ \r\n\ \ \ \ '," - " -replace '{ready}\r\n',''
[array]$outputout = #($outputout.Split("`r`n",[System.StringSplitOptions]::RemoveEmptyEntries))
Write-Output "Errors:"
foreach($i in $outputerror){
Write-Output $i
}
Write-Output "Standard output:"
foreach($i in $outputout){
Write-Output $i
}
If you want to reproduce but do not have/want that many files, there is also a simpler way: let exiftool print out its version number 600 times:
$psi = New-Object System.Diagnostics.ProcessStartInfo
$psi.FileName = .\exiftool.exe
$psi.Arguments = "-stay_open True -charset utf8 -# -"
$psi.UseShellExecute = $false
$psi.RedirectStandardInput = $true
$psi.RedirectStandardOutput = $true
$psi.RedirectStandardError = $true
$exiftoolproc = [System.Diagnostics.Process]::Start($psi)
for($i=0; $i -lt 600; $i++){
try{
$exiftoolproc.StandardInput.WriteLine("-ver`n-execute`n")
Write-Output "Success:`t$i"
}catch{
Write-Output "Failed:`t$i"
}
}
# close exiftool:
try{
$exiftoolproc.StandardInput.WriteLine("-stay_open`nFalse`n")
}catch{
Write-Output "Could not close exiftool!"
}
[array]$outputerror = #($exiftoolproc.StandardError.ReadToEnd().Split("`r`n",[System.StringSplitOptions]::RemoveEmptyEntries))
[array]$outputout = #($exiftoolproc.StandardOutput.ReadToEnd().Split("`r`n",[System.StringSplitOptions]::RemoveEmptyEntries))
Write-Output "Errors:"
foreach($i in $outputerror){
Write-Output $i
}
Write-Output "Standard output:"
foreach($i in $outputout){
Write-Output $i
}
As far as I could test, it all goes well, as long as you stay < 115 files. If you go above, the 114th JPEG gets proper metadata, but exiftool stops to work after this one - it idles, and my script does so, too. I can reproduce this with different files, paths, and exiftool commands.
Neither the StandardOutput nor the StandardError show any irregularities even with exiftool's -verbose-flag - of course, they would not, as I have to kill exiftool to get them to show up.
Running ISE's / VSCode's debugger shows nothing. Exiftool's window (only showing up when debugging) shows nothing.
Is there some hard limit on commands run with System.Diagnostics.Process, is this a problem with exiftool or is this simply due to my incompetence to use something outside the most basic Powershell cmdlets? Or maybe the better question would be: How can I properly debug this?
Powershell is 5.1, exiftool is 10.80 (production) - 10.94 (latest).
After messing around with different variants of $ArgList, I found out that there is no difference when using different file commands, but using commands that produce less StdOut (like -ver) resulted in more iterations. Therefore, I took an educated guess that the output buffer is the culprit.
As per Mark Byers' answer to "ProcessStartInfo hanging on “WaitForExit”? Why?":
The problem is that if you redirect StandardOutput and/or StandardError the internal buffer can become full. [...]
The solution is to use asynchronous reads to ensure that the buffer doesn't get full.
Then, it was just a matter of searching for the right things. I found that Alexander Obersht's answer to "How to capture process output asynchronously in powershell?" provides almost everything that I needed.
The script now looks like this:
# searching files (done before converting the files, so just listed for reproduction):
$WorkingFiles = #(Get-ChildItem -Path D:\MyPictures\Testfiles -Filter *.tif | ForEach-Object {
[PSCustomObject]#{
SourceFullName = $_.FullName
JPEGFullName = $_.FullName -Replace 'tif$','jpg'
}
})
# Then, converting is done. PowerShell will wait until every jpeg is successfully created.
# Creating the exiftool process:
$psi = New-Object System.Diagnostics.ProcessStartInfo
$psi.FileName = .\exiftool.exe
$psi.Arguments = "-stay_open True -charset utf8 -# -"
$psi.UseShellExecute = $false
$psi.RedirectStandardInput = $true
$psi.RedirectStandardOutput = $true
$psi.RedirectStandardError = $true
# + + + + NEW STUFF (1/2) HERE: + + + +
# Creating process object.
$exiftoolproc = New-Object -TypeName System.Diagnostics.Process
$exiftoolproc.StartInfo = $psi
# Creating string builders to store stdout and stderr.
$exiftoolStdOutBuilder = New-Object -TypeName System.Text.StringBuilder
$exiftoolStdErrBuilder = New-Object -TypeName System.Text.StringBuilder
# Adding event handers for stdout and stderr.
$exiftoolScripBlock = {
if (-not [String]::IsNullOrEmpty($EventArgs.Data)){
$Event.MessageData.AppendLine($EventArgs.Data)
}
}
$exiftoolStdOutEvent = Register-ObjectEvent -InputObject $exiftoolproc -Action $exiftoolScripBlock -EventName 'OutputDataReceived' -MessageData $exiftoolStdOutBuilder
$exiftoolStdErrEvent = Register-ObjectEvent -InputObject $exiftoolproc -Action $exiftoolScripBlock -EventName 'ErrorDataReceived' -MessageData $exiftoolStdErrBuilder
[Void]$exiftoolproc.Start()
$exiftoolproc.BeginOutputReadLine()
$exiftoolproc.BeginErrorReadLine()
# + + + + END OF NEW STUFF (1/2) + + + +
# creating the string argument for every file, then pass it over to exiftool:
for($i=0; $i -lt $WorkingFiles.length; $i++){
[string]$ArgList = "-All:all=`n-charset`nfilename=utf8`n-tagsFromFile`n$($WorkingFiles[$i].SourceFullName)`n-EXIF:All`n-charset`nfilename=utf8`n$($WorkingFiles[$i].JPEGFullName)"
# using -overwrite_original makes no difference
# Also, just as good as above code:
# [string]$ArgList = "-All:All=`n-EXIF:XResolution=300`n-EXIF:YResolution=300`n-charset`nfilename=utf8`n-overwrite_original`n$($WorkingFiles[$i].JPEGFullName)"
$exiftoolproc.StandardInput.WriteLine("$ArgList`n-execute`n")
}
# + + + + NEW STUFF (2/2) HERE: + + + +
# close exiftool:
$exiftoolproc.StandardInput.WriteLine("-stay_open`nFalse`n")
$exiftoolproc.WaitForExit()
# Unregistering events to retrieve process output.
Unregister-Event -SourceIdentifier $exiftoolStdOutEvent.Name
Unregister-Event -SourceIdentifier $exiftoolStdErrEvent.Name
# read StandardError and StandardOutput of exiftool, then print it:
[array]$outputerror = #($exiftoolStdErrBuilder.ToString().Trim().Split("`r`n",[System.StringSplitOptions]::RemoveEmptyEntries))
[string]$outputout = $exiftoolStdOutBuilder.ToString().Trim() -replace '========\ ','' -replace '\[1/1]','' -replace '\ \r\n\ \ \ \ '," - " -replace '{ready}\r\n',''
[array]$outputout = #($outputout.Split("`r`n",[System.StringSplitOptions]::RemoveEmptyEntries))
# + + + + END OF NEW STUFF (2/2) + + + +
Write-Output "Errors:"
foreach($i in $outputerror){
Write-Output $i
}
Write-Output "Standard output:"
foreach($i in $outputout){
Write-Output $i
}
I can confirm that it works for many, many files (at least 1600).
Is there any simple way(i.e., script) to watch file in Powershell and run commands if file changes. I have been googling but can't find simple solution. Basically I run script in Powershell and if file changes then Powershell run other commands.
EDIT
Ok I think I made a mistake. I don't need script, a need function that I can include in my $PROFILE.ps1 file. But still, I was trying hard and still I'm unable to write it, so I will give bounty. It have to look like this:
function watch($command, $file) {
if($file #changed) {
#run $command
}
}
There is a NPM module that is doing what I want, watch , but it only watches for folders not files, and it's not Powershell xD.
Here is an example I have found in my snippets. Hopefully it is a little bit more comprehensive.
First you need to create a file system watcher and subsequently you subscribe to an event that the watcher is generating. This example listens for “Create” events, but could easily be modified to watch out for “Change”.
$folder = "C:\Users\LOCAL_~1\AppData\Local\Temp\3"
$filter = "*.LOG"
$Watcher = New-Object IO.FileSystemWatcher $folder, $filter -Property #{
IncludeSubdirectories = $false
NotifyFilter = [IO.NotifyFilters]'FileName, LastWrite'
}
$onCreated = Register-ObjectEvent $Watcher -EventName Created -SourceIdentifier FileCreated -Action {
$path = $Event.SourceEventArgs.FullPath
$name = $Event.SourceEventArgs.Name
$changeType = $Event.SourceEventArgs.ChangeType
$timeStamp = $Event.TimeGenerated
Write-Host "The file '$name' was $changeType at $timeStamp"
Write-Host $path
#Move-Item $path -Destination $destination -Force -Verbose
}
I will try to narrow this down to your requirements.
If you run this as part of your "profile.ps1" script you should read The Power of Profiles which explains the different profile scripts available and more.
Also, you should understand that waiting for a change in a folder can't be run as a function in the script. The profile script has to be finished, for your PowerShell session to start. You can, however use a function to register an event.
What this does, is register a piece of code, to be executed every time an event is triggered. This code will be executed in the context of your current PowerShell host (or shell) while the session remains open. It can interact with the host session, but has no knowledge of the original script that registered the code. The original script has probably finished already, by the time your code is triggered.
Here is the code:
Function Register-Watcher {
param ($folder)
$filter = "*.*" #all files
$watcher = New-Object IO.FileSystemWatcher $folder, $filter -Property #{
IncludeSubdirectories = $false
EnableRaisingEvents = $true
}
$changeAction = [scriptblock]::Create('
# This is the code which will be executed every time a file change is detected
$path = $Event.SourceEventArgs.FullPath
$name = $Event.SourceEventArgs.Name
$changeType = $Event.SourceEventArgs.ChangeType
$timeStamp = $Event.TimeGenerated
Write-Host "The file $name was $changeType at $timeStamp"
')
Register-ObjectEvent $Watcher -EventName "Changed" -Action $changeAction
}
Register-Watcher "c:\temp"
After running this code, change any file in the "C:\temp" directory (or any other directory you specify). You will see an event triggering execution of your code.
Also, valid FileSystemWatcher events you can register are "Changed", "Created", "Deleted" and "Renamed".
I will add another answer, because my previous one did miss the requirements.
Requirements
Write a function to WAIT for a change in a specific file
When a change is detected the function will execute a predefined command and return execution to the main script
File path and command are passed to the function as parameters
There is already an answer using file hashes. I want to follow my previous answer and show you how this can be accomplish using FileSystemWatcher.
$File = "C:\temp\log.txt"
$Action = 'Write-Output "The watched file was changed"'
$global:FileChanged = $false
function Wait-FileChange {
param(
[string]$File,
[string]$Action
)
$FilePath = Split-Path $File -Parent
$FileName = Split-Path $File -Leaf
$ScriptBlock = [scriptblock]::Create($Action)
$Watcher = New-Object IO.FileSystemWatcher $FilePath, $FileName -Property #{
IncludeSubdirectories = $false
EnableRaisingEvents = $true
}
$onChange = Register-ObjectEvent $Watcher Changed -Action {$global:FileChanged = $true}
while ($global:FileChanged -eq $false){
Start-Sleep -Milliseconds 100
}
& $ScriptBlock
Unregister-Event -SubscriptionId $onChange.Id
}
Wait-FileChange -File $File -Action $Action
Here is the solution I ended up with based on several of the previous answers here. I specifically wanted:
My code to be code, not a string
My code to be run on the I/O thread so I can see the console output
My code to be called every time there was a change, not once
Side note: I've left in the details of what I wanted to run due to the irony of using a global variable to communicate between threads so I can compile Erlang code.
Function RunMyStuff {
# this is the bit we want to happen when the file changes
Clear-Host # remove previous console output
& 'C:\Program Files\erl7.3\bin\erlc.exe' 'program.erl' # compile some erlang
erl -noshell -s program start -s init stop # run the compiled erlang program:start()
}
Function Watch {
$global:FileChanged = $false # dirty... any better suggestions?
$folder = "M:\dev\Erlang"
$filter = "*.erl"
$watcher = New-Object IO.FileSystemWatcher $folder, $filter -Property #{
IncludeSubdirectories = $false
EnableRaisingEvents = $true
}
Register-ObjectEvent $Watcher "Changed" -Action {$global:FileChanged = $true} > $null
while ($true){
while ($global:FileChanged -eq $false){
# We need this to block the IO thread until there is something to run
# so the script doesn't finish. If we call the action directly from
# the event it won't be able to write to the console
Start-Sleep -Milliseconds 100
}
# a file has changed, run our stuff on the I/O thread so we can see the output
RunMyStuff
# reset and go again
$global:FileChanged = $false
}
}
RunMyStuff # run the action at the start so I can see the current output
Watch
You could pass in folder/filter/action into watch if you want something more generic. Hopefully this is a helpful starting point for someone else.
Calculate the hash of a list of files
Store it in a dictionary
Check each hash on an interval
Perform action when hash is different
function watch($f, $command, $interval) {
$sha1 = New-Object System.Security.Cryptography.SHA1CryptoServiceProvider
$hashfunction = '[System.BitConverter]::ToString($sha1.ComputeHash([System.IO.File]::ReadAllBytes($file)))'
$files = #{}
foreach ($file in $f) {
$hash = iex $hashfunction
$files[$file.Name] = $hash
echo "$hash`t$($file.FullName)"
}
while ($true) {
sleep $interval
foreach ($file in $f) {
$hash = iex $hashfunction
if ($files[$file.Name] -ne $hash) {
iex $command
}
}
}
}
Example usage:
$c = 'send-mailmessage -to "admin#whatever.com" -from "watch#whatever.com" -subject "$($file.Name) has been altered!"'
$f = ls C:\MyFolder\aFile.jpg
watch $f $c 60
You can use the System.IO.FileSystemWatcher to monitor a file.
$watcher = New-Object System.IO.FileSystemWatcher
$watcher.Path = $searchPath
$watcher.IncludeSubdirectories = $true
$watcher.EnableRaisingEvents = $true
See also this article
Here is another option.
I just needed to write my own to watch and run tests within a Docker container. Jan's solution is much more elegant, but FileSystemWatcher is broken within Docker containers presently. My approach is similar to Vasili's, but much lazier, trusting the file system's write time.
Here's the function I needed, which runs the command block each time the file changes.
function watch($command, $file) {
$this_time = (get-item $file).LastWriteTime
$last_time = $this_time
while($true) {
if ($last_time -ne $this_time) {
$last_time = $this_time
invoke-command $command
}
sleep 1
$this_time = (get-item $file).LastWriteTime
}
}
Here is one that waits until the file changes, runs the block, then exits.
function waitfor($command, $file) {
$this_time = (get-item $file).LastWriteTime
$last_time = $this_time
while($last_time -eq $this_time) {
sleep 1
$this_time = (get-item $file).LastWriteTime
}
invoke-command $command
}
I had a similar problem. I first wanted to use Windows events and register, but this would be less fault-tolerant as the solution beneath.
My solution was a polling script (intervals of 3 seconds). The script has a minimal footprint on the system and notices changes very quickly. During the loop my script can do more things (actually I check 3 different folders).
My polling script is started through the task manager. The schedule is start every 5 minutes with the flag stop-when-already-running. This way it will restart after a reboot or after a crash.
Using the task manager for polling every 3 seconds is too frequent for the task manager.
When you add a task to the scheduler make sure you do not use network drives (that would call for extra settings) and give your user batch privileges.
I give my script a clean start by shutting it down a few minutes before midnight. The task manager starts the script every morning (the init function of my script will exit 1 minute around midnight).
I was looking for something I could run as a one-liner from a terminal. This is what I arrived at:
while ($True) { if ((Get-Item .\readme.md).LastWriteTime -ne $LastWriteTime) { "Hello!"; $LastWriteTime = (Get-Item .\readme.md).LastWriteTime; Sleep 1 } }
Another simple version:
$date = get-date
while ( (dir file.txt -ea 0 | % lastwritetime) -lt $date -and $count++ -lt 10) {
sleep 1
}
'file changed or timeout'
I want to know if its bad form to use try blocks to test if a file is locked. Here's the background.
I need to send text output of an application to two serial printers simultaneously. My solution was to use MportMon, and a Powershell script. The way it's supposed to work is the application default prints to the MportMon virtual printer port, which actually makes a uniquely named file in a "dropbox" folder. The powershell script uses a filesystemwatcher to monitor the folder and when a new file is created, it takes the textual content and pushes it out two serial printers, then deletes the file, so as not to fill up the folder. I was having a problem when trying to read the text from the file that the virtual printer created. I found that I was getting errors becasue the file was still locked. To fixed the problem, I used a FSM to impliment the logic and instead of checking for a lock everytime before attempting to get the content from the file, I used a try block that attempts to read content from the file, if it fails, the catch block just reaffirms the state that the FSM is in, and the process is repeated until successful. It seems to work fine, but I've read somewhere that its bad practice. Is there any danger in this method, or is it safe and reliable? Below is my code.
$fsw = New-Object system.io.filesystemwatcher
$q = New-Object system.collections.queue
$path = "c:\DropBox"
$fsw.path = $path
$state = "waitforQ"
[string]$tempPath = $null
Register-ObjectEvent -InputObject $fsw -EventName created -Action {
$q.enqueue( $event.sourceeventargs.fullpath )
}
while($true) {
switch($state)
{
"waitforQ" {
echo "waitforQ"
if ($q.count -gt 0 ) {$state = "retrievefromQ"}
}
"retrievefromQ" {
echo "retrievefromQ"
$tempPath = $q.dequeue()
$state = "servicefile"
}
"servicefile" {
echo " in servicefile "
try
{
$text = Get-Content -ErrorAction stop $tempPath
#echo "in try"
$text | out-printer db1
$text | out-printer db2
echo " $text "
$state = "waitforQ"
rm $tempPath
}
catch
{
#echo "in catch"
$state = "servicefile"
}
}
Default { $state = "waitforQ" }
}
}
I wouldn't say it's bad practice to test a file to see if it's locked, but it's not as clean as checking the handles used by other processes. Personally I'd test the file like you do, but I adjust a few parts to make it safer/better.
That switch-statement looks way to complicated (for me), I'd replace it with a simple if-test. "If files in queue, proceed, if not, wait".
You need to slow down.. You will try to read the file as many times as possible while it's locked. This is a waste of resources since it will take some time for the current application to let it go and save the data to a HDD. Add some pauses. You won't notice them, but your CPU will love them. The same applies when there are no files in the queue.
You might benefit from adding a timeout, like max 50 attempts to read the file, to avoid the script getting stuck if one specific file is never released.
Try:
$fsw = New-Object system.io.filesystemwatcher
$q = New-Object system.collections.queue
$path = "c:\DropBox"
$fsw.path = $path
$MaxTries = 50 #50times * 0,2s sleep = 10sec timeout
[string]$tempPath = $null
Register-ObjectEvent -InputObject $fsw -EventName created -Action {
$q.enqueue( $event.sourceeventargs.fullpath )
}
while($true) {
if($q.Count -gt 0) {
#Get next file in queue
$tempPath = $q.dequeue()
#Read file
$text = $null
$i = 0
while($text -eq $null) {
#If locked, wait and try again
try {
$text = Get-Content -Path $tempPath -ErrorAction Stop
} catch {
$i++
if($i -eq $MaxTries) {
#Max attempts reached. Stops script
Write-Error -Message "Script is stuck on locked file '$tempPath'" -ErrorAction Stop
} else {
#Wait
Start-Sleep -Milliseconds 200
}
}
}
#Print file
$text | Out-Printer db1
$text | Out-Printer db2
echo " $text "
#Remove temp-file
Remove-Item $tempPath
}
#Relax..
Start-Sleep -Milliseconds 500
}
I'm running the DTEXEC.exe command from within a PowerShell script, trying to capture and log the output to a file. Sometimes the output is incomplete and I'm trying to figure out why this the case and what might be done about it. The lines that never seem to get logged are the most interesting:
DTEXEC: The package execution returned DTSER_SUCCESS(0)
Started: 10:58:43 a.m.
Finished: 10:59:24 a.m.
Elapsed: 41.484 seconds
The output always seems incomplete on packages that execute in less than ~ 8 seconds and this might be a clue (there isn't much output or they finish quickly).
I'm using .NETs System.Diagnostics.Process and ProcessStartInfo to setup and run the command, and I'm redirecting stdout and stderror to event handlers that each append to a StringBuilder which is subsequently written to disk.
The problem feels like a timing issue or a buffering issue. To solve the timing issue, I've attempted to use Monitor.Enter/Exit. If it's a buffering issue, I'm not sure how to force the Process to not buffer stdout and stderror.
The environment is
- PowerShell 2 running CLR version 2
- SQL 2008 32-bit DTEXEC.exe
- Host Operating System: XP Service Pack 3.
Here's the code:
function Execute-SSIS-Package
{
param([String]$fileName)
$cmd = GetDTExecPath
$proc = New-Object System.Diagnostics.Process
$proc.StartInfo.FileName = $cmd
$proc.StartInfo.Arguments = "/FILE ""$fileName"" /CHECKPOINTING OFF /REPORTING ""EWP"""
$proc.StartInfo.RedirectStandardOutput = $True
$proc.StartInfo.RedirectStandardError = $True
$proc.StartInfo.WorkingDirectory = Get-Location
$proc.StartInfo.UseShellExecute = $False
$proc.StartInfo.CreateNoWindow = $False
Write-Host $proc.StartInfo.FileName $proc.StartInfo.Arguments
$cmdOut = New-Object System.Text.StringBuilder
$errorEvent = Register-ObjectEvent -InputObj $proc `
-Event "ErrorDataReceived" `
-MessageData $cmdOut `
-Action `
{
param
(
[System.Object] $sender,
[System.Diagnostics.DataReceivedEventArgs] $e
)
try
{
[System.Threading.Monitor]::Enter($Event.MessageData)
Write-Host -ForegroundColor "DarkRed" $e.Data
[void](($Event.MessageData).AppendLine($e.Data))
}
catch
{
Write-Host -ForegroundColor "Red" "Error capturing processes std error" $Error
}
finally
{
[System.Threading.Monitor]::Exit($Event.MessageData)
}
}
$outEvent = Register-ObjectEvent -InputObj $proc `
-Event "OutputDataReceived" `
-MessageData $cmdOut `
-Action `
{
param
(
[System.Object] $sender,
[System.Diagnostics.DataReceivedEventArgs] $e
)
try
{
[System.Threading.Monitor]::Enter($Event.MessageData)
#Write-Host $e.Data
[void](($Event.MessageData).AppendLine($e.Data))
}
catch
{
Write-Host -ForegroundColor "Red" "Error capturing processes std output" $Error
}
finally
{
[System.Threading.Monitor]::Exit($Event.MessageData)
}
}
$isStarted = $proc.Start()
$proc.BeginOutputReadLine()
$proc.BeginErrorReadLine()
while (!$proc.HasExited)
{
Start-Sleep -Milliseconds 100
}
Start-Sleep -Milliseconds 1000
$procExitCode = $proc.ExitCode
$procStartTime = $proc.StartTime
$procFinishTime = Get-Date
$proc.Close()
$proc.CancelOutputRead()
$proc.CancelErrorRead()
$result = New-Object PsObject -Property #{
ExitCode = $procExitCode
StartTime = $procStartTime
FinishTime = $procFinishTime
ElapsedTime = $procFinishTime.Subtract($procStartTime)
StdErr = ""
StdOut = $cmdOut.ToString()
}
return $result
}
The reason that your output is truncated is that Powershell returns from WaitForExit() and sets the HasExited property before it has processed all the output events in the queue.
One solution it to loop an arbitrary amount of time with short sleeps to allow the events to be processed; Powershell event processing appear to not be pre-emptive so a single long sleep does not allow events to process.
A much better solution is to also register for the Exited event (in addition to Output and Error events) on the Process. This event is the last in the queue so if you set a flag when this event occurs then you can loop with short sleeps until this flag is set and know that you have processed all the output events.
I have written up a full solution on my blog but the core snippet is:
# Set up a pair of stringbuilders to which we can stream the process output
$global:outputSB = New-Object -TypeName "System.Text.StringBuilder";
$global:errorSB = New-Object -TypeName "System.Text.StringBuilder";
# Flag that shows that final process exit event has not yet been processed
$global:myprocessrunning = $true
$ps = new-object System.Diagnostics.Process
$ps.StartInfo.Filename = $target
$ps.StartInfo.WorkingDirectory = Split-Path $target -Parent
$ps.StartInfo.UseShellExecute = $false
$ps.StartInfo.RedirectStandardOutput = $true
$ps.StartInfo.RedirectStandardError = $true
$ps.StartInfo.CreateNoWindow = $true
# Register Asynchronous event handlers for Standard and Error Output
Register-ObjectEvent -InputObject $ps -EventName OutputDataReceived -action {
if(-not [string]::IsNullOrEmpty($EventArgs.data)) {
$global:outputSB.AppendLine(((get-date).toString('yyyyMMddHHmm')) + " " + $EventArgs.data)
}
} | Out-Null
Register-ObjectEvent -InputObject $ps -EventName ErrorDataReceived -action {
if(-not [string]::IsNullOrEmpty($EventArgs.data)) {
$global:errorSB.AppendLine(((get-date).toString('yyyyMMddHHmm')) + " " + $EventArgs.data)
}
} | Out-Null
Register-ObjectEvent -InputObject $ps -EventName Exited -action {
$global:myprocessrunning = $false
} | Out-Null
$ps.start() | Out-Null
$ps.BeginOutputReadLine();
$ps.BeginErrorReadLine();
# We set a timeout after which time the process will be forceably terminated
$processTimeout = $timeoutseconds * 1000
while (($global:myprocessrunning -eq $true) -and ($processTimeout -gt 0)) {
# We must use lots of shorts sleeps rather than a single long one otherwise events are not processed
$processTimeout -= 50
Start-Sleep -m 50
}
if ($processTimeout -le 0) {
Add-Content -Path $logFile -Value (((get-date).toString('yyyyMMddHHmm')) + " PROCESS EXCEEDED EXECUTION ALLOWANCE AND WAS ABENDED!")
$ps.Kill()
}
# Append the Standard and Error Output to log file, we don't use Add-Content as it appends a carriage return that is not required
[System.IO.File]::AppendAllText($logFile, $global:outputSB)
[System.IO.File]::AppendAllText($logFile, $global:errorSB)
My 2 cents...its not a powershell issue but an issue/bug in the System.Diagnostics.Process class and underlying shell. I've seen times when wrapping the StdError and StdOut does not catch everything, and other times when the 'listening' wrapper application will hang indefinitly because of HOW the underlying application writes to the console. (in the c/c++ world there are MANY different ways to do this, [e.g. WriteFile, fprintf, cout, etc])
In addition there are more than 2 outputs that may need to be captured, but the .net framework only shows you those two (given they are the two primary ones) [see this article about command redirection here as it starts to give hints).
My guess (for both your issue as well as mine) is that it has to do with some low-level buffer flushing and/or ref counting. (If you want to get deep, you can start here)
One (very hacky) way to get around this is instead of executing the program directly to actually execute wrap it in a call to cmd.exe with 2>&1, but this method has its own pitfalls and issues.
The most ideal solution is for the executable to have a logging parameter, and then go parse the log file after the process exits...but most of the time you don't have that option.
But wait, we're using powershell...why are you using System.Diagnositics.Process in the first place? you can just call the command directly:
$output = & (GetDTExecPath) /FILE "$fileName" /CHECKPOINTING OFF /REPORTING "EWP"