Perform action after multiple files are created - powershell

I'm new to Powershell scripting, and I'm struggling with how to identify when multiple files have been created in order to initiate a process.
Ultimately, I need to wait until a handful of files have been created (this usually occurs within a finite period of time the same time each day). These files are created at separate times. Once all files have been created and are at their final location, I need to perform a separate action with these files. The trouble I'm having is:
Identifying when all files are available
Identifying how to initiate a separate process once these files are available
If necessary,unregistering the events (I plan to run this script each morning...I don't know how long these events are registered)
I've toyed with using the IO.FileSystemWatcher with some success in order to monitor when any individual directory has this file. While this logs appropriately, I don't know how to consolidate the collection of these files. Possibly a flag? Not sure how to implement. I've also considered using Test-Path as a way of checking to see if these files exist -- but a drawback with this is that I'd need to periodically run the script (or Sleep) for a pre-defined period of time.
Does anyone have experience with doing something like this? Or possibly provide guidance?
What I've tried (with respect to IO.FileSystemWatcher) using test data:
$i=0
$paths = "C:\","Z:\"
$fileName='Test'+$Datestr.Trim()+'.txt'
foreach ($path in $paths)
{
$fsw = New-Object IO.FileSystemWatcher $path, $fileName -Property #{IncludeSubdirectories = $tru;NotifyFilter = [IO.NotifyFilters]'FileName, LastWrite'}
Register-ObjectEvent $fsw Created -SourceIdentifier "$i+fileCreated" -Action {
$name = $Event.SourceEventArgs.Name
$changeType = $Event.SourceEventArgs.ChangeType
$fpath = $Event.SourceEventArgs.FullPath
$timeStamp = $Event.TimeGenerated
Write-Host "The folder "$Event.SourceEventArgs.FullPath" was $changeType at $timeStamp" -fore green
Out-File -FilePath Z:\log.txt -Append -InputObject "The folder $fpath was $changeType at $timeStamp"
}

I am guessing you do not have control over the process(es) that create the files, or else you would use the completion of that job to trigger the "post processing" you need to do. I have used IO.FileSystemWatcher on a loop/timer like you described, and then I Group-Object on the file names to get a distinct list of the files that have changed. I was using this to monitor for small files (about 100 files at ~100KB each) that did not change frequently, and it still generated a large number of events every time the files changed. If you want to take action/start a process every time a change is made, then IO.FileSystemWatcher is your friend.
If the files are larger/take longer to generate, and because you only care once they are all done (not when they are created/modified) you may be better off skipping the filewatcher and just check if all of the files are there. IE: the process(s) to generate the files usually finishes by 6am. So at 6am you run a script that checks to see if all the files exist. (and I would also check the file size/last modified dttm to help you ensure that the process which generates the file is done writing to it.) You still may want to build a loop into this, especially if you want the process to start as soon as the files are done.
$filesIWatch = #()
$filesIWatch += New-Object -TypeName psobject -Property #{"FullName"="C:\TestFile1.txt"
"AvgSize"=100}
$filesIWatch += New-Object -TypeName psobject -Property #{"FullName"="C:\TestFile2.txt"
"AvgSize"=100}
[bool]$filesDone = $true
foreach($file in $filesIWatch){
#Check if the file is there. Also, maybe check the file size/modified dttm to help be sure that some other process is still writing to the file.
if(-not (Test-Path ($file.FullName))){
$filesDone = $false
}
}
if($filesDone){
#do your processing
}
else{
#throw error/handle them not being done
}

Related

Mirroring folders with slight modifications

I have to "almost" mirror two folders.
One folder contains the output from a tool with the file format *_output.txt. Files are continuously added here as the tool runs and it will produce hundreds of thousands of files.
The other one should contain the same files as input for another tool, but that tool expects the format to be *_input.txt
My current solution is a powershell script that loops through the first folder, checks if the renamed file exists and if it doesn't, copies and renames it with Copy-Item. This, however, is proving very inefficient once the file number goes high enough. I would like to improve this.
Is it possible to somehow make use of robocopy's /MIR and also rename files in the second folder? I would like to prevent the original files being mirrored if a renamed file exists.
Is such a thing possible?
You could use FileSystemWatcher:
$watcher = New-Object -TypeName System.IO.FileSystemWatcher -Property #{Path='c:\temp\lib'; Filter='*_output.txt'; NotifyFilter=[IO.NotifyFilters]::FileName, [IO.NotifyFilters]::LastWrite};
Register-ObjectEvent -InputObject $watcher -EventName Created -Action {Copy-Item $event.SourceEventArgs.FullPath $($event.SourceEventArgs.FullPath.Replace("output","input"))};
$watcher.EnableRaisingEvents = $true

How to use powershell to tell when a series of files have finished being added to network

So I have a powershell script, mainly taken from a similar question here, that can detect when a certain type of file is added to a certain network and send emails when this happens:
Function Watcher{
param ($folder, $filter, $to, $Subject)
$watcher = New-Object IO.FileSystemWatcher $folder, $filter -Property #{
IncludeSubdirectories = $true
EnableRaisingEvents = $true
}
$changeAction = [scriptblock]::Create('
$path = $Event.SourceEventArgs.FullPath
$name = $Event.SourceEventArgs.Name
$changeType = $Event.SourceEventArgs.ChangeType
$timeStamp = $Event.TimeGenerated
Write-Host "The file $name was $changeType at $timeStamp"
$Body = "The file $name was $changeType at $timeStamp"
Email $to $Subject $Body
')
Register-ObjectEvent $Watcher -EventName "Created" -Action $changeAction
}
However, I want to modify this such that it can be useful for this application: right now there is a tool adding data (.datx files) to the network (several files per minute) and I would like to receive email notification the moment that the data is done being recorded. How can I most easily modify this such that when the initial watcher is triggered, it waits to see if it happens again and resets if so, but then continues if not? Or would creating a whole new script be best? Basically, how can I make the watcher activated by a lone .datx file being uploaded to the network, but not have it triggered by a stream of them (except for the very last one)
You can use the following batching approach:
Define the length of a sliding time window that resets if a new file is created within it; keep collecting events while ones arrive within that sliding window.
To prevent the collection from growing indefinitely, define a maximum batch size at which a batch is processed, even if further events are pending.
Once the time window elapses without new events having arrived, process the batch at hand, i.e., the events collected so far, then start a new batch.
Caveat:
The System.IO.FileSystemWatcher class can report duplicate events.
The code below eliminates duplicates in a given batch, but not across batches, which would require quite a bit more effort - see the source-code comments.
Implementation notes:
Instead of using an -Action script block passed to Register-ObjectEvent to process the events, they are processed synchronously - with a timeout - in a Wait-Event loop.
Wait-Event uses PowerShell's event queue and therefore usually doesn't miss events (although that can happen at the .NET level in high-volume situations); by contrast, the FileSystemWatcher's similar WaitForChanged method does not queue events and only reports a - single - event, if one happens to arrive while the method waits.
try {
# Specify the target folder: the system's temp folder in this example.
$dir = (Get-Item -EA Ignore temp:).FullName; if (-not $dir) { $dir = $env:TEMP }
# Create and initialize the watcher.
# Note the [ordered] to ensure that .EnableRaisingEvents is set last.
$watcher = [System.IO.FileSystemWatcher] [ordered] #{
Filter = '*.datx'
Path = $dir
EnableRaisingEvents = $true
}
# To simulate file creation, create *.datx files in the folder printed
# mentioned in the following status message.
Write-Host "Watching $dir for creation of $($watcher.Filter) files..."
# Register for (subscribe to) creation events:
# Determine a unique event-source ID...
[string] $sourceId = New-Guid
# ... and register for the watcher's Created event with it.
Register-ObjectEvent $watcher -EventName Created -SourceIdentifier $sourceId
# Initialize the ordered hashtable that collects all file names for a single
# batch.
# Note: Since any given file creation can trigger *multiple* events, we
# use an ordered hashtable (dictionary) to ensure that each file is
# only reported once.
# However, *across batches* duplicates can still occur - see below.
$batch = [ordered] #{}
# Determine the sliding time window during which newly created files are
# considered part of a single batch.
# That is, once a file has been created, each additional file created
# within that time window relative to previous file becomes part of the
# same batch.
# When a time window elapses without a new file having been created, the
# batch is considered complete and processed - see max. batch-size exception
# below.
# IMPORTANT:
# * The granularity is *seconds*, so the time window must be at least 1 sec.
# * Seemingly independently of the length of this window, duplicate events
# are likely to occur across batches the less time has elapsed between
# the end of a batch and the start of a new one - see below.
$batchTimeWindowSecs = 5
# How many names to allow a batch to contain at most, even if more
# files keep getting created in the sliding time window.
$maxBatchSize = 100
while ($true) {
# Run indefinitely; use Ctrl-C to exit.
# Wait for events in a sliding time window of $batchTimeWindowSecs length.
# Note: Using Wait-Event in a loop (1 event per iteration) is *more*
# predictable than the multi-event collecting Get-Event in terms of
# avoiding duplicates, but duplicates do still occur.
$batch.Clear()
while ($evt = Wait-Event -SourceIdentifier $sourceId -Timeout $batchTimeWindowSecs) {
$evt | Remove-Event # By default, events linger in the queue; they must be removed manually.
# Add the new file's name to the batch (unless already present)
# IMPORTANT:
# * Duplicates can occur both in a single batch and across batches.
# * To truly weed out all duplicates, you'd have to create a session-level
# dictionary of the files' names and their creation timestamps.
# With high-volume file creation, this session-level dictionary could
# grow large; periodic removal of obsolete entries would help.
$batch[$evt.SourceArgs.Name] = $null # dummy value; it is the *keys* that matter.
Write-Host ✔ -NoNewline # status output: signal that a new file was created
# If the max. batch size has been reached, submit the batch now, even if further
# events are pending within the timeout window.
if ($batch.Count -ge $maxBatchSize) {
Write-Warning "Max. batch size of $maxBatchSize reached; force-submitting batch."
break
}
}
# Completed batch available?
if ($batch.Count) {
# Simulate processing the batch.
Write-Host "`nBatch complete: Sending email for the following $($batch.Count) files:`n$($batch.Keys -join "`n")" #`
# Start a new batch.
$batch.Clear()
}
else {
Write-Host . -NoNewline # status output: signal that no new files were created in the most recent time window.
}
}
}
finally {
# Clean up:
# Unregister the event subscription.
Unregister-Event -SourceIdentifier $sourceId
# Dispose of the watcher.
$watcher.Dispose()
}
Sample output from creating a batch of 3 files first, then another with 5:
Watching C:\Users\jdoe\AppData\Local\Temp for creation of *.datx files...
............✔✔✔
Batch complete: Sending email for the following 3 files:
1.datx
2.datx
3.datx
.✔✔✔✔✔
Batch complete: Sending email for the following 5 files:
4.datx
5.datx
6.datx
7.datx
8.datx
....................

Programmatically Moving Emails Efficiently

I'm writing a script that moves all of my read emails older than 2 weeks to a separate PST for archiving. Once it is acceptable, I'll execute it via a rule.
However, my current code takes a very long time to complete (about 8 minutes), while simply doing a drag and drop in Outlook is phenomenally quicker.
Does anyone know of a better way to move large amounts of emails? Maybe via accessing Outlook's index?
Add-Type -AssemblyName "Microsoft.Office.Interop.Outlook"
$Outlook=New-Object -ComObject Outlook.Application
$Namespace = $Outlook.GetNameSpace("MAPI")
$Items=1
while ($Items -gt 0)
{
$Items=0
$SourceFolder = $Namespace.Folders.Item($SourcePSTName).Folders.Item($Folder)
$TargetFolder = $Namespace.Folders.Item($TargetPSTName).Folders.Item($Folder)
$AllOfDem=($SourceFolder.Items | where {$_.SentOn -lt $SentMaxDate -and $_.Unread -eq $False})
foreach ($Mail in $AllOfDem)
{
$Mail.Move($TargetFolder) | Out-Null
$Items++
}
}
I suspect your problem is not so much moving the messages (which can be optimized using Extended MAPI or Redemption (I am its author) to move all messages in a single call), but rather looping through all items in a folder - that is a huge problem.
Instead of looping, use Items.Find/FindNext or Items.Restrict to provide a query that only returns the matching items.

Powershell novice looking to create script to trigger notifications on missing files

Long time lurker first time poster. I'm looking(of my own initiative) to see if there is a method by which I can check for missing files, that we would expect to receive on a daily basis, and be notified via e-mail.
Our company has what I'd call a relatively unhinged systems infrstructure, that since I arrived I've been chipping away here and there putting in some practices and process' to be more proactive with our monitoring.
Specifically in this case, we receive files via FTP from a vendor, that outlines our Sales and other data. These files go through some validation and the data is then imported into our ERP platform. However I am interested to put in a check, that raises and alert when a file has not been received, when expected.
The last part of that requirement can potentially change, I'm not sure how specific I can get when trying to raise an alert from an expected file.
I'll outline this by stating I'm a relative novice in this area, but there is really no one in my department any the wiser. So I've been looking into powershell.
I've created the following two bits of codes so far, that when executed appear to return files that have been created/last writ, within the last day. This would even be enough, to have this output sent via e-mail. I would be able to spot quickly if an expected file is not in the list.
GET-ChildItem -Path "Path I am checking" |
Where-Object {$_.LastWritetime -gt (get-Date).AddDays(-1)}
The above returns one .csv file. I guess if I get a returned file, then I know its been provided, and if the return is blank/zero, then I know I didn't get a file.
I've used the above for four seperate checks, checking other subfolders in the structure.
To outline the folder structure
\"App server"\"Region"\"Vendor"
There are then the following subfolders
Purchases
Sales
Tenders
VAT
Each of the above four folders then has
Incoming
Processed
I am running my checks on the processed folder for each of the four folder outlined above.
Maybe something like this will help you out:
Function Test-NewerFiles {
# We use parameters as it makes things easy when we need to change things
# CmdLetBinding makes sure that we can see our 'Write-Verbose' messages if we want to
[CmdLetBinding()]
Param (
[String]$Path = 'C:\Users\me\Downloads\Input_Test',
[String]$ExportFile = 'C:\Users\me\Downloads\Log_Test\Attachment.txt'
)
# We first save the date, then we don't need to do this every time again
$CompareDate = (Get-Date).AddDays(-1)
# Then we collect only the folders and check each folder for files and count them
Get-ChildItem -Path $Path -Directory -Recurse | ForEach-Object {
$Files = (Get-ChildItem -Path $_.FullName -File | Where-Object {$_.LastWritetime -gt $CompareDate} | Measure-Object).Count
# If we didn't find files the count is 0 and we report this
if ($Files -eq 0) {
Write-Verbose "No files found in folder $($_.FullName)"
Write-Output $_.FullName
}
# If we found files it's ok and we don't report it
else {
Write-Verbose "Files found in folder $($_.FullName)"
}
}
}
# If you don't want to see output you can remove the '-Verbose' switch
Test-NewerFiles -Verbose
$MyNewFiles = Test-NewerFiles
$MyNewFiles | Out-File -FilePath $ExportFile -Encoding utf8
if ($MyNewFiles) {
$MailParams = #{
To = 'Chuck.Norris#world.com'
From = 'MyScriptServer#world.com'
SmtpServer = 'SMTPServer'
}
Send-MailMessage #MailParams -Priority High -Attachments $ExportFile -Body 'We found problems: check attachment for details'
}
else {
Send-MailMessage #MailParams -Priority Low -Body 'All is ok'
}
The Verbose switch is only used to report progress. So we can see what it does when it's running. But when we use this code in production, we don't need these messages and just use Test-NewerFiles instead of Test-NewerFiles -Verbose.

How do I pass option flags to Folder.CopyHere in PowerShell?

I am trying to write a script that automatically and silently moves a bunch of fonts into the Fonts special folder so they are available as if you had "installed" them from Explorer (by dragging and dropping, copying, or right-click and choosing Install). I have the Shell.Application part down all the way to the copy.
$FONTS = 0x14
$shell = New-Object -ComObject Shell.Application
$source = $shell.Namespace($downloaded_path)
$target = $shell.Namespace($FONTS)
$target.CopyHere($source.Items())
However, some systems may already have the fonts installed and I want the progress dialog to be hidden and any prompts to be silently accepted.
So, I'm investigating the Folder.CopyHere option flags.
4 Do not display a progress dialog box
16 Respond with "Yes to All" for any dialog box that is displayed.
I hope they are supported in this folder (some options are ignored by design). And I think these are in decimal, right? Do they need to be converted? However I pass them in, I still see both dialogs. I have tried
$options = 4 <-- don't expect int to work
$options = 0x4 <-- thought hexidecimal would be ok, the VB documentation shows &H4&
$options = "4" <-- string's the thing?
$options = [byte]4 <-- no luck with bytes
$options = [variant]4 <-- this isn't even a type accelerator!
And, if I can get one option working, how do I get both working? Do I bor them together? What about the formatting?
$options = 4 -bor 16
Or do I add them or convert them to hex?
$options = "{0:X}" -f (4 + 16)
You can use 4 -bor 16. It is hard to tell what this method expects since the type is VARIANT. I would have thought that it would take an integer value. If that doesn't work, this comment from the MSDN topic on Folder.CopyHere implies that a string should work:
function CopyFileProgress
{
param( $Source, $DstFolder, $CopyType = 0 )
# Convert the decimal to hex
$copyFlag = [String]::Format("{0:x}", $CopyType)
$objShell = New-Object -ComObject "Shell.Application"
$objFolder = $objShell.NameSpace($DestLocation)
$objFolder.CopyHere($Source, $copyFlag)
}
Although I wonder if the format string should be "0x{0:x}"?
Just be aware that for normal .NET flags style enums, you can pass multiple flags to a .NET (or command parameter) that is strongly typed to the enum like so:
$srv.ReplicationServer.Script('Creation,SomeOtherValue')
Oisin has written up some info on this subject in this blog post.
I had the same problem and found this in another thread, Worked perfectly for me.
If you want it to overwrite AND be silent change 0x10 to 0x14 (docs).
$destinationFolder.CopyHere($zipPackage.Items(), 0x14)
The Folder.CopyHere option flags may simply not work. This makes me sad. I'll have to investigate one of these other methods, all of which leave me in a bit of a bind.
Separate Process
Invoke the copy in a new process and hide the window using the ProcessStartInfo properties. I haven't implemented this yet, but I wonder if it will address the user-prompting for overwriting existing files?
Dim iProcess As New System.Diagnostics.ProcessStartInfo(AppDomain.CurrentDomain.BaseDirectory + “unzip.exe”)
iProcess.CreateNoWindow = True
Dim sArgs As String = ZippedFile
iProcess.Arguments = sArgs
iProcess.WindowStyle = ProcessWindowStyle.Hidden
Dim p As New System.Diagnostics.Process
iProcess.UseShellExecute = False
p = System.Diagnostics.Process.Start(iProcess)
p.WaitForExit(30000)
Dim s As Integer = p.ExitCode
iProcess.UseShellExecute = True
p.Dispose()
iProcess = Nothing
For Loop
Only copy non-existing items. This seems to fall down when I actually want to update an existing font with a new font file of the same name.
foreach($File in $Fontdir) {
$fontName = $File.Name.Replace(".ttf", " Regular")
$objFolderItem = $objFolder.ParseName($fontName);
if (!$objFolderItem) {
$objFolder.CopyHere($File.fullname,0x14)
}
}
Remove Existing
I'm thinking of removing all fonts of the same name as the ones I'm copying, then copying the set. Although that's kind of brutal. And I believe that there's another prompt if that font cannot be deleted because it's in use. sigh
The copy flags don't work for me. I setup a job in the install fonts script that detects the "Installing Fonts" window and send {Enter} to it so I am not overwriting existing fonts.
Start-Job –Name DetectAndClosePrompt –Scriptblock {
$i=1
[void] [System.Reflection.Assembly]::LoadWithPartialName("'System.Windows.Forms")
[void] [System.Reflection.Assembly]::LoadWithPartialName("'Microsoft.VisualBasic")
while ($i -eq 1) {
$windowPrompt = Get-Process -ErrorAction SilentlyContinue |? {$_.MainWindowTitle -like "*Installing Fonts*"}
[Microsoft.VisualBasic.Interaction]::AppActivate($windowPrompt.ID)
[System.Windows.Forms.SendKeys]::SendWait("{Enter}")
sleep 2
}
}
After all fonts are copied/installed... I remove the job, by name.
Get-Job DetectAndClosePrompt | Remove-Job -Force
That works for me on Windows 7, 8.x, & 10.
I'm seeing a number of Unzip folder operations, but really no one writing a solution to fit the Fonts folder situation. So I wrote my own! As it turns out, the Fonts folder does implement the Shell.Folder.CopyHere method, but does not honor any overloads passed for the second argument of the method. Why? Who knows! I suspect Raymond Chen of 'The Old new Thing' Windows Developer blog could explain it, but I don't know the answer. So we need instead to intelligently look for our fonts before trying to copy them, or we'll get a nasty message.
In my code, we check to see a font exists or not by checking for a match on the first four characters of the font name with a wildcard search. If the font doesn't exist, we assume this is the first time we're installing fonts on this system and set a special flag called $FirstInstall.
From then on in the script, if $FirstInstall is true, we install every font in the source font directory. On subsequent executions, we check to see if each font is a match, and if so, we abort that copy. If not, we go ahead and copy. This seems to work for most of my clients, thus far.
Here you go!
<#
.SYNOPSIS
Script to quietly handle the installation of fonts from a network source to a system
.DESCRIPTION
We Can't just move files into the %windir%\Fonts directory with a script, as a simple copy paste from command line doesn't trigger windows to note the new font
If we used that approach, the files would exist within the directory, but the font files woudln't be registered in windows, nor would applications
display the new font for use. Instead, we can make a new object of the Shell.Application type (effectively an invisible Windows Explorer Windows) and use its Copy method
Which is the functional equivalent of dragging an dropping font files into the Font folder, which does trigger the font to be installed the same as if you right clicked the font
and choose install.
.PARAMETER FontPath
The path of a folder where fonts reside on the network
.EXAMPLE
.\Install-Fonts.ps1 -FontPath "\\corp\fileshare\Scripts\Fonts"
Installing font...C:\temp\Noto\NotoSans-Bold.ttf
Installing font...C:\temp\Noto\NotoSans-BoldItalic.ttf
Installing font...C:\temp\Noto\NotoSans-Italic.ttf
Installing font...C:\temp\Noto\NotoSans-Regular.ttf
In this case, the fonts are copied from the network down to the system and installed silently, minus the logging seen here
import files needed for step 1, step 2, and step 5 of the migration process.
.EXAMPLE
.\Install-Fonts.ps1 -FontPath "\\corp\fileshare\Scripts\Fonts"
Font already exists, skipping
Font already exists, skipping
Font already exists, skipping
Font already exists, skipping
In this case, the fonts already existed on the system. Rather than display an annoying 'Overwrite font' dialog, we simply abort the copy and try the next file
.INPUTS
String.
.OUTPUTS
Console output
.NOTES
CREATED: 06/11/2015
Author: sowen#ivision.com
MODIFIED:06/11/2015
Author: sowen#ivision.com -Reserved...
#>
param
(
[Parameter(Mandatory)][string]$FontPath="C:\temp\Noto"
)
#0x14 is a special system folder pointer to the path where fonts live, and is needed below.
$FONTS = 0x14
#Make a refrence to Shell.Application
$objShell = New-Object -ComObject Shell.Application
$objFolder = $objShell.Namespace($FONTS)
ForEach ($font in (dir $fontsPath -Recurse -Include *.ttf,*.otf)){
#check for existing font (to suppress annoying 'do you want to overwrite' dialog box
if ((($objShell.NameSpace($FONTS).Items() | where Name -like "$($font.BaseName.Split('-')[0].substring(0,4))*") | measure).Count -eq 0){
$firstInstall = $true}
if ($firstInstall -ne $true) {Write-Output "Font already exists, skipping"}
else{
$objFolder.CopyHere($font.FullName)
Write-Output "Installing font...$($font.FullName)"
$firstInstall = $true
}
}
.\Install-Fonts.ps1 -FontPath "\\corp\fileshare\Scripts\Fonts"
There are several issues with #FoxDeploy's answer which is why it is not working. First issue is that you also want to check Fonts folder in %USERPROFILE% or you would get confirmation dialog. Second issue is that you want to avoid assuming '-' in font name.
Below is the fixed version that installs fonts from CodeFonts repo as an example:
$ErrorActionPreference = "Stop"
Add-Type -AssemblyName System.Drawing
# Clone chrissimpkins/codeface from which we will install fonts
if (!(Test-Path /GitHubSrc/codeface)){
git clone git://github.com/chrissimpkins/codeface.git /GitHubSrc/codeface
}
#0x14 is a special system folder pointer to the path where fonts live, and is needed below.
$FONTS = 0x14
$fontCollection = new-object System.Drawing.Text.PrivateFontCollection
#Make a refrence to Shell.Application
$objShell = New-Object -ComObject Shell.Application
$objFolder = $objShell.Namespace($FONTS)
# local path
$localSysPath = "$Env:USERPROFILE\AppData\Local\Microsoft\Windows\Fonts"
$localSysFonts = Get-ChildItem -Path $localSysPath -Recurse -File -Name | ForEach-Object -Process {[System.IO.Path]::GetFileNameWithoutExtension($_)}
$fontsPath="\GitHubSrc\codeface\fonts"
ForEach ($font in (dir $fontsPath -Recurse -Include *.ttf,*.otf)){
if ($localSysFonts -like $font.BaseName) {
Write-Output "SKIP: Font ${font} already exists in ${localSysPath}"
}
else {
$fontCollection.AddFontFile($font.FullName)
$fontName = $fontCollection.Families[-1].Name
#check for existing font (to suppress annoying 'do you want to overwrite' dialog box
if ((($objShell.NameSpace($FONTS).Items() | where Name -ieq $fontName) | measure).Count -eq 0){
Write-Output "INST: Font ${font}"
$objFolder.CopyHere($font.FullName)
$firstInstall = $true
}
else {
Write-Output "SKIP: Font ${font} already exists in SYSTEM FONTS"
}
}
# Read-Host -Prompt "Press Enter to continue"
}
You can just take a sum of your options. I was need to run CopyHere with two options - SILENT and NOCONFIRMATION. Look at the sample below:
function Unzip-Archive($targetpath, $destination)
{
$shell_app=new-object -com shell.application
$FOF_SILENT_FLAG = 4
$FOF_NOCONFIRMATION_FLAG = 16
$zip_file = $shell_app.namespace("$targetpath")
#Set the destination directory for the extracts
$destination = $shell_app.namespace("$destination")
#unzip the files
$destination.Copyhere($zip_file.items(), $FOF_SILENT_FLAG + $FOF_NOCONFIRMATION_FLAG)
}
I just got this to work by simply using + i.e.
function Expand-ZIPFile($file, $destination)
{
$shell = new-object -com shell.application
$zip = $shell.NameSpace($file)
foreach($item in $zip.items())
{
$shell.Namespace($destination).copyhere($item, 16+1024)
}
}