How to compare files under 2 folders with PowerShell; - powershell

I am a newbie in PowerShell, and trying to learn the things from few forums and msdn. Now i got some requirement from my group of learners.
I am trying to compare 2 folder's files with each other in powershell, for effective file comparison i am using MD5 Hashes.
Till now i have created a code like this,
[Cmdletbinding()]
Param
(
[Parameter(Position=0, Mandatory)][ValidateScript({ Test-Path -Path $_ })][string]$SourceFolder,
[Parameter(Position=1, Mandatory)][ValidateScript({ Test-Path -Path $_ })][string]$DestinationFolder
)
$SourceFolderList =#()
$DestinationFolderList =#()
$Sourcefiles = #(Get-ChildItem -Path $SourceFolder -Filter *.log)
foreach($srcFile in $Sourcefiles )
{
$SourceFolderHash = [ordered]#{}
$SourceFolderHash.Name = $srcFile.Name
$SourceFolderHash.FullName = $srcFile.FullName
$obj = New-Object PSObject -Property $SourceFolderHash
$SourceFolderList+= $obj
}
$Destfiles = #(Get-ChildItem -Path $DestinationFolder -Filter *.log)
foreach($Destfile in $Destfiles )
{
$DestinationFolderHash = [ordered]#{}
$DestinationFolderHash.Name = $Destfile.Name
$DestinationFolderHash.FullName = $Destfile.FullName
$obj = New-Object PSObject -Property $DestinationFolderHash
$DestinationFolderList+= $obj
}
$SourceFolderList =#() &
$DestinationFolderList =#() are Arrays with Name & FullName properties.
Now i am trying to create a new array with values which matches in the $SourceFolderList & $DestinationFolderList ( I hope i am going in the right way?!)
But the problem is, i am not sure how to loop through each item in the Arrays and get the fullnames of each file from 2 folders to pass as params to MD5hash Function.
I am trying in this way
##1
For ($i =$j=0; $i -le $SourceFolderList.Count -and $j -le $DestinationFolderList.Count; $i++ -and $j++)
{
$file1Name = $SourceFolderList[$i].Name
$file1Path = $SourceFolderList[$i].FullName
$file2Name = $DestinationFolderList[$j].Name
$file2Path = $DestinationFolderList[$j].FullName
}
##2
foreach( $file in $SourceFolderList)
{
if($DestinationFolderList.Name -contains $file.Name )
{
Write-Host $file.Name -ForegroundColor Cyan
Write-Host $DestinationFolderList.($file.Name).FullName -ForegroundColor Yellow
}
}
In the 1st way i am not getting correct File Paths << Index is mismatching for Destination folder's file paths >>
In the 2nd Way i am not at all getting the Full Path of file.
Please correct me if am going in the wrong way to achieve my requirement.
And please help me to solve this issue.

I think your're making your task more difficult that it is, by gathering file info into the arrays. Why don't you just iterate over the files in the source folder and compare their hashes with hashes of files in the destination folder on the fly:
function Compare-Folders
{
[CmdletBinding()]
Param
(
[Parameter(Mandatory = $true, ValueFromPipeline = $true, ValueFromPipelineByPropertyName = $true)]
[string]$Source,
[Parameter(Mandatory = $true, ValueFromPipelineByPropertyName = $true)]
[string]$Destinaton,
[Parameter(ValueFromPipelineByPropertyName = $true)]
[string]$Filter
)
Process
{
# Iterate over files in source folder, skip folders
Get-ChildItem -Path $Source -Filter $Filter | Where-Object {!$_.PsIsContainer} | ForEach-Object {
# Generate file name in destination folder
$DstFileName = Resolve-Path -Path (Join-Path -Path $Destinaton -ChildPath (Split-Path -Path $_.FullName -Leaf))
# Create hashtable with filenames and hashes
$Result = #{
SourceFile = $_.FullName
SourceFileHash = (Get-FileHash -Path $_.FullName -Algorithm MD5).Hash
DestinationFile = $DstFileName
DestinationFileHash = (Get-FileHash -Path $DstFileName -Algorithm MD5).Hash
}
# Check if file hashes are equal and add result to hashtable
$Result.Add('IsEqual', ($Result.SourceFileHash -eq $Result.DestinationFileHash))
# Output PsObject from hashtable
New-Object -TypeName psobject -Property $Result |
Select-Object -Property SourceFile, SourceFileHash , DestinationFile, DestinationFileHash, IsEqual
}
}
}

Related

How to write an advanced function to select path and sort files?

function Sort-Size {
[CmdletBinding()]
param (
[Parameter(Mandatory)]
[string] $Name
[Parameter(ValueFromPipeline = $true)]
$Path,
$Lenght
)
begin {
$Lenght = #()
}
process {
$Path = Get-ChildItem -recurse -File $Name
}
end {
$Path | Select-Object FullName, #{Name='FileSizeInKb';Expression={$_.Length/1KB}} | Sort-Object -Property FileSizeInKb | Format-Table -AutoSize
}
}
Sort-Size -Name {Test-Path -Name "C:\"}
This code only sorts in one folder, how to make it sort in different folders?
For this you don't need parameters Name and Length.
Instead, I would add an optional file pattern to look for and add a switch whether or not you want the function to recurse through the various subfolders.
Also, I would not have it return the DISPLAY ONLY output of Format-Table, but the sorted list of files itself. Then when caling the function you can decide if you want to pipe that to Format-* or do something else with the data.
Something like below:
function Sort-Size {
[CmdletBinding()]
param (
[Parameter(ValueFromPipeline = $true, ValueFromPipelineByPropertyName = $true)]
[ValidateScript({ $_ | Test-Path -PathType Container })]
[Alias('FullName')]
[string]$Path,
[string]$Pattern = '*.*',
[switch]$Recurse
)
Write-Verbose "Searching files in '$Path' with pattern '$Pattern'"
Get-ChildItem -Path $Path -File -Filter $Pattern -Recurse:$Recurse |
Select-Object Name, DirectoryName, #{Name='FileSizeInKb';Expression={[Math]::Round($_.Length/1KB, 2)}} |
Sort-Object FileSizeInKb
}
Sort-Size -Path "C:\Users\user" | Format-Table -Property Name,FileSizeInKb -AutoSize
function Sort-Size {
[CmdletBinding()]
param (
[Parameter(ValueFromPipeline = $true)]
$Path,
$Lenght,
$Name
)
begin {
$Lenght = #()
}
process {
$Path = Get-ChildItem -recurse -File "$Name"
}
end {
$Path | Select-Object FullName, #{Name='FileSizeInKb';Expression={$_.Length/1KB}} | Sort-Object -Property FileSizeInKb | Format-Table -AutoSize
}
}
#Sort-Size -Path "C:\Users"
Sort-Size -Name "C:\Users\user"
#Get-Help "Sort-size"

How to get an last write time object using select-object in PowerShell?

I have some folder, inside the folder there is a file and contain of ID, then I need to select one of the folder by last write time. I tried this way
Function Test
{
[CmdletBinding()]
param(
[Parameter(Mandatory = $true, Position = 0)]
$Path,
[Parameter(Mandatory = $true, Position = 2)]
[string]$Pattern
)
$global:Result = '' | Select-Object #{Name = 'Exists'; Expression = {$false}}, FileName, Directory, #{Name = 'Attempts'; Expression = {1}}
$file = Select-String -Path $Path -Pattern $Pattern -SimpleMatch -ErrorAction SilentlyContinue | Select-Object -First 1
if ($file) {
$file = Get-Item -Path $file.Path
$global:Result = [PSCustomObject]#{
Exists = $true
FileName = $file.FullName
Directory = $file.DirectoryName
Attempts = 1
}
}
else {
Write-Host "Not Found"
}
}
$ID = "8538"
$IDName = "ID_LST"
$Path = "D:\Folder\*\$IDName\"
Test -Path $Path -Pattern "$ID"
$global:Result | Format-List
The result is not select the folder based on the last write time, but based on the int of the folder name.
There should be no need to use $global variables because the function simply outputs the (array) of objects.
Also, I would recommend changing the function name to comply with the Verb-Noun convention.
Something like this should work:
function Test-Pattern {
[CmdletBinding()]
param(
[Parameter(Mandatory = $true, Position = 0)]
[string]$Path,
[Parameter(Mandatory = $true, Position = 1)]
[string]$Pattern
)
Get-ChildItem -Path $Path -Recurse -Filter '*.*' | ForEach-Object {
if (Select-String -Path $_.FullName -Pattern $Pattern -SimpleMatch -ErrorAction SilentlyContinue) {
[PSCustomObject]#{
Exists = $true
FileName = $_.FullName
Directory = $_.DirectoryName
LastWriteTime = $_.LastWriteTime
Attempts = 1
}
}
else {
Write-Host "Pattern '$Pattern' not found in file $($_.FullName)"
}
}
}
$ID = "8538"
$IDName = "ID_LST"
$Path = "D:\Folder\*\$IDName"
# now use the function to find files that contain the pattern
# and capture the result in a variable.
$result = Test-Pattern -Path $Path -Pattern $ID
# output on screen
$result | Format-List
# output to csv file
$result | Export-Csv -Path 'D:\patternsearch.csv' -NoTypeInformation
If this is not what you want need then Please let me know. I do my best to understand. This does not work right now but, I have some questions. What is in $SSID? Is your goal to find wireless SSIDs in a folder?
Function Test
{
[CmdletBinding()]
param(
[Parameter(Mandatory = $true, Position = 0)]
$Path,
[Parameter(Mandatory = $true, Position = 1)] # Postion = 2 needs be Postion = 1
[string]$Pattern
)
# It looks you are making an object twice. You do not need this. The PSCustomObject will do the job
# $global:Result = '' | Select-Object #{Name = 'Exists'; Expression = {$false}}, FileName, Directory, #{Name = 'Attempts'; Expression = {1}}
$file = Select-String -Path $Path -Pattern $Pattern -SimpleMatch -ErrorAction SilentlyContinue # | Select-Object -First 1 # We will need more than one so we can sort it.
if ($file) {
Foreach ($f in $file) { # Feed one object at a time to make your PSObject.
$myfile = Get-Item -Path $f.Path
$global:Result = [PSCustomObject]#{
Exists = $true
FileName = $myfile.FullName
Directory = $myfile.DirectoryName
LastWriteTime = $myfile.LastWritetime
Attempts = 1 # attempts will always equal 1 # I do not know what you are trying to do but, it will not break code.
}
$global:Result # output one object a time.
}
}
else {
Write-Host "Not Found"
}
}
Test <# I help with the input to the fuction #> | Sort-Object -Property LastWriteTime
<#
$ID = "8538"
$IDName = "ID_LST"
$Path = "D:\Folder\*\$IDName\"
Test-FileWithGui -Path $Path -Pattern "$SSID"
$global:Result | Format-List
#>

Powershell Script is printing out duplicate entries of the same path

My objective is to write a powershell script that will recursively check a file server for any directories that are "x" (insert days) old or older.
I ran into a few issues initially, and I think I got most of it worked out. One of the issues I ran into was with the path limitation of 248 characters. I found a custom function that I am implementing in my code to bypass this limitation.
The end result is I would like to output the path and LastAccessTime of the folder and export the information into an easy to read csv file.
Currently everything is working properly, but for some reason I get some paths output several times (duplicates, triples, even 4 times). I just want it output once for each directory and subdirectory.
I'd appreciate any guidance I can get. Thanks in advance.
Here's my code
#Add the import and snapin in order to perform AD functions
Add-PSSnapin Quest.ActiveRoles.ADManagement -ea SilentlyContinue
Import-Module ActiveDirectory
#Clear Screen
CLS
Function Get-FolderItem
{
[cmdletbinding(DefaultParameterSetName='Filter')]
Param (
[parameter(Position=0,ValueFromPipeline=$True,ValueFromPipelineByPropertyName=$True)]
[Alias('FullName')]
[string[]]$Path = $PWD,
[parameter(ParameterSetName='Filter')]
[string[]]$Filter = '*.*',
[parameter(ParameterSetName='Exclude')]
[string[]]$ExcludeFile,
[parameter()]
[int]$MaxAge,
[parameter()]
[int]$MinAge
)
Begin
{
$params = New-Object System.Collections.Arraylist
$params.AddRange(#("/L","/S","/NJH","/BYTES","/FP","/NC","/NFL","/TS","/XJ","/R:0","/W:0"))
If ($PSBoundParameters['MaxAge'])
{
$params.Add("/MaxAge:$MaxAge") | Out-Null
}
If ($PSBoundParameters['MinAge'])
{
$params.Add("/MinAge:$MinAge") | Out-Null
}
}
Process
{
ForEach ($item in $Path)
{
Try
{
$item = (Resolve-Path -LiteralPath $item -ErrorAction Stop).ProviderPath
If (-Not (Test-Path -LiteralPath $item -Type Container -ErrorAction Stop))
{
Write-Warning ("{0} is not a directory and will be skipped" -f $item)
Return
}
If ($PSBoundParameters['ExcludeFile'])
{
$Script = "robocopy `"$item`" NULL $Filter $params /XF $($ExcludeFile -join ',')"
}
Else
{
$Script = "robocopy `"$item`" NULL $Filter $params"
}
Write-Verbose ("Scanning {0}" -f $item)
Invoke-Expression $Script | ForEach {
Try
{
If ($_.Trim() -match "^(?<Children>\d+)\s+(?<FullName>.*)")
{
$object = New-Object PSObject -Property #{
ParentFolder = $matches.fullname -replace '(.*\\).*','$1'
FullName = $matches.FullName
Name = $matches.fullname -replace '.*\\(.*)','$1'
}
$object.pstypenames.insert(0,'System.IO.RobocopyDirectoryInfo')
Write-Output $object
}
Else
{
Write-Verbose ("Not matched: {0}" -f $_)
}
}
Catch
{
Write-Warning ("{0}" -f $_.Exception.Message)
Return
}
}
}
Catch
{
Write-Warning ("{0}" -f $_.Exception.Message)
Return
}
}
}
}
Function ExportFolders
{
#================ Global Variables ================
#Path to folders
$Dir = "\\myFileServer\somedir\blah"
#Get all folders
$ParentDir = Get-ChildItem $Dir | Where-Object {$_.PSIsContainer -eq $True}
#Export file to our destination
$ExportedFile = "c:\temp\dirFolders.csv"
#Duration in Days+ the file hasn't triggered "LastAccessTime"
$duration = 800
$cutOffDate = (Get-Date).AddDays(-$duration)
#Used to hold our information
$results = #()
#=============== Done with Variables ===============
ForEach ($SubDir in $ParentDir)
{
$FolderPath = $SubDir.FullName
$folders = Get-ChildItem -Recurse $FolderPath -force -directory| Where-Object { ($_.LastAccessTimeUtc -le $cutOffDate)} | Select-Object FullName, LastAccessTime
ForEach ($folder in $folders)
{
$folderPath = $folder.fullname
$fixedFolderPaths = ($folderPath | Get-FolderItem).fullname
ForEach ($fixedFolderPath in $fixedFolderPaths)
{
#$fixedFolderPath
$getLastAccessTime = $(Get-Item $fixedFolderPath -force).lastaccesstime
#$getLastAccessTime
$details = #{ "Folder Path" = $fixedFolderPath; "LastAccessTime" = $getLastAccessTime}
$results += New-Object PSObject -Property $details
$results
}
}
}
}
ExportFolders
I updated my code a bit and simplified it. Here is the new code.
#Add the import and snapin in order to perform AD functions
Add-PSSnapin Quest.ActiveRoles.ADManagement -ea SilentlyContinue
Import-Module ActiveDirectory
#Clear Screen
CLS
Function ExportFolders
{
#================ Global Variables ================
#Path to user profiles in Barrington
$Dir = "\\myFileServer\somedir\blah"
#Get all user folders
$ParentDir = Get-ChildItem $Dir | Where-Object {$_.PSIsContainer -eq $True} | where {$_.GetFileSystemInfos().Count -eq 0 -or $_.GetFileSystemInfos().Count -gt 0}
#Export file to our destination
$ExportedFile = "c:\temp\dirFolders.csv"
#Duration in Days+ the file hasn't triggered "LastAccessTime"
$duration = 1
$cutOffDate = (Get-Date).AddDays(-$duration)
#Used to hold our information
$results = #()
$details = $null
#=============== Done with Variables ===============
ForEach ($SubDir in $ParentDir)
{
$FolderName = $SubDir.FullName
$FolderInfo = $(Get-Item $FolderName -force) | Select-Object FullName, LastAccessTime #| ft -HideTableHeaders
$FolderLeafs = gci -Recurse $FolderName -force -directory | Where-Object {$_.PSIsContainer -eq $True} | where {$_.GetFileSystemInfos().Count -eq 0 -or $_.GetFileSystemInfos().Count -gt 0} | Select-Object FullName, LastAccessTime #| ft -HideTableHeaders
$details = #{ "LastAccessTime" = $FolderInfo.LastAccessTime; "Folder Path" = $FolderInfo.FullName}
$results += New-Object PSObject -Property $details
ForEach ($FolderLeaf in $FolderLeafs.fullname)
{
$details = #{ "LastAccessTime" = $(Get-Item $FolderLeaf -force).LastAccessTime; "Folder Path" = $FolderLeaf}
$results += New-Object PSObject -Property $details
}
$results
}
}
ExportFolders
The FolderInfo variable is sometimes printing out multiple times, but the FolderLeaf variable is printing out once from what I can see. The problem is if I move or remove the results variable from usnder the details that print out the folderInfo, then the Parent directories don't get printed out. Only all the subdirs are shown. Also some directories are empty and don't get printed out, and I want all directories printed out including empty ones.
The updated code seems to print all directories fine, but as I mentioned I am still getting some duplicate $FolderInfo variables.
I think I have to put in a condition or something to check if it has already been processed, but I'm not sure which condition I would use to do that, so that it wouldn't print out multiple times.
In your ExportFolders you Get-ChildItem -Recurse and then loop over all of the subfolders calling Get-FolderItem. Then in Get-FolderItem you provide Robocopy with the /S flag in $params.AddRange(#("/L", "/S", "/NJH", "/BYTES", "/FP", "/NC", "/NFL", "/TS", "/XJ", "/R:0", "/W:0")) The /S flag meaning copy Subdirectories, but not empty ones. So you are recursing again. Likely you just need to remove the /S flag, so that you are doing all of your recursion in ExportFolders.
In response to the edit:
Your $results is inside of the loop. So you will have a n duplicates for the first $subdir then n-1 duplicates for the second and so forth.
ForEach ($SubDir in $ParentDir) {
#skipped code
ForEach ($FolderLeaf in $FolderLeafs.fullname) {
#skipped code
}
$results
}
should be
ForEach ($SubDir in $ParentDir) {
#skipped code
ForEach ($FolderLeaf in $FolderLeafs.fullname) {
#skipped code
}
}
$results

Replace command in powershell is deleting the whole content

I am new to powershell. I create a powershell script which need to search a string in the path provided in parameters and replace that string. But actually it is replacing entire file content with new string.
I am using Powershell in Windows 10 OS.
Code:
param(
[Parameter(Mandatory=$true, ParameterSetName="Path", Position=0,HelpMessage='Data folder Path')]
[string] $Path,
[Parameter(Mandatory=$true, HelpMessage='Input the string to be replaced')]
[string] $Input,
[Parameter(Mandatory=$true,HelpMessage='Input the new string that need to be replaced')]
[string] $Replace
)
$a = Test-Path $Path
IF ($a -eq $True) {Write-Host "Path Exists"} ELSE {Write-Host "Path Doesnot exits"}
$configFiles = Get-ChildItem -Path $Path -include *.pro, *.rux -recurse
$Append = join-path -path $path \*
$b = test-path $Append -include *.pro, *.rux
If($b -eq $True) {
foreach ($file in $configFiles)
{
(Get-Content $file.PSPath) |
Foreach-Object { $_ -replace [regex]::Escape($Input), $Replace } |
Set-Content $file.PSPath
}
$wshell = New-Object -ComObject Wscript.Shell
$wshell.Popup("Operation Completed",0,"Done",0x0)
}
As best I can read this without directly reproducing it, this is where it goes wrong:
(get-content $file.pspath) gets the entire content of the file, not its name.
Your "foreach" then regexes every line in the file, and finally "set-content" replaces the contents of the file, not its path.
If you want to change the name of a file, you are looking for Rename-Item, not Set-Content. If you want the name of a file $file.Name will do, you don't need Get-Content, which will ... get its content :)
This should be a working solution.
Param(
[Parameter(Mandatory,
ParameterSetName='Path',
Position=0,
HelpMessage='Data folder Path')]
[String]
$Path,
[Parameter(Mandatory,
HelpMessage='Input the string to be replaced')]
[String]
$StringToReplace,
[Parameter(Mandatory,
HelpMessage='Input the new string that need to be replaced')]
[String]
$ReplacementString
)
If (!(Test-Path $Path)) {
Write-Host 'Path does not exist'
Return
}
Get-ChildItem -Path $Path -Include *.pro,*.rux -Recurse |
? { $_.Name -like "*$StringToReplace*" } |
% { Rename-Item $_ $($ReplacementString+$_.Extension) }
(New-Object -ComObject Wscript.Shell).Popup("Operation Completed",0,"Done",0x0)

Improve performance when searching for a string within multiple word files

I have drafted a PowerShell script that searches for a string among a large number of Word files. The script is working fine, but I have around 1 GB of data to search through and it is taking around 15 minutes.
Can anyone suggest any modifications I can do to make it run faster?
Set-StrictMode -Version latest
$path = "c:\Tester1"
$output = "c:\Scripts\ResultMatch1.csv"
$application = New-Object -comobject word.application
$application.visible = $False
$findtext = "Roaming"
$charactersAround = 30
$results = #()
Function getStringMatch
{
For ($i=1; $i -le 4; $i++) {
$j="D"+$i
$finalpath=$path+"\"+$j
$files = Get-Childitem $finalpath -Include *.docx,*.doc -Recurse | Where-Object { !($_.psiscontainer) }
# Loop through all *.doc files in the $path directory
Foreach ($file In $files)
{
$document = $application.documents.open($file.FullName,$false,$true)
$range = $document.content
If($range.Text -match ".{$($charactersAround)}$($findtext).{$($charactersAround)}"){
$properties = #{
File = $file.FullName
Match = $findtext
TextAround = $Matches[0]
}
$results += New-Object -TypeName PsCustomObject -Property $properties
$document.close()
}
}
}
If($results){
$results | Export-Csv $output -NoTypeInformation
}
$application.quit()
}
getStringMatch
import-csv $output
As mentioned in comments, you might want to consider using the OpenXML SDK library (you can also get the newest version of the SDK on GitHub), since it's way less overhead than spinning up an instance of Word.
Below I've turned your current function into a more generic one, using the SDK and with no dependencies on the caller/parent scope:
function Get-WordStringMatch
{
param(
[Parameter(Mandatory,ValueFromPipeline)]
[System.IO.FileInfo[]]$Files,
[string]$FindText,
[int]$CharactersAround
)
begin {
# import the OpenXML library
Add-Type -Path 'C:\Program Files (x86)\Open XML SDK\V2.5\lib\DocumentFormat.OpenXml.dll' |Out-Null
# make a "shorthand" reference to the word document type
$WordDoc = [DocumentFormat.OpenXml.Packaging.WordprocessingDocument] -as [type]
# construct the regex pattern
$Pattern = ".{$CharactersAround}$([regex]::Escape($FindText)).{$CharactersAround}"
}
process {
# loop through all the *.doc(x) files
foreach ($File In $Files)
{
# open document, wrap content stream in streamreader
$Document = $WordDoc::Open($File.FullName, $false)
$DocumentStream = $Document.MainDocumentPart.GetStream()
$DocumentReader = New-Object System.IO.StreamReader $DocumentStream
# read entire document
if($DocumentReader.ReadToEnd() -match $Pattern)
{
# got a match? output our custom object
New-Object psobject -Property #{
File = $File.FullName
Match = $FindText
TextAround = $Matches[0]
}
}
}
}
end{
# Clean up
$DocumentReader.Dispose()
$DocumentStream.Dispose()
$Document.Dispose()
}
}
Now that you have a nice function that supports pipeline input, all you need to do is gather your documents and pipe them to it!
# variables
$path = "c:\Tester1"
$output = "c:\Scripts\ResultMatch1.csv"
$findtext = "Roaming"
$charactersAround = 30
# gather the files
$files = 1..4|ForEach-Object {
$finalpath = Join-Path $path "D$i"
Get-Childitem $finalpath -Recurse | Where-Object { !($_.PsIsContainer) -and #('*.docx','*.doc' -contains $_.Extension)}
}
# run them through our new function
$results = $files |Get-WordStringMatch -FindText $findtext -CharactersAround $charactersAround
# got any results? export it all to CSV
if($results){
$results |Export-Csv -Path $output -NoTypeInformation
}
Since all of our components now support pipelining, you could do it all in one go:
1..4|ForEach-Object {
$finalpath = Join-Path $path "D$i"
Get-Childitem $finalpath -Recurse | Where-Object { !($_.PsIsContainer) -and #('*.docx','*.doc' -contains $_.Extension)}
} |Get-WordStringMatch -FindText $findtext -CharactersAround $charactersAround |Export-Csv -Path $output -NoTypeInformation