Only recurse subfolder x numbers of levels using PowerShell - powershell

The PowerShell script below will list out all shared folders (excluding hidden shared folders), then list out all sub-folders and finally get the ACL information of each of them and export to a CSV file.
However, I'm trying to set the limit of the sub-folder it can drill into. For example, if I set it to 3, the script will get the ACL information of first three sub-folders. How can I do this?
Input:
path=\\server\sharefolder0\subfolder01\subfolder02
path=\\server\sharefolder1\subfolder11\subfolder12\subfolder13\subfolder14
path=\\server\sharefolder2
Expected result:
path=\\server\sharefolder0
path=\\server\sharefolder0\subfolder01
path=\\server\sharefolder0\subfolder01\subfolder02
path=\\server\sharefolder1
path=\\server\sharefolder1\subfolder11
path=\\server\sharefolder1\subfolder11\subfolder12
path=\\server\sharefolder2
This is the code:
$getSRVlist = Get-Content .\server.txt
$outputDirPath=".\DirPathList.txt"
$outputACLInfo=".\ACLInfo.CSV"
$header="FolderPath,IdentityReference,Rights"
Del $outputACLInfo
add-content -value $header -path $outputACLInfo
foreach ($readSRVlist in $getSRVlist)
{
foreach ($readShareInfoList in $getShareInfoList=Get-WmiObject Win32_Share
-computerName $readSRVlist | Where {$_.name -notlike "*$"} | %{$_.Name})
{
foreach ($readDirPathList in
$getDirPathList=get-childitem \\$readSRVlist\$readShareInfoList -recurse
| where {$_.PSIsContainer})# | %{$_.fullname})
{
$getACLList=get-ACL $readDirPathList.fullname | ForEach-Object
{$_.Access}
foreach ($readACLList in $getACLList)
{
$a = $readDirPathList.fullname + "," +
$readACLList.IdentityReference + "," + $readACLList.FileSystemRights
add-content -value $a -path $outputACLInfo
}
}
}
}

Recursion is your friend. Try this:
$maxDepth = 3
function TraverseFolders($folder, $remainingDepth) {
Get-ChildItem $folder | Where-Object { $_.PSIsContainer } | ForEach-Object {
if ($remainingDepth -gt 1) {
TraverseFolders $_.FullName ($remainingDepth - 1)
}
}
}
TraverseFolders "C:\BASE\PATH" $maxDepth
Edit: Now I see what you mean. For checking the first three parent folders of a given path try this:
$server = "\\server\"
$path = ($args[0] -replace [regex]::escape($server), "").Split("\\")[0..2]
for ($i = 0; $i -lt $path.Length; $i++) {
Get-ACL ($server + [string]::join("\", $path[0..$i])
}

In newer version of powershell one can use -DEPTH parameter,
One liner can help-
get-childitem -path \\server\folder -Depth 2 -Directory | Select-object -Property Name, Fullname
It will search for 2 nested folders and will provide folder name and full path of that particular folder. Tested in version- PSVersion 5.1.17134.858

Related

Powershell script to compare two directories (including sub directories and contents) that are supposed to be identical but on different servers

I would like to run a powershell script that can be supplied a directory name by the user and then it will check the directory, sub directories, and all file contents of those directories to compare if they are identical to each other. There are 8 servers that should all have identical files and contents. The below code does not appear to be doing what I intended. I have seen the use of Compare-Object, Get-ChildItem, and Get-FileHash but have not found the right combo that I am certain is actually accomplishing the task. Any and all help is appreciated!
$35 = "\\server1\"
$36 = "\\server2\"
$37 = "\\server3\"
$38 = "\\server4\"
$45 = "\\server5\"
$46 = "\\server6\"
$47 = "\\server7\"
$48 = "\\server8\"
do{
Write-Host "|1 : New |"
Write-Host "|2 : Repeat|"
Write-Host "|3 : Exit |"
$choice = Read-Host -Prompt "Please make a selection"
switch ($choice){
1{
$App = Read-Host -Prompt "Input Directory Application"
}
2{
#rerun
}
3{
exit; }
}
$c35 = $35 + "$App" +"\*"
$c36 = $36 + "$App" +"\*"
$c37 = $37 + "$App" +"\*"
$c38 = $38 + "$App" +"\*"
$c45 = $45 + "$App" +"\*"
$c46 = $46 + "$App" +"\*"
$c47 = $47 + "$App" +"\*"
$c48 = $48 + "$App" +"\*"
Write-Host "Comparing Server1 -> Server2"
if((Get-ChildItem $c35 -Recurse | Get-FileHash | Select-Object Hash,Path).hash -eq (Get-ChildItem $c36 -Recurse | Get-FileHash | Select-Object Hash,Path).hash){"Identical"}else{"NOT Identical"}
Write-Host "Comparing Server1 -> Server3"
if((Get-ChildItem $c35 -Recurse | Get-FileHash | Select-Object Hash,Path).hash -eq (Get-ChildItem $c37 -Recurse | Get-FileHash | Select-Object Hash,Path).hash){"Identical"}else{"NOT Identical"}
Write-Host "Comparing Server1 -> Server4"
if((Get-ChildItem $c35 -Recurse | Get-FileHash | Select-Object Hash,Path).hash -eq (Get-ChildItem $c38 -Recurse | Get-FileHash | Select-Object Hash,Path).hash){"Identical"}else{"NOT Identical"}
Write-Host "Comparing Server1 -> Server5"
if((Get-ChildItem $c35 -Recurse | Get-FileHash | Select-Object Hash,Path).hash -eq (Get-ChildItem $c45 -Recurse | Get-FileHash | Select-Object Hash,Path).hash){"Identical"}else{"NOT Identical"}
Write-Host "Comparing Server1 -> Server6"
if((Get-ChildItem $c35 -Recurse | Get-FileHash | Select-Object Hash,Path).hash -eq (Get-ChildItem $c46 -Recurse | Get-FileHash | Select-Object Hash,Path).hash){"Identical"}else{"NOT Identical"}
Write-Host "Comparing Server1 -> Server7"
if((Get-ChildItem $c35 -Recurse | Get-FileHash | Select-Object Hash,Path).hash -eq (Get-ChildItem $c47 -Recurse | Get-FileHash | Select-Object Hash,Path).hash){"Identical"}else{"NOT Identical"}
Write-Host "Comparing Server1 -> Server8"
if((Get-ChildItem $c35 -Recurse | Get-FileHash | Select-Object Hash,Path).hash -eq (Get-ChildItem $c48 -Recurse | Get-FileHash | Select-Object Hash,Path).hash){"Identical"}else{"NOT Identical"}
} until ($choice -eq 3)
Here is an example function that tries to compare one reference directory against multiple difference directories efficiently. It does so by comparing the most easily available informations first and stopping at the first difference.
Get all relevant informations about files in reference directory once, including hashes (though this could be more optimized by getting hashes only if necessary).
For each difference directory, compare in this order:
file count - if different, then obviously directories are different
relative file paths - if not all paths from difference directory can be found in reference directory, then directories are different
file sizes - should be obvious
file hashes - hashes only need to be calculated if files have equal size
Function Compare-MultipleDirectories {
param(
[Parameter(Mandatory)] [string] $ReferencePath,
[Parameter(Mandatory)] [string[]] $DifferencePath
)
# Get basic file information recursively by calling Get-ChildItem with the addition of the relative file path
Function Get-ChildItemRelative {
param( [Parameter(Mandatory)] [string] $Path )
Push-Location $Path # Base path for Get-ChildItem and Resolve-Path
try {
Get-ChildItem -File -Recurse |
Select-Object FullName, Length, #{ n = 'RelativePath'; e = { Resolve-Path $_.FullName -Relative } }
} finally {
Pop-Location
}
}
Write-Verbose "Reading reference directory '$ReferencePath'"
# Create hashtable with all infos of reference directory
$refFiles = #{}
Get-ChildItemRelative $ReferencePath |
Select-Object *, #{ n = 'Hash'; e = { (Get-FileHash $_.FullName -Algorithm MD5).Hash } } |
ForEach-Object { $refFiles[ $_.RelativePath ] = $_ }
# Compare content of each directory of $DifferencePath with $ReferencePath
foreach( $diffPath in $DifferencePath ) {
Write-Verbose "Comparing directory '$diffPath' with '$ReferencePath'"
$areDirectoriesEqual = $false
$differenceType = $null
$diffFiles = Get-ChildItemRelative $diffPath
# Directories must have same number of files
if( $diffFiles.Count -eq $refFiles.Count ) {
# Find first different path (if any)
$firstDifferentPath = $diffFiles | Where-Object { -not $refFiles.ContainsKey( $_.RelativePath ) } |
Select-Object -First 1
if( -not $firstDifferentPath ) {
# Find first different content (if any) by file size comparison
$firstDifferentFileSize = $diffFiles |
Where-Object { $refFiles[ $_.RelativePath ].Length -ne $_.Length } |
Select-Object -First 1
if( -not $firstDifferentFileSize ) {
# Find first different content (if any) by hash comparison
$firstDifferentContent = $diffFiles |
Where-Object { $refFiles[ $_.RelativePath ].Hash -ne (Get-FileHash $_.FullName -Algorithm MD5).Hash } |
Select-Object -First 1
if( -not $firstDifferentContent ) {
$areDirectoriesEqual = $true
}
else {
$differenceType = 'Content'
}
}
else {
$differenceType = 'FileSize'
}
}
else {
$differenceType = 'Path'
}
}
else {
$differenceType = 'FileCount'
}
# Output comparison result
[PSCustomObject]#{
ReferencePath = $ReferencePath
DifferencePath = $diffPath
Equal = $areDirectoriesEqual
DiffCause = $differenceType
}
}
}
Usage example:
# compare each of directories B, C, D, E, F against A
Compare-MultipleDirectories -ReferencePath 'A' -DifferencePath 'B', 'C', 'D', 'E', 'F' -Verbose
Output example:
ReferencePath DifferencePath Equal DiffCause
------------- -------------- ----- ---------
A B True
A C False FileCount
A D False Path
A E False FileSize
A F False Content
DiffCause column gives you the information why the function thinks the directories are different.
Note:
Select-Object -First 1 is a neat trick to stop searching after we got the first result. It is efficient because it doesn't process all input first and drop everything except first item, but instead it actually cancels the pipeline after the 1st item has been found.
Group-Object RelativePath -AsHashTable creates a hashtable of the file information so it can be looked up quickly by the RelativePath property.
Empty sub directories are ignored, because the function only looks at files. E. g. if reference path contains some empty directories but difference path does not, and the files in all other directories are equal, the function treats the directories as equal.
I've choosen MD5 algorithm because it is faster than the default SHA-256 algorithm used by Get-FileHash, but it is insecure. Someone could easily manipulate a file that is different, to have the same MD5 hash as the original file. In a trusted environment this won't matter though. Remove -Algorithm MD5 if you need more secure comparison.
A simple place to start:
compare (dir -r dir1) (dir -r dir2) -Property name,length,lastwritetime
You can also add -passthru to see the original objects, or -includeequal to see the equal elements. The order of each array doesn't matter without -syncwindow. I'm assuming all the lastwritetime's are in sync, to the millisecond. Don't assume you can skip specifying the properties to compare. See also Comparing folders and content with PowerShell
I was looking into calculated properties like for relative path, but it looks like you can't name them, even in powershell 7. I'm chopping off the first four path elements, 0..3.
compare (dir -r foo1) (dir -r foo2) -Property length,lastwritetime,#{e={($_.fullname -split '\\')[4..$_.fullname.length] -join '\'}}
length lastwritetime ($_.fullname -split '\\')[4..$_.fullname.length] -join '\' SideIndicator
------ ------------- ---------------------------------------------------------- -------------
16 11/12/2022 11:30:20 AM foo2\file2 =>
18 11/12/2022 11:30:20 AM foo1\file2 <=

Find similarly-named files, and if present, remove the files without a specific string using PowerShell

In a directory, there are files with the following filenames:
ExampleFile.mp3
ExampleFile_pn.mp3
ExampleFile2.mp3
ExampleFile2_pn.mp3
ExampleFile3.mp3
I want to iterate through the directory, and IF there is a filename that contains the string '_pn.mp3', I want to test if there is a similarly named file without the '_pn.mp3' in the same directory. If that file exists, I want to remove it.
In the above example, I'd want to remove:
ExampleFile.mp3
ExampleFile2.mp3
and I'd want to keep ExampleFile3.mp3
Here's what I have so far:
$pattern = "_pn.mp3"
$files = Get-ChildItem -Path '$path' | Where-Object {! $_.PSIsContainer}
Foreach ($file in $files) {
If($file.Name -match $pattern){
# filename with _pn.mp3 exists
Write-Host $file.Name
# search in the current directory for the same filename without _pn
<# If(Test-Path $currentdir $filename without _pn.mp3) {
Remove-Item -Force}
#>
}
enter code here
You could use Group-Object to group all files by their BaseName (with the pattern removed), and then loop over the groups where there are more than one file. The result of grouping the files and filtering by count would look like this:
$files | Group-Object { $_.BaseName.Replace($pattern,'') } |
Where-Object Count -GT 1
Count Name Group
----- ---- -----
2 ExampleFile {ExampleFile.mp3, ExampleFile_pn.mp3}
2 ExampleFile2 {ExampleFile2.mp3, ExampleFile2_pn.mp3}
Then if we loop over these groups we can search for the files that do not end with the $pattern:
#'
ExampleFile.mp3
ExampleFile_pn.mp3
ExampleFile2.mp3
ExampleFile2_pn.mp3
ExampleFile3.mp3
'# -split '\r?\n' -as [System.IO.FileInfo[]] | Set-Variable files
$pattern = "_pn"
$files | Group-Object { $_.BaseName.Replace($pattern,'') } |
Where-Object Count -GT 1 | ForEach-Object {
$_.Group.Where({-not $_.BaseName.Endswith($pattern)})
}
This is how your code would look like, remove the -WhatIf switch if you consider the code is doing what you wanted.
$pattern = "_pn.mp3"
$files = Get-ChildItem -Path -Filter *.mp3 -File
$files | Group-Object { $_.BaseName.Replace($pattern,'') } |
Where-Object Count -GT 1 | ForEach-Object {
$toRemove = $_.Group.Where({-not $_.BaseName.Endswith($pattern)})
Remove-Item $toRemove -WhatIf
}
I think you can get by here by adding file names into a hash map as you go. If you encounter a file with the ending you are interested in, check if a similar file name was added. If so, remove both the file and the similar match.
$ending = "_pn.mp3"
$files = Get-ChildItem -Path $path -File | Where-Object { ! $_.PSIsContainer }
$hash = #{}
Foreach ($file in $files) {
# Check if file has an ending we are interested in
If ($file.Name.EndsWith($ending)) {
$similar = $file.Name.Split($ending)[0] + ".mp3"
# Check if we have seen the similar file in the hashmap
If ($hash.Contains($similar)) {
Write-Host $file.Name
Write-Host $similar
Remove-Item -Force $file
Remove-Item -Force $hash[$similar]
# Remove similar from hashmap as it is removed and no longer of interest
$hash.Remove($similar)
}
}
else {
# Add entry for file name and reference to the file
$hash.Add($file.Name, $file)
}
}
Just get a list of the files with the _pn then process against the rest.
$pattern = "*_pn.mp3"
$files = Get-ChildItem -Path "$path" -File -filter "$pattern"
Foreach ($file in $files) {
$TestFN = $file.name -replace("_pn","")
If (Test-Path -Path $(Join-Path -Path $Path -ChildPath $TestFN)) {
$file | Remove-Item -force
}
} #End Foreach

Powershell Script to find duplicate files

I found a PowerShell script on TechNet to help locate duplicate files in folders. However, when I run it, I am getting an error on what appears to be every folder\file. Not sure what switch is supposed to be used in this.
$Path = '\\servername\Share\Folders' #define path to folders to find duplicate files
$Files=gci -File -Recurse -path $Path | Select-Object -property FullName,Length
$Count=1
$TotalFiles=$Files.Count
$MatchedSourceFiles=#()
ForEach ($SourceFile in $Files)
{
Write-Progress -Activity "Processing Files" -status "Processing File $Count / $TotalFiles" -PercentComplete ($Count / $TotalFiles * 100)
$MatchingFiles=#()
$MatchingFiles=$Files |Where-Object {$_.Length -eq $SourceFile.Length}
Foreach ($TargetFile in $MatchingFiles)
{
if (($SourceFile.FullName -ne $TargetFile.FullName) -and !(($MatchedSourceFiles |
Select-Object -ExpandProperty File) -contains $TargetFile.FullName))
{
Write-Verbose "Matching $($SourceFile.FullName) and $($TargetFile.FullName)"
Write-Verbose "File sizes match."
if ((fc.exe /A $SourceFile.FullName $TargetFile.FullName) -contains "FC: no differences encountered")
{
Write-Verbose "Match found."
$MatchingFiles+=$TargetFile.FullName
}
}
}
if ($MatchingFiles.Count -gt 0)
{
$NewObject=[pscustomobject][ordered]#{
File=$SourceFile.FullName
MatchingFiles=$MatchingFiles
}
$MatchedSourceFiles+=$NewObject
}
$Count+=1
}
$MatchedSourceFiles
Errors
FC: Insufficient number of file specifications
fc.exe : FC: Invalid Switch
At line:18 char:12
gci : Could not find a part of the path
At line:2 char:8
fc.exe : FC: Invalid Switch
At line:18 char:12
To fix your fc.exe error and optimize tour script, I also recommend #rich-moss 's solution.
But if you only want to find duplicates, you can easily accomplish so by checking their hashes.
Example:
$Duplicates = Get-ChildItem -File -Recurse | Get-FileHash | Group-Object -Property Hash | Where-Object Count -gt 1
If ($duplicates.count -lt 1) {
$null # 'No duplicates found. Do stuff ...'
} else {
$result = foreach ($d in $duplicates) {
$d.Group | Select-Object -Property Path, Hash
}
The script you provided is very inefficient and provides false positives in my tests. It's inefficient because it compares every file twice (Source->Target and Target->Source) and because it iterates through all files regardless of size. Here's a quicker version that gathers the files into groups of similarly sized files and only executes FC.EXE once per pair of files:
$Path = 'C:\Temp'
$SameSizeFiles = gci -Path $Path -File -Recurse | Select FullName, Length | Group-Object Length | ? {$_.Count -gt 1} #the list of files with same size
$MatchingFiles=#()
$GroupNdx=1
Foreach($SizeGroup in ($SameSizeFiles | Select Group)){
For($FromNdx = 0; $FromNdx -lt $SizeGroup.Group.Count - 1; $FromNdx++){
For($ToNdx = $FromNdx + 1; $ToNdx -lt $SizeGroup.Group.Count; $ToNdx++){
If( (fc.exe /A $SizeGroup.Group[$FromNdx].FullName $SizeGroup.Group[$ToNdx].FullName) -contains "FC: no differences encountered"){
$MatchingFiles += [pscustomobject]#{File=$SizeGroup.Group[$FromNdx].FullName; Match = $SizeGroup.Group[$ToNdx].FullName }
}
}
}
Write-Progress -Activity "Finding Duplicates" -status "Processing group $GroupNdx of $($SameSizeFiles.Count)" -PercentComplete ($GroupNdx / $SameSizeFiles.Count * 100)
$GroupNdx += 1
}
$MatchingFiles
Efficiency will be even more important if you're running it over the network. You may find it quicker to execute the script on the server itself, rather than from a share. There is some discussion here about the fastest way to compare files in .Net.

PowerShell to get Folder Owner 3 Folders Deep

I need to get a list of all the folders owners on a shared network drive. However, I want to limit the recursion to just 3 folders deep (some of our users will create folders several levels deep, despite us telling them not to). I've found the below script, and slightly modified it to just give folder owner (it originally returned a lot more information for ACLs), but it still goes down through every folder level. How can I modify this to only return 3 folder levels?
$OutFile = "C:\temp\FolderOwner.csv" # indicates where to input your logfile#
$Header = "Folder Path;Owner"
Add-Content -Value $Header -Path $OutFile
$RootPath = "G:\" # which directory/folder you would like to extract the acl permissions#
$Folders = dir $RootPath -recurse | where {$_.psiscontainer -eq $true}
foreach ($Folder in $Folders){
$Owner = (get-acl $Folder.fullname).owner
Foreach ($ACL in $Owner){
$OutInfo = $Folder.Fullname + ";" + $owner
Add-Content -Value $OutInfo -Path $OutFile
}
}
You should be able to add a '*' to your path for each level. For example, this should return items three levels deep under C:\Temp:
dir c:\temp\*\*\*
Here's a sample function you can use (it's written for PowerShell v3 or higher, but it can be modified to work for version 2):
function Get-FolderOwner {
param(
[string] $Path = "."
)
Get-ChildItem $Path -Directory | ForEach-Object {
# Get-Acl throws terminating errors, so we need to wrap it in
# a ForEach-Object block; included -ErrorAction Stop out of habit
try {
$Owner = $_ | Get-Acl -ErrorAction Stop | select -exp Owner
}
catch {
$Owner = "Error: {0}" -f $_.Exception.Message
}
[PSCustomObject] #{
Path = $_.FullName
Owner = $Owner
}
}
}
Then you could use it like this:
Get-FolderOwner c:\temp\*\*\* | Export-Csv C:\temp\FolderOwner.csv
If you're after all items up to and including 3 levels deep, you can modify the function like this:
function Get-FolderOwner {
param(
[string] $Path = ".",
[int] $RecurseDepth = 1
)
$RecurseDepth--
Get-ChildItem $Path -Directory | ForEach-Object {
# Get-Acl throws terminating errors, so we need to wrap it in
# a ForEach-Object block; included -ErrorAction Stop out of habit
try {
$Owner = $_ | Get-Acl -ErrorAction Stop | select -exp Owner
}
catch {
$Owner = "Error: {0}" -f $_.Exception.Message
}
[PSCustomObject] #{
Path = $_.FullName
Owner = $Owner
}
if ($RecurseDepth -gt 0) {
Get-FolderOwner -Path $_.FullName -RecurseDepth $RecurseDepth
}
}
}
And use it like this:
Get-FolderOwner c:\temp -RecurseDepth 3 | Export-Csv C:\temp\FolderOwner.csv
Any help?
resolve-path $RootPath\*\* |
where { (Get-Item $_).PSIsContainer } -PipelineVariable Path |
Get-Acl |
Select #{l='Folder';e={$Path}},Owner

Recursively count files in subfolders

I am trying to count the files in all subfolders in a directory and display them in a list.
For instance the following dirtree:
TEST
/VOL01
file.txt
file.pic
/VOL02
/VOL0201
file.nu
/VOL020101
file.jpg
file.erp
file.gif
/VOL03
/VOL0301
file.org
Should give as output:
PS> DirX C:\TEST
Directory Count
----------------------------
VOL01 2
VOL02 0
VOL02/VOL0201 1
VOL02/VOL0201/VOL020101 3
VOL03 0
VOL03/VOL0301 1
I started with the following:
Function DirX($directory)
{
foreach ($file in Get-ChildItem $directory -Recurse)
{
Write-Host $file
}
}
Now I have a question: why is my Function not recursing?
Something like this should work:
dir -recurse | ?{ $_.PSIsContainer } | %{ Write-Host $_.FullName (dir $_.FullName | Measure-Object).Count }
dir -recurse lists all files under current directory and pipes (|) the result to
?{ $_.PSIsContainer } which filters directories only then pipes again the resulting list to
%{ Write-Host $_.FullName (dir $_.FullName | Measure-Object).Count } which is a foreach loop that, for each member of the list ($_) displays the full name and the result of the following expression
(dir $_.FullName | Measure-Object).Count which provides a list of files under the $_.FullName path and counts members through Measure-Object
?{ ... } is an alias for Where-Object
%{ ... } is an alias for foreach
Similar to David's solution this will work in Powershell v3.0 and does not uses aliases in case someone is not familiar with them
Get-ChildItem -Directory | ForEach-Object { Write-Host $_.FullName $(Get-ChildItem $_ | Measure-Object).Count}
Answer Supplement
Based on a comment about keeping with your function and loop structure i provide the following. Note: I do not condone this solution as it is ugly and the built in cmdlets handle this very well. However I like to help so here is an update of your script.
Function DirX($directory)
{
$output = #{}
foreach ($singleDirectory in (Get-ChildItem $directory -Recurse -Directory))
{
$count = 0
foreach($singleFile in Get-ChildItem $singleDirectory.FullName)
{
$count++
}
$output.Add($singleDirectory.FullName,$count)
}
$output | Out-String
}
For each $singleDirectory count all files using $count ( which gets reset before the next sub loop ) and output each finding to a hash table. At the end output the hashtable as a string. In your question you looked like you wanted an object output instead of straight text.
Well, the way you are doing it the entire Get-ChildItem cmdlet needs to complete before the foreach loop can begin iterating. Are you sure you're waiting long enough? If you run that against very large directories (like C:) it is going to take a pretty long time.
Edit: saw you asked earlier for a way to make your function do what you are asking, here you go.
Function DirX($directory)
{
foreach ($file in Get-ChildItem $directory -Recurse -Directory )
{
[pscustomobject] #{
'Directory' = $File.FullName
'Count' = (GCI $File.FullName -Recurse).Count
}
}
}
DirX D:\
The foreach loop only get's directories since that is all we care about, then inside of the loop a custom object is created for each iteration with the full path of the folder and the count of the items inside of the folder.
Also, please note that this will only work in PowerShell 3.0 or newer, since the -directory parameter did not exist in 2.0
Get-ChildItem $rootFolder `
-Recurse -Directory |
Select-Object `
FullName, `
#{Name="FileCount";Expression={(Get-ChildItem $_ -File |
Measure-Object).Count }}
My version - slightly cleaner and dumps content to a file
Original - Recursively count files in subfolders
Second Component - Count items in a folder with PowerShell
$FOLDER_ROOT = "F:\"
$OUTPUT_LOCATION = "F:DLS\OUT.txt"
Function DirX($directory)
{
Remove-Item $OUTPUT_LOCATION
foreach ($singleDirectory in (Get-ChildItem $directory -Recurse -Directory))
{
$count = Get-ChildItem $singleDirectory.FullName -File | Measure-Object | %{$_.Count}
$summary = $singleDirectory.FullName+" "+$count+" "+$singleDirectory.LastAccessTime
Add-Content $OUTPUT_LOCATION $summary
}
}
DirX($FOLDER_ROOT)
I modified David Brabant's solution just a bit so I could evaluate the result:
$FileCounter=gci "$BaseDir" -recurse | ?{ $_.PSIsContainer } | %{ (gci "$($_.FullName)" | Measure-Object).Count }
Write-Host "File Count=$FileCounter"
If($FileCounter -gt 0) {
... take some action...
}