Powershell get total size of files which user owns - powershell

I need a powershell script, which will go through all users in system and will find total size of all files which any user own... I have script which is going through all users, but then I've no idea to continue with counting total size which user owns for each user
Here is a script, which I`ve now:
$users = Get-WmiObject -class Win32_UserAccount
foreach($user in $users) {
$name = $user.Name
$fullName = $user.FullName;
if(Test-Path "C:\Users\$name") {
$path = "C:\Users\$name"
} else {
$path = "C:\Users\Public"
}
$dirSize = (Get-ChildItem $path -recurse | Measure-Object -property length -sum)
"{0:N2}" -f ($dirSize.sum / 1Gb) + " Gb"
echo "$dirSize"
Add-Content -path "pathototxt..." -value "$name $fullName $path"
}
I would be more than happy If somebody know the answer and tell me it...
Thank you

If there's a lot of files, you might want to consider:
$oSIDs = #{}
get-childitem <filespec> |
foreach {
$oSID = $_.GetAccessControl().Sddl -replace '^o:(.+?).:.+','$1'
$oSIDs[$oSID] += $_.length
}
Then resolve the SIDs when you're done. Parsing the owner SID or well-know security principal ID from the SDDL string saves the provider from having to do a lot of repetitive name resolution to give you back the "friendly" names.

I have no idea what you're asking for here.
"to continue with counting total size which user owns for each user". huh? Do want to check every file on the system or just the userfolder as you currently do?
Your script works fine if you just tweak it to include the filesize in the output. Personally I'd consider using a csv to store this because not all users will have e.g. a full name(admin, guest etc.). Also, atm. your script is counting the public folder multiple times(each time a user doesn't have a profile). E.g. admin(if it has never logged in), guest etc. might both get it specified.
Updated script that outputs both textfile and csv
$users = Get-WmiObject -class Win32_UserAccount
$out = #()
#If you want to append to a csv-file, replace the $out line above with the one below
#$out = Import-Csv "file.csv"
foreach($user in $users) {
$name = $user.Name
$fullName = $user.FullName;
if(Test-Path "C:\Users\$name") {
$path = "C:\Users\$name"
} else {
$path = "C:\Users\Public"
}
$dirSize = (Get-ChildItem $path -Recurse -ErrorAction SilentlyContinue | ? { !$_.PSIsContainer } | Measure-Object -Property Length -Sum)
$size = "{0:N2}" -f ($dirSize.Sum / 1Gb) + " Gb"
#Saving as textfile
#Add-Content -path "pathototxt..." -value "$name $fullName $path $size"
Add-Content -path "file.txt" -value "$name $fullName $path $size"
#CSV-way
$o = New-Object psobject -Property #{
Name = $name
FullName = $fullName
Path = $path
Size = $size
}
$out += $o
}
#Exporting to csv format
$out | Export-Csv "file.csv" -NoTypeInformation
EDIT: Another solution using the answer provided by #mjolinor and #C.B. modified to scan your c:\ drive while excluding some "rootfolders" like "program files", "windows" etc. It exports the result to a csv file ready for Excel.:
$oSIDs = #{}
$exclude = #("Program Files", "Program Files (x86)", "Windows", "Perflogs");
Get-ChildItem C:\ | ? { $exclude -notcontains $_.Name } | % { Get-ChildItem $_.FullName -Recurse -ErrorAction SilentlyContinue | ? { !$_.PSIsContainer } } | % {
$oSID = $_.GetAccessControl().Sddl -replace '^o:(.+?).:.+','$1'
$oSIDs[$oSID] += $_.Length
}
$out = #()
$oSIDs.GetEnumerator() | % {
$user = (New-Object System.Security.Principal.SecurityIdentifier($_.Key)).Translate([System.Security.Principal.NTAccount]).Value
$out += New-Object psobject -Property #{
User = if($user) { $user } else { $_.Key }
"Size(GB)" = $oSIDs[$_.Key]/1GB
}
}
$out | Export-Csv file.csv -NoTypeInformation -Delimiter ";"

Related

How to optimize Powershell script with Get-ChildItem consuming all RAM

I have this script which parses all shares on a file server to gather information on share size, ACLs, and count of files and folders. The script works great on smaller file servers but on hosts with large shares it consumes all RAM and crashes the host, I can't seem to figure out how to optimize the script during the Get-ChildItem portion to not consume all RAM.
I found a few articles which mentioned to use a foreach loop and pipe out what I need. I am a Powershell beginner, I can't figure out how to get it to work like that. What can I try next?
$ScopeName = Read-Host "Enter scope name to gather data on"
$SavePath = Read-Host "Path to save results and log to"
$SaveCSVPath = "$SavePath\ShareData.csv"
$TranscriptLog = "$SavePath\Transcript.log"
Write-Host
Start-Transcript -Path $TranscriptLog
$StartTime = Get-Date
$Start = $StartTime | Select-Object -ExpandProperty DateTime
$Exclusions = {$_.Description -ne "Remote Admin" -and $_.Description -ne "Default Share" -and $_.Description -ne "Remote IPC" }
$FileShares = Get-SmbShare -ScopeName $ScopeName | Where-Object $Exclusions
$Count = $FileShares.Count
Write-Host
Write-Host "Gathering data for $Count shares" -ForegroundColor Green
Write-Host
Write-Host "Results will be saved to $SaveCSVPath" -ForegroundColor Green
Write-Host
ForEach ($FileShare in $FileShares)
{
$ShareName = $FileShare.Name
$Path = $Fileshare.Path
Write-Host "Working on: $ShareName - $Path" -ForegroundColor Yellow
$GetObjectInfo = Get-Childitem -Path $Path -Recurse -Force -ErrorAction SilentlyContinue
$ObjSize = $GetObjectInfo | Measure-Object -Property Length -Sum -ErrorAction SilentlyContinue
$ObjectSizeMB = "{0:N2}" -f ($ObjSize.Sum / 1MB)
$ObjectSizeGB = "{0:N2}" -f ($ObjSize.Sum / 1GB)
$ObjectSizeTB = "{0:N2}" -f ($ObjSize.Sum / 1TB)
$NumFiles = ($GetObjectInfo | Where-Object {-not $_.PSIsContainer}).Count
$NumFolders = ($GetObjectInfo | Where-Object {$_.PSIsContainer}).Count
$ACL = Get-Acl -Path $Path
$LastAccessTime = Get-ItemProperty $Path | Select-Object -ExpandProperty LastAccessTime
$LastWriteTime = Get-ItemProperty $Path | Select-Object -ExpandProperty LastWriteTime
$Table = [PSCustomObject]#{
'ScopeName' = $FileShare.ScopeName
'Sharename' = $ShareName
'SharePath' = $Path
'Owner' = $ACL.Owner
'Permissions' = $ACL.AccessToString
'LastAccess' = $LastAccessTime
'LastWrite' = $LastWriteTime
'Size (MB)' = $ObjectSizeMB
'Size (GB)' = $ObjectSizeGB
'Size (TB)' = $ObjectSizeTB
'Total File Count' = $NumFiles
'Total Folder Count' = $NumFolders
'Total Item Count' = $GetObjectInfo.Count
}
$Table | Export-CSV -Path $SaveCSVPath -Append -NoTypeInformation
}
$EndTime = Get-Date
$End = $EndTime | Select-Object -ExpandProperty DateTime
Write-Host
Write-Host "Script start time: $Start" -ForegroundColor Green
Write-Host "Script end time: $End" -ForegroundColor Green
Write-Host
$ElapsedTime = $(($EndTime-$StartTime))
Write-Host "Elapsed time: $($ElapsedTime.Days) Days $($ElapsedTime.Hours) Hours $($ElapsedTime.Minutes) Minutes $($ElapsedTime.Seconds) Seconds $($ElapsedTime.MilliSeconds) Milliseconds" -ForegroundColor Cyan
Write-Host
Write-Host "Results saved to $SaveCSVPath" -ForegroundColor Green
Write-Host
Write-Host "Transcript saved to $TranscriptLog" -ForegroundColor Green
Write-Host
Stop-Transcript
To correctly use the PowerShell pipeline (and preserve memory as each item is streamed separately), use the PowerShell ForEach-Object cmdlet (unlike the ForEach statement) and avoid assigning the pipeline to a variable (as you doing with $FileShares = ...) and don't use parenthesis ((...)) arround the the pipeline:
Get-SmbShare -ScopeName $ScopeName | Where-Object $Exclusions | ForEach-Object {
And replace all $FileShare variables in your loop with the current item: $_ variable (e.g. $FileShare.Name → $_.Name).
For the Get-Childitem part you might do the same thing (stream! meaning: use the mighty PowerShell pipeline rather than piling everything up in $GetObjectInfo):
$ObjSize = Get-Childitem -Path $Path -Recurse -Force -ErrorAction SilentlyContinue |
Measure-Object -Property Length -Sum -ErrorAction SilentlyContinue
As an aside; you might simplify your 3 size properties to a single smarter size property, see: How to convert value to KB, MB, or GB depending on digit placeholders?
addition
"But isn't putting everything into $ObjSize just swapping one variable for another?"
No it is not, think of the PowerShell pipeline as an assembly line. In this case, at the first station you take each single file information and pass it to the next (last) station where you just sum the length property and the current (file) object disposed.
Where in your question example, you read the information of all files in once and store it into $GetObjectInfo and than go to the whole list to just use (add) the length property of the (quiet heavy) PowerShell file objects.
But why don't you try it?:
Open a new PowerShell session and run:
$Path = '.'
$GetObjectInfo = Get-Childitem -Path $Path -Recurse -Force -ErrorAction SilentlyContinue
$ObjSize = $GetObjectInfo | Measure-Object -Property Length -Sum -ErrorAction SilentlyContinue
Get-Process -ID $PID
Now, open a new session again and use the PowerShell pipeline:
$Path = '.'
$ObjSize = Get-Childitem -Path $Path -Recurse -Force -ErrorAction SilentlyContinue |
Measure-Object -Property Length -Sum -ErrorAction SilentlyContinue
Get-Process -ID $PID
Notice the difference in memory usage (WS(M)).
You are buffering the entire collection of [FileSystemInfo] on $FileShare into a variable with...
$GetObjectInfo = Get-Childitem -Path $Path -Recurse -Force -ErrorAction SilentlyContinue
So, if there's a million directories and files on that share then that's a million [FileSystemInfo] instances stored in a million-element array, none of which can be garbage collected during that iteration of the foreach loop. You can use Group-Object to improve that a bit...
$groupsByPSIsContainer = Get-Childitem -Path $Path -Recurse -Force -ErrorAction SilentlyContinue |
Group-Object -Property 'PSIsContainer' -AsHashTable
# $groupsByPSIsContainer is a [Hashtable] with two keys:
# - $true gets the collection of directories
# - $false gets the collection of files
$ObjSize = $groupsByPSIsContainer[$false] | Measure-Object -Property Length -Sum -ErrorAction SilentlyContinue
$NumFiles = $groupsByPSIsContainer[$false].Count
$NumFolders = $groupsByPSIsContainer[$true].Count
...but that still ends up storing all of the [FileSystemInfo]s in the two branches of the [Hashtable]. Instead, I would just enumerate and count the results myself...
$ObjSize = 0L # Stores the total file size directly; use $ObjSize instead of $ObjSize.Sum
$NumFiles = 0
$NumFolders = 0
foreach ($fileSystemInfo in Get-Childitem -Path $Path -Recurse -Force -ErrorAction SilentlyContinue)
{
if ($fileSystemInfo.PSIsContainer)
{
$NumFolders++
}
else
{
$NumFiles++
$ObjSize += $fileSystemInfo.Length
}
}
That stores only the current enumeration result in $fileSystemInfo and never the entire sequence.
Note that if you weren't summing the files' sizes Group-Object would work well...
$groupsByIsContainer = Get-Childitem -Path $Path -Recurse -Force -ErrorAction SilentlyContinue |
Group-Object -Property 'PSIsContainer' -NoElement
$NumFiles = ($groupsByIsContainer | Where-Object -Property 'Name' -EQ -Value $false).Count
$NumFolders = ($groupsByIsContainer | Where-Object -Property 'Name' -EQ -Value $true ).Count
-NoElement prevents the resulting group objects from storing the grouped elements; we just care about the count of members in each grouping but not the members themselves. If we passed -AsHashTable then we'd lose the convenient Count property, hence why the two groups have to be accessed in this awkward way.

Powershell Filter for mathing AD Account to Folder

I am trying to match the samaccountname to the current folder patch, and it would worked fine if the folderpatch was "d:\profile\username" but instead of username its states username_S-1-5-21*....
I use the below code, but is it possible to filter everything behind the _S-1.... so it can match the username to the samaccountname ?
I use the below code, any help would be appreciated
[EDIT: added complete script]
Below is the complete script, so the issue is that i have several user folder with _S-1-5-21* behind the username in the folder of our FXLogic profile folder and need to match the samaccountname with the folder ( username - _S-1-5-21* )
I hope this explanation is more clear, and yes its a SID not a GUID always get them mixed up.
param(
[Parameter(Mandatory=$true)]
$FXLogicFolderPath,
$MoveFolderPath,
$SearchBase,
[string[]]$ExcludePath,
[switch]$FolderSize,
[switch]$MoveDisabled,
[switch]$DisplayAll,
[switch]$UseRobocopy,
[switch]$RegExExclude,
[switch]$CheckFXLogicDirectory)
Check if FXLogicFolderPath is found, exit with warning message if path is incorrect
if (!(Test-Path -LiteralPath $FXLogicFolderPath)){
Write-Warning "FXLogicFolderPath not found: $FXLogicFolderPath"
Check if MoveFolderPath is found, exit with warning message if path is incorrect
if ($MoveFolderPath) {
if (!(Test-Path -LiteralPath $MoveFolderPath)){
Write-Warning "MoveFolderPath not found: $MoveFolderPath"
exit
}}
exit
Main loop, for each folder found under FXLogic folder path AD is queried to find a matching samaccountname
$ListOfFolders = Get-ChildItem -LiteralPath "$FXLogicFolderPath" -Force | Where-Object {$_.PSIsContainer}
Exclude folders if the ExcludePath parameter is given
if ($ExcludePath) {
$ExcludePath | ForEach-Object {
$CurrentExcludePath = $_
if ($RegExExclude) {
$ListOfFolders = $ListOfFolders | Where-Object {$_.FullName -notmatch $CurrentExcludePath}
} else {
$ListOfFolders = $ListOfFolders | Where-Object {$_.FullName -ne $CurrentExcludePath}
}
}}
$ListOfFolders | ForEach-Object {
$CurrentPath = Split-Path -Path $_ -Leaf
Construct AD Searcher, add SearchRoot attribute if SearchBase parameter is specified
$ADSearcher = New-Object DirectoryServices.DirectorySearcher -Property #{
Filter = "(samaccountname=$CurrentPath)"
}
if ($SearchBase) {
$ADSearcher.SearchRoot = [adsi]$SearchBase
}
Use the FullName path to look for a FXLogicdirectory attribute and replace the backslash by the \5C LDAP escape character
if ($CheckFXLogicDirectory) {
$ADSearcher.Filter = "(FXLogicdirectory=$($_.FullName -replace '\\','\5C')*)"
}
Execute AD Query and store in $ADResult
$ADResult = $ADSearcher.Findone()
If no matching samaccountname is found this code is executed and displayed
if (!($ADResult)) {
$HashProps = #{
'Error' = 'Account does not exist and has a FXLogic folder'
'FullPath' = $_.FullName
}
if ($FolderSize) {
$HashProps.SizeinBytes = [long](Get-ChildItem -LiteralPath $_.Fullname -Recurse -Force -ErrorAction SilentlyContinue |
Measure-Object -Property Length -Sum -ErrorAction SilentlyContinue | Select-Object -Exp Sum)
$HashProps.SizeinMegaBytes = "{0:n2}" -f ($HashProps.SizeinBytes/1MB)
}
if ($MoveFolderPath) {
$HashProps.DestinationFullPath = Join-Path -Path $MoveFolderPath -ChildPath (Split-Path -Path $_.FullName -Leaf)
if ($UseRobocopy) {
robocopy $($HashProps.FullPath) $($HashProps.DestinationFullPath) /E /MOVE /R:2 /W:1 /XJD /XJF | Out-Null
} else {
Move-Item -LiteralPath $HashProps.FullPath -Destination $HashProps.DestinationFullPath -Force
}
}
Output the object
New-Object -TypeName PSCustomObject -Property $HashProps
If samaccountname is found but the account is disabled this information is displayed
} elseif (([boolean]((-join $ADResult.Properties.useraccountcontrol) -band 2))) {
$HashProps = #{
'Error' = 'Account is disabled and has a FXLogic folder'
'FullPath' = $_.FullName
}
if ($FolderSize) {
$HashProps.SizeinBytes = [long](Get-ChildItem -LiteralPath $_.Fullname -Recurse -Force -ErrorAction SilentlyContinue |
Measure-Object -Property Length -Sum -ErrorAction SilentlyContinue | Select-Object -Exp Sum)
$HashProps.SizeinMegaBytes = "{0:n2}" -f ($HashProps.SizeinBytes/1MB)
}
if ($MoveFolderPath -and $MoveDisabled) {
$HashProps.DestinationFullPath = Join-Path -Path $MoveFolderPath -ChildPath (Split-Path -Path $_.FullName -Leaf)
Move-Item -LiteralPath $HashProps.FullPath -Destination $HashProps.DestinationFullPath -Force
}
Output the object
New-Object -TypeName PSCustomObject -Property $HashProps
Folders that do have active user accounts are displayed if -DisplayAll switch is set
} elseif ($ADResult -and $DisplayAll) {
$HashProps = #{
'Error' = $null
'FullPath' = $_.FullName
}
if ($FolderSize) {
$HashProps.SizeinBytes = [long](Get-ChildItem -LiteralPath $_.Fullname -Recurse -Force -ErrorAction SilentlyContinue |
Measure-Object -Property Length -Sum -ErrorAction SilentlyContinue | Select-Object -Exp Sum)
$HashProps.SizeinMegaBytes = "{0:n2}" -f ($HashProps.SizeinBytes/1MB)
}
Output the object
New-Object -TypeName PSCustomObject -Property $HashProps
}}
# Construct AD Searcher, add SearchRoot attribute if SearchBase parameter is specified
$ADSearcher = New-Object DirectoryServices.DirectorySearcher -Property #{
Filter = "(samaccountname=$CurrentPath)"
}
if ($SearchBase) {
$ADSearcher.SearchRoot = [adsi]$SearchBase
The S-1-5-21... part is not a GUID, it's a SID (the principal's Security Identifier).
You can use the -replace operator to remove that part of the folder name:
$folderName = 'username_S-1-5-21-2855571654-3033049851-1520320983-9328'
$userName = $folderName -replace '_S-1-5.*$'
After which you can construct the desired LDAP query filter:
$ADSearcher = New-Object DirectoryServices.DirectorySearcher -Property #{
Filter = "(samaccountname=$userName)"
}
Not sure where your username is being derived from but you could do something simple with a regex replace on the $CurrentPath
$currentPath = 'username_S-1-5-21-2855571654-3033049851-1520320983-9328'
$currentPath = $currentPath -replace '_.+'
# AD Searcher here
This will replace everything after the underscore but all usernames will have to be in a consistent format or you'll have to take into account all possible formats that could be encountered.

Powershell Script is printing out duplicate entries of the same path

My objective is to write a powershell script that will recursively check a file server for any directories that are "x" (insert days) old or older.
I ran into a few issues initially, and I think I got most of it worked out. One of the issues I ran into was with the path limitation of 248 characters. I found a custom function that I am implementing in my code to bypass this limitation.
The end result is I would like to output the path and LastAccessTime of the folder and export the information into an easy to read csv file.
Currently everything is working properly, but for some reason I get some paths output several times (duplicates, triples, even 4 times). I just want it output once for each directory and subdirectory.
I'd appreciate any guidance I can get. Thanks in advance.
Here's my code
#Add the import and snapin in order to perform AD functions
Add-PSSnapin Quest.ActiveRoles.ADManagement -ea SilentlyContinue
Import-Module ActiveDirectory
#Clear Screen
CLS
Function Get-FolderItem
{
[cmdletbinding(DefaultParameterSetName='Filter')]
Param (
[parameter(Position=0,ValueFromPipeline=$True,ValueFromPipelineByPropertyName=$True)]
[Alias('FullName')]
[string[]]$Path = $PWD,
[parameter(ParameterSetName='Filter')]
[string[]]$Filter = '*.*',
[parameter(ParameterSetName='Exclude')]
[string[]]$ExcludeFile,
[parameter()]
[int]$MaxAge,
[parameter()]
[int]$MinAge
)
Begin
{
$params = New-Object System.Collections.Arraylist
$params.AddRange(#("/L","/S","/NJH","/BYTES","/FP","/NC","/NFL","/TS","/XJ","/R:0","/W:0"))
If ($PSBoundParameters['MaxAge'])
{
$params.Add("/MaxAge:$MaxAge") | Out-Null
}
If ($PSBoundParameters['MinAge'])
{
$params.Add("/MinAge:$MinAge") | Out-Null
}
}
Process
{
ForEach ($item in $Path)
{
Try
{
$item = (Resolve-Path -LiteralPath $item -ErrorAction Stop).ProviderPath
If (-Not (Test-Path -LiteralPath $item -Type Container -ErrorAction Stop))
{
Write-Warning ("{0} is not a directory and will be skipped" -f $item)
Return
}
If ($PSBoundParameters['ExcludeFile'])
{
$Script = "robocopy `"$item`" NULL $Filter $params /XF $($ExcludeFile -join ',')"
}
Else
{
$Script = "robocopy `"$item`" NULL $Filter $params"
}
Write-Verbose ("Scanning {0}" -f $item)
Invoke-Expression $Script | ForEach {
Try
{
If ($_.Trim() -match "^(?<Children>\d+)\s+(?<FullName>.*)")
{
$object = New-Object PSObject -Property #{
ParentFolder = $matches.fullname -replace '(.*\\).*','$1'
FullName = $matches.FullName
Name = $matches.fullname -replace '.*\\(.*)','$1'
}
$object.pstypenames.insert(0,'System.IO.RobocopyDirectoryInfo')
Write-Output $object
}
Else
{
Write-Verbose ("Not matched: {0}" -f $_)
}
}
Catch
{
Write-Warning ("{0}" -f $_.Exception.Message)
Return
}
}
}
Catch
{
Write-Warning ("{0}" -f $_.Exception.Message)
Return
}
}
}
}
Function ExportFolders
{
#================ Global Variables ================
#Path to folders
$Dir = "\\myFileServer\somedir\blah"
#Get all folders
$ParentDir = Get-ChildItem $Dir | Where-Object {$_.PSIsContainer -eq $True}
#Export file to our destination
$ExportedFile = "c:\temp\dirFolders.csv"
#Duration in Days+ the file hasn't triggered "LastAccessTime"
$duration = 800
$cutOffDate = (Get-Date).AddDays(-$duration)
#Used to hold our information
$results = #()
#=============== Done with Variables ===============
ForEach ($SubDir in $ParentDir)
{
$FolderPath = $SubDir.FullName
$folders = Get-ChildItem -Recurse $FolderPath -force -directory| Where-Object { ($_.LastAccessTimeUtc -le $cutOffDate)} | Select-Object FullName, LastAccessTime
ForEach ($folder in $folders)
{
$folderPath = $folder.fullname
$fixedFolderPaths = ($folderPath | Get-FolderItem).fullname
ForEach ($fixedFolderPath in $fixedFolderPaths)
{
#$fixedFolderPath
$getLastAccessTime = $(Get-Item $fixedFolderPath -force).lastaccesstime
#$getLastAccessTime
$details = #{ "Folder Path" = $fixedFolderPath; "LastAccessTime" = $getLastAccessTime}
$results += New-Object PSObject -Property $details
$results
}
}
}
}
ExportFolders
I updated my code a bit and simplified it. Here is the new code.
#Add the import and snapin in order to perform AD functions
Add-PSSnapin Quest.ActiveRoles.ADManagement -ea SilentlyContinue
Import-Module ActiveDirectory
#Clear Screen
CLS
Function ExportFolders
{
#================ Global Variables ================
#Path to user profiles in Barrington
$Dir = "\\myFileServer\somedir\blah"
#Get all user folders
$ParentDir = Get-ChildItem $Dir | Where-Object {$_.PSIsContainer -eq $True} | where {$_.GetFileSystemInfos().Count -eq 0 -or $_.GetFileSystemInfos().Count -gt 0}
#Export file to our destination
$ExportedFile = "c:\temp\dirFolders.csv"
#Duration in Days+ the file hasn't triggered "LastAccessTime"
$duration = 1
$cutOffDate = (Get-Date).AddDays(-$duration)
#Used to hold our information
$results = #()
$details = $null
#=============== Done with Variables ===============
ForEach ($SubDir in $ParentDir)
{
$FolderName = $SubDir.FullName
$FolderInfo = $(Get-Item $FolderName -force) | Select-Object FullName, LastAccessTime #| ft -HideTableHeaders
$FolderLeafs = gci -Recurse $FolderName -force -directory | Where-Object {$_.PSIsContainer -eq $True} | where {$_.GetFileSystemInfos().Count -eq 0 -or $_.GetFileSystemInfos().Count -gt 0} | Select-Object FullName, LastAccessTime #| ft -HideTableHeaders
$details = #{ "LastAccessTime" = $FolderInfo.LastAccessTime; "Folder Path" = $FolderInfo.FullName}
$results += New-Object PSObject -Property $details
ForEach ($FolderLeaf in $FolderLeafs.fullname)
{
$details = #{ "LastAccessTime" = $(Get-Item $FolderLeaf -force).LastAccessTime; "Folder Path" = $FolderLeaf}
$results += New-Object PSObject -Property $details
}
$results
}
}
ExportFolders
The FolderInfo variable is sometimes printing out multiple times, but the FolderLeaf variable is printing out once from what I can see. The problem is if I move or remove the results variable from usnder the details that print out the folderInfo, then the Parent directories don't get printed out. Only all the subdirs are shown. Also some directories are empty and don't get printed out, and I want all directories printed out including empty ones.
The updated code seems to print all directories fine, but as I mentioned I am still getting some duplicate $FolderInfo variables.
I think I have to put in a condition or something to check if it has already been processed, but I'm not sure which condition I would use to do that, so that it wouldn't print out multiple times.
In your ExportFolders you Get-ChildItem -Recurse and then loop over all of the subfolders calling Get-FolderItem. Then in Get-FolderItem you provide Robocopy with the /S flag in $params.AddRange(#("/L", "/S", "/NJH", "/BYTES", "/FP", "/NC", "/NFL", "/TS", "/XJ", "/R:0", "/W:0")) The /S flag meaning copy Subdirectories, but not empty ones. So you are recursing again. Likely you just need to remove the /S flag, so that you are doing all of your recursion in ExportFolders.
In response to the edit:
Your $results is inside of the loop. So you will have a n duplicates for the first $subdir then n-1 duplicates for the second and so forth.
ForEach ($SubDir in $ParentDir) {
#skipped code
ForEach ($FolderLeaf in $FolderLeafs.fullname) {
#skipped code
}
$results
}
should be
ForEach ($SubDir in $ParentDir) {
#skipped code
ForEach ($FolderLeaf in $FolderLeafs.fullname) {
#skipped code
}
}
$results

Keyword Search Across All Servers - Powershell

Afternoon All,
I need to run a search across all of our servers.
I have the list of servers in a text document and a list of keywords in another
$Servers = get-content -path 'C:\support\Server Search\Server Test.txt'
$Keywords = get-content -path "C:\Support\Server Search\Keyword Test.txt"
Foreach ($Server in $Servers){
Foreach ($Keyword in $Keywords){
Get-ChildItem "$Server" -Recurse | Where-Object {$_.Name -like "$Keyword"}
$i++
Write-Host "$found: $i - Current $ $_"
New-Object -TypeName PSCustomObject -Property #{
Directory = $_.Directory
Name = $_.Name
Length = $_.Length /1024
CreationTime = $_.CreationTime
LastWriteTime = $_.LastWriteTime
LastAccessTime = $_.LastAccessTime}|
select Directory,Name,Length,CreationTime,LastWriteTime,LastAccessTime |
Export-Csv "C:\support\server search\$Server.csv" -Append -NoTypeInformation
}}
$i = 0
Is there a way to indicate when a Keyword has been located and total keywords found? I feel like I need to change this Line but I cannot fathom what I would actually put, I've tried $Keywords but that just changes keyword everytime the directory changes
$i++
Write-Host "$found: $i - Current $ $_"
I'm assuming your $server is set up something like "\\servername\c$\"
when a Keyword has been located and total keywords found:
$Servers = get-content -path 'C:\support\Server Search\Server Test.txt'
$Keywords = get-content -path "C:\Support\Server Search\Keyword Test.txt"
$num = 0 #Total Keyword files Found
Foreach ($Server in $Servers){
Foreach ($Keyword in $Keywords){
#Keyword found Check
$Found = Get-ChildItem -Path "$Server" -Recurse -Include "$Keyword"
if($Found){
Foreach($File in $Found){
$num++ #increment num of keyword files found by 1
Write-Host "found: $num - $File"
New-Object -TypeName PSCustomObject -Property #{
Directory = $File.Directory
Name = $File.Name
Length = $File.Length /1024
CreationTime = $File.CreationTime
LastWriteTime = $File.LastWriteTime
LastAccessTime = $File.LastAccessTime}|
select Directory,Name,Length,CreationTime,LastWriteTime,LastAccessTime |
Export-Csv "C:\support\server search\$Server.csv" -Append -NoTypeInformation
}
}
}
}
Please let me know if this helps you progress. I can assist further if requested.

Delete old files in recycle bin with powershell

Ok, I have a script I am writing in powershell that will delete old files in the recycle bin. I want it to delete all files from the recycle bin that were deleted more than 2 days ago. I have done lots of research on this and have not found a suitable answer.
This is what I have so far(found the script online, i don't know much powershell):
$Path = 'C' + ':\$Recycle.Bin'
Get-ChildItem $Path -Force -Recurse -ErrorAction SilentlyContinue |
#Where-Object { $_.LastWriteTime -lt (Get-Date).AddDays(-3) } |
Remove-Item -Recurse -exclude *.ini -ErrorAction SilentlyContinue
It is working great with one exception, it checks the file parameter "LastWriteTime". That is awesome if the user deletes the file they same day they modify it. Otherwise it fails.
How can I modify this code so that it will check when the file was deleted, not when it was written.
-On a side note, if I run this script from an administrator account on Microsoft Server 2008 will it work for all users recycle bins or just mine?
Answer:
the code that worked for me is:
$Shell = New-Object -ComObject Shell.Application
$Global:Recycler = $Shell.NameSpace(0xa)
foreach($item in $Recycler.Items())
{
$DeletedDate = $Recycler.GetDetailsOf($item,2) -replace "\u200f|\u200e",""
$dtDeletedDate = get-date $DeletedDate
If($dtDeletedDate -lt (Get-Date).AddDays(-3))
{
Remove-Item -Path $item.Path -Confirm:$false -Force -Recurse
}#EndIF
}#EndForeach item
It works awesome for me, however 2 questions remain...How do I do this with multiple drives? and Will this apply to all users or just me?
WMF 5 includes the new "Clear-RecycleBin" cmdlet.
PS > Clear-RecycleBin -DriveLetter C:\
These two lines will empty all the files recycle bin:
$Recycler = (New-Object -ComObject Shell.Application).NameSpace(0xa)
$Recycler.items() | foreach { rm $_.path -force -recurse }
This article has answers to all your questions
http://baldwin-ps.blogspot.be/2013/07/empty-recycle-bin-with-retention-time.html
Code for posterity:
# -----------------------------------------------------------------------
#
# Author : Baldwin D.
# Description : Empty Recycle Bin with Retention (Logoff Script)
#
# -----------------------------------------------------------------------
$Global:Collection = #()
$Shell = New-Object -ComObject Shell.Application
$Global:Recycler = $Shell.NameSpace(0xa)
$csvfile = "\\YourNetworkShare\RecycleBin.txt"
$LogFailed = "\\YourNetworkShare\RecycleBinFailed.txt"
function Get-recyclebin
{
[CmdletBinding()]
Param
(
$RetentionTime = "7",
[Switch]$DeleteItems
)
$User = $env:USERNAME
$Computer = $env:COMPUTERNAME
$DateRun = Get-Date
foreach($item in $Recycler.Items())
{
$DeletedDate = $Recycler.GetDetailsOf($item,2) -replace "\u200f|\u200e","" #Invisible Unicode Characters
$DeletedDate_datetime = get-date $DeletedDate
[Int]$DeletedDays = (New-TimeSpan -Start $DeletedDate_datetime -End $(Get-Date)).Days
If($DeletedDays -ge $RetentionTime)
{
$Size = $Recycler.GetDetailsOf($item,3)
$SizeArray = $Size -split " "
$Decimal = $SizeArray[0] -replace ",","."
If ($SizeArray[1] -contains "bytes") { $Size = [int]$Decimal /1024 }
If ($SizeArray[1] -contains "KB") { $Size = [int]$Decimal }
If ($SizeArray[1] -contains "MB") { $Size = [int]$Decimal * 1024 }
If ($SizeArray[1] -contains "GB") { $Size = [int]$Decimal *1024 *1024 }
$Object = New-Object Psobject -Property #{
Computer = $computer
User = $User
DateRun = $DateRun
Name = $item.Name
Type = $item.Type
SizeKb = $Size
Path = $item.path
"Deleted Date" = $DeletedDate_datetime
"Deleted Days" = $DeletedDays }
$Object
If ($DeleteItems)
{
Remove-Item -Path $item.Path -Confirm:$false -Force -Recurse
if ($?)
{
$Global:Collection += #($object)
}
else
{
Add-Content -Path $LogFailed -Value $error[0]
}
}#EndIf $DeleteItems
}#EndIf($DeletedDays -ge $RetentionTime)
}#EndForeach item
}#EndFunction
Get-recyclebin -RetentionTime 7 #-DeleteItems #Remove the comment if you wish to actually delete the content
if (#($collection).count -gt "0")
{
$Collection = $Collection | Select-Object "Computer","User","DateRun","Name","Type","Path","SizeKb","Deleted Days","Deleted Date"
$CsvData = $Collection | ConvertTo-Csv -NoTypeInformation
$Null, $Data = $CsvData
Add-Content -Path $csvfile -Value $Data
}
[System.Runtime.Interopservices.Marshal]::ReleaseComObject($shell)
#ScriptEnd
Had to do a bit of research on this myself, the recycle bin contains two files for every file deleted on every drive in win 10 (in win 7 files are as is so this script is too much and needs to be cut down, especially for powershell 2.0, win 8 untested), an info file created at time of deletion $I (perfect for ascertaining the date of deletion) and the original file $R, i found the com object method would ignore more files than i liked but on the up side had info i was interested in about the original file deleted, so after a bit of exploring i found a simple get-content of the info files included the original file location, after cleaning it up with a bit of regex and came up with this:
# Refresh Desktop Ability
$definition = #'
[System.Runtime.InteropServices.DllImport("Shell32.dll")]
private static extern int SHChangeNotify(int eventId, int flags, IntPtr item1, IntPtr item2);
public static void Refresh() {
SHChangeNotify(0x8000000, 0x1000, IntPtr.Zero, IntPtr.Zero);
}
'#
Add-Type -MemberDefinition $definition -Namespace WinAPI -Name Explorer
# Set Safe within deleted days and get physical drive letters
$ignoreDeletedWithinDays = 2
$drives = (gwmi -Class Win32_LogicalDisk | ? {$_.drivetype -eq 3}).deviceid
# Process discovered drives
$drives | % {$drive = $_
gci -Path ($drive+'\$Recycle.Bin\*\$I*') -Recurse -Force | ? {($_.LastWriteTime -lt [datetime]::Now.AddDays(-$ignoreDeletedWithinDays)) -and ($_.name -like "`$*.*")} | % {
# Just a few calcs
$infoFile = $_
$originalFile = gi ($drive+"\`$Recycle.Bin\*\`$R$($infoFile.Name.Substring(2))") -Force
$originalLocation = [regex]::match([string](gc $infoFile.FullName -Force -Encoding Unicode),($drive+'[^<>:"/|?*]+\.[\w\-_\+]+')).Value
$deletedDate = $infoFile.LastWriteTime
$sid = $infoFile.FullName.split('\') | ? {$_ -like "S-1-5*"}
$user = try{(gpv "HKLM:\Software\Microsoft\Windows NT\CurrentVersion\ProfileList\$($sid)" -Name ProfileImagePath).replace("$(gpv 'HKLM:\Software\Microsoft\Windows NT\CurrentVersion\ProfileList' -Name ProfilesDirectory)\",'')}catch{$Sid}
#' Various info
$originalLocation
$deletedDate
$user
$sid
$infoFile.Fullname
((gi $infoFile -force).length / 1mb).ToString('0.00MB')
$originalFile.fullname
((gi $originalFile -force).length / 1mb).ToString('0.00MB')
""
# Blow it all Away
#ri $InfoFile -Recurse -Force -Confirm:$false -WhatIf
#ri $OriginalFile -Recurse -Force -Confirm:$false- WhatIf
# remove comment before two lines above and the '-WhatIf' statement to delete files
}
}
# Refresh desktop icons
[WinAPI.Explorer]::Refresh()
This works well also as a script with the task scheduler.
Clear-RecycleBin -Force