Keyword Search Across All Servers - Powershell - powershell

Afternoon All,
I need to run a search across all of our servers.
I have the list of servers in a text document and a list of keywords in another
$Servers = get-content -path 'C:\support\Server Search\Server Test.txt'
$Keywords = get-content -path "C:\Support\Server Search\Keyword Test.txt"
Foreach ($Server in $Servers){
Foreach ($Keyword in $Keywords){
Get-ChildItem "$Server" -Recurse | Where-Object {$_.Name -like "$Keyword"}
$i++
Write-Host "$found: $i - Current $ $_"
New-Object -TypeName PSCustomObject -Property #{
Directory = $_.Directory
Name = $_.Name
Length = $_.Length /1024
CreationTime = $_.CreationTime
LastWriteTime = $_.LastWriteTime
LastAccessTime = $_.LastAccessTime}|
select Directory,Name,Length,CreationTime,LastWriteTime,LastAccessTime |
Export-Csv "C:\support\server search\$Server.csv" -Append -NoTypeInformation
}}
$i = 0
Is there a way to indicate when a Keyword has been located and total keywords found? I feel like I need to change this Line but I cannot fathom what I would actually put, I've tried $Keywords but that just changes keyword everytime the directory changes
$i++
Write-Host "$found: $i - Current $ $_"

I'm assuming your $server is set up something like "\\servername\c$\"
when a Keyword has been located and total keywords found:
$Servers = get-content -path 'C:\support\Server Search\Server Test.txt'
$Keywords = get-content -path "C:\Support\Server Search\Keyword Test.txt"
$num = 0 #Total Keyword files Found
Foreach ($Server in $Servers){
Foreach ($Keyword in $Keywords){
#Keyword found Check
$Found = Get-ChildItem -Path "$Server" -Recurse -Include "$Keyword"
if($Found){
Foreach($File in $Found){
$num++ #increment num of keyword files found by 1
Write-Host "found: $num - $File"
New-Object -TypeName PSCustomObject -Property #{
Directory = $File.Directory
Name = $File.Name
Length = $File.Length /1024
CreationTime = $File.CreationTime
LastWriteTime = $File.LastWriteTime
LastAccessTime = $File.LastAccessTime}|
select Directory,Name,Length,CreationTime,LastWriteTime,LastAccessTime |
Export-Csv "C:\support\server search\$Server.csv" -Append -NoTypeInformation
}
}
}
}
Please let me know if this helps you progress. I can assist further if requested.

Related

How to optimize Powershell script with Get-ChildItem consuming all RAM

I have this script which parses all shares on a file server to gather information on share size, ACLs, and count of files and folders. The script works great on smaller file servers but on hosts with large shares it consumes all RAM and crashes the host, I can't seem to figure out how to optimize the script during the Get-ChildItem portion to not consume all RAM.
I found a few articles which mentioned to use a foreach loop and pipe out what I need. I am a Powershell beginner, I can't figure out how to get it to work like that. What can I try next?
$ScopeName = Read-Host "Enter scope name to gather data on"
$SavePath = Read-Host "Path to save results and log to"
$SaveCSVPath = "$SavePath\ShareData.csv"
$TranscriptLog = "$SavePath\Transcript.log"
Write-Host
Start-Transcript -Path $TranscriptLog
$StartTime = Get-Date
$Start = $StartTime | Select-Object -ExpandProperty DateTime
$Exclusions = {$_.Description -ne "Remote Admin" -and $_.Description -ne "Default Share" -and $_.Description -ne "Remote IPC" }
$FileShares = Get-SmbShare -ScopeName $ScopeName | Where-Object $Exclusions
$Count = $FileShares.Count
Write-Host
Write-Host "Gathering data for $Count shares" -ForegroundColor Green
Write-Host
Write-Host "Results will be saved to $SaveCSVPath" -ForegroundColor Green
Write-Host
ForEach ($FileShare in $FileShares)
{
$ShareName = $FileShare.Name
$Path = $Fileshare.Path
Write-Host "Working on: $ShareName - $Path" -ForegroundColor Yellow
$GetObjectInfo = Get-Childitem -Path $Path -Recurse -Force -ErrorAction SilentlyContinue
$ObjSize = $GetObjectInfo | Measure-Object -Property Length -Sum -ErrorAction SilentlyContinue
$ObjectSizeMB = "{0:N2}" -f ($ObjSize.Sum / 1MB)
$ObjectSizeGB = "{0:N2}" -f ($ObjSize.Sum / 1GB)
$ObjectSizeTB = "{0:N2}" -f ($ObjSize.Sum / 1TB)
$NumFiles = ($GetObjectInfo | Where-Object {-not $_.PSIsContainer}).Count
$NumFolders = ($GetObjectInfo | Where-Object {$_.PSIsContainer}).Count
$ACL = Get-Acl -Path $Path
$LastAccessTime = Get-ItemProperty $Path | Select-Object -ExpandProperty LastAccessTime
$LastWriteTime = Get-ItemProperty $Path | Select-Object -ExpandProperty LastWriteTime
$Table = [PSCustomObject]#{
'ScopeName' = $FileShare.ScopeName
'Sharename' = $ShareName
'SharePath' = $Path
'Owner' = $ACL.Owner
'Permissions' = $ACL.AccessToString
'LastAccess' = $LastAccessTime
'LastWrite' = $LastWriteTime
'Size (MB)' = $ObjectSizeMB
'Size (GB)' = $ObjectSizeGB
'Size (TB)' = $ObjectSizeTB
'Total File Count' = $NumFiles
'Total Folder Count' = $NumFolders
'Total Item Count' = $GetObjectInfo.Count
}
$Table | Export-CSV -Path $SaveCSVPath -Append -NoTypeInformation
}
$EndTime = Get-Date
$End = $EndTime | Select-Object -ExpandProperty DateTime
Write-Host
Write-Host "Script start time: $Start" -ForegroundColor Green
Write-Host "Script end time: $End" -ForegroundColor Green
Write-Host
$ElapsedTime = $(($EndTime-$StartTime))
Write-Host "Elapsed time: $($ElapsedTime.Days) Days $($ElapsedTime.Hours) Hours $($ElapsedTime.Minutes) Minutes $($ElapsedTime.Seconds) Seconds $($ElapsedTime.MilliSeconds) Milliseconds" -ForegroundColor Cyan
Write-Host
Write-Host "Results saved to $SaveCSVPath" -ForegroundColor Green
Write-Host
Write-Host "Transcript saved to $TranscriptLog" -ForegroundColor Green
Write-Host
Stop-Transcript
To correctly use the PowerShell pipeline (and preserve memory as each item is streamed separately), use the PowerShell ForEach-Object cmdlet (unlike the ForEach statement) and avoid assigning the pipeline to a variable (as you doing with $FileShares = ...) and don't use parenthesis ((...)) arround the the pipeline:
Get-SmbShare -ScopeName $ScopeName | Where-Object $Exclusions | ForEach-Object {
And replace all $FileShare variables in your loop with the current item: $_ variable (e.g. $FileShare.Name → $_.Name).
For the Get-Childitem part you might do the same thing (stream! meaning: use the mighty PowerShell pipeline rather than piling everything up in $GetObjectInfo):
$ObjSize = Get-Childitem -Path $Path -Recurse -Force -ErrorAction SilentlyContinue |
Measure-Object -Property Length -Sum -ErrorAction SilentlyContinue
As an aside; you might simplify your 3 size properties to a single smarter size property, see: How to convert value to KB, MB, or GB depending on digit placeholders?
addition
"But isn't putting everything into $ObjSize just swapping one variable for another?"
No it is not, think of the PowerShell pipeline as an assembly line. In this case, at the first station you take each single file information and pass it to the next (last) station where you just sum the length property and the current (file) object disposed.
Where in your question example, you read the information of all files in once and store it into $GetObjectInfo and than go to the whole list to just use (add) the length property of the (quiet heavy) PowerShell file objects.
But why don't you try it?:
Open a new PowerShell session and run:
$Path = '.'
$GetObjectInfo = Get-Childitem -Path $Path -Recurse -Force -ErrorAction SilentlyContinue
$ObjSize = $GetObjectInfo | Measure-Object -Property Length -Sum -ErrorAction SilentlyContinue
Get-Process -ID $PID
Now, open a new session again and use the PowerShell pipeline:
$Path = '.'
$ObjSize = Get-Childitem -Path $Path -Recurse -Force -ErrorAction SilentlyContinue |
Measure-Object -Property Length -Sum -ErrorAction SilentlyContinue
Get-Process -ID $PID
Notice the difference in memory usage (WS(M)).
You are buffering the entire collection of [FileSystemInfo] on $FileShare into a variable with...
$GetObjectInfo = Get-Childitem -Path $Path -Recurse -Force -ErrorAction SilentlyContinue
So, if there's a million directories and files on that share then that's a million [FileSystemInfo] instances stored in a million-element array, none of which can be garbage collected during that iteration of the foreach loop. You can use Group-Object to improve that a bit...
$groupsByPSIsContainer = Get-Childitem -Path $Path -Recurse -Force -ErrorAction SilentlyContinue |
Group-Object -Property 'PSIsContainer' -AsHashTable
# $groupsByPSIsContainer is a [Hashtable] with two keys:
# - $true gets the collection of directories
# - $false gets the collection of files
$ObjSize = $groupsByPSIsContainer[$false] | Measure-Object -Property Length -Sum -ErrorAction SilentlyContinue
$NumFiles = $groupsByPSIsContainer[$false].Count
$NumFolders = $groupsByPSIsContainer[$true].Count
...but that still ends up storing all of the [FileSystemInfo]s in the two branches of the [Hashtable]. Instead, I would just enumerate and count the results myself...
$ObjSize = 0L # Stores the total file size directly; use $ObjSize instead of $ObjSize.Sum
$NumFiles = 0
$NumFolders = 0
foreach ($fileSystemInfo in Get-Childitem -Path $Path -Recurse -Force -ErrorAction SilentlyContinue)
{
if ($fileSystemInfo.PSIsContainer)
{
$NumFolders++
}
else
{
$NumFiles++
$ObjSize += $fileSystemInfo.Length
}
}
That stores only the current enumeration result in $fileSystemInfo and never the entire sequence.
Note that if you weren't summing the files' sizes Group-Object would work well...
$groupsByIsContainer = Get-Childitem -Path $Path -Recurse -Force -ErrorAction SilentlyContinue |
Group-Object -Property 'PSIsContainer' -NoElement
$NumFiles = ($groupsByIsContainer | Where-Object -Property 'Name' -EQ -Value $false).Count
$NumFolders = ($groupsByIsContainer | Where-Object -Property 'Name' -EQ -Value $true ).Count
-NoElement prevents the resulting group objects from storing the grouped elements; we just care about the count of members in each grouping but not the members themselves. If we passed -AsHashTable then we'd lose the convenient Count property, hence why the two groups have to be accessed in this awkward way.

Folders with more than 40.000 files

I have this script I received to check folders and subfolders on a network drive. I wonder how it could be modified into checking only folders and subfolder and write in the CSV if there is any folder with more then 40.000 files in it and the number of files. The image show a sample output from the script as it is now and I do not need it to show any files as it currently do.
$dir = "D:\test"
$results = #()
gci $dir -Recurse -Depth 1 | % {
$temp = [ordered]#{
NAME = $_
SIZE = "{0:N2} MB" -f ((gci $_.Fullname -Recurse | measure -Property Length -Sum -ErrorAction SilentlyContinue).Sum / 1MB)
FILE_COUNT = (gci -File $_.FullName -Recurse | measure | select -ExpandProperty Count)
FOLDER_COUNT = (gci -Directory $_.FullName -Recurse | measure | select -ExpandProperty Count)
DIRECTORY_PATH = $_.Fullname
}
$results += New-Object PSObject -Property $temp
}
$results | export-csv -Path "C:\temp\output.csv" -NoTypeInformation
Instead of executing so many Get-ChildItem cmdlets, here's an approach that uses robocopy to do the heavy lifting of counting the number of files, folders and total sizes:
# set the rootfolder to search
$dir = 'D:\test'
# switches for robocopy
$roboSwitches = '/L','/E','/NJH','/BYTES','/NC','/NP','/NFL','/XJ','/R:0','/W:0','/MT:16'
# regex to parse the output from robocopy
$regEx = '\s*Total\s*Copied\s*Skipped\s*Mismatch\s*FAILED\s*Extras' +
'\s*Dirs\s*:\s*(?<DirCount>\d+)(?:\s+\d+){3}\s+(?<DirFailed>\d+)\s+\d+' +
'\s*Files\s*:\s*(?<FileCount>\d+)(?:\s+\d+){3}\s+(?<FileFailed>\d+)\s+\d+' +
'\s*Bytes\s*:\s*(?<ByteCount>\d+)(?:\s+\d+){3}\s+(?<BytesFailed>\d+)\s+\d+'
# loop through the directories directly under $dir
$result = Get-ChildItem -Path $dir -Directory | ForEach-Object {
$path = $_.FullName # or if you like $_.Name
$summary = (robocopy.exe $_.FullName NULL $roboSwitches | Select-Object -Last 8) -join [Environment]::NewLine
if ($summary -match $regEx) {
$numFiles = [int64] $Matches['FileCount']
if ($numFiles -gt 40000) {
[PsCustomObject]#{
PATH = $path
SIZE = [int64] $Matches['ByteCount']
FILE_COUNT = [int64] $Matches['FileCount']
FOLDER_COUNT = [int64] $Matches['DirCount']
}
}
}
else {
Write-Warning -Message "Path '$path' output from robocopy was in an unexpected format."
}
}
# output on screen
$result | Format-Table -AutoSize
# output to CSV file
$result | Export-Csv -Path "C:\temp\output.csv" -NoTypeInformation

Exporting Object and strings to CSV using Powershell

The purpose of this code is to transfer files from one location to another and to log whether the transfer was a success or a failure.
Everything works except I am having issues with the log. I want the log to be in CSV format and there to be 3 columns: success/failure, from location, and to location. This is outputting the results all into rows with one column.
I've tried the Export-Csv option but that looks for objects/properties so only displays the length(I have strings too). Add-content works but there is only one column. Any suggestions?
#LOCATION OF CSV
$csv = Import-Csv C:\test2.csv
#SPECIFY DATE (EXAMPLE-DELETE FILES > 7 YEARS. 7 YEARS=2555 DAYS SO YOU WOULD ENTER "-2555" BELOW)
$Daysback = "-1"
#FILE DESTINATION
$storagedestination = "C:\Users\mark\Documents\Test2"
#LOG LOCATION
$loglocation = "C:\Users\mark\Documents\filetransferlog.csv"
$s = "SUCCESS"
$f = "FAIL"
$CurrentDate = Get-Date
foreach ($line in $csv) {
$Path = $line | Select-Object -ExpandProperty FullName
$DatetoDelete = $CurrentDate.AddDays($DaysBack)
$objects = Get-ChildItem $Path -Recurse | Select-Object FullName, CreationTime, LastWriteTime, LastAccessTime | Where-Object { $_.LastWriteTime -lt $DatetoDelete }
foreach ($object in $objects) {
try
{
$sourceRoot = $object | Select-Object -ExpandProperty FullName
Copy-Item -Path $sourceRoot -Recurse -Destination $storagedestination
Remove-Item -Path $sourceRoot -Force -Recurse
$temp = $s, $sourceRoot, $storagedestination
$temp | add-content $loglocation
}
catch
{
$temp2 = $f, $sourceRoot, $storagedestination
$temp2 | add-content $loglocation
}
}
}
All your | Select-Object -ExpandProperty are superfluous, simply attach the property name to the variable name => $Path = $line.FullName
Why calculate $DatetoDelete inside the foreach every time?
Output the success/fail to a [PSCustomObject] and gather them in a variable assigned directly to the foreach.
Untested:
$csv = Import-Csv C:\test2.csv
$Daysback = "-1"
$destination = "C:\Users\mark\Documents\Test2"
$loglocation = "C:\Users\mark\Documents\filetransferlog.csv"
$s = "SUCCESS"
$f = "FAIL"
$CurrentDate = Get-Date
$DatetoDelete = $CurrentDate.Date.AddDays($DaysBack)
$Log = foreach ($line in $csv) {
$objects = Get-ChildItem $line.FullName -Rec |
Where-Object LastWriteTime -lt $DatetoDelete
foreach ($object in $objects) {
$Result = $s
$sourceRoot = $object.FullName
try {
Copy-Item -Path $sourceRoot -Recurse -Destination $destination
Remove-Item -Path $sourceRoot -Recurse -Force
} catch {
$Result = $f
}
[PSCustomObject]#{
'Success/Fail' = $Result
Source = $sourceRoot
Destination = $destination
}
}
}
$Log | Export-Csv $loglocation -NoTypeInformation

Check files on remote computers for time stamp older than X hours and export results to CSV

We are trying to run a script against a pile of remote computers to check the date stamps of files in a fixed folder that are older than say 12 hours and return the results to a CSV. The date range needs to be flexible as its a set time of 6pm yesterday which will move as the time moves on.
$computers = Get-Content -Path computers.txt
$filePath = "c:\temp\profile"
$numdays = 0
$numhours = 12
$nummins = 5
function ShowOldFiles($filepath, $days, $hours, $mins)
{
$files = $computers #(get-childitem $filepath -include *.* -recurse | where {($_.LastWriteTime -lt (Get-Date).AddDays(-$days).AddHours(-$hours).AddMinutes(-$mins)) -and ($_.psIsContainer -eq $false)})
if ($files -ne $NULL)
{
for ($idx = 0; $idx -lt $files.Length; $idx++)
{
$file = $files[$idx]
write-host ("Old: " + $file.Name) -Fore Red
}
}
}
Write-output $computers, $numdays, $numhours, $nummins >> computerlist.txt
You could run the follow script on all of your remote machines:
$computers = Get-Content -Path computers.txt
$logFile = "\\ServerName\C$\Logfile.txt"
$date = "12/03/2002 12:00"
$limit = Get-Date $date
$computers | %{
$filePath = "\\$_\C$\temp\profile"
$files = $null
$files = Get-ChildItem -Path $filePath -Recurse -Force | `
Where-Object {$_.CreationTime -lt $limit }
If($files -ne $null){
"-------------------------[$($_)]------------------------">> $logFile
$files | Foreach {$_.FullName >> $logFile}
}
}
This will check the folder given ($filePath) for files that are older than the limit given. Files older than the limit will have there full file path logged in the given network location $logFile.
with a small alteration to #chard earlier code I managed to get a workable solution.
The output log file only returns the files that are older than the date in the code.
this can be manipulated in Excel with other outputs for our needs.
I will try the updated code above in a bit.
$computers = Get-Content -Path "C:\temp\computers.txt"
$logFile = "\\SERVER\logs\output.txt"
$numdays = 3
$numhours = 10
$nummins = 5
$limit = (Get-Date).AddDays(-$numdays).AddHours(-$numhours).AddMinutes(-$nummins)
$computers | %{
$filePath = "\\$_\C$\temp\profile\runtime.log"
Get-ChildItem -Path $filePath -Recurse -Force | `
Where-Object {$_.LastWriteTime -lt $limit } |
foreach {"$($_)">> $logFile}
}

Powershell get total size of files which user owns

I need a powershell script, which will go through all users in system and will find total size of all files which any user own... I have script which is going through all users, but then I've no idea to continue with counting total size which user owns for each user
Here is a script, which I`ve now:
$users = Get-WmiObject -class Win32_UserAccount
foreach($user in $users) {
$name = $user.Name
$fullName = $user.FullName;
if(Test-Path "C:\Users\$name") {
$path = "C:\Users\$name"
} else {
$path = "C:\Users\Public"
}
$dirSize = (Get-ChildItem $path -recurse | Measure-Object -property length -sum)
"{0:N2}" -f ($dirSize.sum / 1Gb) + " Gb"
echo "$dirSize"
Add-Content -path "pathototxt..." -value "$name $fullName $path"
}
I would be more than happy If somebody know the answer and tell me it...
Thank you
If there's a lot of files, you might want to consider:
$oSIDs = #{}
get-childitem <filespec> |
foreach {
$oSID = $_.GetAccessControl().Sddl -replace '^o:(.+?).:.+','$1'
$oSIDs[$oSID] += $_.length
}
Then resolve the SIDs when you're done. Parsing the owner SID or well-know security principal ID from the SDDL string saves the provider from having to do a lot of repetitive name resolution to give you back the "friendly" names.
I have no idea what you're asking for here.
"to continue with counting total size which user owns for each user". huh? Do want to check every file on the system or just the userfolder as you currently do?
Your script works fine if you just tweak it to include the filesize in the output. Personally I'd consider using a csv to store this because not all users will have e.g. a full name(admin, guest etc.). Also, atm. your script is counting the public folder multiple times(each time a user doesn't have a profile). E.g. admin(if it has never logged in), guest etc. might both get it specified.
Updated script that outputs both textfile and csv
$users = Get-WmiObject -class Win32_UserAccount
$out = #()
#If you want to append to a csv-file, replace the $out line above with the one below
#$out = Import-Csv "file.csv"
foreach($user in $users) {
$name = $user.Name
$fullName = $user.FullName;
if(Test-Path "C:\Users\$name") {
$path = "C:\Users\$name"
} else {
$path = "C:\Users\Public"
}
$dirSize = (Get-ChildItem $path -Recurse -ErrorAction SilentlyContinue | ? { !$_.PSIsContainer } | Measure-Object -Property Length -Sum)
$size = "{0:N2}" -f ($dirSize.Sum / 1Gb) + " Gb"
#Saving as textfile
#Add-Content -path "pathototxt..." -value "$name $fullName $path $size"
Add-Content -path "file.txt" -value "$name $fullName $path $size"
#CSV-way
$o = New-Object psobject -Property #{
Name = $name
FullName = $fullName
Path = $path
Size = $size
}
$out += $o
}
#Exporting to csv format
$out | Export-Csv "file.csv" -NoTypeInformation
EDIT: Another solution using the answer provided by #mjolinor and #C.B. modified to scan your c:\ drive while excluding some "rootfolders" like "program files", "windows" etc. It exports the result to a csv file ready for Excel.:
$oSIDs = #{}
$exclude = #("Program Files", "Program Files (x86)", "Windows", "Perflogs");
Get-ChildItem C:\ | ? { $exclude -notcontains $_.Name } | % { Get-ChildItem $_.FullName -Recurse -ErrorAction SilentlyContinue | ? { !$_.PSIsContainer } } | % {
$oSID = $_.GetAccessControl().Sddl -replace '^o:(.+?).:.+','$1'
$oSIDs[$oSID] += $_.Length
}
$out = #()
$oSIDs.GetEnumerator() | % {
$user = (New-Object System.Security.Principal.SecurityIdentifier($_.Key)).Translate([System.Security.Principal.NTAccount]).Value
$out += New-Object psobject -Property #{
User = if($user) { $user } else { $_.Key }
"Size(GB)" = $oSIDs[$_.Key]/1GB
}
}
$out | Export-Csv file.csv -NoTypeInformation -Delimiter ";"