Powershell Process CPU checking - powershell

Have the following which works OK, but with an in issue in PowerShell:
$FileName = "E:\Work\ps\Inventory.htm"
$serverlist = "E:\Work\ps\Monitored_computers.txt"
foreach ($server in Get-Content $serverlist)
{
$servern=$server.split(",")[0]
$ip=$server.split(",")[1]
$cpu = gwmi Win32_PerfFormattedData_PerfProc_Process -Computer $servern -filter "Name <> '_Total' and Name <> 'Idle'" | Sort-Object PercentProcessorTime -Descending | where { $_.PercentProcessorTime -gt 0 }| select -First 1
if ($cpu.PercentProcessorTime -ge "92") {
write-host $servern ' ' $cpu.Name ' ' $cpu.PercentProcessorTime
}
}
I have seen some other code in PowerShell, that takes an Average but almost seems like an "average of an average" - which is meaningless. And, this is for overall CPU Usage
gwmi win32_processor | Measure-Object -property LoadPercentage -Average | Foreach {$_.Average}
Now, if we can take the same logic and apply for our process issue:
gwmi Win32_PerfFormattedData_PerfProc_Process | Sort-Object PercentProcessorTime -Descending | where { $_.PercentProcessorTime -gt 0 } | select -First 1 | Measure-Object -property PercentProcessorTime -Average | Foreach {$_.PercentProcessorTime}
What am trying to ask is: I do get the CPU Percentage, which seems to be a "point in time". How do locate the true CPU Percentage? This is why I am pointing out the average. I really want to get around the "point in time" part of this.
The point being, when we have seen on several occasions, a high CPU per process on a server, we login to the server and the high CPU has subsided. This is not to say, this has been each time, but we know that sometimes a CPU will spike and then quiet down.
Thanks for any insight!

First issue, you are stuck at a Point In Time because when you execute your script it captures a snapshot of what is happening right then and there. What you are looking for is historical data, so you can figure out the average CPU usage of processes over a set amount of time, and pinpoint the process that's bogging down your CPU. Do you have performance monitors in place to track CPU usage for individual processes? You may need to setup performance logging if you want to be able to get the numbers you're looking for after the fact.
Secondly, I think that you misunderstand how Measure-Object works. If you run Get-Help on the cmdlet and check the Output you'll see that it outputs a GenericMeasureInfo object. This object will have a property for the relevant stat that you are looking for, which in your case is the Average property. It is not an average of an average, the most common usage I see for it is to calculate something, like a Sum or Average, and then output the value of that property.
Let's try a simple example...
Find the average size of the files in a folder. First we use Get-ChildItem to get a collection of files, and pipe it to Measure-Object. We will specify the -Average argument to specify that we want the Average calculated, and -Property length, so that it knows what to average:
GCI C:\Temp\* -file | Measure-Object -Average -Property length
This outputs a GenericMeasureInfo object like this:
Count : 30
Average : 55453155
Sum :
Maximum :
Minimum :
Property : Length
That lets me know that it had 30 files piped to it, and it found the Average for the Length property. Now, sometime you want to calculate more than one thing, so you can use more than one argument, such as -Sum and -Maximum, and those values will be populated as well:
Count : 30
Average : 55453155
Sum : 1663594650
Maximum : 965376000
Minimum :
Property : Length
So it looks like my average file is ~55MB, but out of the 1.6GB in the whole folder I've got one file that's 965MB! That file is undoubtedly skewing my numbers. With that output I could find folders that have multiple files, but one file is taking up over half of the space for the folder, and find anomalies... such as the ISO that I have saved to my C:\temp folder for some reason. Looks like I need to do some file maintenance.

Thanks to #TheMadTechnician I have been able to sort this out. I had a wrong component with
$_.Average
where I had
$_.PercentProcessorTime
and that would never work. Here is the correct script:
$serverlist = "D:\Work\ps\Monitored_computers.txt"
foreach ($server in Get-Content $serverlist) {
$servern=$server.split(",")[0]
$ip=$server.split(",")[1]
$cpu = gwmi Win32_PerfFormattedData_PerfProc_Process -Computer $ip | `
Where-Object {$_.Name -like "*tomcat*"} | `
Measure-Object -property PercentProcessorTime -Average | `
Foreach {$_.Average}
if ($cpu -ge "20") {
write-host $servern $cpu ' has a tomcat process greater than 20'
}
}

Related

powershell: cpu usage, function does not work for value 100

I am trying to provide a short script to inform me about the CPU usage over some limet but it works only if CPU < 100% but fails if it works 100% CPU
$trashhold=90
$doit=Get-WmiObject Win32_Processor | Measure-Object -Property LoadPercentage -Average | Select Average | Out-String
if ($doit -match "\d+"){
$doit = $matches[0];
}
if ($doit -gt $trashhold)
{
send email bla bla
}
else
{
Write-Output "less than $limit, do nothing"
}
Basically regex takes number values from Get-WmiObject function and script works if cpu is 99 or less. For 100 it goes for else even it is bigger than trashhold (90). Where is the mistake?
try to catch error but withoout results (regex is returning correct numbers)
There is no need for Out-String and no need for regex either, you're just overcomplicating it:
$threshold = 90
$cpuUsage = (Get-CimInstance Win32_Processor | Measure-Object -Property LoadPercentage -Average).Average
if($cpuUsage -gt $threshold) {
# send email blah blah with `$cpuUsage`
}
else {
"Less than $threshold, do nothing"
}
What you're looking to do is to obtain the Property Value from the LoadPercentage Property, this can be done either via member-access enumeration (as shown in this answer) or using Select-Object -ExpandProperty LoadPercentage. ForEach-Object LoadPercentage would also work (see -MemberName parameter for more info). There are other methods too, though these are the most common ones.

How can I improve the speed and memory usage of calculating the size of the N largest files?

I am getting the total number of bytes of the 32 largest files in the folder:
$big32 = Get-ChildItem c:\\temp -recurse |
Sort-Object length -descending |
select-object -first 32 |
measure-object -property length –sum
$big32.sum /1gb
However, it's working very slowly. We have about 10 TB of data in 1.4 million files.
The following implements improvements by only using PowerShell cmdlets. Using System.IO.Directory.EnumerateFiles() as a basis as suggested by this answer might give another performance improvement but you should do your own measurements to compare.
(Get-ChildItem c:\temp -Recurse -File).ForEach('Length') |
Sort-Object -Descending -Top 32 |
Measure-Object -Sum
This should reduce memory consumption considerably as it only sorts an array of numbers instead of an array of FileInfo objects. Maybe it's also somewhat faster due to better caching (an array of numbers is stored in a contiguous, cache-friendly block of memory, whereas an array of objects only stores the references in a contiguous way, but the objects themselfs can be scattered all around in memory).
Note the use of .ForEach('Length') instead of just .Length because of member enumeration ambiguity.
By using Sort-Object parameter -Top we can get rid of the Select-Object cmdlet, further reducing pipeline overhead.
I can think of some improvements, especially to memory usage but following should be considerable faster than Get-ChildItem
[System.IO.Directory]::EnumerateFiles('c:\temp', '*.*', [System.IO.SearchOption]::AllDirectories) |
Foreach-Object {
[PSCustomObject]#{
filename = $_
length = [System.IO.FileInfo]::New($_).Length
}
} |
Sort-Object length -Descending |
Select-Object -First 32
Edit
I would look at trying to implement an implit heap to reduce memory usage without hurting performance (possibly even improves it... to be tested)
Edit 2
If the filenames are not required, the easiest gain on memory is to not include them in the results.
[System.IO.Directory]::EnumerateFiles('c:\temp', '*.*', [System.IO.SearchOption]::AllDirectories) |
Foreach-Object {
[System.IO.FileInfo]::New($_).Length
} |
Sort-Object length -Descending |
Select-Object -First 32
Firstly, if you're going to use Get-ChildItem then you should pass the -File switch parameter so that [System.IO.DirectoryInfo] instances never enter the pipeline.
Secondly, you're not passing the -Force switch parameter to Get-ChildItem, so any hidden files in that directory structure won't be retrieved.
Thirdly, note that your code is retrieving the 32 largest files, not the files with the 32 largest lengths. That is, if files 31, 32, and 33 are all the same length, then file 33 will be arbitrarily excluded from the final count. If that distinction is important to you you could rewrite your code like this...
$filesByLength = Get-ChildItem -File -Force -Recurse -Path 'C:\Temp\' |
Group-Object -AsHashTable -Property Length
$big32 = $filesByLength.Keys |
Sort-Object -Descending |
Select-Object -First 32 |
ForEach-Object -Process { $filesByLength[$_] } |
Measure-Object -Property Length -Sum
$filesByLength is a [Hashtable] that maps from a length to the file(s) with that length. The Keys property contains all of the unique lengths of all of the retrieved files, so we get the 32 largest keys/lengths and use each one to send all the files of that length down the pipeline.
Most importantly, sorting the retrieved files to find the largest ones is problematic for several reasons:
Sorting cannot start until all of the input data is available, meaning at that point in time all 1.4 million [System.IO.FileInfo] instances will be present in memory.
I'm not sure how Sort-Object buffers the incoming pipeline data, but I imagine it would be some kind of list that doubles in size every time it needs more capacity, leading to further garbage in memory to be cleaned up.
Each of the 1.4 million [System.IO.FileInfo] instances will be accessed a second time to get their Length property, all the while whatever sorting manipulations (depending on what algorithm Sort-Object uses) are occurring, too.
Since we only care about a mere 32 largest files/lengths out of 1.4 million files, what if we only kept track of those 32 instead of all 1.4 million? Consider if we only wanted to find the single largest file...
$largestFileLength = 0
$largestFile = $null
foreach ($file in Get-ChildItem -File -Force -Recurse -Path 'C:\Temp\')
{
# Track the largest length in a separate variable to avoid two comparisons...
# if ($largestFile -eq $null -or $file.Length -gt $largestFile.Length)
if ($file.Length -gt $largestFileLength)
{
$largestFileLength = $file.Length
$largestFile = $file
}
}
Write-Host -Message "The largest file is named ""$($largestFile.Name)"" and has length $largestFileLength."
As opposed to Get-ChildItem ... | Sort-Object -Property Length -Descending | Select-Object -First 1, this has the advantage of only one [FileInfo] object being "in-flight" at a time and the complete set of [System.IO.FileInfo]s being enumerated only once. Now all we need to do is to take the same approach but expanded from 1 file/length "slot" to 32...
$basePath = 'C:\Temp\'
$lengthsToKeep = 32
$includeZeroLengthFiles = $false
$listType = 'System.Collections.Generic.List[System.IO.FileInfo]'
# A SortedDictionary[,] could be used instead to avoid having to fully enumerate the Keys
# property to find the new minimum length, but add/remove/retrieve performance is worse
$dictionaryType = "System.Collections.Generic.Dictionary[System.Int64, $listType]"
# Create a dictionary pre-sized to the maximum number of lengths to keep
$filesByLength = New-Object -TypeName $dictionaryType -ArgumentList $lengthsToKeep
# Cache the minimum length currently being kept
$minimumKeptLength = -1L
Get-ChildItem -File -Force -Recurse -Path $basePath |
ForEach-Object -Process {
if ($_.Length -gt 0 -or $includeZeroLengthFiles)
{
$list = $null
if ($filesByLength.TryGetValue($_.Length, [ref] $list))
{
# The current file's length is already being kept
# Add the current file to the existing list for this length
$list.Add($_)
}
else
{
# The current file's length is not being kept
if ($filesByLength.Count -lt $lengthsToKeep)
{
# There are still available slots to keep more lengths
$list = New-Object -TypeName $listType
# The current file's length will occupy an empty slot of kept lengths
}
elseif ($_.Length -gt $minimumKeptLength)
{
# There are no available slots to keep more lengths
# The current file's length is large enough to keep
# Get the list for the minimum length
$list = $filesByLength[$minimumKeptLength]
# Remove the minimum length to make room for the current length
$filesByLength.Remove($minimumKeptLength) |
Out-Null
# Reuse the list for the now-removed minimum length instead of allocating a new one
$list.Clear()
# The current file's length will occupy the newly-vacated slot of kept lengths
}
else
{
# There are no available slots to keep more lengths
# The current file's length is too small to keep
return
}
$list.Add($_)
$filesByLength.Add($_.Length, $list)
$minimumKeptLength = ($filesByLength.Keys | Measure-Object -Minimum).Minimum
}
}
}
# Unwrap the files in each by-length list
foreach ($list in $filesByLength.Values)
{
foreach ($file in $list)
{
$file
}
}
I went with the approach, described above, of retrieving the files with the 32 largest lengths. A [Dictionary[Int64, List[FileInfo]]] is used to track those 32 largest lengths and the corresponding files with that length. For each input file, we first check if its length is among the largest so far and, if so, add the file to the existing List[FileInfo] for that length. Otherwise, if there's still room in the dictionary we can unconditionally add the input file and its length, or if the input file is at least bigger than the smallest tracked length we can remove that smallest length and add in its place the input file and its length. Once there are no more input files we output all of the [FileInfo] objects from all of the [List[FileInfo]]s in the [Dictionary[Int64, [List[FileInfo]]]].
I ran this simple benchmarking template...
1..5 |
ForEach-Object -Process {
[GC]::Collect()
return Measure-Command -Expression {
# Code to test
}
} | Measure-Object -Property 'TotalSeconds' -Minimum -Maximum -Average
...on PowerShell 7.2 against my $Env:WinDir directory (325,000 files) with these results:
# Code to test
Minimum
Maximum
Average
Memory usage*
Get-ChildItem -File -Force -Recurse -Path $Env:WinDir
69.7240896
79.727841
72.81731518
+260 MB
Get $Env:WinDir files with 32 largest lengths using -AsHashtable, Sort-Object
82.7488729
83.5245153
83.04068032
+1 GB
Get $Env:WinDir files with 32 largest lengths using dictionary of by-length lists
81.6003697
82.7035483
82.15654538
+235 MB
* As observed in the Task Manager → Details tab → Memory (active private working set) column
I'm a little disappointed that my solution is only about 1% faster than the code using the Keys of a [Hashtable], but perhaps grouping the files using a compiled cmdlet vs. not grouping or sorting them but with more (interpreted) PowerShell code is a wash. Still, the difference in memory usage is significant, though I can't explain why the Get-ChildItem call to simply enumerate all files ended up using a bit more.

How can I use powershell to group process names and show the sum of memory used

I am trying to wrap my head around combining powershell options in order to produce a simple table of the top 10 memory users on my system (server, pc, etc). My PC is Windows 7 with no timeline in site for upgrade to Windows 10 due to Covid 19. I cannot add applications to my work PC that has not gone through a vetting process (read, it takes forever) so most of the time I create my own.
I would like to produce a result that looks something like this:
Count Name Memory Sum in MB
10 Firefox 5000
3 javaw 1000
The order I would like to be able to select by changing a property in the powershell options. So for example, sort by count, name or memory. My sample table is not set in stone.
I have come across the following 2 pieces of powershell and have been trying to adapt them but get errors.
(Get-Process | Measure-Object WorkingSet -sum).sum /1gb
Get-Process | Group-Object -Property Name -NoElement | Where-Object {$_.Count -gt 1}
For sake of learning, I don't mind seeing an "ugly" version and an optimized version.
You can use this:
$proc=ps|select -eXp name;$proc2=#()
$proc|%{
if(!("$($_)" -in $proc2)){$proc2+="$($_)"
$mem=0;ps $_|select -eXp workingSet|%{$mem+=$_/1MB}
[pscustomobject][ordered]#{
'Count'=(ps $_ -ea silentlyContinue).Count
'Name'=$_
'Memory in MB'=$mem
}}}
The PSCustomObject accelerator was introduced in PowerShell v3 so I don't know if the the output looks like a table in Windows 7 however the following pipeline returns desired properties even in PowerShell v2:
Get-Process |
Group-Object -Property Name -NoElement |
Where-Object { $_.Count -gt 1 } |
ForEach-Object {
[PSCustomObject]#{
Count= $_.Count
Name = $_.Name
'Memory Sum in MB' = [math]::Round(( Get-Process -Name $_.Name |
Measure-Object WorkingSet -sum).sum /1Mb, 3)
}
} # | Sort-Object -Property 'Memory Sum in MB'

How to get max CPU percentage from from 5 trials

I am new to Powershell and struggling with syntax.
I want to write a script which gives me max CPU usage by a process out of 5 attempts.
$properties=#(
#{Name="Process Name"; Expression = {$_.name}},
#{Name="CPU (%)"; Expression = {$_.PercentProcessorTime}},
#{Name="Memory (MB)"; Expression = {[Math]::Round(($_.workingSetPrivate / 1mb),2)}}
)
Get-WmiObject -class Win32_PerfFormattedData_PerfProc_Process | Select-Object $properties
I have to run the above process 5 times and pick the top process which has max CPU usage.
This should get you what you want (remember to also include your definition of $properties):
1 .. 5 |
ForEach-Object {
Get-WmiObject -class Win32_PerfFormattedData_PerfProc_Process
} | Where-Object Name -notin '_Total','Idle' |
Sort-Object -Property 'PercentProcessorTime' -Descending |
Select-Object -First 1 -Property $properties
1 .. 5 is the range operator, which generates the set of numbers 1,2,3,4,5. This is just a quick hack to run ForEach-Object 5 times.
Where-Object Name -notin '_Total','Idle' excludes some 'processes' that always have high values but are unlikely to be what you're looking for. Generally it is more efficient to update the call to Get-WmiObject to exclude these at that stage, but for clarity I went with this technique.
Sort-Object -Property 'PercentProcessorTime' -Descending takes all of the readings and sorts them in order from largest CPU value to lowest.
Select-Object -First 1 -Property $properties Selects just the first object in the sorted list (i.e. the one with the highest value). Note that it is better to do this last and not after each call to Get-WmiObject as it creates a new custom object for each WMI one returned, almost all of which we discard further along the line - it is more efficient to do this 'duplication' for only the final object we select.

PowerShell script to calculate the memory usage per user

We have an environment of 240 VMs. Clients are using ICA/RDP connection to connect to these servers. Sometimes users are hogging the memory and causing the slowing and crash on that particular server.
I would like to have a PowerShell script to calculate the memory usage for each user connected to the server. I spent hours and hours searching and trying different scripts but was not successful.
Some scripts giving me the working sets value using Get-WmiObject Win32_Process and GetOwner(). but the calculation is not correct.
What I need is exactly the format that I can see in the users tab in Task Manager. The main information which I need is the memory usage, but it would be nice to have the disk and CPU usage per user as well.
Here is the code which i am using. When i run this script after a minute or two It returns an error which says $.GetOwner() can not be found and in another line it gives me the user name which utilizing the memory more than the others but the calculation is not correct when i compare it with TaskManager user's tab.
$h = #{}
get-wmiobject win32_process | foreach {
$u = $_.getowner().user;
if ( $u -ne $null)
{
if ( !$h.ContainsKey($u) )
{
$h.add( $u, $_.WS);
}
else
{
$h.item($u) = $h.item($u) + $_.WS;
}
}
}
$h.GetEnumerator() | sort value -desc
try this:
get-wmiobject win32_process |
select #{N='User';E={$_.getowner().user}}, WorkingSetSize |
group user |
select Name, #{N='CPU';E={($_.Group.WorkingSetSize | Measure-Object -Sum).Sum / 1Mb }}
gwmi win32_process |
select #{N='User';E={$_.getowner().user}},WorkingSetSize |
group User | select Name,#{N='RAM';E={[math]::Round(($_.Group.WorkingSetSize | Measure-Object -Sum).Sum/1MB) }} |
sort RAM -Descending | select -first 1 ;