powershell stop process using 45% cpu - powershell

Does anyone know how to stop a process that is hung at around 45% CPU usage using powershell.
I run into having to do this alot manually and would like to:
First develop a script that can find the process by name (in this case it's a process called dfileman.exe used by an application running on a Windows 2003 server), check to see if it's stuck at equal to or greater than 45% CPU usage for more than 5 minutes and stop it if it meets the criteria. The application will start a new process the next time it needs it so I'm not worried about restarting it.
Second, use MS SCOM to monitor the dfileman.exe process, run the above script whenever it gets hung and send me an email whenever the script was run.
Any help would be greatly appreciated. Even if it's just helping me with the script.
What I have so far is:
$ProcessName = "defilman.exe"
$PSEmailServer = "smtp.company.com"
foreach ($proc in (Get-WmiObject Win32_Processor)){
if($proc.numberofcores -eq $null){
$cores++
}else{
$cores = $cores + $proc.numberofcores
}
}
$cpuusage = [Math]::round(((((Get-Counter "\Process($ProcessName)\% Processor Time" -MaxSamples 2).Countersamples)[0].CookedValue)/$cores),2)
if ($cpuusage -gt 45)
Send-MailMessage -To "Me (myaddress)" -From "Me (myaddress)" -Subject "DFileMan Process Hung" -body "An instance of $ProcessName on Server has reached a CPU Percentage of $cpuusage %. Please Kill Process Immediately"
else
Exit

Instead of a custom script, take a look at Performance Monitor. It has built-in functionality for taking actions when a counter stays long enough on specific a value.
After Perfmon has detected that the app is using too much CPU, you can use Powershell or whatever to kill the process.

So an If statement is written like this: If(test){code to run if test passes} right now you are missing the { } from that. Corrected code:
if ($cpuusage -gt 45){
Send-MailMessage -To "Me (myaddress)" -From "Me (myaddress)" -Subject "DFileMan Process Hung" -body "An instance of $ProcessName on Server has reached a CPU Percentage of $cpuusage %. Please Kill Process Immediately"
}

Related

Invoke-Command takes a long time, but only occasionally

I have a PowerShell script using Invoke-Command that has been in use for about a year that suddenly has been having very bad performance issues. What normally took 1 to 2 seconds to run now takes about 90 seconds. Confusingly, it doesn't always take a long time. In fact, I've been testing it many times throughout today and have seen it run perfectly fine with every attempt over a 10-20 minute period, and then it goes back to being abysmally slow for the next 20-40 minute test period.
A simple look at my test code:
$cred = New-Object System.Management.Automation.PSCredential $username, $securePassword
Write-Host "Running command..."
Invoke-Command -ComputerName $target -Credential $cred -ScriptBlock {
Write-Host "Hello"
}
Write-Host "Done"
The timing of the results goes something like this:
"Running command..."
65 seconds...
"Hello"
25 seconds...
"Done"
For the usage of this, I need to wait on the process and then be able to see the result of the command so I can't just throw -AsJob into the script and not wait for the output. What should I be looking for to find what's slowing this down? I've checked the target machine during a slow response and don't see unusual CPU or memory usage.
I think I finally found part of what's causing this... And the answer has to do with some details I didn't provide in my initial question.
The target machine for my script is a domain controller, and network is setup with two domain controllers (for fault tolerance). If I make calls to -ComputerName "my.domain.com", I occasionally see the long delays... But if I just use the machine's IP address instead of the domain name, it goes through immediately.
I still don't know why this only now just started having issues, and what the real root problem is... but this gives me something to at least have the script working until we can upgrade our environment.

Powersheel script to fetch logs from remote desktop

We are working on one application and for that application there is one module called replicator and this replicator shows the status whether failed or successful in every h hrs it runs. So, I want to create a powershell script which will check the logs from my remote desktop server and email me the status of replicator .
Can somebody help me in this.
Are the logs in a text file? Does it have a particular format? Something like:
$logline = Get-Content \\path\to\file.txt -Last 1
Send-MailMessage -smtpserver smtp0 -from me#me.com -to me#me.com -Subject "Latest Result is $logline"

Getting Processes via Get-Counter does not refresh when processes change and throws errors if processes end

My script monitors the CPU usage of the process, looping the code every 5 sec and writing it to a file. Which works fine.
But I found when a new Process runs my script will not find it until I stop the script and rerun it again.
Also if a process ends/stops, the script give an this error:
Get-Counter : The data in one of the performance counter samples is
not valid. View the Status property for each
PerformanceCounterSample object to make sure it contains valid data.
At line:2 char:34
It seems PowerShell retrieves the Process information only once and caches it.
If I run the bellow script (which is a part of all my script), it runs perfectly:
while($true) {
$ProcessId = (Get-Counter "\Process(*)\ID Process").CounterSamples
$ProcessId.count
Start-Sleep -s 5
}
If I have 50 process it will gives 50, but if a new process starts it will keep giving 50 until I restart the script.
If I stop any process it will give the same error above.
Any idea how to solve this problem and force PowerShell to reread the process list without restarting the script?
You could use PowerShell Jobs to execute it in a new background process on each iteration and use -ErrorAction SilentlyContinue to suppress the error messages that might occur if one or more processes stopped during a check:
while($true) {
$ProcessId = Start-Job -ScriptBlock { (Get-Counter "\Process(*)\ID Process" -ErrorAction SilentlyContinue).CounterSamples } | Wait-Job | Receive-Job
$ProcessId.Count
Start-Sleep -Seconds 5
}

Read time stamp from notepad using batch script

I'm looking solution for below issue :
We reload our application everyday and it will create the log file on daily.
sometimes the log stop writing because of some issues related to CPU usage and other strange stuff.
Now we want to monitor the log file on every 5 mins, if anything doesn't get updated in the log file then it should trigger an email.
The log file keep updating every 1 min, when the reload of file goes smooth.
I need read time stamp in the log file.The log file has time stamp column , it will write content in the log file in every 5 mins. so i need to read the time stamp inside the file for every 5 mins , if anything doesn't return more then 5 mins then i should trigger alert.
Is there any way to implement the above scenario using batch script or power shell or any other idea's to monitor log file is welcome.
Thanks for the help.
You definately can use Powershell for this purpose!
Make sure you have changed Powershell Execution Policy for scripts. You can change it like this:
Set-ExecutionPolicy -ExecutionPolicy Unrestricted
Here's the simple script that would handle it. It could be made as one/two liners of course, but i tried to make it to be more readable/understandable, there:
$FilePath = 'C:\log.txt' #Log file location
$Minutes = 5 #LastWriteTime older than 5 minutes from current time.
#General email parameters
$SMTP = 'your.smtp.server.local'
$From = 'yourserviceaccount#contoso.com'
$To = 'someperson#contoso.com'
$Subject = 'Action Required'
$Body = 'Log file havent been written from more than {0} minutes!' -f $Minutes
$LastWriteTime = (Get-ItemProperty -LiteralPath $FilePath).LastWriteTime
if ($LastWriteTime -ge (Get-Date).AddMinutes(-$Minutes))
{
Send-MailMessage -SmtpServer $SMTP -Port 25 -From $From -To $To -Subject $Subject -Body $Body -Priority High
}
You could then make a Scheduled Task that will trigger the script file every N-minutes.

How to pull a range of failed services from a remote server after a reboot

Caveat: Without spiking the cpu while a Get-WmiObject call parses the whole event log to match my specified filter.
Situation: I am working on a script that remotely runs some checks, then reboots a pc. I want it to check the health once the server reboots (after sleeping for some time) to make sure services that were supposed to start did. I've been running into "Automatic" services that start and then shut down (as intended) but then my current version picks them up as failed if they've already run. It was suggested that I check the event log for "Service Control Manager" errors, and report on those, the only problem now is that with the below script, we have servers who's event log can range anywhere from 20K to several hundred thousand events, and on a 2k server with 20K, this takes roughly 20 seconds to complete, and the cpu pegs near 100% while it's running.
I'm still learning powershell/wmi, so any advice would be appreciated.
function Check_Startup_Events {
BEGIN {
$time = [System.Management.ManagementDateTimeConverter]::ToDmtfDateTime((Get-Date).AddMinutes(-15))
}
PROCESS {
$results = Get-WmiObject Win32_NTLogEvent -computername $_ -Filter "LogFile='System' and SourceName='Service Control Manager' and TimeGenerated>='$time' and EventType=1" |
Format-Table -Autosize EventCode, Message
$results
}
}
$results = Get-EventLog -ComputerName w2kserver -LogName System -After $time
foreach ($result in $results){
if ($result.Source -eq "Service Control Manager" -and $result.EntryType -eq "Error"){
Write-Host $_.Description}}
I ran this against a 60k big event log on a W2K server in our environment. It takes a while to run but runs locally and does not tax the server. Not sure how you would want to output the data but I think Get-EventLog will do what you want.