I curious if you can answer this or point me in the right direction.
I've written a script that tests/monitors urls. I'm not posting the code ( unless you want me to ) because there is no error in the code. It works great. I can even scriptblock run it as part of start job. The issue I have seems to be that I can not run more than 3 jobs at time.. or they hang. I'm not sure why this is. I can run it for a total of 15 urls throttled to 3 and it's great. If I try to run it on 15 urls with 4 as my run limit, they will hang.. and I can kill one at a time.. until only 3 remain and those will finish. So it seems that I can only start a total of 3 powershell instances or they hang. Anyone explain why this is? All my searches lead me to pages that show how to throttle and it's not really my issue.
Watching the processes, each consumes about 25MBs of memory and sits there idle... If I kill one the other 3 will start using cpu and process go up to maybe 30MBs of memory and terminate completed. System has 8GBs of memory & a quad cord I5-2400 CPU # 3.10GHz. As requested...
Param(
$file
)
$testscript =
{
Param(
[string]$url,
#[ValidateSet('InternetExplorer','Chrome','Firefox','Safari','Opera', IgnoreCase = $true)]
[string]$browser="InternetExplorer",
[string]$teststring="Solution Center",
[int]$timeout=20,
[int]$retry
)
$i=0
do {
$userAgent = [Microsoft.PowerShell.Commands.PSUserAgent]::$browser
$data = Invoke-WebRequest $url -UserAgent $userAgent -TimeoutSec $timeout
$data.Content
$findit = $data.Content.Contains($teststring)
$i++
If ($findit){
break
}
}
while ($i -lt $retry)
if(!$findit) {
Echo "opcmsg a=PSURLCheck o=NHTSA msg_t='$teststring was not found on $url or $url failed to load'"
}
}
$urls = Import-Csv $file | % {
Start-Job -ScriptBlock $testscript -ArgumentList $_.url, $_.browser, $_.teststring, $_.retry
}
While (#(Get-Job | Where { $_.State -eq "Running" }).Count -ne 0)
{ Write-Host "Processing URLs..."
Get-Job
Start-Sleep -Seconds 5
}
$Data = ForEach ($Job in (Get-Job)) {
Receive-Job $Job
Remove-Job $Job
}
$data | select *
So I've used new system.net.webclient and I've even tried doing this with [System.Collections.Queue]... but all three methods use Jobs... so it appears.. I can not run more than three start jobs at any one time.
Are you sure your code is fine? If you're calling separate powershell sessions multiple times memory can be consumed very quickly. Check process monitor for high CPU or memory usage and ensure your blocks are terminating. Or post the code.
Related
I'm trying to routinely check the presence of particular strings in text files on hundreds of computers on our domain.
foreach ($computer in $computers) {
$hostname = $computer.DNSHostName
if (Test-Connection $hostname -Count 2 -Quiet) {
$FilePath = "\\" + $hostname + "c$\SomeDirectory\SomeFile.txt"
if (Test-Path -Path $FilePath) {
# Check for string
}
}
}
For the most part, the pattern of Test-Connection and then Test-Path is effective and fast. There are certain computers, however, that ping successfully but Test-Path takes around 60 seconds to resolve to FALSE. I'm not sure why, but it may be a domain trust issue.
For situations like this, I would like to have a timeout for Test-Path that defaults to FALSE if it takes more than 2 seconds.
Unfortunately the solution in a related thread (How can I wrap this Powershell cmdlet into a timeout function?) does not apply to my situation. The proposed do-while loop gets hung up in the code block.
I've been experimenting with Jobs but it appears even this won't force quit the Test-Path command:
Start-Job -ScriptBlock {param($Path) Test-Path $Path} -ArgumentList $Path | Wait-Job -Timeout 2 | Remove-Job -Force
The job continues to hang in the background. Is this the cleanest way I can achieve my requirements above? Is there a better way to timeout Test-Path so the script doesn't hang besides spawning asynchronous activities? Many thanks.
Wrap your code in a [powershell] object and call BeginInvoke() to execute it asynchronously, then use the associated WaitHandle to wait for it to complete only for a set amount of time.
$sleepDuration = Get-Random 2,3
$ps = [powershell]::Create().AddScript("Start-Sleep -Seconds $sleepDuration; 'Done!'")
# execute it asynchronously
$handle = $ps.BeginInvoke()
# Wait 2500 milliseconds for it to finish
if(-not $handle.AsyncWaitHandle.WaitOne(2500)){
throw "timed out"
return
}
# WaitOne() returned $true, let's fetch the result
$result = $ps.EndInvoke($handle)
return $result
In the example above, we randomly sleep for either 2 or 3 seconds, but set a 2 and a half second timeout - try running it a couple of times to see the effect :)
I need some help with a powershell script to kill all PIDs that are non-responsive for 3 minutes.
This is my script, but is not doing the trick. This script is running, but i need it to run as in a while loop, forever since the computer is running till the end of the working time.
I need to have a list with all the processes that are unresponsive for a period of 3 minutes. After 3 minutes, if those processes from the list have the same status -eq NoT Responsing to kill them. I don't want to kill the processes that are not responsing for 5 seconds or so, only those that are hanging for more than 3 minutes.
My purpose is to kill the PIDs that are running with the status Not Responding for more than 3 minutes.
As you know, processes sometimes are unresponsive for a couple of seconds e.g IE hangs for 7 seconds till the server response with the DOM etc. hence I need to close all the pids that are hanging with the status Not Responsive for more than 3 min.
while (1) {
# if ( $allProcesses = get-process -name $pN -errorAction SilentlyContinue ) {
foreach ($oneProcess in $allProcesses) {
if ( -not $oneProcess.Responding ) {
write "Status = Not Responding: Kill& Restart.."
$oneProcess.kill()
## restart ..
} else {
write "Status = either normal or not detectable (no Window-Handle)"
}
}
start-sleep 5
}
A quick and dirty solution, not tested, is based on idea about storing the process info in a hashtable and performing a re-check after sleep period. Like so,
while($true){
# Get a list of non-responding processes
$ps = get-process | ? { $_.responding -eq $false }
$ht = #{}
# Store process info in a hash table.
foreach($p in $ps) {
$o = new-object psobject -Property #{ "name"=$p.name; "status"=$p.responding; "time"=get-date; "pid"=$p.id }
$ht.Add($o.pid, $o)
}
# sleep for a while
start-sleep -minutes 3
# Get a list of non-responding processes, again
$ps = get-process | ? { $_.responding -eq $false }
foreach($p in $ps) {
# Check if process already is in the hash table
if($ht.ContainsKey($p.id)) {
# Calculate time difference, in minutes for
# process' start time and current time
# If start time's older than 3 minutes, kill it
if( ((get-date)-$ht[$p.id].Time).TotalMinutes -ge 3 ) {
# Actuall killing
$p.kill()
}
}
}
}
It's certainly possible to store process objects in the hashtable, but in most cases all you need is process id. Mind that process ids are recycled. If you are spawning a lot of processes, it might be reasonable to check $p.time value so that newly created process isn't killed instead.
I've written a script in Powershell 3.0 to monitor a log file for specific errors. The script starts a background process, which monitors the file. When anything gets written to the file, the background process simply passes it to the foreground process, if it matches the proper format (a datestamped line). The foreground process then counts the number of errors.
Everything works correctly with no errors. The issue is that, as the source logfile grows in size, the memory consumed by Powershell increases dramatically. These logs are capped at ~24M before they are rotated, which amounts to ~250K lines. In my tests, by the time the log size reaches ~80K lines or so, the monitor process is consuming 250M RAM (foreground and background processes combined. They're consuming ~70M combined when they first start. This type of growth is unacceptable in our environment. What can I do to decrease this?
Here's the script:
# Constants.
$F_IN = "C:\Temp\test.log"
$RE = "^\d+-\d+-\d+ \d+:\d+:\d+,.+ERROR.+Foo$"
$MAX_RESTARTS = 3 # Max restarts for failed background job.
$SLEEP_DELAY = 60 # In seconds.
# Background job.
$SCRIPT_BLOCK = { param($f, $r)
Get-Content -Path $f -Tail 0 -Wait -EA SilentlyContinue `
| Where { $_ -match $r }
}
function Start-FileMonitor {
Param([parameter(Mandatory=$true,Position=0)][alias("f")]
[String]$file,
[parameter(Mandatory=$true,Position=1)][alias("b")]
[ScriptBlock]$SCRIPT_BLOCK,
[parameter(Mandatory=$true,Position=2)][alias("re","r")]
[String]$regex)
$j = Start-Job -ScriptBlock $SCRIPT_BLOCK -Arg $file,$regex
return $j
}
function main {
# Tail log file in the background, return any errors.
$job = Start-FileMonitor -b $SCRIPT_BLOCK -f $F_IN -r $RE
$restarts = 0 # Current number of restarts.
# Poll background $job every $SLEEP_DELAY seconds.
While ($true) {
$a = (Receive-Job $job | Measure-Object)
If ($job.JobStateInfo.State -eq "Running") {
$restarts = 0
If ($a.Count -gt 0) {
$t0 = $a.Count
Write-Host "Error Count: ${t0}"
}
}
Else {
If ($restarts -lt $MAX_RESTARTS) {
$job = Start-FileMonitor -b $SCRIPT_BLOCK -f $F_IN -r $RE
$restarts++
Write-Host "Background job not running. Attempted restart ${restarts}."
}
Else {
Write-Host "`$MAX_RESTARTS (${MAX_RESTARTS}) exceeded. Exiting."
Break
}
}
# Sleep for $SLEEP_DELAY.
Start-Sleep -Seconds $SLEEP_DELAY
}
Write-Host "Done."
}
# Execute script.
main
...and here's the sample data:
2015-11-19 00:00:00, WARN Foo
2015-11-19 00:00:00, ERROR Foo
In order to replicate this issue:
Paste the sample data lines into the file C:\Temp\test.log. Save.
Start the monitoring script.
Paste additional sample data lines into the log and save. Wait for the Error Count: line to confirm that everything is working correctly.
Continue to paste additional lines and watch the memory consumption for powershell.exe in Task Manager. Note how much it increases at 400 lines...800 lines...8,000 lines...80,000 lines...
I have this script to trap the IP's on a network to compare to historic packet captures as part of a larger problem solving exercise.
function qp($comp)
{
$p = ping $comp -n 1 -w 2 -4
IF($? -eq $true){$out = $p[1].split("[")[1].split("]")[0]}
else{$out = $False}
return $out
}
$comps = Get-Content C:\PacketCapture\comps.txt
DO
{
foreach($comp in $comps)
{
ECHO "$(qp $comp);$comp" >>"C:\PacketCapture\IP_$(Get-date -format HHmm-ddMMyy).txt"
}
Start-sleep 3600
}
until($null -eq "WANG")
I set this off initially a fortnight ago, at the middle of last week the terminal it was running on was grinding to a halt as the memory use for the power shell process was nearly at 2GB.
I stopped and restarted it and again we were at 1.2GB RAM use this morning.
Whilst not particularly critical, I've modified this to run once then stop/start itself, I'm interested to know which element is causing the memory leak and how I would identify that in the future.
You could try invoking garbage collection manually. I think the do/until loop is a good place for it:
DO
{
foreach($comp in $comps)
{
ECHO "$(qp $comp);$comp" >>"C:\PacketCapture\IP_$(Get-date -format HHmm-ddMMyy).txt"
}
Start-sleep 3600
[GC]::Collect()
}
until($null -eq "WANG")
Try introducing garbage collection into your code with [System.GC]::Collect()
Source : https://dmitrysotnikov.wordpress.com/2012/02/24/freeing-up-memory-in-powershell-using-garbage-collector
I am trying to execute a function or a scriptblock in powershell and set a timeout for the execution.
Basically I have the following (translated into pseudocode):
function query{
#query remote system for something
}
$computerList = Get-Content "C:\scripts\computers.txt"
foreach ($computer in $computerList){
$result = query
#do something with $result
}
The query can range from a WMI query using Get-WmiObject to a HTTP request and the script has to run in a mixed environment, which includes Windows and Unix machines which do not all have a HTTP interface.
Some of the queries will therefore necessarily hang or take a VERY long time to return.
In my quest for optimization I have written the following:
$blockofcode = {
#query remote system for something
}
foreach ($computer in $computerList){
$Job = Start-Job -ScriptBlock $blockofcode -ArgumentList $computer
Wait-Job $Job.ID -Timeout 10 | out-null
$result = Receive-Job $Job.ID
#do something with result
}
But unfortunately jobs seem to carry a LOT of overhead. In my tests a query that executes in 1.066 seconds (according to timers inside $blockofcode) took 6.964 seconds to return a result when executed as a Job. Of course it works, but I would really like to reduce that overhead. I could also start all jobs together and then wait for them to finish, but the jobs can still hang or take ridiculous amounts to time to complete.
So, on to the question: is there any way to execute a statement, function, scriptblock or even a script with a timeout that does not comprise the kind of overhead that comes with jobs? If possible I would like to run the commands in parallel, but that is not a deal-breaker.
Any help or hints would be greatly appreciated!
EDIT: running powershell V3 in a mixed windows/unix environment
Today, I ran across a similar question, and noticed that there wasn't an actual answer to this question. I created a simple PowerShell class, called TimedScript. This class provides the following functionality:
Method: Start() method to kick off the job, when you're ready
Method:GetResult() method, to retrieve the output of the script
Constructor: A constructor that takes two parameters:
ScriptBlock to execute
[int] timeout period, in milliseconds
It currently lacks:
Passing in arguments to the PowerShell ScriptBlock
Other useful features you think up
Class: TimedScript
class TimedScript {
[System.Timers.Timer] $Timer = [System.Timers.Timer]::new()
[powershell] $PowerShell
[runspace] $Runspace = [runspacefactory]::CreateRunspace()
[System.IAsyncResult] $IAsyncResult
TimedScript([ScriptBlock] $ScriptBlock, [int] $Timeout) {
$this.PowerShell = [powershell]::Create()
$this.PowerShell.AddScript($ScriptBlock)
$this.PowerShell.Runspace = $this.Runspace
$this.Timer.Interval = $Timeout
Register-ObjectEvent -InputObject $this.Timer -EventName Elapsed -MessageData $this -Action ({
$Job = $event.MessageData
$Job.PowerShell.Stop()
$Job.Runspace.Close()
$Job.Timer.Enabled = $False
})
}
### Method: Call this when you want to start the job.
[void] Start() {
$this.Runspace.Open()
$this.Timer.Start()
$this.IAsyncResult = $this.PowerShell.BeginInvoke()
}
### Method: Once the job has finished, call this to get the results
[object[]] GetResult() {
return $this.PowerShell.EndInvoke($this.IAsyncResult)
}
}
Example Usage of TimedScript Class
# EXAMPLE: The timeout period is set longer than the execution time of the script, so this will succeed
$Job1 = [TimedScript]::new({ Start-Sleep -Seconds 2 }, 4000)
# EXAMPLE: This script will fail. Even though Get-Process returns quickly, the Start-Sleep call will cause it to be terminated by its Timer.
$Job2 = [TimedScript]::new({ Get-Process -Name s*; Start-Sleep -Seconds 3 }, 2000)
# EXAMPLE: This job will fail, because the timeout is less than the script execution time.
$Job3 = [TimedScript]::new({ Start-Sleep -Seconds 3 }, 1000)
$Job1.Start()
$Job2.Start()
$Job3.Start()
Code is also hosted on GitHub Gist.
I think you might want to investigate using Powershell runspaces:
http://learn-powershell.net/2012/05/13/using-background-runspaces-instead-of-psjobs-for-better-performance/