How to Execute Powershell Pipeline Asynchronously - powershell

How to make Powershell execute pipelines asynchronously instead of 'sequentially'?
An example to illustrate this:
Function PipeDelay($Name, $Miliseconds){
Process {
Sleep -miliseconds $Miliseconds
Write-Host "$Name : $_"
$_
}
}
1..7 | PipeDelay Fast 100 | PipeDelay Slow 300 | PipeDelay Slowest 900
The output in both ISE and powershell are the following:
Fast : 1
Slow : 1
Slowest : 1
1
Fast : 2
Slow : 2
Slowest : 2
2
Fast : 3
Slow : 3
Slowest : 3
3
...
It is as if Powershell sequentially run the whole pipeline for one input object before processing next input object. The execution looks something like this :
Fast : 1 2 3
Slow : 1---1 2---2 3---3
Slowest : 1---------------1 2---------------2 3-----...
Is it possible with some settings/environment variables/etc. to make pipelines to run independently/asynchronously? perhaps with setting about pipeline buffer size, etc.? So that the execution will looks something like this:
Fast : 1 2 3 4 5 6 7
Slow : 1---1 2---2 3---3 4---4 5---5 6---6 7---7
Slowest : 1---------------1 2---------------2 3-----...
NOTE
I thought it is because of STA/MTA mode. I don't understand them completely but same result in ISE (STA) / Powershell Shell (MTA) seems to eliminate STA/MTA mode as the cause.
Also, I thought Write-Host is the issue that force pipeline to be processed sequentially, but even if I substitute Write-Host with New-Event, the sequential processing still applies.

I don't think you will be able to leverage the pipeline in an asynchronous manner in the way you desire without being quiet expensive with resource usage. I tried to capture the spirit of what you are trying to accomplish, but in a different way. I used a slightly different example to illustrate how [Automation.PowerShell] Async works.
#1. You have a list of room requests you want to process in a function. TimeToClean was added as a controllable thread block.
$roomsToClean = #( ([psCustomObject]#{Name='Bedroom';TimeToClean=2}),
([psCustomObject]#{Name='Kitchen';TimeToClean=5}),
([psCustomObject]#{Name='Bathroom';TimeToClean=3}),
([psCustomObject]#{Name='Living room';TimeToClean=1}),
([psCustomObject]#{Name='Dining room';TimeToClean=1}),
([psCustomObject]#{Name='Foyier';TimeToClean=1})
)
#2. We will clean three rooms and return a custom PowerShell object with a message.
Function Clean-Room{
param([string]$RoomName,[int]$Seconds)
Sleep -Seconds $Seconds
Write-Output [psCustomObject] #{Message= "The robot cleaned the $RoomName in $Seconds seconds."}
}
#3. Executing this list synchronously will result in an approximate 13 second runtime.
Write-Host "===== Synchronous Results =====" -ForegroundColor Green
$stopwatch = [system.diagnostics.stopwatch]::StartNew()
foreach($item in $roomsToClean){
$obj = Clean-Room $item.Name $item.TimeToClean
Write-Output $obj.Message
}
$stopwatch.Stop()
Write-Host "Execution time for synchronous function was $($stopwatch.Elapsed)." -ForegroundColor Green
#4. Now let's run this function asynchronously for all of these items. Expected runtime will be approximately 5 seconds.
#=============== Setting up an ansynchronous powerShell Automation object and attaching it to a runspace.
#Many [Automation.PowerShell] objects can be attached to a given Runspace pool and the Runspace pool will manage queueing/dequeueing of each PS object as it completes.
#Create a RunSpace pool with 5 runspaces. The pool will manage the
#Many PowerShell autom
$minRunSpaces = 2
$maxRunsSpaces = 5
$runspacePool = [RunspaceFactory]::CreateRunspacePool($minRunSpaces, $maxRunsSpaces)
$runspacePool.ApartmentState = 'STA' #MTA = Multithreaded apartment #STA = Singl-threaded apartment.
$runspacePool.Open() #runspace pool must be opened before it can be used.
#For each room object, create an [Automation.PowerShell] object and attach it to the runspace pool.
#Asynchronously invoke the function for all objects in the collection.
$ps_collections = foreach($room in $roomsToClean){
try{
$ps = [System.Management.Automation.PowerShell]::Create()
$ps.RunspacePool = $runspacePool
#Add your custom functions to the [Automation.PowerShell] object.
#Add argument with parameter name for readability. You may just use AddArgument as an alternative but know your positional arguments.
[void] $ps.AddScript(${function:Clean-Room})
[void] $ps.AddParameter('RoomName',$room.Name) #Add parameterName,value
[void] $ps.AddParameter('Seconds',$room.TimeToClean) #Add parameterName,value
#extend the ps management object to include AsyncResult and attach the AsyncResult object for receiving results at a later time.
$ps | Add-Member -MemberType NoteProperty -Name 'AsyncResult' -Value $ps.BeginInvoke() #invoke asynchronously
$ps | Add-Member -MemberType ScriptMethod -Name 'GetAsyncResult' -Value {$this.EndInvoke($this.AsyncResult) } -PassThru
}
catch{
throw $_ #handle custom error here.
}
}
#After the function has been asynchronously called for all room objects, Grab results from asynchronous function calls.
Write-Host "===== Asynchronous Results =====" -ForegroundColor Green
$stopwatch = [system.diagnostics.stopwatch]::StartNew()
foreach($ps in $ps_collections){
$obj = $ps.GetAsyncResult()
[void] $ps.Dispose() #dispose of object after use.
Write-Output $obj.Message
}
$stopwatch.Stop()
Write-Host "Execution time for asynchronous function was
$($stopwatch.Elapsed)." -ForegroundColor Green
#Runspace cleanup.
If($runspacePool){
[void] $runspacePool.Close()
[void] $runspacePool.Dispose()
}
Result times will vary slightly but should look similar to this:
===== Synchronous Results =====
The robot cleaned the Bedroom in 2 seconds.
The robot cleaned the Kitchen in 5 seconds.
The robot cleaned the Bathroom in 3 seconds.
The robot cleaned the Living room in 1 seconds.
The robot cleaned the Dining room in 1 seconds.
The robot cleaned the Foyier in 1 seconds.
Execution time for synchronous function was 00:00:13.0719157.
####===== Asynchronous Results =====
The robot cleaned the Bedroom in 2 seconds.
The robot cleaned the Kitchen in 5 seconds.
The robot cleaned the Bathroom in 3 seconds.
The robot cleaned the Living room in 1 seconds.
The robot cleaned the Dining room in 1 seconds.
The robot cleaned the Foyier in 1 seconds.
Execution time for asynchronous function was 00:00:04.9909951.

Related

Powershell Script to Kill all PIDs that are non-responsive for 3 minutes

I need some help with a powershell script to kill all PIDs that are non-responsive for 3 minutes.
This is my script, but is not doing the trick. This script is running, but i need it to run as in a while loop, forever since the computer is running till the end of the working time.
I need to have a list with all the processes that are unresponsive for a period of 3 minutes. After 3 minutes, if those processes from the list have the same status -eq NoT Responsing to kill them. I don't want to kill the processes that are not responsing for 5 seconds or so, only those that are hanging for more than 3 minutes.
My purpose is to kill the PIDs that are running with the status Not Responding for more than 3 minutes.
As you know, processes sometimes are unresponsive for a couple of seconds e.g IE hangs for 7 seconds till the server response with the DOM etc. hence I need to close all the pids that are hanging with the status Not Responsive for more than 3 min.
while (1) {
# if ( $allProcesses = get-process -name $pN -errorAction SilentlyContinue ) {
foreach ($oneProcess in $allProcesses) {
if ( -not $oneProcess.Responding ) {
write "Status = Not Responding: Kill& Restart.."
$oneProcess.kill()
## restart ..
} else {
write "Status = either normal or not detectable (no Window-Handle)"
}
}
start-sleep 5
}
A quick and dirty solution, not tested, is based on idea about storing the process info in a hashtable and performing a re-check after sleep period. Like so,
while($true){
# Get a list of non-responding processes
$ps = get-process | ? { $_.responding -eq $false }
$ht = #{}
# Store process info in a hash table.
foreach($p in $ps) {
$o = new-object psobject -Property #{ "name"=$p.name; "status"=$p.responding; "time"=get-date; "pid"=$p.id }
$ht.Add($o.pid, $o)
}
# sleep for a while
start-sleep -minutes 3
# Get a list of non-responding processes, again
$ps = get-process | ? { $_.responding -eq $false }
foreach($p in $ps) {
# Check if process already is in the hash table
if($ht.ContainsKey($p.id)) {
# Calculate time difference, in minutes for
# process' start time and current time
# If start time's older than 3 minutes, kill it
if( ((get-date)-$ht[$p.id].Time).TotalMinutes -ge 3 ) {
# Actuall killing
$p.kill()
}
}
}
}
It's certainly possible to store process objects in the hashtable, but in most cases all you need is process id. Mind that process ids are recycled. If you are spawning a lot of processes, it might be reasonable to check $p.time value so that newly created process isn't killed instead.

Copy-item using invoke-async in Powershell

This article shows how to use Invoke-Async in PowerShell: https://sqljana.wordpress.com/2018/03/16/powershell-sql-server-run-in-parallel-collect-sql-results-with-print-output-from-across-your-sql-farm-fast/
I wish to run in parallel the copy-item cmdlet in PowerShell because the alternative is to use FileSystemObject via Excel and copy one file at a time out of a total of millions of files.
I have cobbled together the following:
.SYNOPSIS
<Brief description>
For examples type:
Get-Help .\<filename>.ps1 -examples
.DESCRIPTION
Copys files from one path to another
.PARAMETER FileList
e.g. C:\path\to\list\of\files\to\copy.txt
.PARAMETER NumCopyThreads
default is 8 (but can be 100 if you want to stress the machine to maximum!)
.EXAMPLE
.\CopyFilesToBackup -filelist C:\path\to\list\of\files\to\copy.txt
.NOTES
#>
[CmdletBinding()]
Param(
[String] $FileList = "C:\temp\copytest.csv",
[int] $NumCopyThreads = 8
)
$filesToCopy = New-Object "System.Collections.Generic.List[fileToCopy]"
$csv = Import-Csv $FileList
foreach($item in $csv)
{
$file = New-Object fileToCopy
$file.SrcFileName = $item.SrcFileName
$file.DestFileName = $item.DestFileName
$filesToCopy.add($file)
}
$sb = [scriptblock] {
param($file)
Copy-item -Path $file.SrcFileName -Destination $file.DestFileName
}
$results = Invoke-Async -Set $filesToCopy -SetParam file -ScriptBlock $sb -Verbose -Measure:$true -ThreadCount 8
$results | Format-Table
Class fileToCopy {
[String]$SrcFileName = ""
[String]$DestFileName = ""
}
the csv input for which looks like this:
SrcFileName,DestFileName
C:\Temp\dummy-data\101438\101438-0154723869.zip,\\backupserver\Project Archives\101438\0154723869.zip
C:\Temp\dummy-data\101438\101438-0165498273.xlsx,\\backupserver\Project Archives\101438\0165498273.xlsx
What am I missing to get this working, because when I run .\CopyFiles.ps1 -FileList C:\Temp\test.csv nothing happens. The files exist in the source path, but the file objects aren't being pulled from the -Set collection. (Unless I have misunderstood how the collection is used?)
No, I can't use robocopy to do this because there are millions of files which resolve to different paths depending upon their original location.
I have no explanation for your symptom based on the code in your question (see bottom section), but I suggest basing your solution on the (now) standard Start-ThreadJob cmdlet (comes with PowerShell Core; in Windows PowerShell, install it with Install-Module ThreadJob -Scope CurrentUser, for instance[1]):
Such a solution is more efficient than use of the third-party Invoke-Async function, which as of this writing is flawed in that it waits for jobs to finish in a tight loop, which creates unnecessary processing overhead.
Start-ThreadJob jobs are a lightweight, thread-based alternative to the process-based Start-Job background jobs, yet they integrate with the standard job-management cmdlets, such as Wait-Job and Receive-Job.
Here's a self-contained example based on your code that demonstrates its use:
Note: Whether you use Start-ThreadJob or Invoke-Async, you won't be able to explicit reference custom classes such as [fileToCopy] in the script block that runs in separate threads (runspaces; see bottom section), so the solution below simply uses [pscustomobject] instances with the properties of interest for simplicity and brevity.
# Create sample CSV file with 10 rows.
$FileList = Join-Path ([IO.Path]::GetTempPath()) "tmp.$PID.csv"
#'
Foo,SrcFileName,DestFileName,Bar
1,c:\tmp\a,\\server\share\a,baz
2,c:\tmp\b,\\server\share\b,baz
3,c:\tmp\c,\\server\share\c,baz
4,c:\tmp\d,\\server\share\d,baz
5,c:\tmp\e,\\server\share\e,baz
6,c:\tmp\f,\\server\share\f,baz
7,c:\tmp\g,\\server\share\g,baz
8,c:\tmp\h,\\server\share\h,baz
9,c:\tmp\i,\\server\share\i,baz
10,c:\tmp\j,\\server\share\j,baz
'# | Set-Content $FileList
# How many threads at most to run concurrently.
$NumCopyThreads = 8
Write-Host 'Creating jobs...'
$dtStart = [datetime]::UtcNow
# Import the CSV data and transform it to [pscustomobject] instances
# with only .SrcFileName and .DestFileName properties - they take
# the place of your original [fileToCopy] instances.
$jobs = Import-Csv $FileList | Select-Object SrcFileName, DestFileName |
ForEach-Object {
# Start the thread job for the file pair at hand.
Start-ThreadJob -ThrottleLimit $NumCopyThreads -ArgumentList $_ {
param($f)
$simulatedRuntimeMs = 2000 # How long each job (thread) should run for.
# Delay output for a random period.
$randomSleepPeriodMs = Get-Random -Minimum 100 -Maximum $simulatedRuntimeMs
Start-Sleep -Milliseconds $randomSleepPeriodMs
# Produce output.
"Copied $($f.SrcFileName) to $($f.DestFileName)"
# Wait for the remainder of the simulated runtime.
Start-Sleep -Milliseconds ($simulatedRuntimeMs - $randomSleepPeriodMs)
}
}
Write-Host "Waiting for $($jobs.Count) jobs to complete..."
# Synchronously wait for all jobs (threads) to finish and output their results
# *as they become available*, then remove the jobs.
# NOTE: Output will typically NOT be in input order.
Receive-Job -Job $jobs -Wait -AutoRemoveJob
Write-Host "Total time lapsed: $([datetime]::UtcNow - $dtStart)"
# Clean up the temp. file
Remove-Item $FileList
The above yields something like:
Creating jobs...
Waiting for 10 jobs to complete...
Copied c:\tmp\b to \\server\share\b
Copied c:\tmp\g to \\server\share\g
Copied c:\tmp\d to \\server\share\d
Copied c:\tmp\f to \\server\share\f
Copied c:\tmp\e to \\server\share\e
Copied c:\tmp\h to \\server\share\h
Copied c:\tmp\c to \\server\share\c
Copied c:\tmp\a to \\server\share\a
Copied c:\tmp\j to \\server\share\j
Copied c:\tmp\i to \\server\share\i
Total time lapsed: 00:00:05.1961541
Note that the output received does not reflect the input order, and that the overall runtime is roughly 2 times the per-thread runtime of 2 seconds (plus overhead), because 2 "batches" have to be run due to the input count being 10, whereas only 8 threads were made available.
If you upped the thread count to 10 or more (50 is the default), the overall runtime would drop to 2 seconds plus overhead, because all jobs then run concurrently.
Caveat: The above numbers stem from running in PowerShell Core, version on Microsoft Windows 10 Pro (64-bit; Version 1903), using version 2.0.1 of the ThreadJob module.
Inexplicably, the same code is much slower in Windows PowerShell, v5.1.18362.145.
However, for performance and memory consumption it is better to use batching (chunking) in your case, i.e, to process multiple file pairs per thread.
The following solution demonstrates this approach; tweak $chunkSize to find a batch size that works for you.
# Create sample CSV file with 10 rows.
$FileList = Join-Path ([IO.Path]::GetTempPath()) "tmp.$PID.csv"
#'
Foo,SrcFileName,DestFileName,Bar
1,c:\tmp\a,\\server\share\a,baz
2,c:\tmp\b,\\server\share\b,baz
3,c:\tmp\c,\\server\share\c,baz
4,c:\tmp\d,\\server\share\d,baz
5,c:\tmp\e,\\server\share\e,baz
6,c:\tmp\f,\\server\share\f,baz
7,c:\tmp\g,\\server\share\g,baz
8,c:\tmp\h,\\server\share\h,baz
9,c:\tmp\i,\\server\share\i,baz
10,c:\tmp\j,\\server\share\j,baz
'# | Set-Content $FileList
# How many threads at most to run concurrently.
$NumCopyThreads = 8
# How many files to process per thread
$chunkSize = 3
# The script block to run in each thread, which now receives a
# $chunkSize-sized *array* of file pairs.
$jobScriptBlock = {
param([pscustomobject[]] $filePairs)
$simulatedRuntimeMs = 2000 # How long each job (thread) should run for.
# Delay output for a random period.
$randomSleepPeriodMs = Get-Random -Minimum 100 -Maximum $simulatedRuntimeMs
Start-Sleep -Milliseconds $randomSleepPeriodMs
# Produce output for each pair.
foreach ($filePair in $filePairs) {
"Copied $($filePair.SrcFileName) to $($filePair.DestFileName)"
}
# Wait for the remainder of the simulated runtime.
Start-Sleep -Milliseconds ($simulatedRuntimeMs - $randomSleepPeriodMs)
}
Write-Host 'Creating jobs...'
$dtStart = [datetime]::UtcNow
$jobs = & {
# Process the input objects in chunks.
$i = 0
$chunk = [pscustomobject[]]::new($chunkSize)
Import-Csv $FileList | Select-Object SrcFileName, DestFileName | ForEach-Object {
$chunk[$i % $chunkSize] = $_
if (++$i % $chunkSize -ne 0) { return }
# Note the need to wrap $chunk in a single-element helper array (, $chunk)
# to ensure that it is passed *as a whole* to the script block.
Start-ThreadJob -ThrottleLimit $NumCopyThreads -ArgumentList (, $chunk) -ScriptBlock $jobScriptBlock
$chunk = [pscustomobject[]]::new($chunkSize) # we must create a new array
}
# Process any remaining objects.
# Note: $chunk -ne $null returns those elements in $chunk, if any, that are non-null
if ($remainingChunk = $chunk -ne $null) {
Start-ThreadJob -ThrottleLimit $NumCopyThreads -ArgumentList (, $remainingChunk) -ScriptBlock $jobScriptBlock
}
}
Write-Host "Waiting for $($jobs.Count) jobs to complete..."
# Synchronously wait for all jobs (threads) to finish and output their results
# *as they become available*, then remove the jobs.
# NOTE: Output will typically NOT be in input order.
Receive-Job -Job $jobs -Wait -AutoRemoveJob
Write-Host "Total time lapsed: $([datetime]::UtcNow - $dtStart)"
# Clean up the temp. file
Remove-Item $FileList
While the output is effectively the same, note how only 4 jobs were created this time, each of which processed (up to) $chunkSize (3) file pairs.
As for what you tried:
The screen shot you show suggests that the problem is that your custom class, [fileToCopy], isn't visible to the script block run by Invoke-Async.
Since Invoke-Async invokes the script block via the PowerShell SDK in separate runspaces that know nothing about the caller's state, it is to be expected that these runspaces don't know your class (this equally applies to Start-ThreadJob).
However, it is unclear why that is a problem in your code, because your script block doesn't make an explicit reference to you class: your script-block parameter $file is not type-constrained (it is implicitly [object]-typed).
Therefore, simply accessing the properties of your custom-class instance inside the script block should work, and indeed does in my tests on Windows PowerShell v5.1.18362.145 on Microsoft Windows 10 Pro (64-bit; Version 1903).
However, if your real script-block code were to explicitly reference custom class [fileToCopy] - such as by defining the parameter as param([fileToToCopy] $file) - you would see the symptom.
[1] In Windows PowerShell v3 and v4, which do not come with the PowerShellGet module, Install-Module isn't available by default. However, the module can be installed on demand, as described in Installing PowerShellGet.

Why is the command in ScriptBlock not working?

I want to measure elapsed time from a cmdlet
Invoke-ASCmd
I am using it the following way
$elapsedTime = [system.diagnostics.stopwatch]::StartNew()
$j = Start-Job -ScriptBlock {
Invoke-ASCmd –InputFile $XMLF -Server "$Server[-1]" >$output_file
}
do {
write-progress -activity "Syncing..." -status "$([string]::Format("Time Elapsed: {0:d2}:{1:d2}:{2:d2}", $elapsedTime.Elapsed.hours, $elapsedTime.Elapsed.minutes, $elapsedTime.Elapsed.seconds))"
#-percentcomplete ($_/10);
Start-Sleep -milliseconds 250
} while ($j.State -eq 'Running')
Receive-Job -Job $j
$elapsedTime.stop()
However, all i see on the console is a flashing blue progress bar that doesnt appear to be even elapsing the time at all...and frankly, i dont even think the scriptblock is being executed at all (the Invoke cmdlet)
why is that?
and it appears to last 1 second
I know that the scriptblock is not working because the syncing is supposed to take at least 20 seconds so something is wrong
Also, i would like to get the percentage (circles animation/prgress), this is not working
-percentcomplete ($_/10);
One last thing, i would like to save the final elapsed time to a variable $FinalTime, would i do it inside the loop or outside?
I am combining these two answers here and modifying for my needs:
https://stackoverflow.com/a/9813370/8397835
https://stackoverflow.com/a/8468024/8397835
Yes, the progress is quick because it takes PowerShell 1 second to load the module before erroring out. We can see the error message with Receive-Job:
PS C:\> Receive-Job $j
InputFile "" not found
+ CategoryInfo : InvalidData: (:) [Invoke-ASCmd], FileNotFoundException
+ FullyQualifiedErrorId : DataValidation,Microsoft.AnalysisServices.PowerShell.Cmdlets.ExecuteScriptCommand
+ PSComputerName : localhost
InputFile "" not found indicates that the variables were empty. They are empty because you can't reference variables directly inside of the Script Block. Using Start-Job, you must pass it into the Script Block as an argument, and receive it as a parameter inside the Script Block. Something like this:
$j = Start-Job -Arg $XMLF, $Server, $output_file -ScriptBlock {
Param($XMLF, $Server, $output_file)
Invoke-ASCmd –InputFile $XMLF -Server "$Server" >$output_file
}
As for progress, since there is no "Direct" way to measure how far the progress is to 100%, we "fake it". Since we know that it takes about 20 seconds to execute, we simply have our progress do some math using the time from 0 to 20 as our 0 to 100 progress:
[Math]::Min(100*($elapsedTime.Elapsed.Seconds / 20),100)
Essentially use $elapsedTime for 0 to 100 percent over 20 seconds. That 20 seconds can be changed to any number that is close to the approximate execution time. Using [Math]::Min we ensure that if it takes longer than 20 seconds, the progress will show 100 percent, but the status will continue to show the time. So it would look like this:
do {
write-progress -activity "Syncing..." -status "$($elapsedTime.Elapsed.ToString())" -percentcomplete ([Math]::Min(100*($elapsedTime.Elapsed.Seconds / 20),100));
Start-Sleep -Milliseconds 250
} while ($j.State -eq 'Running')

Limited powershell start-jobs

I curious if you can answer this or point me in the right direction.
I've written a script that tests/monitors urls. I'm not posting the code ( unless you want me to ) because there is no error in the code. It works great. I can even scriptblock run it as part of start job. The issue I have seems to be that I can not run more than 3 jobs at time.. or they hang. I'm not sure why this is. I can run it for a total of 15 urls throttled to 3 and it's great. If I try to run it on 15 urls with 4 as my run limit, they will hang.. and I can kill one at a time.. until only 3 remain and those will finish. So it seems that I can only start a total of 3 powershell instances or they hang. Anyone explain why this is? All my searches lead me to pages that show how to throttle and it's not really my issue.
Watching the processes, each consumes about 25MBs of memory and sits there idle... If I kill one the other 3 will start using cpu and process go up to maybe 30MBs of memory and terminate completed. System has 8GBs of memory & a quad cord I5-2400 CPU # 3.10GHz. As requested...
Param(
$file
)
$testscript =
{
Param(
[string]$url,
#[ValidateSet('InternetExplorer','Chrome','Firefox','Safari','Opera', IgnoreCase = $true)]
[string]$browser="InternetExplorer",
[string]$teststring="Solution Center",
[int]$timeout=20,
[int]$retry
)
$i=0
do {
$userAgent = [Microsoft.PowerShell.Commands.PSUserAgent]::$browser
$data = Invoke-WebRequest $url -UserAgent $userAgent -TimeoutSec $timeout
$data.Content
$findit = $data.Content.Contains($teststring)
$i++
If ($findit){
break
}
}
while ($i -lt $retry)
if(!$findit) {
Echo "opcmsg a=PSURLCheck o=NHTSA msg_t='$teststring was not found on $url or $url failed to load'"
}
}
$urls = Import-Csv $file | % {
Start-Job -ScriptBlock $testscript -ArgumentList $_.url, $_.browser, $_.teststring, $_.retry
}
While (#(Get-Job | Where { $_.State -eq "Running" }).Count -ne 0)
{ Write-Host "Processing URLs..."
Get-Job
Start-Sleep -Seconds 5
}
$Data = ForEach ($Job in (Get-Job)) {
Receive-Job $Job
Remove-Job $Job
}
$data | select *
So I've used new system.net.webclient and I've even tried doing this with [System.Collections.Queue]... but all three methods use Jobs... so it appears.. I can not run more than three start jobs at any one time.
Are you sure your code is fine? If you're calling separate powershell sessions multiple times memory can be consumed very quickly. Check process monitor for high CPU or memory usage and ensure your blocks are terminating. Or post the code.

Powershell command timeout

I am trying to execute a function or a scriptblock in powershell and set a timeout for the execution.
Basically I have the following (translated into pseudocode):
function query{
#query remote system for something
}
$computerList = Get-Content "C:\scripts\computers.txt"
foreach ($computer in $computerList){
$result = query
#do something with $result
}
The query can range from a WMI query using Get-WmiObject to a HTTP request and the script has to run in a mixed environment, which includes Windows and Unix machines which do not all have a HTTP interface.
Some of the queries will therefore necessarily hang or take a VERY long time to return.
In my quest for optimization I have written the following:
$blockofcode = {
#query remote system for something
}
foreach ($computer in $computerList){
$Job = Start-Job -ScriptBlock $blockofcode -ArgumentList $computer
Wait-Job $Job.ID -Timeout 10 | out-null
$result = Receive-Job $Job.ID
#do something with result
}
But unfortunately jobs seem to carry a LOT of overhead. In my tests a query that executes in 1.066 seconds (according to timers inside $blockofcode) took 6.964 seconds to return a result when executed as a Job. Of course it works, but I would really like to reduce that overhead. I could also start all jobs together and then wait for them to finish, but the jobs can still hang or take ridiculous amounts to time to complete.
So, on to the question: is there any way to execute a statement, function, scriptblock or even a script with a timeout that does not comprise the kind of overhead that comes with jobs? If possible I would like to run the commands in parallel, but that is not a deal-breaker.
Any help or hints would be greatly appreciated!
EDIT: running powershell V3 in a mixed windows/unix environment
Today, I ran across a similar question, and noticed that there wasn't an actual answer to this question. I created a simple PowerShell class, called TimedScript. This class provides the following functionality:
Method: Start() method to kick off the job, when you're ready
Method:GetResult() method, to retrieve the output of the script
Constructor: A constructor that takes two parameters:
ScriptBlock to execute
[int] timeout period, in milliseconds
It currently lacks:
Passing in arguments to the PowerShell ScriptBlock
Other useful features you think up
Class: TimedScript
class TimedScript {
[System.Timers.Timer] $Timer = [System.Timers.Timer]::new()
[powershell] $PowerShell
[runspace] $Runspace = [runspacefactory]::CreateRunspace()
[System.IAsyncResult] $IAsyncResult
TimedScript([ScriptBlock] $ScriptBlock, [int] $Timeout) {
$this.PowerShell = [powershell]::Create()
$this.PowerShell.AddScript($ScriptBlock)
$this.PowerShell.Runspace = $this.Runspace
$this.Timer.Interval = $Timeout
Register-ObjectEvent -InputObject $this.Timer -EventName Elapsed -MessageData $this -Action ({
$Job = $event.MessageData
$Job.PowerShell.Stop()
$Job.Runspace.Close()
$Job.Timer.Enabled = $False
})
}
### Method: Call this when you want to start the job.
[void] Start() {
$this.Runspace.Open()
$this.Timer.Start()
$this.IAsyncResult = $this.PowerShell.BeginInvoke()
}
### Method: Once the job has finished, call this to get the results
[object[]] GetResult() {
return $this.PowerShell.EndInvoke($this.IAsyncResult)
}
}
Example Usage of TimedScript Class
# EXAMPLE: The timeout period is set longer than the execution time of the script, so this will succeed
$Job1 = [TimedScript]::new({ Start-Sleep -Seconds 2 }, 4000)
# EXAMPLE: This script will fail. Even though Get-Process returns quickly, the Start-Sleep call will cause it to be terminated by its Timer.
$Job2 = [TimedScript]::new({ Get-Process -Name s*; Start-Sleep -Seconds 3 }, 2000)
# EXAMPLE: This job will fail, because the timeout is less than the script execution time.
$Job3 = [TimedScript]::new({ Start-Sleep -Seconds 3 }, 1000)
$Job1.Start()
$Job2.Start()
$Job3.Start()
Code is also hosted on GitHub Gist.
I think you might want to investigate using Powershell runspaces:
http://learn-powershell.net/2012/05/13/using-background-runspaces-instead-of-psjobs-for-better-performance/