$vcconnect has 50 machines and I need to run this job on all 50 machines, but when I just run it, it crashes the shell.
I would like to limit the parallel execution to 10 at a point of time.
I tried a do-while but I was missing something as it executed on all 50 at same time and crashed my shell.
foreach($vci in $vcconnect){
[array]$jobstart += Start-Job -Name top2 -ArgumentList #($vci, $cred, $from, $to) -ScriptBlock $importcode
}
If you want to run scripts in parallel, and control the maximum number of concurrently running instances, use a RunspacePool:
# Create a RunspacePool, of maximum 10 concurrently running runspaces
$RSPool = [runspacefactory]::CreateRunspacePool(1,10)
$RSPool.Open()
# Start a new "job" for each server
$Jobs = foreach($vci in $vconnect){
$PS = [PowerShell]::Create().AddScript($importcode)
$PS.RunspacePool = $RSPool
$vci, $cred, $from, $to |ForEach-Object {
[void]$PS.AddArgument($_)
}
New-Object psobject -Property #{
Shell = $PS
ComputerName = $vci
ResultHandle = $PS.BeginInvoke()
}
}
# Wait for the "jobs" to finish
do{
Start-Sleep -Milliseconds 500
} while ($Jobs |Where-Object IsCompleted -eq $false)
# Collect results, suppress (but warn on) errors
$Results = foreach($Job in $Jobs){
$Job.Shell.EndInvoke($Job.ResultHandle)
if($Job.Shell.HadErrors){
Write-Warning "$($Job.ComputerName) had $($Job.Shell.Streams.Error.Count) errors:"
$Job.Shell.Streams.Error |Write-Warning
}
}
Related
#my script is a excerpt of https://www.codeproject.com/Tips/895840/Multi-Threaded-PowerShell-Cookbook
#the first example
#the issue is where ".AddScript($secb)" is. All jobs are finished sequentially , Could anyone explain ???
#why in my script , .AddScript($sb) is concurrent ??
$numThreads = 5
# Create session state
$myString = "this is session state!"
$sessionState = [System.Management.Automation.Runspaces.InitialSessionState]::CreateDefault()
$sessionstate.Variables.Add((New-Object -TypeName System.Management.Automation.Runspaces.SessionStateVariableEntry -ArgumentList "myString" ,$myString, "example string"))
# Create runspace pool consisting of $numThreads runspaces
$RunspacePool = [RunspaceFactory]::CreateRunspacePool(1, 5, $sessionState, $Host)
$RunspacePool.Open()
$Jobs = #()
$sb={
param ($data)
$r=Get-Random
Write-Host "before $r"
Start-Sleep -Seconds 3
Write-Host "after $r"
}
$secb={
param ($block)
Invoke-Command -ScriptBlock $block
}
1..5 | % {
#below line is not concurrent , i don't know why
$Job = [powershell]::Create().AddScript($secb) # $Job = [powershell]::Create().AddScript($sb) could do multi-thread
$Job.AddArgument($sb)
$Job.RunspacePool = $RunspacePool
$Jobs += New-Object PSObject -Property #{
RunNum = $_
Job = $Job
Result = $Job.BeginInvoke()
}
}
Write-Host "Waiting.." -NoNewline
Do {
Write-Host "." -NoNewline
Start-Sleep -Seconds 1
} While ( $Jobs.Result.IsCompleted -contains $false) #Jobs.Result is a collection
I'm by far not an expert on RunSpaces, usually whenever I need multithreading I use the ThreadJob module. I'm not sure what you are doing wrong in your code but I can show you a working example for what you're trying to do. I took this example from this answer (credits to Mathias) which is excellent and modified it a bit.
Code:
cls
$elapsedTime = [System.Diagnostics.Stopwatch]::StartNew()
$numThreads = 5
# Create session state
$myString = "this is session state!"
$sessionState = [System.Management.Automation.Runspaces.InitialSessionState]::CreateDefault()
$sessionstate.Variables.Add((New-Object -TypeName System.Management.Automation.Runspaces.SessionStateVariableEntry -ArgumentList "myString" ,$myString, "example string"))
# Create runspace pool consisting of $numThreads runspaces
$RunspacePool = [RunspaceFactory]::CreateRunspacePool(1, 5, $sessionState, $Host)
$RunspacePool.Open()
$runspaces = foreach($i in 1..5)
{
$PSInstance = [powershell]::Create().AddScript({
param ($TestNumber)
Write-Output "Test Number: $TestNumber"
$r={Get-Random}
Write-Output "before $(& $r)"
Start-Sleep -Seconds 3
Write-Output "after $(& $r)"
}).AddParameter('TestNumber',$i)
$PSInstance.RunspacePool = $RunspacePool
[pscustomobject]#{
Instance = $PSInstance
Result = $PSInstance.BeginInvoke()
}
}
while($runspaces|Where-Object{-not $_.Result.IsCompleted})
{
Start-Sleep -Milliseconds 500
}
$resultRunspace = [collections.generic.list[string]]::new()
$Runspaces|ForEach-Object {
$resultRunspace.Add($_.Instance.EndInvoke($_.Result))
}
$elapsedTime.Stop()
"Elapsed Time: {0}" -f $elapsedTime.Elapsed.TotalSeconds
""
$resultRunspace
Result:
Elapsed Time: 3.1271587
Test Number: 1 before 474010429 after 2055432874
Test Number: 2 before 1639634857 after 1049683678
Test Number: 3 before 72786850 after 2046654322
Test Number: 4 before 1958738687 after 1832326064
Test Number: 5 before 1944958392 after 1652518661
Now, again, if you are able to install modules I would recommend ThreadJob as it is a lot easier to use and performs equally as fast as RunSpaces.
I wrote a script some time ago which would loop through all the directories in $HOME and get the number of files on each one, the script was meant to compare Linear Loops vs ThreadJob vs RunSpace here are the results and why I would always recommend ThreadJob
I need to generate some load on about 100 Windows 2012 R2 servers. The idea I have is to use PowerShell remoting to kick off my script the generate the CPU load. I am also trying to control the load by checking the current load and if it is over X% don't spawn a new job. It's not the most elegant script, but the script works if I execute it locally. When I run it remotely things go wrong.
it seems as if the foreach loop does nothing. So I just have an idle process
it will start about 15 jobs with no issues then the rest all report as having failed.
I have tried various methods of calling the function. I have modified the function so that it contains a server variable but still not getting the desired result. I have replaced Start-Job with Invoke-Command. PSRemoting is working I can do basic tasks on the remote servers.
Any guidance would be appreciated
Below is my code
function Invoke-CPUStress
{
[CmdletBinding()]
param (
$duration
)
$start = (get-date).AddMinutes(1)
$end = $start.AddMinutes($duration)
while ((Get-Date) -lt $start)
{
sleep -Seconds 10
}
while ((get-date) -lt $end)
{
$Load = (Get-WmiObject Win32_Processor | Measure-Object -Property LoadPercentage -Average).Average
If ($Load -lt 60)
{
Start-Job -ScriptBlock {
$result = 1;
foreach ($number in 1..2147483647)
{
$result = $result * $number
sleep -Milliseconds 1
}
}
}
}
Get-Job | Stop-Job
Get-Job | Remove-Job
}
Invoke-CPUStress -duration 5
Long story short, we are experiencing issues with some of our servers that cause crippling effects on them and I am looking for a way to monitor them, now I have a script that will check the RDP port to make sure that it is open and I am thinking that I want to use get-service and then I will return if it pulled any data or not.
Here is the issue I don't know how to limit the time it will wait for a response before returning false.
[bool](Get-process -ComputerName MYSERVER)
Although I like Ansgars answer with a time-limited job, I think a separate Runspace and async invocation fits this task better.
The major difference here being that a Runspace reuses the in-process thread pool, whereas the PSJob method launches a new process, with the overhead that that entails, such as OS/kernel resources spawning and managing a child process, serializing and deserializing data etc.
Something like this:
function Timeout-Statement {
param(
[scriptblock[]]$ScriptBlock,
[object[]]$ArgumentList,
[int]$Timeout
)
$Runspace = [runspacefactory]::CreateRunspace()
$Runspace.Open()
$PS = [powershell]::Create()
$PS.Runspace = $Runspace
$PS = $PS.AddScript($ScriptBlock)
foreach($Arg in $ArgumentList){
$PS = $PS.AddArgument($Arg)
}
$IAR = $PS.BeginInvoke()
if($IAR.AsyncWaitHandle.WaitOne($Timeout)){
$PS.EndInvoke($IAR)
}
return $false
}
Then use that to do:
$ScriptBlock = {
param($ComputerName)
Get-Process #PSBoundParameters
}
$Timeout = 2500 # 2 and a half seconds (2500 milliseconds)
Timeout-Statement $ScriptBlock -ArgumentList "mycomputer.fqdn" -Timeout $Timeout
You could run your check as a background job:
$sb = { Get-Process -ComputerName $args[0] }
$end = (Get-Date).AddSeconds(5)
$job = Start-Job -ScriptBlock $sb -ArgumentList 'MYSERVER'
do {
Start-Sleep 100
$finished = (Get-Job -Id $job.Id).State -eq 'Completed'
} until ($finished -or (Get-Date) -gt $end)
if (-not $finished) {
Stop-Job -Id $job.Id
}
Receive-Job $job.Id
Remove-Job $job.Id
This is a known issue: https://connect.microsoft.com/PowerShell/feedback/details/645165/add-timeout-parameter-to-get-wmiobject
There is a workaround provided Here : https://connect.microsoft.com/PowerShell/feedback/details/645165/add-timeout-parameter-to-get-wmiobject
Function Get-WmiCustom([string]$computername,[string]$namespace,[string]$class,[int]$timeout=15)
{
$ConnectionOptions = new-object System.Management.ConnectionOptions
$EnumerationOptions = new-object System.Management.EnumerationOptions
$timeoutseconds = new-timespan -seconds $timeout
$EnumerationOptions.set_timeout($timeoutseconds)
$assembledpath = "\\" + $computername + "\" + $namespace
#write-host $assembledpath -foregroundcolor yellow
$Scope = new-object System.Management.ManagementScope $assembledpath, $ConnectionOptions
$Scope.Connect()
$querystring = "SELECT * FROM " + $class
#write-host $querystring
$query = new-object System.Management.ObjectQuery $querystring
$searcher = new-object System.Management.ManagementObjectSearcher
$searcher.set_options($EnumerationOptions)
$searcher.Query = $querystring
$searcher.Scope = $Scope
trap { $_ } $result = $searcher.get()
return $result
}
You can call the function like this:
get-wmicustom -class Win32_Process -namespace "root\cimv2" -computername MYSERVER –timeout 1
I have a small PowerShell program that starts a few threads to do parallel calculations and then when they are finished they append a line with the results to a text file and proceed to do some more. This worked fine in development and testing, but occasionally in production it hangs, and it seems the file is "jammed open". I have the writes wrapped in "try" blocks, but that does not help. I have written a toy application to illustrate the problem, it hangs after about 10-15 minutes usually (and writing about 3000 lines).
It seems to me I would have been better off with a Python solution using mutexs or something, but I am pretty far down this road now. Looking for ideas how I can easily fix this. I really thought Add-Content would have been atomic...
Parentjob.ps1
# Start a bunch of jobs
$curdir = "c:\transfer\filecollide"
$tokens = "tok00","tok01","tok02",
"tok03","tok04","tok05",
"tok06","tok07","tok08"
$jobs = #()
foreach ($tok in $tokens)
{
$job = Start-Job -FilePath ".\childjob.ps1" -ArgumentList "${curdir}",$tok,2,1000
Start-Sleep -s 3 # stagger things a bit
Write-Output " Starting:${tok} job"
$jobs += ,$job
}
foreach ($job in $jobs)
{
wait-job $job
$out = receive-job $job
Write-Output($out)
}
childjob.ps1
param(
[string]$curdir = ".",
[string]$tok = "tok?",
[int]$interval = 10,
[int]$ntodo = 1
)
$nwritefails = 0
$nwritesuccess = 0
$nwrite2fails = 0
function singleLine
{
param(
[string]$tok,
[string]$fileappendout = "",
[int]$timeout = 3
)
$curdatetime = (Get-Date)
$sout = "${curdatetime},${tok},${global:nwritesuccess},${global:nwritefails},${global:nwrite2fails}"
$global:nwritesuccess++
try
{
Add-Content -Path $fileappendout -Value "${sout}"
}
catch
{
$global:nwritefails++
try
{
Start-Sleep -s 1
Add-Content -Path $fileappendout -Value "${sout}"
}
catch
{
$global:nwrite2fails++
Write-Output "Failed to write to ${fileappendout}"
}
}
}
Write-Output "Starting to process ${tok}"
#Start of main code
cd "${curdir}"
$ndone = 0
while ($true)
{
singleLine $tok "outfile.txt"
$ndone++
if ($ndone -gt $ntodo){ break }
Start-Sleep -s $interval
}
Write-Output "Successful ${tok} appends:${nwritesuccess} failed:${nwritefails} failed2:${nwrite2fails}"
Why not have the jobs write the results to the output stream, and use Receive-Job in the main thread to collect the results and update the file? You can do this while the jobs are still running. What you're writing to the out stream now looks like it might be more appropriately written to the Progress stream.
Run a job for each server in a list. I only want 5 jobs running at a time. When a job completes, it should start a new job on the next server on the list. Here's what I have so far, but I can't get it to start a new job after the first 5 jobs have ran:
$MaxJobs = 5
$list = Get-Content ".\list.csv"
$Queue = New-Object System.Collections.Queue
$CurrentJobQueue = Get-Job -State Running
$JobQueueCount = $CurrentJobQueue.count
ForEach($Item in $list)
{
Write-Host "Adding $Item to queue"
$Queue.Enqueue($Item)
}
Function Global:Remote-Install
{
$Server = $queue.Dequeue()
$j = Start-Job -Name $Server -ScriptBlock{
If($JobQueueCount -gt 0)
{
Test-Connection $Server -Count 15
}##EndIf
}##EndScriptBlock
}
For($i = 0 ;$i -lt $MaxJobs; $i++)
{
Remote-Install
}
PowerShell will do this for you if you use Invoke-Command e.g.:
Invoke-Command -ComputerName $serverArray -ScriptBlock { .. script here ..} -ThrottleLimit 5 -AsJob
BTW I don't think your use of a .NET Queue is going to work because Start-Job fires up another PowerShell process to execute the job.
You may take a look at the cmdlet Split-Pipeline of the module SplitPipeline.
The code will look like:
Import-Module SplitPipeline
$MaxJobs = 5
$list = Get-Content ".\list.csv"
$list | Split-Pipeline -Count $MaxJobs -Load 1,1 {process{
# process an item from $list represented by $_
...
}}
-Count $MaxJobs limits the number of parallel jobs. -Load 1,1 tells to pipe
exactly 1 item to each job.
The advantage of this approach is that the code itself is invoked synchronously
and it outputs results from jobs as if all was invoked sequentially (even
output order can be preserved with the switch Order).
But this approach does not use remoting. The code works in the current PowerShell session in several runspaces.