Powershell Start-Jobs throttling - powershell

I am having trouble with throttling of jobs and "hung" or "failed" jobs. Here is basically what I am trying to do.
$allServers = Import-Csv "C:\temp\input.csv"
$job = $allServers | % {
while (#(Get-Job -State Running).Count -ge 6) {
Start-Sleep -Seconds 2
}
Start-Job -Name $_.computerName -ScriptBlock {
param ($cpn,$dom)
(DO QUERIES HERE)
(OUTPUT TO OBJECT HERE)
} -ArgumentList $_.computerName,$_.Domain
}
$jobsdone = $job | Wait-Job | Receive-Job
I would like to run 5 concurrent jobs, simple enough.
The issue is when I query a server that does not respond, the job hangs and the script never ends. I have tried adding...
Wait-Job -Name $_.computerName -Timeout 20
...above the last curly brace, but all that does is effectively limit the concurrence to one thread, until 20 seconds goes by, then abandoning the hung job to do other jobs. The whole script still doesn't finish in that instance.
This code works fine without the throttling and job waiting, so long as I don't get a non-responsive server.

Inside your while loop check the length of time the job has been running. If that is greater than some timeout you determine, then stop that job e.g.:
while (#(Get-Job -State Running).Count -ge 6) {
$now = Get-Date
foreach ($job in #(Get-Job -State Running)) {
if ($now - (Get-Job -Id $job.id).PSBeginTime -gt [TimeSpan]::FromMinutes(2)) {
Stop-Job $job
}
}
Start-Sleep -sec 2
}
You might want to check out this PowerShell team blog post on how to throttle jobs using a queue.

Related

Terminate part of powershell script and continue

I made a powershell script, that reads remote PCs registry keys, and prints them out to an html page.
Sometimes remote PCs freeze/hang, etc. This increases the final html page by around 40 sec for each frozen PC.
How can I time just a part of my script, let's say 1-2 commands and if that time gets too large, i terminate that command and continue script with the next PC out of PC name-array?
Or maybe the solution is not in timing, is there other way? Thanks!
Smth like:
$Reg = [Microsoft.Win32.RegistryKey]::OpenRemoteBaseKey('Users', $remote[$i]) + timer runs in parallel + if condition for the timer count. And when the counter reaches threshold terminate OpenRemoteBaseKey & continue
Execute the statement as a job. Monitor the time outside the job. If the job runs longer than you prefer, kill it.
$job = Invoke-Command `
-Session $s
-ScriptBlock { $Reg = [Microsoft.Win32.RegistryKey]::OpenRemoteBaseKey('Users', $remote[$i]) } `
-AsJob `
-JobName foo
$int = 0
while (($job.State -like "Running") -and ($int -lt 3)) {
Start-Sleep -Seconds 1
$int++
}
if ($Job.State -like "Running") { $job | Stop-Job }

Job doing Stop-Job on self

Just looking for verification here. Can a job Stop-Job itself? I have a script that creates a job that suppresses a service for as long as the main script is running (by way of passed $PID) and I am currently using this
Start-Job -name:'SuppressAdAppMgrSvc' -argumentList $id -scriptBlock {
param (
$id
)
do {
if ((Get-Service AdAppMgrSvc -errorAction:silentlyContinue).Status -eq 'Running') {
Stop-Service AdAppMgrSvc -force -warningAction:silentlyContinue -errorAction:silentlyContinue
}
Start-Sleep -s 5
$powershellProcess = Get-Process -id:$id -errorAction:silentlyContinue
} while ($powershellProcess)
Stop-Job 'SuppressAdAppMgrSvc' -warningAction:silentlyContinue -errorAction:silentlyContinue
Remove-Job 'SuppressAdAppMgrSvc' -warningAction:silentlyContinue -errorAction:silentlyContinue
My thinking is the job will run, and when $PowershellProcess is no longer, then the Stop-Job would run. But I suspect the Remove-Job would not, since this is the very job that just got stopped. In general it probably isn't a problem, as 99% of the time I do a reboot when my script completes, but I am curious if there is a pattern for dealing with this? Or is it something of an edge case?
How do you expect Remove-Job to run inside a stopped job? And why would you want to do that from inside the job in the first place?
The job will automatically enter the Stopped state when the code in the scriptblock terminates, so all you need to do is to wait for that to happen and then remove the job:
Start-Job -Name 'SuppressAdAppMgrSvc' -ArgumentList $id -ScriptBlock {
...
} | Wait-Job | Remove-Job

Powershell: Run multiple jobs in parralel and view streaming results from background jobs

Overview
Looking to call a Powershell script that takes in an argument, runs each job in the background, and shows me the verbose output.
Problem I am running into
The script appears to run, but I want to verify this for sure by streaming the results of the background jobs as they are running.
Code
###StartServerUpdates.ps1 Script###
#get list of servers to update from text file and store in array
$servers=get-content c:\serverstoupdate.txt
#run all jobs, using multi-threading, in background
ForEach($server in $servers){
Start-Job -FilePath c:\cefcu_it\psscripts\PSPatch.ps1 -ArgumentList $server
}
#Wait for all jobs
Get-Job | Wait-Job
#Get all job results
Get-Job | Receive-Job
What I am currently seeing:
Id Name State HasMoreData Location Command
-- ---- ----- ----------- -------- -------
23 Job23 Running True localhost #patch server ...
25 Job25 Running True localhost #patch server ...
What I want to see:
Searching for approved updates ...
Update Found: Security Update for Windows Server 2003 (KB2807986)
Update Found: Windows Malicious Software Removal Tool - March 2013 (KB890830)
Download complete. Installing updates ...
The system must be rebooted to complete installation.
cscript exited on "myServer" with error code 3.
Reboot required...
Waiting for server to reboot (35)
Searching for approved updates ...
There are no updates to install.
cscript exited on "myServer" with error code 2.
Servername "myServer" is fully patched after 2 loops
I want to be able to see the output or store that somewhere so I can refer back to be sure the script ran and see which servers rebooted, etc.
Conclusion:
In the past, I ran the script and it went through updating the servers one at a time and gave me the output I wanted, but when I started doing more servers - this task took too long, which is why I am trying to use background jobs with "Start-Job".
Can anyone help me figure this out, please?
You may take a look at the module SplitPipeline.
It it specifically designed for such tasks. The working demo code is:
# import the module (not necessary in PS V3)
Import-Module SplitPipeline
# some servers (from 1 to 10 for the test)
$servers = 1..10
# process servers by parallel pipelines and output results immediately
$servers | Split-Pipeline {process{"processing server $_"; sleep 1}} -Load 1, 1
For your task replace "processing server $_"; sleep 1 (simulates a slow job) with a call to your script and use the variable $_ as input, the current server.
If each job is not processor intensive then increase the parameter Count (the default is processor count) in order to improve performance.
Not a new question but I feel it is missing an answer including Powershell using workflows and its parallel possibilities, from powershell version 3. Which is less code and maybe more understandable than starting and waiting for jobs, which of course works good as well.
I have two files: TheScript.ps1 which coordinates the servers and BackgroundJob.ps1 which does some kind of check. They need to be in the same directory.
The Write-Output in the background job file writes to the same stream you see when starting TheScript.ps1.
TheScript.ps1:
workflow parallelCheckServer {
param ($Servers)
foreach -parallel($Server in $Servers)
{
Invoke-Expression -Command ".\BackgroundJob.ps1 -Server $Server"
}
}
parallelCheckServer -Servers #("host1.com", "host2.com", "host3.com")
Write-Output "Done with all servers."
BackgroundJob.ps1 (for example):
param (
[Parameter(Mandatory=$true)] [string] $server
)
Write-Host "[$server]`t Processing server $server"
Start-Sleep -Seconds 5
So when starting the TheScript.ps1 it will write "Processing server" 3 times but it will not wait for 15 seconds but instead 5 because they are run in parallel.
[host3.com] Processing server host3.com
[host2.com] Processing server host2.com
[host1.com] Processing server host1.com
Done with all servers.
In your ForEach loop you'll want to grab the output generated by the Jobs already running.
Example Not Tested
$sb = {
"Starting Job on $($args[0])"
#Do something
"$($args[0]) => Do something completed successfully"
"$($args[0]) => Now for something completely different"
"Ending Job on $($args[0])"
}
Foreach($computer in $computers){
Start-Job -ScriptBlock $sb -Args $computer | Out-Null
Get-Job | Receive-Job
}
Now if you do this all your results will be mixed. You might want to put a stamp on your verbose output to tell which output came from.
Or
Foreach($computer in $computers){
Start-Job -ScriptBlock $sb -Args $computer | Out-Null
Get-Job | ? {$_.State -eq 'Complete' -and $_.HasMoreData} | % {Receive-Job $_}
}
while((Get-Job -State Running).count){
Get-Job | ? {$_.State -eq 'Complete' -and $_.HasMoreData} | % {Receive-Job $_}
start-sleep -seconds 1
}
It will show all the output as soon as a job is finished. Without being mixed up.
If you're wanting to multiple jobs in-progress, you'll probably want to massage the output to help keep what output goes with which job straight on the console.
$BGList = 'Black','Green','DarkBlue','DarkCyan','Red','DarkGreen'
$JobHash = #{};$ColorHash = #{};$i=0
ForEach($server in $servers)
{
Start-Job -FilePath c:\cefcu_it\psscripts\PSPatch.ps1 -ArgumentList $server |
foreach {
$ColorHash[$_.ID] = $BGList[$i++]
$JobHash[$_.ID] = $Server
}
}
While ((Get-Job).State -match 'Running')
{
foreach ($Job in Get-Job | where {$_.HasMoreData})
{
[System.Console]::BackgroundColor = $ColorHash[$Job.ID]
Write-Host $JobHash[$Job.ID] -ForegroundColor Black -BackgroundColor White
Receive-Job $Job
}
Start-Sleep -Seconds 5
}
[System.Console]::BackgroundColor = 'Black'
You can get the results by doing something like this after all the jobs have been received:
$array=#()
Get-Job -Name * | where{$array+=$_.ChildJobs.output}
.ChildJobs.output will have anything that was returned in each job.
function OutputJoblogs {
[CmdletBinding(DefaultParameterSetName='Name')]
Param
(
[Parameter(Mandatory=$true, Position=0)]
[System.Management.Automation.Job] $job,
[Parameter(Mandatory=$true, Position=1)]
[string] $logFolder,
[Parameter(Mandatory=$true, Position=2)]
[string] $logTimeStamp
)
#Output All logs
while ($job.sate -eq "Running" -or $job.HasMoreData){
start-sleep -Seconds 1
foreach($remotejob in $job.ChildJobs){
if($remotejob.HasMoreData){
$output=(Receive-Job $remotejob)
if($output -gt 0){
$remotejob.location +": "+ (($output) | Tee-Object -Append -file ("$logFolder\$logTimeStamp."+$remotejob.Location+".txt"))
}
}
}
}
#Output Errors
foreach($remotejob in $job.ChildJobs){
if($remotejob.Error.Count -gt0){$remotejob.location +": "}
foreach($myerr in $remotejob.Error){
$myerr 2>&1 | Tee-Object -Append -file ("$logFolder\$logTimeStamp."+$remotejob.Location+".ERROR.txt")
}
if($remotejob.JobStateInfo.Reason.ErrorRecord.Count -gt 0){$remotejob.location +": "}
foreach($myerr in $remotejob.JobStateInfo.Reason.ErrorRecord){
$myerr 2>&1 | Tee-Object -Append -file ("$logFolder\$logTimeStamp."+$remotejob.Location+".ERROR.txt")
}
}
}
#example of usage
$logfileDate="$((Get-Date).ToString('yyyy-MM-dd-HH.mm.ss'))"
$job = Invoke-Command -ComputerName "servername1","servername2" -ScriptBlock {
for ($i=1; $i -le 5; $i++) {
$i+"`n";
if($i -gt 2){
write-error "Bad thing happened"};
if($i -eq 4){
throw "Super Bad thing happened"
};
start-sleep -Seconds 1
}
} -asjob
OutputJoblogs -Job $job -logFolder "$PSScriptRoot\logs" -logTimeStamp $logfileDate

Managing the running time of background jobs. Timing out if not completed after x seconds,

I would like to time my background jobs (started with start-job) and time them out after x seconds. I find it hard however to keep track of the running time on each separate job (I am running aprox 400 jobs).
I wish there was a way to time out the job and set it to failed if not completed in X seconds, but I find no timeout-parameter.
What would be a good way to track the individual run-time of the jobs?
I guess I could create a hashtable with start-time of each job and the job-id and check against the running state and do a manual timeout, but that sounds kinda "inventing the wheel".
Any ideas?
Edit
Thank you everyone for a fruitful discussion and great inspiration on this topic!
You can use a hash table of timers:
$jobtimer = #{}
foreach ($job in $jobs){
start-job -name $job -ScriptBlock {scriptblock commands}
$jobtimer[$job] = [System.Diagnostics.Stopwatch]::startnew()
}
The running time of each job will be in $jobtimer[$job].elapsed
Just walk through the list of running jobs and stop any that have run past your timeout spec e.g.:
$timeout = [timespan]::FromMinutes(1)
$now = Get-Date
Get-Job | Where {$_.State -eq 'Running' -and
(($now - $_.PSBeginTime) -gt $timeout)} | Stop-Job
BTW there are more properties to a job object than the default formatting shows e.g.:
3 > $job | fl *
State : Running
HasMoreData : True
StatusMessage :
Location : localhost
Command : Start-sleep -sec 30
JobStateInfo : Running
Finished : System.Threading.ManualResetEvent
InstanceId : de370ea8-763b-4f3b-ba0e-d45f402c8bc4
Id : 3
Name : Job3
ChildJobs : {Job4}
PSBeginTime : 3/18/2012 11:07:20 AM
PSEndTime :
PSJobType : BackgroundJob
Output : {}
Error : {}
Progress : {}
Verbose : {}
Debug : {}
Warning : {}
You can specify the timeout option of Wait-Job:
-Timeout
Determines the maximum wait time for each background job, in seconds.
The default, -1, waits until the job completes, no matter how long it
runs. The timing starts when you submit the Wait-Job command, not the
Start-Job command.
If this time is exceeded, the wait ends and the command prompt
returns, even if the job is still running. No error message is
displayed.
Here's some example code:
This part just makes some test jobs:
Remove-Job -Name *
$jobs = #()
1..10 | % {
$jobs += Start-Job -ScriptBlock {
Start-Sleep -Seconds (Get-Random -Minimum 5 -Maximum 20)
}
}
The variable $timedOutJobs contains jobs that timed out. You can then restart them or what have you.
$jobs | Wait-Job -Timeout 10
$timedOutJobs = Get-Job | ? {$_.State -eq 'Running'} | Stop-Job -PassThru
For completeness, this answer combines the maximum seconds per job and the maximum concurrent jobs running. As this is what most people are after.
The example below retrieves the printer configuration for each print server. There can be over 3000 printers, so we added throttling.
$i = 0
$maxConcurrentJobs = 40
$maxSecondsPerJob = 60
$jobTimer = #{ }
$StopLongRunningJobs = {
$jobTimer.GetEnumerator().where( {
($_.Value.IsRunning) -and
($_.Value.Elapsed.TotalSeconds -ge $maxSecondsPerJob)
}).Foreach( {
$_.Value.Stop()
Write-Verbose "Stop job '$($_.Name.Name)' that ran for '$($_.Value.Elapsed.TotalSeconds)' seconds"
Stop-Job $_.Name
})
}
Foreach ($Computer in #($GetPrinterJobResults.Where( { $_.Data }) )) {
foreach ($Printer in $Computer.Data) {
do {
& $StopLongRunningJobs
$running = #(Get-Job -State Running)
$Wait = $running.Count -ge $maxConcurrentJobs
if ($Wait) {
Write-Verbose 'Waiting for jobs to fininsh'
$null = $running | Wait-Job -Any -Timeout 5
}
} while ($Wait)
$i++
Write-Verbose "$I $($Computer.ComputerName) Get print config '$($Printer.Name)'"
$Job = $Printer | Get-PrintConfiguration -AsJob -EA Ignore
$jobtimer[$Job] = [System.Diagnostics.Stopwatch]::StartNew()
}
}
$JobResult = Get-Job | Wait-Job -Timeout $maxSecondsPerJob -EA Ignore
$JobResult = Get-Job | Stop-Job -EA Ignore # Add this line if you want to stop the Jobs that reached the max time to wait (TimeOut)
$JobResult = Get-Job | Receive-Job -EA Ignore
$JobResult.count

Monitoring jobs in a PowerShell session from another PowerShell session

A script is executing the following steps in a loop, assume both steps take a long time to complete:
$x = DoSomeWork;
Start-Job -Name "Process $x" { DoSomeMoreWork $x; };
Step 1 blocks the script and step 2 does not, of course.
I can easily monitor the progress/state of the loop and step 1 through the console.
What I'd also like to do is monitor the job status of jobs started by step 2 while the batch is still executing.
In general, it is possible to 'attach' or query another powershell session from another session? (Assuming the monitoring session does not spawn the worker session)
If I'm following you, then you cannot share state between two different console instances. That is to say, it's not possible in the way you want to do it. However, it's not true that you cannot monitor a job from the same session. You can signal with events from within the job:
Start-Job -Name "bgsignal" -ScriptBlock {
# forward events named "progress" back to job owner
# this even works across machines ;-)
Register-EngineEvent -SourceIdentifier Progress -Forward
$percent = 0
while ($percent -lt 100) {
$percent += 10
# raise a new progress event, redirecting to $null to prevent
# it ending up in the job's output stream
New-Event -SourceIdentifier Progress -MessageData $percent > $null
# wait 5 seconds
sleep -Seconds 5
}
}
Now you have the choice to either use Wait-Event [-SourceIdentifier Progress], Register-EngineEvent -SourceIdentifier Progress [-Action { ... }] or plain old interactive Get-Event to see and/or act on progress from the same session (or a different machine if you started the job on a remote server.)
It's also entirely possible you don't need the Jobs infrastructure if all work is being done on the local machine. Take a look at an old blog post of mine on the RunspaceFactory and PowerShell objects for a rudimentary script "threadpool" implementation:
http://www.nivot.org/2009/01/22/CTP3TheRunspaceFactoryAndPowerShellAccelerators.aspx
Hope this helps,
-Oisin
State is easy to monitor:
$job = Start-Job -Name "Process $x" { DoSomeMoreWork $x }
$job.state
If you don't need to retrieve any output data from the function then you can write to output like so:
$job = Start-Job {$i=0; while (1) { "Step $i"; $i++; Start-Sleep -sec 1 }}
while ($job.State -eq 'Running')
{
Receive-Job $job.id
}
If you do need to capture the output, then you could use the progress stream I think:
$job = Start-Job {$i=0; while (1) {
Write-Progress Activity "Step $i"; $i++; Start-Sleep -sec 1 }}
while ($job.State -eq 'Running') {
$progress=$job.ChildJobs[0].progress;
$progress | %{$_.StatusDescription};
$progress.Clear(); Start-Sleep 1 }