Terminate part of powershell script and continue - powershell

I made a powershell script, that reads remote PCs registry keys, and prints them out to an html page.
Sometimes remote PCs freeze/hang, etc. This increases the final html page by around 40 sec for each frozen PC.
How can I time just a part of my script, let's say 1-2 commands and if that time gets too large, i terminate that command and continue script with the next PC out of PC name-array?
Or maybe the solution is not in timing, is there other way? Thanks!
Smth like:
$Reg = [Microsoft.Win32.RegistryKey]::OpenRemoteBaseKey('Users', $remote[$i]) + timer runs in parallel + if condition for the timer count. And when the counter reaches threshold terminate OpenRemoteBaseKey & continue

Execute the statement as a job. Monitor the time outside the job. If the job runs longer than you prefer, kill it.
$job = Invoke-Command `
-Session $s
-ScriptBlock { $Reg = [Microsoft.Win32.RegistryKey]::OpenRemoteBaseKey('Users', $remote[$i]) } `
-AsJob `
-JobName foo
$int = 0
while (($job.State -like "Running") -and ($int -lt 3)) {
Start-Sleep -Seconds 1
$int++
}
if ($Job.State -like "Running") { $job | Stop-Job }

Related

Cannot break do-while loop in Powershell

I have been working on this PowerShell script. I'm still pretty new at this. The way it works is that I give it a list of servers which it goes though and restarts the App Pools 1 by 1. I have been having a problem with the snippet below. I do not use Restart-WebAppPool because it sometimes gives the App Pool a start before it's ready and leaves it stopped. I cannot show the whole script because it's big and has proprietary info. The problem that I'm having is I can't seem to break the do-while loop. In it I'm checking App Pool status to make sure that it's stopped.
What I get appears for $PL_Break appears to be a valid string showing "Stopped". However, even when it shows "Stopped" it doesn't break the loop.
$PL_Timeout = New-TimeSpan -Seconds 95
foreach ($PL_Server in $PL_ServerName)
{
$PL_Stopwatch = [System.Diagnostics.Stopwatch]::StartNew()
Write-host "`n`n`nRestarting App Pool : $PL_AppPool"
write-host "Stopping: " $PL_Server -f Green
Invoke-Command -ComputerName $PL_Server -ArgumentList $PL_AppPool -ScriptBlock {param($PL_App) Stop-WebAppPool -Name $PL_App}
do {
sleep 5
$PL_Br = Invoke-Command -ComputerName $PL_Server -ArgumentList $PL_AppPool -ScriptBlock {param($PL_App) Get-IISAppPool $PL_App | Select-Object State}
$PL_Break = [string]$PL_Br.State.value
} while (($PL_Stopwatch.elapsed -lt $PL_Timeout) -or ($PL_Break -ne "Stopped"))
} # Foreach - Server

How to loop until Process is running, allow to run 60 seconds, kill the process and break the loop?

I tried to repurpose another script to loop until the BCR_MODE_SET process is running, allow to run 60 seconds, kill the process and then break the loop.
If the process is already running and I run just the contents of the loop, it kills the process as it should after 60 seconds. However, if I run the whole script it never kills the process once the process has started running.
Start-Process C:\Userdata\Barcode2COM.exe
for ($i=0; $i -le $max_iterations; $i++)
{
$proc = Get-Process -Name BCR_MODE_SET
# keep track of timeout event
$timeouted = $null # reset any previously set timeout
# wait up to x seconds for normal termination
$proc | Wait-Process -Timeout 60 -ErrorAction SilentlyContinue -ErrorVariable timeouted
if ($timeouted)
{
# terminate the process
$proc | kill
}
elseif ($proc.ExitCode -ne 0)
{
}
}
I ended up simplifying to the following script which accomplishes what I need:
Start-Process C:\Userdata\Barcode2COM.exe
DO {$ProcessActive = Get-Process BCR_MODE_SET -ErrorAction SilentlyContinue}
While ($ProcessActive -eq $null)
Start-Sleep -Seconds 45
$ProcessActive | kill

Powershell Start-Jobs throttling

I am having trouble with throttling of jobs and "hung" or "failed" jobs. Here is basically what I am trying to do.
$allServers = Import-Csv "C:\temp\input.csv"
$job = $allServers | % {
while (#(Get-Job -State Running).Count -ge 6) {
Start-Sleep -Seconds 2
}
Start-Job -Name $_.computerName -ScriptBlock {
param ($cpn,$dom)
(DO QUERIES HERE)
(OUTPUT TO OBJECT HERE)
} -ArgumentList $_.computerName,$_.Domain
}
$jobsdone = $job | Wait-Job | Receive-Job
I would like to run 5 concurrent jobs, simple enough.
The issue is when I query a server that does not respond, the job hangs and the script never ends. I have tried adding...
Wait-Job -Name $_.computerName -Timeout 20
...above the last curly brace, but all that does is effectively limit the concurrence to one thread, until 20 seconds goes by, then abandoning the hung job to do other jobs. The whole script still doesn't finish in that instance.
This code works fine without the throttling and job waiting, so long as I don't get a non-responsive server.
Inside your while loop check the length of time the job has been running. If that is greater than some timeout you determine, then stop that job e.g.:
while (#(Get-Job -State Running).Count -ge 6) {
$now = Get-Date
foreach ($job in #(Get-Job -State Running)) {
if ($now - (Get-Job -Id $job.id).PSBeginTime -gt [TimeSpan]::FromMinutes(2)) {
Stop-Job $job
}
}
Start-Sleep -sec 2
}
You might want to check out this PowerShell team blog post on how to throttle jobs using a queue.

Powershell: Run multiple jobs in parralel and view streaming results from background jobs

Overview
Looking to call a Powershell script that takes in an argument, runs each job in the background, and shows me the verbose output.
Problem I am running into
The script appears to run, but I want to verify this for sure by streaming the results of the background jobs as they are running.
Code
###StartServerUpdates.ps1 Script###
#get list of servers to update from text file and store in array
$servers=get-content c:\serverstoupdate.txt
#run all jobs, using multi-threading, in background
ForEach($server in $servers){
Start-Job -FilePath c:\cefcu_it\psscripts\PSPatch.ps1 -ArgumentList $server
}
#Wait for all jobs
Get-Job | Wait-Job
#Get all job results
Get-Job | Receive-Job
What I am currently seeing:
Id Name State HasMoreData Location Command
-- ---- ----- ----------- -------- -------
23 Job23 Running True localhost #patch server ...
25 Job25 Running True localhost #patch server ...
What I want to see:
Searching for approved updates ...
Update Found: Security Update for Windows Server 2003 (KB2807986)
Update Found: Windows Malicious Software Removal Tool - March 2013 (KB890830)
Download complete. Installing updates ...
The system must be rebooted to complete installation.
cscript exited on "myServer" with error code 3.
Reboot required...
Waiting for server to reboot (35)
Searching for approved updates ...
There are no updates to install.
cscript exited on "myServer" with error code 2.
Servername "myServer" is fully patched after 2 loops
I want to be able to see the output or store that somewhere so I can refer back to be sure the script ran and see which servers rebooted, etc.
Conclusion:
In the past, I ran the script and it went through updating the servers one at a time and gave me the output I wanted, but when I started doing more servers - this task took too long, which is why I am trying to use background jobs with "Start-Job".
Can anyone help me figure this out, please?
You may take a look at the module SplitPipeline.
It it specifically designed for such tasks. The working demo code is:
# import the module (not necessary in PS V3)
Import-Module SplitPipeline
# some servers (from 1 to 10 for the test)
$servers = 1..10
# process servers by parallel pipelines and output results immediately
$servers | Split-Pipeline {process{"processing server $_"; sleep 1}} -Load 1, 1
For your task replace "processing server $_"; sleep 1 (simulates a slow job) with a call to your script and use the variable $_ as input, the current server.
If each job is not processor intensive then increase the parameter Count (the default is processor count) in order to improve performance.
Not a new question but I feel it is missing an answer including Powershell using workflows and its parallel possibilities, from powershell version 3. Which is less code and maybe more understandable than starting and waiting for jobs, which of course works good as well.
I have two files: TheScript.ps1 which coordinates the servers and BackgroundJob.ps1 which does some kind of check. They need to be in the same directory.
The Write-Output in the background job file writes to the same stream you see when starting TheScript.ps1.
TheScript.ps1:
workflow parallelCheckServer {
param ($Servers)
foreach -parallel($Server in $Servers)
{
Invoke-Expression -Command ".\BackgroundJob.ps1 -Server $Server"
}
}
parallelCheckServer -Servers #("host1.com", "host2.com", "host3.com")
Write-Output "Done with all servers."
BackgroundJob.ps1 (for example):
param (
[Parameter(Mandatory=$true)] [string] $server
)
Write-Host "[$server]`t Processing server $server"
Start-Sleep -Seconds 5
So when starting the TheScript.ps1 it will write "Processing server" 3 times but it will not wait for 15 seconds but instead 5 because they are run in parallel.
[host3.com] Processing server host3.com
[host2.com] Processing server host2.com
[host1.com] Processing server host1.com
Done with all servers.
In your ForEach loop you'll want to grab the output generated by the Jobs already running.
Example Not Tested
$sb = {
"Starting Job on $($args[0])"
#Do something
"$($args[0]) => Do something completed successfully"
"$($args[0]) => Now for something completely different"
"Ending Job on $($args[0])"
}
Foreach($computer in $computers){
Start-Job -ScriptBlock $sb -Args $computer | Out-Null
Get-Job | Receive-Job
}
Now if you do this all your results will be mixed. You might want to put a stamp on your verbose output to tell which output came from.
Or
Foreach($computer in $computers){
Start-Job -ScriptBlock $sb -Args $computer | Out-Null
Get-Job | ? {$_.State -eq 'Complete' -and $_.HasMoreData} | % {Receive-Job $_}
}
while((Get-Job -State Running).count){
Get-Job | ? {$_.State -eq 'Complete' -and $_.HasMoreData} | % {Receive-Job $_}
start-sleep -seconds 1
}
It will show all the output as soon as a job is finished. Without being mixed up.
If you're wanting to multiple jobs in-progress, you'll probably want to massage the output to help keep what output goes with which job straight on the console.
$BGList = 'Black','Green','DarkBlue','DarkCyan','Red','DarkGreen'
$JobHash = #{};$ColorHash = #{};$i=0
ForEach($server in $servers)
{
Start-Job -FilePath c:\cefcu_it\psscripts\PSPatch.ps1 -ArgumentList $server |
foreach {
$ColorHash[$_.ID] = $BGList[$i++]
$JobHash[$_.ID] = $Server
}
}
While ((Get-Job).State -match 'Running')
{
foreach ($Job in Get-Job | where {$_.HasMoreData})
{
[System.Console]::BackgroundColor = $ColorHash[$Job.ID]
Write-Host $JobHash[$Job.ID] -ForegroundColor Black -BackgroundColor White
Receive-Job $Job
}
Start-Sleep -Seconds 5
}
[System.Console]::BackgroundColor = 'Black'
You can get the results by doing something like this after all the jobs have been received:
$array=#()
Get-Job -Name * | where{$array+=$_.ChildJobs.output}
.ChildJobs.output will have anything that was returned in each job.
function OutputJoblogs {
[CmdletBinding(DefaultParameterSetName='Name')]
Param
(
[Parameter(Mandatory=$true, Position=0)]
[System.Management.Automation.Job] $job,
[Parameter(Mandatory=$true, Position=1)]
[string] $logFolder,
[Parameter(Mandatory=$true, Position=2)]
[string] $logTimeStamp
)
#Output All logs
while ($job.sate -eq "Running" -or $job.HasMoreData){
start-sleep -Seconds 1
foreach($remotejob in $job.ChildJobs){
if($remotejob.HasMoreData){
$output=(Receive-Job $remotejob)
if($output -gt 0){
$remotejob.location +": "+ (($output) | Tee-Object -Append -file ("$logFolder\$logTimeStamp."+$remotejob.Location+".txt"))
}
}
}
}
#Output Errors
foreach($remotejob in $job.ChildJobs){
if($remotejob.Error.Count -gt0){$remotejob.location +": "}
foreach($myerr in $remotejob.Error){
$myerr 2>&1 | Tee-Object -Append -file ("$logFolder\$logTimeStamp."+$remotejob.Location+".ERROR.txt")
}
if($remotejob.JobStateInfo.Reason.ErrorRecord.Count -gt 0){$remotejob.location +": "}
foreach($myerr in $remotejob.JobStateInfo.Reason.ErrorRecord){
$myerr 2>&1 | Tee-Object -Append -file ("$logFolder\$logTimeStamp."+$remotejob.Location+".ERROR.txt")
}
}
}
#example of usage
$logfileDate="$((Get-Date).ToString('yyyy-MM-dd-HH.mm.ss'))"
$job = Invoke-Command -ComputerName "servername1","servername2" -ScriptBlock {
for ($i=1; $i -le 5; $i++) {
$i+"`n";
if($i -gt 2){
write-error "Bad thing happened"};
if($i -eq 4){
throw "Super Bad thing happened"
};
start-sleep -Seconds 1
}
} -asjob
OutputJoblogs -Job $job -logFolder "$PSScriptRoot\logs" -logTimeStamp $logfileDate

PowerShell Job Queue

Run a job for each server in a list. I only want 5 jobs running at a time. When a job completes, it should start a new job on the next server on the list. Here's what I have so far, but I can't get it to start a new job after the first 5 jobs have ran:
$MaxJobs = 5
$list = Get-Content ".\list.csv"
$Queue = New-Object System.Collections.Queue
$CurrentJobQueue = Get-Job -State Running
$JobQueueCount = $CurrentJobQueue.count
ForEach($Item in $list)
{
Write-Host "Adding $Item to queue"
$Queue.Enqueue($Item)
}
Function Global:Remote-Install
{
$Server = $queue.Dequeue()
$j = Start-Job -Name $Server -ScriptBlock{
If($JobQueueCount -gt 0)
{
Test-Connection $Server -Count 15
}##EndIf
}##EndScriptBlock
}
For($i = 0 ;$i -lt $MaxJobs; $i++)
{
Remote-Install
}
PowerShell will do this for you if you use Invoke-Command e.g.:
Invoke-Command -ComputerName $serverArray -ScriptBlock { .. script here ..} -ThrottleLimit 5 -AsJob
BTW I don't think your use of a .NET Queue is going to work because Start-Job fires up another PowerShell process to execute the job.
You may take a look at the cmdlet Split-Pipeline of the module SplitPipeline.
The code will look like:
Import-Module SplitPipeline
$MaxJobs = 5
$list = Get-Content ".\list.csv"
$list | Split-Pipeline -Count $MaxJobs -Load 1,1 {process{
# process an item from $list represented by $_
...
}}
-Count $MaxJobs limits the number of parallel jobs. -Load 1,1 tells to pipe
exactly 1 item to each job.
The advantage of this approach is that the code itself is invoked synchronously
and it outputs results from jobs as if all was invoked sequentially (even
output order can be preserved with the switch Order).
But this approach does not use remoting. The code works in the current PowerShell session in several runspaces.