I'm trying to track progress and I think it's a scope issue. The two functions are located in Functions.ps1 and are called from another script main.ps1. The progress is tracked fine until it gets the the invoke-command in Function 1. Why is the remote session resetting progress tracker? Here is what the code and output look like:
main.ps1
. "C:\Functions.ps1" #dot source so functions available
$steps = 10
Write-ProgressHelper -stepTotal $steps -Status "reset"
Functions.ps1
Function Write-ProgressHelper {
param (
[int]$stepTotal,
[string]$Status
)
Switch ($Status)
{
"run" {
$global:Counter++
Write-Host "$Counter"
Write-Progress -Activity "STIG" -PercentComplete (($Counter / $stepTotal) * 100)
}
"reset" {
$Counter = 0
Write-Host "Counter Reset"
}
}
}
Function 1 {
Write-Host "StepNumber:1"
Write-ProgressHelper -stepTotal $steps -Status "run"
#some code here
Write-Host "StepNumber:2"
$sysinfo = {
. "C:\Functions.ps1" #dot source so write-progresshelper is available within invoke-command
#more code
Write-Host "StepNumber:3"
Write-ProgressHelper -stepTotal $using:steps -Status "run"
Write-Host "StepNumber:4"
#more code
Write-ProgressHelper -stepTotal $using:steps -Status "run"
}#end sysinfo
Invoke-Command -ComputerName $Servers -ScriptBlock $sysinfo
Write-Host "StepNumber:10"
Write-ProgressHelper -stepTotal $using:steps -Status "run"
}#end Function 1
There are 10 Steps to track in Function 1 and then I output the $counter variable. Step 1,2, and 10 are within Function 1. Steps 3-9 are in invoke-command session. I tried declaring the counter as global and that didn't work (I had it declare with script scope in order for it to increment at all). Here is the output:
StepNumber:1
1
StepNumber:2
2
StepNumber:3
1
StepNumber:4
2
StepNumber:5
3
StepNumber:6
4
StepNumber:7
5
StepNumber:8
6
StepNumber:9
7
StepNumber:10
3
Remote variables don't get saved back to local variables (See: about_Remote_Variables). So even though you define $counter as a $global variable, it is only global within that PowerShell session. It is not available on the remote computer. When the Invoke-Command is called, a new PowerShell session is created on that computer, and $counter is reset to 0. It will increment and return. The existing value of the $counter variable on the local computer is retained, and continues on.
When we look at what the value of $global:Counter is, we have to realize that the variable is only available on the computer that the script is executing:
Local Computer Server1
----------------- -------------------
$global:Counter $global:Counter
_________________ ___________________
StepNumber:1 1 (0)
1
StepNumber:2 2 (0)
2
Invoke-Command -Computer Server1
|---------------->
StepNumber:3 (2) 1
1
StepNumber:4 (2) 2
2
StepNumber:5 (2) 3
3
StepNumber:6 (2) 4
4
StepNumber:7 (2) 5
5
StepNumber:8 (2) 6
6
StepNumber:9 (2) 7
7
<-----------------|
StepNumber:10 3
3
See how the value of $global:Counter is incremented for step 1 and 2. When the Invoke-Computer is run, PowerShell creates a new PowerShell session on the remote computer, where the value of $global:Counter is 0. That is why you see the value apparently "reset". After steps 3-9 execute, the process returns to the local computer, where the value of $global:Counter is still 2. That is why you see for Step 10, it returns 3.
To have an effective counter, you will have to pass the counter value to the Invoke-Command as an AgrumentList and return the value back.
Related
I have a recursive function that is executed around 750~ times - iterating over XML files and processing. The code is running using Start-Job
Example below:
$job = Start-Job -ScriptBlock {
function Test-Function {
Param
(
$count
)
Write-Host "Count is: $count"
$count++
Test-Function -count $count
}
Test-Function -count 1
}
Output:
$job | Receive-Job
Count is: 224
Count is: 225
Count is: 226
Count is: 227
The script failed due to call depth overflow.
The depth overflow occurs at 227 consistently on my machine. If I remove Start-Job, I can reach 750~ (and further). I am using jobs for batch processing.
Is there a way to configure the depth overflow value when using Start-Job?
Is this a limitation of PowerShell Jobs?
I can't answer about the specifics of call depth overflow limitations in PS 5.1 / 7.2 but you could do your recursion based-off a Queue within the job.
So instead of doing the recursion within the function, you do it from outside (still within the job though).
Here's what this look like.
$job = Start-Job -ScriptBlock {
$Queue = [System.Collections.Queue]::new()
function Test-Function {
Param
(
$count
)
Write-Host "Count is: $count"
$count++
# Next item to process.
$Queue.Enqueue($Count)
}
# Call the function once
Test-Function -count 1
# Process the queue
while ($Queue.Count -gt 0) {
Test-Function -count $Queue.Dequeue()
}
}
Reference:
.net Queue class
Not an answer that would solve the problem but rather an informative one. You could use an instance of the PowerShell SDK's [powershell] class instead of Start-Job, which can handle more levels of recursion (by a big amount) in case it helps, here are my results:
Technique
Iterations
PowerShell Version
Operating System
Start-Job
226~
5.1
Windows 10
Start-Job
2008~
7.2.1
Linux
PowerShell Instance
4932~
5.1
Windows 10
PowerShell Instance
11193~
7.2.1
Linux
Code to reproduce
$instance = [powershell]::Create().AddScript({
function Test-Function {
Param($count)
Write-Host "Count is: $count"
$count++
Test-Function -count $count
}
Test-Function -count 1
})
$handle = $instance.BeginInvoke()
do {
$done = $handle.AsyncWaitHandle.WaitOne(500)
} until($done)
$instance.Streams.Information[-1]
$instance.Dispose()
I want to measure elapsed time from a cmdlet
Invoke-ASCmd
I am using it the following way
$elapsedTime = [system.diagnostics.stopwatch]::StartNew()
$j = Start-Job -ScriptBlock {
Invoke-ASCmd –InputFile $XMLF -Server "$Server[-1]" >$output_file
}
do {
write-progress -activity "Syncing..." -status "$([string]::Format("Time Elapsed: {0:d2}:{1:d2}:{2:d2}", $elapsedTime.Elapsed.hours, $elapsedTime.Elapsed.minutes, $elapsedTime.Elapsed.seconds))"
#-percentcomplete ($_/10);
Start-Sleep -milliseconds 250
} while ($j.State -eq 'Running')
Receive-Job -Job $j
$elapsedTime.stop()
However, all i see on the console is a flashing blue progress bar that doesnt appear to be even elapsing the time at all...and frankly, i dont even think the scriptblock is being executed at all (the Invoke cmdlet)
why is that?
and it appears to last 1 second
I know that the scriptblock is not working because the syncing is supposed to take at least 20 seconds so something is wrong
Also, i would like to get the percentage (circles animation/prgress), this is not working
-percentcomplete ($_/10);
One last thing, i would like to save the final elapsed time to a variable $FinalTime, would i do it inside the loop or outside?
I am combining these two answers here and modifying for my needs:
https://stackoverflow.com/a/9813370/8397835
https://stackoverflow.com/a/8468024/8397835
Yes, the progress is quick because it takes PowerShell 1 second to load the module before erroring out. We can see the error message with Receive-Job:
PS C:\> Receive-Job $j
InputFile "" not found
+ CategoryInfo : InvalidData: (:) [Invoke-ASCmd], FileNotFoundException
+ FullyQualifiedErrorId : DataValidation,Microsoft.AnalysisServices.PowerShell.Cmdlets.ExecuteScriptCommand
+ PSComputerName : localhost
InputFile "" not found indicates that the variables were empty. They are empty because you can't reference variables directly inside of the Script Block. Using Start-Job, you must pass it into the Script Block as an argument, and receive it as a parameter inside the Script Block. Something like this:
$j = Start-Job -Arg $XMLF, $Server, $output_file -ScriptBlock {
Param($XMLF, $Server, $output_file)
Invoke-ASCmd –InputFile $XMLF -Server "$Server" >$output_file
}
As for progress, since there is no "Direct" way to measure how far the progress is to 100%, we "fake it". Since we know that it takes about 20 seconds to execute, we simply have our progress do some math using the time from 0 to 20 as our 0 to 100 progress:
[Math]::Min(100*($elapsedTime.Elapsed.Seconds / 20),100)
Essentially use $elapsedTime for 0 to 100 percent over 20 seconds. That 20 seconds can be changed to any number that is close to the approximate execution time. Using [Math]::Min we ensure that if it takes longer than 20 seconds, the progress will show 100 percent, but the status will continue to show the time. So it would look like this:
do {
write-progress -activity "Syncing..." -status "$($elapsedTime.Elapsed.ToString())" -percentcomplete ([Math]::Min(100*($elapsedTime.Elapsed.Seconds / 20),100));
Start-Sleep -Milliseconds 250
} while ($j.State -eq 'Running')
Trying to get the active processes for powershell(example) after every 5 seconds. Running the below script. I killed 2 powershell sessions and the script which is running every 5 seconds doesn't update the active sessions as 3 instead it displays as 5 sessions. please help me where am going wrong
$process = Get-Process powershell*
$count = $process.count
Do {
$count
sleep -Seconds 5
} until ($count -eq 1)
Output:
You just need to put your first two statements inside your do block.
do
{
$process = Get-Process powershell*
$count = $process.count
$count
sleep -Seconds 5
} until ($count -eq 1)
that way you recalculate $count each time you loop, otherwise the value never changes as you observed.
How to make Powershell execute pipelines asynchronously instead of 'sequentially'?
An example to illustrate this:
Function PipeDelay($Name, $Miliseconds){
Process {
Sleep -miliseconds $Miliseconds
Write-Host "$Name : $_"
$_
}
}
1..7 | PipeDelay Fast 100 | PipeDelay Slow 300 | PipeDelay Slowest 900
The output in both ISE and powershell are the following:
Fast : 1
Slow : 1
Slowest : 1
1
Fast : 2
Slow : 2
Slowest : 2
2
Fast : 3
Slow : 3
Slowest : 3
3
...
It is as if Powershell sequentially run the whole pipeline for one input object before processing next input object. The execution looks something like this :
Fast : 1 2 3
Slow : 1---1 2---2 3---3
Slowest : 1---------------1 2---------------2 3-----...
Is it possible with some settings/environment variables/etc. to make pipelines to run independently/asynchronously? perhaps with setting about pipeline buffer size, etc.? So that the execution will looks something like this:
Fast : 1 2 3 4 5 6 7
Slow : 1---1 2---2 3---3 4---4 5---5 6---6 7---7
Slowest : 1---------------1 2---------------2 3-----...
NOTE
I thought it is because of STA/MTA mode. I don't understand them completely but same result in ISE (STA) / Powershell Shell (MTA) seems to eliminate STA/MTA mode as the cause.
Also, I thought Write-Host is the issue that force pipeline to be processed sequentially, but even if I substitute Write-Host with New-Event, the sequential processing still applies.
I don't think you will be able to leverage the pipeline in an asynchronous manner in the way you desire without being quiet expensive with resource usage. I tried to capture the spirit of what you are trying to accomplish, but in a different way. I used a slightly different example to illustrate how [Automation.PowerShell] Async works.
#1. You have a list of room requests you want to process in a function. TimeToClean was added as a controllable thread block.
$roomsToClean = #( ([psCustomObject]#{Name='Bedroom';TimeToClean=2}),
([psCustomObject]#{Name='Kitchen';TimeToClean=5}),
([psCustomObject]#{Name='Bathroom';TimeToClean=3}),
([psCustomObject]#{Name='Living room';TimeToClean=1}),
([psCustomObject]#{Name='Dining room';TimeToClean=1}),
([psCustomObject]#{Name='Foyier';TimeToClean=1})
)
#2. We will clean three rooms and return a custom PowerShell object with a message.
Function Clean-Room{
param([string]$RoomName,[int]$Seconds)
Sleep -Seconds $Seconds
Write-Output [psCustomObject] #{Message= "The robot cleaned the $RoomName in $Seconds seconds."}
}
#3. Executing this list synchronously will result in an approximate 13 second runtime.
Write-Host "===== Synchronous Results =====" -ForegroundColor Green
$stopwatch = [system.diagnostics.stopwatch]::StartNew()
foreach($item in $roomsToClean){
$obj = Clean-Room $item.Name $item.TimeToClean
Write-Output $obj.Message
}
$stopwatch.Stop()
Write-Host "Execution time for synchronous function was $($stopwatch.Elapsed)." -ForegroundColor Green
#4. Now let's run this function asynchronously for all of these items. Expected runtime will be approximately 5 seconds.
#=============== Setting up an ansynchronous powerShell Automation object and attaching it to a runspace.
#Many [Automation.PowerShell] objects can be attached to a given Runspace pool and the Runspace pool will manage queueing/dequeueing of each PS object as it completes.
#Create a RunSpace pool with 5 runspaces. The pool will manage the
#Many PowerShell autom
$minRunSpaces = 2
$maxRunsSpaces = 5
$runspacePool = [RunspaceFactory]::CreateRunspacePool($minRunSpaces, $maxRunsSpaces)
$runspacePool.ApartmentState = 'STA' #MTA = Multithreaded apartment #STA = Singl-threaded apartment.
$runspacePool.Open() #runspace pool must be opened before it can be used.
#For each room object, create an [Automation.PowerShell] object and attach it to the runspace pool.
#Asynchronously invoke the function for all objects in the collection.
$ps_collections = foreach($room in $roomsToClean){
try{
$ps = [System.Management.Automation.PowerShell]::Create()
$ps.RunspacePool = $runspacePool
#Add your custom functions to the [Automation.PowerShell] object.
#Add argument with parameter name for readability. You may just use AddArgument as an alternative but know your positional arguments.
[void] $ps.AddScript(${function:Clean-Room})
[void] $ps.AddParameter('RoomName',$room.Name) #Add parameterName,value
[void] $ps.AddParameter('Seconds',$room.TimeToClean) #Add parameterName,value
#extend the ps management object to include AsyncResult and attach the AsyncResult object for receiving results at a later time.
$ps | Add-Member -MemberType NoteProperty -Name 'AsyncResult' -Value $ps.BeginInvoke() #invoke asynchronously
$ps | Add-Member -MemberType ScriptMethod -Name 'GetAsyncResult' -Value {$this.EndInvoke($this.AsyncResult) } -PassThru
}
catch{
throw $_ #handle custom error here.
}
}
#After the function has been asynchronously called for all room objects, Grab results from asynchronous function calls.
Write-Host "===== Asynchronous Results =====" -ForegroundColor Green
$stopwatch = [system.diagnostics.stopwatch]::StartNew()
foreach($ps in $ps_collections){
$obj = $ps.GetAsyncResult()
[void] $ps.Dispose() #dispose of object after use.
Write-Output $obj.Message
}
$stopwatch.Stop()
Write-Host "Execution time for asynchronous function was
$($stopwatch.Elapsed)." -ForegroundColor Green
#Runspace cleanup.
If($runspacePool){
[void] $runspacePool.Close()
[void] $runspacePool.Dispose()
}
Result times will vary slightly but should look similar to this:
===== Synchronous Results =====
The robot cleaned the Bedroom in 2 seconds.
The robot cleaned the Kitchen in 5 seconds.
The robot cleaned the Bathroom in 3 seconds.
The robot cleaned the Living room in 1 seconds.
The robot cleaned the Dining room in 1 seconds.
The robot cleaned the Foyier in 1 seconds.
Execution time for synchronous function was 00:00:13.0719157.
####===== Asynchronous Results =====
The robot cleaned the Bedroom in 2 seconds.
The robot cleaned the Kitchen in 5 seconds.
The robot cleaned the Bathroom in 3 seconds.
The robot cleaned the Living room in 1 seconds.
The robot cleaned the Dining room in 1 seconds.
The robot cleaned the Foyier in 1 seconds.
Execution time for asynchronous function was 00:00:04.9909951.
I'm working on my first PowerShell script and can't figure the loop out.
I have the following, which will repeat $ActiveCampaigns number of times:
Write-Host "Creating $PQCampaign1 Pre-Qualified Report"
Invoke-Item "$PQCampaignPath1\PQ REPORT $PQCampaign1.qvw"
Write-Host "Waiting 1 minute for QlikView to update"
sleep -seconds 60 # Wait 1 minute for QlikView to Reload, create Report and Save.
DO{
Write-Host "Daily Qlikview Reports"
Write-Host "Wating for QlikView to create the $PQCampaign1 PQ Report"
Get-Date
Write-Host "Checking...."
sleep -seconds 1
Write-Host ""
Write-Host "Not Done Yet"
Write-Host "Will try again in 5 seconds."
Write-Host ""
sleep -seconds 5
}
Until (Test-Path "$PQCampaignPath1\$PQCampaign1 $PQReportName $ReportDate.xlsx" -pathType leaf)
Get-Date
Write-Host "Done with $PQCampaign1 PQ Report. Wait 10 seconds."
sleep -seconds 10
These parameters need to increase with one for each loop:
$PQCampaign1 (should become $PQCampaign2, then 3, etc.)
$PQCampaignPath1 (should become $PQCampaignPath2, then 3, etc.)
So if $ActiveCampaigns is set to 8 on a certain day, then this needs to repeat 8 times and the last time it must open $PQCampaign3 which lies in $PQCampaignPath8.
How can I fix this?
Use:
1..10 | % { write "loop $_" }
Output:
PS D:\temp> 1..10 | % { write "loop $_" }
loop 1
loop 2
loop 3
loop 4
loop 5
loop 6
loop 7
loop 8
loop 9
loop 10
This may be what you are looking for:
for ($i=1; $i -le $ActiveCampaigns; $i++)
{
$PQCampaign = Get-Variable -Name "PQCampaign$i" -ValueOnly
$PQCampaignPath = Get-Variable -Name "PQCampaignPath$i" -ValueOnly
# Do stuff with $PQCampaign and $PQCampaignPath
}
Here is a simple way to loop any number of times in PowerShell.
It is the same as the for loop above, but much easier to understand for newer programmers and scripters. It uses a range and foreach. A range is defined as:
range = lower..upper
or
$range = 1..10
A range can be used directly in a for loop as well, although not the most optimal approach, any performance loss or additional instruction to process would be unnoticeable. The solution is below:
foreach($i in 1..10){
Write-Host $i
}
Or in your case:
$ActiveCampaigns = 10
foreach($i in 1..$ActiveCampaigns)
{
Write-Host $i
If($i==$ActiveCampaigns){
// Do your stuff on the last iteration here
}
}
See this link. It shows you how to dynamically create variables in PowerShell.
Here is the basic idea:
Use New-Variable and Get-Variable,
for ($i=1; $i -le 5; $i++)
{
New-Variable -Name "var$i" -Value $i
Get-Variable -Name "var$i" -ValueOnly
}
(It is taken from the link provided, and I don't take credit for the code.)