I'm trying to use PowerShell pipeline for some recurring tasks and checks, like
perform certain checks X times or skip forward after the response in pipeline will have different state.
The simplest script I can write to do such checks is something like this:
do {
$result=Update-ACMEIdentifier dns1 -ChallengeType dns-01
if($result.Status -ne 'pending')
{
"Hurray"
break
}
"Still pending"
Start-Sleep -s 3
} while ($true)
The question is - how can I write this script as a single pipeline.
It looks like the only thing I need is infinity pipeline to start with:
1..Infinity |
%{ Start-Sleep -Seconds 1 | Out-Null; $_ } |
%{Update-ACMEIdentifier dns1 -ChallengeType dns-01 } |
Select -ExpandProperty Status | ?{$_ -eq 'pending'} |
#some code here to stop infinity producer or stop the pipeline
So is there any simple one-liner, which allows me to put infinity object producer on one side of the pipeline?
Good example of such object may be a tick generator that generates current timestamp into pipeline every 13 seconds
#PetSerAl gave the crucial pointer in a comment on the question: A script block containing an infinite loop, invoked with the call operator (&), creates an infinite source of objects that can be sent through a pipeline:
& { while ($true) { ... } }
A later pipeline segment can then stop the pipeline on demand.
Note:
As of PS v5, only Select-Object is capable of directly stopping a pipeline.
An imperfect generic pipeline-stopping function can be found in this answer of mine.
Using break to stop the pipeline is tricky, because it doesn't just stop the pipeline, but breaks out of any enclosing loop - safe use requires wrapping the pipeline in a dummy loop.
Alternatively, a Boolean variable can be used to terminate the infinite producer.
Here are examples demonstrating each approach:
A working example with Select-Object -First:
& { while ($true) { Get-Date; Start-Sleep 1 } } | Select-Object -First 5
This executes Get-Date every second indefinitely, but is stopped by Select-Object after 5 iterations.
An equivalent example with break and a dummy loop:
do {
& { while ($true) { Get-Date; Start-Sleep 1 } } |
% { $i = 0 } { $_; if (++$i -eq 5) { break } } # `break` stops the pipeline and
# breaks out of the dummy loop
} while ($false)
An equivalent example with a Boolean variable that terminates the infinite producer:
& { while (-not $done) { Get-Date; Start-Sleep 1 } } |
% { $done = $false; $i = 0 } { $_; if (++$i -eq 5) { $done = $true } }
Note how even though $done is only initialized in the 2nd pipeline segment - namely in the ForEach-Object (%) cmdlet's (implicit) -Begin block - that initialization still happens before the 1st pipeline segment - the infinite producer - starts executing.Thanks again, #PetSerAl.
Not sure why you'd want to use a pipeline over a loop in this scenario, but it is possible by using a bit of C#; e.g.
$Source = #"
using System.Collections.Generic;
public static class Counter
{
public static bool Running = false;
public static IEnumerable<long> Run()
{
Running = true;
while(Running)
{
for (long l = 0; l <= long.MaxValue; l++)
{
yield return l;
if (!Running) {
break;
}
}
}
}
}
"#
Add-Type -TypeDefinition $Source -Language CSharp
[Counter]::Run() | %{
start-sleep -seconds 1
$_
} | %{
"Hello $_"
if ($_ -eq 12) {
[Counter]::Running = $false;
}
}
NB: Because the numbers are generated in parallel with the pipeline execution it's possible that the generator may create a backlog of numbers before it's stopped. In my testing that didn't happen; but I believe that scenario is possible.
You'll also notice that I've stuck a for loop inside the while loop; that's to ensure that the values produced are valid; i.e. so I don't overrun the max value for the data type.
Update
Per #PetSerAl's comment above, here's an adapted version in pure PowerShell:
$run=$true; &{for($i=0;$run;$i++){$i}} | %{ #infinite loop outputting to pipeline demo
"hello $_";
if($_ -eq 10){"stop";$run=$false <# stopping condition demo #>}
}
Related
I have a function that flattens directories in parallel for multiple folders. It works great when I call it in a non-pipeline fashion:
$Files = Get-Content $FileList
Merge-FlattenDirectory -InputPath $Files
But now I want to update my function to work both on the pipeline as well as when called off the pipeline. Someone on discord recommended the best way to do this is to defer all processing to the end block, and use the begin and process blocks to add pipeline input to a list. Basically this:
function Merge-FlattenDirectory {
[CmdletBinding()]
param (
[Parameter(Mandatory,Position = 0,ValueFromPipeline)]
[string[]]
$InputPath
)
begin {
$List = [System.Collections.Generic.List[PSObject]]#()
}
process {
if(($InputPath.GetType().BaseType.Name) -eq "Array"){
Write-Host "Array detected"
$List = $InputPath
} else {
$List.Add($InputPath)
}
}
end {
$List | ForEach-Object -Parallel {
# Code here...
} -ThrottleLimit 16
}
}
However, this is still not working on the pipeline for me. When I do this:
$Files | Merge-FlattenDirectory
It actually passes individual arrays of length 1 to the function. So testing for ($InputPath.GetType().BaseType.Name) -eq "Array" isn't really the way forward, as only the first pipeline value gets used.
My million dollar question is the following:
What is the most robust way in the process block to differentiate between pipeline input and non-pipeline input? The function should add all pipeline input to a generic list, and in the case of non-pipeline input, should skip this step and process the collection as-is moving directly to the end block.
The only thing I could think of is the following:
if((($InputPath.GetType().BaseType.Name) -eq "Array") -and ($InputPath.Length -gt 1)){
$List = $InputPath
} else {
$List.Add($InputPath)
}
But this just doesn't feel right. Any help would be extremely appreciated.
You might just do
function Merge-FlattenDirectory {
[CmdletBinding()]
param (
[Parameter(Mandatory,Position = 0,ValueFromPipeline)]
[string[]]
$InputPath
)
begin {
$List = [System.Collections.Generic.List[String]]::new()
}
process {
$InputPath.ForEach{ $List.Add($_) }
}
end {
$List |ForEach-Object -Parallel {
# Code here...
} -ThrottleLimit 16
}
}
Which will process the input values either from the pipeline or the input parameter.
But that doesn't comply with the Strongly Encouraged Development Guidelines to Support Well Defined Pipeline Input (SC02) especially for Implement for the Middle of a Pipeline
This means if you correctly want to implement the PowerShell Pipeline, you should directly (parallel) process your items in the Process block and immediately output any results from there:
function Merge-FlattenDirectory {
[CmdletBinding()]
param (
[Parameter(Mandatory,Position = 0,ValueFromPipeline)]
[string[]]
$InputPath
)
begin {
$SharedPool = New-ThreadPool -Limit 16
}
process {
$InputPath |ForEach-Object -Parallel -threadPool $Using:SharedPool {
# Process your current item ($_) here ...
}
}
}
In general, script authors are advised to use idiomatic PowerShell which often comes down to lesser object manipulations and usually results in a correct PowerShell pipeline implementation with less memory usage.
Please let me know if you intent to collect (and e.g. order) the output based on this suggestion.
Caveat
The full invocation of the ForEach-Object -Parallel cmdlet itself is somewhat inefficient as you open and close a new pipeline each iteration. To resolve this, my whole general statement about idiomatic PowerShell falls a bit apart, but should be resolvable by using a steppable pipeline.
To implement this, you might use the ForEach-Object cmdlet as a template:
[System.Management.Automation.ProxyCommand]::Create((Get-Command ForEach-Object))
And set the ThrottleLimit of the ThreadPool in the Begin Block
function Merge-FlattenDirectory {
[CmdletBinding()]
param (
[Parameter(Mandatory,Position = 0,ValueFromPipeline)]
[string[]]
$InputPath
)
begin {
$PSBoundParameters += #{
ThrottleLimit = 4
Parallel = {
Write-Host (Get-Date).ToString('HH:mm:ss.s') 'Started' $_
Start-Sleep -Seconds 3
Write-Host (Get-Date).ToString('HH:mm:ss.s') 'finished' $_
}
}
$wrappedCmd = $ExecutionContext.InvokeCommand.GetCommand('ForEach-Object', [System.Management.Automation.CommandTypes]::Cmdlet)
$scriptCmd = {& $wrappedCmd #PSBoundParameters }
$steppablePipeline = $scriptCmd.GetSteppablePipeline($myInvocation.CommandOrigin)
$steppablePipeline.Begin($PSCmdlet)
}
process {
$InputPath.ForEach{ $steppablePipeline.Process($_) }
}
end {
$steppablePipeline.End()
}
}
1..5 |Merge-FlattenDirectory
17:57:40.40 Started 3
17:57:40.40 Started 2
17:57:40.40 Started 1
17:57:40.40 Started 4
17:57:43.43 finished 3
17:57:43.43 finished 1
17:57:43.43 finished 4
17:57:43.43 finished 2
17:57:43.43 Started 5
17:57:46.46 finished 5
Here's how I would write it with comments where I have changed it.
function Merge-FlattenDirectory {
[CmdletBinding()]
param (
[Parameter(Mandatory,Position = 0,ValueFromPipeline)]
$InputPath # <this may be a string, a path object, a file object,
# or an array
)
begin {
$List = #() # Use an array for less than 100K objects.
}
process {
#even if InputPath is a string for each will iterate once and set $p
#if it is an array of strings add each. If it is one or more objects,
#try to find the right property for the path.
foreach ($p in $inputPath) {
if ($p -is [String]) {$list += $p }
elseif ($p.Path) {$list += $p.Path}
elseif ($p.FullName) {$list += $p.FullName}
elseif ($p.PSPath) {$list += $p.PSPath}
else {Write-warning "$P makes no sense"}
}
}
end {
$List | ForEach-Object -Parallel {
# Code here...
} -ThrottleLimit 16
}
}
#iRon That "write for the middle of the pipeline" in the docs does not mean write everything in the process block .
function one { #(1,2,3,4,5) }
function two {
param ([parameter(ValueFromPipeline=$true)] $p )
begin {Write-host "Two begins" ; $a = #() }
process {Write-host "Two received $P" ; $a += $p }
end {Write-host "Two ending" ; $a; Write-host "Two ended"}
}
function three {
param ([parameter(ValueFromPipeline=$true)] $p )
begin {Write-host "three Starts" }
process {Write-host "Three received $P" }
end {Write-host "Three ended" }
}
one | two | three
One is treated as an end block.
One, two and three all run their begins (one's is empty).
One's output goes to the process block in two, which just collects the data. Two's end block starts after one's end-block ends, and sends output
At this point three's process block gets input. After two's end block ends, three's endblock runs.
Two is "in the middle" it has a process block to deal with multiple piped items (if it were all one 'end' block it would only process the last one).
I'm solving the problem
here is the solution i made first
using namespace System.Linq
$DebugPreference = "Continue"
class kata{
static [string] FirstNonRepeatingLetter ([String]$string_){
[Enumerable]::GroupBy([char[]]$string_, [func[char, char]]{$args[0]}).Foreach{
if ($_.Count -eq 1){
Write-Debug ($_.key)
return $_.Key
}
}
return "~~~"
}
}
Write-Debug ([kata]::FirstNonRepeatingLetter("stress"))
it's return
DEBUG: t
DEBUG: r
DEBUG: e
DEBUG: ~~~
this is not what i expected
I try this
using namespace System.Linq
$DebugPreference = "Continue"
class kata{
static [string] FirstNonRepeatingLetter ([String]$string_){
$groups = [Enumerable]::GroupBy([char[]]$string_, [func[char, char]]{$args[0]})
foreach ($group in $groups)
{
if ($group.Count -eq 1){
return $group.Key
}
}
return "~~~"
}
}
Write-Debug ([kata]::FirstNonRepeatingLetter("stress"))
and then what I wanted to see
DEBUG: t
and now I don't understand why these codes work differently...
You're executing return inside a script block ({ ... }) you've passed to the .ForEach() array method method:
In script blocks passed as arguments (as opposed to blocks that are integral to statements such as if and foreach), return exits only the block itself.
Therefore, your .ForEach() enumeration continues after each return; that is, while execution inside the script block is terminated, execution continues with the next .ForEach() iteration.
There actually is no direct way to stop the enumeration of .ForEach() (by contrast, the related .Where() array method has an optional parameter that accepts a 'First' to stop enumerating after finding the first match).
You have two options:
Set a Boolean flag inside your script block to indicate whether a result has been found, and exit the script block right away if the flag is found to be set.
The obvious downside to this is that the enumeration always runs to completion.
Use a dummy loop so that you can repurpose a break statement to break out of the loop and thereby also out of the .ForEach() enumeration.
Note: Without the dummy loop, break would be looking up the entire call stack for a loop to break out of; if it finds none, execution would terminate overall - see this answer for more information.
Either way, you need a return statement outside to script block in order to return a value from your class method.
Here's how to implement the dummy-loop approach:
using namespace System.Linq
$DebugPreference = 'Continue'
class kata{
static [string] FirstNonRepeatingLetter ([String]$string_){
$result = '~~~'
do { # dummy loop
[Enumerable]::GroupBy([char[]]$string_, [func[char, char]]{$args[0]}).Foreach{
if ($_.Count -eq 1){
Write-Debug ($_.key)
$result = $_.Key
break # break out of the dummy loop
}
}
} while ($false)
return $result
}
}
Write-Debug ([kata]::FirstNonRepeatingLetter('stress'))
If you can live with doing that without a class, but a normal function instead, this should work:
function Get-FirstNonRepeatingCharacter ([string]$string) {
($string.ToCharArray() | Group-Object | Where-Object { $_.Count -eq 1 } | Select-Object -First 1).Name
}
Get-FirstNonRepeatingCharacter "sTress" # --> T
Here's a simple function that uses ValidateSet:
function TestLongValidateSet
{
param
(
[ValidateSet(...)]
$abc
)
$abc
}
My version has 3001 items instead of the ....
If you'd like to follow along at home, here's a way to generate a 3001 element list suitable for placing in there:
(0..3000 | foreach { (Get-Random -Count 30 (65..90 | ForEach-Object { [char]$_ })) -join '' } | ForEach-Object { "`"$_`"" }) -join ', ' | Out-File test.txt
Anyway, the above function loads into PowerShell just fine. However, the first attempt at using IntelliSense with it triggers a multi-minute delay. PowerShell ISE also proceeds to consume a couple of gigabytes of RAM. After this delay, the RAM usage drops back to normal, IntelliSense works, and everything's responsive. Even the completion on the $abc variable is responsive.
Is there anyway to avoid the long initial delay?
Try this. It creates a custom enum type, and uses that instead of ValidateSet
0..3000 | foreach { (Get-Random -Count 30 (65..90 | ForEach-Object { [char]$_ })) -join '' } | sv enumarray
$i=0
$enumlist = ($enumarray | foreach {'{0} = {1}' -f $_,$i++}) -join ', '
$enum = "
namespace myspace
{
public enum myenum
{
$enumlist
}
}
"
Add-Type -TypeDefinition $enum -Language CSharpVersion3
You can put that inside the function, but on my system, creating the enum takes about 200ms. If you're going to run this inside a loop, I'd create it in the parent scope so the function doesn't have to do it every time it runs.
function TestLongValidateSet
{
param
(
[myspace.myenum]$abc
)
$abc
}
In PowerShell v2, the following line:
1..3| foreach { Write-Host "Value : $_"; $_ }| select -First 1
Would display:
Value : 1
1
Value : 2
Value : 3
Since all elements were pushed down the pipeline. However, in v3 the above line displays only:
Value : 1
1
The pipeline is stopped before 2 and 3 are sent to Foreach-Object (Note: the -Wait switch for Select-Object allows all elements to reach the foreach block).
How does Select-Object stop the pipeline, and can I now stop the pipeline from a foreach or from my own function?
Edit: I know I can wrap a pipeline in a do...while loop and continue out of the pipeline. I have also found that in v3 I can do something like this (it doesn't work in v2):
function Start-Enumerate ($array) {
do{ $array } while($false)
}
Start-Enumerate (1..3)| foreach {if($_ -ge 2){break};$_}; 'V2 Will Not Get Here'
But Select-Object doesn't require either of these techniques so I was hoping that there was a way to stop the pipeline from a single point in the pipeline.
Check this post on how you can cancel a pipeline:
http://powershell.com/cs/blogs/tobias/archive/2010/01/01/cancelling-a-pipeline.aspx
In PowerShell 3.0 it's an engine improvement. From the CTP1 samples folder ('\Engines Demos\Misc\ConnectBugFixes.ps1'):
# Connect Bug 332685
# Select-Object optimization
# Submitted by Shay Levi
# Connect Suggestion 286219
# PSV2: Lazy pipeline - ability for cmdlets to say "NO MORE"
# Submitted by Karl Prosser
# Stop the pipeline once the objects have been selected
# Useful for commands that return a lot of objects, like dealing with the event log
# In PS 2.0, this took a long time even though we only wanted the first 10 events
Start-Process powershell.exe -Args '-Version 2 -NoExit -Command Get-WinEvent | Select-Object -First 10'
# In PS 3.0, the pipeline stops after retrieving the first 10 objects
Get-WinEvent | Select-Object -First 10
After trying several methods, including throwing StopUpstreamCommandsException, ActionPreferenceStopException, and PipelineClosedException, calling $PSCmdlet.ThrowTerminatingError and $ExecutionContext.Host.Runspace.GetCurrentlyRunningPipeline().stopper.set_IsStopping($true) I finally found that just utilizing select-object was the only thing that didn't abort the whole script (versus just the pipeline). [Note that some of the items mentioned above require access to private members, which I accessed via reflection.]
# This looks like it should put a zero in the pipeline but on PS 3.0 it doesn't
function stop-pipeline {
$sp = {select-object -f 1}.GetSteppablePipeline($MyInvocation.CommandOrigin)
$sp.Begin($true)
$x = $sp.Process(0) # this call doesn't return
$sp.End()
}
New method follows based on comment from OP. Unfortunately this method is a lot more complicated and uses private members. Also I don't know how robust this - I just got the OP's example to work and stopped there. So FWIW:
# wh is alias for write-host
# sel is alias for select-object
# The following two use reflection to access private members:
# invoke-method invokes private methods
# select-properties is similar to select-object, but it gets private properties
# Get the system.management.automation assembly
$smaa=[appdomain]::currentdomain.getassemblies()|
? location -like "*system.management.automation*"
# Get the StopUpstreamCommandsException class
$upcet=$smaa.gettypes()| ? name -like "*upstream*"
filter x {
[CmdletBinding()]
param(
[parameter(ValueFromPipeline=$true)]
[object] $inputObject
)
process {
if ($inputObject -ge 5) {
# Create a StopUpstreamCommandsException
$upce = [activator]::CreateInstance($upcet,#($pscmdlet))
$PipelineProcessor=$pscmdlet.CommandRuntime|select-properties PipelineProcessor
$commands = $PipelineProcessor|select-properties commands
$commandProcessor= $commands[0]
$null = $upce.RequestingCommandProcessor|select-properties *
$upce.RequestingCommandProcessor.commandinfo =
$commandProcessor|select-properties commandinfo
$upce.RequestingCommandProcessor.Commandruntime =
$commandProcessor|select-properties commandruntime
$null = $PipelineProcessor|
invoke-method recordfailure #($upce, $commandProcessor.command)
1..($commands.count-1) | % {
$commands[$_] | invoke-method DoComplete
}
wh throwing
throw $upce
}
wh "< $inputObject >"
$inputObject
} # end process
end {
wh in x end
}
} # end filter x
filter y {
[CmdletBinding()]
param(
[parameter(ValueFromPipeline=$true)]
[object] $inputObject
)
process {
$inputObject
}
end {
wh in y end
}
}
1..5| x | y | measure -Sum
PowerShell code to retrieve PipelineProcessor value through reflection:
$t_cmdRun = $pscmdlet.CommandRuntime.gettype()
# Get pipelineprocessor value ($pipor)
$bindFlags = [Reflection.BindingFlags]"NonPublic,Instance"
$piporProp = $t_cmdRun.getproperty("PipelineProcessor", $bindFlags )
$pipor=$piporProp.GetValue($PSCmdlet.CommandRuntime,$null)
Powershell code to invoke method through reflection:
$proc = (gps)[12] # semi-random process
$methinfo = $proc.gettype().getmethod("GetComIUnknown", $bindFlags)
# Return ComIUnknown as an IntPtr
$comIUnknown = $methinfo.Invoke($proc, #($true))
I know that throwing a PipelineStoppedException stops the pipeline. The following example will simulate what you see with Select -first 1 in v3.0, in v2.0:
filter Select-Improved($first) {
begin{
$count = 0
}
process{
$_
$count++
if($count -ge $first){throw (new-object System.Management.Automation.PipelineStoppedException)}
}
}
trap{continue}
1..3| foreach { Write-Host "Value : $_"; $_ }| Select-Improved -first 1
write-host "after"
I have written a simple PowerShell filter that pushes the current object down the pipeline if its date is between the specified begin and end date. The objects coming down the pipeline are always in ascending date order so as soon as the date exceeds the specified end date I know my work is done and I would like to let tell the pipeline that the upstream commands can abandon their work so that the pipeline can finish its work. I am reading some very large log files and I will frequently want to examine just a portion of the log. I am pretty sure this is not possible but I wanted to ask to be sure.
It is possible to break a pipeline with anything that would otherwise break an outside loop or halt script execution altogether (like throwing an exception). The solution then is to wrap the pipeline in a loop that you can break if you need to stop the pipeline. For example, the below code will return the first item from the pipeline and then break the pipeline by breaking the outside do-while loop:
do {
Get-ChildItem|% { $_;break }
} while ($false)
This functionality can be wrapped into a function like this, where the last line accomplishes the same thing as above:
function Breakable-Pipeline([ScriptBlock]$ScriptBlock) {
do {
. $ScriptBlock
} while ($false)
}
Breakable-Pipeline { Get-ChildItem|% { $_;break } }
It is not possible to stop an upstream command from a downstream command.. it will continue to filter out objects that do not match your criteria, but the first command will process everything it was set to process.
The workaround will be to do more filtering on the upstream cmdlet or function/filter. Working with log files makes it a bit more comoplicated, but perhaps using Select-String and a regular expression to filter out the undesired dates might work for you.
Unless you know how many lines you want to take and from where, the whole file will be read to check for the pattern.
You can throw an exception when ending the pipeline.
gc demo.txt -ReadCount 1 | %{$num=0}{$num++; if($num -eq 5){throw "terminated pipeline!"}else{write-host $_}}
or
Look at this post about how to terminate a pipeline: https://web.archive.org/web/20160829015320/http://powershell.com/cs/blogs/tobias/archive/2010/01/01/cancelling-a-pipeline.aspx
Not sure about your exact needs, but it may be worth your time to look at Log Parser to see if you can't use a query to filter the data before it even hits the pipe.
If you're willing to use non-public members here is a way to stop the pipeline. It mimics what select-object does. invoke-method (alias im) is a function to invoke non-public methods. select-property (alias selp) is a function to select (similar to select-object) non-public properties - however it automatically acts like -ExpandProperty if only one matching property is found. (I wrote select-property and invoke-method at work, so can't share the source code of those).
# Get the system.management.automation assembly
$script:smaa=[appdomain]::currentdomain.getassemblies()|
? location -like "*system.management.automation*"
# Get the StopUpstreamCommandsException class
$script:upcet=$smaa.gettypes()| ? name -like "*StopUpstreamCommandsException *"
function stop-pipeline {
# Create a StopUpstreamCommandsException
$upce = [activator]::CreateInstance($upcet,#($pscmdlet))
$PipelineProcessor=$pscmdlet.CommandRuntime|select-property PipelineProcessor
$commands = $PipelineProcessor|select-property commands
$commandProcessor= $commands[0]
$ci = $commandProcessor|select-property commandinfo
$upce.RequestingCommandProcessor | im set_commandinfo #($ci)
$cr = $commandProcessor|select-property commandruntime
$upce.RequestingCommandProcessor| im set_commandruntime #($cr)
$null = $PipelineProcessor|
invoke-method recordfailure #($upce, $commandProcessor.command)
if ($commands.count -gt 1) {
$doCompletes = #()
1..($commands.count-1) | % {
write-debug "Stop-pipeline: added DoComplete for $($commands[$_])"
$doCompletes += $commands[$_] | invoke-method DoComplete -returnClosure
}
foreach ($DoComplete in $doCompletes) {
$null = & $DoComplete
}
}
throw $upce
}
EDIT: per mklement0's comment:
Here is a link to the Nivot ink blog on a script on the "poke" module which similarly gives access to non-public members.
As far as additional comments, I don't have meaningful ones at this point. This code just mimics what a decompilation of select-object reveals. The original MS comments (if any) are of course not in the decompilation. Frankly I don't know the purpose of the various types the function uses. Getting that level of understanding would likely require a considerable amount of effort.
My suggestion: get Oisin's poke module. Tweak the code to use that module. And then try it out. If you like the way it works, then use it and don't worry how it works (that's what I did).
Note: I haven't studied "poke" in any depth, but my guess is that it doesn't have anything like -returnClosure. However adding that should be easy as this:
if (-not $returnClosure) {
$methodInfo.Invoke($arguments)
} else {
{$methodInfo.Invoke($arguments)}.GetNewClosure()
}
Here's an - imperfect - implementation of a Stop-Pipeline cmdlet (requires PS v3+), gratefully adapted from this answer:
#requires -version 3
Filter Stop-Pipeline {
$sp = { Select-Object -First 1 }.GetSteppablePipeline($MyInvocation.CommandOrigin)
$sp.Begin($true)
$sp.Process(0)
}
# Example
1..5 | % { if ($_ -gt 2) { Stop-Pipeline }; $_ } # -> 1, 2
Caveat: I don't fully understand how it works, though fundamentally it takes advantage of Select -First's ability to stop the pipeline prematurely (PS v3+). However, in this case there is one crucial difference to how Select -First terminates the pipeline: downstream cmdlets (commands later in the pipeline) do not get a chance to run their end blocks.
Therefore, aggregating cmdlets (those that must receive all input before producing output, such as Sort-Object, Group-Object, and Measure-Object) will not produce output if placed later in the same pipeline; e.g.:
# !! NO output, because Sort-Object never finishes.
1..5 | % { if ($_ -gt 2) { Stop-Pipeline }; $_ } | Sort-Object
Background info that may lead to a better solution:
Thanks to PetSerAl, my answer here shows how to produce the same exception that Select-Object -First uses internally to stop upstream cmdlets.
However, there the exception is thrown from inside the cmdlet that is itself connected to the pipeline to stop, which is not the case here:
Stop-Pipeline, as used in the examples above, is not connected to the pipeline that should be stopped (only the enclosing ForEach-Object (%) block is), so the question is: How can the exception be thrown in the context of the target pipeline?
Try these filters, they'll force the pipeline to stop after the first object -or the first n elements- and store it -them- in a variable; you need to pass the name of the variable, if you don't the object(s) are pushed out but cannot be assigned to a variable.
filter FirstObject ([string]$vName = '') {
if ($vName) {sv $vName $_ -s 1} else {$_}
break
}
filter FirstElements ([int]$max = 2, [string]$vName = '') {
if ($max -le 0) {break} else {$_arr += ,$_}
if (!--$max) {
if ($vName) {sv $vName $_arr -s 1} else {$_arr}
break
}
}
# can't assign to a variable directly
$myLog = get-eventLog security | ... | firstObject
# pass the the varName
get-eventLog security | ... | firstObject myLog
$myLog
# can't assign to a variable directly
$myLogs = get-eventLog security | ... | firstElements 3
# pass the number of elements and the varName
get-eventLog security | ... | firstElements 3 myLogs
$myLogs
####################################
get-eventLog security | % {
if ($_.timegenerated -lt (date 11.09.08) -and`
$_.timegenerated -gt (date 11.01.08)) {$log1 = $_; break}
}
#
$log1
Another option would be to use the -file parameter on a switch statement. Using -file will read the file one line at a time, and you can use break to exit immediately without reading the rest of the file.
switch -file $someFile {
# Parse current line for later matches.
{ $script:line = [DateTime]$_ } { }
# If less than min date, keep looking.
{ $line -lt $minDate } { Write-Host "skipping: $line"; continue }
# If greater than max date, stop checking.
{ $line -gt $maxDate } { Write-Host "stopping: $line"; break }
# Otherwise, date is between min and max.
default { Write-Host "match: $line" }
}