Is it possible to set parameter positions for default commandlets? - powershell

For example I would like the Select-Object commandlet to interpret Get-ChildItem|Select-Object 1 as Get-ChildItem|Select-Object -First 1
I have been getting by with "Wrappers" like this for my most common commandlets:
function Select-Object{
[CMDletbinding()]
param (
[Parameter(ValueFromPipeline)]
$InputObect,
[Parameter(Position = 1)]
[int]$First,
[Parameter(Position = 2)]
[int]$Last
)
$input | Select-Object -First $First -Last $Last
}
But there are sometimes buggy and I always rewrite it to add more parameters.
I have been reading the docs and have not found anything other than parametersplicing.
It does not have to be an official solution. So if anyone has come up with a solution to this I would like to know.
PS: I know this is can lead to confusing code, but I am intending to only use it for an interactive\terminal sessions. Doing gci | sel 1 is infinitely more preferable to Get-ChildItem | Select-Object -First 1
Any help would be greatly appreciated!

For wrappers like this you should use a ProxyCommand. Most of the code below can be auto-generated for you via its .Create(..) Method:
[System.Management.Automation.ProxyCommand]::Create((Get-Command Select-Object))
Using only the 2 parameters you're interested in -First and -Last in addition to the pipeline parameter, the function would look like this:
function sel {
[CmdletBinding()]
param(
[Parameter(ValueFromPipeline)]
[object] $InputObject,
[Parameter(Position = 1)]
[int] $First,
[Parameter(Position = 2)]
[int] $Last
)
begin {
$wrapper = { Microsoft.PowerShell.Utility\Select-Object #PSBoundParameters }
$pipeline = $wrapper.GetSteppablePipeline($MyInvocation.CommandOrigin)
$pipeline.Begin($PSCmdlet)
}
process {
$pipeline.Process($InputObject)
}
end {
$pipeline.End()
}
}
Now you can use positional binding without problems:
0..10 | sel 1 # for `-First 1`
0..10 | sel 1 1 # for `-First 1` `-Last 1`
As for parameter splicing you might referring to what's known as Splatting in powershell. For that you can look into about Splatting. As an example the code above is using splatting with the automatic variable $PSBoundParameters.

Related

Select-Object has different behavior using piping vs. -InputObject

can someone shed light on why example 2 does not have the same result as example 1? I would think that $a and $b2 should be the same. $b2 is null! I am writing a script where using the method in example 2 is preferable.
example 1:
$a = Get-Content $some_text_file | Select-Object -Skip 1
example 2:
$b1 = Get-Content $some_text_file
$b2 = Select-Object -InputObject $b1 -Skip 1
edit: using this syntax gets me where I need to be.
$b1 = Get-Content $file
$b2 = $b1 | Select-Object -Skip 1
As Lee_Dailey notes, this is expected behavior
can someone shed light on why
This has to do with how cmdlets execute in the pipeline.
As you might know, the core functionality of a cmdlet is made up of three methods:
BeginProcessing()
ProcessRecord()
EndProcessing()
*(the begin/process/end blocks in an advanced function correspond to these).
BeginProcessing() and EndProcessing() are always executed exactly once. How many times ProcessRecord() execute depend on whether it's the first command in a pipeline or not.
When a cmdlet occurs as the first element in a pipeline (ie. there are no | sign to the left of it), ProcessRecord() executes once.
When a cmdlet receives input from an upstream command in its pipeline, however, ProcessRecord() is run once for each input item coming in through the pipeline.
With this in mind, please consider this simplified version of Select-Object:
function Select-FakeObject {
param(
[Parameter(Mandatory, ValueFromPipeline)]
[object[]]$InputObject,
[Parameter()]
[int]$Skip = 0
)
begin {
$skipped = 0
}
process {
if($skipped -lt $Skip){
$skipped++
Write-Host "Skipping $InputObject"
}
else{
Write-Host "Selecting $InputObject"
Write-Output $InputObject
}
}
}
Now, let's try both scenarios with this dummy function:
PS C:\> $a = 1,2,3
PS C:\> $b1 = $a |Select-FakeObject -Skip 1
We'll see that PowerShell indeed calls the process block once per input item:
Skipping 1
Selecting 2
Selecting 3
whereas if we pass the object like in your second example:
PS C:\> $a = 1,2,3
PS C:\> $b2 = Select-FakeObject -Skip 1
Skipping 1 2 3
We now see that the process block only executes once, over all of $a rather than the individual items.

How to loop through arrays in hash table - passing parameters based on values read from a CSV file

Curious about how to loop through a hash table where each value is an array. Example:
$test = #{
a = "a","1";
b = "b","2";
c = "c","3";
}
Then I would like to do something like:
foreach ($T in $test) {
write-output $T
}
Expected result would be something like:
name value
a a
b b
c c
a 1
b 2
c 3
That's not what currently happens and my use case is to basically pass a hash of parameters to a function in a loop. My approach might be all wrong, but figured I would ask and see if anyone's tried to do this?
Edit**
A bit more clarification. What I'm basically trying to do is pass a lot of array values into a function and loop through those in the hash table prior to passing to a nested function. Example:
First something like:
$parameters = import-csv .\NewComputers.csv
Then something like
$parameters | New-LabVM
Lab VM Code below:
function New-LabVM
{
[CmdletBinding()]
Param (
# Param1 help description
[Parameter(Mandatory=$true,
Position=0,
ValueFromPipeline=$true,
ValueFromPipelineByPropertyName=$true)]
[Alias("p1")]
[string[]]$ServerName,
# Param2 help description
[Parameter(Position = 1)]
[int[]]$RAM = 2GB,
# Param3 help description
[Parameter(Position=2)]
[int[]]$ServerHardDriveSize = 40gb,
# Parameter help description
[Parameter(Position=3)]
[int[]]$VMRootPath = "D:\VirtualMachines",
[Parameter(Position=4)]
[int[]]$NetworkSwitch = "VM Switch 1",
[Parameter(Position=4)]
[int[]]$ISO = "D:\ISO\Win2k12.ISO"
)
process
{
New-Item -Path $VMRootPath\$ServerName -ItemType Directory
$Arguments = #{
Name = $ServerName;
MemoryStartupBytes = $RAM;
NewVHDPath = "$VMRootPath\$ServerName\$ServerName.vhdx";
NewVHDSizeBytes = $ServerHardDriveSize
SwitchName = $NetworkSwitch;}
foreach ($Argument in $Arguments){
# Create Virtual Machines
New-VM #Arguments
# Configure Virtual Machines
Set-VMDvdDrive -VMName $ServerName -Path $ISO
Start-VM $ServerName
}
# Create Virtual Machines
New-VM #Arguments
}
}
What you're looking for is parameter splatting.
The most robust way to do that is via hashtables, so you must convert the custom-object instances output by Import-Csv to hashtables:
Import-Csv .\NewComputers.csv | ForEach-Object {
# Convert the custom object at hand to a hashtable.
$htParams = #{}
$_.psobject.properties | ForEach-Object { $htParams[$_.Name] = $_.Value }
# Pass the hashtable via splatting (#) to the target function.
New-LabVM #htParams
}
Note that since parameter binding via splatting is key-based (the hashtable keys are matched against the parameter names), it is fine to use a regular hashtable with its unpredictable key ordering (no need for an ordered hashtable ([ordered] #{ ... }) in this case).
Try this:
for($i=0;$i -lt $test.Count; $i++)
{$test.keys | %{write-host $test.$_[$i]}}
Weirdly, it outputs everything in the wrong order (because $test.keys outputs it backwards).
EDIT: Here's your solution.
Using the [System.Collections.Specialized.OrderedDictionary] type, you guarantee that the output will come out the same order as you entered it.
$test = [ordered] #{
a = "a","1";
b = "b","2";
c = "c","3";
}
After running the same solution code as before, you get exactly the output you wanted.

how to write streaming function in powershell

I tried to create a function that emulates Linux's head:
Function head( )
{
[CmdletBinding()]
param (
[parameter(mandatory=$false, ValueFromPipeline=$true)] [Object[]] $inputs,
[parameter(position=0, mandatory=$false)] [String] $liness = "10",
[parameter(position=1, ValueFromRemainingArguments=$true)] [String[]] $filess
)
$lines = 0
if (![int]::TryParse($liness, [ref]$lines)) {
$lines = 10
$filess = ,$liness + (#{$true=#();$false=$filess}[$null -eq $filess])
}
$read = 0
$input | select-object -First $lines
if ($filess) {
get-content -TotalCount $lines $filess
}
}
The problem is that this will actually read all the content (whether by reading $filess or from $input) and then print the first, where I'd want head to read the first lines and forget about the rest so it can work with large files.
How can this function be rewritten?
Well, as far as I know, you are overdoing it slightly...
"Beginning in Windows PowerShell 3.0, Select-Object includes an optimization feature that prevents commands from creating and processing objects that are not used. When you include a Select-Object command with the First or Index parameter in a command pipeline, Windows PowerShell stops the command that generates the objects as soon as the selected number of objects is generated, even when the command that generates the objects appears before the Select-Object command in the pipeline. To turn off this optimizing behavior, use the Wait parameter."
So all you need to do is:
Get-Content -Path somefile | Select-Object -First 10 #or pass a variable

Does Powershell have an Aggregate/Reduce function?

I realize there are related questions, but all the answers seem to be work-arounds that avoid the heart of the matter. Does powershell have an operation that can use a scriptblock to aggregate elements of an array into a single value? This is what is known in other languages as aggregate or reduce or fold.
I can write it myself pretty easily, but given that its the base operation of any list processing, I would assume there's something built in I just don't know about.
So what I'm looking for is something like this
1..10 | Aggregate-Array {param($memo, $x); $memo * $x}
There is not anything so obviously named as Reduce-Object but you can achieve your goal with Foreach-Object:
1..10 | Foreach {$total=1} {$total *= $_} {$total}
BTW there also isn't a Join-Object to merge two sequences of data based on some matching property.
If you need Maximum, Minimum, Sum or Average you can use Measure-Object sadly it dosenĀ“t handle any other aggregate method.
Get-ChildItem | Measure-Object -Property Length -Minimum -Maximum -Average
This is something I wanted to start for a while. Seeing this question, just wrote a pslinq (https://github.com/manojlds/pslinq) utility. The first and only cmdlet as of now is Aggregate-List, which can be used like below:
1..10 | Aggregate-List { $acc * $input } -seed 1
#3628800
Sum:
1..10 | Aggregate-List { $acc + $input }
#55
String reverse:
"abcdefg" -split '' | Aggregate-List { $input + $acc }
#gfedcba
PS: This is more an experiment
Ran into a similar issue recently. Here's a pure Powershell solution. Doesn't handle arrays within arrays and strings like the Javascript version does but maybe a good starting point.
function Reduce-Object {
[CmdletBinding()]
[Alias("reduce")]
[OutputType([Int])]
param(
# Meant to be passed in through pipeline.
[Parameter(Mandatory=$True,
ValueFromPipeline=$True,
ValueFromPipelineByPropertyName=$True)]
[Array] $InputObject,
# Position=0 because we assume pipeline usage by default.
[Parameter(Mandatory=$True,
Position=0)]
[ScriptBlock] $ScriptBlock,
[Parameter(Mandatory=$False,
Position=1)]
[Int] $InitialValue
)
begin {
if ($InitialValue) { $Accumulator = $InitialValue }
}
process {
foreach($Value in $InputObject) {
if ($Accumulator) {
# Execute script block given as param with values.
$Accumulator = $ScriptBlock.InvokeReturnAsIs($Accumulator, $Value)
} else {
# Contigency for no initial value given.
$Accumulator = $Value
}
}
}
end {
return $Accumulator
}
}
1..10 | reduce {param($a, $b) $a + $b}
# Or
reduce -inputobject #(1,2,3,4) {param($a, $b) $a + $b} -InitialValue 2
There's a functional module I came upon https://github.com/chriskuech/functional that has a reduce-object, also merge-object, test-equality, etc. It's surprising that powershell has map (foreach-object) and filter (where-object) but not reduce. https://medium.com/swlh/functional-programming-in-powershell-876edde1aadb Even javascript has reduce for arrays. A reducer actually is very powerful. You can define map and filter in terms of it.
1..10 | reduce-object { $a * $b }
3628800
Measure-object can't sum [timespan]'s:
1..10 | % { [timespan]"$_" } | Reduce-Object { $a + $b } | ft
Days Hours Minutes Seconds Milliseconds
---- ----- ------- ------- ------------
55 0 0 0 0
Merge two objects:
#{a=1},#{b=2} | % { [pscustomobject]$_ } | Merge-Object -Strategy fail
b a
- -
2 1

How does Select-Object stop the pipeline in PowerShell v3?

In PowerShell v2, the following line:
1..3| foreach { Write-Host "Value : $_"; $_ }| select -First 1
Would display:
Value : 1
1
Value : 2
Value : 3
Since all elements were pushed down the pipeline. However, in v3 the above line displays only:
Value : 1
1
The pipeline is stopped before 2 and 3 are sent to Foreach-Object (Note: the -Wait switch for Select-Object allows all elements to reach the foreach block).
How does Select-Object stop the pipeline, and can I now stop the pipeline from a foreach or from my own function?
Edit: I know I can wrap a pipeline in a do...while loop and continue out of the pipeline. I have also found that in v3 I can do something like this (it doesn't work in v2):
function Start-Enumerate ($array) {
do{ $array } while($false)
}
Start-Enumerate (1..3)| foreach {if($_ -ge 2){break};$_}; 'V2 Will Not Get Here'
But Select-Object doesn't require either of these techniques so I was hoping that there was a way to stop the pipeline from a single point in the pipeline.
Check this post on how you can cancel a pipeline:
http://powershell.com/cs/blogs/tobias/archive/2010/01/01/cancelling-a-pipeline.aspx
In PowerShell 3.0 it's an engine improvement. From the CTP1 samples folder ('\Engines Demos\Misc\ConnectBugFixes.ps1'):
# Connect Bug 332685
# Select-Object optimization
# Submitted by Shay Levi
# Connect Suggestion 286219
# PSV2: Lazy pipeline - ability for cmdlets to say "NO MORE"
# Submitted by Karl Prosser
# Stop the pipeline once the objects have been selected
# Useful for commands that return a lot of objects, like dealing with the event log
# In PS 2.0, this took a long time even though we only wanted the first 10 events
Start-Process powershell.exe -Args '-Version 2 -NoExit -Command Get-WinEvent | Select-Object -First 10'
# In PS 3.0, the pipeline stops after retrieving the first 10 objects
Get-WinEvent | Select-Object -First 10
After trying several methods, including throwing StopUpstreamCommandsException, ActionPreferenceStopException, and PipelineClosedException, calling $PSCmdlet.ThrowTerminatingError and $ExecutionContext.Host.Runspace.GetCurrentlyRunningPipeline().stopper.set_IsStopping($true) I finally found that just utilizing select-object was the only thing that didn't abort the whole script (versus just the pipeline). [Note that some of the items mentioned above require access to private members, which I accessed via reflection.]
# This looks like it should put a zero in the pipeline but on PS 3.0 it doesn't
function stop-pipeline {
$sp = {select-object -f 1}.GetSteppablePipeline($MyInvocation.CommandOrigin)
$sp.Begin($true)
$x = $sp.Process(0) # this call doesn't return
$sp.End()
}
New method follows based on comment from OP. Unfortunately this method is a lot more complicated and uses private members. Also I don't know how robust this - I just got the OP's example to work and stopped there. So FWIW:
# wh is alias for write-host
# sel is alias for select-object
# The following two use reflection to access private members:
# invoke-method invokes private methods
# select-properties is similar to select-object, but it gets private properties
# Get the system.management.automation assembly
$smaa=[appdomain]::currentdomain.getassemblies()|
? location -like "*system.management.automation*"
# Get the StopUpstreamCommandsException class
$upcet=$smaa.gettypes()| ? name -like "*upstream*"
filter x {
[CmdletBinding()]
param(
[parameter(ValueFromPipeline=$true)]
[object] $inputObject
)
process {
if ($inputObject -ge 5) {
# Create a StopUpstreamCommandsException
$upce = [activator]::CreateInstance($upcet,#($pscmdlet))
$PipelineProcessor=$pscmdlet.CommandRuntime|select-properties PipelineProcessor
$commands = $PipelineProcessor|select-properties commands
$commandProcessor= $commands[0]
$null = $upce.RequestingCommandProcessor|select-properties *
$upce.RequestingCommandProcessor.commandinfo =
$commandProcessor|select-properties commandinfo
$upce.RequestingCommandProcessor.Commandruntime =
$commandProcessor|select-properties commandruntime
$null = $PipelineProcessor|
invoke-method recordfailure #($upce, $commandProcessor.command)
1..($commands.count-1) | % {
$commands[$_] | invoke-method DoComplete
}
wh throwing
throw $upce
}
wh "< $inputObject >"
$inputObject
} # end process
end {
wh in x end
}
} # end filter x
filter y {
[CmdletBinding()]
param(
[parameter(ValueFromPipeline=$true)]
[object] $inputObject
)
process {
$inputObject
}
end {
wh in y end
}
}
1..5| x | y | measure -Sum
PowerShell code to retrieve PipelineProcessor value through reflection:
$t_cmdRun = $pscmdlet.CommandRuntime.gettype()
# Get pipelineprocessor value ($pipor)
$bindFlags = [Reflection.BindingFlags]"NonPublic,Instance"
$piporProp = $t_cmdRun.getproperty("PipelineProcessor", $bindFlags )
$pipor=$piporProp.GetValue($PSCmdlet.CommandRuntime,$null)
Powershell code to invoke method through reflection:
$proc = (gps)[12] # semi-random process
$methinfo = $proc.gettype().getmethod("GetComIUnknown", $bindFlags)
# Return ComIUnknown as an IntPtr
$comIUnknown = $methinfo.Invoke($proc, #($true))
I know that throwing a PipelineStoppedException stops the pipeline. The following example will simulate what you see with Select -first 1 in v3.0, in v2.0:
filter Select-Improved($first) {
begin{
$count = 0
}
process{
$_
$count++
if($count -ge $first){throw (new-object System.Management.Automation.PipelineStoppedException)}
}
}
trap{continue}
1..3| foreach { Write-Host "Value : $_"; $_ }| Select-Improved -first 1
write-host "after"