How do I forward all pipeline input and arguments to a command inside an alias function.
For example if I wanted to alias tail
function tail {
coreutils tail #args
}
works fine with tail -n 5 test.txt
but not with cat test.txt | tail -n 5
even though cat test.txt | coreutils tail -n 5 works
In the simplest case, use the following:
function tail {
if ($MyInvocation.ExpectingInput) { # Pipeline input present.
# $Input passes the collected pipeline input through.
$Input | coreutils tail #args
} else {
coreutils tail #args
}
}
The down-side of this approach is that all pipeline input is collected in memory first, before it is relayed to the target program.
A streaming solution - where input objects (lines) - are passed through as they become available - requires more effort:
function tail {
[CmdletBinding(PositionalBinding=$false)]
param(
[Parameter(ValueFromPipeline)]
$InputObject
,
[Parameter(ValueFromRemainingArguments)]
[string[]] $PassThruArgs
)
begin
{
# Set up a steppable pipeline.
$scriptCmd = { coreutils tail $PassThruArgs }
$steppablePipeline = $scriptCmd.GetSteppablePipeline($myInvocation.CommandOrigin)
$steppablePipeline.Begin($PSCmdlet)
}
process
{
# Pass the current pipeline input through.
$steppablePipeline.Process($_)
}
end
{
$steppablePipeline.End()
}
}
The above advanced function is a so-called proxy function, explained in more detail in this answer.
Related
I'm looking to have a function in script where I can use a ScriptBlock passed in as either a predicate or with Where-Object.
I can write
cat .\.gitignore | Where-Object { $_.contains('pp') }
and this works; as does:
$f = { $_.contains('pp') }; cat .gitignore | Where-Object $f
however trying
$f.Invoke( 'apple' )
results in
MethodInvocationException: Exception calling "Invoke" with "1" argument(s): "You cannot call a method on a null-valued expression.
Whereas I expected True. So clearly $_ wasn't set.
Likewise
$ff = { echo "args: $args`nauto: $_" }; $ff.Invoke( 'apple' )
outputs
args: apple
auto:
So $_ is clearly not getting set.
'apple' | %{ $_.contains('pp') }
Works, but I want the scriptblock to be a variable and
$f = { $_.contains('pp') }; 'apple' | %$f
Is a compile error.
tl;dr: So how do I set/pass the value of $_ inside a scriptblock I am invoking?
Note, this answer only covers how does $_ gets populated in the context of a process block of a script block. Other use cases can be found in the about_PSItem documentation.
In the context of a process block of a Script Block, the $_ ($PSItem) variable is automatically populated and represents each element coming from the pipeline, i.e.:
$f = { process { $_.contains('pp') }}
'apple' | & $f # True
You can however achieve the same using InvokeWithContext method from the ScriptBlock Class:
$f = { $_.contains('pp') }
$f.InvokeWithContext($null, [psvariable]::new('_', 'apple')) # True
Do note, this method always returns Collection`1. Output is not enumerated.
Worth noting as zett42 points out, the scoping rules of script blocks invoked via it's methods or via the call operator & still apply.
Script Blocks are able to see parent scope variables (does not include Remoting):
$foo = 'hello'
{ $foo }.Invoke() # hello
But are not able to update them:
$foo = 'hello'
{ $foo = 'world' }.Invoke()
$foo # hello
Unless using a scope a modifier (applies only to Value Types):
$foo = 'hello'
{ $script:foo = 'world' }.Invoke()
$foo # world
Or via the dot sourcing operator .:
$foo = 'hello'
. { $foo = 'world' }
$foo # world
# still applies with pipelines!
$foo = 'hello'
'world' | . { process { $foo = $_ }}
$foo # world
See about Scopes for more details.
Using the .Invoke() method (and its variants, .InvokeReturnAsIs() and .InvokeWithContext()) to execute a script block in PowerShell code is best avoided, because it changes the semantics of the call in several respects - see this answer for more information.
While the PowerShell-idiomatic equivalent is &, the call operator, it is not enough here, given that you want want the automatic $_ variable to be defined in your script block.
The easiest way to define $_ based on input is indeed ForEach-Object (one of whose built-in aliases is %):
$f = { $_.contains('pp') }
ForEach-Object -Process $f -InputObject 'apple' # -> $true
Note, however, that -InputObject only works meaningfully for a single input object (though you may pass an array / collection in which case $_ then refers to it as a whole); to provide multiple ones, use the pipeline:
'apple', 'pear' | ForEach-Object $f # $true, $false
# Equivalent, with alias
'apple', 'pear' | % $f
If, by contrast, your intent is simply for your script block to accept arguments, you don't need $_ at all and can simply make your script either formally declare parameter(s) or use the automatic $args variable which contains all (unbound) positional arguments:
# With $args: $args[0] is the first positional argument.
$f = { $args[0].contains('pp') }
& $f 'apple'
# With declared parameter.
$f = { param([string] $fruit) $fruit.contains('pp') }
& $f 'apple'
For more information about the parameter-declaration syntax, see the conceptual about_Functions help topic (script blocks are basically unnamed functions, and only the param(...) declaration style can be used in script blocks).
I got it to work by wrapping $f in () like
$f = { $_.contains('pp') }; 'apple' | %($f)
...or (thanks to #zett42) by placing a space between the % and $ like
$f = { $_.contains('pp') }; 'apple' | % $f
Can even pass in the value from a variable
$f = { $_.contains('pp') }; $a = 'apple'; $a | %($f)
Or use it inside an If-statement
$f = { $_.contains('pp') }; $a = 'apple'; If ( $a | %($f) ){ echo 'yes' }
So it appears that $_ is only set by having things 'piped' (aka \) into it? But why this is and how it works, and if this can be done through .invoke() is unknown to me. If anyone can explain this please do.
From What does $_ mean in PowerShell? and the related documentation, it seems like $PSItem is indeed a better name since it isn't like Perl's $_
I am trying to write a PowerShell script that can get pipeline input (and is expected to do so), but trying something like
ForEach-Object {
# do something
}
doesn't actually work when using the script from the commandline as follows:
1..20 | .\test.ps1
Is there a way?
Note: I know about functions and filters. This is not what I am looking for.
In v2 you can also accept pipeline input (by propertyName or byValue), add parameter aliases etc:
function Get-File{
param(
[Parameter(
Position=0,
Mandatory=$true,
ValueFromPipeline=$true,
ValueFromPipelineByPropertyName=$true)
]
[Alias('FullName')]
[String[]]$FilePath
)
process {
foreach($path in $FilePath)
{
Write-Host "file path is: $path"
}
}
}
# test ValueFromPipelineByPropertyName
dir | Get-File
# test ValueFromPipeline (byValue)
"D:\scripts\s1.txt","D:\scripts\s2.txt" | Get-File
- or -
dir *.txt | foreach {$_.fullname} | Get-File
This works and there are probably other ways to do it:
foreach ($i in $input) {
$i
}
17:12:42 PS>1..20 | .\cmd-input.ps1
1
2
3
-- snip --
18
19
20
Search for "powershell $input variable" and you will find more information and examples.
A couple are here:
PowerShell Functions and Filters PowerShell Pro!
(see the section on "Using the PowerShell Special Variable “$input”")
"Scripts, functions, and script blocks all have access to the $input variable, which provides an enumerator over the elements in the incoming pipeline. "
or
$input gotchas « Dmitry’s PowerBlog PowerShell and beyond
"... basically $input in an enumerator which provides access to the pipeline you have."
For the PS command line, not the DOS command line Windows Command Processor.
You can either write a filter which is a special case of a function like so:
filter SquareIt([int]$num) { $_ * $_ }
or you can create a similar function like so:
function SquareIt([int]$num) {
Begin {
# Executes once before first item in pipeline is processed
}
Process {
# Executes once for each pipeline object
$_ * $_
}
End {
# Executes once after last pipeline object is processed
}
}
The above works as an interactive function definiton or if in a script can be dotted into your global session (or another script). However your example indicated you wanted a script so here it is in a script that is directly usable (no dotting required):
--- Contents of test.ps1 ---
param([int]$num)
Begin {
# Executes once before first item in pipeline is processed
}
Process {
# Executes once for each pipeline object
$_ * $_
}
End {
# Executes once after last pipeline object is processed
}
With PowerShell V2, this changes a bit with "advanced functions" which embue functions with the same parameter binding features that compiled cmdlets have. See this blog post for an example of the differences. Also note that in this advanced functions case you don't use $_ to access the pipeline object. With advanced functions, pipeline objects get bound to a parameter just like they do with a cmdlet.
The following are the simplest possible examples of scripts/functions that use piped input. Each behaves the same as piping to the "echo" cmdlet.
As Scripts:
# Echo-Pipe.ps1
Begin {
# Executes once before first item in pipeline is processed
}
Process {
# Executes once for each pipeline object
echo $_
}
End {
# Executes once after last pipeline object is processed
}
# Echo-Pipe2.ps1
foreach ($i in $input) {
$i
}
As functions:
Function Echo-Pipe {
Begin {
# Executes once before first item in pipeline is processed
}
Process {
# Executes once for each pipeline object
echo $_
}
End {
# Executes once after last pipeline object is processed
}
}
Function Echo-Pipe2 {
foreach ($i in $input) {
$i
}
}
E.g.
PS > . theFileThatContainsTheFunctions.ps1 # This includes the functions into your session
PS > echo "hello world" | Echo-Pipe
hello world
PS > cat aFileWithThreeTestLines.txt | Echo-Pipe2
The first test line
The second test line
The third test line
In bash, I can do this:
if this_command >/dev/null 2>&1; then
ANSWER="this_command"
elif that_command >/dev/null 2>&1; then
ANSWER="that_command"
else
ANSWER="neither command"
fi
but in Powershell, I have to do this:
this_command >/dev/null 2>&1
if ($?) {
ANSWER="this_command"
} else {
that_command >/dev/null 2>&1
if ($?) {
ANSWER="that_command"
} else {
ANSWER="neither command"
}
}
or something similar with ($LASTEXITCODE -eq 0). How do I make the Powershell look like bash? I'm not a Powershell expert, but I cannot believe that it doesn't not provide some means of running a command and checking its return code in a single statement in a way that could be used in an if-elseif-else statement. This statement would be increasingly difficult to read with every external command that must be tested in this way.
For PowerShell cmdlets you can do the exact same thing you do in bash. You don't even need to do individual assignments in each branch. Just output what you want to assign and collect the output of the entire conditional in a variable.
$ANSWER = if (Do-Something >$null 2>&1) {
'this_command'
} elseif (Do-Other >$null 2>&1) {
'that_command'
} else {
'neither command'
}
For external commands it's slightly different, because PowerShell would evaluate the command output, not the exit code/status (with empty output evaluating to "false"). But you can run the command in a subexpression and output the status to get the desired result.
$ANSWER = if ($(this_command >$null 2>&1; $?)) {
'this_command'
} elseif ($(that_command >$null 2>&1; $?)) {
'that_command'
} else {
'neither command'
}
Note that you must use a subexpression ($(...)), not a grouping expression ((...)), because you effectively need to run 2 commands in a row (run external command, then output status), which the latter doesn't support.
You can't do it inline like in bash, but you can one-line this with two statements on one line, separated by a semi-colon ;:
MyProgram.exe -param1 -param2 -etc *>$null; if( $LASTEXITCODE -eq 0 ) {
# Success code here
} else {
# Fail code here
}
Also, you can't use $? with commands, only Powershell cmdlets, which is why we check that $LASTEXITCODE -eq 0 instead of using $?.
Note that you CAN evaluate cmdlets inline, just not external commands. For example:
if( Test-Connection stackoverflow.com ){
"success"
} else {
"fail"
}
Another approach is to have it output an empty string if it's false:
if (echo hi | findstr there) { 'yes' }
if (echo hi | findstr hi) { 'yes' }
yes
PowerShell's native error handling works completely differently from the exit-code-based error signaling performed by external programs, and, unfortunately, error handling with external programs in PowerShell is cumbersome, requiring explicit checks of the automatic $? or $LASTEXITCODE variables.
PowerShell [Core]:
introduced support for Bash-style && and || pipeline-chain operators in v7 - see this answer.
but this will not also enable use of external-program calls in if statements, because there PowerShell will continue to operate on the output from commands, not on their implied success status / exit code; see this answer for more information.
Solutions:
PowerShell [Core] 7.0+:
$ANSWER = this_command *>$null && "this_command" ||
(that_command *>$null && "that_command" || "neither command")
Note:
If this_command or that_command don't exist (can't be found), a statement-terminating error occurs, i.e. the statement fails as a whole.
Note the need to enclose the 2nd chain in (...) so that && "that_command" doesn't also kick in when this_command succeeds.
*>$null is used to conveniently silence all streams with a single redirection.
Unlike an if-based solution, this technique passes (non-suppressed) output from the external programs through.
Windows PowerShell and PowerShell Core 6.x:
If the external-program calls produce no output or you actively want to suppress their output, as in your question:
See the $(...)-based technique in Ansgar Wiechers' helpful answer.
If you do want the external programs' output:
An aux. dummy do loop allows for a fairly "low-noise" solution:
$ANSWER = do {
this_command # Note: No output suppression
if ($?) { "this_command"; break }
that_command # Note: No output suppression
if ($?) { "that_command"; break }
"neither command"
} while ($false)
Currently, I'm writing a PowerShell module which automatically configures aliases for all git commands, inspired by git-sh.
Then I wrote functions below.
The Enable-GitAliases function is the entry point to configure aliases automatically.
it collects git's subcommands by Get-GitCommands, which parses git --help -a to get all git's subcommands.
Then it defines the wrapper functions for the collected git commands.
My question is: why is git --help -a called so many times (possibly infinitely) when invoking Enable-GitAliases, which causing significant slow down?
After writing the code, I found Enable-GitAliases takes too much time (I've never seen it finishes).
According to the Task Manager, the git --help -a command is launched and exits repeatedly.
I expected the git --help -a command is called only once.
Actually, Get-GitCommands | % { echo $_ } calls git --help -a only once.
What is the difference, and what is best way to fix?
function Get-GitCommands {
-Split (git --help -a | select-string -pattern '^ [-a-zA-Z0-9.]+\s*')
}
function Enable-GitAliases($avoidConflicts = $true) {
Get-GitCommands | % {
$aliasName = $_
if (-not ($avoidConflicts -and (Get-Command $aliasName 2> $null) -ne $null)) {
Enable-GitAliases $aliasName
}
}
}
function Enable-GitAlias($commandName) {
$wrapper = #'
function global:{0} {{
git {0} $args
}}
'# -f $commandName
Invoke-Expression $wrapper
}
You call Enable-GitAliases recursively, but is this intended?
Is your intention this?
function Enable-GitAliases($avoidConflicts = $true) {
Get-GitCommands | % {
$aliasName = $_
if (-not ($avoidConflicts -and (Get-Command $aliasName 2> $null) -ne $null)) {
# Enable-GitAliases -> Enable-GitAlias
Enable-GitAlias $aliasName
}
}
}
I want to make grep case insensitive by applying a default option -i to grep. The standard way in PowerShell is to use a function:
function grep {
env grep -i $args
}
Grep also accepts text to search via standard input (cat file | grep search).
A simple way to achieve that is:
function grep($search) {
$input | env grep -i $search
Can I combine these two so that function grep knows it was called in a pipeline? Or is there an even simpler way?
I'm going to assume that env grep means that you have a Unix-ish environment with a grep.exe somewhere. The proper way of wrappting that in a function that can handle parameter and pipeline input looks somewhat like this:
function grep {
[CmdletBinding(DefaultParameterSetName='content')]
Param(
[Parameter(Position=0,Mandatory=$true)]
[string]$Search,
[Parameter(
ParameterSetName='content',
Mandatory=$true,
ValueFromPipeline=$true,
ValueFromPipelineByPropertyName=$true
)]
[string]$InputObject,
[Parameter(
ParameterSetName='file',
Position=1,
Mandatory=$true
)]
[string[]]$File
)
Begin {
$grep = & env grep
}
Process {
if ($PSCmdlet.ParameterSetName -eq 'file') {
& $grep -i $Search $File
} else {
$InputObject | & $grep -i $Search
}
}
}
I finally figured it out. This might only work interactively and not in a script - because of $input - but aliasing is generally considered an interactive technique (although providing standard options to commands not necessarily).
In this example I'm using ag - the Silver Searcher - as it's case insensitive (actually "case-smart"), recursive and enables color by default. It searches the current directory if no path is given (which is the first "test case" to show that the function is working as intended)
function ag {
if ($input.Count) {
$input.Reset()
$input | env ag -i $args
}
else {
env ag --depth -1 --hidden --skip-vcs-ignores $args
}
}