I want to make grep case insensitive by applying a default option -i to grep. The standard way in PowerShell is to use a function:
function grep {
env grep -i $args
}
Grep also accepts text to search via standard input (cat file | grep search).
A simple way to achieve that is:
function grep($search) {
$input | env grep -i $search
Can I combine these two so that function grep knows it was called in a pipeline? Or is there an even simpler way?
I'm going to assume that env grep means that you have a Unix-ish environment with a grep.exe somewhere. The proper way of wrappting that in a function that can handle parameter and pipeline input looks somewhat like this:
function grep {
[CmdletBinding(DefaultParameterSetName='content')]
Param(
[Parameter(Position=0,Mandatory=$true)]
[string]$Search,
[Parameter(
ParameterSetName='content',
Mandatory=$true,
ValueFromPipeline=$true,
ValueFromPipelineByPropertyName=$true
)]
[string]$InputObject,
[Parameter(
ParameterSetName='file',
Position=1,
Mandatory=$true
)]
[string[]]$File
)
Begin {
$grep = & env grep
}
Process {
if ($PSCmdlet.ParameterSetName -eq 'file') {
& $grep -i $Search $File
} else {
$InputObject | & $grep -i $Search
}
}
}
I finally figured it out. This might only work interactively and not in a script - because of $input - but aliasing is generally considered an interactive technique (although providing standard options to commands not necessarily).
In this example I'm using ag - the Silver Searcher - as it's case insensitive (actually "case-smart"), recursive and enables color by default. It searches the current directory if no path is given (which is the first "test case" to show that the function is working as intended)
function ag {
if ($input.Count) {
$input.Reset()
$input | env ag -i $args
}
else {
env ag --depth -1 --hidden --skip-vcs-ignores $args
}
}
Related
I am trying to write a PowerShell script that can get pipeline input (and is expected to do so), but trying something like
ForEach-Object {
# do something
}
doesn't actually work when using the script from the commandline as follows:
1..20 | .\test.ps1
Is there a way?
Note: I know about functions and filters. This is not what I am looking for.
In v2 you can also accept pipeline input (by propertyName or byValue), add parameter aliases etc:
function Get-File{
param(
[Parameter(
Position=0,
Mandatory=$true,
ValueFromPipeline=$true,
ValueFromPipelineByPropertyName=$true)
]
[Alias('FullName')]
[String[]]$FilePath
)
process {
foreach($path in $FilePath)
{
Write-Host "file path is: $path"
}
}
}
# test ValueFromPipelineByPropertyName
dir | Get-File
# test ValueFromPipeline (byValue)
"D:\scripts\s1.txt","D:\scripts\s2.txt" | Get-File
- or -
dir *.txt | foreach {$_.fullname} | Get-File
This works and there are probably other ways to do it:
foreach ($i in $input) {
$i
}
17:12:42 PS>1..20 | .\cmd-input.ps1
1
2
3
-- snip --
18
19
20
Search for "powershell $input variable" and you will find more information and examples.
A couple are here:
PowerShell Functions and Filters PowerShell Pro!
(see the section on "Using the PowerShell Special Variable “$input”")
"Scripts, functions, and script blocks all have access to the $input variable, which provides an enumerator over the elements in the incoming pipeline. "
or
$input gotchas « Dmitry’s PowerBlog PowerShell and beyond
"... basically $input in an enumerator which provides access to the pipeline you have."
For the PS command line, not the DOS command line Windows Command Processor.
You can either write a filter which is a special case of a function like so:
filter SquareIt([int]$num) { $_ * $_ }
or you can create a similar function like so:
function SquareIt([int]$num) {
Begin {
# Executes once before first item in pipeline is processed
}
Process {
# Executes once for each pipeline object
$_ * $_
}
End {
# Executes once after last pipeline object is processed
}
}
The above works as an interactive function definiton or if in a script can be dotted into your global session (or another script). However your example indicated you wanted a script so here it is in a script that is directly usable (no dotting required):
--- Contents of test.ps1 ---
param([int]$num)
Begin {
# Executes once before first item in pipeline is processed
}
Process {
# Executes once for each pipeline object
$_ * $_
}
End {
# Executes once after last pipeline object is processed
}
With PowerShell V2, this changes a bit with "advanced functions" which embue functions with the same parameter binding features that compiled cmdlets have. See this blog post for an example of the differences. Also note that in this advanced functions case you don't use $_ to access the pipeline object. With advanced functions, pipeline objects get bound to a parameter just like they do with a cmdlet.
The following are the simplest possible examples of scripts/functions that use piped input. Each behaves the same as piping to the "echo" cmdlet.
As Scripts:
# Echo-Pipe.ps1
Begin {
# Executes once before first item in pipeline is processed
}
Process {
# Executes once for each pipeline object
echo $_
}
End {
# Executes once after last pipeline object is processed
}
# Echo-Pipe2.ps1
foreach ($i in $input) {
$i
}
As functions:
Function Echo-Pipe {
Begin {
# Executes once before first item in pipeline is processed
}
Process {
# Executes once for each pipeline object
echo $_
}
End {
# Executes once after last pipeline object is processed
}
}
Function Echo-Pipe2 {
foreach ($i in $input) {
$i
}
}
E.g.
PS > . theFileThatContainsTheFunctions.ps1 # This includes the functions into your session
PS > echo "hello world" | Echo-Pipe
hello world
PS > cat aFileWithThreeTestLines.txt | Echo-Pipe2
The first test line
The second test line
The third test line
I'm using C:\Users\User\Documents\Powershell\Microsoft.PowerShell_profile.ps1 for some alias commands for my every day work.
Example:
function MyAliasCommand { .\custom\script.ps1 }
New-Alias my MyAliasCommand
It works if I call it from my Powershell using the command "my".
I'm happy.
What it doesn't work is if I need to pass flags to my, example:
my -d one
How to do this?
Is there a way to let powershell pass all my arguments in the function MyAliasCommand?
Something like: function MyAliasCommand { .\custom\script.ps1 $allTheFlags }?
I would do this as follows:
./custom/script.ps1
[CmdletBinding()]
param (
[Parameter()]
[string]
$d
)
Write-Host $d
Then you can write:
function MyCommand {
& $PSScriptRoot/custom/script.ps1 #args
}
New-Alias myAlias MyCommand
myAlias -d 'one'
Take into account that I used the call operator and splatting.
$args will contain all the parameters you supply.
How do I forward all pipeline input and arguments to a command inside an alias function.
For example if I wanted to alias tail
function tail {
coreutils tail #args
}
works fine with tail -n 5 test.txt
but not with cat test.txt | tail -n 5
even though cat test.txt | coreutils tail -n 5 works
In the simplest case, use the following:
function tail {
if ($MyInvocation.ExpectingInput) { # Pipeline input present.
# $Input passes the collected pipeline input through.
$Input | coreutils tail #args
} else {
coreutils tail #args
}
}
The down-side of this approach is that all pipeline input is collected in memory first, before it is relayed to the target program.
A streaming solution - where input objects (lines) - are passed through as they become available - requires more effort:
function tail {
[CmdletBinding(PositionalBinding=$false)]
param(
[Parameter(ValueFromPipeline)]
$InputObject
,
[Parameter(ValueFromRemainingArguments)]
[string[]] $PassThruArgs
)
begin
{
# Set up a steppable pipeline.
$scriptCmd = { coreutils tail $PassThruArgs }
$steppablePipeline = $scriptCmd.GetSteppablePipeline($myInvocation.CommandOrigin)
$steppablePipeline.Begin($PSCmdlet)
}
process
{
# Pass the current pipeline input through.
$steppablePipeline.Process($_)
}
end
{
$steppablePipeline.End()
}
}
The above advanced function is a so-called proxy function, explained in more detail in this answer.
I am unable to understand Parameter passing behavior in Powershell.
Say, I have a script callScript.ps1
param($a="DefaultA",$b="DefaultB")
echo $a, $b
Say, another script calls callScript.ps1.
.\callScript.ps1
# outputs DefaultA followed by DefaultB as expected
.\callScript.ps1 -a 2 -b 3
# outputs 2 followed by 3 as expected
$arguments='-a 2 -b 3'
callScript.ps1 $arguments
# I expected output as previous statement but it is as follows.
# -a 2 -b 3
# DefaultB
How can I run a powershell script by constructing command dynamically as above?
Can you please explain why the script the $arguments is interpreted as $a variable in callScript.ps1?
what's happening here is that you're passing a string:
'-a 2 -b 3' as the parameter for $a
you need to specify the values within the param, if you really needed to do it as you have above (there's definitely a better way though) you could do this using Invoke-Expression (short iex)
function doprint {
param( $a,$b )
$a ; $b
}
$arg = '-a "yes" -b "no"'
"doprint $arg" | iex
you could also change your function to take in an array of values like this:
function doprint {
param( [string[]]$a )
$a[0] ; $a[1]
}
$arg = #('yes','no')
doprint $arg
As has already been hinted at, you can't pass a single string as your script is expecting two params - the string is taken as input for param $a, whilst param $b takes the default value.
You can however build a simple hash table containing your arguments, and then use splatting to pass them to the script.
The changes to your code are minimal:
$arguments=#{a="2";b="3"}
callScript.ps1 #arguments
Assuming powershell has a limit of N characters in its command, how can I pass more than N chars to the powershell cmdlet? Based on https://support.microsoft.com/en-in/kb/830473 link, it seems that the character limit is 8191 but it says that for cmd.exe, not sure what is the size limit for powershell. So if I have an input of size more than >8k, can I redirect the input to the powershell to circumvent this problem (solution based on what is mentioned in the referenced document).
Eg:
powershell console $> echo “a very long string” // the whole command including the echo and the very long string totalling less than 8192 chars on the powershell console. When I execute this I get the whole string as the output on the console
powershell console $> echo “a very long string // Try to add characters to the very long string, powershell doesn’t allow me to add more chars to the very long string if the total goes above 8192 since I guess I have reached the limit on the number of characters I can enter.
What I want:
powershell console $> echo // Place my input (which is more than 8192 chars) in a file and provide that as an input to echo and echo should display the complete string on the console thereby circumventing the limitation of the number of chars in a command.
The command echo I have used is only for representation purpose and I want to use a custom cmdlet instead of that so please consider this a valid scenario.
Edit 2:
psm1 file:
Function DoSomething {
[CmdletBinding()]
Param(
[Parameter(
Mandatory = $False)
]
[string]$v1,
[Parameter(
Mandatory = $False)
]
[string]$v2)
Begin {}
Process {
Write-Output "hello $v1 | $v2"
}
}
Text File Content say content.txt(short for representation purpose but assume this can be more than 8k characters):
-v1 "t1" -v2 "qwe"
Now when I do
powershell Console$> DoSomething (Get-Content content.txt)
the output that I get is
hello -v1 "t1" -v2 "qwe" |
I expect the output to be
hello -v1 "t1" | -v2 "qwe"
so that the execution of the cmdlet can happen without any issues. I tried this with the example of more than 8k characters in the text file and it is able to print the output, it is just that the parameters aren't getting separated. The command to provide the input to the cmdlet doesn't have to be Get-Content, it can be anything as long as it works.
You misunderstand how parameters in PowerShell functions work. The output of Get-Content is an array of strings (one string for each line in the file), but the entire array is passed to the first parameter. Also, a string isn't magically split so that the substrings can go to several parameters. How should PowerShell know which way to split the string?
A better way to deal with such input data is to have your function accept input from the pipeline:
Function DoSomething {
[CmdletBinding()]
Param(
[Parameter(
Mandatory=$false,
ValueFromPipeline=$true,
ValueFromPipelineByPropertyName=$true
)]
[string]$v1,
[Parameter(
Mandatory=$false,
ValueFromPipeline=$true,
ValueFromPipelineByPropertyName=$true
)]
[string]$v2
)
Process {
Write-Output "hello $v1 | $v2"
}
}
and define the data as a CSV (column names matching the parameter names):
v1,v2
"-v1 ""t1""","-v2 ""qwe"""
so that you can pipe the data into the function:
Import-Csv content.csv | DoSomething
With the function built like this you could also define the data as a hashtable and splat it:
$data = #{
'v1' = '-v1 "t1"'
'v2' = '-v2 ""qwe"'
}
DoSomething #data
For more information about function parameters see about_Parameters and about_Functions_Advanced_Parameters.
Else you can pass a path file where content of this file is used in your script ps. No limit then...