How to use dash argument in Powershell? - powershell

I am porting a script from bash to PowerShell, and I would like to keep the same support for argument parsing in both. In the bash, one of the possible arguments is -- and I want to also detect that argument in PowerShell. However, nothing I've tried so far has worked. I cannot define it as an argument like param($-) as that causes a compile error. Also, if I decide to completely forego PowerShell argument processing, and just use $args everything appears good, but when I run the function, the -- argument is missing.
Function Test-Function {
Write-Host $args
}
Test-Function -- -args go -here # Prints "-args go -here"
I know about $PSBoundParameters as well, but the value isn't there, because I can't bind a parameter named $-. Are there any other mechanisms here that I can try, or any solution?
For a bit more context, note that me using PowerShell is a side effect. This isn't expected to be used as a normal PowerShell command, I have also written a batch wrapper around this, but the logic of the wrapper is more complex than I wanted to write in batch, so the batch wrapper just calls the PowerShell function, which then does the more complex processing.

I found a way to do so, but instead of double-hyphen you have to pass 3 of them.
This is a simple function, you can change the code as you want:
function Test-Hyphen {
param(
${-}
)
if (${-}) {
write-host "You used triple-hyphen"
} else {
write-host "You didn't use triple-hyphen"
}
}
Sample 1
Test-Hyphen
Output
You didn't use triple-hyphen
Sample 2
Test-Hyphen ---
Output
You used triple-hyphen

As an aside: PowerShell allows a surprising range of variable names, but you have to enclose them in {...} in order for them to be recognized; that is, ${-} technically works, but it doesn't solve your problem.
The challenge is that PowerShell quietly strips -- from the list of arguments - and the only way to preserve that token is you precede it with the PSv3+ stop-parsing symbol, --%, which, however, fundamentally changes how the arguments are passed and is obviously an extra requirement, which is what you're trying to avoid.
Your best bet is to try - suboptimal - workarounds:
Option A: In your batch-file wrapper, translate -- to a special argument that PowerShell does preserve and pass it instead; the PowerShell script will then have to re-translate that special argument to --.
Option B: Perform custom argument parsing in PowerShell:
You can analyze $MyInvocation.Line, which contains the raw command line that invoked your script, and look for the presence of -- there.
Getting this right and making it robust is nontrivial, however.
Here's a reasonably robust approach:
# Don't use `param()` or `$args` - instead, do your own argument parsing:
# Extract the argument list from the invocation command line.
$argList = ($MyInvocation.Line -replace ('^.*' + [regex]::Escape($MyInvocation.InvocationName)) -split '[;|]')[0].Trim()
# Use Invoke-Expression with a Write-Output call to parse the raw argument list,
# performing evaluation and splitting it into an array:
$customArgs = if ($argList) { #(Invoke-Expression "Write-Output -- $argList") } else { #() }
# Print the resulting arguments array for verification:
$i = 0
$customArgs | % { "Arg #$((++$i)): [$_]" }
Note:
There are undoubtedly edge cases where the argument list may not be correctly extracted or where the re-evaluation of the raw arguments causes side effect, but for the majority of cases - especially when called from outside PowerShell - this should do.
While useful here, Invoke-Expression should generally be avoided.
If your script is named foo.ps1 and you invoked it as ./foo.ps1 -- -args go -here, you'd see the following output:
Arg #1: [--]
Arg #2: [-args]
Arg #3: [go]
Arg #4: [-here]

I came up with the following solution, which works well also inside pipelines multi-line expressions. I am using the PowerShell Parser to parse the invocation expression string (while ignoring any incomplete tokens, which might be present at the end of $MyInfocation.Line value) and then Invoke-Expression with Write-Output to get the actual argument values:
# Parse the whole invocation line
$code = [System.Management.Automation.Language.Parser]::ParseInput($MyInvocation.Line.Substring($MyInvocation.OffsetInLine - 1), [ref]$null, [ref]$null)
# Find our invocation expression without redirections
$myline = $code.Find({$args[0].CommandElements}, $true).CommandElements | % { $_.ToString() } | Join-String -Separator ' '
# Get the argument values
$command, $arguments = Invoke-Expression ('Write-Output -- ' + $myline)
# Fine-tune arguments to be always an array
if ( $arguments -is [string] ) { $arguments = #($arguments) }
if ( $arguments -eq $null ) { $arguments = #() }
Please be aware that the original values in the function call are reevaluated in Invoke-Expression, so any local variables might shadow values of the actual arguments. Because of that, you can also use this (almost) one-liner at the top of your function, which prevents the pollution of local variables:
# Parse arguments
$command, $arguments = Invoke-Expression ('Write-Output -- ' + ([System.Management.Automation.Language.Parser]::ParseInput($MyInvocation.Line.Substring($MyInvocation.OffsetInLine - 1), [ref]$null, [ref]$null).Find({$args[0].CommandElements}, $true).CommandElements | % { $_.ToString() } | Join-String -Separator ' '))
# Fine-tune arguments to be always an array
if ( $arguments -is [string] ) { $arguments = #($arguments) }
if ( $arguments -eq $null ) { $arguments = #() }

Related

Powershell: How to remove space between $ and text[0]

Code:
$text=Get-Content -Path "E:\1.txt"
$text.GetType() | Format-Table -AutoSize
For($i=0; $i -le 5 ;$i++)
{
$var=Write-host '$'text[$i]
$var
}
Actual Result:
$ text[0]
$ text[1]
$ text[2]
$ text[3]
$ text[4]
$ text[5]
I need below Result:
$text[0]
$text[1]
$text[2]
$text[3]
$text[4]
$text[5]
If you must use the quotes, using -separator also works:
$text=Get-Content -Path "E:\1.txt"
$text.GetType() | Format-Table -AutoSize
For($i=0; $i -le 5 ;$i++)
{
$var=Write-host '$'text[$i] -Separator ''
$var
}
Your code fundamentally doesn't do what you intend it to do, due to the mistaken use of Write-Host:
# !! Doesn't capture anything in $var, prints directly to the screen.
$var=Write-host '$'text[$i]
$var # !! is effectively $null and produces no output.
See the bottom section for details.
Instead, what you want is an expandable string (aka interpolating string, "..."-enclosed), with selective `-escaping of the $ character you want to be treated verbatim:
$var= "`$text[$i]" # Expandable string; ` escapes the $ char., $i is expanded
$var
There are other ways to construct the desired string:
$var = '$text[{0}]' -f $i, using -f, the format operator.
$var = '$' + "text[$i]", using string concatenation with +
but the above approach is simplest in your case.
As for what you tried:
Write-Host is typically - and definitely in your case - the wrong tool to use, unless the intent is to write to the display only, bypassing the success output stream and with it the ability to send output to other commands, capture it in a variable, or redirect it to a file. To output a value, use it by itself; e.g., $value instead of Write-Host $value (or use Write-Output $value, though that is rarely needed); see this answer.
What you thought of as a single argument, '$'text[$i], was actually passed as two arguments, verbatim $ and expanded text[$i] (e.g., text[0]) and because Write-Host simply space-concatenates multiple arguments, a space was effectively inserted in the (for-display) output.
That '$'text[$i] becomes two arguments is a perhaps surprising PowerShell idiosyncrasy; unlike in POSIX-compatible shells such as bash, composing a single string argument from a mix of unquoted and (potentially differently) quoted parts only works if the argument starts with an unquoted substring or (mere) variable reference; for instance:
Write-Output foo'bar none' does pass a single argument (passes foobar none), whereas
Write-Output 'bar none'foo does not (passes bar none and foo)
See this answer for more information.

How to pass parameters to a PS script invoked through Start-Job?

I want to use start-job to run a .ps1 script requiring a parameter. Here's the script file:
#Test-Job.ps1
Param (
[Parameter(Mandatory=$True)][String]$input
)
$output = "$input to output"
return $output
and here is how I am running it:
$input = "input"
Start-Job -FilePath 'C:\PowerShell\test_job.ps1' -ArgumentList $input -Name "TestJob"
Get-Job -name "TestJob" | Wait-Job | Receive-Job
Get-Job -name "TestJob" | Remove-Job
Run like this, it returns " to output", so $input is null in the script run by the job.
I've seen other questions similar to this, but they mostly use -Scriptblock in place of -FilePath. Is there a different method for passing parameters to files through Start-Job?
tl;dr
$input is an automatic variable (value supplied by PowerShell) and shouldn't be used as a custom variable.
Simply renaming $input to, say, $InputObject solves your problem.
As Lee_Dailey notes, $input is an automatic variable and shouldn't be assigned to (it is automatically managed by PowerShell to provide an enumerator of pipeline input in non-advanced scripts and functions).
Regrettably and unexpectedly, several automatic variables, including $input, can be assigned to: see this answer.
$input is a particularly insidious example, because if you use it as a parameter variable, any value you pass to it is quietly discarded, because in the context of a function or script $input invariably is an enumerator for any pipeline input.
Here's a simple example to demonstrate the problem:
PS> & { param($input) "[$input]" } 'hi'
# !! No output - the argument was quietly discarded.
That the built-in definition of $input takes precedence can be demonstrated as follows:
PS> 'ho' | & { param($input) "[$input]" } 'hi'
ho # !! pipeline input took precedence
While you can technically get away with using $input as a regular variable (rather than a parameter variable) as long as you don't cross scope boundaries, custom use of $input should still be avoided:
& {
$input = 'foo' # TO BE AVOIDED
"[$input]" # Technically works: -> '[foo]'
& { "[$input]" } # FAILS, due to child scope: -> '[]'
}

Why does the scope of variables change depending on if it's a .ps1 or .psm1 file, and how can this be mitigated?

I have a function that executes a script block. For convenience, the script block does not need to have explicitly defined parameters, but instead can use $_ and $A to refer to the inputs.
In the code, this is done as such:
$_ = $Value
$A = $Value2
& $ScriptBlock
This whole thing is wrapped in a function. Minimal example:
function F {
param(
[ScriptBlock]$ScriptBlock,
[Object]$Value
[Object]$Value2
)
$_ = $Value
$A = $Value2
& $ScriptBlock
}
If this function is written in a PowerShell script file (.ps1), but imported using Import-Module, the behaviour of F is as expected:
PS> F -Value 7 -Value2 1 -ScriptBlock {$_ * 2 + $A}
15
PS>
However, when the function is written in a PowerShell module file (.psm1) and imported using Import-Module, the behaviour is unexpected:
PS> F -Value 7 -Value2 1 -ScriptBlock {$_ * 2 + $A}
PS>
Using {$_ + 1} instead gives 1. It seems that $_ has a value of $null instead. Presumably, some security measure restricts the scope of the $_ variable or otherwise protects it. Or, possibly, the $_ variable is assigned by some automatic process. Regardless, if only the $_ variable was affected, the first unsuccessful example would return 1.
Ideally, the solution would involve the ability to explicitly specify the environment in which a script block is run. Something like:
Invoke-ScriptBlock -Variables #{"_" = $Value; "A" = $Value2} -InputObject $ScriptBlock
In conclusion, the questions are:
Why can't script blocks in module files access variables defined in functions from which they were called?
Is there a method for explicitly specifying the variables accessible by a script block when invoking it?
Is there some other way of solving this that does not involve including an explicit parameter declaration in the script block?
Out of order:
Is there some other way of solving this that does not involve including an explicit parameter declaration in the script block?
Yes, if you just want to populate $_, use ForEach-Object!
ForEach-Object executes in the caller's local scope, which helps you work around the issue - except you won't have to, because it also automatically binds input to $_/$PSItem:
# this will work both in module-exported commands and standalone functions
function F {
param(
[ScriptBlock]$ScriptBlock,
[Object]$Value
)
ForEach-Object -InputObject $Value -Process $ScriptBlock
}
Now F will work as expected:
PS C:\> F -Value 7 -ScriptBlock {$_ * 2}
Ideally, the solution would involve the ability to explicitly specify the environment in which a script block is run. Something like:
Invoke-ScriptBlock -Variables #{"_" = $Value; "A" = $Value2} -InputObject $ScriptBlock
Execute the scripblock using ScriptBlock.InvokeWithContext():
$functionsToDefine = #{
'Do-Stuff' = {
param($a,$b)
Write-Host "$a - $b"
}
}
$variablesToDefine = #(
[PSVariable]::new("var1", "one")
[PSVariable]::new("var2", "two")
)
$argumentList = #()
{Do-Stuff -a $var1 -b two}.InvokeWithContext($functionsToDefine, $variablesToDefine, $argumentList)
Or, wrapped in a function like your original example:
function F
{
param(
[scriptblock]$ScriptBlock
[object]$Value
)
$ScriptBlock.InvokeWithContext(#{},#([PSVariable]::new('_',$Value)),#())
}
Now you know how to solve your problem, let's get back to the question(s) about module scoping.
At first, it's worth noting that you could actually achieve the above using modules, but sort of in reverse.
(In the following, I use in-memory modules defined with New-Module, but the module scope resolution behavior describe is the same as when you import a script module from disk)
While module scoping "bypasses" normal scope resolution rules (see below for explanation), PowerShell actually supports the inverse - explicit execution in a specific module's scope.
Simply pass a module reference as the first argument to the & call operator, and PowerShell will treat the subsequent arguments as a command to be invoked in said module:
# Our non-module test function
$twoPlusTwo = { return $two + $two }
$two = 2
& $twoPlusTwo # yields 4
# let's try it with explicit module-scoped execution
$myEnv = New-Module {
$two = 2.5
}
& $myEnv $twoPlusTwo # Hell froze over, 2+2=5 (returns 5)
Why can't script blocks in module files access variables defined in functions from which they were called?
If they can, why can't the $_ automatic variable?
Because loaded modules maintain state, and the implementers of PowerShell wanted to isolate module state from the caller's environment.
Why might that be useful, and why might one preclude the other, you ask?
Consider the following example, a non-module function to test for odd numbers:
$two = 2
function Test-IsOdd
{
param([int]$n)
return $n % $two -ne 0
}
If we run the above statements in a script or an interactive prompt, subsequently invocating Test-IsOdd should yield the expected result:
PS C:\> Test-IsOdd 123
True
So far, so great, but relying on the non-local $two variable bears a flaw in this scenario - if, somewhere in our script or in the shell we accidentally reassign the local variable $two, we might break Test-IsOdd completely:
PS C:\> $two = 1 # oops!
PS C:\> Test-IsOdd 123
False
This is expected since, by default, variable scope resolution just wanders up the call stack until it reaches the global scope.
But sometimes you might require state to be kept across executions of one or more functions, like in our example above.
Modules solve this by following slightly different scope resolution rules - module-exported functions defer to something we call module scope (before reaching the global scope).
To illustrate how this solves our problem from before, considering this module-exported version of the same function:
$oddModule = New-Module {
function Test-IsOdd
{
param([int]$n)
return $n % $two -ne 0
}
$two = 2
}
Now, if we invoke our new module-exported Test-IsOdd, we predictably get the expected result, regardless of "contamination" in the callers scope:
PS C:\> Test-IsOdd 123
True
PS C:\> $two = 1
PS C:\> Test-IsOdd 123 # still works
True
This behavior, while maybe surprising, basicly serves to solidify the implicit contract between the module author and the user - the module author doesn't need to worry too much about what's "out there" (the callers session state), and the user can expect whatever going on "in there" (the loaded module's state) to work correctly without worrying about what they assign to variables in the local scope.
Module scoping behavior poorly documented in the help files, but is explained in some depth in chapter 8 of Bruce Payette's "PowerShell In Action" (ISBN:9781633430297)

Is there a PowerShell equivalent for Python's doctest module?

I've just come across the doctest module in Python, which helps you perform automated testing against example code that's embedded within Python doc-strings. This ultimately helps ensure consistency between the documentation for a Python module, and the module's actual behavior.
Is there an equivalent capability in PowerShell, so I can test examples in the .EXAMPLE sections of PowerShell's built-in help?
This is an example of what I would be trying to do:
function MyFunction ($x, $y) {
<#
.EXAMPLE
> MyFunction -x 2 -y 2
4
#>
return $x + $y
}
MyFunction -x 2 -y 2
You could do this, although I'm not aware of any all-in-one built-in way to do it.
Method 1 - Create a scriptblock and execute it
Help documentation is an object, so can be leveraged to index into examples and their code. Below is the simplest example I could think of which executes your example code.
I'm not sure if this is what doctest does - it seems a bit dangerous to me but it might be what you're after! It's the simplest solution and I think will give you the most accurate results.
Function Test-Example {
param (
$Module
)
# Get the examples
$examples = Get-Help $Module -Examples
# Loop over the code of each example
foreach ($exampleCode in $examples.examples.example.code) {
# create a scriptblock of your code
$scriptBlock = [scriptblock]::Create($exampleCode)
# execute the scriptblock
$scriptBlock.Invoke()
}
}
Method 2 - Parse the example/function and make manual assertions
I think a potentially better way to this would be to parse your example and parse the function to make sure it's valid. The downside is this can get quite complex, especially if you're writing complex functions.
Here's some code that checks the example has the correct function name, parameters and valid values. It could probably be refactored (first time dealing with [System.Management.Automation.Language.Parser]) and doesn't deal with advanced functions at all.
If you care about things like Mandatory, ParameterSetName, ValidatePattern etc this probably isn't a good solution as it will require a lot of extension.
Function Check-Example {
param (
$Function
)
# we'll use this to get the example command later
# source: https://vexx32.github.io/2018/12/20/Searching-PowerShell-Abstract-Syntax-Tree/
$commandAstPredicate = {
param([System.Management.Automation.Language.Ast]$AstObject)
return ($AstObject -is [System.Management.Automation.Language.CommandAst])
}
# Get the examples
$examples = Get-Help $Function -Examples
# Parse the function
$parsedFunction = [System.Management.Automation.Language.Parser]::ParseInput((Get-Content Function:$Function), [ref]$null, [ref]$null)
# Loop over the code of each example
foreach ($exampleCode in $examples.examples.example.code) {
# parse the example code
$parsedExample = [System.Management.Automation.Language.Parser]::ParseInput($exampleCode, [ref]$null, [ref]$null)
# get the command, which gives us useful properties we can use
$parsedExampleCommand = $parsedExample.Find($commandAstPredicate,$true).CommandElements
# check the command name is correct
"Function is correctly named: $($parsedExampleCommand[0].Value -eq $Function)"
# loop over the command elements. skip the first one, which we assume is the function name
foreach ($element in ($parsedExampleCommand | select -Skip 1)) {
"" # new line
# check parameter in example exists in function definition
if ($element.ParameterName) {
"Checking parameter $($element.ParameterName)"
$parameterDefinition = $parsedFunction.ParamBlock.Parameters | where {$_.Name.VariablePath.Userpath -eq $element.ParameterName}
if ($parameterDefinition) {
"Parameter $($element.ParameterName) exists"
# store the parameter name so we can use it to check the value, which we should find in the next loop
# this falls apart for switches, which have no value so they'll need some additional logic
$previousParameterName = $element.ParameterName
}
}
# check the value has the same type as defined in the function, or can at least be cast to it.
elseif ($element.Value) {
"Checking value $($element.Value) of parameter $previousParameterName"
$parameterDefinition = $parsedFunction.ParamBlock.Parameters | where {$_.Name.VariablePath.Userpath -eq $previousParameterName}
"Parameter $previousParameterName has the same type: $($element.StaticType.Name -eq $parameterDefinition.StaticType.Name)"
"Parameter $previousParameterName can be cast to correct type: $(-not [string]::IsNullOrEmpty($element.Value -as $parameterDefinition.StaticType))"
}
else {
"Unexpected command element:"
$element
}
}
}
}
Method 3 - Use Pester (maybe out of scope)
I think this one is a bit off topic, but worth mentioning. Pester is the test framework for PowerShell and has features that could be helpful here. You could have a generic test that takes a script/function as argument and runs tests against the parsed examples/functions.
This is could involve executing the script like in method 1 or checking the parameters like in method 2. Pester has a HaveParameter assertion that allows you to check certain things about your function.
HaveParameter documenation, copied from link above:
Get-Command "Invoke-WebRequest" | Should -HaveParameter Uri -Mandatory
function f ([String] $Value = 8) { }
Get-Command f | Should -HaveParameter Value -Type String
Get-Command f | Should -Not -HaveParameter Name
Get-Command f | Should -HaveParameter Value -DefaultValue 8
Get-Command f | Should -HaveParameter Value -Not -Mandatory

Powershell Switch Parameter to add to end of expression

Here's what I'm trying to do:
param([Switch]$myparameter)
If($myparamter -eq $true) {$export = Export-CSV c:\temp\temp.csv}
Get-MyFunction | $export
If $myparameter is passed, export the data to said location. Else, just display the normal output (in other words, ignore the $export). What doesn't work here is setting $export to the "Export-csv...". Wrapping it in quotes does not work.
I'm trying to avoid an if, then statement saying "if it's passed, export this. If it's not passed, output data"
I have a larger module that everything works in so there is a reason behind why I am looking to do it this way. Please let me know if any additional information is needed.
Thank you everyone in advance.
tl;dr:
param([Switch] $myparameter)
# Define the core command as a *script block* (enclosed in { ... }),
# to be invoked later, either with operator . (no child variable scope)
# or & (with child variable scope)
$scriptBlock = { Get-MyFunction }
# Invoke the script block with . (or &), and pipe it to the Export-Csv cmdlet,
# if requested.
If ($myparameter) { # short for: ($myparameter -eq $True), because $myparameter is a switch
. $scriptBlock | Export-Csv c:\temp\temp.csv
} else {
. $scriptBlock
}
TessellatingHeckler's answer is concise, works, and uses a number of advanced features cleverly - however, while it avoids an if statement, as requested, doing so may not yield the best or most readable solution in this case.
What you're looking for is to store a command in a variable for later execution, but your own attempt to do so:
If ($myparameter -eq $true) { $export = Export-CSV c:\temp\temp.csv }
results in immediate execution, which is not only unintended, but fails, because the Export-Csv cmdlet is missing input in the above statement.
You can store a snippet of source code for later execution in a variable via a script block, simply by enclosing the snippet in { ... }, which in your case would mean:
If ($myparameter -eq $true) { $export = { Export-Csv c:\temp\temp.csv } }
Note that what you pass to if is itself a script block, but it is by definition one that is executed as soon as the if condition is found to be true.
A variable containing a script block can then be invoked on demand, using one of two operators:
., the "dot-sourcing" operator, which executes the script block in the current scope.
&, the call operator, which executes the script block in a child scope with respect to potential variable definitions.
However, given that you only need the pipeline with an additional command if switch $myparameter is specified, it's better to change the logic:
Store the shared core command, Get-MyFunction, in a script block, in variable $scriptBlock.
Invoke that script block in an if statement, either standalone (by default), or by piping it to Export-Csv (if -MyParameter was specified).
I'm trying to avoid an if, then statement
Uh, if you insist...
param([Switch]$myparameter)
$cmdlet, $params = (('Write-output', #{}),
('Export-Csv', #{'LiteralPath'='c:\temp\temp.csv'}))[$myparameter]
Get-MyFunction | & $cmdlet #params