I am trying to call a PS script via batch file, like so
Powershell.exe -file "C:\Scripts\Blah\Blah\Blah.ps1" -webUID "usernameValue" -webPWD "passwordValue" -Param "param value" -Param2 "param 2 value"
The issue seems to be the batch file is confusing Param and Param2. It thinks I am setting Param2 twice however Param and Param2 are separate parameters altogether. Has anyone experienced this? Is there perhaps a way to explicitly state the param names? Thanks
Param block
# Parameters
Param
(
[string]$WebUID,
[string]$WebPWD,
[string]$Param,
[string]$Param2
)
In an effort to support concise command-line use, PowerShell's "elastic syntax" allows specifying unambiguous prefix substrings of parameter names so that you only need to type as much of a parameter name as is necessary to identify it without ambiguity;
e.g., typing -p to refer to -Path is enough, if no other parameters start with p.
However, an exact match is always recognized, so that specifying -Param in your case should unambiguously match the -Param parameter, even though its full name happens to be a prefix substring of different parameter -Param2.
If the problem were an issue of ambiguity (it isn't), you'd see a different error message. For instance, were you to use the ambiguous -Para, you'd see:
Parameter cannot be processed because the parameter name 'para' is ambiguous. Possible matches include: -Param -Param2.
Instead, the wording of your error message suggests that the exact same parameter name - -Param2 - was indeed specified more than once - even though your sample code doesn't show that.
I've tested the behavior in PSv2 and PSv5.1 / 6.0 alpha 10 - it's conceivable, however, that other versions act differently due to a bug. Do let us know.
Consider an alternative approach:
If you invoked your script from within PowerShell, you could use a single, array-valued parameter - e.g. [string[]] $Params - and then simply pass as many parameters as needed, comma-separated, without needing to specify a distinct parameter name for each value.
Sadly, when invoking a script from outside of PowerShell, this approach won't work, because passing arrays isn't supported from the outside.
There is a workaround, however:
Declare the array-valued parameter decorated with [parameter(ValueFromRemainingArguments=$true)]
Invoke the script with the parameters as a space-separated list at the end of the command.
Applied to your scenario:
If your script defined its parameters as follows:
Param
(
[string]$WebUID,
[string]$WebPWD,
[parameter(ValueFromRemainingArguments=$true)]
[string[]] $Params
)
You could then invoke your script as follows:
Powershell.exe -file "C:\Scripts\Blah\Blah\Blah.ps1" `
-webUID "usernameValue" `
-webPWD "passwordValue" `
"param value" "param 2 value"
and $Params would receive an array of values: $Params[0] would receive param value, and $Params[1] would receive param 2 value.
Note that when calling from outside of PowerShell:
you must not use parameter name -Params in the invocation - just specify the values at the end.
you must not use , to separate the values - use spaces.
I'm no guru, but this looks like it's related to "Partial Parameters" and "Parameter Completion". See this article for more information.
Simply changing Param to Param1 should fix the issue.
Related
I have a PS script I call from a Windows shortcut. I drop on it several files or directories, and it works fine.
I would like to add some named parameters (let's call them : -Param1 and -Param2), optional, that can be used, of course, only from PowerShell Prompt.
param (
[switch]$CreateShortcut
)
A switch parameter works.
But, if I add a string parameter :
param (
[switch]$CreateShortcut,
[string]$Param1
)
Of course, it does not work anymore when I call my script thru the Windows shortcut : $Param1 receive the first file.
Is there a solution ?
Thanks
When you drop files/folders on a shortcut file, their full paths are passed as individual, unnamed arguments to the shortcut's executable (script).
PowerShell allows you to collect such unnamed arguments in a single, array-valued parameter, by declaring it as ValueFromRemainingArguments:
[CmdletBinding(PositionalBinding=$false)]
param (
[switch] $CreateShortcut,
# Collect all unnamed arguments in this parameter:
[Parameter(ValueFromRemainingArguments)]
[string[]] $FilesOrFolders
)
[CmdletBinding(PositionalBinding=$false)] ensures that any parameters not explicitly marked with a Position property must be passed as named arguments (i.e., the argument must be preceded by the name of the target parameter, e.g. -Path foo).
This isn't necessary to support [switch] parameters, because they are implicitly named-only, but it allows you to support additional, non-[switch] parameters that can be bound by explicit invocation (only).
Alternatively, if you do not need support for additional predeclared non-switch parameters, you can omit [CmdletBinding(PositionalBinding=$false)] and the $FilesOrFolders parameter declaration and access any arguments that do not bind to predeclared parameters via the automatic $args variable.
Generally, note that use of a [Parameter()] attribute on any of the predeclared parameters would make $args unavailable, as the presence of [CmdletBinding()] does.
The reason is that the use of either attribute makes a script or function an advanced one, i.e., makes it cmdlet-like, and therefore disallows passing arguments that do not bind to declared parameters; to put it differently: $args is then by definition always empty (an empty array).
Advanced scripts or functions automatically gain additional features, notably support for common parameters such as -Verbose.
issue
the called powershell script will accept parameters but not all of them:
Current Set-Up and code:
I have a common folder where two .ps1 scripts are located:
DoWork.ps1
Workmanager.ps1
Workmanager.ps1 calls the Dowork.ps1:
$targetPath="M:\target"
echo "target path: $targetPath"
start powershell {.\DoWork.ps1 -target $targetPath -tempdrive D:\}
output (as expected):
target path: M:\target
DoWork.ps1 contains some start code:
param
(
[string]$tempdrive,
[string]$target,
[int] $threads = 8,
[int] $queuelength = -1
)
echo "variables:"
echo "temp drive: $tempdrive"
echo "target path: $target"
Unexpectedly, the $target is not beeing assigned. Previously I had the variable named $targetpath, which did not work either.
variables:
temp drive: D:\
target path:
Findings
It appears that the issue relies in Workmanager.ps1. Spcifying the parameter as fixed string rather than as variable will load the parameter. Any solution for this?
start powershell {.\DoWork.ps1 -target "foo" -tempdrive D:\}
When you use a ScriptBlock as an argument to powershell.exe, variables aren't going to be evaluated until after the new session starts. $targetPath has not been set in the child PowerShell process called by Workmanager.ps1 and so it has no value. This is actually an expected behavior of a ScriptBlock in general and behaves this way in other contexts too.
The solution is mentioned in the help text for powershell -?:
[-Command { - | <script-block> [-args <arg-array>] <========== THIS GUY
| <string> [<CommandParameters>] } ]
You must provide the -args parameter which will be passed to the ScriptBlock on execution (separate multiple arguments with a ,). Passed arguments are passed positionally, and must be referenced as though you were processing the arguments to a function manually using the $args array. For example:
$name = 'Bender'
& powershell { Write-Output "Hello, $($args[0])" } -args $name
However, especially with more complicated ScriptBlock bodies, having to remember which index of $args[i] contains the value you want at a given time is a pain in the butt. Luckily, we can use a little trick with defining parameters within the ScriptBlock to help:
$name = 'Bender'
& powershell { param($name) Write-Output "Hello, $name" } -args $name
This will print Hello, Bender as expected.
Some additional pointers:
The ScriptBlock can be multiline as though you were defining a function. way. The examples above are single line due to their simplicity.
A ScriptBlock is just an unnamed function, which is why defining parameters and referencing arguments within one works the same way.
To exemplify this behavior outside of powershell.exe -Command, Invoke-Command requires you to pass variables to its ScriptBlock in a similar fashion. Note however that answer uses an already-defined function body as the ScriptBlock (which is totally valid to do)
You don't need to use Start-Process here (start is its alias), at least as demonstrated in your example. You can simply use the call operator & unless you need to do something more complex than "run the program and wait for it to finish". See this answer of mine for more information.
If you opt to pass a string to powershell.exe instead, you don't need to provide arguments and your variables will get rendered in the current PowerShell process. However, so will any other unescaped variables that might be intended to set within the child process, so be careful with this approach. Personally, I prefer using ScriptBlock regardless, and just deal with the extra parameter definition and arguments.
Using the call & operator is optional when you are not executing a path rendered as a string. It can be omitted in the examples above, but is more useful like so:
& "C:\The\Program Path\Contains\spaces.exe"
& $programPathAsAVariable
I was wondering if there's any simple way to make aliases for powershell like cmd.
For example: In cmd, doskey art=php artisan $* where $* is optional. Currently, I'm using the following alias in powershell.
function runArtisanCommand
{
param(
[Parameter(Mandatory=$false, Position = 0, ValueFromRemainingArguments = $true)]
$command
)
php artisan $command
}
Set-Alias art runArtisanCommand
This works somewhat but don't take flags. For example: I can't write art -h or art route:list -c. In art -h command, it prints the output of php artisan and don't read flag at all but in art route:list -c command, it errors out with.
runArtisanCommand : Missing an argument for parameter 'command'. Specify a parameter of type 'System.Object' and try again.
At line:1 char:16
+ art route:list -c
+ ~~
+ CategoryInfo : InvalidArgument: (:) [runArtisanCommand], ParameterBindingException
+ FullyQualifiedErrorId : MissingArgument,runArtisanCommand
I would love a simpler solution than this. Thanks in advance.
The simplest and most convenient way to pass unknown arguments through is by spatting
the automatic $args array - as #args - in a simple function or script (one that neither uses a [CmdletBinding()] nor [Parameter()] attributes):
# Note: #args rather than $args makes the function work with named
# arguments for PowerShell commands too - see explanation below.
function runArtisanCommand { php artisan #args }
# As in your question: Define alias 'art' for the function
# Note: Of course, you could directly name your *function* 'art'.
# If you do want the function to have a longer name, consider one
# that adheres to PowerShell's Verb-Noun naming convention, such as
# 'Invoke-ArtisanCommand'.
Set-Alias art runArtisanCommand
As an aside: Since the target executable, php, is neither quoted nor specified based on a variable or expression, it can be invoked as-is; otherwise, you would need &, the call operator - see this answer for background information.
As for what you tried:
The problem was that use of -c as a pass-through argument only works if you precede it with --:
# OK, thanks to '--'
art -- route:list -c
-- tells PowerShell to treat all remaining arguments as unnamed (positional) arguments, instead of trying to interpret tokens such as -c as parameter names.
Without --, -c is interpreted as referring to your -command parameter (the parameter you declared as $command with ValueFromRemainingArguments = $true), given that PowerShell allows you to specify name prefixes in lieu of full parameter names, as long as the given prefix is unambiguous.
Because a parameter of any type other than [switch] requires an associated argument, -c (aka -command) failed with an error message to that effect.
You could have avoided the collision by naming your parameter so that it doesn't collide with any pass-through parameters, such as by naming it -_args (with parameter variable $_args):
function runArtisanCommand
{
param(
# Note: `Mandatory = $false` and `Position = 0` are *implied*.
[Parameter(ValueFromRemainingArguments)]
$_args
)
php artisan #_args
}
However, given that use of a [Parameter()] attribute implicitly makes your function an advanced function, it invariably also accepts common parameters, such as -ErrorAction, -OutVariable, -Verbose... - all of which can be passed by unambiguous prefix / short alias too; e.g., -outv for -OutVariable, or alias -ea for ErrorAction; collisions with them cannot be avoided.
Therefore, intended pass-through arguments such as -e still wouldn't work:
# FAILS, because -e ambiguously matches common parameters -ErrorAction
# and -ErrorVariable.
PS> art router:list -e
Parameter cannot be processed because the parameter name 'e' is ambiguous.
Possible matches include: -ErrorAction -ErrorVariable.
Again, -- is needed:
# OK, thanks to '--'
art -- router:list -e
Summary:
Especially for functions wrapping calls to external programs, such as php.exe, using a simple function with #args, as shown at the top, is not only simpler, but also more robust.
For functions wrapping PowerShell commands (with explicitly declared parameters):
a simple function with #args works too,
but if you also want support for tab-completion and showing a syntax diagram with the supported parameters, by passing -?, or via Get-Help, consider defining an (invariably advanced) proxy (wrapper) function via the PowerShell SDK - see below.
Optional background information: Pass-through arguments in PowerShell
As Mathias R. Jessen points out, the simplest way to pass (undeclared) arguments passed to a function or script through to another command is to use the automatic $args variable, which is an automatically populated array of all the arguments passed to a simple function or script (one that isn't advanced, through use of the [CmdletBinding()] and/or [Parameter()] attributes).
As for why #args (splatting) rather than $args should be used:
Using $args as-is in your wrapper function only works for passing positional arguments through (those not prefixed by the parameter name; e.g., *.txt), as opposed to named arguments (e.g., -Path *.txt).
If the ultimate target command is an external program (such as php.exe in this case), this isn't a problem, because PowerShell of necessity then treats all arguments as positional arguments (it cannot know the target program's syntax).
However, if a PowerShell command (with formally declared parameters) is ultimately called, only splatting the $args array - which syntactically means us of #args instead - supports passing named arguments through.[1]
Therefore, as a matter of habit, I suggest always using #args in simple wrapper functions, which equally works with external programs.[2]
To give an example with a simple wrapper function for Get-ChildItem:
# Simple wrapper function for Get-ChildItem that lists recursively
# and by relative path only.
function dirTree {
# Use #args to make sure that named arguments are properly passed through.
Get-ChildItem -Recurse -Name #args
}
# Invoke it with a *named* argument passed through to Get-ChildItem
# If $args rather than #args were used inside the function, this call would fail.
dirTree -Filter *.txt
Using a proxy function for more sophisticated pass-through processing:
The use of #args is convenient, but comes at the expense of not supporting the following:
tab-completion, given that tab-completion only works with formally declared parameters (typically with a param(...) block).
showing a syntax diagram with the supported parameters, by passing -?, or via Get-Help
To overcome these limitations, the parameter declarations of the ultimate target command must be duplicated in the (then advanced) wrapper function; while that is cumbersome, PowerShell can automate the process by scaffolding a so-called proxy (wrapper) function via the PowerShell SDK - see this answer.
Note:
With respect to common parameters such as -ErrorAction, it is the proxy function itself that (automatically) processes them, but that shouldn't make a difference to the caller.
Scaffolding a proxy function only works with PowerShell commands, given that PowerShell has no knowledge of the syntax of external programs.
However, you can manually duplicate the parameter declarations of the external target program.
[1] Note that the automatic $args array has built-in magic to support this; passing named arguments through with splatting is not supported with a custom array and requires use of a hash table instead, as discussed in the help topic about splatting linked to above.
[2] In fact, only #args also supports the correct interpretation of --%, the stop-parsing symbol.
I'm struggling to understand the outputs of the below function
function testApp
{
param(
[string] $appName,
[switch] $sw = $false,
[string[]] $test,
[string[]] $test2
)
Write-Host $appName - $sw - $test - $test2
}
testApp -appName "TestApp" -sw $true -test "one", "two" -test2 "three","four"
Output: TestApp - True - one two - three four
testApp -appName "TestApp" -sw $true -test "one", "two"
Output: TestApp - True - one two - True
The first output is as expected. But I cannot understand why the second output has "True" for the test2 array when I did not pass it. Can anyone help me in understanding the reason for the behavior? Thanks.
To summarize and complement the helpful comments on the question by Lee_Dailey, Matthew and mclayton:
[switch] parameters in PowerShell (aka flags in other shells):
switch parameters are meant to imply $true vs. $false by their presence in an invocation: e.g., passing -sw by itself signals $true, whereas omitting -sw signals $false.
It is possible to pass a Boolean value explicitly, for the purpose of passing a programmatically determined value; e.g.: -sw:$var
Note the required : following the switch name, which tells PowerShell that the Boolean value belongs to the switch parameter; without it, PowerShell thinks the value is a positional argument meant for a different parameter (see below).
Caveat: Commands may interpret -sw:$false differently from omitting -sw; a prominent example is is the use of common parameter -Confirm:$false to override the effective $ConfirmPreference value.
If you need to make this distinction in your own code, use $PSBoundParameters.ContainsKey('sw') -and -not $sw to detect the -sw:$false case.
Do not assign a default value to a switch parameter variable: while technically possible, the convention is that switches default to $false (which is a [switch] instance's default value anyway); that is, a [switch] parameter should always have opt-in logic.
A [switch] parameter variable effectively behaves like a Boolean value in most contexts:
That is, you can say if ($sw) { ... }, for instance.
If you need to access the wrapped Boolean value explicitly, access the .IsPresent property (note that the property name is somewhat confusing, because in a -sw:$false invocation the switch is still present, but its value, as reflected in .IsPresent, is $false).
An example of where .IsPresent is needed is the use of a Boolean as an implicit array index, notably to emulate a ternary conditional[1]: ('falseValue', 'trueValue')[$sw.IsPresent]; without the .IsPresent, the effective Boolean value wouldn't be recognized as such and wouldn't automatically be mapped to index 0 (from $false) or 1 (from $true).
Ultimately, your problem was that you thought $true was an argument for -sw, whereas it became a positional argument implicitly bound to the -test2 parameter.
[switch] parameters never need a value, so the next argument becomes a separate, positional argument - unless you explicitly indicate that the argument belongs to the switch by following the switch name with :, as shown above.[2]
Positional vs. named argument passing in PowerShell:
Terminology note: For conceptual clarity the term argument is used to refer to a value passed to a declared parameter. This avoids the ambiguity of using parameter situationally to either refer to the language construct that receives a value vs. a given value.
Named argument passing (binding) refers to explicitly placing the target parameter name before the argument (typically separated by a space, but alternatively also and / or by :); e.g., -AppName foo.
The order in which named arguments are passed never matters.
Positional (unnamed) argument passing refers to passing an argument without preceding it by the name of its target parameter; e.g., foo.
The passing is positional in the sense that the relative position (order) among other unnamed arguments determines what target parameter is implied.
[switch] parameters are the exception in that they:
are typically passed by name only (-sw), implying value $true, and if a value is passed, require : to separate the name from the value.
never support positional binding.
You may combine named passing with positional passing, in which case the named arguments are bound first, after which the positional ones are then considered (in order) for binding to the not-yet-bound parameters.
PowerShell functions are simple functions by default. In order to exercise control over positional binding, use of the [CmdletBinding()] and / or [Parameter()] attributes is necessary (see below), which invariably turn a simple function into an advanced function.
Making a simple function an advanced one has larger behavioral implications (mostly beneficial ones), which are detailed in this answer.
By default, PowerShell functions accept positional arguments for any parameter (other than those of type [switch]), in the order in which the parameters were declared.
Additionally, simple functions accept arbitrary additional arguments for which no parameters were declared, which are collected in the automatic $args array variable.
To prevent your function from accepting any positional arguments by default, place a [CmdletBinding(PositionalBinding=$false, ...)] attribute above the param(...) block.
Since this makes your function an advanced one, this also disables passing arbitrary additional arguments ($args no longer applies and isn't populated).
As an aside: when you implement a cmdlet (a command implemented as a binary, typically via C#), this behavior is implied.
To selectively support positional arguments, decorate individual parameter declarations with a [Parameter(Position=<n>, ...)] attribute (e.g, [Parameter(Position=0)] [string] $Path)
Note: Whether you start your numbering with 0 or 1 doesn't matter, as long as the numbers used reflect the desired ordering among all positional parameters; 0 is advisable as a self-documenting convention, because it is unambiguous.
Attribute [Parameter(Position=<n>)] is an explicit opt-in that selectively overrides [CmdletBinding(PositionalBinding=$false)]: that is, the latter disables positional binding unless explicitly indicated by individual parameter declarations; in fact, the latter is implied by the former, in that once you use one [Parameter(Position=<n>)] attribute, you must use it on all other parameters you want to bind positionally as well.
[1] Note that PowerShell [Core] 7.0+ supports ternary conditionals natively: $sw ? 'trueValue' : 'falseValue'
[2] In effect, [switch] parameters are the only type for which PowerShell supports an optional argument. See this answer for more information.
Here's my script (test.ps1):
[CmdLetBinding()]
Param
(
[parameter(Mandatory=$true)][string]$environment,
[switch][bool]$continue=$true
)
Write-Host $environment
Write-Host $continue
Question:
If I invoke this script by giving an argument which is a substring of the parameter I specified in the script like this: PS> .\test.ps1 -envi:blah, PowerShell doesn't seem to check the argument name. I want PowerShell to enforce parameter spelling, i.e., it should only accept -environment which matches the parameter name in the script. For anything else, it should raise an exception. Is that doable? How do I do that?
Thanks.
It's not pretty, but it will keep you from using anything except -environment as a parameter name.
Param
(
[parameter(Mandatory=$true)][string]$environment,
[parameter()]
[ValidateScript({throw "Invalid parameter. 'environment' required."})]
[string]$environmen,
[switch][bool]$continue=$true
)
Write-Host $environment
Write-Host $continue
}
Edit: As Matt noted in his comment the automatic disambiguation will force you to specify enough of the parameter name to find a unique substring match. What I'm doing here is basically giving it a parameter that satisfies all but the last character to prevent using any substring up to the last character (because it's ambiguous), and throwing an error to prevent you from using that.
And, FWIW, that could well be the ugliest parameter validation I've ever done but I don't have any better ideas right now.