Wrapper function in PowerShell: Pass remaining parameters - powershell

I’m trying to write a wrapper function in PowerShell that basically evaluates the first parameter and based on that runs a program on the computer. All the remaining parameters to the wrapper function should then be passed to the program that is ran as well.
So it should look something like this:
function test ( [string] $option )
{
if ( $option -eq 'A' )
{
Write-Host $args
}
elseif ( $option -eq 'B' )
{
. 'C:\Program Files\some\program.exe' $args
}
}
Now just adding $args does not work, so what do I have to do to make it work? Another option would probably be using Invoke-Expression, but it feels a bit like eval so I want to avoid if possible, and in addition I think doing it like that would limit me to string-only parameters right? If possible I would want to have the full support for the wrapped program/cmdlet - basically like a dynamic alias. Is that even possible?

This sort of does what you ask. You may run into trouble if you need to pass dash-prefixed options to the executable that conflict or cause ambiguity with the PowerShell common parameters. But this may get you started.
function Invoke-MyProgram
{
[CmdletBinding()]
Param
(
[parameter(mandatory=$true, position=0)][string]$Option,
[parameter(mandatory=$false, position=1, ValueFromRemainingArguments=$true)]$Remaining
)
if ($Option -eq 'A')
{
Write-Host $Remaining
}
elseif ($Option -eq 'B')
{
& 'C:\Program Files\some\program.exe' #Remaining # NOTE: # not $ (splatting)
}
}

What you have written does work. Note that what is there is $args is the unnamed arguments that are over and above the parameters expected by the function.
So if you call test as
test -option "A" 1 2 3
$args will have 1,2,3
Note that if you call test as
test -option "A" -other "B" 1 2 3
$args will have -other,B,1,2,3

Your solution works as-is for external programs (such as your C:\Program Files\some\program.exe example): you can always pass an array of values (which is what $args is) to an external program, and its elements will be passed as individual arguments (stringified, if necessary).
You can make your solution work with any command if you change $args to #args[1], to take advantage of a PowerShell parameter-passing technique called splatting:
function test ( [string] $option )
{
if ( $option -eq 'A' )
{
Write-Host $args
}
elseif ( $option -eq 'B' )
{
# Use #args to also support passing *named* arguments
# through to *PowerShell* commands.
& $someCommand #args
}
}
Caveats:
The automatic $args variable, which collects all arguments for which no parameter was declared, is only available in simple (non-advanced) functions and scripts; advanced functions and scripts - those that use the [CmdletBinding()] attribute and/or [Parameter()] attributes - require that all potential parameters be declared.
PowerShell has built-in magic that makes the automatic array variable $args also support passing named parameters through via splatting, which no custom array or collection supports.
By contrast, custom arrays and collections only support splatting positional (unnamed) arguments, which, however, covers all calls to external programs.
When calling PowerShell commands, this limitation is problematic, however: For instance, if you wanted to pass the named argument -Path C:\ through to the Set-Location cmdlet via splatting, using a custom collection parameter declared via ValueFromRemaining Arguments, as shown in OldFart's answer (Set-Location #Remaining), would not work; to support passing through named arguments (other than via #args, if available), you must use hashtable-based splatting.
Therefore, if your function is an advanced one and you need to support passing named arguments through to other PowerShell commands, a different approach is required: this answer shows two alternatives.
[1] With external programs, there is a corner case where #args behaves differently from $args, namely if the $args array contains --%, the stop-parsing symbol: #args recognizes it, $args treats it as a literal.

Related

Powershell console different to script

Can anyone tell me why this command works fine in the Powershell console, returning a single thumbprint, but when run as a script it just returns all the certificate's thumbprints:
$crt = (Get-ChildItem -Path Cert:\LocalMachine\WebHosting\ | Where-Object {$_.Subject.Contains($certcn)}).thumbprint
$certcn is a string containing a domain. eg "www.test.com"
I figured it out. $certcn was derived from $args[0]. It turns out $args[0] is not a string, and even though PS would quite happily use it as a string in other commands, it would not do this with Where-Object.
Not sure what type $args[0] actually is, but doing $certcn = $args[0].tostring() fixed it.
The only explanation for your symptom is that the value of variable $certcn:
either: is the empty string ('') because 'someString'.Contains('') returns $true for any input string.
or: is implicitly converted to the empty string, though that wouldn't happen often in practice; here are some examples (see GitHub issue # for the [pscustomobject] stringification bug mentioned below):
# An empty array stringifies to ''
'someString'.Contains(#()) # -> $true
# A single-element array containing $null stringifies to ''
'someString'.Contains(#($null)) # -> $true
# Due to a longstanding bug, [pscustomobject] instances, when
# stringified via .ToString(), convert to the empty string.
# This makes the command equivalent to `.Contains(#(''))`, which is again
# the same as `.Contains('')`
'someString'.Contains(#([pscustomobject] #{ foo=1 })) # -> $true
$args[0] is not a string
The automatic $args variable is an array that contains all positional arguments that weren't bound to declared parameters, if any.
$args can contain elements of any data type, and what that type is is solely determined by the caller.
However, if you formally declare a parameter, you can type it, which means that if the caller passes an argument of a different data type, an attempt is made to convert the argument to the parameter's type (which may fail, but at least the failure will be "loud", and the reason obvious).
A robust solution for your script:
param(
[Parameter(Mandatory)] # Ensure that the caller passes a value.
[string] $CertCN # Type-constrain to a string.
# , ... declare other parameters as needed
)
# $CertCN is now guaranteed to be a *string* that is *non-empty*.
$crt =
(Get-ChildItem -Path Cert:\LocalMachine\WebHosting |
Where-Object { $_.Subject.Contains($CertCN) }).thumbprint
Note:
The use of the [Parameter()] attribute in the parameter declaration block (param(...)) makes your script an advanced one, which means that $args isn't supported, requiring all arguments to bind to explicitly declared parameters; however, you can define a catch-all parameter with [Parameter(ValueFromRemaningArguments)], if needed. (The other thing that makes a script or function an advanced one is use of the [CmdletBinding()] attribute above the param(...) block as a whole.)
[Parameter(Mandatory)], in addition to ensuring that the caller passes a value for the parameter, implicitly also prevents passing the empty string (or $null) - though you could explicitly allow that with [AllowEmptyString()]
Additionally, advanced scripts and functions automatically prevent passing arrays to [string]-typed parameters, which is desirable. (By contrast, simple functions and scripts simply stringify arrays, as would happen in expandable strings (string interpolation); e.g., & { param([string] $foo) $foo } 1, 2 binds '1 2', which is also what you'd get with "$(1, 2)")
Caveat:
When passing a value to a [string]-typed parameter, PowerShell accepts a scalar (non-collection) of any type, and any non-string type is automatically converted to a string, via .ToString(). This is usually desirable and convenient, but can result in useless stringifications; e.g.:
& { param([string] $str) $str } #{} # -> 'System.Collections.Hashtable'
Instances of hashtables (#{ ... }) stringify to their type name, which is unhelpful, and this behavior is the default behavior for any type that doesn't explicitly implement a meaningful string representation by overriding the .ToString() method.
If that is a concern, you can modify your script to ensure that the argument value being passed already is a string, using a [ValidateScript()] attribute.
param(
[Parameter(Mandatory)]
# Ensure that the argument value is a [string] to begin with.
# Note: The `ErrorMessage` property requires PowerShell (Core) 7+
[ValidateScript({ $_ -is [string] }, ErrorMessage='Please pass a *string* argument.')]
$CertCN # Do not type-constrain, so that the original type can be inspected.
# , ... declare other parameters as needed
)
# ...
As stated in the code comments, use of the ErrorMessage property in the requires PowerShell (Core) 7+, unfortunately. In Windows PowerShell a standard error message is invariably shown, which isn't user-friendly at all.

start powershell wont accept variable as parameter

issue
the called powershell script will accept parameters but not all of them:
Current Set-Up and code:
I have a common folder where two .ps1 scripts are located:
DoWork.ps1
Workmanager.ps1
Workmanager.ps1 calls the Dowork.ps1:
$targetPath="M:\target"
echo "target path: $targetPath"
start powershell {.\DoWork.ps1 -target $targetPath -tempdrive D:\}
output (as expected):
target path: M:\target
DoWork.ps1 contains some start code:
param
(
[string]$tempdrive,
[string]$target,
[int] $threads = 8,
[int] $queuelength = -1
)
echo "variables:"
echo "temp drive: $tempdrive"
echo "target path: $target"
Unexpectedly, the $target is not beeing assigned. Previously I had the variable named $targetpath, which did not work either.
variables:
temp drive: D:\
target path:
Findings
It appears that the issue relies in Workmanager.ps1. Spcifying the parameter as fixed string rather than as variable will load the parameter. Any solution for this?
start powershell {.\DoWork.ps1 -target "foo" -tempdrive D:\}
When you use a ScriptBlock as an argument to powershell.exe, variables aren't going to be evaluated until after the new session starts. $targetPath has not been set in the child PowerShell process called by Workmanager.ps1 and so it has no value. This is actually an expected behavior of a ScriptBlock in general and behaves this way in other contexts too.
The solution is mentioned in the help text for powershell -?:
[-Command { - | <script-block> [-args <arg-array>] <========== THIS GUY
| <string> [<CommandParameters>] } ]
You must provide the -args parameter which will be passed to the ScriptBlock on execution (separate multiple arguments with a ,). Passed arguments are passed positionally, and must be referenced as though you were processing the arguments to a function manually using the $args array. For example:
$name = 'Bender'
& powershell { Write-Output "Hello, $($args[0])" } -args $name
However, especially with more complicated ScriptBlock bodies, having to remember which index of $args[i] contains the value you want at a given time is a pain in the butt. Luckily, we can use a little trick with defining parameters within the ScriptBlock to help:
$name = 'Bender'
& powershell { param($name) Write-Output "Hello, $name" } -args $name
This will print Hello, Bender as expected.
Some additional pointers:
The ScriptBlock can be multiline as though you were defining a function. way. The examples above are single line due to their simplicity.
A ScriptBlock is just an unnamed function, which is why defining parameters and referencing arguments within one works the same way.
To exemplify this behavior outside of powershell.exe -Command, Invoke-Command requires you to pass variables to its ScriptBlock in a similar fashion. Note however that answer uses an already-defined function body as the ScriptBlock (which is totally valid to do)
You don't need to use Start-Process here (start is its alias), at least as demonstrated in your example. You can simply use the call operator & unless you need to do something more complex than "run the program and wait for it to finish". See this answer of mine for more information.
If you opt to pass a string to powershell.exe instead, you don't need to provide arguments and your variables will get rendered in the current PowerShell process. However, so will any other unescaped variables that might be intended to set within the child process, so be careful with this approach. Personally, I prefer using ScriptBlock regardless, and just deal with the extra parameter definition and arguments.
Using the call & operator is optional when you are not executing a path rendered as a string. It can be omitted in the examples above, but is more useful like so:
& "C:\The\Program Path\Contains\spaces.exe"
& $programPathAsAVariable

PowerShell - accessing script parameters, sent from Windows cmd shell

Questions
How can you access the parameters sent to PowerShell script (.ps1 file)?
Can you access parameters A: by name, B: by position, C: a mix of either?
Context
I recently tried to write a PowerShell script (.ps1) that would be called from a Windows batch file (.bat) (or potentially cmd shell, or AutoHotKey script) - which would pass parameters into the .ps1 script for it to use (to display a toast notification). Thanks to the instructions on ss64.com, I have used $args to do this kind of thing in the past, however for some reason I could access the parameters this way (despite passing parameters, $args[0] = '' (empty string) and $args.Count = 0) so eventually had to remove all the $args code, and replace it with Param() script instead.
I'm still not quite sure why, but thought this is something I should get to the bottom of before I try to write my next script...
Code Example 1: Args (un-named parameters)
ToastNotificationArgs.ps1
-------------------------
Write-Debug "The script has been passed $($args.Count) parameters"
If (!$args[0]) { # Handle first parameter missing }
If (!$args[1]) { # Handle second parameter missing }
Import-Module -Name BurntToast
New-BurntToastNotification -Text "$args[0], $args[1]"
^ I thought the above code was correct, but like I say, I kept struggling to access the parameters for some reason and could not figure out why. (If anyone can spot what I was doing wrong, please shout!)
Is $args[] a valid approach? I assume so given it's use of ss64.com, but maybe there are some pitfalls / limitations I need to be aware of?
Code Example 2: Param (named parameters)
ToastNotificationParams.ps1
---------------------------
Param(
[Parameter(Mandatory=$false, Position=0, ValueFromPipeline=$true)] [string]$Title,
[Parameter(Mandatory=$false, Position=1, ValueFromPipeline=$true)] [string]$Message
)
Import-Module -Name BurntToast
New-BurntToastNotification -Text "$Title, $Message"
^ This was the only way to get my script working in the end. However when I passed the parameters in, my calling cmd script sent the parameters by position i.e. (pwsh.exe -File "ToastNotificationParams.ps1" "This is the title" "Message goes here") rather than by named pairs. (Not sure if this is best practice, but is how my script was initially intended to be used to left it for now).
While Param() got my script working this time (and I also realise the inherent dangers of position-based parameters), there are times when a position-based approach might be necessary (e.g. the number of parameter is unknown)...
Code Example 3: Hybrid
ToastNotificationMix.ps1
------------------------
Param(
[Parameter(Mandatory=$false, Position=0, ValueFromPipeline=$true)] [string]$Title
)
Import-Module -Name BurntToast
For ( $i = 1; $i -lt $args.count; $i++ ) {
New-BurntToastNotification -Text "$Title, $args[i]"
}
Is something like this valid?.. If not (or there is a better solution), any help would be greatly appreciated!
Thanks in advance!
The automatic $args variable is only available in simple (non-advanced) functions / scripts. A script automatically becomes an advanced one by using the [CmdletBinding()] attribute and/or at least one per-parameter [Parameter()] attribute.
Using $args allows a function/script to accept an open-ended number of positional arguments, usually instead of, but also in addition to using explicitly declared parameters.
But it doesn't allow passing named arguments (arguments prefixed by a predeclared target parameter name, e.g., -Title)
For robustness, using an advanced (cmdlet-like) function or script is preferable; such functions / scripts:
They require declaring parameters explicitly.
They accept no arguments other than ones that bind to declared parameters.
However, you can define a single catch-all parameter that collects all positional arguments that don't bind to any of the other predeclared parameters, using [Parameter(ValueFromRemainingArguments)].
Explicitly defined parameters are positional by default, in the order in which they are declared inside the param(...) block.
You can turn off this default with [CmdletBinding(PositionalBinding=$false)],
which then allows you to selectively enable positional binding, using the Position property of the individual [Parameter()] attributes.
When you call a PowerShell script via the PowerShell's CLI's -File parameter, the invocation syntax is fundamentally the same as when calling script from inside PowerShell; that is, you can pass named arguments and/or - if supported - positional arguments.
Constraints:
The arguments are treated as literals.
Passing array arguments (,-separated elements) is not supported.
If you do need your arguments to be interpreted as they would be from inside PowerShell, use the -Command / -c CLI parameter instead
See this answer for guidance on when to use -File vs. `-Command.
To put it all together:
ToastNotificationMix.ps1:
[CmdletBinding(PositionalBinding=$false)]
Param(
[Parameter(Position=0)]
[string]$Title
,
[Parameter(Mandatory, ValueFromRemainingArguments)]
[string[]] $Rest
)
Import-Module -Name BurntToast
foreach ($restArg in $Rest) {
New-BurntToastNotification -Text "$Title, $restArg"
}
You can then call your script from cmd.exe as follows, for instance (I'm using pwsh.exe, the PowerShell (Core) CLI; for Windows PowerShell, use powershell.exe):
Positional binding only:
:: "foo" binds to $Title, "bar" to $Rest
pwsh -File ./ToastNotificationMix.ps1 foo bar
Mix of named and positional binding:
:: "bar" and "baz" both bind to $Rest
pwsh -File ./ToastNotificationMix.ps1 -Title foo bar baz

Wrapper function for cmdlet - pass remaining parameters

I'm writing a function that wraps a cmdlet using ValueFromRemainingArguments (as discussed here).
The following simple code demonstrates the problem:
works
function Test-WrapperArgs {
Set-Location #args
}
Test-WrapperArgs -Path C:\
does not work
function Test-WrapperUnbound {
Param(
[Parameter(ValueFromRemainingArguments)] $UnboundArgs
)
Set-Location #UnboundArgs
}
Test-WrapperUnbound -Path C:\
Set-Location: F:\cygwin\home\thorsten\.config\powershell\test.ps1:69
Line |
69 | Set-Location #UnboundArgs
| ~~~~~~~~~~~~~~~~~~~~~~~~~
| A positional parameter cannot be found that accepts argument 'C:\'.
I tried getting to the issue with GetType and EchoArgs from the PowerShell Community Extensions to no avail. At the moment I'm almost considering a bug (maybe related to this ticket??).
The best solution for an advanced function (one that uses a [CmdletBinding()] attribute and/or a [Parameter()] attribute) is to scaffold a proxy (wrapper) function via the PowerShell SDK, as shown in this answer.
This involves essentially duplicating the target command's parameter declarations (albeit in an automatic, but static fashion).
If you do not want to use this approach, your only option is to perform your own parsing of the $UnboundArgs array (technically, it is an instance of [System.Collections.Generic.List[object]]), which is cumbersome, however, and not foolproof:
function Test-WrapperUnbound {
Param(
[Parameter(ValueFromRemainingArguments)] $UnboundArgs
)
# (Incompletely) emulate PowerShell's own argument parsing by
# building a hashtable of parameter-argument pairs to pass through
# to Set-Location via splatting.
$htPassThruArgs = #{}; $key = $null
switch -regex ($UnboundArgs) {
'^-(.+)' { if ($key) { $htPassThruArgs[$key] = $true } $key = $Matches[1] }
default { $htPassThruArgs[$key] = $_; $key = $null }
}
if ($key) { $htPassThruArgs[$key] = $true } # trailing switch param.
# Pass the resulting hashtable via splatting.
Set-Location #htPassThruArgs
}
Note:
This isn't foolproof in that your function won't be able to distinguish between an actual parameter name (e.g., -Path) and a string literal that happens to look like a parameter name (e.g., '-Path')
Also, unlike with the scaffolding-based proxy-function approach mentioned at the top, you won't get tab-completion for any pass-through parameters and the pass-through parameters won't be listed with -? / Get-Help / Get-Command -Syntax.
If you don't mind having neither tab-completion nor syntax help and/or your wrapper function must support pass-through to multiple or not-known-in-advance target commands, using a simple (non-advanced) function with #args (as in your working example; see also below) is the simplest option, assuming your function doesn't itself need to support common parameters (which requires an advanced function).
Using a simple function also implies that common parameters are passed through to the wrapped command only (whereas an advanced function would interpret them as meant for itself, though their effect usually propagates to calls inside the function; with a common parameter such as -OutVariable, however, the distinction matters).
As for what you tried:
While PowerShell does support splatting via arrays (or array-like collections such as [System.Collections.Generic.List[object]]) in principle, this only works as intended if all elements are to be passed as positional arguments and/or if the target command is an external program (about whose parameter structure PowerShell knows nothing, and always passes arguments as a list/array of tokens).
In order to pass arguments with named parameters to other PowerShell commands, you must use hashtable-based splatting, where each entry's key identifies the target parameter and the value the parameter value (argument).
Even though the automatic $args variable is technically also an array ([object[]]), PowerShell has built-in magic that allows splatting with #args to also work with named parameters - this does not work with any custom array or collection.
Note that the automatic $args variable, which collects all arguments for which no parameter was declared - is only available in simple (non-advanced) functions and scripts; advanced functions and scripts - those that use the [CmdletBinding()] attribute and/or [Parameter()] attributes - require that all potential parameters be declared.

Pass a variable to another script

I have two PowerShell scripts.
The first script has the following code:
$var = "abc"
$DIR = "C:\"
$SCRIPT_NAME = "abc.ps1"
&"${DIR}\${SCRIPT_NAME}" #execute the second script
If I want to pass the variable $var to the second script, how do I achieve that? What code do I need to put in both the first and the second script?
Parameters (Recommended): Use parameters to pass values to the second script.
Step2.ps1:
param ($myparameter)
write-host $myparameter
Step1.ps1:
$var = "abc"
$DIR = "C:\"
$SCRIPT_NAME = "step2.ps1"
&"${DIR}\${SCRIPT_NAME}" -myparameter $var
Alternative: You could also have used arguments $args (extra values not linked to a parameter). You can specify the first argument using $args[0]. I would however always recommend parameters as arguments needs to be in a specific order (if multiple arguments are passed) etc.
Step2.ps1:
write-host $args[0]
Step1.ps1:
$var = "abc"
$DIR = "C:\"
$SCRIPT_NAME = "step2.ps1"
&"${DIR}\${SCRIPT_NAME}" $var
There are several ways to do what you want, two of which have already been suggested by #FrodeF..
Pass the variable as a (named) parameter:
# script1.ps1
$var = 'foo'
$dir = 'C:\some\folder'
$scriptname = "script2.ps1"
& "${dir}\${scriptname}" -Foo $var
# script2.ps1
Param($foo)
Write-Output $foo
This is the cleanest solution. You have a well-defined interface and pass the variable in a clear-cut way from one script to another.
Parameter definitions will also allow you to make a parameter mandatory (so that the script will ask the user to provide input if the parameter was omitted), require a particular data type, easily incorporate validation routines, or add comment-based help.
# script2.ps1
<#
.SYNOPSIS
Short description of the script or function.
.DESCRIPTION
Longer description of what the script or function actually does.
.PARAMETER Foo
Description of the parameter Foo.
#>
[CmdletBinding()]
Param(
[Parameter(Mandatory=$true, Position=0, ValueFromPipeline=$true)]
[ValidateRange(2,42)]
[int]$foo
)
Write-Output $foo
See Get-Help about_Function_Advanced_Parameters for more information.
Pass the variable as an unnamed argument:
# script1.ps1
$var = 'foo'
$dir = 'C:\some\folder'
$scriptname = "script2.ps1"
& "${dir}\${scriptname}" $var
# script2.ps1
Write-Output $args[0]
This is the second best approach, because you still pass the variable in a clear-cut way, but the interface isn't as well defined as before.
Define the variable as an environment variable:
# script1.ps1
$env:var = 'foo'
$dir = 'C:\some\folder'
$scriptname = "script2.ps1"
& "${dir}\${scriptname}"
# script2.ps1
Write-Output $env:var
This is a less clean approach than the argument-based ones, as the variable is passed using a "side-channel" (the process environment, which is inherited by child processes).
Just define the variable in the first script and use it in the second one:
# script1.ps1
$var = 'foo'
$dir = 'C:\some\folder'
$scriptname = "script2.ps1"
& "${dir}\${scriptname}"
# script2.ps1
Write-Output $var
This will work as well, because by using the call operator (&) the second script is run in the same context as the first script and thus has access to the same variables. However, "passing" a variable like this will easily break if someone runs the second script in a different context/scope or modies it without being aware of the implicit dependency.
If you want to go this route it's usually better to use the first script for variable (and function) definitions only, and dot-source it in the second script, so that the definitions are imported into the scope of the second script:
# script1.ps1
$var = 'foo'
# script2.ps1
. 'C:\path\to\script1.ps1'
Write-Output $var
Technically, passing values via a file would be another option. However, I would recommend against using this approach for several reasons:
it's prone to errors due to improper permissions (could be mitigated by creating the file in the $env:TEMP folder),
it's prone to littering the filesystem if you don't clean up the file afterwards,
it needlessly generates disk I/O when simple in-memory operations provided by the language would suffice.