start powershell wont accept variable as parameter - powershell

issue
the called powershell script will accept parameters but not all of them:
Current Set-Up and code:
I have a common folder where two .ps1 scripts are located:
DoWork.ps1
Workmanager.ps1
Workmanager.ps1 calls the Dowork.ps1:
$targetPath="M:\target"
echo "target path: $targetPath"
start powershell {.\DoWork.ps1 -target $targetPath -tempdrive D:\}
output (as expected):
target path: M:\target
DoWork.ps1 contains some start code:
param
(
[string]$tempdrive,
[string]$target,
[int] $threads = 8,
[int] $queuelength = -1
)
echo "variables:"
echo "temp drive: $tempdrive"
echo "target path: $target"
Unexpectedly, the $target is not beeing assigned. Previously I had the variable named $targetpath, which did not work either.
variables:
temp drive: D:\
target path:
Findings
It appears that the issue relies in Workmanager.ps1. Spcifying the parameter as fixed string rather than as variable will load the parameter. Any solution for this?
start powershell {.\DoWork.ps1 -target "foo" -tempdrive D:\}

When you use a ScriptBlock as an argument to powershell.exe, variables aren't going to be evaluated until after the new session starts. $targetPath has not been set in the child PowerShell process called by Workmanager.ps1 and so it has no value. This is actually an expected behavior of a ScriptBlock in general and behaves this way in other contexts too.
The solution is mentioned in the help text for powershell -?:
[-Command { - | <script-block> [-args <arg-array>] <========== THIS GUY
| <string> [<CommandParameters>] } ]
You must provide the -args parameter which will be passed to the ScriptBlock on execution (separate multiple arguments with a ,). Passed arguments are passed positionally, and must be referenced as though you were processing the arguments to a function manually using the $args array. For example:
$name = 'Bender'
& powershell { Write-Output "Hello, $($args[0])" } -args $name
However, especially with more complicated ScriptBlock bodies, having to remember which index of $args[i] contains the value you want at a given time is a pain in the butt. Luckily, we can use a little trick with defining parameters within the ScriptBlock to help:
$name = 'Bender'
& powershell { param($name) Write-Output "Hello, $name" } -args $name
This will print Hello, Bender as expected.
Some additional pointers:
The ScriptBlock can be multiline as though you were defining a function. way. The examples above are single line due to their simplicity.
A ScriptBlock is just an unnamed function, which is why defining parameters and referencing arguments within one works the same way.
To exemplify this behavior outside of powershell.exe -Command, Invoke-Command requires you to pass variables to its ScriptBlock in a similar fashion. Note however that answer uses an already-defined function body as the ScriptBlock (which is totally valid to do)
You don't need to use Start-Process here (start is its alias), at least as demonstrated in your example. You can simply use the call operator & unless you need to do something more complex than "run the program and wait for it to finish". See this answer of mine for more information.
If you opt to pass a string to powershell.exe instead, you don't need to provide arguments and your variables will get rendered in the current PowerShell process. However, so will any other unescaped variables that might be intended to set within the child process, so be careful with this approach. Personally, I prefer using ScriptBlock regardless, and just deal with the extra parameter definition and arguments.
Using the call & operator is optional when you are not executing a path rendered as a string. It can be omitted in the examples above, but is more useful like so:
& "C:\The\Program Path\Contains\spaces.exe"
& $programPathAsAVariable

Related

Dot-sourcing a self-elevate script

I have a self elevate snippet which is quite wordy, so I decided instead of duplicating it at the top of every script that needs to be run as admin to move it into a separate .ps1:
function Switch-ToAdmin {
# Self-elevate the script if required
if (-not ([Security.Principal.WindowsPrincipal] [Security.Principal.WindowsIdentity]::GetCurrent()).IsInRole([Security.Principal.WindowsBuiltInRole] 'Administrator')) {
if ([int](Get-CimInstance -Class Win32_OperatingSystem | Select-Object -ExpandProperty BuildNumber) -ge 6000) {
$Cmd = #(
"-Command Set-Location `"$(Get-Location)`"; & `"$PSCommandPath`""
"-$($PSBoundParameters.Keys)"
)
$ProcArgs = #{
FilePath = 'PowerShell.exe'
Verb = 'RunAs'
ArgumentList = $Cmd
}
Start-Process #ProcArgs
Exit
}
}
}
So for every script that needs elevation I'd prepend
. "$PSScriptRoot\self-elevate.ps1"
Switch-ToAdmin
# rest of script
Doing above successfully procs the UAC prompt, but the rest of the script won't get executed.
Is this sorta stuff disallowed?
Darin and iRon have provided the crucial pointers:
Darin points out that the automatic $PSCommandPath variable variable in your Switch-ToAdmin function does not contain the full path of the script from which the function is called, but that of the script in which the function is defined, even if that script's definitions are loaded directly into the scope of your main script via ., the dot-sourcing operator.
The same applies analogously to the automatic $PSScriptRoot variable, which reflects the defining script's full directory path.
Also, more generally, the automatic $PSBoundParameters variable inside a function reflects that function's bound parameters, not its enclosing script's.
iRon points out that the Get-PSCallStack cmdlet can be used to get information about a script's callers, starting at index 1; the first object returned - index 0, when Get-PSCallStack output is captured in an array, represents the current command. Index 1 therefore refers to the immediate caller, which from the perspective of your dot-sourced script is your main script.
Therefore:
Replace $PSCommandPath with $MyInvocation.PSCommandPath, via the automatic $MyInvocation variable. $MyInvocation.PSCommandPath truly reflects the caller's full script path, irrespective of where the called function was defined.
Alternatively, use (Get-PSCallStack)[1].ScriptName, which despite what the property name suggests, returns the full path of the calling script too.
Replace $PSBoundParameters with (Get-PSCallStack)[1].InvocationInfo.BoundParameters
Note that there's also (Get-PSCallStack)[1].Arguments, but it seems to contain a single string only, containing a representation of all arguments that is only semi-structured and therefore doesn't allow robust reconstruction of the individual parameters.
As an aside:
Even if $PSBoundParameters contained the intended information, "-$($PSBoundParameters.Keys)" would only succeed in passing the bound parameters through if your script defines only one parameter, if that parameter is a [switch] parameter, and if it is actually passed in every invocation.
Passing arguments through robustly in this context is hard to do, and has inherent limitations - see this answer for a - complex - attempt to make it work as well as possible.

PowerShell - accessing script parameters, sent from Windows cmd shell

Questions
How can you access the parameters sent to PowerShell script (.ps1 file)?
Can you access parameters A: by name, B: by position, C: a mix of either?
Context
I recently tried to write a PowerShell script (.ps1) that would be called from a Windows batch file (.bat) (or potentially cmd shell, or AutoHotKey script) - which would pass parameters into the .ps1 script for it to use (to display a toast notification). Thanks to the instructions on ss64.com, I have used $args to do this kind of thing in the past, however for some reason I could access the parameters this way (despite passing parameters, $args[0] = '' (empty string) and $args.Count = 0) so eventually had to remove all the $args code, and replace it with Param() script instead.
I'm still not quite sure why, but thought this is something I should get to the bottom of before I try to write my next script...
Code Example 1: Args (un-named parameters)
ToastNotificationArgs.ps1
-------------------------
Write-Debug "The script has been passed $($args.Count) parameters"
If (!$args[0]) { # Handle first parameter missing }
If (!$args[1]) { # Handle second parameter missing }
Import-Module -Name BurntToast
New-BurntToastNotification -Text "$args[0], $args[1]"
^ I thought the above code was correct, but like I say, I kept struggling to access the parameters for some reason and could not figure out why. (If anyone can spot what I was doing wrong, please shout!)
Is $args[] a valid approach? I assume so given it's use of ss64.com, but maybe there are some pitfalls / limitations I need to be aware of?
Code Example 2: Param (named parameters)
ToastNotificationParams.ps1
---------------------------
Param(
[Parameter(Mandatory=$false, Position=0, ValueFromPipeline=$true)] [string]$Title,
[Parameter(Mandatory=$false, Position=1, ValueFromPipeline=$true)] [string]$Message
)
Import-Module -Name BurntToast
New-BurntToastNotification -Text "$Title, $Message"
^ This was the only way to get my script working in the end. However when I passed the parameters in, my calling cmd script sent the parameters by position i.e. (pwsh.exe -File "ToastNotificationParams.ps1" "This is the title" "Message goes here") rather than by named pairs. (Not sure if this is best practice, but is how my script was initially intended to be used to left it for now).
While Param() got my script working this time (and I also realise the inherent dangers of position-based parameters), there are times when a position-based approach might be necessary (e.g. the number of parameter is unknown)...
Code Example 3: Hybrid
ToastNotificationMix.ps1
------------------------
Param(
[Parameter(Mandatory=$false, Position=0, ValueFromPipeline=$true)] [string]$Title
)
Import-Module -Name BurntToast
For ( $i = 1; $i -lt $args.count; $i++ ) {
New-BurntToastNotification -Text "$Title, $args[i]"
}
Is something like this valid?.. If not (or there is a better solution), any help would be greatly appreciated!
Thanks in advance!
The automatic $args variable is only available in simple (non-advanced) functions / scripts. A script automatically becomes an advanced one by using the [CmdletBinding()] attribute and/or at least one per-parameter [Parameter()] attribute.
Using $args allows a function/script to accept an open-ended number of positional arguments, usually instead of, but also in addition to using explicitly declared parameters.
But it doesn't allow passing named arguments (arguments prefixed by a predeclared target parameter name, e.g., -Title)
For robustness, using an advanced (cmdlet-like) function or script is preferable; such functions / scripts:
They require declaring parameters explicitly.
They accept no arguments other than ones that bind to declared parameters.
However, you can define a single catch-all parameter that collects all positional arguments that don't bind to any of the other predeclared parameters, using [Parameter(ValueFromRemainingArguments)].
Explicitly defined parameters are positional by default, in the order in which they are declared inside the param(...) block.
You can turn off this default with [CmdletBinding(PositionalBinding=$false)],
which then allows you to selectively enable positional binding, using the Position property of the individual [Parameter()] attributes.
When you call a PowerShell script via the PowerShell's CLI's -File parameter, the invocation syntax is fundamentally the same as when calling script from inside PowerShell; that is, you can pass named arguments and/or - if supported - positional arguments.
Constraints:
The arguments are treated as literals.
Passing array arguments (,-separated elements) is not supported.
If you do need your arguments to be interpreted as they would be from inside PowerShell, use the -Command / -c CLI parameter instead
See this answer for guidance on when to use -File vs. `-Command.
To put it all together:
ToastNotificationMix.ps1:
[CmdletBinding(PositionalBinding=$false)]
Param(
[Parameter(Position=0)]
[string]$Title
,
[Parameter(Mandatory, ValueFromRemainingArguments)]
[string[]] $Rest
)
Import-Module -Name BurntToast
foreach ($restArg in $Rest) {
New-BurntToastNotification -Text "$Title, $restArg"
}
You can then call your script from cmd.exe as follows, for instance (I'm using pwsh.exe, the PowerShell (Core) CLI; for Windows PowerShell, use powershell.exe):
Positional binding only:
:: "foo" binds to $Title, "bar" to $Rest
pwsh -File ./ToastNotificationMix.ps1 foo bar
Mix of named and positional binding:
:: "bar" and "baz" both bind to $Rest
pwsh -File ./ToastNotificationMix.ps1 -Title foo bar baz

What is the simplest way to make Alias in powershell?

I was wondering if there's any simple way to make aliases for powershell like cmd.
For example: In cmd, doskey art=php artisan $* where $* is optional. Currently, I'm using the following alias in powershell.
function runArtisanCommand
{
param(
[Parameter(Mandatory=$false, Position = 0, ValueFromRemainingArguments = $true)]
$command
)
php artisan $command
}
Set-Alias art runArtisanCommand
This works somewhat but don't take flags. For example: I can't write art -h or art route:list -c. In art -h command, it prints the output of php artisan and don't read flag at all but in art route:list -c command, it errors out with.
runArtisanCommand : Missing an argument for parameter 'command'. Specify a parameter of type 'System.Object' and try again.
At line:1 char:16
+ art route:list -c
+ ~~
+ CategoryInfo : InvalidArgument: (:) [runArtisanCommand], ParameterBindingException
+ FullyQualifiedErrorId : MissingArgument,runArtisanCommand
I would love a simpler solution than this. Thanks in advance.
The simplest and most convenient way to pass unknown arguments through is by spatting
the automatic $args array - as #args - in a simple function or script (one that neither uses a [CmdletBinding()] nor [Parameter()] attributes):
# Note: #args rather than $args makes the function work with named
# arguments for PowerShell commands too - see explanation below.
function runArtisanCommand { php artisan #args }
# As in your question: Define alias 'art' for the function
# Note: Of course, you could directly name your *function* 'art'.
# If you do want the function to have a longer name, consider one
# that adheres to PowerShell's Verb-Noun naming convention, such as
# 'Invoke-ArtisanCommand'.
Set-Alias art runArtisanCommand
As an aside: Since the target executable, php, is neither quoted nor specified based on a variable or expression, it can be invoked as-is; otherwise, you would need &, the call operator - see this answer for background information.
As for what you tried:
The problem was that use of -c as a pass-through argument only works if you precede it with --:
# OK, thanks to '--'
art -- route:list -c
-- tells PowerShell to treat all remaining arguments as unnamed (positional) arguments, instead of trying to interpret tokens such as -c as parameter names.
Without --, -c is interpreted as referring to your -command parameter (the parameter you declared as $command with ValueFromRemainingArguments = $true), given that PowerShell allows you to specify name prefixes in lieu of full parameter names, as long as the given prefix is unambiguous.
Because a parameter of any type other than [switch] requires an associated argument, -c (aka -command) failed with an error message to that effect.
You could have avoided the collision by naming your parameter so that it doesn't collide with any pass-through parameters, such as by naming it -_args (with parameter variable $_args):
function runArtisanCommand
{
param(
# Note: `Mandatory = $false` and `Position = 0` are *implied*.
[Parameter(ValueFromRemainingArguments)]
$_args
)
php artisan #_args
}
However, given that use of a [Parameter()] attribute implicitly makes your function an advanced function, it invariably also accepts common parameters, such as -ErrorAction, -OutVariable, -Verbose... - all of which can be passed by unambiguous prefix / short alias too; e.g., -outv for -OutVariable, or alias -ea for ErrorAction; collisions with them cannot be avoided.
Therefore, intended pass-through arguments such as -e still wouldn't work:
# FAILS, because -e ambiguously matches common parameters -ErrorAction
# and -ErrorVariable.
PS> art router:list -e
Parameter cannot be processed because the parameter name 'e' is ambiguous.
Possible matches include: -ErrorAction -ErrorVariable.
Again, -- is needed:
# OK, thanks to '--'
art -- router:list -e
Summary:
Especially for functions wrapping calls to external programs, such as php.exe, using a simple function with #args, as shown at the top, is not only simpler, but also more robust.
For functions wrapping PowerShell commands (with explicitly declared parameters):
a simple function with #args works too,
but if you also want support for tab-completion and showing a syntax diagram with the supported parameters, by passing -?, or via Get-Help, consider defining an (invariably advanced) proxy (wrapper) function via the PowerShell SDK - see below.
Optional background information: Pass-through arguments in PowerShell
As Mathias R. Jessen points out, the simplest way to pass (undeclared) arguments passed to a function or script through to another command is to use the automatic $args variable, which is an automatically populated array of all the arguments passed to a simple function or script (one that isn't advanced, through use of the [CmdletBinding()] and/or [Parameter()] attributes).
As for why #args (splatting) rather than $args should be used:
Using $args as-is in your wrapper function only works for passing positional arguments through (those not prefixed by the parameter name; e.g., *.txt), as opposed to named arguments (e.g., -Path *.txt).
If the ultimate target command is an external program (such as php.exe in this case), this isn't a problem, because PowerShell of necessity then treats all arguments as positional arguments (it cannot know the target program's syntax).
However, if a PowerShell command (with formally declared parameters) is ultimately called, only splatting the $args array - which syntactically means us of #args instead - supports passing named arguments through.[1]
Therefore, as a matter of habit, I suggest always using #args in simple wrapper functions, which equally works with external programs.[2]
To give an example with a simple wrapper function for Get-ChildItem:
# Simple wrapper function for Get-ChildItem that lists recursively
# and by relative path only.
function dirTree {
# Use #args to make sure that named arguments are properly passed through.
Get-ChildItem -Recurse -Name #args
}
# Invoke it with a *named* argument passed through to Get-ChildItem
# If $args rather than #args were used inside the function, this call would fail.
dirTree -Filter *.txt
Using a proxy function for more sophisticated pass-through processing:
The use of #args is convenient, but comes at the expense of not supporting the following:
tab-completion, given that tab-completion only works with formally declared parameters (typically with a param(...) block).
showing a syntax diagram with the supported parameters, by passing -?, or via Get-Help
To overcome these limitations, the parameter declarations of the ultimate target command must be duplicated in the (then advanced) wrapper function; while that is cumbersome, PowerShell can automate the process by scaffolding a so-called proxy (wrapper) function via the PowerShell SDK - see this answer.
Note:
With respect to common parameters such as -ErrorAction, it is the proxy function itself that (automatically) processes them, but that shouldn't make a difference to the caller.
Scaffolding a proxy function only works with PowerShell commands, given that PowerShell has no knowledge of the syntax of external programs.
However, you can manually duplicate the parameter declarations of the external target program.
[1] Note that the automatic $args array has built-in magic to support this; passing named arguments through with splatting is not supported with a custom array and requires use of a hash table instead, as discussed in the help topic about splatting linked to above.
[2] In fact, only #args also supports the correct interpretation of --%, the stop-parsing symbol.

Wrapper function for cmdlet - pass remaining parameters

I'm writing a function that wraps a cmdlet using ValueFromRemainingArguments (as discussed here).
The following simple code demonstrates the problem:
works
function Test-WrapperArgs {
Set-Location #args
}
Test-WrapperArgs -Path C:\
does not work
function Test-WrapperUnbound {
Param(
[Parameter(ValueFromRemainingArguments)] $UnboundArgs
)
Set-Location #UnboundArgs
}
Test-WrapperUnbound -Path C:\
Set-Location: F:\cygwin\home\thorsten\.config\powershell\test.ps1:69
Line |
69 | Set-Location #UnboundArgs
| ~~~~~~~~~~~~~~~~~~~~~~~~~
| A positional parameter cannot be found that accepts argument 'C:\'.
I tried getting to the issue with GetType and EchoArgs from the PowerShell Community Extensions to no avail. At the moment I'm almost considering a bug (maybe related to this ticket??).
The best solution for an advanced function (one that uses a [CmdletBinding()] attribute and/or a [Parameter()] attribute) is to scaffold a proxy (wrapper) function via the PowerShell SDK, as shown in this answer.
This involves essentially duplicating the target command's parameter declarations (albeit in an automatic, but static fashion).
If you do not want to use this approach, your only option is to perform your own parsing of the $UnboundArgs array (technically, it is an instance of [System.Collections.Generic.List[object]]), which is cumbersome, however, and not foolproof:
function Test-WrapperUnbound {
Param(
[Parameter(ValueFromRemainingArguments)] $UnboundArgs
)
# (Incompletely) emulate PowerShell's own argument parsing by
# building a hashtable of parameter-argument pairs to pass through
# to Set-Location via splatting.
$htPassThruArgs = #{}; $key = $null
switch -regex ($UnboundArgs) {
'^-(.+)' { if ($key) { $htPassThruArgs[$key] = $true } $key = $Matches[1] }
default { $htPassThruArgs[$key] = $_; $key = $null }
}
if ($key) { $htPassThruArgs[$key] = $true } # trailing switch param.
# Pass the resulting hashtable via splatting.
Set-Location #htPassThruArgs
}
Note:
This isn't foolproof in that your function won't be able to distinguish between an actual parameter name (e.g., -Path) and a string literal that happens to look like a parameter name (e.g., '-Path')
Also, unlike with the scaffolding-based proxy-function approach mentioned at the top, you won't get tab-completion for any pass-through parameters and the pass-through parameters won't be listed with -? / Get-Help / Get-Command -Syntax.
If you don't mind having neither tab-completion nor syntax help and/or your wrapper function must support pass-through to multiple or not-known-in-advance target commands, using a simple (non-advanced) function with #args (as in your working example; see also below) is the simplest option, assuming your function doesn't itself need to support common parameters (which requires an advanced function).
Using a simple function also implies that common parameters are passed through to the wrapped command only (whereas an advanced function would interpret them as meant for itself, though their effect usually propagates to calls inside the function; with a common parameter such as -OutVariable, however, the distinction matters).
As for what you tried:
While PowerShell does support splatting via arrays (or array-like collections such as [System.Collections.Generic.List[object]]) in principle, this only works as intended if all elements are to be passed as positional arguments and/or if the target command is an external program (about whose parameter structure PowerShell knows nothing, and always passes arguments as a list/array of tokens).
In order to pass arguments with named parameters to other PowerShell commands, you must use hashtable-based splatting, where each entry's key identifies the target parameter and the value the parameter value (argument).
Even though the automatic $args variable is technically also an array ([object[]]), PowerShell has built-in magic that allows splatting with #args to also work with named parameters - this does not work with any custom array or collection.
Note that the automatic $args variable, which collects all arguments for which no parameter was declared - is only available in simple (non-advanced) functions and scripts; advanced functions and scripts - those that use the [CmdletBinding()] attribute and/or [Parameter()] attributes - require that all potential parameters be declared.

"Cannot bind parameter because Param2 is specified more than once"

I am trying to call a PS script via batch file, like so
Powershell.exe -file "C:\Scripts\Blah\Blah\Blah.ps1" -webUID "usernameValue" -webPWD "passwordValue" -Param "param value" -Param2 "param 2 value"
The issue seems to be the batch file is confusing Param and Param2. It thinks I am setting Param2 twice however Param and Param2 are separate parameters altogether. Has anyone experienced this? Is there perhaps a way to explicitly state the param names? Thanks
Param block
# Parameters
Param
(
[string]$WebUID,
[string]$WebPWD,
[string]$Param,
[string]$Param2
)
In an effort to support concise command-line use, PowerShell's "elastic syntax" allows specifying unambiguous prefix substrings of parameter names so that you only need to type as much of a parameter name as is necessary to identify it without ambiguity;
e.g., typing -p to refer to -Path is enough, if no other parameters start with p.
However, an exact match is always recognized, so that specifying -Param in your case should unambiguously match the -Param parameter, even though its full name happens to be a prefix substring of different parameter -Param2.
If the problem were an issue of ambiguity (it isn't), you'd see a different error message. For instance, were you to use the ambiguous -Para, you'd see:
Parameter cannot be processed because the parameter name 'para' is ambiguous. Possible matches include: -Param -Param2.
Instead, the wording of your error message suggests that the exact same parameter name - -Param2 - was indeed specified more than once - even though your sample code doesn't show that.
I've tested the behavior in PSv2 and PSv5.1 / 6.0 alpha 10 - it's conceivable, however, that other versions act differently due to a bug. Do let us know.
Consider an alternative approach:
If you invoked your script from within PowerShell, you could use a single, array-valued parameter - e.g. [string[]] $Params - and then simply pass as many parameters as needed, comma-separated, without needing to specify a distinct parameter name for each value.
Sadly, when invoking a script from outside of PowerShell, this approach won't work, because passing arrays isn't supported from the outside.
There is a workaround, however:
Declare the array-valued parameter decorated with [parameter(ValueFromRemainingArguments=$true)]
Invoke the script with the parameters as a space-separated list at the end of the command.
Applied to your scenario:
If your script defined its parameters as follows:
Param
(
[string]$WebUID,
[string]$WebPWD,
[parameter(ValueFromRemainingArguments=$true)]
[string[]] $Params
)
You could then invoke your script as follows:
Powershell.exe -file "C:\Scripts\Blah\Blah\Blah.ps1" `
-webUID "usernameValue" `
-webPWD "passwordValue" `
"param value" "param 2 value"
and $Params would receive an array of values: $Params[0] would receive param value, and $Params[1] would receive param 2 value.
Note that when calling from outside of PowerShell:
you must not use parameter name -Params in the invocation - just specify the values at the end.
you must not use , to separate the values - use spaces.
I'm no guru, but this looks like it's related to "Partial Parameters" and "Parameter Completion". See this article for more information.
Simply changing Param to Param1 should fix the issue.