Call Windows executable from PowerShell script, passing all arguments - powershell

I've looked all over, and I can't find the answer. Most things I've seen have answers that go around in circles and talk about all sorts of complicated things with no direct answer.
My question is simple. I have the following Windows batch file foo.bat:
#ECHO OFF
bar.exe %*
If I call foo.bat foobar 123 from the command line, it invokes bar.exe foobar 123. This works with no command-line arguments. This works with multiple command-line arguments.
What is the equivalent PowerShell script that does basically the same thing: invoke another executable, passing all CLI parameters the user provided?
I wouldn't expect this would be difficult, but I sure can't find any straightforward answer.

The PowerShell equivalent of your foo.bat file is a foo.ps1 file with the following content:
# Passes all arguments received by this script to bar.exe
bar.exe #args
PowerShell exposes all (unbound) positional arguments[1] as an array stored in the automatic $args variable.
By prefixing $args with # instead of $, argument splatting is employed, which means that the elements of the $args array are passed as individual arguments to bar.exe
Note: This isn't strictly necessary when calling external programs (such as bar.exe) - $args would work there too - but is more versatile in that it can also pass named arguments correctly through to other PowerShell commands, which typically have declared parameters that can be bound by name (e.g., -Path C:\temp to bind value C:\temp to declared parameter -Path)
As for working with $args in general:
$args.Count tells you how many (unbound) positional arguments were passed,
$args[0] returns the first such argument, $args[1] the second, and so on.
However, it is usually preferable to formally declare parameters in PowerShell scripts and functions, which can then also be bound by name (e.g., -FooParam foo instead of just foo). See this answer for more information.
[1] If your script doesn't formally declare any parameters (via a param(...) block - see about_Scripts and the linked answer), all arguments are by definition unbound (i.e. not mapped to a declared parameter) and positional (not prefixed by a target parameter name). However, if your script does declare parameters, $args only contains those arguments, if any, that were passed in addition to those binding to declared parameters. If your script is an advanced script (or function), $args isn't supported at all, because passing unbound arguments is then categorically prevented. See this answer for more information.

Related

Comment-based Help and parameter args[] [duplicate]

I have a PS script I call from a Windows shortcut. I drop on it several files or directories, and it works fine.
I would like to add some named parameters (let's call them : -Param1 and -Param2), optional, that can be used, of course, only from PowerShell Prompt.
param (
[switch]$CreateShortcut
)
A switch parameter works.
But, if I add a string parameter :
param (
[switch]$CreateShortcut,
[string]$Param1
)
Of course, it does not work anymore when I call my script thru the Windows shortcut : $Param1 receive the first file.
Is there a solution ?
Thanks
When you drop files/folders on a shortcut file, their full paths are passed as individual, unnamed arguments to the shortcut's executable (script).
PowerShell allows you to collect such unnamed arguments in a single, array-valued parameter, by declaring it as ValueFromRemainingArguments:
[CmdletBinding(PositionalBinding=$false)]
param (
[switch] $CreateShortcut,
# Collect all unnamed arguments in this parameter:
[Parameter(ValueFromRemainingArguments)]
[string[]] $FilesOrFolders
)
[CmdletBinding(PositionalBinding=$false)] ensures that any parameters not explicitly marked with a Position property must be passed as named arguments (i.e., the argument must be preceded by the name of the target parameter, e.g. -Path foo).
This isn't necessary to support [switch] parameters, because they are implicitly named-only, but it allows you to support additional, non-[switch] parameters that can be bound by explicit invocation (only).
Alternatively, if you do not need support for additional predeclared non-switch parameters, you can omit [CmdletBinding(PositionalBinding=$false)] and the $FilesOrFolders parameter declaration and access any arguments that do not bind to predeclared parameters via the automatic $args variable.
Generally, note that use of a [Parameter()] attribute on any of the predeclared parameters would make $args unavailable, as the presence of [CmdletBinding()] does.
The reason is that the use of either attribute makes a script or function an advanced one, i.e., makes it cmdlet-like, and therefore disallows passing arguments that do not bind to declared parameters; to put it differently: $args is then by definition always empty (an empty array).
Advanced scripts or functions automatically gain additional features, notably support for common parameters such as -Verbose.

start powershell wont accept variable as parameter

issue
the called powershell script will accept parameters but not all of them:
Current Set-Up and code:
I have a common folder where two .ps1 scripts are located:
DoWork.ps1
Workmanager.ps1
Workmanager.ps1 calls the Dowork.ps1:
$targetPath="M:\target"
echo "target path: $targetPath"
start powershell {.\DoWork.ps1 -target $targetPath -tempdrive D:\}
output (as expected):
target path: M:\target
DoWork.ps1 contains some start code:
param
(
[string]$tempdrive,
[string]$target,
[int] $threads = 8,
[int] $queuelength = -1
)
echo "variables:"
echo "temp drive: $tempdrive"
echo "target path: $target"
Unexpectedly, the $target is not beeing assigned. Previously I had the variable named $targetpath, which did not work either.
variables:
temp drive: D:\
target path:
Findings
It appears that the issue relies in Workmanager.ps1. Spcifying the parameter as fixed string rather than as variable will load the parameter. Any solution for this?
start powershell {.\DoWork.ps1 -target "foo" -tempdrive D:\}
When you use a ScriptBlock as an argument to powershell.exe, variables aren't going to be evaluated until after the new session starts. $targetPath has not been set in the child PowerShell process called by Workmanager.ps1 and so it has no value. This is actually an expected behavior of a ScriptBlock in general and behaves this way in other contexts too.
The solution is mentioned in the help text for powershell -?:
[-Command { - | <script-block> [-args <arg-array>] <========== THIS GUY
| <string> [<CommandParameters>] } ]
You must provide the -args parameter which will be passed to the ScriptBlock on execution (separate multiple arguments with a ,). Passed arguments are passed positionally, and must be referenced as though you were processing the arguments to a function manually using the $args array. For example:
$name = 'Bender'
& powershell { Write-Output "Hello, $($args[0])" } -args $name
However, especially with more complicated ScriptBlock bodies, having to remember which index of $args[i] contains the value you want at a given time is a pain in the butt. Luckily, we can use a little trick with defining parameters within the ScriptBlock to help:
$name = 'Bender'
& powershell { param($name) Write-Output "Hello, $name" } -args $name
This will print Hello, Bender as expected.
Some additional pointers:
The ScriptBlock can be multiline as though you were defining a function. way. The examples above are single line due to their simplicity.
A ScriptBlock is just an unnamed function, which is why defining parameters and referencing arguments within one works the same way.
To exemplify this behavior outside of powershell.exe -Command, Invoke-Command requires you to pass variables to its ScriptBlock in a similar fashion. Note however that answer uses an already-defined function body as the ScriptBlock (which is totally valid to do)
You don't need to use Start-Process here (start is its alias), at least as demonstrated in your example. You can simply use the call operator & unless you need to do something more complex than "run the program and wait for it to finish". See this answer of mine for more information.
If you opt to pass a string to powershell.exe instead, you don't need to provide arguments and your variables will get rendered in the current PowerShell process. However, so will any other unescaped variables that might be intended to set within the child process, so be careful with this approach. Personally, I prefer using ScriptBlock regardless, and just deal with the extra parameter definition and arguments.
Using the call & operator is optional when you are not executing a path rendered as a string. It can be omitted in the examples above, but is more useful like so:
& "C:\The\Program Path\Contains\spaces.exe"
& $programPathAsAVariable

What is the simplest way to make Alias in powershell?

I was wondering if there's any simple way to make aliases for powershell like cmd.
For example: In cmd, doskey art=php artisan $* where $* is optional. Currently, I'm using the following alias in powershell.
function runArtisanCommand
{
param(
[Parameter(Mandatory=$false, Position = 0, ValueFromRemainingArguments = $true)]
$command
)
php artisan $command
}
Set-Alias art runArtisanCommand
This works somewhat but don't take flags. For example: I can't write art -h or art route:list -c. In art -h command, it prints the output of php artisan and don't read flag at all but in art route:list -c command, it errors out with.
runArtisanCommand : Missing an argument for parameter 'command'. Specify a parameter of type 'System.Object' and try again.
At line:1 char:16
+ art route:list -c
+ ~~
+ CategoryInfo : InvalidArgument: (:) [runArtisanCommand], ParameterBindingException
+ FullyQualifiedErrorId : MissingArgument,runArtisanCommand
I would love a simpler solution than this. Thanks in advance.
The simplest and most convenient way to pass unknown arguments through is by spatting
the automatic $args array - as #args - in a simple function or script (one that neither uses a [CmdletBinding()] nor [Parameter()] attributes):
# Note: #args rather than $args makes the function work with named
# arguments for PowerShell commands too - see explanation below.
function runArtisanCommand { php artisan #args }
# As in your question: Define alias 'art' for the function
# Note: Of course, you could directly name your *function* 'art'.
# If you do want the function to have a longer name, consider one
# that adheres to PowerShell's Verb-Noun naming convention, such as
# 'Invoke-ArtisanCommand'.
Set-Alias art runArtisanCommand
As an aside: Since the target executable, php, is neither quoted nor specified based on a variable or expression, it can be invoked as-is; otherwise, you would need &, the call operator - see this answer for background information.
As for what you tried:
The problem was that use of -c as a pass-through argument only works if you precede it with --:
# OK, thanks to '--'
art -- route:list -c
-- tells PowerShell to treat all remaining arguments as unnamed (positional) arguments, instead of trying to interpret tokens such as -c as parameter names.
Without --, -c is interpreted as referring to your -command parameter (the parameter you declared as $command with ValueFromRemainingArguments = $true), given that PowerShell allows you to specify name prefixes in lieu of full parameter names, as long as the given prefix is unambiguous.
Because a parameter of any type other than [switch] requires an associated argument, -c (aka -command) failed with an error message to that effect.
You could have avoided the collision by naming your parameter so that it doesn't collide with any pass-through parameters, such as by naming it -_args (with parameter variable $_args):
function runArtisanCommand
{
param(
# Note: `Mandatory = $false` and `Position = 0` are *implied*.
[Parameter(ValueFromRemainingArguments)]
$_args
)
php artisan #_args
}
However, given that use of a [Parameter()] attribute implicitly makes your function an advanced function, it invariably also accepts common parameters, such as -ErrorAction, -OutVariable, -Verbose... - all of which can be passed by unambiguous prefix / short alias too; e.g., -outv for -OutVariable, or alias -ea for ErrorAction; collisions with them cannot be avoided.
Therefore, intended pass-through arguments such as -e still wouldn't work:
# FAILS, because -e ambiguously matches common parameters -ErrorAction
# and -ErrorVariable.
PS> art router:list -e
Parameter cannot be processed because the parameter name 'e' is ambiguous.
Possible matches include: -ErrorAction -ErrorVariable.
Again, -- is needed:
# OK, thanks to '--'
art -- router:list -e
Summary:
Especially for functions wrapping calls to external programs, such as php.exe, using a simple function with #args, as shown at the top, is not only simpler, but also more robust.
For functions wrapping PowerShell commands (with explicitly declared parameters):
a simple function with #args works too,
but if you also want support for tab-completion and showing a syntax diagram with the supported parameters, by passing -?, or via Get-Help, consider defining an (invariably advanced) proxy (wrapper) function via the PowerShell SDK - see below.
Optional background information: Pass-through arguments in PowerShell
As Mathias R. Jessen points out, the simplest way to pass (undeclared) arguments passed to a function or script through to another command is to use the automatic $args variable, which is an automatically populated array of all the arguments passed to a simple function or script (one that isn't advanced, through use of the [CmdletBinding()] and/or [Parameter()] attributes).
As for why #args (splatting) rather than $args should be used:
Using $args as-is in your wrapper function only works for passing positional arguments through (those not prefixed by the parameter name; e.g., *.txt), as opposed to named arguments (e.g., -Path *.txt).
If the ultimate target command is an external program (such as php.exe in this case), this isn't a problem, because PowerShell of necessity then treats all arguments as positional arguments (it cannot know the target program's syntax).
However, if a PowerShell command (with formally declared parameters) is ultimately called, only splatting the $args array - which syntactically means us of #args instead - supports passing named arguments through.[1]
Therefore, as a matter of habit, I suggest always using #args in simple wrapper functions, which equally works with external programs.[2]
To give an example with a simple wrapper function for Get-ChildItem:
# Simple wrapper function for Get-ChildItem that lists recursively
# and by relative path only.
function dirTree {
# Use #args to make sure that named arguments are properly passed through.
Get-ChildItem -Recurse -Name #args
}
# Invoke it with a *named* argument passed through to Get-ChildItem
# If $args rather than #args were used inside the function, this call would fail.
dirTree -Filter *.txt
Using a proxy function for more sophisticated pass-through processing:
The use of #args is convenient, but comes at the expense of not supporting the following:
tab-completion, given that tab-completion only works with formally declared parameters (typically with a param(...) block).
showing a syntax diagram with the supported parameters, by passing -?, or via Get-Help
To overcome these limitations, the parameter declarations of the ultimate target command must be duplicated in the (then advanced) wrapper function; while that is cumbersome, PowerShell can automate the process by scaffolding a so-called proxy (wrapper) function via the PowerShell SDK - see this answer.
Note:
With respect to common parameters such as -ErrorAction, it is the proxy function itself that (automatically) processes them, but that shouldn't make a difference to the caller.
Scaffolding a proxy function only works with PowerShell commands, given that PowerShell has no knowledge of the syntax of external programs.
However, you can manually duplicate the parameter declarations of the external target program.
[1] Note that the automatic $args array has built-in magic to support this; passing named arguments through with splatting is not supported with a custom array and requires use of a hash table instead, as discussed in the help topic about splatting linked to above.
[2] In fact, only #args also supports the correct interpretation of --%, the stop-parsing symbol.

Powershell seems to get the wrong number of elements in a param array

Assume I have (In file test.ps1):
param (
[string[]] $one
)
Write-Host $one.Count
If I do:
powershell -File test.ps1 -one "hello","cat","Dog"
I get:
1
But I expect:
3
Why?
"-one" is getting passed in as a whole string as the converting happens before calling the method.
You could alternatively call it like the following
powershell -Command {.\test.ps1 -one "hello","cat","Dog"}
To complement Kevin Smith's helpful answer:
The only way to pass an array to a PowerShell script via the PowerShell CLI (powershell.exe; pwsh for PowerShell Core) is to use -Commmand (-c).
By contrast, -File interprets the arguments as literal values, and does not recognize arrays, variable references ($foo), ...; in the case at hand, the script ends up seeing a single string with literal contents hello,cat,Dog (due to double-quote removal).
From inside PowerShell:
Use -Command with a script block ({ ... }), as shown in Kevin's answer, which not only simplifies the syntax (just use regular syntax inside the block), but produces type-rich output (not just strings, as with other external programs), because the target PowerShell instance uses the CLIXML serialization format to output its results, which the calling session automatically deserializes, the same way that PowerShell remoting / background jobs work (as with the latter, however, the type fidelity of the deserialization is invariably limited; see this answer).
Note, however, that from within PowerShell you generally don't need the CLI, which creates a (costly) child process, and can just invoke a *.ps1 script file directly:
.\test.ps1 -one hello, cat, Dog
From outside PowerShell - typically cmd.exe / a batch file - use -Command with a single, double-quoted string containing the PowerShell code to execute, given that using script blocks isn't supported from the outside.
powershell -Command ".\test.ps1 -one hello, cat, Dog"
Note that, with -Command, just as with direct invocation inside PowerShell, you must use .\test.ps1 rather than just test.ps1 in order to execute a file by that name in the current directory, which is a security feature.
Also note that with your simple argument values, "..."-enclosing them is optional, which is why the commands above use just hello, cat, Dog instead of "hello", "cat", "Dog"; in fact, using embedded " chars. in an overall "..." command string can get quite tricky - see this answer.

Calling powershell script with many args from cmd

I have a simple powershell script. It takes two parameters. Both are part of methods (Get-ResourcePool) which take string arguments.
If I call the function definition in the Powershell script, like so:
functName CI *FMS
That works fine.
The function call in Powershell is (and like so because this script will be called from outside):
FuncName $($args[0], $args[1])
I try to call this from Powershell editor, where I have the snapins I need all installed, like so:
C:\Script.ps1 A "*s"
Where scrpt is the name of my .ps1 file. There is one function. This, however, fails with an error that the argument is null or empty.
Any ideas why?
EDIT:
The function signature is:
function RevertToSnapshot($otherarg, $wildcard)
I use $wildcard here:
$SearchString = [System.String]::Concat("*",
$VMWildcard)
Name $SearchString (Name is a parameter to get-vm in powercli).
This calling style:
FuncName $($args[0], $args[1])
Will result in just one argument passed to FuncName - a single array with two elements. You probably want:
FuncName $args[0] $args[1]
In general, with PowerShell you call cmdlets, functions, aliases using space separated arguments and no parens. Calling .NET methods is the one exception to this rule where you have to use parens and commas to separate arguments e.g:
[string]::Concat('ab','c')