The tabexpansion function only works partially when I override it like so:
function tabexpansion {
param($line, $lastWord)
if ($line -eq "hey ") {
"you", "Joe"
}
}
The custom completions work as expected, but now I only get the default autocomplete behavior for cmdlet names, not parameters. So New-TAB works fine, but New-Alias -TAB doesn't. How do I get the regular completions too after overriding tabexpansion?
The file name and cmdlet expansion is handled in the shell itself if the function doesn't do anything on those. Everything else, including static members, parameters to cmdlets, &c. is handled by the function. If you take a look at Function:TabExpansion there is quite a bit going on there you may want to keep if you want the other tab completion features to still work.
Related
Here's a script to list directories / files passed on the command line -- recursively or not:
param( [switch] $r )
#gci_args = #{
Recurse = $r
ErrorAction = Ignore
}
$args | gci #gci_args
Now this does not work because Ignore is interpreted as a literal. What's the canonical way to pass an ErrorAction?
I see that both "Ignore" and (in PS7) { Ignore } work as values, but neither seems to make a difference for my use case (bad file name created under Linux, which stops PS5 regardless of the ErrorAction, but does not bother PS7 at all). So I'm not even sure the parameter has any effect.
because Ignore is interpreted as a literal
No, Ignore is interpreted as a command to execute, because it is parsed in argument mode (command invocation, like a shell) rather than in expression mode (like a traditional programming language) - see this answer for more information.
While using a [System.Management.Automation.ActionPreference] enumeration value explicitly, as in filimonic's helpful answer, is definitely an option, you can take advantage of the fact that PowerShell automatically converts back and forth between enum values and their symbolic string representations.
Therefore, you can use string 'Ignore' as a more convenient alternative to [System.Management.Automation.ActionPreference]::Ignore:[1]
$gci_args = #{
# ...
ErrorAction = 'Ignore'
}
Note that it is the quoting ('...') that signals to PowerShell that expression-mode parsing should be used, i.e. that the token is a string literal rather than a command.
Also note that -ErrorAction only operates on non-terminating errors (which are the typical kind, however) - see this answer for more information.
As for discovery of the permissible -ErrorAction values:
The conceptual about_CommonParameters help topic covers all common parameters, of which -ErrorAction is one.
Many common parameters have corresponding preference variables (which accept the same values), covered in about_Preference_Variables, which allow you to preset common parameters.
Interactively, you can use tab-completion to see the permissible values (as unquoted symbolic names, which you simply need to wrap in quotes); e.g.:
# Pressing the Tab key repeatedly where indicated
# cycles through the acceptable arguments.
Get-ChildItem -ErrorAction <tab>
[1] Note that using a string does not mean giving up type safety, if the context unambiguously calls for a specific enum type, such as in this case. Validation only happens at runtime either way, given that PowerShell is an interpreted language.
However, it is possible for a PowerShell-aware editor - such as Visual Studio Code with the PowerShell extension - to flag incorrect values at design time. As of version 2020.6.0, however, that does not yet appear to be the case. Fortunately, however, tab-completion and IntelliSense work as expected, so the problem may not arise.
That said, as zett42 points out, in the context of defining a hashtable entry for latter splatting the expected type is not (yet) known, so explicit use of [System.Management.Automation.ActionPreference] does have advantages: (a) IntelliSense in the editor can guide you, and (b) - assuming that Set-StrictMode -Version 2 or higher is in effect - an invalid value will we reported earlier at runtime, namely at the point of assignment, which makes troubleshooting easier. As of PowerShell 7.1, a caveat regarding Set-StrictMode -Version 2 or higher is that you will not be able to use the intrinsic (PowerShell-supplied) .Count property on objects that don't have it type-natively, due to the bug described in GitHub issue #2798.
I think the best way is to use native type.
$ErrorActionPreference.GetType().FullName # System.Management.Automation.ActionPreference
So, use
$gci_args = #{
Recurse = $r
ErrorAction = [System.Management.Automation.ActionPreference]::Ignore
}
I am trying to create an alias (named which) of the Get-Command cmdlet in a way that it doesn't run if I'm not sending any arguments (because if it's run without arguments it outputs all the commands that are available).
I know this can be done using a function but I would like to keep the tab completion functionality without having to write a sizeable function that is to be placed into my $PROFILE.
In short, I only want the alias to work if it is being passed arguments.
You can't do it with an alias, because PowerShell aliases can only refer to another command name or path, and can therefore neither include arguments nor custom logic.
Therefore you do need a function, but it can be a short and simple one:
function which { if ($args.count) { Get-Command #args } else { Throw "Missing command name." } }
Note that while passing -? for showing Get-Command's help does work, tab completion of arguments does not.
In order to get tab completion as well, you'll need to write a wrapper (proxy) function or at least replicate Get-Command's parameter declarations - which then does make the function definition sizable.
If the concern is just the size of the $PROFILE file itself, you can write a proxy script instead - which.ps1 - which you can invoke with just which as well, assuming you place it in one of the directories listed in $env:Path[1]; see next section.
Defining a wrapper (proxy) script or function:
Defining a wrapper (proxy) function or script is a nontrivial undertaking, but allows you to implement a robust wrapper that supports tab completion and even forwarding to the original command's help.
Note:
Bug alert: As zett42 points out, as of PowerShell [Core] 7.1, System.Management.Automation.ProxyCommand.Create neglects to include dynamic parameters if the target command is an (advanced) function or script; however, compiled cmdlets are not affected; see GitHub issue #4792 and this answer for a workaround.
For simplicity, the following creates a wrapper script, which.ps1 , and saves it in the current directory. As stated, if you place it in one of the directories listed in $env:PATH, you'll be able to invoke it as just which.
The code below can easily be adapted to create a wrapper function instead: simply take the contents of the $wrapperCmdSource variable below and enclose it in function which { ... }.
As of PowerShell Core 7.0.0-preview.5, there are some problems with the auto-generated code, which may or may not affect you; they will be fixed at some point; to learn more and to learn how to manually correct them, see GitHub issue #10863.
# Create the wrapper scaffolding as source code (outputs a single [string])
$wrapperCmdSource =
[System.Management.Automation.ProxyCommand]::Create((Get-Command Get-Command))
# Write the auto-generated source code to a script file
$wrapperCmdSource > which.ps1
Note:
Even though System.Management.Automation.ProxyCommand.Create requires a System.Management.Automation.CommandMetadata instance to identify the target command, the System.Management.Automation.CommandInfo instances output by Get-Command can be used as-is.
Re comment-based help: By default, the proxy function simply forwards to the original cmdlet's help; however, you can optionally pass a string to serve as the comment-based help as the 2nd argument.
By using [System.Management.Automation.ProxyCommand]::GetHelpComments() in combination with output from Get-Help, you could start with a copy of the original command's help and modify it:
[System.Management.Automation.ProxyCommand]::GetHelpComments((Get-Help Get-Command))
You now have a fully functional which.ps1 wrapper script that behaves like Get-Command itself.
You can invoke it as follows:
./which # Same as: Get-Command; tab completion of parameters supported.
./which -? # Shows Get-Command's help.
You can now edit the script file to perform the desired customization.
Note: The auto-generated source code contains a lot of boilerplate code; however, typically only one or two places need tweaking to implement the custom functionality.
Specifically, place the following command as the first statement inside the begin { ... } block:
if (-not $MyInvocation.ExpectingInput -and -not ($Name -or $CommandType -or $Module -or $FullyQualifiedModule)) {
Throw "Missing command name or filter."
}
This causes the script to throw an error if the caller didn't provide some way of targeting a specific command or group of commands, either by direct argument or via the pipeline.
If you invoke the modified script without arguments now, you should see the desired error:
PS> ./which.ps1
Missing command name or filter.
...
Other common types of customizations are:
Removing parameters from the wrapper, by simply removing the parameter declaration.
Adding additional parameters to the invocation of the wrapped command, by modifying the following line in the begin block:
# Add parameters, as needed.
$scriptCmd = { & $wrappedCmd #PSBoundParameters }
Preprocessing pipeline input before passing it to the wrapped command, by customizing the process block and replacing $_ with your preprocessed input in the following line:
# Replace $_ with a preprocessed version of it, as needed.
$steppablePipeline.Process($_)
For an example of a complete implementation of a proxy function, see this answer.
[1] Caveat for Linux users: since the Linux file-system is case is case-sensitive, invocation of your script won't work case-insensitively, the way commands normally work in PowerShell. E.g., if your script file name is Get-Foo.ps1, only Get-Foo - using the exact same case - will work, not also get-foo, for instance.
I am attempting to add auto-complete features in powershell. In this case I would like to be able to type "test" in my console. After that to be able to type Get-Se[TAB] to auto complete to Get-Search using TAB Expansion.
PS > Get-Se[TAB]
PS > Get-Search
function test
{
[CmdletBinding()]
param()
# label the while loop "outer"
:outer while($true){
$x = Read-Host
# split $x into two parts
$first,$second = $x -split '\s',2
# switch evaluating the first part
switch($first){
Get-Search {
# Searching
}
default {
Write-Host "False"
}
}
}
}
Additional Information:
Goal:
I'd like to be able to use arguments that look like cmdlets to have the Powershell feel.
About the original script:
I have created a script to automate queries from several API's, for many different users. What I have right now for search is "s" and I'd like it to be "Get-Search" So Read-Host waits for an input, the user would type "Get-Search 'value'" and a formatted JSON returns.
PS > Get-Search foobar
#Returns JSON
I had a hard time understanding your intention at first, but I think I get it now.
You want to implement tab completion (tab expansion) inside the Read-Host prompt.
Unfortunately, there is no way to do that.
If you share why you want this, there may be better ways to achieve your ultimate goal.
Based on your additional information, I have a different approach.
Create actual functions for each of your queries, like Get-Search, etc. You can even add aliases for them so that s corresponds directly if you want.
Wrap all of these functions in a proper module, so that you can import them (see next step).
Create a constrained runspace that only allows the user to execute the specific functions and aliases you want (this is easier with a module, but the module is not a requirement).
What this can do is give your end users access (even remotely) to a PowerShell session which can only use the functions you've created and allowed to be executed. Other cmdlets/functions and even language features like using variables will be restricted and unavailable.
That way, you get true PowerShell tab expansion and semantics, and you end up with a real set of functions that be used in an automated way as well.
You don't have to write any prompting or parsing.
Further, the session can be secured, allowing only specific users and groups to connect to it.
Let's take the PowerShell statement below as an example:
powershell.exe c:\temp\windowsbroker.ps1 IIS
Is it possible to have it scripted within windowsbroker.ps1 to check for that IIS string, and if it's present to do a specific install script? The broker script would be intended to install different applications depending on what string followed it when it was called.
This may seem like an odd question, but I've been using CloudFormation to spin up application environments and I'm specifying an "ApplicationStack" parameter that will be referenced at the time when the powershell script is run so it knows which script to run to install the correct application during bootup.
What you're trying to do is called argument or parameter handling. In its simplest form PowerShell provides all arguments to a script in the automatic variable $args. That would allow you to check for an argument IIS like this:
if ($args -contains 'iis') {
# do something
}
or like this if you want the check to be case-sensitive (which I wouldn't recommend, since Windows and PowerShell usually aren't):
if ($args -ccontains 'IIS') {
# do something
}
However, since apparently you want to use the argument as a switch to trigger specific behavior of your script, there are better, more sophisticated ways of doing this. You could add a Param() section at the top of your script and check if the parameter was present in the arguments like this (for a list of things to install):
Param(
[Parameter()]
[string[]]$Install
)
$Install | ForEach-Object {
switch ($_) {
'IIS' {
# do something
}
...
}
}
or like this (for a single option):
Param(
[switch]$IIS
)
if ($IIS.IsPresent) {
# do something
}
You'd run the script like this:
powershell "c:\temp\windowsbroker.ps1" -Install "IIS",...
or like this respectively:
powershell "c:\temp\windowsbroker.ps1" -IIS
Usually I'd prefer switches over parameters with array arguments (unless you have a rather extensive list of options), because with the latter you have to worry about spelling of the array elements, whereas with switches you got a built-in spell check.
Using a Param() section will also automatically add a short usage description to your script:
PS C:\temp> Get-Help windowsbroker.ps1
windowsbroker.ps1 [-IIS]
You can further enhance this online help to your script via comment-based help.
Using parameters has a lot of other advantages on top of that (even though they probably aren't of that much use in your scenario). You can do parameter validation, make parameters mandatory, define default values, read values from the pipeline, make parameters depend on other parameters via parameter sets, and so on. See here and here for more information.
Yes, they are called positional parameters. You provide the parameters at the beginning of your script:
Param(
[string]$appToInstall
)
You could then write your script as follows:
switch ($appToInstall){
"IIS" {"Install IIS here"}
}
I have a bunch of functions that I call that produce output that is displayed to the console. Functions might look something like the following:
exec { & .\xunit.console.clr4 tests.xunit }
#or
exec { & .\nuget.exe pack $source_dir\ZocMonLib\NuSpec\ZocMon.nuspec -OutputDirectory $build_dir\local -Symbols -Version $version }
Now I know I could do something like powershell indentation but that only works if I control the output.
How do I do the indenting of output for these private functions?
Ok, I wrote a version that does the line wrapping right. But it's slightly complicated. I posted it on PoshCode http://poshcode.org/3386
That should work for Write-Host or Write-Verbose, but it will not work if those functions are actually outputting objects -- you'd have to pipe to Write-Host.
The function on PoshCode will (optionally) auto-indent based on the stack depth, but also allow you to specify -Pad 5 or something to manually indent, so you can call nuget.exe ... | write-host -pad 5 or just stick | Write-Host wherever you need it, and then set $WriteVerboseAutoIndent = $true ...
Hope that helps -- it does do manual line wrapping on the output of exes, so it should work.
There's not a great solution, because PowerShell doesn't always run in the console window. Other hosting applications might or might not support tab characters, and might not even support Write-Host. If your goal is strictly to support console display, consider writing a "Format-Console" function.
nuget list NuGetPowerTools | Format-Console
Inside that function, you can capture the pipeline input (which I presume would be strings since this is an external command). Each line of output would be a single String object, so...
Write-Host " $x"
Would display that indented by four spaces.
function Format-Console {
[CmdletBinding()]
param([Parameter(ValueFromPipeline=$True)][string[]]$inputObject)
PROCESS { Write-Host " $inputObject" }
}
That's kinda quick and dirty, but assuming you only ever pipe strings to it, it'll work. Building this as a function lets it be more reusable; using the Format- verb cues other users that the output of this isn't intended to be consumable. It technically isn't a true "Format" cmdlet since it doesn't output internal formatting directives, but it's consistent with the usage pattern for
Can't you assign the result of your private functions to a string and "tab" that string?
$x = nuget list NuGetPowerTools
Write-Host "`t`t$x"