It was pointed out to me (in PowerShell, replicate bash parallel ping) that I can load a function from the internet as follows:
iex (irm https://raw.githubusercontent.com/proxb/AsyncFunctions/master/Test-ConnectionAsync.ps1)
The url referenced Test-ConnectionAsync.ps1 contains two functions: Ping-Subnet and Test-ConnectionAsync
This made me wonder if I could then define bypass functions in my personal module that are dummy functions that will be permanently overridden as soon as they are invoked. e.g.
function Ping-Subnet <mimic the switches of the function to be loaded> {
if <function is not already loaded from internet> {
iex (irm https://raw.githubusercontent.com/proxb/AsyncFunctions/master/Test-ConnectionAsync.ps1)
}
# Now, somehow, permanently overwrite Ping-Subnet to be the function that loaded from the URL
Ping-Subnet <pass the switches that we mimicked to the required function that we have just loaded>
}
This would very simply allow me to reference a number of useful scripts directly from my module but without having to load them all from the internet upon loading the Module (i.e. the functions are only loaded on demand, when I invoke them, and I will often never invoke the functions unless I need them).
You could use the Parser to find the functions in the remote script and load them into your scope. This will not be a self-updating function, but should be safer than what you're trying to accomplish.
using namespace System.Management.Automation.Language
function Load-Function {
[cmdletbinding()]
param(
[parameter(Mandatory, ValueFromPipeline)]
[uri] $URI
)
process {
try {
$funcs = Invoke-RestMethod $URI
$ast = [Parser]::ParseInput($funcs, [ref] $null, [ref] $null)
foreach($func in $ast.FindAll({ $args[0] -is [FunctionDefinitionAst] }, $true)) {
if($func.Name -in (Get-Command -CommandType Function).Name) {
Write-Warning "$($func.Name) is already loaded! Skipping"
continue
}
New-Item -Name "script:$($func.Name)" -Path function: -Value $func.Body.GetScriptBlock()
}
}
catch {
Write-Warning $_.Exception.Message
}
}
}
Load-Function https://raw.githubusercontent.com/proxb/AsyncFunctions/master/Test-ConnectionAsync.ps1
Ping-Subnet # => now is available in your current session.
function Ping-Subnet{
$toImport = (IRM "https://raw.githubusercontent.com/proxb/AsyncFunctions/master/Test-ConnectionAsync.ps1").
Replace([Text.Encoding]::UTF8.GetString((239,187,191)),"")
NMO([ScriptBlock]::Create($toImport))|Out-Null
$MyInvocation.Line|IEX
}
function Test-ConnectionAsync{
$toImport = (IRM "https://raw.githubusercontent.com/proxb/AsyncFunctions/master/Test-ConnectionAsync.ps1").
Replace([Text.Encoding]::UTF8.GetString((239,187,191)),"")
NMO([ScriptBlock]::Create($toImport))|Out-Null
$MyInvocation.Line|IEX
}
Ping-Subnet -Result Success
Test-ConnectionAsync -Computername $env:COMPUTERNAME
Result:
Computername Result
------------ ------
192.168.1.1 Success
192.168.1.2 Success
192.168.1.146 Success
Computername IPAddress Result
------------ --------- ------
HOME-PC fe80::123:1234:ABCD:EF12 Success
Yes, it should work. Calling Test-ConnectionAsync.ps1 from with-in a function will create the functions defined with-in, in the wrapping function's scope. You will be able to call any wrapped functions until the function's scope ends.
If you name the wrapper and wrapped functions differently, you can check whether the function has been declared with something like...
Otherwise, you need to get more creative.
This said, PROCEED WITH CAUTION. Remote code execution, like this, is fraught with security issues, especially in the way we're talking about it i.e., no validation of Test-ConnectionAsync.ps1.
Fors1k's answer deserves the credit for coming up with the clever fundamentals of the approach:
Download and execute the remote script's content in a dynamic module created with New-Module (whose built-in alias is nmo), which causes the script's functions to be auto-exported and to become available session-globally[1]
Note that dynamic modules aren't easy to discover, because they're not shown in Get-Module's output; however, you can discover them indirectly, via the .Source property of the command-info objects output by Get-Command:
Get-Command | Where Source -like __DynamicModule_*
That the downloaded functions become available session-globally may be undesired if you're trying to use the technique inside a script that shouldn't affect the session's global state - see the bottom section for a solution.
Then re-invoke the function, under the assumption that the original stub function has been replaced with the downloaded version of the same name, passing the received arguments through.
While Fors1k's solution will typically work, here is a streamlined, robust alternative that prevents potential, inadvertent re-execution of code:
function Ping-Subnet{
$uri = 'https://raw.githubusercontent.com/proxb/AsyncFunctions/master/Test-ConnectionAsync.ps1'
# Define and session-globally import a dynamic module based on the remote
# script's content.
# Any functions defined in the script would automatically be exported.
# However, unlike with persisted modules, *aliases* are *not* exported by
# default, which the appended Export-ModuleMember call below compensates for.
# If desired, also add -Variable * in order to export variables too.
# Conversely, if you only care about functions, remove the Export-ModuleMember call.
$dynMod = New-Module ([scriptblock]::Create(
((Invoke-RestMethod $uri)) + "`nExport-ModuleMember -Function * -Alias *")
)
# If this stub function shadows the newly defined function in the dynamic
# module, remove it first, so that re-invocation by name uses the new function.
# Note: This happens if this stub function is run in a child scope, such as
# in a (non-dot-sourced) script rather than in the global scope.
# If run in the global scope, curiously, the stub function seemingly
# disappears from view right away - not even Get-Command -All shows it later.
$myName = $MyInvocation.MyCommand.Name
if ((Get-Command -Type Function $myName).ModuleName -ne $dynMod.Name) {
Remove-Item -LiteralPath "function:$myName"
}
# Now invoke the newly defined function of the same name, passing the arguments
# through.
& $myName #args
}
Specifically, this implementation ensures:
That aliases defined in the remote script are exported as well (just remove + "`nExport-ModuleMember -Function * -Alias *" from the code above if that is undesired.
That the re-invocation robustly targets the new, module-defined implementation of the function - even if the stub function runs in a child scope, such as in a (non-dot-sourced) script.
When run in a child scope, $MyInvocation.Line|IEX (iex is a built-in alias of the Invoke-Expression cmdlet) would result in an infinite loop, because the stub function itself is still in effect at that time.
That all received arguments are passed through on re-invocation without re-evaluation.
Using the built-in magic of splatting the automatic $args variable (#args) passes only the received, already expanded arguments through, supporting both named and positional arguments.[2]
$MyInvocation.Line|IEX has two potential problems:
If the invoking command line contained multiple commands, they are all repeated.
You can solve this particular problem by substituting (Get-PSCallStack)[1].Position.Text for $MyInvocation.Line, but that still wouldn't address the next problem.
Both $MyInvocation.Line and (Get-PSCallStack)[1].Position.Text contain the arguments that were passed in unexpanded (unevaluated) form, which causes their re-evaluation by Invoke-Expression, and the perils of that are that, at least hypothetically, this re-evaluation could involve lengthy commands whose output served as arguments or, worse, commands that had side effects that cannot or should not be repeated.
Scoping the technique to a given local script:
That the downloaded functions become available session-globally may be undesired if you're trying to use the technique inside a script that shouldn't affect the session's global state; that is, you may want the functions exported via the dynamic module to disappear when the script exits.
This requires two extra steps:
Piping the dynamic module to Import-Module, which is the prerequisite for being able to unload it before exiting with Remove-Module
Calling Remove-Module with the dynamic module before exiting in order to unload it.
function Ping-Subnet{
$uri = 'https://raw.githubusercontent.com/proxb/AsyncFunctions/master/Test-ConnectionAsync.ps1'
# Save the module in a script-level variable, and pipe it to Import-Module
# so that it can be removed before the script exits.
$script:dynMod = New-Module ([scriptblock]::Create(
((Invoke-RestMethod $uri)) + "`nExport-ModuleMember -Function * -Alias *")
) | Import-Module -PassThru
# If this stub function shadows the newly defined function in the dynamic
# module, remove it first, so that re-invocation by name use the new function.
# Note: This happens if this stub function is run in a child scope, such as
# in a (non-dot-sourced) script rather than in the global scope.
# If run in the global scope, curiously, the stub function seemingly
# disappears from view right away - not even Get-Command -All shows it later.
$myName = $MyInvocation.MyCommand.Name
if ((Get-Command -Type Function $myName).ModuleName -ne $dynMod.Name) {
Remove-Item -LiteralPath "function:$myName"
}
# Now invoke the newly defined function of the same name, passing the arguments
# through.
& $myName #args
}
# Sample commands to perform in the script.
Ping-Subnet -?
Get-Command Ping-Subnet, Test-ConnectionAsync | Format-Table
# Before exiting, remove (unload) the dynamic module.
$dynMod | Remove-Module
[1] This assumes that the New-Module call itself is made outside of a module; if it is made inside a module, at least that module's commands see the auto-exported functions; if that module uses implicit exporting behavior (which is rare and not advisable), the auto-exported functions from the dynamic module would be included in that module's exports and therefore again become available session-globally.
[2] This magic has one limitation, which, however, will only rarely surface: [switch] parameters with a directly attached Boolean argument aren't supported (e.g., -CaseSensitive:$true) - see this answer.
In bash, I have a bunch of aliases that add parameters to existing programs/functions, for example:
alias grep='grep --color'
I know that's not the best analogy, but is there a simple way to do that in Powershell? It seems like Set-Alias doesn't let you specify parameters.
You can create an alias for a cmdlet, but you cannot create an alias
for a command that consists of a cmdlet and its parameters.
They suggest creating a new cmdlet to do so, but I'd prefer to be able to pass additional parameters without having to hardcode all the allowed params in the new cmdlet (like New-ProxyCommand seems to require you to do). That way, I don't have to know when the proxied/aliased cmdlet params change and change that in my proxy cmdlet.
So what's the best solution for
Not statically duplicating the parameter definition for the aliased/proxied cmdlet. Let the original cmdlet do the validation or dynamically refer to it.
Use an alias/differently named cmdlet so you have to do something explicit to get the different behavior
Have the alias/new cmdlet pass values to existing parameters in the aliased/proxied cmdlet
Closest I can think of is something like the below, though syntax is probably wrong. Also seems like it wouldn't play the nicest with piping, but that can probably be worked around somehow.
& $proxiedcommand $additionaldefaultparams $rawparamsfromread-host
Or is there a way to use the things for proxy cmdlets to dynamically instantiate parameters like below?
function aliased-cmdlet
{
[CmdletBinding((Get-Command Original-Cmdlet)._cmdletBindingsettings_)]
Param(
(Get-Command Original-Cmdlet)._paramsettings_)
)
Original-Cmdlet -CustomDefault Value -Whatever Else
}
If the only change you want to override is a default parameter value, there's already a built-in facility for that. Use the $PSDefaultParameterValues automatic variable:
PS C:\> ('a a a' |Select-String 'a').Matches.Count
1
PS C:\> $PSDefaultParameterValues['Select-String:AllMatches']=$true
PS C:\> ('a a a' |Select-String 'a').Matches.Count
3
If you want to override default parameter values in some instances, but not change the default behavior of the cmdlet, create a proxy command and set default values for the proxy command:
# Gather required info
$OriginalCommand = Get-Command Select-String
$NewCommandName = 'Select-AllMatches'
$Metadata = [System.Management.Automation.CommandMetadata]::new($OriginalCommand)
# Create proxy command
$ProxyString = [System.Management.Automation.ProxyCommand]::Create($Metadata)
New-Item -Path function:\ -Name $NewCommandName -Value $ProxyString
# Set default parameter values for proxy command
$PSDefaultParameterValues["$NewCommandName`:AllMatches"] = $true
Now the parameter default value is only overridden for Select-AllMatches:
PS C:\> ('a a a' |Select-String 'a').Matches.Count
1
PS C:\> ('a a a' |Select-AllMatches 'a').Matches.Count
3
I want to list all my PowerShell functions from one directory. The following command works:
Get-ChildItem -Path ($env:USERPROFILE + "\somewhere\*.psm1") -Recurse | ForEach-Object {Get-Command -Module $_.BaseName}
Now I tried to pipe the output from Get-ChildItem directly to the cmdlet Get-Command. Something like this, which does not work:
Get-ChildItem -Path ($env:USERPROFILE + "\somewhere\*.psm1") -Recurse | Get-Command -Module {$_.BaseName}
Obviously, I do not really understand how to pipe the object from Get-ChildItem in the correct way to the parameter -Module in Get-Command.
I have two questions:
Do you have a hint how to pipe correctly?
Is it possible to pipe to a specific parameter like -Module or is the object always handed over to one default parameter?
Parameters can be bound in four different ways:
By location in the argument list, e.g., Get-ChildItem C:\ (only certain parameters)
By name in the argument list, e.g. Get-ChildItem -Path C:\
By value from the pipeline, e.g. 1..5 | Get-Random (only certain parameters)
By name from the pipeline, e.g. 'C:\Windows' | Get-ChildItem (only certain parameters)
You can inspect the various ways of parameter binding via Get-Help <command> -Parameter *. You can then see that Get-Command allows the Module parameter to be bound only by property name:
-Module [<String[]>]
Specifies an array of modules. ...
Required? false
Position? named
Default value none
Accept pipeline input? True (ByPropertyName)
Accept wildcard characters? false
So the input has to be an object that has a Module property, to allow binding. In your case you thus need an additional step in between:
Get-ChildItem -Path ($env:USERPROFILE + "\somewhere\*.psm1") -Recurse |
Select-Object #{l='Module';e={$_.Basename}} |
Get-Command
Now, this instance here is something that's a bit annoying, since the Module parameter is bound by property name, but most things don't give you an object with a Module property. Heck, even Get-Module doesn't have that, since the returned object uses Name as the property name, so you can't even do
Get-Module | Get-Command
However, in many other places (notably concerning paths) work very well automatically. And if you can control your input objects, e.g. when reading from CSV or other data sources, you can end up with rather nice and concise code.
EDIT: Ansgar Wiechers notes that, while this should work, it doesn't, actually. This may be a shortfall of PowerShell's parameter binding algorithm (which is quite complex, as seen above, and we never got it to work correctly in Pash either), or maybe the Get-Command cmdlet has parameters described in a way that simply cannot allow binding because of reasons.
In Javascript it is possible to do so, I think. In Powershell I'm not sure how to :
Let's say I want to override every call to write-host with my custom method but at some time I want to execute the native write-host inside my overide. Is it possible to store the native implentation under another name so as to call it later from new implementation ?
Update : it seems to me that the answer https://serverfault.com/a/642299/236470 does not fully answer the second part of my question. How do I store and call the native implementation ?
Calls to functions will override cmdlets. You can read more on this from about_Command_Precedence on TechNet ...
If you do not specify a path, Windows PowerShell uses the following
precedence order when it runs commands:
Alias
Function
Cmdlet
Native Windows commands
So simply making a function of the same name as a native cmdlet will get you what you want.
function Write-Host{
[cmdletbinding()]
param(
[Parameter(Mandatory,ValueFromPipeline)]
$string
)
Process {
# Executes once for each pipeline object
If ($string -match "bagels"){
Microsoft.PowerShell.Utility\Write-Host $string -ForegroundColor Green
}else{
Microsoft.PowerShell.Utility\Write-Host $string
}
}
}
So now write-host works with pipeline input that we can filter with. Calling the "real" cmdlet is as easy as specifying the module in the call. You can see I have done that twice in the above code sample. Some sample usage and output would be the following:
Be careful that you don't forget you have done this if you save it in a profile or something of that nature. Use Get-Command Write-Host whenever in doubt. In my case you can remove the override by calling Remove-Item function:write-host
You can also look into what are called proxy functions but I think that is overkill for what you intend to do.
Yes you can. I have an answer for that here on ServerFault, but since it's a different site I'll copy it since I can't close as duplicate to another site.
Yes, you can override Get-ChildItem or any other cmdlet in Powershell.
Name Your Function The Same
If you make a function with the same name in the same scope, yours will be used.
Example:
Function Get-ChildItem {
[CmdletBinding()]
param(
# Simulate the parameters here
)
# ... do stuff
}
Using Aliases
Create your own function, and then create an alias to that function, with the same name as the cmdlet you want to override.
Example:
Function My-GetChildItem {
[CmdletBinding()]
param(
# Simulate the parameters here
)
# ... do stuff
}
New-Alias -Name 'Get-ChildItem' -Value 'My-GetChildItem' -Scope Global
This way is nice because it's easier to test your function without
stomping on the built-in function, and you can control when the cmdlet
is overridden or not within your code.
To remove the alias:
Remove-Item 'Alias:\Get-ChildItem' -Force
I have a module (psm1 file), where I have a set of function. I need to call all functions in that module, which accept one parameter (an array of PSToken). Obviously, I can directly call all functions, but I need that changes in the module, didn't require changes in the calling script. How can I do this?
You can use the Get-Command commandlet to iterate over the functions in a given module, and then call each function using dot-sourcing:
Import-Module MyPowershellModule
$arrPsToken = #($token1, token2, token3)
Get-Command -Module MyPowershellModule |
Select-Object -ExpandProperty Name |
ForEach-Object {
. "$_" $arrPsToken
}
Keep in mind that this code assumes that all functions have the same signature, which is risky.