User cashfoley has posted what appears to be a fairly elegant set of code at codeplex for a "module" called PSClass.
When I dot-source the psclass code into some code of my own, I am able to write code like:
$Animal = New-PSClass Animal {
constructor {
param( $name, $legs )
# ...
}
method -override ToString {
"A $($this.Class.ClassName) named $($this.name) with $($this.Legs) Legs"
}
}
When I tried to create a module out of the PSClass code, however, I started getting errors. The constructor and method names are no longer recognized.
Looking at the actual implementation, what I see is that constructor, method, etc. are actually nested functions inside the New-PSClass function.
Thus, it seems to me that when I dot-source the PSClass.ps1 file, my script-blocks are allowed to contain references to functions nested inside other local functions. But when the PSClass code becomes a module, with the New-PSClass function exported (I tried both using a manifest and using Export-ModuleMember), the names are no longer visible.
Can someone explain to me how the script blocks, scoping rules, and visibility rules for nested functions work together?
Also, kind of separately, is there a better class definition protocol for pure Powershell scripting? (Specifically, one that does not involve "just write it in C# and then do this...")
The variables in your script blocks don't get evaluated until they are executed. If the variables in the script block don't exist in the current scope when the block is executed, the variables won't have any values. Script blocks aren't closures: they don't capture the context at instantiation time.
Remove-variable FooBar
function New-ScriptBlock
{
$FooBar = 1
$scriptBlock = {
Write-Host "FooBar: $FooBar"
}
$FooBar = 2
& $scriptBlock # Outputs FooBar: 2 because $FooBar was set to 2 before invocation
return $scriptBlock
}
function Invoke-ScriptBlock
{
param(
$ScriptBlock
)
& $ScriptBlock
}
$scriptBlock = New-ScriptBlock
& $scriptBlock # Prints nothing since $FooBar doesn't exist in this scope
$FooBar = 3
Invoke-ScriptBlock $scriptBlock # Prints $FooBar: 3 since FooBar set to 3
Related
In Single Module Scenario: Running Set-Var returns 10.
# m.psm1
function Set-Var {
$MyVar = 10
Get-Var
}
function Get-Var {
$MyVar
}
In Nested Modules Scenario: Running Set-Var does not return any value.
# m1.psm1
function Get-Var {
$MyVar
}
# m.psm1
Import-Module .\m1.psm1
function Set-Var {
$MyVar = 10
Get-Var
}
How do I achieve the same effect as a single module with nested modules? Using $script:MyVar also does not work. However, I would like to keep the scope of the variable local to enable concurrent executions with different values.
Your code doesn't work because local variables are not inherited by functions in nested module context.
You can do this instead:
$null = New-Module {
function Get-Var {
[CmdletBinding()] param()
$PSCmdlet.SessionState.PSVariable.Get('MyVar').Value
}
}
The New-Module command creates an in-memory module, because this code only works when the caller is in a different module or script.
Use the CmdletBinding attribute to create an advanced function. This is a prerequisite to use the automatic $PSCmdlet variable, which we need in the next step.
Use its SessionState.PSVariable member to get or set a variable from the parent (module) scope.
This answer shows an example how to set a variable in the parent (module) scope.
See also: Is there any way for a powershell module to get at its caller's scope?
Is it possible for a powershell module to call functions that live in its importer's scope?
For example, say I have module.psm1, and script.ps1. Inside script.ps1 I import module.psm1, and define a function called Execute-Payload. Is there any way for the code in module.psm1 to call Execute-Payload?
If I understood correctly what you're trying to do, there are 2 commonly used ways to load custom functions to your main script. I'll give you a few examples, since, I'm not sure if there is a best practice for this.
The script being executed will always be main.ps1 on all given examples.
Example 1: All functions are stored on one file
Folder structure
../path/to/script/main.ps1
../path/to/script/Functions/functions.ps1
Functions.ps1
function myCustomFunction1 {
....
}
function myCustomFunction2 {
....
}
function myCustomFunction3 {
....
}
main.ps1
On the first lines of code you could add something like this:
$ErrorActionPreference = 'Stop'
# Option 1: Using Import-Module.
try
{
Import-Module "$PSScriptRoot\Functions\functions.ps1"
}
catch
{
"Failed to load dependency Functions.`nError: $_"
}
# Option 2: Dot Sourcing
try
{
. "$PSScriptRoot\Functions\functions.ps1"
}
catch
{
"Failed to load dependency Functions.`nError: $_"
}
Note: both options will load ALL functions.
Example 2: Many functions stored on different files. This is used when you have lots of complex and/or lengthy functions.
Folder structure
../path/to/script/main.ps1
../path/to/script/Functions/myCustomFunction1.ps1
../path/to/script/Functions/myCustomFunction2.ps1
../path/to/script/Functions/myCustomFunction3.ps1
myCustomFunction1.ps1
function myCustomFunction1 {
....
}
myCustomFunction2.ps1
function myCustomFunction2 {
....
}
myCustomFunction3.ps1
function myCustomFunction3 {
....
}
main.ps1
On the first lines of code you could add something like this:
$ErrorActionPreference = 'Stop'
# Option 1: Using Import-Module.
try
{
Get-ChildItem "$PSScriptRoot\Functions\*.ps1" | Import-Module
}
catch
{
"Failed to load dependency Functions.`nError: $_"
}
# Option 2: Dot Sourcing
try
{
Get-ChildItem "$PSScriptRoot\Functions\*.ps1" | ForEach-Object {
. $_.FullName
}
}
catch
{
"Failed to load dependency Functions.`nError: $_"
}
From inside an (advanced) function in your module, you can use $PSCmdlet.InvokeCommand.InvokeScript() with argument $PSCmdlet.SessionState and a script block created from a string to execute arbitrary code in the caller's scope.
The technique was gratefully adapted from this comment on GitHub.
For brevity, the following code demonstrates the technique with a dynamic module created with New-Module, but it equally applies to regular, persisted modules:
# The function to call from the module.
function Execute-Payload {
"Inside Execute-Payload in the script's scope."
}
# Create (and implicitly import) a dynamic module.
$null = New-Module {
# Define the advanced module function that calls `Execute-Payload`
# in its caller's scope.
function Invoke-Something {
[CmdletBinding()]
param()
$PSCmdlet.InvokeCommand.InvokeScript(
$PSCmdlet.SessionState,
# Pass a *string* with arbitrary code to execute in the caller's scope to
# [scriptblock]::Create().
# !! It is imperative that [scriptblock]::Create() be used, with
# !! a *string*, as it creates an *unbound* script block that then
# !! runs in the specified session.
# !! If needed, use string expansion to "bake" module-local variable
# !! values into the string, or pass parameters as additional arguments.
# !! However, if a script block is passed *by the caller*,
# !! bound to *its* context, it *can* be used as-is.
[scriptblock]::Create(' Execute-Payload ')
)
}
}
# Now invoke the function defined in the module.
Invoke-Something
The above outputs Inside Execute-Payload in the script's scope., proving that the script's function was called.
In powershell you can make functions with function name {commands} and make those functions take arguments with this:
function myFunction {
param($var1, $var2)
}
but you can also accomplish this with
function myFunction($var1, $var2) {}
and they would be the same.
For example, if I made a function func1 be:
function func1 {
param($var1, $var2)
echo "$var1 $var2"
}
I would call it by using func1 1 2 where $var1 would be equal to 1 and $var2 would be equal to 2.
Input:
PS C:\Users\Neko> func1 1 2
Output:
1 2
However, if I do the same thing but instead I did the other method of passing arguments to functions:
function func2($var1, $var2) {
echo "$var1 $var2"
}
I would also call it the same exact way, calling it by using func2 1 2 where $var1 would be equal to 1 and $var2 would be equal to 2 like the previous function.
Input:
PS C:\Users\Neko> func2 1 2
Output:
1 2
So everything seems the same and constant between the two renditions of the function, so my question is, is there a difference between the two methods of passing arguments to functions or are they both actually the same? Even if it is the most minor of details, or just a parsing difference, I would like to know any differences between the two in functions specifically since param has other uses as well.
UPDATE: The arguments you can do in param like [parameter(Mandatory=$true, ValueFromPipeline=$true)] and [String[]] are not unique to param. You can also accomplish this in the other 'non-param' example by doing:
function func2(
[parameter(Mandatory=$true, ValueFromPipeline=$true, etc)]
[String[]]
$var1, $var2
) {
echo "$var1 $var2"
}
To complement 7cc's helpful answer:
While the two syntax forms are mostly interchangeable when you define a function's parameters, only the param(...) block syntax works in the following circumstances:
If you want to use a [CmdletBinding()] attribute and its properties to (explicitly) make your function or script an advanced function or script.[1]
If you're writing a script file(*.ps1) or script block ({ ... }): the only way to declare parameters for them is is by placing a param(...) block at the beginning.
Therefore, you may opt to always use the param(...) block syntax, for consistency across function and script parameter definitions.
If a [CmdletBinding(...)]) attribute is used, it must directly precede the param(...) block.
As for:
I would call it by using func1(1)(2)
No, you would call it as follows:
func1 1 2
That is, PowerShell functions are called like shell commands: without parentheses, separated by whitespace; while your invocation happens to work too, the use of (...) around the arguments can change their interpretation:
without the enclosing (...) the arguments are parsed in argument mode, where, notably, strings needn't be quoted
with the enclosing (...), are parsed in expression mode, where strings do need to be quoted.
See this answer for more information.
[1] While you can place a [CmdletBinding(...)] attribute inside the parentheses with the function Foo (...) { ... } syntax without provoking an error, doing so is effectively ignored. Separately, in the absence of an (effective) explicit [CmdletBinding(...)] attribute, with either syntax, if you happen to decorate at least one parameter with a [Parameter()] attribute, you get the default behaviors of an advanced function (e.g., support for automatic common parameters such as -Verbose), because using [Parameter()] implicitly makes a function an advanced one (as if a [CmdletBinding()] attribute - without explicit property values - were in effect). However, if you need an explicit [CmdletBinding(...)] attribute, so as to opt into non-default advanced-function behaviors, via property values such as PositionalBinding=$false or SupportsShouldProcess=$true, use of a param(...) block is your only option.
One thing is that the CmdletBinding attribute requires Param
function Echo-Confirm
{
# Here
[CmdletBinding(SupportsShouldProcess=$true, ConfirmImpact="High")]
Param ($val=1)
if ($PSCmdlet.ShouldProcess($val) -eq $true) {
Write-Output "Confirmed $val"
}
}
Edit after this comment
The syntax is fine, but CmdletBinding has no effect
Function foo (
[CmdletBinding(SupportsShouldProcess=$true, ConfirmImpact="High")]
[Parameter()]$val=1
) {
# never confirm
if ($PSCmdlet.ShouldProcess($val) -eq $true) {
Write-Output "always here"
}
else {
Write-Output "never here"
}
}
foo -Confirm
# throws an error
foo: A parameter cannot be found that matches parameter name 'confirm'.
From About Functions - Functions with Parameters - Named Parameters:
You can define any number of named parameters. You can include a
default value for named parameters, as described later in this topic.
You can define parameters inside the braces using the Param keyword,
as shown in the following sample syntax:
function <name> {
param ([type]$parameter1[,[type]$parameter2])
<statement list>
}
You can also define parameters outside the braces without the Param
keyword, as shown in the following sample syntax:
function <name> [([type]$parameter1[,[type]$parameter2])] {
<statement list>
}
…
While the first method is preferred, there is no difference between
these two methods.
Edit
#mklement0 thanks for helpful explication of the latter (emphasized) statement.
This statement (there is no difference between these two methods) is valid despite of 7cc's improper guesswork.
7cc's answer is right, explained in mklement0's comments below and his updated answer.
In About Functions Advanced Parameters => Attributes of parameters, there are some allusions to the relation of CmdletBinding and Parameter attributes and advanced functions (advanced functions use the CmdletBinding attribute to identify them as functions that act similar to cmdlets):
… if you omit the CmdletBinding attribute, then to be recognized
as an advanced function, the function must include the Parameter
attribute…
… to be recognized as an advanced function, rather than a simple
function, a function must have either the CmdletBinding attribute
or the Parameter attribute, or both.
I can't comprehend PowerShell inventors' motivation for such (confusing for me) design…
I have been working with the PowerShell AST to create some custom rules for PSScriptAnalyzer.
In a lot of the example code for AST, there is one line that I don't understand. Here is an example.
First parse a file, in this case, the current open file in the ISE.
$AbstractSyntaxTree = [System.Management.Automation.Language.Parser]::
ParseInput($psISE.CurrentFile.Editor.Text, [ref]$null, [ref]$null)
This makes sense so far. Let's say that we want to look for all the ParameterAst objects. The code that I have seen to do this is below.
$params = $AbstractSyntaxTree.FindAll({$args[0] -is [System.Management.Automation.Language.ParameterAst]}, $true)
This line of code is calling FindAll and passing in a scriptblock, that seems to be acting as a filter, so that only ParameterAst objects are returned.
What I don't understand here is how $args[0] fits into this call. How are any parameters actually getting passed into the scriptblock when the FindAll method is invoked?
FindAll method has following signature (from msdn):
public IEnumerable<Ast> FindAll (
Func<Ast,bool> predicate,
bool searchNestedScriptBlocks
)
So first argument is a delegate that takes Ast as input, and returns bool.
In Powershell you can create such delegate like that:
$delegate = { param($ast) $ast -is [System.Management.Automation.Language.ParameterAst] }
Or without declaring parameter:
$delegate = { $args[0] -is [System.Management.Automation.Language.ParameterAst] }
FindAll method will then do something like that (pseudocode):
foreach ($node in $allNodes) {
$shouldAdd = & $delegate $node <-- this is how $node gets passed to your delegate
if ($shouldAdd) {
<add the node to the output list>
}
}
Think of the scriptblock as an anonymous callback function.
It's really the same thing that happens when you use Where-Object { $someCondition }.
.FindAll finds all the (things) and for each one it calls the function you provided it. It's apparently expecting a [bool] result, and returning the objects that satisfied the conditions present in the callback.
In a function or script or scriptblock in powershell, you can have named parameters that are explicitly defined, or you can reference parameters without declaring them using the $args array, which is what's happening here.
Using a scriptblock as a callback is similar to using it for an event:
$Args
Contains an array of the undeclared parameters and/or parameter
values that are passed to a function, script, or script block.
When you create a function, you can declare the parameters by using the
param keyword or by adding a comma-separated list of parameters in
parentheses after the function name.
In an event action, the $Args variable contains objects that represent
the event arguments of the event that is being processed.
I have some PowerShell helper functions in a file. I'd like to make them available to the scope of another file that I am writing, but not pollute the global scope.
Helpers.ps1
function global:Helper1
{
# this function pollutes the global scope
}
function Helper2
{
# this function is not visible to the Utility.ps1 file.
}
Utilities.ps1
&{
./Helpers.ps1
function global:Utility1
{
Helper1
}
function global:Utility2
{
Helper2
}
}
I found this question:
How do I dynamically create functions that are accessible in a parent scope? but the answers discuss adding functions to the global scope. What I really want to do is make the Helper functions from one PS1 file available to a calling PS1 file, without polluting the global scope with the helpers.
I want to avoid defining the functions as variables, which is possible with Set-Variable and the -Scope parameter. The closest I've seen (from the linked thread) is using Set-Item in the function: drive.
Any help would be appreciated!
Edit: here is the solution expanded from Mike's answer
Helpers.ps1
function Helper
{
}
Utilities.ps1
&{
function global:Utility
{
. ./Helpers.ps1
Helper1
}
}
Using the dot-source syntax to load Helpers.ps1 puts it's contents in the scope of the Utility function. Putting Helpers.ps1 outside the Utility function causes it to be in the &{...} scope but that scope ends once the functions are defined.
You can use this snippet in the Utilities.ps1 file. What we do is get all current functions then we dot source the helpers. We then make a diff of the before and after functions. From the diff we recreate the functions in the global scope.
$beforeFunctions = ls function:
. .\helpers.ps1
$afterFunctions = ls function:
$functionDiff = #(Compare-Object $beforeFunctions $afterFunctions)
foreach($diffEntry in $functionDiff){
$func = $diffEntry.InputObject
invoke-expression "function global:$($func.Name) { $($func.definition) }"
}
If you dot-source a .ps1 file in a function, the definitions that are in the ps1 file are not global, unless the function was itself dot-sourced.