Access a variable from parent scope - powershell

In Single Module Scenario: Running Set-Var returns 10.
# m.psm1
function Set-Var {
$MyVar = 10
Get-Var
}
function Get-Var {
$MyVar
}
In Nested Modules Scenario: Running Set-Var does not return any value.
# m1.psm1
function Get-Var {
$MyVar
}
# m.psm1
Import-Module .\m1.psm1
function Set-Var {
$MyVar = 10
Get-Var
}
How do I achieve the same effect as a single module with nested modules? Using $script:MyVar also does not work. However, I would like to keep the scope of the variable local to enable concurrent executions with different values.

Your code doesn't work because local variables are not inherited by functions in nested module context.
You can do this instead:
$null = New-Module {
function Get-Var {
[CmdletBinding()] param()
$PSCmdlet.SessionState.PSVariable.Get('MyVar').Value
}
}
The New-Module command creates an in-memory module, because this code only works when the caller is in a different module or script.
Use the CmdletBinding attribute to create an advanced function. This is a prerequisite to use the automatic $PSCmdlet variable, which we need in the next step.
Use its SessionState.PSVariable member to get or set a variable from the parent (module) scope.
This answer shows an example how to set a variable in the parent (module) scope.
See also: Is there any way for a powershell module to get at its caller's scope?

Related

Use value of parameter in inner (global) function

In PowerShell, I'm trying to customise the prompt inside a function that creates a development shell. I do that by creating an inner function prompt, with global scropt.
function Enter-DevEnvironment {
Param(
[Parameter()] [ValidateSet('Debug', 'Release')] $flavor = 'Debug'
)
function global:prompt {
"[$flavor] $($executionContext.SessionState.Path.CurrentLocation)>"
}
}
The problem is that while the function Enter-DevEnvironment has a variable $flavor, this variable is not available for the prompt function.
I've workedaround this by creating a yet another global variable ($global:DevFlavor = $flavor), and using DevFlavor inside prompt, but it left me wonder, whether a cleaner solution is available. I.E. creating an inner function using values from the outer scope by value, and not refering to a variable that may or may not be defined.
This can be done without creating a global variable, by defining the prompt function using New-Item. This allows us to pass a ScriptBlock and use its method GetNewClosure() to bake the value of the -flavor parameter into the function.
function Enter-DevEnvironment {
Param(
[Parameter()] [ValidateSet('Debug', 'Release')] $flavor = 'Debug'
)
$null = New-Item Function:\global:prompt -Force -Value {
"[$flavor] $($executionContext.SessionState.Path.CurrentLocation)>"
}.GetNewClosure()
}

Can a powershell module call functions in its importer's scope?

Is it possible for a powershell module to call functions that live in its importer's scope?
For example, say I have module.psm1, and script.ps1. Inside script.ps1 I import module.psm1, and define a function called Execute-Payload. Is there any way for the code in module.psm1 to call Execute-Payload?
If I understood correctly what you're trying to do, there are 2 commonly used ways to load custom functions to your main script. I'll give you a few examples, since, I'm not sure if there is a best practice for this.
The script being executed will always be main.ps1 on all given examples.
Example 1: All functions are stored on one file
Folder structure
../path/to/script/main.ps1
../path/to/script/Functions/functions.ps1
Functions.ps1
function myCustomFunction1 {
....
}
function myCustomFunction2 {
....
}
function myCustomFunction3 {
....
}
main.ps1
On the first lines of code you could add something like this:
$ErrorActionPreference = 'Stop'
# Option 1: Using Import-Module.
try
{
Import-Module "$PSScriptRoot\Functions\functions.ps1"
}
catch
{
"Failed to load dependency Functions.`nError: $_"
}
# Option 2: Dot Sourcing
try
{
. "$PSScriptRoot\Functions\functions.ps1"
}
catch
{
"Failed to load dependency Functions.`nError: $_"
}
Note: both options will load ALL functions.
Example 2: Many functions stored on different files. This is used when you have lots of complex and/or lengthy functions.
Folder structure
../path/to/script/main.ps1
../path/to/script/Functions/myCustomFunction1.ps1
../path/to/script/Functions/myCustomFunction2.ps1
../path/to/script/Functions/myCustomFunction3.ps1
myCustomFunction1.ps1
function myCustomFunction1 {
....
}
myCustomFunction2.ps1
function myCustomFunction2 {
....
}
myCustomFunction3.ps1
function myCustomFunction3 {
....
}
main.ps1
On the first lines of code you could add something like this:
$ErrorActionPreference = 'Stop'
# Option 1: Using Import-Module.
try
{
Get-ChildItem "$PSScriptRoot\Functions\*.ps1" | Import-Module
}
catch
{
"Failed to load dependency Functions.`nError: $_"
}
# Option 2: Dot Sourcing
try
{
Get-ChildItem "$PSScriptRoot\Functions\*.ps1" | ForEach-Object {
. $_.FullName
}
}
catch
{
"Failed to load dependency Functions.`nError: $_"
}
From inside an (advanced) function in your module, you can use $PSCmdlet.InvokeCommand.InvokeScript() with argument $PSCmdlet.SessionState and a script block created from a string to execute arbitrary code in the caller's scope.
The technique was gratefully adapted from this comment on GitHub.
For brevity, the following code demonstrates the technique with a dynamic module created with New-Module, but it equally applies to regular, persisted modules:
# The function to call from the module.
function Execute-Payload {
"Inside Execute-Payload in the script's scope."
}
# Create (and implicitly import) a dynamic module.
$null = New-Module {
# Define the advanced module function that calls `Execute-Payload`
# in its caller's scope.
function Invoke-Something {
[CmdletBinding()]
param()
$PSCmdlet.InvokeCommand.InvokeScript(
$PSCmdlet.SessionState,
# Pass a *string* with arbitrary code to execute in the caller's scope to
# [scriptblock]::Create().
# !! It is imperative that [scriptblock]::Create() be used, with
# !! a *string*, as it creates an *unbound* script block that then
# !! runs in the specified session.
# !! If needed, use string expansion to "bake" module-local variable
# !! values into the string, or pass parameters as additional arguments.
# !! However, if a script block is passed *by the caller*,
# !! bound to *its* context, it *can* be used as-is.
[scriptblock]::Create(' Execute-Payload ')
)
}
}
# Now invoke the function defined in the module.
Invoke-Something
The above outputs Inside Execute-Payload in the script's scope., proving that the script's function was called.

PowerShell AST FindAll method take a scriptblock with $args[0]

I have been working with the PowerShell AST to create some custom rules for PSScriptAnalyzer.
In a lot of the example code for AST, there is one line that I don't understand. Here is an example.
First parse a file, in this case, the current open file in the ISE.
$AbstractSyntaxTree = [System.Management.Automation.Language.Parser]::
ParseInput($psISE.CurrentFile.Editor.Text, [ref]$null, [ref]$null)
This makes sense so far. Let's say that we want to look for all the ParameterAst objects. The code that I have seen to do this is below.
$params = $AbstractSyntaxTree.FindAll({$args[0] -is [System.Management.Automation.Language.ParameterAst]}, $true)
This line of code is calling FindAll and passing in a scriptblock, that seems to be acting as a filter, so that only ParameterAst objects are returned.
What I don't understand here is how $args[0] fits into this call. How are any parameters actually getting passed into the scriptblock when the FindAll method is invoked?
FindAll method has following signature (from msdn):
public IEnumerable<Ast> FindAll (
Func<Ast,bool> predicate,
bool searchNestedScriptBlocks
)
So first argument is a delegate that takes Ast as input, and returns bool.
In Powershell you can create such delegate like that:
$delegate = { param($ast) $ast -is [System.Management.Automation.Language.ParameterAst] }
Or without declaring parameter:
$delegate = { $args[0] -is [System.Management.Automation.Language.ParameterAst] }
FindAll method will then do something like that (pseudocode):
foreach ($node in $allNodes) {
$shouldAdd = & $delegate $node <-- this is how $node gets passed to your delegate
if ($shouldAdd) {
<add the node to the output list>
}
}
Think of the scriptblock as an anonymous callback function.
It's really the same thing that happens when you use Where-Object { $someCondition }.
.FindAll finds all the (things) and for each one it calls the function you provided it. It's apparently expecting a [bool] result, and returning the objects that satisfied the conditions present in the callback.
In a function or script or scriptblock in powershell, you can have named parameters that are explicitly defined, or you can reference parameters without declaring them using the $args array, which is what's happening here.
Using a scriptblock as a callback is similar to using it for an event:
$Args
Contains an array of the undeclared parameters and/or parameter
values that are passed to a function, script, or script block.
When you create a function, you can declare the parameters by using the
param keyword or by adding a comma-separated list of parameters in
parentheses after the function name.
In an event action, the $Args variable contains objects that represent
the event arguments of the event that is being processed.

How does name lookup work in Powershell script blocks?

User cashfoley has posted what appears to be a fairly elegant set of code at codeplex for a "module" called PSClass.
When I dot-source the psclass code into some code of my own, I am able to write code like:
$Animal = New-PSClass Animal {
constructor {
param( $name, $legs )
# ...
}
method -override ToString {
"A $($this.Class.ClassName) named $($this.name) with $($this.Legs) Legs"
}
}
When I tried to create a module out of the PSClass code, however, I started getting errors. The constructor and method names are no longer recognized.
Looking at the actual implementation, what I see is that constructor, method, etc. are actually nested functions inside the New-PSClass function.
Thus, it seems to me that when I dot-source the PSClass.ps1 file, my script-blocks are allowed to contain references to functions nested inside other local functions. But when the PSClass code becomes a module, with the New-PSClass function exported (I tried both using a manifest and using Export-ModuleMember), the names are no longer visible.
Can someone explain to me how the script blocks, scoping rules, and visibility rules for nested functions work together?
Also, kind of separately, is there a better class definition protocol for pure Powershell scripting? (Specifically, one that does not involve "just write it in C# and then do this...")
The variables in your script blocks don't get evaluated until they are executed. If the variables in the script block don't exist in the current scope when the block is executed, the variables won't have any values. Script blocks aren't closures: they don't capture the context at instantiation time.
Remove-variable FooBar
function New-ScriptBlock
{
$FooBar = 1
$scriptBlock = {
Write-Host "FooBar: $FooBar"
}
$FooBar = 2
& $scriptBlock # Outputs FooBar: 2 because $FooBar was set to 2 before invocation
return $scriptBlock
}
function Invoke-ScriptBlock
{
param(
$ScriptBlock
)
& $ScriptBlock
}
$scriptBlock = New-ScriptBlock
& $scriptBlock # Prints nothing since $FooBar doesn't exist in this scope
$FooBar = 3
Invoke-ScriptBlock $scriptBlock # Prints $FooBar: 3 since FooBar set to 3

Add PowerShell function to the parent scope

I have some PowerShell helper functions in a file. I'd like to make them available to the scope of another file that I am writing, but not pollute the global scope.
Helpers.ps1
function global:Helper1
{
# this function pollutes the global scope
}
function Helper2
{
# this function is not visible to the Utility.ps1 file.
}
Utilities.ps1
&{
./Helpers.ps1
function global:Utility1
{
Helper1
}
function global:Utility2
{
Helper2
}
}
I found this question:
How do I dynamically create functions that are accessible in a parent scope? but the answers discuss adding functions to the global scope. What I really want to do is make the Helper functions from one PS1 file available to a calling PS1 file, without polluting the global scope with the helpers.
I want to avoid defining the functions as variables, which is possible with Set-Variable and the -Scope parameter. The closest I've seen (from the linked thread) is using Set-Item in the function: drive.
Any help would be appreciated!
Edit: here is the solution expanded from Mike's answer
Helpers.ps1
function Helper
{
}
Utilities.ps1
&{
function global:Utility
{
. ./Helpers.ps1
Helper1
}
}
Using the dot-source syntax to load Helpers.ps1 puts it's contents in the scope of the Utility function. Putting Helpers.ps1 outside the Utility function causes it to be in the &{...} scope but that scope ends once the functions are defined.
You can use this snippet in the Utilities.ps1 file. What we do is get all current functions then we dot source the helpers. We then make a diff of the before and after functions. From the diff we recreate the functions in the global scope.
$beforeFunctions = ls function:
. .\helpers.ps1
$afterFunctions = ls function:
$functionDiff = #(Compare-Object $beforeFunctions $afterFunctions)
foreach($diffEntry in $functionDiff){
$func = $diffEntry.InputObject
invoke-expression "function global:$($func.Name) { $($func.definition) }"
}
If you dot-source a .ps1 file in a function, the definitions that are in the ps1 file are not global, unless the function was itself dot-sourced.