Add PowerShell function to the parent scope - powershell

I have some PowerShell helper functions in a file. I'd like to make them available to the scope of another file that I am writing, but not pollute the global scope.
Helpers.ps1
function global:Helper1
{
# this function pollutes the global scope
}
function Helper2
{
# this function is not visible to the Utility.ps1 file.
}
Utilities.ps1
&{
./Helpers.ps1
function global:Utility1
{
Helper1
}
function global:Utility2
{
Helper2
}
}
I found this question:
How do I dynamically create functions that are accessible in a parent scope? but the answers discuss adding functions to the global scope. What I really want to do is make the Helper functions from one PS1 file available to a calling PS1 file, without polluting the global scope with the helpers.
I want to avoid defining the functions as variables, which is possible with Set-Variable and the -Scope parameter. The closest I've seen (from the linked thread) is using Set-Item in the function: drive.
Any help would be appreciated!
Edit: here is the solution expanded from Mike's answer
Helpers.ps1
function Helper
{
}
Utilities.ps1
&{
function global:Utility
{
. ./Helpers.ps1
Helper1
}
}
Using the dot-source syntax to load Helpers.ps1 puts it's contents in the scope of the Utility function. Putting Helpers.ps1 outside the Utility function causes it to be in the &{...} scope but that scope ends once the functions are defined.

You can use this snippet in the Utilities.ps1 file. What we do is get all current functions then we dot source the helpers. We then make a diff of the before and after functions. From the diff we recreate the functions in the global scope.
$beforeFunctions = ls function:
. .\helpers.ps1
$afterFunctions = ls function:
$functionDiff = #(Compare-Object $beforeFunctions $afterFunctions)
foreach($diffEntry in $functionDiff){
$func = $diffEntry.InputObject
invoke-expression "function global:$($func.Name) { $($func.definition) }"
}

If you dot-source a .ps1 file in a function, the definitions that are in the ps1 file are not global, unless the function was itself dot-sourced.

Related

Use value of parameter in inner (global) function

In PowerShell, I'm trying to customise the prompt inside a function that creates a development shell. I do that by creating an inner function prompt, with global scropt.
function Enter-DevEnvironment {
Param(
[Parameter()] [ValidateSet('Debug', 'Release')] $flavor = 'Debug'
)
function global:prompt {
"[$flavor] $($executionContext.SessionState.Path.CurrentLocation)>"
}
}
The problem is that while the function Enter-DevEnvironment has a variable $flavor, this variable is not available for the prompt function.
I've workedaround this by creating a yet another global variable ($global:DevFlavor = $flavor), and using DevFlavor inside prompt, but it left me wonder, whether a cleaner solution is available. I.E. creating an inner function using values from the outer scope by value, and not refering to a variable that may or may not be defined.
This can be done without creating a global variable, by defining the prompt function using New-Item. This allows us to pass a ScriptBlock and use its method GetNewClosure() to bake the value of the -flavor parameter into the function.
function Enter-DevEnvironment {
Param(
[Parameter()] [ValidateSet('Debug', 'Release')] $flavor = 'Debug'
)
$null = New-Item Function:\global:prompt -Force -Value {
"[$flavor] $($executionContext.SessionState.Path.CurrentLocation)>"
}.GetNewClosure()
}

Access a variable from parent scope

In Single Module Scenario: Running Set-Var returns 10.
# m.psm1
function Set-Var {
$MyVar = 10
Get-Var
}
function Get-Var {
$MyVar
}
In Nested Modules Scenario: Running Set-Var does not return any value.
# m1.psm1
function Get-Var {
$MyVar
}
# m.psm1
Import-Module .\m1.psm1
function Set-Var {
$MyVar = 10
Get-Var
}
How do I achieve the same effect as a single module with nested modules? Using $script:MyVar also does not work. However, I would like to keep the scope of the variable local to enable concurrent executions with different values.
Your code doesn't work because local variables are not inherited by functions in nested module context.
You can do this instead:
$null = New-Module {
function Get-Var {
[CmdletBinding()] param()
$PSCmdlet.SessionState.PSVariable.Get('MyVar').Value
}
}
The New-Module command creates an in-memory module, because this code only works when the caller is in a different module or script.
Use the CmdletBinding attribute to create an advanced function. This is a prerequisite to use the automatic $PSCmdlet variable, which we need in the next step.
Use its SessionState.PSVariable member to get or set a variable from the parent (module) scope.
This answer shows an example how to set a variable in the parent (module) scope.
See also: Is there any way for a powershell module to get at its caller's scope?

Can a powershell module call functions in its importer's scope?

Is it possible for a powershell module to call functions that live in its importer's scope?
For example, say I have module.psm1, and script.ps1. Inside script.ps1 I import module.psm1, and define a function called Execute-Payload. Is there any way for the code in module.psm1 to call Execute-Payload?
If I understood correctly what you're trying to do, there are 2 commonly used ways to load custom functions to your main script. I'll give you a few examples, since, I'm not sure if there is a best practice for this.
The script being executed will always be main.ps1 on all given examples.
Example 1: All functions are stored on one file
Folder structure
../path/to/script/main.ps1
../path/to/script/Functions/functions.ps1
Functions.ps1
function myCustomFunction1 {
....
}
function myCustomFunction2 {
....
}
function myCustomFunction3 {
....
}
main.ps1
On the first lines of code you could add something like this:
$ErrorActionPreference = 'Stop'
# Option 1: Using Import-Module.
try
{
Import-Module "$PSScriptRoot\Functions\functions.ps1"
}
catch
{
"Failed to load dependency Functions.`nError: $_"
}
# Option 2: Dot Sourcing
try
{
. "$PSScriptRoot\Functions\functions.ps1"
}
catch
{
"Failed to load dependency Functions.`nError: $_"
}
Note: both options will load ALL functions.
Example 2: Many functions stored on different files. This is used when you have lots of complex and/or lengthy functions.
Folder structure
../path/to/script/main.ps1
../path/to/script/Functions/myCustomFunction1.ps1
../path/to/script/Functions/myCustomFunction2.ps1
../path/to/script/Functions/myCustomFunction3.ps1
myCustomFunction1.ps1
function myCustomFunction1 {
....
}
myCustomFunction2.ps1
function myCustomFunction2 {
....
}
myCustomFunction3.ps1
function myCustomFunction3 {
....
}
main.ps1
On the first lines of code you could add something like this:
$ErrorActionPreference = 'Stop'
# Option 1: Using Import-Module.
try
{
Get-ChildItem "$PSScriptRoot\Functions\*.ps1" | Import-Module
}
catch
{
"Failed to load dependency Functions.`nError: $_"
}
# Option 2: Dot Sourcing
try
{
Get-ChildItem "$PSScriptRoot\Functions\*.ps1" | ForEach-Object {
. $_.FullName
}
}
catch
{
"Failed to load dependency Functions.`nError: $_"
}
From inside an (advanced) function in your module, you can use $PSCmdlet.InvokeCommand.InvokeScript() with argument $PSCmdlet.SessionState and a script block created from a string to execute arbitrary code in the caller's scope.
The technique was gratefully adapted from this comment on GitHub.
For brevity, the following code demonstrates the technique with a dynamic module created with New-Module, but it equally applies to regular, persisted modules:
# The function to call from the module.
function Execute-Payload {
"Inside Execute-Payload in the script's scope."
}
# Create (and implicitly import) a dynamic module.
$null = New-Module {
# Define the advanced module function that calls `Execute-Payload`
# in its caller's scope.
function Invoke-Something {
[CmdletBinding()]
param()
$PSCmdlet.InvokeCommand.InvokeScript(
$PSCmdlet.SessionState,
# Pass a *string* with arbitrary code to execute in the caller's scope to
# [scriptblock]::Create().
# !! It is imperative that [scriptblock]::Create() be used, with
# !! a *string*, as it creates an *unbound* script block that then
# !! runs in the specified session.
# !! If needed, use string expansion to "bake" module-local variable
# !! values into the string, or pass parameters as additional arguments.
# !! However, if a script block is passed *by the caller*,
# !! bound to *its* context, it *can* be used as-is.
[scriptblock]::Create(' Execute-Payload ')
)
}
}
# Now invoke the function defined in the module.
Invoke-Something
The above outputs Inside Execute-Payload in the script's scope., proving that the script's function was called.

What is the setlocal / endlocal equivalent for PowerShell?

Objective
Isolate environmental variable changes to a code block.
Background
If I want to create a batch script to run a command that requires an environmental variable set, I know I can do this:
setlocal
set MYLANG=EN
my-cmd dostuff -o out.csv
endlocal
However, I tend to use PowerShell when I need to using a shell scripting language. I know how to set the environmental variable ($env:TEST="EN") and of course this is just a simple example. However, I am not sure how to achieve the same effect that I could with a batch script. Surprisingly, I don't see any questions asking this either.
I am aware that setting something with $env:TEST="EN" is process scoped, but that isn't practical if I'm using scripts as small utilities in a single terminal session.
My current approaches:
Entered setlocal. But that wasn't a commandlet... I hoped.
Save the current variable to a temp variable, run my command, change it back... kinda silly.
Function level scope (though I doubted the success since $env: seems to be not much unlike $global:)
Function scope doesn't trump the reference to $env:
$env:TEST="EN"
function tt {
$env:TEST="GB"
($env:TEST)
}
($env:TEST)
tt
($env:TEST)
Output:
C:\Users\me> .\eg.ps1
EN
GB
GB
In batch files, all shell variables are environment variables too; therefore, setlocal ... endlocal provides a local scope for environment variables too.
By contrast, in PowerShell, shell variables (e.g., $var) are distinct from environment variables (e.g., $env:PATH) - a distinction that is generally beneficial.
Given that the smallest scope for setting environment variables is the current process - and therefore the entire PowerShell session, you must manage a smaller custom scope manually, if you want to do this in-process (which is what setlocal ... endlocal does in cmd.exe, for which PowerShell has no built-in equivalent; to custom-scope shell variables, use & { $var = ...; ... }):
In-process approach: manual management of a custom scope:
To ease the pain somewhat, you can use a script block ({ ... }) to provide a distinct visual grouping of the command, which, when invoked with & also create a new local scope, so that any aux. variables you define in the script block automatically go out of scope (you can write this as a one-line with ;-separated commands):
& {
$oldVal, $env:MYLANG = $env:MYLANG, 'EN'
my-cmd dostuff -o out.csv
$env:MYLANG = $oldVal
}
More simply, if there's no preexisting MYLANG value that must be restored:
& { $env:MYLANG='EN'; my-cmd dostuff -o out.csv; $env:MYLANG=$null }
$oldVal, $env:MYLANG = $env:MYLANG, 'EN' saves the old value (if any) of $env:MYLANG in $oldVal while changing the value to 'EN'; this technique of assigning to multiple variables at once (known as destructuring assignment in some languages) is explained in Get-Help about_Assignment_Operators, section "ASSIGNING MULTIPLE VARIALBES".
A more proper and robust but more verbose solution is to use try { ... } finally { ... }:
try {
# Temporarily set/create $env:MYLANG to 'EN'
$prevVal = $env:MYLANG; $env:MYLANG = 'EN'
my-cmd dostuff -o out.csv # run the command(s) that should see $env:MYLANG as 'EN'
} finally { # remove / restore the temporary value
# Note: if $env:MYLANG didn't previously exist, $prevVal is $null,
# and assigning that back to $env:MYLANG *removes* it, as desired.
$env:MYLANG = $prevVal
}
Note, however, that if you only ever call external programs with the temporarily modified environment, there is no strict need for try / catch, because external programs never cause PowerShell errors as of PowerShell 7.1, though that may change in the future.
To facilitate this approach, this answer to a related question offers convenience function
Invoke-WithEnvironment, which allows you to write the same invocation as:
# Define env. var. $env:MYLANG only for the duration of executing the commands
# in { ... }
Invoke-WithEnvironment #{ MYLANG = 'EN' } { my-cmd dostuff -o out.csv }
Alternatives, using an auxiliary process:
By using an auxiliary process and only setting the transient environment variable there,
you avoid the need to restore the environment after invocation
but you pay a performance penalty, and invocation complexity is increased.
Using an aux. cmd.exe process:
cmd /c "set `"MYLANG=EN`" & my-cmd dostuff -o out.csv"
Note:
Outer "..." quoting was chosen so that you can reference PowerShell variables in your command; embedded " must then be escaped as `"
Additionally, the arguments to the target command must be passed according to cmd.exe's rules (makes no difference with the simple command at hand).
Using an aux. child PowerShell session:
# In PowerShell *Core*, use `pwsh` in lieu of `powershell`
powershell -nop -c { $env:MYLANG = 'EN'; my-cmd dostuff -o out.csv }
Note:
Starting another PowerShell session is expensive.
Output from the script block ({ ... }) is subject to serialization and later deserialization in the calling scope; for string output, that doesn't matter, but complex objects such as [System.IO.FileInfo] deserialize to emulations of the originals (which may or may not be problem).
There is a way to achieve this in PowerShell:
Local Scope:
& { [System.Environment]::SetEnvironmentVariable('TEST', 'WORK Local', [System.EnvironmentVariableTarget]::Process)
[System.Environment]::GetEnvironmentVariable("TEST", [System.EnvironmentVariableTarget]::Process) }
This creates the environmental variable in the scope of the process same as above. Any call to it outside the scope will return nothing.
For a global one you just change the target to Machine:
& { [System.Environment]::SetEnvironmentVariable('TEST', 'WORK Global', [System.EnvironmentVariableTarget]::Machine) }
Any call to this outside the scope will return 'Work Global'
Putting it all together:
## create local variable and print
& { [System.Environment]::SetEnvironmentVariable('TEST', 'WORK Local', [System.EnvironmentVariableTarget]::Process)
[System.Environment]::GetEnvironmentVariable("TEST", [System.EnvironmentVariableTarget]::Process) }
function tt {
($env:TEST)
}
& { $TEST="EN"; $env:TEST="EN"; tt }
& { $TEST="change1"; $env:TEST="change1"; tt }
& { $TEST="change1"; $env:TEST="change2"; tt }
[System.Environment]::GetEnvironmentVariable("TEST", [System.EnvironmentVariableTarget]::Process)
& { [System.Environment]::SetEnvironmentVariable('TEST', 'WORK Global', [System.EnvironmentVariableTarget]::Machine) } ## create global variable
## Create local variable and print ( overrides global )
& { [System.Environment]::SetEnvironmentVariable('TEST', 'WORK Local', [System.EnvironmentVariableTarget]::Process)
[System.Environment]::GetEnvironmentVariable("TEST", [System.EnvironmentVariableTarget]::Process) }
[System.Environment]::GetEnvironmentVariable("TEST", [System.EnvironmentVariableTarget]::Machine) ## get global variable
[System.Environment]::SetEnvironmentVariable("TEST",$null,"USer") ## remove global variable
This gives us the following output:
WORK Local
EN
change1
change2
change2
WORK Local
WORK Global
I would probably just use a try { } finally { }:
try {
$OriginalValue = $env:MYLANG
$env:MYLANG= 'GB'
my-cmd dostuff -o out.csv
}
finally {
$env:MYLANG = $OriginalValue
}
That should force the values to be set back to their original values even if an error is encountered in your script. It's not bulletproof, but most things that would break this would also be very obvious that something went wrong.
You could also do this:
try {
$env:MYLANG= 'GB'
my-cmd dostuff -o out.csv
}
finally {
$env:MYLANG = [System.Environment]::GetEnvironmentVariable('MYLANG', 'User')
}
That should retrieve the value from HKEY_CURRENT_USER\Environment. You may need 'Machine' instead of 'User' and that will pull from HKEY_LOCAL_MACHINE\System\CurrentControlSet\Control\Session Manager\Environment. Which you need depends if it's a user environment variable or a computer environment variable. This works because the Env: provider drive doesn't persist environment variable changes, so changes to those variables won't change the registry.
Forgive me if I've missed something as there parts of this post I'm a bit unclear on.
I would use the scope modifies $local, $script and $global modifiers.
Example
$env:TEST="EN"
function tt {
$local:env:TEST="GB"
($local:env:TEST)
}
$t = {
$local:env:TEST="DE"
($local:env:TEST)
}
($env:TEST)
tt
($env:TEST)
. $t
($env:TEST)
Output with comments
EN # ($env:TEST)
GB # tt
EN # ($env:TEST)
DE # . $t
EN # ($env:TEST)

How does name lookup work in Powershell script blocks?

User cashfoley has posted what appears to be a fairly elegant set of code at codeplex for a "module" called PSClass.
When I dot-source the psclass code into some code of my own, I am able to write code like:
$Animal = New-PSClass Animal {
constructor {
param( $name, $legs )
# ...
}
method -override ToString {
"A $($this.Class.ClassName) named $($this.name) with $($this.Legs) Legs"
}
}
When I tried to create a module out of the PSClass code, however, I started getting errors. The constructor and method names are no longer recognized.
Looking at the actual implementation, what I see is that constructor, method, etc. are actually nested functions inside the New-PSClass function.
Thus, it seems to me that when I dot-source the PSClass.ps1 file, my script-blocks are allowed to contain references to functions nested inside other local functions. But when the PSClass code becomes a module, with the New-PSClass function exported (I tried both using a manifest and using Export-ModuleMember), the names are no longer visible.
Can someone explain to me how the script blocks, scoping rules, and visibility rules for nested functions work together?
Also, kind of separately, is there a better class definition protocol for pure Powershell scripting? (Specifically, one that does not involve "just write it in C# and then do this...")
The variables in your script blocks don't get evaluated until they are executed. If the variables in the script block don't exist in the current scope when the block is executed, the variables won't have any values. Script blocks aren't closures: they don't capture the context at instantiation time.
Remove-variable FooBar
function New-ScriptBlock
{
$FooBar = 1
$scriptBlock = {
Write-Host "FooBar: $FooBar"
}
$FooBar = 2
& $scriptBlock # Outputs FooBar: 2 because $FooBar was set to 2 before invocation
return $scriptBlock
}
function Invoke-ScriptBlock
{
param(
$ScriptBlock
)
& $ScriptBlock
}
$scriptBlock = New-ScriptBlock
& $scriptBlock # Prints nothing since $FooBar doesn't exist in this scope
$FooBar = 3
Invoke-ScriptBlock $scriptBlock # Prints $FooBar: 3 since FooBar set to 3