What is the setlocal / endlocal equivalent for PowerShell? - powershell

Objective
Isolate environmental variable changes to a code block.
Background
If I want to create a batch script to run a command that requires an environmental variable set, I know I can do this:
setlocal
set MYLANG=EN
my-cmd dostuff -o out.csv
endlocal
However, I tend to use PowerShell when I need to using a shell scripting language. I know how to set the environmental variable ($env:TEST="EN") and of course this is just a simple example. However, I am not sure how to achieve the same effect that I could with a batch script. Surprisingly, I don't see any questions asking this either.
I am aware that setting something with $env:TEST="EN" is process scoped, but that isn't practical if I'm using scripts as small utilities in a single terminal session.
My current approaches:
Entered setlocal. But that wasn't a commandlet... I hoped.
Save the current variable to a temp variable, run my command, change it back... kinda silly.
Function level scope (though I doubted the success since $env: seems to be not much unlike $global:)
Function scope doesn't trump the reference to $env:
$env:TEST="EN"
function tt {
$env:TEST="GB"
($env:TEST)
}
($env:TEST)
tt
($env:TEST)
Output:
C:\Users\me> .\eg.ps1
EN
GB
GB

In batch files, all shell variables are environment variables too; therefore, setlocal ... endlocal provides a local scope for environment variables too.
By contrast, in PowerShell, shell variables (e.g., $var) are distinct from environment variables (e.g., $env:PATH) - a distinction that is generally beneficial.
Given that the smallest scope for setting environment variables is the current process - and therefore the entire PowerShell session, you must manage a smaller custom scope manually, if you want to do this in-process (which is what setlocal ... endlocal does in cmd.exe, for which PowerShell has no built-in equivalent; to custom-scope shell variables, use & { $var = ...; ... }):
In-process approach: manual management of a custom scope:
To ease the pain somewhat, you can use a script block ({ ... }) to provide a distinct visual grouping of the command, which, when invoked with & also create a new local scope, so that any aux. variables you define in the script block automatically go out of scope (you can write this as a one-line with ;-separated commands):
& {
$oldVal, $env:MYLANG = $env:MYLANG, 'EN'
my-cmd dostuff -o out.csv
$env:MYLANG = $oldVal
}
More simply, if there's no preexisting MYLANG value that must be restored:
& { $env:MYLANG='EN'; my-cmd dostuff -o out.csv; $env:MYLANG=$null }
$oldVal, $env:MYLANG = $env:MYLANG, 'EN' saves the old value (if any) of $env:MYLANG in $oldVal while changing the value to 'EN'; this technique of assigning to multiple variables at once (known as destructuring assignment in some languages) is explained in Get-Help about_Assignment_Operators, section "ASSIGNING MULTIPLE VARIALBES".
A more proper and robust but more verbose solution is to use try { ... } finally { ... }:
try {
# Temporarily set/create $env:MYLANG to 'EN'
$prevVal = $env:MYLANG; $env:MYLANG = 'EN'
my-cmd dostuff -o out.csv # run the command(s) that should see $env:MYLANG as 'EN'
} finally { # remove / restore the temporary value
# Note: if $env:MYLANG didn't previously exist, $prevVal is $null,
# and assigning that back to $env:MYLANG *removes* it, as desired.
$env:MYLANG = $prevVal
}
Note, however, that if you only ever call external programs with the temporarily modified environment, there is no strict need for try / catch, because external programs never cause PowerShell errors as of PowerShell 7.1, though that may change in the future.
To facilitate this approach, this answer to a related question offers convenience function
Invoke-WithEnvironment, which allows you to write the same invocation as:
# Define env. var. $env:MYLANG only for the duration of executing the commands
# in { ... }
Invoke-WithEnvironment #{ MYLANG = 'EN' } { my-cmd dostuff -o out.csv }
Alternatives, using an auxiliary process:
By using an auxiliary process and only setting the transient environment variable there,
you avoid the need to restore the environment after invocation
but you pay a performance penalty, and invocation complexity is increased.
Using an aux. cmd.exe process:
cmd /c "set `"MYLANG=EN`" & my-cmd dostuff -o out.csv"
Note:
Outer "..." quoting was chosen so that you can reference PowerShell variables in your command; embedded " must then be escaped as `"
Additionally, the arguments to the target command must be passed according to cmd.exe's rules (makes no difference with the simple command at hand).
Using an aux. child PowerShell session:
# In PowerShell *Core*, use `pwsh` in lieu of `powershell`
powershell -nop -c { $env:MYLANG = 'EN'; my-cmd dostuff -o out.csv }
Note:
Starting another PowerShell session is expensive.
Output from the script block ({ ... }) is subject to serialization and later deserialization in the calling scope; for string output, that doesn't matter, but complex objects such as [System.IO.FileInfo] deserialize to emulations of the originals (which may or may not be problem).

There is a way to achieve this in PowerShell:
Local Scope:
& { [System.Environment]::SetEnvironmentVariable('TEST', 'WORK Local', [System.EnvironmentVariableTarget]::Process)
[System.Environment]::GetEnvironmentVariable("TEST", [System.EnvironmentVariableTarget]::Process) }
This creates the environmental variable in the scope of the process same as above. Any call to it outside the scope will return nothing.
For a global one you just change the target to Machine:
& { [System.Environment]::SetEnvironmentVariable('TEST', 'WORK Global', [System.EnvironmentVariableTarget]::Machine) }
Any call to this outside the scope will return 'Work Global'
Putting it all together:
## create local variable and print
& { [System.Environment]::SetEnvironmentVariable('TEST', 'WORK Local', [System.EnvironmentVariableTarget]::Process)
[System.Environment]::GetEnvironmentVariable("TEST", [System.EnvironmentVariableTarget]::Process) }
function tt {
($env:TEST)
}
& { $TEST="EN"; $env:TEST="EN"; tt }
& { $TEST="change1"; $env:TEST="change1"; tt }
& { $TEST="change1"; $env:TEST="change2"; tt }
[System.Environment]::GetEnvironmentVariable("TEST", [System.EnvironmentVariableTarget]::Process)
& { [System.Environment]::SetEnvironmentVariable('TEST', 'WORK Global', [System.EnvironmentVariableTarget]::Machine) } ## create global variable
## Create local variable and print ( overrides global )
& { [System.Environment]::SetEnvironmentVariable('TEST', 'WORK Local', [System.EnvironmentVariableTarget]::Process)
[System.Environment]::GetEnvironmentVariable("TEST", [System.EnvironmentVariableTarget]::Process) }
[System.Environment]::GetEnvironmentVariable("TEST", [System.EnvironmentVariableTarget]::Machine) ## get global variable
[System.Environment]::SetEnvironmentVariable("TEST",$null,"USer") ## remove global variable
This gives us the following output:
WORK Local
EN
change1
change2
change2
WORK Local
WORK Global

I would probably just use a try { } finally { }:
try {
$OriginalValue = $env:MYLANG
$env:MYLANG= 'GB'
my-cmd dostuff -o out.csv
}
finally {
$env:MYLANG = $OriginalValue
}
That should force the values to be set back to their original values even if an error is encountered in your script. It's not bulletproof, but most things that would break this would also be very obvious that something went wrong.
You could also do this:
try {
$env:MYLANG= 'GB'
my-cmd dostuff -o out.csv
}
finally {
$env:MYLANG = [System.Environment]::GetEnvironmentVariable('MYLANG', 'User')
}
That should retrieve the value from HKEY_CURRENT_USER\Environment. You may need 'Machine' instead of 'User' and that will pull from HKEY_LOCAL_MACHINE\System\CurrentControlSet\Control\Session Manager\Environment. Which you need depends if it's a user environment variable or a computer environment variable. This works because the Env: provider drive doesn't persist environment variable changes, so changes to those variables won't change the registry.

Forgive me if I've missed something as there parts of this post I'm a bit unclear on.
I would use the scope modifies $local, $script and $global modifiers.
Example
$env:TEST="EN"
function tt {
$local:env:TEST="GB"
($local:env:TEST)
}
$t = {
$local:env:TEST="DE"
($local:env:TEST)
}
($env:TEST)
tt
($env:TEST)
. $t
($env:TEST)
Output with comments
EN # ($env:TEST)
GB # tt
EN # ($env:TEST)
DE # . $t
EN # ($env:TEST)

Related

PowerShell, cannot append a variable inside a function

I'm a bit confused about something in PowerShell.
At the end of a function it appends some text to a variable so that I can log what happened. But when I do this, $x just contains the header and nothing else. What am I doing wrong?
$x = "Header`n=========="
function xxx ($z) {
$z = "$z ... 1"
$x += "`noutput from $z"
}
xxx 123
xxx 234
xxx 345
xxx 456
To summarise the comments, and lean on this answer by #mklement0 from a similar question - https://stackoverflow.com/a/38675054/3156906 - the default behaviour is for PowerShell to let you read variables from a parent scope, but if you assign a value it creates a new variable in the child scope which "hides" the parent variable until the child scope exists.
The about_Scopes documentation says something similar as well, but doesn't really lay it out in such specific detail unfortunately:
If you create an item in a scope, and the item shares its name with an item in a different scope, the original item might be hidden under the new item, but it is not overridden or changed.
You can assign values to the variable in the parent scope if you refer to it explicitly by scope number - e.g:
$x = "some value"
function Invoke-MyFunction
{
# "-Scope 1" means "immediate parent scope"
Set-Variable x -Value "another value" -Scope 1
}
Invoke-MyFunction
write-host $x
In your case though, you might want to wrap your logging logic into separate functions rather than litter your code with lots of Set-Variable x -Value ($x + "another log line") -Scope 1 (which is an implementation detail of your logging approach).
Your current approach will degrade performance over time by creating a new (and increasingly long) string object every time you add some logging output, and it will also make it really hard to change your logging mechanism at a later date (e.g. what if you decide want to log to a file instead?) as you'll need to go back to every logging line and rewrite it to use the new mechanism.
What you could do instead is something like this:
Logging Code
$script:LogEntries = new-object System.Collections.ArrayList;
function Write-LogEntry
{
param( [string] $Value )
$null = $script:LogEntries.Add($Value)
}
function Get-LogEntries
{
return $script:LogEntries
}
Application Code
function Invoke-MyFunction
{
Write-LogEntry -Value "another logging line"
}
and then your logging mechanism is abstracted away from your main application, and your application just has to call Write-LogEntry.
Note that $script:<varname> is another way of referencing variables in the containing script's root scope using Scope Modifiers - the link describes some other options including global, local and private.

Access a variable from parent scope

In Single Module Scenario: Running Set-Var returns 10.
# m.psm1
function Set-Var {
$MyVar = 10
Get-Var
}
function Get-Var {
$MyVar
}
In Nested Modules Scenario: Running Set-Var does not return any value.
# m1.psm1
function Get-Var {
$MyVar
}
# m.psm1
Import-Module .\m1.psm1
function Set-Var {
$MyVar = 10
Get-Var
}
How do I achieve the same effect as a single module with nested modules? Using $script:MyVar also does not work. However, I would like to keep the scope of the variable local to enable concurrent executions with different values.
Your code doesn't work because local variables are not inherited by functions in nested module context.
You can do this instead:
$null = New-Module {
function Get-Var {
[CmdletBinding()] param()
$PSCmdlet.SessionState.PSVariable.Get('MyVar').Value
}
}
The New-Module command creates an in-memory module, because this code only works when the caller is in a different module or script.
Use the CmdletBinding attribute to create an advanced function. This is a prerequisite to use the automatic $PSCmdlet variable, which we need in the next step.
Use its SessionState.PSVariable member to get or set a variable from the parent (module) scope.
This answer shows an example how to set a variable in the parent (module) scope.
See also: Is there any way for a powershell module to get at its caller's scope?

Can a powershell module call functions in its importer's scope?

Is it possible for a powershell module to call functions that live in its importer's scope?
For example, say I have module.psm1, and script.ps1. Inside script.ps1 I import module.psm1, and define a function called Execute-Payload. Is there any way for the code in module.psm1 to call Execute-Payload?
If I understood correctly what you're trying to do, there are 2 commonly used ways to load custom functions to your main script. I'll give you a few examples, since, I'm not sure if there is a best practice for this.
The script being executed will always be main.ps1 on all given examples.
Example 1: All functions are stored on one file
Folder structure
../path/to/script/main.ps1
../path/to/script/Functions/functions.ps1
Functions.ps1
function myCustomFunction1 {
....
}
function myCustomFunction2 {
....
}
function myCustomFunction3 {
....
}
main.ps1
On the first lines of code you could add something like this:
$ErrorActionPreference = 'Stop'
# Option 1: Using Import-Module.
try
{
Import-Module "$PSScriptRoot\Functions\functions.ps1"
}
catch
{
"Failed to load dependency Functions.`nError: $_"
}
# Option 2: Dot Sourcing
try
{
. "$PSScriptRoot\Functions\functions.ps1"
}
catch
{
"Failed to load dependency Functions.`nError: $_"
}
Note: both options will load ALL functions.
Example 2: Many functions stored on different files. This is used when you have lots of complex and/or lengthy functions.
Folder structure
../path/to/script/main.ps1
../path/to/script/Functions/myCustomFunction1.ps1
../path/to/script/Functions/myCustomFunction2.ps1
../path/to/script/Functions/myCustomFunction3.ps1
myCustomFunction1.ps1
function myCustomFunction1 {
....
}
myCustomFunction2.ps1
function myCustomFunction2 {
....
}
myCustomFunction3.ps1
function myCustomFunction3 {
....
}
main.ps1
On the first lines of code you could add something like this:
$ErrorActionPreference = 'Stop'
# Option 1: Using Import-Module.
try
{
Get-ChildItem "$PSScriptRoot\Functions\*.ps1" | Import-Module
}
catch
{
"Failed to load dependency Functions.`nError: $_"
}
# Option 2: Dot Sourcing
try
{
Get-ChildItem "$PSScriptRoot\Functions\*.ps1" | ForEach-Object {
. $_.FullName
}
}
catch
{
"Failed to load dependency Functions.`nError: $_"
}
From inside an (advanced) function in your module, you can use $PSCmdlet.InvokeCommand.InvokeScript() with argument $PSCmdlet.SessionState and a script block created from a string to execute arbitrary code in the caller's scope.
The technique was gratefully adapted from this comment on GitHub.
For brevity, the following code demonstrates the technique with a dynamic module created with New-Module, but it equally applies to regular, persisted modules:
# The function to call from the module.
function Execute-Payload {
"Inside Execute-Payload in the script's scope."
}
# Create (and implicitly import) a dynamic module.
$null = New-Module {
# Define the advanced module function that calls `Execute-Payload`
# in its caller's scope.
function Invoke-Something {
[CmdletBinding()]
param()
$PSCmdlet.InvokeCommand.InvokeScript(
$PSCmdlet.SessionState,
# Pass a *string* with arbitrary code to execute in the caller's scope to
# [scriptblock]::Create().
# !! It is imperative that [scriptblock]::Create() be used, with
# !! a *string*, as it creates an *unbound* script block that then
# !! runs in the specified session.
# !! If needed, use string expansion to "bake" module-local variable
# !! values into the string, or pass parameters as additional arguments.
# !! However, if a script block is passed *by the caller*,
# !! bound to *its* context, it *can* be used as-is.
[scriptblock]::Create(' Execute-Payload ')
)
}
}
# Now invoke the function defined in the module.
Invoke-Something
The above outputs Inside Execute-Payload in the script's scope., proving that the script's function was called.

Default value of mandatory parameter in PowerShell

Is it possible to set a default value on a mandatory parameter in a function?
It works without having it set as an mandatory ...
Ie.
Function Get-Hello {
[CmdletBinding()]
Param([Parameter(Mandatory=$true)]
[String]$Text = $Script:Text
)
BEGIN {
}
PROCESS {
Write-Host "$Script:Text"
Write-Host "$Text"
}
END {
}
}
$Text = "hello!"
Get-Hello
Reason for asking this is because i have a function that has some required parameters and the function works perfect with calling it with the required parameters but i also want to make it possible for these variables to be defined in the scripts that use this function in a "better presentable & editable" way along with the function to be able to be run with defining the required parameters.
Hence if defined in the script scope it should take that as default else it should prompty for the value.
Thanks in Advance,
If you targeting to PowerShell V3+, then you can use $PSDefaultParameterValues preferences variable:
$PSDefaultParameterValues['Get-Hello:Text']={
if(Test-Path Variable::Script:Text){
# Checking that variable exists, so we does not return $null, or produce error in strict mode.
$Script:Text
}
}

How does name lookup work in Powershell script blocks?

User cashfoley has posted what appears to be a fairly elegant set of code at codeplex for a "module" called PSClass.
When I dot-source the psclass code into some code of my own, I am able to write code like:
$Animal = New-PSClass Animal {
constructor {
param( $name, $legs )
# ...
}
method -override ToString {
"A $($this.Class.ClassName) named $($this.name) with $($this.Legs) Legs"
}
}
When I tried to create a module out of the PSClass code, however, I started getting errors. The constructor and method names are no longer recognized.
Looking at the actual implementation, what I see is that constructor, method, etc. are actually nested functions inside the New-PSClass function.
Thus, it seems to me that when I dot-source the PSClass.ps1 file, my script-blocks are allowed to contain references to functions nested inside other local functions. But when the PSClass code becomes a module, with the New-PSClass function exported (I tried both using a manifest and using Export-ModuleMember), the names are no longer visible.
Can someone explain to me how the script blocks, scoping rules, and visibility rules for nested functions work together?
Also, kind of separately, is there a better class definition protocol for pure Powershell scripting? (Specifically, one that does not involve "just write it in C# and then do this...")
The variables in your script blocks don't get evaluated until they are executed. If the variables in the script block don't exist in the current scope when the block is executed, the variables won't have any values. Script blocks aren't closures: they don't capture the context at instantiation time.
Remove-variable FooBar
function New-ScriptBlock
{
$FooBar = 1
$scriptBlock = {
Write-Host "FooBar: $FooBar"
}
$FooBar = 2
& $scriptBlock # Outputs FooBar: 2 because $FooBar was set to 2 before invocation
return $scriptBlock
}
function Invoke-ScriptBlock
{
param(
$ScriptBlock
)
& $ScriptBlock
}
$scriptBlock = New-ScriptBlock
& $scriptBlock # Prints nothing since $FooBar doesn't exist in this scope
$FooBar = 3
Invoke-ScriptBlock $scriptBlock # Prints $FooBar: 3 since FooBar set to 3