How to automatically call Pop-Location at end of scope - powershell

Let's say I have a simple scope that is book-ended with Push-Location and Pop-Location:
Function MyFunction($Location)
{
Push-Location $Location
# do other stuff here
Pop-Location
}
Is there any way to set it up at the beginning of the scope so that I don't have to remember to put the Pop-Location at the end? Something like this:
Function MyFunction($Location)
{
Setup-BothPushAndPopHere $Location
# do other stuff here
# at the end of the scope, Pop-Location is automatically called
}

Short answer: No.
My take on Push-Location and Pop-Location is that you should generally avoid them, and adapt your script to use the path names in commands instead; in other words, instead of:
Push-Location $Location
Get-ChildItem
Pop-Location
Just do:
Get-ChildItem $Location
(simplified example)
If you must use the pattern, consider try/finally:
Push-Location $Location
try {
# ...
} finally {
Pop-Location
}
As this helps with unexpected exceptions or the user interrupting program execution.
I typically use the try/finally pattern when the code is outside my control; most often when loading the SQLPS module since it changes the current location to the SQL server provider which in my experience causes everything that uses the current location to become much slower.
As Eris points out, it's also useful when dealing with native applications. This can be especially true if it's painful to escape quotes around path names with spaces, or the application wouldn't handle it correctly anyway.

Depending on what you try to do in the #do other stuff here code, you could try executing those commands as a script block in a child process. A skeleton-code example:
$ScriptBlock = {
Push-Location $Location
#commands
}
$Job = Start-Job -ScriptBlock $ScriptBlock # Add other arguments if needed
# Check the status of the task:
Get-Job $Job
# Wait for the job state to be 'Completed' (do-while, maybe?)
# If your ScriptBlock writes any output, collect in a variable:
$Output = Receive-Job $Job
# Clean up:
Remove-Job $Job
The point of this approach is that you spawn a job to do the work (in the background) and you needn't worry about Pop-Location as you just let that child scope exit while you carry on doing whatever you need to do in your main script.
There are other posts here on StackExchange that go into more detail about powershell-jobs.

Here is a cool way to do it with script blocks
function withPath($action,$path) {
Push-Location $path
try
{
& $action
}
finally
{
Pop-Location
}
}
withPath {
Get-Location | Write-Host
withPath {
Get-Location | Write-Host
throw "oh no, an inner exception!"
} 'Program Files' #relative path
} 'C:\'

Related

How to implement Invoke-SilentlyAndReturnExitCode as a Powershell module function?

Please, observe:
The method
PS C:\> (Get-Command Invoke-SilentlyAndReturnExitCode).ScriptBlock
param([scriptblock]$Command, $Folder)
$ErrorActionPreference = 'Continue'
Push-Location $Folder
try
{
& $Command > $null 2>&1
$LASTEXITCODE
}
catch
{
-1
}
finally
{
Pop-Location
}
PS C:\>
The command to silence
PS C:\> $ErrorActionPreference = "Stop"
PS C:\> $Command = { cmd /c dir xo-xo-xo }
PS C:\> & $Command > $null 2>&1
cmd : File Not Found
At line:1 char:14
+ $Command = { cmd /c dir xo-xo-xo }
+ ~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (File Not Found:String) [], RemoteException
+ FullyQualifiedErrorId : NativeCommandError
PS C:\>
As you can see, it fails with an exception. But we can silence it easily, right?
PS C:\> $ErrorActionPreference = 'SilentlyContinue'
PS C:\> & $Command > $null 2>&1
PS C:\> $LASTEXITCODE
1
PS C:\>
All is good. Now my function does the same, so let us try it:
PS C:\> $ErrorActionPreference = "Stop"
PS C:\> Invoke-SilentlyAndReturnExitCode $Command
-1
PS C:\>
Yikes! It returns -1, not 1.
The problem appears to be that setting $ErrorActionPreference inside the function does not actually propagate to the command scope. Indeed, let me add some output:
PS C:\> (Get-Command Invoke-SilentlyAndReturnExitCode).ScriptBlock
param([scriptblock]$Command, $Folder)
$ErrorActionPreference = 'Continue'
Push-Location $Folder
try
{
Write-Host $ErrorActionPreference
& $Command > $null 2>&1
$LASTEXITCODE
}
catch
{
-1
}
finally
{
Pop-Location
}
PS C:\> $Command = { Write-Host $ErrorActionPreference ; cmd /c dir xo-xo-xo }
PS C:\> Invoke-SilentlyAndReturnExitCode $Command
Continue
Stop
-1
PS C:\>
So, the problem is really around $ErrorActionPreference - why does it not propagate? Powershell uses dynamic scoping, so the command definition should not capture its value, but use the one from the function. So, what is going on? How to fix it?
tl;dr
Because your Invoke-SilentlyAndReturnExitCode function is defined in a module, you must recreate your script block in the scope of that module for it to see the module-local $ErrorActionPreference value of Continue:
# Use an in-memory module to demonstrate the behavior.
$null = New-Module {
Function Invoke-SilentlyAndReturnExitCode {
param([scriptblock] $Command, $Folder)
$ErrorActionPreference = 'Continue'
Push-Location $Folder
try
{
Write-Host $ErrorActionPreference # local value
# *Recreate the script block in the scope of this module*,
# which makes it see the module's variables.
$Command = [scriptblock]::Create($Command.ToString())
# Invoke the recreated script block, suppressing all output.
& $Command *>$null
# Output the exit code.
$LASTEXITCODE
}
catch
{
-1
}
finally
{
Pop-Location
}
}
}
$ErrorActionPreference = 'Stop'
$Command = { Out-Host -InputObject $ErrorActionPreference; cmd /c dir xo-xo-xo }
Invoke-SilentlyAndReturnExitCode $Command
On Windows, the above now prints the following, as expected:
Continue
Continue
1
That is, the recreated $Command script block saw the function-local $ErrorActionPreference value, and the catch block was not triggered.
Caveat:
This will only work if the $Command script block contains no references to variables in the originating scope other than variables in the global scope.
The alternative to avoid this limitation is to define the function outside of a module (assuming you're also calling it from code that lives outside modules).
Background Information
The behavior implies that your Invoke-SilentlyAndReturnExitCode function is defined in a module, and each module has its own domain of scopes (hierarchy of scopes).
Your $Command script block, because it was defined outside that module, is bound to the default scope domain, and even when executed from inside a module, it continues see the variables from the scope domain in which it was defined.
Therefore, $Command still sees the Stop $ErrorActionPreference value, even though for module-originated code inside the function it would be Continue, due to setting a local copy of $ErrorActionPreference inside the module function.
Perhaps surprisingly, it is still the $ErrorActionPreference in effect inside $Command that controls the behavior, not the function-local value.
With a redirection such as 2>$null for *>$null in effect while Stop is the effective $ErrorActionPreference value, the mere presence of stderr output from an external program - whether it indicates a true error of not - triggers a terminating error and therefore the catch branch.
This particular behavior - where the explicit intent to suppress stderr output triggers an error - should be considered a bug, and has been reported in this GitHub issue.
The general behavior, however - a script block executing in the scope in which it was defined - while non-obvious, is by design.
Note: The remainder of this answer is its original form, which contains general background information that, however, does not cover the module aspect discussed above.
*> $null can be used to silence all output from a command - no need for suppressing the success output stream (>, implied 1>) and the error output stream (2>) separately.
Generally, $ErrorActionPreference has no effect on error output from external programs (such as git), because stderr output from external programs bypasses PowerShell's error stream by default.
There is on exception, however: setting $ErrorActionPreference to 'Stop' actually makes redirections such as 2>&1 and *>$null throw a terminating error if an external program such as git produces any stderr output.
This unexpected behavior is discussed in this GitHub issue.
Otherwise, a call to an external program never triggers a terminating error that a try / catch statement would handle. Success or failure can only be inferred from the automatic $LASTEXITCODE variable.
Therefore, write your function as follows if you define (and call) it outside a module:
function Invoke-SilentlyAndReturnExitCode {
param([scriptblock]$Command, $Folder)
# Set a local copy of $ErrorActionPreference,
# which will go out of scope on exiting this function.
# For *> $null to effectively suppress stderr output from
# external programs *without triggering a terminating error*
# any value other than 'Stop' will do.
$ErrorActionPreference = 'Continue'
Push-Location $Folder
try {
# Invoke the script block and suppress all of its output.
# Note that if the script block calls an *external program*, the
# catch handler will never get triggered - unless the external program
# cannot be found.
& $Command *> $null
$LASTEXITCODE
}
catch {
# Output the exit code used by POSIX-like shells such
# as Bash to signal that an executable could not be found.
127
} finally {
Pop-Location
}
}

Make a Powershell respond to Register-ObjectEvent events in script mode

I have a simple Powershell script that I wrote in the Powershell ISE. The gist of it is that it watches a named pipe for a write as a signal to perform an action, while at the same time monitoring its boss process. When the boss-process exits, the script exits as well. Simple.
After struggling to get the named pipe working in Powershell without crashing, I managed to get working code, which is shown below. However, while this functions great in the Powershell ISE and interactive terminals, I've been hopeless in getting this to work as a standalone script.
$bosspid = 16320
# Create the named pipe
$pipe = new-object System.IO.Pipes.NamedPipeServerStream(
-join('named-pipe-',$bosspid),
[System.IO.Pipes.PipeDirection]::InOut,
1,
[System.IO.Pipes.PipeTransmissionMode]::Byte,
[System.IO.Pipes.PipeOptions]::Asynchronous
)
# If we don't do it this way, Powershell crashes
# Brazenly stolen from github.com/Tadas/PSNamedPipes
Add-Type #"
using System;
public sealed class CallbackEventBridge
{
public event AsyncCallback CallbackComplete = delegate {};
private void CallbackInternal(IAsyncResult result)
{
CallbackComplete(result);
}
public AsyncCallback Callback
{
get { return new AsyncCallback(CallbackInternal); }
}
}
"#
$cbbridge = New-Object CallBackEventBridge
Register-ObjectEvent -InputObject $cbbridge -EventName CallBackComplete -Action {
param($asyncResult)
$pipe.EndWaitForConnection($asyncResult)
$pipe.Disconnect()
$pipe.BeginWaitForConnection($cbbridge.Callback, 1)
Host-Write('The named pipe has been written to!')
}
# Make sure to close when boss closes
$bossproc = Get-Process -pid $bosspid -ErrorAction SilentlyContinue
$exitsequence = {
$pipe.Dispose()
[Environment]::Exit(0)
}
if (-Not $bossproc) {$exitsequence.Invoke()}
Register-ObjectEvent $bossproc -EventName Exited -Action {$exitsequence.Invoke()}
# Begin watching for events until boss closes
$pipe.BeginWaitForConnection($cbbridge.Callback, 1)
The first problem is that the script terminates before doing anything meaningful. But delaying end of execution with such tricks like while($true) loops, the -NoExit flag, pause command, or even specific commands which seem made for the purpose, like Wait-Event, will cause the process to stay open, but still won't make it respond to the events.
I gave up on doing it the "proper" way and have instead reverted to using synchronous code wrapped in while-true blocks and Job control.
$bosspid = (get-process -name notepad).id
# Construct the named pipe's name
$pipename = -join('named-pipe-',$bosspid)
$fullpipename = -join("\\.\pipe\", $pipename) # fix SO highlighting: "
# This will run in a separate thread
$asyncloop = {
param($pipename, $bosspid)
# Create the named pipe
$pipe = new-object System.IO.Pipes.NamedPipeServerStream($pipename)
# The core loop
while($true) {
$pipe.WaitForConnection()
# The specific signal I'm using to let the loop continue is
# echo m > %pipename%
# in CMD. Powershell's echo will *not* work. Anything other than m
# will trigger the exit condition.
if ($pipe.ReadByte() -ne 109) {
break
}
$pipe.Disconnect()
# (The action this loop is supposed to perform on trigger goes here)
}
$pipe.Dispose()
}
# Set up the exit sequence
$bossproc = Get-Process -pid $bosspid -ErrorAction SilentlyContinue
$exitsequence = {
# While PS's echo doesn't work for passing messages, it does
# open and close the pipe which is enough to trigger the exit condition.
&{echo q > $fullpipename} 2> $null
[Environment]::Exit(0)
}
if ((-Not $bossproc) -or $bossproc.HasExited) { $exitsequence.Invoke() }
# Begin watching for events until boss closes
Start-Job -ScriptBlock $asyncloop -Name "miniloop" -ArgumentList $pipename,$bosspid
while($true) {
Start-Sleep 1
if ($bossproc.HasExited) { $exitsequence.Invoke() }
}
This code works just fine now and does the job I need.

Run functions in parallel

Is there any way to run parallel programmed functions in PowerShell?
Something like:
Function BuildParallel($configuration)
{
$buildJob = {
param($configuration)
Write-Host "Building with configuration $configuration."
RunBuilder $configuration;
}
$unitJob = {
param()
Write-Host "Running unit."
RunUnitTests;
}
Start-Job $buildJob -ArgumentList $configuration
Start-Job $unitJob
While (Get-Job -State "Running")
{
Start-Sleep 1
}
Get-Job | Receive-Job
Get-Job | Remove-Job
}
Does not work because it complains about not recognizing "RunUnitTests" and "RunBuilder", which are functions declared in the same script file. Apparently this happens because the script block is a new context and does not know anything about the scripts declared in the same file.
I could try to use -InitializationScript in Start-Job, but both RunUnitTests and RunBuilder call more functions declared in the same file or referred from other files, so...
I'm sure there's a way to do this, since it's just modular programming (functions, routines and all that stuff).
You could have the functions in a separate file and import them into the current context wherever needed via dot sourcing. I do this in my Powershell profile, so some of my custom functions are available.
$items = Get-ChildItem "$PSprofilePath\functions"
$items | ForEach-Object {
. $_.FullName
}
If you wanted import one file, it would just be:
. C:\some\path\RunUnitTests.ps1

Powershell: Checking if script was invoked with -ErrorAction?

Scripts (with CmdletBinding) and cmdlets all have a standard -ErrorAction parameter available when being invoked. Is there a way then, within your script, if indeed you script was invoked with -ErrorAction?
Reason I ask is because I want to know if the automatic variable value for $ErrorActionPreference, as far as your script is concerned, was set by -ErrorAction or if it is coming from your session level.
$ErrorActionPreference is a variable in the global(session) scope. If you run a script and don't specify the -ErrorAction parameter, it inherits the value from the global scope ($global:ErrorActionPreference).
If you specify -ErrorAction parameter, the $ErrorActionPreference is changed for your private scope, meaning it's stays the same through the script except while running code where you specified something else(ex. you call another script with another -ErrorAction value). Example to test:
Test.ps1
[CmdletBinding()]
param()
Write-Host "Session: $($global:ErrorActionPreference)"
Write-Host "Script: $($ErrorActionPreference)"
Output:
PS-ADMIN > $ErrorActionPreference
Continue
PS-ADMIN > .\Test.ps1
Session: Continue
Script: Continue
PS-ADMIN > .\Test.ps1 -ErrorAction Ignore
Session: Continue
Script: Ignore
PS-ADMIN > $ErrorActionPreference
Continue
If you wanna test if the script was called with the -ErrorAction paramter, you could use ex.
if ($global:ErrorActionPreference -ne $ErrorActionPreference) { Write-Host "changed" }
If you don't know what scopes is, type this in a powershell console: Get-Help about_scopes
Check the $MyInvocation.BoundParameters object. You could use the built-in $PSBoundParameters variable but I found that in some instances it was empty (not directly related to this question), so imo it's safer to use $MyInvocation.
function Test-ErrorAction
{
param()
if($MyInvocation.BoundParameters.ContainsKey('ErrorAction'))
{
'The ErrorAction parameter has been specified'
}
else
{
'The ErrorAction parameter was not specified'
}
}
Test-ErrorAction
If you need to know if the cmdlet was called with the -ErrorAction parameter do like this:
[CmdletBinding()]
param()
if ($myinvocation.Line.ToLower() -match "-erroraction" )
{
"yessss"
}
else
{
"nooooo"
}
This is true alse when the parameter have same value as the global $ErrorActionPreference

How to properly use the -verbose and -debug parameters in a custom cmdlet

By default, any named function that has the [CmdletBinding()] attribute accepts the -debug and -verbose (and a few others) parameters and has the predefined $debug and $verbose variables. I'm trying to figure out how to pass them on to other cmdlet's that get called within the function.
Let's say I have a cmdlet like this:
function DoStuff() {
[CmdletBinding()]
PROCESS {
new-item Test -type Directory
}
}
If -debug or -verbose was passed into my function, I want to pass that flag into the new-item cmdlet. What's the right pattern for doing this?
$PSBoundParameters isn't what you're looking for. The use of the [CmdletBinding()] attribute allows the usage of $PSCmdlet within your script, in addition to providing a Verbose flag. It is in fact this same Verbose that you're supposed to use.
Through [CmdletBinding()], you can access the bound parameters through $PSCmdlet.MyInvocation.BoundParameters. Here's a function that uses CmdletBinding and simply enters a nested prompt immediately in order examine the variables available inside the function scope.
PS D:\> function hi { [CmdletBinding()]param([string] $Salutation) $host.EnterNestedPrompt() }; hi -Salutation Yo -Verbose
PS D:\>>> $PSBoundParameters
____________________________________________________________________________________________________
PS D:\>>> $PSCmdlet.MyInvocation.BoundParameters
Key Value
--- -----
Salutation Yo
Verbose True
So in your example, you would want the following:
function DoStuff `
{
[CmdletBinding()]
param ()
process
{
new-item Test -type Directory `
-Verbose:($PSCmdlet.MyInvocation.BoundParameters["Verbose"].IsPresent -eq $true)
}
}
This covers -Verbose, -Verbose:$false, -Verbose:$true, and the case where the switch is not present at all.
Perhaps it sounds strange, but there isn't any easy way for a cmdlet to know its verbose or debug mode. Take a look at the related question:
How does a cmdlet know when it really should call WriteVerbose()?
One not perfect, but practically reasonable, option is to introduce your own cmdlet parameters (for example, $MyVerbose and $MyDebug) and use them in the code explicitly:
function DoStuff {
[CmdletBinding()]
param
(
# Unfortunately, we cannot use Verbose name with CmdletBinding
[switch]$MyVerbose
)
process {
if ($MyVerbose) {
# Do verbose stuff
}
# Pass $MyVerbose in the cmdlet explicitly
New-Item Test -Type Directory -Verbose:$MyVerbose
}
}
DoStuff -MyVerbose
UPDATE
When we need only a switch (not, say, a verbosity level value) then the approach with $PSBoundParameters is perhaps better than proposed in the first part of this answer (with extra parameters):
function DoStuff {
[CmdletBinding()]
param()
process {
if ($PSBoundParameters['Verbose']) {
# Do verbose stuff
}
New-Item Test -Type Directory -Verbose:($PSBoundParameters['Verbose'] -eq $true)
}
}
DoStuff -Verbose
It's all not perfect anyway. If there are better solutions then I would really like to know them myself.
There is no need. PowerShell already does this as the code below proves.
function f { [cmdletbinding()]Param()
"f is called"
Write-Debug Debug
Write-Verbose Verbose
}
function g { [cmdletbinding()]Param()
"g is called"
f
}
g -Debug -Verbose
The output is
g is called
f is called
DEBUG: Debug
VERBOSE: Verbose
It is not done as direct as passing -Debug to the next cmdlet though. It is done through the $DebugPreference and $VerbrosePreference variables. Write-Debug and Write-Verbose act like you would expect, but if you want to do something different with debug or verbose you can read here how to check for yourself.
Here's my solution:
function DoStuff {
[CmdletBinding()]
param ()
BEGIN
{
$CMDOUT = #{
Verbose = If ($PSBoundParameters.Verbose -eq $true) { $true } else { $false };
Debug = If ($PSBoundParameters.Debug -eq $true) { $true } else { $false }
}
} # BEGIN ENDS
PROCESS
{
New-Item Example -ItemType Directory #CMDOUT
} # PROCESS ENDS
END
{
} #END ENDS
}
What this does different from the other examples is that it will repsect "-Verbose:$false" or "-Debug:$false". It will only set -Verbose/-Debug to $true if you use the following:
DoStuff -Verbose
DoStuff -Verbose:$true
DoStuff -Debug
DoStuff -Debug:$true
You could build a new hash table based on the bound debug or verbose parameters and then splat it to the internal command. If you're just specifying switches (and aren't passing a false switch, like $debug:$false) you can just check for the existence of debug or verbose:
function DoStuff() {
[CmdletBinding()]
PROCESS {
$HT=#{Verbose=$PSBoundParameters.ContainsKey'Verbose');Debug=$PSBoundParameters.ContainsKey('Debug')}
new-item Test -type Directory #HT
}
}
If you want to pass the parameter value it's more complicated, but can be done with:
function DoStuff {
[CmdletBinding()]
param()
PROCESS {
$v,$d = $null
if(!$PSBoundParameters.TryGetValue('Verbose',[ref]$v)){$v=$false}
if(!$PSBoundParameters.TryGetValue('Debug',[ref]$d)){$d=$false}
$HT=#{Verbose=$v;Debug=$d}
new-item Test -type Directory #HT
}
}
The best way to do it is by setting the $VerbosePreference. This will enable the verbose level for the entire script. Do not forget to disable it by the end of the script.
Function test
{
[CmdletBinding()]
param($param1)
if ($psBoundParameters['verbose'])
{
$VerbosePreference = "Continue"
Write-Verbose " Verbose mode is on"
}
else
{
$VerbosePreference = "SilentlyContinue"
Write-Verbose " Verbose mode is Off"
}
# <Your code>
}
You can set the VerbosePreference as a global variable on starting your script and then check for the global variable in your custom cmdlet.
Script:
$global:VerbosePreference = $VerbosePreference
Your-CmdLet
Your-CmdLet:
if ($global:VerbosePreference -eq 'Continue') {
# verbose code
}
Checking explicitly for 'Continue' allows the script to be equal to -verbose:$false when you call the CmdLet from a script that doesn't set the global variable (in which case it's $null)
You do not have to do any checks or comparisons. Even though -Verbose (and -Debug) are of type [switch], they seem to understand not just $true and $false but also their preference variable. The preference variable also gets inherited correctly to all child functions that are called. I tried this on Powershell version 7.3.2 and it works as expected.
function Parent {
[CmdletBinding()]param()
Child
}
function Child {
[CmdletBinding()]param()
New-Item C:\TEST\SomeDir -Force -ItemType Directory -Verbose:$VerbosePreference -Debug:$DebugPreference
}
Parent -Verbose
Parent -Debug
I think this is the easiest way:
Function Test {
[CmdletBinding()]
Param (
[parameter(Mandatory=$False)]
[String]$Message
)
Write-Host "This is INFO message"
if ($PSBoundParameters.debug) {
Write-Host -fore cyan "This is DEBUG message"
}
if ($PSBoundParameters.verbose) {
Write-Host -fore green "This is VERBOSE message"
}
""
}
Test -Verbose -Debug