How do I clear $error and $LASTEXITCODE set by an external cmdlet or executable? - powershell

I have a custom module wrapping an external command (csrun.exe), and parses the output so I can use it in PowerShell.
Everything just about works except if the external command writes to stderror, and clearing the error in my cmdlet doesn't seem to fully work. It will clear (i.e. $error.count is 0 and $lasterrorcode is 0, but once I return to the script that is calling my cmdlet, $error and $lasterrorcode are no longer clear and the error in $error references the underlying exception for the external command
System.Management.Automation.RemoteException: The compute emulator is not running.
I've attempted, try-catches, clearing the mentioned variables. Regardless, the calling script retains a reference to the error.
CustomModule.psm1
$__azureEmulatorPath = "C:\Program Files\Microsoft SDKs\Azure\Emulator\"SDKs\Azure\Emulator\"
$__azureEmulator = __azureEmulatorPath + "csrun.exe"
function Get-EmulatorStatus() {
[OutputType([ComputeEmulatorStatus])]
[cmdletbinding()]
param()
$output = (& $__azureEmulator /status | Out-String)
if ($error.Count -gt 0 -or $LASTEXITCODE -ne 0) {
Write-Host ($Error | Format-List -Force | Out-String)
Write-Host Clearing Error and Continuing
$error.Clear()
$LASTEXITCODE = 0
}
#error from command cleared here
return $output
}
export-modulemember -function *
Test.ps1
import-module "CustomModule.psm1" # definew cmdlet Get-EmulatorStatus
$status = Get-EmulatorStatus
# even though error cleared in cmdlet, still here
Write-Host Write-Host Error $LASTEXITCODE, $Error.Count
Write-Host ($Error | Format-List -Force | Out-String)

Try using one of two options:
use exit from your cmdlet, e.g. exit 0 (preferred).
use a global scope when setting the codes explicitly, E.g.
$global:LASTEXITCODE
I ran into this calling robocopy that sets non-zero exit codes even on success, and interfered with Jenkin's automation.

Related

How to implement Invoke-SilentlyAndReturnExitCode as a Powershell module function?

Please, observe:
The method
PS C:\> (Get-Command Invoke-SilentlyAndReturnExitCode).ScriptBlock
param([scriptblock]$Command, $Folder)
$ErrorActionPreference = 'Continue'
Push-Location $Folder
try
{
& $Command > $null 2>&1
$LASTEXITCODE
}
catch
{
-1
}
finally
{
Pop-Location
}
PS C:\>
The command to silence
PS C:\> $ErrorActionPreference = "Stop"
PS C:\> $Command = { cmd /c dir xo-xo-xo }
PS C:\> & $Command > $null 2>&1
cmd : File Not Found
At line:1 char:14
+ $Command = { cmd /c dir xo-xo-xo }
+ ~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (File Not Found:String) [], RemoteException
+ FullyQualifiedErrorId : NativeCommandError
PS C:\>
As you can see, it fails with an exception. But we can silence it easily, right?
PS C:\> $ErrorActionPreference = 'SilentlyContinue'
PS C:\> & $Command > $null 2>&1
PS C:\> $LASTEXITCODE
1
PS C:\>
All is good. Now my function does the same, so let us try it:
PS C:\> $ErrorActionPreference = "Stop"
PS C:\> Invoke-SilentlyAndReturnExitCode $Command
-1
PS C:\>
Yikes! It returns -1, not 1.
The problem appears to be that setting $ErrorActionPreference inside the function does not actually propagate to the command scope. Indeed, let me add some output:
PS C:\> (Get-Command Invoke-SilentlyAndReturnExitCode).ScriptBlock
param([scriptblock]$Command, $Folder)
$ErrorActionPreference = 'Continue'
Push-Location $Folder
try
{
Write-Host $ErrorActionPreference
& $Command > $null 2>&1
$LASTEXITCODE
}
catch
{
-1
}
finally
{
Pop-Location
}
PS C:\> $Command = { Write-Host $ErrorActionPreference ; cmd /c dir xo-xo-xo }
PS C:\> Invoke-SilentlyAndReturnExitCode $Command
Continue
Stop
-1
PS C:\>
So, the problem is really around $ErrorActionPreference - why does it not propagate? Powershell uses dynamic scoping, so the command definition should not capture its value, but use the one from the function. So, what is going on? How to fix it?
tl;dr
Because your Invoke-SilentlyAndReturnExitCode function is defined in a module, you must recreate your script block in the scope of that module for it to see the module-local $ErrorActionPreference value of Continue:
# Use an in-memory module to demonstrate the behavior.
$null = New-Module {
Function Invoke-SilentlyAndReturnExitCode {
param([scriptblock] $Command, $Folder)
$ErrorActionPreference = 'Continue'
Push-Location $Folder
try
{
Write-Host $ErrorActionPreference # local value
# *Recreate the script block in the scope of this module*,
# which makes it see the module's variables.
$Command = [scriptblock]::Create($Command.ToString())
# Invoke the recreated script block, suppressing all output.
& $Command *>$null
# Output the exit code.
$LASTEXITCODE
}
catch
{
-1
}
finally
{
Pop-Location
}
}
}
$ErrorActionPreference = 'Stop'
$Command = { Out-Host -InputObject $ErrorActionPreference; cmd /c dir xo-xo-xo }
Invoke-SilentlyAndReturnExitCode $Command
On Windows, the above now prints the following, as expected:
Continue
Continue
1
That is, the recreated $Command script block saw the function-local $ErrorActionPreference value, and the catch block was not triggered.
Caveat:
This will only work if the $Command script block contains no references to variables in the originating scope other than variables in the global scope.
The alternative to avoid this limitation is to define the function outside of a module (assuming you're also calling it from code that lives outside modules).
Background Information
The behavior implies that your Invoke-SilentlyAndReturnExitCode function is defined in a module, and each module has its own domain of scopes (hierarchy of scopes).
Your $Command script block, because it was defined outside that module, is bound to the default scope domain, and even when executed from inside a module, it continues see the variables from the scope domain in which it was defined.
Therefore, $Command still sees the Stop $ErrorActionPreference value, even though for module-originated code inside the function it would be Continue, due to setting a local copy of $ErrorActionPreference inside the module function.
Perhaps surprisingly, it is still the $ErrorActionPreference in effect inside $Command that controls the behavior, not the function-local value.
With a redirection such as 2>$null for *>$null in effect while Stop is the effective $ErrorActionPreference value, the mere presence of stderr output from an external program - whether it indicates a true error of not - triggers a terminating error and therefore the catch branch.
This particular behavior - where the explicit intent to suppress stderr output triggers an error - should be considered a bug, and has been reported in this GitHub issue.
The general behavior, however - a script block executing in the scope in which it was defined - while non-obvious, is by design.
Note: The remainder of this answer is its original form, which contains general background information that, however, does not cover the module aspect discussed above.
*> $null can be used to silence all output from a command - no need for suppressing the success output stream (>, implied 1>) and the error output stream (2>) separately.
Generally, $ErrorActionPreference has no effect on error output from external programs (such as git), because stderr output from external programs bypasses PowerShell's error stream by default.
There is on exception, however: setting $ErrorActionPreference to 'Stop' actually makes redirections such as 2>&1 and *>$null throw a terminating error if an external program such as git produces any stderr output.
This unexpected behavior is discussed in this GitHub issue.
Otherwise, a call to an external program never triggers a terminating error that a try / catch statement would handle. Success or failure can only be inferred from the automatic $LASTEXITCODE variable.
Therefore, write your function as follows if you define (and call) it outside a module:
function Invoke-SilentlyAndReturnExitCode {
param([scriptblock]$Command, $Folder)
# Set a local copy of $ErrorActionPreference,
# which will go out of scope on exiting this function.
# For *> $null to effectively suppress stderr output from
# external programs *without triggering a terminating error*
# any value other than 'Stop' will do.
$ErrorActionPreference = 'Continue'
Push-Location $Folder
try {
# Invoke the script block and suppress all of its output.
# Note that if the script block calls an *external program*, the
# catch handler will never get triggered - unless the external program
# cannot be found.
& $Command *> $null
$LASTEXITCODE
}
catch {
# Output the exit code used by POSIX-like shells such
# as Bash to signal that an executable could not be found.
127
} finally {
Pop-Location
}
}

redirect stdout, stderr from powershell script as admin through start-process

Inside a powershell script, I'm running a command which starts a new powershell as admin (if I'm not and if needed, depending on $arg) and then runs the script.
I'm trying to redirect stdout and stderr to the first terminal.
Not trying to make things easier, there are arguments too.
param([string]$arg="help")
if($arg -eq "start" -Or $arg -eq "stop")
{
if(![bool](([System.Security.Principal.WindowsIdentity]::GetCurrent()).groups -match "S-1-5-32-544"))
{
Start-Process powershell -Verb runas -ArgumentList " -file servicemssql.ps1 $arg"
exit
}
}
$Services = "MSSQLSERVER", "SQLSERVERAGENT", "MSSQLServerOLAPService", "SSASTELEMETRY", "SQLBrowser", `
"SQLTELEMETRY", "MSSQLLaunchpad", "SQLWriter", "MSSQLFDLauncher"
function startsql {
"starting SQL services"
Foreach ($s in $Services) {
"starting $s"
Start-Service -Name "$s"
}
}
function stopsql {
"stopping SQL services"
Foreach ($s in $Services) {
"stopping $s"
Stop-Service -Force -Name "$s"
}
}
function statussql {
"getting SQL services status"
Foreach ($s in $Services) {
Get-Service -Name "$s"
}
}
function help {
"usage: StartMssql [status|start|stop]"
}
Switch ($arg) {
"start" { startsql }
"stop" { stopsql }
"status" { statussql }
"help" { help }
"h" { help }
}
Using the following answers on SO doesn't work:
Capturing standard out and error with Start-Process
Powershell: Capturing standard out and error with Process object
How to deal with the double quote inside double quote while preserving the variable ($arg) expansion ?
PowerShell's Start-Process cmdlet:
does have -RedirectStandardOut and -RedirectStandardError parameters,
but syntactically they cannot be combined with -Verb Runas, the argument required to start a process elevated (with administrative privileges).
This constraint is also reflected in the underlying .NET API, where setting the .UseShellExecute property on a System.Diagnostics.ProcessStartInfo instance to true - the prerequisite for being able to use .Verb = "RunAs" in order to run elevated - means that you cannot use the .RedirectStandardOutput and .RedirectStandardError properties.
Overall, this suggests that you cannot directly capture an elevated process' output streams from a non-elevated process.
A pure PowerShell workaround is not trivial:
param([string] $arg='help')
if ($arg -in 'start', 'stop') {
if (-not (([System.Security.Principal.WindowsPrincipal] [System.Security.Principal.WindowsIdentity]::GetCurrent()).IsInRole('Administrators'))) {
# Invoke the script via -Command rather than -File, so that
# a redirection can be specified.
$passThruArgs = '-command', '&', 'servicemssql.ps1', $arg, '*>', "`"$PSScriptRoot\out.txt`""
Start-Process powershell -Wait -Verb RunAs -ArgumentList $passThruArgs
# Retrieve the captured output streams here:
Get-Content "$PSScriptRoot\out.txt"
exit
}
}
# ...
Instead of -File, -Command is used to invoke the script, because that allows appending a redirection to the command: *> redirects all output streams.
#soleil suggests using Tee-Object as an alternative so that the output produced by the elevated process is not only captured, but also printed to the (invariably new window's) console as it is being produced:
..., $arg, '|', 'Tee-Object', '-FilePath', "`"$PSScriptRoot\out.txt`""
Caveat: While it doesn't make a difference in this simple case, it's important to know that arguments are parsed differently between -File and -Command modes; in a nutshell, with -File, the arguments following the script name are treated as literals, whereas the arguments following -Command form a command that is evaluated according to normal PowerShell rules in the target session, which has implications for escaping, for instance; notably, values with embedded spaces must be surrounded with quotes as part of the value.
The $PSScriptRoot\ path component in output-capture file $PSScriptRoot\out.txt ensures that the file is created in the same folder as the calling script (elevated processes default to $env:SystemRoot\System32 as the working dir.)
Similarly, this means that script file servicemssql.ps1, if it is invoked without a path component, must be in one of the directories listed in $env:PATH in order for the elevated PowerShell instance to find it; otherwise, a full path is also required, such as $PSScriptRoot\servicemssql.ps1.
-Wait ensures that control doesn't return until the elevated process has exited, at which point file $PSScriptRoot\out.txt can be examined.
As for the follow-up question:
To go even further, could we have a way to have the admin shell running non visible, and read the file as we go with the Unix equivalent of tail -f from the non -privileged shell ?
It is possible to run the elevated process itself invisibly, but note that you'll still get the UAC confirmation prompt. (If you were to turn UAC off (not recommended), you could use Start-Process -NoNewWindow to run the process in the same window.)
To also monitor output as it is being produced, tail -f-style, a PowerShell-only solution is both nontrivial and not the most efficient; to wit:
param([string]$arg='help')
if ($arg -in 'start', 'stop') {
if (-not (([System.Security.Principal.WindowsPrincipal] [System.Security.Principal.WindowsIdentity]::GetCurrent()).IsInRole('Administrators'))) {
# Delete any old capture file.
$captureFile = "$PSScriptRoot\out.txt"
Remove-Item -ErrorAction Ignore $captureFile
# Start the elevated process *hidden and asynchronously*, passing
# a [System.Diagnostics.Process] instance representing the new process out, which can be used
# to monitor the process
$passThruArgs = '-noprofile', '-command', '&', "servicemssql.ps1", $arg, '*>', $captureFile
$ps = Start-Process powershell -WindowStyle Hidden -PassThru -Verb RunAs -ArgumentList $passThruArgs
# Wait for the capture file to appear, so we can start
# "tailing" it.
While (-not $ps.HasExited -and -not (Test-Path -LiteralPath $captureFile)) {
Start-Sleep -Milliseconds 100
}
# Start an aux. background that removes the capture file when the elevated
# process exits. This will make Get-Content -Wait below stop waiting.
$jb = Start-Job {
# Wait for the process to exit.
# Note: $using:ps cannot be used directly, because, due to
# serialization/deserialization, it is not a live object.
$ps = (Get-Process -Id $using:ps.Id)
while (-not $ps.HasExited) { Start-Sleep -Milliseconds 100 }
# Get-Content -Wait only checks once every second, so we must make
# sure that it has seen the latest content before we delete the file.
Start-Sleep -Milliseconds 1100
# Delete the file, which will make Get-Content -Wait exit (with an error).
Remove-Item -LiteralPath $using:captureFile
}
# Output the content of $captureFile and wait for new content to appear
# (-Wait), similar to tail -f.
# `-OutVariable capturedLines` collects all output in
# variable $capturedLines for later inspection.
Get-Content -ErrorAction SilentlyContinue -Wait -OutVariable capturedLines -LiteralPath $captureFile
Remove-Job -Force $jb # Remove the aux. job
Write-Verbose -Verbose "$($capturedLines.Count) line(s) captured."
exit
}
}
# ...

Executing command doesn't result in script exception

We have a powershell script that deploys a database script. However, if the database script fails, the output doesn't throw exceptions to the powershell script.
Below is an example of the .ps1 file:
function Publish-DatabaseProject
{
sqlcmd -S . -b -v DatabaseName=Integration -q "alter table xx add test Varchar(10)"
}
function Add-Timestamp {
process {
if ($_.GetType() -eq [string]) {
"[$(Get-Date -Format o)] $_"
} else {
$_
}
}
}
function Write-LogFile {
[CmdletBinding()]
param(
[Parameter(Mandatory=$true)] [string] $Path,
[Parameter(Mandatory=$true, ValueFromPipeline=$true)] [object[]] $InputObject
)
begin {
$root = Split-Path -Path $Path -Parent
if ($root -and !(Test-Path -Path $root)) { New-Item -Path $root -Type Directory
| Out-Null }
}
process {
$InputObject |
Add-Timestamp |
Tee-Object -File $Path -Append
}
}
Publish-DatabaseProject -ErrorVariable DeployError 2>&1 |
Write-LogFile -Path "C:\output.log"
if ($DeployError -and $DeployError.Count -ne 0)
{
Write-Output "Failed"
} else
{
Write-Output "Succeeded"
}
The query in question is executing against a non-existent table. The text output shows:
[2015-12-11T14:42:45.1973944+00:00] Msg 4902, Level 16, State 1, Server ABDN-DEV-PC1, Line 1
[2015-12-11T14:42:45.2053944+00:00] Cannot find the object "xx" because it does not exist or you do not have permission
s.
Succeeded
I am expecting the last line to read: Failed.
If you run the sqlcmd line on its own, and follow it up with $LastExitCode, it does spit out a non-zero exit code.
> sqlcmd -S . -b -v DatabaseName=Integration -q "alter table xx add test Varchar(10)"
Msg 4902, Level 16, State 1, Server ABDN-DEV-PC1, Line 1
Cannot find the object "xx" because it does not exist or you do not have permissions.
> $LastExitCode
1
For various reasons, we cannot use Invoke-SqlCmd, and need to stick with SQLCMD.exe.
How can we make exceptions from within SQLCMD bubble out correctly to the calling script?
Your -ErrorVariable DeployError statement would only get triggered if the Publish-DatabaseProject cmdlet itself fails to execute. As that function is mostly a wrapper around sqlcmd.exe, there isn't any intelligence to bubble up this error. We can wrap it though by using the $LastExitCode automatic variable.
function Publish-DatabaseProject
{
sqlcmd -S . -b -v DatabaseName=Integration -q "alter table xx add test Varchar(10)"
if ($LastExitCode -ne 0)
{
Write-error $LastExitCode
throw $LastExitCode
}
}
Now, PowerShell will catch this error from the .exe, and you can use -ErrorVariable.
Update
So, since you want to keep running and not abandon ship when enountering an error, we need to wrap your Publish-DataBaseProject function with a try{}catch{} block, to catch the error we're generating, without stopping execution.
try {Publish-DatabaseProject -ErrorAction STOP -ErrorVariable DeployError 2>&1 |
Write-LogFile -Path "C:\output.log" }
catch{
Write-Output "Current job $($_), failed"
Write-Output "Exception $($Error.ExceptionID)"
}
Now we're properly generating an exception from a CMD, bubbling it up through our Function, and handling it as if it were a normal function, all using Native PowerShell. I believe this is what you were seeking to do. If not, post a gist of the whole script and I'll help you in a more targeted fashion.

Powershell: How do I get the exit code returned from a process run inside a PsJob?

I have the following job in powershell:
$job = start-job {
...
c:\utils\MyToolReturningSomeExitCode.cmd
} -ArgumentList $JobFile
How do I access the exit code returned by c:\utils\MyToolReturningSomeExitCode.cmd ? I have tried several options, but the only one I could find that works is this:
$job = start-job {
...
c:\utils\MyToolReturningSomeExitCode.cmd
$LASTEXITCODE
} -ArgumentList $JobFile
...
# collect the output
$exitCode = $job | Wait-Job | Receive-Job -ErrorAction SilentlyContinue
# output all, except the last line
$exitCode[0..($exitCode.Length - 2)]
# the last line is the exit code
exit $exitCode[-1]
I find this approach too wry to my delicate taste. Can anyone suggest a nicer solution?
Important, I have read in the documentation that powershell must be run as administrator in order for the job related remoting stuff to work. I cannot run it as administrator, hence -ErrorAction SilentlyContinue. So, I am looking for solutions not requiring admin privileges.
Thanks.
If all you need is to do something in background while the main script does something else then PowerShell class is enough (and it is normally faster). Besides it allows passing in a live object in order to return something in addition to output via parameters.
$code = #{}
$job = [PowerShell]::Create().AddScript({
param($JobFile, $Result)
cmd /c exit 42
$Result.Value = $LASTEXITCODE
'some output'
}).AddArgument($JobFile).AddArgument($code)
# start thee job
$async = $job.BeginInvoke()
# do some other work while $job is working
#.....
# end the job, get results
$job.EndInvoke($async)
# the exit code is $code.Value
"Code = $($code.Value)"
UPDATE
The original code was with [ref] object. It works in PS V3 CTP2 but does not work in V2. So I corrected it, we can use other objects instead, a hashtable, for example, in order to return some data via parameters.
One way you can detect if the background job failed or not based on an exit code is to evaluate the exit code inside the background job itself and throw an exception if the exit code indicates an error occurred. For instance, consider the following example:
$job = start-job {
# ...
$output = & C:\utils\MyToolReturningSomeExitCode.cmd 2>&1
if ($LASTEXITCODE -ne 0) {
throw "Job failed. The error was: {0}." -f ([string] $output)
}
} -ArgumentList $JobFile
$myJob = Start-Job -ScriptBlock $job | Wait-Job
if ($myJob.State -eq 'Failed') {
Receive-Job -Job $myJob
}
A couple things of note in this example. I am redirecting the standard error output stream to the standard output stream to capture all textual output from the batch script and returning it if the exit code is non-zero indicating it failed to run. By throwing an exception this way the background job object State property will let us know the result of the job.

Redirecting output to $null in PowerShell, but ensuring the variable remains set

I have some code:
$foo = someFunction
This outputs a warning message which I want to redirect to $null:
$foo = someFunction > $null
The problem is that when I do this, while successfully supressing the warning message, it also has the negative side-effect of NOT populating $foo with the result of the function.
How do I redirect the warning to $null, but still keep $foo populated?
Also, how do you redirect both standard output and standard error to null? (In Linux, it's 2>&1.)
I'd prefer this way to redirect standard output (native PowerShell)...
($foo = someFunction) | out-null
But this works too:
($foo = someFunction) > $null
To redirect just standard error after defining $foo with result of "someFunction", do
($foo = someFunction) 2> $null
This is effectively the same as mentioned above.
Or to redirect any standard error messages from "someFunction" and then defining $foo with the result:
$foo = (someFunction 2> $null)
To redirect both you have a few options:
2>&1>$null
2>&1 | out-null
ADDENDUM:
Please note that (Windows) powershell has many more streams than a linux based OS. Here's the list from MS docs:
Thus you can redirect all streams using the wildcard with *>$null, and you can also use a file instead of $null.
This should work.
$foo = someFunction 2>$null
If it's errors you want to hide you can do it like this
$ErrorActionPreference = "SilentlyContinue"; #This will hide errors
$someObject.SomeFunction();
$ErrorActionPreference = "Continue"; #Turning errors back on
Warning messages should be written using the Write-Warning cmdlet, which allows the warning messages to be suppressed with the -WarningAction parameter or the $WarningPreference automatic variable. A function needs to use CmdletBinding to implement this feature.
function WarningTest {
[CmdletBinding()]
param($n)
Write-Warning "This is a warning message for: $n."
"Parameter n = $n"
}
$a = WarningTest 'test one' -WarningAction SilentlyContinue
# To turn off warnings for multiple commads,
# use the WarningPreference variable
$WarningPreference = 'SilentlyContinue'
$b = WarningTest 'test two'
$c = WarningTest 'test three'
# Turn messages back on.
$WarningPreference = 'Continue'
$c = WarningTest 'test four'
To make it shorter at the command prompt, you can use -wa 0:
PS> WarningTest 'parameter alias test' -wa 0
Write-Error, Write-Verbose and Write-Debug offer similar functionality for their corresponding types of messages.
using a function:
function run_command ($command)
{
invoke-expression "$command *>$null"
return $_
}
if (!(run_command "dir *.txt"))
{
if (!(run_command "dir *.doc"))
{
run_command "dir *.*"
}
}
or if you like one-liners:
function run_command ($command) { invoke-expression "$command "|out-null; return $_ }
if (!(run_command "dir *.txt")) { if (!(run_command "dir *.doc")) { run_command "dir *.*" } }
Recently, I had to shut up powershell on a Linux host, this wasn't that obvious to figure out. After back and forth I found out that wrapping a command in $( ) and adding a explicit redirection after the wrapper works.
Anything else I tried, wouldn't - I still don't know why since the PowerShell Docs are of desirable quality (and full of inconsistency...)
To import all modules on startup, I added the following. This produced some stderr output by powershell that couldnt be put to rest by ErrorAction or redirection without using the wrapping...
If anyone could elaborate on why's that would be very appreciated.
# import installed modules on launch
$PsMods = $(Get-InstalledModule);
$($PsMods.forEach({ Import-Module -Name $_.Name -ErrorAction Ignore })) *> /dev/null