Powershell: capturing remote output streams in Invoke-Command + Invoke-Expression combination - powershell

As I didn't find a solution by searching the forum and spent some time for finding out how to do it properly, I'm placing here the issue along with the working solution.
Scenario: in Powershell, need to remotely execute a script block stored in a variable and capture its output for further processing. No output should appear on the screen unless the script generates it on purpose. The script block can contain Write-Warning commands.

Note that the behaviors of interest apply generally to PowerShell commands, not just in the context of Invoke-Command and the - generally to be avoided - Invoke-Expression; in your case, it is only needed to work around a bug.[1]
Your own answer shows how to redirect a single, specific output streams to the success output stream; e.g, 3>&1 redirects (>&) the warning stream (3) to the success (output) stream (1).
The & indicates that the redirection target is a stream, as opposed to a file; for more information about PowerShell's output stream, see about_Redirection.
If you want to redirect all output streams to the success output stream, use redirection *>&1
By redirecting all streams to the output stream, their combined output can be captured in a variable, redirected to a file, or sent through the pipeline, whereas by default only the success output stream (1) is captured.
Separately, you can use the common parameters named -*Variable parameters to capture individual stream output in variables for some streams, namely:
Stream 1 (success): -OutVariable
Stream 2 (error): -ErrorVariable
Stream 3 (warning): -WarningVariable
Stream 6 (information): -InformationVariable
Be sure to specify the target variable by name only, without the $ prefix; e.g., to capture warnings in variable $warnings, use
-WarningVariable warnings, such as in the following example:
Write-Warning hi -WarningVariable warnings; "warnings: $warnings"
Note that with -*Variable, the stream output is collected in the variable whether or not you silence or even ignore that stream otherwise, with the notable exception of -ErrorAction Ignore, in which case an -ErrorVariable variable is not populated (and the error is also not recorded in the automatic $Error variable that otherwise records all errors that occur in the session).
Generally, -{StreamName}Action SilentlyIgnore seems to be equivalent to {StreamNumber}>$null.
Note the absence of the verbose (4) and the debug (5) streams above; you can only capture them indirectly, via 4>&1 and 5>&1 (or *>&1), which then requires you to extract the output of interest from the combined stream, via filtering by output-object type:
Important:
The verbose (4) and debug (5) streams are the only two streams that are silent at the source by default; that is, unless these streams are explicitly turned on via -Verbose / -Debug or their preference-variable equivalents, $VerbosePreference = 'Continue' / $DebugPreference = 'Continue', nothing is emitted and nothing can be captured.
The information stream (5) is silent only on output by default; that is, writing to the information stream (with Write-Information) always writes objects to the stream, but they're not displayed by default (they're only displayed with -InformationAction Continue / $InformationPreference = 'Continue')
Since v5, Write-Host now too writes to the information stream, though its output does print by default, but can be suppressed with 6>$null or -InformationAction Ignore (but not -InformationAction SilentlyContinue).
# Sample function that produces success and verbose output.
# Note that -Verbose is required for the message to actually be emitted.
function foo { Write-Output 1; Write-Verbose -Verbose 4 }
# Get combined output, via 4>&1
$combinedOut = foo 4>&1
# Extract the verbose-stream output records (objects).
# For the debug output stream (5), the object type is
# [System.Management.Automation.DebugRecord]
$verboseOut = $combinedOut.Where({ $_ -is [System.Management.Automation.VerboseRecord] })
[1] Stream-capturing bug, as of PowerShell v7.0:
In a nutshell: In the context of remoting (such as Invoke-Command -Session here), background jobs, and so-called minishells (passing a script block to the PowerShell CLI to execute commands in a child process), only the success (1) and error (2) streams can be captured as expected; all other are unexpectedly passed through to the host (display) - see this GitHub issue.
Your command should - but currently doesn't - work as follows, which would obviate the need for Invoke-Expression:
# !! 3>&1 redirection is BROKEN as of PowerShell 7.0, if *remoting* is involved
# !! (parameters -Session or -ComputerName).
$RemoteOutput =
Invoke-Command -Session $Session $Commands 3>&1 -ErrorVariable RemoteError 2>$null
That is, in principle you should be able to pass a $Commands variable that contains a script block directly as the (implied) -ScriptBlock argument to Invoke-Command.

Script block is contained in $Commands variable. $Session is an already established Powershell remoting session.
The task is resolved by the below command:
$RemoteOutput =
Invoke-Command -Session $Session {
Invoke-Expression $Using:Commands 3>&1
} -ErrorVariable RemoteError 2>$null
After the command is executed all output of the script block is contained in $RemoteOutput. Errors generated during remote code execution are placed in $RemoteError.
Additional clarifications. Write-Warning in Invoke-Expression code block generates its own output stream that is not captured by Invoke-Command. The only way to capture it in a variable is to redirect that stream to the standard stream of Invoke-Expression by using 3>&1. Commands in the code block writing to other output streams (verbose, debug) seems not to be captured even by adding 4>&1 and 5>&1 parameters to Invoke-Expression. However, stream #2 (errors) is properly captured by Invoke-Command in the way shown above.

Related

Get output from `New-AzResourceGroupDeployment` including `WhatIf`

I want to write a wrapper around New-AzResourceGroupDeployment in PowerShell. So lets assume the following script:
New-AzResourceGroupDeployment `
-Name 'test' `
-ResourceGroupName 'rg-test' `
-TemplateFile .\main.bicep `
-TemplateParameterFile .\parameters\parameters.json `
-Verbose `
-WhatIf
This will output something like this:
VERBOSE: Using Bicep v0.4.1008
...
What if: Performing the operation "Creating Deployment" on target "rg-test".
So the problem here is that I won't get any results from the WhatIf. I guess its because WhatIf runs a different process in the background.
So is there a way to capture the output of the WhatIf?
The output produced by using the common -WhatIf parameter can unfortunately not be captured in-session - it prints directly to the console (host).
The only workaround is to use PowerShell's CLI (powershell.exe for Windows PowerShell, pwsh for PowerShell (Core) 7+)
$whatIfAndVerboseOutput = powershell.exe -c #'
New-AzResourceGroupDeployment `
-Name 'test' `
-ResourceGroupName 'rg-test' `
-TemplateFile .\main.bicep `
-TemplateParameterFile .\parameters\parameters.json `
-Verbose `
-WhatIf
'#
Not the use of a verbatim here-string (#'<newline>...<newline>'#) to avoid the need for escaping the embedded quote characters.
Caveats:
Such a call is expensive.
Since a new, independent session that runs in a child process is created, it knows nothing about the state of the caller (except that it inherits environment variables).
Assuming the argument data types are simple enough, you can use an expandable string ("...") to embed values from the caller in the command text.
Note: As of PowerShell 7.2, PowerShell's CLI, when a command string is passed to -c (-Command) - which is the only option when calling from outside PowerShell - not only sends direct-to-console output to stdout, but all of its output streams there as well - including error - which is problematic: see the bottom section of this answer for details; here, this means that the -Verbose output is captured as well.
Alternatively, for better - but not complete - type fidelity, pass a script block ({ ... }) to -c (-Command) instead of a string, to which you can pass values from the caller with -args. Note that using a script block is only an option when also calling from Powershell and that doing so will insert an empty line after each what-if output message.
Note: The script-block approach limits the output that is mapped to the caller's success stream (via stdout) to direct-to-console output (such as from -WhatIf) and the command's success-stream output. Output to any of the other streams is mapped to the corresponding streams of the caller, though you can use redirections inside the script block to merge streams into the success output stream.
In the case at hand, the -Verbose output would be passed through to the caller's verbose stream by default and would therefore not be captured in a variable capturing the command output. However, placing 4>&1 inside the script block would include it.

Start-Process, Invoke-Command or?

Using the program got your back or GYB. I run the following command
Start-Process -FilePath 'C:\Gyb\gyb.exe' -ArgumentList #("--email <Email Address>", "--action backup", "--local-folder $GYBfolder", "--service-account", "--batch-size 4") -Wait
The issue is that when the process is done my script does not complete.
$GYBfolder = $GYBfolder.Replace('"', "")
$output = [PSCustomObject]#{
Name = $SourceGYB
Folder = $GYBfolder
}
$filename = "C:\reports\" + $SourceGYB.Split("#")[0] + "_Backup.csv"
$output | Export-Csv $filename -NoTypeInformation | Format-Table text-align=left -AutoSize
Return $filename
For some reason the script stops right before the return.
I am curious to know if I should be using a different command to run GYB?
Any thoughts on why the script does not process the return?
There's great information in the comments, but let me attempt a systematic overview:
To synchronously execute external console applications and capture their output, call them directly (C:\Gyb\gyb.exe ... or & 'C:\Gyb\gyb.exe' ...), do not use Start-Process - see this answer.
Only if gyb.exe were a GUI application would you need **Start-Process -Wait in order to execute it synchronously**.
A simple, but non-obvious shortcut is to pipe the invocation to another command, such as Out-Null, which also forces PowerShell to wait (e.g. gyb.exe | Out-Null) - see below.
When Start-Process is appropriate, the most robust way to pass all arguments is as a single string encoding all arguments, with appropriate embedded "..." quoting, as needed; this is unfortunate, but required as a workaround for a long-standing bug: see this answer.
Invoke-Command's primary purpose is to invoke commands remotely; while it can be used locally, there's rarely a good reason to do so, as &, the call operator is both more concise and more efficient - see this answer.
When you use an array to pass arguments to an external application, each element must contain just one argument, where parameter names and their values are considered distinct arguments; e.g., you must use #(--'action', 'backup', ...) rather than
#('--action backup', ...)
Therefore, use the following to run your command synchronously:
If gyb.exe is a console application:
# Note: Enclosing #(...) is optional
$argList = '--email', $emailAddress, '--action', 'backup', '--local-folder', $GYBfolder, '--service-account', '--batch-size', 4
# Note: Stdout and stderr output will print to the current console, unless captured.
& 'C:\Gyb\gyb.exe' $argList
If gyb.exe is a GUI application, which necessitates use of Start-Process -Wait (a here-string is used, because it makes embedded quoting easier):
# Note: A GUI application typically has no stdout or stderr output, and
# Start-Process never returns the application's *output*, though
# you can ask to have a *process object* returned with -PassThru.
Start-Process -Wait 'C:\Gyb\gyb.exe' #"
--email $emailAddress --action backup --local-folder "$GYBfolder" --service-account --batch-size 4
#"
The shortcut mentioned above - piping to another command in order to force waiting for a GUI application to exit - despite being obscure, has two advantages:
Normal argument-passing syntax can be used.
The automatic $LASTEXITCODE variable is set to the external program's process exit code, which does not happen with Start-Process. While GUI applications rarely report meaningful exit codes, some do, notably msiexec.
# Pipe to | Out-Null to force waiting (argument list shortened).
# $LASTEXITCODE will reflect gyb.exe's exit code.
# Note: In the rare event that the target GUI application explicitly
# attaches to the caller's console and produces output there,
# pipe to `Write-Output` instead, and possibly apply 2>&1 to
# the application call so as to also capture std*err* output.
& 'C:\Gyb\gyb.exe' --email $emailAddress --action backup | Out-Null
Note: If the above unexpectedly does not run synchronously, the implication is that gyb.exe itself launches another, asynchronous operation. There is no generic solution for that, and an application-specific one would require you to know the internals of the application and would be nontrivial.
A note re argument passing with direct / &-based invocation:
Passing an array as-is to an external program essentially performs splatting implicitly, without the need to use #argList[1]. That is, it passes each array element as its own argument.
By contrast, if you were to pass $argList to a PowerShell command, it would be passed as a single, array-valued argument, so #argList would indeed be necessary in order to pass the elements as separate, positional arguments. However, the more typical form of splatting used with PowerShell commands is to use a hashtable, which allows named arguments to be passed (parameter name-value pairs; e.g., to pass a value to a PowerShell command's
-LiteralPath parameter:
$argHash = #{ LiteralPath = $somePath; ... }; Set-Content #argHash
[1] $args and #args are largely identical in this context, but, strangely, #argList, honors use of --%, the stop-parsing symbol operator, even though it only makes sense in a literally specified argument list.

Why does PowerShell interpret kind/kubectl STDOUT as STDERR and How to Prevent it?

We are moving our DevOps pipelines to a new cluster and while at it, we bumped into a weird behavior when calling kind with PowerShell. This applies to kubectl also.
The below should be taken only as a repro, not a real world application. In other words, I'm not looking to fix the below code but I am searching for an explanation why the error happens:
curl.exe -Lo kind-windows-amd64.exe https://kind.sigs.k8s.io/dl/v0.10.0/kind-windows-amd64
Move-Item .\kind-windows-amd64.exe c:\temp\kind.exe -Force
$job = Start-Job -ScriptBlock { iex "$args" } -ArgumentList c:\temp\kind.exe, get, clusters
$job | Receive-Job -Wait -AutoRemoveJob
Now, if I directly execute the c:\temp\kind.exe get clusters command in the PowerShell window, the error won't happen:
In other words, why does PowerShell (any version) consider the STDOUT of kind/kubectl as STDERR? And how can I prevent this from happening?
There must be an environmental factor to it as the same exact code runs fine in one system while on another it throws an error...
tl;dr
kind outputs its status messages to stderr, which in the context of PowerShell jobs surface via PowerShell's error output stream, which makes them print in red (and susceptible to $ErrorActionPreference = 'Stop' and -ErrorAction Stop).
Either:
Silence stderr: Use 2>$null as a general mechanism or, as David Kruk suggests, use a program-specific option to achieve the same effect, which in the case of kind is -q (--quiet)
Re-route stderr output through PowerShell's success output stream, merged with stdout output, using *>&1.
Caveat: The original output sequencing between stdout and stderr lines is not necessarily maintained on output.
Also, if you want to know whether the external program reported failure or success, you need to include the value of the automatic $LASTEXITCODE variable, which contains the most recently executed external program's process exit code, in the job's output (the exit code is the only reliably success/failure indicator - not the presence or absence of stderr output).
A simplified example with *>&1 (for Windows; on Unix-like platforms, replace cmd and /c with sh and -c):
$job = Start-Job -ScriptBlock {
param($exe)
& $exe $args *>&1
$LASTEXITCODE # Also output the process exit code.
} -ArgumentList cmd, /c, 'echo data1; echo status >&2; echo data2'
$job | Receive-Job -Wait -AutoRemoveJob
As many utilities do, kind apparently reports status messages via stderr.
Given that stdout is for data, it makes sense to use the only other available output stream, stderr, for anything that isn't data, so as to prevent pollution of the data output. The upshot is that stderr output doesn't necessarily indicate actual errors (success vs. failure should solely be inferred from an external program's process exit code).
PowerShell (for its own commands only) commendably has a more diverse system of output streams, documented in the conceptual about_Redirection help topic, allowing you to report status messages via Write-Verbose, for instance.
PowerShell maps an external program's output streams to its own streams as follows:
Stdout output:
Stdout output is mapped to PowerShell's success output stream (the stream with number 1, analogous to how stdout can be referred to in cmd.exe and POSIX-compatible shells), allowing it to be captured in a variable ($output = ...) or redirected to a file (> output.txt) or sent through the pipeline to another command.
Stderr output:
In local, foreground processing in a console (terminal), stderr is by default not mapped at all, and is passed through to the display (not colored in red) - unless a 2> redirection is used, which allows you to suppress stderr output (2>$null) or to send it to a file (2>errs.txt)
This is appropriate, because PowerShell cannot and should not assume that stderr output represents actual errors, whereas PowerShell's error stream is meant to be used for errors exclusively.
Unfortunately, as of PowerShell 7.2, in the context of PowerShell jobs (created with Start-Job or Start-ThreadJob) and remoting (e.g., in Invoke-Command -ComputerName ... calls), stderr output is mapped to PowerShell's error stream (the stream with number 2, analogous to how stdout can be referred to in cmd.exe and POSIX-compatible shells).
Caveat: This means that if $ErrorActionPreference = 'Stop' is in effect or -ErrorAction Stop is passed to Receive-Job or Invoke-Command, for instance, any stderr output from external programs will trigger a script-terminating error - even with stderr output comprising status messages only. Due to a bug in PowerShell 7.1 and below this can also happen in local, foreground invocation if a 2> redirection is used.
The upshot:
To silence stderr output, apply 2>$null - either at the source (inside the job or remote command), or on the receiving end.
To route stderr output (all streams) via the success output stream / stdout, i.e. to merge all streams, use *>&1
To prevent the stderr lines from printing in red (when originating from jobs or remote commands), apply this redirection at the source - which also guards against side effects from $ErrorActionPreference = 'Stop' / -ErrorAction Stop on the caller side.
Note: If you merge all streams with *>&1, the order in which stdout and stderr lines are output is not guaranteed to reflect the original output order, as of PowerShell 7.2.
If needed, PowerShell still allows you to later separate the output lines based on whether they originated from stdout or stderr - see this answer.

Error handling of command prompt commands in Powershell

My goal is to check, disable and remove Scheduled Tasks on numerous Windows servers using Powershell.
Some of the servers are Windows 2008R2, so Get-ScheduledTask is out of question. I have to use schtasks
Here is what I have thus far
$servers = (Get-ADComputer -Server DomainController -Filter 'OperatingSystem -like "*Server*"').DNSHostname
$servers |
ForEach-Object {
if (Test-Connection -Count 1 -Quiet -ComputerName $_) {
Write-Output "$($_) exists, checking for Scheduled Task"
Invoke-Command -ComputerName $_ {
If((schtasks /query /TN 'SOMETASK')) {
Write-Output "Processing removal of scheduled task`n"
schtasks /change /TN 'SOMETASK' /DISABLE
schtasks /delete /TN 'SOMETASK' /F
}
else {
Write-Output "Scheduled Task does not exist`n"
}
}
}
}
This works fine for when SOMETASK exists but when it doesn't, Powershell spits an error, like this:
ERROR: The system cannot find the file specified.
+ CategoryInfo : NotSpecified: (ERROR: The syst...file specified.:String) [], RemoteException
+ FullyQualifiedErrorId : NativeCommandError
+ PSComputerName : SERVER1
NotSpecified: (:) [], RemoteException
Scheduled Task does not exist
I can circumvent this behavior by setting $ErrorActionPreference to "SilentlyContinue" but this suppresses other errors I may be interested in. I also tried Try, Catch but that still generates the error. I don't think I can add -ErrorHandling argument to an IF statement. Can anyone please lend a helping hand?
Thank you,
tl;dr:
Use 2>$null to suppress the stderr output from a call to an external program (such as schtasksk.exe)
To work around a bug present up to at least PowerShell [Core] 7.0 (see below), make sure that $ErrorActionPreferece is not set to 'Stop'.
# Execute with stderr silenced.
# Rely on the presence of stdout output in the success case only
# to make the conditional true.
if (schtasks /query /TN 'SOMETASK' 2>$null) { # success, task exists
"Processing removal of scheduled task`n"
# ...
}
For background information and more general use cases, read on.
Given how the line from the external program's stderr stream manifests as shown in your question,
it sounds like you're running your code in the PowerShell ISE, which I suggest moving away from: The PowerShell ISE is obsolescent and should be avoided going forward (bottom section of the linked answer).
That the ISE surfaces stderr lines surface via PowerShell's error stream by default is especially problematic - see this GitHub issue.
The regular console doesn't do that, fortunately - it passes stderr lines through to the host (console), and prints them normally (not in red), which is the right thing to do, given that you cannot generally assume that all stderr output represents errors (the stream's name notwithstanding).
With well-behaved external programs, you should only ever derive success vs. failure from their process exit code (as reflected in the automatic $LASTEXITCODE variable[1]), not from the presence of stderr output.: exit code 0 indicates success, any nonzero exit code (typically) indicates failure.
As for your specific case:
In the regular console, the value of the $ErrorActionPreference preference variable does not apply to external programs such as schtasks.exe, except in the form of a bug [fixed in PowerShell 7.2+] when you also use a 2> redirection - see GitHub issue #4002; as of PowerShell 7.1.0-preview.6; the corrected behavior is a available as experimental feature PSNotApplyErrorActionToStderr.
Since your schtasks /query /TN 'SOMETASK' command functions as a test, you can do the following:
# Execute with all streams silenced (both stdout and stderr, in this case).
# schtask.exe will indicate the non-existence of the specified task
# with exit code 1
schtasks /query /TN 'SOMETASK' *>$null
if ($LASTEXITCODE -eq 0) { # success, task exists
"Processing removal of scheduled task`n"
# ...
}
# You can also squeeze it into a single conditional, using
# $(...), the subexpression operator.
if (0 -eq $(schtasks /query /TN 'SOMETASK' *>$null; $LASTEXITCODE)) { # success, task exists
"Processing removal of scheduled task`n"
# ...
}
In your specific case, a more concise solution is possible, which relies on your schtasks command (a) producing stdout output in the case of success (if the task exists) and (b) only doings so in the success case:
# Execute with stderr silenced.
# Rely on the presence of stdout output in the success case only
# to make the conditional true.
if (schtasks /query /TN 'SOMETASK' 2>$null) { # success, task exists
"Processing removal of scheduled task`n"
# ...
}
If schtasks.exe produces stdout output (which maps to PowerShell's success output stream, 1), PowerShell's implicit to-Boolean conversion will consider the conditional $true (see the bottom section of this answer for an overview of PowerShell's to-Boolean conversion rules).
Note that a conditional only ever acts on the success output stream's output (1), other streams are passed through, such as the stderr output (2) would be in this case (as you've experienced).
2>$null silences stderr output, by redirecting it to the null device.
1 and 2 are the numbers of PowerShell's success output / error streams, respectively; in the case of external programs, they refers to their stdout (standard output) and stderr (standard error) streams, respectively - see about_Redirection.
You can also capture stderr output with a 2> redirection, if you want to report it later (or need to examine it specifically for an ill-behaved program that doesn't use exit codes properly).
2> stderr.txt sends the stderr lines to file sdterr.txt; unfortunately, there is currently no way to capture stderr in a variable - see GitHub issue #4332, which proposes syntax 2>&variableName for that.
As implied by the aforementioned bug, you must ensure that $ErrorActionPreference isn't set to 'Stop', because the 2> will then mistakenly trigger a script-terminating error.
Aside from the aforementioned bug, using 2> currently has another unexpected side effect [fixed in PowerShell 7.2+]: The stderr lines are unexpectedly also added to the automatic $Error collection, as if they're errors (which they cannot assumed to be).
The root cause of both issues is that stderr lines are unexpectedly routed via PowerShell's error stream, even though there is no good reason to do so - see GitHub issue #11133.
[1] Note that the automatic $? variable that indicates success vs. failure as a Boolean ($true / $false) is also set, but not reliably so: since stderr output is currently (v7.0) unexpectedly routed via PowerShell's error stream if redirected with 2>&, the presence of any stderr output invariably sets $? to $false, even if the external program reports overall success, via $LASTEXITCODE reporting 0. Therefore, the only reliable way to test for success is $LASTEXITCODE -eq 0, not $?.
Personally I prefer to use the Scheduler ComObject to manage scheduled tasks. You can connect to other servers with it, and search them simply enough to manage their tasks.
$Scheduler = New-Object -ComObject Schedule.Service
$servers = (Get-ADComputer -Server DomainController -Filter 'OperatingSystem -like "*Server*"').DNSHostname
$servers |
ForEach-Object {
if (Test-Connection -Count 1 -Quiet -ComputerName $_) {
Write-Output "$($_) exists, checking for Scheduled Task"
$Scheduler.Connect($_)
$RootFolder = $Scheduler.GetFolder("\")
$TargetTask = $RootFolder.GetTask('SOMETASK')
# If the task wasn't found continue to the next server
If(!$TargetTask){
Write-Output "Scheduled Task does not exist`n"
Continue
}
Write-Output "Processing removal of scheduled task`n"
$TargetTask.Enabled = $false
$RootFolder.DeleteTask('SOMETASK')
}
}
This appears like you've way over-complicated execution of this effort.
Why disable and remove vs just remove, as that seems a bit redundant?
All scheduled tasks are nothing but xml files and reg entries, that you can just delete if you don't want the task any longer. Thus, you can use sue Get-ChildItem.
# File system:
(Get-ChildItem -Path "$env:windir\System32\Tasks").FullName
# Results
<#
...
C:\Windows\System32\Tasks\Microsoft
...
C:\Windows\System32\Tasks\MicrosoftEdgeUpdateTaskMachineCore
...
#>
# Registry:
Get-ChildItem -Path 'HKLM:\Software\Microsoft\Windows NT\CurrentVersion\Schedule\Taskcache\Tasks'
# Results
<#
Name Property
---- --------
{01C5B377-A7EB-4FF3-9C6C-86852 Path : \Microsoft\Windows\Management\Provisioning\Logon
...
#>
Get-ChildItem -Path 'HKLM:\Software\Microsoft\Windows NT\CurrentVersion\Schedule\Taskcache\Tree'
# Results
<#
Name Property
---- --------
Adobe Acrobat Update Task SD : {1...
#>
Just select your task by name and delete the file and the regkeys using the normal filesystem cmdlets.
So you just want to hide the error message from schtasks? One way is to redirect standard error or "2" to $null. This is an example anyone can run as admin. The if statement only works because there's no output to standard out when there's an error. It looks like invoke-command generates a remote exception when something comes over standard error, but it doesn't stop the commands that follow. I don't see a way to try/catch it.
invoke-command localhost { if (schtasks /query /tn 'foo' 2>$null) {
'yes' } ; 'hi'}
hi

Standard output in powershell debug console

I have a long script in powershell which calls an even longer function located in a separate .ps1 file. The function runs some svn update commands and some compiled executables which produce standard output. When I run these directly from the script the output gets redirected to the debug console in Powershell ISE. When I run them through the function I can tell they are running but I get no standard output in the console.
How do I redirect standard output from my function back to the powershell debug console where I can see it?
THanks.
EDIT
I am importing the function as follows:
. "D:\common.ps1"
and calling it as follows:
$MedianValue = Run-Comparison $LastRev $FirstRev $ScriptPath $SolutionPath $DevenvPath $TestPath $refFile $SVNPAth
Within the function, one of the calls is as follows
svn update $FirstRev
Start-Process ExecutableName Argument
It is for the above two statements that I cannot see the standard output for when I call their containing function.
If you're capturing a script's / function's output and that script / function contains a mix of PowerShell-native output statements and external-program calls producing stdout output, both types of output are sent to PowerShell's regular success output streams.
Therefore, unless you redirect at the source, you cannot selectively pass stdout from external programs through to the the host (e.g., a regular console window or the console pane in the ISE), because you won't be able to tell which output objects (lines) come from where.
To redirect at the source - if you have control over the callee's source code - you have several options, the simplest being Write-Host, as the following example demonstrates:
function Run-Comparison {
'PS success output'
cmd /c 'echo external stdout output' | Write-Host
}
# Captures 'PS success output', but passes the cmd.exe output through to the console.
$MedianValue = Run-Comparison
The above selectively sends the cmd.exe command's output to the host.
In PSv5+, where Write-Host writes to the newly introduced information stream (number 6), you can optionally suppress the to-host output with 6>$null on invocation.
To reverse the logic, use Write-Information instead of Write-Host (PSv5+ only), which is silent by default and allows you to turn on output with $InformationPreference = 'Continue'.
If you want silent-by-default behavior in PSv4-, use Write-Verbose or Write-Debug, but note that such output will be a different color, with each line having a prefix (VERBOSE: and DEBUG:, respectively).