I ha a function, where I call an application with the & operator. The application produces several line command line output, downloads some files, and returns a string:
& "app.exe" | Out-Host
$var = ...
return $var
It seems, that on the console appears the output produced by app.exe only after app.exe terminates. The user does not have any real time information which file is downloading. Is there a way to continuously update the console when app.exe is running?
Many console applications buffer theirs output stream, if it known to be redirected. Actually, it is standard behavior of C library. So, buffering done by app.exe, because of redirection, but not by Out-Host.
Solution would be to not to redirect output of app.exe, even when outer command redirected. For than you should know exact condition, when PowerShell not redirect output stream of console application and link it directly to their own output stream, which would be console for interactive PowerShell.exe session. The conditions is:
Command is last item in pipeline.
Command is piped to Out-Default.
Solution would be wrap command into script block, and pipe that script block to Out-Default:
& { & "app.exe" } | Out-Default
The other solution would be to use Start-Process cmdlet with -NoNewWindow and -Wait parameters:
Start-Process "app.exe" -NoNewWindow -Wait
Related
Here is the program. I am using dell command | configure. The command-line command is as follows:
"C:\Program Files (x86)\Dell\Command Configure\X86_64>cctk.exe" --wakeonlan
In Powershell you can navigate to the folder and run:
./cctk.exe --wakeonlan
I can pipe the above command into a variable and get the information I need. This requires my shell to cd into the folder accordingly and run accordingly.
$test = ./cctk.exe --wakeonlan
This will give you an output. However when you use start-process, you get no output as this is a command-line command. A cmd screen appears and runs the command. So, I added a -nonewwindow and -wait flags. The output now appears on the screen, but I can't seem to capture it.
$test = start-process "C:\Program Files (x86)\Dell\Command Configure\X86_64\cctk.exe" -ArgumentList #("--wakeonlan") -NoNewWindow -Wait
At this point test is empty. I tried using the Out-File to capture the information as well. No success. The command outputs to the screen but nowhere else.
I also tried the cmd method where you pipe the information in using the /C flag.
$test = Start-Process cmd -ArgumentList '/C start "C:\Program Files (x86)\Dell\Command Configure\X86_64\cctk.exe" "--wakeonlan"' -NoNewWindow -Wait
However, I have tried many variations of this command with no luck. Some say C:\Program is not recognized. Some just open command prompt. The above says --wakeonlan is an unknown command.
Any pointers would help greatly.
There are various ways to run this without the added complication of start-process.
Add to the path temporarily:
$env:path += ';C:\Program Files (x86)\Dell\Command Configure\X86_64;'
cctk
Call operator:
& 'C:\Program Files (x86)\Dell\Command Configure\X86_64\cctk'
Backquote all spaces and parentheses:
C:\Program` Files` `(x86`)\Dell\Command` Configure\X86_64\cctk
To elaborate on js2010's helpful answer:
In short: Because your executable path is quoted, direct invocation requires use of &, the call operator, for syntactic reasons - see this answer for details.
To synchronously execute console applications or batch files and capture their output, call them directly ($output = c:\path\to\some.exe ... or $output = & $exePath ...), do not use Start-Process (or the System.Diagnostics.Process API it is based on) - see this answer for more information.
If you do use Start-Process, which may be necessary in special situations, such as needing to run with a different user identity:
The only way to capture output is in text files, via the -RedirectStandardOutput / -RedirectStandardError parameters. Note that the character encoding of the output files is determined by the encoding stored in [Console]::OutputEncoding[1], which reflects the current console output code page, which defaults to the system's active legacy OEM code page.
By contrast, even with -NoNewWindow -Wait, directly capturing output with $output = ... does not work, because the launched process writes directly to the console, bypassing PowerShell's success output stream, which is the one variable assignments capture.
[1] PowerShell uses the same encoding to decode output from external programs in direct invocations - see this answer for details.
We are moving our DevOps pipelines to a new cluster and while at it, we bumped into a weird behavior when calling kind with PowerShell. This applies to kubectl also.
The below should be taken only as a repro, not a real world application. In other words, I'm not looking to fix the below code but I am searching for an explanation why the error happens:
curl.exe -Lo kind-windows-amd64.exe https://kind.sigs.k8s.io/dl/v0.10.0/kind-windows-amd64
Move-Item .\kind-windows-amd64.exe c:\temp\kind.exe -Force
$job = Start-Job -ScriptBlock { iex "$args" } -ArgumentList c:\temp\kind.exe, get, clusters
$job | Receive-Job -Wait -AutoRemoveJob
Now, if I directly execute the c:\temp\kind.exe get clusters command in the PowerShell window, the error won't happen:
In other words, why does PowerShell (any version) consider the STDOUT of kind/kubectl as STDERR? And how can I prevent this from happening?
There must be an environmental factor to it as the same exact code runs fine in one system while on another it throws an error...
tl;dr
kind outputs its status messages to stderr, which in the context of PowerShell jobs surface via PowerShell's error output stream, which makes them print in red (and susceptible to $ErrorActionPreference = 'Stop' and -ErrorAction Stop).
Either:
Silence stderr: Use 2>$null as a general mechanism or, as David Kruk suggests, use a program-specific option to achieve the same effect, which in the case of kind is -q (--quiet)
Re-route stderr output through PowerShell's success output stream, merged with stdout output, using *>&1.
Caveat: The original output sequencing between stdout and stderr lines is not necessarily maintained on output.
Also, if you want to know whether the external program reported failure or success, you need to include the value of the automatic $LASTEXITCODE variable, which contains the most recently executed external program's process exit code, in the job's output (the exit code is the only reliably success/failure indicator - not the presence or absence of stderr output).
A simplified example with *>&1 (for Windows; on Unix-like platforms, replace cmd and /c with sh and -c):
$job = Start-Job -ScriptBlock {
param($exe)
& $exe $args *>&1
$LASTEXITCODE # Also output the process exit code.
} -ArgumentList cmd, /c, 'echo data1; echo status >&2; echo data2'
$job | Receive-Job -Wait -AutoRemoveJob
As many utilities do, kind apparently reports status messages via stderr.
Given that stdout is for data, it makes sense to use the only other available output stream, stderr, for anything that isn't data, so as to prevent pollution of the data output. The upshot is that stderr output doesn't necessarily indicate actual errors (success vs. failure should solely be inferred from an external program's process exit code).
PowerShell (for its own commands only) commendably has a more diverse system of output streams, documented in the conceptual about_Redirection help topic, allowing you to report status messages via Write-Verbose, for instance.
PowerShell maps an external program's output streams to its own streams as follows:
Stdout output:
Stdout output is mapped to PowerShell's success output stream (the stream with number 1, analogous to how stdout can be referred to in cmd.exe and POSIX-compatible shells), allowing it to be captured in a variable ($output = ...) or redirected to a file (> output.txt) or sent through the pipeline to another command.
Stderr output:
In local, foreground processing in a console (terminal), stderr is by default not mapped at all, and is passed through to the display (not colored in red) - unless a 2> redirection is used, which allows you to suppress stderr output (2>$null) or to send it to a file (2>errs.txt)
This is appropriate, because PowerShell cannot and should not assume that stderr output represents actual errors, whereas PowerShell's error stream is meant to be used for errors exclusively.
Unfortunately, as of PowerShell 7.2, in the context of PowerShell jobs (created with Start-Job or Start-ThreadJob) and remoting (e.g., in Invoke-Command -ComputerName ... calls), stderr output is mapped to PowerShell's error stream (the stream with number 2, analogous to how stdout can be referred to in cmd.exe and POSIX-compatible shells).
Caveat: This means that if $ErrorActionPreference = 'Stop' is in effect or -ErrorAction Stop is passed to Receive-Job or Invoke-Command, for instance, any stderr output from external programs will trigger a script-terminating error - even with stderr output comprising status messages only. Due to a bug in PowerShell 7.1 and below this can also happen in local, foreground invocation if a 2> redirection is used.
The upshot:
To silence stderr output, apply 2>$null - either at the source (inside the job or remote command), or on the receiving end.
To route stderr output (all streams) via the success output stream / stdout, i.e. to merge all streams, use *>&1
To prevent the stderr lines from printing in red (when originating from jobs or remote commands), apply this redirection at the source - which also guards against side effects from $ErrorActionPreference = 'Stop' / -ErrorAction Stop on the caller side.
Note: If you merge all streams with *>&1, the order in which stdout and stderr lines are output is not guaranteed to reflect the original output order, as of PowerShell 7.2.
If needed, PowerShell still allows you to later separate the output lines based on whether they originated from stdout or stderr - see this answer.
This is my code:
set-location [PATH]
$A = Start-Process -FilePath .\refresh.bat -Wait
set-location C:\
When executed in powershell, the system opens a Command prompt window and executes the bat file without issue. The problem is that the window closes and I cannot see if there was an error if it succeeds.
I want to keep the CMD window open.
I also tried at the end of the bat file:
:END
cmd /k
but no luck.
First, unless you specifically need to run the batch file in a new window, do not use Start-Process - use direct invocation instead, which is implicitly synchronous and allows you to capture or redirect output:
# Invoke the batch file synchronously (wait for it to exit)
# and capture its (standard) output in variable $A
# To print the batch file's output to the console instead, just use:
# .\refresh.bat
$A = .\refresh.bat
See this answer for more information.
Also note Start-Process never allows you to capture the invoked program's output directly (you can only redirect it to files with -RedirectStandardOutput and -RedirectStandardOutput); your specific command captures nothing[1] in $A; adding -PassThru does return something, but not the program's output, but a process-information object (System.Diagnostics.Process).
If you do need to run the batch file in a new window and want to keep that window open:
Start-Process -Wait -FilePath cmd -ArgumentList '/k .\refresh.bat'
Relying on positional parameter binding, the above can be simplified to:
Start-Process -Wait cmd '/k .\refresh.bat'
[1] Strictly speaking, $A is assigned the [System.Management.Automation.Internal.AutomationNull]::Value singleton, which in most contexts behaves like $null.
Thank you mklement0 with your post gave me the solution I wanted. This is how I solved it.
set-location [PATH]
$A = Start-Process -FilePath .\refresh.bat -Wait -NoNewWindow
set-location C:\
-NoNewWindow allowed me to run my batch in the same powershell window getting the feedback of the bat file. That way I have errors if any and success status if no errors.
Thanks!
Consider the powershell command:
cmd.exe "/c start notepad.exe"
Using powershell.exe (console) this command completes immediately after starting the cmd/notepad process and does not wait for notepad/cmd to exit. If I want it to wait for completion I have to pipe the output to Out-Null.
Using the powershell ISE this command blocks execution until notepad/cmd is closed.
Further, if I use create a new process (of powershell.exe) in a script running in powershell.exe using System.Diagnostics.Process and redirect standard output the script now blocks execution until notepad/cmd is closed. If I don't redirect output it does not block execution.
But if I use c# to create a new process with the same settings/start info, and run the same script with redirected output it doesn't block execution.
I'm at a loss here. Obviously it has something to do with the setup of the execution and output and maybe "cmd.exe". I'm hoping someone understands what's fundamentally happening behind the scenes differently in these cases. I know the ISE and cmd.exe isn't fully supported but the I don't know why the other 3 aren't consistent.
Why do the scripts not run with the same behavior (especially the powershell console ones) and how do I get them to?
Edit:
After some troubleshooting I was able to get all the powershell.exe versions to run the same (the ISE isn't of importance to me). The odd ball where cmd.exe "/c start notepad.exe" blocks execution is when a new process is created in powershell to run powershell.exe using System.Diagnostics.Process. If output is redirected (the reason I was using System.Diagnostics.Process in the first place, Start-Process doesn't support redirecting except to a file) then calling WaitForExit() on the process object causes the blocking to occur.
Simply substituting WaitForExit() with Wait-Process (and the process ID) causes the powershell script running in the new powershell.exe to execute as expected and exit after running the command. Hopefully this information combined with #mklement0 answer will be sufficient for anyone else having similar problems.
To get predictable behavior in both the regular console and in the ISE:
Start a GUI application asynchronously (return to the prompt right away / continue executing the script):
notepad.exe
Invoking Notepad directly makes it run asynchronously, because it is a GUI-subsystem application.
If you still want to track the process and later check whether it is still running and what its exit code was on termination, use -PassThru, which makes Start-Process return a [System.Diagnostic.Process] instance:
$process = Start-Process -PassThru notepad.exe
$process.HasExited later tells you whether the process is still running.
Once it has exited, $process.ExitCode tells you the exit code (which may not tell you much in the case of a GUI application).
To wait synchronously (at some point):
Use Wait-Process $process.ID to wait (indefinitely) for the process to terminate.
Add a -Timeout value in seconds to limit the waiting period; a non-terminating error is reported if the process doesn't terminate within the timeout period, causing $? to reflect $False.
Start a GUI application synchronously (block until the application terminates):
Start-Process -Wait notepad.exe
-Wait tells Start-Process to wait until the process created terminates; this is the equivalent of cmd /c 'start /wait notepad.exe'.
There's a non-obvious shortcut to Start-Process -Wait: you can repurpose the Out-Null cmdlet to take advantage of the fact that piping invocation of a program to it makes Out-Null to wait for the program's termination (a GUI application has no stdout or stderr output, so there's nothing for Out-Null to discard; the only effect is synchronous invocation):
notepad.exe | Out-Null
In fact, this approach has two advantages:
If arguments must be passed to the GUI application, they can be passed directly, as usual - rather than indirectly, via Start-Process's -ArgumentList parameter.
In the (rare) event that a GUI application reports a meaningful process exit code (e.g, msiexec.exe), the Out-Null approach (unlike Start-Process -Wait) causes it to be reflected in the automatic $LASTEXITCODE variable.
Note: In rare cases, a GUI application may explicitly attach to the caller's console and write information to it; in order to surface that, pipe to | Write-Output instead (you'll still be able to evaluate $LASTEXITCODE) - see this answer.
Note that for console-subsystem applications (e.g., findstr.exe), synchronous execution is the default; Start-Process is only needed for GUI applications (and for special situations, such as wanting to run an application in a new console window or with elevation (run as admin)).
To run a console application or shell command asynchronously (without opening a new console window), you have the following options:
[Preferred] Use Start-Job kick off the command, and Receive-Job to receive its output / success status later.
$j = Start-Job { sleep 2; 'hi' }
To synchronously wait for this job to finish (and remove it automatically), use
Receive-Job -Wait -AutoRemoveJob $j
In PowerShell (Core) 6+:
You can use the simpler ... & syntax (as in
POSIX-like Unix shells such as bash) in lieu of Start-Job; e.g.:
$j = & { sleep 2; 'hi!' } &
Better yet, you can use the lighter-weight, faster Start-ThreadJob cmdlet, which uses threads for concurrency, but otherwise seamlessly integrates with the other *-Job cmdlets (note that it has no short-hand syntax):
$j = Start-ThreadJob { sleep 2; 'hi' }
[Not advisable] You can use something like Start-Process -NoNewWindow powershell -Args ..., but it is ill-advised:
Passing a shell command to powershell via Start-Process requires intricate quoting based on obscure rules - see this answer of mine for background information.
Any stdout and stderr output produced by the application / shell command will by default arrive asynchronously in the current console, interspersed with what you're doing interactively.
While you can redirect these streams to files with RedirectStandardOutput and -RedirectStandardError (and stdin input via -RedirectStandardInput) and you can use -PassThru to obtain a process-information object to determine the status and exit code of the process, Start-Job and Receive-Job are a simpler, more convenient alternative.
P.S.: I don't have an explanation for why cmd.exe "/c start notepad.exe" is blocking in the ISE but not in the regular console. Given the solutions above, however, getting to the bottom of this discrepancy may not be needed.
I have GUI.exe application that sometimes prints messages back to the console it was started in. In git-bash for example I would kick it off like this gui.exe &. This will place it into the background but allows it to write back to the shell. In PowerShell I can do the following:
& `C:\pathTo\gui.exe` | Out-Default
This will write back to the console but will block the shell.
I can also do the following:
Start-Job {& `C:\pathTo\gui.exe` | Out-Default}
But in this case I would have to call Receive-Job (maybe with -Keep) to extract the messages.
Is there a way to have the process both print to Out-Default as well as run in the background. I'm also not sure why I have to force the pipe to | Out-Default as both in cmd.exe and bash it can print the messages. PSversion is 5.0.10586.117.
Edit
As per the suggestion of #metix:
Start-Process -NoNewWindow 'C:\pathTo\gui.exe
This does put it into the background but doesn't print back to the console.
If I add -RedirectStandardOutput C:\tmp\testout.txt, then this writes to testout.txt. Is there a way to RedirectStandardOutput to the console?
this should run the process in the background and also prints to console:
Start-Process -NoNewWindow 'C:\pathTo\gui.exe'