I have GUI.exe application that sometimes prints messages back to the console it was started in. In git-bash for example I would kick it off like this gui.exe &. This will place it into the background but allows it to write back to the shell. In PowerShell I can do the following:
& `C:\pathTo\gui.exe` | Out-Default
This will write back to the console but will block the shell.
I can also do the following:
Start-Job {& `C:\pathTo\gui.exe` | Out-Default}
But in this case I would have to call Receive-Job (maybe with -Keep) to extract the messages.
Is there a way to have the process both print to Out-Default as well as run in the background. I'm also not sure why I have to force the pipe to | Out-Default as both in cmd.exe and bash it can print the messages. PSversion is 5.0.10586.117.
Edit
As per the suggestion of #metix:
Start-Process -NoNewWindow 'C:\pathTo\gui.exe
This does put it into the background but doesn't print back to the console.
If I add -RedirectStandardOutput C:\tmp\testout.txt, then this writes to testout.txt. Is there a way to RedirectStandardOutput to the console?
this should run the process in the background and also prints to console:
Start-Process -NoNewWindow 'C:\pathTo\gui.exe'
Related
This is my code:
set-location [PATH]
$A = Start-Process -FilePath .\refresh.bat -Wait
set-location C:\
When executed in powershell, the system opens a Command prompt window and executes the bat file without issue. The problem is that the window closes and I cannot see if there was an error if it succeeds.
I want to keep the CMD window open.
I also tried at the end of the bat file:
:END
cmd /k
but no luck.
First, unless you specifically need to run the batch file in a new window, do not use Start-Process - use direct invocation instead, which is implicitly synchronous and allows you to capture or redirect output:
# Invoke the batch file synchronously (wait for it to exit)
# and capture its (standard) output in variable $A
# To print the batch file's output to the console instead, just use:
# .\refresh.bat
$A = .\refresh.bat
See this answer for more information.
Also note Start-Process never allows you to capture the invoked program's output directly (you can only redirect it to files with -RedirectStandardOutput and -RedirectStandardOutput); your specific command captures nothing[1] in $A; adding -PassThru does return something, but not the program's output, but a process-information object (System.Diagnostics.Process).
If you do need to run the batch file in a new window and want to keep that window open:
Start-Process -Wait -FilePath cmd -ArgumentList '/k .\refresh.bat'
Relying on positional parameter binding, the above can be simplified to:
Start-Process -Wait cmd '/k .\refresh.bat'
[1] Strictly speaking, $A is assigned the [System.Management.Automation.Internal.AutomationNull]::Value singleton, which in most contexts behaves like $null.
Thank you mklement0 with your post gave me the solution I wanted. This is how I solved it.
set-location [PATH]
$A = Start-Process -FilePath .\refresh.bat -Wait -NoNewWindow
set-location C:\
-NoNewWindow allowed me to run my batch in the same powershell window getting the feedback of the bat file. That way I have errors if any and success status if no errors.
Thanks!
Consider the powershell command:
cmd.exe "/c start notepad.exe"
Using powershell.exe (console) this command completes immediately after starting the cmd/notepad process and does not wait for notepad/cmd to exit. If I want it to wait for completion I have to pipe the output to Out-Null.
Using the powershell ISE this command blocks execution until notepad/cmd is closed.
Further, if I use create a new process (of powershell.exe) in a script running in powershell.exe using System.Diagnostics.Process and redirect standard output the script now blocks execution until notepad/cmd is closed. If I don't redirect output it does not block execution.
But if I use c# to create a new process with the same settings/start info, and run the same script with redirected output it doesn't block execution.
I'm at a loss here. Obviously it has something to do with the setup of the execution and output and maybe "cmd.exe". I'm hoping someone understands what's fundamentally happening behind the scenes differently in these cases. I know the ISE and cmd.exe isn't fully supported but the I don't know why the other 3 aren't consistent.
Why do the scripts not run with the same behavior (especially the powershell console ones) and how do I get them to?
Edit:
After some troubleshooting I was able to get all the powershell.exe versions to run the same (the ISE isn't of importance to me). The odd ball where cmd.exe "/c start notepad.exe" blocks execution is when a new process is created in powershell to run powershell.exe using System.Diagnostics.Process. If output is redirected (the reason I was using System.Diagnostics.Process in the first place, Start-Process doesn't support redirecting except to a file) then calling WaitForExit() on the process object causes the blocking to occur.
Simply substituting WaitForExit() with Wait-Process (and the process ID) causes the powershell script running in the new powershell.exe to execute as expected and exit after running the command. Hopefully this information combined with #mklement0 answer will be sufficient for anyone else having similar problems.
To get predictable behavior in both the regular console and in the ISE:
Start a GUI application asynchronously (return to the prompt right away / continue executing the script):
notepad.exe
Invoking Notepad directly makes it run asynchronously, because it is a GUI-subsystem application.
If you still want to track the process and later check whether it is still running and what its exit code was on termination, use -PassThru, which makes Start-Process return a [System.Diagnostic.Process] instance:
$process = Start-Process -PassThru notepad.exe
$process.HasExited later tells you whether the process is still running.
Once it has exited, $process.ExitCode tells you the exit code (which may not tell you much in the case of a GUI application).
To wait synchronously (at some point):
Use Wait-Process $process.ID to wait (indefinitely) for the process to terminate.
Add a -Timeout value in seconds to limit the waiting period; a non-terminating error is reported if the process doesn't terminate within the timeout period, causing $? to reflect $False.
Start a GUI application synchronously (block until the application terminates):
Start-Process -Wait notepad.exe
-Wait tells Start-Process to wait until the process created terminates; this is the equivalent of cmd /c 'start /wait notepad.exe'.
There's a non-obvious shortcut to Start-Process -Wait: you can repurpose the Out-Null cmdlet to take advantage of the fact that piping invocation of a program to it makes Out-Null to wait for the program's termination (a GUI application has no stdout or stderr output, so there's nothing for Out-Null to discard; the only effect is synchronous invocation):
notepad.exe | Out-Null
In fact, this approach has two advantages:
If arguments must be passed to the GUI application, they can be passed directly, as usual - rather than indirectly, via Start-Process's -ArgumentList parameter.
In the (rare) event that a GUI application reports a meaningful process exit code (e.g, msiexec.exe), the Out-Null approach (unlike Start-Process -Wait) causes it to be reflected in the automatic $LASTEXITCODE variable.
Note: In rare cases, a GUI application may explicitly attach to the caller's console and write information to it; in order to surface that, pipe to | Write-Output instead (you'll still be able to evaluate $LASTEXITCODE) - see this answer.
Note that for console-subsystem applications (e.g., findstr.exe), synchronous execution is the default; Start-Process is only needed for GUI applications (and for special situations, such as wanting to run an application in a new console window or with elevation (run as admin)).
To run a console application or shell command asynchronously (without opening a new console window), you have the following options:
[Preferred] Use Start-Job kick off the command, and Receive-Job to receive its output / success status later.
$j = Start-Job { sleep 2; 'hi' }
To synchronously wait for this job to finish (and remove it automatically), use
Receive-Job -Wait -AutoRemoveJob $j
In PowerShell (Core) 6+:
You can use the simpler ... & syntax (as in
POSIX-like Unix shells such as bash) in lieu of Start-Job; e.g.:
$j = & { sleep 2; 'hi!' } &
Better yet, you can use the lighter-weight, faster Start-ThreadJob cmdlet, which uses threads for concurrency, but otherwise seamlessly integrates with the other *-Job cmdlets (note that it has no short-hand syntax):
$j = Start-ThreadJob { sleep 2; 'hi' }
[Not advisable] You can use something like Start-Process -NoNewWindow powershell -Args ..., but it is ill-advised:
Passing a shell command to powershell via Start-Process requires intricate quoting based on obscure rules - see this answer of mine for background information.
Any stdout and stderr output produced by the application / shell command will by default arrive asynchronously in the current console, interspersed with what you're doing interactively.
While you can redirect these streams to files with RedirectStandardOutput and -RedirectStandardError (and stdin input via -RedirectStandardInput) and you can use -PassThru to obtain a process-information object to determine the status and exit code of the process, Start-Job and Receive-Job are a simpler, more convenient alternative.
P.S.: I don't have an explanation for why cmd.exe "/c start notepad.exe" is blocking in the ISE but not in the regular console. Given the solutions above, however, getting to the bottom of this discrepancy may not be needed.
In cmd I'm trying to run Method1 which is in a PowerShell script, script1.
Method1 is a method that takes a few hours, and I simply want to fire and forget.
The following is working for me:
c:\temp> powershell
PS c:\temp> . .\script1.ps1;Method1
When I do the lines above, everything is working fine as long as I keep the CMD of PS opened. once I close the PS window, it kills Method1.
So actually I want that from cmd, in one line, to somehow make Method1 work without the dependency of the PowerShell window, maybe create a new process.. I am not really sure.
I've tried:
c:\temp> cmd /c powershell . .\script1.ps1;Method1
It is running for a few seconds, but when the cmd gets closed, then Method1 also terminates.
I also tried
c:\temp>cmd /c powershell -noexit "& { . .\script.ps1;Method1 }"
Again, once I do this, it is working. However, a PowerShell window is opened and if I close it then it terminates Method1.
From you help, I've tried:
c:\temp> cmd /c powershell start-process cmd /c powershell . .\script1.ps1;Method1
But I get an exception:
Start-Process : A positional parameter cannot be found that accepts
argument 'powershell'.
But still, I am not able to make it work.
Alternatively if you want a pure PowerShell solution (note this needs to be running as Admin):
Invoke-Command LocalHost -Scriptblock $script -InDisconnectedSession
The InDisconnectedSession switch runs it in a separate PowerShell session that will not be terminated when you close the PowerShell window. You can also use Get-PSSession and pass the session to Enter-PSSession to interact with it during or after execution. Remember in this state if you close the window it -will- kill it, so you'll want to use Exit-PSSession to keep it alive.
There is however a problem - you can't do any remoting tasks, at least not easily. This incurs the wrath of the "double hop" where you remote to one computer (your own in this case), then to another, and for security PowerShell refuses to pass any credentials to the second remoting session so it can't connect, even if you put the credentials in manually. If you do need to do remoting I recommend sticking with launching a hidden PowerShell process.
You can use PowerShell jobs for that, so just:
Start-Job -FilePath somepath
And add a method call at the end of the script, or pass in a Scriptblock like:
Start-Job -ScriptBlock {. .\path_to_ps1; Method1}
Or perhaps use the hackish:
start-process cmd -WindowStyle Hidden -ArgumentList "'/c powershell . .\script1.ps1;Method1'"
Actually, you can just launch PowerShell, without CMD, and I am not sure why I was using a cmd approach:
start-process powershell -WindowStyle Hidden -ArgumentList ". .\script1.ps1;Method1"
Easy answer ya'll; Just paste "start" command into your PS window (whether in a remote session or not) and it works fine:
Start C:\Windows\SysWOW64\WindowsPowerShell\v1.0\powershell.exe -file 'driverletter:\path\yourpowershellscript.ps1'
I ha a function, where I call an application with the & operator. The application produces several line command line output, downloads some files, and returns a string:
& "app.exe" | Out-Host
$var = ...
return $var
It seems, that on the console appears the output produced by app.exe only after app.exe terminates. The user does not have any real time information which file is downloading. Is there a way to continuously update the console when app.exe is running?
Many console applications buffer theirs output stream, if it known to be redirected. Actually, it is standard behavior of C library. So, buffering done by app.exe, because of redirection, but not by Out-Host.
Solution would be to not to redirect output of app.exe, even when outer command redirected. For than you should know exact condition, when PowerShell not redirect output stream of console application and link it directly to their own output stream, which would be console for interactive PowerShell.exe session. The conditions is:
Command is last item in pipeline.
Command is piped to Out-Default.
Solution would be wrap command into script block, and pipe that script block to Out-Default:
& { & "app.exe" } | Out-Default
The other solution would be to use Start-Process cmdlet with -NoNewWindow and -Wait parameters:
Start-Process "app.exe" -NoNewWindow -Wait
I have a set of test DLL's that I'm running from a powershell script that calls OpenCover.Console.exe via the Start-Process command.
I have the -returntargetcode flag set
After execution I check $lastexitcode and $?. They return 1 and True respectively all the time. Even when tests are failing.
Shouldn't $lastexitcode be 0 when all tests pass and 1 when they fail?
By default, Start-Process is asynchronous, so it doesn't wait for your process to exit. If you want your command-line tool to run synchronously, drop the Start-Process and invoke the command directly. That's the only way it will set $LASTEXITCODE. For example, causing CMD.exe to exit with a 2:
cmd /c exit 2
$LASTEXITCODE
You can make Start-Process synchronous by adding the -Wait flag, but it still wont' set $LASTEXITCODE. To get the ExitCode from Start-Process you add -PassThru to your Start-Process, which then outputs a [System.Diagnostics.Process] object which you can use to monitor the process, and (eventually) get access to its ExitCode property. Here's an example that should help:
$p = Start-Process "cmd" -ArgumentList "/c exit 2" -PassThru -Wait
$p.ExitCode
Of course the advantage of this approach is you don't need to wait for the process, but later when it exits you have the information about it's run in $p.
When executing a GUI application, dropping the Start-Process does not help, as PowerShell does not wait for GUI application to complete, when executing them directly this way. So $LASTEXITCODE is not set.
Piping the (non existing) GUI application output helps, as it makes PowerShell to wait for the application to complete.
notepad.exe | Out-Null
echo $LASTEXITCODE
Note that "GUI application" does not necessarily mean that the application has windows. Whether an application is GUI or console is a flag in .exe file header.
Start-Process -PassThru -Wait as suggested in the answer by #Burt_Harris works too in this case, it's just a bit more complicated.