Consider the powershell command:
cmd.exe "/c start notepad.exe"
Using powershell.exe (console) this command completes immediately after starting the cmd/notepad process and does not wait for notepad/cmd to exit. If I want it to wait for completion I have to pipe the output to Out-Null.
Using the powershell ISE this command blocks execution until notepad/cmd is closed.
Further, if I use create a new process (of powershell.exe) in a script running in powershell.exe using System.Diagnostics.Process and redirect standard output the script now blocks execution until notepad/cmd is closed. If I don't redirect output it does not block execution.
But if I use c# to create a new process with the same settings/start info, and run the same script with redirected output it doesn't block execution.
I'm at a loss here. Obviously it has something to do with the setup of the execution and output and maybe "cmd.exe". I'm hoping someone understands what's fundamentally happening behind the scenes differently in these cases. I know the ISE and cmd.exe isn't fully supported but the I don't know why the other 3 aren't consistent.
Why do the scripts not run with the same behavior (especially the powershell console ones) and how do I get them to?
Edit:
After some troubleshooting I was able to get all the powershell.exe versions to run the same (the ISE isn't of importance to me). The odd ball where cmd.exe "/c start notepad.exe" blocks execution is when a new process is created in powershell to run powershell.exe using System.Diagnostics.Process. If output is redirected (the reason I was using System.Diagnostics.Process in the first place, Start-Process doesn't support redirecting except to a file) then calling WaitForExit() on the process object causes the blocking to occur.
Simply substituting WaitForExit() with Wait-Process (and the process ID) causes the powershell script running in the new powershell.exe to execute as expected and exit after running the command. Hopefully this information combined with #mklement0 answer will be sufficient for anyone else having similar problems.
To get predictable behavior in both the regular console and in the ISE:
Start a GUI application asynchronously (return to the prompt right away / continue executing the script):
notepad.exe
Invoking Notepad directly makes it run asynchronously, because it is a GUI-subsystem application.
If you still want to track the process and later check whether it is still running and what its exit code was on termination, use -PassThru, which makes Start-Process return a [System.Diagnostic.Process] instance:
$process = Start-Process -PassThru notepad.exe
$process.HasExited later tells you whether the process is still running.
Once it has exited, $process.ExitCode tells you the exit code (which may not tell you much in the case of a GUI application).
To wait synchronously (at some point):
Use Wait-Process $process.ID to wait (indefinitely) for the process to terminate.
Add a -Timeout value in seconds to limit the waiting period; a non-terminating error is reported if the process doesn't terminate within the timeout period, causing $? to reflect $False.
Start a GUI application synchronously (block until the application terminates):
Start-Process -Wait notepad.exe
-Wait tells Start-Process to wait until the process created terminates; this is the equivalent of cmd /c 'start /wait notepad.exe'.
There's a non-obvious shortcut to Start-Process -Wait: you can repurpose the Out-Null cmdlet to take advantage of the fact that piping invocation of a program to it makes Out-Null to wait for the program's termination (a GUI application has no stdout or stderr output, so there's nothing for Out-Null to discard; the only effect is synchronous invocation):
notepad.exe | Out-Null
In fact, this approach has two advantages:
If arguments must be passed to the GUI application, they can be passed directly, as usual - rather than indirectly, via Start-Process's -ArgumentList parameter.
In the (rare) event that a GUI application reports a meaningful process exit code (e.g, msiexec.exe), the Out-Null approach (unlike Start-Process -Wait) causes it to be reflected in the automatic $LASTEXITCODE variable.
Note: In rare cases, a GUI application may explicitly attach to the caller's console and write information to it; in order to surface that, pipe to | Write-Output instead (you'll still be able to evaluate $LASTEXITCODE) - see this answer.
Note that for console-subsystem applications (e.g., findstr.exe), synchronous execution is the default; Start-Process is only needed for GUI applications (and for special situations, such as wanting to run an application in a new console window or with elevation (run as admin)).
To run a console application or shell command asynchronously (without opening a new console window), you have the following options:
[Preferred] Use Start-Job kick off the command, and Receive-Job to receive its output / success status later.
$j = Start-Job { sleep 2; 'hi' }
To synchronously wait for this job to finish (and remove it automatically), use
Receive-Job -Wait -AutoRemoveJob $j
In PowerShell (Core) 6+:
You can use the simpler ... & syntax (as in
POSIX-like Unix shells such as bash) in lieu of Start-Job; e.g.:
$j = & { sleep 2; 'hi!' } &
Better yet, you can use the lighter-weight, faster Start-ThreadJob cmdlet, which uses threads for concurrency, but otherwise seamlessly integrates with the other *-Job cmdlets (note that it has no short-hand syntax):
$j = Start-ThreadJob { sleep 2; 'hi' }
[Not advisable] You can use something like Start-Process -NoNewWindow powershell -Args ..., but it is ill-advised:
Passing a shell command to powershell via Start-Process requires intricate quoting based on obscure rules - see this answer of mine for background information.
Any stdout and stderr output produced by the application / shell command will by default arrive asynchronously in the current console, interspersed with what you're doing interactively.
While you can redirect these streams to files with RedirectStandardOutput and -RedirectStandardError (and stdin input via -RedirectStandardInput) and you can use -PassThru to obtain a process-information object to determine the status and exit code of the process, Start-Job and Receive-Job are a simpler, more convenient alternative.
P.S.: I don't have an explanation for why cmd.exe "/c start notepad.exe" is blocking in the ISE but not in the regular console. Given the solutions above, however, getting to the bottom of this discrepancy may not be needed.
Related
I trying to install a software using Start-Process in PowerShell, I would like for the script to wait until a command is completed before proceeding to the next one. I'm not experienced I tired the script below but it did not work.
Start-Process -Wait -FilePath "C:\Temp\Latitude_5X10_Precision_3550_1.15.0.exe" -ArgumentList "/S" -PassThru
Your Start-Process call is correct, but -Wait invariably only tracks the lifetime of the directly launched process (C:\Temp\Latitude_5X10_Precision_3550_1.15.0.exe in your case).
That is, you're out of luck if the target process itself spawns yet another process in order to perform its task and then returns before that child process has terminated.
Additional work is then needed, if even feasible:
If you know the name of the child process, you can try to find and track it via Get-Process.
Alternatively, if you know of an indirect sign that the task has completed, such as the existence of a directory or a registry entry, look for that.
As an aside: console(-subsystem) applications can be invoked directly for synchronous (blocking) execution (e.g., foo.exe bar baz or & $fooExePath bar baz), which is the preferred method, because it connects the application's output streams to PowerShell's streams.
We have received 20 jmeter test plans, each testing one endpoint, which we need to run. On tests we need to pass parameters and others we don't.
My idea was to create a powershell script that loops through the directories and runs a test, waits until finished and then runs the next test. When we develop a new endpoint we just create a new test plan and save it in the appropriate folder and the powershell script will include it next time we loop through tests.
I need the tests to finish before starting the next plan, so I'm looking at something like:
Write-Output "Running Test 1"
$proc = Start-Process -FilePath "C:\JmeterLoadTests\apache-jmeter-5.2.1\bin\jmeter" -ArgumentList "-n -t C:\JmeterLoadTests\test\enpointsType1\test-1-1.jmx -Jduration=10"
$proc.WaitForExit()
Write-Output "Proc 1 Done"
Write-Output "Running Proc 2"
$proc2 = Start-Process -FilePath "C:\JmeterLoadTests\apache-jmeter-5.2.1\bin\jmeter" -ArgumentList "-n -t C:\JmeterLoadTests\test\enpointsType1\test-1-1.jmx -Jduration=10"
$proc2.WaitForExit()
This just launches both tests simultaneously.
My question is then how to make Powershell wait for the previous test to finish.
Your immediate problem is that your Start-Process call is missing the -PassThru switch, which is required for the call to return a System.Diagnostics.Process instance representing the newly launched process.
# ...
# Note the use of -PassThru
$proc = Start-Process -PassThru -FilePath "C:\JmeterLoadTests\apache-jmeter-5.2.1\bin\jmeter" -ArgumentList "-n -t C:\JmeterLoadTests\test\enpointsType1\test-1-1.jmx -Jduration=10"
$proc.WaitForExit()
# ...
Alternatively, if you don't need to examine the process exit code (which $proc.ExitCode in the above command would give you), you can simply use the -Wait switch, which makes Start-Process itself wait for the process to terminate:
# ...
# Note the use of -Wait
Start-Process -Wait -FilePath "C:\JmeterLoadTests\apache-jmeter-5.2.1\bin\jmeter" -ArgumentList "-n -t C:\JmeterLoadTests\test\enpointsType1\test-1-1.jmx -Jduration=10"
# ...
Taking a step back:
To synchronously execute console applications or batch files in the current console window, call them directly, do not use Start-Process (or the System.Diagnostics.Process API it is based on).
Aside from being syntactically easier and less verbose, this has two key advantages:
You can directly capture their output.
It allows you to examine the process exit code via the automatic $LASTEXITCODE variable afterwards.
Assuming that jmeter is a console application (the docs suggests it runs as one when invoked with arguments):
# ...
# Direct invocation in the current window.
# Stdout and stderr output will print to the console by default,
# but can be captured or redirected.
# Note: &, the call operator, isn't strictly needed here,
# but would be if your executable path were quoted
# or contained variable references.
& C:\JmeterLoadTests\apache-jmeter-5.2.1\bin\jmeter -n -t C:\JmeterLoadTests\test\enpointsType1\test-1-1.jmx -Jduration=10
# Use $LASTEXITCODE to examine the process exit code.
# ...
See this answer for more information.
It might be the case you're suffering from Out-Default cmdlet execution, the easiest is separating the commands with the semicolon like:
cmd1;cmd2;cmd3;etc;
this way Powershell will wait for the previous command to complete before starting the next one
Demo:
It might be a better idea considering switching to Maven JMeter Plugin which by default executes all tests it finds under src/test/jmeter folder relative to the pom.xml file
This is really basic, but I can't find the answer. The installer sets up my path so that I can just type the command:
ng serve
at the command prompt and the script runs. I don't want to wait for this program to finish (it's a server, after all). How do I launch the same script (it's a CMD script as far as I can tell) from Powershell without waiting for it to finish (and without having to find the source directory for the script)?
If it's acceptable to terminate the server when the PowerShell session exits, use a background job:
In PowerShell (Core) 7+
ng server &
In Windows PowerShell, explicit use of Start-Job is required:
Start-Job { ng server }
Both commands return a job-information object, which you can either save in a variable ($jb = ...) or discard ($null = ...)
If the server process produces output you'd like to monitor, you can use the Receive-Job cmdlet.
See the conceptual about_Jobs topic for more information.
If the server must continue to run even after the launching PowerShell session exits, use the Start-Process cmdlet, which on Windows launches an independent process in a new console window (by default); use the -WindowStyle parameter to control the visibility / state of that window:
Start-Process ng server # short for: Start-Process -FilePath ng -ArgumentList server
Note: On Unix-like platforms, where Start-Process doesn't support creating independent new terminal windows, you must additionally use nohup - see this answer.
I am trying to call a Start-Job in powershell. When I do, it spawns a background powershell with the following arguments:
C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe -Version 5.0 -s -NoLogo -NoProfile -EncodedCommand [encoded command I want to run in base64]
However, the powershell command never seems to complete, whatever the command I send it.
I tried spawning a powershell instance like this:
powershell.exe -s
And this also seems to create an instance that seems frozen, not executing or doing anything. Looking online, I cannot seem to find any reference to the -s argument.
Does anybody know what it's for or how to get rid of it so that my start-jobs work properly?
Edit: It's possible that -s is the shorthand for -sta, but my command does not freeze using -sta, but it does using -s.
Edit2: I have since found out that -s is a shorthand for -ServerMode, apparently a Legacy Powershell 2.0 option. I have no idea why that is added when using Start-Job.
Edit3: The command I use is:
$deploymentsJobs += Start-Job -InitializationScript { SomeSmallFunction } (AnotherFunction) -ArgumentList $arg1, $arg2, $arg3}
tl;dr:
The -s option is an expected part of the command line used to launch a background job via a new PowerShell process - it puts the new process in server mode, which is required to communicate with the calling process for background job management.
It is not a legacy option, but it isn't documented either, because it is only meant to be used internally by PowerShell.
Given that everything you describe is as expected, the problem is likely with the specific commands you run via -InitializationScript and in the main script block (the implied -ScriptBlock argument).
As you've discovered, a Start-Job call spawns a powershell -s -NoLogo -NoProfile call behind the scenes (discoverable via Task Manager).
That is, a new PowerShell process is created to run the commands in the background.
An -EncodedCommand parameter with a Base64-encoded command string is only present if you invoked Start-Process with the -Initialization parameter - the main script block (the (implied) -ScriptBlock argument) is not passed via the command line (see below).
-s is used PowerShell-internally - always - to invoke background jobs, and -s, as you've also discovered, is an alias for the -servermode switch. (Given that only -STA is documented, one would expect -s to be short for -STA, but it is not).
-s / -servermode is an implementation detail, used only by PowerShell itself, which is why it isn't documented.
This location in the PowerShell Core source code on GitHub shows you how the command line for the background process is constructed.
Server mode is the mode the background process must be in in order to communicate with the calling process via its standard streams (stdin, stdout, stderr): That is, the command to execute in the background is sent to the background process through its stdin stream, and the background process reports its output via its stdout and stderr streams.[1]
Note that XML-based serialization / deserialization happens during this inter-process communication, using the same infrastructure as PowerShell remoting - see this answer for more information.
[1] Ohad Schneider points out that it is possible to accidentally disrupt this communication if the main script block contains commands such as Start-Process -NoNewWindow with a console program that directly write to the background process' stdout stream - see this answer.
I have a set of test DLL's that I'm running from a powershell script that calls OpenCover.Console.exe via the Start-Process command.
I have the -returntargetcode flag set
After execution I check $lastexitcode and $?. They return 1 and True respectively all the time. Even when tests are failing.
Shouldn't $lastexitcode be 0 when all tests pass and 1 when they fail?
By default, Start-Process is asynchronous, so it doesn't wait for your process to exit. If you want your command-line tool to run synchronously, drop the Start-Process and invoke the command directly. That's the only way it will set $LASTEXITCODE. For example, causing CMD.exe to exit with a 2:
cmd /c exit 2
$LASTEXITCODE
You can make Start-Process synchronous by adding the -Wait flag, but it still wont' set $LASTEXITCODE. To get the ExitCode from Start-Process you add -PassThru to your Start-Process, which then outputs a [System.Diagnostics.Process] object which you can use to monitor the process, and (eventually) get access to its ExitCode property. Here's an example that should help:
$p = Start-Process "cmd" -ArgumentList "/c exit 2" -PassThru -Wait
$p.ExitCode
Of course the advantage of this approach is you don't need to wait for the process, but later when it exits you have the information about it's run in $p.
When executing a GUI application, dropping the Start-Process does not help, as PowerShell does not wait for GUI application to complete, when executing them directly this way. So $LASTEXITCODE is not set.
Piping the (non existing) GUI application output helps, as it makes PowerShell to wait for the application to complete.
notepad.exe | Out-Null
echo $LASTEXITCODE
Note that "GUI application" does not necessarily mean that the application has windows. Whether an application is GUI or console is a flag in .exe file header.
Start-Process -PassThru -Wait as suggested in the answer by #Burt_Harris works too in this case, it's just a bit more complicated.