Conditionally piping to Out-Null - powershell

I'm writing a PowerShell script to msbuild a bunch of solutions. I want to count how many solutions build successfully and how many fail. I also want to see the compiler errors, but only from the first one that fails (I'm assuming the others will usually have similar errors and I don't want to clutter my output).
My question is about how to run an external command (msbuild in this case), but conditionally pipe its output. If I'm running it and haven't gotten any failures yet, I don't want to pipe its output; I want it to output directly to the console, with no redirection, so it will color-code its output. (Like many programs, msbuild turns off color-coding if it sees that its stdout is redirected.) But if I have gotten failures before, I want to pipe to Out-Null.
Obviously I could do this:
if ($SolutionsWithErrors -eq 0) {
msbuild $Path /nologo /v:q /consoleloggerparameters:ErrorsOnly
} else {
msbuild $Path /nologo /v:q /consoleloggerparameters:ErrorsOnly | Out-Null
}
But it seems like there's got to be a way to do it without the duplication. (Okay, it doesn't have to be duplication -- I could leave off /consoleloggerparameters if I'm piping to null anyway -- but you get the idea.)
There may be other ways to solve this, but for today, I specifically want to know: is there a way to run a command, but only pipe its output if a certain condition is met (and otherwise not pipe it or redirect its output at all, so it can do fancy stuff like color-coded output)?

You can define the output command as a variable and use either Out-Default or Out-Null:
# set the output command depending on the condition
$output = if ($SolutionsWithErrors -eq 0) {'Out-Default'} else {'Out-Null'}
# invoke the command with the variable output
msbuild $Path /nologo /v:q /consoleloggerparameters:ErrorsOnly | & $output
UPDATE
The above code loses MSBuild colors. In order to preserve colors and yet avoid
duplication of code this approach can be used:
# define the command once as a script block
$command = {msbuild $Path /nologo /v:q /consoleloggerparameters:ErrorsOnly}
# invoke the command with output depending on the condition
if ($SolutionsWithErrors -eq 0) {& $command} else {& $command | Out-Null}
is there a way to run a command, but only pipe its output if a certain condition is met (and otherwise not pipe it or redirect its output at all, so it can do fancy stuff like color-coded output)?
There is no such a way built-in, more likely. But it can be implemented with a function and the function is reused as such a way:
function Invoke-WithOutput($OutputCondition, $Command) {
if ($OutputCondition) { & $Command } else { $null = & $Command }
}
Invoke-WithOutput ($SolutionsWithErrors -eq 0) {
msbuild $Path /nologo /v:q /consoleloggerparameters:ErrorsOnly
}

Related

Can you pipe a string or variable into a downloaded .ps1 file?

I have file with the following contents hosted on dropbox:
(I know it is not currently calling anything)
function test_echo {
[CmdletBinding()]
param (
[Parameter (Position=0,Mandatory = $True, ValueFromPipeline = $True)]
[string]$e
)
echo $e
}
Is there a way to pipe information into this file before it is downloaded and executed?
ex:
"test" | iwr DropBoxLink | iex
and to make this echo out test
this honestly probably has no practical application, but I found myself wondering if it is possible so thought I'd ask.
I know I could define the string as a variable first and execute it but I just want to know if you can pipe it for principles sake
$testEcho = "string"; iwr DropBoxLink | iex
> string
"test" | % -begin { iex (irm $DropBoxUrl) } -process { $_ | test_echo }
The above uses a ForEach-Object call's -Begin block to download and evaluate the script (the usual caveats re iex (Invoke-Expression) apply - you should trust the source), which defines the test_echo function contained in the downloaded script.
The -Begin script block executes before pipeline input is processed, which means that by the time the -Process script block processes (each) pipeline input object, the test_echo function is already defined.
Also note that irm (Invoke-RestMethod) rather than iwr (Invoke-WebRequest) is used, given that you're only interested in the content of the script.
Of course, this doesn't gain you much, as you could simply use two statements, which has the added advantage that all pipeline input (should there be multiple input objects) are handled by a single test_echo invocation:
iex (irm $DropBoxUrl) # Download and effectively dot-source the script.
"test" | test_echo # Pipe to the script's `test_echo` function.
A general caveat is that if the downloaded script contains an exit statement that is hit during evaluation, the calling shell exits as a whole.
Since in effect you need to dot-source the downloaded script in order to make its functions available, the only way to solve the exit problem is to download to a (temporary) file first, and dout-source that.
If dot-sourcing isn't needed, calling via the PowerShell CLI (child process) may be an option - see below.
GitHub issue #5909 proposes a future enhancement that would allow piping Invoke-WebRequest (iwr) calls to Invoke-Command (icm) for direct-from-the-web downloading and execution, without the exit problem (iwr $DropboxLink | icm).
Note that if your downloaded script were to accept pipeline input directly, you could use [scriptblock]::Create() as follows (Invoke-Expression is not an option, because it doesn't accept pipeline input):
# Assumes that the script located at $DropBoxUrl
# *itself* accepts pipeline input.
# Use . (...) if you want to *dot-source* the script, as
# Invoke-Expression would implicitly do.
"test" | & ([scriptblock]::Create((irm $DropBoxUrl))
To work around the exit problem, you can call via the CLI and a script block, using pwsh, the PowerShell (Core) 7+ CLI, in this example; use powershell.exe for Windows PowerShell:
# Assumes that the script located at $DropBoxUrl
# *itself* accepts pipeline input.
'test' |
pwsh -NoProfile {
$input | & ([scriptblock]::Create($args[0]))
} -args (irm $DropBoxUrl)

Powershell Invoke-Expressions pauses

I wrote a Powershell script that uses Steam's command line tool to login and check for updates for a community server I am running. See below:
$steamcmdFolder = 'C:\download\steam'
$steamcmdExec = $steamcmdFolder+"\steamcmd.exe"
$forceinstall = 'force_install_dir'+$steamcmdFolder
$appupdate = 'app_update 258550'
$cmdOutput = "$steamcmdExec +login anonymous"
do {
Write-Host Checking for an update....
Invoke-Expression $cmdOutput
Invoke-expression $forceinstall
Invoke-expression $appupdate
}
while ($Update = 1)
The Invoke-Expression lines are individual command-line statements I want executed in the order I have them. For some reason, the first Invoke-Expression works fine but the others do not -- everything just stops. I can type in the value of $forceinstall on the PowerShell command-line and it works as expected. But why can't I do this using PowerShell? Any suggestions are welcome!
If you convert the other two lines down to what they are, it seems like they are not real commands.
#Invoke-expression $forceinstall
Invoke-Expression "force_install_dirC:\download\steam"
#Invoke-expression $appupdate
Invoke-Expression "app_update 258550"
Looking into the SteamCMD documents, it appears that you might want to change it to be a single line command.
Invoke-Expression "steamcmd +login anonymous +force_install_dir C:\download\steam +app_update 258550 +quit"

Powershell - Run external powershell script and capture output - Can't pass parameters

I have a problem running a powershell script from within another powershell script, passing parameters and capturing the output. I have tried using the ampersand operator, calling it via powershell.exe -Command but nothing seems to work.
What seems to work is using fixed parameter and values stored in a variable like this C:\path\to\script.ps1 -arg1 $value.
This may present a solution if nothing else works, but I would like to run the command similar to this & $pathToScript $params 2>&1 (the 2>&1is for capturing error output as well as standard).
Sometimes the construct prints just the path to the script,
sometimes it says Cannot run file in the middle of pipeline and sometimes it complains about it cannot find the mentioned script file (I sometimes have spaces in my path, but I thought quoting it would suffice: quoting was done like this $path = "`"C:\path\with space\to\script.ps1`"").
This is the simplified function I want to use this in:
Function captureScriptOutput
{
#the function receives the script path and parameters
param($path, $params)
#this works if no params are passed, but I need params!
$output = & $path $params 2>&1 | Out-String
}
I solved the problem with the help of a colleague.
We went a little indirection and included a cd into the respective directory and ran the command afterwards. This works like a charm.
Solution source code:
Function captureScriptOutput
{
param($fileName, $params)
cd "C:\into\path with\space"
$output = & .\$fileName $params 2>&1 | Out-String
}
This works and even captures the error output, I hope some other folks encountering this kind of problem can use this to fix their problems.
Cheerioh and thanks for reply!
Try with invoke-expression but you need test how many quote needed
Invoke-expression "$path $param"

Call a PowerShell script in a new, clean PowerShell instance (from within another script)

I have many scripts. After making changes, I like to run them all to see if I broke anything. I wrote a script to loop through each, running it on fresh data.
Inside my loop I'm currently running powershell.exe -command <path to script>. I don't know if that's the best way to do this, or if the two instances are totally separate from each other.
What's the preferred way to run a script in a clean instance of PowerShell? Or should I be saying "session"?
Using powershell.exe seems to be a good approach but with its pros and cons, of course.
Pros:
Each script is invoked in a separate clean session.
Even crashes do not stop the whole testing process.
Cons:
Invoking powershell.exe is somewhat slow.
Testing depends on exit codes but 0 does not always mean success.
None of the cons is mentioned is a question as a potential problem.
The demo script is below. It has been tested with PS v2 and v3. Script names
may include special characters like spaces, apostrophes, brackets, backticks,
dollars. One mentioned in comments requirement is ability to get script paths
in their code. With the proposed approach scripts can get their own path as
$MyInvocation.MyCommand.Path
# make a script list, use the full paths or explicit relative paths
$scripts = #(
'.\test1.ps1' # good name
'.\test 2.ps1' # with a space
".\test '3'.ps1" # with apostrophes
".\test [4].ps1" # with brackets
'.\test `5`.ps1' # with backticks
'.\test $6.ps1' # with a dollar
'.\test ''3'' [4] `5` $6.ps1' # all specials
)
# process each script in the list
foreach($script in $scripts) {
# make a command; mind &, ' around the path, and escaping '
$command = "& '" + $script.Replace("'", "''") + "'"
# invoke the command, i.e. the script in a separate process
powershell.exe -command $command
# check for the exit code (assuming 0 is for success)
if ($LastExitCode) {
# in this demo just write a warning
Write-Warning "Script $script failed."
}
else {
Write-Host "Script $script succeeded."
}
}
If you're on PowerShell 2.0 or higher, you can use jobs to do this. Each job runs in a separate PowerShell process e.g.:
$scripts = ".\script1.ps1", ".\script2.ps1"
$jobs = #()
foreach ($script in $scripts)
{
$jobs += Start-Job -FilePath $script
}
Wait-Job $jobs
foreach ($job in $jobs)
{
"*" * 60
"Status of '$($job.Command)' is $($job.State)"
"Script output:"
Receive-Job $job
}
Also, check out the PowerShell Community Extensions. It has a Test-Script command that can detect syntax errors in a script file. Of course, it won't catch runtime errors.
One tip for PowerShell V3 users: we (the PowerShell team) added a new API on the Runspace class called ResetRunspace(). This API resets the global variable table back to the initial state for that runspace (as well as cleaning up a few other things). What it doesn't do is clean out function definitions, types and format files or unload modules. This allows the API to be much faster. Also note that the Runspace has to have been created using an InitialSessionState object, not a RunspaceConfiguration instance. ResetRunspace() was added as part of the Workflow feature in V3 to support parallel execution efficiently in a script.
The two instances are totally separate, because they are two different processes. Generally, it is not the most efficient way to start a Powershell process for every script run. Depending on the number of scripts and how often you re-run them, it may be affecting your overall performance. If it's not, I would leave everything AS IS.
Another option would be to run in the same runspace (this is a correct word for it), but clean everything up every time. See this answer for a way to do it. Or use below extract:
$sysvars = get-variable | select -Expand name
function remove-uservars {
get-variable |
where {$sysvars -notcontains $_.name} |
remove-variable
}

Execute process conditionally in Windows PowerShell (e.g. the && and || operators in Bash)

I'm wondering if anybody knows of a way to conditionally execute a program depending on the exit success/failure of the previous program. Is there any way for me to execute a program2 immediately after program1 if program1 exits successfully without testing the LASTEXITCODE variable? I tried the -band and -and operators to no avail, though I had a feeling they wouldn't work anyway, and the best substitute is a combination of a semicolon and an if statement. I mean, when it comes to building a package somewhat automatically from source on Linux, the && operator can't be beaten:
# Configure a package, compile it and install it
./configure && make && sudo make install
PowerShell would require me to do the following, assuming I could actually use the same build system in PowerShell:
# Configure a package, compile it and install it
.\configure ; if ($LASTEXITCODE -eq 0) { make ; if ($LASTEXITCODE -eq 0) { sudo make install } }
Sure, I could use multiple lines, save it in a file and execute the script, but the idea is for it to be concise (save keystrokes). Perhaps it's just a difference between PowerShell and Bash (and even the built-in Windows command prompt which supports the && operator) I'll need to adjust to, but if there's a cleaner way to do it, I'd love to know.
You could create a function to do this, but there is not a direct way to do it that I know of.
function run-conditionally($commands) {
$ranAll = $false
foreach($command in $commands) {
invoke-command $command
if ($LASTEXITCODE -ne 0) {
$ranAll = $false
break;
}
$ranAll = $true
}
Write-Host "Finished: $ranAll"
return $ranAll
}
Then call it similar to
run-conditionally(#(".\configure","make","sudo make install"))
There are probably a few errors there this is off the cuff without a powershell environment handy.
I was really hurting for the lack of &&, too, so wrote the following simple script based on GrayWizardX's answer (which doesn't work as-is):
foreach( $command in $args )
{
$error.clear()
invoke-command $command
if ($error) { break; }
}
If you save it as rc.ps1 (for "run conditionally") in a directory in your path, you can use it like:
rc {.\configure} {make} {make install}
Using script blocks (the curly braces) as arguments instead of strings means you can use tab completion while typing out the commands, which is much nicer. This script is almost as good as &&, and works.