Run ps1 file in foreground - powershell

How to run a ps1 file in foreground?
I noticed when I execute my ps1 file, instead of view the log of the ps1 file execution, a Background job is started.
Is there anyway to run a ps1 file and get the same behavior we have when executing a sh or batch file?
Updates:
My ps1 file:
$scratchOrgName=$args[0]
Write-Host "Hello " & $scratchOrgName
ps1 file execution:

The & starts a new process. (It's called the background operator)
Change the code into something like
Write-Host "Hello" $scratchOrgName
or
Write-Host "Hello $scratchOrgName"

tl;dr
Unless you explicitly request that commands be run in the background (as you accidentally did, see next section), PowerShell commands do run in the foreground.
To achieve what you were (presumably) trying to do:
$scratchOrgName=$args[0]
"Hello $scratchOrgName"
Michaël Hompus' helpful answer provides the crucial pointers, but let me attempt a systematic overview:
Write-Host "Hello " & $scratchOrgName is composed of two statements:
Write-Host "Hello " & submits command Write-Host "Hello " as a background job, in PowerShell (Core) v6+ (in Windows PowerShell (v5.1-), you'd get an error, saying that & is reserved for future use). An object representing the newly created job is output and prints to the screen, as shown in your screenshot.
The post-positional use of & - i.e. placed after a command - is indeed the background operator, and is therefore equivalent to Start-Job { Write-Host "Hello " }
This contrasts with pre-positional use of &, which then acts as the call operator, for invoking command names or paths that are potentially quoted or contain / are variable values (e.g. & 'C:\Program Files\Node.js\node.exe')
$scratchOrgName - by virtue of PowerShell's implicit output behavior - outputs the value of that variable, which prints to the screen by default.
As for what you intended:
& is VBScript's string-concatenation operator; its PowerShell equivalent is +
A string-concatenation operation is an expression, and as such it must be enclosed in (...) in order to be passed as an argument to a command such as Write-Host.
Therefore, the direct PowerShell expression of your intent would be:
Write-Host ("Hello " + $scratchOrgName)
But, as also shown in Michaël's answer, this is more easily expressed via an expandable (double-quoted) string ("..."):
Write-Host "Hello $scratchOrgName"
Taking a step back: Write-Host is typically the wrong tool to use, unless the intent is to write to the display only, bypassing the success output stream and with it the ability to send output to other commands, capture it in a variable, or redirect it to a file.
To output a value, use it by itself, taking advantage of the aforementioned implicit output behavior (or use Write-Output, though that is rarely needed):
"Hello $scratchOrgName"
See this answer for more information.

Related

Is it possible to dot source a string variable in PowerShell?

I know I can dot source a file:
. .\MyFunctions.ps1
But, I would like to dot source the commands in a string variable:
. $myFuctions
I see that this is possible:
.{$x=2}
And $x equals 2 after the script block is sourced.
But... .{$myFunctions} does not work.
I tried $myFunctions | Invoke-Expression, but it doesn't keep the source function in the current scope. The closest I have been able to come up with is to write the variable to a temporary file, dot source the file, and then remove the file.
Inevitably, someone will ask: "What are you trying to do?" So here is my use case:
I want to obfuscate some functions I intend to call from another script. I don't want to obfuscate the master script, just my additional functions. I have a user base that will need to adjust the master script to their network, directory structure and other local factors, but I don't want certain functions modified. I would also like to protect the source code. So, an alternate question would be: What are some good ways to protect PowerShell script code?
I started with the idea that PowerShell will execute a Base64-encoded string, but only when passed on the command line with -EncodedCommand.
I first wanted to dot source an encoded command, but I couldn't figure that out. I then decided that it would be "obfuscated" enough for my purposes if I converted by Base64 file into a decode string and dot sourced the value of the string variable. However, without writing the decoded source to a file, I cannot figure out how to dot source it.
It would satisfy my needs if I could Import-Module -EncodedCommand .\MyEncodedFile.dat
Actually, there is a way to achieve that and you were almost there.
First, as you already stated, the source or dot operator works either by providing a path (as string) or a script block. See also: . (source or dot operator).
So, when trying to dot-source a string variable, PowerShell thinks it is a path. But, thanks to the possibility of dot-sourcing script blocks, you could do the following:
# Make sure everything is properly escaped.
$MyFunctions = "function Test-DotSourcing { Write-Host `"Worked`" }"
. { Invoke-Expression $MyFunctions }
Test-DotSourcing
And you successfully dot-sourced your functions from a string variable!
Explanation:
With Invoke-Expression the string is evaluated and run in the child scope (script block).
Then with . the evaluated expressions are added to the current scope.
See also:
Invoke-Expression
About scopes
While #dwettstein's answer is a viable approach using Invoke-Expression to handle the fact that the function is stored as a string, there are other approaches that seem to achieve the same result below.
One thing I'm not crystal clear on is the scoping itself, Invoke-Expression doesn't create a new scope so there isn't exactly a need to dot source at that point...
#Define your function as a string
PS> $MyUselessFunction = "function Test-WriteSomething { 'It works!' }"
#Invoke-Expression would let you use the function
PS> Invoke-Expression $MyUselessFunction
PS> Test-WriteSomething
It works!
#Dot sourcing works fine if you use a script block
PS> $ScriptBlock = [ScriptBlock]::Create($MyUselessFunction)
PS> . $ScriptBlock
PS> Test-WriteSomething
It works!
#Or just create the function as a script block initially
PS> $MyUselessFunction = {function Test-WriteSomething { 'It works!' }}
PS> . $MyUselessFunction
PS> Test-WriteSomething
It works!
In other words, there are probably a myriad of ways to get something similar to what you want - some of them documented, and some of them divined from the existing documentation. If your functions are defined as strings, then Invoke-Expression might be needed, or you can convert them into script blocks and dot source them.
At this time it is not possible to dot source a string variable.
I stand corrected! . { Invoke-Expression $MyFunctions } definitely works!

How can I pass unbound arguments from one script as parameters to another?

I have little experience with PowerShell in particular.
I'm trying to refactor some very commonly re-used code into a single script that can be sourced where it's needed, instead of copying and pasting this same code into n different scripts.
The scenario I'm trying to get looks (I think) like this:
#common.ps1:
param(
# Sure'd be great if clients didn't need to know about these
$some_params_here
...
)
function Common-Func-Uses-Params {
...
}
⋮
# foo/bar/bat.ps1:
# sure would love not to have to redefine all the common params() here...
. common.ps1 <pass-the-arguments>
Common-Func-Uses-Params $specific_Foo/Bar/Bat_Data
As the pseudo-comments above indicate, I've only been able to do this so far by capturing the params in the calling script as well.
I want to be in a situation where I can update the common code (say with a -Debug or -DryRun or -Url or whatever parameter) and not have to worry about updating all of the client code to match.
Is this possible?
You're missing two key things:
args - which captures all of (and only) the unbound arguments to the script
splatting (#) - which is used to pass arrays or hashtables to a command rather than flattening them like you'd get with $
When you combine these, you can easily pass all arguments onto another script, like so:
# foo.ps1
. common.ps1 #args
With a sourced file like this:
#common.ps1
param ([string]$foo = "foo")
echo "`$foo is $foo"
You get these output:
> foo.ps1 returns $foo is foo
> foo.ps1 -Foo bar returns $foo is bar
Note that, if you're trying to use the PowerShell ISE it might take you a while to figure this out or debug any of it. When you're in the debugger, both $args nor $MyInvocation.UnboundArguments will do their best to hide that information from you. They'll appear to be completely empty.
You can print the args with >> echo "$(#args)", but that also provides the very weird side effect of telling the Debugger to continue. I think the splatting is adding an extra newline and that's ending up in the Command Window.
The best workaround I have for that is to add $theargs = $args at the top of your script and remember to use $theargs in the debugger.

Call an executable with quotes at specific positions

I want to call an executable from a PowerShell script that requires quotes at specific positions in the argument list. Although I found similar questions I did not find a solution at all.
This is what the command must look like on the command line:
reptool.exe --profile="C:\My profile"
The parameter value ("C:\Profiles...") is supposed to be generated dynamically using a variable:
$repToolProfile = "C:\My profile"
This is what I have already tried:
&"reptool.exe" --profile=$repToolProfile
Fails as the argument is given as "--profile=C:\My profile" (quotes around the whole argument).
&"reptool.exe" --profile="$repToolProfile"
Fails as the argument is given as "--profile=C:\My profile" (quotes around the whole argument, same as above).
&"reptool.exe" "--profile=`"$repToolProfile`"
Fails as the argument is given as "--profile="C:\My profile"" (quotes around the whole argument and the value).
I cannot use single quotes or the "verbatim operator" (--%) as I have to use a PowerShell variable, neither I can use Start-Process as it is called asynchroneously (even when I use the -Wait parameter. Also I want to check the exit code. I don't want to convert my arguments to Base64.
This worked for me:
$command = '& "reptool.exe" --% ' + "--profile=`"$repToolProfile`"
Invoke-Expression $command

Build up a string to be passed to call operator

I need to build a string that is actually a command-line, and then execute the contents of that command-line. I'd hoped the call operator (&) would help, but it appears not. Here is a simple contrived example. The following works as expected, it pings a website:
$command = "ping"
$website = "www.bbc.co.uk"
& $command $website
however if I change it to this:
$command = "ping"
$website = "www.bbc.co.uk"
$cmd = "$command $website"
& $cmd
I get an error:
The term 'ping www.bbc.co.uk' is not recognized as the name of a
cmdlet, function, script file, or operable program.
Is there a way to dynamically build up a command-line as a string, and then execute it?
Yes, but you need to use Invoke-Expression (which is just like eval), instead of the call operator. Note that you also need to ensure that all your quoting is correct in that case. E.g.
$cmd = "'$command' '$website'"
would work in your trivial example, unless $command or $website contained single quotes. The problem here is essentially that everything you put into the string is subject to the usual parsing rules of PowerShell.
Generally, if you can, stay as far away from Invoke-Expression as you can. There are a few problems that need it, but invoking external programs ... not so much.
A much better alternative, especially if you have an arbitrary number of arguments, is to just collect the arguments in an array and use the splat operator (note the # in the code example below):
$command = 'ping'
$arguments = '-t','www.bbc.co.uk'
&$command #arguments
This ensures that arguments are properly quoted when necessary and generally avoids a lot of headaches you're going to get with Invoke-Expression.
(Side note: Whenever you have a problem in PowerShell and think »Oh, I'm just going to use a string«, it's often time to rethink that. This includes handling file names, or command lines. PowerShell has objects, reducing it to the capabilities of earlier shells just yields the same pain you have elsewhere too, e.g. multiple levels of escaping, sometimes with different syntaxes, etc. And most of the time there are better ways of solving the problem.)

command execution ordering inside a PowerShell scriptblock

I got excited with PowerShell ScriptBlock at first but I was confused recently with its executing ordering inside blocks. For example:
$test_block = {
write-host "show 1"
ps
write-host "show 2"
Get-Date
}
The output by calling $test_block.Invoke():
show 1
show 2
<result of command 'ps'>
<result of command 'get-date'>
Do commands who output something run first?
This behaviour is because write-host doesn't put the output on the pipeline. The other commands are placed on the pipeline so are not output to the screen until the function (invoke) returns.
To get the behaviour I believe you were expecting, use write-output instead, the results of all the commands will then be returned in the pipeline.
$test_block = {
write-output "show 1"
ps
write-output "show 2"
Get-Date
}
$test_block.Invoke()
To complement David Martin's helpful answer:
Avoiding Write-Host (which is often the wrong tool to use) in favor of Write-Output, i.e. outputting to the success output stream - rather than printing to the display with Write-Host[1] - solves your immediate problem.
However, you could have avoided the problem by using &, the call operator instead of the .Invoke() method:
# Invokes the script block and *streams* its output.
& $test_block
Using & is generally preferable, not just to avoid .Invoke()'s collect-all-success-output-stream-first behavior - see next section.
As an aside:
This answer describes common problem with similar symptoms (success output (pipeline output) appearing out of order relative to other streams), although it is technically unrelated and also occurs with &:
# !! The Write-Host output prints FIRST, due to implicit, asynchronous
# !! table formatting of the [pscustomobject] instance.
& { [pscustomobject] #{ Foo = 'Bar' }; Write-Host 'Why do I print first?' }
Why &, not .Invoke(), should be used to invoke script blocks ({ ... }):
Script blocks ({ ... }) are normally invoked with &, the call operator, in argument (parsing) mode (like cmdlets and external programs), not via their .Invoke() method, which allows for more familiar syntax; e.g.:
& $test_block rather than $test_block.Invoke()
with arguments: & $test_block arg1 ... rather than $test_block.Invoke(arg1, ...)
Perhaps more importantly, using this operator-based invocation syntax (& { ... } ... or . { ... } ...) has the following advantages:
It preserves normal streaming semantics, meaning that success output (too) is emitted from the script block as it is being produced, whereas .Invoke() collects all success output first, in a [System.Collections.ObjectModel.Collection[psobject]] instance, which it then returns - by contrast, the Write-Host output goes straight to the display in both cases.
As a beneficial side effect, your specific output-ordering problem goes away, but note that in PSv5+ there can generally still be an output-ordering problem, although its cause is unrelated:
Implicit tabular output (implied Format-Table) for output types without predefined format data is asynchronous in an effort to determine suitable column widths (your specific code happens to only use cmdlets with predefined format data).
See this answer for more information; a simple repro:
[pscustomobject] #{ foo = 1 }; Write-Host 'Should print after, but prints first.'
It allows you to pass named arguments (e.g. -Foo Bar), whereas .Invoke() supports only positional (unnamed) ones (e.g. Bar).
It preserves normal semantics for script-terminating errors:
# OK: & throw aborts the entire script; 'after' never prints.
& { throw 'fatal' }; 'after'
# !! .Invoke() "eats" the script-terminating error and
# !! effectively converts it to a *statement*-terminating one.
# !! Therefore, execution continues, and 'after' prints.
{ throw 'fatal' }.Invoke(); 'after'
Additionally, using operator-based invocation gives you the option to use ., the dot-sourcing operator, in lieu of &, so as to run a script block directly in the caller's scope, whereas an .Invoke() method call only runs in a child scope (as & does):
# Dot-sourcing the script block runs it in the caller's scope.
. { $foo='bar' }; $foo # -> 'bar'
Use of .Invoke() is best limited to PowerShell SDK projects (which are typically C#-based, where use of PowerShell operators isn't an option).
[1] Technically, since PowerShell v5 Write-Host outputs to the information stream (stream number 6), which, however, prints to the display by default, while getting ignored in the pipeline and redirections. See this answer for a juxtaposition of Write-Host and Write-Output, and why the latter is typically note even needed.