Execute a parameter passed into a powershell script as if it were a line in the script - powershell

I've got a wrapper powershell script that I'm hoping to use to automate a few things. It's pretty basic, and accepts a parameter that I want the script to run as if it were a line in the script. I absolutely cannot get it to work.
example:
param( [string[]] $p)
echo $p
# Adds the base cmdlets
Add-PSSnapin VMware.VimAutomation.Core
# Add the following if you want to do things with Update Manager
Add-PSSnapin VMware.VumAutomation
# This script adds some helper functions and sets the appearance. You can pick and choose parts of this file for a fully custom appearance.
. "C:\Program Files (x86)\VMware\Infrastructure\vSphere PowerCLI\Scripts\Initialize-VIToolkitEnvironment.ps1"
$p
In the example above, I want $p to execute as if it were a line in the script. I know this isn't secure, and that's probably where the problem lies.
Here is how I try running the script and passing in a parameter for $p:
D:\ps\test>powershell -command "D:\ps\test\powershell_wrapper.ps1" 'Suspend-VM servername -Verbose -Confirm:$False'
How can I get my parameter of 'Suspend-VM servername -Verbose -Confirm:$False' to run inside my script? If I just include the value in the script instead of pass it in as a parameter it runs without any issues...

You can basically approach this two ways, depending on what your needs really are and how you want to structure your code.
Approach #1 - Invoke-Expression
Invoke-Expression basically allows you to treat a string like an expression and evaluate it. Consider the following trivial example:
Invoke-Expression '{"Hello World"}'
That will evaluate the string as if it were an expression typed in directly, and place the string "Hello World" on the pipeline. You could use that to take your string parameter and run it on-the-fly in your script.
Approach #2 - Using a ScriptBlock
PowerShell has a special data type called a ScriptBlock, where you can bind a script to a variable, and then invoke that script as part of your code. Again, here is a trivial example:
function Test-SB([ScriptBlock]$sb) {
$sb.Invoke()
}
Test-SB -sb {"Hello World"}
This example creates a function with a single parameter $sb that is of type ScriptBlock. Notice the parameter is bound to the actual chunk of code {"Hello World"}? That code is assigned to the $sb parameter, and then a call to the .Invoke method actually executes the code. You could adapt your code to take in a ScriptBlock and invoke it as part of your script.
Approach #3 - Updating your profile
OK, so I said there were two ways to approach it. There is actually a third... sort of... You could add the VMWare cmdlets to your $profile so they are always present and you don't need your wrapper to load in those libraries. Granted, this is a pretty big hammer - but it might make sense if this is the environment you are constantly working in. You could also just create a shortcut to PowerShell that runs a .ps1 on startup that includes those libraries and hangs around (this is what MS did with the SharePoint admin shell and several others). Take a look at this TechNet page to get more info on the $profile and if it can help you out:
http://msdn.microsoft.com/en-us/library/windows/desktop/bb613488.aspx

Related

Are these alternatives to Invoke-Expression really any safer? And why?

I'm not understanding if Invoke-Expression is internally flawed, making it more dangerous. Or is the problem that it incorporates text to code, code to execution, and maybe execution in the local scope, all in a single command.
What I'm wanting to do is create a class in C# with a public event EventHandler MyEvent; event via Add-Type, and then inherit from that class in PowerShell by writing the PowerShell in a string #'class psMessages: csMessages{<code for clas>}'#, converting the string into a script block, and then executing it.
I found these methods for creating the script block will work:
$ScriptBlock = ([System.Management.Automation.Language.Parser]::ParseInput($psMessages, [ref]$null, [ref]$null)).GetScriptBlock()
# or
$ScriptBlock = [scriptblock]::Create($psMessages)
And either of these commands will execute the the script block in the current scope:
. $ScriptBlock
# or
Invoke-Command -NoNewScope $ScriptBlock
Additional info: These commands fail, I believe because they execute the script block in a new scope - please correct me if I'm wrong:
& $ScriptBlock
# or
$ScriptBlock.Invoke()
# or
Invoke-Command $ScriptBlock
So, are any of these methods safer to use than Invoke-Expression? Or are they all just as dangerous? And, if any are safer, why?
What makes any command dangerous is the blind execution of source code from an unknown / untrusted source.
As such, the execution mechanism is incidental to the problem.
Conversely, this means that if you full control or implicitly trust a given piece of source code, use of Invoke-Expression - which is generally to be avoided - is acceptable.
Note that the code executed by Invoke-Expression invariably runs in the current scope; you could wrap the input string in & { ... } in order to execute it in a child scope.
As Santiago Squarzon points out, [scriptblock]::Create() enables a middle ground:
As demonstrated in this answer of his, it is possible to constrain what may be executed in terms of permissible commands, read access to specific variables, and whether read access to environment variables is allowed.
Additionally, a script block instance returned by [scriptblock]::Create() allows potentially reusable invocation on demand, with the choice to either execute it in the current scope, with ., the dot-sourcing operator, or a child scope, with &, the call operator.
As for the commands listed under "Additional info:"
They should not fail; they should all execute the script block in a child scope.
However:
Using the .Invoke() method on script blocks should be avoided, because it changes the semantics of the call in several respects - see this answer.
Similarly, there is no good reason to use Invoke-Command for (local) invocation of script blocks - see this answer.

start powershell wont accept variable as parameter

issue
the called powershell script will accept parameters but not all of them:
Current Set-Up and code:
I have a common folder where two .ps1 scripts are located:
DoWork.ps1
Workmanager.ps1
Workmanager.ps1 calls the Dowork.ps1:
$targetPath="M:\target"
echo "target path: $targetPath"
start powershell {.\DoWork.ps1 -target $targetPath -tempdrive D:\}
output (as expected):
target path: M:\target
DoWork.ps1 contains some start code:
param
(
[string]$tempdrive,
[string]$target,
[int] $threads = 8,
[int] $queuelength = -1
)
echo "variables:"
echo "temp drive: $tempdrive"
echo "target path: $target"
Unexpectedly, the $target is not beeing assigned. Previously I had the variable named $targetpath, which did not work either.
variables:
temp drive: D:\
target path:
Findings
It appears that the issue relies in Workmanager.ps1. Spcifying the parameter as fixed string rather than as variable will load the parameter. Any solution for this?
start powershell {.\DoWork.ps1 -target "foo" -tempdrive D:\}
When you use a ScriptBlock as an argument to powershell.exe, variables aren't going to be evaluated until after the new session starts. $targetPath has not been set in the child PowerShell process called by Workmanager.ps1 and so it has no value. This is actually an expected behavior of a ScriptBlock in general and behaves this way in other contexts too.
The solution is mentioned in the help text for powershell -?:
[-Command { - | <script-block> [-args <arg-array>] <========== THIS GUY
| <string> [<CommandParameters>] } ]
You must provide the -args parameter which will be passed to the ScriptBlock on execution (separate multiple arguments with a ,). Passed arguments are passed positionally, and must be referenced as though you were processing the arguments to a function manually using the $args array. For example:
$name = 'Bender'
& powershell { Write-Output "Hello, $($args[0])" } -args $name
However, especially with more complicated ScriptBlock bodies, having to remember which index of $args[i] contains the value you want at a given time is a pain in the butt. Luckily, we can use a little trick with defining parameters within the ScriptBlock to help:
$name = 'Bender'
& powershell { param($name) Write-Output "Hello, $name" } -args $name
This will print Hello, Bender as expected.
Some additional pointers:
The ScriptBlock can be multiline as though you were defining a function. way. The examples above are single line due to their simplicity.
A ScriptBlock is just an unnamed function, which is why defining parameters and referencing arguments within one works the same way.
To exemplify this behavior outside of powershell.exe -Command, Invoke-Command requires you to pass variables to its ScriptBlock in a similar fashion. Note however that answer uses an already-defined function body as the ScriptBlock (which is totally valid to do)
You don't need to use Start-Process here (start is its alias), at least as demonstrated in your example. You can simply use the call operator & unless you need to do something more complex than "run the program and wait for it to finish". See this answer of mine for more information.
If you opt to pass a string to powershell.exe instead, you don't need to provide arguments and your variables will get rendered in the current PowerShell process. However, so will any other unescaped variables that might be intended to set within the child process, so be careful with this approach. Personally, I prefer using ScriptBlock regardless, and just deal with the extra parameter definition and arguments.
Using the call & operator is optional when you are not executing a path rendered as a string. It can be omitted in the examples above, but is more useful like so:
& "C:\The\Program Path\Contains\spaces.exe"
& $programPathAsAVariable

Is there a way to create an alias to a cmdlet in a way that it only runs if arguments are passed to the alias?

I am trying to create an alias (named which) of the Get-Command cmdlet in a way that it doesn't run if I'm not sending any arguments (because if it's run without arguments it outputs all the commands that are available).
I know this can be done using a function but I would like to keep the tab completion functionality without having to write a sizeable function that is to be placed into my $PROFILE.
In short, I only want the alias to work if it is being passed arguments.
You can't do it with an alias, because PowerShell aliases can only refer to another command name or path, and can therefore neither include arguments nor custom logic.
Therefore you do need a function, but it can be a short and simple one:
function which { if ($args.count) { Get-Command #args } else { Throw "Missing command name." } }
Note that while passing -? for showing Get-Command's help does work, tab completion of arguments does not.
In order to get tab completion as well, you'll need to write a wrapper (proxy) function or at least replicate Get-Command's parameter declarations - which then does make the function definition sizable.
If the concern is just the size of the $PROFILE file itself, you can write a proxy script instead - which.ps1 - which you can invoke with just which as well, assuming you place it in one of the directories listed in $env:Path[1]; see next section.
Defining a wrapper (proxy) script or function:
Defining a wrapper (proxy) function or script is a nontrivial undertaking, but allows you to implement a robust wrapper that supports tab completion and even forwarding to the original command's help.
Note:
Bug alert: As zett42 points out, as of PowerShell [Core] 7.1, System.Management.Automation.ProxyCommand.Create neglects to include dynamic parameters if the target command is an (advanced) function or script; however, compiled cmdlets are not affected; see GitHub issue #4792 and this answer for a workaround.
For simplicity, the following creates a wrapper script, which.ps1 , and saves it in the current directory. As stated, if you place it in one of the directories listed in $env:PATH, you'll be able to invoke it as just which.
The code below can easily be adapted to create a wrapper function instead: simply take the contents of the $wrapperCmdSource variable below and enclose it in function which { ... }.
As of PowerShell Core 7.0.0-preview.5, there are some problems with the auto-generated code, which may or may not affect you; they will be fixed at some point; to learn more and to learn how to manually correct them, see GitHub issue #10863.
# Create the wrapper scaffolding as source code (outputs a single [string])
$wrapperCmdSource =
[System.Management.Automation.ProxyCommand]::Create((Get-Command Get-Command))
# Write the auto-generated source code to a script file
$wrapperCmdSource > which.ps1
Note:
Even though System.Management.Automation.ProxyCommand.Create requires a System.Management.Automation.CommandMetadata instance to identify the target command, the System.Management.Automation.CommandInfo instances output by Get-Command can be used as-is.
Re comment-based help: By default, the proxy function simply forwards to the original cmdlet's help; however, you can optionally pass a string to serve as the comment-based help as the 2nd argument.
By using [System.Management.Automation.ProxyCommand]::GetHelpComments() in combination with output from Get-Help, you could start with a copy of the original command's help and modify it:
[System.Management.Automation.ProxyCommand]::GetHelpComments((Get-Help Get-Command))
You now have a fully functional which.ps1 wrapper script that behaves like Get-Command itself.
You can invoke it as follows:
./which # Same as: Get-Command; tab completion of parameters supported.
./which -? # Shows Get-Command's help.
You can now edit the script file to perform the desired customization.
Note: The auto-generated source code contains a lot of boilerplate code; however, typically only one or two places need tweaking to implement the custom functionality.
Specifically, place the following command as the first statement inside the begin { ... } block:
if (-not $MyInvocation.ExpectingInput -and -not ($Name -or $CommandType -or $Module -or $FullyQualifiedModule)) {
Throw "Missing command name or filter."
}
This causes the script to throw an error if the caller didn't provide some way of targeting a specific command or group of commands, either by direct argument or via the pipeline.
If you invoke the modified script without arguments now, you should see the desired error:
PS> ./which.ps1
Missing command name or filter.
...
Other common types of customizations are:
Removing parameters from the wrapper, by simply removing the parameter declaration.
Adding additional parameters to the invocation of the wrapped command, by modifying the following line in the begin block:
# Add parameters, as needed.
$scriptCmd = { & $wrappedCmd #PSBoundParameters }
Preprocessing pipeline input before passing it to the wrapped command, by customizing the process block and replacing $_ with your preprocessed input in the following line:
# Replace $_ with a preprocessed version of it, as needed.
$steppablePipeline.Process($_)
For an example of a complete implementation of a proxy function, see this answer.
[1] Caveat for Linux users: since the Linux file-system is case is case-sensitive, invocation of your script won't work case-insensitively, the way commands normally work in PowerShell. E.g., if your script file name is Get-Foo.ps1, only Get-Foo - using the exact same case - will work, not also get-foo, for instance.

Shorter execution of powershell script

Say I have a script to be executed in a single call, how do I do it?
Like, say I have a powershell script saved at E:\Fldr\scrpt.ps1.
Now if I have to normally execute that script is PowerShell ISE then I would have to use:
& "E:\Fldr\scrpt.ps1"
and the scrpt.ps1 gets executed.
But whatI want is, when I write a word, say "exeScrpt" instead of & "E:\Fldr\scrpt.ps1" then I want scrpt.ps1 to get executed.
Is there a way to do this?
Thank you for checking in..
You can wrap your call to the script in a function:
function Invoke-Script
{
E:\Fldr\scrpt.ps1
}
Then you can run your script by executing a single command anywhere after the definition:
Invoke-Script
Note that it is good practice to name your functions according to the Verb-Noun cmdlet naming standard, so something like Invoke-Script instead of exeScrpt. If you really want a single word as the name, then you can additionally create an alias for your function:
New-Alias -Name exeScrpt -Value Invoke-Script

Referencing text after script is called within PS1 Script

Let's take the PowerShell statement below as an example:
powershell.exe c:\temp\windowsbroker.ps1 IIS
Is it possible to have it scripted within windowsbroker.ps1 to check for that IIS string, and if it's present to do a specific install script? The broker script would be intended to install different applications depending on what string followed it when it was called.
This may seem like an odd question, but I've been using CloudFormation to spin up application environments and I'm specifying an "ApplicationStack" parameter that will be referenced at the time when the powershell script is run so it knows which script to run to install the correct application during bootup.
What you're trying to do is called argument or parameter handling. In its simplest form PowerShell provides all arguments to a script in the automatic variable $args. That would allow you to check for an argument IIS like this:
if ($args -contains 'iis') {
# do something
}
or like this if you want the check to be case-sensitive (which I wouldn't recommend, since Windows and PowerShell usually aren't):
if ($args -ccontains 'IIS') {
# do something
}
However, since apparently you want to use the argument as a switch to trigger specific behavior of your script, there are better, more sophisticated ways of doing this. You could add a Param() section at the top of your script and check if the parameter was present in the arguments like this (for a list of things to install):
Param(
[Parameter()]
[string[]]$Install
)
$Install | ForEach-Object {
switch ($_) {
'IIS' {
# do something
}
...
}
}
or like this (for a single option):
Param(
[switch]$IIS
)
if ($IIS.IsPresent) {
# do something
}
You'd run the script like this:
powershell "c:\temp\windowsbroker.ps1" -Install "IIS",...
or like this respectively:
powershell "c:\temp\windowsbroker.ps1" -IIS
Usually I'd prefer switches over parameters with array arguments (unless you have a rather extensive list of options), because with the latter you have to worry about spelling of the array elements, whereas with switches you got a built-in spell check.
Using a Param() section will also automatically add a short usage description to your script:
PS C:\temp> Get-Help windowsbroker.ps1
windowsbroker.ps1 [-IIS]
You can further enhance this online help to your script via comment-based help.
Using parameters has a lot of other advantages on top of that (even though they probably aren't of that much use in your scenario). You can do parameter validation, make parameters mandatory, define default values, read values from the pipeline, make parameters depend on other parameters via parameter sets, and so on. See here and here for more information.
Yes, they are called positional parameters. You provide the parameters at the beginning of your script:
Param(
[string]$appToInstall
)
You could then write your script as follows:
switch ($appToInstall){
"IIS" {"Install IIS here"}
}