Dot-sourcing a self-elevate script - powershell

I have a self elevate snippet which is quite wordy, so I decided instead of duplicating it at the top of every script that needs to be run as admin to move it into a separate .ps1:
function Switch-ToAdmin {
# Self-elevate the script if required
if (-not ([Security.Principal.WindowsPrincipal] [Security.Principal.WindowsIdentity]::GetCurrent()).IsInRole([Security.Principal.WindowsBuiltInRole] 'Administrator')) {
if ([int](Get-CimInstance -Class Win32_OperatingSystem | Select-Object -ExpandProperty BuildNumber) -ge 6000) {
$Cmd = #(
"-Command Set-Location `"$(Get-Location)`"; & `"$PSCommandPath`""
"-$($PSBoundParameters.Keys)"
)
$ProcArgs = #{
FilePath = 'PowerShell.exe'
Verb = 'RunAs'
ArgumentList = $Cmd
}
Start-Process #ProcArgs
Exit
}
}
}
So for every script that needs elevation I'd prepend
. "$PSScriptRoot\self-elevate.ps1"
Switch-ToAdmin
# rest of script
Doing above successfully procs the UAC prompt, but the rest of the script won't get executed.
Is this sorta stuff disallowed?

Darin and iRon have provided the crucial pointers:
Darin points out that the automatic $PSCommandPath variable variable in your Switch-ToAdmin function does not contain the full path of the script from which the function is called, but that of the script in which the function is defined, even if that script's definitions are loaded directly into the scope of your main script via ., the dot-sourcing operator.
The same applies analogously to the automatic $PSScriptRoot variable, which reflects the defining script's full directory path.
Also, more generally, the automatic $PSBoundParameters variable inside a function reflects that function's bound parameters, not its enclosing script's.
iRon points out that the Get-PSCallStack cmdlet can be used to get information about a script's callers, starting at index 1; the first object returned - index 0, when Get-PSCallStack output is captured in an array, represents the current command. Index 1 therefore refers to the immediate caller, which from the perspective of your dot-sourced script is your main script.
Therefore:
Replace $PSCommandPath with $MyInvocation.PSCommandPath, via the automatic $MyInvocation variable. $MyInvocation.PSCommandPath truly reflects the caller's full script path, irrespective of where the called function was defined.
Alternatively, use (Get-PSCallStack)[1].ScriptName, which despite what the property name suggests, returns the full path of the calling script too.
Replace $PSBoundParameters with (Get-PSCallStack)[1].InvocationInfo.BoundParameters
Note that there's also (Get-PSCallStack)[1].Arguments, but it seems to contain a single string only, containing a representation of all arguments that is only semi-structured and therefore doesn't allow robust reconstruction of the individual parameters.
As an aside:
Even if $PSBoundParameters contained the intended information, "-$($PSBoundParameters.Keys)" would only succeed in passing the bound parameters through if your script defines only one parameter, if that parameter is a [switch] parameter, and if it is actually passed in every invocation.
Passing arguments through robustly in this context is hard to do, and has inherent limitations - see this answer for a - complex - attempt to make it work as well as possible.

Related

PowerShell, auto load functions from internet on demand

It was pointed out to me (in PowerShell, replicate bash parallel ping) that I can load a function from the internet as follows:
iex (irm https://raw.githubusercontent.com/proxb/AsyncFunctions/master/Test-ConnectionAsync.ps1)
The url referenced Test-ConnectionAsync.ps1 contains two functions: Ping-Subnet and Test-ConnectionAsync
This made me wonder if I could then define bypass functions in my personal module that are dummy functions that will be permanently overridden as soon as they are invoked. e.g.
function Ping-Subnet <mimic the switches of the function to be loaded> {
if <function is not already loaded from internet> {
iex (irm https://raw.githubusercontent.com/proxb/AsyncFunctions/master/Test-ConnectionAsync.ps1)
}
# Now, somehow, permanently overwrite Ping-Subnet to be the function that loaded from the URL
Ping-Subnet <pass the switches that we mimicked to the required function that we have just loaded>
}
This would very simply allow me to reference a number of useful scripts directly from my module but without having to load them all from the internet upon loading the Module (i.e. the functions are only loaded on demand, when I invoke them, and I will often never invoke the functions unless I need them).
You could use the Parser to find the functions in the remote script and load them into your scope. This will not be a self-updating function, but should be safer than what you're trying to accomplish.
using namespace System.Management.Automation.Language
function Load-Function {
[cmdletbinding()]
param(
[parameter(Mandatory, ValueFromPipeline)]
[uri] $URI
)
process {
try {
$funcs = Invoke-RestMethod $URI
$ast = [Parser]::ParseInput($funcs, [ref] $null, [ref] $null)
foreach($func in $ast.FindAll({ $args[0] -is [FunctionDefinitionAst] }, $true)) {
if($func.Name -in (Get-Command -CommandType Function).Name) {
Write-Warning "$($func.Name) is already loaded! Skipping"
continue
}
New-Item -Name "script:$($func.Name)" -Path function: -Value $func.Body.GetScriptBlock()
}
}
catch {
Write-Warning $_.Exception.Message
}
}
}
Load-Function https://raw.githubusercontent.com/proxb/AsyncFunctions/master/Test-ConnectionAsync.ps1
Ping-Subnet # => now is available in your current session.
function Ping-Subnet{
$toImport = (IRM "https://raw.githubusercontent.com/proxb/AsyncFunctions/master/Test-ConnectionAsync.ps1").
Replace([Text.Encoding]::UTF8.GetString((239,187,191)),"")
NMO([ScriptBlock]::Create($toImport))|Out-Null
$MyInvocation.Line|IEX
}
function Test-ConnectionAsync{
$toImport = (IRM "https://raw.githubusercontent.com/proxb/AsyncFunctions/master/Test-ConnectionAsync.ps1").
Replace([Text.Encoding]::UTF8.GetString((239,187,191)),"")
NMO([ScriptBlock]::Create($toImport))|Out-Null
$MyInvocation.Line|IEX
}
Ping-Subnet -Result Success
Test-ConnectionAsync -Computername $env:COMPUTERNAME
Result:
Computername Result
------------ ------
192.168.1.1 Success
192.168.1.2 Success
192.168.1.146 Success
Computername IPAddress Result
------------ --------- ------
HOME-PC fe80::123:1234:ABCD:EF12 Success
Yes, it should work. Calling Test-ConnectionAsync.ps1 from with-in a function will create the functions defined with-in, in the wrapping function's scope. You will be able to call any wrapped functions until the function's scope ends.
If you name the wrapper and wrapped functions differently, you can check whether the function has been declared with something like...
Otherwise, you need to get more creative.
This said, PROCEED WITH CAUTION. Remote code execution, like this, is fraught with security issues, especially in the way we're talking about it i.e., no validation of Test-ConnectionAsync.ps1.
Fors1k's answer deserves the credit for coming up with the clever fundamentals of the approach:
Download and execute the remote script's content in a dynamic module created with New-Module (whose built-in alias is nmo), which causes the script's functions to be auto-exported and to become available session-globally[1]
Note that dynamic modules aren't easy to discover, because they're not shown in Get-Module's output; however, you can discover them indirectly, via the .Source property of the command-info objects output by Get-Command:
Get-Command | Where Source -like __DynamicModule_*
That the downloaded functions become available session-globally may be undesired if you're trying to use the technique inside a script that shouldn't affect the session's global state - see the bottom section for a solution.
Then re-invoke the function, under the assumption that the original stub function has been replaced with the downloaded version of the same name, passing the received arguments through.
While Fors1k's solution will typically work, here is a streamlined, robust alternative that prevents potential, inadvertent re-execution of code:
function Ping-Subnet{
$uri = 'https://raw.githubusercontent.com/proxb/AsyncFunctions/master/Test-ConnectionAsync.ps1'
# Define and session-globally import a dynamic module based on the remote
# script's content.
# Any functions defined in the script would automatically be exported.
# However, unlike with persisted modules, *aliases* are *not* exported by
# default, which the appended Export-ModuleMember call below compensates for.
# If desired, also add -Variable * in order to export variables too.
# Conversely, if you only care about functions, remove the Export-ModuleMember call.
$dynMod = New-Module ([scriptblock]::Create(
((Invoke-RestMethod $uri)) + "`nExport-ModuleMember -Function * -Alias *")
)
# If this stub function shadows the newly defined function in the dynamic
# module, remove it first, so that re-invocation by name uses the new function.
# Note: This happens if this stub function is run in a child scope, such as
# in a (non-dot-sourced) script rather than in the global scope.
# If run in the global scope, curiously, the stub function seemingly
# disappears from view right away - not even Get-Command -All shows it later.
$myName = $MyInvocation.MyCommand.Name
if ((Get-Command -Type Function $myName).ModuleName -ne $dynMod.Name) {
Remove-Item -LiteralPath "function:$myName"
}
# Now invoke the newly defined function of the same name, passing the arguments
# through.
& $myName #args
}
Specifically, this implementation ensures:
That aliases defined in the remote script are exported as well (just remove + "`nExport-ModuleMember -Function * -Alias *" from the code above if that is undesired.
That the re-invocation robustly targets the new, module-defined implementation of the function - even if the stub function runs in a child scope, such as in a (non-dot-sourced) script.
When run in a child scope, $MyInvocation.Line|IEX (iex is a built-in alias of the Invoke-Expression cmdlet) would result in an infinite loop, because the stub function itself is still in effect at that time.
That all received arguments are passed through on re-invocation without re-evaluation.
Using the built-in magic of splatting the automatic $args variable (#args) passes only the received, already expanded arguments through, supporting both named and positional arguments.[2]
$MyInvocation.Line|IEX has two potential problems:
If the invoking command line contained multiple commands, they are all repeated.
You can solve this particular problem by substituting (Get-PSCallStack)[1].Position.Text for $MyInvocation.Line, but that still wouldn't address the next problem.
Both $MyInvocation.Line and (Get-PSCallStack)[1].Position.Text contain the arguments that were passed in unexpanded (unevaluated) form, which causes their re-evaluation by Invoke-Expression, and the perils of that are that, at least hypothetically, this re-evaluation could involve lengthy commands whose output served as arguments or, worse, commands that had side effects that cannot or should not be repeated.
Scoping the technique to a given local script:
That the downloaded functions become available session-globally may be undesired if you're trying to use the technique inside a script that shouldn't affect the session's global state; that is, you may want the functions exported via the dynamic module to disappear when the script exits.
This requires two extra steps:
Piping the dynamic module to Import-Module, which is the prerequisite for being able to unload it before exiting with Remove-Module
Calling Remove-Module with the dynamic module before exiting in order to unload it.
function Ping-Subnet{
$uri = 'https://raw.githubusercontent.com/proxb/AsyncFunctions/master/Test-ConnectionAsync.ps1'
# Save the module in a script-level variable, and pipe it to Import-Module
# so that it can be removed before the script exits.
$script:dynMod = New-Module ([scriptblock]::Create(
((Invoke-RestMethod $uri)) + "`nExport-ModuleMember -Function * -Alias *")
) | Import-Module -PassThru
# If this stub function shadows the newly defined function in the dynamic
# module, remove it first, so that re-invocation by name use the new function.
# Note: This happens if this stub function is run in a child scope, such as
# in a (non-dot-sourced) script rather than in the global scope.
# If run in the global scope, curiously, the stub function seemingly
# disappears from view right away - not even Get-Command -All shows it later.
$myName = $MyInvocation.MyCommand.Name
if ((Get-Command -Type Function $myName).ModuleName -ne $dynMod.Name) {
Remove-Item -LiteralPath "function:$myName"
}
# Now invoke the newly defined function of the same name, passing the arguments
# through.
& $myName #args
}
# Sample commands to perform in the script.
Ping-Subnet -?
Get-Command Ping-Subnet, Test-ConnectionAsync | Format-Table
# Before exiting, remove (unload) the dynamic module.
$dynMod | Remove-Module
[1] This assumes that the New-Module call itself is made outside of a module; if it is made inside a module, at least that module's commands see the auto-exported functions; if that module uses implicit exporting behavior (which is rare and not advisable), the auto-exported functions from the dynamic module would be included in that module's exports and therefore again become available session-globally.
[2] This magic has one limitation, which, however, will only rarely surface: [switch] parameters with a directly attached Boolean argument aren't supported (e.g., -CaseSensitive:$true) - see this answer.

start powershell wont accept variable as parameter

issue
the called powershell script will accept parameters but not all of them:
Current Set-Up and code:
I have a common folder where two .ps1 scripts are located:
DoWork.ps1
Workmanager.ps1
Workmanager.ps1 calls the Dowork.ps1:
$targetPath="M:\target"
echo "target path: $targetPath"
start powershell {.\DoWork.ps1 -target $targetPath -tempdrive D:\}
output (as expected):
target path: M:\target
DoWork.ps1 contains some start code:
param
(
[string]$tempdrive,
[string]$target,
[int] $threads = 8,
[int] $queuelength = -1
)
echo "variables:"
echo "temp drive: $tempdrive"
echo "target path: $target"
Unexpectedly, the $target is not beeing assigned. Previously I had the variable named $targetpath, which did not work either.
variables:
temp drive: D:\
target path:
Findings
It appears that the issue relies in Workmanager.ps1. Spcifying the parameter as fixed string rather than as variable will load the parameter. Any solution for this?
start powershell {.\DoWork.ps1 -target "foo" -tempdrive D:\}
When you use a ScriptBlock as an argument to powershell.exe, variables aren't going to be evaluated until after the new session starts. $targetPath has not been set in the child PowerShell process called by Workmanager.ps1 and so it has no value. This is actually an expected behavior of a ScriptBlock in general and behaves this way in other contexts too.
The solution is mentioned in the help text for powershell -?:
[-Command { - | <script-block> [-args <arg-array>] <========== THIS GUY
| <string> [<CommandParameters>] } ]
You must provide the -args parameter which will be passed to the ScriptBlock on execution (separate multiple arguments with a ,). Passed arguments are passed positionally, and must be referenced as though you were processing the arguments to a function manually using the $args array. For example:
$name = 'Bender'
& powershell { Write-Output "Hello, $($args[0])" } -args $name
However, especially with more complicated ScriptBlock bodies, having to remember which index of $args[i] contains the value you want at a given time is a pain in the butt. Luckily, we can use a little trick with defining parameters within the ScriptBlock to help:
$name = 'Bender'
& powershell { param($name) Write-Output "Hello, $name" } -args $name
This will print Hello, Bender as expected.
Some additional pointers:
The ScriptBlock can be multiline as though you were defining a function. way. The examples above are single line due to their simplicity.
A ScriptBlock is just an unnamed function, which is why defining parameters and referencing arguments within one works the same way.
To exemplify this behavior outside of powershell.exe -Command, Invoke-Command requires you to pass variables to its ScriptBlock in a similar fashion. Note however that answer uses an already-defined function body as the ScriptBlock (which is totally valid to do)
You don't need to use Start-Process here (start is its alias), at least as demonstrated in your example. You can simply use the call operator & unless you need to do something more complex than "run the program and wait for it to finish". See this answer of mine for more information.
If you opt to pass a string to powershell.exe instead, you don't need to provide arguments and your variables will get rendered in the current PowerShell process. However, so will any other unescaped variables that might be intended to set within the child process, so be careful with this approach. Personally, I prefer using ScriptBlock regardless, and just deal with the extra parameter definition and arguments.
Using the call & operator is optional when you are not executing a path rendered as a string. It can be omitted in the examples above, but is more useful like so:
& "C:\The\Program Path\Contains\spaces.exe"
& $programPathAsAVariable

Is it possible to have a scriptblock evaluated inside a string?

I would like for the resulting value in $s to be "now is then for today"
PS H:\> $s = "now is $({if (1 -eq 1){'then'}}) for today"
PS H:\> $s
now is if (1 -eq 1){'then'} for today
It's definitely possible, and pretty easy with subexpressions
You were close, just need to remove the outer set of curly braces
$s = "now is $(if (1 -eq 1){'then'}) for today"
$s
Delay-bind script-block arguments are an implicit feature that:
only works with parameters that are designed to take pipeline input,
of any type except the following, in which case regular parameter binding happens[1]:
[scriptblock]
[object] ([psobject], however, does work, and therefore [pscustomobject] too)
(no type specified), which is effectively the same as [object]
whether such parameters accept pipeline input by value (ValueFromPipelineBy) or by property name (ValueFromPipelineByPropertyName), is irrelevant.
enables per-input-object transformations via a script block passed instead of a type-appropriate argument; the script block is evaluated for each pipeline object, which is accessible inside the script block as $_, as usual, and the script block's output - which is assumed to be type-appropriate for the parameter - is used as the argument.
Since such ad-hoc script blocks by definition do not match the type of the parameter you're targeting, you must always use the parameter name explicitly when passing them.
Delay-bind script blocks unconditionally provide access to the pipeline input objects, even if the parameter would ordinarily not be bound by a given pipeline object, if it is defined as ValueFromPipelineByPropertyName and the object lacks a property by that name.
This enables techniques such as the following call to Rename-Item, where the pipeline input from Get-Item is - as usual - bound to the -LiteralPath parameter, but passing a script block to -NewName - which would ordinarily only bind to input objects with a .NewName property - enables access to the same pipeline object and thus deriving the destination filename from the input filename:
Get-Item file | Rename-Item -NewName { $_.Name + '1' } # renames 'file' to 'file1'; input binds to both -LiteralPath (implicitly) and the -NewName script block.
Note: Unlike script blocks passed to ForEach-Object or Where-Object, for example, delay-bind script blocks run in a child variable scope[2], which means that you cannot directly modify the caller's variables, such as incrementing a counter across input objects.
As a workaround, use a [ref]-typed variable declared in the caller's scope and access its .Value property inside the script block - see this answer for an example.

Wrapper function for cmdlet - pass remaining parameters

I'm writing a function that wraps a cmdlet using ValueFromRemainingArguments (as discussed here).
The following simple code demonstrates the problem:
works
function Test-WrapperArgs {
Set-Location #args
}
Test-WrapperArgs -Path C:\
does not work
function Test-WrapperUnbound {
Param(
[Parameter(ValueFromRemainingArguments)] $UnboundArgs
)
Set-Location #UnboundArgs
}
Test-WrapperUnbound -Path C:\
Set-Location: F:\cygwin\home\thorsten\.config\powershell\test.ps1:69
Line |
69 | Set-Location #UnboundArgs
| ~~~~~~~~~~~~~~~~~~~~~~~~~
| A positional parameter cannot be found that accepts argument 'C:\'.
I tried getting to the issue with GetType and EchoArgs from the PowerShell Community Extensions to no avail. At the moment I'm almost considering a bug (maybe related to this ticket??).
The best solution for an advanced function (one that uses a [CmdletBinding()] attribute and/or a [Parameter()] attribute) is to scaffold a proxy (wrapper) function via the PowerShell SDK, as shown in this answer.
This involves essentially duplicating the target command's parameter declarations (albeit in an automatic, but static fashion).
If you do not want to use this approach, your only option is to perform your own parsing of the $UnboundArgs array (technically, it is an instance of [System.Collections.Generic.List[object]]), which is cumbersome, however, and not foolproof:
function Test-WrapperUnbound {
Param(
[Parameter(ValueFromRemainingArguments)] $UnboundArgs
)
# (Incompletely) emulate PowerShell's own argument parsing by
# building a hashtable of parameter-argument pairs to pass through
# to Set-Location via splatting.
$htPassThruArgs = #{}; $key = $null
switch -regex ($UnboundArgs) {
'^-(.+)' { if ($key) { $htPassThruArgs[$key] = $true } $key = $Matches[1] }
default { $htPassThruArgs[$key] = $_; $key = $null }
}
if ($key) { $htPassThruArgs[$key] = $true } # trailing switch param.
# Pass the resulting hashtable via splatting.
Set-Location #htPassThruArgs
}
Note:
This isn't foolproof in that your function won't be able to distinguish between an actual parameter name (e.g., -Path) and a string literal that happens to look like a parameter name (e.g., '-Path')
Also, unlike with the scaffolding-based proxy-function approach mentioned at the top, you won't get tab-completion for any pass-through parameters and the pass-through parameters won't be listed with -? / Get-Help / Get-Command -Syntax.
If you don't mind having neither tab-completion nor syntax help and/or your wrapper function must support pass-through to multiple or not-known-in-advance target commands, using a simple (non-advanced) function with #args (as in your working example; see also below) is the simplest option, assuming your function doesn't itself need to support common parameters (which requires an advanced function).
Using a simple function also implies that common parameters are passed through to the wrapped command only (whereas an advanced function would interpret them as meant for itself, though their effect usually propagates to calls inside the function; with a common parameter such as -OutVariable, however, the distinction matters).
As for what you tried:
While PowerShell does support splatting via arrays (or array-like collections such as [System.Collections.Generic.List[object]]) in principle, this only works as intended if all elements are to be passed as positional arguments and/or if the target command is an external program (about whose parameter structure PowerShell knows nothing, and always passes arguments as a list/array of tokens).
In order to pass arguments with named parameters to other PowerShell commands, you must use hashtable-based splatting, where each entry's key identifies the target parameter and the value the parameter value (argument).
Even though the automatic $args variable is technically also an array ([object[]]), PowerShell has built-in magic that allows splatting with #args to also work with named parameters - this does not work with any custom array or collection.
Note that the automatic $args variable, which collects all arguments for which no parameter was declared - is only available in simple (non-advanced) functions and scripts; advanced functions and scripts - those that use the [CmdletBinding()] attribute and/or [Parameter()] attributes - require that all potential parameters be declared.

How can I prevent variable injection in PowerShell?

I was triggered again on a comment on a recent PowerShell question from #Ansgar Wiechers: DO NOT use Invoke-Expression with regards to a security question I have for a long time somewhere in the back of my mind and need to ask.
The strong statement (with a reference to the Invoke-Expression considered harmful article) suggests that an invocation of a script that can overwrite variables is considered harmful.
Also the PSScriptAnalyzer advises against using Invoke-Expression, see the AvoidUsingInvokeExpression rule.
But I once used a technic myself to update a common variable in a recursive script which can actually overwrite a value in any of its parents scopes which is as simple as:
([Ref]$ParentVariable).Value = $NewValue
As far as I can determine a potential malicious script could use this technic too to inject variables in any case no matter how it is invoked...
Consider the following "malicious" Inject.ps1 script:
([Ref]$MyValue).Value = 456
([Ref]$MyString).Value = 'Injected string'
([Ref]$MyObject).Value = [PSCustomObject]#{Name = 'Injected'; Value = 'Object'}
My Test.ps1 script:
$MyValue = 123
$MyString = "MyString"
$MyObject = [PSCustomObject]#{Name = 'My'; Value = 'Object'}
.\Inject.ps1
Write-Host $MyValue
Write-Host $MyString
Write-Host $MyObject
Result:
456
Injected string
#{Name=Injected; Value=Object}
As you see all three variables in the Test.ps1 scope are overwritten by the Inject.ps1 script. This can also be done using the Invoke-Command cmdlet and it doesn't even matter whether I set the scope of a variable to Private either:
New-Variable -Name MyValue -Value 123 -Scope Private
$MyString = "MyString"
$MyObject = [PSCustomObject]#{Name = 'My'; Value = 'Object'}
Invoke-Command {
([Ref]$MyValue).Value = 456
([Ref]$MyString).Value = 'Injected string'
([Ref]$MyObject).Value = [PSCustomObject]#{Name = 'Injected'; Value = 'Object'}
}
Write-Host $MyValue
Write-Host $MyString
Write-Host $MyObject
Is there a way to completely isolate an invoked script/command from overwriting variables in the current scope?
If not, can this be considered as a security risk for invoking scripts in any way?
The advice against use of Invoke-Expression use is primarily about preventing unintended execution of code (code injection).
If you invoke a piece of PowerShell code - whether directly or via Invoke-Expression - it can indeed (possibly maliciously) manipulate parent scopes, including the global scope.
Note that this potential manipulation isn't limited to variables: for instance, functions and aliases can be modified as well.
Caveat: Running unknown code is problematic in two respects:
Primarily for the potential to perform unwanted / destructive actions directly.[1]
Secondarily, for the potential to maliciously modify the caller's state (variables, ...), which is the only aspect the solutions below guard against.
To provide the desired isolation, you have two basic choices:
Run the code in a child process:
By starting another PowerShell instance; e.g. (use powershell instead of pwsh in Windows PowerShell):
pwsh -c { ./someUntrustedScript.ps1 }
By starting a background job; e.g.:
Start-Job { ./someUntrustedScript.ps1 } | Receive-Job -Wait -AutoRemove
Run the code in a separate thread in the same process:
As a thread job, via the Start-ThreadJob cmdlet (ships with PowerShell [Core] 6+; in Windows PowerShell, it can be installed from the PowerShell Gallery with something like Install-Module -Scope CurrentUser ThreadJob); e.g.:
Start-ThreadJob { ./someUntrustedScript.ps1 } | Receive-Job -Wait -AutoRemove
By creating a new runspace via the PowerShell SDK; e.g.:
[powershell]::Create().AddScript('./someUntrustedScript.ps1').Invoke()
Note that you'll have to do extra work to get the output streams other than the success one, notably the error stream's output; also, .Dispose() should be called on the PowerShell instance on completion of the command.
A child process-based solution will be slow and limited in terms of data types you can return (due to serialization / deserialization being involved), but it provides isolation against the invoked code crashing the process.
A thread-based job is much faster, can return any data type, but can crash the entire process.
In all cases you will have to pass any values from the caller that the invoked code needs access to as arguments or, with background jobs and thread jobs, alternatively via the $using: scope specifier.
js2010 mentions other, less desirable alternatives:
Start-Process (child process-based, with text-only arguments and output)
PowerShell Workflows, which are obsolescent (they weren't ported to PowerShell Core and won't be).
Using Invoke-Command with "loopback remoting" (-ComputerName localhost) is hypothetically also an option, but then you incur the double overhead of a child process and HTTP-based communication; also, your computer must be set up for remoting, and you must run with elevation (as administrator).
[1] A way to mitigate the problem is to limit which commands, statements, types, ... are permitted to be called when the string is evaluated, which can be achieved via the PowerShell SDK in combination with language modesand/or by explicitly constructing an initial session state. See this answer for an example of SDK use with language modes.