I'm using SSIS and Powershell to check if a file is locked or not.
I have the below Expression in a variable called 'Cmd':
"-NoProfile -ExecutionPolicy ByPass -Command \"try { [IO.File]::OpenWrite(‘” + #[User::TestFilePath] + “‘).close();0 } catch {999}"
Which evaluates to this:
-NoProfile -ExecutionPolicy ByPass -Command "try { [IO.File]::OpenWrite(‘” + #[User::TestFilePath] + “‘).close();0 } catch {999}
Using the Execute Process Task I then call the Cmd variable above and have Success and Failure constraints after it. The Process always reports a Success, even I open the file in question, rename or even delete it.
If I then amend to the below, the Task will always fail, even if its not open:
"-NoProfile -ExecutionPolicy ByPass -Command \"try { [IO.File]::OpenWrite(‘” + #[User::TestFilePath] + “‘).close();exit 0} catch {exit 999}"
What am I missing?
If it helps anyone, I found a resolution. Instead of calling the PS script via an expression, I just called the actual PS file via a Process Task with the below in in the PS file:
$file = "\\xxxx\xxxxx\xxxxxxxx\test.log"
try { [IO.File]::OpenWrite($file).close();exit 0 } catch { exit 999}
Related
function Test-IsAdministrator
{
$Identity = [System.Security.Principal.WindowsIdentity]::GetCurrent()
$Principal = New-Object System.Security.Principal.WindowsPrincipal($Identity)
$Principal.IsInRole([System.Security.Principal.WindowsBuiltInRole]::Administrator)
}
function Test-IsUacEnabled
{
(Get-ItemProperty HKLM:\Software\Microsoft\Windows\CurrentVersion\Policies\System).EnableLua -ne 0
}
if (!(Test-IsAdministrator))
{
if (Test-IsUacEnabled)
{
[string[]]$argList = #('-NoProfile', '-NoExit', '-File', $MyInvocation.MyCommand.Path)
$argList += $MyInvocation.BoundParameters.GetEnumerator() | ForEach-Object {"-$($_.Key)", "$($_.Value)"}
$argList += $MyInvocation.UnboundArguments
Start-Process PowerShell.exe -Verb Runas -WorkingDirectory $pwd -ArgumentList $argList
return
}
else
{
throw "You must be an administrator to run this script."
}
}
If I run the script above, it successfully spawns another PowerShell instance with elevated privileges but the current working directory is lost and automatically set to C:\Windows\System32. Bound Parameters are also lost or incorrectly parsed.
After reading similar questions I learned that when using Start-Process with -Verb RunAs, the -WorkingDirectory argument is only honored if the target executable is a .NET executable. For some reason PowerShell 5 doesn't honor it:
The problem exists at the level of the .NET API that PowerShell uses behind the scenes (see System.Diagnostics.ProcessStartInfo), as of this writing (.NET 6.0.0-preview.4.21253.7).
Quote from this related question:
In practice - and the docs do not mention that - the -WorkingDirectory parameter is not respected if you start a process elevated (with administrative privileges, which is what -Verb RunAs - somewhat obscurely - does): the location defaults to $env:SYSTEMROOT\system32 (typically, C:\Windows\System32).
So the most common solution I've seen involves using -Command instead of -File. I.E:
Start-Process -FilePath powershell.exe -Verb Runas -ArgumentList '-Command', 'cd C:\ws; & .\script.ps1'
This looks really hack-ish but works. The only problem is I can't manage to get an implementation that can pass both bound and unbound parameters to the script being called via -Command.
I am trying my hardest to find the most robust implementation of self-elevation possible so that I can nicely wrap it into a function (and eventually into a module I'm working on) such as Request-AdminRights which can then be cleanly called immediately in new scripts that require admin privileges and/or escalation. Pasting the same auto-elevation code at the beginning of every script that needs admin rights feels really sloppy.
I'm also concerned I might be overthinking things, and to just leave elevation to the script level instead of wrapping it into a function.
Any input at all is greatly appreciated.
Note: On 15 Nov 2021 a bug was fixed in the code below in order to make it work properly with advanced scripts - see this answer for details.
The closest you can get to a robust, cross-platform self-elevating script solution that supports:
both positional (unnamed) and named arguments
while preserving type fidelity within the constraints of PowerShell's serialization (see this answer)
preserving the caller's working directory.
On Unix-like platforms only: synchronous, same-window execution with exit-code reporting (via the standard sudo utility).
is the following monstrosity (I certainly wish this were easier):
Note:
For (relative) brevity, I've omitted your Test-IsUacEnabled test, and simplified the test for whether the current session is already elevated to [bool] (net.exe session 2>$null)
You can drop everything between # --- BEGIN: Helper function for self-elevation. and # --- END: Helper function for self-elevation. into any script to make it self-elevating.
If you find yourself in repeated need of self-elevation, in different scripts, you can copy the code into your $PROFILE file or - better suited to wider distribution - convert the dynamic (in-memory) module used below (via New-Module) into a regular persisted module that your scripts can (auto-)load. With the Ensure-Elevated function available available via an auto-loading module, all you need in a given script is to call Ensure-Elevated, without arguments (or with -Verbose for verbose output).
# Sample script parameter declarations.
# Note: Since there is no [CmdletBinding()] attribute and no [Parameter()] attributes,
# the script also accepts *unbound* arguments.
param(
[object] $First,
[int] $Second,
[array] $Third
)
# --- BEGIN: Helper function for self-elevation.
# Define a dynamic (in-memory) module that exports a single function, Ensure-Elevated.
# Note:
# * In real life you would put this function in a regular, persisted module.
# * Technically, 'Ensure' is not an approved verb, but it seems like the best fit.
$null = New-Module -Name "SelfElevation_$PID" -ScriptBlock {
function Ensure-Elevated {
[CmdletBinding()]
param()
$isWin = $env:OS -eq 'Windows_NT'
# Simply return, if already elevated.
if (($isWin -and (net.exe session 2>$null)) -or (-not $isWin -and 0 -eq (id -u))) {
Write-Verbose "(Now) running as $(("superuser", "admin")[$isWin])."
return
}
# Get the relevant variable values from the calling script's scope.
$scriptPath = $PSCmdlet.GetVariableValue('PSCommandPath')
$scriptBoundParameters = $PSCmdlet.GetVariableValue('PSBoundParameters')
$scriptArgs = $PSCmdlet.GetVariableValue('args')
Write-Verbose ("This script, `"$scriptPath`", requires " + ("superuser privileges, ", "admin privileges, ")[$isWin] + ("re-invoking with sudo...", "re-invoking in a new window with elevation...")[$isWin])
# Note:
# * On Windows, the script invariably runs in a *new window*, and by design we let it run asynchronously, in a stay-open session.
# * On Unix, sudo runs in the *same window, synchronously*, and we return to the calling shell when the script exits.
# * -inputFormat xml -outputFormat xml are NOT used:
# * The use of -encodedArguments *implies* CLIXML serialization of the arguments; -inputFormat xml presumably only relates to *stdin* input.
# * On Unix, the CLIXML output created by -ouputFormat xml is not recognized by the calling PowerShell instance and passed through as text.
# * On Windows, the elevated session's working dir. is set to the same as the caller's (happens by default on Unix, and also in PS Core on Windows - but not in *WinPS*)
# Determine the full path of the PowerShell executable running this session.
# Note: The (obsolescent) ISE doesn't support the same CLI parameters as powershell.exe, so we use the latter.
$psExe = (Get-Process -Id $PID).Path -replace '_ise(?=\.exe$)'
if (0 -ne ($scriptBoundParameters.Count + $scriptArgs.Count)) {
# ARGUMENTS WERE PASSED, so the CLI must be called with -encodedCommand and -encodedArguments, for robustness.
# !! To work around a bug in the deserialization of [switch] instances, replace them with Boolean values.
foreach ($key in #($scriptBoundParameters.Keys)) {
if (($val = $scriptBoundParameters[$key]) -is [switch]) { $null = $scriptBoundParameters.Remove($key); $null = $scriptBoundParameters.Add($key, $val.IsPresent) }
}
# Note: If the enclosing script is non-advanced, *both*
# $scriptBoundParameters and $scriptArgs may be present.
# !! Be sure to pass #() when $args is $null (advanced script), otherwise a scalar $null will be passed on reinvocation.
# Use the same serialization depth as the remoting infrastructure (1).
$serializedArgs = [System.Management.Automation.PSSerializer]::Serialize(($scriptBoundParameters, (#(), $scriptArgs)[$null -ne $scriptArgs]), 1)
# The command that receives the (deserialized) arguments.
# Note: Since the new window running the elevated session must remain open, we do *not* append `exit $LASTEXITCODE`, unlike on Unix.
$cmd = 'param($bound, $positional) Set-Location "{0}"; & "{1}" #bound #positional' -f (Get-Location -PSProvider FileSystem).ProviderPath, $scriptPath
if ($isWin) {
Start-Process -Verb RunAs $psExe ('-noexit -encodedCommand {0} -encodedArguments {1}' -f [Convert]::ToBase64String([Text.Encoding]::Unicode.GetBytes($cmd)), [Convert]::ToBase64String([Text.Encoding]::Unicode.GetBytes($serializedArgs)))
}
else {
sudo $psExe -encodedCommand ([Convert]::ToBase64String([Text.Encoding]::Unicode.GetBytes($cmd))) -encodedArguments ([Convert]::ToBase64String([Text.Encoding]::Unicode.GetBytes($serializedArgs)))
}
}
else {
# NO ARGUMENTS were passed - simple reinvocation of the script with -c (-Command) is sufficient.
# Note: While -f (-File) would normally be sufficient, it leaves $args undefined, which could cause the calling script to break.
# Also, on WinPS we must set the working dir.
if ($isWin) {
Start-Process -Verb RunAs $psExe ('-noexit -c Set-Location "{0}"; & "{1}"' -f (Get-Location -PSProvider FileSystem).ProviderPath, $scriptPath)
}
else {
# Note: On Unix, the working directory is always automatically inherited.
sudo $psExe -c "& `"$scriptPath`"; exit $LASTEXITCODE"
}
}
# EXIT after reinvocation, passing the exit code through, if possible:
# On Windows, since Start-Process was invoked asynchronously, all we can report is whether *it* failed on invocation.
exit ($LASTEXITCODE, (1, 0)[$?])[$isWin]
}
}
# --- END: Helper function for self-elevation.
"Current location: $($PWD.ProviderPath)"
# Call the self-elevation helper function:
# * If this session is already elevated, the call is a no-op and execution continues,
# in the current console window.
# * Otherwise, the function exits the script and re-invokes it with elevation,
# passing all arguments through and preserving the working directory.
# * On Windows:
# * UAC will prompt for confirmation / adming credentials every time.
# * Of technical necessity, the elevated session runs in a *new* console window,
# asynchronously, and the window running the elevated session remains open.
# Note: The new window is a regular *console window*, irrespective of the
# environment you're calling from (including Windows Terminal, VSCode,
# or the (obsolescent) ISE).
# * Due to running asynchronously in a new window, the calling session won't know
# the elevated script call's exit code.
# * On Unix:
# * The `sudo` utility used for elevation will prompt for a password,
# and by default remember it for 5 minutes for repeat invocations.
# * The elevated script runs in the *current* window, *synchronously*,
# and $LASTEXITCODE reflects the elevated script's exit code.
# That is, the elevated script runs and returns control to the non-elevated caller.
# Note that $LASTEXITCODE is only meaningful if the elevated script
# sets its intentionally, via `exit $n`.
# Omit -Verbose to suppress verbose output.
Ensure-Elevated -Verbose
# For illustration:
# Print the arguments received in diagnostic form.
Write-Verbose -Verbose '== Arguments received:'
[PSCustomObject] #{
PSBoundParameters = $PSBoundParameters.GetEnumerator() | Select-Object Key, Value, #{ n='Type'; e={ $_.Value.GetType().Name } } | Out-String
# Only applies to non-advanced scripts
Args = $args | ForEach-Object { [pscustomobject] #{ Value = $_; Type = $_.GetType().Name } } | Out-String
CurrentLocation = $PWD.ProviderPath
} | Format-List
Sample call:
If you save the above code to file script.ps1 and invoke it as follows:
./script.ps1 -First (get-date) -Third ('foo', 'bar') -Second 42 #{ unbound=1 } 'last unbound'
you'll see the following:
In the non-elevated session, which triggers the UAC / sudo password prompt (Windows example):
Current location: C:\Users\jdoe\sample
VERBOSE: This script, "C:\Users\jdoe\sample\script.ps1", requires admin privileges, re-invoking in a new window with elevation...
In the elevated session (which on Unix runs transiently in the same window):
VERBOSE: (Now) running as admin.
VERBOSE: == Arguments received:
PSBoundParameters :
Key Value Type
--- ----- ----
First 10/30/2021 12:30:08 PM DateTime
Third {foo, bar} Object[]
Second 42 Int32
Args :
Value Type
----- ----
{unbound} Hashtable
last unbound String
CurrentLocation : C:\Users\jdoe\sample
I figured out a really short solution:
if (-Not ([Security.Principal.WindowsPrincipal] [Security.Principal.WindowsIdentity]::GetCurrent()).IsInRole([Security.Principal.WindowsBuiltInRole] 'Administrator')) {
if ([int](Get-CimInstance -Class Win32_OperatingSystem | Select-Object -ExpandProperty BuildNumber) -ge 6000) {
$CommandLine = "-NoExit -c cd '$pwd'; & `"" + $MyInvocation.MyCommand.Path + "`""
Start-Process powershell -Verb runas -ArgumentList $CommandLine
Exit
}
}
#Elevated script content after that
Please, observe:
The method
PS C:\> (Get-Command Invoke-SilentlyAndReturnExitCode).ScriptBlock
param([scriptblock]$Command, $Folder)
$ErrorActionPreference = 'Continue'
Push-Location $Folder
try
{
& $Command > $null 2>&1
$LASTEXITCODE
}
catch
{
-1
}
finally
{
Pop-Location
}
PS C:\>
The command to silence
PS C:\> $ErrorActionPreference = "Stop"
PS C:\> $Command = { cmd /c dir xo-xo-xo }
PS C:\> & $Command > $null 2>&1
cmd : File Not Found
At line:1 char:14
+ $Command = { cmd /c dir xo-xo-xo }
+ ~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (File Not Found:String) [], RemoteException
+ FullyQualifiedErrorId : NativeCommandError
PS C:\>
As you can see, it fails with an exception. But we can silence it easily, right?
PS C:\> $ErrorActionPreference = 'SilentlyContinue'
PS C:\> & $Command > $null 2>&1
PS C:\> $LASTEXITCODE
1
PS C:\>
All is good. Now my function does the same, so let us try it:
PS C:\> $ErrorActionPreference = "Stop"
PS C:\> Invoke-SilentlyAndReturnExitCode $Command
-1
PS C:\>
Yikes! It returns -1, not 1.
The problem appears to be that setting $ErrorActionPreference inside the function does not actually propagate to the command scope. Indeed, let me add some output:
PS C:\> (Get-Command Invoke-SilentlyAndReturnExitCode).ScriptBlock
param([scriptblock]$Command, $Folder)
$ErrorActionPreference = 'Continue'
Push-Location $Folder
try
{
Write-Host $ErrorActionPreference
& $Command > $null 2>&1
$LASTEXITCODE
}
catch
{
-1
}
finally
{
Pop-Location
}
PS C:\> $Command = { Write-Host $ErrorActionPreference ; cmd /c dir xo-xo-xo }
PS C:\> Invoke-SilentlyAndReturnExitCode $Command
Continue
Stop
-1
PS C:\>
So, the problem is really around $ErrorActionPreference - why does it not propagate? Powershell uses dynamic scoping, so the command definition should not capture its value, but use the one from the function. So, what is going on? How to fix it?
tl;dr
Because your Invoke-SilentlyAndReturnExitCode function is defined in a module, you must recreate your script block in the scope of that module for it to see the module-local $ErrorActionPreference value of Continue:
# Use an in-memory module to demonstrate the behavior.
$null = New-Module {
Function Invoke-SilentlyAndReturnExitCode {
param([scriptblock] $Command, $Folder)
$ErrorActionPreference = 'Continue'
Push-Location $Folder
try
{
Write-Host $ErrorActionPreference # local value
# *Recreate the script block in the scope of this module*,
# which makes it see the module's variables.
$Command = [scriptblock]::Create($Command.ToString())
# Invoke the recreated script block, suppressing all output.
& $Command *>$null
# Output the exit code.
$LASTEXITCODE
}
catch
{
-1
}
finally
{
Pop-Location
}
}
}
$ErrorActionPreference = 'Stop'
$Command = { Out-Host -InputObject $ErrorActionPreference; cmd /c dir xo-xo-xo }
Invoke-SilentlyAndReturnExitCode $Command
On Windows, the above now prints the following, as expected:
Continue
Continue
1
That is, the recreated $Command script block saw the function-local $ErrorActionPreference value, and the catch block was not triggered.
Caveat:
This will only work if the $Command script block contains no references to variables in the originating scope other than variables in the global scope.
The alternative to avoid this limitation is to define the function outside of a module (assuming you're also calling it from code that lives outside modules).
Background Information
The behavior implies that your Invoke-SilentlyAndReturnExitCode function is defined in a module, and each module has its own domain of scopes (hierarchy of scopes).
Your $Command script block, because it was defined outside that module, is bound to the default scope domain, and even when executed from inside a module, it continues see the variables from the scope domain in which it was defined.
Therefore, $Command still sees the Stop $ErrorActionPreference value, even though for module-originated code inside the function it would be Continue, due to setting a local copy of $ErrorActionPreference inside the module function.
Perhaps surprisingly, it is still the $ErrorActionPreference in effect inside $Command that controls the behavior, not the function-local value.
With a redirection such as 2>$null for *>$null in effect while Stop is the effective $ErrorActionPreference value, the mere presence of stderr output from an external program - whether it indicates a true error of not - triggers a terminating error and therefore the catch branch.
This particular behavior - where the explicit intent to suppress stderr output triggers an error - should be considered a bug, and has been reported in this GitHub issue.
The general behavior, however - a script block executing in the scope in which it was defined - while non-obvious, is by design.
Note: The remainder of this answer is its original form, which contains general background information that, however, does not cover the module aspect discussed above.
*> $null can be used to silence all output from a command - no need for suppressing the success output stream (>, implied 1>) and the error output stream (2>) separately.
Generally, $ErrorActionPreference has no effect on error output from external programs (such as git), because stderr output from external programs bypasses PowerShell's error stream by default.
There is on exception, however: setting $ErrorActionPreference to 'Stop' actually makes redirections such as 2>&1 and *>$null throw a terminating error if an external program such as git produces any stderr output.
This unexpected behavior is discussed in this GitHub issue.
Otherwise, a call to an external program never triggers a terminating error that a try / catch statement would handle. Success or failure can only be inferred from the automatic $LASTEXITCODE variable.
Therefore, write your function as follows if you define (and call) it outside a module:
function Invoke-SilentlyAndReturnExitCode {
param([scriptblock]$Command, $Folder)
# Set a local copy of $ErrorActionPreference,
# which will go out of scope on exiting this function.
# For *> $null to effectively suppress stderr output from
# external programs *without triggering a terminating error*
# any value other than 'Stop' will do.
$ErrorActionPreference = 'Continue'
Push-Location $Folder
try {
# Invoke the script block and suppress all of its output.
# Note that if the script block calls an *external program*, the
# catch handler will never get triggered - unless the external program
# cannot be found.
& $Command *> $null
$LASTEXITCODE
}
catch {
# Output the exit code used by POSIX-like shells such
# as Bash to signal that an executable could not be found.
127
} finally {
Pop-Location
}
}
I have a vbscript to call a PowerShell script in hopes of returning the PowerShell output to an HTA (HTML Application) GUI. Right now I just want to see if I can return the PowerShell output into a MsgBox in the vbscript. I am not having much luck with this.
VBScript Code:
Set shell = CreateObject("WScript.Shell")
return = shell.Run("powershell.exe -executionpolicy bypass -noprofile -file pathToScript\PowerShellToVBA.ps1", , true)
MsgBox return
PowerShell Code:
Clear-Host
return 50
I am trying to keep the return value extremely simple until it works. With this, I would expect the MsgBox to return '50', however it is returning '0' instead. Any ideas what im doing wrong?
I think you just want the exit command to get the return value:
VBScript
pscommand = ".\myscript.ps1; exit $LASTEXITCODE"
cmd = "powershell.exe -noprofile -command " & pscommand
Set shell = CreateObject("WScript.Shell")
rv = shell.Run(cmd, , True)
MsgBox "PowerShell returned: " & rv, vbSystemModal
Powershell
exit 50;
EDIT #1
Or if you want to grab a return string:
VBScript
pscommand = ".\myscript2.ps1"
cmd = "powershell.exe -noprofile -command " & pscommand
Set shell = CreateObject("WScript.Shell")
Set executor = shell.Exec(cmd)
executor.StdIn.Close
MsgBox executor.StdOut.ReadAll
Powershell
Write-Output '***SUCCESS***';
As a followup, using the VBScript above to output a string. In your PS script if you use eg
write-host "Error: some specific thing happened"
to set various error or success messages the
MsgBox executor.StdOut.ReadAll
will display all of what you write via write-host.
A helpful way to display the outcome of the PS script to your VBA user.
Just FYI.
I'm trying to run a .bat file calling a shell .ps1 file.
I've tested my script directly in powershell and there it works.
But when I run the .bat, a error occurs saying to me something like [ The string started with: (...) does not contain the terminator " ]
My .bat file:
powershell.exe -command "& C:\Users\I\Desktop\teste.ps1"
My .ps1 file:
$scripts = 'C:\Users\I\Desktop\Teste_0.1\Teste\Teste_run.bat', 'C:\Users\I\Desktop\Teste_0.2\Teste\Teste_run.bat','C:\Users\I\Desktop\Teste_0.3\Teste\Teste_run.bat' |%{ Start-Job –scriptblock (iex "[Scriptblock] { $_ } ")}| wait-job
To call another script from powershell, I don't think you need to use -command. You should just be able to call the script directly.
powershell -nol -noe C:\Users\I\Desktop\teste.ps1
Alternately, here is another clever way to do it.
#echo off
more +4 "%~dpnx0" >> temp.ps1 && powershell -nol -noe .\temp.ps1
exit /b
$scripts = 'C:\Users\I\Desktop\Teste_0.1\Teste\Teste_run.bat', 'C:\Users\I\Desktop\Teste_0.2\Teste\Teste_run.bat','C:\Users\I\Desktop\Teste_0.3\Teste\Teste_run.bat' |%{ Start-Job –scriptblock (iex "[Scriptblock] { $_ } ")}| wait-job
I have a script that may be run manually or may be run by a scheduled task. I need to programmatically determine if I'm running in -noninteractive mode (which is set when run via scheduled task) or normal mode. I've googled around and the best I can find is to add a command line parameter, but I don't have any feasible way of doing that with the scheduled tasks nor can I reasonably expect the users to add the parameter when they run it manually.
Does noninteractive mode set some kind of variable or something I could check for in my script?
Edit:
I actually inadvertently answered my own question but I'm leaving it here for posterity.
I stuck a read-host in the script to ask the user for something and when it ran in noninteractive mode, boom, terminating error. Stuck it in a try/catch block and do stuff based on what mode I'm in.
Not the prettiest code structure, but it works. If anyone else has a better way please add it!
I didn't like any of the other answers as a complete solution. [Environment]::UserInteractive reports whether the user is interactive, not specifically if the process is interactive. The api is useful for detecting if you are running inside a service. Here's my solution to handle both cases:
function Assert-IsNonInteractiveShell {
# Test each Arg for match of abbreviated '-NonInteractive' command.
$NonInteractive = [Environment]::GetCommandLineArgs() | Where-Object{ $_ -like '-NonI*' }
if ([Environment]::UserInteractive -and -not $NonInteractive) {
# We are in an interactive shell.
return $false
}
return $true
}
You can check how powershell was called using Get-WmiObject for WMI objects:
(gwmi win32_process | ? { $_.processname -eq "powershell.exe" }) | select commandline
#commandline
#-----------
#"C:\WINDOWS\system32\WindowsPowerShell\v1.0\powershell.exe" -noprofile -NonInteractive
UPDATE: 2020-10-08
Starting in PowerShell 3.0, this cmdlet has been superseded by Get-CimInstance
(Get-CimInstance win32_process -Filter "ProcessID=$PID" | ? { $_.processname -eq "pwsh.exe" }) | select commandline
#commandline
#-----------
#"C:\Program Files\PowerShell\6\pwsh.exe"
I think the question needs a more thorough evaluation.
"interactive" means the shell is running as REPL - a continuous read-execute-print loop.
"non-interactive" means the shell is executing a script, command, or script block and terminates after execution.
If PowerShell is run with any of the options -Command, -EncodedCommand, or -File, it is non-interactive. Unfortunately, you can also run a script without options (pwsh script.ps1), so there is no bullet-proof way of detecting whether the shell is interactive.
So are we out of luck then? No, fortunately PowerShell does automatically add options that we can test if PowerShell runs a script block or is run via ssh to execute commands (ssh user#host command).
function IsInteractive {
# not including `-NonInteractive` since it apparently does nothing
# "Does not present an interactive prompt to the user" - no, it does present!
$non_interactive = '-command', '-c', '-encodedcommand', '-e', '-ec', '-file', '-f'
# alternatively `$non_interactive [-contains|-eq] $PSItem`
-not ([Environment]::GetCommandLineArgs() | Where-Object -FilterScript {$PSItem -in $non_interactive})
}
Now test in your PowerShell profile whether this is in interactive mode, so the profile is not run when you execute a script, command or script block (you still have to remember to run pwsh -f script.ps1 - not pwsh script.ps1)
if (-not (IsInteractive)) {
exit
}
Testing for interactivity should probably take both the process and the user into account. Looking for the -NonInteractive (minimally -noni) powershell switch to determine process interactivity (very similar to #VertigoRay's script) can be done using a simple filter with a lightweight -like condition:
function Test-Interactive
{
<#
.Synopsis
Determines whether both the user and process are interactive.
#>
[CmdletBinding()] Param()
[Environment]::UserInteractive -and
!([Environment]::GetCommandLineArgs() |? {$_ -ilike '-NonI*'})
}
This avoids the overhead of WMI, process exploration, imperative clutter, double negative naming, and even a full regex.
I wanted to put an updated answer here because it seems that [Environment]::UserInteractive doesn't behave the same between a .NET Core (container running microsoft/nanoserver) and .NET Full (container running microsoft/windowsservercore).
While [Environment]::UserInteractive will return True or False in 'regular' Windows, it will return $null in 'nanoserver'.
If you want a way to check interactive mode regardless of the value, add this check to your script:
($null -eq [Environment]::UserInteractive -or [Environment]::UserInteractive)
EDIT: To answer the comment of why not just check the truthiness, consider the following truth table that assumes such:
left | right | result
=======================
$null | $true | $false
$null | $false | $true (!) <--- not what you intended
This will return a Boolean when the -Noninteractive switch is used to launch the PowerShell prompt.
[Environment]::GetCommandLineArgs().Contains('-NonInteractive')
I came up with a posh port of existing and proven C# code that uses a fair bit of P/Invoke to determine all the corner cases. This code is used in my PowerShell Build Script that coordinates several build tasks around Visual Studio projects.
# Some code can be better expressed in C#...
#
Add-Type #'
using System;
using System.Runtime.InteropServices;
public class Utils
{
[DllImport("kernel32.dll")]
private static extern uint GetFileType(IntPtr hFile);
[DllImport("kernel32.dll")]
private static extern IntPtr GetStdHandle(int nStdHandle);
[DllImport("kernel32.dll")]
private static extern IntPtr GetConsoleWindow();
[DllImport("user32.dll")]
private static extern bool IsWindowVisible(IntPtr hWnd);
public static bool IsInteractiveAndVisible
{
get
{
return Environment.UserInteractive &&
GetConsoleWindow() != IntPtr.Zero &&
IsWindowVisible(GetConsoleWindow()) &&
GetFileType(GetStdHandle(-10)) == 2 && // STD_INPUT_HANDLE is FILE_TYPE_CHAR
GetFileType(GetStdHandle(-11)) == 2 && // STD_OUTPUT_HANDLE
GetFileType(GetStdHandle(-12)) == 2; // STD_ERROR_HANDLE
}
}
}
'#
# Use the interactivity check somewhere:
if (![Utils]::IsInteractiveAndVisible)
{
return
}
C:\> powershell -NoProfile -NoLogo -NonInteractive -Command "[Environment]::GetCommandLineArgs()"
C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe
-NoProfile
-NoLogo
-NonInteractive
-Command
[Environment]::GetCommandLineArgs()
Implement two scripts, one core.ps1 to be manually launched, and one scheduled.ps1 that launches core.ps1 with a parameter.
powerShell -NonInteractive { Get-WmiObject Win32_Process -Filter "Name like '%powershell%'" | select-Object CommandLine }
powershell -Command { Get-WmiObject Win32_Process -Filter "Name like '%powershell%'" | select-Object CommandLine }
In the first case, you'll get the "-NonInteractive" param. In the latter you won't.
Script: IsNonInteractive.ps1
function Test-IsNonInteractive()
{
#ref: http://www.powershellmagazine.com/2013/05/13/pstip-detecting-if-the-console-is-in-interactive-mode/
#powershell -NoProfile -NoLogo -NonInteractive -File .\IsNonInteractive.ps1
return [bool]([Environment]::GetCommandLineArgs() -Contains '-NonInteractive')
}
Test-IsNonInteractive
Example Usage (from command prompt)
pushd c:\My\Powershell\Scripts\Directory
::run in non-interactive mode
powershell -NoProfile -NoLogo -NonInteractive -File .\IsNonInteractive.ps1
::run in interactive mode
powershell -File .\IsNonInteractive.ps1
popd
More Involved Example Powershell Script
#script options
$promptForCredentialsInInteractive = $true
#script starts here
function Test-IsNonInteractive()
{
#ref: http://www.powershellmagazine.com/2013/05/13/pstip-detecting-if-the-console-is-in-interactive-mode/
#powershell -NoProfile -NoLogo -NonInteractive -File .\IsNonInteractive.ps1
return [bool]([Environment]::GetCommandLineArgs() -Contains '-NonInteractive')
}
function Get-CurrentUserCredentials()
{
return [System.Net.CredentialCache]::DefaultCredentials
}
function Get-CurrentUserName()
{
return ("{0}\{1}" -f $env:USERDOMAIN,$env:USERNAME)
}
$cred = $null
$user = Get-CurrentUserName
if (Test-IsNonInteractive)
{
$msg = 'non interactive'
$cred = Get-CurrentUserCredentials
}
else
{
$msg = 'interactive'
if ($promptForCredentialsInInteractive)
{
$cred = (get-credential -UserName $user -Message "Please enter the credentials you wish this script to use when accessing network resources")
$user = $cred.UserName
}
else
{
$cred = Get-CurrentUserCredentials
}
}
$msg = ("Running as user '{0}' in '{1}' mode" -f $user,$msg)
write-output $msg