How to not error on cmdlets that are missing - powershell

I have a lot of powershell scripts so I created a custom logging Powershell module with a couple of cmdlets. The problem is that we have had trouble getting all of the users to download and put the custom module in their PSModulePath.
Ideally I'd like the script to continue silently and not show any errors if the cmdlet cannot be found, but I haven't found a good way to do it. Right now we get messages like this:
C:> foo-bar
foo-bar : The term 'foo-bar' is not recognized as the name of a cmdlet, function, script file, or operable program...
Setting ErrorActionPreference will suppress the message, but it also suppresses all other error messages which isn't desirable.
I could write a custom function in every script that checks to see if the module is loading before calling the custom cmdlets, but it seems like there should be a better way to do it.

For a given cmdlet foo-bar, you can do the following, once, at the beginning of the script:
if (-not (Get-Command foo-bar -ErrorAction Ignore)) {
# Define a dummy function that will be a no-op when invoked.
function foo-bar { }
}
All subsequent calls to foo-bar then either call the actual cmdlet, if available, or the dummy function, resulting in a quiet no-op.

You can use try/catch in a different way to suppress that. Since it is not a cmdlet, -erroraction won't work directly. So you can try like this:
try
{
foo-bar
}
catch [Management.Automation.RuntimeException]
{
}

Related

Powershell: Why can't I add aliases in functions?

I tried adding several aliases in my powershell profile script. For organization reasons I wanted to keep those in a function which I would call at the end of said profile. I discovered that, while the function can be called from inside or outside the script without any problems, the aliases don't apply to my current powershell session. Only when I add them one at a time in the profile script, they would be usable.
This is my script as it is right now
function Populate-Aliases() {
New-Alias grep Select-String
New-Alias touch New-Item
New-Alias lsa dir -Force
}
Populate-Aliases
I'm also certain the script is executed when I create a new ps session, as is proven by inserting any output in the function. It's just the aliases that don't apply to my session.
I tried creating the aliases via function in the profile script, which didn't work. I also tried declaring a function from within the terminal as such:
function al(){New-Alias lsa dir -Force}
al
lsa
This did also not work which leads me to believe that I'm making some kind of mistake or creating aliases in functions is not supported (which I could not quite understand why that would be the case).
Creating an alias via New-Alias in the cli works without any problem. Also just adding the New-Alias statement to the profile script works, when it is not enclosed in a function.
-Scope
Specifies the scope in which this alias is valid. The default value is Local. For more information, see about_Scopes.
This means that, by default, the concerned alias is only available in the scope of the function:
function Test {
New-Alias Show Write-Host
Show 'This works'
}
Test
Show 'but this does not work'
Unless you will set the -scope to global:
function Test {
New-Alias -Scope Global Show Write-Host
Show 'This works'
}
Test
Show 'And this works too'
To complement iRon's helpful answer:
While a function name such as Populate-Aliases - or perhaps better, using an approved verb, Add-CustomAliases - does suggest modification of the caller's state, it is generally better to let the caller make the choice to have its state modified, by using . , the dot-sourcing operator, which executes the specified function or script directly in the caller's scope rather than in a child scope (as is the default and as happens when you use &, the call operator).
Thus, you could leave your function as-is and simply invoke it in your $PROFILE file as follows:
# Dot-source the function call, so that it runs directly in the current scope
# (which inside $PROFILE is the *global* scope), causing the aliases to
# become globally defined.
. Populate-Aliases
Note that this technique also allows you to out-source the alias definitions to a script file; say you place them in an CustomAliases.ps1 file alongside your $PROFILE file, you can then define them globally as follows:
# Ditto, via an external .ps1 file.
. $PSScriptRoot/CustomAliases.ps1
The only challenge is that not using . for invocation then becomes effectively a quiet no-op. The function's / script's comment-based help could make that clear, but you can also implement a runtime check to enforce dot-sourced invocation:
function Add-CustomAliases {
# Ensure that the function was invoked with dot-sourcing.
if ($MyInvocation.InvocationName -ne '.') {
throw "Please invoke this function dot-sourced."
}
New-Alias grep Select-String
New-Alias touch New-Item
New-Alias lsa dir -Force
}
. Add-CustomAliases # OK
Add-CustomAliases # Throws an error, due to using dot-sourcing.
Note: With the script-file implementation, an extended check is necessary for robustness (see this answer).
# Content of CustomAliases.ps1
# Ensure that the script was invoked with dot-sourcing.
if (-not ($MyInvocation.InvocationName -eq '.' -or $MyInvocation.Line -eq '')) {
throw "Please invoke this script dot-sourced."
}
New-Alias grep Select-String
New-Alias touch New-Item
New-Alias lsa dir -Force

Powershell function call causes missing function error using powershell v7 on windows 10

I wrote a script to build all .net projects in a folder.
Issue
The issue is I am getting a missing function error when I call Build-Sollution.
What I tried
I made sure that function was declared before I used it so I am not really sure why it saids that it is not defined.
I am new to powershell but I would think a function calling another functions should work like this?
Thanks in advance!
Please see below for the error message and code.
Error Message
Line |
3 | Build-Sollution $_
| ~~~~~~~~~~~~~~~
The term 'Build-Sollution' is not recognized as the name of a cmdlet, function, script file, or operable program.
Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
Build-Sollution:
Code
param (
#[Parameter(Mandatory=$true)][string]$plugin_path,
[string]$depth = 5
)
$plugin_path = 'path/to/sollutions/'
function Get-Sollutions {
Get-ChildItem -File -Path $plugin_path -Include *.sln -Recurse
}
function Build-Sollution($solution) {
dotnet build $solution.fullname
}
function Build-Sollutions($solutions) {
$solutions | ForEach-Object -Parallel {
Build-Sollution $_
}
}
$solutions_temp = Get-Sollutions
Build-Sollutions $solutions_temp
From PowerShell ForEach-Object Parallel Feature | PowerShell
Script blocks run in a context called a PowerShell runspace. The runspace context contains all of the defined variables, functions and loaded modules.
...
And each runspace must load whatever module is needed and have any variable be explicitly passed in from the calling script.
So in this case, the easiest solution is to define Build-Sollution inside Build-Sollutions
As for this...
I am new to powershell but I would think a function calling another
functions should work like this?
... you cannot use the functions until you load your code into memory. You need to run the code before the functions are available.
If you are in the ISE or VSCode, if the script is not saved, Select All and hit use the key to run. In the ISE use F8 Selected, F5 run all. In VSCode, F8 run selected, crtl+F5 run all. YOu can just click the menu options as well.
If you are doing this from the consolehost, the run the script using dot sourcing.
. .\UncToYourScript.ps1
It's ok to be new, we all started somewhere, but it's vital that you get ramped up first. so, beyond what I address here, be sure to spend time on Youtube and search for Beginning, Intermediate, Advanced PowerShell for videos to consume. There are tons of free training resources all over the web and using the built-in help files would have given you the answer as well.
about_Scripts
SCRIPT SCOPE AND DOT SOURCING Each script runs in its own scope. The
functions, variables, aliases, and drives that are created in the
script exist only in the script scope. You cannot access these items
or their values in the scope in which the script runs.
To run a script in a different scope, you can specify a scope, such as
Global or Local, or you can dot source the script.
The dot sourcing feature lets you run a script in the current scope
instead of in the script scope. When you run a script that is dot
sourced, the commands in the script run as though you had typed them
at the command prompt. The functions, variables, aliases, and drives
that the script creates are created in the scope in which you are
working. After the script runs, you can use the created items and
access their values in your session.
To dot source a script, type a dot (.) and a space before the script
path.
See also:
'powershell .net projects build run scripts'
'powershell build all .net projects in a folder'
Simple build script using Power Shell
Update
As per your comments below:
Sure the script should be saved, using whatever editor you choose.
The ISE does not use PSv7 by design, it uses WPSv5x and earlier.
The editor for PSv7 is VSCode. If you run a function that contains another function, you have explicitly loaded everything in that call, and as such it's available.
However, you are saying, you are using PSv7, so, you need to run your code in the PSv7 consolehost or VSCode, not the ISE.
Windows PowerShell (powershell.exe and powershell_ise.exe) and PowerShell Core (pwsh.exe) are two different environments, with two different executables, designed to run side-by-side on Windows, but you do have to explicitly choose which to use or write your code to branch to a code segment to execute relative to the host you started.
For example, let's say I wanted to run a console command and I am in the ISE, but I need to run that in Pwsh. I use a function like this that I have in a custom module autoloaded via my PowerShell profiles:
# Call code by console executable
Function Start-ConsoleCommand
{
[CmdletBinding(SupportsShouldProcess)]
[Alias('scc')]
Param
(
[string]$ConsoleCommand,
[switch]$PoSHCore
)
If ($PoSHCore)
{Start-Process pwsh -ArgumentList "-NoExit","-Command &{ $ConsoleCommand }" -PassThru -Wait}
Else {Start-Process powershell -ArgumentList "-NoExit","-Command &{ $ConsoleCommand }" -PassThru -Wait}
}
All this code is doing is taking whatever command I send it and if I use the PoSHCore switch...
scc -ConsoleCommand 'SomeCommand' -PoSHCore
... it will shell out to PSCore, run the code, otherwise, it just runs from the ISE>
If you want to use the ISE with PSv7 adn not do the shell out thing, you need to force the ISE to use PSv7 to run code. See:
Using PowerShell Core 6 and 7 in the Windows PowerShell ISE

Invoking custom cmdlet from script block in PowerShell

I'm very new to PowerShell. I wrote a cmdlet which works fine. However, when I try and invoke it inside a job...
. .\MyCmdlet.ps1 # Dot Source
$GetProcesssJob = Start-Job -ScriptBlock {
MyCmdlet
} -Credential $specialCredentials
...I get the error that it's "not recognized as the name of a cmdlet, function, script file, or operable program". What am I doing wrong?
My problems were two-fold. As TheIncorrigible1 pointed out, I needed to put the dot-sourcing inside the ScriptBlock. However, I had tried that previously and it didn't work. I now realize that's because the credentials I was using in $specialCredentials didn't have access privileges to the file MyCmdlet.ps1!

How to generate comprehensive log file from PowerShell script

I am trying to write a migration/deployment script to deploy an application to an environment (DEV, QA, PROD) and I need to be able to fully log all output. This would include any status message I specifically put in the output stream (not a problem) as well as verbose output from all commands. For instance, if I'm calling Copy-Item, I want the full listing of each item copied.
I'm trying to figure out a way to do this throughout the entire script reliably. In other words, I don't want to rely on including -Verbose on every command (as it could be missed when someone else maintains the script in the future). I've been looking at things like $VerbosePreference as well as the possibility of calling my main cmdlet/function using -Verbose, with the hope being that either would apply to the entire script. But that appears to not be the case. While any Write-Verbose commands I use respect either approach, calls to Copy-Item only show the verbose listing if I specifically pass -Verbose to it. I'm really hoping I'm just missing something! Surely this is possible to do what I'm wanting!
Sample code:
function Main () {
[CmdletBinding()]
Param()
Begin {
Copy-Item C:\Temp\src\* -Destination C:\Temp\dest -Recurse -Force
Write-Output 'Main output'
Write-Verbose 'Main verbose'
Child
}
}
function Child () {
[CmdletBinding()]
Param()
Begin {
Copy-Item C:\Temp\src\* -Destination C:\Temp\dest -Recurse -Force
Write-Output 'Child output'
Write-Verbose 'Child verbose'
}
}
$VerbosePreference = 'SilentlyContinue'
Write-Output $VerbosePreference
Main
''
Main -Verbose
''
''
$VerbosePreference = 'Continue'
Write-Output $VerbosePreference
Main
''
Main -Verbose
Produces output:
SilentlyContinue
Main output
Child output
Main output
VERBOSE: Main verbose
Child output
VERBOSE: Child verbose
Continue
Main output
VERBOSE: Main verbose
Child output
VERBOSE: Child verbose
Main output
VERBOSE: Main verbose
Child output
VERBOSE: Child verbose
So, clearly $VerbosePreference and -Verbose are affecting the Write-Verbose, but that's about it. The Copy-Item is not displaying ANY output whatsoever (though it will if I specifically use -Verbose directly on that command).
Any thoughts? Am I going about this all wrong? Please help!
How about leveraging...
Tip: Create a Transcript of What You Do in Windows PowerShell
The PowerShell console includes a transcript feature to help you
record all your activities at the prompt. As of this writing, you
cannot use this feature in the PowerShell application. Commands you
use with transcripts include the following:
https://technet.microsoft.com/en-us/library/ff687007.aspx
... or the approaches provided / detailed here:
Enhanced Script Logging module (automatic console output captured to
file)
Automatically copy PowerShell console output to a log file (from
Output, Error, Warning, Verbose and Debug streams), while still
displaying the output at the console. Log file output is prepended
with date/time and an indicator of which stream originated the line
https://gallery.technet.microsoft.com/scriptcenter/Enhanced-Script-Logging-27615f85
Write-Log PowerShell Logging Function
The Write-Log PowerShell advanced function is designed to be a simple
logger function for other cmdlets, advanced functions, and scripts.
Often when running scripts one needs to keep a log of what happened
and when. The Write-Log accepts a string and a path to a log file and
ap
https://gallery.technet.microsoft.com/scriptcenter/Write-Log-PowerShell-999c32d0
* Update as per the OP comment*
See this discussion...
Powershell apply verbosity at a global level
where the -verbose flag is not supplied to the ni command. Is there a
way to set the Verbosity at a global PSSession level if I were to run
this script to force verbosity? The reason I ask is that I have a
group of about 60 scripts which are interdependent and none of these
supply -verbose to any commands they issue and I'd like to see the
entire output when I call the main entry point powershell script.
Powershell apply verbosity at a global level
Use PowerShell Default Parameter Values to Simplify Scripts
Changing default parameter values
When I was asked to write about my favorite Windows PowerShell 3.0
feature, my #1 $PSDefaultParameterValues came to mind immediately.
From my point of view, this was something I was looking for, for a
long time.
How does it work? With $PSDefaultParameterValues, you can define
(overwrite) default values of parameters for Windows PowerShell
cmdlets.
https://blogs.technet.microsoft.com/heyscriptingguy/2012/12/03/use-powershell-default-parameter-values-to-simplify-scripts/
See also:
Script Tracing and Logging
While Windows PowerShell already has the LogPipelineExecutionDetails
Group Policy setting to log the invocation of cmdlets, PowerShell’s
scripting language has plenty of features that you might want to log
and/or audit. The new Detailed Script Tracing feature lets you enable
detailed tracking and analysis of Windows PowerShell scripting use on
a system. After you enable detailed script tracing, Windows PowerShell
logs all script blocks to the ETW event log,
Microsoft-Windows-PowerShell/Operational. If a script block creates
another script block (for example, a script that calls the
Invoke-Expression cmdlet on a string), that resulting script block is
logged as well.
Logging of these events can be enabled through the Turn on PowerShell
Script Block Logging Group Policy setting (in Administrative Templates
-> Windows Components -> Windows PowerShell).
https://learn.microsoft.com/en-us/powershell/wmf/5.0/audit_script

Scope of functions defined in modules when used in psake tasks

I have a psake Task looking something like below (this is simplified for clarity):
Task Invoke-Deploy {
Import-Module "somefunctions.psm1"
Import-Module "morefunctions.psm1"
Set-Something #This is a function defined in morefunctions.psm1
}
Function Set-Something (which is defined in module morefunctions.psm1) attempts to call function Get-Something (which is defined in somefunctions.psm1). When it does I get an error:
The term 'Get-Something' is not recognized as the name of a cmdlet,
function, script file, or operable program.
Interestingly I modified "morefunctions.psm1" to also 'Import-Module "somefunctions.psm1"' and at that point everything worked fine. I would rather not have to do this however as i want my modules to be "loosely-coupled" insofar as they don't need to rely on the existence of other modules.
My knowledge of function/variable scope in Powershell is limited but I thought that functions in two different imported modules lived in the same scope and hence a function in one of those modules would be able to call a function in the other.
I am suspecting that that scope is being affected by the fact that I'm inside a psake task, I'm hoping that someone here can confirm that and also advise on what I should do to fix this. TIA.
I created a script module test-module.psm1:
function Invoke-Test {
Import-Module ".\somefunctions.psm1"
Import-Module ".\morefunctions.psm1"
Set-Something #This is a function defined in morefunctions.psm1
}
and a couple of dummy modules, somefunctions.psm1:
function Get-Something {
'Get-Something'
}
and morefunctions.psm1:
function Set-Something {
Get-Something
'Set-Something'
}
If I call
Import-Module .\test-module.psm1
Invoke-Test
then I get the error "Get-Something : The term 'Get-Something' is not
recognized as the name of a cmdlet, function, script file, or operable
program.". So it looks like a generic PowerShell issue dealing with script
modules. I tried PowerShell v2.0, v3.0, and v4.0.
Perhaps this cannot be resolved in psake without workarounds because it is a script
module. You can use the similar tool
Invoke-Build. It is implemented
as a script and avoids issues like these. It works fine
with this build script:
Task Invoke-Deploy {
Import-Module ".\somefunctions.psm1"
Import-Module ".\morefunctions.psm1"
Set-Something #This is a function defined in morefunctions.psm1
}
It outputs, as expected:
Build Invoke-Deploy ...\.build.ps1
Task /Invoke-Deploy
Get-Something
Set-Something
Done /Invoke-Deploy 00:00:00.0150008
Build succeeded. 1 tasks, 0 errors, 0 warnings 00:00:00.1450083
I ran into this today and got it to work.
In your module "morefunctions.psm1" you need to export the method you want like this:
Export-ModuleMember -Function Set-Something
In your psake task, you need to prepend the module name in front of the method so PowerShell can find it:
Import-Module "morefunctions.psm1"
morefunctions\Set-Something