I'm a Powershell beginner, although not a programming n00b. I'm trying to create an IDisposable/RAII-style failsafe pattern, sort of like in:
http://www.sbrickey.com/Tech/Blog/Post/IDisposable_in_PowerShell
So I have:
Function global:FailSafeGuard
{
param (
[parameter(Mandatory=$true)] [ScriptBlock] $execute,
[parameter(Mandatory=$true)] [ScriptBlock] $cleanup
)
Try { &$execute }
Finally { &$cleanup }
}
I'm trying to use it to perform a bunch of tasks in a different directory, using Push-Location on the way in and Pop-Location on the way out. So I have:
Function global:Push-Location-FailSafe
{
param (
$location,
[ScriptBlock] $execute
)
FailSafeGuard {
Push-Location $location;
&$execute
} { Pop-Location }
}
I find that the $execute param in Push-Location-FailSafe collides with the $execute param in the FailSafe function.
Push-Location-FailSafe "C:\" {dir}
The expression after '&' in a pipeline element produced an invalid object. It must result in a command name, script block or CommandInfo object.
At C:\TEMP\b807445c-1738-49ff-8109-18db972ab9e4.ps1:line:20 char:10
+ &$ <<<< execute
The reason I think it's a name-collision is that if I rename $execute to $execute2 in Push-Location-FailSafe, it works fine:
Push-Location-FailSafe "C:\" {dir}
Directory: C:\
Mode LastWriteTime Length Name
---- ------------- ------ ----
d---- 2011-08-18 21:34 cygwin
d---- 2011-08-17 01:46 Dell
[snip]
What's wrong in my understanding of parameters?
Your problem is with scriptblocks and how they handle variables. Variables inside a scriptblock doesn't expand until they are executed. Because of this you are hitting a loop. Let me show you:
When you call Push-Location-Failsafe method your variable is like this:
[DBG]: PS C:\>> (Get-Variable execute).Value
dir
But then you call your inner function FailSafeGuard, your $execute variable changes to this:
[DBG]: PS C:\>> (Get-Variable execute).Value
Push-Location $location;
& $execute
Now when you're try { } block starts executing, it begins to expand the variables. When it expands $execute it will get look like this:
Try {
Push-Location $location;
& $execute
}
Then it expands $execute again. Your try block is now:
Try {
Push-Location $location;
& {
Push-Location $location;
& $execute
}
}
And you got yourself an infinite loop caused by recursion. To fix this, you can expand your $execute variable inside a string, that you then create a scriptblock out of. Like this:
Function global:Push-Location-FailSafe
{
param (
$location,
[ScriptBlock] $execute
)
FailSafeGuard ([ScriptBlock]::Create("
Push-Location $location;
& $execute")) { Pop-Location }
}
Be aware that this particular solution will have a problem when $execute includes variables inside. e.g.: $execute = { $h = dir } because it will try to expand $h when it creates the scriptblock.
An easier and better way to solve it is just to use different variablenames so there's no collision in the first place :-)
Related
After reading a lot of Q&A here on SO about Start-Job I am still can not understand what I am doing wrong...
The main idea: I need to run a lot of functions that call another functions with different parameters. Something like this:
function Base-Function {
PARAM(
[string]
$Param
)
# I will do something with $Param
}
function Copy-File {
PARAM(
[string]
$CopyFileParam
)
Base-Function -Param $CopyFileParam
}
function Read-File {
PARAM(
[string]
$ReadFileParam
)
Base-Function -Param $ReadFileParam
}
function Move-File {
PARAM(
[string]
$MoveFileParam
)
Base-Function -Param $MoveFileParam
}
So - I am trying to call Copy-File, Read-File and Move-File simultaneously:
function Main-Function {
$copyFileArgs = #{ "CopyFileParam" = 1 }
Start-Job -ScriptBlock ${Function:Copy-File} -ArgumentList $copyFileArgs
$readFileArgs = #{ "ReadFileParam" = 2 }
Start-Job -ScriptBlock ${Function:Read-File} -ArgumentList $readFileArgs
...
...
}
but of course I can not call Base-Function inside Copy-File function this way so I added -InitializationScript argument:
$init = {
function Base-Function {
PARAM(
[string]
$Param
)
# I will do something with $Param
}
}
and then I call it like this:
function Main-Function {
$copyFileArgs = #{ "CopyFileParam" = 1 }
Start-Job -ScriptBlock ${Function:Copy-File} -ArgumentList $copyFileArgs -InitializationScript $init
}
but then I get an error:
OpenError: [localhost] An error occurred while starting the background process. Error reported: An error occurred trying to start process 'C:\Program Files\PowerShell\7\pwsh.exe' with working directory 'C:\Projects\powershell\project'. The filename or extension is too long..
So my question is:
Any suggestion to simultaneously call different function that in they turn call to some in-script functions ?
Why I get this error The filename or extension is too long ?
Here is a link to powershell script for example: Gist
Run the script and let it finish
In the same shell window check for jobs: Get-Job
Check the output of running job: Receive-Job xxx
see that output of job is:
ObjectNotFound: The term 'Base-Function' is not recognized as a name of a cmdlet, function, script file, or executable program.
Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
ObjectNotFound: The term 'Base-Function' is not recognized as a name of a cmdlet, function, script file, or executable program.
Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
Sorry for misinforming you; the code above is correct and functional (many thanks to #mklement0 for hints and suggestions).
The actual problem that I encountered was in this line:
OpenError: [localhost] An error occurred while starting the background process.
Error reported: An error occurred trying to start process 'C:\Program Files\PowerShell\7\pwsh.exe' with working directory 'C:\Projects\powershell\project'.
The filename or extension is too long..
The filename or extension is too long.. -> this means that there is a character's length limit for what can be passed in the '-InitializationScript' parameter. You could check it in Gist example above - everything work OK.
Here is Stakoverflow question that give me this idea: Max size of ScriptBlock / InitializationScript for Start-Job in PowerShell
Once I put all my code instead in -InitializationScript parameter inside ps1 script file and then dot source it inside function - everything started working perfectly.
Thanks again #mklement0
I guess this was answered a thousand times, but for love of all I can't find good (matching) answer to my problem.
I have a large PS script with a good dozen global variables that are used in various functions. For variables like $homedir I did not bother to include them in invocation of each function, since virtually all of them need to use it.
But now I need to write another script, and reuse ~80% of functions. Obviously I don't want to just copy&paste, since maintenance would be nightmare, so I told myself "let's finally learn to write PS modules" - basically cutting functions from the script and pasting them into module.
So far so good, but almost immediately I discovered that variables from the script are not passed to the module. I am not surprised by this, I just don't know what is the best practice to refactor my code (provided I don't really want to create functions with 10+ variables as input).
For now, I started adding necessary variables to each function, but the effect is that while before "working directory" variable was a given, now it has to be declared for each function. Hardly nice clean code there.
Is there a way to "init" a module, populating it with global variables?
EDIT:
Let's say I have a following code within a single script:
Function New-WorkDir {
if (Test-Path "$workDirectory") {
$null = Remove-Item -Recurse -Force $workDirectory
}
$null = New-Item -ItemType "directory" -Path "$workDirectory"
}
Function Set-Stage {
$null = New-Item -ItemType "file" -Force -Value $stage -Path "$workDirectory" -Name "ExecutionStage.txt"
}
$workDirectory = "C:\Temp"
New-WorkDir
$stage = "1"
Set-Stage
Now, I want to split the function be in a separate module. In order for this to work, I need to add function parameters explicitly, like so:
Function New-WorkDir {
Param(
[Parameter(Mandatory = $True, Position = 0)] [String] $WorkDirectory
)
if (Test-Path "$WorkDirectory") {
$null = Remove-Item -Recurse -Force $WorkDirectory
}
$null = New-Item -ItemType "directory" -Path "$WorkDirectory"
}
Function Set-Stage {
Param(
[Parameter(Mandatory = $True, Position = 0)] [String] $Stage,
[Parameter(Mandatory = $True, Position = 1)] [String] $WorkDirectory
)
$null = New-Item -ItemType "file" -Force -Value $Stage -Path "$WorkDirectory" -Name "ExecutionStage.txt"
}
And the main script becomes:
Import-Module new-module.psm1
$workDirectory = "C:\Temp"
New-WorkDir -WorkDirectory $workDirectory
$stage = "1"
Set-Stage -WorkDirectory $workDirectory -Stage $stage
So far, so good. My problem is that since virtually every function uses "$workDirectory", now I need to add an additional parameter to each of those functions, and what's worse - I need to add it everywhere in the code, severely impacting readability.
I was hoping that maybe there's some mechanism to "init" internal module variable, something like (pseudocode):
Import-Module new-module.psm1
Set-Variables -module new-module -WorkDirectory $workDirectory
$workDirectory = "C:\Temp"
New-WorkDir
$stage = "1"
Set-Stage -Stage $stage
Help, please?
While modules have state and you could set module variables through module functions that assign to $script:YourVariableName, I wouldn't recommend doing so. Although they are scoped to the module, module variables still smell like an anti-pattern similar to global variables. Having functions depend on state outside of the function makes maintenance and testing much harder. I recommend to use module variables only for constants.
A better pattern is to pass everything to the module functions via parameters. If it turns out that your functions have many common parameters, you could pass these via a single object parameter.
Say you have:
Function MyModuleFun1( $commonParam1, $commonParam2, $foo ) {
Write-Output $commonParam1 $commonParam2 $foo
}
Function MyModuleFun2( $commonParam1, $commonParam2, $bar ) {
Write-Output $commonParam1 $commonParam2 $bar
}
This could be refactored to...
Function MyModuleFun1( $commonParams, $foo ) {
Write-Output $commonParams.param1 $commonParams.param2 $foo
}
Function MyModuleFun2( $commonParams, $bar ) {
Write-Output $commonParams.param1 $commonParams.param2 $bar
}
... and called like this:
$common = [PSCustomObject]#{ param1 = 42; param2 = 21 }
MyModuleFun1 -commonParams $common -foo theFoo
MyModuleFun2 -commonParams $common -bar theBar
In this example the common parameter values are the same for all function calls, so we could use $PSDefaultParameterValues to pass them implicitly:
$PSDefaultParameterValues = #{
'MyModule*:commonParams' = [PSCustomObject]#{ param1 = 42; param2 = 21 }
}
MyModuleFun1 -foo theFoo
MyModuleFun2 -bar theBar
It is advisable to use a common prefix for all your module functions, to make sure that your $PSDefaultParameterValues don't leak into other functions. In my example all module functions start with prefix 'MyModule', so I could write MyModule*:commonParams to pass the common parameter values only to functions that start with 'MyModule' prefix.
For added type safety you could create a class for the common parameters within your module...
class MyModuleCommonParams {
[int] $param1
[String] $param2 = 'DefaultValue'
}
... and change your function signatures like this:
Function MyModuleFun1( [MyModuleCommonParams] $commonParams, $foo )
The function calls can stay the same, but now the function will check that only variables defined in the class, that have correct1 type, are passed:
$common = [PSCustomObject]#{ param1 = 42; xyz = 21 }
MyModuleFun1 -commonParams $common -foo theFoo
PowerShell will report an error, because the member xyz is not defined in class MyModuleCommonParams.
1 Actually it is sufficient that the argument type is convertible to the class member type. You could pass the string '42' to $param1, because it will be automatically converted to int.
Is it possible to use the -WhatIf argument when executing external commands? I want to be able to run a script with -WhatIf and have it print out a full list of all the external commands and arguments it's going to run without actually running them.
I've tried doing stuff like the following:
Function Invoke-Checked
{
param([ScriptBlock]$s)
if ($PSCmdlet.ShouldProcess($s.ToString(), "Execute"))
{
Invoke-Command $s
}
}
But that won't expand any variables that are present in the scriptblock - doing something like:
$s = { & dir $test }
Invoke-Checked $s
just prints
Performing the operation "Execute" on target " & dir $test ".
not particularly helpful.
Is there any way to do what I want?
First of all - you need to make sure that your 'wrapper' function supports WhatIf.
Another thing: you can expand the scriptBlock, but I'm not really convinced that is smart thing to do: e.g. if $test = 'Some path with spaces', it would stop working after expansion.
That being said: here are two options that work for me: using GetNewClosure() method on scriptBlock, and expanding whole thing:
function Invoke-ExpandedChecked {
[CmdletBinding(
SupportsShouldProcess = $true,
ConfirmImpact = 'Medium'
)]
param([ScriptBlock]$ScriptBlock)
$expanded = $ExecutionContext.InvokeCommand.ExpandString($ScriptBlock)
$script = [scriptblock]::Create($expanded)
if ($PSCmdlet.ShouldProcess($script.ToString(), "Execute"))
{
& $script
}
}
function Invoke-Checked {
[CmdletBinding(
SupportsShouldProcess = $true,
ConfirmImpact = 'Medium'
)]
param([ScriptBlock]$ScriptBlock)
$newClosure = $ScriptBlock.GetNewClosure()
if ($PSCmdlet.ShouldProcess($newClosure.ToString(), "Execute"))
{
& $newClosure
}
}
$test = '.\DSCDemo.ps_'
$s = { cmd /c dir $test}
Invoke-Checked $s -WhatIf
Invoke-Checked $s
Invoke-ExpandedChecked $s -WhatIf
Invoke-ExpandedChecked $s
And an example of results for path with spaces:
$test = 'C:\Program Files'
Invoke-Checked $s
Invoke-ExpandedChecked $s
Works fine for one with new enclosure. With expanded:
cmd : File Not Found
At line:1 char:2
+ cmd /c dir C:\Program Files
I'm going to interpret the question to mean, "how do I use -whatif with running external commands?", since that's how I found this question.
# myscript.ps1
[cmdletbinding(SupportsShouldProcess=$True)]
Param($path) # put Param() if no parameters
if ($pscmdlet.ShouldProcess($Path, 'creating folder')) { # not -whatif
cmd /c mkdir $path
}
.\myscript foo -whatif
What if: Performing the operation "creating folder" on target "foo".
I have two Powershell files, a module and a script that calls the module.
Module: test.psm1
Function Get-Info {
$MyInvocation.MyCommand.Name
}
Script: myTest.ps1
Import-Module C:\Users\moomin\Documents\test.psm1 -force
Get-Info
When I run ./myTest.ps1 I get
Get-Info
I want to return the name of the calling script (test.ps1). How can I do that?
Use PSCommandPath instead in your module:
Example test.psm1
function Get-Info{
$MyInvocation.PSCommandPath
}
Example myTest.ps1
Import-Module C:\Users\moomin\Documents\test.psm1 -force
Get-Info
Output:
C:\Users\moomin\Documents\myTest.ps1
If you want only the name of the script that could be managed by doing
GCI $MyInvocation.PSCommandPath | Select -Expand Name
That would output:
myTest.ps1
I believe you could use the Get-PSCallStack cmdlet, which returns an array of stack frame objects. You can use this to identify the calling script down to the line of code.
Module: test.psm1
Function Get-Info {
$callstack = Get-PSCallStack
$callstack[1].Location
}
Output:
myTest.ps1: Line 2
Using the $MyInvocation.MyCommand is relative to it's scope.
A simple example (Of a script located : C:\Dev\Test-Script.ps1):
$name = $MyInvocation.MyCommand.Name;
$path = $MyInvocation.MyCommand.Path;
function Get-Invocation(){
$path = $MyInvocation.MyCommand.Path;
$cmd = $MyInvocation.MyCommand.Name;
write-host "Command : $cmd - Path : $path";
}
write-host "Command : $cmd - Path : $path";
Get-Invocation;
The output when running .\c:\Dev\Test-Script.ps1 :
Command : C:\Dev\Test-Script.ps1 - Path : C:\Dev\Test-Script.ps1
Command : Get-Invocation - Path :
As you see, the $MyInvocation is relative to the scoping. If you want the path of your script, do not enclose it in a function. If you want the invocation of the command, then you wrap it.
You could also use the callstack as suggested, but be aware of scoping rules.
I used this today after trying a couple of techniques.
$scriptPath = split-path -parent $MyInvocation.MyCommand.Definition
$ScriptName = $MyInvocation.MyCommand | select -ExpandProperty Name
Invoke-Expression ". $Script\$ScriptName"
To refer to the invocation info of the calling script, use:
#(Get-PSCallStack)[1].InvocationInfo
e.g.:
#(Get-PSCallStack)[1].InvocationInfo.MyCommand.Name
This provides the script path with trailing backslash as one variable and the script name as another.
The path works with Powershell 2.0 and 3.0 and 4.0 and probably 5.0
Where with Posershell $PSscriptroot is now available.
$_INST = $myinvocation.mycommand.path.substring(0,($myinvocation.mycommand.path.length - $MyInvocation.mycommand.name.length))
$_ScriptName = $myinvocation.mycommand.path.substring($MyInvocation.MyCommand.Definition.LastIndexOf('\'),($MyInvocation.mycommand.name.length +1))
$_ScriptName = $_ScriptName.TrimStart('\')
If you want a more reusable approach, you can use:
function Get-CallingFileName
{
$cStack = #(Get-PSCallStack)
$cStack[$cStack.Length-1].InvocationInfo.MyCommand.Name
}
The challenge I had was having a function that could be reused within the module. Everything else assumed that the script was calling the module function directly and if it was removed even 1 step, then the result would be the module file name. If, however, the source script is calling a function in the module which is, in turn, calling another function in the module, then this is the only answer I've seen that can ensure you're getting the source script info.
Of course, this approach is based on what #iRon and #James posted.
For you googlers looking for quick copy paste solution,
here is what works in Powershell 5.1
Inside your module:
$Script = (Get-PSCallStack)[2].Command
This will output just the script name (ScriptName.ps1) which invoked a function located in module.
I use this in my module:
function Get-ScriptPath {
[CmdletBinding()]
param (
[string]
$Extension = '.ps1'
)
# Allow module to inherit '-Verbose' flag.
if (($PSCmdlet) -and (-not $PSBoundParameters.ContainsKey('Verbose'))) {
$VerbosePreference = $PSCmdlet.GetVariableValue('VerbosePreference')
}
# Allow module to inherit '-Debug' flag.
if (($PSCmdlet) -and (-not $PSBoundParameters.ContainsKey('Debug'))) {
$DebugPreference = $PSCmdlet.GetVariableValue('DebugPreference')
}
$callstack = Get-PSCallStack
$i = 0
$max = 100
while ($true) {
if (!$callstack[$i]) {
Write-Verbose "Cannot detect callstack frame '$i' in 'Get-ScriptPath'."
return $null
}
$path = $callstack[$i].ScriptName
if ($path) {
Write-Verbose "Callstack frame '$i': '$path'."
$ext = [IO.Path]::GetExtension($path)
if (($ext) -and $ext -eq $Extension) {
return $path
}
}
$i++
if ($i -gt $max) {
Write-Verbose "Exceeded the maximum of '$max' callstack frames in 'Get-ScriptPath'."
return $null
}
}
return $null
}
You can grab the automatic variable MyInvocation from the parent scope and get the name from there.
Get-Variable -Scope:1 -Name:MyInvocation -ValueOnly
I did a basic test to check to see if it would always just get the direct parent scope and it worked like a treat and is extremely fast as opposed to Get-PSCallStack
function ScopeTest () {
Write-Information -Message:'ScopeTest'
}
Write-nLog -Message:'nLog' -Type:110 -SetLevel:Verbose
ScopeTest
By default, any named function that has the [CmdletBinding()] attribute accepts the -debug and -verbose (and a few others) parameters and has the predefined $debug and $verbose variables. I'm trying to figure out how to pass them on to other cmdlet's that get called within the function.
Let's say I have a cmdlet like this:
function DoStuff() {
[CmdletBinding()]
PROCESS {
new-item Test -type Directory
}
}
If -debug or -verbose was passed into my function, I want to pass that flag into the new-item cmdlet. What's the right pattern for doing this?
$PSBoundParameters isn't what you're looking for. The use of the [CmdletBinding()] attribute allows the usage of $PSCmdlet within your script, in addition to providing a Verbose flag. It is in fact this same Verbose that you're supposed to use.
Through [CmdletBinding()], you can access the bound parameters through $PSCmdlet.MyInvocation.BoundParameters. Here's a function that uses CmdletBinding and simply enters a nested prompt immediately in order examine the variables available inside the function scope.
PS D:\> function hi { [CmdletBinding()]param([string] $Salutation) $host.EnterNestedPrompt() }; hi -Salutation Yo -Verbose
PS D:\>>> $PSBoundParameters
____________________________________________________________________________________________________
PS D:\>>> $PSCmdlet.MyInvocation.BoundParameters
Key Value
--- -----
Salutation Yo
Verbose True
So in your example, you would want the following:
function DoStuff `
{
[CmdletBinding()]
param ()
process
{
new-item Test -type Directory `
-Verbose:($PSCmdlet.MyInvocation.BoundParameters["Verbose"].IsPresent -eq $true)
}
}
This covers -Verbose, -Verbose:$false, -Verbose:$true, and the case where the switch is not present at all.
Perhaps it sounds strange, but there isn't any easy way for a cmdlet to know its verbose or debug mode. Take a look at the related question:
How does a cmdlet know when it really should call WriteVerbose()?
One not perfect, but practically reasonable, option is to introduce your own cmdlet parameters (for example, $MyVerbose and $MyDebug) and use them in the code explicitly:
function DoStuff {
[CmdletBinding()]
param
(
# Unfortunately, we cannot use Verbose name with CmdletBinding
[switch]$MyVerbose
)
process {
if ($MyVerbose) {
# Do verbose stuff
}
# Pass $MyVerbose in the cmdlet explicitly
New-Item Test -Type Directory -Verbose:$MyVerbose
}
}
DoStuff -MyVerbose
UPDATE
When we need only a switch (not, say, a verbosity level value) then the approach with $PSBoundParameters is perhaps better than proposed in the first part of this answer (with extra parameters):
function DoStuff {
[CmdletBinding()]
param()
process {
if ($PSBoundParameters['Verbose']) {
# Do verbose stuff
}
New-Item Test -Type Directory -Verbose:($PSBoundParameters['Verbose'] -eq $true)
}
}
DoStuff -Verbose
It's all not perfect anyway. If there are better solutions then I would really like to know them myself.
There is no need. PowerShell already does this as the code below proves.
function f { [cmdletbinding()]Param()
"f is called"
Write-Debug Debug
Write-Verbose Verbose
}
function g { [cmdletbinding()]Param()
"g is called"
f
}
g -Debug -Verbose
The output is
g is called
f is called
DEBUG: Debug
VERBOSE: Verbose
It is not done as direct as passing -Debug to the next cmdlet though. It is done through the $DebugPreference and $VerbrosePreference variables. Write-Debug and Write-Verbose act like you would expect, but if you want to do something different with debug or verbose you can read here how to check for yourself.
Here's my solution:
function DoStuff {
[CmdletBinding()]
param ()
BEGIN
{
$CMDOUT = #{
Verbose = If ($PSBoundParameters.Verbose -eq $true) { $true } else { $false };
Debug = If ($PSBoundParameters.Debug -eq $true) { $true } else { $false }
}
} # BEGIN ENDS
PROCESS
{
New-Item Example -ItemType Directory #CMDOUT
} # PROCESS ENDS
END
{
} #END ENDS
}
What this does different from the other examples is that it will repsect "-Verbose:$false" or "-Debug:$false". It will only set -Verbose/-Debug to $true if you use the following:
DoStuff -Verbose
DoStuff -Verbose:$true
DoStuff -Debug
DoStuff -Debug:$true
You could build a new hash table based on the bound debug or verbose parameters and then splat it to the internal command. If you're just specifying switches (and aren't passing a false switch, like $debug:$false) you can just check for the existence of debug or verbose:
function DoStuff() {
[CmdletBinding()]
PROCESS {
$HT=#{Verbose=$PSBoundParameters.ContainsKey'Verbose');Debug=$PSBoundParameters.ContainsKey('Debug')}
new-item Test -type Directory #HT
}
}
If you want to pass the parameter value it's more complicated, but can be done with:
function DoStuff {
[CmdletBinding()]
param()
PROCESS {
$v,$d = $null
if(!$PSBoundParameters.TryGetValue('Verbose',[ref]$v)){$v=$false}
if(!$PSBoundParameters.TryGetValue('Debug',[ref]$d)){$d=$false}
$HT=#{Verbose=$v;Debug=$d}
new-item Test -type Directory #HT
}
}
The best way to do it is by setting the $VerbosePreference. This will enable the verbose level for the entire script. Do not forget to disable it by the end of the script.
Function test
{
[CmdletBinding()]
param($param1)
if ($psBoundParameters['verbose'])
{
$VerbosePreference = "Continue"
Write-Verbose " Verbose mode is on"
}
else
{
$VerbosePreference = "SilentlyContinue"
Write-Verbose " Verbose mode is Off"
}
# <Your code>
}
You can set the VerbosePreference as a global variable on starting your script and then check for the global variable in your custom cmdlet.
Script:
$global:VerbosePreference = $VerbosePreference
Your-CmdLet
Your-CmdLet:
if ($global:VerbosePreference -eq 'Continue') {
# verbose code
}
Checking explicitly for 'Continue' allows the script to be equal to -verbose:$false when you call the CmdLet from a script that doesn't set the global variable (in which case it's $null)
You do not have to do any checks or comparisons. Even though -Verbose (and -Debug) are of type [switch], they seem to understand not just $true and $false but also their preference variable. The preference variable also gets inherited correctly to all child functions that are called. I tried this on Powershell version 7.3.2 and it works as expected.
function Parent {
[CmdletBinding()]param()
Child
}
function Child {
[CmdletBinding()]param()
New-Item C:\TEST\SomeDir -Force -ItemType Directory -Verbose:$VerbosePreference -Debug:$DebugPreference
}
Parent -Verbose
Parent -Debug
I think this is the easiest way:
Function Test {
[CmdletBinding()]
Param (
[parameter(Mandatory=$False)]
[String]$Message
)
Write-Host "This is INFO message"
if ($PSBoundParameters.debug) {
Write-Host -fore cyan "This is DEBUG message"
}
if ($PSBoundParameters.verbose) {
Write-Host -fore green "This is VERBOSE message"
}
""
}
Test -Verbose -Debug