How can I find the source path of an executing script? [duplicate] - powershell

This question already has answers here:
What's the best way to determine the location of the current PowerShell script?
(15 answers)
Closed 8 years ago.
I want to be able to tell what path my executing script was run from.
This will often not be $pwd.
I need to call other scripts that are in a folder structure relative to my script and while I could hard code the paths, that's both distasteful and a bit of a pain in the neck when trying to promote from "dev" to "test" to "production".

The ubiquitous script originally posted by Jeffrey Snover of the PowerShell team (given in Skyler's answer) and the variations posted by Keith Cedirc, and EBGreen, all suffer from a serious drawback--whether the code reports what you expect depends on where you call it!
My code below overcomes this problem by simply referencing script scope instead of parent scope:
function Get-ScriptDirectory
{
Split-Path $script:MyInvocation.MyCommand.Path
}
To illustrate the problem, I created a test vehicle that evalutes the target expression in four different ways. (The bracketed terms are the keys to the following result table.)
inline code [inline]
inline function, i.e. function in the main program [inline function]
Dot-sourced function, i.e. the same function moved to a separate .ps1 file [dot source]
Module function, i.e. the same function moved to a separate .psm1 file [module]
The last two columns show the result of using script scope (i.e. $script:) or with parent scope (with -scope 1). A result of "script" means that the invocation correctly reported the location of the script. The "module" result means the invocation reported the location of the module containing the function rather than the script that called the function; this indicates a drawback of both functions that you cannot put the function in a module.
Setting the module issue aside the remarkable observation from the table is that using the parent scope approach fails most of the time (in fact, twice as often as it succeeds).
Finally, here is the test vehicle:
function DoubleNested()
{
"=== DOUBLE NESTED ==="
NestCall
}
function NestCall()
{
"=== NESTED ==="
"top level:"
Split-Path $script:MyInvocation.MyCommand.Path
#$foo = (Get-Variable MyInvocation -Scope 1).Value
#Split-Path $foo.MyCommand.Path
"immediate func call"
Get-ScriptDirectory1
"dot-source call"
Get-ScriptDirectory2
"module call"
Get-ScriptDirectory3
}
function Get-ScriptDirectory1
{
Split-Path $script:MyInvocation.MyCommand.Path
# $Invocation = (Get-Variable MyInvocation -Scope 1).Value
# Split-Path $Invocation.MyCommand.Path
}
. .\ScriptDirFinder.ps1
Import-Module ScriptDirFinder -force
"top level:"
Split-Path $script:MyInvocation.MyCommand.Path
#$foo = (Get-Variable MyInvocation -Scope 1).Value
#Split-Path $foo.MyCommand.Path
"immediate func call"
Get-ScriptDirectory1
"dot-source call"
Get-ScriptDirectory2
"module call"
Get-ScriptDirectory3
NestCall
DoubleNested
Contents of ScriptDirFinder.ps1:
function Get-ScriptDirectory2
{
Split-Path $script:MyInvocation.MyCommand.Path
# $Invocation = (Get-Variable MyInvocation -Scope 1).Value
# Split-Path $Invocation.MyCommand.Path
}
Contents of ScriptDirFinder.psm1:
function Get-ScriptDirectory3
{
Split-Path $script:MyInvocation.MyCommand.Path
# $Invocation = (Get-Variable MyInvocation -Scope 1).Value
# Split-Path $Invocation.MyCommand.Path
}
I am not familiar with what was introduced in PowerShell 2, but it could very well be that script scope did not exist in PowerShell 1, at the time Jeffrey Snover published his example.
I was surprised when, though I found his code example proliferated far and wide on the web, it failed immediately when I tried it! But that was because I used it differently than Snover's example (I called it not at script-top but from inside another function (my "nested twice" example).)
2011.09.12 Update
You can read about this with other tips and tricks on modules in my just-published article on Simple-Talk.com:
Further Down the Rabbit Hole: PowerShell Modules and Encapsulation.

You tagged your question for Powershell version 1.0, however, if you have access to Powershell version 3.0 you know have $PSCommandPathand$PSScriptRootwhich makes getting the script path a little easier. Please refer to the "OTHER SCRIPT FEATURES" section on this page for more information.

We've been using code like this in most of our scripts for several years with no problems:
#--------------------------------------------------------------------
# Dot source support scripts
#--------------------------------------------------------------------
$ScriptPath = $MyInvocation.MyCommand.Path
$ScriptDir = Split-Path -Parent $ScriptPath
. $ScriptDir\BuildVars.ps1
. $ScriptDir\LibraryBuildUtils.ps1
. $ScriptDir\BuildReportUtils.ps1

I ran into the same issue recently. The following article helped me solve the problem: http://blogs.msdn.com/powershell/archive/2007/06/19/get-scriptdirectory.aspx
If you're not interested in how it works, here's all the code you need per the article:
function Get-ScriptDirectory
{
$Invocation = (Get-Variable MyInvocation -Scope 1).Value
Split-Path $Invocation.MyCommand.Path
}
And then you get the path by simply doing:
$path = Get-ScriptDirectory

I think you can find the path of your running script using
$MyInvocation.MyCommand.Path
Hope it helps !
Cédric

This is one of those oddities (to my mind at least) in PS. I'm sure there is a perfectly good reason for it, but it still seems odd to me. So:
If you are in a script but not in a function then $myInvocation.InvocationName will give you the full path including the script name. If you are in a script and inside a function then $myInvocation.ScriptName will give you the same thing.

Thank you msorens! This really helped me with my custom module. In case anyone is interested in making their own, here is how mine is structured.
MyModule (folder)
- MyModule.psd1 (help New-ModuleManifest)
- MyScriptFile.ps1 (ps1 files are easy to test)
You then reference MyScriptFile.ps1 in MyModule.psd1. Referencing the .ps1 in the NestedModules array will place the functions in the module session state rather than the global session state. (How to Write a Module Manifest)
NestedModules = #('.\MyScriptFile.ps1','.\MyOtherScriptFile.ps1')
Content of MyScriptFile.ps1
function Get-ScriptDirectory {
Split-Path $script:MyInvocation.MyCommand.Path
}
try {
Export-ModuleMember -Function "*-*"
}
catch{}
The try/catch hides the error from Export-ModuleMember when running MyScriptFile.ps1
Copy the MyModule directory to one of the paths found here $env:PSModulePath
PS C:\>Import-Module MyModule
PS C:\>Get-Command -Module MyModule
CommandType Name ModuleName
----------- ---- ----------
Function Get-ScriptDirectory MyModule

Related

PowerShell ignoring Write-Verbose while running Import-Module

For presenting the problem, I have this simple script saved as PowerShell module (test.psm1)
Write-Verbose 'Verbose message'
In real life, it includes command to import additional functions, but that is irrelevant at the moment.
If I run Import-Module .\test.psm1 -Verbose -Force I get only
VERBOSE: Loading module from path 'C:\tmp\test.psm1'.
My Write-Verbose is ignored 😟
I tried adding cmdletbinging but it also did not work.
[cmdletbinding()]
param()
Write-Verbose 'Verbose message'
Any clue how to provide Verbose output while importing the PowerShell module?
P.S. I do not want to display Verbose information always, but only if -Verbose is specified. Here would be my expected output for these two different cases:
PS C:\> Import-Module .\test.psm1 -Verbose -Force # with verbose output
VERBOSE: Loading module from path 'C:\tmp\test.psm1'.
VERBOSE: Verbose message
PS C:\> Import-Module .\test.psm1 -Force # without verbose output
PS C:\>
That is an interesting situation. I have a theory, but if anyone can prove me wrong, I would be more than happy.
The short answer: you probably cannot do what you want by playing with -Verbose only. There may be some workarounds, but the shortest path could be setting $VerbosePreference.
First of all, we need to understand the lifetime of a module when it is imported:
When a module is imported, a new session state is created for the
module, and a System.Management.Automation.PSModuleInfo object is
created in memory. A session-state is created for each module that is
imported (this includes the root module and any nested modules). The
members that are exported from the root module, including any members
that were exported to the root module by any nested modules, are then
imported into the caller's session state. [..] To send output to the host, users should run the Write-Host cmdlet.
The last line is the first hint that pointed me to a solution: when a module is imported, a new session state is created, but only exported elements are attached to the global session state. This means that test.psm1 code is executed in a session different than the one where you run Import-Module, therefore the -Verbose option, related to that single command, is not propagated.
Instead, and this is an assumption of mine, since I did not find it on the documentation, configurations from the global session state are visible to all the child sessions. Why is this important? Because there are two ways to turn on verbosity:
-Verbose option, not working in this case because it is local to the command
$VerbosePreference, that sets the verbosity for the entire session using a preference variable.
I tried the second approached and it worked, despite not being so elegant.
$VerbosePreference = "Continue" # print all the verbose messages, disabled by default
Import-Module .\test.psm1 -Force
$VerbosePreference = "SilentlyContinue" # restore default value
Now some considerations:
Specifying -Verbose on the Import-Module command is redundant
You can still override the verbosity configuration inside your module script, by using
Write-Verbose -Message "Verbose message" -Verbose:$false
As #Vesper pointed out, $false will always suppress the Write-Verbose output. Instead, you may want to parameterized that option with a boolean variable assigned in a previous check, perhaps. Something like:
if (...)
{
$forceVerbose=$true
}
else
{
$forceVerbose=$false
}
Write-Verbose -Message "Verbose message" -Verbose:$forceVerbose
There might be other less invasive workarounds (for instance centered on Write-Host), or even a real solution. As I said, it is just a theory.
Marco Luzzara's answer is spot on (and deserves the bounty in my opinion) in regards to the module being run in its own session state, and that by design you can't access those variables.
An alternative solution to setting $VerbosePreference and restoring it, is to have your module take a parameter specifically for this purpose. You touched on this a little bit by trying to add [CmdletBinding()] to your module; the problem is you have no way to pass in named parameters, only unnamed arguments, via Import-Module -ArgumentList, so you can't specifically pass in a $true for -Verbose.
Instead you can specify your own parameter and use it.
(psm1)
[CmdletBinding()]param([bool]$myverbose)
Write-Verbose "Message" -Verbose:$myverbose
followed with:
Import-Module test.psm1 -Force -ArgumentList $true
In the above example, it would apply only to a specific command, where you were setting -Verbose:$myverbose every time.
But you could apply it to the module's $VerbosePreference:
[CmdletBinding()]param([bool]$myverbose)
$VerbosePreference = if ($myverbose) { 'Continue' } else { 'SilentlyContinue' }
Write-Verbose "Message"
That way it applies throughout.
At this point I should mention the drawback of what I'm showing: you might notice I didn't include -Verbose in the Import-Module call, and that's because, it doesn't change the behavior inside the module. The verbose messages from inside will be shown purely based on the argument you passed in, regardless of the -Verbose setting on Import-Module.
An all-in-one solution then goes back to Marco's answer: manipulating $VerbosePreference on the caller's side. I think it's the only way to get both behaviors aligned, but only if you don't use -Verbose switch on Import-Module to override.
On the other hand, within a scope, like within an advanced function that can take -Verbose, setting the switch changes the local value of $VerbosePreference. That can lead us to wrap Import-Module in our own function:
function Import-ModuleVerbosely {
[CmdletBinding()]
param($Name, [Switch]$Force)
Import-Module $Name -Force:$Force
}
Great! Now we can call Import-ModuleVerbosely test.psm1 -Force -Verbose. But... it didn't work. Import-Module did recognize the verbose setting but it didn't make it down into the module this time.
Although I haven't been able to find a way to see it, I suspect it's because the variable is set to Private (even though Get-Variable seems to say otherwise) and so that value doesn't make it this time. Whatever the reason.. we could go back to making our module accept a value. This time let's make it the same type for ease of use:
(psm1)
[CmdletBinding()]param([System.Management.Automation.ActionPreference]$myverbose)
if ($myverbose) { $VerbosePreference = $myverbose }
Write-Verbose "message"
Then let's change the function:
function Import-ModuleVerbosely {
[CmdletBinding()]
param($Name, [Switch]$Force)
Import-Module $Name -Force:$Force -ArgumentList $VerbosePreference
}
Hey now we're getting somewhere! But.. it's kind of clunky isn't it?
You could go farther with it, making a full on proxy function for Import-Module, then making an alias to it called Import-Module to replace the real one.
Ultimately you're trying to do something not really supported, so it depends how far you want to go.

PowerShell functions load from function

I have a module with several files with functions and module loader.
The example function:
Function1.ps1
function Init() {
echo "I am the module initialization logic"
}
function DoStuff() {
echo "Me performing important stuff"
}
Module loader file:
Module1.psm1:
$script:Functions = Get-ChildItem $PSScriptRoot\*.ps1
function LoadModule {
Param($path)
foreach ($import in #($path)) {
. $import.FullName
}
}
LoadModule script:Functions
Init # function doesn't found
So I'm trying to load functions from the file Function1.ps1 by procedure LoadModule.
Debugging LoadModule shows external functions loaded, but after finishing LoadModule procedure the functions become not accessible so script fails on Init row.
But rewritten module loader with no LoadModule function works fine
Module1.psm1:
Get-ChildItem $PSScriptRoot\*.ps1 | %{
. $import.FullName
}
Init # In this case - all works fine
So as I understand the file functions loaded from function placed in some isolated scope and to able to access them I need to add some scope flag.
Maybe somebody knows what I should add to make function Init() accessible from the module.pm1 script body but not make it accessible externally (without using Export-ModuleMember)?
Note: Edit 1, a clarification on what dot sourcing actually does, is included at the end.
First up, you are intermingling terminology and usage for Functions and Modules. Modules, which have the .psm1 extension, should be imported into the terminal using the Import-Module cmdlet. When dot sourcing, such as what you are doing here, you should only be targeting script files which contain functions, which are files with the .ps1 extension.
I too am relatively new to PowerShell, and I ran into the same problem. After spending around an hour reading up on the issue I was unable to find a solution, but a lot of the information I found points to it being an issue of scope. So I created a test, utilising three files.
foo.ps1
function foo {
Write-Output "foo"
}
bar.psm1
function bar {
Write-Output "bar"
}
scoping.ps1
function loader {
echo "dot sourcing file"
. ".\foo.ps1"
foo
echo "Importing module"
Import-Module -Name ".\bar.psm1"
bar
}
foo
bar
loader
foo
bar
pause
Lets walk through what this script does.
First we define a dummy loader function. This isn't a practical loader, but it is sufficient for testing scopes and the availability of functions within files that are loaded. This function dot sources the ps1 file containing the function foo, and uses Import-Module for the file containing the function bar.
Next, we call on the functions foo and bar, which will produce errors, in order to establish that neither are within the current scope. While not strictly necessary, this helps to illustrate their absence.
Next, we call the loader function. After dot sourcing foo.ps1, we see foo successfully executed because foo is within the current scope of the loader function. After using Import-Module for bar.psm1, we see bar also successfully executed. Now we exit the scope of the loader function and return to the main script.
Now we see the execution of foo fail with an error. This is because we dot sourced foo.ps1 within the scope of a function. However, because we imported bar.psm1, bar successfully executes. This is because modules are imported into the Global scope by default.
How can we use this to improve your LoadModule function? The main thing for this functionality is that you need to switch to using modules for your imported functions. Note that, from my testing, you cannot Import-Module the loader function; this only works if you dot source the loader.
LoadModule.ps1
function LoadModule($Path) {
Get-ChildItem -Path "$Path" -Filter "*.psm1" -Recurse -File -Name| ForEach-Object {
$File = "$Path$_"
echo "Import-Module -Name $File"
Import-Module -Name "$File" -Force
}
}
And now in a terminal:
. ".\LoadModule.ps1"
LoadModule ".\"
foo
bar
Edit 1: A further clarification on dot sourcing
Dot sourcing is equivalent to copy-pasting the contents of the specified file into the file preforming the dot source. The file performing the operation "imports" the contents of the target verbatim, performing no additional actions before proceeding to execute the "imported" code. e.g.
foo.ps1
Write-Output "I am foo"
. ".\bar.ps1"
bar.ps1
Write-Output "I am bar"
is effectively
Write-Output "I am foo"
Write-Output "I am bar"
Edit: You don't actually need to use Import-Module. So long as you have the modules in your $env:PSModulePath PowerShell will autoload any exported functions when they are first called. Source.
Depending on the specifics of your use case, there's another method you can use. This method addresses when you want to mass-import modules into a PowerShell session.
When you start PowerShell it looks at the values of the environment variable $PSModulePath in order to determine where it should look for modules. It then looks under this directory for directories containing psm1 and psd1 files. You can modify this variable during the session, and then import modules by name. Here's an example, using what I've added to my PowerShell profile.ps1 file:
$MyPSPath = [Environment]::GetFolderPath("MyDocuments") + "\WindowsPowerShell"
$env:PSModulePath = $env:PSModulePath + ";$MyPSPath\Custom\Modules"
Import-Module `
-Name Confirm-Directory, `
Confirm-File, `
Get-FileFromURL, `
Get-RedirectedURL, `
Get-RemoteFileName, `
Get-ReparseTarget, `
Get-ReparseType, `
Get-SpecialPath, `
Test-ReparsePoint
In the event that you're new to PowerShell profiles (they're pretty much the same as Unix's ~/.profile file), you can find:
more information about PowerShell profiles here.
a summary of what profile files are used and when here.
While this may not seem as convenient as an auto-loader, installing & importing modules is the intended and accepted approach for this. Unless you have a specific reason not to, you should try to follow the established standards so that you aren't later fighting your way out of bad habits.
You can also modify the registry to achieve this.
After some research, I found: During the execution of the LoadModule function, all registered functions will be added to Functions Provider
So from the LoadModule function body they can be enumerated via Get-ChildItem -Path Function:
[DBG]: PS > Get-ChildItem -Path Function:
CommandType Name Version Source
----------- ---- ------- ------
Function C:
Function Close-VSCodeHtmlContentView 0.2.0 PowerShellEditorServices.VSCode
Function Init 0.0 Module1
Function ConvertFrom-ScriptExtent 0.2.0
Function Module1 0.0 Module1
So we can store functions list to variable in the beginning of the invocation of the LoadModule
$loadedFunctions = Get-ChildItem -Path Function:
and after dot load notation retrieve the added function list
Get-ChildItem -Path Function: | where { $loadedFunctions -notcontains $_ }
So the modified LoadModule function will look like:
function LoadModule {
param ($path)
$loadRef = Get-PSCallStack
$loadedFunctions = Get-ChildItem -Path Function:
foreach ($import in #($path)) {
. $import.FullName
}
$functions= Get-ChildItem -Path Function: | `
Where-Object { $loadedFunctions -notcontains $_ } | `
ForEach-Object{ Get-Item function:$_ }
return $functions
}
the next step it just assigns the functions to list More about this
$script:functions = LoadModule $script:Private ##Function1.ps1
$script:functions += LoadModule $script:PublicFolder
After this step, we can
Invoke initalizer:
$initScripts = $script:functions| #here{ $_.Name -eq 'Initalize'} #filter
$initScripts | ForEach-Object{ & $_ } ##execute
and export Public functions:
$script:functions| `
where { $_.Name -notlike '_*' } | ` # do not extport _Name functions
%{ Export-ModuleMember -Function $_.Name}
Full code of the module load function I moved to the ModuleLoader.ps1 file. And it can be found in the GitHub repo PowershellScripts
And the complete version of the Moudule.psm1 file is
if($ModuleDevelopment){
. $PSScriptRoot\..\Shared-Functions\ModuleLoader.ps1 "$PSScriptRoot"
}
else {
. $PSScriptRoot\Shared\ModuleLoader.ps1 "$PSScriptRoot"
}

Creating powershell modules from multiple files, referencing with module

I creating a PowerShell script module using separate source files. What is the canonical way to reference source functions internal to the module from other internal source files?
For example if my module is created from PS source code in files "foo" and "bar"; and a function in "foo" needs to call a function in "bar", what is the best way to do that?
It doesn't seem like dot-sourcing would be a good idea. Nor does making the component files ("foo" and "bar") psm1 files. Is this the idea behind the "ScriptsToProcess" field in the psd1 file?
Am I thinking about this wrong (non-"PowerShelly")? Should I just dump everything into a single psm1?
I've personally followed the practice laid out by RamblingCookieMonster in his blog here: http://ramblingcookiemonster.github.io/Building-A-PowerShell-Module/
Which is to organise your functions in to separate .ps1 files under sub-folders \Public and \Private. Public contains the functions the user should be able to call directly, Private is for the functions that are only used internally by your module.
Then in the .psm1 file you load the functions via a loop and dot sourcing as follows:
#Get public and private function definition files.
$Public = #( Get-ChildItem -Path $PSScriptRoot\Public\*.ps1 -ErrorAction SilentlyContinue )
$Private = #( Get-ChildItem -Path $PSScriptRoot\Private\*.ps1 -ErrorAction SilentlyContinue )
#Dot source the files
Foreach($import in #($Public + $Private))
{
Try
{
. $import.fullname
}
Catch
{
Write-Error -Message "Failed to import function $($import.fullname): $_"
}
}
# Here I might...
# Read in or create an initial config file and variable
# Export Public functions ($Public.BaseName) for WIP modules
# Set variables visible to the module and its functions only
Export-ModuleMember -Function $Public.Basename
Source of this example: https://github.com/RamblingCookieMonster/PSStackExchange/blob/db1277453374cb16684b35cf93a8f5c97288c41f/PSStackExchange/PSStackExchange.psm1
You should then also explicitly list your Public function names in your .psd1 module manifest file under the FunctionsToExport setting. Doing this allows these functions to be discoverable and the module to be auto-loaded when they are used.
Since I recently had to do this myself, I am sharing my solution. I have recently started grouping functions in psm1 files. These can be compiled into a single module with a single manifest.
This allows me to have groups of functions that can be packaged with multiple modules.
Write-BarFunctions.psm1
Function Write-Bar {
return "Bar"
}
Function Write-Baz {
return "Baz"
}
Write-FooFunctions.psm1
Function Write-Foo {
return "Foo"
}
Function Write-FooBar {
$foo = Write-Foo
$bar = Write-Bar
return ("{0}{1}" -f $foo, $bar)
}
Function Write-FooBarBaz {
$foobar = Write-FooBar
$baz = Write-Baz
return ("{0}{1}" -f $foobar, $baz)
}
Which are combined into a single module like this:
(formatted for readability)
New-ModuleManifest
-Path .\Write-FooBarBazCombos
-NestedModules #('.\FooFunctions\Write-FooFunctions.psm1', '.\BarFunctions\Write-BarFunctions.psm1')
-Guid (New-Guid)
-ModuleVersion '1.0.0.0'
-Description 'demonstrate multiple psm1 files as 1 powershell module with 1 powershell module manifest'
-PowerShellVersion $PSVersionTable.PSVersion.ToString()
-FunctionsToExport #('Write-Foo', 'Write-Bar','Write-FooBar', 'Write-FooBarBaz')
PowerShell output:
PS C:\LWC\scripting-misc\module-manifest-multiple-files-example> New-ModuleManifest -Path .\Write-FooBarBazCombos.psd1
-NestedModules #('.\Write-FooFunctions.psm1', '.\Write-BarFunctions.psm1') -Guid (New-Guid) -ModuleVersion '1.0.0.0' -D
escription 'demonstrate multiple psm1 files as 1 powershell module with 1 powershell module manifest' -PowerShellVersio
n $PSVersionTable.PSVersion.ToString() -FunctionsToExport #('Write-Foo', 'Write-Bar','Write-FooBar', 'Write-FooBarBaz')
PS C:\LWC\scripting-misc\module-manifest-multiple-files-example> Import-Module .\Write-FooBarBazCombos.psd1
PS C:\LWC\scripting-misc\module-manifest-multiple-files-example> Get-Command -Module Write-FooBarBazCombos
CommandType Name Version Source
----------- ---- ------- ------
Function Write-Bar 1.0.0.0 Write-FooBarBazCombos
Function Write-Foo 1.0.0.0 Write-FooBarBazCombos
Function Write-FooBar 1.0.0.0 Write-FooBarBazCombos
Function Write-FooBarBaz 1.0.0.0 Write-FooBarBazCombos
note that Write-Baz is not exposed in the imported module as it is excluded from the FunctionsToExport parameter so Write-FooBarBaz will error (intentional to show behavior).
PS C:\LWC\scripting-misc\module-manifest-multiple-files-example> Write-FooBar
FooBar
What you're left with in the directory:
PS C:\LWC\scripting-misc\module-manifest-multiple-files-example> Get-ChildItem | Select-Object Name
Name
----
Write-BarFunctions.psm1
Write-FooBarBazCombos.psd1
Write-FooFunctions.psm1
Addendum - I expanded on this answer in another question - here:
https://stackoverflow.com/a/56171985/7710456
#Ryan
I similarly assumed that dot sourcing wasn't the best choice here, but I'm not so sure anymore. I've used the NestedModules approach as well, but have run up against a specific problem. I've asked the question here:
PowerShell module, call function in NestedModule from another NestedModule
In summary I find that the PrimaryModule can call any function in any NestedModule. But one NestedModule is not able to call a function in another NestedModule.
Splitting your code out into many logical files is Developer 101 basics. So I'm really surprised there isn't a standard way of handling this.
Any help here much appreciated. Please read the linked question, it gives plenty of detail. Is the consensus that dot sourcing has to be used? Because I'm finding the module manifest way of splitting out the code very limiting.

Creating functions dynamically in a module in PowerShell

Suppose I have the following code in a module (called MyModule.psm1, in the proper place for a module):
function new-function{
$greeting='hello world'
new-item -path function:\ -name write-greeting -value {write-output $greeting} -Options AllScope
write-greeting
}
After importing the module and running new-function I can successfully call the write-greeting function (created by new-function).
When I try to call the write-greeting function outside the scope of the new-function call, it fails because the function does not exist.
I've tried dot-sourcing new-function, but that doesn't help. I've supplied the -option Allscope, but apparently that only includes it in child scopes.
I've also tried explicitly following the new-item call with an export-modulemember write-greeting which doesn't give an error, but also doesn't create the function.
I want to be able to create a function dynamically (i.e. via new-item because the contents and name of the function will vary based on input) from a function inside a module and have the newly created function available to call outside of the module.
Specifically, I want to be able to do this:
Import-module MyModule
New-Function
write-greeting
and see "hello world" as output
Any ideas?
Making the function visible is pretty easy: just change the name of your function in New-Item to have the global: scope modifier:
new-item -path function:\ -name global:write-greeting -value {write-output $greeting} #-Options AllScope
You're going to have a new problem with your example, though, because $greeting will only exist in the new-function scope, which won't exist when you call write-greeting. You're defining the module with an unbound scriptblock, which means it will look for $greeting in its scope (it's not going to find it), then it will look in any parent scopes. It won't see the one from new-function, so the only way you'll get any output is if the module or global scope contain a $greeting variable.
I'm not exactly sure what your real dynamic functions will look like, but the easiest way to work around the new issue is to create a new closure around your scriptblock like this:
new-item -path function:\ -name global:write-greeting -value {write-output $greeting}.GetNewClosure()
That will create a new dynamic module with a copy of the state available at the time. Of course, that creates a new problem in that the function won't go away if you call Remove-Module MyModule. Without more information, I'm not sure if that's a problem for you or not...
You were close with needing to dot source, but you were missing Export-ModuleMember. Here is a complete example:
function new-function
{
$greeting='hello world'
Invoke-Expression "function write-greeting { write-output '$greeting' }"
write-greeting
}
. new-function
Export-ModuleMember -Function write-greeting
You also did not need or want -Scope AllScope.
Using the global: scope qualifier appears to work, but isn't the ideal solution. First, your function could stomp on another function in the global scope, which modules normally shouldn't do. Second, your global function would not be removed if you remove the module. Last - your global function won't be defined in the scope of the module, so if it needed access to non-exported functions or variables in your module, you can't (easily) get at them.
Thanks to the other solutions i was able to come up with a little helper that allows me to add plain script-files as functions and export them for the module in one step.
I have added the following function to my .psm1
function AddModuleFileAsFunction {
param (
[string] $Name,
[switch] $Export
)
$content = Get-Content (Join-Path $PSScriptRoot "$Name.ps1") -Raw
# Write-Host $content
$expression = #"
function $Name {
$content
}
"#
Invoke-Expression $expression
if ($Export) {
Export-ModuleMember -Function $Name
}
}
this allows me to load scripts as functions:
. AddModuleFileAsFunction "Get-WonderfulThings" -Export
( loads Get-WonderfulThings.ps1 body and exports it as function:Get-WonderfulThings )

Internal functions in PowerShell Module

I'm writing a module, and I have some helper functions that I don't want exposed, but I do want available to module functions internally. I have set up my directory structure like:
root\
..Private
....Invoke-PrivateTest.ps1
....private.psd1
....private.psm1
..Public
....Get-Something.ps1
....Public.psd1
....Public.psm1
..test.psd1
I've setup a repository on github https://github.com/jpbruckler/test that has all the module files in it.
The behavior that I'm expecting is that Get-Something is a public function. When running Get-Command -Module Test it should be listed. On the contrary, Invoke-PrivateTest should not be in the output of that command.
When calling Get-Something it should output the text Invoke-PrivateTest called. Instead, I get an error stating that the command Invoke-PrivateTest doesn't exist.
I am explicitly saying in test.psd1 that only the Get-Something function is to be exported.
Both the Private module and the public module are being called via the NestedModules property in test.psd1. Any help or pointers would be appreciated.
Unless you have other reasons to put the code into separate (sub)modules I'd keep it in one folder and control what's exported via the function names. Use "official" notation (<Verb>-<Noun>) for the names of public functions and omit the hyphen in the names of private function (<Verb><Noun>). That way you can export public functions in your global .psd1 like this:
FunctionsToExport = '*-*'
#PetSerAl pointed me in the right direction. Ultimately this came down to a scoping issue. The way I had the module arranged, each sub-module would need to make a call to load the private module, which is a bunch of code duplication - and also what I was hoping to avoid by splitting out some helper functions.
To get it all to work, instead of multiple sub modules, I just broke up the Public folder into sub folders that will hold scripts that do similar things, basically removing all the .psd1 and .psm1 files from the Public directory. I did the same thing for the Private directory. This left me with a bunch of loose .ps1 files that I load in test.psm1 with the following code:
$Private = (Get-ChildItem -Path (Join-Path $PSScriptRoot 'Private') -Filter *.ps1)
$Public = (Get-ChildItem -Path (Join-Path $PSScriptRoot 'Public') -Filter *.ps1 -Recurse)
foreach ($Script in $Public) {
. $Script.FullName
Export-ModuleMember $Script.BaseName
}
foreach ($Script in $Private) {
. $Script.FullName
}
I've modified the test module at https://github.com/jpbruckler/test to reflect the changes I made.