I'm writing a module, and I have some helper functions that I don't want exposed, but I do want available to module functions internally. I have set up my directory structure like:
root\
..Private
....Invoke-PrivateTest.ps1
....private.psd1
....private.psm1
..Public
....Get-Something.ps1
....Public.psd1
....Public.psm1
..test.psd1
I've setup a repository on github https://github.com/jpbruckler/test that has all the module files in it.
The behavior that I'm expecting is that Get-Something is a public function. When running Get-Command -Module Test it should be listed. On the contrary, Invoke-PrivateTest should not be in the output of that command.
When calling Get-Something it should output the text Invoke-PrivateTest called. Instead, I get an error stating that the command Invoke-PrivateTest doesn't exist.
I am explicitly saying in test.psd1 that only the Get-Something function is to be exported.
Both the Private module and the public module are being called via the NestedModules property in test.psd1. Any help or pointers would be appreciated.
Unless you have other reasons to put the code into separate (sub)modules I'd keep it in one folder and control what's exported via the function names. Use "official" notation (<Verb>-<Noun>) for the names of public functions and omit the hyphen in the names of private function (<Verb><Noun>). That way you can export public functions in your global .psd1 like this:
FunctionsToExport = '*-*'
#PetSerAl pointed me in the right direction. Ultimately this came down to a scoping issue. The way I had the module arranged, each sub-module would need to make a call to load the private module, which is a bunch of code duplication - and also what I was hoping to avoid by splitting out some helper functions.
To get it all to work, instead of multiple sub modules, I just broke up the Public folder into sub folders that will hold scripts that do similar things, basically removing all the .psd1 and .psm1 files from the Public directory. I did the same thing for the Private directory. This left me with a bunch of loose .ps1 files that I load in test.psm1 with the following code:
$Private = (Get-ChildItem -Path (Join-Path $PSScriptRoot 'Private') -Filter *.ps1)
$Public = (Get-ChildItem -Path (Join-Path $PSScriptRoot 'Public') -Filter *.ps1 -Recurse)
foreach ($Script in $Public) {
. $Script.FullName
Export-ModuleMember $Script.BaseName
}
foreach ($Script in $Private) {
. $Script.FullName
}
I've modified the test module at https://github.com/jpbruckler/test to reflect the changes I made.
Related
I have a module with several files with functions and module loader.
The example function:
Function1.ps1
function Init() {
echo "I am the module initialization logic"
}
function DoStuff() {
echo "Me performing important stuff"
}
Module loader file:
Module1.psm1:
$script:Functions = Get-ChildItem $PSScriptRoot\*.ps1
function LoadModule {
Param($path)
foreach ($import in #($path)) {
. $import.FullName
}
}
LoadModule script:Functions
Init # function doesn't found
So I'm trying to load functions from the file Function1.ps1 by procedure LoadModule.
Debugging LoadModule shows external functions loaded, but after finishing LoadModule procedure the functions become not accessible so script fails on Init row.
But rewritten module loader with no LoadModule function works fine
Module1.psm1:
Get-ChildItem $PSScriptRoot\*.ps1 | %{
. $import.FullName
}
Init # In this case - all works fine
So as I understand the file functions loaded from function placed in some isolated scope and to able to access them I need to add some scope flag.
Maybe somebody knows what I should add to make function Init() accessible from the module.pm1 script body but not make it accessible externally (without using Export-ModuleMember)?
Note: Edit 1, a clarification on what dot sourcing actually does, is included at the end.
First up, you are intermingling terminology and usage for Functions and Modules. Modules, which have the .psm1 extension, should be imported into the terminal using the Import-Module cmdlet. When dot sourcing, such as what you are doing here, you should only be targeting script files which contain functions, which are files with the .ps1 extension.
I too am relatively new to PowerShell, and I ran into the same problem. After spending around an hour reading up on the issue I was unable to find a solution, but a lot of the information I found points to it being an issue of scope. So I created a test, utilising three files.
foo.ps1
function foo {
Write-Output "foo"
}
bar.psm1
function bar {
Write-Output "bar"
}
scoping.ps1
function loader {
echo "dot sourcing file"
. ".\foo.ps1"
foo
echo "Importing module"
Import-Module -Name ".\bar.psm1"
bar
}
foo
bar
loader
foo
bar
pause
Lets walk through what this script does.
First we define a dummy loader function. This isn't a practical loader, but it is sufficient for testing scopes and the availability of functions within files that are loaded. This function dot sources the ps1 file containing the function foo, and uses Import-Module for the file containing the function bar.
Next, we call on the functions foo and bar, which will produce errors, in order to establish that neither are within the current scope. While not strictly necessary, this helps to illustrate their absence.
Next, we call the loader function. After dot sourcing foo.ps1, we see foo successfully executed because foo is within the current scope of the loader function. After using Import-Module for bar.psm1, we see bar also successfully executed. Now we exit the scope of the loader function and return to the main script.
Now we see the execution of foo fail with an error. This is because we dot sourced foo.ps1 within the scope of a function. However, because we imported bar.psm1, bar successfully executes. This is because modules are imported into the Global scope by default.
How can we use this to improve your LoadModule function? The main thing for this functionality is that you need to switch to using modules for your imported functions. Note that, from my testing, you cannot Import-Module the loader function; this only works if you dot source the loader.
LoadModule.ps1
function LoadModule($Path) {
Get-ChildItem -Path "$Path" -Filter "*.psm1" -Recurse -File -Name| ForEach-Object {
$File = "$Path$_"
echo "Import-Module -Name $File"
Import-Module -Name "$File" -Force
}
}
And now in a terminal:
. ".\LoadModule.ps1"
LoadModule ".\"
foo
bar
Edit 1: A further clarification on dot sourcing
Dot sourcing is equivalent to copy-pasting the contents of the specified file into the file preforming the dot source. The file performing the operation "imports" the contents of the target verbatim, performing no additional actions before proceeding to execute the "imported" code. e.g.
foo.ps1
Write-Output "I am foo"
. ".\bar.ps1"
bar.ps1
Write-Output "I am bar"
is effectively
Write-Output "I am foo"
Write-Output "I am bar"
Edit: You don't actually need to use Import-Module. So long as you have the modules in your $env:PSModulePath PowerShell will autoload any exported functions when they are first called. Source.
Depending on the specifics of your use case, there's another method you can use. This method addresses when you want to mass-import modules into a PowerShell session.
When you start PowerShell it looks at the values of the environment variable $PSModulePath in order to determine where it should look for modules. It then looks under this directory for directories containing psm1 and psd1 files. You can modify this variable during the session, and then import modules by name. Here's an example, using what I've added to my PowerShell profile.ps1 file:
$MyPSPath = [Environment]::GetFolderPath("MyDocuments") + "\WindowsPowerShell"
$env:PSModulePath = $env:PSModulePath + ";$MyPSPath\Custom\Modules"
Import-Module `
-Name Confirm-Directory, `
Confirm-File, `
Get-FileFromURL, `
Get-RedirectedURL, `
Get-RemoteFileName, `
Get-ReparseTarget, `
Get-ReparseType, `
Get-SpecialPath, `
Test-ReparsePoint
In the event that you're new to PowerShell profiles (they're pretty much the same as Unix's ~/.profile file), you can find:
more information about PowerShell profiles here.
a summary of what profile files are used and when here.
While this may not seem as convenient as an auto-loader, installing & importing modules is the intended and accepted approach for this. Unless you have a specific reason not to, you should try to follow the established standards so that you aren't later fighting your way out of bad habits.
You can also modify the registry to achieve this.
After some research, I found: During the execution of the LoadModule function, all registered functions will be added to Functions Provider
So from the LoadModule function body they can be enumerated via Get-ChildItem -Path Function:
[DBG]: PS > Get-ChildItem -Path Function:
CommandType Name Version Source
----------- ---- ------- ------
Function C:
Function Close-VSCodeHtmlContentView 0.2.0 PowerShellEditorServices.VSCode
Function Init 0.0 Module1
Function ConvertFrom-ScriptExtent 0.2.0
Function Module1 0.0 Module1
So we can store functions list to variable in the beginning of the invocation of the LoadModule
$loadedFunctions = Get-ChildItem -Path Function:
and after dot load notation retrieve the added function list
Get-ChildItem -Path Function: | where { $loadedFunctions -notcontains $_ }
So the modified LoadModule function will look like:
function LoadModule {
param ($path)
$loadRef = Get-PSCallStack
$loadedFunctions = Get-ChildItem -Path Function:
foreach ($import in #($path)) {
. $import.FullName
}
$functions= Get-ChildItem -Path Function: | `
Where-Object { $loadedFunctions -notcontains $_ } | `
ForEach-Object{ Get-Item function:$_ }
return $functions
}
the next step it just assigns the functions to list More about this
$script:functions = LoadModule $script:Private ##Function1.ps1
$script:functions += LoadModule $script:PublicFolder
After this step, we can
Invoke initalizer:
$initScripts = $script:functions| #here{ $_.Name -eq 'Initalize'} #filter
$initScripts | ForEach-Object{ & $_ } ##execute
and export Public functions:
$script:functions| `
where { $_.Name -notlike '_*' } | ` # do not extport _Name functions
%{ Export-ModuleMember -Function $_.Name}
Full code of the module load function I moved to the ModuleLoader.ps1 file. And it can be found in the GitHub repo PowershellScripts
And the complete version of the Moudule.psm1 file is
if($ModuleDevelopment){
. $PSScriptRoot\..\Shared-Functions\ModuleLoader.ps1 "$PSScriptRoot"
}
else {
. $PSScriptRoot\Shared\ModuleLoader.ps1 "$PSScriptRoot"
}
I creating a PowerShell script module using separate source files. What is the canonical way to reference source functions internal to the module from other internal source files?
For example if my module is created from PS source code in files "foo" and "bar"; and a function in "foo" needs to call a function in "bar", what is the best way to do that?
It doesn't seem like dot-sourcing would be a good idea. Nor does making the component files ("foo" and "bar") psm1 files. Is this the idea behind the "ScriptsToProcess" field in the psd1 file?
Am I thinking about this wrong (non-"PowerShelly")? Should I just dump everything into a single psm1?
I've personally followed the practice laid out by RamblingCookieMonster in his blog here: http://ramblingcookiemonster.github.io/Building-A-PowerShell-Module/
Which is to organise your functions in to separate .ps1 files under sub-folders \Public and \Private. Public contains the functions the user should be able to call directly, Private is for the functions that are only used internally by your module.
Then in the .psm1 file you load the functions via a loop and dot sourcing as follows:
#Get public and private function definition files.
$Public = #( Get-ChildItem -Path $PSScriptRoot\Public\*.ps1 -ErrorAction SilentlyContinue )
$Private = #( Get-ChildItem -Path $PSScriptRoot\Private\*.ps1 -ErrorAction SilentlyContinue )
#Dot source the files
Foreach($import in #($Public + $Private))
{
Try
{
. $import.fullname
}
Catch
{
Write-Error -Message "Failed to import function $($import.fullname): $_"
}
}
# Here I might...
# Read in or create an initial config file and variable
# Export Public functions ($Public.BaseName) for WIP modules
# Set variables visible to the module and its functions only
Export-ModuleMember -Function $Public.Basename
Source of this example: https://github.com/RamblingCookieMonster/PSStackExchange/blob/db1277453374cb16684b35cf93a8f5c97288c41f/PSStackExchange/PSStackExchange.psm1
You should then also explicitly list your Public function names in your .psd1 module manifest file under the FunctionsToExport setting. Doing this allows these functions to be discoverable and the module to be auto-loaded when they are used.
Since I recently had to do this myself, I am sharing my solution. I have recently started grouping functions in psm1 files. These can be compiled into a single module with a single manifest.
This allows me to have groups of functions that can be packaged with multiple modules.
Write-BarFunctions.psm1
Function Write-Bar {
return "Bar"
}
Function Write-Baz {
return "Baz"
}
Write-FooFunctions.psm1
Function Write-Foo {
return "Foo"
}
Function Write-FooBar {
$foo = Write-Foo
$bar = Write-Bar
return ("{0}{1}" -f $foo, $bar)
}
Function Write-FooBarBaz {
$foobar = Write-FooBar
$baz = Write-Baz
return ("{0}{1}" -f $foobar, $baz)
}
Which are combined into a single module like this:
(formatted for readability)
New-ModuleManifest
-Path .\Write-FooBarBazCombos
-NestedModules #('.\FooFunctions\Write-FooFunctions.psm1', '.\BarFunctions\Write-BarFunctions.psm1')
-Guid (New-Guid)
-ModuleVersion '1.0.0.0'
-Description 'demonstrate multiple psm1 files as 1 powershell module with 1 powershell module manifest'
-PowerShellVersion $PSVersionTable.PSVersion.ToString()
-FunctionsToExport #('Write-Foo', 'Write-Bar','Write-FooBar', 'Write-FooBarBaz')
PowerShell output:
PS C:\LWC\scripting-misc\module-manifest-multiple-files-example> New-ModuleManifest -Path .\Write-FooBarBazCombos.psd1
-NestedModules #('.\Write-FooFunctions.psm1', '.\Write-BarFunctions.psm1') -Guid (New-Guid) -ModuleVersion '1.0.0.0' -D
escription 'demonstrate multiple psm1 files as 1 powershell module with 1 powershell module manifest' -PowerShellVersio
n $PSVersionTable.PSVersion.ToString() -FunctionsToExport #('Write-Foo', 'Write-Bar','Write-FooBar', 'Write-FooBarBaz')
PS C:\LWC\scripting-misc\module-manifest-multiple-files-example> Import-Module .\Write-FooBarBazCombos.psd1
PS C:\LWC\scripting-misc\module-manifest-multiple-files-example> Get-Command -Module Write-FooBarBazCombos
CommandType Name Version Source
----------- ---- ------- ------
Function Write-Bar 1.0.0.0 Write-FooBarBazCombos
Function Write-Foo 1.0.0.0 Write-FooBarBazCombos
Function Write-FooBar 1.0.0.0 Write-FooBarBazCombos
Function Write-FooBarBaz 1.0.0.0 Write-FooBarBazCombos
note that Write-Baz is not exposed in the imported module as it is excluded from the FunctionsToExport parameter so Write-FooBarBaz will error (intentional to show behavior).
PS C:\LWC\scripting-misc\module-manifest-multiple-files-example> Write-FooBar
FooBar
What you're left with in the directory:
PS C:\LWC\scripting-misc\module-manifest-multiple-files-example> Get-ChildItem | Select-Object Name
Name
----
Write-BarFunctions.psm1
Write-FooBarBazCombos.psd1
Write-FooFunctions.psm1
Addendum - I expanded on this answer in another question - here:
https://stackoverflow.com/a/56171985/7710456
#Ryan
I similarly assumed that dot sourcing wasn't the best choice here, but I'm not so sure anymore. I've used the NestedModules approach as well, but have run up against a specific problem. I've asked the question here:
PowerShell module, call function in NestedModule from another NestedModule
In summary I find that the PrimaryModule can call any function in any NestedModule. But one NestedModule is not able to call a function in another NestedModule.
Splitting your code out into many logical files is Developer 101 basics. So I'm really surprised there isn't a standard way of handling this.
Any help here much appreciated. Please read the linked question, it gives plenty of detail. Is the consensus that dot sourcing has to be used? Because I'm finding the module manifest way of splitting out the code very limiting.
I'm using classes in PS with WinSCP PowerShell Assembly. In one of the methods I'm using various types from WinSCP.
This works fine as long as I already have the assembly added - however, because of the way PowerShell reads the script when using classes (I assume?), an error is thrown before the assembly could be loaded.
In fact, even if I put a Write-Host at the top, it will not load.
Is there any way of forcing something to run before the rest of the file is parsed?
Transfer() {
$this.Logger = [Logger]::new()
try {
Add-Type -Path $this.Paths.WinSCP
$ConnectionType = $this.FtpSettings.Protocol.ToString()
$SessionOptions = New-Object WinSCP.SessionOptions -Property #{
Protocol = [WinSCP.Protocol]::$ConnectionType
HostName = $this.FtpSettings.Server
UserName = $this.FtpSettings.Username
Password = $this.FtpSettings.Password
}
Results in an error like this:
Protocol = [WinSCP.Protocol]::$ConnectionType
Unable to find type [WinSCP.Protocol].
But it doesn't matter where I load the assembly. Even if I put the Add-Type cmdlet on the topmost line with a direct path to WinSCPnet.dll, it won't load - it detects the missing types before running anything, it seems.
As you've discovered, PowerShell refuses to run scripts that contains class definitions that reference then-unavailable (not-yet-loaded) types - the script-parsing stage fails.
As of PSv5.1, even a using assembly statement at the top of a script does not help in this case, because in your case the type is referenced in the context of a PS class definition - this may get fixed in PowerShell Core, however; the required work, along with other class-related issues, is being tracked in GitHub issue #6652.
The proper solution is to create a script module (*.psm1) whose associated manifest (*.psd1) declares the assembly containing the referenced types a prerequisite, via the RequiredAssemblies key.
See alternative solution at the bottom if using modules is not an option.
Here's a simplified walk-through:
Create test module tm as follows:
Create module folder ./tm and manifest (*.psd1) in it:
# Create module folder (remove a preexisting ./tm folder if this fails).
$null = New-Item -Type Directory -ErrorAction Stop ./tm
# Create manifest file that declares the WinSCP assembly a prerequisite.
# Modify the path to the assembly as needed; you may specify a relative path, but
# note that the path must not contain variable references (e.g., $HOME).
New-ModuleManifest ./tm/tm.psd1 -RootModule tm.psm1 `
-RequiredAssemblies C:\path\to\WinSCPnet.dll
Create the script module file (*.psm1) in the module folder:
Create file ./tm/tm.psm1 with your class definition; e.g.:
class Foo {
# As a simple example, return the full name of the WinSCP type.
[string] Bar() {
return [WinSCP.Protocol].FullName
}
}
Note: In the real world, modules are usually placed in one of the standard locations defined in $env:PSMODULEPATH, so that the module can be referenced by name only, without needing to specify a (relative) path.
Use the module:
PS> using module ./tm; [Foo]::new().Bar()
WinSCP.Protocol
The using module statement imports the module and - unlike Import-Module -
also makes the class defined in the module available to the current session.
Since importing the module implicitly loaded the WinSCP assembly thanks to the RequiredAssemblies key in the module manifest, instantiating class Foo, which references the assembly's types, succeeded.
If you need to determine the path to the dependent assembly dynamically in order to load it or even to ad-hoc-compile one (in which case use of a RequiredAssemblies manifest entry isn't an option), you should be able to use the approach recommended in Justin Grote's helpful answer - i.e., to use a ScriptsToProcess manifest entry that points to a *.ps1 script that calls Add-Type to dynamically load dependent assemblies before the script module (*.psm1) is loaded - but this doesn't actually work as of PowerShell 7.2.0-preview.9: while the definition of the class in the *.psm1 file relying on the dependent assembly's types succeeds, the caller doesn't see the class until a script with a using module ./tm statement is executed a second time:
Create a sample module:
# Create module folder (remove a preexisting ./tm folder if this fails).
$null = New-Item -Type Directory -ErrorAction Stop ./tm
# Create a helper script that loads the dependent
# assembly.
# In this simple example, the assembly is created dynamically,
# with a type [demo.FooHelper]
#'
Add-Type #"
namespace demo {
public class FooHelper {
}
}
"#
'# > ./tm/loadAssemblies.ps1
# Create the root script module.
# Note how the [Foo] class definition references the
# [demo.FooHelper] type created in the loadAssemblies.ps1 script.
#'
class Foo {
# Simply return the full name of the dependent type.
[string] Bar() {
return [demo.FooHelper].FullName
}
}
'# > ./tm/tm.psm1
# Create the manifest file, designating loadAssemblies.ps1
# as the script to run (in the caller's scope) before the
# root module is parsed.
New-ModuleManifest ./tm/tm.psd1 -RootModule tm.psm1 -ScriptsToProcess loadAssemblies.ps1
Now, still as of PowerShell 7.2.0-preview.9, trying to use the module's [Foo] class inexplicably succeeds only after calling using module ./tm twice - which you cannot do in a single script, rendering this approach useless for now:
# As of PowerShell 7.2.0-preview.9:
# !! First attempt FAILS:
PS> using module ./tm; [Foo]::new().Bar()
InvalidOperation: Unable to find type [Foo]
# Second attempt: OK
PS> using module ./tm; [Foo]::new().Bar()
demo.FooHelper
The problem is a known one, as it turns out, and dates back to 2017 - see GitHub issue #2962
If your use case doesn't allow the use of modules:
In a pinch, you can use Invoke-Expression, but note that it's generally better to avoid Invoke-Expression in the interest of robustness and so as to avoid security risks[1]
.
# Adjust this path as needed.
Add-Type -LiteralPath C:\path\to\WinSCPnet.dll
# By placing the class definition in a string that is invoked at *runtime*
# via Invoke-Expression, *after* the WinSCP assembly has been loaded, the
# class definition succeeds.
Invoke-Expression #'
class Foo {
# Simply return the full name of the WinSCP type.
[string] Bar() {
return [WinSCP.Protocol].FullName
}
}
'#
[Foo]::new().Bar()
Alternatively, use a two-script approach:
A main script that loads the dependent assemblies,
which then dot-sources a second script that contains the class definitions relying on the types from the dependent assemblies.
This approach is demonstrated in Takophiliac's helpful answer.
[1] It's not a concern in this case, but generally, given that Invoke-Expression can invoke any command stored in a string, applying it to strings not fully under your control can result in the execution of malicious commands - see this answer for more information.
This caveat applies to other language analogously, such as to Bash's built-in eval command.
An additional solution is to put your Add-Type logic into a separate .ps1 file (name it AssemblyBootStrap.ps1 or something) and then add it to the ScriptsToProcess section of your module manifest. ScriptsToProcess runs before the root script module (*.psm1), and the assemblies will be loaded at the time the class definitions are looking for them.
Although it's not the solution per se, I worked around it. However, I'll leave the question open as it still stands
Instead of using WinSCP-types, I just use strings. Seeing as I already have enumerals that are identical to WinSCP.Protocol
Enum Protocols {
Sftp
Ftp
Ftps
}
And have set Protocol in FtpSettings
$FtpSettings.Protocol = [Protocols]::Sftp
I can set the protocol like this
$SessionOptions = New-Object WinSCP.SessionOptions -Property #{
Protocol = $this.FtpSettings.Protocol.ToString()
HostName = $this.FtpSettings.Server
UserName = $this.FtpSettings.Username
Password = $this.FtpSettings.Password
}
I used similar on [WinSCP.TransferMode]
$TransferOptions.TransferMode = "Binary" #[WinSCP.TransferMode]::Binary
First, I would recommend mklement0's answer.
However, there is a bit of running around you can do to get much the same effect with a bit less work, which can be helpful in smaller projects or in the early stages.
It's possible to merely . source another ps1 file in your code which contains your classes referencing a not yet loaded library after you load the referenced assembly.
##########
MyClasses.ps1
Class myClass
{
[3rdParty.Fancy.Object] $MyFancyObject
}
Then you can call your custom class library from your main script with a .
#######
MyMainScriptFile.ps1
#Load fancy object's library
Import-Module Fancy.Module #If it's in a module
Add-Type -Path "c:\Path\To\FancyLibrary.dll" #if it's in a dll you have to reference
. C:\Path\to\MyClasses.ps1
The original parsing will pass muster, the script will start, your reference will be added, and then as the script continues, the . sourced file will be read and parsed, adding your custom classes without issue as their reference library is in memory by the time the code is parsed.
It's still very much better to make and use a module with the proper manifest, but this will get by easy and is very easy to remember and use.
Add-Type -LiteralPath C:\Windows\Microsoft.NET\Framework\v4.0.30319\System.IO.Compression.FileSystem.dll
$psclasses = GC "C:\Windows\Temp\foobarclass.ps1" -Raw
Invoke-Expression $psclasses
[Foo]::new().Bar()
I am trying to make my way through the PowerShell documentation and have hit a point of confusion.
I have creating a cmdlet assembly using the sample code located here.
I can load the module by issuing the command:
Import-Module -Name *NameOfAssembly*
This is, of course, if the assembly is located in a folder where PowerShell can find it.
If I create a module manifest, the only way I have been able to get the module manifest to load the assembly is to add the assembly on the RequiredModules line of the manifest. The documentation (located here) states that this doesn't actually load any modules. From what I have observed this is contradictory to what actually happens. Am I reading / understanding this incorrectly? If not, what am I missing? Is there a better way to get a cmdlet assembly (or assemblies) (I think they are called binary modules) deployed?
First of all - same as it works with script modules, you can get loading of NameOfModule.dll by simply putting it in NameOfModule subfolder of any folders listed in $env:PSModulePath
$dir = mkdir $profile\..\Modules\Greetings -Force
$dllPath = Join-Path -Path $dir.FullName -ChildPath Greetings.dll
Add-Type #'
using System.Management.Automation;
namespace SendGreeting
{
[Cmdlet(VerbsCommunications.Send, "Greeting")]
public class SendGreetingCommand : Cmdlet
{
[Parameter(Mandatory=true)]
public string Name
{
get { return name; }
set { name = value; }
}
private string name;
protected override void ProcessRecord()
{
WriteObject("Hello " + name + "!");
}
}
}
'# -OutputAssembly $dllPath
Import-Module Greetings -PassThru
In case you really need manifest (e.g. for some metadata, or external files) you have two options, depending on PowerShell version:
v2 - ModulesToProcess (works with newer versions, but causes warning)
v3+ - RootModule (fails on v2)
The key you've used, RequiredModules, is there to provide a way to name your dependencies. E.g. modules that your code depends on. RequiredAssemblies kind-of works, because adding any assembly that contains PowerShell cmdlets "just works" - but that approach kind of "hides" from future user where you defined your commands.
Suppose I have the following code in a module (called MyModule.psm1, in the proper place for a module):
function new-function{
$greeting='hello world'
new-item -path function:\ -name write-greeting -value {write-output $greeting} -Options AllScope
write-greeting
}
After importing the module and running new-function I can successfully call the write-greeting function (created by new-function).
When I try to call the write-greeting function outside the scope of the new-function call, it fails because the function does not exist.
I've tried dot-sourcing new-function, but that doesn't help. I've supplied the -option Allscope, but apparently that only includes it in child scopes.
I've also tried explicitly following the new-item call with an export-modulemember write-greeting which doesn't give an error, but also doesn't create the function.
I want to be able to create a function dynamically (i.e. via new-item because the contents and name of the function will vary based on input) from a function inside a module and have the newly created function available to call outside of the module.
Specifically, I want to be able to do this:
Import-module MyModule
New-Function
write-greeting
and see "hello world" as output
Any ideas?
Making the function visible is pretty easy: just change the name of your function in New-Item to have the global: scope modifier:
new-item -path function:\ -name global:write-greeting -value {write-output $greeting} #-Options AllScope
You're going to have a new problem with your example, though, because $greeting will only exist in the new-function scope, which won't exist when you call write-greeting. You're defining the module with an unbound scriptblock, which means it will look for $greeting in its scope (it's not going to find it), then it will look in any parent scopes. It won't see the one from new-function, so the only way you'll get any output is if the module or global scope contain a $greeting variable.
I'm not exactly sure what your real dynamic functions will look like, but the easiest way to work around the new issue is to create a new closure around your scriptblock like this:
new-item -path function:\ -name global:write-greeting -value {write-output $greeting}.GetNewClosure()
That will create a new dynamic module with a copy of the state available at the time. Of course, that creates a new problem in that the function won't go away if you call Remove-Module MyModule. Without more information, I'm not sure if that's a problem for you or not...
You were close with needing to dot source, but you were missing Export-ModuleMember. Here is a complete example:
function new-function
{
$greeting='hello world'
Invoke-Expression "function write-greeting { write-output '$greeting' }"
write-greeting
}
. new-function
Export-ModuleMember -Function write-greeting
You also did not need or want -Scope AllScope.
Using the global: scope qualifier appears to work, but isn't the ideal solution. First, your function could stomp on another function in the global scope, which modules normally shouldn't do. Second, your global function would not be removed if you remove the module. Last - your global function won't be defined in the scope of the module, so if it needed access to non-exported functions or variables in your module, you can't (easily) get at them.
Thanks to the other solutions i was able to come up with a little helper that allows me to add plain script-files as functions and export them for the module in one step.
I have added the following function to my .psm1
function AddModuleFileAsFunction {
param (
[string] $Name,
[switch] $Export
)
$content = Get-Content (Join-Path $PSScriptRoot "$Name.ps1") -Raw
# Write-Host $content
$expression = #"
function $Name {
$content
}
"#
Invoke-Expression $expression
if ($Export) {
Export-ModuleMember -Function $Name
}
}
this allows me to load scripts as functions:
. AddModuleFileAsFunction "Get-WonderfulThings" -Export
( loads Get-WonderfulThings.ps1 body and exports it as function:Get-WonderfulThings )