Creating functions dynamically in a module in PowerShell - powershell

Suppose I have the following code in a module (called MyModule.psm1, in the proper place for a module):
function new-function{
$greeting='hello world'
new-item -path function:\ -name write-greeting -value {write-output $greeting} -Options AllScope
write-greeting
}
After importing the module and running new-function I can successfully call the write-greeting function (created by new-function).
When I try to call the write-greeting function outside the scope of the new-function call, it fails because the function does not exist.
I've tried dot-sourcing new-function, but that doesn't help. I've supplied the -option Allscope, but apparently that only includes it in child scopes.
I've also tried explicitly following the new-item call with an export-modulemember write-greeting which doesn't give an error, but also doesn't create the function.
I want to be able to create a function dynamically (i.e. via new-item because the contents and name of the function will vary based on input) from a function inside a module and have the newly created function available to call outside of the module.
Specifically, I want to be able to do this:
Import-module MyModule
New-Function
write-greeting
and see "hello world" as output
Any ideas?

Making the function visible is pretty easy: just change the name of your function in New-Item to have the global: scope modifier:
new-item -path function:\ -name global:write-greeting -value {write-output $greeting} #-Options AllScope
You're going to have a new problem with your example, though, because $greeting will only exist in the new-function scope, which won't exist when you call write-greeting. You're defining the module with an unbound scriptblock, which means it will look for $greeting in its scope (it's not going to find it), then it will look in any parent scopes. It won't see the one from new-function, so the only way you'll get any output is if the module or global scope contain a $greeting variable.
I'm not exactly sure what your real dynamic functions will look like, but the easiest way to work around the new issue is to create a new closure around your scriptblock like this:
new-item -path function:\ -name global:write-greeting -value {write-output $greeting}.GetNewClosure()
That will create a new dynamic module with a copy of the state available at the time. Of course, that creates a new problem in that the function won't go away if you call Remove-Module MyModule. Without more information, I'm not sure if that's a problem for you or not...

You were close with needing to dot source, but you were missing Export-ModuleMember. Here is a complete example:
function new-function
{
$greeting='hello world'
Invoke-Expression "function write-greeting { write-output '$greeting' }"
write-greeting
}
. new-function
Export-ModuleMember -Function write-greeting
You also did not need or want -Scope AllScope.
Using the global: scope qualifier appears to work, but isn't the ideal solution. First, your function could stomp on another function in the global scope, which modules normally shouldn't do. Second, your global function would not be removed if you remove the module. Last - your global function won't be defined in the scope of the module, so if it needed access to non-exported functions or variables in your module, you can't (easily) get at them.

Thanks to the other solutions i was able to come up with a little helper that allows me to add plain script-files as functions and export them for the module in one step.
I have added the following function to my .psm1
function AddModuleFileAsFunction {
param (
[string] $Name,
[switch] $Export
)
$content = Get-Content (Join-Path $PSScriptRoot "$Name.ps1") -Raw
# Write-Host $content
$expression = #"
function $Name {
$content
}
"#
Invoke-Expression $expression
if ($Export) {
Export-ModuleMember -Function $Name
}
}
this allows me to load scripts as functions:
. AddModuleFileAsFunction "Get-WonderfulThings" -Export
( loads Get-WonderfulThings.ps1 body and exports it as function:Get-WonderfulThings )

Related

Powershell: Export-ModuleMember with alias; set Scope

I have a Powershell module (.psm1 and .psd1) with several functions and aliases in it.
I added aliases like this:
function MyFunction {
<#
.SYNOPSIS
A Method.
.DESCRIPTION
Method does something.
#>
[Alias("MyAlias")]
param(
[String][Parameter(Mandatory = $true)] $ParamOne,
[String][Parameter(Mandatory = $false)] $ParamTwo
)
# Do Something
}
Later I export the function and the alias like this:
Export-ModuleMember -Function MyFunction -Alias MyAlias
After importing the module (Import-Module (Join-Path $env:MODULE_PATH 'MyModule') -Force), I am able to call MyFunction but I can't call MyAlias, as it is not found.
In the psd1-File I exported all aliases like this AliasesToExport = '*' or this AliasesToExport = #('MyAlias'). I still can't call MyAlias.
However it works, when I remove the Alias-Annotation from the function and export the function and alias like this:
Export-ModuleMember -Function MyFunction
New-Alias -Name MyAlias -Value MyFunction -Scope 'Global' -Force
It only works, when the Scope is set to Global. So my conclusion is that the Export-ModuleMember has a lower scope for aliases.
I think that the initial version has worked before, probably with an earlier version of powershell (but I'm not sure about that...).
Now my question: Can I set the scope for exporting aliases with Export-ModuleMember, or can I set the scope probably in my psd1 file?
Or do I have to change all my Export-ModuleMembers and add New-Alias calls everywhere (There are quite a lot).

PowerShell functions load from function

I have a module with several files with functions and module loader.
The example function:
Function1.ps1
function Init() {
echo "I am the module initialization logic"
}
function DoStuff() {
echo "Me performing important stuff"
}
Module loader file:
Module1.psm1:
$script:Functions = Get-ChildItem $PSScriptRoot\*.ps1
function LoadModule {
Param($path)
foreach ($import in #($path)) {
. $import.FullName
}
}
LoadModule script:Functions
Init # function doesn't found
So I'm trying to load functions from the file Function1.ps1 by procedure LoadModule.
Debugging LoadModule shows external functions loaded, but after finishing LoadModule procedure the functions become not accessible so script fails on Init row.
But rewritten module loader with no LoadModule function works fine
Module1.psm1:
Get-ChildItem $PSScriptRoot\*.ps1 | %{
. $import.FullName
}
Init # In this case - all works fine
So as I understand the file functions loaded from function placed in some isolated scope and to able to access them I need to add some scope flag.
Maybe somebody knows what I should add to make function Init() accessible from the module.pm1 script body but not make it accessible externally (without using Export-ModuleMember)?
Note: Edit 1, a clarification on what dot sourcing actually does, is included at the end.
First up, you are intermingling terminology and usage for Functions and Modules. Modules, which have the .psm1 extension, should be imported into the terminal using the Import-Module cmdlet. When dot sourcing, such as what you are doing here, you should only be targeting script files which contain functions, which are files with the .ps1 extension.
I too am relatively new to PowerShell, and I ran into the same problem. After spending around an hour reading up on the issue I was unable to find a solution, but a lot of the information I found points to it being an issue of scope. So I created a test, utilising three files.
foo.ps1
function foo {
Write-Output "foo"
}
bar.psm1
function bar {
Write-Output "bar"
}
scoping.ps1
function loader {
echo "dot sourcing file"
. ".\foo.ps1"
foo
echo "Importing module"
Import-Module -Name ".\bar.psm1"
bar
}
foo
bar
loader
foo
bar
pause
Lets walk through what this script does.
First we define a dummy loader function. This isn't a practical loader, but it is sufficient for testing scopes and the availability of functions within files that are loaded. This function dot sources the ps1 file containing the function foo, and uses Import-Module for the file containing the function bar.
Next, we call on the functions foo and bar, which will produce errors, in order to establish that neither are within the current scope. While not strictly necessary, this helps to illustrate their absence.
Next, we call the loader function. After dot sourcing foo.ps1, we see foo successfully executed because foo is within the current scope of the loader function. After using Import-Module for bar.psm1, we see bar also successfully executed. Now we exit the scope of the loader function and return to the main script.
Now we see the execution of foo fail with an error. This is because we dot sourced foo.ps1 within the scope of a function. However, because we imported bar.psm1, bar successfully executes. This is because modules are imported into the Global scope by default.
How can we use this to improve your LoadModule function? The main thing for this functionality is that you need to switch to using modules for your imported functions. Note that, from my testing, you cannot Import-Module the loader function; this only works if you dot source the loader.
LoadModule.ps1
function LoadModule($Path) {
Get-ChildItem -Path "$Path" -Filter "*.psm1" -Recurse -File -Name| ForEach-Object {
$File = "$Path$_"
echo "Import-Module -Name $File"
Import-Module -Name "$File" -Force
}
}
And now in a terminal:
. ".\LoadModule.ps1"
LoadModule ".\"
foo
bar
Edit 1: A further clarification on dot sourcing
Dot sourcing is equivalent to copy-pasting the contents of the specified file into the file preforming the dot source. The file performing the operation "imports" the contents of the target verbatim, performing no additional actions before proceeding to execute the "imported" code. e.g.
foo.ps1
Write-Output "I am foo"
. ".\bar.ps1"
bar.ps1
Write-Output "I am bar"
is effectively
Write-Output "I am foo"
Write-Output "I am bar"
Edit: You don't actually need to use Import-Module. So long as you have the modules in your $env:PSModulePath PowerShell will autoload any exported functions when they are first called. Source.
Depending on the specifics of your use case, there's another method you can use. This method addresses when you want to mass-import modules into a PowerShell session.
When you start PowerShell it looks at the values of the environment variable $PSModulePath in order to determine where it should look for modules. It then looks under this directory for directories containing psm1 and psd1 files. You can modify this variable during the session, and then import modules by name. Here's an example, using what I've added to my PowerShell profile.ps1 file:
$MyPSPath = [Environment]::GetFolderPath("MyDocuments") + "\WindowsPowerShell"
$env:PSModulePath = $env:PSModulePath + ";$MyPSPath\Custom\Modules"
Import-Module `
-Name Confirm-Directory, `
Confirm-File, `
Get-FileFromURL, `
Get-RedirectedURL, `
Get-RemoteFileName, `
Get-ReparseTarget, `
Get-ReparseType, `
Get-SpecialPath, `
Test-ReparsePoint
In the event that you're new to PowerShell profiles (they're pretty much the same as Unix's ~/.profile file), you can find:
more information about PowerShell profiles here.
a summary of what profile files are used and when here.
While this may not seem as convenient as an auto-loader, installing & importing modules is the intended and accepted approach for this. Unless you have a specific reason not to, you should try to follow the established standards so that you aren't later fighting your way out of bad habits.
You can also modify the registry to achieve this.
After some research, I found: During the execution of the LoadModule function, all registered functions will be added to Functions Provider
So from the LoadModule function body they can be enumerated via Get-ChildItem -Path Function:
[DBG]: PS > Get-ChildItem -Path Function:
CommandType Name Version Source
----------- ---- ------- ------
Function C:
Function Close-VSCodeHtmlContentView 0.2.0 PowerShellEditorServices.VSCode
Function Init 0.0 Module1
Function ConvertFrom-ScriptExtent 0.2.0
Function Module1 0.0 Module1
So we can store functions list to variable in the beginning of the invocation of the LoadModule
$loadedFunctions = Get-ChildItem -Path Function:
and after dot load notation retrieve the added function list
Get-ChildItem -Path Function: | where { $loadedFunctions -notcontains $_ }
So the modified LoadModule function will look like:
function LoadModule {
param ($path)
$loadRef = Get-PSCallStack
$loadedFunctions = Get-ChildItem -Path Function:
foreach ($import in #($path)) {
. $import.FullName
}
$functions= Get-ChildItem -Path Function: | `
Where-Object { $loadedFunctions -notcontains $_ } | `
ForEach-Object{ Get-Item function:$_ }
return $functions
}
the next step it just assigns the functions to list More about this
$script:functions = LoadModule $script:Private ##Function1.ps1
$script:functions += LoadModule $script:PublicFolder
After this step, we can
Invoke initalizer:
$initScripts = $script:functions| #here{ $_.Name -eq 'Initalize'} #filter
$initScripts | ForEach-Object{ & $_ } ##execute
and export Public functions:
$script:functions| `
where { $_.Name -notlike '_*' } | ` # do not extport _Name functions
%{ Export-ModuleMember -Function $_.Name}
Full code of the module load function I moved to the ModuleLoader.ps1 file. And it can be found in the GitHub repo PowershellScripts
And the complete version of the Moudule.psm1 file is
if($ModuleDevelopment){
. $PSScriptRoot\..\Shared-Functions\ModuleLoader.ps1 "$PSScriptRoot"
}
else {
. $PSScriptRoot\Shared\ModuleLoader.ps1 "$PSScriptRoot"
}

How to create an alias with fixed/static parameters in Powershell [duplicate]

I'm trying to set up a Windows PowerShell alias to run MinGW's g++ executable with certain parameters. However, these parameters need to come after the file name and other arguments. I don't want to go through the hassle of trying to set up a function and all of that. Is there a way to simply say something like:
alias mybuild="g++ {args} -lib1 -lib2 ..."
or something along those lines? I am not all that familiar with PowerShell, and I'm having a difficult time finding a solution. Anyone?
You want to use a function, not an alias, as Roman mentioned. Something like this:
function mybuild { g++ $args -lib1 -lib2 ... }
To try this out, here's a simple example:
PS> function docmd { cmd /c $args there }
PS> docmd echo hello
hello there
PS>
You might also want to put this in your profile in order to have it available whenever you run PowerShell. The name of your profile file is contained in $profile.
There is not such a way built-in. IMHO, a wrapper function is the best way to go so far. But I know that some workarounds were invented, for example:
https://web.archive.org/web/20120213013609/http://huddledmasses.org/powershell-power-user-tips-bash-style-alias-command
To build an function, store it as an alias, and persist the whole thing in your profile for later, use:
$g=[guid]::NewGuid();
echo "function G$g { COMMANDS }; New-Alias -Force ALIAS G$g">>$profile
where you have replaced ALIAS with the alias you want and COMMANDS with the command or string of commands to execute.
Of course, instead of doing that you can (and should!) make an alias for the above by:
echo 'function myAlias {
$g=[guid]::NewGuid();
$alias = $args[0]; $commands = $args[1]
echo "function G$g { $commands }; New-Alias -Force $alias G$g">>$profile
}; New-Alias alias myAlias'>>$profile
Just in case your brain got turned inside out from all the recursion (aliasing of aliases, etc.), after pasting the second code block to your PowerShell (and restarting PowerShell), a simple example of using it is:
alias myEcho 'echo $args[0]'
or without args:
alias myLs 'ls D:\MyFolder'
Iff you don't have a profile yet
The above method will fail if you don't have a profile yet!
In that case, use New-Item -type file -path $profile -force from this answer.
This is a sample function that will do different things based on how it was called:
Function Do-Something {
[CmdletBinding()]
[Alias('DOIT')]
Param(
[string] $option1,
[string] $option2,
[int] $option3)
#$MyInvocation|select *|FL
If ($MyInvocation.InvocationName -eq 'DOIT'){write-host "You told me to do it...so i did!" -ForegroundColor Yellow}
Else {Write-Host "you were boring and said do something..." -ForegroundColor Green}
}
Creating a 'filter' is also an option, a lighter alternative to functions. It processes each element in the pipeline, assigning it the $_ automatic variable. So, for instance:
filter test { Write-Warning "$args $_" }
'foo','bar' | test 'This is'
returns:
WARNING: This is foo
WARNING: This is bar

Creating powershell modules from multiple files, referencing with module

I creating a PowerShell script module using separate source files. What is the canonical way to reference source functions internal to the module from other internal source files?
For example if my module is created from PS source code in files "foo" and "bar"; and a function in "foo" needs to call a function in "bar", what is the best way to do that?
It doesn't seem like dot-sourcing would be a good idea. Nor does making the component files ("foo" and "bar") psm1 files. Is this the idea behind the "ScriptsToProcess" field in the psd1 file?
Am I thinking about this wrong (non-"PowerShelly")? Should I just dump everything into a single psm1?
I've personally followed the practice laid out by RamblingCookieMonster in his blog here: http://ramblingcookiemonster.github.io/Building-A-PowerShell-Module/
Which is to organise your functions in to separate .ps1 files under sub-folders \Public and \Private. Public contains the functions the user should be able to call directly, Private is for the functions that are only used internally by your module.
Then in the .psm1 file you load the functions via a loop and dot sourcing as follows:
#Get public and private function definition files.
$Public = #( Get-ChildItem -Path $PSScriptRoot\Public\*.ps1 -ErrorAction SilentlyContinue )
$Private = #( Get-ChildItem -Path $PSScriptRoot\Private\*.ps1 -ErrorAction SilentlyContinue )
#Dot source the files
Foreach($import in #($Public + $Private))
{
Try
{
. $import.fullname
}
Catch
{
Write-Error -Message "Failed to import function $($import.fullname): $_"
}
}
# Here I might...
# Read in or create an initial config file and variable
# Export Public functions ($Public.BaseName) for WIP modules
# Set variables visible to the module and its functions only
Export-ModuleMember -Function $Public.Basename
Source of this example: https://github.com/RamblingCookieMonster/PSStackExchange/blob/db1277453374cb16684b35cf93a8f5c97288c41f/PSStackExchange/PSStackExchange.psm1
You should then also explicitly list your Public function names in your .psd1 module manifest file under the FunctionsToExport setting. Doing this allows these functions to be discoverable and the module to be auto-loaded when they are used.
Since I recently had to do this myself, I am sharing my solution. I have recently started grouping functions in psm1 files. These can be compiled into a single module with a single manifest.
This allows me to have groups of functions that can be packaged with multiple modules.
Write-BarFunctions.psm1
Function Write-Bar {
return "Bar"
}
Function Write-Baz {
return "Baz"
}
Write-FooFunctions.psm1
Function Write-Foo {
return "Foo"
}
Function Write-FooBar {
$foo = Write-Foo
$bar = Write-Bar
return ("{0}{1}" -f $foo, $bar)
}
Function Write-FooBarBaz {
$foobar = Write-FooBar
$baz = Write-Baz
return ("{0}{1}" -f $foobar, $baz)
}
Which are combined into a single module like this:
(formatted for readability)
New-ModuleManifest
-Path .\Write-FooBarBazCombos
-NestedModules #('.\FooFunctions\Write-FooFunctions.psm1', '.\BarFunctions\Write-BarFunctions.psm1')
-Guid (New-Guid)
-ModuleVersion '1.0.0.0'
-Description 'demonstrate multiple psm1 files as 1 powershell module with 1 powershell module manifest'
-PowerShellVersion $PSVersionTable.PSVersion.ToString()
-FunctionsToExport #('Write-Foo', 'Write-Bar','Write-FooBar', 'Write-FooBarBaz')
PowerShell output:
PS C:\LWC\scripting-misc\module-manifest-multiple-files-example> New-ModuleManifest -Path .\Write-FooBarBazCombos.psd1
-NestedModules #('.\Write-FooFunctions.psm1', '.\Write-BarFunctions.psm1') -Guid (New-Guid) -ModuleVersion '1.0.0.0' -D
escription 'demonstrate multiple psm1 files as 1 powershell module with 1 powershell module manifest' -PowerShellVersio
n $PSVersionTable.PSVersion.ToString() -FunctionsToExport #('Write-Foo', 'Write-Bar','Write-FooBar', 'Write-FooBarBaz')
PS C:\LWC\scripting-misc\module-manifest-multiple-files-example> Import-Module .\Write-FooBarBazCombos.psd1
PS C:\LWC\scripting-misc\module-manifest-multiple-files-example> Get-Command -Module Write-FooBarBazCombos
CommandType Name Version Source
----------- ---- ------- ------
Function Write-Bar 1.0.0.0 Write-FooBarBazCombos
Function Write-Foo 1.0.0.0 Write-FooBarBazCombos
Function Write-FooBar 1.0.0.0 Write-FooBarBazCombos
Function Write-FooBarBaz 1.0.0.0 Write-FooBarBazCombos
note that Write-Baz is not exposed in the imported module as it is excluded from the FunctionsToExport parameter so Write-FooBarBaz will error (intentional to show behavior).
PS C:\LWC\scripting-misc\module-manifest-multiple-files-example> Write-FooBar
FooBar
What you're left with in the directory:
PS C:\LWC\scripting-misc\module-manifest-multiple-files-example> Get-ChildItem | Select-Object Name
Name
----
Write-BarFunctions.psm1
Write-FooBarBazCombos.psd1
Write-FooFunctions.psm1
Addendum - I expanded on this answer in another question - here:
https://stackoverflow.com/a/56171985/7710456
#Ryan
I similarly assumed that dot sourcing wasn't the best choice here, but I'm not so sure anymore. I've used the NestedModules approach as well, but have run up against a specific problem. I've asked the question here:
PowerShell module, call function in NestedModule from another NestedModule
In summary I find that the PrimaryModule can call any function in any NestedModule. But one NestedModule is not able to call a function in another NestedModule.
Splitting your code out into many logical files is Developer 101 basics. So I'm really surprised there isn't a standard way of handling this.
Any help here much appreciated. Please read the linked question, it gives plenty of detail. Is the consensus that dot sourcing has to be used? Because I'm finding the module manifest way of splitting out the code very limiting.

How can I find the source path of an executing script? [duplicate]

This question already has answers here:
What's the best way to determine the location of the current PowerShell script?
(15 answers)
Closed 8 years ago.
I want to be able to tell what path my executing script was run from.
This will often not be $pwd.
I need to call other scripts that are in a folder structure relative to my script and while I could hard code the paths, that's both distasteful and a bit of a pain in the neck when trying to promote from "dev" to "test" to "production".
The ubiquitous script originally posted by Jeffrey Snover of the PowerShell team (given in Skyler's answer) and the variations posted by Keith Cedirc, and EBGreen, all suffer from a serious drawback--whether the code reports what you expect depends on where you call it!
My code below overcomes this problem by simply referencing script scope instead of parent scope:
function Get-ScriptDirectory
{
Split-Path $script:MyInvocation.MyCommand.Path
}
To illustrate the problem, I created a test vehicle that evalutes the target expression in four different ways. (The bracketed terms are the keys to the following result table.)
inline code [inline]
inline function, i.e. function in the main program [inline function]
Dot-sourced function, i.e. the same function moved to a separate .ps1 file [dot source]
Module function, i.e. the same function moved to a separate .psm1 file [module]
The last two columns show the result of using script scope (i.e. $script:) or with parent scope (with -scope 1). A result of "script" means that the invocation correctly reported the location of the script. The "module" result means the invocation reported the location of the module containing the function rather than the script that called the function; this indicates a drawback of both functions that you cannot put the function in a module.
Setting the module issue aside the remarkable observation from the table is that using the parent scope approach fails most of the time (in fact, twice as often as it succeeds).
Finally, here is the test vehicle:
function DoubleNested()
{
"=== DOUBLE NESTED ==="
NestCall
}
function NestCall()
{
"=== NESTED ==="
"top level:"
Split-Path $script:MyInvocation.MyCommand.Path
#$foo = (Get-Variable MyInvocation -Scope 1).Value
#Split-Path $foo.MyCommand.Path
"immediate func call"
Get-ScriptDirectory1
"dot-source call"
Get-ScriptDirectory2
"module call"
Get-ScriptDirectory3
}
function Get-ScriptDirectory1
{
Split-Path $script:MyInvocation.MyCommand.Path
# $Invocation = (Get-Variable MyInvocation -Scope 1).Value
# Split-Path $Invocation.MyCommand.Path
}
. .\ScriptDirFinder.ps1
Import-Module ScriptDirFinder -force
"top level:"
Split-Path $script:MyInvocation.MyCommand.Path
#$foo = (Get-Variable MyInvocation -Scope 1).Value
#Split-Path $foo.MyCommand.Path
"immediate func call"
Get-ScriptDirectory1
"dot-source call"
Get-ScriptDirectory2
"module call"
Get-ScriptDirectory3
NestCall
DoubleNested
Contents of ScriptDirFinder.ps1:
function Get-ScriptDirectory2
{
Split-Path $script:MyInvocation.MyCommand.Path
# $Invocation = (Get-Variable MyInvocation -Scope 1).Value
# Split-Path $Invocation.MyCommand.Path
}
Contents of ScriptDirFinder.psm1:
function Get-ScriptDirectory3
{
Split-Path $script:MyInvocation.MyCommand.Path
# $Invocation = (Get-Variable MyInvocation -Scope 1).Value
# Split-Path $Invocation.MyCommand.Path
}
I am not familiar with what was introduced in PowerShell 2, but it could very well be that script scope did not exist in PowerShell 1, at the time Jeffrey Snover published his example.
I was surprised when, though I found his code example proliferated far and wide on the web, it failed immediately when I tried it! But that was because I used it differently than Snover's example (I called it not at script-top but from inside another function (my "nested twice" example).)
2011.09.12 Update
You can read about this with other tips and tricks on modules in my just-published article on Simple-Talk.com:
Further Down the Rabbit Hole: PowerShell Modules and Encapsulation.
You tagged your question for Powershell version 1.0, however, if you have access to Powershell version 3.0 you know have $PSCommandPathand$PSScriptRootwhich makes getting the script path a little easier. Please refer to the "OTHER SCRIPT FEATURES" section on this page for more information.
We've been using code like this in most of our scripts for several years with no problems:
#--------------------------------------------------------------------
# Dot source support scripts
#--------------------------------------------------------------------
$ScriptPath = $MyInvocation.MyCommand.Path
$ScriptDir = Split-Path -Parent $ScriptPath
. $ScriptDir\BuildVars.ps1
. $ScriptDir\LibraryBuildUtils.ps1
. $ScriptDir\BuildReportUtils.ps1
I ran into the same issue recently. The following article helped me solve the problem: http://blogs.msdn.com/powershell/archive/2007/06/19/get-scriptdirectory.aspx
If you're not interested in how it works, here's all the code you need per the article:
function Get-ScriptDirectory
{
$Invocation = (Get-Variable MyInvocation -Scope 1).Value
Split-Path $Invocation.MyCommand.Path
}
And then you get the path by simply doing:
$path = Get-ScriptDirectory
I think you can find the path of your running script using
$MyInvocation.MyCommand.Path
Hope it helps !
Cédric
This is one of those oddities (to my mind at least) in PS. I'm sure there is a perfectly good reason for it, but it still seems odd to me. So:
If you are in a script but not in a function then $myInvocation.InvocationName will give you the full path including the script name. If you are in a script and inside a function then $myInvocation.ScriptName will give you the same thing.
Thank you msorens! This really helped me with my custom module. In case anyone is interested in making their own, here is how mine is structured.
MyModule (folder)
- MyModule.psd1 (help New-ModuleManifest)
- MyScriptFile.ps1 (ps1 files are easy to test)
You then reference MyScriptFile.ps1 in MyModule.psd1. Referencing the .ps1 in the NestedModules array will place the functions in the module session state rather than the global session state. (How to Write a Module Manifest)
NestedModules = #('.\MyScriptFile.ps1','.\MyOtherScriptFile.ps1')
Content of MyScriptFile.ps1
function Get-ScriptDirectory {
Split-Path $script:MyInvocation.MyCommand.Path
}
try {
Export-ModuleMember -Function "*-*"
}
catch{}
The try/catch hides the error from Export-ModuleMember when running MyScriptFile.ps1
Copy the MyModule directory to one of the paths found here $env:PSModulePath
PS C:\>Import-Module MyModule
PS C:\>Get-Command -Module MyModule
CommandType Name ModuleName
----------- ---- ----------
Function Get-ScriptDirectory MyModule