I have a Powershell module and in the manifest I have declared the primary module and two nested modules.
The structure of the module is as follows:
- [dir] Pivot.DockerAdmin
- [manifest] Pivot.DockerAdmin.psd1
- [main module file] Pivot.DockerAdmin.psm1
- [nested script] DockerfileScripts.ps1
- [nested script] DockerCliScripts.ps1
What works
The Primary Module (Pivot.DockerAdmin.psm1) can call functions in the Nested Module files (both DockerfileScripts.ps1, DockerCliScripts.ps1), without a problem. Note, there's no specific logic to include these files, other than the entry in the manifest file.
What does NOT work
One Nested Module script file (DockerfileScripts.ps1) cannot call functions in the other Nested Module script file (DockerCliScripts.ps1).
The nested modules are just simple script files. So in effect, I'm using the NestedModule concept to logically group some functions in other files.
The module is setup correctly. I'm confident about this, because I even have Pester tests running on a build box without any special treatment.
I expect to be able to call a function in a nested module from another nested module, in the same way the primary module can call functions in any nested module, but this fails with an unrecognised command error.
If this is not possible are there any recommendations around organising script files within PS modules, so that a similar division of scripts / separation of concerns is possible?
So if you look at the example I posted here:
https://stackoverflow.com/a/55064995/7710456
I'll expand on it a bit.
I took another look at it and created a module manifest for all of the modules, and all of those modules need to follow standards for PowerShell modules (in a folder with the same name as the PowerShell module, in a location that is present in the PSModulePath)
Write-BazFunctions.psm1:
Function Write-Baz {
return "Baz"
}
Write-BarFunctions.psm1:
Function Write-Bar {
return "Bar"
}
Function Write-BarBaz {
$bar = Write-Bar;
$baz = Write-Baz;
return ("{0}{1}" -f $bar, $baz)
}
Write-FooFunctions.psm1
Function Write-Foo {
return "Foo"
}
Function Write-FooBar {
$foo = Write-Foo
$bar = Write-Bar
return ("{0}{1}" -f $foo, $bar)
}
Function Write-FooBarBaz {
$foobar = Write-FooBar
$baz = Write-Baz
return ("{0}{1}" -f $foobar, $baz)
}
Function Write-FooBazBar {
$foo = Write-Foo
$bar = Write-Bar
$baz = Write-Baz
return ("{0}{1}{2}" -f $foo, $bar, $baz)
}
Now - differences. In the manifest for Write-BarFunctions (note required, not nested):
RequiredModules = #('Write-BazFunctions')
note another difference from my original answer linked above I was targeting the psm1 files directly, instead, reference them just by the module name.
Once I did this, I was able to import Write-FooFunctions and all of the functions came available. Since Write-BarBaz in Write-BarFunctions calls Write-Baz from Write-BazFunctions you can see that this will allow for nested modules to reference one another.
Related
Background
I am running a Jenkns Job, called Job A, that feeds its build parameters into a perl script, called ScriptA.pl, in the following format below:
#Check the input params
my $PARAM1 = $ENV{ "PARAM1" };
my $PARAM2 = $ENV{ "PARAM2" };
.....more params fed in the same way
if ( $PARAM1 eq "" ) {
print "PARAM1 is a required parameter.\n";
exit 1;
}
if ( $PARAM2 eq "" ) {
print "PARAM2 is a required parameter.\n";
exit 1;
}
.....more param checks done in the same way
###Script then runs a bunch of execution statements####
Problem
I am trying to run this script from Linux command line in cshell in the following way to test that the execution component works:
/% ScriptA.pl jetFuel steelBeams
And it thinks that no parameters have been entered, since this error is returned
PARAM1 is a required parameter.
Question
How do I properly input Jenkins parameters into a Jenkins script from command line?
If you can't modify the Perl script so it reads command-line parameters (which would improve its general usability to boot), maybe create a simple wrapper script:
#/bin/sh
env PARAM1="$1" PARAM2="$2" perl ScriptA.pl
(Yes, sh. Nobody should use csh for anything in 2019.)
I'm writing a fork of DBI::Log, my purpose is to make it conditionally-pluggable to be able to log SQL queries more flexible, e.g. only from particular module or after specified call.
I faced with strange problem - Sub::Override can not override DBI::db::* and DBI::st::execute methods.
Scheme of overriding is following
1) I save reference to original method to variable, e.g. my $orig_execute = \&DBI::st::execute;
2) Create a new function with adding some additional logging code, e.g.
sub _execute {
my ( $sth, #args ) = #_;
warn "Execute is working!";
my $log = dbilog( "execute", $sth->{Database}, $sth->{Statement}, \#args );
my $retval = $orig_execute->( $sth, #args );
dbilog2($log);
return $retval;
}
3) Replace old to new function using Sub::Override
my $sub = Sub::Override->new;
$sub->replace( 'DBI::st::execute', \&_execute );
Here is a full code of changed DBI::Log module. It must do same as original DBI::Log, just use Sub::Override, so original unit tests must pass.
If I run test.pl and added debug output script I see that Sub::Override is working, but for some reason overrided function can not start - no Execute is working! message.
I want to import external function from file, not converting it to a module (we have hundreds of file-per-function, so treat all them as modules is overkill).
Here is code explanation. Please notice that I have some additional logic in Import-Function like adding scripts root folder and to check file existence and throw special error, to avoid this code duplication in each script which requires that kind of import.
C:\Repository\Foo.ps1:
Function Foo {
Write-Host 'Hello world!'
}
C:\InvocationTest.ps1:
# Wrapper func
Function Import-Function ($Name) {
# Checks and exception throwing are omitted
. "C:\Repository\$name.ps1"
# Foo function can be invoked in this scope
}
# Wrapped import
Import-Function -Name 'Foo'
Foo # Exception: The term 'Foo' is not recognized
# Direct import
. "C:\Repository\Foo.ps1"
Foo # 'Hello world!'
Is there any trick, to dot source to global scope?
You can't make the script run in a parent scope, but you can create a function in the global scope by explicitly scoping it.
Would something like this work for you?
# Wrapper func
Function Import-Function ($Path) {
# Checks and exception throwing are omitted
$script = Get-Content $Path
$Script -replace '^function\s+((?!global[:]|local[:]|script[:]|private[:])[\w-]+)', 'function Global:$1'
.([scriptblock]::Create($script))
}
The above regex only targets root functions (functions left justified; no white space to left of the word function). In order to target all functions, regardless of spacing (including sub-functions), change the $Script -replace line to:
$Script -replace '^\s*function\s+((?!global[:]|local[:]|script[:]|private[:])[\w-]+)','function Global:$1'
You can change the functions that are defined in the dot-sourced files so that they are defined in the global scope:
function Global:Get-SomeThing {
# ...
}
When you dot source that from within a function, the function defined in the dot sourced file will be global. Not saying this is best idea, just another possibility.
Just dot-source the function as well:
. Import-Function -Name 'Foo'
Foo # Hello world!
I can't remember a way to run a function in global scope right now. You could do something like this:
$name = "myscript"
$myimportcode= {
# Checks and exception throwing are omitted
. .\$name.ps1
# Foo function can be invoked in this scope
}
Invoke-Expression -Command $myimportcode.ToString()
When you convert the scriptblock to a string .ToString(), the variable will expand.
I have a main script that I am running. What it does is read through a directory filled with other powershell scripts, dot includes them all and runs a predefined method in each made up of the first portion of the dot delimited file name. Example:
Run master.ps1
Master.ps1 dot sources .\resource\sub.ps1
Sub.ps1 has defined a function called 'dosub'
Master.ps1 runs 'dosub' using Invoke-Expression
Also defined in sub.ps1 is the function 'saysomething'. Implemented in'dosub' is a call to 'saysomething'.
My problem is I keep getting the error:
The term 'saysomething' is not recognized as the name of a cmdlet,
function, script file, or operable program. Check the spelling of the
name, or if a path was included, verify that the path is correct and
try again.
Why can't the method 'dosub' find the method 'saysomething' which is defined in the same file?
master.ps1:
$handlersDir = "handlers"
$handlers = #(Get-ChildItem $handlersDir)
foreach ( $handler in $handlers ) {
. .\$handlersDir\$handler
$fnParts = $handler.Name.split(".")
$exp = "do" + $fnParts[0]
Invoke-Expression $exp
}
sub.ps1:
function saysomething() {
Write-Host "I'm here to say something!"
}
function dosub() {
saysomething
Write-Host "In dosub!"
}
Your code works on my system. However you can simplify it a bit:
$handlersDir = "handlers"
$handlers = #(Get-ChildItem $handlersDir)
foreach ( $handler in $handlers )
{
. .\$handlersDir\$handler
$exp = "do" + $handler.BaseName
Write-Host "Calling $exp"
& $exp
}
Note the availability of the BaseName property. You also don't need to use Invoke-Expression. You can just call the named command ysing the call (&) operator.
What you have given works as needed. You probably don't have the directories etc proper on your machine. Or you are running something else and posting a different ( working!) code here.
You can also make following corrections:
. .\$handlersDir\$handler
instead of above you can do:
. $handler.fullname
Instead the splitting of the filename you can do:
$exp = "do" + $handler.basename
I'm wanting to write a couple commands for the NuGet package manager console to insert Gists from GitHub. I have 4 basic commands
List-Gists 'user'
Gist-Info 'gistId'
Gist-Contents 'gistId' 'fileName'
Gist-Insert 'gistId' 'fileName'
All of my commands depend on a couple utility functions, and I'm struggling with whether they need to be global or not.
# Json Parser
function parseJson([string]$json, [bool]$throwError = $true) {
try {
$result = $serializer.DeserializeObject( $json );
return $result;
} catch {
if($throwError) { throw "ERROR: Parsing Error"}
else { return $null }
}
}
function downloadString([string]$stringUrl) {
try {
return $webClient.DownloadString($stringUrl)
} catch {
throw "ERROR: Problem downloading from $stringUrl"
}
}
function parseUrl([string]$url) {
return parseJson(downloadString($url));
}
Can I just have these utility functions outside of my global functions, or will I need to include them in each of the global functions definition scope somehow?
No they don't. From your init.ps1 you can import a powershell module that you wrote (psm1) file and moving forward, this will be the way we recommend adding methods to the console environment.
Your init.ps1 would look something like this:
param($installPath, $toolsPath)
Import-Module (Join-Path $toolsPath MyModule.psm1)
In MyModule.psm1:
function MyPrivateFunction {
"Hello World"
}
function Get-Value {
MyPrivateFunction
}
# Export only the Get-Value method from this module so that's what gets added to the nuget console environment
Export-ModuleMember Get-Value
You can get more information on modules here http://msdn.microsoft.com/en-us/library/dd878340(v=VS.85).aspx