Calling a local function from a dot sourced file - powershell

I have a main script that I am running. What it does is read through a directory filled with other powershell scripts, dot includes them all and runs a predefined method in each made up of the first portion of the dot delimited file name. Example:
Run master.ps1
Master.ps1 dot sources .\resource\sub.ps1
Sub.ps1 has defined a function called 'dosub'
Master.ps1 runs 'dosub' using Invoke-Expression
Also defined in sub.ps1 is the function 'saysomething'. Implemented in'dosub' is a call to 'saysomething'.
My problem is I keep getting the error:
The term 'saysomething' is not recognized as the name of a cmdlet,
function, script file, or operable program. Check the spelling of the
name, or if a path was included, verify that the path is correct and
try again.
Why can't the method 'dosub' find the method 'saysomething' which is defined in the same file?
master.ps1:
$handlersDir = "handlers"
$handlers = #(Get-ChildItem $handlersDir)
foreach ( $handler in $handlers ) {
. .\$handlersDir\$handler
$fnParts = $handler.Name.split(".")
$exp = "do" + $fnParts[0]
Invoke-Expression $exp
}
sub.ps1:
function saysomething() {
Write-Host "I'm here to say something!"
}
function dosub() {
saysomething
Write-Host "In dosub!"
}

Your code works on my system. However you can simplify it a bit:
$handlersDir = "handlers"
$handlers = #(Get-ChildItem $handlersDir)
foreach ( $handler in $handlers )
{
. .\$handlersDir\$handler
$exp = "do" + $handler.BaseName
Write-Host "Calling $exp"
& $exp
}
Note the availability of the BaseName property. You also don't need to use Invoke-Expression. You can just call the named command ysing the call (&) operator.

What you have given works as needed. You probably don't have the directories etc proper on your machine. Or you are running something else and posting a different ( working!) code here.
You can also make following corrections:
. .\$handlersDir\$handler
instead of above you can do:
. $handler.fullname
Instead the splitting of the filename you can do:
$exp = "do" + $handler.basename

Related

catch error from param in Powershell without using try-catch

I’m writing a Powershell script to call a file convert function(to execute ANYTRAN file).
I was told to put param() in a try-catch by my boss but that seems to cause an error.
Then how can I catch the error from param()?
I think it’s possible to use if statement in a parent shell.
Please give me some advice.
Below is the code.
$ErrorActionPreference = "Stop"
try{
#-----------------------------------------------------------
# 初期処理
#-----------------------------------------------------------
#---# 環境変数定義(define common env)
#---& ".\commonEnv.ps1"
# 共通関数インクルード(include common func)
. (Resolve-Path ".\commonFunc.ps1").path
# 引数取得(get parameter)
Param(
$inFile
,$outFile
,$flgZeroByte
)
Your issue is not in params passing . "param", unless the syntax is incorrect, logically can't crash (and you can't handle whether it could, unless with error handling at scope higher than the definition of function) .
function sc1 {
param ( $v_f = '' )
process {
Write-Host $v_f
}
}
sc1 '& 2'

PowerShell module, call function in NestedModule from another NestedModule

I have a Powershell module and in the manifest I have declared the primary module and two nested modules.
The structure of the module is as follows:
- [dir] Pivot.DockerAdmin
- [manifest] Pivot.DockerAdmin.psd1
- [main module file] Pivot.DockerAdmin.psm1
- [nested script] DockerfileScripts.ps1
- [nested script] DockerCliScripts.ps1
What works
The Primary Module (Pivot.DockerAdmin.psm1) can call functions in the Nested Module files (both DockerfileScripts.ps1, DockerCliScripts.ps1), without a problem. Note, there's no specific logic to include these files, other than the entry in the manifest file.
What does NOT work
One Nested Module script file (DockerfileScripts.ps1) cannot call functions in the other Nested Module script file (DockerCliScripts.ps1).
The nested modules are just simple script files. So in effect, I'm using the NestedModule concept to logically group some functions in other files.
The module is setup correctly. I'm confident about this, because I even have Pester tests running on a build box without any special treatment.
I expect to be able to call a function in a nested module from another nested module, in the same way the primary module can call functions in any nested module, but this fails with an unrecognised command error.
If this is not possible are there any recommendations around organising script files within PS modules, so that a similar division of scripts / separation of concerns is possible?
So if you look at the example I posted here:
https://stackoverflow.com/a/55064995/7710456
I'll expand on it a bit.
I took another look at it and created a module manifest for all of the modules, and all of those modules need to follow standards for PowerShell modules (in a folder with the same name as the PowerShell module, in a location that is present in the PSModulePath)
Write-BazFunctions.psm1:
Function Write-Baz {
return "Baz"
}
Write-BarFunctions.psm1:
Function Write-Bar {
return "Bar"
}
Function Write-BarBaz {
$bar = Write-Bar;
$baz = Write-Baz;
return ("{0}{1}" -f $bar, $baz)
}
Write-FooFunctions.psm1
Function Write-Foo {
return "Foo"
}
Function Write-FooBar {
$foo = Write-Foo
$bar = Write-Bar
return ("{0}{1}" -f $foo, $bar)
}
Function Write-FooBarBaz {
$foobar = Write-FooBar
$baz = Write-Baz
return ("{0}{1}" -f $foobar, $baz)
}
Function Write-FooBazBar {
$foo = Write-Foo
$bar = Write-Bar
$baz = Write-Baz
return ("{0}{1}{2}" -f $foo, $bar, $baz)
}
Now - differences. In the manifest for Write-BarFunctions (note required, not nested):
RequiredModules = #('Write-BazFunctions')
note another difference from my original answer linked above I was targeting the psm1 files directly, instead, reference them just by the module name.
Once I did this, I was able to import Write-FooFunctions and all of the functions came available. Since Write-BarBaz in Write-BarFunctions calls Write-Baz from Write-BazFunctions you can see that this will allow for nested modules to reference one another.

Perl to exec a program with arguments containing "#"

Am a newbie in Perl and need help with a small problem
Situation:
I have to execute a command line program through perl.
The arguments to this command line are email addresses
These email addresses are passed to me through another module.
Problem:
I have written the code to create the argument list from these email addresses but am having problem in running exec().
NOTE: If I pass hardcoded strings with escaped "#" character to the exec() as command args,it works perfectly.
Sub creating cmd args map
sub create_cmd_args {
my($self, $msginfo) = #_;
my #gd_args_msg = ('--op1');
my $mf = $msginfo->sender_smtp;
$mf =~ s/#/\\#/ig; ## Tried escaping #, incorrect results.
push #gd_args_msg, '-f="'.$mf.'"';
for my $r (#{$msginfo->per_recip_data}) {
my $recip = $r->recip_addr_smtp;
$recip =~ s/#/\\#/ig; ## Tried escaping #, incorrect results.
push #gd_args_msg, '-r="'.($recip).'"';
}
return #gd_args_msg;
}
Sub that uses this args map to exec the program
sub check {
my($self, $msginfo) = #_;
my $cmd = $g_command;
my #cmd_args = create_cmd_args($self, $msginfo);
exec($cmd, #cmd_args); ### ******* fails here
}
Sample run:
INPUT:
sender_smtp: <ashish#isthisreal.com>
receiver_smtp: <areyouarealperson#somedomain.com>
Could someone please guide me what is wrong here?
As an argument to a command in the shell,
-f="<ashish#isthisreal.com>"
causes the the string
-f=<ashish#isthisreal.com>
to be passed to the program. Your program passes
-f="<ashish\#isthisreal.com>"
to the program. The problem isn't the #; the problem is the " and \ you are adding.
my $mf = $msginfo->sender_smtp;
push #gd_args_msg, "-f=$mf"; # Assuming $mf is <ashish#isthisreal.com>
If you look at the post at Trying to convert Perl to PHP and the code within the md5sum implementation that calls the command line you will see an approach that will save you from needing to worry about escaping characters.

Running Access Macro in Powershell

I'm trying to run an Access 2010 macro in PowerShell (v4.0 Windows 8.1) with the below code:
$Access = New-Object -com Access.Application
$Access.OpenCurrentDatabase("SomePath", $False, "Password")
$Access.Run("SomeProc")
$Access.CloseCurrentDatabase()
$Access.Quit()
[System.Runtime.InteropServices.Marshal]::ReleaseComObject($Access)
Remove-Variable Access
I get an error on the line $Access.Run("SomeProc") that there's not enough parameters specified:
Exception calling "Run" with "1" argument(s): "Invalid number of parameters. (Exception
from HRESULT: 0x8002000E (DISP_E_BADPARAMCOUNT))"
The procedure SomeProc does not require any parameters.
I've read the msdn article on the run method and only one parameter is required.
I've also tried this workaround which also failed to work for an unrelated reason.
Does anyone know what the cause of the error could be and how to get the method working?
This is a driver issue where the OLEDB libraries aren't loading correctly.
I was able to reproduce your error exactly, and I was able to work around it by opening Powershell from your SysWow directory instead of System32.
Try opening this version of Powershell (you'll have to run set-executionpolicy again), and see if it'll execute your script.
%SystemRoot%\syswow64\WindowsPowerShell\v1.0\powershell.exe
Helpful link: https://social.msdn.microsoft.com/Forums/en-US/4500877f-0031-426e-869d-bda33d9fe254/microsoftaceoledb120-provider-cannot-be-found-it-may-not-be-properly-installed?forum=adodotnetdataproviders
The C# signature is something like this:
public object Run(string Procedure, ref object Arg1, ... ref object Arg30) ...
It means that COM the Arg optional arguments are not optional in .NET because they are explicitly marked as [ref]. You need to provide all 32 args even if you don't use them.
Assuming you have the following VBA code:
Public Sub Greeting(ByVal strName As String)
MsgBox ("Hello, " & strName & "!"), vbInformation, "Greetings"
End Sub
You can either use call it like this:
$Access = New-Object -com Access.Application
$Access.OpenCurrentDatabase("Database1.accdb")
$runArgs = #([System.Reflection.Missing]::Value) * 31
$runArgs[0] = "Greeting" #Method Name
$runArgs[1] = "Jeno" #First Arg
$Access.GetType().GetMethod("Run").Invoke($Access, $runArgs)
In your case it will be:
$runArgs = #([System.Reflection.Missing]::Value) * 31
$runArgs[0] = "SomeProc"
$Access.GetType().GetMethod("Run").Invoke($Access, $runArgs)
I would probably try to add a helper to the access object:
Add-Member -InputObject $Access -MemberType ScriptMethod -Name "Run2" -Value {
$runArgs = #([System.Reflection.Missing]::Value) * 31
for($i = 0; $i -lt $args.Length; $i++){ $runArgs[$i] = $args[$i] }
$this.GetType().GetMethod("Run").Invoke($this, $runArgs)
}
Then you can use Run2 as you would expect:
$Access.Run2("Greeting", "Jeno")
$Access.Run2("SomeProc")

Dot-sourcing functions from file to global scope inside of function

I want to import external function from file, not converting it to a module (we have hundreds of file-per-function, so treat all them as modules is overkill).
Here is code explanation. Please notice that I have some additional logic in Import-Function like adding scripts root folder and to check file existence and throw special error, to avoid this code duplication in each script which requires that kind of import.
C:\Repository\Foo.ps1:
Function Foo {
Write-Host 'Hello world!'
}
C:\InvocationTest.ps1:
# Wrapper func
Function Import-Function ($Name) {
# Checks and exception throwing are omitted
. "C:\Repository\$name.ps1"
# Foo function can be invoked in this scope
}
# Wrapped import
Import-Function -Name 'Foo'
Foo # Exception: The term 'Foo' is not recognized
# Direct import
. "C:\Repository\Foo.ps1"
Foo # 'Hello world!'
Is there any trick, to dot source to global scope?
You can't make the script run in a parent scope, but you can create a function in the global scope by explicitly scoping it.
Would something like this work for you?
# Wrapper func
Function Import-Function ($Path) {
# Checks and exception throwing are omitted
$script = Get-Content $Path
$Script -replace '^function\s+((?!global[:]|local[:]|script[:]|private[:])[\w-]+)', 'function Global:$1'
.([scriptblock]::Create($script))
}
The above regex only targets root functions (functions left justified; no white space to left of the word function). In order to target all functions, regardless of spacing (including sub-functions), change the $Script -replace line to:
$Script -replace '^\s*function\s+((?!global[:]|local[:]|script[:]|private[:])[\w-]+)','function Global:$1'
You can change the functions that are defined in the dot-sourced files so that they are defined in the global scope:
function Global:Get-SomeThing {
# ...
}
When you dot source that from within a function, the function defined in the dot sourced file will be global. Not saying this is best idea, just another possibility.
Just dot-source the function as well:
. Import-Function -Name 'Foo'
Foo # Hello world!
I can't remember a way to run a function in global scope right now. You could do something like this:
$name = "myscript"
$myimportcode= {
# Checks and exception throwing are omitted
. .\$name.ps1
# Foo function can be invoked in this scope
}
Invoke-Expression -Command $myimportcode.ToString()
When you convert the scriptblock to a string .ToString(), the variable will expand.