shared function in script and code block - powershell

I have a function that I use in my main script, and I also need to create a Job, which uses the same function. Right now I just have the code written twice, once in the main script, and once in the script block handed to the Job. I know I can add code to a variable, but not sure how to then unpack that variable in the code block, so the same code is effectively used in both places.
If it makes a difference, I am limited to v2 unfortunately.

Use a Module
Define your function(s) in the module.
In your main script, you import the module.
Import-Module MyModule
In your job, you import the module:
Start-Job -ScriptBlock {
Import-Module MyModule
# Invoke-MyFunction
}

Related

have an entry point in a powershell file like python name main [duplicate]

I am really fond of python's capability to do things like this:
if __name__ == '__main__':
#setup testing code here
#or setup a call a function with parameters and human format the output
#etc...
This is nice because I can treat a Python script file as something that can be called from the command line but it remains available for me to import its functions and classes into a separate python script file easily without triggering the default "run from the command line behavior".
Does Powershell have a similar facility that I could exploit? And if it doesn't how should I be organizing my library of function files so that i can easily execute some of them while I am developing them?
$MyInvocation.Invocation has information about how the script was started.
If ($MyInvocation.InvocationName -eq '&') {
"Called using operator: '$($MyInvocation.InvocationName)'"
} ElseIf ($MyInvocation.InvocationName -eq '.') {
"Dot sourced: '$($MyInvocation.InvocationName)'"
} ElseIf ((Resolve-Path -Path $MyInvocation.InvocationName).ProviderPath -eq $MyInvocation.MyCommand.Path) {
"Called using path: '$($MyInvocation.InvocationName)'"
}
$MyInvocation has lots of information about the current context, and those of callers. Maybe this could be used to detect if a script is being dot-sourced (i.e. imported) or executed as a script.
A script can act like a function: use param as first non-common/whitespace in the file to defined parameters. It is not clear (one would need to try different combinations) what happens if you dot-source a script that starts param...
Modules can directly execute code as well as export functions, variables, ... and can take parameters. Maybe $MyInvocation in a module would allow the two cases to be detected.
EDIT: Additional:
$MyInvocation.Line contains the command line used to execute the current script or function. Its Line property has the scrip text used for the execution, when dot-sourcing this will start with "." but not if run as a script (obviously a case to use a regex match to allow for variable whitespace around the period).
In a script run as a function
As of now I see 2 options that work
if ($MyInvocation.InvocationName -ne '.') {#do main stuff}
and
if ($MyInvocation.CommandOrigin -eq 'Runspace') {#do main stuff}
Disclaimer: This is only tested on Powershell Core on Linux. It may not work the same for Windows. If anyone tries it on Windows I would appreciate if you could verify in the comments.
function IsMain() {
(Get-Variable MyInvocation -Scope Local).Value.PSCommandPath -Eq (Get-Variable MyInvocation -Scope Global).Value.InvocationName
}
Demonstrated with a gist

Import-Module not working when called after a param() statement

I've a script which uses a parameter to pass details to it, and which needs to import the WebAdministration module.
The start of the script is :
param(
[parameter(position=0)]
[string]$iisAppName
)
Import-Module -name WebAdministration
however when I run the script I get errors from those cmdlets which use the module saying they're not found, since the module obviously hasn't been loaded.
If I put the Import-Module statement before the param() then the parameter isn't loaded. If I don't have the param() statement at all it works fine.
This script is for removing a website, but the companion creation script (which doesn't use param) works fine. In fact if I run that one it works, and if I then run this one (where the module is still loaded from the first) it works fine (annoyingly... since I didn't spot the issue in testing!), so I know I'm calling those cmdlets correctly.
Is there an alternate way I need to call one or both of these to allow both of them to work in my script?
I think this is to do with session states but would need more information to be sure.
https://msdn.microsoft.com/en-us/powershell/reference/5.1/microsoft.powershell.core/import-module#-global
By default, the commands in a module, including commands from nested modules, are imported into the caller's session state.
When you import a module from the global session state, it's available to the console and all modules. When the module is imported from another module, it will only be available to the module(s) that imported it. I think when you include Params it treats it differently, perhaps running it in a script state session instead of the global state session.
Try using Import-Module -Name WebAdministration -Global which, regardless of where it is called, should import it into the global state session and make it available to everything.

Scope of functions defined in modules when used in psake tasks

I have a psake Task looking something like below (this is simplified for clarity):
Task Invoke-Deploy {
Import-Module "somefunctions.psm1"
Import-Module "morefunctions.psm1"
Set-Something #This is a function defined in morefunctions.psm1
}
Function Set-Something (which is defined in module morefunctions.psm1) attempts to call function Get-Something (which is defined in somefunctions.psm1). When it does I get an error:
The term 'Get-Something' is not recognized as the name of a cmdlet,
function, script file, or operable program.
Interestingly I modified "morefunctions.psm1" to also 'Import-Module "somefunctions.psm1"' and at that point everything worked fine. I would rather not have to do this however as i want my modules to be "loosely-coupled" insofar as they don't need to rely on the existence of other modules.
My knowledge of function/variable scope in Powershell is limited but I thought that functions in two different imported modules lived in the same scope and hence a function in one of those modules would be able to call a function in the other.
I am suspecting that that scope is being affected by the fact that I'm inside a psake task, I'm hoping that someone here can confirm that and also advise on what I should do to fix this. TIA.
I created a script module test-module.psm1:
function Invoke-Test {
Import-Module ".\somefunctions.psm1"
Import-Module ".\morefunctions.psm1"
Set-Something #This is a function defined in morefunctions.psm1
}
and a couple of dummy modules, somefunctions.psm1:
function Get-Something {
'Get-Something'
}
and morefunctions.psm1:
function Set-Something {
Get-Something
'Set-Something'
}
If I call
Import-Module .\test-module.psm1
Invoke-Test
then I get the error "Get-Something : The term 'Get-Something' is not
recognized as the name of a cmdlet, function, script file, or operable
program.". So it looks like a generic PowerShell issue dealing with script
modules. I tried PowerShell v2.0, v3.0, and v4.0.
Perhaps this cannot be resolved in psake without workarounds because it is a script
module. You can use the similar tool
Invoke-Build. It is implemented
as a script and avoids issues like these. It works fine
with this build script:
Task Invoke-Deploy {
Import-Module ".\somefunctions.psm1"
Import-Module ".\morefunctions.psm1"
Set-Something #This is a function defined in morefunctions.psm1
}
It outputs, as expected:
Build Invoke-Deploy ...\.build.ps1
Task /Invoke-Deploy
Get-Something
Set-Something
Done /Invoke-Deploy 00:00:00.0150008
Build succeeded. 1 tasks, 0 errors, 0 warnings 00:00:00.1450083
I ran into this today and got it to work.
In your module "morefunctions.psm1" you need to export the method you want like this:
Export-ModuleMember -Function Set-Something
In your psake task, you need to prepend the module name in front of the method so PowerShell can find it:
Import-Module "morefunctions.psm1"
morefunctions\Set-Something

How can I use a PowerShell module in a script without leaving the module loaded in the user's session?

I have a script I wish to use interactively from the PowerShell prompt. The script needs to use a local script module.
I cannot see how to import/use the module such that it's not left loaded in the current session.
Example
A module (MyModule.psm1)...
function Test-Method
{
write-host "Test-Method invoked"
}
... and a script (script.ps1)
Import-Module .\MyModule
Test-Method
Now running the script at the PowerShell prompt ...
PS C:\temp> Get-Module | % {$_.Name}
Microsoft.PowerShell.Management
Microsoft.PowerShell.Utility
PS C:\temp> .\script.ps1
Test-Method invoked
PS C:\temp> Get-Module | % {$_.Name}
Microsoft.PowerShell.Management
Microsoft.PowerShell.Utility
MyModule
How can my script import and use MyModule.psm1 without it being left loaded in the caller's current session? Bearing in mind that the call may have already imported the module and would not want it unloaded by the script (so simply removing the module at the completion of the script is not really good enough).
I've considered dot-sourcing the module rather than importing it, but I want the module for the reasons covered in PowerShell Import-Module vs Dot Sourcing
It sounds like you already described in pseudo-code what you wanted. Here it is in actual code:
$checkCmds = Get-Commands -Module MyModule
Import-Module MyModule
# Do stuff here . . .
# unload only if we loaded it
if ($checkCmds -eq $null) { Remove-Module MyModule }
As far as I can tell, you don't get that automatic cleanup behavior from a "script" importing a module. OTOH if you import a module from within another module, when the parent module is removed then any modules it imported will be removed if there are no other modules using them (or unless ipmo -global was specified).
This builds on the previous answer and uses the following property.
If you import a module from within another module, when the parent module is removed then any modules it imported will be removed
You can exploit several techniques to create a wrapper:
Importing a module from a module
Anonymous modules
Call operator with the context of a module
Set script.ps1 to
& (New-Module {
function Invoke-Function {
Import-Module .\MyModule
Test-Method
}
}) { Invoke-Function }
If you run script.ps1 and then (Get-Module).Name then MyModule will not be listed in the output.
Note: In this example Invoke-Function is just another scope, and can be omitted, letting the New-Module just run when defined. In one line:
& (New-Module { Import-Module .\MyModule; Test-Method }) {}
You can import the module with -Scope local to restrict a module to your script's scope. If the module happens to also be loaded in the global scope, then it will still be available after your script exits.

PowerShell Add-PSSnapIn from an advanced (cmdlet) function

I would like to create an advancedmodule with a cmdlet function which performs some logic and adds some pssnapins. This is the code:
function Add-DefaultSnapIns
{
[CmdletBinding()]
param()
begin {}
process {
# ...
Add-PsSnapIn SnapInName
}
end {}
}
export-module -function Add-DefaultSnapIns
If I invoke the function from any point (for instance, a powershell prompt), the operation succeeds, but the snapin is not available outside of the scope of the function. The snap-in appears registered, but none of its functions have been exported to the global scope. How could I solve it?
Well the idea is that Modules are self-contained and don't spill too much of their "stuff" into the global session space except the cmdlets, functions and aliases they export. It might be better to add the snapins yourself as part of the module initialization and then export those snapins' cmdlets via Export-ModuleMember.