Access PowerShell script parameters from inside module - powershell

I have a PowerShell module that contains a number of common management and deployment functions. This is installed on all our client workstations. This module is called from a large number of scripts that get executed at login, via scheduled tasks or during deployments.
From within the module, it is possible to get the name of the calling script:
function Get-CallingScript {
return ($script:MyInvocation.ScriptName)
}
However, from within the module, I have not found any way of accessing the parameters originally passed to the calling script. For my purposes, I'd prefer to access them in the form of a dictionary object, but even the original command line would do. Unfortunately, given my use case, accessing the parameters from within the script and passing them to the module is not an option.
Any ideas? Thank you.

From about_Scopes:
Sessions, modules, and nested prompts are self-contained environments,
but they are not child scopes of the global scope in the session.
That being said, this worked for me from within a module:
$Global:MyInvocation.UnboundArguments
Note that I was calling my script with an unnamed parameter when the script was defined without parameters, so UnboundArguments makes sense. You might need this instead if you have defined parameters:
$Global:MyInvocation.BoundParameters

I can see how this in general would be a security concern. For instance, if there was a credential passed to the function up the stack from you, you would be able to access that credential.
The arguments passed to the current function can be accessed via $PSBoundParameters, but there isn't a mechanism to look at the call stack function's parameters.

Related

Using dot source function in params

I'm going to start by saying I'm still pretty much a rookie at PowerShell and hoping there is a way to do this.
We have a utils.ps1 script that contains just functions that we dot source with in other scripts. One of the functions returns back a default value if a value is not passed in. I know I could check $args and such but what I wanted was to use the function for the default value in the parameters.
param(
[string]$dbServer=$(Get-DefaultParam "dbServer"),
[string]$appServer=$(Get-DefaultParam "appServer")
)
This doesn't work since the Util script hasn't been sourced yet. I can't put the dot source first because then params doesn't work as it's not the top line. The utils isn't a module and I can't use the #require.
What I got working was this
param(
[ValidateScript({ return $false; })]
[bool]$loadScript=$(. ./Utils.ps1; $true),
[string]$dbServer=$(Get-DefaultParam "dbServer"),
[string]$appServer=$(Get-DefaultParam "appServer")
)
Create a parameter that loads the script and prevent passing a value into that parameter. This will load the script in the correct scope, if I load it in the ValidateScript it's not in the correct scope. Then the rest of the parameters have access to the functions in the Utils.ps1. This probably is not a supported side effect, aka hack, as if I move the loadScript below the other parameters fail since the script hasn't been loaded.
PowerShell guarantee parameters will always load sequential?
Instead should we put all the functions in Utils.ps1 in global scope? this would need to run Utils.ps1 before the other scripts - which seems ok in scripting but less than ideal when running the scripts by hand
Is there a more supported way of doing this besides modules and #require?
Better to not use default value of params and just code all the checks after sourcing and check $args if we need to run the function?
It would be beneficial to instead turn that script into a PowerShell Module, despite your statement that you desire to avoid one. This way, your functions are always available for use as long as the module is installed. Also, despite not wanting to use it, the #Require directive is how you put execution constraints on your script, such as PowerShell version or modules that must be installed for the script to function.
If you really don't want to put this into a module, you can dot-source utils.ps1 from the executing user's $profile. As long as you don't run powershell.exe with the -NoProfile parameter, the profile loads with each session and your functions will be available for use.

Return objects from Powershell Script

How would one go about returning an object from powershell into another powershell script? I am looking to automate some of my deployments using powerhsell so that we can have easier to repeat deployments with a minimum amount of human intervention.
The idea would be to have a "library" of scripts for the various processes that occur during a deployment that take a series of arguments, and then have a main deployment script that just calls each of those subscripts with arguments for the files being used. For example, for one deployment, I might have to create A login on a Sql Server, add some functions or stored procedures to a database , Deploy SSRS Reports, update the shared data sources for the SSRS to use an AD Service account, etc.
I am able to cram everything into a single script with a bunch of functions, but for easier re-usability, I would like to take each basic task - (Run SQL Scripts, get a credential from Secret Server , run a folder of SQL Scripts, Deploy SSRS Reports , etc. ) and place it in its own script with parameters that can be called from my main script. This would allow me to have a main script that just calls each task script with parameters. In order to do this though, for things like updating the AD Credentials, I would need a way to return the PScredential object that the function currently returns from a separate script instead.
You can explicitly return an object by using the return keyword:
return $myObject
Or you can implicitly return the object by explicitly using Write-Ouptut or implicitly outputting the object by having it bare on a line:
Write-Output $myObject
Or
$myObject

luigi: command-line parameters not becoming part of a task's signature?

In luigi, I know how to use its parameter mechanism to pass command-line parameters into a task. However, if I do so, the parameter becomes part of the task's signature.
But there are some cases -- for example, if I want to optionally pass a --debug or --verbose flag on the command line -- where I don't want the command-line parameter to become part of the task's signature.
I know I can do this outside of the luigi world, such as by running my tasks via a wrapper script which can optionally set environment variables to be read within my luigi code. However, is there a way I can accomplish this via luigi, directly?
Just declare them as insignificant parameters, ie instantiate the parameter class passing significant=False as keyword argument.
Example:
class MyTask(DateTask):
other = luigi.Parameter(significant=False)

Powershell naming conventions/scoping

I want to create a Powershell function/cmdlet which installs (and one that uninstalls) a web application: copies files, creates an app pool, creates the web application, sets up all kinds of IIS properties, does some web.config modifications, etc. I'm confused about how I should name it. Powershell has this verb-object naming convention, and it's all nice, but the names I want to use (New-WebApplication, etc.) are already taken by the WebAdministration module (which this new function will use internally). Is there a nice way to scope my functions to make it clear that it's a different module? Like mymodule.New-WebApplication, My-New-WebApplication, New-MyWebApplication? Or I could call it Install-WebApplication but that could lead to confusion because of reusing the same name.
I just ran into this recently for a similar issue. This could have many opinionated answers but this would handle the way to scope my functions to make it clear that it's a different module.
You could use the -Prefix parameter of Import-Module
Import-Module mymodule -Prefix Super
So when you go to use your cmdlet you would call it with
New-SuperWebApplication
Alternatively, you can also explicitly call the cmdlet with the module path
mymodule\New-WebApplication
I agree with Matt's answer, but I wanted to offer another perspective.
I wrote a module where the intention was specifically to recreate the functionality of an existing cmdlet. I named my function differently, but I also exported functions from the module that allow the caller to overrride the existing cmdlet with mine (using an Alias, which is interpreted first), and then to also undo that process.
This allowed someone to explicitly call the function without needing to use -Prefix nor use the \ syntax, using the new name with new code, but it also allowed one to use my function as a drop-in replacement for existing code by calling a single new command.
Here's that module if you want to take a look:
DnsCmdletFixes

How can I have a parent script but maintain separation of concerns with Powershell Scripts?

So I am working on some IIS management scripts for a specific IIS Site setup exclusive to a product to complete tasks such as:
- Create Site
- Create App Pool
- Create Virtual directories
The problem, is I would like to keep separate scripts for each concern and reference them in a parent script. The parent script could be ran to do a full deployment/setup. Or you could run the individual scripts for a specific task. The problem is that they are interactive, so they will request for the user information relevant to completing the task.
I am not sure how to approach the problem where each script will have a script body that will acquire information from the user, yet if it is loaded into the parent script avoid that specific's scripts script body from prompting the user.
NOTE: I know I could put them into modules, and fire off the individual "Exported to the environment" functions, but this script is going to be moved around to the environment that needs setup, and having to manually put modules (psm1) files into the proper PowerShell module folders just to run the scripts is a route I am not particularly fond of.
I am new to scripting with Powershell, any thoughts or recommendations?
Possible Answer*
This might be (a) solution: but I found I could Import-Modules from the working directory and from there have access to those exported functions.
I am interested in any other suggestions as well.
They way I would address it would be to implement a param block at the top of each sub script that would collect the information it needs to run. If a sub script is run individually the param block would prompt the user for the data needed to run that individual script. This also allows the parent script to pass the data needed to run the subscripts as the parent script calls the sub script. The data needed can be hard coded in the parent script or prompted for or some mixture thereof. That way you can make the sub scripts run either silently or with user interaction. You get the user interaction for free from Powershell's parameter handling mechanism. In the subscripts add a parameter attribute to indicate that Powershell will request those particular parameter values from the user if they are not already provided by the calling script.
At the top of your sub scripts, use a parameter block to collected needed data.
param
(
[parameter(Mandatory=$true, HelpMessage="This is required, please enter a value.")]
[string] $SomeParameter
)
You can have a deploy.ps1 script which dot sources the individual scripts and then calls the necessary functions within them:
. $scriptDir\create-site.ps1
. $scriptDir\create-apppool.ps1
. $scriptDir\create-virtualdirectories.ps1
Prompt-Values
Create-Site -site test
Create-AppPool -pool test
Create-VirtualDirectories -vd test
In the individual functions, you can see if the values needed are passed in from the caller ( deploy.ps1 or the command line)
For example, create-site.ps1 will be like:
function Create-Site($site){
if(-not $site){
Prompt-Values
}
}
The ideal is to make the module take care of maintaining storing it's own settings, depending on distribution concerns of that module, and to make commands to help work with the settings.
I.e.
Write a Set-SiteInfo -Name -Pool -VirtualDirectory and have that store values in the registry or in the local directory of the module ($psScriptRoot), then have other commands in the module use this.
If the module is being put in a location where there's no file write access for low-rights users (i.e. a web site directory, or $psHome), then it's a notch better to store the values in the registry.
Hope this Helps