I'm refactoring some of my older PS scripts to a) improve them b) clean up c) modularize.
In the script I'm working now there are 10-15 functions that work on specific directory - let's call it work directory. Currently it's defined globally and loaded from a config file; it never changes after initialization (does that make it a constant?).
I want to wrap some of the functions in a separate module. The question is: should I rewrite them so the variable is passed explicitly as a parameter, or can I leave it as is, with the assumption that every script I use this module (library?) in will have this variable initialized? If the latter, how to make sure the module can "detect" the variable is uninitialized and throw some error?
And, last but not least, currently it's just a variable - should I use some specific construct so that it's obvious it is global, and not to be modified?
should I rewrite them so the variable is passed explicitly as a parameter
As long as there's no legitimate use case for overriding it in a single call, I wouldn't pass it as a parameter.
If your functions are packaged as a module, I'd strongly recommend utilizing module-scoped variables rather than globals.
Assuming you're talking about a script module, this is as simple as:
Set-Variable -Scope Script -Name ModuleTargetDirectory -Value $config.TargetDirectory
from inside the module file or a module function that runs during import (the script: scope is the same as module-scope inside a module), and then in the consuming functions:
function Get-Something
{
# ...
$targetDirectory = $script:ModuleTargetDirectory
# ...
}
Or wrap the entire config storage in a private helper method:
# don't export this function
function Get-MyModuleConfig
{
# assuming we stored a dictionary or custom object with the config options in a module-scoped variable named `config`
return $script:config
}
And then always just call $config = Get-MyModuleConfig in the begin block of functions that need access to the config data
Related
(PS 5.1) I put a using namespace directive in a psm1 to shorten some code. It then no longer recognizes a custom type I defined. Without the directive, all is peachy. It seems that a using namespace directive overrides things I did not expect it to.
Example: I have a main.ps1, and two modules.
main.ps1
using module .\classes.psm1 # using modules is needed to bring in classes, Import-Module won't cut it
Import-Module .\process.psm1 -Force # force reload
$MyList = Get-List # call function in module
$MyList[0].Bar # show a result
classes.psm1
Class Foo
{
[int]$Bar
}
process.psm1
function Get-List {
$f = New-Object Foo
$f.Bar = 42
$list = [System.Collections.Generic.List[Foo]]::new()
$list.Add($f)
$list
}
This works fine. The trouble starts when I want to shorten things in process.psm1:
using namespace System.Collections.Generic
function Get-List {
$f = New-Object Foo
$f.Bar = 42
$list = [List[Foo]]::new() # I just want a shorter way to declare a list
$list.Add($f)
$list
}
This complains about not recognizing type Foo. Why? (When I bring in using module .\classes.psm1 in process.psm1, all is fine again.)
My point/question is: how does using namespace affect a module's capability to recognize other modules/files within a 'solution'? It find it rather counter-intuitive, but I am not a PS expert.
By default, PowerShell defers type name resolution until runtime - unless a script file contains at least one of the following:
A using statement
A class definition
An enum definition
Without the using namespace System.Collections.Generic statement, PowerShell has no reason to attempt resolving any type names in process.psm1 until after [Foo] has been loaded from classes.psm1 and is resolvable/visible.
With the using ... statement on the other hand, PowerShell attempts to resolve the [Foo] type parameter in [List[Foo]] at parse-time - before process.psm1 has actually been loaded into main.ps1, and so the error you see is thrown.
As you've found, adding an explicit using module reference to the module file containing [Foo] solves it, as process.psm1 then no longer depends on the callers type resolution scope (which is not accessible at parse-time)
I have a dynamically loaded PowerShell script block that accepts multiple parameters (including Mandatory, type, default value,...). I also have a function that has its own parameter sets and internally, it calls the script block. The script block is dynamically loaded based on other parameters, and I'd like to allow the user to pass the parameters to my function and forward them to script block. For that, I'd like to read the possible parameters from the script block, which I can do using $block.Ast.ParamBlock.Parameters and then create matching parameters for my function using DynamicParam.
Example:
Script block:
$block = {
param([Parameter(Mandatory)][string]$test = 10)
echo $test
}
Function:
function Fn {
param($FnParam1, $FnParam2)
DynamicParam {
// what to write here
}
& $block // how to efficiently forward the parameters here?
}
It should be possible to call Fn like this:
Fn -FnParam1 value -FnParam2 value -test 20
However, DynamicParam is created using RuntimeDefinedParameter and related classes, and AST of the script block returns Management.Automation.Language.ParameterAst and other instances from the same namespace. I could manually create the DynamicParam parameters by converting the AST node-by-node, but it seems like there should be a simpler way to automatically convert these.
I've made a powershell script which validates some parameters. In the process of validation I need to create some strings. I also need these strings later in the script.
To avoid rebuilding the same strings again, can I reuse variables defined within validation blocks? Perhaps I can use functions in validation blocks somehow? Or maybe global variables? I'm not sure what's possible here, or what's good practice.
Example:
Test.ps1
Function Test {
param(
[string]
[Parameter(Mandatory=$true)]
$thing1
[string]
[Parameter(Mandatory=$true)]
$thing2
[string]
[Parameter(Mandatory=$true)]
[ValidateScript({
$a = Get-A $thing1
$b = Get-B $thing2
$c = $a + $b
$d = Get-D $c
if(-not($d -contains $_)) {
throw "$_ is not a valid value for the thing3 parameter."
}
return $true
})]
$thing3
)
# Here I'd like to use $c
# At worst, calling Get-A and Get-B again may be expensive
# Or it could just be annoying duplication of code
}
Bonus question, if this is possible, could I reuse those variables in a subsequent validation block?
You could use a byref varliable.
This will affect the variable being passed to it so you could both have a return value and a parameter affected by the execution of your function.
About Ref
You can pass variables to functions by reference or by value.
When you pass a variable by value, you are passing a copy of the data.
In the following example, the function changes the value of the
variable passed to it. In PowerShell, integers are value types so they
are passed by value. Therefore, the value of $var is unchanged outside
the scope of the function.
Function Test{
Param($thing1,$thing2,[ref]$c)
$c.Value = new-guid
return $true
}
#$ThisIsC = $null
test -c ([ref] $ThisIsC)
Write-Host $ThisIsC -ForegroundColor Green
Alternatively, you can use the $script or the $global scope.
For a simple script to quickly expose your variable, the $scriptscope will do just that. A byref parameter might be easier for the end-user if you intend to distribute your function by making it clear you need to pass a reference parameter.
See About Scopes documentation.
Scopes in PowerShell have both names and numbers. The named scopes
specify an absolute scope. The numbers are relative and reflect the
relationship between scopes.
Global: The scope that is in effect when PowerShell starts. Variables
and functions that are present when PowerShell starts have been
created in the global scope, such as automatic variables and
preference variables. The variables, aliases, and functions in your
PowerShell profiles are also created in the global scope.
Local: The current scope. The local scope can be the global scope or
any other scope.
Script: The scope that is created while a script file runs. Only the
commands in the script run in the script scope. To the commands in a
script, the script scope is the local scope.
Private: Items in private scope cannot be seen outside of the current
scope. You can use private scope to create a private version of an
item with the same name in another scope.
Numbered Scopes: You can refer to scopes by name or by a number that
describes the relative position of one scope to another. Scope 0
represents the current, or local, scope. Scope 1 indicates the
immediate parent scope. Scope 2 indicates the parent of the parent
scope, and so on. Numbered scopes are useful if you have created many
recursive scopes.
I have a problem about global variables on Smartface. I create a dataset and I gave a criteria with a value which name is param1. Although I define a variable which name is param1 in Global file, even the code runs correctly I get an error like;
Can't find variable:
param1
*undefined
*1
*global code
As I said, my code runs correctly, but why I always get this error?
I have got such an error like that. Probably, you defined that variable into the one of the functions that are in the Global file. You should directly define global variables into the Global file.
function Global_Events_OnStart(e) {
...
}
--> For example you should define your global variables here, so outside of all the functions.
function Global_Events_OnError(e) {
...
}
I'd like to use the data that are loaded to my workspace in a Matlab function. This is the beginning of my function.
function [totalProfit] = compute(p,exit)
%% Declaration of variables
entry=0;
T = length(data);
.
.
.
end
I'm getting an error:
Undefined function or variable 'data'.
Where is the error?
The variable data was probably defined outside of the function, so it is out of scope.
Pass data as a parameter to compute and then it will be available inside the function.
You can use evalin to work with variables from another workspace. In your example this could be
T = evalin('caller','length(data)')
But please note that in most cases you get cleaner code if you define the variable as input argument for the function. So for your case this would be
function [totalProfit] = compute(p,exit,data)
T = length(data) ;
end
Ran is correct, but I wanted to mention something else. In general, only variables that are passed as arguments to a function are able to be used inside that function, so if you want to use your existing variables inside the function, pass them as input arguments.
It is possible to create global variables which allow you to use them inside functions without passing them as arguments, but it's usually not the best way of writing code. The times where I have used global variables are where I am calling multiple functions from a single script, and I have some constants that will be used by all the functions (for example gravity is a common one). An alternative to global variables is to use a struct, with the variables you want to pass to the function in it, so you only need one extra input argument, but you still have to be a bit careful.