Second level inclusion - powershell

I have a function that includes another scripts:
function include-function($fileName)
{
.$fileName
}
I store this function in another script
From my main script I want to first include this script and then include another script:
."c:\1.ps1" #include first file
include-function "c:\2.ps1" #call function to include other functions
xtest "bbb" #function from 2.ps1 that should be included
The problem is that function xtest from 2.ps1 is not visible in main script, it's only visible in scope of include-function. Is there a way to pass xtest to main script?
My include function doesn't realy load file (it gets it as string from API), so I can't call it directly from main script. As a workaround I just changed include-function to return me contents of a file and then from main script I call Invoke-Expression (include-function "c:\2.ps1")
Thanks

The explanation is the scope of your vars and function if in 2.ps1, you declare your vars and functions as globals they will be visible in the global scope.
Exemple of 2.ps1 :
$global:Var2="Coucou"
function global:Test2 ([string]$Param)
{
write-host $Param $Param
}
usage test.ps1:
function include-function($fileName)
{
.$fileName
}
Clear-Host
include-function "c:\silogix\2.ps1"
Test2 "Hello"
gives :
Hello Hello
As you tag your question in PowerShell V2.0 you'd better have a look to modules. using module will end in a best structured programmation see about_Modules.

Related

how can i pass a powershell variable to terraform

I need to pass a powershell/devops variable to terraform, is there a way of doing this? As in the below example i want the below PR number to be used as a variable in terraform.
testvalue-$(System.PullRequest.PullRequestNumber)
As far as I know, there is no way to define a variable by the output of a command (shell ..), but you can take a look at this data source external data source ,
the idea is that you define a bash script or any program and use it's output as parameters for other resources.
Example
data "external" "PullRequest" {
program = [
"${path.module}/scriptWhichReturnsPullRequestName.sh",
]
result {
}
}
...
resource ... {
value = data.external.PullRequest.property
}
I put my variables in a variables.tf and trigger terraform execution from a powershell script. Prior that execution I just replace certain strings in variables.tf.

How to make the output in the pipeline appear on the console when running pester tests?

By default, the output in the pipeline is hidden,But sometimes I really want to know the output at that time.
Of course, I knew I could add additional commands, such as write-host or out-default.
But does Pester itself have a mechanism to make the output display properly?
I checked the help document and didn't find the relevant content, so I came here for help.
It is possible to write a custom Should wrapper (proxy) using this technique. The wrapper can write the pipeline objects to the console.
Here is a complete example of such a Should wrapper and how to override Pester's Should.
To apply it to your case, edit the process{} block of the wrapper like this:
process {
try {
# Here $_ is the current pipeline object
Write-Host "Current test case input:`n$( $_ | Out-String )"
# forward it to "process" block of original "Should"
$steppablePipeline.Process( $_ )
} catch {
throw
}
}

Generic collection as parameters

I would like to use a generic collection as a parameter type for a PowerShell function like so:
function Execute-Tokenlist
{
param([System.Collections.Generic.List[WTToken]]$Tokenlist)
}
[WTToken] is just a custom type.
I create a new generic collection list with WTToken objects:
$TokenList1 = New-Object -TypeName System.Collections.Generic.List[WTToken]
But when I try to call the function
Execute-Tokenlist -Tokenlist $TokenList1
the result is a ParameterBindingArgumentTransformationException because PowerShell turns System.Collections.Generic.List`1[WTToken] into a new type:
System.Collections.Generic.List`1[[WTToken, TokenListeAuswerten.ps1, Version=0.0.0.0, Culture=neutral, PublicKeyToken=null]]
So it adds the script name as an assembly name to the type name.
My question is: How can I prevent that PowerShell does these kind of type transformation?
I know I could use Object as a parameter type but I don't like giving up type checking for no good reason.
I found the solution and (of course) it has a simple explanation.
Powershell includes in the full name of any type thats based on a class definition the path of the Ps1 file that contains the class definition or just "\powershell" if the definition is not in a ps1 file.
My mistake was that I was opening a new script window inside the ISE and put in the code for defining the class.
Then I tried to put an object based on that class in a generic collection that was already defined inside a ps1 file and has an Add method that excpects objects with a type that includes a ps1 file. Since the new object type name didn't contain that ps1 path I got that error messages that didn't make sense to me (now it does).
Here is the complete PowerShell script code again:
class WTToken
{
[String]$Type
[String]$Name
[Bool]$Value
}
$TokenList = [System.Collections.Generic.List[WTToken]]::new()
$Token = [WTToken]::new()
$TokenList.Add($Token)
$TokenList.GetType().FullName
Start the ISE, paste in that code and save it into a ps1 file, eg. test.ps1 and run it with F5.
Now open a new script window and paste in the following Powershell code:
class WTToken
{
[String]$Type
[String]$Name
[Bool]$Value
}
$Token = [WTToken]::new()
$TokenList.Add($Token)
Run this and an error message should appear.

How do I make a variable in a Powershell module accessible to other functions in that module?

I have a powershell module that defines some basic functions that write log events according to corporate standard, and another single function which creates the folder the log file should go into, and restarts the logging app service.
Each function that writes different severities of log event needs to use the $LogFileLocation variable (set in the function that creates the folder, restarts the service and generally gets the system ready to start logging) but the $LogFileLocation variable is only available inside the function that does setup.
How can I make it available to the other functions, including to any script functions from a script which imports the module? I've tried Global:$LogFileLocation instead of just $LogFileLocation but this doesn't seem to make it a global variable.
Define this function in your module:
Function Set-MyModuleLogPath {
[CmdletBinding()]
param (
[Parameter(Mandatory=$true)]
[ValidateScript({Test-Path $_ -PathType Leaf})] #include this if you want to ensure the path exists before using (or add logic to create the path below)
[String]$Path
)
process {
(New-Variable -Name 'LogFileLocation' -Value (Resolve-Path $Path).Path -Option 'Constant' -Scope 1 -PassThru).Value
#For more info on Scope see: https://msdn.microsoft.com/en-us/powershell/reference/5.1/microsoft.powershell.core/about/about_scopes
}
}
This uses the New-Variable command to assign your path value to a variable, setting that variable as a constant (as presumably once set you don't want other functions to change this path at runtime; since that could lead to unpredictable results with a widely scoped variable).
The Scope parameter takes an argument of 1, meaning that this variable is scoped to the container of the function's definition; which in your case is the module script. i.e. Defining this function in the module then calling at runtime has the same effect as writing $LogFileLocation = 'c:\path\to\file.log' in the module's code; only you now have a way to avoid hard-coding that path in your module.
The ValidateScript option on path has the logic Test-Path $_ -PathType Leaf. i.e. we want to ensure that the path being referred to is valid / already exists; and that it's a file, not a directory. Of course, you may want to only validate the directory exists then create a new file at runtime; or maybe create anything that doesn't already exist at runtime... you can tweak this logic as you require.
The Resolve-Path is used in case someone passes in a relative path (i.e. '.\default.log'; as if the working directory changes as the script runs, the file which this refers to would also change. By resolving it to an absolute path when set this location is then locked down.
Rather than referring to the variable by name elsewhere in the module (i.e. $Script:LogFileLocation or $LogFileLocation), I'd recommend adding a Get method to allow logic to check that this variable was set, then using that. Of course, that may be an additional overhead that's not worth it (i.e. if performance is more important than robustness)... depending on your requirements.
Function Get-MyModuleLogPath {
[CmdletBinding(DefaultParameterSetName='ErrorIfNotSet')]
param (
[Parameter(ParameterSetName='DefaultIfNotSet', Mandatory=$true)]
[switch]$DefaultIfNotSet
,
[Parameter(ParameterSetName='DefaultIfNotSet')]
[string]$DefaultPath = '.\default.log'
)
process {
try {
$path = (Get-Variable -Name 'LogFileLocation' -Scope 1 -ValueOnly -ErrorAction Stop)
} catch {
if ($DefaultIfNotSet) {
$path = Set-MyModuleLogPath $DefaultPath #once we've used the default we want it to become locked as the constant; so this log path won't change at runtime
} else {
throw "Please run 'Set-ModuleLogPath' to define a log path before calling 'Get-ModuleLogPath', or use the '-DefaultIfNotSet' switch to allow the default path to be used"
}
}
$path
}
}
Scope: https://msdn.microsoft.com/en-us/powershell/reference/5.1/microsoft.powershell.core/about/about_scopes- - New-Variable: https://msdn.microsoft.com/en-us/powershell/reference/5.1/microsoft.powershell.utility/new-variable
Get-Variable: https://msdn.microsoft.com/en-us/powershell/reference/5.1/microsoft.powershell.utility/get-variable
Set-Variable: https://msdn.microsoft.com/en-us/powershell/reference/5.1/microsoft.powershell.utility/set-variable
Remove-Variable: https://msdn.microsoft.com/en-us/powershell/reference/5.1/microsoft.powershell.utility/remove-variable
Clear-Variable: https://msdn.microsoft.com/en-us/powershell/reference/5.1/microsoft.powershell.utility/clear-variable
Further Reading: http://www.powershellatoms.com/shell-environment/constant-and-read-only-variables/
Put the variable and its assignment ($LogFile = 'C:\path\to\file.log'), at the top of the .psm1 file, and it'll becomes a module-level variable, thus making it available to all the functions in the module.

Passing a Variable to a script calling itself

My Script is storing a ClearCase View in a Variable. To operate in this View, the Script needs to call itself inside the View again, after it started the View.
The Code looks like this
if($params{ViewSet} eq 'no')
{
# Start the View
# Store the View in $View
# Call the Script in the new-set View with parameter -ViewSet yes
}
if($params{ViewSet} eq 'yes')
{
# Do Work inside the View
}
The problem is, obviously the Variable $View is not defined when I call my script the second time, since it is defined in the first if loop.
Can I pass the View I stored in $View when I call the Script the second time?
Setting the View before entering the if-Statements would not word, I would start the View two times then.
Call the Script in the new-set View with parameter -ViewSet
If that involve calling cleartool setview, don't: setview spawns a subshell in which what you have defined in your script won't be visible.
When your script needs to access the dynamic view started, do use the full dynamic view path:
/view/myDynView
# under which you would see:
/view/myDynView/vobs/MyVob