Is there any way to test functions in a PowerShell script without executing the script? - powershell

I would like to define stand-alone functions in my PowerShell script and be able to Pester test the functions without executing the rest of the script. Is there any way to do this without defining the functions in a separate file?
In the following pseudocode example, how do I test functionA or functionB without executing mainFunctionality?
script.ps1:
functionA
functionB
...
mainFunctionality
script.Tests.ps1:
BeforeAll {
. $PSScriptRoot/script.ps1 # This will execute the mainFunctionality, which is what I want to avoid
}
Describe 'functionA' {
# ... tests
}
I believe in Python, you can do this by wrapping your "mainFunctionality" inside this condition, so I am looking for something similar in Powershell.
if __name__ == '__main__':
mainFunctionality
Ref: What does if __name__ == "__main__": do?

Using the PowerShell Abstract Syntax Tree (AST) to just grab functionA and invoke it:
$ScriptBlock = {
function functionA {
Write-Host 'Do something A'
}
function functionB {
Write-Host 'Do something A'
}
function mainFunctionality {
# Do something
functionA
# Do something
functionB
# Do something
}
mainFunctionality
}
Using NameSpace System.Management.Automation.Language
$Ast = [Parser]::ParseInput($ScriptBlock, [ref]$null, [ref]$null) # or: ParseFile
$FunctionA = $Ast.FindAll({
$Args[0] -is [ScriptBlockAst] -and $Args[0].Parent.Name -eq 'functionA'
}, $True)
Invoke-Expression $FunctionA.EndBlock
Do something A

You could use $MyInvocation.PSCommandPath to determine who invoked your script, its the closest I can think of to Python's if __name__ == '__main__':. This property will give you the absolute path of the caller. From there you can extract the script name, i.e. with Path.GetFileName and after you can determine what you want to do, for example, call mainFunctionality if the caller's name equals to main.ps1 or call mainFunctionality if the caller's name is not equal to script.Tests.ps1.
Here is a short example.
myScript.ps1
function A {
"I'm function A"
}
function B {
"I'm function B"
}
function mainFunctionality {
"I'm function mainFunctionality"
}
A # Calls A
B # Calls B
# Call `mainFunctionality` only if my caller's name is `main.ps1`
if([System.IO.Path]::GetFileName($MyInvocation.PSCommandPath) -eq 'main.ps1') {
mainFunctionality
}
Then if calling myScript.ps1 from main.ps1 you would see:
I'm function A
I'm function B
I'm function mainFunctionality
And if calling myScript.ps1 from anywhere else (console or other script with a different name) you would see:
I'm function A
I'm function B

Yes, you can use the Invoke-Expression cmdlet to test functions in a PowerShell script without executing the script. This cmdlet allows you to execute a string as if it were a command.
For example, if you have a function called Test-Function in your script, you can use the following command to test it: Invoke-Expression -Command "Test-Function"
function functionA {
# Do something
}
function functionB {
# Do something
}
function mainFunctionality {
# Do something
functionA
# Do something
functionB
# Do something
}
mainFunctionality
Yes, you can test functions in a PowerShell script without executing the rest of the script. To do this, you can use the Invoke-Pester command to run specific tests in the script. For example, if you wanted to test the functions functionA and functionB, you could use the following command:
Invoke-Pester -Script .\MyScript.ps1 -TestName functionA,functionB
This will execute the tests for the specified functions without executing the rest of the script.

Related

Using Pipeline to pass an object to another Powershell script [duplicate]

I am trying to write a PowerShell script that can get pipeline input (and is expected to do so), but trying something like
ForEach-Object {
# do something
}
doesn't actually work when using the script from the commandline as follows:
1..20 | .\test.ps1
Is there a way?
Note: I know about functions and filters. This is not what I am looking for.
In v2 you can also accept pipeline input (by propertyName or byValue), add parameter aliases etc:
function Get-File{
param(
[Parameter(
Position=0,
Mandatory=$true,
ValueFromPipeline=$true,
ValueFromPipelineByPropertyName=$true)
]
[Alias('FullName')]
[String[]]$FilePath
)
process {
foreach($path in $FilePath)
{
Write-Host "file path is: $path"
}
}
}
# test ValueFromPipelineByPropertyName
dir | Get-File
# test ValueFromPipeline (byValue)
"D:\scripts\s1.txt","D:\scripts\s2.txt" | Get-File
- or -
dir *.txt | foreach {$_.fullname} | Get-File
This works and there are probably other ways to do it:
foreach ($i in $input) {
$i
}
17:12:42 PS>1..20 | .\cmd-input.ps1
1
2
3
-- snip --
18
19
20
Search for "powershell $input variable" and you will find more information and examples.
A couple are here:
PowerShell Functions and Filters PowerShell Pro!
(see the section on "Using the PowerShell Special Variable “$input”")
"Scripts, functions, and script blocks all have access to the $input variable, which provides an enumerator over the elements in the incoming pipeline. "
or
$input gotchas « Dmitry’s PowerBlog PowerShell and beyond
"... basically $input in an enumerator which provides access to the pipeline you have."
For the PS command line, not the DOS command line Windows Command Processor.
You can either write a filter which is a special case of a function like so:
filter SquareIt([int]$num) { $_ * $_ }
or you can create a similar function like so:
function SquareIt([int]$num) {
Begin {
# Executes once before first item in pipeline is processed
}
Process {
# Executes once for each pipeline object
$_ * $_
}
End {
# Executes once after last pipeline object is processed
}
}
The above works as an interactive function definiton or if in a script can be dotted into your global session (or another script). However your example indicated you wanted a script so here it is in a script that is directly usable (no dotting required):
--- Contents of test.ps1 ---
param([int]$num)
Begin {
# Executes once before first item in pipeline is processed
}
Process {
# Executes once for each pipeline object
$_ * $_
}
End {
# Executes once after last pipeline object is processed
}
With PowerShell V2, this changes a bit with "advanced functions" which embue functions with the same parameter binding features that compiled cmdlets have. See this blog post for an example of the differences. Also note that in this advanced functions case you don't use $_ to access the pipeline object. With advanced functions, pipeline objects get bound to a parameter just like they do with a cmdlet.
The following are the simplest possible examples of scripts/functions that use piped input. Each behaves the same as piping to the "echo" cmdlet.
As Scripts:
# Echo-Pipe.ps1
Begin {
# Executes once before first item in pipeline is processed
}
Process {
# Executes once for each pipeline object
echo $_
}
End {
# Executes once after last pipeline object is processed
}
# Echo-Pipe2.ps1
foreach ($i in $input) {
$i
}
As functions:
Function Echo-Pipe {
Begin {
# Executes once before first item in pipeline is processed
}
Process {
# Executes once for each pipeline object
echo $_
}
End {
# Executes once after last pipeline object is processed
}
}
Function Echo-Pipe2 {
foreach ($i in $input) {
$i
}
}
E.g.
PS > . theFileThatContainsTheFunctions.ps1 # This includes the functions into your session
PS > echo "hello world" | Echo-Pipe
hello world
PS > cat aFileWithThreeTestLines.txt | Echo-Pipe2
The first test line
The second test line
The third test line

Reference Function Defined in a ScriptBlock from Another Context

I'm trying to call a function declared inside a scriptblock outside of the scriptblock but PS can't resolve it. Here's my code
$ScriptBlock={
function Get-Baz(){
Write-Host "Baz executed"
}
function Get-Foo(){
Write-Host "Foo executed"
}
}
Get-Baz <--The term 'Get-Baz' is not recognized as the name of a cmdlet, function, script
Defining a script block doesn't execute anything inside of it.
Usually you execute a scriptblock with the call operator & but that executes it in a different scope and won't work.
Instead, you need to execute the scriptblock in the current scope. To do that, use the dot sourcing operator .:
$ScriptBlock={
function Get-Baz(){
Write-Host "Baz executed"
}
function Get-Foo(){
Write-Host "Foo executed"
}
}
. $ScriptBlock
Get-Baz

How do I keep function and variable names from colliding?

I am making a script to install several programs.
Install.ps1
$here = Split-Path -Parent $MyInvocation.MyCommand.Path
. "$here\includes\script1.ps1"
. "$here\includes\script2.ps1"
Write-Host "Installing program 1"
Install-ProgramOne
Write-Host "Installing program 2"
Install-ProgramTwo
script1.ps1
param (
[string] $getCommand = "msiexec /a program1.msi /q"
)
function Get-Command {
$getCommand
}
function Install-ProgramOne {
iex $(Get-Command)
}
script2.ps1
param (
[string] $getCommand = "msiexec /a program2.msi /q"
)
function Get-Command {
$getCommand
}
function Install-ProgramTwo {
iex $(Get-Command)
}
The $getCommand variable will get overwritten when both files are included.
There are namespaces in C# and modules in Ruby, but I cannot figure out how to keep namespaces separate in Powershell.
The $getCommand variable is not a variable per-se but a parameter. A parameter that has a default value specified. That said, it isn't a great idea to have script parameters for a dot sourced script file. These type of files usually just include a library of functions and shared/global variables.
A better approach in V2 and higher is to use a module. A module is a container of variables and functions in which you control what is exported and what is private. This is what I would do with your two scripts:
script1.psm1
# private to this module
$getCommand = "msiexec /a program1.msi /q"
function Get-Command {
$getCommand
}
function Install-ProgramOne {
iex $(Get-Command)
}
Export-ModuleMember -Function Install-ProgramOne
script2.psm1
# private to this module
$getCommand = "msiexec /a program2.msi /q"
function Get-Command {
$getCommand
}
function Install-ProgramTwo {
iex $(Get-Command)
}
Export-ModuleMember -Function Install-ProgramTwo
The use like so:
Import-Module $PSScriptRoot\script1.psm1
Import-Module $PSScriptRoot\script2.psm1
Install-ProgramOne
Install-ProgramTwo
You are "dot sourcing" your scripts instead of running them. This basically means "dump everything into the GLOBAL namespace". If you simply run the scripts instead of dot-sourcing them, then they each get their own local scope. In general, I think scripts with parameters should be run, not dot-sourced.
The problem with not dot-sourcing is that the functions you are declaring, by default, will go out of the scope when the script completes. To avoid this, you can define your function like this instead:
function global:Install-ProgramOne
{
}
And then merely run the script instead of dot-sourcing, and $getcommand will be local to each script you run.

Recursive -Verbose in Powershell

Is there an easy way to make the -Verbose switch "passthrough" to other function calls in Powershell?
I know I can probably search $PSBoundParameters for the flag and do an if statement:
[CmdletBinding()]
Function Invoke-CustomCommandA {
Write-Verbose "Invoking Custom Command A..."
if ($PSBoundParameters.ContainsKey("Verbose")) {
Invoke-CustomCommandB -Verbose
} else {
Invoke-CustomCommandB
}
}
Invoke-CustomCommandA -Verbose
It seems rather messy and redundant to do it this way however... Thoughts?
One way is to use $PSDefaultParameters at the top of your advanced function:
$PSDefaultParameterValues = #{"*:Verbose"=($VerbosePreference -eq 'Continue')}
Then every command you invoke with a -Verbose parameter will have it set depending on whether or not you used -Verbose when you invoked your advanced function.
If you have just a few commands the do this:
$verbose = [bool]$PSBoundParameters["Verbose"]
Invoke-CustomCommandB -Verbose:$verbose
I began using KeithHill's $PSDefaultParameterValues technique in some powershell modules. I ran into some pretty surprising behavior which I'm pretty sure resulted from the effect of scope and $PSDefaultParameterValues being a sort-of global variable. I ended up writing a cmdlet called Get-CommonParameters (alias gcp) and using splat parameters to achieve explicit and terse cascading of -Verbose (and the other common parameters). Here is an example of how that looks:
function f1 {
[CmdletBinding()]
param()
process
{
$cp = &(gcp)
f2 #cp
# ... some other code ...
f2 #cp
}
}
function f2 {
[CmdletBinding()]
param()
process
{
Write-Verbose 'This gets output to the Verbose stream.'
}
}
f1 -Verbose
The source for cmdlet Get-CommonParameters (alias gcp) is in this github repository.
How about:
$vb = $PSBoundParameters.ContainsKey('Verbose')
Invoke-CustomCommandB -Verbose:$vb

How to define a subroutine in PowerShell

In C# a RemoveAllFilesByExtenstion subroutine could be, for example, decleard like this:
void RemoveAllFilesByExtenstion(string targetFolderPath, string ext)
{
...
}
and used like:
RemoveAllFilesByExtenstion("C:\Logs\", ".log");
How can I defne and call a subroutine with the same signature from a PowerShell script file (ps1)?
Pretty simple to convert this to PowerShell:
function RemoveAllFilesByExtenstion([string]$targetFolderPath, [string]$ext)
{
...
}
But the invocation has to use space separated args but doesn't require quotes unless there's a PowerShell special character in the string:
RemoveAllFilesByExtenstion C:\Logs\ .log
OTOH, if the function is indicative of what you want to do, this can be done in PowerShell easily:
Get-ChildItem $targetFolderPath -r -filter $ext | Remove-Item
There are no subroutines in PowerShell, you need a function:
function RemoveAllFilesByExtenstion
{
param(
[string]$TargetFolderPath,
[string]$ext
)
... code...
}
To invoke it :
RemoveAllFilesByExtenstion -TargetFolderPath C:\Logs -Ext *.log
If you don't the function to return any value make sure you capture any results returned from the commands inside the function.