Testing for mandatory parameters with Pester - powershell

I'm trying to figure out how to have Pester test for parameters that are missing:
Find-Waldo.Tests.ps1
$here = Split-Path -Parent $MyInvocation.MyCommand.Path
$sut = (Split-Path -Leaf $MyInvocation.MyCommand.Path) -replace '\.Tests\.', '.'
Describe 'Mandatory paramters' {
it 'ComputerName' {
{
$Params = #{
#ComputerName = 'MyPc'
ScriptName = 'Test'
}
. "$here\$sut" #Params
} | Should throw
}
}
Find-Waldo.ps1
Param (
[Parameter(Mandatory)]
[String]$ComputerName,
[String]$ScriptName
)
Function Find-Waldo {
[CmdletBinding()]
Param (
[String]$FilePath
)
'Do something'
}
Every time I try to assert the result or simply run the test, it will prompt me for the ComputerName parameter instead of failing the test.
Am I missing something super obvious here? Is there a way to test for the presence of mandatory parameters?

Per the comments from Mathias, you can't really test for whether a Mandatory parameter is missing because PowerShell prompts for it rather than throwing an error. Per the comment he linked to from the Pester team you could use Get-Command to test for the Mandatory parameter setting in the script (assuming it is the only parameter attribute set for that variable)
((Get-Command "$here\$sut").Parameters['ComputerName'].Attributes.Mandatory | Should Be $true
An alternative option would be to not use Mandatory parameters in this instance, and instead have a script block that does a Throw as the default value of the parameter:
Param (
[String]$ComputerName = $(Throw '-ComputerName is required'),
[String]$ScriptName
)
If the script is always used as part of an automated process (instead of via user execution) this might be preferred as it allows you to control/capture its behavior and avoids it getting stuck during execution. You can then test the script as you had originally proposed:
Describe 'Mandatory paramters' {
it 'ComputerName' {
{
$Params = #{
#ComputerName = 'MyPc'
ScriptName = 'Test'
}
. "$here\$sut" #Params
} | Should throw '-ComputerName is required'
}
}

Although the accepted answer indicates that this isn't possible, it actually is possible. Here is the solution that I developed to solve for this problem.
It 'Should fail when no priority is specified, for a valid process name' {
{
$ScriptBlock = {
Import-Module -Name $args[0]
Set-ProcessPriority -Name System
}
Start-Job -ScriptBlock $ScriptBlock -ArgumentList $HOME/git/ProcessPriority/src/ProcessPriority | Wait-Job | Receive-Job
} | Should -Throw
}
What you'll notice from the above example is:
🚀 The code being tested has been wrapped in a PowerShell ScriptBlock
🚀 We invoke a PowerShell background job, containing the test code
🚀 We wait for the background job to complete, and then receive the results
🚀 If you run the Get-Job command, you'll notice that there is a job in the Blocked status
The exception that's thrown by the background job is similar to the following:
The Wait-Job cmdlet cannot finish working, because one or more jobs are blocked waiting for user interaction. Process interactive job output by using the Receive-Job cmdlet, and then try again.
You'll notice that I hard-coded the filesystem path to the module. I am not sure how to pass this as an argument into the "outer" ScriptBlock that Pester is invoking for us. Perhaps someone has a suggestion on how to accomplish that final piece of the puzzle.
What's uniquely interesting about PowerShell background jobs is that you can actually resume a job in the Blocked status, and it will prompt you for input, even though it threw the earlier exception.

Related

Start-Job: Call another functions in a script

After reading a lot of Q&A here on SO about Start-Job I am still can not understand what I am doing wrong...
The main idea: I need to run a lot of functions that call another functions with different parameters. Something like this:
function Base-Function {
PARAM(
[string]
$Param
)
# I will do something with $Param
}
function Copy-File {
PARAM(
[string]
$CopyFileParam
)
Base-Function -Param $CopyFileParam
}
function Read-File {
PARAM(
[string]
$ReadFileParam
)
Base-Function -Param $ReadFileParam
}
function Move-File {
PARAM(
[string]
$MoveFileParam
)
Base-Function -Param $MoveFileParam
}
So - I am trying to call Copy-File, Read-File and Move-File simultaneously:
function Main-Function {
$copyFileArgs = #{ "CopyFileParam" = 1 }
Start-Job -ScriptBlock ${Function:Copy-File} -ArgumentList $copyFileArgs
$readFileArgs = #{ "ReadFileParam" = 2 }
Start-Job -ScriptBlock ${Function:Read-File} -ArgumentList $readFileArgs
...
...
}
but of course I can not call Base-Function inside Copy-File function this way so I added -InitializationScript argument:
$init = {
function Base-Function {
PARAM(
[string]
$Param
)
# I will do something with $Param
}
}
and then I call it like this:
function Main-Function {
$copyFileArgs = #{ "CopyFileParam" = 1 }
Start-Job -ScriptBlock ${Function:Copy-File} -ArgumentList $copyFileArgs -InitializationScript $init
}
but then I get an error:
OpenError: [localhost] An error occurred while starting the background process. Error reported: An error occurred trying to start process 'C:\Program Files\PowerShell\7\pwsh.exe' with working directory 'C:\Projects\powershell\project'. The filename or extension is too long..
So my question is:
Any suggestion to simultaneously call different function that in they turn call to some in-script functions ?
Why I get this error The filename or extension is too long ?
Here is a link to powershell script for example: Gist
Run the script and let it finish
In the same shell window check for jobs: Get-Job
Check the output of running job: Receive-Job xxx
see that output of job is:
ObjectNotFound: The term 'Base-Function' is not recognized as a name of a cmdlet, function, script file, or executable program.
Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
ObjectNotFound: The term 'Base-Function' is not recognized as a name of a cmdlet, function, script file, or executable program.
Check the spelling of the name, or if a path was included, verify that the path is correct and try again.
Sorry for misinforming you; the code above is correct and functional (many thanks to #mklement0 for hints and suggestions).
The actual problem that I encountered was in this line:
OpenError: [localhost] An error occurred while starting the background process.
Error reported: An error occurred trying to start process 'C:\Program Files\PowerShell\7\pwsh.exe' with working directory 'C:\Projects\powershell\project'.
The filename or extension is too long..
The filename or extension is too long.. -> this means that there is a character's length limit for what can be passed in the '-InitializationScript' parameter. You could check it in Gist example above - everything work OK.
Here is Stakoverflow question that give me this idea: Max size of ScriptBlock / InitializationScript for Start-Job in PowerShell
Once I put all my code instead in -InitializationScript parameter inside ps1 script file and then dot source it inside function - everything started working perfectly.
Thanks again #mklement0

PowerShell Jobs, writing to a file

Having some problems getting a Start-Job script block to output to a file. The following three lines of code work without any problem:
$about_name = "C:\0\ps_about_name.txt"
$about = get-help about_* | select Name,Synopsis
if (-not (Test-
Path $about_name)) { ($about | select Name | sort Name | Out-String).replace("[Aa]bout_", "") > $about_name }
The file is created in C:\0\
But I need to do a lot of collections like this, so I naturally looked at stacking them in parallel as separate jobs. I followed online examples and so put the last line in the above as a script block invoked by Start-Job:
Start-Job { if (-not (Test-Path $about_name)) { { ($about | select Name | sort Name | Out-String).replace("[Aa]bout_", "") > $about_name } }
The Job is created, goes to status Running, and then to status Completed, but no file is created. Without Start-Job, all works, with Start-Job, nothing... I've tried a lot of variations on this but cannot get it to create the file. Can someone advise what I am doing wrong in this please?
IMO, the simplest way to get around this problem by use of the $using scope modifier.
$about_name = "C:\0\ps_about_name.txt"
$about = get-help about_* | select Name,Synopsis
$sb = { if (-not (Test-Path $using:about_name)) {
$using:about.Name -replace '^about_' | Sort-Object > $using:about_name
}
}
Start-Job -Scriptblock $sb
Explanation:
$using allows you to access local variables in a remote command. This is particularly useful when running Start-Job and Invoke-Command. The syntax is $using:localvariable.
This particular problem is a variable scope issue. Start-Job creates a background job with its own scope. When using -Scriptblock parameter, you are working within that scope. It does not know about variables defined in your current scope/session. Therefore, you must use a technique that will define the variable within the scope, pass in the variable's value, or access the local scope from the script block. You can read more about scopes at About_Scopes.
As an aside, character sets [] are not supported in the .NET .Replace() method. You need to switch to -replace to utilize those. I updated the code to perform the replace using -replace case-insensitively.
HCM's perfectly fine solution uses a technique that passes the value into the job's script block. By defining a parameter within the script block, you can pass a value into that parameter by use of -ArgumentList.
Another option is to just define your variables within the Start-Job script block.
$sb = { $about_name = "C:\0\ps_about_name.txt"
$about = get-help about_* | select Name,Synopsis
if (-not (Test-Path $about_name)) {
$about.Name -replace '^about_' | Sort-Object > $about_name
}
}
Start-Job -Scriptblock $sb
You've got to send your parameters to your job.
This does not work:
$file = "C:\temp\_mytest.txt"
start-job {"_" | out-file $file}
While this does:
$file = "C:\temp\_mytest.txt"
start-job -ArgumentList $file -scriptblock {
Param($file)
"_" | out-file $file
}

Powershell workflow function issue

I´m trying to reuse code on my SMA runbooks but everything I try to put inside a function doesn´t seem to work as expected.
For example, If I do this it works and returns the username of the credential:
workflow RB_Test
{
$credent = Get-AutomationPSCredential -Name "CRED_TESTE"
$var = $credent.Username
"result = ${var}"
}
Output:
But if I turn into this it doesn't work anymore (returns null):
workflow RB_Test
{
function FN_Test
{
$credent = Get-AutomationPSCredential -Name "CRED_TESTE"
$var = $credent.Username
"result = ${var}"
}
FN_Test
}
Output:
I've tried different things but without success. The debug/verbose screen don't return anything different. That also doesn't work:
Inlinescript {
. FN_Test
}
My goal would be to put several functions into a separate module and then import it on my runbooks for reusability but this really seems not to work.
This is a runbook (powershell workflow) created in the Service Management Automation (SMA).
I've read that there are some restrictions with Powershell workflow compared to pure Powershell but I am not sure if I am hitting one of them:
https://blogs.technet.microsoft.com/heyscriptingguy/2013/01/02/powershell-workflows-restrictions/
Thanks
Here's what I've had to do to get functions to work:
workflow FunctionTest {
function log {
param(
[string]$Message
)
Write-Output $Message
Write-Output "Filename: $Filename"
Write-Output "using:Filename: $using:Filename"
Write-Output "workflow:Filename: $workflow:Filename"
Write-Output "----"
## Under what conditions is 'global' used? Can't be used in a workflow...Hey Scripting Guy!
}
workflow DoSomething {
param(
[string]$Filename
)
log "Starting DoSomething"
}
$Filename = "LogFile_2017.csv"
log "Starting workflow"
## Variables need to be passed into workflow from parent-workflow
DoSomething -Filename $Filename
log "End workflow"
}
FunctionTest
I found you need to define your functions before using them. The tricky part was discovering that you have to pass your variables into the child-workflow.
The scope of the variables takes some getting used to.

Powershell: Checking if script was invoked with -ErrorAction?

Scripts (with CmdletBinding) and cmdlets all have a standard -ErrorAction parameter available when being invoked. Is there a way then, within your script, if indeed you script was invoked with -ErrorAction?
Reason I ask is because I want to know if the automatic variable value for $ErrorActionPreference, as far as your script is concerned, was set by -ErrorAction or if it is coming from your session level.
$ErrorActionPreference is a variable in the global(session) scope. If you run a script and don't specify the -ErrorAction parameter, it inherits the value from the global scope ($global:ErrorActionPreference).
If you specify -ErrorAction parameter, the $ErrorActionPreference is changed for your private scope, meaning it's stays the same through the script except while running code where you specified something else(ex. you call another script with another -ErrorAction value). Example to test:
Test.ps1
[CmdletBinding()]
param()
Write-Host "Session: $($global:ErrorActionPreference)"
Write-Host "Script: $($ErrorActionPreference)"
Output:
PS-ADMIN > $ErrorActionPreference
Continue
PS-ADMIN > .\Test.ps1
Session: Continue
Script: Continue
PS-ADMIN > .\Test.ps1 -ErrorAction Ignore
Session: Continue
Script: Ignore
PS-ADMIN > $ErrorActionPreference
Continue
If you wanna test if the script was called with the -ErrorAction paramter, you could use ex.
if ($global:ErrorActionPreference -ne $ErrorActionPreference) { Write-Host "changed" }
If you don't know what scopes is, type this in a powershell console: Get-Help about_scopes
Check the $MyInvocation.BoundParameters object. You could use the built-in $PSBoundParameters variable but I found that in some instances it was empty (not directly related to this question), so imo it's safer to use $MyInvocation.
function Test-ErrorAction
{
param()
if($MyInvocation.BoundParameters.ContainsKey('ErrorAction'))
{
'The ErrorAction parameter has been specified'
}
else
{
'The ErrorAction parameter was not specified'
}
}
Test-ErrorAction
If you need to know if the cmdlet was called with the -ErrorAction parameter do like this:
[CmdletBinding()]
param()
if ($myinvocation.Line.ToLower() -match "-erroraction" )
{
"yessss"
}
else
{
"nooooo"
}
This is true alse when the parameter have same value as the global $ErrorActionPreference

Powershell: How do I get the exit code returned from a process run inside a PsJob?

I have the following job in powershell:
$job = start-job {
...
c:\utils\MyToolReturningSomeExitCode.cmd
} -ArgumentList $JobFile
How do I access the exit code returned by c:\utils\MyToolReturningSomeExitCode.cmd ? I have tried several options, but the only one I could find that works is this:
$job = start-job {
...
c:\utils\MyToolReturningSomeExitCode.cmd
$LASTEXITCODE
} -ArgumentList $JobFile
...
# collect the output
$exitCode = $job | Wait-Job | Receive-Job -ErrorAction SilentlyContinue
# output all, except the last line
$exitCode[0..($exitCode.Length - 2)]
# the last line is the exit code
exit $exitCode[-1]
I find this approach too wry to my delicate taste. Can anyone suggest a nicer solution?
Important, I have read in the documentation that powershell must be run as administrator in order for the job related remoting stuff to work. I cannot run it as administrator, hence -ErrorAction SilentlyContinue. So, I am looking for solutions not requiring admin privileges.
Thanks.
If all you need is to do something in background while the main script does something else then PowerShell class is enough (and it is normally faster). Besides it allows passing in a live object in order to return something in addition to output via parameters.
$code = #{}
$job = [PowerShell]::Create().AddScript({
param($JobFile, $Result)
cmd /c exit 42
$Result.Value = $LASTEXITCODE
'some output'
}).AddArgument($JobFile).AddArgument($code)
# start thee job
$async = $job.BeginInvoke()
# do some other work while $job is working
#.....
# end the job, get results
$job.EndInvoke($async)
# the exit code is $code.Value
"Code = $($code.Value)"
UPDATE
The original code was with [ref] object. It works in PS V3 CTP2 but does not work in V2. So I corrected it, we can use other objects instead, a hashtable, for example, in order to return some data via parameters.
One way you can detect if the background job failed or not based on an exit code is to evaluate the exit code inside the background job itself and throw an exception if the exit code indicates an error occurred. For instance, consider the following example:
$job = start-job {
# ...
$output = & C:\utils\MyToolReturningSomeExitCode.cmd 2>&1
if ($LASTEXITCODE -ne 0) {
throw "Job failed. The error was: {0}." -f ([string] $output)
}
} -ArgumentList $JobFile
$myJob = Start-Job -ScriptBlock $job | Wait-Job
if ($myJob.State -eq 'Failed') {
Receive-Job -Job $myJob
}
A couple things of note in this example. I am redirecting the standard error output stream to the standard output stream to capture all textual output from the batch script and returning it if the exit code is non-zero indicating it failed to run. By throwing an exception this way the background job object State property will let us know the result of the job.