Powershell piping multiple PSCustomObjects drops all but first object - powershell

I'm making some modules to make RESTAPI calls easier for our software (since all resources use different logic to how I can find them and how they should be called but that's a separate story). I'm making Get-, Set-, New-, and Remove- for each resource and just found that you can't pipe more than one object to modules that takes arrays from the pipeline. Here's an illustration of the problem that you can copy-paste into your environment:
function Get-MyTest {
param(
[string[]]$MyInput
)
[array]$Output = #()
$i=0
foreach($Item in $MyInput){
$i++
$Output += [pscustomobject]#{
Name = $Item
Id = $i
}
}
return $Output
}
function Get-IncorrectResults {
param(
[parameter(ValueFromPipeline)][array]$MyIncorrectInput
)
foreach ($Item in $MyIncorrectInput) {
Write-Host "Current object: $Item `n "
}
}
The Get module can return more than one object. Each object returned is a PSCustomObject, so if there are more than one returned it becomes an array of PSCustomObjects. Here's the problem:
This works:
$Results = Get-MyTest -MyInput "Test1","Test2","Test3"
Get-IncorrectResults -MyIncorrectInput $Results
This only returns the first item:
$Results = Get-MyTest -MyInput "Test1","Test2","Test3"
$Results | Get-IncorrectResults
If the Get part returns more than one object, only the first object is passed on to the Remove-MyModule. I tried changing the parameter definition from [pscustomobject[]] to [array] but it's the same result. What is the reason for dropping all but the first array item when piping but not when using it as a parameter?

The structure of an advanced PowerShell function is as follows:
function Function-Name {
param(
<# parameter definitions go here#>
)
begin {
<# this runs once when the pipeline starts #>
}
process {
<# this runs once for every input item piped to the function #>
}
end {
<# this runs once at the end of the pipeline #>
}
}
When you omit the begin/process/end blocks, PowerShell assumes your function body is really the end block - so this function definition:
function Function-Name {
param()
Do-Something
}
... is the exact same as:
function Function-Name {
param()
begin {}
process {}
end {Do-Something}
}
Notice that the process block - the only piece of the function body that repeats with new input - is assumed empty, and the Do-Something statement will not be executed until after the last input item has been received.
This is why it looks like it "drops" everything but the last input object.
Place the code that consumes the pipeline-bound variable inside the process block and it'll work:
Remove-MyModule{
param(
[parameter(ValueFromPipeline)][pscustomobject[]]$MyModules
)
process {
foreach($Module in $MyModules){
Invoke-RestMethod -Method Delete -Uri "MyURL" -Body (ConvertTo-Json $Module) -UseDefaultCredentials
}
}
}
For more information about the syntax and semantics of advanced functions, see the about_Functions_Advanced_Methods help file

Related

Using Pipeline to pass an object to another Powershell script [duplicate]

I am trying to write a PowerShell script that can get pipeline input (and is expected to do so), but trying something like
ForEach-Object {
# do something
}
doesn't actually work when using the script from the commandline as follows:
1..20 | .\test.ps1
Is there a way?
Note: I know about functions and filters. This is not what I am looking for.
In v2 you can also accept pipeline input (by propertyName or byValue), add parameter aliases etc:
function Get-File{
param(
[Parameter(
Position=0,
Mandatory=$true,
ValueFromPipeline=$true,
ValueFromPipelineByPropertyName=$true)
]
[Alias('FullName')]
[String[]]$FilePath
)
process {
foreach($path in $FilePath)
{
Write-Host "file path is: $path"
}
}
}
# test ValueFromPipelineByPropertyName
dir | Get-File
# test ValueFromPipeline (byValue)
"D:\scripts\s1.txt","D:\scripts\s2.txt" | Get-File
- or -
dir *.txt | foreach {$_.fullname} | Get-File
This works and there are probably other ways to do it:
foreach ($i in $input) {
$i
}
17:12:42 PS>1..20 | .\cmd-input.ps1
1
2
3
-- snip --
18
19
20
Search for "powershell $input variable" and you will find more information and examples.
A couple are here:
PowerShell Functions and Filters PowerShell Pro!
(see the section on "Using the PowerShell Special Variable “$input”")
"Scripts, functions, and script blocks all have access to the $input variable, which provides an enumerator over the elements in the incoming pipeline. "
or
$input gotchas « Dmitry’s PowerBlog PowerShell and beyond
"... basically $input in an enumerator which provides access to the pipeline you have."
For the PS command line, not the DOS command line Windows Command Processor.
You can either write a filter which is a special case of a function like so:
filter SquareIt([int]$num) { $_ * $_ }
or you can create a similar function like so:
function SquareIt([int]$num) {
Begin {
# Executes once before first item in pipeline is processed
}
Process {
# Executes once for each pipeline object
$_ * $_
}
End {
# Executes once after last pipeline object is processed
}
}
The above works as an interactive function definiton or if in a script can be dotted into your global session (or another script). However your example indicated you wanted a script so here it is in a script that is directly usable (no dotting required):
--- Contents of test.ps1 ---
param([int]$num)
Begin {
# Executes once before first item in pipeline is processed
}
Process {
# Executes once for each pipeline object
$_ * $_
}
End {
# Executes once after last pipeline object is processed
}
With PowerShell V2, this changes a bit with "advanced functions" which embue functions with the same parameter binding features that compiled cmdlets have. See this blog post for an example of the differences. Also note that in this advanced functions case you don't use $_ to access the pipeline object. With advanced functions, pipeline objects get bound to a parameter just like they do with a cmdlet.
The following are the simplest possible examples of scripts/functions that use piped input. Each behaves the same as piping to the "echo" cmdlet.
As Scripts:
# Echo-Pipe.ps1
Begin {
# Executes once before first item in pipeline is processed
}
Process {
# Executes once for each pipeline object
echo $_
}
End {
# Executes once after last pipeline object is processed
}
# Echo-Pipe2.ps1
foreach ($i in $input) {
$i
}
As functions:
Function Echo-Pipe {
Begin {
# Executes once before first item in pipeline is processed
}
Process {
# Executes once for each pipeline object
echo $_
}
End {
# Executes once after last pipeline object is processed
}
}
Function Echo-Pipe2 {
foreach ($i in $input) {
$i
}
}
E.g.
PS > . theFileThatContainsTheFunctions.ps1 # This includes the functions into your session
PS > echo "hello world" | Echo-Pipe
hello world
PS > cat aFileWithThreeTestLines.txt | Echo-Pipe2
The first test line
The second test line
The third test line

What is the best way to structure an advanced multi-threaded Powershell function to work both on the pipeline as well as non-pipeline calls?

I have a function that flattens directories in parallel for multiple folders. It works great when I call it in a non-pipeline fashion:
$Files = Get-Content $FileList
Merge-FlattenDirectory -InputPath $Files
But now I want to update my function to work both on the pipeline as well as when called off the pipeline. Someone on discord recommended the best way to do this is to defer all processing to the end block, and use the begin and process blocks to add pipeline input to a list. Basically this:
function Merge-FlattenDirectory {
[CmdletBinding()]
param (
[Parameter(Mandatory,Position = 0,ValueFromPipeline)]
[string[]]
$InputPath
)
begin {
$List = [System.Collections.Generic.List[PSObject]]#()
}
process {
if(($InputPath.GetType().BaseType.Name) -eq "Array"){
Write-Host "Array detected"
$List = $InputPath
} else {
$List.Add($InputPath)
}
}
end {
$List | ForEach-Object -Parallel {
# Code here...
} -ThrottleLimit 16
}
}
However, this is still not working on the pipeline for me. When I do this:
$Files | Merge-FlattenDirectory
It actually passes individual arrays of length 1 to the function. So testing for ($InputPath.GetType().BaseType.Name) -eq "Array" isn't really the way forward, as only the first pipeline value gets used.
My million dollar question is the following:
What is the most robust way in the process block to differentiate between pipeline input and non-pipeline input? The function should add all pipeline input to a generic list, and in the case of non-pipeline input, should skip this step and process the collection as-is moving directly to the end block.
The only thing I could think of is the following:
if((($InputPath.GetType().BaseType.Name) -eq "Array") -and ($InputPath.Length -gt 1)){
$List = $InputPath
} else {
$List.Add($InputPath)
}
But this just doesn't feel right. Any help would be extremely appreciated.
You might just do
function Merge-FlattenDirectory {
[CmdletBinding()]
param (
[Parameter(Mandatory,Position = 0,ValueFromPipeline)]
[string[]]
$InputPath
)
begin {
$List = [System.Collections.Generic.List[String]]::new()
}
process {
$InputPath.ForEach{ $List.Add($_) }
}
end {
$List |ForEach-Object -Parallel {
# Code here...
} -ThrottleLimit 16
}
}
Which will process the input values either from the pipeline or the input parameter.
But that doesn't comply with the Strongly Encouraged Development Guidelines to Support Well Defined Pipeline Input (SC02) especially for Implement for the Middle of a Pipeline
This means if you correctly want to implement the PowerShell Pipeline, you should directly (parallel) process your items in the Process block and immediately output any results from there:
function Merge-FlattenDirectory {
[CmdletBinding()]
param (
[Parameter(Mandatory,Position = 0,ValueFromPipeline)]
[string[]]
$InputPath
)
begin {
$SharedPool = New-ThreadPool -Limit 16
}
process {
$InputPath |ForEach-Object -Parallel -threadPool $Using:SharedPool {
# Process your current item ($_) here ...
}
}
}
In general, script authors are advised to use idiomatic PowerShell which often comes down to lesser object manipulations and usually results in a correct PowerShell pipeline implementation with less memory usage.
Please let me know if you intent to collect (and e.g. order) the output based on this suggestion.
Caveat
The full invocation of the ForEach-Object -Parallel cmdlet itself is somewhat inefficient as you open and close a new pipeline each iteration. To resolve this, my whole general statement about idiomatic PowerShell falls a bit apart, but should be resolvable by using a steppable pipeline.
To implement this, you might use the ForEach-Object cmdlet as a template:
[System.Management.Automation.ProxyCommand]::Create((Get-Command ForEach-Object))
And set the ThrottleLimit of the ThreadPool in the Begin Block
function Merge-FlattenDirectory {
[CmdletBinding()]
param (
[Parameter(Mandatory,Position = 0,ValueFromPipeline)]
[string[]]
$InputPath
)
begin {
$PSBoundParameters += #{
ThrottleLimit = 4
Parallel = {
Write-Host (Get-Date).ToString('HH:mm:ss.s') 'Started' $_
Start-Sleep -Seconds 3
Write-Host (Get-Date).ToString('HH:mm:ss.s') 'finished' $_
}
}
$wrappedCmd = $ExecutionContext.InvokeCommand.GetCommand('ForEach-Object', [System.Management.Automation.CommandTypes]::Cmdlet)
$scriptCmd = {& $wrappedCmd #PSBoundParameters }
$steppablePipeline = $scriptCmd.GetSteppablePipeline($myInvocation.CommandOrigin)
$steppablePipeline.Begin($PSCmdlet)
}
process {
$InputPath.ForEach{ $steppablePipeline.Process($_) }
}
end {
$steppablePipeline.End()
}
}
1..5 |Merge-FlattenDirectory
17:57:40.40 Started 3
17:57:40.40 Started 2
17:57:40.40 Started 1
17:57:40.40 Started 4
17:57:43.43 finished 3
17:57:43.43 finished 1
17:57:43.43 finished 4
17:57:43.43 finished 2
17:57:43.43 Started 5
17:57:46.46 finished 5
Here's how I would write it with comments where I have changed it.
function Merge-FlattenDirectory {
[CmdletBinding()]
param (
[Parameter(Mandatory,Position = 0,ValueFromPipeline)]
$InputPath # <this may be a string, a path object, a file object,
# or an array
)
begin {
$List = #() # Use an array for less than 100K objects.
}
process {
#even if InputPath is a string for each will iterate once and set $p
#if it is an array of strings add each. If it is one or more objects,
#try to find the right property for the path.
foreach ($p in $inputPath) {
if ($p -is [String]) {$list += $p }
elseif ($p.Path) {$list += $p.Path}
elseif ($p.FullName) {$list += $p.FullName}
elseif ($p.PSPath) {$list += $p.PSPath}
else {Write-warning "$P makes no sense"}
}
}
end {
$List | ForEach-Object -Parallel {
# Code here...
} -ThrottleLimit 16
}
}
#iRon That "write for the middle of the pipeline" in the docs does not mean write everything in the process block .
function one { #(1,2,3,4,5) }
function two {
param ([parameter(ValueFromPipeline=$true)] $p )
begin {Write-host "Two begins" ; $a = #() }
process {Write-host "Two received $P" ; $a += $p }
end {Write-host "Two ending" ; $a; Write-host "Two ended"}
}
function three {
param ([parameter(ValueFromPipeline=$true)] $p )
begin {Write-host "three Starts" }
process {Write-host "Three received $P" }
end {Write-host "Three ended" }
}
one | two | three
One is treated as an end block.
One, two and three all run their begins (one's is empty).
One's output goes to the process block in two, which just collects the data. Two's end block starts after one's end-block ends, and sends output
At this point three's process block gets input. After two's end block ends, three's endblock runs.
Two is "in the middle" it has a process block to deal with multiple piped items (if it were all one 'end' block it would only process the last one).

Passing a Array of information to a separate script in powershell

I have a script that grabs a series of information from SQL. It then parses the information and passes it to a series of arrays. I want to then pass each array to a separate script.
I've seen Start-job should be able to do this but form my testing it didn't seem to work. This is what I have tried. Each Script individually works, and I am currently just using CVS's to pass the information.
Once the information is in the script I need to be able to call specific properties from each object. I did get it to just print the array as a string, but I couldn't call anything specific.
Invoke-Sqlcmd -Query $Q1 -ServerInstance $I -Database $DB | Export-Csv "$Files\Employees.csv"
$emps = Import-Csv "$Files\Employees.csv"
$newaccounts = #()
$deacaccounts = #()
$changedusers = #()
if(Test-Path -Path "$Files\Employees.csv"){
foreach ($emp in $emps) {
if ($emp.emp_num.trim() -ne $emp.EmpNum) {
$newaccounts += $emp
}
if ($emp.emp_num.trim() -eq $emp.EmpNum) {
if ($emp.fname -ne $emp.GivenName -and $emp.lname -ne $emp.SurName) {
$deacaccounts += $emp
$newaccounts += $emp
}
else ($emp.dept -ne $emp.DepartmentNumber -or $emp.job_title -ne $emp.JobTitle) {
$changedusers += $emp
}
}
}
}
Start-job -path "script" -argumentlist (,$deacaccounts)
Start-job -path "script" -argumentlist (,$changedusers)
Start-job -path "script" -argumentlist (,$newaccounts )
EDIT:
The Information passed to the scripts would be multiple lines of employee data. I need to be able to grab that info in the "Sub" scripts and perform actions based on them.
EX:
Deacaccounts =
fname
Lname
empnum
ted
kaz
1234
sam
cart
245
If you really need background jobs - it turns out that you don't - note that Start-Job doesn't have a -Path parameter; you'd have use -ScriptBlock { & "$script" } instead.
To simply invoke the script in the foreground, in sequence, use the following (script representing your .ps1 file path(s)):
& "script" $deacaccounts
& "script" $changedusers
& "script" $newaccounts
Note: &, the call operator, is only needed if the script / executable path is quoted and/or contains variable references (or subexpresions); e.g., a script with path c:\foo\bar.ps1 may be invoked without &; e.g.
c:\foo\bar.ps1 $deacaccounts
Note that your script(s) will receive a single argument each, containing an array of values.
If instead, you wanted to pass the array elements as individual (positional) arguments, you'd have to use splatting, where you use sigil # instead of $ to pass your variable (e.g.,
& "script" #deaccounts).
If you need to enumerate the arrays and pass each object individually as a parameter, use the following:
foreach ($obj in $deaccounts) { & "script" $obj }
foreach ($obj in $changedusers) { & "script" $obj }
foreach ($obj in $newaccounts) { & "script" $obj }
If each object should be splatted positionally based on its property values:
foreach ($obj in $deaccounts) {
$vals = $obj.psobject.Properties.Value
& "script" #vals
}
# ... ditto for $changeduser and $newaccounts
If each object should be splatted by property names, based on both property names and values, you need to convert each object to a hashtable first:
foreach ($obj in $deaccounts) {
$params = #{}
foreach ($prop in $obj.psobject.Properties) {
$params[$prop.Name] = $prop.Value
}
& "script" #params
}
# ... ditto for $changeduser and $newaccounts
As an aside: Incrementally extending arrays in a loop with += is inefficient, because a new array must be created behind the scenes in every iteration, because arrays are of fixed size.
In general, a much more efficient approach is to use a foreach loop as an expression and let PowerShell itself collect the outputs in an array: [array] $outputs = foreach (...) { ... } - see this answer.
In case you need to create arrays manually, e.g to create multiple ones, such as in your case, use an efficiently extensible list type, such as [System.Collections.Generic.List[object]] - see this answer.

echo in while loop get's added to my return value

I wasn't sure how to describe this problem in the title so here goes.
I call a function from a script in another script. In that function i have a while loop that basically keeps looping through a set of ip's and looks up their hostname. when the while loop times out or we have all the host names.
it returns the hostnames.
My problem is that the return value contains every single Write-Host i'm doing in that function.
i know it's because Write-Host puts stuff on the pipeline and the return just returns whatever it has.
How do i go about fixing this?
The entire script i run get's logged in a log file which is why i want to have some verbose logging.
| out-null on write-host fixes the issue but it doesn't print the write-host values in the script.
in main.psm1 i have a function like so:
$nodes = #("ip1", "ip2", "ip3", "ip4")
$nodesnames = DoStuff -nodes $nodes
then in functions.psm1 i have functions like:
Function DoStuff
{
param($nodes)
$timeout = 300
$timetaken = 0
$sleepseconds = 5
$nodenames = #("$env:COMPUTERNAME")
while(($nodenames.count -lt $nodes.count) -and ($timetaken -lt $timeout))
{
try
{
Write-Host "Stuff"
foreach($node in $nodes)
{
$nodename = SuperawesomeFunction $node
Write-Host "$nodename"
if($nodenames -notcontains $nodename)
{
$nodenames += #($nodename)
}
}
}
catch
{
Write-Host "DoStuff Failed because $_"
}
Start-Sleep $sleepseconds
$timetaken += $sleepseconds
}
return $nodenames
}
Function SuperawesomeFunction
{
param($node)
$nodename = [System.Net.Dns]::GetHostEntry("$node")
return $nodename
}
Thanks.
So the answer is, your function is working like it is by design. In PowerShell a function will return output in general to the pipeline, unless specifically directed otherwise.
You used Echo before, which is an alias of Write-Output, and output is passed down the pipe as I mentioned before. As such it would be collected along with the returned $nodenames array.
Replacing Echo with Write-Host changes everything because Write-Host specifically tells PowerShell to send the information to the host application (usually the PowerShell Console or PowerShell ISE).
How do you avoid this? You could add a parameter specifying a path for a logfile, and have your function update the logfile directly, and only output the relevant data.
Or you can make an object with a pair of properties that gets passed back down the pipe which has the DNS results in one property, and the errors in another.
You could use Write-Error in the function, and set it up as an advanced function to support -errorvariable and capture the errors in a separate variable. To be honest, I'm not sure how to do that, I've never done it, but I'm 90% sure that it can be done.

How to Pass Multiple Objects via the Pipeline Between Two Functions in Powershell

I am attempting to pass a list of objects from one function to another, one by one.
First function: generate a list of users (objects) near expiry;
Second function: send an email to each user (object)
The first function works fine and outputs a group of objects (or so it would seem) and the second function will accept input and email a single user without issue.
Issues arise only when multiple objects are passed from the first function to the second.
Relevant code snippets are below:
The First function creates a custom object for each located user and adds it to an array, which is then outputted in the end block. Below is an extremely simplified snippet of the code with the essential object creation step:
Function 01
{
#param block goes here etc...
Foreach ($user in $users)
{
$userOutput = #()
$userTable = New-Object PSObject -Property #{
name = $User.Name
SamAccountName = $User.SamAccountName
emailAddress = $User.EmailAddress
expired = $user.PasswordExpired
expiryDate = $ExpiryDate.ToShortDateString()
daysTillExpiry = $daysTillExpiry
smtpRecipientAddress = $User.EmailAddress
smtpRecipientName = $User.Name
}
$userOutput += $userTable
}
Write-Output $userOutput
}
I have also tried writing each custom object ($userTable) straight to the console within each iteration of the Foreach (users) loop.
The Second function accepts pipeline input for a number of matching parameters from the first function, e.g:
[Parameter(Mandatory=$true,ValueFromPipeline=$true,ValueFromPipelineByPropertyName=$true)][string]$smtpRecipientName
The second function also calls a third function designed specifically to send smtp mail and contains no loops, it just takes the current object from the pipeline and deals with it.
I haven't included the full code for either mail function because it is largely irrelevant. I just want to know whether the objects outputted from the first function can be dealt with one-by-one by the second.
At present, the mail function deals with the first object passed to it, and no others.
Update:
This is what I have in mind (but the second function only deals with the last object that was piped in:
Function Test-UserExp
{
$iteration = 0
For ($i=0;$i -le 9;$i++)
{
$iteration ++
$userTable = New-Object PSObject -Property #{
expiryDate = "TestExpDate_$iteration"
daysTillExpiry = "TestDaysTillExpiry_$iteration"
smtpRecipientAddress = "TestSMTPRecipientAddress_$iteration"
smtpRecipientName = "TestSMTPRecipientName_$iteration"
}
$userTable
}
}
Function Test-MailSend
{
Param
(
[Parameter(ValueFromPipelineByPropertyName=$true)][string]$expiryDate,
[Parameter(ValueFromPipelineByPropertyName=$true)][string]$daysTillExpiry,
[Parameter(ValueFromPipelineByPropertyName=$true)][string]$smtpRecipientAddress,
[Parameter(ValueFromPipelineByPropertyName=$true)][string]$smtpRecipientName
)
Write-Host 'Output from Test-MailSend:'
$expiryDate
$daysTillExpiry
$smtpRecipientAddress
$smtpRecipientName
}
First of all: if you want to process objects in a pipeline, one at the time, do not kill experience by collecting all the objects - that's only necessary if you intend to do something about whole collection at some point. If not than just output objects as soon as you get them:
foreach ($user in $users) {
New-Object PSObject -Property #{
name = $User.Name
SamAccountName = $User.SamAccountName
emailAddress = $User.EmailAddress
# ...
}
}
In your case you output whole collection at the end. That's hardly a pipeline experience if you would ask me.
For the second command: if you intend to create parameter for each property, just leave the part 'ValueFromPipeline' out. Otherwise you may end up with whole object converted to string... If you want to take an object as a whole, leave out 'ValueFromPipelineByPropertyName' and specify correct type. And make sure you have process {} wrapped around the code that uses parameters taken from pipeline.
And finally: why would you write a function to send mails? You have Send-MailMessage, so unless you do something this cmdlet doesn't cover, you probably don't need hand-crafted replacement...
In function 1 you want to create the array before the ForEach loop, so you aren't re-creating the array every iteration.
In the param block for the second function, you want to declare the parameter as an array of strings, not just a string.
Finally, when accepting pipeline input for the second function you will need to use the Begin, Process, and End blocks. The part of the function that repeats for each item should be in the Process block.
Here is a short working sample below:
Function fun1{
$users = #(1,2,3)
$userOutput = #()
Foreach ($user in $users){
$userTable = New-Object PSObject -Property #{
emailAddress = "$user#blah.com"
}
$userOutput += $userTable
}
$userOutput
}
Function fun2{
param(
[parameter(ValueFromPipeLine=$true)]
[String[]]$Recipients
)
begin{}
process{
ForEach ($Recipient in $Recipients){
$_
}
}
end{}
}
fun1 | Select emailAddress | fun2
This will give you the output below:
emailAddress
------------
1#blah.com
2#blah.com
3#blah.com
Here is a great breakdown of how the Begin/Process/End blocks work in PowerShell http://technet.microsoft.com/en-us/magazine/hh413265.aspx
function Set-UserExpiry {
1..10 | foreach {
[PSCustomObject]#{
ExpiryDate = "TestExpDate_$_"
DaysTillExpiry = "TestDaysTillExpiry_$_"
SmtpRecipientAddress = "TestSMTPRecipientAddress_$_"
SmtpRecipientName = "TestSMTPRecipientName_$_"
}
}
}
function Test-UserExpiry {
param
(
[Parameter(ValueFromPipelineByPropertyName = $true)]
[string]$ExpiryDate,
[Parameter(ValueFromPipelineByPropertyName = $true)]
[string]$DaysTillExpiry,
[Parameter(ValueFromPipelineByPropertyName = $true)]
[string]$SmtpRecipientAddress,
[Parameter(ValueFromPipelineByPropertyName = $true)]
[string]$SmtpRecipientName
)
process {
Write-Output 'Output from Test-MailSend:'
$expiryDate
$daysTillExpiry
$smtpRecipientAddress
$smtpRecipientName
Write-Output ''
}
}
Set-UserExpiry | Test-UserExpiry