Scheduling Powershell changes ObjectType - powershell

I've written a little script that checks for differences between 2 text files.
$new = get-content $outPutFile
$old = get-content $outPutFileYesterday
$result = $null
$result = Compare-Object $old $new
$resultHTML = $result.GetEnumerator() | ConvertTo-Html
Send-MailMessage -SmtpServer 10.14.23.4 -From me#mail.com -To $toAddress -Subject "DiffTest" -Body "$resultHTML" -BodyAsHtml
When I run it from an active PowerShell prompt, all is well. However, when I try to schedule it to run daily I get this error on the run-time (the block above is in a try catch that mails any execution errors):
Method invocation failed because [System.Management.Automation.PSCustomObject] doesn't contain a method named 'GetEnumerator'.
How can I fix this?

The script may run in a different user context when scheduled, potentially with a different set of read/write permissions on the filesystem.
However, In PowerShell arrays are automatically enumerated when used in expressions, so you don't need to
call the GetEnumerator() method before passing it to ConvertTo-Html.
You could start by changing your script to:
$resultHTML = $result | ConvertTo-Html
and see how it impacts the result.

Compare-Object either returns:
$null: if the ReferenceObject and the DifferenceObject are equal
an object of type PSCustomObject: if only one item differs (1)
an array of objects: if multiple differences have been found
Of these return values only the last one (the array) has a GetEnumerator() method. ConvertTo-Html produces the expected output when fed either of these return values, so you can safely drop the .GetEnumerator() part (as mentioned by Enrico). Another option is to wrap the $result in an array, which would change line 6 of your script to:
$resultHTML = #($result).GetEnumerator() | ConvertTo-Html
(1) This is the return value for compare-object in your script

Related

Powershell not using my variable in command

I have this bit of simple code where I am passed a variable for the username, I need to parse that and then use it in a path for a copy command.
I have used outputting the variables to a text file to help try to troubleshoot the problem.
I parse the variable, and it seems to output to the text file properly, but when I use it in my path variable it shows as empty.
The Code:
param ([String] $mdmUserName)
$mdmUserName | Out-File "C:\Windows\Temp\test.txt"
$FullUserSplit = $mdmUserName.Split("\")
$FullUserSplit | Out-File -append "C:\Windows\Temp\test.txt"
$localusername = $FullUserSplit[2]
$localusername | Out-File -Append "C:\Windows\Temp\test.txt"
$from = "C:\Windows\Temp\Normaltest.dotm"
$to = "C:\Users\$localusername\AppData\Roaming\Microsoft\Templates\"
$to | Out-File -Append "C:\Windows\Temp\test.txt"
Copy-Item $from $to -Force
The output of the test.txt file:
Win11\User
Win11
User
C:\Users\\AppData\Roaming\Microsoft\Templates\
You can see that it outputs the $localusername variable correctly to the test.txt, but then when added to the path it is not there. I feel like I am missing something simple.
I also tried manually setting the $mdmusername manually to "Win11/User" with the same result.
Arrays are zero based (they start with 0), thus: $FullUserSplit[1] or $FullUserSplit[-1] (which selects the last entry) – iRon
iRon's comment was the answer. I needed to use $FullUserSplit[1]
Outputting the $FullUserSplit to the troubleshooting file was throwing me off, as it outputted the full array on 2 lines.

Suppress Array List Add method pipeline output

I am using an Array List to build a sequence of log items to later log. Works a treat, but the Add method emits the current index to the pipeline. I can address this by sending it to $null, like this
$strings.Add('junk') > $null
but I wonder if there is some mechanism to globally change the behavior of the Add method. Right now I have literally hundreds of > $null repetitions, which is just ugly. Especially when I forget one.
I really would like to see some sort of global variable that suppresses all automatic pipelining. When writing a large script I want to intentionally send to the pipeline, as unexpected automatic send to pipeline is a VERY large fraction of my total bugs, and the hardest to find.
You could wrap your ArrayList in a custom object with a custom Add() method.
$log = New-Object -Type PSObject -Property #{
Log = New-Object Collections.ArrayList
}
$log | Add-Member -Type ScriptMethod -Name Add -Value {
Param(
[Parameter(Mandatory=$true)]
[string]$Message
)
$this.Log.Add($Message) | Out-Null
}
$log.Log.Add('some message') # output on this is suppressed
So, this thread getting resurrected has led me to "answer" my own question, since I discovered long ago that ArrayLists have been deprecated and Collections.Generic.List<T> is the preferred solution, as pointed out by #santiago-squarzon today.
So, for anyone wondering
$log = [System.Collections.Generic.List[String]]::new()
or the older New-Object way
$log = New-Object System.Collections.Generic.List[String]
to instantiate the collection, then happily
$log.Add('Message')
with no pipeline pollution to worry about. You can also add multiple items at once with
$log.AddRange()
With the range being another list, or an array if you cast to List first.
And you can insert a message with something like
$log.Insert(0, 'Message')
So yeah, lots of flexibility and no pollution. Winning.

Using pipeline object to populate mail -to and -attachment

First ever Powershell script so any advice or recommendations are appreciated. I'm parsing a .csv into smaller .csv's to send out information about servers to recipients and i'm running into a problem in my foreach. How do I get this to work?
One interesting thing is that in Send-MailMessage, -to should not accept pipeline objects, It still throws an error, but it still sends the emails. However the attachment will never send.
#had to set this as a variable because # was throwing splatting errors
$Mail = "#Email.com"
#Import csv and split information, exports UID.csv
Import-csv C:\path\info.csv | Group-Object UID |
ForEach-Object {
$_.Group | Export-csv "C:\path\$($_.Name).csv" -NoTypeInformation
}
#Import file again to get unique list of UID and send mail with
#respective UID.csv
Import-csv C:\path\info.csv | Group-Object UID |
ForEach-Object {
$_.UID | Send-MailMessage -From "<Me#email.com>" -To "<$($_.Name)$Mail>" `
-Attachments "C:\path\$($_.Name).csv" `
-Subject "Testing" -Body "Please Ignore This" -Priority High `
-SmtpServer smtp.server.com
}
in Send-MailMessage, -to should not accept pipeline objects
In principle it does, namely if the pipeline objects have a .To property (which is not the case for you).
However, with your current approach, you don't need pipeline input at all, given that you're supplying all input as arguments.
Additionally, your pipeline input is incorrect, because $_.UID sends $null through the pipeline, given that $_ - a group-info object output by Group-Object - doesn't have a .UID property.
Using delay-bind script blocks ({ ... }), you can simplify your command as follows, obviating the need for a ForEach-Object call:
Import-csv C:\path\info.csv | Group-Object UID |
Send-MailMessage -From "<Me#email.com>" -To { "<$($_.Name)#Email.com>" } `
-Attachments { "C:\path\$($_.Name).csv" } `
-Subject "Testing" -Body "Please Ignore This" -Priority High `
-SmtpServer smtp.server.com
In short, the script blocks passed to -To and Attachments are evaluated for each input object, and their output determines the parameter value in each iteration. In the script block, $_ represents the pipeline object at hand, as usual.
Note that such delay-bind script blocks can only be used with parameters that are designed to accept pipeline input (irrespective of whether by value (whole object) or by a specific property's value).

Send-MailMessage shortens lines in message when triggered by Task Scheduler and breaks FullNames

I'm not able to find a reason why lines in emails are shortened when it is triggered by Task Scheduler(lines aren't shortened when the script is executed manualy from ISE!). I'd like to pass FullName to email and use it as link to document (when the path and file doesn't contain spaces, the link works great).
If I use "format-list" instead of "format-table" it looks better (even when triggered by Task Scheduler) and I have to add parameter "$body = $newdoc | Out-String -Width 255" to prevent breaking lines - but space in filenames still breaks links:
Next thing is the FullName contains spaces - I tried many ways (like $variable.replace; $var = $var -replace " ","` ", etc.)
#date and time formating
$culture = Get-Culture
$culture.DateTimeFormat.LongTimePattern = 'HH:mm'
$culture.DateTimeFormat.ShortDatePattern = 'dd-MM-yyyy'
Set-Culture $culture
#find files changed during last hour, sort descending
$newdoc = get-childitem -File -Path \\ottm09itoms01\OTA-IT_Operators\ -Recurse | ? {$_.LastWriteTime -gt (Get-Date).AddHours(-1)} | sort lastwritetime -Descending | Format-table -Property LastWriteTime, fullname
$body = $newdoc | Out-String
$enc = New-Object System.Text.utf8encoding
Send-MailMessage -From $sender -To $receiver2 -Subject "Documents updated" -body $body -Encoding $enc -SmtpServer $SMTPserver
After some digging, I suspect the problem is your use of Out-String which truncates output based on the -width parameter, which you have left unspecified. To quote the documentation:
-width
Specifies the number of characters in each line of output. Any additional characters are truncated, not wrapped. If you omit this parameter, the width is determined by the characteristics of the host program. The default value for the Windows PowerShell console is 80 (characters).
In other words, when you run this script in the ISE Out-String probably sets the width to whatever the buffer width of the ISE is, but when Task Scheduler runs it, it uses the default 80 character width.
So basically just add -width 120 (or the value of your choosing) to your Out-String and see if that fixes the problem.
To fix the links breaking on whitespace you might have to manually generate some HTML for them using -replace. Something like:
$body = $body -replace '(\\\\.*[^\s])','$1'
$body = $body.trim() -replace "`n","<br>`n"
This assumes that all your paths are UNC paths (i.e., paths starting with \\). You'd then need to throw the -BodyAsHtml on your Send-MailMessage command. This is kind of thrown together and I'm sure there's probably a better way of doing things, but it should work.

Powershell - avoid repeating arguments of Out-File

Is there any way to avoid passing parameters to a function, like "-Append $outfile" to Out-File, every time? I have a script which collects data from the system, something like:
... collect OS information ... | Out-File -Append $output
... collect local users ... | Out-File -Append $output
... collect logfile permissions ... | Out-File -Append $output
etc.
The last command in the pipe is most of the time Out-File -Append $output - can this be done more elegant? I had different ideas:
Create a wrapper function which passes the needed parameters to Out-File command - already tried, but I had problems to make it pipe-compatible
Write all output into a String-Variable and write the content at the end of all commands into the file - needs a lot of memory
Create something like an Output-Writer-Object which only receives once at initialization the necessary paramters - not tried yet
Thank you very much for your help!
You dont appear to be using a lot of arguments for this to be incredibly useful but a good suggestion would be to use splatting. I added some more parameters to illustrate how clean it can make code appear while still being functional.
$options = #{
Append = $True
FilePath = $output
Encoding = "Unicode"
Width = 400
}
Build a hastable of options and splat the cmdlet with them
... collect OS information ... | Out-File #options
... collect local users ... | Out-File #options
... collect logfile permissions ... | Out-File #options
Outside of that a wrapper function (of filter if it is easier) like you suggest would be another option. Look at the options in this answer. Specifically the filter
You want to use the $PSDefaultParameterValues preference variable. Something like this:
$PSDefaultParameterValues = #{
"Out-File:Encoding"="utf8";
"Out-File:Append"=$true;
"Out-File:FilePath"=$output
}
This feature is especially useful when you must specify the same alternate parameter value nearly every time you use the command or when a particular parameter value is difficult to remember, such as an email server name or project GUID.
Or put everything inside a function or scriptblock. Note that out-file defaults to utf16 encoding, and can mix encodings, as opposed to add-content.
& {
... collect OS information ...
... collect local users ...
... collect logfile permissions ...
} | add-content $output