Making variables visible inside Powershell workflow - powershell

I have five variables defined before a workflow that need to be available to the workflow, but I can't find out how to do it.
Putting the variables inside the workflow makes them visible, but that causes an issue with the CSV import that means extra properties are added to the object relating to the workflow that I don't want.
Code as follows:
$source = 'C:\Users\Koozer\a place\'
$rotateParams = 90, 90, 270
$cropParams = #(64, 64), (32, 0)
$images = Import-Csv "${source}images.csv"
$colNames = $images[0].psobject.properties.Name
Workflow StitchCropWorkflow {
foreach -parallel ($imageSet in $images) {
$magickRotParams = ''
$n = 0
foreach ($image in $colNames) {
$magickRotParams += '`( '''+$source+$imageSet.($image)+''' -rotate '+$rotateParams[$n]+' `) '
$n++
}
$finfo = [io.fileinfo]$imagePathSets[0]
$command = 'magick '+$magickRotParams+' +append -crop '+$cropParams[0][0]+'x'+$cropParams[0][1]+'+'+$cropParams[1][0]+'+'+$cropParams[1][1]+' +repage '''+$finfo.DirectoryName+'\'+$finfo.BaseName+'_stitch_crop'+$finfo.Extension+''''
echo $command
Invoke-Expression $command
}
}
StitchCropWorkflow

You can pass parameters to a workflow like you would do it for a function:
$source = 'C:\Users\Koozer\a place\'
$rotateParams = 90, 90, 270
$cropParams = #(64, 64), (32, 0)
$images = Import-Csv "${source}images.csv"
$colNames = $images[0].psobject.properties.Name
Workflow StitchCropWorkflow {
param (
$source,
$rotateParams,
$cropParams,
$images,
$colNames
)
# your code here
}
StitchCropWorkflow $source $rotateParams $cropParams $images $colNames

If you define a variable in a script containing a workflow, to access the variable within the workflow the syntax is $using:variable
As an example, here is a powershell script containing a workflow with a single param.
param([string]$ComputerName)
$VerbosePreference = "Continue"
workflow Start-Reboot {
param($server)
Write-Verbose "Input server is $server"
InlineScript{
New-Item -Path C:\Scripts\Logging\Start-Reboot-Single.log -ItemType File -ErrorAction SilentlyContinue | Out-Null
}
Sequence{
InlineScript
{
Try
{
Get-WmiObject -ComputerName $using:server -Class Win32_Service -Filter "Name='HealthService'" | Invoke-WmiMethod -Name PauseService | Out-Null
$operatingSystem = Get-WmiObject Win32_OperatingSystem -ComputerName $using:server -ErrorAction stop
$LastReboot = [Management.ManagementDateTimeConverter]::ToDateTime($operatingSystem.LastBootUpTime).ToString().Trim()
Write-Verbose "Last reboot time for $using:server is $LastReboot"
Write-Verbose "Here we restart $using:server, for testing no reboot is done"
Restart-Computer -ComputerName $using:server -Wait -For Wmi -Force
$OSRecheck = Get-WmiObject Win32_OperatingSystem -ComputerName $using:server -ErrorAction stop
$CurrentReboot = [Management.ManagementDateTimeConverter]::ToDateTime($OSRecheck.LastBootUpTime).ToString().Trim()
$props = [Ordered]#{
Server=$using:server
LastReboot=$LastReboot
CurrentReboot=$CurrentReboot
}
} Catch {
Write-Verbose "Oh no, problem with $using:server"
$rnd = Get-Random -Minimum 1 -Maximum 5
Start-Sleep -Seconds $rnd
$err = $_.Exception.GetType().FullName
$props = [Ordered]#{
Server=$using:server
LastReboot=$err
CurrentReboot=$null
}
} Finally {
$object = New-Object -TypeName PSObject -Property $props
$random = Get-Random -Minimum 2 -Maximum 15
Start-Sleep -Seconds $random
Write-Output $object | Out-File -Append -FilePath C:\Scripts\Logging\Start-Reboot-Single.log
}
}#inline end
}#sequence end
}#end workflow block
Start-Reboot -server $ComputerName

Related

Powershell: How to incorporate if,elseif statements for adding columns into CSV file?

I'd like the following code to add the specified columns if it finds the appropriate graphics adapter in a pc.
Right now, my if/elseif statements are throwing all kinds of errors and I'm thinking its because I put it in the wrong section of the code. The columns are not being generated as how I would like for it to.
Any advice?
# User needs to create a txt file containing hostnames.
function ReadHostnames($initialDirectory) {
[void] [System.Reflection.Assembly]::LoadWithPartialName('System.Windows.Forms')
$OpenFileDialog = New-Object System.Windows.Forms.OpenFileDialog
if ($initialDirectory) { $OpenFileDialog.initialDirectory = $initialDirectory }
$OpenFileDialog.filter = 'All files (*.*)|*.*'
[void] $OpenFileDialog.ShowDialog()
return $OpenFileDialog.FileName
}
($FilePermissions = ReadHostnames C:\)
$FilePermissions = Get-Content $FilePermissions
write-host "Please wait while gathering information..."
$counter = 0
foreach ($computernames in $FilePermissions)
{
Write-host "Processing $computernames ($counter/$($FilePermissions.count))"
IF (Test-Connection -BufferSize 32 -Count 1 -ComputerName $computernames -Quiet)
{
$Computersystem = Get-WMIObject Win32_ComputerSystem -ComputerName $computernames -AsJob
$videocontroller = Get-WmiObject win32_videocontroller -ComputerName $computernames -AsJob
$bioscontroller1 = Get-WmiObject win32_bios -ComputerName $computernames -AsJob
$bioscontroller2 = Get-WmiObject -Class:Win32_ComputerSystem -ComputerName $computernames -AsJob
$userlogon = Get-CimInstance -ClassName Win32_ComputerSystem -Property UserName -ComputerName $computernames
Wait-Job -Job $Computersystem,$videocontroller,$bioscontroller -Timeout 10 | out-Null
$computersystem_output = Receive-Job $Computersystem
$intelvideocontroller_output = Receive-Job $videocontroller | ? {$_.name -ilike "*Intel*"}
$nvidiavideocontroller_output = Receive-Job $videocontroller | ? {$_.name -ilike "*NVIDIA*"}
$AMDvideocontroller_output = Receive-Job $videocontroller | ? {$_.name -ilike "*AMD*"}
$bioscontroller1_output = Receive-Job $bioscontroller1
$bioscontroller2_output = Receive-Job $bioscontroller2
# Creating spreadsheet headers
$newrow = [Pscustomobject] #{
Host_name = $computersystem_output.name
Model_Name = $bioscontroller2_output.Model
Serial_Number = $bioscontroller1_output.SerialNumber
BIOS_Version = $bioscontroller1_output.SMBIOSBIOSVersion
Last_User_Logon = $userlogon.UserName
If ($intelvideocontroller_output -ilike "*Intel*")
{ Intel_Card = $intelvideocontroller_output.name
IntelGraphics_Version = $intelvideocontroller_output.DriverVersion}
ElseIf ($nvidiavideocontroller_output -ilike "*NVIDIA*")
{ Nvidia_Card = $nvidiavideocontroller_output.name
NvidiaGraphics_Version = $nvidiavideocontroller_output.DriverVersion }
ElseIf ( $AMDvideocontroller_output -ilike "*AMD*")
{ AMD_Card = $AMDvideocontroller_output.name
AMDGraphics_Version = $AMDvideocontroller_output.DriverVersion }
}
$newrow | export-csv -path c:\HostnameData.csv -Append -NoTypeInformation
Remove-Job -job $Computersystem,$videocontroller,$bioscontroller1,$bioscontroller2 -Force
$counter++
}
Else
{
write-Host "The remote computer "$computernames" is Offline"
}
}
The logic would have to be wrapped inside some type of operator such as the grouping, or sub-expression operator to be allowed as the assignment of the name. A more concise solution is using a switch statement before your pscustomobject construct and having a dynamic assignment.
$card_type,
$card_version =
switch -Wildcard ($videocontroller.Name)
{
'*Intel*' { 'Intel_Card', 'IntelGraphics_Version' }
'*NVIDIA*' { 'Nvidia_Card', 'NvidiaGraphics_Version' }
'*AMD*' { 'AMD_Card', 'AMDGraphics_Version' }
$_ { 'N/A_Card', 'N/A_Version' }
}
$newrow = [pscustomobject]#{
Host_name = $computersystem_output.name
Model_Name = $bioscontroller2_output.Model
Serial_Number = $bioscontroller1_output.SerialNumber
BIOS_Version = $bioscontroller1_output.SMBIOSBIOSVersion
Last_User_Logon = $userlogon.UserName
$card_type = $videocontroller.Name
$card_version = $videocontroller.DriverVersion
}
Now the type of card gets saved to $card_type, and the driver version name gets saved to $card_version; it also accounts for cards that didn't meet that criteria and defaults to 'N/A'.
On another note, I personally don't see why you're querying the same class of Win32_ComputerSystem more than once while using jobs and having them wait. You are also using 2 different type of cmdlets that do the same thing when querying those classes. You should only need the first variable assignments and you'd only have to reference them once.
EDIT:
You also can't append properties with different names to a csv that has a column already being used by a different name.

Accessible a powershell variable inside a function / workflow

I am working on a Powershell script with a function and a Workflow. Unfortunately, I was unable to access variables inside the function. Here is an example :
$location = "c:\temp"
function PingComputer
{
Param($ip)
$res = Test-Connection -ComputerName $ip -quiet -Count 1
If ($res -eq "true")
{
Try
{
#Some tasks if pings are ok
#For example : copy-item -path $location -destination $dest -force -recurse
}
Catch
{
#Catch exceptions
}
}
Else
{
#Ping fail
}
}
workflow parallelPingCOmputer {
Param($ips)
$i=0
foreach -parallel($ip in $ips)
{
PingComputer($ip)
$workflow:i++
$count = $ips.Count
InlineScript {
#write-host "$using:i : " $using:ips.count " : $using:ips "
Write-Progress -Activity "Poste : $using:ip" -Status "Postes effectués : $using:i sur $using:count" -PercentComplete (($using:i / $using:Count) * 100)
sleep -s 1
}
}
}
$request = parallelPingComputer -ips $ip_list | Select-object date, computer, result | out-gridview
This is a simplified version of my current script. But, as you can see, the variable $location can't be accessed inside my function PingComputer. I tried to modify its scope as global or script, but nothing works.
The message I get with the copy-item is "path is null"... How can I make my variable accessible ?
If you want to reuse the function, just copy the function inside the workflow and keep it outside. Else, copy the function inside the workflow and remove the one outside like the code below. It could solve your problem without using a function inside the workflow.
I made an example on my Github :
Workflow Get-Ping{
Param(
[Parameter(Mandatory = $true)][string[]]$Computers
)
Foreach -Parallel ($computer in $Computers){
$ping = $null
$version = $null
if(Test-Connection -ComputerName $computer -Count 1 -Quiet){
$ping = "Online"
$version = Get-WmiObject -Namespace "root\cimv2" -Class "Win32_OperatingSystem" -PSComputerName $computer | select Version
}
else{
$ping = "Offline"
}
#if no gwmi use -ComputerName $computer
$arrayResults = New-Object -Type PSObject -Property #{
Hostname = $computer
Ping = $ping
Version = $version.Version
}
return($arrayResults)
}
}
$computers = Get-Content ".\Computers.txt"
Write-Host "$($computers.Count) computers found" -ForegroundColor Green
Get-Ping -Computers $computers | Select-Object Hostname, Ping, Version | Sort-Object Hostname | Out-GridView -Title "Powershell Workflow - Ping"

Split-Pipeline parallel jobs script won't finish the last job

So I've been using Split-Pipeline module for some time now, and what bothers me about it, rarely does it really finish. I always get stuck with the last job in the queue hanging and have to either stop the script or kill ISE:
VERBOSE: Split-Pipeline: Jobs = 1; Load = End; Queue = 0
example: 300 servers, running a scan for hotfixes on them using split-pipeline. I'm using pretty standard parameters:
$servers | Split-Pipeline -Verbose -Count 10 { process {#some code for scanning#}}
So 10 jobs each loaded with 30 servers, first lets say 250 servers are scanned really fast, then it slows down a little and when the last job remains only, it never finishes...
Anyone experienced something similar? I've tested the same on several machines and it's always the same so I don't think it's related to the machine running the script.
EDIT: heres the code
$servers = (Get-Clipboard).trim() #server1..100
$KB = (Get-Clipboard).trim() -split ', ' #KB4022714, KB4022717, KB4022718, KB4022722, KB4022727, KB4022883, KB4022884, KB4022887, KB4024402, KB4025339, KB4025342, KB4021903, KB4021923, KB4022008, KB4022010, KB4022013, KB3217845
$servers | Split-Pipeline -Verbose -Variable KB -Count 10 { process {
$hash = [ordered]#{}
try
{
$hash.Hostname = $_
$os = gwmi win32_operatingsystem -ComputerName $_ -ErrorAction Stop
$hash.OS = $os.Caption
$hash.Architecture = $os.OSArchitecture
$today = [Management.ManagementDateTimeConverter]::ToDateTime($os.LocalDateTime)
$hash.LastReboot = [Management.ManagementDateTimeConverter]::ToDateTime($os.LastBootUpTime)
$hash.DaysSinceReboot = ($today - $hash.LastReboot).Days
}
catch
{
$hash.OS = $Error[0]
$hash.Architecture = 'N/A'
$hash.LastReboot = 'N/A'
$hash.DaysSinceReboot = 'N/A'
}
try
{
$hash.PendingReboot = icm -cn $_ {
if (Get-ChildItem "HKLM:\Software\Microsoft\Windows\CurrentVersion\Component Based Servicing\RebootPending" -EA SilentlyContinue) { return $true }
if (Get-Item "HKLM:\SOFTWARE\Microsoft\Windows\CurrentVersion\WindowsUpdate\Auto Update\RebootRequired" -EA SilentlyContinue) { return $true }
if (Get-ItemProperty "HKLM:\SYSTEM\CurrentControlSet\Control\Session Manager" -Name PendingFileRenameOperations -Ea SilentlyContinue) { return $true }
try
{
$util = [wmiclass]"\\.\root\ccm\clientsdk:CCM_ClientUtilities"
$status = $util.DetermineIfRebootPending()
if(($status -ne $null) -and $status.RebootPending)
{
return $true
}
}
catch{}
return $false
}
}
catch
{
$hash.PendingReboot = 'N/A'
}
try
{
$hotfix = Get-HotFix -ComputerName $_ -Id $KB -ErrorAction Stop
if ($hotfix)
{
$hash.Hotfix = $hotfix.HotFixID -join ','
}
else
{
$hash.Hotfix = "No Hotfix from the list applied"
}
}
catch
{
$hash.Hotfix = $Error[0]
}
$obj = New-Object -TypeName PSObject -Property $hash
$obj | Export-Csv c:\temp\hotfixes.csv -NoTypeInformation -Append -Force
}
}

Resolve-DnsName inside Test-Connection

I was wondering how I could return the Resolve-DnsName output from my Test-Connection script and add it to the CSV I created.
I like to capture the Name, Type, TTL, Section from that please.
Only invoke the Resolve-DnsName when the ping is not successful.
$servers = Get-Content "servers.txt"
$collection = $()
foreach ($server in $servers)
{
$status = #{ "ServerName" = $server; "TimeStamp" = (Get-Date -f s) }
$result = Test-Connection $server -Count 1 -ErrorAction SilentlyContinue
if ($result)
{
$status.Results = "Up"
$status.IP = ($result.IPV4Address).IPAddressToString
}
else
{
$status.Results = "Down"
$status.IP = "N/A"
$status.DNS = if (-not(Resolve-DnsName -Name $server -ErrorAction SilentlyContinue))
{
Write-Output -Verbose "$server -- Not Resolving"
}
else
{
"$server resolving"
}
}
New-Object -TypeName PSObject -Property $status -OutVariable serverStatus
$collection += $serverStatus
}
$collection | Export-Csv -LiteralPath .\ServerStatus3.csv -NoTypeInformation
but nothing new is added to the CSV.
You ran into a PowerShell gotcha. PowerShell determines the columns displayed in tabular/CSV output from the first object processed. If that object doesn't have a property DNS that column won't be shown in the output, even if other objects in the list do have it. If other objects don't have properties that were present in the first object they will be displayed as empty values.
Demonstration:
PS C:\> $a = (New-Object -Type PSObject -Property #{'a'=1; 'b'=2}),
>> (New-Object -Type PSObject -Property #{'a'=3; 'b'=4; 'c'=5}),
>> (New-Object -Type PSObject -Property #{'b'=6; 'c'=7})
>>
PS C:\> $a | Format-Table -AutoSize
a b
- -
1 2
3 4
6
PS C:\> $a[1..2] | Format-Table -AutoSize
c b a
- - -
5 4 3
7 6
If you want to generate tabular output always create your objects uniformly with the same set of properties. Choosing sensible defaults even allows you to reduce your total codebase.
$collection = foreach ($server in $servers) {
$status = New-Object -Type PSObject -Property #{
'ServerName' = $server
'TimeStamp' = Get-Date -f s
'Results' = 'Down'
'IP' = 'N/A'
'HasDNS' = [bool](Resolve-DnsName -Name $server -EA SilentlyContinue)
}
$result = Test-Connection $server -Count 1 -EA SilentlyContinue
if ($result) {
$status.Results = 'Up'
$status.IP = ($result.IPV4Address).IPAddressToString
}
$status
}

Creating workflow for parallel scheduled server reboots with logging

I'm currently using the following code to schedule a server reboot. This works pretty well for a handful of servers but becomes a problem when there are many servers (over 80) because Register-ScheduledJob takes a long time per server.
$user = Get-Credential -UserName $env:USERNAME -Message "UserName/password for scheduled Reboot"
$trigger = New-JobTrigger -once -at $date
$script = [ScriptBlock]::Create("D:\Scripts\Scheduled-Reboot-Single.ps1 -server $server")
Register-ScheduledJob -Name $server -Credential $user -Trigger $trigger -ScriptBlock $script
My research pointed to using workflow and foreach -parallel.
The problem I run into is accurate logging. My log file is created but the columns are not ordered correctly.
workflow Do-ScheduledReboot{
Param([string[]]$servers)
foreach -parallel($server in $servers) {
InlineScript {
try {
$LastReboot = Get-EventLog -ComputerName $using:server -LogName system |
Where-Object {$_.EventID -eq '6005'} |
Select -ExpandProperty TimeGenerated |
select -first 1
#New loop with counter, exit script if server did not reboot.
$max = 20; $i = 0
do {
if ($i -gt $max) {
$hash = #{
"Server" = $using:server
"Status" = "FailedToReboot!"
"LastRebootTime" = "$LastReboot"
"CurrentRebootTime" = "FailedToReboot!"
}
$newRow = New-Object PsObject -Property $hash
$rnd = Get-Random -Minimum 5 -Maximum 40
Start-Sleep -Seconds $rnd
Export-Csv D:\workflow-results.csv -InputObject $newrow -Append -Force
exit
}#exit script and log failed to reboot.
$i++
Start-Sleep -Seconds 15
} while (Test-path "\\$using:server\c$")
$max = 20; $i = 0
do {
if ($i -gt $max) {
$hash = #{
"Server" = $using:server
"Status" = "FailedToComeOnline!"
"LastRebootTime" = "$LastReboot"
"CurrentRebootTime" = "FailedToReboot!"
}
$newRow = New-Object PsObject -Property $hash
$rnd = Get-Random -Minimum 5 -Maximum 40
Start-Sleep -Seconds $rnd
Export-Csv D:\workflow-results.csv -InputObject $newrow -Append -Force
exit
}#exit script and log failed to come online.
$i++
Start-Sleep -Seconds 15
} while (-not(Test-path "\\$using:server\c$"))
$CurrentReboot = Get-EventLog -ComputerName $using:server -LogName system | Where-Object {$_.EventID -eq '6005'} | Select -ExpandProperty TimeGenerated | select -first 1
$hash = #{
"Server" = $using:server
"Status" = "RebootSuccessful"
"LastRebootTime" = $LastReboot
"CurrentRebootTime" = "$CurrentReboot"
}
$newRow = New-Object PsObject -Property $hash
$rnd = Get-Random -Minimum 5 -Maximum 40
Start-Sleep -Seconds $rnd
Export-Csv D:\workflow-results.csv -InputObject $newrow -Append -Force
} catch {
$errMsg = $_.Exception
"Failed with $errMsg"
}#end catch
}#end inline script
}#end foreach parallel
}#end workflow
$mylist = gc D:\Servers.txt
Do-ScheduledReboot -servers $mylist
Create ordered hashtables:
$hash = [ordered]#{
'Server' = $using:server
'Status' = ...
"LastRebootTime" = ...
'CurrentRebootTime' = ...
}