In powershell, i want Ioop twice through a text file but in second loop i want to continue from end of first loop - powershell

I have a text file to process. Text file has some configuration data and some networking commands. I want to run all those network commands and redirect output in some log file.
At starting of text file,there are some configuration information like File-name and file location. This can be used for naming log file and location of log file. These line starts with some special characters like '<#:'. just to know that rest of the line is config data about file not the command to execute.
Now, before i want start executing networking commands (starts with some special characters like '<:'), first i want to read all configuration information about file i.e. file name, location, overwrite flag etc. Then i can run all commands and dump output into log file.
I used get-content iterator to loop over entire text file.
Question: Is there any way to start looping over file from a specific line again?
So that i can process config information first (loop till i first encounter command to execute, remember this line number), create log file and then keep running commands and redirect output to log file (loop from last remembered line number).
Config File looks like:
<#Result_File_Name:dump1.txt
<#Result_File_Location:C:\powershell
<:ping www.google.com
<:ipconfig
<:traceroute www.google.com
<:netsh interface ip show config
My powerhsell script looks like:
$content = Get-Content C:\powershell\config.txt
foreach ($line in $content)
{
if($line.StartsWith("<#Result_File_Name:")) #every time i am doing this, even for command line
{
$result_file_arr = $line.split(":")
$result_file_name = $result_file_arr[1]
Write-Host $result_file_name
}
#if($line.StartsWith("<#Result_File_Location:"))#every time i am doing this, even for command line
#{
# $result_file_arr = $line.split(":")
# $result_file_name = $result_file_arr[1]
#}
if( $conf_read_over =1)
{
break;
}
if ($line.StartsWith("<:")) #In this if block, i need to run all commands
{
$items = $line.split("<:")
#$items[0]
#invoke-expression $items[2] > $result_file_name
invoke-expression $items[2] > $result_file_name
}
}

If all the config information starts with <# just process those out first separately. Once that is done you can assume the rest are commands?
# Collect config lines and process
$config = $content | Where-Object{$_.StartsWith('<#')} | ForEach-Object{
$_.Trim("<#") -replace "\\","\\" -replace "^(.*?):(.*)" , '$1 = $2'
} | ConvertFrom-StringData
# Process all the lines that are command lines.
$content | Where-Object{!$_.StartsWith('<#') -and ![string]::IsNullOrEmpty($_)} | ForEach-Object{
Invoke-Expression $_.trimstart("<:")
}
I went a little over board with the config section. What I did was convert it into a hashtable. Now you will have your config options, as they were in file, accessible as an object.
$config
Name Value
---- -----
Result_File_Name dump1.txt
Result_File_Location C:\powershell
Small reconfiguration of your code, with some parts missing, would look like the following. You will most likely need to tweak this to your own needs.
# Collect config lines and process
$config = ($content | Where-Object{$_.StartsWith('<#')} | ForEach-Object{
$_.Trim("<#") -replace "\\","\\" -replace "^(.*?):(.*)" , '$1 = $2'
} | Out-String) | ConvertFrom-StringData
# Process all the lines that are command lines.
$content | Where-Object{!$_.StartsWith('<#') -and ![string]::IsNullOrEmpty($_)} | ForEach-Object{
Invoke-Expression $_.trimstart("<:") | Add-Content -Path $config.Result_File_Name
}

As per your comment you are still curious about your restart loop logic which was part of your original question. I will add this as a separate answer to that. I would still prefer my other approach.
# Use a flag to determine if we have already restarted. Assume False
$restarted = $false
$restartIndexPoint = 4
$restartIndex = 2
for($contentIndex = 0; $contentIndex -lt $content.Length; $contentIndex++){
Write-Host ("Line#{0} : {1}" -f $contentIndex, $content[$contentIndex])
# Check to see if we are on the $restartIndexPoint for the first time
if(!$restarted -and $contentIndex -eq $restartIndexPoint){
# Set the flag so this does not get repeated.
$restarted = $true
# Reset the index to repeat some steps over again.
$contentIndex = $restartIndex
}
}
Remember that array indexing is 0 based when you are setting your numbers. Line 20 is element 19 in the string array for example.
Inside the loop we run a check. If it passes we change the current index to something earlier. The write-host will just print the lines so you can see the "restart" portion. We need a flag to be set so that we are not running a infinite loop.

Related

How to speed up processing of ~million lines of text in log file

I am trying to parse a very large log file that consists of space delimited text across about 16 fields. Unfortunately the app logs a blank line in between each legitimate one (arbitrarily doubling the lines I must process). It also causes fields to shift because it uses space as both a delineator as well as for empty fields. I couldn't get around this in LogParser. Fortunately Powershell affords me the option to reference fields from the end as well making it easier to get later fields affected by shift.
After a bit of testing with smaller sample files, I've determined that processing line by line as the file is streaming with Get-Content natively is slower than just reading the file completely using Get-Content -ReadCount 0 and then processing from memory. This part is relatively fast (<1min).
The problem comes when processing each line, even though it's in memory. It is taking hours for a 75MB file with 561178 legitimate lines of data (minus all the blank lines).
I'm not doing much in the code itself. I'm doing the following:
Splitting line via space as delimiter
One of the fields is an IP address that I am reverse DNS resolving, which is obviously going to be slow. So I have wrapped this into more code to create an in-memory arraylist cache of previously resolved IPs and pulling from it when possible. The IPs are largely the same so after a few hundred lines, resolution shouldn't be an issue any longer.
Saving the needed array elements into my pscustomobject
Adding pscustomobject to arraylist to be used later.
During the loop I'm tracking how many lines I've processed and outputting that info in a progress bar (I know this adds extra time but not sure how much). I really want to know progress.
All in all, it's processing some 30-40 lines per second, but obviously this is not very fast.
Can someone offer alternative methods/objectTypes to accomplish my goals and speed this up tremendously?
Below are some samples of the log with the field shift (Note this is a Windows DNS Debug log) as well as the code below that.
10/31/2022 12:38:45 PM 2D00 PACKET 000000B25A583FE0 UDP Snd 127.0.0.1 6c94 R Q [8385 A DR NXDOMAIN] AAAA (4)pool(3)ntp(3)org(0)
10/31/2022 12:38:45 PM 2D00 PACKET 000000B25A582050 UDP Snd 127.0.0.1 3d9d R Q [8081 DR NOERROR] A (4)pool(3)ntp(3)org(0)
NOTE: the issue in this case being [8385 A DR NXDOMAIN] (4 fields) vs [8081 DR NOERROR] (3 fields)
Other examples would be the "R Q" where sometimes it's " Q".
$Logfile = "C:\Temp\log.txt"
[System.Collections.ArrayList]$LogEntries = #()
[System.Collections.ArrayList]$DNSCache = #()
# Initialize log iteration counter
$i = 1
# Get Log data. Read entire log into memory and save only lines that begin with a date (ignoring blank lines)
$LogData = Get-Content $Logfile -ReadCount 0 | % {$_ | ? {$_ -match "^\d+\/"}}
$LogDataTotalLines = $LogData.Length
# Process each log entry
$LogData | ForEach-Object {
$PercentComplete = [math]::Round(($i/$LogDataTotalLines * 100))
Write-Progress -Activity "Processing log file . . ." -Status "Processed $i of $LogDataTotalLines entries ($PercentComplete%)" -PercentComplete $PercentComplete
# Split line using space, including sequential spaces, as delimiter.
# NOTE: Due to how app logs events, some fields may be blank leading split yielding different number of columns. Fortunately the fields we desire
# are in static positions not affected by this, except for the last 2, which can be referenced backwards with -2 and -1.
$temp = $_ -Split '\s+'
# Resolve DNS name of IP address for later use and cache into arraylist to avoid DNS lookup for same IP as we loop through log
If ($DNSCache.IP -notcontains $temp[8]) {
$DNSEntry = [PSCustomObject]#{
IP = $temp[8]
DNSName = Resolve-DNSName $temp[8] -QuickTimeout -DNSOnly -ErrorAction SilentlyContinue | Select -ExpandProperty NameHost
}
# Add DNSEntry to DNSCache collection
$DNSCache.Add($DNSEntry) | Out-Null
# Set resolved DNS name to that which came back from Resolve-DNSName cmdlet. NOTE: value could be blank.
$ResolvedDNSName = $DNSEntry.DNSName
} Else {
# DNSCache contains resolved IP already. Find and Use it.
$ResolvedDNSName = ($DNSCache | ? {$_.IP -eq $temp[8]}).DNSName
}
$LogEntry = [PSCustomObject]#{
Datetime = $temp[0] + " " + $temp[1] + " " + $temp[2] # Combines first 3 fields Date, Time, AM/PM
ClientIP = $temp[8]
ClientDNSName = $ResolvedDNSName
QueryType = $temp[-2] # Second to last entry of array
QueryName = ($temp[-1] -Replace "\(\d+\)",".") -Replace "^\.","" # Last entry of array. Replace any "(#)" characters with period and remove first period for friendly name
}
# Add LogEntry to LogEntries collection
$LogEntries.Add($LogEntry) | Out-Null
$i++
}
Here is a more optimized version you can try.
What changed?:
Removed Write-Progress, especially because it's not known if Windows PowerShell is used. PowerShell versions below 6 have a big performance impact with Write-Progress
Changed $DNSCache to Generic Dictionary for fast lookups
Changed $LogEntries to Generic List
Switched from Get-Content to switch -Regex -File
$Logfile = 'C:\Temp\log.txt'
$LogEntries = [System.Collections.Generic.List[psobject]]::new()
$DNSCache = [System.Collections.Generic.Dictionary[string, psobject]]::new([System.StringComparer]::OrdinalIgnoreCase)
# Process each log entry
switch -Regex -File ($Logfile) {
'^\d+\/' {
# Split line using space, including sequential spaces, as delimiter.
# NOTE: Due to how app logs events, some fields may be blank leading split yielding different number of columns. Fortunately the fields we desire
# are in static positions not affected by this, except for the last 2, which can be referenced backwards with -2 and -1.
$temp = $_ -Split '\s+'
$ip = [string] $temp[8]
$resolvedDNSRecord = $DNSCache[$ip]
if ($null -eq $resolvedDNSRecord) {
$resolvedDNSRecord = [PSCustomObject]#{
IP = $ip
DNSName = Resolve-DnsName $ip -QuickTimeout -DnsOnly -ErrorAction Ignore | select -ExpandProperty NameHost
}
$DNSCache[$ip] = $resolvedDNSRecord
}
$LogEntry = [PSCustomObject]#{
Datetime = $temp[0] + ' ' + $temp[1] + ' ' + $temp[2] # Combines first 3 fields Date, Time, AM/PM
ClientIP = $ip
ClientDNSName = $resolvedDNSRecord.DNSName
QueryType = $temp[-2] # Second to last entry of array
QueryName = ($temp[-1] -Replace '\(\d+\)', '.') -Replace '^\.', '' # Last entry of array. Replace any "(#)" characters with period and remove first period for friendly name
}
# Add LogEntry to LogEntries collection
$LogEntries.Add($LogEntry)
}
}
If it's still slow, there is still the option to use Start-ThreadJob as a multithreading approach with chunked lines (like 10000 per job).

Parse MDT Log using PowerShell

I am trying to setup a log which would pull different information from another log file to log assets build by MDT using PowerShell. I can extract a line of log using simple get-content | select-string to get the lines i need so output looks like that
[LOG[Validate Domain Credentials [domain\user]]LOG]!
time="16:55:42.000+000" date="10-20-2017" component="Wizard"
context="" type="1" thread="" file="Wizard"
and I am curious if there is a way of capturing things like domain\user, time and date in a separate variables so it can be later passed with another data captured in a similar way in output file in a single line.
This is how you could do it:
$line = Get-Content "<your_log_path>" | Select-String "Validate Domain Credentials" | select -First 1
$regex = '\[(?<domain>[^\\[]+)\\(?<user>[^]]+)\].*time="(?<time>[^"]*)".*date="(?<date>[^"]*)".*component="(?<component>[^"]*)".*context="(?<context>[^"]*)".*type="(?<type>[^"]*)".*thread="(?<thread>[^"]*)".*file="(?<file>[^"]*)"'
if ($line -match $regex) {
$user = $Matches.user
$date = $Matches.date
$time = $Matches.time
# ... now do stuff with your variables ...
}
You might want to build in some error checking etc. (e.g. when no line is found or does not match etc.)
Also you could greatly simplify the regex if you only need those 3 values. I designed it so that all fields from the line are included.
Also, you could convert the values into more appropriate types, which (depending on what you want to do with them afterwards) might make handling them easier:
$type = [int]$Matches.type
$credential = New-Object System.Net.NetworkCredential($Matches.user, $null, $Matches.domain)
$datetime = [DateTime]::ParseExact(($Matches.date + $Matches.time), "MM-dd-yyyyHH:mm:ss.fff+000", [CultureInfo]::InvariantCulture)

PowerShell - Sorry, we couldn't find Microsoft.PowerShell.Core\FileSystem::

I'm trying to modify the script created by Boe Prox that combines multiple CSV files to one Excel workbook to run on a network share.
When I run it locally, the script executes great and combines multiple .csv files into one Excel workbook.
Clear-Host
$OutputFile = "ePortalMonthlyReport.xlsx"
$ChildDir = "C:\MonthlyReport\*.csv"
cd "C:\MonthlyReport\"
echo "Combining .csv files into Excel workbook"
. C:\PowerShell\ConvertCSVtoExcel.ps1
Get-ChildItem $ChildDir | ConvertCSVtoExcel -output $OutputFile
echo " "
But when I modify it to run from a network share with the following changes:
Clear-Host
# Variables
$OutputFile = "ePortalMonthlyReport.xlsx"
$NetworkDir = "\\sqltest2\dev_ePortal\Monthly_Report"
$ChildDir = "\\sqltest2\dev_ePortal\Monthly_Report\*.csv"
cd "\\sqltest2\dev_ePortal\Monthly_Report"
echo "Combining .csv files into Excel workbook"
. $NetworkDir\ConvertCSVtoExcel.ps1
Get-ChildItem $ChildDir | ConvertCSVtoExcel -output $OutputFile
echo " "
I am getting an error where it looks like it using the network path twice and I am not sure why:
Combining .csv files into Excel workbook
Converting \sqltest2\dev_ePortal\Monthly_Report\001_StatsByCounty.csv
naming worksheet 001_StatsByCounty
--done
opening csv Microsoft.PowerShell.Core\FileSystem::\sqltest2\dev_ePortal\Monthly_Report\\sqltest2\dev_ePortal\Monthly_Report\001_StatsByCounty.csv) in excel in temp workbook
Sorry, we couldn't find Microsoft.PowerShell.Core\FileSystem::\sqltest2\dev_ePortal\Monthly_Report\\sqltest2\dev_ePortal\Monthly_Report\001_StatsByCounty.csv. Is it possible it was moved, renamed or deleted?
Anyone have any thoughts on resolving this issue?
Thanks,
Because in the script it uses the following regex:
[regex]$regex = "^\w\:\\"
which matches a path beginning with a driveletter, e.g. c:\data\file.csv will match and data\file.csv will not. It uses this because (apparently) Excel needs a complete path, so if the file path does not match, it will add the current directory to the front of it:
#Open the CSV file in Excel, must be converted into complete path if no already done
If ($regex.ismatch($input)) {
$tempcsv = $excel.Workbooks.Open($input)
}
ElseIf ($regex.ismatch("$($input.fullname)")) {
$tempcsv = $excel.Workbooks.Open("$($input.fullname)")
}
Else {
$tempcsv = $excel.Workbooks.Open("$($pwd)\$input")
}
Your file paths will be \\server\share\data\file.csv and it doesn't see a drive letter, so it hits the last option and jams $pwd - an automatic variable of the current working directory - onto the beginning of the file path.
You might get away if you edit his script and change the regex to:
[regex]$regex = "^\w\:\\|^\\\\"
which will match a path beginning with \\ as OK to use without changing it, as well.
Or maybe edit the last option (~ line 111) to say ...Open("$($input.fullname)") as well, like the second option does.
Much of the issues are caused in almost every instance where the script calls $pwd rather than $PSScriptRoot. Replace all instances with a quick find and replace.
$pwd looks like:
PS Microsoft.PowerShell.Core\FileSystem::\\foo\bar
$PSScriptRoot looks like:
\\foo\bar
The second part i fixed for myself is what #TessellatingHeckler pointed out. I took a longer approach.
It's not the most efficient way...but to me it is clear.
[regex]$regex = "^\w\:\\"
[regex]$regex2 = "^\\\\"
$test = 0
If ($regex.ismatch($input) -and $test -eq 0 ) {
$tempcsv = $excel.Workbooks.Open($input)
$test = 1 }
If ($regex.ismatch("$($input.fullname)") -and $test -eq 0) {
$tempcsv = $excel.Workbooks.Open("$($input.fullname)")
$test = 1}
If ($regex2.ismatch($input) -and $test -eq 0) {
$tempcsv = $excel.Workbooks.Open($input)
$test = 1 }
If ($regex2.ismatch("$($input.fullname)") -and $test -eq 0) {
$tempcsv = $excel.Workbooks.Open("$($input.fullname)")
$test = 1}
If ($test -eq 0) {
$tempcsv = $excel.Workbooks.Open("$($PSScriptRoot)\$input")
$test = 0 }

Run command, extract a field, run a resultant command

Apologies if this is an insanely simple question, but I'm at something at a loss.
What I'm trying to do is take a command output - in this case from NetApp DFM:
dfm event list
ID Source Name Severity Timestamp
------- ------- ------------- ----------- ------------
1 332 volume-online Normal 20 Apr 10:16
2 443 volume-online Normal 20 Apr 10:17
3 3222 volume-online Normal 20 Apr 10:18
I have about 17,000 events - I want to delete them all by ID, by running:
dfm event delete <ID>
I know exactly how I'd do this on Unix (and used to, when this was our platform):
for i in `dfm event list | awk '{print $1}'`
do
dfm event delete $i
done
For bonus points - a 'grep' type criteria? I apologise in advance for the basic nature of the question - I've tried looking on Google for a suitable example, but haven't found anything.
I've made a start by:
dfm event list > dfmevent.txt
foreach ( $line in get-content dfmevent.txt ) {
echo $line
}
But I thought I would ask if there's a better way.
I don't have access to your environment to test but if you are just trying to get access to that first element which is the ID then that should be straight forward.
dfm event list | ForEach-Object{$_.Split(" ",2)[0]} | Where-Object{$_ -match '^\d+$'} | ForEach-Object{
#For Testing
Write-Host "Id: $_ will be deleted"
# Then do something
# dfm event delete $_
}
I'm sure the output is already delimited with new line so sending to file might be redundant.
We take each line and try and split it on the first space. Then pass the first element from that array. Next we ensure that element is indeed a number with a simple regex check. This will ensure that we only get numbers. I had thought about skipping the first two lines but this should work for other occurrences of text as well.
The last loop is for processing that ID. I left a Write-Host there for testing. Assuming you get the id's you are looking for you should just be able to uncomment out that last line with dfm event delete $_
Capturing the output of a DOS command into Powershell is a challenge.
Using a native snapin or module from NetApp would be easier.
might be worth checking out if that link helps
Otherwise, your method of writing to a text file and reading it back in is actually quite a good idea, this is one way of reading it back and pushing the data into the command you need.
$a = get-content dfmevent.txt
foreach ($i in $a) { if ($i.ReadCount -gt 2) { dfm event delete ($i.Substring(0,$i.IndexOf(" "))) } }
This will assign to the variable $result only
$a = get-content dfmevent.txt
$result = #()
foreach ($i in $a) { if ($i.ReadCount -gt 2) { $result += $i.Substring(0,$i.IndexOf(" "))} }
And if you did not want to write to a text file, you could use the .NET method of capturing the output directly
$ProcessInfo = New-Object System.Diagnostics.ProcessStartInfo
$ProcessInfo.FileName = "dfm"
$ProcessInfo.RedirectStandardOutput = $true
$ProcessInfo.UseShellExecute = $false
$ProcessInfo.Arguments = "event list"
$Process = New-Object System.Diagnostics.Process
$Process.StartInfo = $ProcessInfo
$Process.Start() | Out-Null
$Process.WaitForExit()
$output = $Process.StandardOutput.ReadToEnd()

Powershell assistance

I am currently using the below PS script to check if the currents months MS patches are installed on the system. The script is set to check the $env:COMPUTERNAME.mbsa and the Patch_NA.txt file and send the result to the $env:COMPUTERNAME.csv file.
I now need to modify this script to also pull information from other POS devices in the same location (C:\Users\Cambridge\SecurityScans) and send the results to the $env:COMPUTERNAME.csv file.
The POS devices are listed like this:
172.26.210.1.mbsa
172.26.210.2.mbsa
172.26.210.3.mbsa
and so forth.
The IP range at all our locations (last octet) is 1 - 60. Any ideas on how I can set this up?
Script:
$logname = "C:\temp\PatchVerify\$env:COMPUTERNAME.csv"
[xml]$x=type "C:\Users\Cambridge\SecurityScans\$env:COMPUTERNAME.mbsa"
#This list is created based on a text file that is provided.
$montlyPatches = type "C:\Temp\PatchVerify\Patches_NA.txt"|
foreach{if ($_ -mat"-KB(? <KB>\d+)"){$matches.KB}}
$patchesNotInstalled=$x.SecScan.check | where {$_.id -eq 500} |foreach{`
$_.detail.updatedata|where {$_.isinstalled -eq "false"}}|Select -expandProperty KBID
$patchesInstalled =$x.SecScan.check | where {$_.id -eq 500} |foreach{`
$_.detail.updatedata|where {$_.isinstalled -eq "true"}}|Select -expandProperty KBID
"Store,Patch,Present"> $logname
$store = "$env:COMPUTERNAME"
foreach ($patch in $montlyPatches)
{
$result = "Unknown"
if ( $patchesInstalled -contains $patch)
{
$result = "YES"
}
if ( $patchesNotInstalled -contains $patch)
{
$result = "NO"
}
"$store,KB$($patch),$result" >>$logname
}
You can find lots of information on creating functions on the web, but a simple example would be:
Function Check-Patches{
Param($FileName)
$logname = "C:\temp\PatchVerify\$FileName.csv"
[xml]$x=type "C:\Users\Cambridge\SecurityScans\$FileName.mbsa"
The rest of your existing code goes here...
}
Check-Patches "$env:ComputerName"
For($i=1;$i -le 60;$i++){
Check-Patches "172.26.210.$i"
}
If you need me to break down anything in that let me know and I'll go into further explanation, but from what you already have it looks like you have a decent grasp on PowerShell theory and just needed to know what resources are available.
Edit: I updated my example to better fit your script, having it accept a file name, and then applying that file name to the $logname and $x variables within the function.
The break down...
First we declare that we are creating a Function using the Function keyword. Following that is the name of the function that you will use later to call it, and an opening curly brace to start the scriptblock that makes up the actual function.
Next is the Param line, which in this case is very simple only declaring one variable as input. This could alternatively be done as Function Check-Patches ($FileName){ but when you start getting into more advanced functions that only gets confusing, so my recommendation is to stick with putting the parameters inside the function's scriptblock. This is the first thing you want inside of your function in most cases, excluding any Help that you would write up for the function.
Then we have updated lines for $logname and [xml]$x that use the $FileName that the function gets as input.
After that comes all of your code that parses the patch logs, and outputs to your CSV, and the closing curly brace that ends the scriptblock, and the function.
Then we call it for the ComputerName, and run a For loop. The For loop runs everything between 1 and 60, and for each loop it uses that number as the last octet of the file name to feed into the function and check those files.
A few comments on the rest of your code. $monthlypatches = could be changed to = type | ?{$_ -match "-KB(? <KB>\d+)"}|%{$matches.KB} so that the results are filtered before the ForEach loop, which could cut down on some time.
On the $patchesInstalled and $patchesNotInstalled lines you don't need the backtick at the end of that line. You can naturally have a linebreak after the beginning of the scriptblock for a ForEach loop. Having it there can be hard to see later if the script breaks, and if there is anything after it (including a space) the script can break and throw errors that are hard to track down.
Lastly, you loop through $x twice, and then $monthlyPatches once, and do a lot of individual writes to the log file. I would suggest creating an array, filling it with custom objects that have 3 properties (Store, Patch, and Present), and then outputting that at the end of the function. That changes things a little bit, but then your function outputs an object, which you could pipe to Export-CSV, or maybe later you could want it to do something else, but at least then you'd have it. To do that I'd run $x through a switch to see if things are installed, then I'd flush out the array by setting all of the monthlypatches that aren't already in that array to Unknown. That would go something like:
Function Check-Patches{
Param($FileName)
$logname = "C:\temp\PatchVerify\$FileName.csv"
[xml]$x=type "C:\Users\Cambridge\SecurityScans\$FileName.mbsa"
$PatchStatus = #()
#This list is created based on a text file that is provided.
$monthlyPatches = GC "C:\Temp\PatchVerify\Patches_NA.txt"|?{$_ -match "-KB(? <KB>\d+)"} | %{$matches.KB}
#Create objects for all the patches in the updatelog that were in the monthly list.
Switch($x.SecScan.Check|?{$_.KBID -in $monthlyPatches -and $_.id -eq 500}){
{$_.detail.updatedata.isinstalled -eq "true"}{$PatchStatus+=[PSCustomObject][Ordered]#{Store=$FileName;Patch=$_.KBID;Present="YES"};Continue}
{$_.detail.updatedata.isinstalled -eq "false"}{$PatchStatus+=[PSCustomObject][Ordered]#{Store=$FileName;Patch=$_.KBID;Present="NO"};Continue}
}
#Populate all of the monthly patches that weren't found on the machine as installed or failed
$monthlyPatches | ?{$_ -notin $PatchStatus.Patch} | %{$PatchStatus += [PSCustomObject][Ordered]#{Store=$FileName;Patch=$_;Present="Unknown"}}
#Output results
$PatchStatus
}
#Check patches on current computer
Check-Patches "$env:ComputerName"|Export-Csv "C:\temp\PatchVerify\$env:ComputerName.csv" -NoTypeInformation
#Check patches on POS Devices
For($i=1;$i -le 60;$i++){
Check-Patches "172.26.210.$i"|Export-Csv "C:\temp\PatchVerify\172.26.210.$i.csv" -NoTypeInformation
}