Why does this Powershell ForEach loop get slower with each iteration? - powershell

My code is working as expected. I'm really just curious to know if anyone has any idea why what I describe below might be happening. That said, if anyone has any ideas for further optimising the routine, I'd gratefully accept them in the spirit of every day being a school day!
The script is querying all our domain controllers to get the most recent lastLogon attribute for all users in a particular OU. (I am doing this instead of using the lastLogonTimeStamp attribute because I need the absolute most recent logon as of the moment the script is run.)
During testing, to check the code was doing what I expected it to do, I added some console output (included in the snippet below).
When I did this I noticed that with each iteration of the SECOND ForEach ( $DC in $AllDCs ) loop, there was a noticeable pause before the nested loop wrote its first line of console output. The duration of the pause increased with each iteration of the outer loop, and the speed of the inner loop's subsequent output also dropped noticeably. Over the course of the run, looking at the output of a dozen or so DCs, I'd estimate the rate console lines were being written dropped by at least a factor of 4.
$AllDCs = Get-ADDomainController -Filter *
$AllRecords = #{}
ForEach ( $DC in $AllDCs ) {
$UserList = Get-ADUser -filter * -Server $DC -SearchBase $ini.OUDN.NewStartsInternal -SearchScope OneLevel -Properties lastLogon
$UserList = $UserList | Where { $_.SamAccountName -match $ini.RegEx.PayNo }
$AllRecords.Add($DC.Hostname,$UserList)
}
$Logons = #{}
ForEach ( $DC in $AllDCs ) { ; this loop is the one I'm talking about
ForEach ( $User in $AllRecords.$($DC.HostName) ) {
If ( $Logons.ContainsKey($User.SamAccountName) ) { ;this line amended on advice from mklement0
If ( $Logons.$($User.SamAccountName) -lt $User.lastLogon ) {
$Logons.$($User.SamAccountName) = $User.lastLogon
Write-Host "Updated $($User.SamAccountName) to $($User.lastLogon)" -ForegroundColor Green
} Else {
Write-Host "Ignored $($User.SamAccountName)"
}
} Else {
$Logons.Add( $User.SamAccountName , $User.lastLogon )
Write-Host "Created $($User.SamAccountName)" -ForegroundColor Yellow
}
}
}
I'm not really any under any time constraints here as we're only talking a couple hundred users and a dozen or so domain controllers. I've already reduced the runtime by a huge amount anyway. Previously it was looping through the users and querying every DC for every user one by one which, unsurprisingly, was taking far longer.
UPDATE:
I implemented mklement0's suggestion from the first comment below, and if anything the script is actually running more slowly than before. The delays between iterations of the outer loop are longer, and the output from the inner loop seems subject to random delays. On average I'd say the inner loop is getting through about 2 to 3 iterations per second, which to my mind is extremely slow for looping through data that is already held in local memory. I know PowerShell has a reputation for being slow, but this seems exceptional.
The script normally runs on a VM so I tested it on my own computer and it was a LOT slower, so this isn't a resource issue with the VM.
UPDATE 2:
I removed all the Write-Host commands and simply timed each iteration of the outer loop.
First of all, removing all the console writes increased performance dramatically, which I expected, although I didn't realise by how much. It easily cut the run time to a fifth of what it had been.
In terms of the loop times, the strange behaviour is still there. Out of twelve iterations, the first seven are done within 1 second, and getting through the final five takes about 35 seconds. This behaviour repeats more or less the same every time. There is nothing different about the hostnames of the final five DCs compared to the first seven that may be slowing down the hashtable lookup.
I'm very happy with the run time as it is now, but still utterly perplexed about this weird behaviour.

So, this is my take on your code, I didn't change many things but I have a feeling this should be a bit faster but I may be wrong.
Note, I'm using LDAPFilter = '(LastLogon=*)' instead of Filter = '*' because, if it's an attribute that is not replicated accross the domain it might save time when querying each Domain Controller. Change it back to Filter = '*' if that didn't work :(
It should avoid bringing users without LastLogon attribute set which could save some time.
$AllDCs = Get-ADDomainController -Filter *
$logons = #{}
$params = #{
LDAPFilter = '(LastLogon=*)' # Use this instead if that didn't work => Filter = '*'
Server = ''
SearchBase = $ini.OUDN.NewStartsInternal
SearchScope = 'OneLevel'
Properties = 'lastLogon'
}
foreach($DC in $AllDCs) {
$params.Server = $DC
$UserList = Get-ADUser #params
foreach($user in $UserList) {
if($user.samAccountName -notmatch $ini.RegEx.PayNo) {
continue
}
if($logons[$user.samAccountName].LastLogon -lt $user.LastLogon) {
# On first loop iteration should be always entering
# this condition because
# $null -lt [datetime]::MaxValue => True AND
# $null -lt [datetime]::MinValue => True
# Assuming $user.LastLogon is always a populated attribute
# which also explains my LDAPFilter = '(LastLogon=*)' from before
$logons[$user.samAccountName] = $user
}
}
}

Related

Powershell - Test if user is part of any Groups

I'm having trouble getting a small PS script to work the way I want it to.
Powershell version: 5.1
Ultimate goal: I want to parse all existing local User accounts and generate a list of those that are not part of any local Groups. (This is not an AD environment, nor part of a domain. This is all just local OS accounts and Groups.) Also, I am only interested in the Groups that I have manually created - I don't care about the default System Groups. (More on how I try to achieve this, later.) And then, after I get this part working, this output (list of User names) will then be used (in the future - not shown in this code) as input to another block that will add these not-in-a-Group users to a Group.
I found an example bit of code - which worked, and resulted in the correct set of user names. But it was slow - it took >5 minutes. So I wanted something faster.
The approach that I'm working with now generally seems like it will work, and is pretty quick. I'm just having trouble getting it to restrict things down correctly. It seems I'm having trouble referencing properties of objects returned by the cmdlets. (See code a bit further down. There are various counters and write-host steps in here, too, that are purely for helping me see what's going on and debugging - they aren't required for the actual task.)
My general outline of how I am going about this:
Compile a list of all Users in all Groups
But, I only want Groups that have no Description - these are the ones that I have created on the machine. The OS default Groups all have a Description. This will help me narrow down my list of Users.
Loop over the list of all Users on the system
Compare each user to the List from Step 1
If User is on the List, then that User is IN a Group, so skip it
If User is not on the List, then save/report that Name back
[Side note - the user 'WDAGUtilityAccount' turned out to also not be in any Groups, but I don't want to change that one; so have a specific test to exclude it from the list.]
[original version of code]
# --- Part 1
$UsersList = foreach ( $aGroup in get-localgroup ) {
$groupcount++
# Only want Groups that don't have a Description
if ( $null -eq $aGroup.Description ) {
Get-LocalGroupMember $aGroup.name
}
write-host "$groupCount -- $aGroup.Name _ $aGroup.Description"
}
# Just for debugging - to see the content of Part 1
write-host "List = $UsersList"
# ---- Part 2
Get-LocalUser | ForEach-Object {
$count++
$name = $_.Name
if ( $name -eq "WDAGUtilityAccount" ) {
# nothing - skip this one
} elseif ( $UsersList.Name -inotcontains "$env:COMPUTERNAME\$_" ) {
$count2++
write-host "$Count ($Count2) : $name -- not a part of any groups"
}
}
It appears that my attempts to extract the Properties of the Group in Part 1 are failing - I am getting literal text 'Name' and 'Description', instead of the aGroup.Name aGroup.Description property values. So my output looks like "MyGroup1.Name" instead of "MyGroup1" (assuming the actual name of the group is 'MyGroup1').
I believe that I am approaching the 'Group' object correctly - when I do 'get-localgroup | get-member" it says that it is a 'LocalGroup' object, with various properties (Name and Description being two of those).
There may be many other approaches to this - I'm interested in hearing general ideas; but I would also like to know the specific issues with my current code - it's a learning exercise. :)
Thanks, J
[ Version 2 of code - after some suggestions... ]
$UsersList = foreach ( $aGroup in get-localgroup ) {
$groupcount++
#Write-Host $aGroup.Description
# Only want Groups that don't have a Description (i.e. are the Groups that we created, not the default System/OS groups)
if ( $null -eq $($aGroup.Description) ) {
$UserList += Get-LocalGroupMember $($aGroup.name)
write-host "$groupCount -- $($aGroup.Name) _ $($aGroup.Description)"
}
}
write-host "List = $UsersList"
Get-LocalUser | ForEach-Object {
$count++
$name = $_.Name
if ( $name -eq "WDAGUtilityAccount" ) {
# nothing - skip this one
} elseif ( $($UsersList.name) -inotcontains "$env:COMPUTERNAME\$_" ) {
$count2++
write-host "$Count ($Count2) : $name -- not a part of any groups"
}
}
I believe this could be reduced to:
Store all members of a any group where the group's Description is null.
Get all local users and filter where their user's Name is not equal to WDAGUtilityAccount and they are not part of the stored member's SID array.
$members = Get-LocalGroup | Where-Object { -not $_.Description } | Get-LocalGroupMember
Get-LocalUser | Where-Object {
$_.Name -ne 'WDAGUtilityAccount' -and $_.SID -notin $members.SID
} | Format-Table -AutoSize

Powersell - Remotely Query if User Exists Across Domain [Fastest]

Abstract
So I work for a company that has roughly 10k computer assets on my domain. My issue is the time it takes to query if a user exists on a computer to see if they've ever logged into said computer. We need this functionality for audits in case they've done something they shouldn't have.
I have two methods in mind I've researched to complete this task, and a third alternative solution I have not thought of;
-Method A: Querying every computer for the "C:\Users<USER>" to see if LocalPath exists
-Method B: Checking every computer registry for the "HKU:<SID>" to see if the SID exists
-Method C: You are all smarter than me and have a better way? XD
Method A Function
$AllCompFound = #()
$AllADComputer = Get-ADComputer -Properties Name -SearchBase "WhatsItToYa" -filter 'Name -like "*"' | Select-Object Name
ForEach($Computer in $AllADComputers) {
$CName = $Computer.Name
if (Get-CimInstance -ComputerName "$CName" -ClassName Win32_Profile | ? {"C:\Users\'$EDIPI'" -contains $_.LocalPath}) {
$AllCompFound += $CName
} else {
#DOOTHERSTUFF
}
}
NOTE: I have another function that prompts me to enter a username to check for. Where I work they are numbers so case sensitivity is not an issue. My issue with this function is I believe it is the 'if' statement returns true every time because it ran rather than because it matched the username.
Method B Function
$AllCompFound = #()
$AllADComputer = Get-ADComputer -Properties Name -SearchBase "WhatsItToYa" -filter 'Name -like "*"' | Select-Object Name
$hive = [Microsoft:Win32.RegistryHive]::Users
ForEach($Computer in $AllADComputers) {
try {
$base = [Microsoft.Win32.RegistryKey]::OpenRemoteBaseKey($hive, $Computer.Name)
$key = &base.OpenSubKey($strSID)
if ($!key) {
#DOSTUFF
} else {
$AllCompFound += $Computer.Name
#DOOTHERSTUFF
}
} catch {
#IDONTTHROWBECAUSEIWANTITTOCONTINUE
} finally {
if($key) {
$key.Close()
}
if ($base) {
$base.Close()
}
}
}
NOTE: I have another function that converts the username into a SID prior to this function. It works.
Where my eyes start to glaze over is using Invoke-Command and actually return a value back, and whether or not to run all of these queries as their own PS-Session or not. My Method A returns false positives and my Method B seems to hang up on some computers.
Neither of these methods are really fast enough to get through 10k results, I've been using smaller pools of computers in order to get test these results when requested. I'm by no means an expert, but I think I have a good understanding, so any help is appreciated!
First, use WMI Win32_UserProfile, not C:\Users or registry.
Second, use reports from pc to some database, not from server to pc. This is much better usually.
About GPO: If you get access, you can Add\Remove scheduled task for such reports through GPP (not GPO) from time to time.
Third: Use PoshRSJob to make parallel queries.
Get-WmiObject -Class 'Win32_USerProfile' |
Select #(
'SID',
#{
Name = 'LastUseTime';
Expression = {$_.ConvertToDateTime($_.LastUseTime)}}
#{
Name = 'NTAccount';
Expression = { [System.Security.Principal.SecurityIdentifier]::new($_.SID).Translate([System.Security.Principal.NTAccount])}}
)
Be careful with translating to NTAccount: if SID does not translates, it will cause error, so, maybe, it's better not to collect NTAccount from user space.
If you have no other variants, parallel jobs using PoshRSJob
Example for paralleling ( maybe there are some typos )
$ToDo = [System.Collections.Concurrent.ConcurrentQueue[string]]::new() # This is Queue (list) of computers that SHOULD be processed
<# Some loop through your computers #>
<#...#> $ToDo.Enqueue($computerName)
<#LoopEnd#>
$result = [System.Collections.Concurrent.ConcurrentBag[Object]]::new() # This is Bag (list) of processing results
# This function has ComputerName on input, and outputs some single value (object) as a result of processing this computer
Function Get-MySpecialComputerStats
{
Param(
[String]$ComputerName
)
<#Some magic#>
# Here we make KSCustomObject form Hashtable. This is result object
return [PSCustomObject]#{
ComputerName = $ComputerName;
Result = 'OK'
SomeAdditionalInfo1 = 'whateverYouWant'
SomeAdditionalInfo2 = 42 # Because 42
}
}
# This is script that runs on background. It can not output anything.
# It takes 2 args: 1st is Input queue, 2nd is output queue
$JobScript = [scriptblock]{
$inQueue = [System.Collections.Concurrent.ConcurrentQueue[string]]$args[0]
$outBag = [System.Collections.Concurrent.ConcurrentBag[Object]]$args[1]
$compName = $null
# Logging inside, if you need it
$log = [System.Text.StringBuilder]::new()
# we work until inQueue is empty ( then TryDequeue will return false )
while($inQueue.TryDequeue([ref] $compName) -eq $true)
{
$r= $null
try
{
$r = Get-MySpecialComputerStats -ComputerName $compName -EA Stop
[void]$log.AppendLine("[_]: $($compName) : OK!")
[void]$outBag.Add($r) # We append result to outBag
}
catch
{
[void]$log.AppendLine("[E]: $($compName) : $($_.Exception.Message)")
}
}
# we return log.
return $log.ToString()
}
# Some progress counters
$i_max = $ToDo.Count
$i_cur = $i_max
# We start 20 jobs. Dont forget to say about our functions be available inside job
$jobs = #(1..20) <# Run 20 threads #> | % { Start-RSJob -ScriptBlock $JobScript -ArgumentList #($ToDo, $result) -FunctionsToImport 'Get-MySpecialComputerStats' }
# And once per 3 seconds we check, how much entries left in Queue ($todo)
while ($i_cur -gt 0)
{
Write-Progress -Activity 'Working' -Status "$($i_cur) left of $($i_max) computers" -PercentComplete (100 - ($i_cur / $i_max * 100))
Start-Sleep -Seconds 3
$i_cur = $ToDo.Count
}
# When there is zero, we shall wait for jobs to complete last items and return logs, and we collect logs
$logs = $jobs | % { Wait-RSJob -Job $_ } | % { Receive-RSJob -Job $_ }
# Logs is LOGS, not result
# Result is in the result variable.
$result | Export-Clixml -Path 'P:/ath/to/file.clixml' # Exporting result to CliXML file, or whatever you want
Please be careful: there is no output inside $JobScript done, so it must be perfectly done, and function Get-MySpecialComputerStats must be tested on unusual ways to return value that can be interpreted.

Which part of this Powershell code snippet is making it take a long time to run?

I'm tasked with making a report of the last logon time for each user in our AD env, I obviously first asked mother google for something that I could repurpose but couldn't find anything that would check multiple Domain Controllers and reconcile the last one, and then spit out if it was past an arbitrarily set date/number of days.
Here's the code:
foreach ($user in $usernames) {
$percentCmpUser = [math]::Truncate(($usernames.IndexOf($user)/$usernames.count)*100)
Write-Progress -Id 3 -Activity "Finding Inactive Accounts" -Status "$($percentCmpUser)% Complete:" -PercentComplete $percentCmpUser
$allLogons = $AllUsers | Where-Object {$_.SamAccountName -match $user}
$finalLogon = $allLogons| Sort-Object LastLogon -Descending |Select-Object -First 1
if ($finalLogon.LastLogon -lt $time.ToFileTime()) {
$inactiveAccounts += $finalLogon
}
}
$usernames is a list of about 6000 usernames
$AllUsers is a list of 18000 users, it includes 10 different properties that I'd like to have access to in my final report. The way I got it was by hitting three of our 20 or so DC's for all users in specific OUs that I'm concerned with. The final script will actually be 6k*20 bec I do need to hit every DC to make sure I don't miss any user's logon.
Here's how $time is calculated:
$DaysInactive = 60
$todayDate = Get-Date
$time = ($todayDate).Adddays(-($DaysInactive))
Each variable is used elsewhere in the script, which is why I break it out like that.
Before you suggest LastLogonTimestamp, I was told it's not current enough and when I asked about changing the replication time to be more current I was told "no, not gonna happen".
Search-ADAccount also doesn't seem to offer an accurate view of inactive users.
I'm open to all suggestions about how to make this specific snippet run faster or on how to use a different methodology to achieve the same result in a fast time.
As of now hitting each DC for all users in specific OUs takes about 10-20sec per DC and then the above snippet takes 30-40 min.
Couple of things stand out, but likely the biggest performance killer here is these two statements:
$percentCmpUser = [math]::Truncate(($usernames.IndexOf($user)/$usernames.count)*100)
# and
$allLogons = $AllUsers | Where-Object {$_.SamAccountName -match $user}
... both of these statements will exhibit O(N^2) (or quadratic) performance characteristics - that is, every time you double the input size, the time taken quadruples!
Array.IndexOf() is effectively a loop
Let's look at the first one:
$percentCmpUser = [math]::Truncate(($usernames.IndexOf($user)/$usernames.count)*100)
It might not be self-evident, but this method-call: $usernames.IndexOf() might require iterating through the entire list of $usernames every time it executes - by the time you reach the last $user, it needs to go through and compare $user all 6000 items.
Two ways you can address this:
Use a regular for loop:
for($i = 0; $i -lt $usernames.Count; $i++) {
$user = $usernames[$i]
$percent = ($i / $usernames.Count) * 100
# ...
}
Stop outputting progress altogether
Write-Progress is really slow - even if the caller suppresses Progress output (eg. $ProgressPreference = 'SilentlyContinue'), using the progress stream still carries overhead, especially when called in every loop iteration.
Removing Write-Progress altogether would remove the requirement for calculating percentage :)
If you still need to output progress information you can shave off some overhead by only calling Write-Progress sometimes - for example once every 100 iterations:
for($i = 0; $i -lt $usernames.Count; $i++) {
$user = $usernames[$i]
if($i % 100 -eq 0){
$percent = ($i / $usernames.Count) * 100
Write-Progress -Id 3 -Activity "Finding Inactive Accounts" -PercentComplete $percent
}
# ...
}
... |Where-Object is also just a loop
Now for the second one:
$allLogons = $AllUsers | Where-Object {$_.SamAccountName -match $user}
... 6000 times, powershell has to enumerate all 18000 objects in $AllUsers and test them for the Where-Object filter.
Instead of using an array and Where-Object, consider loading all users into a hashtable:
# Only need to run this once, before the loop
$AllLogonsTable = #{}
$AllUsers |ForEach-Object {
# Check if the hashtable already contains an item associated with the user name
if(-not $AllLogonsTable.ContainsKey($_.SamAccountName)){
# Before adding the first item, create an array we can add subsequent items to
$AllLogonsTable[$_.SamAccountName] = #()
}
# Add the item to the array associated with the username
$AllUsersTable[$_.SamAccountName] += $_
}
foreach($user in $users){
# This will be _much faster_ than $AllUsers |Where-Object ...
$allLogons = $AllLogonsTable[$user]
}
Hashtables have crazy-fast lookups - finding an object by key is much faster that using Where-Object on an array.

Powershell While Loop not Working as Intended

So here's what I'm attempting to do:
I manually input a name, and then I want to get a list of users who work under the person whose name I input (extensionattribute9 is who the user works under). However, for each person that works under that person, I also want to run the process for them, and see if anyone works under them as well. I want this process to continue until no one works under the current user.
I've managed do to this up to 3 times without a while loop, but as I don't know how deep I would have to go to get everyone, I feel using a while loop would be better overall, especially in terms of code length.
Here is the code I currently have:
$users = Get-ADUser -Filter * -Properties extensionattribute9,Displayname,mail,title
$users | ForEach-Object {
if ($_.extensionattribute9 -like '*Lynn, *')
{
$_ | select Displayname,userprincipalname,title,extensionattribute9
$name = $_.Displayname
while ($_.extensionattribute9 -ne $null){ $users | ForEach-Object {
if ($_.extensionattribute9 -eq $name)
{
$_ | select Displayname,userprincipalname,title,extensionattribute9
$name=$_.Displayname
}
}
}
}
}
When I run the code I get a user (User A) under 'Lynn', and then a user under User A. After that, nothing. The program still continues to run, but nothing gets returned. I'm guessing it's stuck in an infinite cycle, but I don't know where better to put the while loop. Help?
It sounds like you are trying to do a recursive search with nested while/for-each loops which can end badly. You can try something like this:
Function Get-Manager {
param([Object]$User)
$ManagerName = $User.extensionattribute9
# Recursion base case
if ($ManagerName -eq $null){
return
}
# Do what you want with the user here
$User | select Displayname, userprincipalname, title, extensionattribute9
# Recursive call to find manager's manager
$Manager = Get-ADUser -Filter "Name -like $ManagerName"
Get-Manager $Manager
}
# Loop through all ADusers as you did before
$Users = Get-ADUser -Filter * -Properties extensionattribute9,Displayname,mail,title
Foreach ($User in $Users) {
Get-Manager $User
}
Please note I don't have experience using Powershell with Active Directory so the syntax may be incorrect, but shouldn't be hard to fix. Hope this helps!
I'm not familiar with Powershell, but one possible reason you're having trouble is that $_ is being used to mean two different things, depending on whether you use it inside the while loop or not. Is Powershell really smart enough to know what you mean?
More important: the code
$_ | select Displayname,userprincipalname,title,extensionattribute9
$name = $_.Displayname
appears in two places close together. This is a definite code smell. It should appear once and only once.
When you're traversing a hierarchy and you don't know how deep it will go, you must use a recursive algorithm (a function that calls itself). Here it is in pseudocode:
function myFunc ($node) {
//report the name of this node
echo "node $node found." ;
// check to see if this node has any child nodes
array $children = getChildNodes ($node) ;
if (count($children) == 0) {
//no child nodes, get out of here
return ;
}
//repeat the process for each child
foreach($children as $child) {
myFunc($child) ;
}
}

Powershell assistance

I am currently using the below PS script to check if the currents months MS patches are installed on the system. The script is set to check the $env:COMPUTERNAME.mbsa and the Patch_NA.txt file and send the result to the $env:COMPUTERNAME.csv file.
I now need to modify this script to also pull information from other POS devices in the same location (C:\Users\Cambridge\SecurityScans) and send the results to the $env:COMPUTERNAME.csv file.
The POS devices are listed like this:
172.26.210.1.mbsa
172.26.210.2.mbsa
172.26.210.3.mbsa
and so forth.
The IP range at all our locations (last octet) is 1 - 60. Any ideas on how I can set this up?
Script:
$logname = "C:\temp\PatchVerify\$env:COMPUTERNAME.csv"
[xml]$x=type "C:\Users\Cambridge\SecurityScans\$env:COMPUTERNAME.mbsa"
#This list is created based on a text file that is provided.
$montlyPatches = type "C:\Temp\PatchVerify\Patches_NA.txt"|
foreach{if ($_ -mat"-KB(? <KB>\d+)"){$matches.KB}}
$patchesNotInstalled=$x.SecScan.check | where {$_.id -eq 500} |foreach{`
$_.detail.updatedata|where {$_.isinstalled -eq "false"}}|Select -expandProperty KBID
$patchesInstalled =$x.SecScan.check | where {$_.id -eq 500} |foreach{`
$_.detail.updatedata|where {$_.isinstalled -eq "true"}}|Select -expandProperty KBID
"Store,Patch,Present"> $logname
$store = "$env:COMPUTERNAME"
foreach ($patch in $montlyPatches)
{
$result = "Unknown"
if ( $patchesInstalled -contains $patch)
{
$result = "YES"
}
if ( $patchesNotInstalled -contains $patch)
{
$result = "NO"
}
"$store,KB$($patch),$result" >>$logname
}
You can find lots of information on creating functions on the web, but a simple example would be:
Function Check-Patches{
Param($FileName)
$logname = "C:\temp\PatchVerify\$FileName.csv"
[xml]$x=type "C:\Users\Cambridge\SecurityScans\$FileName.mbsa"
The rest of your existing code goes here...
}
Check-Patches "$env:ComputerName"
For($i=1;$i -le 60;$i++){
Check-Patches "172.26.210.$i"
}
If you need me to break down anything in that let me know and I'll go into further explanation, but from what you already have it looks like you have a decent grasp on PowerShell theory and just needed to know what resources are available.
Edit: I updated my example to better fit your script, having it accept a file name, and then applying that file name to the $logname and $x variables within the function.
The break down...
First we declare that we are creating a Function using the Function keyword. Following that is the name of the function that you will use later to call it, and an opening curly brace to start the scriptblock that makes up the actual function.
Next is the Param line, which in this case is very simple only declaring one variable as input. This could alternatively be done as Function Check-Patches ($FileName){ but when you start getting into more advanced functions that only gets confusing, so my recommendation is to stick with putting the parameters inside the function's scriptblock. This is the first thing you want inside of your function in most cases, excluding any Help that you would write up for the function.
Then we have updated lines for $logname and [xml]$x that use the $FileName that the function gets as input.
After that comes all of your code that parses the patch logs, and outputs to your CSV, and the closing curly brace that ends the scriptblock, and the function.
Then we call it for the ComputerName, and run a For loop. The For loop runs everything between 1 and 60, and for each loop it uses that number as the last octet of the file name to feed into the function and check those files.
A few comments on the rest of your code. $monthlypatches = could be changed to = type | ?{$_ -match "-KB(? <KB>\d+)"}|%{$matches.KB} so that the results are filtered before the ForEach loop, which could cut down on some time.
On the $patchesInstalled and $patchesNotInstalled lines you don't need the backtick at the end of that line. You can naturally have a linebreak after the beginning of the scriptblock for a ForEach loop. Having it there can be hard to see later if the script breaks, and if there is anything after it (including a space) the script can break and throw errors that are hard to track down.
Lastly, you loop through $x twice, and then $monthlyPatches once, and do a lot of individual writes to the log file. I would suggest creating an array, filling it with custom objects that have 3 properties (Store, Patch, and Present), and then outputting that at the end of the function. That changes things a little bit, but then your function outputs an object, which you could pipe to Export-CSV, or maybe later you could want it to do something else, but at least then you'd have it. To do that I'd run $x through a switch to see if things are installed, then I'd flush out the array by setting all of the monthlypatches that aren't already in that array to Unknown. That would go something like:
Function Check-Patches{
Param($FileName)
$logname = "C:\temp\PatchVerify\$FileName.csv"
[xml]$x=type "C:\Users\Cambridge\SecurityScans\$FileName.mbsa"
$PatchStatus = #()
#This list is created based on a text file that is provided.
$monthlyPatches = GC "C:\Temp\PatchVerify\Patches_NA.txt"|?{$_ -match "-KB(? <KB>\d+)"} | %{$matches.KB}
#Create objects for all the patches in the updatelog that were in the monthly list.
Switch($x.SecScan.Check|?{$_.KBID -in $monthlyPatches -and $_.id -eq 500}){
{$_.detail.updatedata.isinstalled -eq "true"}{$PatchStatus+=[PSCustomObject][Ordered]#{Store=$FileName;Patch=$_.KBID;Present="YES"};Continue}
{$_.detail.updatedata.isinstalled -eq "false"}{$PatchStatus+=[PSCustomObject][Ordered]#{Store=$FileName;Patch=$_.KBID;Present="NO"};Continue}
}
#Populate all of the monthly patches that weren't found on the machine as installed or failed
$monthlyPatches | ?{$_ -notin $PatchStatus.Patch} | %{$PatchStatus += [PSCustomObject][Ordered]#{Store=$FileName;Patch=$_;Present="Unknown"}}
#Output results
$PatchStatus
}
#Check patches on current computer
Check-Patches "$env:ComputerName"|Export-Csv "C:\temp\PatchVerify\$env:ComputerName.csv" -NoTypeInformation
#Check patches on POS Devices
For($i=1;$i -le 60;$i++){
Check-Patches "172.26.210.$i"|Export-Csv "C:\temp\PatchVerify\172.26.210.$i.csv" -NoTypeInformation
}