Compare 2 arrays with wildcards in powershell - powershell

Say we have 2 .txt-files like so:
users.txt
user.name1
user.name2
admin.name1
admin.name2
shares.txt
\\someserver\somefolder\user.name1
\\someserver\somefolder\user.name2
\\someserver\someotherfolder\admin.name1
\\someotherserver\somefolder\admin.name2
\\someplaceelse\somefolder\no.name
I want to compare these 2 points of data to show only the rows in "shares.txt" that match a name in "users.txt". I figure we need to use some sort of wildcardsearch as the username is always included in the uncpath but will not be exactly the same, here is an example of what i have tried so far without success
$users = Get-Content ".\users.txt"
$shares = Get-Content ".\shares.txt"
foreach ($line in $users)
{
if ("*$line*" -like "$shares")
}
Write-Host "this line matches $line"
}
}

foreach ($line in $users)
{
if ($shares -like "*$line*")
{
Write-Host "$($shares | where {$_ -like "*$line*"}) this line matches $line"
}
}
to get the intended information the current value of the loop must be included in the search.
The problem in your code was that like operator only allows wild cards on the right side of the expression. So the matching must happen the other way around.

Related

Powershell - Matching text from array and string?

I have an array that simply pulls a list of numbers in one long column. I am trying to match it with some data in a string and if it matches, I am wanting it to state Down otherwise it will state Up in the output CSV.
Here is my code: `
IF($RESULTS -like $TEST)
{$Output = "DOWN"
}ELSE{
$OUtput = "UP"
}
`
$RESULTS is the array, and $TEST is the string. If I do -match it works, but -match only pulls single digits so it gives false positives. For example, if there is a 3 in the list as well as 638 it will mark them both as down. None of the other switches seem to work like -eq, -like, etc.
What am I missing please?
Thanks much for any assistance!
EDIT:
Sample of Data in $TEST
2
3
5
Sample of Output of $RESULTS
5
628
Since 5 exists in both, my expected output would be DOWN and everything else would be UP.
It sounds like you have two arrays of numbers, and you want to test if the input array contains at least one of the values in the test array.
You can use Compare-Object, which compares two arrays and indicates which elements are different and, with -IncludeEqual, also which ones are the same:
if (
(Compare-Object -IncludeEqual $RESULTS $TEST).
Where({ $_.SideIndicator -eq '==' }).
Count -gt 0
) {
$Output = "DOWN"
}
else {
$Output = "UP"
}
As an aside:
You can use an if statement as an expression, which means you only need to specify $Output once:
$Output =
IF (<# conditional #>) {
"DOWN"
}
ELSE {
"UP"
}
In PowerShell (Core) 7+, you can use ?:, the ternary conditional operator for a more concise solution:
$Output = <# conditional #> ? 'DOWN' : 'UP'
I would do it by using "foreach". Hope this might be helpful
foreach ($result in $RESULTS){
if ($result -like $Test){
$OUTPUT = "Down"}
else{
$OUTPUT= "UP"}
}
In your edited question you show that variable $TEST is also an array, so in that case you can do
$TEST = 2,3,5
$RESULTS = 5,628
# compare both arrays for equal items
$Output = if ($TEST | Where-Object { $RESULTS -contains $_ }) {"DOWN"} else {"UP"}
$Output
In this case, $Output will be DOWN because both arrays have the number 5
If however variable $TEST contains a multiline string, then first create an array out of that like
$TEST = #'
2
3
5
'#
# convert the string to array by splitting at the newline
$TEST = $TEST -split '\r?\n' -ne ''
$RESULTS = 5,628
# compare both arrays for equal items
$Output = if ($TEST | Where-Object { $RESULTS -contains $_ }) {"DOWN"} else {"UP"}
$Output

How could I perform a search using a array file in PowerShell?

I'd like to know how can I perform a concatenation "inside" a powershell array to run a command in windows and get a specific part of code. I produce the code below but isn't works.
[string[]]$a = Get-Content -Path #("C:\users\me\file.txt")
foreach($name in $a){
$b = "Get-Process | net user '$a' /do"
Invoke-Expression $b | Select-String "Nome completo="
}
Thanks for anyone who could help me.
Use $name to refer to the individual array items being iterated ($a refers to the whole array):
foreach($name in $a){
Get-LocalUser $name |ForEach-Object FullName
}

Find lines in one Object based on strings in other object

Finding lines in one file based on rows in another file.
I have one object $A with some rows like:
0c7d3283-bec2-4db1-9078-ebb79d21afdf
200bc957-26dd-4e8e-aa6e-00dc357c4ac2
218e0d2a-0e8b-4a68-8136-8f5dd749a614
I want to find matches in object $B for those rows and print the lines with the matches to an output file.
I've been trying for a week now (my first week in powershell :) )
I've come to:
$F = $B | ForEach-Object{ $A | Select-String -Pattern $_$ -AllMatches| Select-Object line }
but this doesn't give me any returning results.
Who is willing to help me out?
If you want to match your first Array, with something that should match part of a string in a second array you do something like the code below:
$A = #("0c7d3283-bec2-4db1-9078-ebb79d21afdf", "200bc957-26dd-4e8e-aa6e-00dc357c4ac2", "218e0d2a-0e8b-4a68-8136-8f5dd749a614")
$B = #("Something 0c7d3283-bec2-4db1-9078-ebb79d21afdf", "Something else 200bc957-26dd-4e8e-aa6e-00dc357c4ac2", "Something also e3df3978-beb7-4545-bc48-ff40d8453be1")
foreach ($Line in $A) {
if($B -match $Line) {
$B | Where-Object {$_ -match $Line}
}
}
We first loop through all the lines in the first object, then compare if the line is matched to anything in the second array. If we find a match we go through Array B to find where the Line from A matches.
You could make this code a hell lot prettier, but this the most understandable way I can write it.
Old Answer
You could use the Compare-Object cmdlet to compare the two arrays, then use the -IncludeEqual switch to show where there are matches and then use the -ExcludeDifferent switch to remove the results that do not match.
Then take that Output and put in in a file. A simple test could be something like this:
$A = #("0c7d3283-bec2-4db1-9078-ebb79d21afdf", "200bc957-26dd-4e8e-aa6e-00dc357c4ac2", "218e0d2a-0e8b-4a68-8136-8f5dd749a614")
$B = #("0c7d3283-bec2-4db1-9078-ebb79d21afdf", "200bc957-26dd-4e8e-aa6e-00dc357c4ac2", "e3df3978-beb7-4545-bc48-ff40d8453be1")
(Compare-Object -ReferenceObject $A -DifferenceObject $B -ExcludeDifferent -IncludeEqual).InputObject | Out-File .\output.txt
This should output a file in your Shells current working directory with the two GUIDs that match:
0c7d3283-bec2-4db1-9078-ebb79d21afdf
200bc957-26dd-4e8e-aa6e-00dc357c4ac2
Where the one that didn't match is not included.

Fastest way to parse thousands of small files in PowerShell

I have over 16000 inventory log files ranging in size from 3-5 KB on a network share.
Sample file looks like this:
## System Info
SystemManufacturer:=:Dell Inc.
SystemModel:=:OptiPlex GX620
SystemType:=:X86-based PC
ChassisType:=:6 (Mini Tower)
## System Type
isLaptop=No
I need to put them into a DB, so I started parsing them and creating a custom object for each that I can later use to check duplicates, normalize etc...
Initial parse with a code snippet as in below took about 7.5mins.
Foreach ($invlog in $invlogs) {
$content = gc $invlog.FullName -ReadCount 0
foreach ($line in $content) {
if ($line -match '^#|^\s*$') { continue }
$invitem,$value=$line -split ':=:'
[PSCustomObject]#{Name=$invitem;Value=$value}
}
}
I started optimizing it and after several trial and error ended up with this which takes 2mins and 4 secs:
Foreach ($invlog in $invlogs) {
foreach ($line in ([System.IO.File]::ReadLines("$($invlog.FullName)") -match '^\w') ) {
$invitem,$value=$line -split ':=:'
[PSCustomObject]#{name=$invitem;Value=$value} #2.04mins
}
}
I also tried using a hash instead of PSCustomObject, but to my surprise it took much longer (5mins 26secs)
Foreach ($invlog in $invlogs) {
$hash=#{}
foreach ($line in ([System.IO.File]::ReadLines("$($invlog.FullName)") -match $propertyline) ) {
$invitem,$value=$line -split ':=:'
$hash[$invitem]=$value #5.26mins
}
}
What would be the fastest method to use here?
See if this is any faster:
Foreach ($invlog in $invlogs) {
#(gc $invlog.FullName -ReadCount 0) -notmatch '^#|^\s*$' |
foreach {
$invitem,$value=$line -split ':=:'
[PSCustomObject]#{Name=$invitem;Value=$value}
}
}
The -match and -notmatch operators, when appied to an array return all the elements that satisfy the match, so you can eliminate having to test every line for the lines to exclude.
Are you really wanting to create a PS Object for every line, or just one for every file?
If you want one object per file, see if this is any quicker:
The multi-line regex eliminates the line array, and a filter is used in place of the foreach to create the hash entries.
$regex = [regex]'(?ms)^(\w+):=:([^\r]+)'
filter make-hash { #{$_.groups[1].value = $_.groups[2].value} }
Foreach ($invlog in $invlogs) {
$regex.matches([io.file]::ReadAllText($invlog.fullname)) | make-hash
}
The objective of switching to using the multi-line regex and [io.file]::ReadAllText] is to simplify what Powershell is doing with the file input internally. The result of [io.file]::ReadAllText() will be a string object, which is a much simpler type of object than the array of strings that [io.file]::ReadAllLines() will produce, and requires less overhead to counstruct internally. A filter is essentially just the Process block of a function - it will run once for every object that comes to it from the pipeline, so it emulates the action of foreach-object, but actually runs slightly faster (I don't know the internals well enough to tell you exactly why). Both of these changes require more coding and only result in a marginal increase in performace. In my testing switching to multi-line gained about .1ms per file, and changing from foreach-object to the filter another .1 ms. You probably don't see these techniques used very often because of the low return compared to the additional coding work required, but it becomes significant when you start to multiply those fractions of a ms by 160K iterations.
Try this:
Foreach ($invlog in $invlogs) {
$output = #{}
foreach ($line in ([IO.File]::ReadLines("$($invlog.FullName)") -ne '') ) {
if ($line.Contains(":=:")) {
$item, $value = $line.Split(":=:") -ne ''
$output[$item] = $value
}
}
New-Object PSObject -Property $output
}
As a general rule, Regex is sometimes cool but always slower.
Wouldn't you want an object per system, and not per key-value pair? :S
Like this.. By replacing Get-Content to the .Net method you could probably save some time.
Get-ChildItem -Filter *.txt -Path <path to files> | ForEach-Object {
$ht = #{}
Get-Content $_ | Where-Object { $_ -match ':=:' } | ForEach-Object {
$ht[($_ -split ':=:')[0].Trim()] = ($_ -split ':=:')[1].Trim()
}
[pscustomobject]$ht
}
ChassisType SystemManufacturer SystemType SystemModel
----------- ------------------ ---------- -----------
6 (Mini Tower) Dell Inc. X86-based PC OptiPlex GX620

Powershell: Search data in *.txt files to export into *.csv

First of all, this is my first question here. I often come here to browse existing topics, but now I'm hung on my own problem. And I didn't found a helpful resource right now. My biggest concern would be, that it won't work in Powershell... At the moment I try to get a small Powershell tool to save me a lot of time. For those who don't know cw-sysinfo, it is a tool that collects information of any host system (e.g. Hardware-ID, Product Key and stuff like that) and generates *.txt files.
My point is, if you have 20, 30 or 80 server in a project, it is a huge amount of time to browse all files and just look for those lines you need and put them together in a *.csv file.
What I have working is more like the basic of the tool, it browses all *.txt in a specific path and checks for my keywords. And here is the problem that I just can use the words prior to those I really need, seen as follow:
Operating System: Windows XP
Product Type: Professional
Service Pack: Service Pack 3
...
I don't know how I can tell Powershell to search for "Product Type:"-line and pick the following "Professional" instead. Later on with keys or serial numbers it will be the same problem, that is why I just can't browse for "Standard" or "Professional".
I placed my keywords($controls) in an extra file that I can attach the project folders and don't need to edit in Powershell each time. Code looks like this:
Function getStringMatch
{
# Loop through the project directory
Foreach ($file In $files)
{
# Check all keywords
ForEach ($control In $controls)
{
$result = Get-Content $file.FullName | Select-String $control -quiet -casesensitive
If ($result -eq $True)
{
$match = $file.FullName
# Write the filename according to the entry
"Found : $control in: $match" | Out-File $output -Append
}
}
}
}
getStringMatch
I think this is the kind of thing you need, I've changed Select-String to not use the -quiet option, this will return a matches object, one of the properties of this is the line I then split the line on the ':' and trim any spaces. These results are then placed into a new PSObject which in turn is added to an array. The array is then put back on the pipeline at the end.
I also moved the call to get-content to avoid reading each file more than once.
# Create an array for results
$results = #()
# Loop through the project directory
Foreach ($file In $files)
{
# load the content once
$content = Get-Content $file.FullName
# Check all keywords
ForEach ($control In $controls)
{
# find the line containing the control string
$result = $content | Select-String $control -casesensitive
If ($result)
{
# tidy up the results and add to the array
$line = $result.Line -split ":"
$results += New-Object PSObject -Property #{
FileName = $file.FullName
Control = $line[0].Trim()
Value = $line[1].Trim()
}
}
}
}
# return the results
$results
Adding the results to a csv is just a case of piping the results to Export-Csv
$results | Export-Csv -Path "results.csv" -NoTypeInformation
If I understand your question correctly, you want some way to parse each line from your report files and extract values for some "keys". Here are a few lines to give you an idea of how you could proceede. The example is for one file, but can be generalized very easily.
$config = Get-Content ".\config.txt"
# The stuff you are searching for
$keys = #(
"Operating System",
"Product Type",
"Service Pack"
)
foreach ($line in $config)
{
$keys | %{
$regex = "\s*?$($_)\:\s*(?<value>.*?)\s*$"
if ($line -match $regex)
{
$value = $matches.value
Write-Host "Key: $_`t`tValue: $value"
}
}
}