Flatten netstat command output in powershell - powershell

I am attempting to use netstat -bano and collect the output in PowerShell for some very specific reporting requirements.
I have working regex that should be able to parse this output no problem, but since the output appears on multiple lines, the regex isn't being processed correctly
here's a screenshot of how it comes out of netstat
desired output is something like this (all on one line):
TCP 0.0.0.0:135 0.0.0.0:0 LISTENING 1092 RpcSs [svchost.exe]
TCP 0.0.0.0:445 0.0.0.0:0 LISTENING 4 Can not obtain ownership information
TCP 0.0.0.0:623 0.0.0.0:0 LISTENING 7404 [LMS.exe]
TCP 0.0.0.0:3389 0.0.0.0:0 LISTENING 1224 TermService [svchost.exe]
the use of tools outside of Windows isn't possible, so I'm confined to common tools.
Using Get-Process matching on PID also won't work, as it hides sub process information under svchost and lsass. netstat with a -b is perfect because it shows both svchost.exe and the process that utilizes the port
I've scoured the internet to find a viable solution but most end in a different resolution
EDIT**here is my final script using input from you guys
$data = (netstat -bano |select -skip 4 | Out-String) -replace '(?m)^ (TCP|UDP)', '$1' -replace '\r?\n\s+([^\[])', "`t`$1" -replace '\r?\n\s+\[', "`t[" -split "`n"
[regex]$regex = '(?<protocol>TCP|UDP)\s+(?<address>\d+.\d+.\d+.\d+|\[::\]|\[::1\]):(?<port>\d+).+(?<state>LISTENING|\*:\*)\s+(?<pid>\d+)\s+(?<service>Can not obtain ownership information|\[\w+.exe\]|\w+\s+\[\w+.exe\])'
$output = #()
$data | foreach {
$_ -match $regex
$outputobj = #{
protocol = [string]$matches.protocol
address = [string]$matches.address -replace '\[::\]','[..]' -replace '\[::1\]','[..1]'
port = [int]$matches.port
state = [string]$matches.state -replace "\*:\*",'NA'
pid = [int]$matches.pid
service = ([string]$matches.service -replace 'Can not obtain ownership information','[System' -split '.*\[')[1] -replace '\]',''
subservice = ([string]$matches.service -replace 'Can not obtain ownership information','' -split '\[.*\]')[0]
}
$output += New-Object -TypeName PSobject -Property $outputobj
}
$output |select address,port,protocol,pid,state,service,subservice

I would probably do something like this:
mangle the output into a single string:
netstat -bano | Out-String
remove indention of the lines beginning with UDP or TCP to make them distinguishable from the other lines:
-replace '(?m)^ (TCP|UDP)', '$1'
join all indented lines that don't begin with a square bracket to the line preceding them:
-replace '\r?\n\s+([^\[])', "`t`$1"
join all indented lines that do begin with a square bracket to the line preceding them:
-replace '\r?\n\s+\[', "`t["
Complete statement:
(netstat -bano | Out-String) -replace '(?m)^ (TCP|UDP)', '$1' -replace '\r?\n\s+([^\[])', "`t`$1" -replace '\r?\n\s+\[', "`t["

Building off what TheMadTechnician had, I found a few cases that broke his output. Specifically: netstat sometimes skips columns; when a friendly app name and an executable are both output on separate lines, it breaks CSV formatting. Here's an updated version of his/her code to take those pieces into account:
$netstat = (netstat -abn | Select -skip 2) -replace "Address\s{2,}State", "Address,State,Application" -join "`n" `
-split "(?= [TU][CD]P\s+(?:\d+\.|\[\w*:\w*:))" | % {
$_.trim() -replace "`n",' ' `
-replace '\*\:\*', '*:*,' `
-replace '\s{2,}', ',' `
-replace '(.*?),(.*?),(.*?),(.*?),(.*?),', '$1,$2,$3,$4,$5 '
} | `
ConvertFrom-Csv

You can join the output of netstat so that it becomes one large multi-line string, then split it on lines that start with whitespace followed by TCP or UDP, followed by an IP address (to remove false positives of an application having a name of 'TCP tracker' or something). Then trim any whitespace from the beginning or end of the line, replace anywhere that there's two or more spaces with a comma, and push the results to ConvertFrom-Csv to create objects. From that you could filter, group, or just simply pipe to Format-Table to see the results.
$Netstat = (netstat -bano | Select -skip 2) -join "`n" -split "(?= [TU][CD]P\s+(?:\d+\.|\[\w*:\w*:))" |
ForEach-Object {$_.trim() -replace "`n",' ' -replace '\s{2,}',','} |
ConvertFrom-Csv
# Filter the results for TCP connections and pipe the results to Format-Table
$Netstat | Where {$_.Proto -eq 'TCP'} | Format-Table

Related

How to make netstat output's headings show properly in out-gridview?

when I use:
netstat -f | out-gridview
in PowerShell 7.3, I get the window but it has only one column which is a string. I don't know why it's not properly creating a column for each of the headings like Proto, Local Address etc.
how can I fix this?
While commenter Toni makes a good point to use Get-NetTCPConnection | Out-GridView instead, this answer addresses the question as asked.
To be able to show output of netstat in grid view, we have to parse its textual output into objects.
Fortunately, all fields are separated by at least two space characters, so after replacing these with comma, we can simply use ConvertFrom-CSV (thanks to an idea of commenter Doug Maurer).
netstat -f |
# Skip unwanted lines at the beginning
Select-Object -skip 3 |
# Replace two or more white space characters by comma, except at start of line
ForEach-Object { $_ -replace '(?<!^)\s{2,}', ',' } |
# Convert into an object and add it to grid view
ConvertFrom-Csv | Out-GridView
For a detailed explanation of the RegEx pattern used with the -replace operator, see this RegEx101 demo page.
This is the code of my original answer, which is functionally equivalent. I'll keep it as an example of how choosing the right tool for the job can greatly simplify code.
$headers = #()
# Skip first 3 lines of output which we don't need
netstat -f | Select-Object -skip 3 | ForEach-Object {
# Split each line into columns
$columns = $_.Trim() -split '\s{2,}'
if( -not $headers ) {
# First line is the header row
$headers = $columns
}
else {
# Create an ordered hashtable
$objectProperties = [ordered] #{}
$i = 0
# Loop over the columns and use the header columns as property names
foreach( $key in $headers ) {
$objectProperties[ $key ] = $columns[ $i++ ]
}
# Convert the hashtable into an object that can be shown by Out-GridView
[PSCustomObject] $objectProperties
}
} | Out-GridView

How to add quotes and commas in PowerShell?

I have a CSV file where I only need 1 Column Called "SerialNumber" I need to combine the text lines, remove any blank space, add each line in quotes and separate by comma.
So far I have multiple miles of code that work, but it adds quotes at the end and doesn't add quotes in the beginning.
$SerialList = import-csv .\log.csv | select -ExpandProperty Serialnumber | Out-File -FilePath .\Process.txt
(gc process.txt) | ? {$_.trim() -ne "" } | set-content process.txt
gc .\process.txt | %{$_ -replace '$','","'} | out-file process1.csv
Get-Content .\process1.csv| foreach {
$out = $out + $_
}
$out| Out-File .\file2.txt
Output:
SerialNumber
1234
1234
4567
4567
Expected Output:
"1234","1234","4567","4567"
Try the following (PSv3+):
(Import-Csv .\log.csv).SerialNumber -replace '^.*$', '"$&"' -join "," > .\file2.txt
(Import-Csv .\log.csv).SerialNumber imports the CSV file and .SerialNumber uses member-access enumeration to extract the SerialNumber column values as an array.
-replace '^.*$', '"$&"' encloses each array element in "...".
Regex ^.*$ matches each array element in full.
Replacement expression "$&" replaces the element with what was matched ($&) enclosed in " chars. - for background, see this answer
-join "," joins the resulting array elements with , as the separator.

How to avoid double quote when using export-csv in Powershell [duplicate]

I am using ConvertTo-Csv to get comma separated output
get-process | convertto-csv -NoTypeInformation -Delimiter ","
It outputs like:
"__NounName","Name","Handles","VM","WS",".....
However I would like to get output without quotes, like
__NounName,Name,Handles,VM,WS....
Here is a way to remove the quotes
get-process | convertto-csv -NoTypeInformation -Delimiter "," | % {$_ -replace '"',''}
But it has a serious drawback if one of the item contains a " it will be removed !
Hmm, I have Powershell 7 preview 1 on my mac, and Export-Csv has a -UseQuotes option that you can set to AsNeeded. :)
I was working on a table today and thought about this very question as I was previewing the CSV file in notepad and decided to see what others had come up with. It seems many have over-complicated the solution.
Here's a real simple way to remove the quote marks from a CSV file generated by the Export-Csv cmdlet in PowerShell.
Create a TEST.csv file with the following data.
"ID","Name","State"
"5","Stephanie","Arizona"
"4","Melanie","Oregon"
"2","Katie","Texas"
"8","Steve","Idaho"
"9","Dolly","Tennessee"
Save As: TEST.csv
Store file contents in a $Test variable
$Test = Get-Content .\TEST.csv
Load $Test variable to see results of the get-content cmdlet
$Test
Load $Test variable again and replace all ( "," ) with a comma, then trim start and end by removing each quote mark
$Test.Replace('","',",").TrimStart('"').TrimEnd('"')
Save/Replace TEST.csv file
$Test.Replace('","',",").TrimStart('"').TrimEnd('"') | Out-File .\TEST.csv -Force -Confirm:$false
Test new file Output with Import-Csv and Get-Content:
Import-Csv .\TEST.csv
Get-Content .\TEST.csv
To Sum it all up, the work can be done with 2 lines of code
$Test = Get-Content .\TEST.csv
$Test.Replace('","',",").TrimStart('"').TrimEnd('"') | Out-File .\TEST.csv -Force -Confirm:$false
I ran into this issue, found this question, but was not satisfied with the answers because they all seem to suffer if the data you are using contains a delimiter, which should remain quoted. Getting rid of the unneeded double quotes is a good thing.
The solution below appears to solve this issue for a general case, and for all variants that would cause issues.
I found this answer elsewhere, Removing quotes from CSV created by PowerShell, and have used it to code up an example answer for the SO community.
Attribution: Credit for the regex, goes 100% to Russ Loski.
Code in a Function, Remove-DoubleQuotesFromCsv
function Remove-DoubleQuotesFromCsv
{
param (
[Parameter(Mandatory=$true)]
[string]
$InputFile,
[string]
$OutputFile
)
if (-not $OutputFile)
{
$OutputFile = $InputFile
}
$inputCsv = Import-Csv $InputFile
$quotedData = $inputCsv | ConvertTo-Csv -NoTypeInformation
$outputCsv = $quotedData | % {$_ -replace `
'\G(?<start>^|,)(("(?<output>[^,"]*?)"(?=,|$))|(?<output>".*?(?<!")("")*?"(?=,|$)))' `
,'${start}${output}'}
$outputCsv | Out-File $OutputFile -Encoding utf8 -Force
}
Test Code
$csvData = #"
id,string,notes,number
1,hello world.,classic,123
2,"a comma, is in here","test data 1",345
3,",a comma, is in here","test data 2",346
4,"a comma, is in here,","test data 3",347
5,"a comma, is in here,","test data 4`r`nwith a newline",347
6,hello world2.,classic,123
"#
$data = $csvData | ConvertFrom-Csv
"`r`n---- data ---"
$data
$quotedData = $data | ConvertTo-Csv -NoTypeInformation
"`r`n---- quotedData ---"
$quotedData
# this regular expression comes from:
# http://www.sqlmovers.com/removing-quotes-from-csv-created-by-powershell/
$fixedData = $quotedData | % {$_ -replace `
'\G(?<start>^|,)(("(?<output>[^,"\n]*?)"(?=,|$))|(?<output>".*?(?<!")("")*?"(?=,|$)))' `
,'${start}${output}'}
"`r`n---- fixedData ---"
$fixedData
$fixedData | Out-File e:\test.csv -Encoding ascii -Force
"`r`n---- e:\test.csv ---"
Get-Content e:\test.csv
Test Output
---- data ---
id string notes number
-- ------ ----- ------
1 hello world. classic 123
2 a comma, is in here test data 1 345
3 ,a comma, is in here test data 2 346
4 a comma, is in here, test data 3 347
5 a comma, is in here, test data 4... 347
6 hello world2. classic 123
---- quotedData ---
"id","string","notes","number"
"1","hello world.","classic","123"
"2","a comma, is in here","test data 1","345"
"3",",a comma, is in here","test data 2","346"
"4","a comma, is in here,","test data 3","347"
"5","a comma, is in here,","test data 4
with a newline","347"
"6","hello world2.","classic","123"
---- fixedData ---
id,string,notes,number
1,hello world.,classic,123
2,"a comma, is in here",test data 1,345
3,",a comma, is in here",test data 2,346
4,"a comma, is in here,",test data 3,347
5,"a comma, is in here,","test data 4
with a newline","347"
6,hello world2.,classic,123
---- e:\test.csv ---
id,string,notes,number
1,hello world.,classic,123
2,"a comma, is in here",test data 1,345
3,",a comma, is in here",test data 2,346
4,"a comma, is in here,",test data 3,347
5,"a comma, is in here,","test data 4
with a newline","347"
6,hello world2.,classic,123
This is pretty similar to the accepted answer but it helps to prevent unwanted removal of "real" quotes.
$delimiter = ','
Get-Process | ConvertTo-Csv -Delimiter $delimiter -NoTypeInformation | foreach { $_ -replace '^"','' -replace "`"$delimiter`"",$delimiter -replace '"$','' }
This will do the following:
Remove quotes that begin a line
Remove quotes that end a line
Replace quotes that wrap a delimiter with the delimiter alone.
Therefore, the only way this would go wrong is if one of the values actually contained not only quotes, but specifically a quote-delimiter-quote sequence, which hopefully should be pretty uncommon.
Once the file is generated, you can run
set-content FILENAME.csv ((get-content FILENAME.csv) -replace '"')
Depending on how pathological (or "full-featured") your CSV data is, one of the posted solutions will already work.
The solution posted by Kory Gill is almost perfect - the only issue remaining is that quotes are removed also for cells containing the line separator \r\n, which is causing issues in many tools.
The solution is adding a newline to the character class expression:
$fixedData = $quotedData | % {$_ -replace `
'\G(?<start>^|,)(("(?<output>[^,"\n]*?)"(?=,|$))|(?<output>".*?(?<!")("")*?"(?=,|$)))' `
,'${start}${output}'}
I wrote this for my needs:
function ConvertTo-Delimited {
[CmdletBinding()]
param(
[Parameter(ValueFromPipeline=$true,Mandatory=$true)]
[psobject[]]$InputObject,
[string]$Delimiter='|',
[switch]$ExcludeHeader
)
Begin {
if ( $ExcludeHeader -eq $false ) {
#(
$InputObject[0].PsObject.Properties | `
Select-Object -ExpandProperty Name
) -Join $Delimiter
}
}
Process {
foreach ($item in $InputObject) {
#(
$item.PsObject.Properties | `
Select-Object Value | `
ForEach-Object {
if ( $null -ne $_.Value ) {$_.Value.ToString()}
else {''}
}
) -Join $Delimiter
}
}
End {}
}
Usage:
$Data = #(
[PSCustomObject]#{
A = $null
B = Get-Date
C = $null
}
[PSCustomObject]#{
A = 1
B = Get-Date
C = 'Lorem'
}
[PSCustomObject]#{
A = 2
B = Get-Date
C = 'Ipsum'
}
[PSCustomObject]#{
A = 3
B = $null
C = 'Lorem Ipsum'
}
)
# with headers
PS> ConvertTo-Delimited $Data
A|B|C
1|7/17/19 9:07:23 PM|Lorem
2|7/17/19 9:07:23 PM|Ipsum
||
# without headers
PS> ConvertTo-Delimited $Data -ExcludeHeader
1|7/17/19 9:08:19 PM|Lorem
2|7/17/19 9:08:19 PM|Ipsum
||
Here's another approach:
Get-Process | ConvertTo-Csv -NoTypeInformation -Delimiter "," |
foreach { $_ -replace '^"|"$|"(?=,)|(?<=,)"','' }
This replaces matches with the empty string, in each line. Breaking down the regex above:
| is like an OR, used to unite the following 4 sub-regexes
^" matches quotes in the beginning of the line
"$ matches quotes in the end of the line
"(?=,) matches quotes that are immediately followed by a comma
(?<=,)" matches quotes that are immediately preceded by a comma
I found that Kory's answer didn't work for the case where the original string included more than one blank field in a row. I.e. "ABC",,"0" was fine but "ABC",,,"0" wasn't handled properly. It stopped replacing quotes after the ",,,". I fixed it by adding "|(?<output>)" near the end of the first parameter, like this:
% {$_ -replace `
'\G(?<start>^|,)(("(?<output>[^,"]*?)"(?=,|$))|(?<output>".*?(?<!")("")*?"(?=,|$))|(?<output>))', `
'${start}${output}'}
I haven't spent much time looking for removing the quotes. But, here is a workaround.
get-process | Export-Csv -NoTypeInformation -Verbose -Path $env:temp\test.csv
$csv = Import-Csv -Path $env:temp\test.csv
This is a quick workaround and there may be a better way to do this.
A slightly modified variant of JPBlanc's answer:
I had an existing csv file which looked like this:
001,002,003
004,005,006
I wanted to export only the first and third column to a new csv file. And for sure I didn't want any quotes ;-)
It can be done like this:
Import-Csv -Path .\source.csv -Delimiter ',' -Header A,B,C | select A,C | ConvertTo-Csv -NoTypeInformation -Delimiter ',' | % {$_ -replace '"',''} | Out-File -Encoding utf8 .\target.csv
Couldn't find an answer to a similar question so I'm posting what I've found here...
For exporting as Pipe Delimited with No Quotes for string qualifiers, use the following:
$objtable | convertto-csv -Delimiter "|" -notypeinformation | select -Skip $headers | % { $_ -replace '"\|"', "|"} | % { $_ -replace '""', '"'} | % { $_ -replace "^`"",''} | % { $_ -replace "`"$",''} | out-file "$OutputPath$filename" -fo -en ascii
This was the only thing I could come up with that could handle quotes and commas within the text; especially things like a quote and comma next to each other at the beginning or ending of a text field.
This function takes a powershell csv object from the pipeline and outputs like convertto-csv but without adding quotes (unless needed).
function convertto-unquotedcsv {
param([Parameter(ValueFromPipeline=$true)]$csv, $delimiter=',', [switch]$noheader=$false)
begin {
$NeedQuotesRex = "($([regex]::escape($delimiter))|[\n\r\t])"
if ($noheader) { $names = #($true) } else { $names = #($false) }
}
process {
$psop = $_.psobject.properties
if (-not $names) {
$names = $psop.name | % {if ($_ -match $NeedQuotesRex) {'"' + $_ + '"'} else {$_}}
$names -join $delimiter # unquoted csv header
}
$values = $psop.value | % {if ($_ -match $NeedQuotesRex) {'"' + $_ + '"'} else {$_}}
$values -join $delimiter # unquoted csv line
}
end {
}
}
$names gets an array of noteproperty names and $values gets an array of notepropery values. It took that special step to output the header. The process block gets the csv object one piece at a time.
Here is a test run
$delimiter = ','; $csvData = #"
id,string,notes,"points per 1,000",number
4,"a delimiter$delimiter is in here,","test data 3",1,348
5,"a comma, is in here,","test data 4`r`nwith a newline",0.5,347
6,hello world2.,classic,"3,000",123
"#
$csvdata | convertfrom-csv | sort number | convertto-unquotedcsv -delimiter $delimiter
id,string,notes,"points per 1,000",number
6,hello world2.,classic,"3,000",123
5,"a comma, is in here,","test data 4
with a newline",0.5,347
4,"a delimiter, is in here,",test data 3,1,348

Set-Content not working

Basically, all that I'm trying to do is replace some text in a text file based on line number. In this example, only 2 out of 3 Set-Content actually work when I run the script. However, when I run the Set-Content that doesn't work with a breakpoint, or highlight the block and run it separately, it magically works. It also works if I remove the other two set-content blocks.
I've tried putting in multiple Start-Sleep, and have tried on Windows Server 2012 R2 and Windows 10, both with some version of PS 5. Get-Content is in parenthesis to ensure that that operation is complete before continuing. I've tried putting a Get-Content between each operation. The full script has multiple Set-Content in between the first and the last, and they all fail no matter which order they occur in.
You can test it yourself. Create a text file with this content:
;12.1 - MyName
$ScriptVer = "12.1"
If $VAR<"1.2.3"
Then run this:
#Declare Paths
$Temp = "\\FileShare\e$\Temp\file.txt"
#Get-Content
$KIXOLD = (Get-Content $Temp)
[decimal]$OLDVER = 12.1
$NEWVER = ($oldver + .1)
#Update Version Number in File - THIS WORKS
#I can put in multiple of these anywhere in the script and they all work
#I can even move this block to the end and it still works
$VerLine = Select-String -Pattern $oldver -Path $temp |
select -ExpandProperty LineNumber |
select -Index 1
$KIXOLD[$VerLine - 1] = "`$ScriptVer = `"$NEWVER`""
$KIXOLD | Set-Content $temp
#Find the old version in the text file and replace with new
#This FAILS unless there's a breakpoint or it's run separately
#It doesn't matter if it's the first set-content last, or middle, this fails
$CONV = $KIXOLD | where {$_ -like "*If `$VAR<`"1.2.3`""}
($kixold).Replace("$CONV", " If `$VAR<`"1.2.4`"") | Set-Content $Temp
#Update notes to contain current version - THIS WORKS
#I can put in multiple of these anywhere in the script and they all work
$linenum = Select-String -Pattern $oldver -Path $Temp |
select -ExpandProperty LineNumber |
select -Index 0
$NewLine = [int]$linenum +1
$KIXOLD[$linenum] = ";$NewVer - MyName"
$KIXOLD | Set-Content $temp
You'll find that the resulting text file looks like this:
;12.1 - MyName
;12.2 - MyName
$ScriptVer = "12.2"
If $VAR<"1.2.3"
when it should look like this:
;12.1 - MyName
;12.2 - MyName
$ScriptVer = "12.2"
If $VAR<"1.2.4"
To reiterate, the modification of If $VAR<"1.2.3" DOES occur if there is a breakpoint or if I run that selection separately.
No matter what I try, only the first and the last Set-Content work unless there's a breakpoint or it's run separately. I'm at a loss, any help would be appreciated.
.Replace method applied to an object does not change that object! Use
$CONV = $KIXOLD | where {$_ -like "*If `$VAR<`"1.2.3`""}
$KIXOLD = $kixold.Replace("$CONV", " If `$VAR<`"1.2.4`"")
$KIXOLD | Set-Content $Temp
or
$CONV = $KIXOLD | where {$_ -like "*If `$VAR<`"1.2.3`""}
($kixold.Replace("$CONV", " If `$VAR<`"1.2.4`"")) | Set-Content $Temp
$KIXOLD = (Get-Content $Temp)

Need to get the string in between two patterns in PowerShell

I need to match the a pattern i.e. "Commodity Name" and get the string in the next line between the patterns "<dd>" "</dd>".
Sample Input file:
C:\Users\rpm\Desktop\sample.txt:133: <dt>Commodity Name</dt>
C:\Users\rpm\Desktop\sample.txt:134: <dd>Grocery</dd>
C:\Users\rpm\Desktop\sample.txt:136: <dt>IP address</dt>
C:\Users\rpm\Desktop\sample.txt:137: <dd>XXX.XXX.XXX.XXX port 8000</dd>
C:\Users\rpm\Desktop\sample.txt:144: <dt>Commodity Serial #</dt>
C:\Users\rpm\Desktop\sample.txt:145: <dd>0055500000</dd>
C:\Users\rpm\Desktop\sample.txt:147: <dt>Client IP</dt>
C:\Users\rpm\Desktop\sample.txt:148: <dd>xxx.xxx.xxx.xxx</dd>
C:\Users\rpm\Desktop\sample.txt:150: <dt>Client Logged In As</dt>
C:\Users\rpm\Desktop\sample.txt:151: <dd>rpm123</dd>
C:\Users\rpm\Desktop\sample.txt:153: <dt>User is member of</dt>
C:\Users\rpm\Desktop\sample.txt:154: <dd>BP-RPM\COMD_CSO_ITM-AVAI_Def,BP-RPM\user</dd>
Need to match patterns such as
Commodity Name
IP address
Commodity Serial #
Client IP
Client Logged In As
User is member of
and get the values in the next line of the matched patterns between the tags <dd> & </dd>.
Desired output:
Grocery | XXX.XXX.XXX.XXX port 8000 | 0055500000 | xxx.xxx.xxx.xxx | rpm123 | BP-RPM\COMD_CSO_ITM-AVAI_Def,BP-RPM\user
I would start to create an array defining your keywords:
$keywords = #(
'<dt>Commodity Name</dt>'
'<dt>IP address</dt>'
'<dt>Commodity Serial #</dt>'
'<dt>Client IP</dt>'
'<dt>Client Logged In As</dt>'
'<dt>User is member of</dt>'
)
Now you can join the keywords by an | to use it with the Select-String cmdlet:
$file = 'C:\Users\rpm\Desktop\sample.txt'
$content = Get-Content $file
$content | Select-String -Pattern ($keywords -join '|')
This will give you the line number of each matched keyword. Now you can iterate over the result, access the next line by index and crop the <dd> pre and </dd> postifx:
ForEach-Object {
[regex]::Match($content[$_.LineNumber], '<dd>(.+)</dd>').Groups[1].Value
}
Regex:
Output:
Grocery
XXX.XXX.XXX.XXX port 8000
0055500000
xxx.xxx.xxx.xxx
rpm123
BP-RPM\COMD_CSO_ITM-AVAI_Def,BP-RPM\user
Finally you have to join the result by | to get the desired output. Here is the whole script:
$keywords = #(
'<dt>Commodity Name</dt>'
'<dt>IP address</dt>'
'<dt>Commodity Serial #</dt>'
'<dt>Client IP</dt>'
'<dt>Client Logged In As</dt>'
'<dt>User is member of</dt>'
)
$file = 'C:\Users\rpm\Desktop\sample.txt'
$content = Get-Content $file
($content | Select-String -Pattern ($keywords -join '|') |
ForEach-Object {
[regex]::Match($content[$_.LineNumber], '<dd>(.+)</dd>').Groups[1].Value
}) -join ' | '
Output:
Grocery | XXX.XXX.XXX.XXX port 8000 | 0055500000 | xxx.xxx.xxx.xxx | rpm123 | BP-RPM\COMD_CSO_ITM-AVAI_Def,BP-RPM\user