Adding multiple rows to CSV file at once through PowerShell - powershell

Background
I've been looking through several posts here on Stack and can only find answers to "how to add one single row of data to a CSV file" (notably this one). While they are good, they only refer to the specific case of adding a single entry from memory. Suppose I have 100,000 rows I want to add to a CSV file, then the speed of the query will be orders of magnitude slower if I for each row write it to file. I imagine that it will be much faster to keep everything in memory, and once I've built a variable that contains all the data that I want to add, only then write it to file.
Current situation
I have log files that I receive from customers containing about half a million rows. Some of these rows begin with a datetime and how much memory the server is using. In order to get a better view of how the memory usage looks like, I want to plot the memory usage over time using this information. (Note: yes, the best solution would be to ask the developers to add this information as it is fairly common we need this, but since we don't have that yet, I need to work with what I got)
I am able to read the log files, extract the contents, create two variables called $timeStamp and $memoryUsage that finds all the relevant entries. The problem occurs when I occurs when I try to add this to a custom PSObject. It would seem that using a $csvObject += $newRow only adds a pointer to the $newRow variable rather than the actual row itself. Here's the code that I've got so far:
$header1 = "Time Stamp"
$header2 = "Memory Usage"
$csvHeaders = #"
$header1;$header2
"#
# The following two lines are a workaround to make sure that the $csvObject becomes a PSObject that matches the output I'm trying to achieve.
$csvHeaders | Out-File -FilePath $csvFullPath
$csvObject = Import-Csv -Path $csvFullPath -Delimiter ";"
foreach ($TraceFile in $traceFilesToLookAt) {
$curTraceFile = Get-Content $TraceFile.FullName
Write-Host "Starting on file: $($TraceFile.Name)`n"
foreach ($line in $curTraceFile) {
try {
if (($line.Substring(4,1) -eq '-') -and ($line.Substring(7,1) -eq '-')) {
$TimeStamp = $line.Split("|",4)[0]
$memoryUsage = $($line.Split("|",4)[2]).Replace(",","")
$newRow = New-Object PSObject -Property #{
$header1 = $TimeStamp;
$header2 = $memoryUsage
}
$reorderedRow = $newRow | Select-Object -Property $header1,$header2
$reorderedRow | Export-Csv -Path $csvFullPath -Append -Delimiter ";"
}
} catch {
Out-Null
}
This works fine as it appends the row each time it finds one to the CSV file. The problem is that it's not very efficient.
End goal
I would ideally like to solve it with something like:
$newRow = New-Object PSObject -Property #{
$header1 = $TimeStamp;
$header2 = $memoryUsage
}
$rowsToAddToCSV += $newRow
And then in the final step do a:
$rowsToAddToCSV | Export-Csv -Path $csvFullPath -Append -Delimiter ";"
I have not been able to create any form of workaround for this. Among other things, PowerShell tells me that op_Addition is not part of the object, that the object I'm trying to export (the collection of rows) doesn't match the CSV file etc.

Anything that appends thousands of items to an array in a loop is bound to perform poorly, because each time an item is appended, the array will be re-created with its size increased by one, all existing items are copied, and then the new item is put in the new free slot.
Any particular reason why you can't simply do something like this?
$traceFilesToLookAt | ForEach-Object {
Get-Content $_.FullName | ForEach-Object {
if ($_.Substring(4, 1) -eq '-' -and $_.Substring(7, 1) -eq '-') {
$line = $_.Split('|', 4)
New-Object PSObject -Property #{
'Time Stamp' = $line[0]
'Memory Usage' = $line[2].Replace(',', '')
}
}
}
} | Export-Csv -Path $csvFullPath -Append -Delimiter ";"
A regular expression match might be an even more elegant approach to extracting timestamp and memory usage from the input files, but I'm going to leave that as an exercise for you.

Related

PowerShell Keeping track of updated and rejected rows while cleaning up files

Add-Member, hashtables, arrays and such confuse me a bit so I'm not sure the best way to approach this. My goal is to take an input.CSV, perform clean up and send those cleaned rows to Fixed.CSV, and send any 'reject rows' that couldn't be handled to reject.CSV with an explanation of why they were rejected.
My original script was splitting the 'good' from the 'bad' based on a single characteristic (e.g. a missing Account ID), but as a I get into the clean-up, there are other things that would cause a row to error out and I don't want to read the data into memory with .Where() and continually 'split' it - especially considering I'd like to finish with only 3 files total (OG-input.CSV, Fixed.CSV, Junk-reject.CSV).
$data, $rejectData = (Import-CSV $CSV).Where({![string]::IsNullOrEmpty($_."Account ID")}, 'Split')
If($rejectData){
$rejectData | Add-Member -NotePropertyName "Reject Reason" -NotePropertyValue "Account ID missing"
$rejectData | Export-CSV -LiteralPath "$($CSV.DirectoryName)\$($CSV.BaseName)_reject.csv" -NoTypeInformation
My output file was basically created after I had performed a bunch of steps on each row of $data above.
$outputFile = New-Object System.Collections.ArrayList
Foreach($row in $data){
# Do stuff, check using If, make updates, etc.
[void]$outputFile.Add($row)
}
$outputFile | Export-CSV -LiteralPath "$($CSV.DirectoryName)\$($CSV.BaseName)Fixed.csv" -NoTypeInformation
What I'm thinking at this point is instead of splitting the data initially, I should just iterate through all rows and if I can update them; I will and send to $outputFixed. If there is an error that can't be corrected, I'll send them to $outputReject - but here's the caveat, I want to add a new column for "Reject Reason" and then update that as I go. What I mean by this is, there could be multiple reasons a row gets rejected and I'd like to track each one. I've gotten it somewhat close, but creating the new column is giving me trouble. I was originally going to use Add-Member for the first time I add the column, and then just update the value in that column for each $row; something like $row."Reject Reason" = "$($row."Reject Reason")|New Reason" as this gets me a pipe-delimited list of reasons a row rejected. Then I found Powershell add-member. Add a member that's an ArrayList? that got me thinking maybe I could have the reasons within Reject Reason be a list themselves rather than just delimited. However, I'm not sure I quite understand the nuances of the answers proposed and can't figure out what might work best for me.
Nested arrays/lists are great, but you'll have to consider how you want to store and display your data.
A CSV file, like a table, doesn't properly handle nested objects like lists or arrays. This can be fine if you know your data, and don't mind converting your RejectReason field from/to a delimited string when you read it. For example, you could use Where-Object's filter block to find all the entries in $outputRejected with a specific reason:
# similar to what you had before
$csv = Import-Csv $path
$report = foreach ($row in $csv) {
$row | Add-Member -NotePropertyName 'RejectCode' -NotePropertyValue ''
if ($row.id -lt 5) { $row.RejectCode = $row.RejectCode+'Too Low|' }
if ($row.id -gt 3) { $row.RejectCode = $row.RejectCode+'Too High|' }
# Output the finalized row
$row
}
# Example: filter by reason code
$OutputRejected | Where-Object {($_.Reason -split '\|') -contains 'Too High'}
ID RejectCode
-- ----------
4 Too Low|Too High|
5 Too High|
For what you are doing, this usually works just fine. You have to be careful of your additional separator characters, but since you're defining the RejectCode yourself, it shouldn't be an issue.
For anything more complicated, I tend to create a PSCustomObject from each $row and set each property to what I need. This tends to work a little better for me than using Add-Member:
$report = foreach ($row in $csv) {
# custom object with manually defined properties
$reportRow = [PSCustomObject][Ordered]#{
ID = $row.ID
Name = $row.Name
Data = # run some commands to fix bad data
Reasons = #() # list object
}
# can edit properties as normal
if ($row.id -lt 5) { $reportRow.Reasons += $row.RejectCode+'Too Low|' }
if ($row.id -gt 3) { $reportRow.Reasons += $row.RejectCode+'Too High|' }
$reportrow
}
Just be aware that powershell's CSV commands tend to squish properties into the unhelpful system.object[] text when your properties aren't simple values like strings or ints. A better option for saving nested objects like this is a structured format like JSON. e.g.: $report | ConvertTo-Json | Out-File $path.
Without seeing any of your CSV, You could do something like this:
$csvPath = 'X:\Temp'
$original = Import-CSV -Path (Join-Path -Path $csvPath -ChildPath 'OG-input.CSV')
# create a List object to collect the rejected items
$rejects = [System.Collections.Generic.List[object]]::new()
$correct = foreach ($item in $original) {
$reason = $null
if ([string]::IsNullOrWhiteSpace($_.'Account ID')) { $reason = "Empty 'Account ID' field" }
elseif ($_.'Account ID'.Length -gt 20) { $reason = "'Account ID' field exceeds maximum length" }
# more elseif checks go here
# after all checks are done
if (!$reason) {
# all OK for this row; just output so it gets collected in $correct
$item
}
else {
# it's a rejected item, add an object to the $rejects list
$obj = $item | Select-Object *, #{Name = 'Reason'; Expression = {$reason}}
$rejects.Add($obj)
}
}
# save both files
$correct | Export-Csv -Path (Join-Path -Path $csvPath -ChildPath 'Fixed.CSV') -NoTypeInformation
$rejects | Export-Csv -Path (Join-Path -Path $csvPath -ChildPath 'Junk-reject.CSV') -NoTypeInformation
You need to fill in the rest of the checks and reasons for rejection of course
Here's what I ended up going with. I think it works, as the output looks about like I expected.
Foreach($row in $data){
#Process all reject reasons first and reject those rows
If([string]::IsNullOrEmpty($row."Account ID")){
$row | Add-Member -NotePropertyName "Reject Reason" -NotePropertyValue ("$($row."Reject Reason")", "Missing Account ID" -Join "|").TrimStart("|") -Force
}
If([string]::IsNullOrEmpty($row."Service Start Dates") -And ([string]::IsNullOrEmpty($row."Service End Dates"))){
$row | Add-Member -NotePropertyName "Reject Reason" -NotePropertyValue ("$($row."Reject Reason")", "Missing Both Service Dates" -Join "|").TrimStart("|") -Force
}
If(Get-Member -InputObject $row "Reject Reason"){
[void]$outputReject.Add($row)
Continue
}
If([string]::IsNullOrEmpty($row."Birth Date")){
$row."Birth Date" = $dte
}
If([string]::IsNullOrEmpty($row."Gender")){
$row."Gender" = "Female"
}
If( [string]::IsNullOrEmpty($row."Service Start Dates") -And !( [string]::IsNullOrEmpty($row."Service End Dates"))){
$row."Service Start Dates" = $row."Service End Dates"
}
[void]$outputFixed.Add($row)
}
$outputFixed | Export-CSV -LiteralPath "$($inputFile.DirectoryName)\$($inputFile.BaseName)Fixed.csv" -NoTypeInformation
If($outputReject){
$outputReject | Export-CSV -LiteralPath "$($inputFile.DirectoryName)\$($inputFile.BaseName)RejectedRows.csv" -NoTypeInformation
}
Basically I'm still collecting each row in an ArrayList that will be output once the entire file has been processed. I'm using Add-Member with -Force to 'overwrite' the reject reason(s) and a -Join of the text with a .TrimStart("|") to get rid of the leading pipe. This will definitely work for me (plus was easy to implement with what I already had written)

Powershell to present 'Net View' data

happy Easter!
I am trying to write a script in Powershell that takes a list of hosts from a txt (or csv) and then for each does a "net view /all" on it, returning the presented shares in a csv.
I got something working but I need a column to show the host its looking at for each row otherwise I cant map them back.
Attempt 1 returns the data and the host but looks VERY messy and is proving difficult to dissect in Excel:
$InputFile = 'M:\Sources\Temp\net_view_list.txt'
$addresses = get-content $InputFile
foreach($address in $addresses) {
$sharedFolders = (NET.EXE VIEW $address /all)
foreach ($item in $sharedfolders)
{
$str_list = $address + "|" + $item
$obj_list = $str_list | select-object #{Name='Name';Expression={$_}}
$obj_list | export-csv -append M:\sources\temp\netview.csv -notype
}
}
Attempt 2 works better but cant get the hostname listed, plus the comments seem to appear in the "Used as" section (only using for one host to test the theory (didnt work!)):
$command = net view hostname #/all
$netview = $command -split '\n'
$comp = $netview[0].trim().split()[-1]
$result = $netview -match '\w' | foreach {
convertfrom-string $_.trim() -delim '\s{2,}' -propertynames 'Share','Type', 'Used as', 'Comment'
}
$result[0] = $null
$result | format-table 'Share', 'Type', 'Used as', 'Comment' -hidetableheaders
Also neither of these accounts for issues where the host either isn't accessible or has 0 shares.
I have literally spent all day on these - grateful for any guidance!
I will provide the way to get what you want in the your 1st example. The main reason it is not appearing like you are expecting it to is because you are not dealing with a PowerShell object. You are getting the raw output from an external command. What you need to do is take the data and create a PS Custom object then you can use it as you will. Below is the code that you should add after you have the $SharedFolder populated heavily commented to explain what each part is for.
# Create Array to hold PSCustom Object and variable to tell when the DO loop is done
$share_list = #()
$completed = $false
# Loop through each line in the output
for($x=0;$x -lt $sharedFolders.count;$x++){
$next_line = $x + 1
# If the line is a bunch of - then we know the next line contains the 1st share name.
if($sharedFolders[$x] -like "*-------*"){
# Here we will loop until we find the end of the list of shares
do {
# Take the line and split it in to an array. Note when you
# use -split vs variable.split allows you to use regular
# expressions. the '\s+' will consider x number of spaces as one
# the single quotes are important when using regex. Double
# quotes use variable expansion. Single quotes don't
$content = $sharedFolders[$next_line] -split '\s+'
$share_name = $content[0].Trim()
# Create a PS Custom Object. This is a bit over kill for one item
# but shows you how to create a custom Object. Note the Object last
# just one loop thus you create a new one each go round then add it to
# an Array before the loop starts over.
$custom_object = new-object PSObject
$custom_object | add-member -MemberType NoteProperty -name 'Share Name' -Value $share_name
# Add the Custom Object to the Array
$share_list += $custom_object
# This exits the Do loop by setting $completed to true
if($sharedFolders[$next_line] -like "*command completed*"){
$completed = $true
}
# Set to the next line
$next_line++
} until ($completed)
}
}
$share_list

Remove Columns in multiple CSVs POWERSHELL [duplicate]

I need to remove several columns from a CSV file without importing the CSV file in Powershell. Below is an example of my input CSV and what I hope the output CSV can look like.
Input.csv
A,1,2,3,4,5
B,6,7,8,9,10
C,11,12,13,14,15
D,15,16,17,18,19,20
Idealoutput.csv
A,3,5
B,8,10
C,13,15
D,17,20
I have tried doing this the following code, but it is giving me plenty of errors and saying that I cannot use the "Delete" method this way (which I have done in the past)...Any ideas?
$Workbook1 = $Excel.Workbooks.open($file.FullName)
$header = $Workbook1.ActiveSheet.Range("A1:A68").EntireRow
$unneededcolumns1 = $Workbook1.ActiveSheet.Range("A1:O1").EntireColumn
$unneededcolumns2 = $Workbook1.ActiveSheet.Range("B1:K1").EntireColumn
$unneededcolumns3 = $Workbook1.ActiveSheet.Range("F1:I1").EntireColumn
$unneededcolumns4 = $Workbook1.ActiveSheet.Range("G1:I1").EntireColumn
$unneededcolumns5 = $Workbook1.ActiveSheet.Range("H1:O1").EntireColumn
$unneededcolumns6 = $Workbook1.ActiveSheet.Range("J1:AL1").EntireColumn
$unneededcolumns7 = $Workbook1.ActiveSheet.Range("K1").EntireColumn
$unneededcolumns8 = $Workbook1.ActiveSheet.Range("L1:AK1").EntireColumn
$unneededcolumns9 = $Workbook1.ActiveSheet.Range("F1:I1").EntireColumn
$unneededcolumns10 = $Workbook1.ActiveSheet.Range("M1:AB1").EntireColumn
$unneededcolumns11 = $Workbook1.ActiveSheet.Range("N1:X1").EntireColumn
$unneededcolumns12 = $Workbook1.ActiveSheet.Range("O1:BA1").EntireColumn
$unneededcolumns13 = $Workbook1.ActiveSheet.Range("P1:U1").EntireColumn
$header.Delete()
$unneededcolumns1.Delete()
$unneededcolumns2.Delete()
$unneededcolumns3.Delete()
$unneededcolumns4.Delete()
$unneededcolumns5.Delete()
$unneededcolumns6.Delete()
$unneededcolumns7.Delete()
$unneededcolumns8.Delete()
$unneededcolumns9.Delete()
$unneededcolumns10.Delete()
$unneededcolumns11.Delete()
$unneededcolumns12.Delete()
$unneededcolumns13.Delete()
$Workbook1.SaveAs("\\output.csv")
I am just going to add this anyway since I hope to convince you how easy it will be to avoid having to use Excel.
$source = "c:\temp\file.csv"
$destination = "C:\temp\newfile.csv"
(Import-CSV $source -Header 1,2,3,4,5,6 |
Select "1","4","6" |
ConvertTo-Csv -NoTypeInformation |
Select-Object -Skip 1) -replace '"' | Set-Content $destination
We assign arbitrary headers to the object and that way we can call the 1st, 4th and 6th columns by position. Once exported the file will have the following contents which match what I think you want and not what you had in the question. Your last line had an extra value (20) on it which I don't know if it was on purpose or not.
A,3,5
B,8,10
C,13,15
D,17,19
If this is not viable I am really interested as to why.
Excel Approach
Alright, so the file is enormous so Import-CSV is not a viable option. Keeping with your excel idea I came up with this. What it will do is take column indexes and delete any column that is not in those indices.
Wait you say?... that wont work since the column indexes change as you remove columns. Using the indices we want to keep we get the inverse to delete based on the UsedRows of the sheet. We then take each of those columns to delete and remove a value equal to is array position. Reason being is that when a column is actually deleted the next value has already been adjusted to account for the shift.
$file = "c:\temp\file.csv"
$ColumnsToKeep = 1,4,6
# Create the com object
$excel = New-Object -comobject Excel.Application
$excel.DisplayAlerts = $False
$excel.visible = $False
# Open the CSV File
$workbook = $excel.Workbooks.Open($file)
$sheet = $workbook.Sheets.Item(1)
# Determine the number of rows in use
$maxColumns = $sheet.UsedRange.Columns.Count
$ColumnsToRemove = Compare-Object $ColumnsToKeep (1..$maxColumns) | Where-Object{$_.SideIndicator -eq "=>"} | Select-Object -ExpandProperty InputObject
0..($ColumnsToRemove.Count - 1) | %{$ColumnsToRemove[$_] = $ColumnsToRemove[$_] - $_}
$ColumnsToRemove | ForEach-Object{
[void]$sheet.Cells.Item(1,$_).EntireColumn.Delete()
}
# Save the edited file
$workbook.SaveAs("C:\temp\newfile.csv", 6)
# Close excel and release the com object.
$workbook.Close($true)
$excel.Quit()
[void][System.Runtime.Interopservices.Marshal]::ReleaseComObject($excel)
Remove-Variable excel
I was having issues with Excel remaining open even after reading up on the "correct" way to do it. The inner logic is what is important. Don't forget to change your paths as needed.
Here's a better approach that I use, but it's not the most performant on large files. Both have been tested on 1GB files.
Powershell:
Import-Csv '.\inputfile.csv'
| select ColumnName1,ColumnName2,ColumnName3
| Export-Csv -Path .\outputfile.csv -NoTypeInformation
https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.utility/export-csv?view=powershell-5.1
If you want to get rid of those pesky quotes that the tool adds, upgrade to Powershell 7.
Powershell 7+:
Import-Csv '.\inputfile.csv'
| select ColumnName1,ColumnName2,ColumnName3
| Export-Csv -Path .\outputfile.csv -NoTypeInformation -UseQuotes Never
https://learn.microsoft.com/en-us/powershell/module/microsoft.powershell.utility/export-csv?view=powershell-7

Powershell - Export array to CSV in different columns

I am trying to automate below API calls from a csv file.
http_uri
/ModuleName/api/12345/moverequest/MoveRequestQueue?batchSize=200
/ModuleName/api/Portal/GetGarageLocations?email=Dummy#mail.com
/ModuleName/api/DeliveryDate/CommitEta?ref=H7J3M1EA4LF
/ModuleName/api/35345/moverequest/MoveRequestQueue?batchSize=500
The output should be like below in a csv file.
ScenarioName Parameter Value
MoveRequestQueue batchSize 200
GetGarageLocations email Dummy#mail.com
CommitEta ref H7J3M1EA4LF
MoveRequestQueue batchSize 500
I am using below code
$csv = Import-Csv C:\Powershell\Documents\Source.csv
$scenario = #()
ForEach ($row in $csv){
$httpuri = $($row.http_uri)
#Iterating through CSV rows and segregate values
if ($httpuri -match "="){
$equalarr = $httpuri -split '='
if ($equalarr[0] -match "\?"){
$questionarr = $equalarr[0] -split '\?'
$scenarionamearr = $questionarr[0] -split '/'
$totalelements = $scenarionamearr.Count
$scenarioname = $scenarionamearr[$totalelements-1]
$Scenario += $scenarioname
$Scenario += $questionarr[1]
$Scenario += $equalarr[1]
}
}
}
#Adding columns to csv
$columnName = '"Scenario","Parameter","Value"'
Add-Content -Path C:\Powershell\Documents\Output.csv -Value $columnName
#Writing values to CSV
$Scenario | foreach { Add-Content -Path C:\Powershell\Documents\Output.csv -Value $_ }
But Outout is generated like below
Scenario Parameter Value
DequeueMoveRequestQueue
batchSize
200
GetCarrierLocations
email
x-qldanxqldanx
Since i am a newbie, searched a lot to solve this issue but couldn't succeed. Please throw some light on this.
Thanks in advance....
If you store your scenarios in structured objects you can use Powershell's built in Export-Csv command to generate your csv.
So, instead of
$Scenario += $scenarioname
$Scenario += $questionarr[1]
$Scenario += $equalarr[1]
store an array of powershell objects:
$Scenario += [PSCustomObject]#{
"Scenario" = $scenarioname;
"Parameter" = $questionarr[1];
"Value" = $equalarr[1];}
Then, when creating the csv file, just use Export-Csv:
$Scenario | Export-Csv -NoTypeInformation -Path C:\Powershell\Documents\Output.csv
So the issue is that you make an empty array, then add strings to it one at a time, which just makes it an array of strings. Then when you output it to the file it just adds each string to the file on its own line. What you want to do is create an array of objects, then use the Export-Csv cmdlet to output it to a CSV file.
Creating an array, and then adding things to it one at a time is not a good way to do it. PowerShell has to recreate the array each time you add something the way you're doing it. Better would be to have a pipeline that outputs what you want (objects, rather than strings), and capture them all at once creating the array one time. Or even better, just output them to the CSV file and not recollect them in general.
$CSV = Import-Csv C:\Powershell\Documents\Source.csv
$CSV.http_uri -replace '^.*/(.*)$','$1'|ForEach-Object{
$Record = $_ -split '[=\?]'
[PSCustomObject]#{
ScenarioName = $Record[0]
Parameter = $Record[1]
Value = $Record[2]
}
} | Export-Csv -Path C:\Powershell\Documents\Output.csv -Append

Powershell: Search data in *.txt files to export into *.csv

First of all, this is my first question here. I often come here to browse existing topics, but now I'm hung on my own problem. And I didn't found a helpful resource right now. My biggest concern would be, that it won't work in Powershell... At the moment I try to get a small Powershell tool to save me a lot of time. For those who don't know cw-sysinfo, it is a tool that collects information of any host system (e.g. Hardware-ID, Product Key and stuff like that) and generates *.txt files.
My point is, if you have 20, 30 or 80 server in a project, it is a huge amount of time to browse all files and just look for those lines you need and put them together in a *.csv file.
What I have working is more like the basic of the tool, it browses all *.txt in a specific path and checks for my keywords. And here is the problem that I just can use the words prior to those I really need, seen as follow:
Operating System: Windows XP
Product Type: Professional
Service Pack: Service Pack 3
...
I don't know how I can tell Powershell to search for "Product Type:"-line and pick the following "Professional" instead. Later on with keys or serial numbers it will be the same problem, that is why I just can't browse for "Standard" or "Professional".
I placed my keywords($controls) in an extra file that I can attach the project folders and don't need to edit in Powershell each time. Code looks like this:
Function getStringMatch
{
# Loop through the project directory
Foreach ($file In $files)
{
# Check all keywords
ForEach ($control In $controls)
{
$result = Get-Content $file.FullName | Select-String $control -quiet -casesensitive
If ($result -eq $True)
{
$match = $file.FullName
# Write the filename according to the entry
"Found : $control in: $match" | Out-File $output -Append
}
}
}
}
getStringMatch
I think this is the kind of thing you need, I've changed Select-String to not use the -quiet option, this will return a matches object, one of the properties of this is the line I then split the line on the ':' and trim any spaces. These results are then placed into a new PSObject which in turn is added to an array. The array is then put back on the pipeline at the end.
I also moved the call to get-content to avoid reading each file more than once.
# Create an array for results
$results = #()
# Loop through the project directory
Foreach ($file In $files)
{
# load the content once
$content = Get-Content $file.FullName
# Check all keywords
ForEach ($control In $controls)
{
# find the line containing the control string
$result = $content | Select-String $control -casesensitive
If ($result)
{
# tidy up the results and add to the array
$line = $result.Line -split ":"
$results += New-Object PSObject -Property #{
FileName = $file.FullName
Control = $line[0].Trim()
Value = $line[1].Trim()
}
}
}
}
# return the results
$results
Adding the results to a csv is just a case of piping the results to Export-Csv
$results | Export-Csv -Path "results.csv" -NoTypeInformation
If I understand your question correctly, you want some way to parse each line from your report files and extract values for some "keys". Here are a few lines to give you an idea of how you could proceede. The example is for one file, but can be generalized very easily.
$config = Get-Content ".\config.txt"
# The stuff you are searching for
$keys = #(
"Operating System",
"Product Type",
"Service Pack"
)
foreach ($line in $config)
{
$keys | %{
$regex = "\s*?$($_)\:\s*(?<value>.*?)\s*$"
if ($line -match $regex)
{
$value = $matches.value
Write-Host "Key: $_`t`tValue: $value"
}
}
}