PowerShell output text to a file as a table/columns - powershell

In a loop, the code will output text to a file, basically two vars, however the width of each var is variable and the output looks ugly.
Is there a way to output the two vars in "columns", as a table?
Current code in the loop (the variables are concatenated in the sample code):
$matches[0] + $cal | Add-Content -LiteralPath $cleancalendarslist

i agree csv is a plain text solution, and it doesn't require opening with excel - but i think this is what you are looking for: and it is using PadLeft()
$matches = #('short','this is a long string','a med string')
$maxlength = ($matches | Measure-Object -Maximum -Property Length).Maximum
$cal = '20190107' #not sure what cal is....
foreach($s in $array){
$paddedstring = $s.Padleft($maxlength,' ')
$paddedstring + ' ' + $cal | add-content -LiteralPath $cleancalendarslist
}

Related

Save property with comma to CSV

I'm creating a CSV file in powershell.
Right now my code is:
Add-content -Path $filePath -Value "$($variable.Property)"
This works fine for the most part EXCEPT if the property contained a comma ie. "test, organization".
When I open up the CSV, the comma is taken with it (which is what i want) causing a extra separation. How do save "test, organization" to one column?
Referring to the documentation for Export-CSV, you will need to use a different delimiter, like a semi-colon.
When you read the CSV you should specify the delimiter as well: Import-CSV.
Try to quote your properties:
Add-content -Path $filePath -Value "'$($variable.Property)'"
Or use one of the built-in CSV commands, which automatically quote all values:
$foo.Bar | Export-Csv -Path $filePath
$foo.Bar| ConvertTo-Csv | Out-File -Path $filePath
If you just want to avoid issues with commas, you can change the delimiter between fields:
$foo | Export-Csv -Path $filePath -Delimiter '|'
Here is an article on how to use out-file or add-member with some cells of the row having commas in the variable values and some not.
https://imjustanengineer.blogspot.com/2022/01/so-youre-trying-to-use-powershell-out.html
Here is a code snippet, a more detailed explanation with a full working function is in the link. $outputArr is an array of all the lines of csv data you want to write to the csv. The loop checks each line to see if it contains commas inside of the individual cell entries and puts quotes around that entry if it does. If it does not, no adjustment is necessary and then a new array is appended to afterwards.
$index = 0;
foreach ($outputTemp in $outputArr)
{
if ($outputTemp.ToString().Contains(","))
{
$output += "`"$outputTemp`",";
}
else
{
$output += $outputTemp + ",";
}
$index++;
if ($index -eq $outputArr.Count)
{
if ($output.EndsWith(","))
{
$output = $output.Remove($output.Length - 1);
}
}
}
I found a diabolically simply answer after I opened a csv file in Excel and added text and commas to one column. When I saved, closed and reopened, the column still had all the words and commas properly formatted. So, then I opened the file in notepad++ and this is what I found:
column1text, column2text,"column3,text,with,commas"
In case it's not clear, and it took me a fair bit to recognize the little detail that makes all the difference, the opening double quote cannot have a space after the preceding comma.
column1text, column2text, "column3,text,with,commas"
splits all the words into separate columns because there is a space between
column2text, "column3,etc"
Take that space away
column2text,"column3,etc"
and everything within the double quotes stays in one column.
Example using active directory distinguishedName such as CN=somename,OU=Domain Controllers,DC=foo,DC=bar
$computers = get-adcomputer -filter *
foreach ($computer in $computers) {
$deviceName = $computer.Name
$dn = '"' + $computer.DistinguishedName + '"'
$guid = $computer.objectGUID
$lastLogon = $computer.LastLogonDate
$serialNumber = $computer.serialNumber
$whenCreated = $computer.whenCreated
"$guid, $lastLogon, $deviceName, $serialNumber, $whenCreated,$dn" | add-content "c:\temp\filename.csv"
}
It does not work if a space is added between $whenCreated, and $dn like so:
"$guid, $lastLogon, $deviceName, $serialNumber, $whenCreated, $dn" | add-content "c:\temp\filename.csv"
This took up an afternoon, so I hope this saves somebody some time and frustration.

How can I count the number of CSV columns when the file has multiline data and no header

My CSV files have no headers and multi line entries like this:
11;"multi line
col12";13;foobar;foobar
21;22;23;24;25
And I'd like to count the number of columns. So 5 in this example. How do I do that?
What I tried:
Import-CSV doesn't work without the header parameter due to duplicate entries on the first line.
(Import-Csv .\bad.csv -Delimiter ";" | get-member -type NoteProperty).count
Adding a header parameter skews the count.
(Import-Csv .\bad.csv -Delimiter ";" -Header (1..99) | get-member -type NoteProperty).count
I had to abort reading the file manually via Get-Content because of all the parsing I would have to handle manually. Escaping characters and multi line entries...
My version of PowerShell is 3 and I have to port my script to version 2 later on.
If you are willing to accept the caveat that this could miscount the number of columns if there are quoted delimiters in string this could be good enough for you.
$path = "c:\temp\test.txt"
$delimiter = ";"
$numberOfColumns = Get-Content $path |
ForEach-Object{($_.split($delimiter)).Count} |
Measure-Object -Maximum |
Select-Object -ExpandProperty Maximum
Import-Csv $path -Header (1..$numberOfColumns) -Delimiter $delimiter
Read in the file with Get-Content and isolate the maximum number of columns by
splitting each line on its delimiter and then using that value to import the CSV. If the file is large you can read in the file once with Get-Content and then use ConvertTo-CSV once you know your column count.
If all lines contain a line break on them the above logic would fail. Still we could temporarily scrub the data by removing the correct line breaks in order to get the accurate count.
$delimiter = ";"
$fileData = (Get-Content $path | Out-String)
$numberOfColumns = ((($fileData -replace "(`"[^;]+?)`r`n",'$1') -split "`r`n" | Select -First 1).split($delimiter)).Count
$fileData | ConvertFrom-Csv -Header (1..$numberOfColumns) -Delimiter $delimiter
What this will do is find lines that end where there is a double quote followed by data that does not contain the delimiter. We also match the newline that follows but drop that same new line in the replacement. If that is done we know that the first line is proper. Use that same line to split and count just like before.
Since Excel knows, let's ask him :
$path = "path\to\bad.csv"
$excel = New-Object -ComObject Excel.Application
$workbook = $excel.Workbooks.Open($path)
$sheet = $workbook.ActiveSheet
$columnIndex = 1
while($sheet.Cells.Item(1, $columnIndex).Text -ne "") {
$columnIndex++
}
"There are $($columnIndex - 1) columns in CSV file $path"
Start-Sleep -Seconds 1
Get-Process excel | Stop-Process -Force
As pointed out by Ansgar Wiechers in comments, there is a much shorter solution :
$path = "path\to\bad.csv"
$excel = New-Object -ComObject Excel.Application
$workbook = $excel.Workbooks.Open($path)
$sheet = $workbook.ActiveSheet
$columnCount = $sheet.UsedRange.Columns.Count
"There are $columnCount columns in CSV file $path"
Start-Sleep -Seconds 1
Get-Process excel | Stop-Process -Force
(I know my way of killing Excel is dirty, but iirc it takes too much code to do so)
I know this is very old, but I came across a similar situation (did not have have rows of varying columns) today and found my own solution so I thought I would share for anyone else coming into this situation. My solution was to use Get-Content for the first row of the CSV and -split on the delimiter (,) to create an array and then return the count of the array. As mentioned in replies above, this will not account for delimiters existing within quotations.
((Get-Content $PathToCsv)[0] -split ",").count
I had the same issue and went with AAgent suggestion.
$CommaCount = ((Get-Content $PathToCsv)[0] -split ",").count
$SemicolonCount = ((Get-Content $PathToCsv)[0] -split ";").count
if ($CommaCount -gt $SemicolonCount){
$CMSlist = Import-Csv ($PathToCsv) –Delimiter “,”
}
else{
$CMSlist = Import-Csv ($PathToCsv) –Delimiter “;”

How to modify contents of a pipe-delimited text file with PowerShell

I have a pipe-delimited text file. The file contains "records" of various types. I want to modify certain columns for each record type. For simplicity, let's say there are 3 record types: A, B, and C. A has 3 columns, B has 4 columns, and C has 5 columns. For example, we have:
A|stuff|more_stuff
B|123|other|x
C|something|456|stuff|more_stuff
B|78903|stuff|x
A|1|more_stuff
I want to append the prefix "P" to all desired columns. For A, the desired column is 2. For B, the desired column is 3. For C, the desired column is 4.
So, I want the output to look like:
A|Pstuff|more_stuff
B|123|Pother|x
C|something|456|Pstuff|more_stuff
B|78903|Pstuff|x
A|P1|more_stuff
I need to do this in PowerShell. The file could be very large. So, I'm thinking about going with the File-class of .NET. If it were a simple string replacement, I would do something like:
$content = [System.IO.File]::ReadAllText("H:\test_modify_contents.txt").Replace("replace_text","something_else")
[System.IO.File]::WriteAllText("H:\output_file.txt", $content)
But, it's not so simple in my particular situation. So, I'm not even sure if ReadAllText and WriteAllText is the best solution. Any ideas on how to do this?
I would ConvertFrom-Csv so you can check each line as an object. On this code, I did add a header, but mainly for code readability. The header is cut out of the output on the last line anyway:
$input = "H:\test_modify_contents.txt"
$output = "H:\output_file.txt"
$data = Get-Content -Path $input | ConvertFrom-Csv -Delimiter '|' -Header 'Column1','Column2','Column3','Column4','Column5'
$data | % {
If ($_.Column5) {
#type C:
$_.Column4 = "P$($_.Column4)"
} ElseIf ($_.Column4) {
#type B:
$_.Column3 = "P$($_.Column3)"
} Else {
#type A:
$_.Column2 = "P$($_.Column2)"
}
}
$data | Select Column1,Column2,Column3,Column4,Column5 | ConvertTo-Csv -Delimiter '|' -NoTypeInformation | Select-Object -Skip 1 | Set-Content -Path $output
It does add extra | for the type A and B lines. Output:
"A"|"Pstuff"|"more_stuff"||
"B"|"123"|"Pother"|"x"|
"C"|"something"|"456"|"Pstuff"|"more_stuff"
"B"|"78903"|"Pstuff"|"x"|
"A"|"P1"|"more_stuff"||
If your file sizes are large then reading the complete file contents at once using Import-Csv or ReadAll is probably not a good idea. I would use Get-Content cmdlet using the ReadCount property which will stream the file one row at time and then use a regex for the processing. Something like this:
Get-Content your_in_file.txt -ReadCount 1 | % {
$_ -replace '^(A\||B\|[^\|]+\||C\|[^\|]+\|[^\|]+\|)(.*)$', '$1P$2'
} | Set-Content your_out_file.txt
EDIT:
This version should output faster:
$d = Get-Date
Get-Content input.txt -ReadCount 1000 | % {
$_ | % {
$_ -replace '^(A\||B\|[^\|]+\||C\|[^\|]+\|[^\|]+\|)(.*)$', '$1P$2'
} | Add-Content output.txt
}
(New-TimeSpan $d (Get-Date)).Milliseconds
For me this processed 50k rows in 350 milliseconds. You probably get more speed by tweaking the -ReadCount value to find the ideal amount.
Given the large input file, i would not use either ReadAllText or Get-Content.
They actually read the entire file into memory.
Consider using something along the lines of
$filename = ".\input2.csv"
$outfilename = ".\output2.csv"
function ProcessFile($inputfilename, $outputfilename)
{
$reader = [System.IO.File]::OpenText($inputfilename)
$writer = New-Object System.IO.StreamWriter $outputfilename
$record = $reader.ReadLine()
while ($record -ne $null)
{
$writer.WriteLine(($record -replace '^(A\||B\|[^\|]+\||C\|[^\|]+\|[^\|]+\|)(.*)$', '$1P$2'))
$record = $reader.ReadLine()
}
$reader.Close()
$reader.Dispose()
$writer.Close()
$writer.Dispose()
}
ProcessFile $filename $outfilename
EDIT: After testing all the suggestions on this page, i have borrowed the regex from Dave Sexton and this is the fastest implementation. Processes a 1gb+ file in 175 seconds. All other implementations are significantly slower on large input files.

Output PowerShell variables to a text file

I'm new to PowerShell and have a script which loops through Active Directory searching for certain computers. I get several variables and then run functions to check things like WMI and registry settings.
In the console, my script runs great and simple Write-Host command prints the data on the screen as I want. I know about Export-Csv when using the pipeline...but I'm not looking to print from the pipeline.
I want to write the variables to a text file, continue the loop, and check the next computer in AD...output the next iteration of the same variables on the next line. Here is my Write-Host:
Write-Host ($computer)","($Speed)","($Regcheck)","($OU)
Output file:
$computer,$Speed,$Regcheck | out-file -filepath C:\temp\scripts\pshell\dump.txt -append -width 200
It gives me the data, but each variable is on its own line. Why? I'd like all the variables on one line with comma separation. Is there a simple way to do this akin to VB writeline? My PowerShell version appears to be 2.0.
Use this:
"$computer, $Speed, $Regcheck" | out-file -filepath C:\temp\scripts\pshell\dump.txt -append -width 200
I usually construct custom objects in these loops, and then add these objects to an array that I can easily manipulate, sort, export to CSV, etc.:
# Construct an out-array to use for data export
$OutArray = #()
# The computer loop you already have
foreach ($server in $serverlist)
{
# Construct an object
$myobj = "" | Select "computer", "Speed", "Regcheck"
# Fill the object
$myobj.computer = $computer
$myobj.speed = $speed
$myobj.regcheck = $regcheck
# Add the object to the out-array
$outarray += $myobj
# Wipe the object just to be sure
$myobj = $null
}
# After the loop, export the array to CSV
$outarray | export-csv "somefile.csv"
You can concatenate an array of values together using PowerShell's `-join' operator. Here is an example:
$FilePath = '{0}\temp\scripts\pshell\dump.txt' -f $env:SystemDrive;
$Computer = 'pc1';
$Speed = 9001;
$RegCheck = $true;
$Computer,$Speed,$RegCheck -join ',' | Out-File -FilePath $FilePath -Append -Width 200;
Output
pc1,9001,True
$computer,$Speed,$Regcheck will create an array, and run out-file ones per variable = they get seperate lines. If you construct a single string using the variables first, it will show up a single line. Like this:
"$computer,$Speed,$Regcheck" | out-file -filepath C:\temp\scripts\pshell\dump.txt -append -width 200
The simple solution is to avoid creating an array before piping to Out-File. Rule #1 of PowerShell is that the comma is a special delimiter, and the default behavior is to create an array. Concatenation is done like this.
$computer + "," + $Speed + "," + $Regcheck | out-file -filepath C:\temp\scripts\pshell\dump.txt -append -width 200
This creates an array of three items.
$computer,$Speed,$Regcheck
FYKJ
100
YES
vs. concatenation of three items separated by commas.
$computer + "," + $Speed + "," + $Regcheck
FYKJ,100,YES
I was lead here in my Google searching. In a show of good faith I have included what I pieced together from parts of this code and other code I've gathered along the way.
# This script is useful if you have attributes or properties that span across several commandlets
# and you wish to export a certain data set but all of the properties you wish to export are not
# included in only one commandlet so you must use more than one to export the data set you want
#
# Created: Joshua Biddle 08/24/2017
# Edited: Joshua Biddle 08/24/2017
#
$A = Get-ADGroupMember "YourGroupName"
# Construct an out-array to use for data export
$Results = #()
foreach ($B in $A)
{
# Construct an object
$myobj = Get-ADuser $B.samAccountName -Properties ScriptPath,Office
# Fill the object
$Properties = #{
samAccountName = $myobj.samAccountName
Name = $myobj.Name
Office = $myobj.Office
ScriptPath = $myobj.ScriptPath
}
# Add the object to the out-array
$Results += New-Object psobject -Property $Properties
# Wipe the object just to be sure
$myobj = $null
}
# After the loop, export the array to CSV
$Results | Select "samAccountName", "Name", "Office", "ScriptPath" | Export-CSV "C:\Temp\YourData.csv"
Cheers

Extracting columns from text file using PowerShell

I have to extract columns from a text file explained in this post:
Extracting columns from text file using Perl one-liner: similar to Unix cut
but I have to do this also in a Windows Server 2008 which does not have Perl installed. How could I do this using PowerShell? Any ideas or resources? I'm PowerShell noob...
Try this:
Get-Content test.txt | Foreach {($_ -split '\s+',4)[0..2]}
And if you want the data in those columns printed on the same line:
Get-Content test.txt | Foreach {"$(($_ -split '\s+',4)[0..2])"}
Note that this requires PowerShell 2.0 for the -split operator. Also, the ,4 tells the the split operator the maximum number of split strings you want but keep in mind the last string will always contain all extras concat'd.
For fixed width columns, here's one approach for column width equal to 7 ($w=7):
$res = Get-Content test.txt | Foreach {
$i=0;$w=7;$c=0; `
while($i+$w -lt $_.length -and $c++ -lt 2) {
$_.Substring($i,$w);$i=$i+$w-1}}
$res will contain each column for all rows. To set the max columns change $c++ -lt 2 from 2 to something else. There is probably a more elegant solution but don't have time right now to ponder it. :-)
Assuming it's white space delimited this code should do.
$fileName = "someFilePath.txt"
$columnToGet = 2
$columns = gc $fileName |
%{ $_.Split(" ",[StringSplitOptions]"RemoveEmptyEntries")[$columnToGet] }
To ordinary、
type foo.bar | % { $_.Split(" ") | select -first 3 }
Try this. This will help to skip initial rows if you want, extract/iterate through columns, edit the column data and rebuild the record:
$header3 = #("Field_1","Field_2","Field_3","Field_4","Field_5")
Import-Csv $fileName -Header $header3 -Delimiter "`t" | select -skip 3 | Foreach-Object {
$record = $indexName
foreach ($property in $_.PSObject.Properties){
#doSomething $property.Name, $property.Value
if($property.Name -like '*CUSIP*'){
$record = $record + "," + '"' + $property.Value + '"'
}
else{
$record = $record + "," + $property.Value
}
}
$array.add($record) | out-null
#write-host $record
}