Powershell Word Table single row error - powershell

I have the following function that I provide 3 arrays as variables
$columnHeaders = #('Ticket ID', 'Date Raised', 'Title', 'Status' )
$columnproperties = #('number', 'opened_at', 'short_description', 'state')
$contents
$contents has multiple rows of data matching the columns above, however sometimes may only have 1 row. When the $contents only has 1 row the below function errors out and doesnt print the data.
Using ISE I traced the issue to the $contents.count not showing a value, why is this? is there a way to get around it?
function TableOutput ($columnHeaders, $columnProperties, $contents){
# Number of columns
$columnCount = $columnHeaders.Count
# Create a new table
$docTable = $Word.ActiveDocument.Tables.Add($Word.Selection.Range,$contents.Count,$columnCount)
# Table style
$doctable.Style = "Adapt Table"
# Insert the column headers into the table
for ($col = 0; $col -lt $columnCount; $col++) {
$cell = $docTable.Cell(1,$col+1).Range
$cell.Font.Bold=$true
$cell.InsertAfter($columnHeaders[$col])
}
$doctable.Rows.Add() > Null
# Load the data into the table
$i = 1
$j = $contents.Count
for($row = 2; $row -lt ($contents.Count + 2); $row++){
if($row -gt 2){
}
for ($col = 1; $col -le $columnCount; $col++){
Write-Progress -Activity "Processing Table Information" -Status "Adding Row entry $i of $j" -PercentComplete (100*$i/$j)
$cell = $docTable.Cell($row,$col).Range
$cell.Font.Name="Calibri"
$cell.Font.Size="10"
$cell.Font.Bold=$FALSE
$cell.Text = $contents[$row-2].($columnProperties[$col-1])
}
$i++
}
$doctable.Columns.AutoFit()
}
any help is greatly appreciated.

Cast $content as an array of strings and see if that doesn't work better for you.
function TableOutput ($columnHeaders, $columnProperties, [string[]]$contents){
Edit: Sorry, my bad, you are passing objects with properties ad descripbed in $columnheaders, so you would need to cast it as an array of objects instead:
function TableOutput ($columnHeaders, $columnProperties, [object[]]$contents){
Tested on my end, it works fine with 1 object being passed to the function, as well as an array of two objects being passed to the function.

Related

How to dynamically add PSCustomObjects to a list [duplicate]

This question already has answers here:
Add items into a collection array dynamically
(2 answers)
Closed 1 year ago.
I am creating a script to parse a CSV file, where I store the content of each indexed field in the CSV as a NoteProperty in a PSCustomObject.
As I parse the file line by line, I add the PSCustomObject to a list type. When I output my list, I want to be able to do something like:
$list | Format-Table
and have a nice view of each row in the csv file, separated into columns with the heading up top.
Problem
When I add a PSCustomObject to the list, it changes the type of the list to a PSCustomObject. In practice, this has the apparent effect of applying any updates made to that PSCustomObject to every element in the list retroactively.
Here is a sample:
$list = [System.Collections.Generic.List[object]]::new()
$PSCustomObject = [PSCustomObject]#{ count = 0}
Foreach ($i in 1..5) {
$PSCustomObject.count +=1
$list.Add($PSCustomObject)
}
Expected Output:
PS>$list
count
-----
1
2
3
4
5
Actual Output:
PS>$list
count
-----
5
5
5
5
5
Question
Is there any way to get the expected output?
Limitations / additional context if it helps
I'm trying to optimize performance, as I may parse very large CSV files. This is why I am stuck with a list. I understand the Add method in lists is faster than recreating an array with += for every row. I am also using a runspace pool to parse each field separately and update the object via $list.$field[$lineNumber] = <field value>, so this is why I need a way to dynamically update the PSCustomObject. A larger view of my code is:
$out = [hashtable]::Synchronized(#{})
$out.Add($key, #{'dataSets' = [List[object]]::new() } ) ### $key is the file name as I loop through each csv in a directory.
$rowTemplate = [PSCustomObject]#{rowNum = 0}
### Additional steps to prepare the $out dictionary and some other variables
...
...
try {
### Skip lines prior to the line with the headers
$fileParser = [System.IO.StreamReader]$path
Foreach ( $i in 1..$headerLineNumber ) {
[void]$fileParser.ReadLine()
}
### Load the file into a variable, and add empty PSCustomObjects for each line as a placeholder.
while ($null -ne ($line = $fileParser.ReadLine())) {
[void]$fileContents.Add($line)
$rowTemplate.RowNum += 1
[void]$out.$key.dataSets.Add($rowTemplate)
}
}
finally {$fileParser.close(); $fileParser.dispose()}
### Prepare the script block for each runspace
$runspaceScript = {
Param( $fileContents, $column, $columnIndex, $delimiter, $key, $out )
$columnValues = [System.Collections.ArrayList]::new()
$linecount = 0
Foreach ( $line in $fileContents) {
$entry = $line.split($delimiter)[$columnIndex]
$out.$key.dataSets[$linecount].$column = $entry
$linecount += 1
}
}
### Instantiate the runspace pool.
PS Version (5.1.19041)
You're (re-)adding the same object to the list, over and over.
You need to create a new object every time your loop runs, but you can still "template" the objects - just use a hashtable/dictionary instead of a custom object:
# this hashtable will be our object "template"
$scaffold = #{ Count = 0}
foreach($i in 1..5){
$scaffold.Count += 1
$newObject = [pscustomobject]$scaffold
$list.Add($newObject)
}
As mklement0 suggests, if you're templating objects with multiple properties you might want to consider using an ordered dictionary to retain the order of the properties:
# this hashtable will be our object "template"
$scaffold = [ordered]#{ ID = 0; Count = 0}
foreach($i in 1..5){
$scaffold['ID'] = Get-Random
$scaffold['Count'] = $i
$newObject = [pscustomobject]$scaffold
$list.Add($newObject)
}

Powershell: get column number from CSV file header

I need to get the column number from an imported CSV based on a particular column name ($_."Status"). Once I have the correct column number, I can assign it to a variable and use it in a foreach loop to write text to the corresponding cells. $wsSource.cells.item($tr,49) = "Added by xyz)" Note that the column position often varies from file to file.
I already have the index/row number via $tr = $source.IndexOf($row) ...but struggling with the col number.
Thanks in advance,
Jason
Incomplete code from much larger PS script that writes two different excel files in the one loop:
$source = Import-Csv $csvFile
$i = 2
foreach($row in $source.where{$_.Contacted -like "*Invalid"})
{
$tr = ($source.IndexOf($row)+2)
$wsTemp.cells.item($i,4) = $timeStamp
$wsTemp.cells.item($i,10) = $row."Last Name"
$wsSource.cells.item($tr,49) = "Added by xyz)"
$wsSource.cells.item($tr,49).Interior.ColorIndex =19
$i++
}
}
elseif ...
You need to create a Hashtable to map the Excel column names with their index:
# create a hash with Excel column header names and their indices
$colMax = $wsSource.UsedRange.Columns.Count
$xlHeaders = #{}
for ($col = 1; $col -le $colMax; $col++) {
$name = $wsSource.Cells.Item(1, $col).Value() # assuming the first row has the headers
$xlHeaders[$name] = $col
}
Now you can match the column from the Csv with the column index in Excel like
if (!$xlHeaders.ContainsKey('Status')) {
Write-Warning "Excel sheet does not have a column named 'Status'"
}
else {
$xlColumn = $xlHeaders['Status']
$wsSource.Cells.Item($tr, $xlColumn) = "Added by xyz)"
$wsSource.Cells.Item($tr, $xlColumn).Interior.ColorIndex = 19
}

for loop through 5 textboxes

I have created a GUI with 5 Textboxes. I call them $textboxHost1 - 5.
Now I have an array in which I'm gonna save up to 5 values and then write each value according to the order into the textboxes. The first value in the array should be written into the first $textboxHost1 box.
To do that, I would like to make a for loop and have written this code
#$hostnameneingabe: Array, in which the values are saved.
$hostnameneingabeCount = $hostnameneingabe.Count
for($i = 0; $i -le $hostnameneingabeCount; $i++) {
#code here
}
Now, I'm looking for a way to go down the order, so that the first $textboxHost1 comes firstly and so on.
To be accurate, the variable $textboxHost should be incrementally increased in the loop and the values at the position $i in the array should be written into that textbox.
sth like
for($i = 0; $i -le $hostnameneingabeCount; $i++) {
$textboxHost$i =
}
I suppose you would be liking something like this?
$textboxHosts = Get-Variable | ? {$_.Name -match "textBoxHost[0-9]" -and $_.Value -ne $null} | sort Name
After this you can process that var with eg. a foreach:
foreach ($textboxHost in $textboxHosts) {<# Do some stuff #>}
You have to use an array, because otherwise you can't loop through them:
$textboxHost = #(0..4)
#Textbox 0
$textboxHost[0] = New-Object System.Windows.Forms.TextBox
$textboxHost[0].Text = "test"
#Textbox 1
$textboxHost[1] = New-Object System.Windows.Forms.TextBox
$textboxHost[1].Text = "test"
foreach ($textbox in $textboxHost){
#Do whatever you want with the textbox
$textbox =
}

Is this the best way to replace text in all of an object's properties in powershell?

I have a large CSV file in which some fields have a new line embedded. Excel 2016 produces errors when importing a CSV with rows which have fields with a new line embedded.
Based on this post, I wrote code to replace any new line in any field with a space. Below is a code block that duplicates the functionality and issue. Option 1 works. Option 2, which is commented out, casts my object to a string. I was hoping Option 2 might run faster.
Question: Is there a better way to do this to optimize for performance processing very large files?
$array = #([PSCustomObject]#{"ID"="1"; "Name"="Joe`nSmith"},
[PSCustomObject]#{"ID"="2"; "Name"="Jasmine Baker"})
$array = $array | ForEach-Object {
#Option 1: produces an Object, but is code optimized?
foreach ($n in $_.PSObject.Properties.Name) {
$_.PSObject.Properties[$n].Value = `
$_.PSObject.Properties[$n].Value -replace "`n"," "
}
#Option 2: produces a string, not an object
#$_ = $_ -replace "`n"," "
$_
}
Keep in mind that in my real-world use case, each row has > 15 fields and any combination of them may have one or more new lines embedded.
Use the fast TextFieldParser to read, process, and build the CSV from the file (PowerShell 3+):
[Reflection.Assembly]::LoadWithPartialName('Microsoft.VisualBasic') >$null
$parser = New-Object Microsoft.VisualBasic.FileIO.TextFieldParser 'r:\1.csv'
$parser.SetDelimiters(',')
$header = $parser.ReadFields()
$CSV = while (!$parser.EndOfData) {
$i = 0
$row = [ordered]#{}
foreach ($field in $parser.ReadFields()) {
$row[$header[$i++]] = $field.replace("`n", ' ')
}
[PSCustomObject]$row
}
Or modify each field in-place in an already existing CSV array:
foreach ($row in $CSV) {
foreach ($field in $row.PSObject.Properties) {
$field.value = $field.value.replace("`n", ' ')
}
}
Notes:
foreach statement is much faster than piping to ForEach-Object (also aliased as foreach)
$stringVariable.replace() is faster then -replace operator

Creating dynamic variable array names and then adding object to them

What I'm trying to do is create array variable names dynamically, and then with a loop, add the object to its relevant array based on the hash table value being equal to the counter variable.
$hshSite = #{} # Values like this CO,1 NE,2 IA,3
$counter = $hshSite.count
For($i = $counter; $i -gt 0; $i--) {
New-Variable -Name "arr$i" -Value #()
}
If $counter = 3, I would create arrays $arr1, $arr2, $arr3
$csv = Import-CSV....
ForEach ($x in $csv) {
#if $hshSite.Name = $x.location (ie CO), look up hash value (1),
and add the object to $arr1. If $hshSite.Name = NE, add to $arr2
I tried creating the dynamic arrays with New-Variable, but having issues trying to add to those arrays. Is it possible to concatenate 2 variables names into a single variable name? So taking $arr + $i to form $arr1 and $arr2 and $arr3, and then I can essentially just do $arr0 += $_
The end goal is to group things based on CO, NE, IA for further sorting/grouping/processing. And I'm open to other ideas of getting this accomplished. Thanks for your help!
Just make your hash table values the arrays, and accumulate the values to them directly:
$Sites = 'CO','NE','IA'
$hshSite = #{}
Foreach ($Site in $Sites){$hshSite[$Site] = #()}
ForEach ($x in $csv)
{
$hshSite[$x.location] += <whatever it is your adding>
}
If there's a lot of entries in the csv, you might consider creating those values as arraylists instead of arrays.
$Sites = 'CO','NE','IA'
$hshSite = #{}
Foreach ($Site in $Sites){ $hshSite[$Site] = New-Object Collections.Arraylist }
ForEach ($x in $csv)
{
$hshSite[$x.location].add('<whatever it is your adding>') > $nul
}
You could quite easily do add items to a dynamically named array variable using the Get-Variable cmdlet. Similar to the following:
$MyArrayVariable123 = #()
$VariableNamePrefix = "MyArrayVariable"
$VariableNameNumber = "123"
$DynamicallyRetrievedVariable = Get-Variable -Name ($VariableNamePrefix + $VariableNameNumber)
$DynamicallyRetrievedVariable.Value += "added item"
After running the above code the $MyArrayVariable123 variable would be an array holding the single string added item.