powershell select-object outputs array on one line - powershell

I have around 20 arrays which contain over 100 values each.
I want to output these to a csv file with column headings.
If I type any of these arrays in a powershell command prompt they display on multiple lines and I can select different items from the array using $arrayname{14] for example, so I think they are being stored correctly.
If I use the following line in my script:
"" | select-object #{Name="Column1"; Expression={"$Array1"}},#{Name="Column2"; Expression={"$Array2"}},#{Name="Column3"; Expression={"$Array3"}} | export-csv $exportLocation -notypeinformation
Then it creates the columns with the heading but each array variable is displayed on one line.
How can I get the output to display the arrays in the respective columns on a line of their own?

You need to convert your 4 arrays into an array of objects with 4 properties. Try this:
$Array1 = #(...)
$Array2 = #(...)
$Array3 = #(...)
$Array4 = #(...)
$len1 = [Math]::Max($Array1.Length, $Array2.Length)
$len2 = [Math]::Max($Array3.Length, $Array4.Length)
$maxlen = [Math]::Max($len1, $len2)
$csv = for ($i=0; $i -lt $maxlen; $i++) {
New-Object -Type PSCustomObject -Property #{
'Column1' = $Array1[$i];
'Column2' = $Array2[$i];
'Column3' = $Array3[$i];
'Column4' = $Array4[$i];
}
}
$csv | Export-Csv 'C:\path\to\output.csv'

Related

Powershell - Search every element of a large array against every element of another large array

I have two large arrays. One is an array (call it Array1) of 100,000 PSCustomObjects, each of which has a property called "Token". And the other array is simply an array of strings, the size of this second array being 2500 elements.
The challenge is that EVERY element of Array1 needs to be checked against all the elements in Array2 and tagged accordingly. i.e., if the the Token value from Array1 matches any of the elements from Array2, label it as "Match found!"
Looping through would actually make it extremely slow. Is there a better way to do this?
P.S.: The items in Array1 have an ordinal number property as well, and the array is sorted in that order.
Here is the code:
$Array1 = #()
$Array2 = #()
#Sample object:
$obj = New-Object -TypeName PSCustomObject
$obj | Add-Member -MemberType NoteProperty -Name Token -Value "SOMEVALUEHERE"
$obj | Add-Member -MemberType NoteProperty -Name TokenOrdinalNum -Value 1
$Array1 += $obj # This array has 100K such objects
$Array2 = #("VAL1", "SOMEVALUEHERE", ......) #Array2 has 2500 such strings.
The output of this would need to be a new array of objects, say 'ArrayFinal', that has an additional noteproperty called 'MatchFound'.
Please help.
I would create a Hashtable for fast lookups from the values in your $Array2.
For clarity, I have renamed $Array1 and $Array2 into $objects and $tokens.
# the object array
$objects = [PsCustomObject]#{ Token = 'SOMEVALUEHERE'; TokenOrdinalNum = 1 },
[PsCustomObject]#{ Token = 'VAL1'; TokenOrdinalNum = 123 },
[PsCustomObject]#{ Token = 'SomeOtherValue'; TokenOrdinalNum = 555 } # etcetera
# the array with token keywords to check
$tokens = 'VAL1', 'SOMEVALUEHERE', 'ShouldNotFindThis' # etcetera
# create a lookup Hashtable from the array of token values for FAST lookup
# you can also use a HashSet ([System.Collections.Generic.HashSet[string]]::new())
# see https://learn.microsoft.com/en-us/dotnet/api/system.collections.generic.hashset-1
$lookup = #{}
$tokens | ForEach-Object { $lookup[$_] = $true } # it's only the Keys that matter, the value is not important
# now loop over the objects in the first array and check their 'Token' values
$ArrayFinal = foreach ($obj in $objects) {
$obj | Select-Object *, #{Name = 'MatchFound'; Expression = { $lookup.ContainsKey($obj.Token) }}
}
# output on screen
$ArrayFinal | Format-Table -AutoSize
# write to Csv ?
$ArrayFinal | Export-Csv -Path 'Path\To\MatchedObjects.csv' -NoTypeInformation
Output:
Token TokenOrdinalNum MatchFound
----- --------------- ----------
SOMEVALUEHERE 1 True
VAL1 123 True
SomeOtherValue 555 False
100kb objects isn't too big. Here's an example using compare-object. By default it checks every object against every other object (919 ms). EDIT: Ok, if I change the order of $b, it takes much longer (13 min). Sorting both lists first should work well, if most of the positions end up the same.(1.99 s with measure-command). If every item were off by 1 position it will still take a long time ($b = 1,$b).
$a = foreach ($i in 1..100kb) { [pscustomobject]#{token = get-random} }
$a = $a | sort-object token
$b = $a.token | sort-object
compare-object $a.token $b -IncludeEqual
InputObject SideIndicator
----------- -------------
1507400001 ==
120471924 ==
28523825 ==
...

Compare PSCustomObject to Object

I have created a PsCustomObject, when the variable is called is ISE, it reads a table of the relevant data. However, if I try to compare the PsCustomObject with another object, the PsCustomObject doesn't read correctly. I'd like to tell the script if any of the lines in the existing CSV match the PSCustomObject do not export the data to the CSV, in other words skip duplicate rows in the CSV file. The CSV may or may not have multiple rows.
$fileInfo = #(
[pscustomobject]#{
user_id = $user
studio = $studio
function = $Task
end_time_local = $creationTime
asin = $ASIN
variant = $variant
process_class_id = $processClass
}
)
$currentData = Import-Csv "$scansFolder\$fileName.csv"
if($fileInfo -ne $currentData){
$fileInfo | Export-Csv "$scansFolder\$fileName.csv" -Append -NoTypeInformation -Force
}
[pscustomobject] is a .NET reference type, so comparing two instances[1] with -eq will test for reference equality (identity), i.e. if the two instances are one and the same object[2] - which is obviously not the case in your scenario.
Assuming that the properties of your custom objects are instances of value types or strings (which appears to be the case), you can use Compare-Object to compare objects by their property values, with the ability to compare two collections:
$fileInfo = #(
[pscustomobject]#{
user_id = $user
studio = $studio
function = $Task
end_time_local = $creationTime
asin = $ASIN
variant = $variant
process_class_id = $processClass
}
)
# Get the property names.
# This assumes that the CSV data has (at least) the same
# set of properties (columns).
$propNames = $fileInfo[0].psobject.properties.Name
$currentData = Import-Csv "$scansFolder\$fileName.csv"
# Compare the $fileInfo custom object(s) to the custom objects read
# from the CSV file and only export those that are unique to the RHS ('=>')
# back to the file, i.e., those that don't match $fileInfo.
Compare-Object -Property $propNames $fileInfo $currentData |
Where-Object SideIndicator -eq '=>' | Select-Object InputObject |
Export-Csv "$scansFolder\$fileName.csv" -Append -NoTypeInformation -Force
[1] Import-Csv outputs [pscustomobject] instances too.
[2] See the Equality Comparison help topic (written for C#, but applies analogously to PowerShell's -eq operator).

Hashtable with multiple values in GridView

I am storing data in a hashtable with multiple values like this:
$hash = #{}
$folders = dir (...) | where (...)
foreach ($folder in $folders) {
$num1 = (...)
$num2 = (...)
$hash.Add($folder.Name,#($num1,$num2))
}
So this is a hash with an array in its value part. The array always got two items. When finished the foreach part I want to show the data with Out-GridView like this:
$hash | select -Property #{Expression={$_.Name};Label="FolderName"},
#{Expression={$_.Name[0]};Label="num1"},
#{Expression={$_.Name[1]};Label="num2"} | Out-GridView
But as you can imagine, this is not working. How can I split the stored array in the value part of my hash into two new columns to show them in overall three columns in the GridView?
Should be something like Name, Value1, Value2 ...
And then multiple items which are stored in the hashtable as multiple rows.
Hashtables are not lists of objects with a Name and a Value property. That's just how PowerShell displays the data structure for your convenience. For processing a hashtable the way you tried you need an enumerator to produce such objects:
$hash.GetEnumerator() |
Select-Object #{n='FolderName';e={$_.Name}},
#{n='num1';e={$_.Value[0]}},
#{n='num2';e={$_.Value[1]}} |
Out-GridView
Or you can enumerate the keys of the hashtable, use them as the current objects in the pipeline, and look up the values by the respective key and index:
$hash.Keys |
Select-Object #{n='FolderName';e={$_}},
#{n='num1';e={$hash[$_][0]}},
#{n='num2';e={$hash[$_][1]}} |
Out-GridView
If you don't know the number of array elements beforehand you need an inner loop for processing the nested arrays, e.g. like this:
$hash.Keys | ForEach-Object {
$o = New-Object -Type PSObject -Property #{ 'FolderName' = $_ }
$a = $hash[$_]
for ($i = 1; $i -le $a.Count; $i++) {
$o | Add-Member -Type NoteProperty -Name "num$i" -Value $a[$i-1]
}
$o
} | Out-GridView
If you have a variable number of array elements, beware that PowerShell determines by the first object which properties will be displayed.

Powershell Multidimensional Arrays

I have a way of doing Arrays in other languagues like this:
$x = "David"
$arr = #()
$arr[$x]["TSHIRTS"]["SIZE"] = "M"
This generates an error.
You are trying to create an associative array (hash). Try out the following
sequence of commands
$arr=#{}
$arr["david"] = #{}
$arr["david"]["TSHIRTS"] = #{}
$arr["david"]["TSHIRTS"]["SIZE"] ="M"
$arr.david.tshirts.size
Note the difference between hashes and arrays
$a = #{} # hash
$a = #() # array
Arrays can only have non-negative integers as indexes
from powershell.com:
PowerShell supports two types of multi-dimensional arrays: jagged arrays and true multidimensional arrays.
Jagged arrays are normal PowerShell arrays that store arrays as elements. This is very cost-effective storage because dimensions can be of different size:
$array1 = 1,2,(1,2,3),3
$array1[0]
$array1[1]
$array1[2]
$array1[2][0]
$array1[2][1]
True multi-dimensional arrays always resemble a square matrix. To create such an array, you will need to access .NET. The next line creates a two-dimensional array with 10 and 20 elements resembling a 10x20 matrix:
$array2 = New-Object 'object[,]' 10,20
$array2[4,8] = 'Hello'
$array2[9,16] = 'Test'
$array2
for a 3-dimensioanl array 10*20*10
$array3 = New-Object 'object[,,]' 10,20,10
To extend on what manojlds said above is that you can nest Hashtables. It may not be a true multi-dimensional array but give you some ideas about how to structure the data. An example:
$hash = #{}
$computers | %{
$hash.Add(($_.Name),(#{
"Status" = ($_.Status)
"Date" = ($_.Date)
}))
}
What's cool about this is that you can reference things like:
($hash."Name1").Status
Also, it is far faster than arrays for finding stuff. I use this to compare data rather than use matching in Arrays.
$hash.ContainsKey("Name1")
Hope some of that helps!
-Adam
Knowing that PowerShell pipes objects between cmdlets, it is more common in PowerShell to use an array of PSCustomObjects:
$arr = #(
[PSCustomObject]#{Name = 'David'; Article = 'TShirt'; Size = 'M'}
[PSCustomObject]#{Name = 'Eduard'; Article = 'Trouwsers'; Size = 'S'}
)
Or for older PowerShell Versions (PSv2):
$arr = #(
New-Object PSObject -Property #{Name = 'David'; Article = 'TShirt'; Size = 'M'}
New-Object PSObject -Property #{Name = 'Eduard'; Article = 'Trouwsers'; Size = 'S'}
)
And grep your selection like:
$arr | Where {$_.Name -eq 'David' -and $_.Article -eq 'TShirt'} | Select Size
Or in newer PowerShell (Core) versions:
$arr | Where Name -eq 'David' | Where Article -eq 'TShirt' | Select Size
Or (just get the size):
$arr.Where{$_.Name -eq 'David' -and $_.Article -eq 'TShirt'}.Size
Addendum 2020-07-13
Syntax and readability
As mentioned in the comments, using an array of custom objects is straighter and saves typing, if you like to exhaust this further you might even use the ConvertForm-Csv (or the Import-Csv) cmdlet for building the array:
$arr = ConvertFrom-Csv #'
Name,Article,Size
David,TShirt,M
Eduard,Trouwsers,S
'#
Or more readable:
$arr = ConvertFrom-Csv #'
Name, Article, Size
David, TShirt, M
Eduard, Trouwsers, S
'#
Note: values that contain spaces or special characters need to be double quoted
Or use an external cmdlet like ConvertFrom-SourceTable which reads fixed width table formats:
$arr = ConvertFrom-SourceTable '
Name Article Size
David TShirt M
Eduard Trouwsers S
'
Indexing
The disadvantage of using an array of custom objects is that it is slower than a hash table which uses a binary search algorithm.
Note that the advantage of using an array of custom objects is that can easily search for anything else e.g. everybody that wears a TShirt with size M:
$arr | Where Article -eq 'TShirt' | Where Size -eq 'M' | Select Name
To build an binary search index from the array of objects:
$h = #{}
$arr | ForEach-Object {
If (!$h.ContainsKey($_.Name)) { $h[$_.Name] = #{} }
If (!$h[$_.Name].ContainsKey($_.Article)) { $h[$_.Name][$_.Article] = #{} }
$h[$_.Name][$_.Article] = $_ # Or: $h[$_.Name][$_.Article]['Size'] = $_.Size
}
$h.david.tshirt.size
M
Note: referencing a hash table key that doesn't exist in Set-StrictMode will cause an error:
Set-StrictMode -Version 2
$h.John.tshirt.size
PropertyNotFoundException: The property 'John' cannot be found on this object. Verify that the property exists.
Here is a simple multidimensional array of strings.
$psarray = #(
('Line' ,'One' ),
('Line' ,'Two')
)
foreach($item in $psarray)
{
$item[0]
$item[1]
}
Output:
Line
One
Line
Two
Two-dimensional arrays can be defined this way too as jagged array:
$array = New-Object system.Array[][] 5,5
This has the nice feature that
$array[0]
outputs a one-dimensional array, containing $array[0][0] to $array[0][4].
Depending on your situation you might prefer it over $array = New-Object 'object[,]' 5,5.
(I would have commented to CB above, but stackoverflow does not let me yet)
you could also uses System.Collections.ArrayList to make a and array of arrays or whatever you want.
Here is an example:
$resultsArray= New-Object System.Collections.ArrayList
[void] $resultsArray.Add(#(#('$hello'),2,0,0,0,0,0,0,1,1))
[void] $resultsArray.Add(#(#('$test', '$testagain'),3,0,0,1,0,0,0,1,2))
[void] $resultsArray.Add("ERROR")
[void] $resultsArray.Add(#(#('$var', '$result'),5,1,1,0,1,1,0,2,3))
[void] $resultsArray.Add(#(#('$num', '$number'),3,0,0,0,0,0,1,1,2))
One problem, if you would call it a problem, you cannot set a limit. Also, you need to use [void] or the script will get mad.
Using the .net syntax (like CB pointed above)
you also add coherence to your 'tabular' array...
if you define a array...
and you try to store diferent types
Powershell will 'alert' you:
$a = New-Object 'byte[,]' 4,4
$a[0,0] = 111; // OK
$a[0,1] = 1111; // Error
Of course Powershell will 'help' you
in the obvious conversions:
$a = New-Object 'string[,]' 2,2
$a[0,0] = "1111"; // OK
$a[0,1] = 111; // OK also
Another thread pointed here about how to add to a multidimensional array in Powershell. I don't know if there is some reason not to use this method, but it worked for my purposes.
$array = #()
$array += ,#( "1", "test1","a" )
$array += ,#( "2", "test2", "b" )
$array += ,#( "3", "test3", "c" )
Im found pretty cool solvation for making arrays in array.
$GroupArray = #()
foreach ( $Array in $ArrayList ){
$GroupArray += #($Array , $null)
}
$GroupArray = $GroupArray | Where-Object {$_ -ne $null}
Lent from above:
$arr = ConvertFrom-Csv #'
Name,Article,Size
David,TShirt,M
Eduard,Trouwsers,S
'#
Print the $arr:
$arr
Name Article Size
---- ------- ----
David TShirt M
Eduard Trouwsers S
Now select 'David'
$arr.Where({$_.Name -eq "david"})
Name Article Size
---- ------- ----
David TShirt M
Now if you want to know the Size of 'David'
$arr.Where({$_.Name -eq "david"}).size
M

Powershell Select-Object from array not working

I am trying to seperate values in an array so i can pass them to another function.
Am using the select-Object function within a for loop to go through each line and separate the timestamp and value fields.
However, it doesn't matter what i do the below code only displays the first select-object variable for each line. The second select-object command doesn't seem to work as my output is a blank line for each of the 6 rows.
Any ideas on how to get both values
$ReportData = $SystemStats.get_performance_graph_csv_statistics( (,$Query) )
### Allocate a new encoder and turn the byte array into a string
$ASCII = New-Object -TypeName System.Text.ASCIIEncoding
$csvdata = $ASCII.GetString($ReportData[0].statistic_data)
$csv2 = convertFrom-CSV $csvdata
$newarray = $csv2 | Where-Object {$_.utilization -ne "0.0000000000e+00" -and $_.utilization -ne "nan" }
for ( $n = 0; $n -lt $newarray.Length; $n++)
{
$nTime = $newarray[$n]
$nUtil = $newarray[$n]
$util = $nUtil | select-object Utilization
$util
$tstamp = $nTime | select-object timestamp
$tstamp
}
Let me slightly modify the processing code, if it will help.
$csv2 |
Where-Object {$_.utilization -ne "0.0000000000e+00" -and $_.utilization -ne "nan" } |
Select-Object Utilization,TimeStamp
It will produce somewhat different output, but that should be better for working with.
The result are objects with properties Utilization and TimeStamp. You can pass them to the another function as you mention.
Generally it is better to use pipes instead of for loops. You don't need to care about indexes and it works with arrays as well as with scalar values.
If my updated code won't work: is the TimeStamp property really filled with any value?