Some objects has been saved to a txt.file
looking like this:
#{flightNumber=01; flightDate=2010-01-10; flightIdentification=201001}
#{flightNumber=01; flightDate=2010-01-10; flightIdentification=201002}
and I'm trying to read them in another program and convert them back into objects. What bothers me is that it understands each of the "objects" as a string and I have been unable to cast it into an object.
$list = Get-Content -Path 'C:\Users\XXXXX\Downloads\TemplateObject.txt'
foreach (#object in $list) {
Write-Host $object.flightNumber
}
From what I've shown, I would expect to see 2 different objects with the variables flightNumber, flightDate and flightIdentification
I've tried piping it by using ConvertFrom-StringData
I've tried casting to an object
I expect 2 separate objects containing 3 variables in each.
Don't pipe objects directly to files!
As has been pointed out, take advantage of built-in options for serialization to disk, like ConvertTo-Csv/Export-Csv for flat objects, ConvertTo-Json or Export-Clixml for more complex objects.
As a one-off thing, if you need to recover and re-encode this data, you could use the regex -replace operator to add quotes around the values, at which point the parser should accept them as hashtable entries and you can cast it to an object:
$string = '#{flightNumber=01; flightDate=2010-01-10; flightIdentification=201001}'
# Place double-quotes around anything found between a `=` and `;` or `}`
$quotedString = $string -replace '(?<=\=)([^=;}]+)(?=\s*(?:;|}))', '"$1"'
# Parse the resulting string as if it was PowerShell code
$errors = #()
$objectAST = [System.Management.Automation.Language.Parser]::ParseInput($quotedString, [ref]$null,[ref]$errors)
$objects = if(-not $errors){
# This is pretty dangerous, you should NEVER do this in a production script
$objectAST.GetScriptBlock.Invoke() |ForEach-Object {
[pscustomobject]$_
}
}
# This variable now contains the re-animated objects
$objects
You can convert a string to a hashtable using convertfrom-stringdata after some manipulation:
$a = '#{flightNumber=01; flightDate=2010-01-10; flightIdentification=201001}'
$a = $a -replace '#{' -replace '}' -replace ';',"`n" | ConvertFrom-StringData
[pscustomobject]$a
flightNumber flightIdentification flightDate
------------ -------------------- ----------
01 201001 2010-01-10
Related
A Microsoft utility returns strings in the following format:
"Author: First.Last; Name: RootConfiguration; Version: 2.0.0; GenerationDate: 06/01/2022 13:18:10; GenerationHost: Server;"
I would like to convert those strings into simple objects. If this were true JSON, I'd just use ConvertFrom-JSON. To reinvent the wheel as little as possible, what's the most straightforward way to convert that into an object (with keys Author, Name, Version, GenerationDate, GenerationHost, with the obvious values. It's fine if the values are all treated as dumb strings.
If "you just have to grind it out by tokenizing the string bite by bite" is the answer, I can do that, but it seems there should be a simpler way, like if I could tell ConvertFrom-JSON (or even ConvertFrom-String!) "Do your thing, but process the semicolons as newlines, accept spaces on the right hand side, etc."
A solution that combines manual parsing with ConvertFrom-StringData, but note that input order of the entries isn't preserved, given that the latter returns a [hashtable] instance with inherently unordered entries:
# Sampe input string.
$str = 'Author: First.Last; Name: RootConfiguration; Version: 2.0.0; GenerationDate: 06/01/2022 13:18:10; GenerationHost: Server;'
# Replace ":" with "=", split into individual lines, so
# that ConvertFrom-StringData recognizes the format.
$str -replace ': ', '=' -replace '; ?', "`n" | ConvertFrom-StringData
# Note: The above outputs a [hashtable].
# You could cast it to [pscustomobject], as shown below,
# but the input order of entries is lost either way.
As zett42 points out, if the values (as opposed to the keys) in the input string contained \ chars., they'd need to be doubled in order to be retained as such - see his comment below.
A solution with manual parsing only:
# Sampe input string.
$str = 'Author: First.Last; Name: RootConfiguration; Version: 2.0.0; GenerationDate: 06/01/2022 13:18:10; GenerationHost: Server;'
# Initialize an ordered hashtable (dictionary)
$dict = [ordered] #{}
# Split the string by ";", then each entry into key and value by ":".
$str -split '; ?' |
ForEach-Object { $key, $value = $_ -split ': ', 2; $dict[$key] = $value }
# Convert the ordered hashtable (dictionary) to a custom object.
[pscustomobject] $dict
I usually don't answer questions that don't have a coding attempt but, figured this might help others. Given that the delimiter is a semicolon, I was thinking of converting to CSV first but, would have to worry about the header next. So, instead of converting to CSV, we can use the delimiter to split the results at that and process the values one at a time:
"Author: First.Last; Name: RootConfiguration; Version: 2.0.0; GenerationDate: 06/01/2022 13:18:10; GenerationHost: Server;".Split(";").Trim() |
ForEach-Object -Process {
$header,$value = $_ -split ":",2
New-Object -TypeName PSCustomObject #{
$header = $value
}
} | ConvertTo-Json
To make this work we need to split at just the first colon (:) leaving the rest intact; incase there's others in the value like you see in the GenerationDate property.
This was achieved using $_ -split ":",2.
Finally, the rest was just left to assign the header to and value to a PSCustomObject and conver the results to JSON using ConvertTo-Json.
Note: I am restricted to a "strict language mode" on my work system so it's best to use the type-accelarator of [PSCustomObject]#{..} to create the object, rather than New-Object.
Complementing the existing helpful answers, here is another one using the Regex.Matches() function:
$testInput = 'Author: First.Last; Name: RootConfiguration; Version: 2.0.0; GenerationDate: 06/01/2022 13:18:10; GenerationHost: Server;'
# Create a temporary, ordered Hashtable to collect keys and values in the original order.
$ht = [ordered] #{}
# Use a regular expression to find all key/value pairs.
foreach( $match in [regex]::Matches( $testInput, '\s*([^:]+):\s*([^;]+);') ) {
# Enter a key/value pair into the Hashtable
$ht[ $match.Groups[1] ] = $match.Groups[2]
}
# Convert the temporary Hashtable to PSCustomObject.
[PSCustomObject] $ht
Output:
Author : First.Last
Name : RootConfiguration
Version : 2.0.0
GenerationDate : 06/01/2022 13:18:10
GenerationHost : Server
The RegEx pattern consists of two capturing groups ( ), where the first one captures a key and the second one captures a value.
For a detailed explanation see regex101, where you can also play around with the pattern.
I would like to write out a hash table to a file with an array as one of the hash table items. My array item is written out, but it contains files=System.Object[]
Note - Once this works, I will want to reverse the process and read the hash table back in again.
clear-host
$resumeFile="c:\users\paul\resume.log"
$files = Get-ChildItem *.txt
$files.GetType()
write-host
$types="txt"
$in="c:\users\paul"
Remove-Item $resumeFile -ErrorAction SilentlyContinue
$resumeParms=#{}
$resumeParms['types']=$types
$resumeParms['in']=($in)
$resumeParms['files']=($files)
$resumeParms.GetEnumerator() | ForEach-Object {"{0}={1}" -f $_.Name,$_.Value} | Set-Content $resumeFile
write-host "Contents of $resumefile"
get-content $resumeFile
Results
IsPublic IsSerial Name BaseType
-------- -------- ---- --------
True True Object[] System.Array
Contents of c:\users\paul\resume.log
files=System.Object[]
types=txt
in=c:\users\paul
The immediate fix is to create your own array representation, by enumerating the elements and separating them with ,, enclosing string values in '...':
# Sample input hashtable. [ordered] preserves the entry order.
$resumeParms = [ordered] #{ foo = 42; bar = 'baz'; arr = (Get-ChildItem *.txt) }
$resumeParms.GetEnumerator() |
ForEach-Object {
"{0}={1}" -f $_.Name, (
$_.Value.ForEach({
(("'{0}'" -f ($_ -replace "'", "''")), $_)[$_.GetType().IsPrimitive]
}) -join ','
)
}
Not that this represents all non-primitive .NET types as strings, by their .ToString() representation, which may or may not be good enough.
The above outputs something like:
foo=42
bar='baz'
arr='C:\Users\jdoe\file1.txt','C:\Users\jdoe\file2.txt','C:\Users\jdoe\file3.txt'
See the bottom section for a variation that creates a *.psd1 file that can later be read back into a hashtable instance with Import-PowerShellDataFile.
Alternatives for saving settings / configuration data in text files:
If you don't mind taking on a dependency on a third-party module:
Consider using the PSIni module, which uses the Windows initialization file (*.ini) file format; see this answer for a usage example.
Adding support for initialization files to PowerShell itself (not present as of 7.0) is being proposed in GitHub issue #9035.
Consider using YAML as the file format; e.g., via the FXPSYaml module.
Adding support for YAML files to PowerShell itself (not present as of 7.0) is being proposed in GitHub issue #3607.
The Configuration module provides commands to write to and read from *.psd1 files, based on persisted PowerShell hashtable literals, as you would declare them in source code.
Alternatively, you could modify the output format in the code at the top to produce such files yourself, which allows you to read them back in via
Import-PowerShellDataFile, as shown in the bottom section.
As of PowerShell 7.0 there's no built-in support for writing such as representation; that is, there is no complementary Export-PowerShellDataFile cmdlet.
However, adding this ability is being proposed in GitHub issue #11300.
If creating a (mostly) plain-text file is not a must:
The solution that provides the most flexibility with respect to the data types it supports is the XML-based CLIXML format that Export-Clixml creates, as Lee Dailey suggests, whose output can later be read with Import-Clixml.
However, this format too has limitations with respect to type fidelity, as explained in this answer.
Saving a JSON representation of the data, as Lee also suggests, via ConvertTo-Json / ConvertFrom-Json, is another option, which makes for human-friendlier output than XML, but is still not as friendly as a plain-text representation; notably, all \ chars. in file paths must be escaped as \\ in JSON.
Writing a *.psd1 file that can be read with Import-PowerShellDataFile
Within the stated constraints regarding data types - in essence, anything that isn't a number or a string becomes a string - it is fairly easy to modify the code at the top to write a PowerShell hashtable-literal representation to a *.psd1 file so that it can be read back in as a [hashtable] instance via Import-PowerShellDataFile:
As noted, if you don't mind installing a module, consider the Configuration module, which has this functionality built int.
# Sample input hashtable.
$resumeParms = [ordered] #{ foo = 42; bar = 'baz'; arr = (Get-ChildItem *.txt) }
# Create a hashtable-literal representation and save it to file settings.psd1
#"
#{
$(
($resumeParms.GetEnumerator() |
ForEach-Object {
" {0}={1}" -f $_.Name, (
$_.Value.ForEach({
(("'{0}'" -f ($_ -replace "'", "''")), $_)[$_.GetType().IsPrimitive]
}) -join ','
)
}
) -join "`n"
)
}
"# > settings.psd1
If you read settings.psd1 with Import-PowerShellDataFile settings.psd1 later, you'll get a [hashtable] instance whose entries you an access as usual and which produces the following display output:
Name Value
---- -----
bar baz
arr {C:\Users\jdoe\file1.txt, C:\Users\jdoe\file1.txt, C:\Users\jdoe\file1.txt}
foo 42
Note how the order of entries (keys) was not preserved, because hashtable entries are inherently unordered.
On writing the *.psd1 file you can preserve the key(-creation) order by declaring the input hashtable (System.Collections.Hashtable) as [ordered], as shown above (which creates a System.Collections.Specialized.OrderedDictionary instance), but the order is, unfortunately, lost on reading the *.psd1 file.
As of PowerShell 7.0, even if you place [ordered] before the opening #{ in the *.psd1 file, Import-PowerShellDataFile quietly ignores it and creates an unordered hashtable nonetheless.
This is a problem I deal with all the time and it drives me mad. I really think that there should be a function specifically for this action... so I wrote one.
function ConvertHashTo-CSV
{
Param (
[Parameter(Mandatory=$true)]
$hashtable,
[Parameter(Mandatory=$true)]
$OutputFileLocation
)
$hastableAverage = $NULL #This will only work for hashtables where each entry is consistent. This checks for consistency.
foreach ($hashtabl in $hashtable)
{
$hastableAverage = $hastableAverage + $hashtabl.count #Counts the amount of headings.
}
$Paritycheck = $hastableAverage / $hashtable.count #Gets the average amount of headings
if ( ($parity = $Paritycheck -is [int]) -eq $False) #if the average is not an int the hashtable is not consistent
{
write-host "Error. Hashtable is inconsistent" -ForegroundColor red
Start-Sleep -Seconds 5
return
}
$HashTableHeadings = $hashtable[0].GetEnumerator().name #Get the hashtable headings
$HashTableCount = ($hashtable[0].GetEnumerator().name).count #Count the headings
$HashTableString = $null # Strange to hold the CSV
foreach ($HashTableHeading in $HashTableHeadings) #Creates the first row containing the column headings
{
$HashTableString += $HashTableHeading
$HashTableString += ", "
}
$HashTableString = $HashTableString -replace ".{2}$" #Removed the last , added by the above loop in error
$HashTableString += "`n"
foreach ($hashtabl in $hashtable) #Adds the data
{
for($i=0;$i -lt $HashTableCount;$i++)
{
$HashTableString += $hashtabl[$i]
if ($i -lt ($HashTableCount - 1))
{
$HashTableString += ", "
}
}
$HashTableString += "`n"
}
$HashTableString | Out-File -FilePath $OutputFileLocation #writes the CSV to a file
}
To use this copy the function into your script, run it, and then
ConvertHashTo-CSV -$hashtable $Hasharray -$OutputFileLocation c:\temp\data.CSV
The code is annotated but a brief explanation of what it does. Steps through the arrays and hashtables and adds them to a string adding the required formatting to make the string a CSV file, then outputs that to a file.
The main limitation of this is that the Hashtabes in the array all have to contain the same amount of fields. To get around this if a hashtable has a field that doesnt contain data ensure it contains at least a space.
More on this can be found here : https://grumpy.tech/powershell-convert-hashtable-to-csv/
I want to print the word exist in a text file and print "match" and "not match". My 1st text file is: xxaavv6J, my 2nd file is 6J6SCa.yB.
If it is match, it return like this:
Match found:
Match found:
Match found:
Match found:
Match found:
Match found: 6J
Match found:
Match found:
Match found:
My expectation is just print match and not match.
$X = Get-Content "C:\Users\2.txt"
$Data = Get-Content "C:\Users\d.txt"
$Split = $Data -split '(..)'
$Y = $X.Substring(0, 6)
$Z = $Y -split '(..)'
foreach ($i in $Z) {
foreach ($j in $Split) {
if ($i -like $j) {
Write-Host ("Match found: {0}" -f $i, $j)
}
}
}
The operation -split '(..)' does not produce the result you think it does. If you take a look at the output of the following command you'll see that you're getting a lot of empty results:
PS C:\> 'xxaavv6J' -split '(..)' | % { "-$_-" }
--
-xx-
--
-aa-
--
-vv-
--
-6J-
--
Those empty values are the additional matches you're getting from $i -like $j.
I'm not quite sure why -split '(..)' gives you any non-empty values in the first place, because I would have expected it to produce 5 empty strings for an input string "xxaavv6J". Apparently it has to do with the grouping parentheses, since -split '..' (without the grouping parentheses) actually does behave as expected. Looks like with the capturing group the captured matches are returned on top of the results of the split operation.
Anyway, to get the behavior you want replace
... -split '(..)'
with
... |
Select-String '..' -AllMatches |
Select-Object -Expand Matches |
Select-Object -Expand Value
You can also replace the nested loop with something like this:
foreach ($i in $Z) {
if (if $Split -contains $i) {
Write-Host "Match found: ${i}"
}
}
A slightly different approach using regex '.Match()' should also do it.
I have added a lot of explaining comments for you:
$Test = Get-Content "C:\Users\2.txt" -Raw # Read as single string. Contains "xxaavv6J"
$Data = (Get-Content "C:\Users\d.txt") -join '' # Read as array and join the lines with an empty string.
# This will remove Newlines. Contains "6J6SCa.yB"
# Split the data and make sure every substring has two characters
# In each substring, the regex special characters need to be Escaped.
# When this is done, we join the substrings together using the pipe symbol.
$Data = ($Data -split '(.{2})' | # split on every two characters
Where-Object { $_.Length -eq 2 } | # don't care about any left over character
ForEach-Object { [Regex]::Escape($_) } ) -join '|' # join with the '|' which is an OR in regular expression
# $Data is now a string to use with regular expression: "6J|6S|Ca|\.y"
# Using '.Match()' works Case-Sensitive. To have it compare Case-Insensitive, we do this:
$Data = '(?i)' + $Data
# See if we can find one or more matches
$regex = [regex]$Data
$match = $regex.Match($Test)
# If we have found at least one match:
if ($match.Groups.Count) {
while ($match.Success) {
# matched text: $match.Value
# match start: $match.Index
# match length: $match.Length
Write-Host ("Match found: {0}" -f $match.Value)
$match = $match.NextMatch()
}
}
else {
Write-Host "Not Found"
}
Result:
Match found: 6J
Further to the excellent Ansgar Wiechers' answer: if you are running (above) Windows PowerShell 4.0 then you could apply the .Where() method described in Kirk Munro's exhaustive article ForEach and Where magic methods:
With the release of Windows PowerShell 4.0, two new “magic” methods
were introduced for collection types that provide a new syntax for
accessing ForEach and Where capabilities in Windows PowerShell.
These methods are aptly named ForEach and Where. I call
these methods “magic” because they are quite magical in how they work
in PowerShell. They don’t show up in Get-Member output, even if you
apply -Force and request -MemberType All. If you roll up your
sleeves and dig in with reflection, you can find them; however, it
requires a broad search because they are private extension methods
implemented on a private class. Yet even though they are not
discoverable without peeking under the covers, they are there when you
need them, they are faster than their older counterparts, and they
include functionality that was not available in their older
counterparts, hence the “magic” feeling they leave you with when you
use them in PowerShell. Unfortunately, these methods remain
undocumented even today, almost a year since they were publicly
released, so many people don’t realize the power that is available in
these methods.
…
The Where method
Where is a method that allows you to filter a collection of objects.
This is very much like the Where-Object cmdlet, but the Where
method is also like Select-Object and Group-Object as well,
includes several additional features that the Where-Object cmdlet
does not natively support by itself. This method provides faster
performance than Where-Object in a simple, elegant command. Like
the ForEach method, any objects that are output by this method are
returned in a generic collection of type
System.Collections.ObjectModel.Collection1[psobject].
There is only one version of this method, which can be described as
follows:
Where(scriptblock expression[, WhereOperatorSelectionMode mode[, int numberToReturn]])
As indicated by the square brackets, the expression script block is
required and the mode enumeration and the numberToReturn integer
argument are optional, so you can invoke this method using 1, 2, or 3
arguments. If you want to use a particular argument, you must provide
all arguments to the left of that argument (i.e. if you want to
provide a value for numberToReturn, you must provide values for
mode and expression as well).
Applied to your case (using the simplest variant Where(scriptblock expression) of the .Where() method):
$X = '6J6SCa.yB' # Get-Content "C:\Users\2.txt"
$Data = 'xxaavv6J' # Get-Content "C:\Users\d.txt"
$Split = ($Data -split '(..)').Where({$_ -ne ''})
$Y = $X.Substring(0, 6)
$Z = ($Y -split '(..)').Where{$_ -ne ''} # without parentheses
For instance, Ansgar's example changes as follows:
PS > ('xxaavv6J' -split '(..)').Where{$_ -ne ''} | % { "-$_-" }
-xx-
-aa-
-vv-
-6J-
I am importing a CSV file with two records per line, "Name" and "Path".
$softwareList = Import-Csv C:\Scripts\NEW_INSTALLER\softwareList.csv
$count = 0..($softwareList.count -1)
foreach($i in $count){
Write-Host $softwareList[$i].Name,$softwareList[$i].Path
}
What I am trying to do is dynamically assign the Name and Path of each record to a WPFCheckbox variable based on the $i variable. The names for these checkboxes are named something such as WPFCheckbox0, WPFCheckbox1, WPFCheckbox2 and so on. These objects have two properties I planned on using, "Command" to store the $SoftwareList[$i].path and "Content" to store the $SoftwareList[$i].Name
I cannot think of a way to properly loop through these variables and assign the properties from the CSV to the properties on their respective WPFCheckboxes.
Any suggestions would be very appreciated.
Invoke-Expression is one way, though note Mathias' commented concerns on the overall approach.
Within your foreach loop, you can do something like:
invoke-expression "`$WPFCheckbox$i`.Command = $($SoftwareList[$i].Path)"
invoke-expression "`$WPFCheckbox$i`.Content= $($SoftwareList[$i].Name)"
The back-tick ` just before the $WPFCheckBox prevents what would be an undefined variable from being immediately evaluated (before the expression is invoked), but the $I is. This gives you a string with your $WPFCheckbox1, to which you then append the property names and values. The $SoftwareList values are immediately processed into the raw string.
The Invoke-Expression then evaluates and executes the entire string as if it were a regular statement.
Here's a stand-alone code snippet to play with:
1..3 |% {
invoke-expression "`$MyVariable$_` = New-Object PSObject"
invoke-expression "`$MyVariable$_` | add-member -NotePropertyName Command -NotePropertyValue [String]::Empty"
invoke-expression "`$MyVariable$_`.Command = 'Path #$_'"
}
$MyVariable1 | Out-String
$MyVariable2 | Out-String
$MyVariable3 | Out-String
As a side note (since I can't comment yet on your original question,) creating an array just to act as iterator through the lines of the file is really inefficient. There are definitely better ways to do that.
I am using a text file as the backend for an application that I am developing. I first started off leaving the text file in a human-readable format but I decided that there was no sense in that figured it would be best to leave out formatting.
Where I am now in the backend dev process is creating a single-line hashtable with identical keys but different values for each entry. Seems logical and easy to work with.
Here is a mock-up of the entries in the text file:
#{'bName'='1xx'; 'bTotal'='1yy'; 'bSet'='1zz'}
#{'bName'='2xx'; 'bTotal'='2yy'; 'bSet'='2zz'}
#{'bName'='3xx'; 'bTotal'='3yy'; 'bSet'='3zz'}
As you can see, the keys for each entry are identical, however, the values are going to be different. (The numerical and repetitious nature of the values are purely coincidental and put in place for the sake of a mock-up. Actual values will not be numerically-oriented and won't be repetitious as seen in the example.)
I am able to access keys and values by typing:
$hash = Get-Content .\Desktop\Test.txt | Out-String | iex
which outputs:
Name Value
---- -----
bName 1xx
bTotal 1yy
bSet 1zz
bName 2xx
bTotal 2yy
bSet 2zz
bName 3xx
bTotal 3yy
bSet 3zz
What I ultimately want to do is gather each of the values for bName, bTotal, and bSet so that I can append each to a separate WinForms ComboBox. The WinForms part will be simple, I am just having a bit of an issue with getting the values from each hashtable in the text file.
I tried:
$hash.Values | ?{$hash.Keys -contains 'bName'}
but it just prints out every $hash.Value regardless of the $hash.Key match given in the pipe.
I understand that $hash is an array and I figured I may have to pipe out each iteration in a foreach ($hash | %{}) loop but I'm not quite sure the correct way to do this. For example, when I try:
$hash | $_.Keys
or
$hash | $_.Values
it isn't treating each iteration like a hashtable.
What am I doing wrong here? Am I going about it in a convoluted way while there is a much easier way to accomplish this? I am open to all sorts of ideas or suggestions.
As an afterthought: It is kind of funny how often an obvious solution presents itself when you step away and divert your attention towards something else.
I went to grab lunch and I can't, for the life of me, begin to comprehend why I didn't realize that I could just very easily do this:
$hash.bName
or:
$hash.bTotal
or:
$hash.bSet
That will do exact as I was wanting to do. However, considering the answers provided, I may go a different route in terms of using an .ini file in CSV format rather than creating an array of hashtables.
One way of storing hashtables in a text file is the INI format.
[hashtable1]
bName=1xx
bTotal=1yy
bSet=1zz
[hashtable2]
bName=2xx
bTotal=2yy
bSet=2zz
[hashtable3]
bName=3xx
bTotal=3yy
bSet=3zz
INI files are basically a hashtable of hashtables in text form. They can be read like this:
$ht = #{}
Get-Content 'C:\path\to\hashtables.txt' | ForEach-Object {
$_.Trim()
} | Where-Object {
$_ -notmatch '^(;|$)'
} | ForEach-Object {
if ($_ -match '^\[.*\]$') {
$section = $_ -replace '\[|\]'
$ht[$section] = #{}
} else {
$key, $value = $_ -split '\s*=\s*', 2
$ht[$section][$key] = $value
}
}
and written like this:
$ht.Keys | ForEach-Object {
'[{0}]' -f $_
foreach ($key in $ht[$_].Keys) {
'{0}={1}' -f $key, $ht[$_][$key]
}
} | Set-Content 'C:\path\to\hashtables.txt'
Individual values in such a hashtable of hashtables can be accessed like this:
$ht['section']['key']
or like this:
$ht.section.key
Another option would be to store each hashtable in a separate file
hashtable1.txt:
bName=1xx
bTotal=1yy
bSet=1zz
hashtable2.txt.
bName=2xx
bTotal=2yy
bSet=2zz
hashtable3.txt:
bName=3xx
bTotal=3yy
bSet=3zz
That would allow you to import each file into a hashtable via ConvertFrom-StringData:
$ht1 = Get-Content 'C:\path\to\hashtable1.txt' | Out-String |
ConvertFrom-Stringdata
Writing the files would basically be the same as above (there is no ConverTo-StringData cmdlet):
$ht1.Keys | ForEach-Object {
'{0}={1}' -f $_, $ht[$_]
} | Set-Content 'C:\path\to\hashtables1.txt'
PowerShell has built in csv handling so it makes it a good choice to use in this case. So, assuming you had your data stored in a file in the standard csv format with headers:
"bName","bTotal","bSet"
"1xx","1yy","1zz"
"2xx","2yy","2zz"
"3xx","3yy","3zz"
Then you import your data like this:
$data = Import-Csv $path
Now you have an array of PsCustomObject and each header in the csv file is a property of the object. So if, for example, you wanted to get the bTotal of the second object you would do the following:
$data[1].bTotal
2yy