foreach through hashtable values - powershell

I know that this question has already been answered for hashtable keys... but it does not seem to work for hashtable values.
I'm creating a hash of VM's based on the cluster they reside in. So the hashtable looks like this
$clusters[$clustername][$clustervms] = #{}
The reason each VM is a hashtable is because i'm trying to associate it with their VM tag as well (vmware).
This code works incredibly fast but destroys the keys, by injecting values as keys... or in other words, Rather than key/value pairs - values become keys, keys become values ... it's just a shit show.
foreach ($value in $($clusters.values)) {
$clusters[$value] = (get-tagassignment -entity ($value).name).tag
This code works - but it is unbelievably slow.
foreach ($key in $($clusters.keys)) {
$vms = (Get-Cluster -Name $key | Get-Vm).name
foreach ($vm in $vms) {
$clusters[$key][$vm] = #{};
$tag = (Get-TagAssignment -Entity $vm).tag;
$clusters[$key][$vm] = $tag;
}
}
When i say unbelievably slow - i mean getting the VM names takes about 5 seconds. Getting the tag assignments through the first code (codename: shit show) takes about 7 seconds. I've waited a minute on this code, and it's only gone through 6 VM's in that time. So i know there's a better way.
Thanks,

I commented on this above, I wrote an example script which should make this more clear. Also note this powershell is meant to be illustrative, and some/many/or all things could be done in a more efficient manner.
# for example, im just using sourcedata variable to make this clearer.
# you would normally be populating this hash programatically
# lets say a VM has this payload data:
# #{ vm_name="bar"; os="win" }
$SourceData = #(
#{
cluster_name = "foo";
vms = #( #{ vm_name="bar" ; os="win" }, #{ vm_name="baz"; os="linux" })
}, #{
cluster_name = "taco";
vms = #( #{ vm_name="guac"; os="win" }, #{ vm_name="hot"; os="win" })
})
$clusters = #{}
# load the sourcedata into our clusters catalog
$SourceData | %{
$clusternm = $_.cluster_name
$clusters[ $clusternm ] = #{}
$_.vms | %{
$vmnm = $_.vm_name
$clusters[ $clusternm ][ $vmnm ] = $_
}
}
# show the whole thing
$clusters | ConvertTo-Json | Write-Output
<#
{
"taco": {
"hot": {
"os": "win",
"vm_name": "hot"
},
"guac": {
"os": "win",
"vm_name": "guac"
}
},
"foo": {
"bar": {
"os": "win",
"vm_name": "bar"
},
"baz": {
"os": "linux",
"vm_name": "baz"
}
}
}
#>
# show just a vm
$clusters['foo']['bar'] | ConvertTo-Json | Write-Output
<#
{
"os": "win",
"vm_name": "bar"
}
#>
And finally, to assure you that iterating hashes takes no appreciable time:
# now lets iterate each cluster, and each vm in that cluster. in this example, just dump the OS of each vm in each cluster
$clusters.Keys | %{
$clusternm = $_
$clusters[$clusternm].Keys | %{
$vmnm = $_
Write-Output "${clusternm}/${vmnm}: os: $( $clusters[$clusternm][$vmnm].os )"
}
}
<#
taco/hot: os: win
taco/guac: os: win
foo/bar: os: win
foo/baz: os: linux
#>
Whole script runs immediately. Only the json conversion methods to have illustrative output added 0.1s

Related

Powershell access caller scope variables across modules

I have the following method declared in a module I've called Common.psm1:
function Test-Any ([ScriptBlock]$FilterScript = $null)
{
begin {
$done = $false
}
process {
if (!$done)
{
if (!$FilterScript -or ($FilterScript | Invoke-Expression)){
$done = $true
}
}
}
end {
$done
}
}
Set-Alias any Test-Any -Scope Global
Now in another module, I have the following validation:
$id = 1
if($notifications | any { $_.Id -eq $id })
{
# do stuff here
return
}
I receive the following error:
Invoke-Expression : The variable '$id' cannot be retrieved because it has not been set.
The interesting thing is that if I move the Test-Any definition to the calling module, it works like a charm.
How can I make this work without copying Test-Any to my other modules and without changing this syntax:
if($notifications | any { $_.Id -eq $id })
EDIT 1:
There seems to be some debate about whether or not my code should work. Feel free to try this on your own machine:
function Test-Any ([ScriptBlock]$FilterScript = $null)
{
begin {
$done = $false
}
process {
if (!$done)
{
if (!$FilterScript -or ($FilterScript | Invoke-Expression)){
$done = $true
}
}
}
end {
$done
}
}
Set-Alias any Test-Any -Scope Global
$id = 3
$myArray = #(
#{Id = 1},
#{Id = 2},
#{Id = 3},
#{Id = 4},
#{Id = 5},
#{Id = 6},
#{Id = 7},
#{Id = 8}
)
$myEmptyArray = #()
$myArray | any #returns true
$myArray | any {$_.Id -eq $id} #returns true
$myEmptyArray | any #returns false
$myEmptyArray | any {$_.Id -eq $id} #returns false
EDIT 2:
I just discovered that you only encounter this issue, when Test-Any resides in one loaded module and the calling code resides in a second module using Set-StrictMode -Version Latest. If you turn off StrictMode, you don't get the error, but it also doesn't work.
EDIT 3:
Needless to say this works perfectly fine:
$sb = [Scriptblock]::Create("{ `$_.Id -eq $id }")
if($notifications | any $sb)
But seriously takes away from the simplicity and intuitiveness I am trying to obtain
Invoke-Expression (which, when possible, should be avoided) implicitly recreates the script block passed from the caller's scope, via its string representation, in the context of the module, which invalidates any references to the caller's state in the script-block code (because modules generally don't see an outside caller's state, except for the global scope).
The solution is to execute the script block as-is, but provide it pipeline input as passed to the module function:
# Note: New-Module creates a *dynamic* (in-memory only) module,
# but the behavior applies equally to regular, persisted modules.
$null = New-Module {
function Test-Any ([ScriptBlock] $FilterScript)
{
begin {
$done = $false
}
process {
if (!$done)
{
# Note the use of $_ | ... to provide pipeline input
# and the use of ForEach-Object to evaluate the script block.
if (!$FilterScript -or ($_ | ForEach-Object $FilterScript)) {
$done = $true
}
}
}
end {
$done
}
}
}
# Sample call. Should yield $true
$id = 1
#{ Id = 2 }, #{ Id = 1 } | Test-Any { $_.Id -eq $id }
Note: The Test-Any function in this answer uses a similar approach, but tries to optimize processing by stopping further pipeline processing - which, however, comes at the expense of incurring an on-demand compilation penalty the first time the function is called in the session, because - as of PowerShell 7.2 - you cannot (directly) stop a pipeline on demand from user code - see GitHub issue #3821.

Powersell - Remotely Query if User Exists Across Domain [Fastest]

Abstract
So I work for a company that has roughly 10k computer assets on my domain. My issue is the time it takes to query if a user exists on a computer to see if they've ever logged into said computer. We need this functionality for audits in case they've done something they shouldn't have.
I have two methods in mind I've researched to complete this task, and a third alternative solution I have not thought of;
-Method A: Querying every computer for the "C:\Users<USER>" to see if LocalPath exists
-Method B: Checking every computer registry for the "HKU:<SID>" to see if the SID exists
-Method C: You are all smarter than me and have a better way? XD
Method A Function
$AllCompFound = #()
$AllADComputer = Get-ADComputer -Properties Name -SearchBase "WhatsItToYa" -filter 'Name -like "*"' | Select-Object Name
ForEach($Computer in $AllADComputers) {
$CName = $Computer.Name
if (Get-CimInstance -ComputerName "$CName" -ClassName Win32_Profile | ? {"C:\Users\'$EDIPI'" -contains $_.LocalPath}) {
$AllCompFound += $CName
} else {
#DOOTHERSTUFF
}
}
NOTE: I have another function that prompts me to enter a username to check for. Where I work they are numbers so case sensitivity is not an issue. My issue with this function is I believe it is the 'if' statement returns true every time because it ran rather than because it matched the username.
Method B Function
$AllCompFound = #()
$AllADComputer = Get-ADComputer -Properties Name -SearchBase "WhatsItToYa" -filter 'Name -like "*"' | Select-Object Name
$hive = [Microsoft:Win32.RegistryHive]::Users
ForEach($Computer in $AllADComputers) {
try {
$base = [Microsoft.Win32.RegistryKey]::OpenRemoteBaseKey($hive, $Computer.Name)
$key = &base.OpenSubKey($strSID)
if ($!key) {
#DOSTUFF
} else {
$AllCompFound += $Computer.Name
#DOOTHERSTUFF
}
} catch {
#IDONTTHROWBECAUSEIWANTITTOCONTINUE
} finally {
if($key) {
$key.Close()
}
if ($base) {
$base.Close()
}
}
}
NOTE: I have another function that converts the username into a SID prior to this function. It works.
Where my eyes start to glaze over is using Invoke-Command and actually return a value back, and whether or not to run all of these queries as their own PS-Session or not. My Method A returns false positives and my Method B seems to hang up on some computers.
Neither of these methods are really fast enough to get through 10k results, I've been using smaller pools of computers in order to get test these results when requested. I'm by no means an expert, but I think I have a good understanding, so any help is appreciated!
First, use WMI Win32_UserProfile, not C:\Users or registry.
Second, use reports from pc to some database, not from server to pc. This is much better usually.
About GPO: If you get access, you can Add\Remove scheduled task for such reports through GPP (not GPO) from time to time.
Third: Use PoshRSJob to make parallel queries.
Get-WmiObject -Class 'Win32_USerProfile' |
Select #(
'SID',
#{
Name = 'LastUseTime';
Expression = {$_.ConvertToDateTime($_.LastUseTime)}}
#{
Name = 'NTAccount';
Expression = { [System.Security.Principal.SecurityIdentifier]::new($_.SID).Translate([System.Security.Principal.NTAccount])}}
)
Be careful with translating to NTAccount: if SID does not translates, it will cause error, so, maybe, it's better not to collect NTAccount from user space.
If you have no other variants, parallel jobs using PoshRSJob
Example for paralleling ( maybe there are some typos )
$ToDo = [System.Collections.Concurrent.ConcurrentQueue[string]]::new() # This is Queue (list) of computers that SHOULD be processed
<# Some loop through your computers #>
<#...#> $ToDo.Enqueue($computerName)
<#LoopEnd#>
$result = [System.Collections.Concurrent.ConcurrentBag[Object]]::new() # This is Bag (list) of processing results
# This function has ComputerName on input, and outputs some single value (object) as a result of processing this computer
Function Get-MySpecialComputerStats
{
Param(
[String]$ComputerName
)
<#Some magic#>
# Here we make KSCustomObject form Hashtable. This is result object
return [PSCustomObject]#{
ComputerName = $ComputerName;
Result = 'OK'
SomeAdditionalInfo1 = 'whateverYouWant'
SomeAdditionalInfo2 = 42 # Because 42
}
}
# This is script that runs on background. It can not output anything.
# It takes 2 args: 1st is Input queue, 2nd is output queue
$JobScript = [scriptblock]{
$inQueue = [System.Collections.Concurrent.ConcurrentQueue[string]]$args[0]
$outBag = [System.Collections.Concurrent.ConcurrentBag[Object]]$args[1]
$compName = $null
# Logging inside, if you need it
$log = [System.Text.StringBuilder]::new()
# we work until inQueue is empty ( then TryDequeue will return false )
while($inQueue.TryDequeue([ref] $compName) -eq $true)
{
$r= $null
try
{
$r = Get-MySpecialComputerStats -ComputerName $compName -EA Stop
[void]$log.AppendLine("[_]: $($compName) : OK!")
[void]$outBag.Add($r) # We append result to outBag
}
catch
{
[void]$log.AppendLine("[E]: $($compName) : $($_.Exception.Message)")
}
}
# we return log.
return $log.ToString()
}
# Some progress counters
$i_max = $ToDo.Count
$i_cur = $i_max
# We start 20 jobs. Dont forget to say about our functions be available inside job
$jobs = #(1..20) <# Run 20 threads #> | % { Start-RSJob -ScriptBlock $JobScript -ArgumentList #($ToDo, $result) -FunctionsToImport 'Get-MySpecialComputerStats' }
# And once per 3 seconds we check, how much entries left in Queue ($todo)
while ($i_cur -gt 0)
{
Write-Progress -Activity 'Working' -Status "$($i_cur) left of $($i_max) computers" -PercentComplete (100 - ($i_cur / $i_max * 100))
Start-Sleep -Seconds 3
$i_cur = $ToDo.Count
}
# When there is zero, we shall wait for jobs to complete last items and return logs, and we collect logs
$logs = $jobs | % { Wait-RSJob -Job $_ } | % { Receive-RSJob -Job $_ }
# Logs is LOGS, not result
# Result is in the result variable.
$result | Export-Clixml -Path 'P:/ath/to/file.clixml' # Exporting result to CliXML file, or whatever you want
Please be careful: there is no output inside $JobScript done, so it must be perfectly done, and function Get-MySpecialComputerStats must be tested on unusual ways to return value that can be interpreted.

comparision of 2 csv files using powershell

The format of two files is same and as follows:
ServiceName Status computer State
AdobeARMservice OK NEE Running
Amazon Assistan OK NEE Running
the requirement is, i have to check the service name and computer name..if both are same, then i have to check whether the state of particular service is same in both the files or not. And if it is not same then display it..
$preser = import-csv C:\info.csv
$postser = import-csv C:\serviceinfo.csv
foreach($ser1 in $preser)
{
foreach($ser2 in $postser)
{
if(($ser1.computer -eq $ser2.computer) -and ($ser1.ServiceName -eq $ser2.ServiceName))
{
if($ser1.State -eq $ser2.State)
{
}
else
{
write-host $ser1,$ser2
}
}
}
}
This code is working fine but as the files length is very large, the time of execution is more.
Is there any alternative method to reduce the time of execution..?
Thank you
Although Import-Csv on very large files will take its time, maybe this will be faster:
$preser = Import-Csv -Path 'C:\info.csv'
$postser = Import-Csv -Path 'C:\serviceinfo.csv'
# build a lookup Hashtable for $preser
$hash = #{}
foreach ($item in $preser) {
# combine the ServiceName and Computer to form the hash key
$key = '{0}#{1}' -f $item.ServiceName, $item.computer
$hash[$key] = $item
}
# now loop through the items in $postser
foreach ($item in $postser) {
$key = '{0}#{1}' -f $item.ServiceName, $item.computer
if ($hash.ContainsKey($key)) {
if ($hash[$key].State -ne $item.State) {
# create a new object for output
$out = $hash[$key] | Select-Object * -ExcludeProperty State
$out | Add-Member -MemberType NoteProperty -Name 'State in Preser' -Value $hash[$key].State
$out | Add-Member -MemberType NoteProperty -Name 'State in Postser' -Value $item.State
$out
}
}
}
The output on screen will look something like this:
ServiceName : AdobeARMservice
Status : OK
computer : NEE
State in Preser : Running
State in Postser : Stopped
Of course, you can capture this output and save it as new csv if you do
$result = foreach ($item in $postser) {
# rest of the above foreach loop
}
# output on screen
$result
# output to new csv
$result | Export-Csv -Path 'C:\ServiceInfoDifference.csv' -NoTypeInformation
There are a few ways to do this:
1. Sorting the columns
If the columns are unsorted in the files, sort them first, and then try finding a match by using linear search.
2. Binary search
What you are currently doing is an implementation of a linear search. You can implement binary search (works best on sorted lists) to find a result faster.
Taken from dfinkey's github repo
function binarySearch {
param($sortedArray, $seekElement, $comparatorCallback)
$comparator = New-Object Comparator $comparatorCallback
$startIndex = 0
$endIndex = $sortedArray.length - 1
while ($startIndex -le $endIndex) {
$middleIndex = $startIndex + [Math]::floor(($endIndex - $startIndex) / 2)
# If we've found the element just return its position.
if ($comparator.equal($sortedArray[$middleIndex], $seekElement)) {
return $middleIndex
}
# Decide which half to choose for seeking next: left or right one.
if ($comparator.lessThan($sortedArray[$middleIndex], $seekElement)) {
# Go to the right half of the array.
$startIndex = $middleIndex + 1
}
else {
# Go to the left half of the array.
$endIndex = $middleIndex - 1
}
}
return -1
}
3. Hashes
I am not completely sure of this method, but, you can load the columns into hashes and then compare them. Hash comparisons are generally faster than array comparisons.

Remove PowerShell object properties that match pattern

Given an example object (coverted from JSON):
{
"Id": 1,
"Name": "Pablo",
"UnwantedProperty1XOXO": true,
"UnwantedProperty2XOXO": false,
"Things": [
{
"Name": "Something",
"UnwantedProperty3XOXO": true
}
]
...
}
How can I remove all properties that match a pattern? In the example I want to remove the three properties that end in XOXO.
My current approach is to use -ExcludeProperty like this:
$myObject | Select-Object -Property * -ExcludeProperty *XOXO
That only removes the first two properties. I need to reach deeper into the collection of Things as well. The object will change as well so I can't hardcode a check for Things and there could be many collections.
Indeed, Select-Object -ExcludeProperty does not act recursively - it only acts on the immediate properties - so a custom solution is needed.
Defining function Remove-Property, printed below, should provide the desired recursive logic:
$sampleJson = #'
{
"Id": 1,
"Name": "Pablo",
"UnwantedProperty1XOXO": true,
"UnwantedProperty2XOXO": false,
"Things": [
{
"Name": "Something",
"UnwantedProperty3XOXO": true
}
]
}
'#
$sampleJson | ConvertFrom-Json |
Remove-Property -NamePattern *XOXO |
ConvertTo-Json
An important aside: ConvertFrom-Json limits parsing to depth of just 2 levels by default, so you may have to specify a greater depth with -Depth <n>.
This problematic default behavior is discussed in GitHub issue #8393.
The result is as follows - note how all properties ending in XOXO, across all levels of the hierarchy, were removed:
{
"Id": 1,
"Name": "Pablo",
"Things": [
{
"Name": "Something"
}
]
}
Remove-Property source code
Important: Remove-Property:
assumes that the input objects are custom objects ([pscustomobject]), such as created by ConvertFrom-Json.
it modifies these objects in place, in addition to outputting the modified object; this differs from Select-Object, which creates new objects from the input.
function Remove-Property {
param(
[Parameter(Mandatory, ValueFromPipeline, Position = 0)]
[object] $InputObject,
[Parameter(Mandatory, Position = 1)]
[string] $NamePattern
)
process {
foreach ($el in $InputObject) {
foreach ($propName in $el.psobject.Properties.Name) {
if ($propName -like $NamePattern) {
$el.psobject.Properties.Remove($propName)
}
else {
$null = Remove-Property -InputObject $el.$propName -NamePattern $NamePattern
}
}
}
$InputObject
}
}
I don't prefer this solution, but it does seem easier than recursively traversing an object's nested properties of unknown depths.
$json = #'
{
"Id": 1,
"Name": "Pablo",
"UnwantedProperty1XOXO": true,
"UnwantedProperty2XOXO": false,
"Things": [
{
"Name": "Something",
"UnwantedProperty3XOXO": true
}
]
}
'#
$filter = "XOXO"
$json -replace ".*$filter.*\r?\n" -replace ",(?=\r?\n\W+})" | ConvertFrom-Json
Maybe this will work.
filter Remove-Property ($Name) {
$queue = [Collections.Generic.Queue[object]]::new(#(Get-Variable _))
while ($queue.Count) {
foreach ($elem in $queue.Dequeue().Value) {
$props = $elem.psobject.Properties
foreach ($p in $props) {
if ($p.Name -like $Name) { $props.Remove($p.Name) } else { $queue.Enqueue($p) }
}
}
}
}
The usage is as follows.
$myObject | Remove-Property -Name "*XOXO"
The ConvertFrom-Json Cmdlet by default has a depth of 2. This is most likely causing your issue.
To fix, use this ConvertFrom-Json command:
ConvertFrom-Json $input -Depth 10
Reference: ConvertFrom-Json

Merging hashtables in PowerShell: how?

I am trying to merge two hashtables, overwriting key-value pairs in the first if the same key exists in the second.
To do this I wrote this function which first removes all key-value pairs in the first hastable if the same key exists in the second hashtable.
When I type this into PowerShell line by line it works. But when I run the entire function, PowerShell asks me to provide (what it considers) missing parameters to foreach-object.
function mergehashtables($htold, $htnew)
{
$htold.getenumerator() | foreach-object
{
$key = $_.key
if ($htnew.containskey($key))
{
$htold.remove($key)
}
}
$htnew = $htold + $htnew
return $htnew
}
Output:
PS C:\> mergehashtables $ht $ht2
cmdlet ForEach-Object at command pipeline position 1
Supply values for the following parameters:
Process[0]:
$ht and $ht2 are hashtables containing two key-value pairs each, one of them with the key "name" in both hashtables.
What am I doing wrong?
Merge-Hashtables
Instead of removing keys you might consider to simply overwrite them:
$h1 = #{a = 9; b = 8; c = 7}
$h2 = #{b = 6; c = 5; d = 4}
$h3 = #{c = 3; d = 2; e = 1}
Function Merge-Hashtables {
$Output = #{}
ForEach ($Hashtable in ($Input + $Args)) {
If ($Hashtable -is [Hashtable]) {
ForEach ($Key in $Hashtable.Keys) {$Output.$Key = $Hashtable.$Key}
}
}
$Output
}
For this cmdlet you can use several syntaxes and you are not limited to two input tables:
Using the pipeline: $h1, $h2, $h3 | Merge-Hashtables
Using arguments: Merge-Hashtables $h1 $h2 $h3
Or a combination: $h1 | Merge-Hashtables $h2 $h3
All above examples return the same hash table:
Name Value
---- -----
e 1
d 2
b 6
c 3
a 9
If there are any duplicate keys in the supplied hash tables, the value of the last hash table is taken.
(Added 2017-07-09)
Merge-Hashtables version 2
In general, I prefer more global functions which can be customized with parameters to specific needs as in the original question: "overwriting key-value pairs in the first if the same key exists in the second". Why letting the last one overrule and not the first? Why removing anything at all? Maybe someone else want to merge or join the values or get the largest value or just the average...
The version below does no longer support supplying hash tables as arguments (you can only pipe hash tables to the function) but has a parameter that lets you decide how to treat the value array in duplicate entries by operating the value array assigned to the hash key presented in the current object ($_).
Function
Function Merge-Hashtables([ScriptBlock]$Operator) {
$Output = #{}
ForEach ($Hashtable in $Input) {
If ($Hashtable -is [Hashtable]) {
ForEach ($Key in $Hashtable.Keys) {$Output.$Key = If ($Output.ContainsKey($Key)) {#($Output.$Key) + $Hashtable.$Key} Else {$Hashtable.$Key}}
}
}
If ($Operator) {ForEach ($Key in #($Output.Keys)) {$_ = #($Output.$Key); $Output.$Key = Invoke-Command $Operator}}
$Output
}
Syntax
HashTable[] <Hashtables> | Merge-Hashtables [-Operator <ScriptBlock>]
Default
By default, all values from duplicated hash table entries will added to an array:
PS C:\> $h1, $h2, $h3 | Merge-Hashtables
Name Value
---- -----
e 1
d {4, 2}
b {8, 6}
c {7, 5, 3}
a 9
Examples
To get the same result as version 1 (using the last values) use the command: $h1, $h2, $h3 | Merge-Hashtables {$_[-1]}. If you would like to use the first values instead, the command is: $h1, $h2, $h3 | Merge-Hashtables {$_[0]} or the largest values: $h1, $h2, $h3 | Merge-Hashtables {($_ | Measure-Object -Maximum).Maximum}.
More examples:
PS C:\> $h1, $h2, $h3 | Merge-Hashtables {($_ | Measure-Object -Average).Average} # Take the average values"
Name Value
---- -----
e 1
d 3
b 7
c 5
a 9
PS C:\> $h1, $h2, $h3 | Merge-Hashtables {$_ -Join ""} # Join the values together
Name Value
---- -----
e 1
d 42
b 86
c 753
a 9
PS C:\> $h1, $h2, $h3 | Merge-Hashtables {$_ | Sort-Object} # Sort the values list
Name Value
---- -----
e 1
d {2, 4}
b {6, 8}
c {3, 5, 7}
a 9
I see two problems:
The open brace should be on the same line as Foreach-object
You shouldn't modify a collection while enumerating through a collection
The example below illustrates how to fix both issues:
function mergehashtables($htold, $htnew)
{
$keys = $htold.getenumerator() | foreach-object {$_.key}
$keys | foreach-object {
$key = $_
if ($htnew.containskey($key))
{
$htold.remove($key)
}
}
$htnew = $htold + $htnew
return $htnew
}
Not a new answer, this is functionally the same as #Josh-Petitt with improvements.
In this answer:
Merge-HashTable uses the correct PowerShell syntax if you want to drop this into a module
Wasn't idempotent. I added cloning of the HashTable input, otherwise your input was clobbered, not an intention
added a proper example of usage
function Merge-HashTable {
param(
[hashtable] $default, # Your original set
[hashtable] $uppend # The set you want to update/append to the original set
)
# Clone for idempotence
$default1 = $default.Clone();
# We need to remove any key-value pairs in $default1 that we will
# be replacing with key-value pairs from $uppend
foreach ($key in $uppend.Keys) {
if ($default1.ContainsKey($key)) {
$default1.Remove($key);
}
}
# Union both sets
return $default1 + $uppend;
}
# Real-life example of dealing with IIS AppPool parameters
$defaults = #{
enable32BitAppOnWin64 = $false;
runtime = "v4.0";
pipeline = 1;
idleTimeout = "1.00:00:00";
} ;
$options1 = #{ pipeline = 0; };
$options2 = #{ enable32BitAppOnWin64 = $true; pipeline = 0; };
$results1 = Merge-HashTable -default $defaults -uppend $options1;
# Name Value
# ---- -----
# enable32BitAppOnWin64 False
# runtime v4.0
# idleTimeout 1.00:00:00
# pipeline 0
$results2 = Merge-HashTable -default $defaults -uppend $options2;
# Name Value
# ---- -----
# idleTimeout 1.00:00:00
# runtime v4.0
# enable32BitAppOnWin64 True
# pipeline 0
In case you want to merge the whole hashtable tree
function Join-HashTableTree {
param (
[Parameter(Mandatory = $true, ValueFromPipeline = $true)]
[hashtable]
$SourceHashtable,
[Parameter(Mandatory = $true, Position = 0)]
[hashtable]
$JoinedHashtable
)
$output = $SourceHashtable.Clone()
foreach ($key in $JoinedHashtable.Keys) {
$oldValue = $output[$key]
$newValue = $JoinedHashtable[$key]
$output[$key] =
if ($oldValue -is [hashtable] -and $newValue -is [hashtable]) { $oldValue | ~+ $newValue }
elseif ($oldValue -is [array] -and $newValue -is [array]) { $oldValue + $newValue }
else { $newValue }
}
$output;
}
Then, it can be used like this:
Set-Alias -Name '~+' -Value Join-HashTableTree -Option AllScope
#{
a = 1;
b = #{
ba = 2;
bb = 3
};
c = #{
val = 'value1';
arr = #(
'Foo'
)
}
} |
~+ #{
b = #{
bb = 33;
bc = 'hello'
};
c = #{
arr = #(
'Bar'
)
};
d = #(
42
)
} |
ConvertTo-Json
It will produce the following output:
{
"a": 1,
"d": 42,
"c": {
"val": "value1",
"arr": [
"Foo",
"Bar"
]
},
"b": {
"bb": 33,
"ba": 2,
"bc": "hello"
}
}
I just needed to do this and found this works:
$HT += $HT2
The contents of $HT2 get added to the contents of $HT.
The open brace has to be on the same line as ForEach-Object or you have to use the line continuation character (backtick).
This is the case because the code within { ... } is really the value for the -Process parameter of ForEach-Object cmdlet.
-Process <ScriptBlock[]>
Specifies the script block that is applied to each incoming object.
This will get you past the current issue at hand.
I think the most compact code to merge (without overwriting existing keys) would be this:
function Merge-Hashtables($htold, $htnew)
{
$htnew.keys | where {$_ -notin $htold.keys} | foreach {$htold[$_] = $htnew[$_]}
}
I borrowed it from Union and Intersection of Hashtables in PowerShell
I wanted to point out that one should not reference base properties of the hashtable indiscriminately in generic functions, as they may have been overridden (or overloaded) by items of the hashtable.
For instance, the hashtable $hash=#{'keys'='lots of them'} will have the base hashtable property, Keys overridden by the item keys, and thus doing a foreach ($key in $hash.Keys) will instead enumerate the hashed item keys's value, instead of the base property Keys.
Instead the method GetEnumerator or the keys property of the PSBase property, which cannot be overridden, should be used in functions that may have no idea if the base properties have been overridden.
Thus, Jon Z's answer is the best.
To 'inherit' key-values from parent hashtable ($htOld) to child hashtables($htNew), without modifying values of already existing keys in the child hashtables,
function MergeHashtable($htOld, $htNew)
{
$htOld.Keys | %{
if (!$htNew.ContainsKey($_)) {
$htNew[$_] = $htOld[$_];
}
}
return $htNew;
}
Please note that this will modify the $htNew object.
Here is a function version that doesn't use the pipeline (not that the pipeline is bad, just another way to do it). It also returns a merged hashtable and leaves the original unchanged.
function MergeHashtable($a, $b)
{
foreach ($k in $b.keys)
{
if ($a.containskey($k))
{
$a.remove($k)
}
}
return $a + $b
}
I just wanted to expand or simplify on jon Z's answer. There just seems to be too many lines and missed opportunities to use Where-Object. Here is my simplified version:
Function merge_hashtables($htold, $htnew) {
$htold.Keys | ? { $htnew.ContainsKey($_) } | % {
$htold.Remove($_)
}
$htold += $htnew
return $htold
}