I'm new in Scala, so struggling to understand Syntax. Please check below code.
def myDef(entityMap: Seq[(DataName.Value, DataFrame)]) : Seq[Map[Int,Info]] = {
val depenInfo = Seq[Map[Int,Info]]()
entityMap.foldLeft(depenInfo)((info,entity) => {
val(dataName: DataName.Value, df: DataFrame) = entity
info ++ df.createDepenInfo(dataName)
})
}
what am I getting is, Seq of tuples having to two types (DataName.Value, DataFrame) and return type of myDef is Seq of Map.
After that, create one empty Seq of Map, then feed this empty Map to entityMap.foldLeft so that it could add more values in it.
Remaining part I kind of literally stuck to understand. Can anyone please help me to understand what's happening ? If possible with any very simple example similar to above with output.
Thanks
Since there are many user defined class I don't know, I try to mock your data type as following:
import scala.collection.{immutable, Seq}
object Example {
object DataName {
type Value = Int
}
case class DataFrame(fakeData: String) {
def createDepenInfo(value: DataName.Value): Seq[Map[Int, Info]] = Seq(Map(value -> fakeData))
}
type Info = String
def myDef(entityMap: Seq[(DataName.Value, DataFrame)]): Seq[Map[Int, Info]] = {
val depenInfo = Seq[Map[Int, Info]]()
entityMap.foldLeft(depenInfo)((info: Seq[Map[Int, Info]], entity: (DataName.Value, DataFrame)) => {
// here is Pattern matching on tuples,
// here we extract (dataName: DataName.Value, df: DataFrame) from tuple entity: (DataName.Value, DataFrame)
// see: https://docs.scala-lang.org/tour/tuples.html
val (dataName: DataName.Value, df: DataFrame) = entity
// ++ is a method of Seq, it contact two Seq to one
// e.g. Seq(1,2,3) ++ Seq(4,5,6) = Seq(1,2,3,4,5,6)
info ++ df.createDepenInfo(dataName)
})
}
def main(args: Array[String]): Unit = {
val data: immutable.Seq[(DataName.Value, DataFrame)] = (1 to 5).map(i => (i, DataFrame((i + 'a').toChar.toString)))
// Vector((1,DataFrame(b)), (2,DataFrame(c)), (3,DataFrame(d)), (4,DataFrame(e)), (5,DataFrame(f)))
println(data)
val res = myDef(data)
// List(Map(1 -> b), Map(2 -> c), Map(3 -> d), Map(4 -> e), Map(5 -> f))
println(res)
}
}
raw data: Vector((1,DataFrame(b)), (2,DataFrame(c)), (3,DataFrame(d)), (4,DataFrame(e)), (5,DataFrame(f)))
let's say info ++ df.createDepenInfo(dataName) is result
info = Seq(), entity = (1,DataFrame(b)), reuslt=Seq(Map(1 -> b))
info = Seq(Map(1 -> b)), entity = (2,DataFrame(c)), result=Seq(Map(1 -> b), Map(2 -> c))
info = Seq(Map(1 -> b), Map(2 -> c)), entity = (3,DataFrame(d)), result=Seq(Map(1 -> b), Map(2 -> c), Map(3 -> d))
and so on...
You see, during each caluclation, the value info is "saved"(with a init value from deepInfo), and the entity value is "read" from entityMap.
So the final result is List(Map(1 -> b), Map(2 -> c), Map(3 -> d), Map(4 -> e), Map(5 -> f))
In your code, info is the accumulator, depenInfo is the initial value, and entity is a map item (i.e. a key-value tuple). Here's a simpler example where acc is the accumulator and kv is the key-value pair being "read" from the map.
Map(1->2).foldLeft(0)((acc, kv) => {
val (k, v) = kv;
println(s"$k, $v, $acc");
acc + k + v
})
// prints: 1, 2, 0
// result: 3
To read about the accumulator pattern: https://www.arothuis.nl/posts/accumulators-and-folds/
As for the ++, that is the operator/method in Seq (sequence) which concatenates this sequence to another sequence. Simple example with concatenating length 1 sequences together:
Seq(1) ++ Seq(2)
// Seq(1, 2)
Related
Sometimes I use a Map as a memoization cache. With mutable maps, I use getOrElseUpdate:
mutableMap.getOrElseUpdate(key, {
val value = <compute the value>
value
})
Immutable maps don't have getOrElseUpdate. So I want to do this
immutableMap.getOrElse(key, {
val value = <compute the value>
immutableMap += key -> value
value
})
This seems to work in practice, I have good arguments to believe it works in theory, and it's more or less readable -- is it a terrible idea for some reason I'm missing?
The other alternatives I'm considering are
immutableMap.get(key) match {
case Some(value) => value
case None =>
val value = <compute the value>
immutableMap += key -> value
value
}
which is not much different and is more cumbersome, or
if (immutableMap.contains(key)) {
immutableMap(key)
} else {
val value = <compute the value>
immutableMap += key -> value
value
}
which is the dumbest and probably least idiomatic.
In principle I rather not go for a solution that uses a helper to return the value and the updated map, unless it's the unarguably superior way.
Sure, it seems reasonable except for one small issue... it's not updating your collection! If you're using an immutable Map, then that Map is immutable. You can not change it, ever.
In fact, immutable Map from Scala collection does not even have a += method defined on it, see immutable.Map. All the methods with "append" or "add" new values to the Map actually return a new Map. So for what you've written above to compile, you'd have to not be using something immutable.
To do this with an immutable map, you'll need to work with a var and replace that var with the new Map (which can lead to issues with threading) or you have to adopt a State Monad type pattern in which you return not only the new value but also the new Map.
def getOrCalc(m: Map[Key, Value], k: Key)(f: Key => Value): (Map[Key, Value], Value] ={
if(m.contains(k)) (m, m(k))
else{
val value = f(k)
(m +: (k, value), value)
}
}
My only recommendation (regarding the reasons why you choosed var instead of mutable.Map or Java's ConcurrentMap) is to wrap it into DSL, like:
case class Mutable[K,V](var m: Map[K,V]) {
def orElseUpdate(key: K, compute: => V) = m.getOrElse(key, {
val value = compute
m += key -> value
value
})
}
scala> val a = Mutable(Map(1 -> 2))
a: Mutable[Int,Int] = Mutable(Map(1 -> 2))
scala> a.orElseUpdate(2, 4)
res10: Int = 4
scala> a.orElseUpdate(2, 6)
res11: Int = 4
scala> a.orElseUpdate(3, 6)
res12: Int = 6
Another option (if your computation is lightweight) is just:
m += key -> m.getOrElse(key, compute)
m(key)
Example:
scala> var m = Map(1 -> 2)
m: scala.collection.immutable.Map[Int,Int] = Map(1 -> 2)
scala> m += 3 -> m.getOrElse(3, 5)
scala> m
res1: scala.collection.immutable.Map[Int,Int] = Map(1 -> 2, 3 -> 5)
scala> m += 3 -> m.getOrElse(3, 5)
scala> m
res3: scala.collection.immutable.Map[Int,Int] = Map(1 -> 2, 3 -> 5)
scala> m += 3 -> m.getOrElse(3, 6)
scala> m
res5: scala.collection.immutable.Map[Int,Int] = Map(1 -> 2, 3 -> 5)
You can wrap it into DSL as well:
implicit class RichMap[K,V](m: Map[K,V]) {
def kvOrElse(k: K, v: V) = k -> m.getOrElse(k, v)
}
scala> m += m.kvOrElse(3, 7)
scala> m
res7: scala.collection.immutable.Map[Int,Int] = Map(1 -> 2, 3 -> 5)
scala> m += m.kvOrElse(4, 7)
scala> m
res9: scala.collection.immutable.Map[Int,Int] = Map(1 -> 2, 3 -> 5, 4 -> 7)
I have defined the object below. But I don't understand why the mapValues body only executes in test1. ie. why is the output:
Calling test1
Calling test2
Mapping: One
Mapping: Two
Mapped: Map(1 -> Xx, 2 -> Xx)
I have tested it with both scala 2.10 and 2.11 with the same results.
object Test {
def test1: Option[String] = {
val map = Map(1 -> "One", 2 -> "Two")
val mapped = map.mapValues { v =>
println("Mapping: " + v)
"Xx"
}
None
}
def test2: Option[String] = {
val map = Map(1 -> "One", 2 -> "Two")
val mapped = map.mapValues { v =>
println("Mapping: " + v)
"Xx"
}
println("Mapped: " + mapped)
None
}
def main(args: Array[String]): Unit = {
println("Calling test1")
test1
println("Calling test2")
test2
}
}
mapValues actually returns a view, so the results are computed lazily. From the scaladoc for mapValues:
return a map view which maps every key of this map to f(this(key)). The resulting map wraps the original map without copying any elements.
So for example:
val mapped = Map(1 -> "One", 2 -> "Two").mapValues { v =>
println("Mapping: " + v)
"Xx"
}
On it's own this will print nothing when declared. But as soon as mapped is accessed, the values will be computed, and the statements will be printed. (In fact, the values will be re-computed every time you access mapped)
In Test.test1, there is nothing accessing mapped, so the values are never computed.
In Test.test2, you're printing out mapped, which triggers the computation of the values.
The other answer explains the problem, but as a solution, if you want a strict map, just use normal map:
val m = Map(1 -> "One", 2 -> "Two")
val mapped = m.map {
case (k,v) => k -> {
println("Mapping: " + v)
"Xx"
}
}
Alternatively, you can define your own extension method to do what you want:
import scala.collection.GenTraversableLike
import scala.collection.generic.CanBuildFrom
implicit class HasMapVals[T, U, Repr](val self: GenTraversableLike[(T, U), Repr]) extends AnyVal {
def mapVals[R, That](f: U => R)(implicit bf: CanBuildFrom[Repr, (T, R), That]) = {
self.map { case (k,v) => k -> f(v) }
}
}
val m = Map(1 -> "One", 2 -> "Two")
val mapped = m.mapVals { v =>
println("Mapping: " + v)
"Xx"
}
I'm trying to use scala's LinkedHashMap as an LRU cache, but I'm not sure how to remove the oldest entry of such a map. I know java's LinkedHashMap has a method removeEldestEntry, but there does not seem to be a similar method for scala's implementation. I'd prefer not to convert to java's implementation just to have access to removeEldestEntry. How can I achieve this?
This will do what you want:
def removeOldestEntry[K](m: scala.collection.mutable.LinkedHashMap[K, _]): m.type =
m -= m.head._1
(Kudos to Jasper-M for pointing out that head will give the oldest entry)
You can do this in the following way:
object myApp {
def main(args: Array[String]) {
val myMap = new MyLinkedHashMap[Int,String]()
myMap.add(1, "a") // Map(1 -> a)
myMap.add(2, "b") // Map(1 -> a, 2 -> b)
myMap.add(3, "c") // Map(1 -> a, 2 -> b, 3 -> c)
myMap.add(4, "d") // Map(1 -> a, 2 -> b, 3 -> c, 4 -> d)
myMap.removeEldest // Map(2 -> b, 3 -> c, 4 -> d)
myMap.get(2) // Map(3 -> c, 4 -> d, 2 -> b)
myMap.removeEldest // Map(4 -> d, 2 -> b)
}
}
class MyLinkedHashMap[K,V] {
import scala.collection.mutable.LinkedHashMap
var map = new LinkedHashMap[K, V]()
/* adds an element to the HaskMap */
def add(key: K, value: V) {
map.put(key, value)
}
/* removes the LRU element from the HaskMap */
def removeEldest {
if (!map.isEmpty) {
map = map.drop(1)
}
}
/* gets the value for the given key and moves it to the top of the HashMap */
def get(key: K): Option[V] = {
val value = map.remove(key)
if (value != None) {
map.put(key, value.get)
}
return value
}
}
I have a list of parent keys, each of which could possibly have zero or more associated values. I am not sure which collection to use.
I am using Map[Int,List[String]]
I am declaring the Map as
var nodes = new HashMap[Int, List[String]]
Then I have two methods to handle adding new elements. The first is to add new keys addNode and the second is to add new values addValue. Initially, the key will not have any values associated with it. Later on, during execution, new values will be associated.
def addNode(key: Int) = nodes += (key -> "")
def addValue(key: Int, value: String) = ???
I am not sure how to implement addValues
Update:
In response to #oxbow-lakes answer, This is the error I am receiving. Please note that keys need not have values associated with them.
scala> var nodes = Map.empty[Int, List[String]]
nodes: scala.collection.immutable.Map[Int,List[String]] = Map()
scala> nodes += (1->null)
scala> nodes += (1 -> ("one" :: (nodes get 1 getOrElse Nil)))
java.lang.NullPointerException
at .<init>(<console>:9)
at .<clinit>(<console>)
at .<init>(<console>:11)
at .<clinit>(<console>)
at $print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:704)
at scala.tools.nsc.interpreter.IMain$Request$$anonfun$14.apply(IMain.scala:920)
at scala.tools.nsc.interpreter.Line$$anonfun$1.apply$mcV$sp(Line.scala:43)
at scala.tools.nsc.io.package$$anon$2.run(package.scala:25)
at java.lang.Thread.run(Thread.java:680)
Update 2:
The problem with the code above is the line nodes += (1->null) the key should be associated with Nil instead. Below is the working code.
scala> var nodes = Map.empty[Int, List[String]]
nodes: scala.collection.immutable.Map[Int,List[String]] = Map()
scala> nodes += (1->Nil)
scala> nodes += (1 -> ("one" :: (nodes get 1 getOrElse Nil)))
scala> nodes
res27: scala.collection.immutable.Map[Int,List[String]] = Map(1 -> List(one))
Using MultiMap
You possibly want to use MultiMap, which is a mutable collection isomorphic to Map[K, Set[V]]. Use as follows:
import collection.mutable
val mm = new mutable.HashMap[Int, mutable.Set[String]] with mutable.MultiMap[Int, String]
Then you add your nodes:
mm addBinding (key, value)
Without MultiMap
The alternative is to stick with immutable values. Assuming you want to avoid using lenses (see scalaz), you can add nodes as follows:
nodes += (key -> (value :: (nodes get key getOrElse Nil)))
Here it is working (in response to your comment):
scala> var nodes = Map.empty[Int, List[String]]
nodes: scala.collection.immutable.Map[Int,List[String]] = Map()
scala> def addNode(key: Int, value: String) =
| nodes += (key -> (value :: (nodes get key getOrElse Nil)))
addNode: (key: Int, value: String)Unit
scala> addNode(1, "Hi")
scala> addNode(1, "Bye")
scala> nodes
res2: scala.collection.immutable.Map[Int,List[String]] = Map(1 -> List(Bye, Hi))
Using Scalaz
Using the scalaz library, you can realize that this is simply using the Empty pattern:
nodes += (key -> (value :: ~(nodes get key)))
Or you could take advantage of the fact that Map is a monoid:
nodes = nodes |+| Map(key -> List(value))
In addition to #oxbow_lakes' answer, here's a idea for how you could use an addMap method that correctly adds two maps together (ie, combining lists for matching keys, adding new lists for new keys):
class EnhancedListMap(self: Map[Int,List[String]]) {
def addMap(other: Map[Int,List[String]]) =
(this.ungroup ++ enhanceListMap(other).ungroup)
.groupBy(_._1)
.mapValues(_.map(_._2))
def ungroup() =
self.toList.flatMap{ case (k,vs) => vs.map(k -> _) }
}
implicit def enhanceListMap(self: Map[Int,List[String]]) = new EnhancedListMap(self)
And you'd use it like this:
val a = Map(1 -> List("a","b"), 2 -> List("c","d"))
val b = Map(2 -> List("e","f"), 3 -> List("g","h"))
a addMap b
//Map(3 -> List(g, h), 1 -> List(a, b), 2 -> List(c, d, e, f))
You can include addNode, addValue, and addValues the same way (to EnhancedListMap above):
def addNode(key: Int) =
if(self contains key) self else self + (key -> Nil)
def addValue(key: Int, value: String) =
self + (key -> (value :: (self get key getOrElse Nil)))
def addValues(key: Int, values: List[String]) =
self + (key -> (values ::: (self get key getOrElse Nil)))
And then use them together:
var nodes = Map.empty[Int, List[String]]
// Map()
nodes = nodes.addNode(1)
// Map(1 -> List())
nodes = nodes.addValue(1,"a")
// Map(1 -> List(a))
nodes = nodes.addValue(2,"b")
// Map(1 -> List(a), 2 -> List(b))
nodes = nodes.addValues(2,List("c","d"))
// Map(1 -> List(a), 2 -> List(c, d, b))
nodes = nodes.addValues(3,List("e","f"))
// Map(1 -> List(a), 2 -> List(c, d, b), 3 -> List(e, f))
nodes = nodes.addMap(Map(3 -> List("g","h"), 4-> List("i","j")))
// Map(1 -> List(a), 2 -> List(c, d, b), 3 -> List(e, f, g, h), 4 -> List(i, j))
I quite like the getOrElseUpdate method provided by mutable maps:
import scala.collection.mutable._
private val nodes = new HashMap[Int, Buffer[String]]
def addNode(key: Int): Unit =
nodes.getOrElseUpdate(key, new ArrayBuffer)
def addValue(key: Int, value: String): Unit =
nodes.getOrElseUpdate(key, new ArrayBuffer) += value
I have a List of Map[String, Double], and I'd like to merge their contents into a single Map[String, Double]. How should I do this in an idiomatic way? I imagine that I should be able to do this with a fold. Something like:
val newMap = Map[String, Double]() /: listOfMaps { (accumulator, m) => ... }
Furthermore, I'd like to handle key collisions in a generic way. That is, if I add a key to the map that already exists, I should be able to specify a function that returns a Double (in this case) and takes the existing value for that key, plus the value I'm trying to add. If the key does not yet exist in the map, then just add it and its value unaltered.
In my specific case I'd like to build a single Map[String, Double] such that if the map already contains a key, then the Double will be added to the existing map value.
I'm working with mutable maps in my specific code, but I'm interested in more generic solutions, if possible.
Well, you could do:
mapList reduce (_ ++ _)
except for the special requirement for collision.
Since you do have that special requirement, perhaps the best would be doing something like this (2.8):
def combine(m1: Map, m2: Map): Map = {
val k1 = Set(m1.keysIterator.toList: _*)
val k2 = Set(m2.keysIterator.toList: _*)
val intersection = k1 & k2
val r1 = for(key <- intersection) yield (key -> (m1(key) + m2(key)))
val r2 = m1.filterKeys(!intersection.contains(_)) ++ m2.filterKeys(!intersection.contains(_))
r2 ++ r1
}
You can then add this method to the map class through the Pimp My Library pattern, and use it in the original example instead of "++":
class CombiningMap(m1: Map[Symbol, Double]) {
def combine(m2: Map[Symbol, Double]) = {
val k1 = Set(m1.keysIterator.toList: _*)
val k2 = Set(m2.keysIterator.toList: _*)
val intersection = k1 & k2
val r1 = for(key <- intersection) yield (key -> (m1(key) + m2(key)))
val r2 = m1.filterKeys(!intersection.contains(_)) ++ m2.filterKeys(!intersection.contains(_))
r2 ++ r1
}
}
// Then use this:
implicit def toCombining(m: Map[Symbol, Double]) = new CombiningMap(m)
// And finish with:
mapList reduce (_ combine _)
While this was written in 2.8, so keysIterator becomes keys for 2.7, filterKeys might need to be written in terms of filter and map, & becomes **, and so on, it shouldn't be too different.
How about this one:
def mergeMap[A, B](ms: List[Map[A, B]])(f: (B, B) => B): Map[A, B] =
(Map[A, B]() /: (for (m <- ms; kv <- m) yield kv)) { (a, kv) =>
a + (if (a.contains(kv._1)) kv._1 -> f(a(kv._1), kv._2) else kv)
}
val ms = List(Map("hello" -> 1.1, "world" -> 2.2), Map("goodbye" -> 3.3, "hello" -> 4.4))
val mm = mergeMap(ms)((v1, v2) => v1 + v2)
println(mm) // prints Map(hello -> 5.5, world -> 2.2, goodbye -> 3.3)
And it works in both 2.7.5 and 2.8.0.
I'm surprised no one's come up with this solution yet:
myListOfMaps.flatten.toMap
Does exactly what you need:
Merges the list to a single Map
Weeds out any duplicate keys
Example:
scala> List(Map('a -> 1), Map('b -> 2), Map('c -> 3), Map('a -> 4, 'b -> 5)).flatten.toMap
res7: scala.collection.immutable.Map[Symbol,Int] = Map('a -> 4, 'b -> 5, 'c -> 3)
flatten turns the list of maps into a flat list of tuples, toMap turns the list of tuples into a map with all the duplicate keys removed
Starting Scala 2.13, another solution which handles duplicate keys and is only based on the standard library consists in merging the Maps as sequences (flatten) before applying the new groupMapReduce operator which (as its name suggests) is an equivalent of a groupBy followed by a mapping and a reduce step of grouped values:
List(Map("hello" -> 1.1, "world" -> 2.2), Map("goodbye" -> 3.3, "hello" -> 4.4))
.flatten
.groupMapReduce(_._1)(_._2)(_ + _)
// Map("world" -> 2.2, "goodbye" -> 3.3, "hello" -> 5.5)
This:
flattens (concatenates) the maps as a sequence of tuples (List(("hello", 1.1), ("world", 2.2), ("goodbye", 3.3), ("hello", 4.4))), which keeps all key/values (even duplicate keys)
groups elements based on their first tuple part (_._1) (group part of groupMapReduce)
maps grouped values to their second tuple part (_._2) (map part of groupMapReduce)
reduces mapped grouped values (_+_) by taking their sum (but it can be any reduce: (T, T) => T function) (reduce part of groupMapReduce)
The groupMapReduce step can be seen as a one-pass version equivalent of:
list.groupBy(_._1).mapValues(_.map(_._2).reduce(_ + _))
Interesting, noodling around with this a bit, I got the following (on 2.7.5):
General Maps:
def mergeMaps[A,B](collisionFunc: (B,B) => B)(listOfMaps: Seq[scala.collection.Map[A,B]]): Map[A, B] = {
listOfMaps.foldLeft(Map[A, B]()) { (m, s) =>
Map(
s.projection.map { pair =>
if (m contains pair._1)
(pair._1, collisionFunc(m(pair._1), pair._2))
else
pair
}.force.toList:_*)
}
}
But man, that is hideous with the projection and forcing and toList and whatnot. Separate question: what's a better way to deal with that within the fold?
For mutable Maps, which is what I was dealing with in my code, and with a less general solution, I got this:
def mergeMaps[A,B](collisionFunc: (B,B) => B)(listOfMaps: List[mutable.Map[A,B]]): mutable.Map[A, B] = {
listOfMaps.foldLeft(mutable.Map[A,B]()) {
(m, s) =>
for (k <- s.keys) {
if (m contains k)
m(k) = collisionFunc(m(k), s(k))
else
m(k) = s(k)
}
m
}
}
That seems a little bit cleaner, but will only work with mutable Maps as it's written. Interestingly, I first tried the above (before I asked the question) using /: instead of foldLeft, but I was getting type errors. I thought /: and foldLeft were basically equivalent, but the compiler kept complaining that I needed explicit types for (m, s). What's up with that?
I reading this question quickly so I'm not sure if I'm missing something (like it has to work for 2.7.x or no scalaz):
import scalaz._
import Scalaz._
val ms = List(Map("hello" -> 1.1, "world" -> 2.2), Map("goodbye" -> 3.3, "hello" -> 4.4))
ms.reduceLeft(_ |+| _)
// returns Map(goodbye -> 3.3, hello -> 5.5, world -> 2.2)
You can change the monoid definition for Double and get another way to accumulate the values, here getting the max:
implicit val dbsg: Semigroup[Double] = semigroup((a,b) => math.max(a,b))
ms.reduceLeft(_ |+| _)
// returns Map(goodbye -> 3.3, hello -> 4.4, world -> 2.2)
I wrote a blog post about this , check it out :
http://www.nimrodstech.com/scala-map-merge/
basically using scalaz semi group you can achieve this pretty easily
would look something like :
import scalaz.Scalaz._
listOfMaps reduce(_ |+| _)
a oneliner helper-func, whose usage reads almost as clean as using scalaz:
def mergeMaps[K,V](m1: Map[K,V], m2: Map[K,V])(f: (V,V) => V): Map[K,V] =
(m1 -- m2.keySet) ++ (m2 -- m1.keySet) ++ (for (k <- m1.keySet & m2.keySet) yield { k -> f(m1(k), m2(k)) })
val ms = List(Map("hello" -> 1.1, "world" -> 2.2), Map("goodbye" -> 3.3, "hello" -> 4.4))
ms.reduceLeft(mergeMaps(_,_)(_ + _))
// returns Map(goodbye -> 3.3, hello -> 5.5, world -> 2.2)
for ultimate readability wrap it in an implicit custom type:
class MyMap[K,V](m1: Map[K,V]) {
def merge(m2: Map[K,V])(f: (V,V) => V) =
(m1 -- m2.keySet) ++ (m2 -- m1.keySet) ++ (for (k <- m1.keySet & m2.keySet) yield { k -> f(m1(k), m2(k)) })
}
implicit def toMyMap[K,V](m: Map[K,V]) = new MyMap(m)
val ms = List(Map("hello" -> 1.1, "world" -> 2.2), Map("goodbye" -> 3.3, "hello" -> 4.4))
ms reduceLeft { _.merge(_)(_ + _) }