Fold from Map[String,List[Int]] to Map[String,Int] - scala

I'm fairly new to Scala and functional approaches in general. I have a Map that looks something like this:
val myMap: Map[String, List[Int]]
I want to end up something that maps the key to the total of the associated list:
val totalsMap: Map[String, Int]
My initial hunch was to use a for comprehension:
val totalsMap = for (kvPair <- myMap) {
kvPair._2.foldLeft(0)(_+_)
}
But I have no idea what I would put in the yield() clause in order to get a map out of the for comprehension.

You can use mapValues for this,
val totalMap = myMap.mapValues(_.sum)
But mapValues will recalculate the sum every time you get a key from the Map. e.g. If you do totalMap("a") multiple times, it will recalculate the sum each time.
If you don't want this, you should use
val totalMap = myMap map {
case (k, v) => k -> v.sum
}

mapValues would be more suited for this case:
val m = Map[String, List[Int]]("a" -> List(1,2,3), "b" -> List(4,5,6))
m.mapValues(_.foldLeft(0)(_+_))
res1: scala.collection.immutable.Map[String,Int] = Map(a -> 6, b -> 15)
Or without foldLeft:
m.mapValues(_.sum)

val m = Map("hello" -> Seq(1, 1, 1, 1), "world" -> Seq(1, 1))
for ((k, v) <- m) yield (k, v.sum)
yields
Map(hello -> 4, world -> 2)`
The for comprehension will return whatever monadic type you give it. In this case, m is a Map, so that's what's going to come out. The yield must return a tuple. The first element (which becomes the key in each Map entry) is the word you're counting, and the second element (you guessed it, the value in each Map entry) becomes the sum of the original sequence of counts.

Related

Most efficient way to create matrix from Map's [duplicate]

val map1 = Map(1 -> 9 , 2 -> 20)
val map2 = Map(1 -> 100, 3 -> 300)
I want to merge them, and sum the values of same keys. So the result will be:
Map(2->20, 1->109, 3->300)
Now I have 2 solutions:
val list = map1.toList ++ map2.toList
val merged = list.groupBy ( _._1) .map { case (k,v) => k -> v.map(_._2).sum }
and
val merged = (map1 /: map2) { case (map, (k,v)) =>
map + ( k -> (v + map.getOrElse(k, 0)) )
}
But I want to know if there are any better solutions.
The shortest answer I know of that uses only the standard library is
map1 ++ map2.map{ case (k,v) => k -> (v + map1.getOrElse(k,0)) }
Scalaz has the concept of a Semigroup which captures what you want to do here, and leads to arguably the shortest/cleanest solution:
scala> import scalaz._
import scalaz._
scala> import Scalaz._
import Scalaz._
scala> val map1 = Map(1 -> 9 , 2 -> 20)
map1: scala.collection.immutable.Map[Int,Int] = Map(1 -> 9, 2 -> 20)
scala> val map2 = Map(1 -> 100, 3 -> 300)
map2: scala.collection.immutable.Map[Int,Int] = Map(1 -> 100, 3 -> 300)
scala> map1 |+| map2
res2: scala.collection.immutable.Map[Int,Int] = Map(1 -> 109, 3 -> 300, 2 -> 20)
Specifically, the binary operator for Map[K, V] combines the keys of the maps, folding V's semigroup operator over any duplicate values. The standard semigroup for Int uses the addition operator, so you get the sum of values for each duplicate key.
Edit: A little more detail, as per user482745's request.
Mathematically a semigroup is just a set of values, together with an operator that takes two values from that set, and produces another value from that set. So integers under addition are a semigroup, for example - the + operator combines two ints to make another int.
You can also define a semigroup over the set of "all maps with a given key type and value type", so long as you can come up with some operation that combines two maps to produce a new one which is somehow the combination of the two inputs.
If there are no keys that appear in both maps, this is trivial. If the same key exists in both maps, then we need to combine the two values that the key maps to. Hmm, haven't we just described an operator which combines two entities of the same type? This is why in Scalaz a semigroup for Map[K, V] exists if and only if a Semigroup for V exists - V's semigroup is used to combine the values from two maps which are assigned to the same key.
So because Int is the value type here, the "collision" on the 1 key is resolved by integer addition of the two mapped values (as that's what Int's semigroup operator does), hence 100 + 9. If the values had been Strings, a collision would have resulted in string concatenation of the two mapped values (again, because that's what the semigroup operator for String does).
(And interestingly, because string concatenation is not commutative - that is, "a" + "b" != "b" + "a" - the resulting semigroup operation isn't either. So map1 |+| map2 is different from map2 |+| map1 in the String case, but not in the Int case.)
Quick solution:
(map1.keySet ++ map2.keySet).map {i=> (i,map1.getOrElse(i,0) + map2.getOrElse(i,0))}.toMap
Well, now in scala library (at least in 2.10) there is something you wanted - merged function. BUT it's presented only in HashMap not in Map. It's somewhat confusing. Also the signature is cumbersome - can't imagine why I'd need a key twice and when I'd need to produce a pair with another key. But nevertheless, it works and much cleaner than previous "native" solutions.
val map1 = collection.immutable.HashMap(1 -> 11 , 2 -> 12)
val map2 = collection.immutable.HashMap(1 -> 11 , 2 -> 12)
map1.merged(map2)({ case ((k,v1),(_,v2)) => (k,v1+v2) })
Also in scaladoc mentioned that
The merged method is on average more performant than doing a
traversal and reconstructing a new immutable hash map from
scratch, or ++.
This can be implemented as a Monoid with just plain Scala. Here is a sample implementation. With this approach, we can merge not just 2, but a list of maps.
// Monoid trait
trait Monoid[M] {
def zero: M
def op(a: M, b: M): M
}
The Map based implementation of the Monoid trait that merges two maps.
val mapMonoid = new Monoid[Map[Int, Int]] {
override def zero: Map[Int, Int] = Map()
override def op(a: Map[Int, Int], b: Map[Int, Int]): Map[Int, Int] =
(a.keySet ++ b.keySet) map { k =>
(k, a.getOrElse(k, 0) + b.getOrElse(k, 0))
} toMap
}
Now, if you have a list of maps that needs to be merged (in this case, only 2), it can be done like below.
val map1 = Map(1 -> 9 , 2 -> 20)
val map2 = Map(1 -> 100, 3 -> 300)
val maps = List(map1, map2) // The list can have more maps.
val merged = maps.foldLeft(mapMonoid.zero)(mapMonoid.op)
map1 ++ ( for ( (k,v) <- map2 ) yield ( k -> ( v + map1.getOrElse(k,0) ) ) )
I wrote a blog post about this , check it out :
http://www.nimrodstech.com/scala-map-merge/
basically using scalaz semi group you can achieve this pretty easily
would look something like :
import scalaz.Scalaz._
map1 |+| map2
You can also do that with Cats.
import cats.implicits._
val map1 = Map(1 -> 9 , 2 -> 20)
val map2 = Map(1 -> 100, 3 -> 300)
map1 combine map2 // Map(2 -> 20, 1 -> 109, 3 -> 300)
Starting Scala 2.13, another solution only based on the standard library consists in replacing the groupBy part of your solution with groupMapReduce which (as its name suggests) is an equivalent of a groupBy followed by mapValues and a reduce step:
// val map1 = Map(1 -> 9, 2 -> 20)
// val map2 = Map(1 -> 100, 3 -> 300)
(map1.toSeq ++ map2).groupMapReduce(_._1)(_._2)(_+_)
// Map[Int,Int] = Map(2 -> 20, 1 -> 109, 3 -> 300)
This:
Concatenates the two maps as a sequence of tuples (List((1,9), (2,20), (1,100), (3,300))). For conciseness, map2 is implicitly converted to Seq to adapt to the type of map1.toSeq - but you could choose to make it explicit by using map2.toSeq,
groups elements based on their first tuple part (group part of groupMapReduce),
maps grouped values to their second tuple part (map part of groupMapReduce),
reduces mapped values (_+_) by summing them (reduce part of groupMapReduce).
Andrzej Doyle's answer contains a great explanation of semigroups which allows you to use the |+| operator to join two maps and sum the values for matching keys.
There are many ways something can be defined to be an instance of a typeclass, and unlike the OP you might not want to sum your keys specifically. Or, you might want to do operate on a union rather than an intersection. Scalaz also adds extra functions to Map for this purpose:
https://oss.sonatype.org/service/local/repositories/snapshots/archive/org/scalaz/scalaz_2.11/7.3.0-SNAPSHOT/scalaz_2.11-7.3.0-SNAPSHOT-javadoc.jar/!/index.html#scalaz.std.MapFunctions
You can do
import scalaz.Scalaz._
map1 |+| map2 // As per other answers
map1.intersectWith(map2)(_ + _) // Do things other than sum the values
The fastest and simplest way:
val m1 = Map(1 -> 1.0, 3 -> 3.0, 5 -> 5.2)
val m2 = Map(0 -> 10.0, 3 -> 3.0)
val merged = (m2 foldLeft m1) (
(acc, v) => acc + (v._1 -> (v._2 + acc.getOrElse(v._1, 0.0)))
)
By this way, each of element's immediately added to map.
The second ++ way is:
map1 ++ map2.map { case (k,v) => k -> (v + map1.getOrElse(k,0)) }
Unlike the first way, In a second way for each element in a second map a new List will be created and concatenated to the previous map.
The case expression implicitly creates a new List using unapply method.
Here's what I ended up using:
(a.toSeq ++ b.toSeq).groupBy(_._1).mapValues(_.map(_._2).sum)
This is what I came up with...
def mergeMap(m1: Map[Char, Int], m2: Map[Char, Int]): Map[Char, Int] = {
var map : Map[Char, Int] = Map[Char, Int]() ++ m1
for(p <- m2) {
map = map + (p._1 -> (p._2 + map.getOrElse(p._1,0)))
}
map
}
Using the typeclass pattern, we can merge any Numeric type:
object MapSyntax {
implicit class MapOps[A, B](a: Map[A, B]) {
def plus(b: Map[A, B])(implicit num: Numeric[B]): Map[A, B] = {
b ++ a.map { case (key, value) => key -> num.plus(value, b.getOrElse(key, num.zero)) }
}
}
}
Usage:
import MapSyntax.MapOps
map1 plus map2
Merging a sequence of maps:
maps.reduce(_ plus _)
I've got a small function to do the job, it's in my small library for some frequently used functionality which isn't in standard lib.
It should work for all types of maps, mutable and immutable, not only HashMaps
Here is the usage
scala> import com.daodecode.scalax.collection.extensions._
scala> val merged = Map("1" -> 1, "2" -> 2).mergedWith(Map("1" -> 1, "2" -> 2))(_ + _)
merged: scala.collection.immutable.Map[String,Int] = Map(1 -> 2, 2 -> 4)
https://github.com/jozic/scalax-collection/blob/master/README.md#mergedwith
And here's the body
def mergedWith(another: Map[K, V])(f: (V, V) => V): Repr =
if (another.isEmpty) mapLike.asInstanceOf[Repr]
else {
val mapBuilder = new mutable.MapBuilder[K, V, Repr](mapLike.asInstanceOf[Repr])
another.foreach { case (k, v) =>
mapLike.get(k) match {
case Some(ev) => mapBuilder += k -> f(ev, v)
case _ => mapBuilder += k -> v
}
}
mapBuilder.result()
}
https://github.com/jozic/scalax-collection/blob/master/src%2Fmain%2Fscala%2Fcom%2Fdaodecode%2Fscalax%2Fcollection%2Fextensions%2Fpackage.scala#L190
For anyone coming across an AnyVal error, convert the values as follows.
Error:
"could not find implicit value for parameter num: Numeric[AnyVal]"
(m1.toSeq ++ m2.toSeq).groupBy(_._1).mapValues(_.map(_._2.asInstanceOf[Number].intValue()).sum)

How to add all the values of a map without using recurrsion or var

I want to add all the values in a map without using var or any mutable structures. I have tried to do something like this but it doens't work:
val mymap = ("a" -> 1, "b" -> 2)
val sum_of_alcohol_consumption =
for ((k,v) <- mymap ) yield (sum_of_alcohol_consumption += v)
I have been told that I can use .sum on a list
Please help
Thanks
You can use the .values function of a Map to return an Iterable List of its values (all of the Integers) and then call the .sum function on that:
val myMap = Map("a" -> 1, "b" -> 2)
val sum = myMap.values.sum
println(sum) // Outputs: 3
An equivalent answer to the more elegant use of sum is to use a fold operation. sum is implemented in a manner similar to this:
val myMap = Map("a" -> 1, "b" -> 2)
val sumAlcoholConsumption = myMap.values.foldLeft(0)(_ + _)
values returns a sequence of only the values in the map. The first foldLeft argument is the zero value (think of it as the initial value for an accumulator value) for the operation. The second argument is a function that adds the current value of the accumulator to the current element, returning the sum of the two values - and it is applied to each value in turn. That said, sum is a lot more convenient.
To get the only values of map, it provides a function values which will return iterable,we can directly appy sum function to it.
scala> val mymap = Map("a" -> 1, "b" -> 2)
mymap: scala.collection.immutable.Map[String,Int] = Map(a -> 1, b -> 2)
scala> mymap.values.sum
res7: Int = 3

How to sum a List[(Char,Int)] into a Map[Char,Int] in Scala?

I've got list of pairs:
List(('a',3),('b',3),('a',1))
and I would like to transform it by grouping by _1 and summing _2. The result should be like
Map('a'->4, 'b' -> 3)
I very new to Scala so please be kind :)
More direct version. We fold over the list, using a Map as the accumulator. The withDefaultValue means we don't have to test if we have the entry in the map already.
val xs = List(('a',3),('b',3),('a',1))
xs.foldLeft(Map[Char, Int]() withDefaultValue 0)
{case (m, (c, i)) => m updated (c,m(c)+i)}
//> res0: scala.collection.immutable.Map[Char,Int] = Map(a -> 4, b -> 3)
list.groupBy(_._1).mapValues(_.map(_._2).sum)
which can be written as
list.groupBy(_._1).mapValues { tuples =>
val ints = tuples.map { case (c, i) => i }
ints.sum
}

How to un-nest a spark rdd that has the following type ((String, scala.collection.immutable.Map[String,scala.collection.immutable.Map[String,Int]]))

Its a nested map with contents like this when i print it onto screen
(5, Map ( "ABCD" -> Map("3200" -> 3,
"3350.800" -> 4,
"200.300" -> 3)
(1, Map ( "DEF" -> Map("1200" -> 32,
"1320.800" -> 4,
"2100" -> 3)
I need to get something like this
Case Class( 5, ABCD 3200, 3)
Case Class(5, ABCD 3350.800, 4)
CaseClass(5,ABCD., 200.300, 3)
CaseClass(1, DEF 1200, 32)
CaseClass(1 DEF, 1320.800, 4)
etc etc.
basically a list of case classes
And map it to a case class object so that i can save it to cassandra.
I have tried doing flatMapValues but that un nests the map only one level. Also used flatMap . that doesnt work either or I'am making mistakes
Any suggestions ?
Fairly straightforward using a for-comprehension and some pattern matching to destructure things:
val in = List((5, Map ( "ABCD" -> Map("3200" -> 3, "3350.800" -> 4, "200.300" -> 3))),
(1, Map ("DEF" -> Map("1200" -> 32, "1320.800" -> 4, "2100" -> 3))))
case class Thing(a:Int, b:String, c:String, d:Int)
for { (index, m) <- in
(k,v) <-m
(innerK, innerV) <- v}
yield Thing(index, k, innerK, innerV)
//> res0: List[maps.maps2.Thing] = List(Thing(5,ABCD,3200,3),
// Thing(5,ABCD,3350.800,4),
// Thing(5,ABCD,200.300,3),
// Thing(1,DEF,1200,32),
// Thing(1,DEF,1320.800,4),
// Thing(1,DEF,2100,3))
So let's pick part the for-comprehension
(index, m) <- in
This is the same as
t <- in
(index, m) = t
In the first line t will successively be set to each element of in.
t is therefore a tuple (Int, Map(...))
Patten matching lets us put that "patten" for the tuple on the right hand side and the compiler picks apart the tuple, sets index to the Int and m to the Map.
(k,v) <-m
As before this is equivalent to
u <-m
(k, v) = u
And this time u takes each element of Map. Which again are tuples of key and value. So k is set successively to each key and v to the value.
And v is your inner map so we do the same thing again with the inner map
(innerK, innerV) <- v}
Now we have everything we need to create the case class. yield just says make a collection of whatever is "yielded" each time through the loop.
yield Thing(index, k, innerK, innerV)
Under the hood, this just translates to a set of maps/flatmaps
The yield is just the value Thing(index, k, innerK, innerV)
We get one of those for each element of v
v.map{x=>val (innerK, innerV) = t;Thing(index, k, innerK, innerV)}
but there's an inner map per element of the outer map
m.flatMap{y=>val (k, v) = y;v.map{x=>val (innerK, innerV) = t;Thing(index, k, innerK, innerV)}}
(flatMap because we get a List of Lists if we just did a map and we want to flatten it to just the list of items)
Similarly, we do one of those for every element in the List
in.flatMap (z => val (index, m) = z; m.flatMap{y=>val (k, v) = y;v.map{x=>val (innerK, innerV) = t;Thing(index, k, innerK, innerV)}}
Let's do that in _1, _2 style-y.
in.flatMap (z=> z._2.flatMap{y=>y._2.map{x=>;Thing(z._1, y._1, x._1, x._2)}}}
which produces exactly the same result. But isn't it clearer as a for-comprehension?
You can do this like this if you prefer collection operation
case class Record(v1: Int, v2: String, v3: Double, v4: Int)
val data = List(
(5, Map ( "ABC" ->
Map(
3200. -> 3,
3350.800 -> 4,
200.300 -> 3))
),
(1, Map ( "DEF" ->
Map(
1200. -> 32,
1320.800 -> 4,
2100. -> 3))
)
)
val rdd = sc.parallelize(data)
val result = rdd.flatMap(p => {
p._2.toList
.flatMap(q => q._2.toList.map(l => (q._1, l)))
.map((p._1, _))
}).map(p => Record(p._1, p._2._1, p._2._2._1, p._2._2._2))
println(result.collect.toList)
//List(
// Record(5,ABC,3200.0,3),
// Record(5,ABC,3350.8,4),
// Record(5,ABC,200.3,3),
// Record(1,DEF,1200.0,32),
// Record(1,DEF,1320.8,4),
// Record(1,DEF,2100.0,3)
//)

Best way to merge two maps and sum the values of same key?

val map1 = Map(1 -> 9 , 2 -> 20)
val map2 = Map(1 -> 100, 3 -> 300)
I want to merge them, and sum the values of same keys. So the result will be:
Map(2->20, 1->109, 3->300)
Now I have 2 solutions:
val list = map1.toList ++ map2.toList
val merged = list.groupBy ( _._1) .map { case (k,v) => k -> v.map(_._2).sum }
and
val merged = (map1 /: map2) { case (map, (k,v)) =>
map + ( k -> (v + map.getOrElse(k, 0)) )
}
But I want to know if there are any better solutions.
The shortest answer I know of that uses only the standard library is
map1 ++ map2.map{ case (k,v) => k -> (v + map1.getOrElse(k,0)) }
Scalaz has the concept of a Semigroup which captures what you want to do here, and leads to arguably the shortest/cleanest solution:
scala> import scalaz._
import scalaz._
scala> import Scalaz._
import Scalaz._
scala> val map1 = Map(1 -> 9 , 2 -> 20)
map1: scala.collection.immutable.Map[Int,Int] = Map(1 -> 9, 2 -> 20)
scala> val map2 = Map(1 -> 100, 3 -> 300)
map2: scala.collection.immutable.Map[Int,Int] = Map(1 -> 100, 3 -> 300)
scala> map1 |+| map2
res2: scala.collection.immutable.Map[Int,Int] = Map(1 -> 109, 3 -> 300, 2 -> 20)
Specifically, the binary operator for Map[K, V] combines the keys of the maps, folding V's semigroup operator over any duplicate values. The standard semigroup for Int uses the addition operator, so you get the sum of values for each duplicate key.
Edit: A little more detail, as per user482745's request.
Mathematically a semigroup is just a set of values, together with an operator that takes two values from that set, and produces another value from that set. So integers under addition are a semigroup, for example - the + operator combines two ints to make another int.
You can also define a semigroup over the set of "all maps with a given key type and value type", so long as you can come up with some operation that combines two maps to produce a new one which is somehow the combination of the two inputs.
If there are no keys that appear in both maps, this is trivial. If the same key exists in both maps, then we need to combine the two values that the key maps to. Hmm, haven't we just described an operator which combines two entities of the same type? This is why in Scalaz a semigroup for Map[K, V] exists if and only if a Semigroup for V exists - V's semigroup is used to combine the values from two maps which are assigned to the same key.
So because Int is the value type here, the "collision" on the 1 key is resolved by integer addition of the two mapped values (as that's what Int's semigroup operator does), hence 100 + 9. If the values had been Strings, a collision would have resulted in string concatenation of the two mapped values (again, because that's what the semigroup operator for String does).
(And interestingly, because string concatenation is not commutative - that is, "a" + "b" != "b" + "a" - the resulting semigroup operation isn't either. So map1 |+| map2 is different from map2 |+| map1 in the String case, but not in the Int case.)
Quick solution:
(map1.keySet ++ map2.keySet).map {i=> (i,map1.getOrElse(i,0) + map2.getOrElse(i,0))}.toMap
Well, now in scala library (at least in 2.10) there is something you wanted - merged function. BUT it's presented only in HashMap not in Map. It's somewhat confusing. Also the signature is cumbersome - can't imagine why I'd need a key twice and when I'd need to produce a pair with another key. But nevertheless, it works and much cleaner than previous "native" solutions.
val map1 = collection.immutable.HashMap(1 -> 11 , 2 -> 12)
val map2 = collection.immutable.HashMap(1 -> 11 , 2 -> 12)
map1.merged(map2)({ case ((k,v1),(_,v2)) => (k,v1+v2) })
Also in scaladoc mentioned that
The merged method is on average more performant than doing a
traversal and reconstructing a new immutable hash map from
scratch, or ++.
This can be implemented as a Monoid with just plain Scala. Here is a sample implementation. With this approach, we can merge not just 2, but a list of maps.
// Monoid trait
trait Monoid[M] {
def zero: M
def op(a: M, b: M): M
}
The Map based implementation of the Monoid trait that merges two maps.
val mapMonoid = new Monoid[Map[Int, Int]] {
override def zero: Map[Int, Int] = Map()
override def op(a: Map[Int, Int], b: Map[Int, Int]): Map[Int, Int] =
(a.keySet ++ b.keySet) map { k =>
(k, a.getOrElse(k, 0) + b.getOrElse(k, 0))
} toMap
}
Now, if you have a list of maps that needs to be merged (in this case, only 2), it can be done like below.
val map1 = Map(1 -> 9 , 2 -> 20)
val map2 = Map(1 -> 100, 3 -> 300)
val maps = List(map1, map2) // The list can have more maps.
val merged = maps.foldLeft(mapMonoid.zero)(mapMonoid.op)
map1 ++ ( for ( (k,v) <- map2 ) yield ( k -> ( v + map1.getOrElse(k,0) ) ) )
I wrote a blog post about this , check it out :
http://www.nimrodstech.com/scala-map-merge/
basically using scalaz semi group you can achieve this pretty easily
would look something like :
import scalaz.Scalaz._
map1 |+| map2
You can also do that with Cats.
import cats.implicits._
val map1 = Map(1 -> 9 , 2 -> 20)
val map2 = Map(1 -> 100, 3 -> 300)
map1 combine map2 // Map(2 -> 20, 1 -> 109, 3 -> 300)
Starting Scala 2.13, another solution only based on the standard library consists in replacing the groupBy part of your solution with groupMapReduce which (as its name suggests) is an equivalent of a groupBy followed by mapValues and a reduce step:
// val map1 = Map(1 -> 9, 2 -> 20)
// val map2 = Map(1 -> 100, 3 -> 300)
(map1.toSeq ++ map2).groupMapReduce(_._1)(_._2)(_+_)
// Map[Int,Int] = Map(2 -> 20, 1 -> 109, 3 -> 300)
This:
Concatenates the two maps as a sequence of tuples (List((1,9), (2,20), (1,100), (3,300))). For conciseness, map2 is implicitly converted to Seq to adapt to the type of map1.toSeq - but you could choose to make it explicit by using map2.toSeq,
groups elements based on their first tuple part (group part of groupMapReduce),
maps grouped values to their second tuple part (map part of groupMapReduce),
reduces mapped values (_+_) by summing them (reduce part of groupMapReduce).
Andrzej Doyle's answer contains a great explanation of semigroups which allows you to use the |+| operator to join two maps and sum the values for matching keys.
There are many ways something can be defined to be an instance of a typeclass, and unlike the OP you might not want to sum your keys specifically. Or, you might want to do operate on a union rather than an intersection. Scalaz also adds extra functions to Map for this purpose:
https://oss.sonatype.org/service/local/repositories/snapshots/archive/org/scalaz/scalaz_2.11/7.3.0-SNAPSHOT/scalaz_2.11-7.3.0-SNAPSHOT-javadoc.jar/!/index.html#scalaz.std.MapFunctions
You can do
import scalaz.Scalaz._
map1 |+| map2 // As per other answers
map1.intersectWith(map2)(_ + _) // Do things other than sum the values
The fastest and simplest way:
val m1 = Map(1 -> 1.0, 3 -> 3.0, 5 -> 5.2)
val m2 = Map(0 -> 10.0, 3 -> 3.0)
val merged = (m2 foldLeft m1) (
(acc, v) => acc + (v._1 -> (v._2 + acc.getOrElse(v._1, 0.0)))
)
By this way, each of element's immediately added to map.
The second ++ way is:
map1 ++ map2.map { case (k,v) => k -> (v + map1.getOrElse(k,0)) }
Unlike the first way, In a second way for each element in a second map a new List will be created and concatenated to the previous map.
The case expression implicitly creates a new List using unapply method.
Here's what I ended up using:
(a.toSeq ++ b.toSeq).groupBy(_._1).mapValues(_.map(_._2).sum)
This is what I came up with...
def mergeMap(m1: Map[Char, Int], m2: Map[Char, Int]): Map[Char, Int] = {
var map : Map[Char, Int] = Map[Char, Int]() ++ m1
for(p <- m2) {
map = map + (p._1 -> (p._2 + map.getOrElse(p._1,0)))
}
map
}
Using the typeclass pattern, we can merge any Numeric type:
object MapSyntax {
implicit class MapOps[A, B](a: Map[A, B]) {
def plus(b: Map[A, B])(implicit num: Numeric[B]): Map[A, B] = {
b ++ a.map { case (key, value) => key -> num.plus(value, b.getOrElse(key, num.zero)) }
}
}
}
Usage:
import MapSyntax.MapOps
map1 plus map2
Merging a sequence of maps:
maps.reduce(_ plus _)
I've got a small function to do the job, it's in my small library for some frequently used functionality which isn't in standard lib.
It should work for all types of maps, mutable and immutable, not only HashMaps
Here is the usage
scala> import com.daodecode.scalax.collection.extensions._
scala> val merged = Map("1" -> 1, "2" -> 2).mergedWith(Map("1" -> 1, "2" -> 2))(_ + _)
merged: scala.collection.immutable.Map[String,Int] = Map(1 -> 2, 2 -> 4)
https://github.com/jozic/scalax-collection/blob/master/README.md#mergedwith
And here's the body
def mergedWith(another: Map[K, V])(f: (V, V) => V): Repr =
if (another.isEmpty) mapLike.asInstanceOf[Repr]
else {
val mapBuilder = new mutable.MapBuilder[K, V, Repr](mapLike.asInstanceOf[Repr])
another.foreach { case (k, v) =>
mapLike.get(k) match {
case Some(ev) => mapBuilder += k -> f(ev, v)
case _ => mapBuilder += k -> v
}
}
mapBuilder.result()
}
https://github.com/jozic/scalax-collection/blob/master/src%2Fmain%2Fscala%2Fcom%2Fdaodecode%2Fscalax%2Fcollection%2Fextensions%2Fpackage.scala#L190
For anyone coming across an AnyVal error, convert the values as follows.
Error:
"could not find implicit value for parameter num: Numeric[AnyVal]"
(m1.toSeq ++ m2.toSeq).groupBy(_._1).mapValues(_.map(_._2.asInstanceOf[Number].intValue()).sum)