How to sum a List[(Char,Int)] into a Map[Char,Int] in Scala? - scala

I've got list of pairs:
List(('a',3),('b',3),('a',1))
and I would like to transform it by grouping by _1 and summing _2. The result should be like
Map('a'->4, 'b' -> 3)
I very new to Scala so please be kind :)

More direct version. We fold over the list, using a Map as the accumulator. The withDefaultValue means we don't have to test if we have the entry in the map already.
val xs = List(('a',3),('b',3),('a',1))
xs.foldLeft(Map[Char, Int]() withDefaultValue 0)
{case (m, (c, i)) => m updated (c,m(c)+i)}
//> res0: scala.collection.immutable.Map[Char,Int] = Map(a -> 4, b -> 3)

list.groupBy(_._1).mapValues(_.map(_._2).sum)
which can be written as
list.groupBy(_._1).mapValues { tuples =>
val ints = tuples.map { case (c, i) => i }
ints.sum
}

Related

foldLeft : if exists add to a map else update values

I wanted to try out the foldLeft similar to reduceByKey function.
If the letter exists increment by value else append the tuple in HashMap.
The below code fails:
val output = input.toLowerCase.filter(Character.isLetter).map(x => (x,1)).foldLeft(HashMap.empty[Char,Int].withDefaultValue(0)){case (acc,(x,y))=> acc += x }
Please suggest.
With Scala 2.13 you can use the new groupMapReduce().
val output = "In-Pint".collect{case c if c.isLetter => c.toLower}
.groupMapReduce(identity)(_ => 1)(_+_)
//output: Map[Char,Int] = Map(p -> 1, t -> 1, i -> 2, n -> 2)
Breaking down your code snippet:
.toLowerCase.filter(Character.isLetter)
As showcased in #jwvh's answer, this can be simplified to .collect{case c if c.isLetter => c.toLower}
.map(x => (x, 1))
This transformation is unnecessary if you intend to use foldLeft.
.foldLeft(HashMap.empty[Char,Int].withDefaultValue(0)){case (acc, (x,y)) => acc += x}
This wouldn't compile as += is an assignment which cannot be applied to the accumulator.
For counting distinct characters in a string, your foldLeft can be formulated as shown below:
"abac".foldLeft(Map[Char, Int]()){
case (m, c) => m + (c -> (m.getOrElse(c, 0) + 1))
}
// res1: scala.collection.immutable.Map[Char,Int] = Map(a -> 2, b -> 1, c -> 1)
The idea is simple: in foldLeft's binary operator, add to the existing Map c -> m(c)+1 if c already exists; else c -> 0+1.

Reduce/fold over scala sequence with grouping

In scala, given an Iterable of pairs, say Iterable[(String, Int]),
is there a way to accumulate or fold over the ._2s based on the ._1s? Like in the following, add up all the #s that come after A and separately the # after B
List(("A", 2), ("B", 1), ("A", 3))
I could do this in 2 steps with groupBy
val mapBy1 = list.groupBy( _._1 )
for ((key,sublist) <- mapBy1) yield (key, sublist.foldLeft(0) (_+_._2))
but then I would be allocating the sublists, which I would rather avoid.
You could build the Map as you go and convert it back to a List after the fact.
listOfPairs.foldLeft(Map[String,Int]().withDefaultValue(0)){
case (m,(k,v)) => m + (k -> (v + m(k)))
}.toList
You could do something like:
list.foldLeft(Map[String, Int]()) {
case (map, (k,v)) => map + (k -> (map.getOrElse(k, 0) + v))
}
You could also use groupBy with mapValues:
list.groupBy(_._1).mapValues(_.map(_._2).sum).toList
res1: List[(String, Int)] = List((A,5), (B,1))

How to un-nest a spark rdd that has the following type ((String, scala.collection.immutable.Map[String,scala.collection.immutable.Map[String,Int]]))

Its a nested map with contents like this when i print it onto screen
(5, Map ( "ABCD" -> Map("3200" -> 3,
"3350.800" -> 4,
"200.300" -> 3)
(1, Map ( "DEF" -> Map("1200" -> 32,
"1320.800" -> 4,
"2100" -> 3)
I need to get something like this
Case Class( 5, ABCD 3200, 3)
Case Class(5, ABCD 3350.800, 4)
CaseClass(5,ABCD., 200.300, 3)
CaseClass(1, DEF 1200, 32)
CaseClass(1 DEF, 1320.800, 4)
etc etc.
basically a list of case classes
And map it to a case class object so that i can save it to cassandra.
I have tried doing flatMapValues but that un nests the map only one level. Also used flatMap . that doesnt work either or I'am making mistakes
Any suggestions ?
Fairly straightforward using a for-comprehension and some pattern matching to destructure things:
val in = List((5, Map ( "ABCD" -> Map("3200" -> 3, "3350.800" -> 4, "200.300" -> 3))),
(1, Map ("DEF" -> Map("1200" -> 32, "1320.800" -> 4, "2100" -> 3))))
case class Thing(a:Int, b:String, c:String, d:Int)
for { (index, m) <- in
(k,v) <-m
(innerK, innerV) <- v}
yield Thing(index, k, innerK, innerV)
//> res0: List[maps.maps2.Thing] = List(Thing(5,ABCD,3200,3),
// Thing(5,ABCD,3350.800,4),
// Thing(5,ABCD,200.300,3),
// Thing(1,DEF,1200,32),
// Thing(1,DEF,1320.800,4),
// Thing(1,DEF,2100,3))
So let's pick part the for-comprehension
(index, m) <- in
This is the same as
t <- in
(index, m) = t
In the first line t will successively be set to each element of in.
t is therefore a tuple (Int, Map(...))
Patten matching lets us put that "patten" for the tuple on the right hand side and the compiler picks apart the tuple, sets index to the Int and m to the Map.
(k,v) <-m
As before this is equivalent to
u <-m
(k, v) = u
And this time u takes each element of Map. Which again are tuples of key and value. So k is set successively to each key and v to the value.
And v is your inner map so we do the same thing again with the inner map
(innerK, innerV) <- v}
Now we have everything we need to create the case class. yield just says make a collection of whatever is "yielded" each time through the loop.
yield Thing(index, k, innerK, innerV)
Under the hood, this just translates to a set of maps/flatmaps
The yield is just the value Thing(index, k, innerK, innerV)
We get one of those for each element of v
v.map{x=>val (innerK, innerV) = t;Thing(index, k, innerK, innerV)}
but there's an inner map per element of the outer map
m.flatMap{y=>val (k, v) = y;v.map{x=>val (innerK, innerV) = t;Thing(index, k, innerK, innerV)}}
(flatMap because we get a List of Lists if we just did a map and we want to flatten it to just the list of items)
Similarly, we do one of those for every element in the List
in.flatMap (z => val (index, m) = z; m.flatMap{y=>val (k, v) = y;v.map{x=>val (innerK, innerV) = t;Thing(index, k, innerK, innerV)}}
Let's do that in _1, _2 style-y.
in.flatMap (z=> z._2.flatMap{y=>y._2.map{x=>;Thing(z._1, y._1, x._1, x._2)}}}
which produces exactly the same result. But isn't it clearer as a for-comprehension?
You can do this like this if you prefer collection operation
case class Record(v1: Int, v2: String, v3: Double, v4: Int)
val data = List(
(5, Map ( "ABC" ->
Map(
3200. -> 3,
3350.800 -> 4,
200.300 -> 3))
),
(1, Map ( "DEF" ->
Map(
1200. -> 32,
1320.800 -> 4,
2100. -> 3))
)
)
val rdd = sc.parallelize(data)
val result = rdd.flatMap(p => {
p._2.toList
.flatMap(q => q._2.toList.map(l => (q._1, l)))
.map((p._1, _))
}).map(p => Record(p._1, p._2._1, p._2._2._1, p._2._2._2))
println(result.collect.toList)
//List(
// Record(5,ABC,3200.0,3),
// Record(5,ABC,3350.8,4),
// Record(5,ABC,200.3,3),
// Record(1,DEF,1200.0,32),
// Record(1,DEF,1320.8,4),
// Record(1,DEF,2100.0,3)
//)

Fold from Map[String,List[Int]] to Map[String,Int]

I'm fairly new to Scala and functional approaches in general. I have a Map that looks something like this:
val myMap: Map[String, List[Int]]
I want to end up something that maps the key to the total of the associated list:
val totalsMap: Map[String, Int]
My initial hunch was to use a for comprehension:
val totalsMap = for (kvPair <- myMap) {
kvPair._2.foldLeft(0)(_+_)
}
But I have no idea what I would put in the yield() clause in order to get a map out of the for comprehension.
You can use mapValues for this,
val totalMap = myMap.mapValues(_.sum)
But mapValues will recalculate the sum every time you get a key from the Map. e.g. If you do totalMap("a") multiple times, it will recalculate the sum each time.
If you don't want this, you should use
val totalMap = myMap map {
case (k, v) => k -> v.sum
}
mapValues would be more suited for this case:
val m = Map[String, List[Int]]("a" -> List(1,2,3), "b" -> List(4,5,6))
m.mapValues(_.foldLeft(0)(_+_))
res1: scala.collection.immutable.Map[String,Int] = Map(a -> 6, b -> 15)
Or without foldLeft:
m.mapValues(_.sum)
val m = Map("hello" -> Seq(1, 1, 1, 1), "world" -> Seq(1, 1))
for ((k, v) <- m) yield (k, v.sum)
yields
Map(hello -> 4, world -> 2)`
The for comprehension will return whatever monadic type you give it. In this case, m is a Map, so that's what's going to come out. The yield must return a tuple. The first element (which becomes the key in each Map entry) is the word you're counting, and the second element (you guessed it, the value in each Map entry) becomes the sum of the original sequence of counts.

Rules on using a case statement to destruct a tuple in Scala

I have the following code:
val xs = List(('a', 1), ('a', 2), ('b', 3), ('b', 4))
I want to transform this into a Map. e.g. Map('a' -> Seq(1,2), 'b' -> Seq(3,4)). So I proceed to write the transformation:
xs.groupBy(_._1) map {
case (k, v) => (k, v.map(_._2))
}
Why does the brace after the map need to be a {. When I started, I assumed I could do the following:
xs.groupBy(_._1).map(case (k, v) => (k, v.map(_._2)))
But that doesn't compile.
Because .map method accepts a function
What you've actually written is
map({
case (k, v) => (k, v.map(_._2))
})
and the { case (k, v) => (k, v.map(_._2)) } is a shortcut definition for pattern matching anonymous function (SLS, ยง8.5) which is one of the function kinds:
val isOdd: PartialFunction[Int, String] = {
case x if x % 2 == 1 => x+" is odd"
}
val upcastedIsOdd: Function[Int, String] = {
case x if x % 2 == 1 => x+" is odd"
}
You cannot ommit curly braces (so you'll loose partial function and patten matching nicity) but you can skip plain braces (and still retain partial function) just like in the snippet below:
scala> List(1,2,3).take(1)
//res0: List[Int] = List(1)
scala> List(1,2,3) take 1
//res1: List[Int] = List(1)
It seems the real question here is when can one use parenthesis ( in place of braces { to represent an anonymous function. I recommend having a look at Daniel Sobral's answer to the question: What is the formal difference in Scala between braces and parentheses, and when should they be used?