Applying operation to corresponding elements of Array - scala

I want to sum the corresponding elements of the list and multiply the results while keeping the label associated with the array element so
("a",Array((0.5,1.0),(0.667,2.0)))
becomes :
(a , (0.5 + 0.667) * (1.0 + 2.0))
Here is my code to express this for a single array element :
val data = Array(("a",Array((0.5,1.0),(0.667,2.0))), ("b",Array((0.6,2.0), (0.6,2.0))))
//> data : Array[(String, Array[(Double, Double)])] = Array((a,Array((0.5,1.0),
//| (0.667,2.0))), (b,Array((0.6,2.0), (0.6,2.0))))
val v1 = (data(0)._1, data(0)._2.map(m => m._1).sum)
//> v1 : (String, Double) = (a,1.167)
val v2 = (data(0)._1, data(0)._2.map(m => m._2).sum)
//> v2 : (String, Double) = (a,3.0)
val total = (v1._1 , (v1._2 * v2._2)) //> total : (String, Double) = (a,3.5010000000000003)
I just want apply this function to all elements of the array so val "data" above becomes :
Map[(String, Double)] = ((a,3.5010000000000003),(b,4.8))
But I'm not sure how to combine the above code into a single function which maps over all the array elements ?
Update : the inner Array can be of variable length so this is also valid :
val data = Array(("a",Array((0.5,1.0,2.0),(0.667,2.0,1.0))), ("b",Array((0.6,2.0), (0.6,2.0))))

Pattern matching is your friend! You can use it for tuples and arrays. If there are always two elements in the inner array, you can do it this way:
val data = Array(("a",Array((0.5,1.0),(0.667,2.0))), ("b",Array((0.6,2.0), (0.6,2.0))))
data.map {
case (s, Array((x1, x2), (x3, x4))) => s -> (x1 + x3) * (x2 + x4)
}
// Array[(String, Double)] = Array((a,3.5010000000000003), (b,4.8))
res6.toMap
// scala.collection.immutable.Map[String,Double] = Map(a -> 3.5010000000000003, b -> 4.8)
If the inner elements are variable length, you could do it this way (a for comprehension instead of explicit maps):
for {
(s, tuples) <- data
sum1 = tuples.map(_._1).sum
sum2 = tuples.map(_._2).sum
} yield s -> sum1 * sum2
Note that while this is a very clear solution, it's not the most efficient possible, because we're iterating over the tuples twice. You could use a fold instead, but it would be much harder to read (for me anyway. :)
Finally, note that .sum will produce zero on an empty collection. If that's not what you want, you could do this instead:
val emptyDefault = 1.0 // Or whatever, depends on your use case
for {
(s, tuples) <- data
sum1 = tuples.map(_._1).reduceLeftOption(_ + _).getOrElse(emptyDefault)
sum2 = tuples.map(_._2).reduceLeftOption(_ + _).getOrElse(emptyDefault)
} yield s -> sum1 * sum2

You can use algebird numeric library:
val data = Array(("a",Array((0.5,1.0),(0.667,2.0))), ("b",Array((0.6,2.0), (0.6,2.0))))
import com.twitter.algebird.Operators._
def sumAndProduct(a: Array[(Double, Double)]) = {
val sums = a.reduceLeft((m, n) => m + n)
sums._1 * sums._2
}
data.map{ case (x, y) => (x, sumAndProduct(y)) }
// Array((a,3.5010000000000003), (b,4.8))
It will work fine for variable size array as well.
val data = Array(("a",Array((0.5,1.0))), ("b",Array((0.6,2.0), (0.6,2.0))))
// Array((a,0.5), (b,4.8))

Like this? Does your array always have only 2 pairs?
val m = data map ({case (label,Array(a,b)) => (label, (a._1 + b._1) * (a._2 + b._2)) })
m.toMap

Related

How to sum up pair elements individually in Scala

I have the following method to sum up the pair elements in an array of pairs. I am new to scala and feel like there will be a better way than the following piece of code.
def accumulate(results: Array[(Int, Int)]): (Int, Int) = {
var x: Int = 0
var y: Int = 0
for (elem <- results) {
x = x + elem._1
y = y + elem._2
}
(x, y)
}
Yes, you can use foldLeft.
(BTW, I would also use List, instead of Array)
results.foldLeft((0, 0)) {
case ((accX, accY), (x, y)) =>
(accX + x, accY + y)
}
All of the operations in scala.collection.ArrayOps are available on Array[T]. In particular, you can unzip an array of pairs into a pair of arrays
val (xs, ys) = results.unzip
Summing a container is a standard use of fold
val x = xs.fold(0)(_ + _)
val y = ys.fold(0)(_ + _)
And then you can return the pair of values
(x, y)
https://scalafiddle.io/sf/meEKv6T/0 has a complete working example.

Failed to print get count result in forloop

I have been trying to count inside a for loop, but the result just ends with a parentheses. I am just printing out the key here in map.
var count = 0
xs.foreach(x => (myMap += ((count+=1).toString+","+java.util.UUID.randomUUID.toString -> x)))
Output:
(),901e9926-be1e-4dc4-b3e3-6c3b2feea2c4
Expected output:
1,901e9926-be1e-4dc4-b3e3-6c3b2feea2c4
Within your foreach, count += 1 would be of type Unit. If I understand your question correctly, the example below (using an arbitrary xs collection) might be what you're looking for:
val xs = List("a", "b", "c", "d")
var count = 0
var myMap = Map[String, String]()
xs.foreach{ x =>
count += 1
myMap += ((count.toString + "," + java.util.UUID.randomUUID.toString) -> x)
}
myMap.keys
// res1: Iterable[String] = Set(
// 1,bd971c44-b9d0-41a0-b59f-3acbf2e0dee0, 2,5459eed9-309d-4f9c-afd7-10aced9df2a0,
// 3,5816ea42-d8ed-4beb-8b30-0376d0674700, 4,30f6f22f-1e6d-4eec-86af-5bc6734d5196
// )
In case you want a more idiomatic approach, using zip for the count and foldLeft for Map aggregation would produce similar result:
val myMap = Map[String, String]()
val resultMap = xs.zip(Stream from 1).foldLeft( myMap )(
(m, x) => m + ((x._2.toString + "," + java.util.UUID.randomUUID.toString) -> x._1)
)
What you are printing here is actually (count+=1).toString. In Scala, an assignment like this will be evaluated to Unit, which is expressed by parentheses. That's why you print () and not the value of count. If you check the count variable value afterwards you will see that it is 1 as expected.
Additionally, what you are trying to do could be expressed in a better way, e.g, you could do:
val myMap = xs.zipWithIndex.map(x => (x._2 + 1) + "," + java.util.UUID.randomUUID -> x._1).toMap

Multiplication of "double" values in scala

I want to multiply two sparse matrices in spark using scala. I am passing these matrices in form of arguments and storing result in another argument.
Matrices are text files where each matrix element is represented by as: row, column, element.
I am not able to multiply two Double values in Scala.
object MultiplySpark {
def main(args: Array[ String ]) {
val conf = new SparkConf().setAppName("Multiply")
conf.setMaster("local[2]")
val sc = new SparkContext(conf)
val M = sc.textFile(args(0)).flatMap(entry => {
val rec = entry.split(",")
val row = rec(0).toInt
val column = rec(1).toInt
val value = rec(2).toDouble
for {pointer <-1 until rec.length} yield ((row,column),value)
})
val N = sc.textFile(args(0)).flatMap(entry => {
val rec = entry.split(",")
val row = rec(0).toInt
val column = rec(1).toInt
val value = rec(2).toDouble
for {pointer <-1 until rec.length} yield ((row,column),value)
})
val Mmap = M.map( e => (e._2,e))
val Nmap = N.map( d => (d._2,d))
val MNjoin = Mmap.join(Nmap).map{ case (k,(e,d)) => e._2.toDouble+","+d._2.toDouble }
val result = MNjoin.reduceByKey( (a,b) => a*b)
.map(entry => {
((entry._1._1, entry._1._2), entry._2)
})
.reduceByKey((a, b) => a + b)
result.saveAsTextFile(args(2))
sc.stop()
How can I multiply double values in Scala?
Please note:
I tried a.toDouble * b.toDouble
Error is: Value * is not a member of Double Double
This reduceByKey would work if you had RDD[((Int, Int), Double)] (or RDD[(SomeType, Double)] more generally) and join gives you RDD[((Int, Int), (Double, Double))]. So you are trying to multiply pairs (Double, Double), not Doubles.

Sum values of PairRDD

I have an RDD of type:
dataset :org.apache.spark.rdd.RDD[(String, Double)] = MapPartitionRDD[26]
Which is equivalent to (Pedro, 0.0833), (Hello, 0.001828) ...
I'd like to sum all the value , 0.0833+0.001828.. but I can't find a proper
solution.
Considering your input data, you can do the following :
// example
val datasets = sc.parallelize(List(("Pedro", 0.0833), ("Hello", 0.001828)))
datasets.map(_._2).sum()
// res3: Double = 0.085128
// or
datasets.map(_._2).reduce(_ + _)
// res4: Double = 0.085128
// or even
datasets.values.sum()
// res5: Double = 0.085128
like this?:
map(_._2).reduce((x, y) => x + y)
breakdown: map the tuple to just the double values, then reduce the RDD by summing.

Averaging values in List contained in Map

Here is code I wrote to average coordinate values contained within the values of a Map :
val averaged = Map((2,10) -> List((2.0,11.0), (5.0,8.0)))
//> averaged : scala.collection.immutable.Map[(Int, Int),List[(Double, Double)
//| ]] = Map((2,10) -> List((2.0,11.0), (5.0,8.0)))
averaged.mapValues(m => {
val s1 = m.map(m => m._1).sum
val s2 = m.map(m => m._2).sum
(s1 / m.size , s2 / m.size)
}) //> res0: scala.collection.immutable.Map[(Int, Int),(Double, Double)] = Map((2,
//| 10) -> (3.5,9.5))
This code works as expected but the mapValues function requires number of passes equals to length of the List. Is there a more idiomatic method of achieving same using Scala ?
If I'm understanding your question correctly, you are asking if it is possible to avoid the traversal of m on each access. The mapValues method returns a view of a Map, meaning that there will be repeated work on access. To avoid that, just use map instead:
val averaged = Map((2, 10) -> List((2.0, 11.0), (5.0, 8.0)))
val result = averaged.map {
case (key, m) =>
val (s1, s2) = m.unzip
(s1.sum / m.size, s2.sum / m.size)
}
println(result)
// Map((2,10) -> (3.5,9.5))
Using unzip additionally means that the code won't traverse m more than once.