Idiomatic Scala for nested loop - scala

I am trying to write idiomatic scala code to loop through two lists of lists, and to generate a new list containing only the differences of the two lists.
In procedural Scala I would do something like this:
val first: List[List[Int]] = List(List(1,2,3,4,5),List(1,2,3,4,5), List(1,2,3,4,5))
val second: List[List[Int]] = List(List(1,2,3,4,5),List(1,23,3,45,5),List(1,2,3,4,5))
var diff: List[String] = List[String]()
for (i <- List.range(0, first.size)){
for (j <- List.range(0, first(0).size)){
println(first(i)(j) + " " + second(i)(j))
if (first(i)(j) != second(i)(j)) diff = diff ::: (s"${second(i)(j)}" :: Nil)
}
}
Of course I do not like this, I have attempted to write a solution using for comprehension, but without success.
The closest thing I could get to was this:
for {(lf,ls) <- (first zip second) } yield if (lf == ls) lf else ls
but from that for comprehension I can not generate a list of String being of a different type from the input one.
Any suggestion?

The idiomatic Scala would be something like this:
(
for {
(row1, row2) <- (first, second).zipped // go through rows with the same index
(value1, value2) <- (row1, row2).zipped // go through values with the same indexes
if value1 != value2 // leave only different values in the list
} yield value2.toString
).toList
It's better to use zipped, than zip here, because zipped doesn't produce the whole zipped List in memory.
Also, you have to do toList in the end, due to a quirk in type inference.

Something like this produces the same results
val diff = for {
firstInner <- first
v1 <- firstInner
secondInner <- second
v2 <- secondInner
} yield if (v1 != v2) s"$v2}"
println(diff2 mkString ", ") // prints 23, 45
But this ofcourse fails with IndexOutOfBoundsException if the array sizes isn't of the same size.

Related

How to access previous element when using yield in for loop chisel3

This is mix Chisel / Scala question.
Background, I need to sum up a lot of numbers (the number of input signals in configurable). Due to timing constrains I had to split it to groups of 4 and pipe(register it), then it is fed into next stage (which will be 4 times smaller, until I reach on)
this is my code:
// log4 Aux function //
def log4(n : Int): Int = math.ceil(math.log10(n.toDouble) / math.log10(4.0)).toInt
// stage //
def Adder4PipeStage(len: Int,in: Vec[SInt]) : Vec[SInt] = {
require(in.length % 4 == 0) // will not work if not a muliplication of 4
val pipe = RegInit(VecInit(Seq.fill(len/4)(0.S(in(0).getWidth.W))))
pipe.zipWithIndex.foreach {case(p,j) => p := in.slice(j*4,(j+1)*4).reduce(_ +& _)}
pipe
}
// the pipeline
val adderPiped = for(j <- 1 to log4(len)) yield Adder4PipeStage(len/j,if(j==1) io.in else <what here ?>)
how to I access the previous stage, I am also open to hear about other ways to implement the above
There are several things you could do here:
You could just use a var for the "previous" value:
var prev: Vec[SInt] = io.in
val adderPiped = for(j <- 1 to log4(len)) yield {
prev = Adder4PipeStage(len/j, prev)
prev
}
It is a little weird using a var with a for yield (since the former is fundamentally mutable while the latter tends to be used with immutable-style code).
You could alternatively use a fold building up a List
// Build up backwards and reverse (typical in functional programming)
val adderPiped = (1 to log4(len)).foldLeft(io.in :: Nil) {
case (pipes, j) => Adder4PipeStage(len/j, pipes.head) :: pipes
}.reverse
.tail // Tail drops "io.in" which was 1st element in the result List
If you don't like the backwards construction of the previous fold,
You could use a fold with a Vector (better for appending than a List):
val adderPiped = (1 to log4(len)).foldLeft(Vector(io.in)) {
case (pipes, j) => pipes :+ Adder4PipeStage(len/j, pipes.last)
}.tail // Tail drops "io.in" which was 1st element in the result Vector
Finally, if you don't like these immutable ways of doing it, you could always just embrace mutability and write something similar to what one would in Java or Python:
For loop and mutable collection
val pipes = new mutable.ArrayBuffer[Vec[SInt]]
for (j <- 1 to log4(len)) {
pipes += Adder4PipeStage(len/j, if (j == 1) io.in else pipes.last)
}

How to add to a list you're returning?

Sorry if this is a stupid question as I am a total beginner. I have a function factors which looks like this:
def factors (n:Int):List[Int] = {
var xs = List[Int]()
for(i <- 2 to (n-1)) {
if(n%i==0) {xs :+ i}
}
return xs
}
However if I do println(factors(10)) I always get List().
What am I doing wrong?
The :+ operation returns a new List, you never assign it to xs.
def factors (n:Int):List[Int] = {
var xs = List[Int]()
for (i <- 2 to (n - 1)) {
if(n%i==0) {xs = xs :+ i}
}
return xs
}
But, you really shouldn't be using var. We don't like them very much in Scala.
Also don't don't don't use return in Scala. It is a much more loaded keyword than you might think. Read about it here
Here is a better way of doing this.
def factors (n:Int): List[Int] =
for {
i <- (2 to (n - 1)).toList
if (n % i) == 0
} yield i
factors(10)
You don't need .toList either but didn't want to mess with your return types. You are welcome to adjust
Working link: https://scastie.scala-lang.org/haGESfhKRxqDdDIpaHXfpw
You can think of this problem as a filtering operation. You start with all the possible factors and you keep the ones where the remainder when dividing the input by that number is 0. The operation that does this in Scala is filter, which keeps values where a particular test is true and removes the others:
def factors(n: Int): List[Int] =
(2 until n).filter(n % _ == 0).toList
To keep the code short I have also used the short form of a function where _ stands for the argument to the function, so n % _ means n divided by the current number that is being tested.

Scala - conditional product/join of two arrays with default values using for comprehensions

I have two Sequences, say:
val first = Array("B", "L", "T")
val second = Array("T70", "B25", "B80", "A50", "M100", "B50")
How do I get a product such that elements of the first array are joined with each element of the second array which startsWith the former and also yield a default empty result when no element in the second array meets the condition.
Effectively to get an Output:
expectedProductArray = Array("B-B25", "B-B80", "B-B50", "L-Default", "T-T70")
I tried doing,
val myProductArray: Array[String] = for {
f <- first
s <- second if s.startsWith(f)
} yield s"""$f-$s"""
and i get:
myProductArray = Array("B-B25", "B-B80", "B-B50", "T-T70")
Is there an Idiomatic way of adding a default value for values in first sequence not having a corresponding value in the second sequence with the given criteria? Appreciate your thoughts.
Here's one approach by making array second a Map and looking up the Map for elements in array first with getOrElse:
val first = Array("B", "L", "T")
val second = Array("T70", "B25", "B80", "A50", "M100", "B50")
val m = second.groupBy(_(0).toString)
// m: scala.collection.immutable.Map[String,Array[String]] =
// Map(M -> Array(M100), A -> Array(A50), B -> Array(B25, B80, B50), T -> Array(T70))
first.flatMap(x => m.getOrElse(x, Array("Default")).map(x + "-" + _))
// res1: Array[String] = Array(B-B25, B-B80, B-B50, L-Default, T-T70)
In case you prefer using for-comprehension:
for {
x <- first
y <- m.getOrElse(x, Array("Default"))
} yield s"$x-$y"

General comprehensions in Scala

As far as I understand, the Scala for-comprehension notation relies on the first generator to define how elements are to be combined. Namely, for (i <- list) yield i returns a list and for (i <- set) yield i returns a set.
I was wondering if there was a way to specify how elements are combined independently of the properties of the first generator. For instance, I would like to get "the set of all elements from a given list", or "the sum of all elements from a given set". The only way I have found is to first build a list or a set as prescribed by the for-comprehension notation, then apply a transformation function to it - building a useless data structure in the process.
What I have in mind is a general "algebraic" comprehension notation as it exists for instance in Ateji PX:
`+ { i | int i : set } // the sum of all elements from a given set
set() { i | int i : list } // the set of all elements from a given list
concat(",") { s | String s : list } // string concatenation with a separator symbol
Here the first element (`+, set(), concat(",")) is a so-called "monoid" that defines how elements are combined, independently of the structure of the first generator (there can be multiple generators and filters, I just tried to keep the examples concise).
Any idea about how to achieve a similar result in Scala while keeping a nice and concise notation ? As far as I understand, the for-comprehension notation is hard-wired in the compiler and cannot be upgraded.
Thanks for your feedback.
About the for comprehension
The for comprehension in scala is syntactic sugar for calls to flatMap, filter, map and foreach. In exactly the same way as calls to those methods, the type of the target collection leads to the type of the returned collection. That is:
list map f //is a List
vector map f // is a Vector
This property is one of the underlying design goals of the scala collections library and would be seen as desirable in most situations.
Answering the question
You do not need to construct any intermediate collection of course:
(list.view map (_.prop)).toSet //uses list.view
(list.iterator map (_.prop)).toSet //uses iterator
(for { l <- list.view} yield l.prop).toSet //uses view
(Set.empty[Prop] /: coll) { _ + _.prop } //uses foldLeft
Will all yield Sets without generating unnecessary collections. My personal preference is for the first. In terms of idiomatic scala collection manipulation, each "collection" comes with these methods:
//Conversions
toSeq
toSet
toArray
toList
toIndexedSeq
iterator
toStream
//Strings
mkString
//accumulation
sum
The last is used where the element type of a collection has an implicit Numeric instance in scope; such as:
Set(1, 2, 3, 4).sum //10
Set('a, 'b).sum //does not compile
Note that the String concatenation example in scala looks like:
list.mkString(",")
And in the scalaz FP library might look something like (which uses Monoid to sum Strings):
list.intercalate(",").asMA.sum
Your suggestions do not look anything like Scala; I'm not sure whether they are inspired by another language.
foldLeft? That's what you're describing.
The sum of all elements from a given set:
(0 /: Set(1,2,3))(_ + _)
the set of all elements from a given list
(Set[Int]() /: List(1,2,3,2,1))((acc,x) => acc + x)
String concatenation with a separator symbol:
("" /: List("a", "b"))(_ + _) // (edit - ok concat a bit more verbose:
("" /: List("a", "b"))((acc,x) => acc + (if (acc == "") "" else ",") + x)
You can also force the result type of the for comprehension by explicitly supplying the implicit CanBuildFrom parameter as scala.collection.breakout and specifying the result type.
Consider this REPL session:
scala> val list = List(1, 1, 2, 2, 3, 3)
list: List[Int] = List(1, 1, 2, 2, 3, 3)
scala> val res = for(i <- list) yield i
res: List[Int] = List(1, 1, 2, 2, 3, 3)
scala> val res: Set[Int] = (for(i <- list) yield i)(collection.breakOut)
res: Set[Int] = Set(1, 2, 3)
It results in a type error when not specifying the CanBuildFrom explicitly:
scala> val res: Set[Int] = for(i <- list) yield i
<console>:8: error: type mismatch;
found : List[Int]
required: Set[Int]
val res: Set[Int] = for(i <- list) yield i
^
For a deeper understanding of this I suggest the following read:
http://www.scala-lang.org/docu/files/collections-api/collections-impl.html
If you want to use for comprehensions and still be able to combine your values in some result value you could do the following.
case class WithCollector[B, A](init: B)(p: (B, A) => B) {
var x: B = init
val collect = { (y: A) => { x = p(x, y) } }
def apply(pr: (A => Unit) => Unit) = {
pr(collect)
x
}
}
// Some examples
object Test {
def main(args: Array[String]): Unit = {
// It's still functional
val r1 = WithCollector[Int, Int](0)(_ + _) { collect =>
for (i <- 1 to 10; if i % 2 == 0; j <- 1 to 3) collect(i + j)
}
println(r1) // 120
import collection.mutable.Set
val r2 = WithCollector[Set[Int], Int](Set[Int]())(_ += _) { collect =>
for (i <- 1 to 10; if i % 2 == 0; j <- 1 to 3) collect(i + j)
}
println(r2) // Set(9, 10, 11, 6, 13, 4, 12, 3, 7, 8, 5)
}
}

What is Scala's yield?

I understand Ruby and Python's yield. What does Scala's yield do?
I think the accepted answer is great, but it seems many people have failed to grasp some fundamental points.
First, Scala's for comprehensions are equivalent to Haskell's do notation, and it is nothing more than a syntactic sugar for composition of multiple monadic operations. As this statement will most likely not help anyone who needs help, let's try again… :-)
Scala's for comprehensions is syntactic sugar for composition of multiple operations with map, flatMap and filter. Or foreach. Scala actually translates a for-expression into calls to those methods, so any class providing them, or a subset of them, can be used with for comprehensions.
First, let's talk about the translations. There are very simple rules:
This
for(x <- c1; y <- c2; z <-c3) {...}
is translated into
c1.foreach(x => c2.foreach(y => c3.foreach(z => {...})))
This
for(x <- c1; y <- c2; z <- c3) yield {...}
is translated into
c1.flatMap(x => c2.flatMap(y => c3.map(z => {...})))
This
for(x <- c; if cond) yield {...}
is translated on Scala 2.7 into
c.filter(x => cond).map(x => {...})
or, on Scala 2.8, into
c.withFilter(x => cond).map(x => {...})
with a fallback into the former if method withFilter is not available but filter is. Please see the section below for more information on this.
This
for(x <- c; y = ...) yield {...}
is translated into
c.map(x => (x, ...)).map((x,y) => {...})
When you look at very simple for comprehensions, the map/foreach alternatives look, indeed, better. Once you start composing them, though, you can easily get lost in parenthesis and nesting levels. When that happens, for comprehensions are usually much clearer.
I'll show one simple example, and intentionally omit any explanation. You can decide which syntax was easier to understand.
l.flatMap(sl => sl.filter(el => el > 0).map(el => el.toString.length))
or
for {
sl <- l
el <- sl
if el > 0
} yield el.toString.length
withFilter
Scala 2.8 introduced a method called withFilter, whose main difference is that, instead of returning a new, filtered, collection, it filters on-demand. The filter method has its behavior defined based on the strictness of the collection. To understand this better, let's take a look at some Scala 2.7 with List (strict) and Stream (non-strict):
scala> var found = false
found: Boolean = false
scala> List.range(1,10).filter(_ % 2 == 1 && !found).foreach(x => if (x == 5) found = true else println(x))
1
3
7
9
scala> found = false
found: Boolean = false
scala> Stream.range(1,10).filter(_ % 2 == 1 && !found).foreach(x => if (x == 5) found = true else println(x))
1
3
The difference happens because filter is immediately applied with List, returning a list of odds -- since found is false. Only then foreach is executed, but, by this time, changing found is meaningless, as filter has already executed.
In the case of Stream, the condition is not immediatelly applied. Instead, as each element is requested by foreach, filter tests the condition, which enables foreach to influence it through found. Just to make it clear, here is the equivalent for-comprehension code:
for (x <- List.range(1, 10); if x % 2 == 1 && !found)
if (x == 5) found = true else println(x)
for (x <- Stream.range(1, 10); if x % 2 == 1 && !found)
if (x == 5) found = true else println(x)
This caused many problems, because people expected the if to be considered on-demand, instead of being applied to the whole collection beforehand.
Scala 2.8 introduced withFilter, which is always non-strict, no matter the strictness of the collection. The following example shows List with both methods on Scala 2.8:
scala> var found = false
found: Boolean = false
scala> List.range(1,10).filter(_ % 2 == 1 && !found).foreach(x => if (x == 5) found = true else println(x))
1
3
7
9
scala> found = false
found: Boolean = false
scala> List.range(1,10).withFilter(_ % 2 == 1 && !found).foreach(x => if (x == 5) found = true else println(x))
1
3
This produces the result most people expect, without changing how filter behaves. As a side note, Range was changed from non-strict to strict between Scala 2.7 and Scala 2.8.
It is used in sequence comprehensions (like Python's list-comprehensions and generators, where you may use yield too).
It is applied in combination with for and writes a new element into the resulting sequence.
Simple example (from scala-lang)
/** Turn command line arguments to uppercase */
object Main {
def main(args: Array[String]) {
val res = for (a <- args) yield a.toUpperCase
println("Arguments: " + res.toString)
}
}
The corresponding expression in F# would be
[ for a in args -> a.toUpperCase ]
or
from a in args select a.toUpperCase
in Linq.
Ruby's yield has a different effect.
Yes, as Earwicker said, it's pretty much the equivalent to LINQ's select and has very little to do with Ruby's and Python's yield. Basically, where in C# you would write
from ... select ???
in Scala you have instead
for ... yield ???
It's also important to understand that for-comprehensions don't just work with sequences, but with any type which defines certain methods, just like LINQ:
If your type defines just map, it allows for-expressions consisting of a
single generator.
If it defines flatMap as well as map, it allows for-expressions consisting
of several generators.
If it defines foreach, it allows for-loops without yield (both with single and multiple generators).
If it defines filter, it allows for-filter expressions starting with an if
in the for expression.
Unless you get a better answer from a Scala user (which I'm not), here's my understanding.
It only appears as part of an expression beginning with for, which states how to generate a new list from an existing list.
Something like:
var doubled = for (n <- original) yield n * 2
So there's one output item for each input (although I believe there's a way of dropping duplicates).
This is quite different from the "imperative continuations" enabled by yield in other languages, where it provides a way to generate a list of any length, from some imperative code with almost any structure.
(If you're familiar with C#, it's closer to LINQ's select operator than it is to yield return).
Consider the following for-comprehension
val A = for (i <- Int.MinValue to Int.MaxValue; if i > 3) yield i
It may be helpful to read it out loud as follows
"For each integer i, if it is greater than 3, then yield (produce) i and add it to the list A."
In terms of mathematical set-builder notation, the above for-comprehension is analogous to
which may be read as
"For each integer , if it is greater than , then it is a member of the set ."
or alternatively as
" is the set of all integers , such that each is greater than ."
The keyword yield in Scala is simply syntactic sugar which can be easily replaced by a map, as Daniel Sobral already explained in detail.
On the other hand, yield is absolutely misleading if you are looking for generators (or continuations) similar to those in Python. See this SO thread for more information: What is the preferred way to implement 'yield' in Scala?
Yield is similar to for loop which has a buffer that we cannot see and for each increment, it keeps adding next item to the buffer. When the for loop finishes running, it would return the collection of all the yielded values. Yield can be used as simple arithmetic operators or even in combination with arrays.
Here are two simple examples for your better understanding
scala>for (i <- 1 to 5) yield i * 3
res: scala.collection.immutable.IndexedSeq[Int] = Vector(3, 6, 9, 12, 15)
scala> val nums = Seq(1,2,3)
nums: Seq[Int] = List(1, 2, 3)
scala> val letters = Seq('a', 'b', 'c')
letters: Seq[Char] = List(a, b, c)
scala> val res = for {
| n <- nums
| c <- letters
| } yield (n, c)
res: Seq[(Int, Char)] = List((1,a), (1,b), (1,c), (2,a), (2,b), (2,c), (3,a), (3,b), (3,c))
Hope this helps!!
val aList = List( 1,2,3,4,5 )
val res3 = for ( al <- aList if al > 3 ) yield al + 1
val res4 = aList.filter(_ > 3).map(_ + 1)
println( res3 )
println( res4 )
These two pieces of code are equivalent.
val res3 = for (al <- aList) yield al + 1 > 3
val res4 = aList.map( _+ 1 > 3 )
println( res3 )
println( res4 )
These two pieces of code are also equivalent.
Map is as flexible as yield and vice-versa.
val doubledNums = for (n <- nums) yield n * 2
val ucNames = for (name <- names) yield name.capitalize
Notice that both of those for-expressions use the yield keyword:
Using yield after for is the “secret sauce” that says, “I want to yield a new collection from the existing collection that I’m iterating over in the for-expression, using the algorithm shown.”
taken from here
According to the Scala documentation, it clearly says "yield a new collection from the existing collection".
Another Scala documentation says, "Scala offers a lightweight notation for expressing sequence comprehensions. Comprehensions have the form for (enums) yield e, where enums refers to a semicolon-separated list of enumerators. An enumerator is either a generator which introduces new variables, or it is a filter. "
yield is more flexible than map(), see example below
val aList = List( 1,2,3,4,5 )
val res3 = for ( al <- aList if al > 3 ) yield al + 1
val res4 = aList.map( _+ 1 > 3 )
println( res3 )
println( res4 )
yield will print result like: List(5, 6), which is good
while map() will return result like: List(false, false, true, true, true), which probably is not what you intend.