Strange (?) for comprehension evaluation in Scala - scala

Now, it took me a while to figure out why my recursion is somehow managing to blow the stack. Here it is, the part causing this problem:
scala> for {
| i <- List(1, 2, 3)
| j = { println("why am I evaluated?"); 10 } if false
| } yield (i, j)
why am I evaluated?
why am I evaluated?
why am I evaluated?
res0: List[(Int, Int)] = List()
Isn't this, like, insane? Why at all evaluate j = ... if it ends in if false and so will never be used?
What happens when instead of { println ... } you have a recursive call (and recursion guard instead of if false), I have learned. :<
Why?!

I'm going to go out on a limb and say the accepted answer could say more.
This is a parser bug.
Guards can immediately follow a generator, but otherwise a semi is required (actual or inferred).
Here is the syntax.
In the following, the line for res4 should not compile.
scala> for (i <- (1 to 5).toList ; j = 2 * i if j > 4) yield j
res4: List[Int] = List(6, 8, 10)
scala> for (i <- (1 to 5).toList ; j = 2 * i ; if j > 4) yield j
res5: List[Int] = List(6, 8, 10)
What happens is that the val def of j gets merged with the i generator to make a new generator of pairs (i,j). Then the guard looks like it just follows the (synthetic) generator.
But the syntax is still wrong. Syntax is our friend! It was our BFF long before the type system.
On the line for res5, it's pretty obvious that the guard does not guard the val def.
Update:
The implementation bug was downgraded (or upgraded, depending on your perspective) to a specification bug.
Checking for this usage, where a guard looks like a trailing if controlling the valdef that precedes it, like in Perl, falls under the purview of your favorite style checker.

If you structure your loop like this, it will solve your problem:
scala> for {
| i <- List(1, 2, 3)
| if false
| j = { println("why am I evaluated?"); 10 }
| } yield (i, j)
res0: List[(Int, Int)] = List()
Scala syntax in a for-loop treats the if statement as a sort of filter; this tutorial has some good examples.
One way to think of it is to walk through the for loop imperatively, and when you reach an if statement, if that statement evaluates to false, you continue to the next iteration of the loop.

When I have questions like that I seek to see how the disassembled code looks like (feeding the .class files to JD-GUI for instance).
The beginning of this for-comprehension disassembled code looks like this:
((TraversableLike)List..MODULE$.apply(Predef..MODULE$.wrapIntArray(new int[] { 1, 2, 3 })).map(new AbstractFunction1() { public static final long serialVersionUID = 0L;
public final Tuple2<Object, BoxedUnit> apply(int i) { Predef..MODULE$.println("why am I evaluated?"); BoxedUnit j = BoxedUnit.UNIT;
return new Tuple2(BoxesRunTime.boxToInteger(i),
j);
}
}...//continues
where we can see that the array of ints in the i parameter maps to an AbstractFunction1() whose apply method first performs the println nomatter what and then allocates Unit to the parameter j finally returning a tuple of two(i,j) to further pipe it into further filter/map operations (omitted). So essentially the if false condition doesn't have any effect and essentially is removed by the compiler.

Related

For until square root

In Scala, I want to write the equivalent of the following C++ code:
for(int i = 1 ; i * i < n ; i++)
So far I did this, but it looks ugly and I think it goes up until n:
for(i <- 1 to n
if(i * i < n))
Is there a nicer way of writing this code?
Not nicer but different approach
Using a stream
(1 to n).toStream map (i => i * i) takeWhile (_ < n)
Example for n = 100
scala> val res = (1 to 100).toStream map(i => i * i) takeWhile (_ < 100)
res: scala.collection.immutable.Stream[Int] = Stream(1, ?)
scala> res.toList
res16: List[Int] = List(1, 4, 9, 16, 25, 36, 49, 64, 81)
Explanation
A Stream allows to request values on demand, i.e. lazy evaluation. So the function that is mapped will only be applied when the next value is requested.
First of all, declare a function to generate a lazy stream of squares:
def squares(i: Int = 1): Stream[Int] = Stream.cons(i * i, squares(i + 1))
then use takeWhile to get the value when i * i is smaller than n. For example:
scala> squares().takeWhile(_ < 50).foreach(println)
1
4
9
16
25
36
49
The solution you have might not be the nicest but it might be the most efficient, everything else is internally more complicated, so it might have some overhead. (In most situations not a notable overhead though, and it might be optimized very well.)
I would not go for the solution using Streams suggested in an other answer. While streams are computed lazily, they do cache the computed results, which is not required in this case and might take a lot of memory if the range iterated over is large. Instead I would use an Iterator. Operations on Iterators are typically lazy as well and do not cache anything.
If you need this more often, you could add an other "operator" using an implicit class like this:
implicit class UntilHelper(start: Int) {
def aslong(cond: Int => Boolean) =
Iterator.from(start).takeWhile(cond)
}
Your loop then looks like this:
for(i <- 1 aslong (Math.pow(_, 2) < 1000)) {
println(i)
}
From a quick micro-benchmark it looks like this is about 3 times faster than the stream solution and a little bit slower than a simple while loop. These things are however notoriously hard to measure without any context.
Remark on computing squares
A nice way of computing a sequence of Squares is by adding the difference between squares. This can be done using the scanLeft method on a Stream or an Iterator.
val itr = Iterator.from(1).scanLeft(1)((a,b)=>a + 2*b+1)
println(itr.take(10).toList)

General comprehensions in Scala

As far as I understand, the Scala for-comprehension notation relies on the first generator to define how elements are to be combined. Namely, for (i <- list) yield i returns a list and for (i <- set) yield i returns a set.
I was wondering if there was a way to specify how elements are combined independently of the properties of the first generator. For instance, I would like to get "the set of all elements from a given list", or "the sum of all elements from a given set". The only way I have found is to first build a list or a set as prescribed by the for-comprehension notation, then apply a transformation function to it - building a useless data structure in the process.
What I have in mind is a general "algebraic" comprehension notation as it exists for instance in Ateji PX:
`+ { i | int i : set } // the sum of all elements from a given set
set() { i | int i : list } // the set of all elements from a given list
concat(",") { s | String s : list } // string concatenation with a separator symbol
Here the first element (`+, set(), concat(",")) is a so-called "monoid" that defines how elements are combined, independently of the structure of the first generator (there can be multiple generators and filters, I just tried to keep the examples concise).
Any idea about how to achieve a similar result in Scala while keeping a nice and concise notation ? As far as I understand, the for-comprehension notation is hard-wired in the compiler and cannot be upgraded.
Thanks for your feedback.
About the for comprehension
The for comprehension in scala is syntactic sugar for calls to flatMap, filter, map and foreach. In exactly the same way as calls to those methods, the type of the target collection leads to the type of the returned collection. That is:
list map f //is a List
vector map f // is a Vector
This property is one of the underlying design goals of the scala collections library and would be seen as desirable in most situations.
Answering the question
You do not need to construct any intermediate collection of course:
(list.view map (_.prop)).toSet //uses list.view
(list.iterator map (_.prop)).toSet //uses iterator
(for { l <- list.view} yield l.prop).toSet //uses view
(Set.empty[Prop] /: coll) { _ + _.prop } //uses foldLeft
Will all yield Sets without generating unnecessary collections. My personal preference is for the first. In terms of idiomatic scala collection manipulation, each "collection" comes with these methods:
//Conversions
toSeq
toSet
toArray
toList
toIndexedSeq
iterator
toStream
//Strings
mkString
//accumulation
sum
The last is used where the element type of a collection has an implicit Numeric instance in scope; such as:
Set(1, 2, 3, 4).sum //10
Set('a, 'b).sum //does not compile
Note that the String concatenation example in scala looks like:
list.mkString(",")
And in the scalaz FP library might look something like (which uses Monoid to sum Strings):
list.intercalate(",").asMA.sum
Your suggestions do not look anything like Scala; I'm not sure whether they are inspired by another language.
foldLeft? That's what you're describing.
The sum of all elements from a given set:
(0 /: Set(1,2,3))(_ + _)
the set of all elements from a given list
(Set[Int]() /: List(1,2,3,2,1))((acc,x) => acc + x)
String concatenation with a separator symbol:
("" /: List("a", "b"))(_ + _) // (edit - ok concat a bit more verbose:
("" /: List("a", "b"))((acc,x) => acc + (if (acc == "") "" else ",") + x)
You can also force the result type of the for comprehension by explicitly supplying the implicit CanBuildFrom parameter as scala.collection.breakout and specifying the result type.
Consider this REPL session:
scala> val list = List(1, 1, 2, 2, 3, 3)
list: List[Int] = List(1, 1, 2, 2, 3, 3)
scala> val res = for(i <- list) yield i
res: List[Int] = List(1, 1, 2, 2, 3, 3)
scala> val res: Set[Int] = (for(i <- list) yield i)(collection.breakOut)
res: Set[Int] = Set(1, 2, 3)
It results in a type error when not specifying the CanBuildFrom explicitly:
scala> val res: Set[Int] = for(i <- list) yield i
<console>:8: error: type mismatch;
found : List[Int]
required: Set[Int]
val res: Set[Int] = for(i <- list) yield i
^
For a deeper understanding of this I suggest the following read:
http://www.scala-lang.org/docu/files/collections-api/collections-impl.html
If you want to use for comprehensions and still be able to combine your values in some result value you could do the following.
case class WithCollector[B, A](init: B)(p: (B, A) => B) {
var x: B = init
val collect = { (y: A) => { x = p(x, y) } }
def apply(pr: (A => Unit) => Unit) = {
pr(collect)
x
}
}
// Some examples
object Test {
def main(args: Array[String]): Unit = {
// It's still functional
val r1 = WithCollector[Int, Int](0)(_ + _) { collect =>
for (i <- 1 to 10; if i % 2 == 0; j <- 1 to 3) collect(i + j)
}
println(r1) // 120
import collection.mutable.Set
val r2 = WithCollector[Set[Int], Int](Set[Int]())(_ += _) { collect =>
for (i <- 1 to 10; if i % 2 == 0; j <- 1 to 3) collect(i + j)
}
println(r2) // Set(9, 10, 11, 6, 13, 4, 12, 3, 7, 8, 5)
}
}

Incrementing the for loop (loop variable) in scala by power of 5

I had asked this question on Javaranch, but couldn't get a response there. So posting it here as well:
I have this particular requirement where the increment in the loop variable is to be done by multiplying it with 5 after each iteration. In Java we could implement it this way:
for(int i=1;i<100;i=i*5){}
In scala I was trying the following code-
var j=1
for(i<-1.to(100).by(scala.math.pow(5,j).toInt))
{
println(i+" "+j)
j=j+1
}
But its printing the following output:
1 1
6 2
11 3
16 4
21 5
26 6
31 7
36 8
....
....
Its incrementing by 5 always. So how do I got about actually multiplying the increment by 5 instead of adding it.
Let's first explain the problem. This code:
var j=1
for(i<-1.to(100).by(scala.math.pow(5,j).toInt))
{
println(i+" "+j)
j=j+1
}
is equivalent to this:
var j = 1
val range: Range = Predef.intWrapper(1).to(100)
val increment: Int = scala.math.pow(5, j).toInt
val byRange: Range = range.by(increment)
byRange.foreach {
println(i+" "+j)
j=j+1
}
So, by the time you get to mutate j, increment and byRange have already been computed. And Range is an immutable object -- you can't change it. Even if you produced new ranges while you did the foreach, the object doing the foreach would still be the same.
Now, to the solution. Simply put, Range is not adequate for your needs. You want a geometric progression, not an arithmetic one. To me (and pretty much everyone else answering, it seems), the natural solution would be to use a Stream or Iterator created with iterate, which computes the next value based on the previous one.
for(i <- Iterator.iterate(1)(_ * 5) takeWhile (_ < 100)) {
println(i)
}
EDIT: About Stream vs Iterator
Stream and Iterator are very different data structures, that share the property of being non-strict. This property is what enables iterate to even exist, since this method is creating an infinite collection1, from which takeWhile will create a new2 collection which is finite. Let's see here:
val s1 = Stream.iterate(1)(_ * 5) // s1 is infinite
val s2 = s1.takeWhile(_ < 100) // s2 is finite
val i1 = Iterator.iterate(1)(_ * 5) // i1 is infinite
val i2 = i1.takeWhile(_ < 100) // i2 is finite
These infinite collections are possible because the collection is not pre-computed. On a List, all elements inside the list are actually stored somewhere by the time the list has been created. On the above examples, however, only the first element of each collection is known in advance. All others will only be computed if and when required.
As I mentioned, though, these are very different collections in other respects. Stream is an immutable data structure. For instance, you can print the contents of s2 as many times as you wish, and it will show the same output every time. On the other hand, Iterator is a mutable data structure. Once you used a value, that value will be forever gone. Print the contents of i2 twice, and it will be empty the second time around:
scala> s2 foreach println
1
5
25
scala> s2 foreach println
1
5
25
scala> i2 foreach println
1
5
25
scala> i2 foreach println
scala>
Stream, on the other hand, is a lazy collection. Once a value has been computed, it will stay computed, instead of being discarded or recomputed every time. See below one example of that behavior in action:
scala> val s2 = s1.takeWhile(_ < 100) // s2 is finite
s2: scala.collection.immutable.Stream[Int] = Stream(1, ?)
scala> println(s2)
Stream(1, ?)
scala> s2 foreach println
1
5
25
scala> println(s2)
Stream(1, 5, 25)
So Stream can actually fill up the memory if one is not careful, whereas Iterator occupies constant space. On the other hand, one can be surprised by Iterator, because of its side effects.
(1) As a matter of fact, Iterator is not a collection at all, even though it shares a lot of the methods provided by collections. On the other hand, from the problem description you gave, you are not really interested in having a collection of numbers, just in iterating through them.
(2) Actually, though takeWhile will create a new Iterator on Scala 2.8.0, this new iterator will still be linked to the old one, and changes in one have side effects on the other. This is subject to discussion, and they might end up being truly independent in the future.
In a more functional style:
scala> Stream.iterate(1)(i => i * 5).takeWhile(i => i < 100).toList
res0: List[Int] = List(1, 5, 25)
And with more syntactic sugar:
scala> Stream.iterate(1)(_ * 5).takeWhile(_ < 100).toList
res1: List[Int] = List(1, 5, 25)
Maybe a simple while-loop would do?
var i=1;
while (i < 100)
{
println(i);
i*=5;
}
or if you want to also print the number of iterations
var i=1;
var j=1;
while (i < 100)
{
println(j + " : " + i);
i*=5;
j+=1;
}
it seems you guys likes functional so how about a recursive solution?
#tailrec def quints(n:Int): Unit = {
println(n);
if (n*5<100) quints(n*5);
}
Update: Thanks for spotting the error... it should of course be power, not multiply:
Annoyingly, there doesn't seem to be an integer pow function in the standard library!
Try this:
def pow5(i:Int) = math.pow(5,i).toInt
Iterator from 1 map pow5 takeWhile (100>=) toList
Or if you want to use it in-place:
Iterator from 1 map pow5 takeWhile (100>=) foreach {
j => println("number:" + j)
}
and with the indices:
val iter = Iterator from 1 map pow5 takeWhile (100>=)
iter.zipWithIndex foreach { case (j, i) => println(i + " = " + j) }
(0 to 2).map (math.pow (5, _).toInt).zipWithIndex
res25: scala.collection.immutable.IndexedSeq[(Int, Int)] = Vector((1,0), (5,1), (25,2))
produces a Vector, with i,j in reversed order.

scala loop through a linkedlist

In scala, what is a good way to loop through a linked list(scala.collection.mutable.LinkedList) of objects? For example, I want to have 'for' loop traverse through each object on the linked list and process it.
With foreach:
Welcome to Scala version 2.8.0.final (Java HotSpot(TM) Client VM, Java 1.6.0_21).
Type in expressions to have them evaluated.
Type :help for more information.
scala> val ll = scala.collection.mutable.LinkedList[Int](1,2,3)
ll: scala.collection.mutable.LinkedList[Int] = LinkedList(1, 2, 3)
scala> ll.foreach(i => println(i * 2))
2
4
6
or, if your processing of each object returns a new value, use map:
scala> ll.map(_ * 2)
res3: scala.collection.mutable.LinkedList[Int] = LinkedList(2, 4, 6)
Some people prefer for comprehensions instead of foreach and map. They look like this:
scala> for (i <- ll) println(i)
1
2
3
scala> for (i <- ll) yield i * 2
res5: scala.collection.mutable.LinkedList[Int] = LinkedList(2, 4, 6)
To expand on the previous answer...
for, foreach and map are all higher-order functions - they can all take a function as an argument, so starting here:
val list = List(1,2,3)
list.foreach(i => println(i * 2))
You have a number of ways that you can make the code more declarative in nature, and cleaner at the same time.
First, you don't really need to use the name - i - for each member of the collection, you can use _ as a placeholder instead:
list.foreach(println(_ * 2))
You can also separate the logic out into a distinct method, and continue to use placeholder syntax:
def printTimesTwo(i:Int) = println(i * 2)
list.foreach(printTimesTwo(_))
Even cleaner, just pass the raw function without specifying parameters (look ma, no placeholders!)
list.foreach(printTimesTwo)
And to take it to a logical conclusion, this can be made cleaner still by using infix syntax. Which I show here working with a standard library method. Note: you could even use a method imported from a java library, if you wanted:
list foreach println
This thinking extends to anonymous functions and partially-applied functions and also to the map operation:
// "2 *" creates an anonymous function that will double its one-and-only argument
list map { 2 * }
For-comprehensions aren't really very useful when working at this level, they just add boilerplate. But they do come into their own when working with deeper nested structures:
//a list of lists, print out all the numbers
val grid = List(List(1, 2, 3), List(4, 5, 6), List(7, 8, 9))
grid foreach { _ foreach println } //hmm, could get confusing
for(line <- grid; cell <- line) println(cell) //that's clearer
I didn't need the yield keyword there, as nothing is being returned. But if I wanted to get back a list of Strings (un-nested):
for(line <- grid; cell <- line) yield { cell.toString }
With lots of generators, you'll want to split them over multiple lines:
for {
listOfGrids <- someMasterCollection
grid <- listOfGrids
line <- grid
cell <- line
} yield {
cell.toString
}

What is Scala's yield?

I understand Ruby and Python's yield. What does Scala's yield do?
I think the accepted answer is great, but it seems many people have failed to grasp some fundamental points.
First, Scala's for comprehensions are equivalent to Haskell's do notation, and it is nothing more than a syntactic sugar for composition of multiple monadic operations. As this statement will most likely not help anyone who needs help, let's try again… :-)
Scala's for comprehensions is syntactic sugar for composition of multiple operations with map, flatMap and filter. Or foreach. Scala actually translates a for-expression into calls to those methods, so any class providing them, or a subset of them, can be used with for comprehensions.
First, let's talk about the translations. There are very simple rules:
This
for(x <- c1; y <- c2; z <-c3) {...}
is translated into
c1.foreach(x => c2.foreach(y => c3.foreach(z => {...})))
This
for(x <- c1; y <- c2; z <- c3) yield {...}
is translated into
c1.flatMap(x => c2.flatMap(y => c3.map(z => {...})))
This
for(x <- c; if cond) yield {...}
is translated on Scala 2.7 into
c.filter(x => cond).map(x => {...})
or, on Scala 2.8, into
c.withFilter(x => cond).map(x => {...})
with a fallback into the former if method withFilter is not available but filter is. Please see the section below for more information on this.
This
for(x <- c; y = ...) yield {...}
is translated into
c.map(x => (x, ...)).map((x,y) => {...})
When you look at very simple for comprehensions, the map/foreach alternatives look, indeed, better. Once you start composing them, though, you can easily get lost in parenthesis and nesting levels. When that happens, for comprehensions are usually much clearer.
I'll show one simple example, and intentionally omit any explanation. You can decide which syntax was easier to understand.
l.flatMap(sl => sl.filter(el => el > 0).map(el => el.toString.length))
or
for {
sl <- l
el <- sl
if el > 0
} yield el.toString.length
withFilter
Scala 2.8 introduced a method called withFilter, whose main difference is that, instead of returning a new, filtered, collection, it filters on-demand. The filter method has its behavior defined based on the strictness of the collection. To understand this better, let's take a look at some Scala 2.7 with List (strict) and Stream (non-strict):
scala> var found = false
found: Boolean = false
scala> List.range(1,10).filter(_ % 2 == 1 && !found).foreach(x => if (x == 5) found = true else println(x))
1
3
7
9
scala> found = false
found: Boolean = false
scala> Stream.range(1,10).filter(_ % 2 == 1 && !found).foreach(x => if (x == 5) found = true else println(x))
1
3
The difference happens because filter is immediately applied with List, returning a list of odds -- since found is false. Only then foreach is executed, but, by this time, changing found is meaningless, as filter has already executed.
In the case of Stream, the condition is not immediatelly applied. Instead, as each element is requested by foreach, filter tests the condition, which enables foreach to influence it through found. Just to make it clear, here is the equivalent for-comprehension code:
for (x <- List.range(1, 10); if x % 2 == 1 && !found)
if (x == 5) found = true else println(x)
for (x <- Stream.range(1, 10); if x % 2 == 1 && !found)
if (x == 5) found = true else println(x)
This caused many problems, because people expected the if to be considered on-demand, instead of being applied to the whole collection beforehand.
Scala 2.8 introduced withFilter, which is always non-strict, no matter the strictness of the collection. The following example shows List with both methods on Scala 2.8:
scala> var found = false
found: Boolean = false
scala> List.range(1,10).filter(_ % 2 == 1 && !found).foreach(x => if (x == 5) found = true else println(x))
1
3
7
9
scala> found = false
found: Boolean = false
scala> List.range(1,10).withFilter(_ % 2 == 1 && !found).foreach(x => if (x == 5) found = true else println(x))
1
3
This produces the result most people expect, without changing how filter behaves. As a side note, Range was changed from non-strict to strict between Scala 2.7 and Scala 2.8.
It is used in sequence comprehensions (like Python's list-comprehensions and generators, where you may use yield too).
It is applied in combination with for and writes a new element into the resulting sequence.
Simple example (from scala-lang)
/** Turn command line arguments to uppercase */
object Main {
def main(args: Array[String]) {
val res = for (a <- args) yield a.toUpperCase
println("Arguments: " + res.toString)
}
}
The corresponding expression in F# would be
[ for a in args -> a.toUpperCase ]
or
from a in args select a.toUpperCase
in Linq.
Ruby's yield has a different effect.
Yes, as Earwicker said, it's pretty much the equivalent to LINQ's select and has very little to do with Ruby's and Python's yield. Basically, where in C# you would write
from ... select ???
in Scala you have instead
for ... yield ???
It's also important to understand that for-comprehensions don't just work with sequences, but with any type which defines certain methods, just like LINQ:
If your type defines just map, it allows for-expressions consisting of a
single generator.
If it defines flatMap as well as map, it allows for-expressions consisting
of several generators.
If it defines foreach, it allows for-loops without yield (both with single and multiple generators).
If it defines filter, it allows for-filter expressions starting with an if
in the for expression.
Unless you get a better answer from a Scala user (which I'm not), here's my understanding.
It only appears as part of an expression beginning with for, which states how to generate a new list from an existing list.
Something like:
var doubled = for (n <- original) yield n * 2
So there's one output item for each input (although I believe there's a way of dropping duplicates).
This is quite different from the "imperative continuations" enabled by yield in other languages, where it provides a way to generate a list of any length, from some imperative code with almost any structure.
(If you're familiar with C#, it's closer to LINQ's select operator than it is to yield return).
Consider the following for-comprehension
val A = for (i <- Int.MinValue to Int.MaxValue; if i > 3) yield i
It may be helpful to read it out loud as follows
"For each integer i, if it is greater than 3, then yield (produce) i and add it to the list A."
In terms of mathematical set-builder notation, the above for-comprehension is analogous to
which may be read as
"For each integer , if it is greater than , then it is a member of the set ."
or alternatively as
" is the set of all integers , such that each is greater than ."
The keyword yield in Scala is simply syntactic sugar which can be easily replaced by a map, as Daniel Sobral already explained in detail.
On the other hand, yield is absolutely misleading if you are looking for generators (or continuations) similar to those in Python. See this SO thread for more information: What is the preferred way to implement 'yield' in Scala?
Yield is similar to for loop which has a buffer that we cannot see and for each increment, it keeps adding next item to the buffer. When the for loop finishes running, it would return the collection of all the yielded values. Yield can be used as simple arithmetic operators or even in combination with arrays.
Here are two simple examples for your better understanding
scala>for (i <- 1 to 5) yield i * 3
res: scala.collection.immutable.IndexedSeq[Int] = Vector(3, 6, 9, 12, 15)
scala> val nums = Seq(1,2,3)
nums: Seq[Int] = List(1, 2, 3)
scala> val letters = Seq('a', 'b', 'c')
letters: Seq[Char] = List(a, b, c)
scala> val res = for {
| n <- nums
| c <- letters
| } yield (n, c)
res: Seq[(Int, Char)] = List((1,a), (1,b), (1,c), (2,a), (2,b), (2,c), (3,a), (3,b), (3,c))
Hope this helps!!
val aList = List( 1,2,3,4,5 )
val res3 = for ( al <- aList if al > 3 ) yield al + 1
val res4 = aList.filter(_ > 3).map(_ + 1)
println( res3 )
println( res4 )
These two pieces of code are equivalent.
val res3 = for (al <- aList) yield al + 1 > 3
val res4 = aList.map( _+ 1 > 3 )
println( res3 )
println( res4 )
These two pieces of code are also equivalent.
Map is as flexible as yield and vice-versa.
val doubledNums = for (n <- nums) yield n * 2
val ucNames = for (name <- names) yield name.capitalize
Notice that both of those for-expressions use the yield keyword:
Using yield after for is the “secret sauce” that says, “I want to yield a new collection from the existing collection that I’m iterating over in the for-expression, using the algorithm shown.”
taken from here
According to the Scala documentation, it clearly says "yield a new collection from the existing collection".
Another Scala documentation says, "Scala offers a lightweight notation for expressing sequence comprehensions. Comprehensions have the form for (enums) yield e, where enums refers to a semicolon-separated list of enumerators. An enumerator is either a generator which introduces new variables, or it is a filter. "
yield is more flexible than map(), see example below
val aList = List( 1,2,3,4,5 )
val res3 = for ( al <- aList if al > 3 ) yield al + 1
val res4 = aList.map( _+ 1 > 3 )
println( res3 )
println( res4 )
yield will print result like: List(5, 6), which is good
while map() will return result like: List(false, false, true, true, true), which probably is not what you intend.