Smartly deal with Option[T] in Scala - scala

I am developing some code using Scala and I am trying to smartly resolve a basic transformation between collections that contains some Option[T].
Let's say that we have the following list
val list: List[(A, Option[B])] = // Initialization stuff
and we want to apply a transformation to list to obtain the following list
val transformed: List[(B, A)]
for all Option[B]s that evaluate to Some[B]. The best way I found to do this is to apply the following chain of transformations:
val transformed =
list.filter(_.isDefined)
.map { case (a, Some(b)) => (b, a) }
However I feel that I am missing something. Which is the best way to deal with Option[T]s?

You can use collect:
val transformed = list.collect {
case (a, Some(b)) => (b, a)
}
Collect, as defined in the docs:
Builds a new collection by applying a partial function to all elements of this list on which the function is defined.
Meaning, it yields a result only for elements which match any of the cases defined in your partial function. I like to think of it as a combined filter and map.

Related

Merging 3 immutable maps in scala

I have 3 immutable maps
as: Map[UUID, A]
bs: Map[UUID, B]
cs: Map[UUID, C]
and I want to merge them so the result is of type:
Map[UUID, (Option[A], Option[B], Option[C])]
What is the best way to do this. And by best I mean fewest lines of code.
I think you need to iterate all keys and construct the value for each of them. Something like this:
val keys = as.keySet ++ bs.keySet ++ cs.keySet
val merged = keys.map(key => (key, (as.get(key), bs.get(key), cs.get(key)))).toMap
Probably you could use for comprehension:
for {
k <- as.keySet ++ bs.keySet ++ cs.keySet
} yield (as.get(k), bs.get(k), cs.get(k))
Ideally, you want a better data type that knows that at least one of the elements has to be defined. And since you mentioned cats you may do this:
import cats.syntax.all._
val result = ((as align bs) align cs)
That gives you a Map[UUID, Ior[Ior[A, B], C]] which properly represents that the result can either be a single element, a pair, or the three.

Understand Stream scala interleaved transformations behavior

I'm reading and having fun with examples and exercises contained in the book Functional Programming in Scala. I'm studing the strictess and laziness chapter talking about the Stream.
I can't understand the output produced by the following code excerpt:
sealed trait Stream[+A]{
def foldRight[B](z: => B)(f: (A, => B) => B): B =
this match {
case Cons(h,t) => f(h(), t().foldRight(z)(f))
case _ => z
}
def map[B](f: A => B): Stream[B] = foldRight(Stream.empty[B])((h,t) => {println(s"map h:$h"); Stream.cons(f(h), t)})
def filter(f:A=>Boolean):Stream[A] = foldRight(Stream.empty[A])((h,t) => {println(s"filter h:$h"); if(f(h)) Stream.cons(h,t) else t})
}
case object Empty extends Stream[Nothing]
case class Cons[+A](h: () => A, t: () => Stream[A]) extends Stream[A]
object Stream {
def cons[A](hd: => A, tl: => Stream[A]): Stream[A] = {
lazy val head = hd
lazy val tail = tl
Cons(() => head, () => tail)
}
def empty[A]: Stream[A] = Empty
def apply[A](as: A*): Stream[A] =
if (as.isEmpty) empty else cons(as.head, apply(as.tail: _*))
}
Stream(1,2,3,4,5,6).map(_+10).filter(_%2==0)
When I execute this code, I receive this output:
map h:1
filter h:11
map h:2
filter h:12
My questions are:
Why map and filter output are interleaved?
Could you explain all steps involved from the Stream creation until the last step for obtaining this behavior?
Where are other elements of the list that pass also filter transformation, so 4 and 6?
The key to understanding this behavior, I think, is in the signature of the foldRight.
def foldRight[B](z: => B)(f: (A, => B) => B): B = ...
Note that the 2nd argument, f, is a function that takes two parameters, an A and a by-name (lazy) B. Take away that laziness, f: (A, B) => B, and you not only get the expected method grouping (all the map() steps before all the filter() steps), they also come in reverse order with 6 processed first and 1 processed last, as you'd expect from a foldRight.
How does one little => perform all that magic? It basically says that the 2nd argument to f() is going to be held in reserve until it is required.
So, attempting to answer your questions.
Why map and filter output are interleaved?
Because each call to map() and filter() are delayed until the point when the values are requested.
Could you explain all steps involved from the Stream creation until the last step for obtaining this behavior?
Not really. That would take more time and SO answer space than I'm willing to contribute, but let's take just a few steps into the morass.
We start with a Stream, which looks likes a series of Cons, each holding an Int and a reference to the next Cons, but that's not completely accurate. Each Cons really holds two functions, when invoked the 1st produces an Int and the 2nd produces the next Cons.
Call map() and pass it the "+10" function. map() creates a new function: "Given h and t (both values), create a new Cons. The head function of the new Cons, when invoked, will be the "+10" function applied to the current head value. The new tail function will produce the t value as received." This new function is passed to foldRight.
foldRight receives the new function but the evaluation of the function's 2nd parameter will be delayed until it is needed. h() is called to retrieve the current head value, t() will be called to retrieve the current tail value and a recursive call to foldRight will be called on it.
Call filter() and pass it the "isEven" function. filter() creates a new function: "Given h and t, create a new Cons if h passes the isEven test. If not then return t." That's the real t. Not a promise to evaluate its value later.
Where are other elements of the list that pass also filter transformation, so 4 and 6?
They are still there waiting to be evaluated. We can force that evaluation by using pattern matching to extract the various Cons one by one.
val c0#Cons(_,_) = Stream(1,2,3,4,5,6).map(_+10).filter(_%2==0)
// **STDOUT**
//map h:1
//filter h:11
//map h:2
//filter h:12
c0.h() //res0: Int = 12
val c1#Cons(_,_) = c0.t()
// **STDOUT**
//map h:3
//filter h:13
//map h:4
//filter h:14
c1.h() //res1: Int = 14
val c2#Cons(_,_) = c1.t()
// **STDOUT**
//map h:5
//filter h:15
//map h:6
//filter h:16
c2.h() //res2: Int = 16
c2.t() //res3: Stream[Int] = Empty

Scala: Difference between Map.map and Map.transform? Why Map.map requires pattern matching in its parameter?

For an immutable Map,
val original = Map("A"->1, "B"->2)
I can either use
original.map { case (k, v) => (k, v + 1) }
Or
original.transform((_, v) => v + 1)
to transform the values.
But why map() method requires case pattern matching but transform() doesn't? Is it because of these methods are defined in different implicit types?
Someone has marked my question as a duplicate of another question [Difference between mapValues and transform in Map. It is not the same. I am asking Map.map not Map.mapValues. Also I am asking the different way of using the two methods.
With map method you can change (don't want to use transform word here) whole Map converting it to another Map, List etc
val m = Map(1->"a")
m.map { case (k,v) => (k+1) -> (v + 1) } // Map(2 -> a1)
m.map { case (k,v) => k+v } // List(1a)
With transform method you can change only values considering their keys
m.transform { case (k, v) => v + 1 } // Map(1 -> a1)
Transform take a function that has two values as inputs, the first is the key and the second the value. Pattern matching is not needed since the two values are passed in individually.
On the other hand, the function passed to map takes in a single tuple containing the key and value of the element as an input. Pattern matching is used to break this tuple into it's components. You don't have to use pattern matching, but that would mean working with the tuple object instead of it's contents.
The difference is in the function they receive. As you can see in the API
def transform[W, That](f: (K, V) ⇒ W)(implicit bf: CanBuildFrom[Map[K, V], (K, W), That]): That
def map[B](f: (A) ⇒ B): Map[B]
transform's function receives a tuple f: (K, V) ⇒ W while map's function receives a single value (which can obviously be a Tuple) f: (A) ⇒ B
So if you want to treat differently and in a easy-to-read way you should use the case word.
You can also do something like this, but is way less readeable:
original.map(r => (r._1, r._2+1))

Iterable with two elements?

We have Option which is an Iterable over 0 or 1 elements.
I would like to have such a thing with two elements. The best I have is
Array(foo, bar).map{...}, while what I would like is:
(foo, bar).map{...}
(such that Scala recognized there are two elements in the Iterable).
Does such a construction exist in the standard library?
EDIT: another solution is to create a map method:
def map(a:Foo) = {...}
val (mappedFoo, mappedBar) = (map(foo), map(bar))
If all you want to do is map on tuples of the same type, a simple version is:
implicit class DupleOps[T](t: (T,T)) {
def map[B](f : T => B) = (f(t._1), f(t._2))
}
Then you can do the following:
val t = (0,1)
val (x,y) = t.map( _ +1) // x = 1, y = 2
There's no specific type in the scala standard library for mapping over exactly 2 elements.
I can suggest you the following thing (I suppose foo and bar has the same type T):
(foo, bar) // -> Tuple2[T,T]
.productIterator // -> Iterator[Any]
.map(_.asInstanceOf[T]) // -> Iterator[T]
.map(x => // some works)
No, it doesn't.
You could
Make one yourself.
Write an implicit conversion from 2-tuples to a Seq of the common supertype. But this won't yield 2-tuples from operations.
object TupleOps {
implicit def tupleToSeq[A <: C, B <: C](tuple: (A, B)): Seq[C] = Seq(tuple._1,tuple._2)
}
import TupleOps._
(0, 1).map(_ + 1)
Use HLists from shapeless. These provide operations on heterogenous lists, whereas you (probably?) have a homogeneous list, but it should work.

Corecursion vs Recursion understanding in scala

if with recursion almost clear, for example
def product2(ints: List[Int]): Int = {
#tailrec
def productAccumulator(ints: List[Int], accum: Int): Int = {
ints match {
case Nil => accum
case x :: tail => productAccumulator(tail, accum * x)
}
}
productAccumulator(ints, 1)
}
I am not sure about to the corecursion. According to the Wikipedia article, "corecursion allows programs to produce arbitrarily complex and potentially infinite data structures, such as streams". For example construction like this
list.filter(...).map(...)
makes to posible prepare temporary streams after filter and map operations.
after filter stream will be collect only filtered elements, and next in the map we will change elements. Correct?
Do functional combinators use recursion executions for map filter
Does any body have good example in Scala "comparing recursion and corecursion"?
The simplest way to understand the difference is to think that recursion consumes data while corecursion produces data. Your example is recursion since it consumes the list you provide as parameter. Also, foldLeft and foldRight are recursion too, not corecursion. Now an example of corecursion. Consider the following function:
def unfold[A, S](z: S)(f: S => Option[(A, S)]): Stream[A]
Just by looking at its signature you can see this function is intended to produce an infinite stream of data. It takes an initial state, z of type S, and a function from S to a possible tuple that will contain the next state and the actual value of the stream, that is of type A. If the result of f is empty (None) then unfold stops producing elements otherwise it goes on passing the next state and so on. Here is its implementation:
def unfold[S, A](z: S)(f: S => Option[(A, S)]): Stream[A] = f(z) match {
case Some((a, s)) => a #:: unfold(s)(f)
case None => Stream.empty[A]
}
You can use this function to implement other productive functions. E.g. the following function will produce a stream of, at most, numOfValues elements of type A:
def elements[A](element: A, numOfValues: Int): Stream[A] = unfold(numOfValues) { x =>
if (x > 0) Some((element, x - 1)) else None
}
Usage example in REPL:
scala> elements("hello", 3)
res10: Stream[String] = Stream(hello, ?)
scala> res10.toList
res11: List[String] = List(hello, hello, hello)