Applicative instance for ZipList in Scala - scala

This is a follow-up to one of my recent previous questions:
I would like to define a zip Applicative instance for List (and probably Set and Map). For example:
val xs: List[Int] = List(1, 2, 3)
val fs: List[Int => Int] = List(f1, f2, f3)
val ys: List[Int] = xs <*> fs // expected to be List(f1(1), f2(2), f3(3))
So I defined a ZipList and its Applicative:
case class ZipList[A](val list: List[A])
implicit val zipListApplicative = new Applicative[ZipList] {
def point[A](a: => A): ZipList[A] = ZipList(List(a))
def ap[A, B](za: => ZipList[A])(zf: => ZipList[A => B]): ZipList[B] = {
val bs = (za.list zip zf.list) map {case (a, f) => f(a)}
ZipList(bs)
}
}
and can use it as follows:
scala> val xs: List[Int] = List(1, 2, 3)
xs: List[Int] = List(1, 2, 3)
scala> val fs: List[Int => Int] = List(_ + 2, _ + 2, _ +1)
fs: List[Int => Int] = List(<function1>, <function1>, <function1>)
scala> ZipList(xs) <*> ZipList(fs)
res4: ZipList[Int] = ZipList(List(3, 4, 4))
It seems to be working but maybe I am missing something.
Does zipListApplicative comply to the applicative laws ?
Is ZipList supposed to be a stream because the point should generate an infinite stream of values ? Why ?

Applicatives should satisfy the law
point identity <*> v == v
which yours does not since
point identity List(1,2,3) == List(1)
pure a for a zip list should return an infinite stream of a which is why you need a lazy data structure.

Related

Combine 2 partial functions

I have two partial functions returning unit (f1, f2). For instance, something like that:
val f1 = {
case s: arg => //do some
//etc... lots of cases
}
val f2 = {
case s: anotherArg => //do some
//lots of cases
}
Is there a concise way to compose this to partial functions the way as that if
f(x) = {f1(x); f2(x)} iff f1.isDefinedAt(x) && f2.isDefinedAt(x)
f(x) = f1(x); iff f1.isDefinedAt(x) && !f2.isDefinedAt(x)
f(x) = f2(x); iff !f1.isDefinedAt(x) && f2.isDefinedAt(x)
orElse
f1 orElse f2
Scala REPL
scala> val f: PartialFunction[Int, Int] = { case 1 => 1 }
f: PartialFunction[Int,Int] = <function1>
scala> val g: PartialFunction[Int, Int] = { case 2 => 2 }
g: PartialFunction[Int,Int] = <function1>
scala> val h = f orElse g
h: PartialFunction[Int,Int] = <function1>
scala> h(1)
res3: Int = 1
scala> h(2)
res4: Int = 2
scala> h.isDefinedAt(1)
res6: Boolean = true
scala> h.isDefinedAt(2)
res7: Boolean = true
Both both functions to execute on common cases
Using List of partial functions and foldLeft
Scala REPL
scala> val f: PartialFunction[Int, Int] = { case 1 => 1 case 3 => 3}
f: PartialFunction[Int,Int] = <function1>
scala> val g: PartialFunction[Int, Int] = { case 2 => 2 case 3 => 3}
g: PartialFunction[Int,Int] = <function1>
scala> val h = f orElse g
h: PartialFunction[Int,Int] = <function1>
scala> h(3)
res10: Int = 3
scala> h(3)
res11: Int = 3
scala> val h = List(f, g)
h: List[PartialFunction[Int,Int]] = List(<function1>, <function1>)
scala> def i(arg: Int) = h.foldLeft(0){(result, f) => if (f.isDefinedAt(arg)) result + f(arg) else result }
i: (arg: Int)Int
scala> i(3)
res12: Int = 6
Although pamu's answer is good, I don't like the fact that it is bound to specific Int type. Unfortunately you didn't specify result type well enough, so I see 3 alternatives:
You want to get list of all results of all defined functions and you don't care about which function produced which result. In this case something like this would work:
def callAll[A, B](funcs: List[PartialFunction[A, B]], a: A): List[B] = funcs.foldRight(List.empty[B])((f, acc) => if (f.isDefinedAt(a)) f.apply(a) :: acc else acc)
if order of elements is not important you may use
def callAll[A, B](funcs: List[PartialFunction[A, B]], a: A): List[B] = funcs.foldLeft(List.empty[B])((f, acc) => if (f.isDefinedAt(a)) f.apply(a) :: acc else acc)
which probably will be a bit faster
You want to get Option with Some in case corresponding function is defined at the point or None otherwise. In such case something like this would work:
def callAllOption[A, B](funcs: List[PartialFunction[A, B]], a: A): List[Option[B]] = funcs.map(f => f.lift.apply(a))
If you don't want to create List explicitly, you can use varargs such as:
def callAllOptionVarArg[A, B](a: A, funcs: PartialFunction[A, B]*): List[Option[B]] = funcs.map(f => f.lift.apply(a)).toList
or such curried version to specify value after functions:
def callAllOptionVarArg2[A, B](funcs: PartialFunction[A, B]*)(a: A): List[Option[B]] = funcs.map(f => f.lift.apply(a)).toList
You call functions purely for side effects and return value is not important, in which case you can safely use second (a bit faster) callAll definition
Examples:
val f: PartialFunction[Int, Int] = {
case 1 => 1
case 3 => 3
}
val g: PartialFunction[Int, Int] = {
case 2 => 2
case 3 => 4
}
val fl = List(f, g)
println(callAll(fl, 1))
println(callAll(fl, 3))
println(callAllOption(fl, 2))
println(callAllOptionVarArg(1, f, g))
println(callAllOptionVarArg2(f, g)(3))
List(1)
List(3, 4)
List(None, Some(2))
List(Some(1), None)
List(Some(3), Some(4))

Scala views -- non-strict vs. lazy

I am trying to create a "lazy" map of objects (actually, they are actors, but I am asking my question with a more trivial example).
Scala views are, in a sense, lazy. But their laziness is really just non-strictness. That's to say, the values are effectively call-by-name, which is in turn to say that the values are evaluated, when required, by invoking a Function0 (a no-parameter function).
What I'm interested in is a collection that is evaluated lazily, but is evaluated only once. Here's the kind of thing I'm looking for:
val x = Map(1->2, 2->2).view
val y = x map {case (k,v) => (k,{println("Hello");v.toString})}
val z1 = y.find{case (k,_) => k==1}
val z2 = y.find{case (k,_) => k==1}
When I put this into a Scala worksheet, what I get is:
x: scala.collection.IterableView[(Int, Int),scala.collection.immutable.Map[Int,Int]] = IterableView(...)
y: scala.collection.IterableView[(Int, String),Iterable[_]] = IterableViewM(...)
Hello
z1: Option[(Int, String)] = Some((1,1))
Hello
z2: Option[(Int, String)] = Some((1,1))
Everything is just as it should be. Except that I don't want to see that second "Hello". In other words, I only want the mapped function (toString) to be invoked once -- when needed.
Does anyone have a suggestion of how to achieve my goal? It's not super-important but I'm curious if it can be done.
You can almost get what you want using a Stream:
scala> val x = TreeMap(1->2, 2->2) // to preserve order
x: scala.collection.immutable.TreeMap[Int,Int] = Map(1 -> 2, 2 -> 2)
scala> val y = x.toStream map {case (k,v) => (k,{println(s"Hello $k");v.toString})}
Hello 1
y: scala.collection.immutable.Stream[(Int, String)] = Stream((1,2), ?)
scala> y.find{case (k,_) => k==1}
res8: Option[(Int, String)] = Some((1,2))
scala> y.find{case (k,_) => k==2}
Hello 2
res9: Option[(Int, String)] = Some((2,2))
as you can see, the first element is evaluated strictly, but the others are evaluated and memoized on-demand
If you make the stream itself a lazy val, you get what you want:
scala> val x = TreeMap(1->2, 2->2) // to preserve order
x: scala.collection.immutable.TreeMap[Int,Int] = Map(1 -> 2, 2 -> 2)
scala> lazy val y = x.toStream map {case (k,v) => (k,{println(s"Hello $k");v.toString})}
y: scala.collection.immutable.Stream[(Int, String)] = <lazy>
scala> y.find{case (k,_) => k==1}
Hello 1
res10: Option[(Int, String)] = Some((1,2))
scala> y.find{case (k,_) => k==1}
res11: Option[(Int, String)] = Some((1,2))
If you don't mind evaluating the whole collection at once when you use it, you just need a lazy val and the collection can stay what it is (map, list etc)
val x = TreeMap(1->2, 2->2)
lazy val y = x map {case (k,v) => (k,{println(s"Hello $k");v.toString})}
I don't think you can have a (really) lazy map, but I'd be happy if someone proved me wrong :)
edit:
You can have a (sort of) lazy map by wrapping your values like this:
class Lazy[T](x: => T) {
lazy val value = x
override def toString = value.toString
}
object Lazy {
implicit def toStrict[T](l: Lazy[T]): T = l.value
}
val x = TreeMap(1->2, 2->2)
lazy val y = x map {case (k,v) => (k, new Lazy({println(s"Hello $k");v.toString}))}
y.find{case (k,v) => v.indexOf("x");k==1} // let's use v to evaluate it, otherwise nothing gets printed
y.find{case (k,v) => v.indexOf("x");k==1}
The implicit conversion allows you to use your values as if they were of their original type
I would propose alternative solution, instead of laziness.
What if your v will be function but not value.
In such case you'd be able to control execution whenever you need without relying on collections laziness...
val y = x map {case (k,v) => (k,() => {println("Hello");v.toString})}
I do not know of any collection API that offers that kind of laziness. However, I think you can achieve what you want with function memoization as described here:
case class Memo[I <% K, K, O](f: I => O) extends (I => O) {
import collection.mutable.{Map => Dict}
val cache = Dict.empty[K, O]
override def apply(x: I) = cache getOrElseUpdate (x, f(x))
}
val x = Map(1->2, 2->2).view
val memo = Memo { v: Int =>
println("Hello")
v.toString
}
val y = x.map { case (k, v) =>
(k, memo(v))
}
val z1 = y.find{case (k,_) => k==1}
val z2 = y.find{case (k,_) => k==1}
output:
Hello
z1: Option[(Int, String)] = Some((1,2))
z2: Option[(Int, String)] = Some((1,2))

Composing functions that return an option

Suppose I have a few functions of type Int => Option[Int]:
def foo(n: Int): Int => Option[Int] = {x => if (x == n) none else x.some}
val f0 = foo(0)
val f1 = foo(1)
I can compose them with >=> as follows:
val composed: Int => Option[Int] = Kleisli(f0) >=> Kleisli(f1)
Suppose now I need to compose all functions from a list:
val fs: List[Int => Option[Int]] = List(0, 1, 2).map(n => foo(n))
I can do it with map and reduce:
val composed: Int => Option[Int] = fs.map(f => Kleisli(f)).reduce(_ >=> _)
Can it (the composed above) be simplified ?
If you want the composition monoid (as opposed to the "run each and sum the results" monoid), you'll have to use the Endomorphic wrapper:
import scalaz._, Scalaz._
val composed = fs.foldMap(Endomorphic.endoKleisli[Option, Int])
And then:
scala> composed.run(10)
res11: Option[Int] = Some(10)
The monoid for kleisli arrows only requires a monoid instance for the output type, while the composition monoid requires the input and output types to be the same, so it makes sense that the latter is only available via a wrapper.
[A] Kleisli[Option, A, A] is a Semigroup via Compose, so we can use foldMap1:
val composed: Int => Option[Int] = fs.foldMap1(f => Kleisli(f))
Interestingly this doesn't work, though if we pass the correct instance explicitly then it does:
scala> val gs = NonEmptyList(fs.head, fs.tail: _*)
gs: scalaz.NonEmptyList[Int => Option[Int]] = NonEmptyList(<function1>, <function1>, <function1>)
scala> gs.foldMap1(f => Kleisli(f))(Kleisli.kleisliCompose[Option].semigroup[Int])
res20: scalaz.Kleisli[Option,Int,Int] = Kleisli(<function1>)
scala> gs.foldMap1(f => Kleisli(f))(Kleisli.kleisliCompose[Option].semigroup[Int]).apply(1)
res21: Option[Int] = None
I'm not sure where the instance that seems to take priority is coming from.

Suffering from Nothing

How does the A turn to be Nothing in the process?
def seq2map[A](src: Seq[A]): Map[A, A] = {
def pair = for {
f <- src.headOption
s <- src.headOption
} yield (f, s)
Stream continually pair takeWhile(_ isDefined) toMap
}
error: Expression of type Map[Nothing, Nothing] doesn't conform to expected type Map[A, A]
Thank you!
I get
<console>:12: error: Cannot prove that Option[(A, A)] <:< (T, U).
Stream continually pair takeWhile(_ isDefined) toMap
^
because
scala> val src = (1 to 10).toSeq
src: scala.collection.immutable.Range = Range(1, 2, 3, 4, 5, 6, 7, 8, 9, 10)
scala> def pair = for {
| f <- src.headOption
| s <- src.headOption
| } yield (f, s)
pair: Option[(Int, Int)]
is not a pair, but an Option.
scala> (Stream continually pair takeWhile (_.isDefined)).flatten
res0: scala.collection.immutable.Stream[(Int, Int)] = Stream((1,1), ?)
is a stream of pairs.
Just waiting for the game to start.

Universal quantification in generic function type

Reading the paper on Types and Polymorphism in programming languages, i wondered is it possible to express the similar universal quantification on type members with Scala. Example from the paper:
type GenericID = ∀A.A ↦ A
Which is a type for generic identity function and the following example in their paper language Fun was correct:
value inst = fun(f: ∀a.a ↦ a) (f[Int], f[Bool])
value intId = fst(inst(id)) // return a function Int ↦ Int
Is there some way to express the similar thing in Scala?
This is not the same as type constructor type GenericId[A] = A => A, cause it's a type operation when ∀A.A ↦ A is a type for generic function
Following on from my comment above:
scala> type Gen[+_] = _ => _
defined type alias Gen
scala> def f(x: List[Int]): Gen[List[Int]] = x map (y => s"{$y!$y}")
f: (x: List[Int])Gen[List[Int]]
scala> f(List(1, 4, 9))
res0: Function1[_, Any] = List({1!1}, {4!4}, {9!9})
In other words, identity of types has not been preserved by Gen[+_] = _ => _.
Addendum
scala> type Identity[A] = A => A
defined type alias Identity
scala> def f(x: List[Int]): Identity[List[Int]] = x => x.reverse
f: (x: List[Int])List[Int] => List[Int]
scala> f(List(1, 4, 9))
res1: List[Int] => List[Int] = <function1>
scala> def g(x: List[Int]): Identity[List[Int]] = x => x map (y => s"{$y!$y}")
<console>:35: error: type mismatch;
found : List[String]
required: List[Int]
def g(x: List[Int]): Identity[List[Int]] = x => x map (y => s"{$y!$y}")
Try: type Gen[+_] = _ => _
scala> def f(x:List[Int]):Gen[List[Int]] = x.reverse
f: (x: List[Int])Gen[List[Int]]
scala> f(List(3,4))
res0: Function1[_, Any] = List(4, 3)
scala> def f(x:List[Number]):Gen[List[Number]] = x.reverse
f: (x: List[Number])Gen[List[Number]]
scala> f(List(3,4))
res1: Function1[_, Any] = List(4, 3)