I have an idea (vague), to pass (or chain) some implicit value in this manner, not introducing parameters to block f:
def block(x: Int)(f: => Unit)(implicit v: Int) = {
implicit val nv = v + x
f
}
def fun(implicit v: Int) = println(v)
such that if I used something alike:
implicit val ii: Int = 0
block(1) {
block(2) {
fun
}
}
It would print 3.
If I could say def block(x: Int)(f: implicit Int => Unit).
In other words I'm seeking for some design pattern which will allow me to implement this DSL: access some cumulative value inside nested blocks but without explicitly passing it as parameter. Is it possible? (implicits are not necessary, just a hint to emphasize that I don't want to pass that accumulator explicitly). Of course upper code will print 0.
EDIT: One of possible usages: composing http routes, in a following manner
prefix("path") {
prefix("subpath") {
post("action1") { (req, res) => do action }
get("action2") { (req, res) => do action }
}
}
Here post and get will access (how?) accumulated prefix, say List("path", "subpath") or "/path/subpath/".
Consider using DynamicVariable for this. It's really simple to use, and thread-safe:
val acc: DynamicVariable[Int] = new DynamicVariable(0)
def block(x: Int)(f: => Unit) = {
acc.withValue(acc.value + x)(f)
}
def fun = println(acc.value)
Passing state via implicit is dirty and will lead to unexpected and hard to track down bugs. What you're asking to do is build a function that can compose in such a way that nested calls accumulate over some operation and anything else uses that value to execute the function?
case class StateAccum[S](init: S){
val op: S => S
def flatMap[A <: S](f: S => StateAccum[A]) ={
val StateAccum(out) = f(s)
StateAccum(op(init, out))
}
def apply(f: S => A) = f(init)
}
which could allow you do exactly what you're after with a slight change in how you're calling it.
Now, if you really want the nested control structures, your apply would have to use an implicit value to distinguish the types of the return such that it applied the function to one and a flatMap to StateAccum returns. It gets crazy but looks like the following:
def apply[A](f: S => A)(implicit mapper: Mapper[S, A]): mapper.Out = mapper(this, f)
trait Mapper[S, A]{
type Out
def apply(s: StateAccum[S], f: S => A): Out
}
object Mapper extends LowPriorityMapper{
implicit def acuum[S, A <: S] = new Mapper[S, StateAccum[A]]{
type Out = StateAccum[A]
def apply(s: StateAccum[S], f: S => StateAccum[A]) = s.flatMap(f)
}
}
trait LowPriorityMapper{
implicit def acuum[S, A] = new Mapper[S, A]{
type Out = A
def apply(s: StateAccum[S], f: S => A) = f(s.init)
}
}
Related
I am trying to implicitly add Async and Sync in my code for doobie repository. The Sync and Async[F] works fine IO. I want to convert them to Future and facing problem
I have tried to create my own Aync from IO
def futureAsync(implicit F: MonadError[Future, Throwable]): Async[Future] = new Async[Future] {
override def async[A](k: (Either[Throwable, A] => Unit) => Unit): Future[A] = IO.async(k).unsafeToFuture()
override def asyncF[A](k: (Either[Throwable, A] => Unit) => Future[Unit]): Future[A] =
throw new Exception("Not implemented Future.asyncF")
override def suspend[A](thunk: => Future[A]): Future[A] = thunk
override def bracketCase[A, B](acquire: Future[A])(use: A => Future[B])(release: (A, ExitCase[Throwable]) => Future[Unit]): Future[B] =
throw new Exception("Not implemented Future.bracketCase")
override def raiseError[A](e: Throwable): Future[A] = F.raiseError(e)
override def handleErrorWith[A](fa: Future[A])(f: Throwable => Future[A]): Future[A] = F.handleErrorWith(fa)(_ => f(new Exception("")))
override def pure[A](x: A): Future[A] = F.pure(x)
override def flatMap[A, B](fa: Future[A])(f: A => Future[B]): Future[B] = F.flatMap(fa)(f)
override def tailRecM[A, B](a: A)(f: A => Future[Either[A, B]]): Future[B] = F.tailRecM(a)(f)
}
I am struck with implementation of two functions in there asyncF and bracketCase
Can some one help?
As Reactormonk says in a comment above, it's not possible to write an instance of Async for Future that has the right semantics, because Async extends Sync, and Sync requires a representation of a computation that can be run repeatedly, while Scala's futures begin running when they're defined and can't be re-run.
An unlawful instance
It's instructive to see this for yourself, though, and I'd encourage you to try to write your own compile-able but (necessarily) unlawful Async[Future] instance without looking at the next block of code. For the sake of the example, though, here's a quick sketch off the top of my head:
import scala.concurrent.{ExecutionContext, Future, Promise}
import scala.util.{Failure, Success}
import cats.effect.{Async, ExitCase, IO}
def futureAsync(implicit c: ExecutionContext): Async[Future] = new Async[Future] {
def async[A](k: (Either[Throwable, A] => Unit) => Unit): Future[A] =
IO.async(k).unsafeToFuture()
def asyncF[A](k: (Either[Throwable, A] => Unit) => Future[Unit]): Future[A] = {
val p = Promise[A]()
val f = k {
case Right(a) => p.success(a)
case Left(e) => p.failure(e)
}
f.flatMap(_ => p.future)
}
def suspend[A](thunk: => Future[A]): Future[A] = Future(thunk).flatten
def bracketCase[A, B](acquire: Future[A])(use: A => Future[B])(
release: (A, ExitCase[Throwable]) => Future[Unit]
): Future[B] = acquire.flatMap { a =>
use(a).transformWith {
case Success(b) => release(a, ExitCase.Completed).map(_ => b)
case Failure(e) => release(a, ExitCase.Error(e)).flatMap(_ => Future.failed(e))
}
}
def raiseError[A](e: Throwable): Future[A] = Future.failed(e)
def handleErrorWith[A](fa: Future[A])(f: Throwable => Future[A]): Future[A] =
fa.recoverWith { case t => f(t) }
def pure[A](x: A): Future[A] = Future.successful(x)
def flatMap[A, B](fa: Future[A])(f: A => Future[B]): Future[B] = fa.flatMap(f)
def tailRecM[A, B](a: A)(f: A => Future[Either[A, B]]): Future[B] = f(a).flatMap {
case Right(b) => Future.successful(b)
case Left(a) => tailRecM(a)(f)
}
}
This will compile just fine, and would probably work for some situations (but please don't actually use it!). We've said it can't have the right semantics, though, and we can show that by using cats-effect's laws module.
Checking the laws
First we need some boilerplate-y stuff you don't really need to worry about:
import cats.kernel.Eq, cats.implicits._
import org.scalacheck.Arbitrary
implicit val throwableEq: Eq[Throwable] = Eq.by[Throwable, String](_.toString)
implicit val nonFatalArbitrary: Arbitrary[Throwable] =
Arbitrary(Arbitrary.arbitrary[Exception].map(identity))
implicit def futureEq[A](implicit A: Eq[A], ec: ExecutionContext): Eq[Future[A]] =
new Eq[Future[A]] {
private def liftToEither(f: Future[A]): Future[Either[Throwable, A]] =
f.map(Right(_)).recover { case e => Left(e) }
def eqv(fx: Future[A], fy: Future[A]): Boolean =
scala.concurrent.Await.result(
liftToEither(fx).zip(liftToEither(fy)).map {
case (rx, ry) => rx === ry
},
scala.concurrent.duration.Duration(1, "second")
)
}
Then we can define a test that checks the Async laws for our instance:
import cats.effect.laws.discipline.{AsyncTests, Parameters}
import org.scalatest.FunSuite
import org.typelevel.discipline.scalatest.Discipline
object FutureAsyncSuite extends FunSuite with Discipline {
implicit val ec: ExecutionContext = ExecutionContext.global
implicit val params: Parameters =
Parameters.default.copy(allowNonTerminationLaws = false)
checkAll(
"Async",
AsyncTests[Future](futureAsync).async[String, String, String]
)
}
And then we can run the law tests:
scala> FutureAsyncSuite.execute()
FutureAsyncSuite:
- Async.async.acquire and release of bracket are uncancelable
- Async.async.ap consistent with product + map
- Async.async.applicative homomorphism
...
You'll see that most of the tests are green; this instance gets a lot of things right.
Where it breaks the law
It does show three failed tests, though, including the following:
- Async.async.repeated sync evaluation not memoized *** FAILED ***
GeneratorDrivenPropertyCheckFailedException was thrown during property evaluation.
(Discipline.scala:14)
Falsified after 1 successful property evaluations.
Location: (Discipline.scala:14)
Occurred when passed generated values (
arg0 = "淳칇멀",
arg1 = org.scalacheck.GenArities$$Lambda$7154/1834868832#1624ea25
)
Label of failing property:
Expected: Future(Success(驅ṇ숆㽝珅뢈矉))
Received: Future(Success(淳칇멀))
If you look at the laws definitions, you'll see that this is a test that defines a Future value with delay and then sequences it multiple times, like this:
val change = F.delay { /* observable side effect here */ }
val read = F.delay(cur)
change *> change *> read
The other two failures are similar "not memoized" violations. These tests should see the side effect happen twice, but in our case it's not possible to write delay or suspend for Future in such a way that that would happen (it's worth trying, though, to convince yourself that this is the case).
What you should do instead
To sum up: you can write an Async[Future] instance that will pass something like 75 of the 78 Async laws tests, but it's not possible to write an instance that will pass all of them, and using an unlawful instance is a really bad idea: both potential users of your code and libraries like Doobie will assume that your instances are lawful, and if you don't live up to this assumption you're opening the door to complex and annoying bugs.
It's worth noting that it's not too hard to write a minimal wrapper for Future that has a lawful Async instance (for example I've got a wrapper for Twitter's future called Rerunnable in my catbird library). You really should just stick with cats.effect.IO, though, and use the provided conversions to convert to and from futures in any parts of your code where you're working with traditional Future-based APIs.
Background
I have been reading the book Functional Programming in Scala, and have some questions regarding the content in Chapter 7: Purely functional parallelism.
Here is the code for the answers in the book: Par.scala, but I am confused about certain part of it.
Here is the first part of the code of Par.scala, which stands for Parallelism:
import java.util.concurrent._
object Par {
type Par[A] = ExecutorService => Future[A]
def unit[A](a: A): Par[A] = (es: ExecutorService) => UnitFuture(a)
private case class UnitFuture[A](get: A) extends Future[A] {
def isDone = true
def get(timeout: Long, units: TimeUnit): A = get
def isCancelled = false
def cancel(evenIfRunning: Boolean): Boolean = false
}
def map2[A, B, C](a: Par[A], b: Par[B])(f: (A, B) => C): Par[C] =
(es: ExecutorService) => {
val af = a(es)
val bf = b(es)
UnitFuture(f(af.get, bf.get))
}
def fork[A](a: => Par[A]): Par[A] =
(es: ExecutorService) => es.submit(new Callable[A] {
def call: A = a(es).get
})
def lazyUnit[A](a: => A): Par[A] =
fork(unit(a))
def run[A](es: ExecutorService)(a: Par[A]): Future[A] = a(es)
def asyncF[A, B](f: A => B): A => Par[B] =
a => lazyUnit(f(a))
def map[A, B](pa: Par[A])(f: A => B): Par[B] =
map2(pa, unit(()))((a, _) => f(a))
}
The simplest possible model for Par[A] might be ExecutorService => Future[A], and run simply returns the Future.
unit promotes a constant value to a parallel computation by returning a UnitFuture, which is a simple implementation of Future that just wraps a constant value.
map2 combines the results of two parallel computations with a binary function.
fork marks a computation for concurrent evaluation. The evaluation won’t actually occur until forced by run. Here is with its simplest and most natural implementation of it. Even though it has its problems, let's first put them aside.
lazyUnit wraps its unevaluated argument in a Par and marks it for concurrent evaluation.
run extracts a value from a Par by actually performing the computation.
asyncF converts any function A => B to one that evaluates its result asynchronously.
Questions
The fork is the function confuses me a lot here, because it takes a lazy argument, which will be evaluated later when it is called. Then my questions are more about when we should use this fork, i.e., when we need lazy-evaluation and when we need to have the value directly.
Here is an exercise from the book:
EXERCISE 7.5
Hard: Write this function, called sequence. No additional primitives are required. Do not call run.
def sequence[A](ps: List[Par[A]]): Par[List[A]]
And here is the answers (offered here).
First
def sequence_simple[A](l: List[Par[A]]): Par[List[A]] =
l.foldRight[Par[List[A]]](unit(List()))((h, t) => map2(h, t)(_ :: _))
What is the different between above code and the following:
def sequence_simple[A](l: List[Par[A]]): Par[List[A]] =
l.foldLeft[Par[List[A]]](unit(List()))((t, h) => map2(h, t)(_ :: _))
Additionally
def sequenceRight[A](as: List[Par[A]]): Par[List[A]] =
as match {
case Nil => unit(Nil)
case h :: t => map2(h, fork(sequenceRight(t)))(_ :: _)
}
def sequenceBalanced[A](as: IndexedSeq[Par[A]]): Par[IndexedSeq[A]] = fork {
if (as.isEmpty) unit(Vector())
else if (as.length == 1) map(as.head)(a => Vector(a))
else {
val (l,r) = as.splitAt(as.length/2)
map2(sequenceBalanced(l), sequenceBalanced(r))(_ ++ _)
}
}
In sequenceRight, fork is used when recursive function is directly called. However, in sequenceBalanced, fork is used outside of the whole function body.
Then, what is the differences or above code and the following (where we switched the places of fork):
def sequenceRight[A](as: List[Par[A]]): Par[List[A]] = fork {
as match {
case Nil => unit(Nil)
case h :: t => map2(h, sequenceRight(t))(_ :: _)
}
}
def sequenceBalanced[A](as: IndexedSeq[Par[A]]): Par[IndexedSeq[A]] =
if (as.isEmpty) unit(Vector())
else if (as.length == 1) map(as.head)(a => Vector(a))
else {
val (l,r) = as.splitAt(as.length/2)
map2(fork(sequenceBalanced(l)), fork(sequenceBalanced(r)))(_ ++ _)
}
Finally, given the sequence defined above, we have the following function:
def parMap[A,B](ps: List[A])(f: A => B): Par[List[B]] = fork {
val fbs: List[Par[B]] = ps.map(asyncF(f))
sequence(fbs)
}
I would like to know, can I also implement the function in the following way, which is by applying the lazyUnit defined in the beginning? Is this implementation lazyUnit(ps.map(f)) lazy?
def parMapByLazyUnit[A, B](ps: List[A])(f: A => B): Par[List[B]] =
lazyUnit(ps.map(f))
I did not completely understand your doubt. But I see a major problem with the following solution,
def parMapByLazyUnit[A, B](ps: List[A])(f: A => B): Par[List[B]] =
lazyUnit(ps.map(f))
To understand the problem lets look at def lazyUnit,
def fork[A](a: => Par[A]): Par[A] =
(es: ExecutorService) => es.submit(new Callable[A] {
def call: A = a(es).get
})
def lazyUnit[A](a: => A): Par[A] =
fork(unit(a))
So... lazyUnit takes an expression of type => A and submits it to ExecutorService to get evaluated. And returns the wrapped result of this parallel computation as Par[A].
In parMap for every element of ps: List[A], we not only have to evaluate the corresponding mapping using the function f: A => B but we have to do these evaluations in parallel.
But our solution lazyUnit(ps.map(f)) will submit the whole { ps.map(f) } evaluation as a single task to our ExecutionService. Which means we are not doing it in parallel.
What we need to do is make sure that for each element a in ps: [A], the function f: A => B is executed as a separate task for our ExecutorService.
Now, as we learned from our implementation is that we can run an expression of type exp: => A by using lazyUnit(exp) to get a result: Par[A].
So, we will do exactly that for every a: A in ps: List[A],
val parMappedTmp = ps.map( a => lazyUnit(f(a) ) )
// or
val parMappedTmp = ps.map( a => asyncF(f)(a) )
// or
val parMappedTmp = ps.map(asyncF(f))
But, Now our parMappedTmp is a List[Par[B]] and whereas we needed a Par[List[B]]
So, you will need a function with the following signature to get what you wanted,
def sequence[A](ps: List[Par[A]]): Par[List[A]]
Once you have it,
val parMapped = sequence(parMappedTmp)
I am currently reading Hutton's and Meijer's paper on parsing combinators in Haskell http://www.cs.nott.ac.uk/~pszgmh/monparsing.pdf. For the sake of it I am trying to implement them in scala. I would like to construct something easy to code, extend and also simple and elegant. I have come up with two solutions for the following haskell code
/* Haskell Code */
type Parser a = String -> [(a,String)]
result :: a -> Parser a
result v = \inp -> [(v,inp)]
zero :: Parser a
zero = \inp -> []
item :: Parser Char
item = \inp -> case inp of
[] -> []
(x:xs) -> [(x,xs)]
/* Scala Code */
object Hutton1 {
type Parser[A] = String => List[(A, String)]
def Result[A](v: A): Parser[A] = str => List((v, str))
def Zero[A]: Parser[A] = str => List()
def Character: Parser[Char] = str => if (str.isEmpty) List() else List((str.head, str.tail))
}
object Hutton2 {
trait Parser[A] extends (String => List[(A, String)])
case class Result[A](v: A) extends Parser[A] {
def apply(str: String) = List((v, str))
}
case object Zero extends Parser[T forSome {type T}] {
def apply(str: String) = List()
}
case object Character extends Parser[Char] {
def apply(str: String) = if (str.isEmpty) List() else List((str.head, str.tail))
}
}
object Hutton extends App {
object T1 {
import Hutton1._
def run = {
val r: List[(Int, String)] = Zero("test") ++ Result(5)("test")
println(r.map(x => x._1 + 1) == List(6))
println(Character("abc") == List(('a', "bc")))
}
}
object T2 {
import Hutton2._
def run = {
val r: List[(Int, String)] = Zero("test") ++ Result(5)("test")
println(r.map(x => x._1 + 1) == List(6))
println(Character("abc") == List(('a', "bc")))
}
}
T1.run
T2.run
}
Question 1
In Haskell, zero is a function value that can be used as it is, expessing all failed parsers whether they are of type Parser[Int] or Parser[String]. In scala we achieve the same by calling the function Zero (1st approach) but in this way I believe that I just generate a different function everytime Zero is called. Is this statement true? Is there a way to mitigate this?
Question 2
In the second approach, the Zero case object is extending Parser with the usage of existential types Parser[T forSome {type T}] . If I replace the type with Parser[_] I get the compile error
Error:(19, 28) class type required but Hutton2.Parser[_] found
case object Zero extends Parser[_] {
^
I thought these two expressions where equivalent. Is this the case?
Question 3
Which approach out of the two do you think that will yield better results in expressing the combinators in terms of elegance and simplicity?
I use scala 2.11.8
Note: I didn't compile it, but I know the problem and can propose two solutions.
The more Haskellish way would be to not use subtyping, but to define zero as a polymorphic value. In that style, I would propose to define parsers not as objects deriving from a function type, but as values of one case class:
final case class Parser[T](run: String => List[(T, String)])
def zero[T]: Parser[T] = Parser(...)
As shown by #Alec, yes, this will produce a new value every time, since a def is compiled to a method.
If you want to use subtyping, you need to make Parser covariant. Then you can give zero a bottom result type:
trait Parser[+A] extends (String => List[(A, String)])
case object Zero extends Parser[Nothing] {...}
These are in some way quite related; in system F_<:, which is the base of what Scala uses, the types _|_ (aka Nothing) and \/T <: Any. T behave the same (this hinted at in Types and Programming Languages, chapter 28). The two possibilities given here are a consequence of this fact.
With existentials I'm not so familiar with, but I think that while unbounded T forSome {type T} will behave like Nothing, Scala does not allow inhertance from an existential type.
Question 1
I think that you are right, and here is why: Zero1 below prints hello every time you use it. The solution, Zero2, involves using a val instead.
def Zero1[A]: Parser[A] = { println("hi"); str => List() }
val Zero2: Parser[Nothing] = str => List()
Question 2
No idea. I'm still just starting out with Scala. Hope someone answers this.
Question 3
The trait one will play better with Scala's for (since you can define custom flatMap and map), which turns out to be (somewhat) like Haskell's do. The following is all you need.
trait Parser[A] extends (String => List[(A, String)]) {
def flatMap[B](f: A => Parser[B]): Parser[B] = {
val p1 = this
new Parser[B] {
def apply(s1: String) = for {
(a,s2) <- p1(s1)
p2 = f(a)
(b,s3) <- p2(s2)
} yield (b,s3)
}
}
def map[B](f: A => B): Parser[B] = {
val p = this
new Parser[B] {
def apply(s1: String) = for ((a,s2) <- p(s1)) yield (f(a),s2)
}
}
}
Of course, to do anything interesting you need more parsers. I'll propose to you one simple parser combinator: Choice(p1: Parser[A], p2: Parser[A]): Parser[A] which tries both parsers. (And rewrite your existing parsers more to my style).
def choice[A](p1: Parser[A], p2: Parser[A]): Parser[A] = new Parser[A] {
def apply(s: String): List[(A,String)] = { p1(s) ++ p2(s) }
}
def unit[A](x: A): Parser[A] = new Parser[A] {
def apply(s: String): List[(A,String)] = List((x,s))
}
val character: Parser[Char] = new Parser[Char] {
def apply(s: String): List[(Char,String)] = List((s.head,s.tail))
}
Then, you can write something like the following:
val parser: Parser[(Char,Char)] = for {
x <- choice(unit('x'),char)
y <- char
} yield (x,y)
And calling parser("xyz") gives you List((('x','x'),"yz"), (('x','y'),"z")).
I have a collection of methods that return different types:
Either[ErrorResponse, X]
Future[Either[ErrorResponse, X]]
Option[ErrorResponse]
These methods need the result from a previous method to perform their computation. The methods:
type Parameters = Map[String, String]
// allows me to flatmap on an either
implicit def toRightProjection[Failure, Success](e: Either[Failure, Success]) =
e.right
// converts anything to a future
implicit def toFuture[T](t: T) =
Future.successful(t)
// retrieves the request paramters from the given request
def requestParameters(request: RequestHeader): Either[ErrorResponse, Parameters] = ???
// retrieves the response type from the given parameters
def responseType(p: Parameters): Either[ErrorResponse, String] = ???
// retrieves the client id from the given parameters
def clientId(p: Parameters): Either[ErrorResponse, String] = ???
// retrieves the client using the given client id
def client(clientId: String): Future[Either[ErrorResponse, Client]] = ???
// validates the response type of the client
def validateResponseType(client: Client, responseType: String): Option[ErrorResponse] = ???
I can the wire them together with the following for comprehension (note that I wrote down some types to clarify the contents of specific parts of the computation).
val result: Either[ErrorResponse, Future[Either[ErrorResponse, Client]]] =
for {
parameters <- requestParameters(request)
clientId <- clientId(parameters)
responseType <- responseType(parameters)
} yield {
val result: Future[Either[ErrorResponse, Either[ErrorResponse, Client]]] =
for {
errorOrClient <- client(clientId)
client <- errorOrClient
} yield validateResponseType(client, responseType).toLeft(client)
result.map(_.joinRight)
}
val wantedResult: Future[Either[ErrorResponse, Client]] =
result.left.map(Future successful Left(_)).merge
The above code is quite messy and I feel this can be done differently. I read about monads and monad transformers. The concept of those is very new to me and I can not get my head around it.
Most of the examples only deal with two types of results: Either[X, Y] and Future[Either[X, Y]]. I still find it very hard to bend my mind around it.
How can I write a nice and clean for comprehension that replaces the above one?
Something like this would be awesome (I am not sure if that is even possible):
val result: Future[Either[ErrorResponse, Client]] =
for {
parameters <- requestParameters(request)
clientId <- clientId(parameters)
responseType <- responseType(parameters)
client <- client(clientId)
_ <- validateResponseType(client, responseType)
}
OK, here is my attempt at this:
import scalaz._, Scalaz._
implicit val futureMonad = new Monad[Future] {
override def point[A](a: ⇒ A): Future[A] = future(a)
override def bind[A, B](fa: Future[A])(f: A ⇒ Future[B]): Future[B] =
fa.flatMap(f)
}
import EitherT._
val result: EitherT[Future, ErrorResponse, Client] =
for {
parameters <- fromEither(Future(requestParameters(request)))
clientId <- fromEither(Future(clientId(parameters)))
responseType <- fromEither(Future(responseType(parameters)))
client <- fromEither(client(clientId))
response <- fromEither[Future, ErrorResponse, Client](Future(validateResponseType(client, responseType).toLeft(client)))
} yield response
val x: Future[\/[ErrorResponse, Client]] = result.run
scala.util.Either is not a Monad, but the scalaz library has a great implementation.
object Test extends ToIdOps {
import scalaz.{ Monad, Functor, EitherT, \/, -\/, \/- }
import scalaz.syntax.ToIdOps
implicit val FutureFunctor = new Functor[Future] {
def map[A, B](a: Future[A])(f: A => B): Future[B] = a map f
}
implicit val FutureMonad = new Monad[Future] {
def point[A](a: => A): Future[A] = Future(a)
def bind[A, B](fa: Future[A])(f: (A) => Future[B]): Future[B] = fa flatMap f
}
def someMethod: Future[\/[InvalidData, ValidData]] = {
// things went well
ValidData.right // this comes from ToIdOps
// or something went wrong
InvalidData.left
}
def someOtherMethod: Future[\/[InvalidData, ValidData]] // same as above
val seq = for {
d <- EitherT(someMethod)
y <- EitherT(someOtherMethod)
} yield { // whatever}
// you can now Await.result(seq.run, duration)
// you can map or match etc with \/- and -\/
val result = seq.run map {
case -\/(left) => // invalid data
case \/-(right) => // game on
}
}
There is no really clean way to do comprehensions over multiple monad types. In ScalaZ there is OptionT that might help, worth checking out. You could also transform your Eithers to Options or the other way around and be able to have a little bit less of a mess. A third option might be to create your own kind of wrapper that combines Future[Either|Option] into the same monad and then comprehend over that.
For reference I asked aboutish the same question on the play framework mailing list recently and got some good links in the replies: https://groups.google.com/d/topic/play-framework/JmCsXNDvAns/discussion
I'm basically following the example given at the Scala API page for delimited continuations. The code below works fine:
import scala.util.continuations._
import scala.collection.mutable.HashMap
val sessions = new HashMap[Int, Int=>Unit]
def ask(prompt: String): Int #cps[Unit] = shift {
ret: (Int => Unit) => {
val id = sessions.size
printf("%s\nrespond with: submit(0x%x, ...)\n", prompt, id)
sessions += id -> ret
}
}
def submit(id: Int, addend: Int): Unit = {
sessions.get(id) match {
case Some(continueWith) => continueWith(addend)
}
}
def go = reset {
println("Welcome!")
val first = ask("Please give me a number")
val second = ask("Please enter another number")
printf("The sum of your numbers is: %d\n", first + second)
}
However, when I modify go to the following:
def go = reset {
println("Welcome!")
List("First?","Second?").map[Int #cps[Unit]](ask)
}
I get this error:
error: wrong number of type parameters for method map: [B, That](f: String => B)
(implicit bf: scala.collection.generic.CanBuildFrom[List[String],B,That])That
List("First?","Second?").map[Int #cps[Unit]](ask)
^
Adding Any as a second type parameter doesn't help. Any idea what types I should be supplying?
The reason is that this is simply not possible without creating a CPS-transformed map method on List: the CPS annotations make the compiler turn your methods “inside out” in order to pass the continuation back to where it is needed and the standard List.map does not obey the transformed contract. If you want to have your mind wrapped in Klein bottles for a while you may look at the class files produced from your source, in particular the method signatures.
This is the primary reason why the CPS plugin will never be a complete generic solution to this problem, which is not due to a deficiency but caused by an inherent mismatch between “straight” code and continuation passing style.
You need to give correct parameter for the CanBuildFrom implicit to be found:
List("First?","Second?").map[Int #cps[Unit], List[Int #cps[Unit]](ask)
But do you really need to be explicit about type? maybe just do .map(ask) will work.
Here's the closest thing I could work out. It uses shiftR to reify the continuation rather than reset it, uses a foldRight to construct the suspended continuation chain, uses a shift/reset block to get the continuation after the suspension, and an "animate" method to kick off the suspended continuation.
import scala.collection.mutable.HashMap
import scala.util.continuations._
val sessions = new HashMap[Int, (Unit=>Unit, Int)]
val map = new HashMap[Int, Int]
def ask(pair:(String, Int)) = pair match {
case (prompt, index) => shiftR { (ret: Unit => Unit) => {
val id = sessions.size
printf("%s\nrespond with: submit(0x%x, ...)\n", prompt, id)
sessions += id -> (ret, index)
()
}}
}
def submit(id: Int, addend: Int): Unit = {
sessions.get(id) match {
case Some((continue, index)) => { map.put(index, addend); continue() }
}
}
def sum(m:HashMap[Int,Int]) : Int = {
m.fold[(Int, Int)]((0, 0))((a, b) => (0, {a._2+b._2}))._2
}
type Suspended = ControlContext[Unit,Unit,Unit]
class AnimateList(l:List[Suspended]) {
def suspend(k: => Unit) = (c: Unit) => k
def animate(k:Unit => Unit): Unit = {
l.foldRight(k)(
(elem: Suspended, acc: Unit => Unit) => suspend(elem.fun(acc, ex => ())))()
}
}
implicit def listToAnimateList(l:List[Suspended]) = new AnimateList(l)
reset {
val conts = List("First?","Second?","Third?").zipWithIndex.map(ask)
shift { conts.animate }
println(sum(map))
}