How to simplify future result handling in Akka/Futures? - scala

I want to simplify my for comprehension code to make it as simple as possible.
Here is the code
case object Message
class SimpleActor extends Actor {
def receive = {
case Message => sender ! Future { "Hello" }
}
}
object SimpleActor extends App {
val test = ActorSystem("Test")
val sa = test.actorOf(Props[SimpleActor])
implicit val timeout = Timeout(2.seconds)
val fRes = for {
f <- (sa ? Message).asInstanceOf[Future[Future[String]]]
r <- f
} yield r
println {
Await.result(fRes, 5.seconds)
}
}
Is it possible to get rid of this part
.asInstanceOf[Future[Future[String]]]
?

Look at the pipeTo function which is all about passing a Future between Actors. See here on ask patterns.
myFuture pipeTo sender
The return type of your ask will be Future[String] in your case, which as your comment below asks will need the mapTo[String] to actually get the result type to be a String. Thus your for-comp could be ditched and directly called:
val fRes = (sa ? Message).mapTo[String]

Related

Wait N seconds for actor to return in Scala

I am working on a project that uses Akka actors to handle some backend processes. I have created a method that calls the Actor, however it does not return instantly. Therefore, I want a timeout of 5 seconds, so if the Actor returns within the 5 seconds, my method will return this, if not, then a default value will be returned.
Just to be clear, the method calling the Actor's method is not itself within another actor, it is just a regular class. Secondly, I have tried using Timeout and Await, and even when importing them, it says they are not found.
Example code:
service.scala
override def foo(): MyType = {
var toReturn = MyType(fale, None, None)
implicit val timeout: Timeout = 5.seconds
val result = Await.result(myActor ? fetchData, timeout.duration).asInstanceOf[MyType]
if(result.healthy == true){
toReturn = result
}
toReturn
}
actorClass.scala
override def receive: Receive = {
case fetchData =>
actorFoo()
case _ =>
logger.error("...")
}
...
private def actorFoo(): MyType = {
val num1 = list1.size
val num2 = list2.size
val data = MyType(true, Some(num1), Some(num2))
println(data)
data
}
When I run this code, the data is printed within the actorFoo method, but then nothing happens after. I even tried printing "SUCCESS" after the Await to ensure it has not broken, but even that is not printed.
The ask pattern requires a reply to be sent, and your actor doesn't send a reply:
override def receive: Receive = {
case fetchData =>
actorFoo()
case _ =>
logger.error("...")
}
private def actorFoo(): MyType = {
val num1 = list1.size
val num2 = list2.size
val data = MyType(true, Some(num1), Some(num2))
println(data)
data
}
actorFoo() has a result type of MyType, but the result gets thrown away via a silent conversion to Unit (it's a Scala wart... enabling -Ywarn-value-discard will show the warning).
To explicitly send the result of actorFoo() to the requestor, try something like:
override def receive: Receive = {
case fetchData =>
sender ! actorFoo()
}

MVar tryPut returns true and isEmpty also returns true

I wrote simple callback(handler) function which i pass to async api and i want to wait for result:
object Handlers {
val logger: Logger = Logger("Handlers")
implicit val cs: ContextShift[IO] =
IO.contextShift(ExecutionContext.Implicits.global)
class DefaultHandler[A] {
val response: IO[MVar[IO, A]] = MVar.empty[IO, A]
def onResult(obj: Any): Unit = {
obj match {
case obj: A =>
println(response.flatMap(_.tryPut(obj)).unsafeRunSync())
println(response.flatMap(_.isEmpty).unsafeRunSync())
case _ => logger.error("Wrong expected type")
}
}
def getResponse: A = {
response.flatMap(_.take).unsafeRunSync()
}
}
But for some reason both tryPut and isEmpty(when i'd manually call onResult method) returns true, therefore when i calling getResponse it sleeps forever.
This is the my test:
class HandlersTest extends FunSuite {
test("DefaultHandler.test") {
val handler = new DefaultHandler[Int]
handler.onResult(3)
val response = handler.getResponse
assert(response != 0)
}
}
Can somebody explain why tryPut returns true, but nothing puts. And what is the right way to use Mvar/channels in scala?
IO[X] means that you have the recipe to create some X. So on your example, yuo are putting in one MVar and then asking in another.
Here is how I would do it.
object Handlers {
trait DefaultHandler[A] {
def onResult(obj: Any): IO[Unit]
def getResponse: IO[A]
}
object DefaultHandler {
def apply[A : ClassTag]: IO[DefaultHandler[A]] =
MVar.empty[IO, A].map { response =>
new DefaultHandler[A] {
override def onResult(obj: Any): IO[Unit] = obj match {
case obj: A =>
for {
r1 <- response.tryPut(obj)
_ <- IO(println(r1))
r2 <- response.isEmpty
_ <- IO(println(r2))
} yield ()
case _ =>
IO(logger.error("Wrong expected type"))
}
override def getResponse: IO[A] =
response.take
}
}
}
}
The "unsafe" is sort of a hint, but every time you call unsafeRunSync, you should basically think of it as an entire new universe. Before you make the call, you can only describe instructions for what will happen, you can't actually change anything. During the call is when all the changes occur. Once the call completes, that universe is destroyed, and you can read the result but no longer change anything. What happens in one unsafeRunSync universe doesn't affect another.
You need to call it exactly once in your test code. That means your test code needs to look something like:
val test = for {
handler <- TestHandler.DefaultHandler[Int]
_ <- handler.onResult(3)
response <- handler.getResponse
} yield response
assert test.unsafeRunSync() == 3
Note this doesn't really buy you much over just using the MVar directly. I think you're trying to mix side effects inside IO and outside it, but that doesn't work. All the side effects need to be inside.

Iterate data source asynchronously in batch and stop while remote return no data in Scala

Let's say we have a fake data source which will return data it holds in batch
class DataSource(size: Int) {
private var s = 0
implicit val g = scala.concurrent.ExecutionContext.global
def getData(): Future[List[Int]] = {
s = s + 1
Future {
Thread.sleep(Random.nextInt(s * 100))
if (s <= size) {
List.fill(100)(s)
} else {
List()
}
}
}
object Test extends App {
val source = new DataSource(100)
implicit val g = scala.concurrent.ExecutionContext.global
def process(v: List[Int]): Unit = {
println(v)
}
def next(f: (List[Int]) => Unit): Unit = {
val fut = source.getData()
fut.onComplete {
case Success(v) => {
f(v)
v match {
case h :: t => next(f)
}
}
}
}
next(process)
Thread.sleep(1000000000)
}
I have mine, the problem here is some portion is more not pure. Ideally, I would like to wrap the Future for each batch into a big future, and the wrapper future success when last batch returned 0 size list? My situation is a little from this post, the next() there is synchronous call while my is also async.
Or is it ever possible to do what I want? Next batch will only be fetched when the previous one is resolved in the end whether to fetch the next batch depends on the size returned?
What's the best way to walk through this type of data sources? Are there any existing Scala frameworks that provide the feature I am looking for? Is play's Iteratee, Enumerator, Enumeratee the right tool? If so, can anyone provide an example on how to use those facilities to implement what I am looking for?
Edit----
With help from chunjef, I had just tried out. And it actually did work out for me. However, there was some small change I made based on his answer.
Source.fromIterator(()=>Iterator.continually(source.getData())).mapAsync(1) (f=>f.filter(_.size > 0))
.via(Flow[List[Int]].takeWhile(_.nonEmpty))
.runForeach(println)
However, can someone give comparison between Akka Stream and Play Iteratee? Does it worth me also try out Iteratee?
Code snip 1:
Source.fromIterator(() => Iterator.continually(ds.getData)) // line 1
.mapAsync(1)(identity) // line 2
.takeWhile(_.nonEmpty) // line 3
.runForeach(println) // line 4
Code snip 2: Assuming the getData depends on some other output of another flow, and I would like to concat it with the below flow. However, it yield too many files open error. Not sure what would cause this error, the mapAsync has been limited to 1 as its throughput if I understood correctly.
Flow[Int].mapConcat[Future[List[Int]]](c => {
Iterator.continually(ds.getData(c)).to[collection.immutable.Iterable]
}).mapAsync(1)(identity).takeWhile(_.nonEmpty).runForeach(println)
The following is one way to achieve the same behavior with Akka Streams, using your DataSource class:
import scala.concurrent.Future
import scala.util.Random
import akka.actor.ActorSystem
import akka.stream._
import akka.stream.scaladsl._
object StreamsExample extends App {
implicit val system = ActorSystem("Sandbox")
implicit val materializer = ActorMaterializer()
val ds = new DataSource(100)
Source.fromIterator(() => Iterator.continually(ds.getData)) // line 1
.mapAsync(1)(identity) // line 2
.takeWhile(_.nonEmpty) // line 3
.runForeach(println) // line 4
}
class DataSource(size: Int) {
...
}
A simplified line-by-line overview:
line 1: Creates a stream source that continually calls ds.getData if there is downstream demand.
line 2: mapAsync is a way to deal with stream elements that are Futures. In this case, the stream elements are of type Future[List[Int]]. The argument 1 is the level of parallelism: we specify 1 here because DataSource internally uses a mutable variable, and a parallelism level greater than one could produce unexpected results. identity is shorthand for x => x, which basically means that for each Future, we pass its result downstream without transforming it.
line 3: Essentially, ds.getData is called as long as the result of the Future is a non-empty List[Int]. If an empty List is encountered, processing is terminated.
line 4: runForeach here takes a function List[Int] => Unit and invokes that function for each stream element.
Ideally, I would like to wrap the Future for each batch into a big future, and the wrapper future success when last batch returned 0 size list?
I think you are looking for a Promise.
You would set up a Promise before you start the first iteration.
This gives you promise.future, a Future that you can then use to follow the completion of everything.
In your onComplete, you add a case _ => promise.success().
Something like
def loopUntilDone(f: (List[Int]) => Unit): Future[Unit] = {
val promise = Promise[Unit]
def next(): Unit = source.getData().onComplete {
case Success(v) =>
f(v)
v match {
case h :: t => next()
case _ => promise.success()
}
case Failure(e) => promise.failure(e)
}
// get going
next(f)
// return the Future for everything
promise.future
}
// future for everything, this is a `Future[Unit]`
// its `onComplete` will be triggered when there is no more data
val everything = loopUntilDone(process)
You are probably looking for a reactive streams library. My personal favorite (and one I'm most familiar with) is Monix. This is how it will work with DataSource unchanged
import scala.concurrent.duration.Duration
import scala.concurrent.Await
import monix.reactive.Observable
import monix.execution.Scheduler.Implicits.global
object Test extends App {
val source = new DataSource(100)
val completed = // <- this is Future[Unit], completes when foreach is done
Observable.repeat(Observable.fromFuture(source.getData()))
.flatten // <- Here it's Observable[List[Int]], it has collection-like methods
.takeWhile(_.nonEmpty)
.foreach(println)
Await.result(completed, Duration.Inf)
}
I just figured out that by using flatMapConcat can achieve what I wanted to achieve. There is no point to start another question as I have had the answer already. Put my sample code here just in case someone is looking for similar answer.
This type of API is very common for some integration between traditional Enterprise applications. The DataSource is to mock the API while the object App is to demonstrate how the client code can utilize Akka Stream to consume the APIs.
In my small project the API was provided in SOAP, and I used scalaxb to transform the SOAP to Scala async style. And with the client calls demonstrated in the object App, we can consume the API with AKKA Stream. Thanks for all for the help.
class DataSource(size: Int) {
private var transactionId: Long = 0
private val transactionCursorMap: mutable.HashMap[TransactionId, Set[ReadCursorId]] = mutable.HashMap.empty
private val cursorIteratorMap: mutable.HashMap[ReadCursorId, Iterator[List[Int]]] = mutable.HashMap.empty
implicit val g = scala.concurrent.ExecutionContext.global
case class TransactionId(id: Long)
case class ReadCursorId(id: Long)
def startTransaction(): Future[TransactionId] = {
Future {
synchronized {
transactionId += transactionId
}
val t = TransactionId(transactionId)
transactionCursorMap.update(t, Set(ReadCursorId(0)))
t
}
}
def createCursorId(t: TransactionId): ReadCursorId = {
synchronized {
val c = transactionCursorMap.getOrElseUpdate(t, Set(ReadCursorId(0)))
val currentId = c.foldLeft(0l) { (acc, a) => acc.max(a.id) }
val cId = ReadCursorId(currentId + 1)
transactionCursorMap.update(t, c + cId)
cursorIteratorMap.put(cId, createIterator)
cId
}
}
def createIterator(): Iterator[List[Int]] = {
(for {i <- 1 to 100} yield List.fill(100)(i)).toIterator
}
def startRead(t: TransactionId): Future[ReadCursorId] = {
Future {
createCursorId(t)
}
}
def getData(cursorId: ReadCursorId): Future[List[Int]] = {
synchronized {
Future {
Thread.sleep(Random.nextInt(100))
cursorIteratorMap.get(cursorId) match {
case Some(i) => i.next()
case _ => List()
}
}
}
}
}
object Test extends App {
val source = new DataSource(10)
implicit val system = ActorSystem("Sandbox")
implicit val materializer = ActorMaterializer()
implicit val g = scala.concurrent.ExecutionContext.global
//
// def process(v: List[Int]): Unit = {
// println(v)
// }
//
// def next(f: (List[Int]) => Unit): Unit = {
// val fut = source.getData()
// fut.onComplete {
// case Success(v) => {
// f(v)
// v match {
//
// case h :: t => next(f)
//
// }
// }
//
// }
//
// }
//
// next(process)
//
// Thread.sleep(1000000000)
val s = Source.fromFuture(source.startTransaction())
.map { e =>
source.startRead(e)
}
.mapAsync(1)(identity)
.flatMapConcat(
e => {
Source.fromIterator(() => Iterator.continually(source.getData(e)))
})
.mapAsync(5)(identity)
.via(Flow[List[Int]].takeWhile(_.nonEmpty))
.runForeach(println)
/*
val done = Source.fromIterator(() => Iterator.continually(source.getData())).mapAsync(1)(identity)
.via(Flow[List[Int]].takeWhile(_.nonEmpty))
.runFold(List[List[Int]]()) { (acc, r) =>
// println("=======" + acc + r)
r :: acc
}
done.onSuccess {
case e => {
e.foreach(println)
}
}
done.onComplete(_ => system.terminate())
*/
}

Response in akka using ask pattern

i trying get response from my actor using ask pattern
in ask response i have List(scala.concurrent.impl.CallbackRunnable#33d40b3)
but i expect get string "1,2"
How i can get expected results ?
this is my code:
class Storage extends Actor {
var map:ListBuffer[List[String]] = new ListBuffer
val logger = Logging(context.system,this)
override def receive = {
case setRequest(url, urlType)=>
map+=List( url, urlType)
logger.info(s"Putting ${url} to storage")
sender() ! Status.Success
case getRequest()=>
if (map.length >=1){
var response= map(0).mkString(",")
logger.info(s"Send ${response}")
map = map.filter(x => x != response)
sender ! response
}
else{
sender()! Status.Failure(new emptyStorage)
}
case getLength()=>
sender()! map.length
}
}
object Main extends App{
implicit val timeout = Timeout(5 seconds)
val system = ActorSystem.create("default-dispatcher", ConfigFactory.load().getConfig("MyDispatcherExample"))
val storage = system.actorOf(Props(new Storage))
storage ! setRequest("1", "2")
val result = Future {storage ? getRequest }
result onComplete{
case Success(result)=> println(result)
case Failure(result)=> println("some error")
}
}
The ask pattern by itself returns a Future which you're wrapping in an additional Future giving us Future[Future[Any]]. There's not need for that:
val result = storage ? getRequest
result onComplete {
case Success(res) => println(res)
case Failure(e) => println(e)
}
Additionally, when you pass data around in Akka, especially generic data types which are subject to type erasure, it is recommended to wrap them in a case class. For example:
case class Response(result: String)
It is also that a Future returned by ask is untyped. It is recommended to use mapTo in order to cast to a typed response:
val result: Future[Response] = (storage ? getRequest).mapTo[Response]

only once in while loop with scala

I'm beginning with Scala. I have a program which have a method with a while loop which run until the program is not ended.
But for my test, I need to execute this method only once (or twice). In java, I would have used a mutable variable that I would have decremented in order to stop my treatment.
Maybe a condition inside my while loop that I override for my test.
def receive = {
val iterator = stream.iterator()
while (iterator.hasNext && my_condition()) {
something_to_do
}
}
I know it's a stupid question, but could you please advice me ?
Try:
iterator.takeWhile(my_condition).foreach(something_to_do)
or:
iterator.take(n).foreach(something_to_do)
if you just want the first n entries.
Or, if something_to_do returns a result (rather than Unit), and you want to return an iterator of those results, you can use:
iterator.takeWhile(my_condition).map(something_to_do)
(or .take(n).map(...) )
Consider this for comprehension,
for (_ <- iterator if my_condition()) something_to_do
where each iterated value is ignored (note _) and the todo part is invoked while the condition holds.
I think an approach like the following is acceptable:
import akka.actor.{Props, Actor}
import scala.io.Source
object TestableActor {
def props = Props(new TestableActor())
def testProps = Props(new TestableActor(true))
case class Message(stream: Stream)
}
class TestableActor(doOnce: Boolean = false) extends Actor {
import TestableActor._
val stream: Stream = ???
def receive = {
case Message(stream) =>
val iterator = stream.iterator
if(doOnce) {
something_to_do
} else {
while (iterator.hasNext && my_condition()) {
something_to_do
}
}
}
def my_condition(): Boolean = ???
def something_to_do: Unit = ???
}
In your production code, use
context.actorOf(TestableActor.props)
In your test use
TestActorRef[TestableActor](TestableActor.testProps)