scala-cats EitherT: chaining futures - scala

I am following this. Structure of my program is:
(for {
data <- myService.fetch(id).eitherT
values <- anotherService.fetch(data.id).eitherT
// todo: process values as part of forcomprehension
} yield data.id).value
myService.fetch returns Future[...].
anotherService.fetch also returns Future[...], and values is a Seq.
I want to process entries in values using foreach
values.foreach(value => service.moreAction(value))
moreAction also returns Future[...]
Hence my aim is to chain it with previous Futures in for-comprehension itself.
What is idiomatic way of doing it?

In Cats if you want to go through all values of some data, apply some effect to them, and then get a data within this effect:
DataOf[A] => (A => Effect[B]) => Effect[DataOf[B]]
you use Traverse type class with its extension methods.
(Think of it like of generalization of Future.sequence which takes Seq[Future[A]] and returns Future[Seq[A]], but one which performs seq.map(toFuture) and Future.sequence in one step).
import cats.syntax.traverse._
for {
data <- myService.fetch(id).eitherT
values <- anotherService.fetch(data.id).eitherT
// Traverse is not available for Seq, it requires some specific data e.g. List
result <- values.toList.traverse(value => service.moreAction(value).eitherT)
} yield result

Related

Losing types on sequencing Futures

I'm trying to do this:
case class ConversationData(members: Seq[ConversationMemberModel], messages: Seq[MessageModel])
val membersFuture: Future[Seq[ConversationMemberModel]] = ConversationMemberPersistence.searchByConversationId(conversationId)
val messagesFuture: Future[Seq[MessageModel]] = MessagePersistence.searchByConversationId(conversationId)
Future.sequence(List(membersFuture, messagesFuture)).map{ result =>
// some magic here
self ! ConversationData(members, messages)
}
But when I'm sequencing the two futures compiler is losing types. The compiler says that type of result is List[Seq[Product with Serializable]] At the beginning I expect to do something like
Future.sequence(List(membersFuture, messagesFuture)).map{ members, messages => ...
But it looks like sequencing futures don't work like this... I also tried to using a collect inside the map but I get similar errors.
Thanks for your help
When using Future.sequence, the assumption is that the underlying types produced by the multiple Futures are the same (or extend from the same parent type). With sequence, you basically invert a Seq of Futures for a particular type to a single Future for a Seq of that particular type. A concrete example is probably more illustrative of that point:
val f1:Future[Foo] = ...
val f2:Future[Foo] = ...
val f3:Future[Foo] = ...
val futures:List[Future[Foo]] = List(f1, f2, f3)
val aggregateFuture:Future[List[Foo]] = Future.sequence(futures)
So you can see that I went from a List of Future[Foo] to a single Future wrapping a List[Foo]. You use this when you already have a bunch of Futures for results of the same type (or base type) and you want to aggregate all of the results for the next processing step. The sequence method product a new Future that won't be completed until all of the aggregated Futures are done and it will then contain the aggregated results of all of those Futures. This works especially well when you have an indeterminate or variable number of Futures to process.
For your case, it seems that you have a fixed number of Futures to handle. As #Zoltan suggested, a simple for comprehension is probably a better fit here because the number of Futures is known. So solving your problem like so:
for{
members <- membersFuture
messages <- messagesFuture
} {
self ! ConversationData(members, messages)
}
is probably the best way to go for this specific example.
What are you trying to achieve with the sequence call? I'd just use a for-comprehension instead:
val membersFuture: Future[Seq[ConversationMemberModel]] = ConversationMemberPersistence.searchByConversationId(conversationId)
val messagesFuture: Future[Seq[MessageModel]] = MessagePersistence.searchByConversationId(conversationId)
for {
members <- membersFuture
messages <- messagesFuture
} yield (self ! ConversationData(members, messages))
Note that it is important that you declare the two futures outside the for-comprehension, because otherwise your messagesFuture wouldn't be submitted until the membersFuture is completed.
You could also use zip:
membersFuture.zip(messagesFuture).map {
case (members, messages) => self ! ConversationData(members, messages)
}
but I'd prefer the for-comprehension.

Filtering and modifying immutable sequence in loop and have changes effective in subsequent filter calls

I guess I kinda murdered the title but I could not express it another way.
I have a trait like this:
trait Flaggable[T <: Flaggable[T]] { self : T =>
def setFlag(key: String, value: Boolean): T
def getFlag(key:String): Boolean
}
This trait itself is not that important, but main thing here is class implementing it should be immutable as setFlag returns a new instance. Example class extending this trait:
class ExampleData(val flags: Map[String, Boolean] = Map())
extends Flaggable[ExampleData] {
def setFlag(key: String, value: Boolean): ExampleData =
new ExampleData(flags + (key->value))
def getFlag(key:String): Boolean = flags(key)
}
While iterating over collection I set flags on elements and I want those flags to be effective in subsequent iterations. Something like
val seq: Seq[ExampleData] = ???
seq.view.filter(el => !el.getFlag("visited")).foreach { el =>
// do things that may set flag visited to true in elements
// further in the seq, if that happens I do want those
// elements to be filtered
}
Now AFAIK, one option is to make seq mutable and assign new instances returned from setFlag to seq. Another option is to make whole flaggable thing mutable and modify instances in place in collection. Do I have any other option without making either of these (class and collection) mutable? I do not even know how can I modify and filter at the same time in that case.
I guess I should explain my situation more. Specifically, I am trying to implement dbscan clustering algorithm. I have a function that can return distance between two data points. For each data point, I need to get data points that is closer than an epsilon to that data point and mark those visited. And I do not want to process data points that is marked visited again. For example, for data point with index 0, the index list of data points closer than epsilon might be [2, 4, 5]. In this case I want to flag those data points as visited and skip over them without processing.
Just use map instead of foreach and replace the order of the functions:
seq.view.map { el =>
// do things that may set flag visited to true and return the new
// or e1 if no change needed.
}.filter(el => !el.getFlag("visited"))
Update:
Since the filter and the update related to each other, use mutable collection. I prefer that than mutable data objects, since it can be limit only to that scope. (e.g. use seq.to[ListBuffer]). after you done, all the mutations gone.... This allow keep the mutable code locally.
Nevertheless, depends on your algorithm, there may be a better collection for that, like Zipper.
I think you could extract a function to handle what is done inside your foreach with a signature like:
def transform(in: ExampleData): ExampleData
with that you could use a for comprehension:
for {
elem <- seq if !elem.getFlag("visited")
result = transform(elem) if result.getFlag("Foo")
} yield result
If you have multiple operations you can just append:
for {
elem <- seq if !elem.getFlag("visited")
r1 = transform(elem) if r1.getFlag("Foo")
r2 = transform2(r1) if r2.getFlag("Bar")
} yield r2
The result would be a new Seq of new ExampleData according to the transformations and filters applied.
In general, if you want to both filter and process elements you would usually use the collect function and possibly chain them:
seq.collect {
case elem if !elem.getFlag("visited") => transform(elem)
}.collect {
case elem if elem.getFlag("Foo") => transform2(elem)
}.filter(_.getFlag("Bar")

Composing Futures with For Comprehension

I have a Play Framework application using ReactiveMongo with MongoDB, and I have the following code:
def categories(id: String): Future[Vector[Category]] = {...}
....
val categoriesFuture = categories(id)
for {
categories: Vector[Category] <- categoriesFuture
categoryIdsWithoutPerson: Vector[BSONObjectID] <- findCategoryIdsWithoutPerson(categories.map(_.id), personId) //Returns Future[Vector[BSONObjectID]]
categoriesWithoutPerson: Vector[Category] <- categories.filter(category => categoryIdsWithoutPerson.contains(category.id)) //Play cites the error here
} yield categoryIdsWithoutPerson
To explain this code, I fetch a Vector of Categories wrapped in a Future because that's how ReactiveMongo rolls. In the for comprehension, I use that Vector to then fetch a list of ids from the database. Finally, I use a filter call to keep only those categories whose ids can be found in that id list.
It all seems fairly straightforward. The problem is that Play gives me the following compilation error on the last line of the for comprehension:
pattern type is incompatible with expected type;
found : Vector[com.myapp.Category]
required: com.myapp.Category
I am not sure why the required type is a single instance of Category.
I could use some insight into what I am doing wrong and/or if there is a simpler or more idiomatic way of accomplishing this.
It looks like you're trying to compose Futures with Vector. For comprehensions in scala have to all be of the same higher type, which in your case is Future. When you unroll the 'sugar' of the for comprehension, it's just calling flatMap on everything.
for {
categories <- categoriesFuture
// I'm not sure what the return type is here, but I'm guessing it's a future as well
categoryIdsWithoutPerson <- findCategoryIdsWithoutPerson(categories.map(_.id), personId)
// Here we use = to get around need to flatMap
categoriesWithoutPerson = categories.filter(category => categoryIdsWithoutPerson.contains(category.id))
} yield categoryIdsWithoutPerson
Your code de-sugared:
categoriesFuture.flatMap(categories =>
findCategoryIdsWithoutPerson(categories.map(_.id), personId).
flatMap(categoryIdsWithoutPerson =>
categories.filter(category => categoryIdsWithoutPerson.contains(category.id)).
map(_ => categoryIdsWithoutPerson))

Inverting a key to values mapping

Lets say I have a set of a class Action like this: actions: Set[Action], and each Action class has a val consequences : Set[Consequence], where Consequence is a case class.
I wish to get a map from Consequence to Set[Action] to determine which actions cause a specific Consequence. Obviously since an Action can have multiple Consequences it can appear in multiple sets in the map.
I have been trying to get my head around this (I am new to Scala), wondering if I can do it with something like map() and groupBy(), but a bit lost. I don't wish to revert to imperative programming, especially if there is some Scala mapping function that can help.
What is the best way to achieve this?
Not exactly elegant because groupBy doesn't handle the case of operating already on a Tuple2, so you end up doing a lot of tupling and untupling:
case class Conseq()
case class Action(conseqs: Set[Conseq])
def gimme(actions: Seq[Action]): Map[Conseq, Set[Action]] =
actions.flatMap(a => a.conseqs.map(_ -> a))
.groupBy(_._1)
.mapValues(_.map(_._2)(collection.breakOut))
The first line "zips" each action with all of its consequences, yielding a Seq[(Conseq, Action)], grouping this by the first product element gives Map[Conseq, Seq[(Conseq, Action)]. So the last step needs to transform the map's values from Seq[(Conseq, Action)] to a Set[Action]. This can be done with mapValues. Without the explicit builder factory, it would produce a Seq[Action], so one would have to write .mapValues(_.map(_._2)).toSet. Passing in collection.breakOut in the second parameter list to map makes it possible to save one step and make map directly produce the Set collection type.
Another possibility is to use nested folds:
def gimme2(actions: Seq[Action]) = (Map.empty[Conseq, Set[Action]] /: actions) {
(m, a) => (m /: a.conseqs) {
(m1, c) => m1.updated(c, m1.getOrElse(c, Set.empty) + a)
}
}
This is perhaps more readable. We start with an empty result map, traverse the actions, and in the inner fold traverse each action's consequences which get merged into the result map.

Scala's for-comprehension `if` statements

Is it possible in scala to specialize on the conditions inside an if within a for comprehension? I'm thinking along the lines of:
val collection: SomeGenericCollection[Int] = ...
trait CollectionFilter
case object Even extends CollectionFilter
case object Odd extends CollectionFilter
val evenColl = for { i <- collection if(Even) } yield i
//evenColl would be a SomeGenericEvenCollection instance
val oddColl = for { i <- collection if(Odd) } yield i
//oddColl would be a SomeGenericOddCollection instance
The gist is that by yielding i, I get a new collection of a potentially different type (hence me referring to it as "specialization")- as opposed to just a filtered-down version of the same GenericCollection type.
The reason I ask is that I saw something that I couldn't figure out (an example can be found on line 33 of this ScalaQuery example. What it does is create a query for a database (i.e. SELECT ... FROM ... WHERE ...), where I would have expected it to iterate over the results of said query.
So, I think you are asking if it is possible for the if statement in a for-comprehension to change the result type. The answer is "yes, but...".
First, understand how for-comprehensions are expanded. There are questions here on Stack Overflow discussing it, and there are parameters you can pass to the compiler so it will show you what's going on.
Anyway, this code:
val evenColl = for { i <- collection if(Even) } yield i
Is translated as:
val evenColl = collection.withFilter(i => Even).map(i => i)
So, if the withFilter method changes the collection type, it will do what you want -- in this simple case. On more complex cases, that alone won't work:
for {
x <- xs
y <- ys
if cond
} yield (x, y)
is translated as
xs.flatMap(ys.withFilter(y => cond).map(y => (x, y)))
In which case flatMap is deciding what type will be returned. If it takes the cue from what result was returned, then it can work.
Now, on Scala Collections, withFilter doesn't change the type of the collection. You could write your own classes that would do that, however.
yes you can - please refer to this tutorial for an easy example. The scala query example you cited is also iterating on the collection, it is then using that data to build the query.