In this example, I have a stream of Ticker instances, which have a sequence attribute.
I want to drop any messages where the sequence number is lower than the previous one.
I can do something like the following, but it's pretty ugly. Is there a simpler approach? And is there a name for this pattern?
source
.scan(TickerInOrder())((state, ticker) => TickerInOrder(state, ticker))
.collect { case TickerInOrder(Some(ticker), Some(inOrder)) if inOrder => ticker }
// ~~~~~~~~
object TickerInOrder {
def apply(state: TickerInOrder, ticker: Ticker): TickerInOrder = {
val inOrder = state.ticker match {
case Some(prev) => ticker.sequence > prev.sequence
case None => true
}
TickerInOrder(Some(ticker), Some(inOrder))
}
}
case class TickerInOrder(ticker: Option[Ticker] = None, inOrder: Option[Boolean] = None)
You can use statefulMapConcat, see docs https://doc.akka.io/docs/akka/current/stream/operators/Source-or-Flow/statefulMapConcat.html
import akka.actor.ActorSystem
import akka.stream.scaladsl.Source
object Stateful {
def main(args: Array[String]): Unit = {
implicit val system: ActorSystem = ActorSystem("Stateful")
Source(List(1,3,2,4,5,6,0,7)).statefulMapConcat {
() =>
var prev = 0L
element =>
if (element > prev) {
prev = element
element :: Nil
} else {
prev = element
Nil
}
}.runForeach(println)
// 1 3 4 5 6 7
}
}
It is simple to change the code to work with Ticker and sequence.
Related
Let's say we have a fake data source which will return data it holds in batch
class DataSource(size: Int) {
private var s = 0
implicit val g = scala.concurrent.ExecutionContext.global
def getData(): Future[List[Int]] = {
s = s + 1
Future {
Thread.sleep(Random.nextInt(s * 100))
if (s <= size) {
List.fill(100)(s)
} else {
List()
}
}
}
object Test extends App {
val source = new DataSource(100)
implicit val g = scala.concurrent.ExecutionContext.global
def process(v: List[Int]): Unit = {
println(v)
}
def next(f: (List[Int]) => Unit): Unit = {
val fut = source.getData()
fut.onComplete {
case Success(v) => {
f(v)
v match {
case h :: t => next(f)
}
}
}
}
next(process)
Thread.sleep(1000000000)
}
I have mine, the problem here is some portion is more not pure. Ideally, I would like to wrap the Future for each batch into a big future, and the wrapper future success when last batch returned 0 size list? My situation is a little from this post, the next() there is synchronous call while my is also async.
Or is it ever possible to do what I want? Next batch will only be fetched when the previous one is resolved in the end whether to fetch the next batch depends on the size returned?
What's the best way to walk through this type of data sources? Are there any existing Scala frameworks that provide the feature I am looking for? Is play's Iteratee, Enumerator, Enumeratee the right tool? If so, can anyone provide an example on how to use those facilities to implement what I am looking for?
Edit----
With help from chunjef, I had just tried out. And it actually did work out for me. However, there was some small change I made based on his answer.
Source.fromIterator(()=>Iterator.continually(source.getData())).mapAsync(1) (f=>f.filter(_.size > 0))
.via(Flow[List[Int]].takeWhile(_.nonEmpty))
.runForeach(println)
However, can someone give comparison between Akka Stream and Play Iteratee? Does it worth me also try out Iteratee?
Code snip 1:
Source.fromIterator(() => Iterator.continually(ds.getData)) // line 1
.mapAsync(1)(identity) // line 2
.takeWhile(_.nonEmpty) // line 3
.runForeach(println) // line 4
Code snip 2: Assuming the getData depends on some other output of another flow, and I would like to concat it with the below flow. However, it yield too many files open error. Not sure what would cause this error, the mapAsync has been limited to 1 as its throughput if I understood correctly.
Flow[Int].mapConcat[Future[List[Int]]](c => {
Iterator.continually(ds.getData(c)).to[collection.immutable.Iterable]
}).mapAsync(1)(identity).takeWhile(_.nonEmpty).runForeach(println)
The following is one way to achieve the same behavior with Akka Streams, using your DataSource class:
import scala.concurrent.Future
import scala.util.Random
import akka.actor.ActorSystem
import akka.stream._
import akka.stream.scaladsl._
object StreamsExample extends App {
implicit val system = ActorSystem("Sandbox")
implicit val materializer = ActorMaterializer()
val ds = new DataSource(100)
Source.fromIterator(() => Iterator.continually(ds.getData)) // line 1
.mapAsync(1)(identity) // line 2
.takeWhile(_.nonEmpty) // line 3
.runForeach(println) // line 4
}
class DataSource(size: Int) {
...
}
A simplified line-by-line overview:
line 1: Creates a stream source that continually calls ds.getData if there is downstream demand.
line 2: mapAsync is a way to deal with stream elements that are Futures. In this case, the stream elements are of type Future[List[Int]]. The argument 1 is the level of parallelism: we specify 1 here because DataSource internally uses a mutable variable, and a parallelism level greater than one could produce unexpected results. identity is shorthand for x => x, which basically means that for each Future, we pass its result downstream without transforming it.
line 3: Essentially, ds.getData is called as long as the result of the Future is a non-empty List[Int]. If an empty List is encountered, processing is terminated.
line 4: runForeach here takes a function List[Int] => Unit and invokes that function for each stream element.
Ideally, I would like to wrap the Future for each batch into a big future, and the wrapper future success when last batch returned 0 size list?
I think you are looking for a Promise.
You would set up a Promise before you start the first iteration.
This gives you promise.future, a Future that you can then use to follow the completion of everything.
In your onComplete, you add a case _ => promise.success().
Something like
def loopUntilDone(f: (List[Int]) => Unit): Future[Unit] = {
val promise = Promise[Unit]
def next(): Unit = source.getData().onComplete {
case Success(v) =>
f(v)
v match {
case h :: t => next()
case _ => promise.success()
}
case Failure(e) => promise.failure(e)
}
// get going
next(f)
// return the Future for everything
promise.future
}
// future for everything, this is a `Future[Unit]`
// its `onComplete` will be triggered when there is no more data
val everything = loopUntilDone(process)
You are probably looking for a reactive streams library. My personal favorite (and one I'm most familiar with) is Monix. This is how it will work with DataSource unchanged
import scala.concurrent.duration.Duration
import scala.concurrent.Await
import monix.reactive.Observable
import monix.execution.Scheduler.Implicits.global
object Test extends App {
val source = new DataSource(100)
val completed = // <- this is Future[Unit], completes when foreach is done
Observable.repeat(Observable.fromFuture(source.getData()))
.flatten // <- Here it's Observable[List[Int]], it has collection-like methods
.takeWhile(_.nonEmpty)
.foreach(println)
Await.result(completed, Duration.Inf)
}
I just figured out that by using flatMapConcat can achieve what I wanted to achieve. There is no point to start another question as I have had the answer already. Put my sample code here just in case someone is looking for similar answer.
This type of API is very common for some integration between traditional Enterprise applications. The DataSource is to mock the API while the object App is to demonstrate how the client code can utilize Akka Stream to consume the APIs.
In my small project the API was provided in SOAP, and I used scalaxb to transform the SOAP to Scala async style. And with the client calls demonstrated in the object App, we can consume the API with AKKA Stream. Thanks for all for the help.
class DataSource(size: Int) {
private var transactionId: Long = 0
private val transactionCursorMap: mutable.HashMap[TransactionId, Set[ReadCursorId]] = mutable.HashMap.empty
private val cursorIteratorMap: mutable.HashMap[ReadCursorId, Iterator[List[Int]]] = mutable.HashMap.empty
implicit val g = scala.concurrent.ExecutionContext.global
case class TransactionId(id: Long)
case class ReadCursorId(id: Long)
def startTransaction(): Future[TransactionId] = {
Future {
synchronized {
transactionId += transactionId
}
val t = TransactionId(transactionId)
transactionCursorMap.update(t, Set(ReadCursorId(0)))
t
}
}
def createCursorId(t: TransactionId): ReadCursorId = {
synchronized {
val c = transactionCursorMap.getOrElseUpdate(t, Set(ReadCursorId(0)))
val currentId = c.foldLeft(0l) { (acc, a) => acc.max(a.id) }
val cId = ReadCursorId(currentId + 1)
transactionCursorMap.update(t, c + cId)
cursorIteratorMap.put(cId, createIterator)
cId
}
}
def createIterator(): Iterator[List[Int]] = {
(for {i <- 1 to 100} yield List.fill(100)(i)).toIterator
}
def startRead(t: TransactionId): Future[ReadCursorId] = {
Future {
createCursorId(t)
}
}
def getData(cursorId: ReadCursorId): Future[List[Int]] = {
synchronized {
Future {
Thread.sleep(Random.nextInt(100))
cursorIteratorMap.get(cursorId) match {
case Some(i) => i.next()
case _ => List()
}
}
}
}
}
object Test extends App {
val source = new DataSource(10)
implicit val system = ActorSystem("Sandbox")
implicit val materializer = ActorMaterializer()
implicit val g = scala.concurrent.ExecutionContext.global
//
// def process(v: List[Int]): Unit = {
// println(v)
// }
//
// def next(f: (List[Int]) => Unit): Unit = {
// val fut = source.getData()
// fut.onComplete {
// case Success(v) => {
// f(v)
// v match {
//
// case h :: t => next(f)
//
// }
// }
//
// }
//
// }
//
// next(process)
//
// Thread.sleep(1000000000)
val s = Source.fromFuture(source.startTransaction())
.map { e =>
source.startRead(e)
}
.mapAsync(1)(identity)
.flatMapConcat(
e => {
Source.fromIterator(() => Iterator.continually(source.getData(e)))
})
.mapAsync(5)(identity)
.via(Flow[List[Int]].takeWhile(_.nonEmpty))
.runForeach(println)
/*
val done = Source.fromIterator(() => Iterator.continually(source.getData())).mapAsync(1)(identity)
.via(Flow[List[Int]].takeWhile(_.nonEmpty))
.runFold(List[List[Int]]()) { (acc, r) =>
// println("=======" + acc + r)
r :: acc
}
done.onSuccess {
case e => {
e.foreach(println)
}
}
done.onComplete(_ => system.terminate())
*/
}
I am trying to implement generic method in Scala operating on database using Quill.io library. Type T will be only case classes what works with Quill.io.
def insertOrUpdate[T](inserting: T, equality: (T,T) => Boolean)(implicit ctx: Db.Context): Unit = {
import ctx._
val existingQuery = quote {
query[T].filter { dbElement: T =>
equality(dbElement, inserting)
}
}
val updateQuery = quote {
query[T].filter { dbElement =>
equality(dbElement, lift(inserting))
}.update(lift(inserting))
}
val insertQuery = quote { query[T].insert(lift(inserting)) }
val existing = ctx.run(existingQuery)
existing.size match {
case 1 => ctx.run(updateQuery)
case _ => ctx.run(insertQuery)
}
}
But I am getting two types of compile error
Error:(119, 12) Can't find an implicit `SchemaMeta` for type `T`
query[T].filter { dbElement: T =>
Error:(125, 33) Can't find Encoder for type 'T'
equality(dbElement, lift(inserting))
How can I modify my code to let it work?
As I said in the issue that #VojtechLetal mentioned in his answer you have to use macros.
I added code implementing generic insert or update in my example Quill project.
It defines trait Queries that's mixed into context:
trait Queries {
this: JdbcContext[_, _] =>
def insertOrUpdate[T](entity: T, filter: (T) => Boolean): Unit = macro InsertOrUpdateMacro.insertOrUpdate[T]
}
This trait uses macro that's implementing your code with minor changes:
import scala.reflect.macros.whitebox.{Context => MacroContext}
class InsertOrUpdateMacro(val c: MacroContext) {
import c.universe._
def insertOrUpdate[T](entity: Tree, filter: Tree)(implicit t: WeakTypeTag[T]): Tree =
q"""
import ${c.prefix}._
val updateQuery = ${c.prefix}.quote {
${c.prefix}.query[$t].filter($filter).update(lift($entity))
}
val insertQuery = quote {
query[$t].insert(lift($entity))
}
run(${c.prefix}.query[$t].filter($filter)).size match {
case 1 => run(updateQuery)
case _ => run(insertQuery)
}
()
"""
}
Usage examples:
import io.getquill.{PostgresJdbcContext, SnakeCase}
package object genericInsertOrUpdate {
val ctx = new PostgresJdbcContext[SnakeCase]("jdbc.postgres") with Queries
def example1(): Unit = {
val inserting = Person(1, "")
ctx.insertOrUpdate(inserting, (p: Person) => p.name == "")
}
def example2(): Unit = {
import ctx._
val inserting = Person(1, "")
ctx.insertOrUpdate(inserting, (p: Person) => p.name == lift(inserting.name))
}
}
P.S. Because update() returns number of updated records your code can be simplified to:
class InsertOrUpdateMacro(val c: MacroContext) {
import c.universe._
def insertOrUpdate[T](entity: Tree, filter: Tree)(implicit t: WeakTypeTag[T]): Tree =
q"""
import ${c.prefix}._
if (run(${c.prefix}.quote {
${c.prefix}.query[$t].filter($filter).update(lift($entity))
}) == 0) {
run(quote {
query[$t].insert(lift($entity))
})
}
()
"""
}
As one of the quill contributors said in this issue:
If you want to make your solution generic then you have to use macros because Quill generates queries at compile time and T type has to be resolved at that time.
TL;DR The following did not work either, just playing
Anyway... just out of curiosity I tried to fix the issue by following the error which you mentioned. I changed the definition of the function as:
def insertOrUpdate[T: ctx.Encoder : ctx.SchemaMeta](...)
which yielded the following log
[info] PopulateAnomalyResultsTable.scala:71: Dynamic query
[info] case _ => ctx.run(insertQuery)
[info]
[error] PopulateAnomalyResultsTable.scala:68: exception during macro expansion:
[error] scala.reflect.macros.TypecheckException: Found the embedded 'T', but it is not a case class
[error] at scala.reflect.macros.contexts.Typers$$anonfun$typecheck$2$$anonfun$apply$1.apply(Typers.scala:34)
[error] at scala.reflect.macros.contexts.Typers$$anonfun$typecheck$2$$anonfun$apply$1.apply(Typers.scala:28)
It starts promising, since quill apparently gave up on static compilation and made the query dynamic. I checked the source code of the failing macro and it seems that quill is trying to get a constructor for T which is not known in the current context.
For more details see my answer Generic macro with quill or implementation
CrudMacro:
Complete project you will find on quill-generic
package pl.jozwik.quillgeneric.quillmacro
import scala.reflect.macros.whitebox.{ Context => MacroContext }
class CrudMacro(val c: MacroContext) extends AbstractCrudMacro {
import c.universe._
def callFilterOnIdTree[K: c.WeakTypeTag](id: Tree)(dSchema: c.Expr[_]): Tree =
callFilterOnId[K](c.Expr[K](q"$id"))(dSchema)
protected def callFilterOnId[K: c.WeakTypeTag](id: c.Expr[K])(dSchema: c.Expr[_]): Tree = {
val t = weakTypeOf[K]
t.baseClasses.find(c => compositeSet.contains(c.asClass.fullName)) match {
case None =>
q"$dSchema.filter(_.id == lift($id))"
case Some(base) =>
val query = q"$dSchema.filter(_.id.fk1 == lift($id.fk1)).filter(_.id.fk2 == lift($id.fk2))"
base.fullName match {
case `compositeKey4Name` =>
q"$query.filter(_.id.fk3 == lift($id.fk3)).filter(_.id.fk4 == lift($id.fk4))"
case `compositeKey3Name` =>
q"$query.filter(_.id.fk3 == lift($id.fk3))"
case `compositeKey2Name` =>
query
case x =>
c.abort(NoPosition, s"$x not supported")
}
}
}
def createAndGenerateIdOrUpdate[K: c.WeakTypeTag, T: c.WeakTypeTag](entity: Tree)(dSchema: c.Expr[_]): Tree = {
val filter = callFilter[K, T](entity)(dSchema)
q"""
import ${c.prefix}._
val id = $entity.id
val q = $filter
val result = run(
q.updateValue($entity)
)
if (result == 0) {
run($dSchema.insertValue($entity).returningGenerated(_.id))
} else {
id
}
"""
}
def createWithGenerateIdOrUpdateAndRead[K: c.WeakTypeTag, T: c.WeakTypeTag](entity: Tree)(dSchema: c.Expr[_]): Tree = {
val filter = callFilter[K, T](entity)(dSchema)
q"""
import ${c.prefix}._
val id = $entity.id
val q = $filter
val result = run(
q.updateValue($entity)
)
val newId =
if (result == 0) {
run($dSchema.insertValue($entity).returningGenerated(_.id))
} else {
id
}
run($dSchema.filter(_.id == lift(newId)))
.headOption
.getOrElse(throw new NoSuchElementException(s"$$newId"))
"""
}
}
I got the error
found : scala.concurrent.Future[Option[models.ProcessTemplatesModel]]
required: Option[models.ProcessTemplatesModel]
My function is below
def createCopyOfProcessTemplate(processTemplateId: Int): Future[Option[ProcessTemplatesModel]] = {
val action = processTemplates.filter(_.id === processTemplateId).result.map(_.headOption)
val result: Future[Option[ProcessTemplatesModel]] = db.run(action)
result.map { case (result) =>
result match {
case Some(r) => {
var copy = (processTemplates returning processTemplates.map(_.id)) += ProcessTemplatesModel(None, "[Copy of] " + r.title, r.version, r.createdat, r.updatedat, r.deadline, r.status, r.comment, Some(false), r.checkedat, Some(false), r.approvedat, false, r.approveprocess, r.trainingsprocess)
val composedAction = copy.flatMap { id =>
processTemplates.filter(_.id === id).result.headOption
}
db.run(composedAction)
}
}
}
}
what is my problem in this case?
edit:
my controller function looks like this:
def createCopyOfProcessTemplate(processTemplateId: Int) = Action.async {
processTemplateDTO.createCopyOfProcessTemplate(processTemplateId).map { process =>
Ok(Json.toJson(process))
}
}
Is there my failure?
According to the your code - there are the following issues:
You use two db.run which return futures, but inner future will
not complete. For resolving it you should compose futures with
flatMap or for-comprehension.
You use only one partial-function case Some(_) => for pattern matching
and don't handle another value None.
You can use only one db.run and actions composition.
Your code can be like as:
def createCopyOfProcessTemplate(processTemplateId: Int): Future[Option[ProcessTemplatesModel]] = {
val action = processTemplates.filter(...).result.map(_.headOption)
val composedAction = action.flatMap {
case Some(r) =>
val copyAction = (processTemplates returning processTemplates...)
copyAction.flatMap { id =>
processTemplates.filter(_.id === id).result.headOption
}
case _ =>
DBIO.successful(None) // issue #2 has been resolved here
}
db.run(composedAction) // issue #3 has been resolved here
}
We get rid of issue #1 (because we use actions composition).
How do I prevent error when someone does not choose one of the options in scala. This is using Map to get options and I tried to implement Try and catch blocks in case options but it does not work. I'm not sure if this is the right way to do it, if there is any other way let me know. The error is Exception in thread "main" java.lang.NumberFormatException: For input string: "e".
object main extends menu {
def main(args: Array[String]): Unit = {
var opt = 0
do { opt = readOption }
while (menu(opt))
}
}
class menu extends database {
def menu(option: Int): Boolean = try {
actionMap.get(option) match {
case Some(a) => a()
case None => println("That didn't work.")
false
}
} catch {
case _: NumberFormatException => true
}
val actionMap = Map[Int, () => Boolean](1 -> cWords, 2 -> cCharacters, 3 -> exit)
def readOption: Int = {
println(
"""|Please select one of the following:
| 1 - Count Words
| 2 - Count Characters in words
| 3 - quit""".stripMargin)
StdIn.readInt()
}
Use scala.util.Try on readInt(),
import scala.io._
import scala.util._
Try(StdIn.readInt()).toOption
// returns Some(123) for input 123
Try(StdIn.readInt()).toOption
// returns None for input 1a3
Thus readOption delivers Option[Int]. Then
def menu(option: Option[Int]): Boolean = option match {
case Some(a) => actionMap(a)()
case None => println("Try again..."); false
}
Note
A more concise version of main,
def main(args: Array[String]): Unit = while (menu(readOption)) ()
Namely, while menu is true do Unit (or () ).
Here is some working implementation:
import scala.io.StdIn
import scala.util.Try
object Main extends Menu with App {
while (menu(readChoice)) ()
}
class Menu {
val actionMap = Map[Int, () => Boolean](1 -> (() => true), 2 -> (() => true), 3 -> (() => false))
def menu(choice: Option[Int]): Boolean = {
choice.flatMap(actionMap.get)
.map(action => action())
.getOrElse({ println("That didn't work."); false })
}
def readChoice: Option[Int] = {
println(
"""|Please select one of the following:
| 1 - Count Words
| 2 - Count Characters in words
| 3 - quit""".stripMargin)
Try(StdIn.readInt).toOption
}
}
For the first, you can mixin App trait to skip main method boilerplate.
You can simplify your do while loop like this, it has to do nothing inside so you either need some expression or block. A Unit value can be your expression that does nothing.
In scala we name classes using camel case, starting with capital letter.
As readInt throws whenever input is wrong you can catch that using Try, that will return Success(result) of Failure(exception) and change this result to an Option to discard the exception.
what happens in menu is shorthand for
choice match {
case Some(number) =>
actionMap.get(number) match {
case Some(action) =>
action()
case None =>
println("That didn't work.")
false
}
case None =>
println("That didn't work.")
false
}
And could be as well written with for
(for {
number <- choice
action <- actionMap.get(number)
} yield {
action()
}) getOrElse {
println("That didn't work.")
false
}
On as sidenote, you named choice of user an "option" which unfortunately is also a scala class used here extensively, I renamed the variables to avoid confusion.
I would make readOption return a Try[Int], (a Try surrounding StdIn.readInt()), then deal with the possible cases in the menu function
I'm new to scala and functional programming. I'm trying out the usual beginner applications and scripts(Obviously using a bit of over-technology)
Anyways I have this code for a calculator that takes arguments and a switch to dictate the operation to use on the arguments.
object Main {
def main(args: Array[String]): Unit = {
var calc = calculate( "" , _:List[Int])
var values:List[Int] = List()
if(args.size < 1) println("No arguments supplied") else{
args collect {_ match{
case arg if arg.contains("-") => {
if(values.size>0){
calc(values)
values = List()}
calc = calculate( arg , _:List[Int])
}
case arg => {
try{
val value=arg.toInt
values = values ::: List(value)
}catch{case e:NumberFormatException=>println("\nError:Invalid argument:\""+arg+"\"\nCannot convert to Integer.\n")}
}
}
}
calc(values)
}
}
def sum(values:List[Int]) { println("The sum is:"+(values.foldLeft(0)((sum,value) => sum + value))) }
def subtract(values:List[Int]) {
val initial:Int = values(0)
var total:Int = 0
for(i <- 1 until values.size){
total = total + values(i)
}
val diff:Int = initial - total
println("The difference is:"+diff)
}
def calculate(operation:String,values:List[Int]) = operation match {
case "-sum" => sum(values)
case "-diff" => subtract(values)
case _ => println("Default operation \"Sum\" will be applied");sum(values)
}
}
Some points that id like to find if theres a better way to do is like removing the try catch statement.
A better way to compose this application would be very welcome.
How about this one?
object Main extends App {
require(args.size > 0, "Please, supply more arguments")
#annotation.tailrec
def parseArguments(arguments: List[String], operation: String, values: List[Int]() = Nil) {
if(values.nonEmpty) calculate(operation, values)
arguments match {
case op::unprocessed if op.startsWith("-") => parseArguments(unprocessed, op)
case maybeNumber::unprocessed => {
val newValues = try {
maybeNumber.toInt::values
} catch {
case _: NumberFormatException =>
println("\nError:Invalid argument:\""+maybeNumber+"\"\nCannot convert to Integer.\n")
values
}
parseArguments(unprocessed, operation, newValues)
}
case Nil => //done processing, exiting
}
}
parseArguments(args.toList, "")
def diff(values:List[Int]) = {
val initial::tail = values
val total = tail.sum
initial - total
}
def calculate(operation:String, values:List[Int]) = operation match {
case "-sum" => println("The sum is " + values.sum)
case "-diff" => println("The difference is: " + diff(values))
case _ =>
println("""Default operation "Sum" will be applied""")
sum(values)
}
}