how to start server/client grpc using scalapb on spark? - scala

i have a problem about running server/client using ScalaPB on spark.
its totally work fine while I running my code using "sbt run". i want running this code using spark coz next ill import my spark model to predict some label. but while I submit my jar to spark, they give me error like this.
Exception in thread "main" io.grpc.ManagedChannelProvider$ProviderNotFoundException:
No functional server found. Try adding a dependency on the grpc-netty artifact
this is my build.sbt
scalaVersion := "2.11.7"
PB.targets in Compile := Seq(
scalapb.gen() -> (sourceManaged in Compile).value
)
val scalapbVersion =
scalapb.compiler.Version.scalapbVersion
val grpcJavaVersion =
scalapb.compiler.Version.grpcJavaVersion
libraryDependencies ++= Seq(
// protobuf
"com.thesamet.scalapb" %% "scalapb-runtime" % scalapbVersion % "protobuf",
//for grpc
"io.grpc" % "grpc-netty" % grpcJavaVersion ,
"com.thesamet.scalapb" %% "scalapb-runtime-grpc" % scalapbVersion
)
assemblyMergeStrategy in assembly := {
case PathList("META-INF", xs # _*) => MergeStrategy.discard
case x => MergeStrategy.first
}
using shade still not work
assemblyShadeRules in assembly := Seq(ShadeRule.rename("com.google.**" -> "shadegoogle.#1").inAll)
and this my main
import java.util.logging.Logger
import io.grpc.{Server, ServerBuilder}
import org.apache.spark.ml.tuning.CrossValidatorModel
import org.apache.spark.sql.SparkSession
import testproto.test.{Email, EmailLabel, RouteGuideGrpc}
import scala.concurrent.{ExecutionContext, Future}
object HelloWorldServer {
private val logger = Logger.getLogger(classOf[HelloWorldServer].getName)
def main(args: Array[String]): Unit = {
val server = new HelloWorldServer(ExecutionContext.global)
server.start()
server.blockUntilShutdown()
}
private val port = 50051
}
class HelloWorldServer(executionContext: ExecutionContext) {
self =>
private[this] var server: Server = null
private def start(): Unit = {
server = ServerBuilder.forPort(HelloWorldServer.port).addService(RouteGuideGrpc.bindService(new RouteGuideImpl, executionContext)).build.start
HelloWorldServer.logger.info("Server started, listening on " + HelloWorldServer.port)
sys.addShutdownHook {
System.err.println("*** shutting down gRPC server since JVM is shutting down")
self.stop()
System.err.println("*** server shut down")
}
}
private def stop(): Unit = {
if (server != null) {
server.shutdown()
}
}
private def blockUntilShutdown(): Unit = {
if (server != null) {
server.awaitTermination()
}
}
private class RouteGuideImpl extends RouteGuideGrpc.RouteGuide {
override def getLabel(request: Email): Future[EmailLabel] = {
val replay = EmailLabel(emailId = request.emailId, label = "aaaaa")
Future.successful(replay)
}
}
}
thanks

It looks like grpc-netty is not found when an uber jar is made. Instead of using ServerBuilder, change your code to use io.grpc.netty.NettyServerBuilder.

Related

[Akka]: Spawn an Actor at runtime with Scala Reflection and monitor its behavior using Grafana and Prometheus

I set up Grafana and Prometheus in Akka to monitor the behaviour of my system.
If I spawn Actors at compile time it works and I can see them on the dashboard.
Now I'd like to compile an Actor at runtime and monitor its behaviour.
In order to achieve that, I run
val toolbox = currentMirror.mkToolBox()
// Class instance
val actorCode = q"""
import akka.actor._
object HelloActor {
def props(name : String) = Props(new HelloActor(name))
}
class HelloActor(myName: String) extends Actor {
def receive = {
case "hello" => println("hello from %s".format(myName))
case _ => println("'huh?', said %s".format(myName))
}
}
return HelloActor.props("Jane")
"""
Then I compile the Actor, I get the Props and send a message to it in this way
val compiledCode = toolbox.compile(actorCode)()
val actorSystem = ActorSystem("firstActorSystem")
val myProps = compiledCode.asInstanceOf[Props]
val helloActor = actorSystem.actorOf(myProps)
helloActor ! "hello"
Everything works fine, but if I go to the Prometheus dashboard I can not see the Actor instance and the messages that have been sent.
Any tips to solve this issue ?
Actually, I guess with current versions (Scala 2.13.8, Akka 2.6.20, Cinnamon 2.17.0, Prometheus 2.38.0, Grafana 9.1.6) the actors are displayed.
src/main/scala/Main.scala
import scala.reflect.runtime.universe._
import scala.reflect.runtime
import scala.tools.reflect.ToolBox
import akka.actor._
object Main extends App {
object HelloActor {
def props(name: String) = Props(new HelloActor(name))
}
class HelloActor(myName: String) extends Actor {
def receive = {
case "hello" => println("hello from %s".format(myName))
case _ => println("'huh?', said %s".format(myName))
}
}
val myProps = HelloActor.props("Jane")
val actorSystem = ActorSystem("firstActorSystem")
val helloActor = actorSystem.actorOf(myProps)
helloActor ! "hello"
val rm = runtime.currentMirror
val tb = rm.mkToolBox()
val actorCode =
q"""
import akka.actor._
object HelloActor1 {
def props(name : String) = Props(new HelloActor1(name))
}
class HelloActor1(myName: String) extends Actor {
def receive = {
case "hello" => println("hello from %s".format(myName))
case _ => println("'huh?', said %s".format(myName))
}
}
HelloActor1.props("Jane1")
"""
val compiledCode = tb.compile(actorCode)()
val myProps1 = compiledCode.asInstanceOf[Props]
val helloActor1 = actorSystem.actorOf(myProps1)
helloActor1 ! "hello"
}
src/main/resources/application.conf
cinnamon.application = "telemetry"
cinnamon.akka {
actors {
"/user/*" {
report-by = instance
}
}
}
cinnamon.prometheus {
exporters += http-server
}
project/build.properties
sbt.version = 1.7.1
project/plugins.sbt
addSbtPlugin("com.lightbend.cinnamon" % "sbt-cinnamon" % "2.17.0")
build.sbt
lazy val root = (project in file(".")).
settings(
inThisBuild(List(
organization := "com.example",
scalaVersion := "2.13.8"
)),
name := "telemetry",
libraryDependencies ++=
Seq(
"com.typesafe.akka" %% "akka-actor" % "2.6.20",
Cinnamon.library.cinnamonAkka,
Cinnamon.library.cinnamonPrometheus,
Cinnamon.library.cinnamonPrometheusHttpServer,
scalaOrganization.value % "scala-reflect" % scalaVersion.value,
scalaOrganization.value % "scala-compiler" % scalaVersion.value,
),
mainClass := Some("Main"),
) enablePlugins Cinnamon
run / cinnamon := true
test / cinnamon := true
lightbend.sbt (mykey is from https://www.lightbend.com/account/lightbend-platform/credentials)
lazy val mykey = "..."
ThisBuild / resolvers += "lightbend-commercial-mvn" at
s"https://repo.lightbend.com/pass/$mykey/commercial-releases"
ThisBuild / resolvers += Resolver.url(
"lightbend-commercial-ivy",
url(s"https://repo.lightbend.com/pass/$mykey/commercial-releases")
)(Resolver.ivyStylePatterns)
I downloaded Prometheus from https://prometheus.io/download/, Grafana from https://grafana.com/grafana/download (standalone binaries).
prometheus-2.38.0.linux-amd64/prometheus.yml
# Scrape configuration for default Cinnamon Prometheus HTTP Server on localhost
scrape_configs:
- job_name: 'cinnamon'
scrape_interval: 10s
static_configs:
- targets: ['localhost:9001']
I run
/prometheus-2.38.0.linux-amd64$ ./prometheus --config.file=prometheus.yml
/grafana-9.1.6/bin$ ./grafana-server
sbt clean compile run
(or you can run in IDE with vm option -javaagent:/home/dmitin/.cache/coursier/v1/https/repo.lightbend.com/pass/[mykey]/commercial-releases/com/lightbend/cinnamon/cinnamon-agent/2.17.0/cinnamon-agent-2.17.0.jar).
I can see Cinnamon Prometheus Http Server (datasource) metrics at http://localhost:9001/metrics (http://localhost:9001), Prometheus Server at http://localhost:9090, http://localhost:9090/metrics, Grafana at http://localhost:3000.
I login at Grafana (admin:admin), create datasource Cinnamon Prometheus (How to add the Kafka Exporter as a data source to Grafana?)
(plugin Prometheus was already installed), enable the datasource, import dashboard Akka Actors from here, go to the dashboard and see
Please notice actor __wrapper$1$10f8024d0bba48c08a394d97fd56b2f0.__wrapper$1$10f8024d0bba48c08a394d97fd56b2f0$HelloActor1$3. It is HelloActor1 created by the Tolbox.
Links:
https://developer.lightbend.com/docs/telemetry/current/sandbox/prometheus-sandbox.html
https://medium.com/akka-scala/akka-monitor-your-applications-with-lightbend-telemetry-prometheus-and-grafana-dashboard-1b7353e281c1
https://developer.lightbend.com/docs/telemetry/current/visualizations/grafana.html
https://developer.lightbend.com/docs/akka-platform-guide/telemetry/prometheus-backend.html
https://developer.lightbend.com/docs/telemetry/current/plugins/prometheus/prometheus.html
https://developer.lightbend.com/docs/telemetry/current/setup/cinnamon-agent-sbt.html

testing kafka and spark with testcontainers

I am trying to check with testcontainers a streaming pipeline as a integration test but I donĀ“t know how get bootstrapServers, at least in last testcontainers version and create a specific topic there. How can I use 'containerDef' to extract bootstrapservers and add a topic?
import com.dimafeng.testcontainers.{ContainerDef, KafkaContainer}
import com.dimafeng.testcontainers.scalatest.TestContainerForAll
import munit.FunSuite
import org.apache.spark.sql.SparkSession
class Mykafkatest extends FunSuite with TestContainerForAll {
//val kafkaContainer: KafkaContainer = KafkaContainer("confluentinc/cp-kafka:5.4.3")
override val containerDef: ContainerDef = KafkaContainer.Def()
test("do something")(withContainers { container =>
val sparkSession: SparkSession = SparkSession
.builder()
.master("local[*]")
.appName("Unit testing")
.getOrCreate()
// How add a topic in that container?
// This is not posible:
val servers=container.bootstrapServers
val df = sparkSession.readStream
.format("kafka")
.option("kafka.bootstrap.servers", servers)
.option("subscribe", "topic1")
.load()
df.show(false)
})
}
My sbt configuration:
lazy val root = project
.in(file("./pipeline"))
.settings(
organization := "org.example",
name := "spark-stream",
version := "0.1",
scalaVersion := "2.12.10",
libraryDependencies := Seq(
"org.apache.spark" %% "spark-sql-kafka-0-10" % "3.0.3" % Compile,
"org.apache.spark" %% "spark-sql" % "3.0.3" % Compile,
"com.dimafeng" %% "testcontainers-scala-munit" % "0.39.5" % Test,
"org.dimafeng" %% "testcontainers-scala-kafka" % "0.39.5" % Test,
"org.scalameta" %% "munit" % "0.7.28" % Test
),
testFrameworks += new TestFramework("munit.Framework"),
Test / fork := true
)
Documentation does not show a complete example: https://www.testcontainers.org/modules/kafka/
The only problem here is that you are explicitly casting that KafkaContainer.Def to ContainerDef.
The type of container provided by withContianers, Containter is decided by path dependent type in provided ContainerDef,
trait TestContainerForAll extends TestContainersForAll { self: Suite =>
val containerDef: ContainerDef
final override type Containers = containerDef.Container
override def startContainers(): containerDef.Container = {
containerDef.start()
}
// inherited from TestContainersSuite
def withContainers[A](runTest: Containers => A): A = {
val c = startedContainers.getOrElse(throw IllegalWithContainersCall())
runTest(c)
}
}
trait ContainerDef {
type Container <: Startable with Stoppable
protected def createContainer(): Container
def start(): Container = {
val container = createContainer()
container.start()
container
}
}
The moment you explicitly specify the type ContainerDef in override val containerDef: ContainerDef = KafkaContainer.Def(), this breaks the whole "type trickery" and thus Scala compiler is left with a type Container <: Startable with Stoppable instead of a KafkaContainer.
So, just remove that explicit type cast to ContainerDef, and that val servers = container.bootstrapServers will work as expected.
import com.dimafeng.testcontainers.KafkaContainer
import com.dimafeng.testcontainers.munit.TestContainerForAll
import munit.FunSuite
class Mykafkatest extends FunSuite with TestContainerForAll {
override val containerDef = KafkaContainer.Def()
test("do something")(withContainers { container =>
//...
val servers = container.bootstrapServers
println(servers)
//...
})
}

PlaySpec not found in IntelliJ

Below is a Scala test of websocket:
import java.util.function.Consumer
import play.shaded.ahc.org.asynchttpclient.AsyncHttpClient
import play.api.inject.guice.GuiceApplicationBuilder
import play.api.test.{Helpers, TestServer, WsTestClient}
import scala.compat.java8.FutureConverters
import scala.concurrent.Await
import scala.concurrent.duration._
import org.scalatestplus.play._
class SocketTest extends PlaySpec with ScalaFutures {
"HomeController" should {
"reject a websocket flow if the origin is set incorrectly" in WsTestClient.withClient { client =>
// Pick a non standard port that will fail the (somewhat contrived) origin check...
lazy val port: Int = 31337
val app = new GuiceApplicationBuilder().build()
Helpers.running(TestServer(port, app)) {
val myPublicAddress = s"localhost:$port"
val serverURL = s"ws://$myPublicAddress/ws"
val asyncHttpClient: AsyncHttpClient = client.underlying[AsyncHttpClient]
val webSocketClient = new WebSocketClient(asyncHttpClient)
try {
val origin = "ws://example.com/ws"
val consumer: Consumer[String] = new Consumer[String] {
override def accept(message: String): Unit = println(message)
}
val listener = new WebSocketClient.LoggingListener(consumer)
val completionStage = webSocketClient.call(serverURL, origin, listener)
val f = FutureConverters.toScala(completionStage)
Await.result(f, atMost = 1000.millis)
listener.getThrowable mustBe a[IllegalStateException]
} catch {
case e: IllegalStateException =>
e mustBe an[IllegalStateException]
case e: java.util.concurrent.ExecutionException =>
val foo = e.getCause
foo mustBe an[IllegalStateException]
}
}
}
}
}
But compile is failing on line import org.scalatestplus.play._ with error :
Cannot resolve symbol scalatestplus
From https://www.playframework.com/documentation/2.8.x/ScalaTestingWithScalaTest I have added scalatest and play to build:
build.sbt:
name := "testproject"
version := "1.0"
lazy val `testproject` = (project in file(".")).enablePlugins(PlayScala)
resolvers += "scalaz-bintray" at "https://dl.bintray.com/scalaz/releases"
resolvers += "Akka Snapshot Repository" at "https://repo.akka.io/snapshots/"
scalaVersion := "2.12.2"
libraryDependencies ++= Seq( jdbc , ehcache , ws , guice , specs2 % Test)
// https://mvnrepository.com/artifact/com.typesafe.scala-logging/scala-logging
libraryDependencies += "com.typesafe.scala-logging" %% "scala-logging" % "3.9.2"
libraryDependencies ++= Seq(
"org.scalatestplus.play" %% "scalatestplus-play" % "3.0.0" % "test"
)
unmanagedResourceDirectories in Test <+= baseDirectory ( _ /"target/web/public/test" )
I've tried rebuilding the project and module in IntelliJ "build" option and "Build Option" when I right click on build.sbt but the import is not found.
sbt dist from Intellij "sbt shell" then File -> "Invalidate caches" with restart of IntelliJ seems to fix the issue
:Invalidate caches screenshot

Running http4s server with ZIO Env

Trying to learn using ZIO library, so I decided to create a basic web service app. Idea pretty basic, use http4s lib for server and route endpoints, print "hello world" on endpoint call.
With the help of docs and examples I found, produces code:
object Main extends ManagedApp {
type AppEnvironment = Clock with Console with HelloRepository
type AppTask[A] = RIO[AppEnvironment, A]
override def run(args: List[String]): ZManaged[ZEnv, Nothing, Int] = {
val httpApp: HttpApp[AppTask] = Router[AppTask]("/" -> helloWorldService).orNotFound
val server = ZIO.runtime[AppEnvironment].flatMap { implicit rts =>
BlazeServerBuilder[AppTask]
.bindHttp(8080, "0.0.0.0")
.withHttpApp(CORS(httpApp))
.serve
.compile[AppTask, AppTask, ExitCode]
.drain
}
(for {
_ <- ZManaged.environment[ZEnv] >>> server.toManaged_
} yield ())
.foldM(err => putStrLn(s"Execution failed with: $err").as(1).toManaged_, _ => ZManaged.succeed(0))
}
val dsl: Http4sDsl[AppTask] = Http4sDsl[AppTask]
import dsl._
val helloWorldService: HttpRoutes[AppTask] = HttpRoutes.of[AppTask] {
case GET -> Root / "hello" / name => Ok(Repo.getHello(name))
}
}
trait HelloRepository extends Serializable {
val helloRepository: HelloRepository.Service[Any]
}
object HelloRepository extends Serializable {
trait Service[R] extends Serializable {
def getHello(name: String): ZIO[R, Nothing, String]
}
}
object Repo extends HelloRepository.Service[HelloRepository] {
override def getHello(name: String): ZIO[HelloRepository, Nothing, String] = ZIO.succeed(s"Hello $name")
}
I create router: Router[AppTask]("/" ...
I create server: ZIO.runtime[AppEnvironment].flatMap ...
Then trying to start server with ZIO enviroment,
but something I am missing as this line:
_ <- ZManaged.environment[ZEnv] >>> server.toManaged_
is incorected, and throws error on build:
Error:(34, 39) inferred type arguments [touch.Main.AppEnvironment,Throwable,Unit] do not conform to method >>>'s type parameter bounds [R1 >: zio.ZEnv,E1,B]
_ <- ZManaged.environment[ZEnv] >>> server.toManaged_
Error:(34, 39) inferred type arguments [touch.Main.AppEnvironment,Throwable,Unit] do not conform to method >>>'s type parameter bounds [R1 >: zio.ZEnv,E1,B]
Error:(34, 50) type mismatch;
found : zio.ZManaged[touch.Main.AppEnvironment,Throwable,Unit]
(which expands to) zio.ZManaged[zio.clock.Clock with zio.console.Console with touch.HelloRepository,Throwable,Unit]
required: zio.ZManaged[R1,E1,B]
maybe someone can help me with the correct syntax?
also would appriacete some explanation, or link to docs, where this is explained.
I would like to explain more but I don't know where you got your code sample or what your build.sbt looks like but I happen to have some http4s code lying around so I took the liberty of adding some import statements and simplifying it a bit. You can always add back the complexity I took out.
Here's what worked for me.
/tmp/http4s/test.scala
import org.http4s.implicits._
import org.http4s.server.blaze._
import org.http4s.server.Router
import org.http4s.server.middleware.CORS
import org.http4s._
import org.http4s.dsl.Http4sDsl
import zio._
import zio.clock._
import zio.console._
import zio.interop.catz._
trait HelloRepository
{
def getHello(name: String): ZIO[AppEnvironment, Nothing, String]
}
trait AppEnvironment extends Console with Clock
{
val helloRepository: HelloRepository
}
object Main extends App {
type AppTask[A] = RIO[AppEnvironment, A]
val dsl: Http4sDsl[AppTask] = Http4sDsl[AppTask]
import dsl._
val httpApp: HttpApp[AppTask] = Router[AppTask](
"/" -> HttpRoutes.of[AppTask] {
case GET -> Root / "hello" / name => Ok( ZIO.accessM[AppEnvironment](_.helloRepository.getHello(name)) )
}
).orNotFound
val program = for {
server <- ZIO.runtime[AppEnvironment]
.flatMap {
implicit rts =>
BlazeServerBuilder[AppTask]
.bindHttp(8080, "0.0.0.0")
.withHttpApp(CORS(httpApp))
.serve
.compile
.drain
}
} yield server
val runEnv = new AppEnvironment with Console.Live with Clock.Live
{
val helloRepository = new HelloRepository
{
def getHello(name: String): ZIO[AppEnvironment, Nothing, String] = ZIO.succeed(s"Hello $name")
}
}
def run(args: List[String]) =
program
.provide(runEnv)
.foldM(err => putStrLn(s"Execution failed with: $err") *> ZIO.succeed(1), _ => ZIO.succeed(0))
}
/tmp/http4s/build.sbt
val Http4sVersion = "0.20.0"
val CatsVersion = "2.0.0"
val ZioCatsVersion = "2.0.0.0-RC3"
val ZioVersion = "1.0.0-RC13"
val LogbackVersion = "1.2.3"
lazy val root = (project in file("."))
.settings(
organization := "example",
name := "example",
version := "0.0.1-SNAPSHOT",
scalaVersion := "2.12.8",
scalacOptions ++= Seq("-Ypartial-unification"),
libraryDependencies ++= Seq(
"org.typelevel" %% "cats-effect" % CatsVersion,
"dev.zio" %% "zio" % ZioVersion,
"dev.zio" %% "zio-interop-cats" % ZioCatsVersion,
"org.http4s" %% "http4s-blaze-server" % Http4sVersion,
"org.http4s" %% "http4s-dsl" % Http4sVersion,
"ch.qos.logback" % "logback-classic" % LogbackVersion,
),
addCompilerPlugin("org.spire-math" %% "kind-projector" % "0.9.6"),
addCompilerPlugin("com.olegpy" %% "better-monadic-for" % "0.2.4")
)
scalacOptions ++= Seq(
"-deprecation", // Emit warning and location for usages of deprecated APIs.
"-encoding", "UTF-8", // Specify character encoding used by source files.
"-language:higherKinds", // Allow higher-kinded types
"-language:postfixOps", // Allows operator syntax in postfix position (deprecated since Scala 2.10)
"-feature", // Emit warning and location for usages of features that should be imported explicitly.
"-Ypartial-unification", // Enable partial unification in type constructor inference
"-Xfatal-warnings", // Fail the compilation if there are any warnings
)
sample execution
bash-3.2$ cd /tmp/http4s
bash-3.2$ sbt
...
sbt:example> compile
...
[info] Done compiling.
[success] Total time: 5 s, completed Oct 24, 2019 11:20:53 PM
sbt:example> run
...
[info] Running Main
23:21:03.720 [zio-default-async-1-163838348] INFO org.http4s.blaze.channel.nio1.NIO1SocketServerGroup - Service bound to address /0:0:0:0:0:0:0:0:8080
23:21:03.725 [blaze-selector-0] DEBUG org.http4s.blaze.channel.nio1.SelectorLoop - Channel initialized.
23:21:03.732 [zio-default-async-1-163838348] INFO org.http4s.server.blaze.BlazeServerBuilder -
_ _ _ _ _
| |_| |_| |_ _ __| | | ___
| ' \ _| _| '_ \_ _(_-<
|_||_\__|\__| .__/ |_|/__/
|_|
23:21:03.796 [zio-default-async-1-163838348] INFO org.http4s.server.blaze.BlazeServerBuilder - http4s v0.20.0 on blaze v0.14.0 started at http://[0:0:0:0:0:0:0:0]:8080/
23:21:11.070 [blaze-selector-1] DEBUG org.http4s.blaze.channel.nio1.SelectorLoop - Channel initialized.
23:21:11.070 [blaze-selector-1] DEBUG org.http4s.blaze.channel.nio1.NIO1HeadStage - Starting up.
23:21:11.070 [blaze-selector-1] DEBUG org.http4s.blaze.channel.nio1.NIO1HeadStage - Stage NIO1HeadStage sending inbound command: Connected
23:21:11.070 [blaze-selector-1] DEBUG org.http4s.server.blaze.Http1ServerStage$$anon$1 - Starting HTTP pipeline
23:21:11.072 [blaze-selector-1] DEBUG org.http4s.blazecore.IdleTimeoutStage - Starting idle timeout stage with timeout of 30000 ms
At this point after opening http://localhost:8080/hello/there I observed the expected output in the browser.
Hope this helps.

Akka.actor.dispatcher NoSuchMethod exception

I am trying to learn about Akka actors and I am running the following example. My problem is that when I run though the Idea IDE it works perfectly fine. But when I run it using the jar created by sbt assembly it throws a NoSuchMethodException java.lang.NoSuchMethodError: akka.actor.ActorSystem.dispatcher()Lscala/concurrent/ExecutionContextExecutor Exception which I cannot debug because it works fine in the IDE.
object Runner {
def main(args: Array[String]) {
run()
}
def run() = {
val system = ActorSystem("my-system")
import system.dispatcher
val props = Props[Manager]
val pool = mutable.ArrayBuffer.empty[(Int, ActorRef)]
for (i <- 1 to 10) {
pool += ((i, system.actorOf(props)))
}
val futures = pool.map {
case (x: Int, y: ActorRef) =>
val future = ask(y, Echo(x))(Timeout(100 seconds)).mapTo[Int]
println(future.toString)
future
}
/*Next line causes Exception*/
val futureList = Future.sequence(futures)
val result = futureList.map(x => {
x.sum
})
result onSuccess {
case sum => println(sum)
}
pool.foreach(x => system.stop(x._2))
system.shutdown()
}
}
The sbt file I am using is the following.
lazy val commonSettings = Seq(
organization := "foobar",
version := "1.0",
scalaVersion := "2.10.6",
test in assembly := {}
)
lazy val root = (project).aggregate(redis).settings(commonSettings: _*).
settings(
name := "scala_code_root",
version := "1.0",
scalaVersion := "2.10.6"
exportJars := false
)
lazy val myakka =(project in file("myakka")).settings(commonSettings: _*).settings(
libraryDependencies += "com.typesafe.akka" % "akka-actor_2.10" % "2.3.15"
)
The exception is thrown at the line val futureList = Future.sequence(futures). Apparently the method is trere because both IDEA and sbt-assembly use the same sbt file. What could be the cause of the Exception?