Akka.actor.dispatcher NoSuchMethod exception - scala

I am trying to learn about Akka actors and I am running the following example. My problem is that when I run though the Idea IDE it works perfectly fine. But when I run it using the jar created by sbt assembly it throws a NoSuchMethodException java.lang.NoSuchMethodError: akka.actor.ActorSystem.dispatcher()Lscala/concurrent/ExecutionContextExecutor Exception which I cannot debug because it works fine in the IDE.
object Runner {
def main(args: Array[String]) {
run()
}
def run() = {
val system = ActorSystem("my-system")
import system.dispatcher
val props = Props[Manager]
val pool = mutable.ArrayBuffer.empty[(Int, ActorRef)]
for (i <- 1 to 10) {
pool += ((i, system.actorOf(props)))
}
val futures = pool.map {
case (x: Int, y: ActorRef) =>
val future = ask(y, Echo(x))(Timeout(100 seconds)).mapTo[Int]
println(future.toString)
future
}
/*Next line causes Exception*/
val futureList = Future.sequence(futures)
val result = futureList.map(x => {
x.sum
})
result onSuccess {
case sum => println(sum)
}
pool.foreach(x => system.stop(x._2))
system.shutdown()
}
}
The sbt file I am using is the following.
lazy val commonSettings = Seq(
organization := "foobar",
version := "1.0",
scalaVersion := "2.10.6",
test in assembly := {}
)
lazy val root = (project).aggregate(redis).settings(commonSettings: _*).
settings(
name := "scala_code_root",
version := "1.0",
scalaVersion := "2.10.6"
exportJars := false
)
lazy val myakka =(project in file("myakka")).settings(commonSettings: _*).settings(
libraryDependencies += "com.typesafe.akka" % "akka-actor_2.10" % "2.3.15"
)
The exception is thrown at the line val futureList = Future.sequence(futures). Apparently the method is trere because both IDEA and sbt-assembly use the same sbt file. What could be the cause of the Exception?

Related

Why does sbt try to pull my interproject dependency?

I have a multi-project build with a build.sbt that looks as follows:
import lmcoursier.CoursierConfiguration
import lmcoursier.definitions.Authentication
ThisBuild / version := "0.1.0-SNAPSHOT"
ThisBuild / scalaVersion := "2.12.12"
val adoMavenUsername = "."
val adoMavenPassword = "ADO_PAT"
val adoRepoIdWithView = "ADO-id"
val adoMavenRepos = Vector(
MavenRepository(adoRepoIdWithView, s"https://adoMavenHost/adoOrganization/adoProject/_packaging/${adoRepoIdWithView.replace("#", "%40")}/maven/v1")
)
val adoAuthentication =
Authentication(user = adoMavenUsername, password = adoMavenPassword)
.withOptional(false)
.withHttpsOnly(true)
.withPassOnRedirect(false)
val coursierConfiguration = {
val initial =
CoursierConfiguration()
.withResolvers(adoMavenRepos)
.withClassifiers(Vector("", "sources"))
.withHasClassifiers(true)
adoMavenRepos.foldLeft(initial) {
case (conf, repo) ⇒
conf.addRepositoryAuthentication(repo.name, adoAuthentication)
}
}
lazy val mainSettings = Seq(
organization := "org.some",
csrConfiguration := coursierConfiguration,
updateClassifiers / csrConfiguration := coursierConfiguration
)
lazy val root = (project in file("."))
.settings(mainSettings: _*)
.settings(
name := "sbt-test",
).aggregate(core, util)
lazy val core = (project in file("core"))
.settings(mainSettings: _*)
.settings(
name := "core",
).dependsOn(util)
lazy val util = (project in file("util"))
.settings(mainSettings: _*)
.settings(
name := "util"
)
For some reason, coursier attempts to download the util package externally during the core/update task. This is not what I want, as it should resolve it internally as part of the project. The package is not added to libraryDependencies, so I'm baffled why it would attempt the download.
The above example will fail because the Azure DevOps credentials are and Maven repository are incorrect, but it shows the attempt to download util.
It seems somehow related to this Github issue.
The default CoursierConfiguration constructor sets the interProjectDependencies property to an empty Vector. To fix this, manually add resolvers on top of sbt's csrConfiguration taskKey using .withResolvers.
This is what the solution looks like applied to my question, largely based on this Github comment:
val adoMavenUsername = "."
val adoMavenPassword = "ADO_PAT"
val adoRepoIdWithView = "ADO-id"
val adoMavenHost = "pkgs.dev.azure.com"
val adoMavenRepos = Vector(
MavenRepository(adoRepoIdWithView, s"https://$adoMavenHost/adoOrganization/adoProject/_packaging/$adoRepoIdWithView/maven/v1")
)
lazy val mainSettings = Seq(
organization := "org.some",
csrConfiguration := {
val resolvers = csrResolvers.value ++ adoMavenRepos
val conf = csrConfiguration.value.withResolvers(resolvers.toVector)
val adoCredentialsOpt = credentials.value.collectFirst { case creds: DirectCredentials if creds.host == adoMavenHost => creds }
val newConfOpt = adoCredentialsOpt.map { adoCredentials =>
val auths =
resolvers
.collect {
case repo: MavenRepository if repo.root.startsWith(s"https://$adoMavenHost/") => {
repo.name ->
Authentication(adoCredentials.userName, adoCredentials.passwd)
}
}
auths.foldLeft(conf) { case (conf, (repoId, auth)) => conf.addRepositoryAuthentication(repoId, auth) }
}
newConfOpt.getOrElse(conf)
},
updateClassifiers / csrConfiguration := coursierConfiguration
)

how to start server/client grpc using scalapb on spark?

i have a problem about running server/client using ScalaPB on spark.
its totally work fine while I running my code using "sbt run". i want running this code using spark coz next ill import my spark model to predict some label. but while I submit my jar to spark, they give me error like this.
Exception in thread "main" io.grpc.ManagedChannelProvider$ProviderNotFoundException:
No functional server found. Try adding a dependency on the grpc-netty artifact
this is my build.sbt
scalaVersion := "2.11.7"
PB.targets in Compile := Seq(
scalapb.gen() -> (sourceManaged in Compile).value
)
val scalapbVersion =
scalapb.compiler.Version.scalapbVersion
val grpcJavaVersion =
scalapb.compiler.Version.grpcJavaVersion
libraryDependencies ++= Seq(
// protobuf
"com.thesamet.scalapb" %% "scalapb-runtime" % scalapbVersion % "protobuf",
//for grpc
"io.grpc" % "grpc-netty" % grpcJavaVersion ,
"com.thesamet.scalapb" %% "scalapb-runtime-grpc" % scalapbVersion
)
assemblyMergeStrategy in assembly := {
case PathList("META-INF", xs # _*) => MergeStrategy.discard
case x => MergeStrategy.first
}
using shade still not work
assemblyShadeRules in assembly := Seq(ShadeRule.rename("com.google.**" -> "shadegoogle.#1").inAll)
and this my main
import java.util.logging.Logger
import io.grpc.{Server, ServerBuilder}
import org.apache.spark.ml.tuning.CrossValidatorModel
import org.apache.spark.sql.SparkSession
import testproto.test.{Email, EmailLabel, RouteGuideGrpc}
import scala.concurrent.{ExecutionContext, Future}
object HelloWorldServer {
private val logger = Logger.getLogger(classOf[HelloWorldServer].getName)
def main(args: Array[String]): Unit = {
val server = new HelloWorldServer(ExecutionContext.global)
server.start()
server.blockUntilShutdown()
}
private val port = 50051
}
class HelloWorldServer(executionContext: ExecutionContext) {
self =>
private[this] var server: Server = null
private def start(): Unit = {
server = ServerBuilder.forPort(HelloWorldServer.port).addService(RouteGuideGrpc.bindService(new RouteGuideImpl, executionContext)).build.start
HelloWorldServer.logger.info("Server started, listening on " + HelloWorldServer.port)
sys.addShutdownHook {
System.err.println("*** shutting down gRPC server since JVM is shutting down")
self.stop()
System.err.println("*** server shut down")
}
}
private def stop(): Unit = {
if (server != null) {
server.shutdown()
}
}
private def blockUntilShutdown(): Unit = {
if (server != null) {
server.awaitTermination()
}
}
private class RouteGuideImpl extends RouteGuideGrpc.RouteGuide {
override def getLabel(request: Email): Future[EmailLabel] = {
val replay = EmailLabel(emailId = request.emailId, label = "aaaaa")
Future.successful(replay)
}
}
}
thanks
It looks like grpc-netty is not found when an uber jar is made. Instead of using ServerBuilder, change your code to use io.grpc.netty.NettyServerBuilder.

Defining implicit encoder using scala meta and quasiquotes

I am trying to create an implicit encoder using Circe. However this encoder will be created using an annotation hence I am using Scalameta. Here is my code. However, the compiler complains about having an override statement within quasiquotes.
class HalResource extends StaticAnnotation {
inline def apply(defn: Any): Any = meta {
val q"..$mods class $tName (..$params) extends $template {..$stats}" = defn
q"object $tName {${createApply(tName)}}"
}
private def createApply(className: Type.Name): Defn.Def = {
q"""
import _root_.io.circe.Json
import _root_.io.circe.syntax._
import _root_.io.circe.Encoder
implicit def encoder = Encoder[$className] {
override def apply(a: $className): Json = {
val (simpleFields: Seq[Term.Param], nonSimpleFields: Seq[Term.Param]) =
params.partition(field => field.decltpe.fold(false) {
case _: Type.Name => true
case _ => false
})
val embedded: Seq[(String, Json)] = nonSimpleFields.map(field => field.name.syntax -> field.name.value.asJson)
val simpleJsonFields: Seq[(String, Json)] = simpleFields.map(field => field.name.syntax -> field.name.value.asJson)
val baseSeq: Seq[(String, Json)] = Seq(
"_links" -> Json.obj(
"href" -> Json.obj(
"self" -> Json.fromString("self_reference")
)
),
"_embedded" -> Json.fromFields(embedded),
) ++ simpleJsonFields
val result: Seq[(String, Json)] = baseSeq ++ simpleJsonFields
Json.fromFields(result)
}
}
"""
}
}
The build file is as follows:
import sbt.Keys.{scalaVersion, scalacOptions}
val circeVersion = "0.8.0"
lazy val circeDependencies = Seq(
"io.circe" %% "circe-core",
"io.circe" %% "circe-generic",
"io.circe" %% "circe-parser"
).map(_ % circeVersion)
lazy val commonSettings = Seq(
name := "Annotation",
version := "1.0",
scalaVersion := "2.12.2",
scalacOptions ++= Seq("-unchecked", "-deprecation", "-feature"),
resolvers += Resolver.sonatypeRepo("releases")
)
lazy val macroAnnotationSettings = Seq(
addCompilerPlugin("org.scalameta" % "paradise" % "3.0.0-M9" cross CrossVersion.full),
scalacOptions += "-Xplugin-require:macroparadise",
scalacOptions in (Compile, console) ~= (_ filterNot (_ contains "paradise"))
)
lazy val projectThatDefinesMacroAnnotations = project.in(file("annotation-definition"))
.settings(commonSettings)
.settings(
name := "HalResource",
libraryDependencies += "org.scalameta" %% "scalameta" % "1.8.0" % Provided,
macroAnnotationSettings)
lazy val annotation = project.in(file("."))
.settings(commonSettings)
.settings(macroAnnotationSettings)
.settings(
libraryDependencies ++= circeDependencies
).dependsOn(projectThatDefinesMacroAnnotations)
As a result I still get:
macro annotation could not be expanded (the most common reason for that is that you need to enable the macro paradise plugin; another possibility is that you try to use macro annotation in the same compilation run that defines it)
You are just missing new before Encoder[$className] { (there may be other errors, but that's the immediate one).
Because of this, the compiler thinks you are trying to call a generic method Encoder with the block
{
override def apply(a: $className): Json = ...
...
}
as the argument, and local methods can't be override.

sbt autoplugin: add javaagent for task

I have a sbt autoplugin and when the user runs a task I want to fork a new JVM with a -javaagent. The task should measure memory using jamm.
object SbtMemory extends AutoPlugin {
object autoImport {
val agentTest = inputKey[Unit]("Run task with javaagent")
}
def makeAgentOptions(classpath: Classpath) : String = {
val jammJar = classpath.map(_.data).filter(_.toString.contains("jamm")).head
s"-javaagent:$jammJar"
}
override lazy val projectSettings =
Seq(
agentTest := agentTask.value,
fork in agentTest := true,
javaOptions in agentTest += (dependencyClasspath in Test).map(makeAgentOptions).value
)
lazy val agentTask = Def.task {
val o = new Array[Byte](1024*1024)
val mm = new MemoryMeter()
println("Size of new Array[Byte](1024*1024): " + mm.measureDeep(o))
}
}
When I run sbt perf from the command line, I get the following exception:
java.lang.IllegalStateException: Instrumentation is not set; Jamm must be set as -javaagent
I also tried printing the javaOptions and the -javaagent option was not set.
How can I add the -javaagent javaOption inside the plugin to run the task with jamm?
Thanks!
Apparently, fork is only available for the run and test task. I added my own forking code and moved the measure code to a separate class MemoryMeasure:
val mainClass: String = "MemoryMeasure"
val forkOptions = ForkOptions(
bootJars = (fullClasspath in Test).value.files,
runJVMOptions = Seq(
(dependencyClasspath in Test).map(makeAgentOptions).value
)
)
val process = Fork.java.fork(forkOptions, mainClass +: arguments)
def cancel() = {
process.destroy()
1
}
val exitCode = try process.exitValue() catch { case e: InterruptedException => cancel() }

Idiomatically defining dynamic tasks in SBT 0.13?

I'm moving an SBT plugin from 0.12 over to 0.13. At various points in my plugin I schedule a dynamic set of tasks onto the SBT build graph.
Below is my old code. Is this still the idiomatic way to express this, or is it possible to leverage the macros to make everything prettier?
import sbt._
import Keys._
object Toplevel extends Build
{
lazy val ordinals = taskKey[Seq[String]]("A list of things")
lazy val times = taskKey[Int]("Number of times to list things")
lazy val inParallel = taskKey[Seq[String]]("Strings to log in parallel")
lazy val Foo = Project( id="Foo", base=file("foo"),
settings = Defaults.defaultSettings ++ Seq(
scalaVersion := "2.10.2",
ordinals := Seq( "First", "Second", "Third", "Four", "Five" ),
times := 3,
inParallel <<= (times, ordinals, streams) flatMap
{ case (t, os, s) =>
os.map( o => toTask( () =>
{
(0 until t).map( _ => o ).mkString(",")
} ) ).join
}
)
)
}
Apologies for the entirely contrived example!
EDIT
So, taking Mark's advice into account I have the following tidier code:
import sbt._
import Keys._
object Toplevel extends Build
{
lazy val ordinals = taskKey[Seq[String]]("A list of things")
lazy val times = taskKey[Int]("Number of times to list things")
lazy val inParallel = taskKey[Seq[String]]("Strings to log in parallel")
def parTask = Def.taskDyn
{
val t = times.value
ordinals.value.map(o => ordinalTask(o, t)).join
}
def ordinalTask(o: String, t: Int) = Def.task
{
(0 until t).map(_ => o).mkString(",")
}
lazy val Foo = Project( id="Foo", base=file("foo"),
settings = Defaults.defaultSettings ++ Seq(
scalaVersion := "2.10.2",
ordinals := Seq( "First", "Second", "Third", "Four", "Five" ),
times := 3,
inParallel := parTask.value
)
)
}
This seems to be nearly there, but fails the build with:
[error] /home/alex.wilson/tmp/sbt0.13/project/build.scala:13: type mismatch;
[error] found : sbt.Def.Initialize[Seq[sbt.Task[String]]]
[error] required: sbt.Def.Initialize[sbt.Task[?]]
[error] ordinals.value.map(o => ordinalTask(o, t)).join
You can use Def.taskDyn, which provides the new syntax for flatMap. The difference from Def.task is that the expected return type is a task Initialize[Task[T]] instead of just T. Translating your example,
inParallel := parTask.value
def parTask = Def.taskDyn {
val t = times.value
ordinals.value.map(o => ordinalTask(o, t)).joinWith(_.join)
}
def ordinalTask(o: String, t: Int) = Def.task {
(0 until t).map(_ => o).mkString(",")
}