Why can't sbt resolve akka-remote? - scala

When I build and run this program on sbt it has following errors:
[info] Updating {file:/opt/ifkaar/akkaprojects/calculation/}calculation...
[info] Resolving com.typesafe.akka#akka-remote;2.3.4 ...
[warn] module not found: com.typesafe.akka#akka-remote;2.3.4
[warn] ==== local: tried
[warn] /home/sarawaheed/.ivy2/local/com.typesafe.akka/akka-remote/2.3.4/ivys/ivy.xml
[warn] ==== public: tried
[warn] http://repo1.maven.org/maven2/com/typesafe/akka/akka-remote/2.3.4/akka-remote-2.3.4.pom
[warn] ==== Typesafe Repository: tried
[warn] http://repo.typesafe.com/typesafe/releases/com/typesafe/akka/akka-remote/2.3.4/akka-remote-2.3.4.pom
[info] Resolving jline#jline;2.11 ...
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] :: UNRESOLVED DEPENDENCIES ::
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] :: com.typesafe.akka#akka-remote;2.3.4: not found
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[trace] Stack trace suppressed: run last *:update for the full output.
[error] (*:update) sbt.ResolveException: unresolved dependency: com.typesafe.akka#akka-remote;2.3.4: not found
[error] Total time: 2 s, completed Jul 17, 2014 3:38:42 PM
here is my build.sbt file
name := "calculation"
version := "1.0"
scalaVersion := "2.11.1"
resolvers += "Typesafe Repository" at "http://repo.typesafe.com/typesafe/releases/"
libraryDependencies += "com.typesafe.akka" %% "akka-actor" % "2.3.4,"com.typesafe.akka" %% "akka-remote" % "2.3.4""
here is my PI.scala code
import akka.actor._
import akka.routing.RoundRobinRouter
import akka.util.Duration
import akka.util.duration._
object Pi extends App {
calculate(nrOfWorkers = 4, nrOfElements = 10000, nrOfMessages = 10000)
sealed trait PiMessage
case object Calculate extends PiMessage
case class Work(start: Int, nrOfElements: Int) extends PiMessage
case class Result(value: Double) extends PiMessage
case class PiApproximation(pi: Double, duration: Duration)
class Worker extends Actor {
def calculatePiFor(start: Int, nrOfElements: Int): Double = {
var acc = 0.0
for (i ← start until (start + nrOfElements))
acc += 4.0 * (1 - (i % 2) * 2) / (2 * i + 1)
acc
}
def receive = {
case Work(start, nrOfElements) ⇒
sender ! Result(calculatePiFor(start, nrOfElements)) // perform the work
}
}
class Master(nrOfWorkers: Int, nrOfMessages: Int, nrOfElements: Int, listener: ActorRef)
extends Actor {
var pi: Double = _
var nrOfResults: Int = _
val start: Long = System.currentTimeMillis
val workerRouter = context.actorOf(
Props[Worker].withRouter(RoundRobinRouter(nrOfWorkers)), name = "workerRouter")
def receive = {
case Calculate ⇒
for (i ← 0 until nrOfMessages) workerRouter ! Work(i * nrOfElements, nrOfElements)
case Result(value) ⇒
pi += value
nrOfResults += 1
if (nrOfResults == nrOfMessages) {
// Send the result to the listener
listener ! PiApproximation(pi, duration = (System.currentTimeMillis - start).millis)
// Stops this actor and all its supervised children
context.stop(self)
}
}
}
class Listener extends Actor {
def receive = {
case PiApproximation(pi, duration) ⇒
println("\n\tPi approximation: \t\t%s\n\tCalculation time: \t%s"
.format(pi, duration))
context.system.shutdown()
}
}
def calculate(nrOfWorkers: Int, nrOfElements: Int, nrOfMessages: Int) {
// Create an Akka system
val system = ActorSystem("PiSystem")
// create the result listener, which will print the result and shutdown the system
val listener = system.actorOf(Props[Listener], name = "listener")
// create the master
val master = system.actorOf(Props(new Master(
nrOfWorkers, nrOfMessages, nrOfElements, listener)),
name = "master")
// start the calculation
master ! Calculate
}
}
Note: I'm following this tutorial.

The libraryDependencies entry in your build.sbt file should look like this:
libraryDependencies ++= Seq(
"com.typesafe.akka" %% "akka-actor" % "2.3.4",
"com.typesafe.akka" %% "akka-remote" % "2.3.4"
)
The ++= operator is a kind of list concatenation operator. It tells sbt to merge this new dependency collection with the existing ones, whichever that may be.
The += operator appends a single item, as opposed to concating a collection. For example:
libraryDependencies += "com.typesafe.akka" %% "akka-actor" % "2.3.4"
libraryDependencies += "com.typesafe.akka" %% "akka-remote" % "2.3.4"
The %% operator is a shorthand to explicitly specifying the Scala version which the required libraries should be compiled against. There's a different library name for each Scala version. For example, instead of writing:
"com.typesafe.akka" % "akka-actor_2.11" % "2.3.4"
You write:
"com.typesafe.akka" %% "akka-actor" % "2.3.4"
The declared scalaVersion setting will be used to "expand" the %% operator. Note that only the X.Y numbers from the X.Y.Z full version will be used. This happens because Scala versions are compatible when Z changes, but not when X or Y change.
You may also eventually end up with the following to avoid duplications:
libraryDependencies ++= Seq("actor", "remote").map("akka-" + _).map("com.typesafe.akka" %% _ % "2.3.4")
And since akka-remote depends on akka-actor the following suffices:
libraryDependencies += "com.typesafe.akka" %% "akka-remote" % "2.3.4"

Related

Scala Flink get java.lang.NoClassDefFoundError: scala/Product$class after using case class for customized DeserializationSchema

It work fine when using generic class.
But get java.lang.NoClassDefFoundError: scala/Product$class error after change class to case class.
Not sure is sbt packaging problem or code problem.
When I'm using:
sbt
scala: 2.11.12
java: 8
sbt assembly to package
package example
import java.util.Properties
import java.nio.charset.StandardCharsets
import org.apache.flink.api.scala._
import org.apache.flink.streaming.util.serialization.{DeserializationSchema, SerializationSchema}
import org.apache.flink.streaming.api.scala.{DataStream, StreamExecutionEnvironment}
import org.apache.flink.streaming.connectors.kafka.{FlinkKafkaConsumer, FlinkKafkaProducer}
import org.apache.flink.streaming.api.watermark.Watermark
import org.apache.flink.streaming.api.functions.AssignerWithPunctuatedWatermarks
import org.apache.flink.api.common.typeinfo.TypeInformation
import Config._
case class Record(
id: String,
startTime: Long
) {}
class RecordDeSerializer extends DeserializationSchema[Record] with SerializationSchema[Record] {
override def serialize(record: Record): Array[Byte] = {
return "123".getBytes(StandardCharsets.UTF_8)
}
override def deserialize(b: Array[Byte]): Record = {
Record("1", 123)
}
override def isEndOfStream(record: Record): Boolean = false
override def getProducedType: TypeInformation[Record] = {
createTypeInformation[Record]
}
}
object RecordConsumer {
def main(args: Array[String]): Unit = {
val config : Properties = {
var p = new Properties()
p.setProperty("zookeeper.connect", Config.KafkaZookeeperServers)
p.setProperty("bootstrap.servers", Config.KafkaBootstrapServers)
p.setProperty("group.id", Config.KafkaGroupID)
p
}
val env = StreamExecutionEnvironment.getExecutionEnvironment
env.enableCheckpointing(1000)
var consumer = new FlinkKafkaConsumer[Record](
Config.KafkaTopic,
new RecordDeSerializer(),
config
)
consumer.setStartFromEarliest()
val stream = env.addSource(consumer).print
env.execute("record consumer")
}
}
Error
2020-08-05 04:07:33,963 INFO org.apache.flink.runtime.checkpoint.CheckpointCoordinator - Discarding checkpoint 1670 of job 4de8831901fa72790d0a9a973cc17dde.
java.lang.NoClassDefFoundError: scala/Product$class
...
build.SBT
First idea is that maybe version is not right.
But every thing work fine if use normal class
Here is build.sbt
ThisBuild / resolvers ++= Seq(
"Apache Development Snapshot Repository" at "https://repository.apache.org/content/repositories/snapshots/",
Resolver.mavenLocal
)
name := "deedee"
version := "0.1-SNAPSHOT"
organization := "dexterlab"
ThisBuild / scalaVersion := "2.11.8"
val flinkVersion = "1.8.2"
val flinkDependencies = Seq(
"org.apache.flink" %% "flink-scala" % flinkVersion % "provided",
"org.apache.flink" %% "flink-streaming-scala" % flinkVersion % "provided",
"org.apache.flink" %% "flink-streaming-java" % flinkVersion % "provided",
"org.apache.flink" %% "flink-connector-kafka" % flinkVersion,
)
val thirdPartyDependencies = Seq(
"com.github.nscala-time" %% "nscala-time" % "2.24.0",
"com.typesafe.play" %% "play-json" % "2.6.14",
)
lazy val root = (project in file(".")).
settings(
libraryDependencies ++= flinkDependencies,
libraryDependencies ++= thirdPartyDependencies,
libraryDependencies += "org.scala-lang" % "scala-compiler" % scalaVersion.value,
)
assembly / mainClass := Some("dexterlab.TelecoDataConsumer")
// make run command include the provided dependencies
Compile / run := Defaults.runTask(Compile / fullClasspath,
Compile / run / mainClass,
Compile / run / runner
).evaluated
// stays inside the sbt console when we press "ctrl-c" while a Flink programme executes with "run" or "runMain"
Compile / run / fork := true
Global / cancelable := true
// exclude Scala library from assembly
assembly / assemblyOption := (assembly / assemblyOption).value.copy(includeScala = false)
autoCompilerPlugins := true
Finally success after I add this line in build.sbt
assembly / assemblyOption := (assemblu / assemblyOption).value.copy(includeScala = true)
To include scala library when running sbt assembly

Apache flink (1.9.1) runtime exception when using case classes in scala (2.12.8)

I am using case class in Scala (2.12.8) Apache Flink (1.9.1) application. I get the following exception when I run the code below Caused by: java.lang.NoSuchMethodError: scala.Product.$init$(Lscala/Product;)V.
NOTE: I have used the default constructor as per the suggestion ( java.lang.NoSuchMethodException for init method in Scala case class) but that does not work in my case
Here is the complete code
package com.zignallabs
import org.apache.flink.api.scala._
/**
// Implements the program that reads from a Element list, Transforms it into tuple and outputs to TaskManager
*/
case class AddCount ( firstP: String, count: Int) {
def this () = this ("default", 1) // No help when added default constructor as per https://stackoverflow.com/questions/51129809/java-lang-nosuchmethodexception-for-init-method-in-scala-case-class
}
object WordCount {
def main(args: Array[String]): Unit = {
// set up the execution environment
val env = ExecutionEnvironment.getExecutionEnvironment
// get input data
val input =env.fromElements(" one", "two", "three", "four", "five", "end of test")
// ***** Line 31 throws the exception
// Caused by: java.lang.NoSuchMethodError: scala.Product.$init$(Lscala/Product;)V
// at com.zignallabs.AddCount.<init>(WordCount.scala:7)
// at com.zignallabs.WordCount$.$anonfun$main$1(WordCount.scala:31)
// at org.apache.flink.api.scala.DataSet$$anon$1.map(DataSet.scala:490)
// at org.apache.flink.runtime.operators.chaining.ChainedMapDriver.collect(ChainedMapDriver.java:79)
// at org.apache.flink.runtime.operators.util.metrics.CountingCollector.collect(CountingCollector.java:35)
// at org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:196)
// at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:705)
// at org.apache.flink.runtime.taskmanager.Task.run(Task.java:530)
// at java.lang.Thread.run(Thread.java:748)
val transform = input.map{w => AddCount(w, 1)} // <- Throwing exception
// execute and print result
println(transform)
transform.print()
transform.printOnTaskManager(" Word")
env.execute()
}
}
Run time exception is :
at com.zignallabs.AddCount.<init>(WordCount.scala:7)
at com.zignallabs.WordCount$.$anonfun$main$1(WordCount.scala:31)
at org.apache.flink.api.scala.DataSet$$anon$1.map(DataSet.scala:490)
at org.apache.flink.runtime.operators.chaining.ChainedMapDriver.collect(ChainedMapDriver.java:79)
at org.apache.flink.runtime.operators.util.metrics.CountingCollector.collect(CountingCollector.java:35)
at org.apache.flink.runtime.operators.DataSourceTask.invoke(DataSourceTask.java:196)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:705)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:530)
at java.lang.Thread.run(Thread.java:748)
I am building and running flink locally using local flink cluster with flink version 1.9.1.
Here is the build.sbt file:
name := "flink191KafkaScala"
version := "0.1-SNAPSHOT"
organization := "com.zignallabs"
scalaVersion := "2.12.8"
val flinkVersion = "1.9.1"
//javacOptions ++= Seq("-source", "1.7", "-target", "1.7")
val http4sVersion = "0.16.6"
resolvers ++= Seq(
"Local Ivy" at "file:///"+Path.userHome+"/.ivy2/local",
"Local Ivy Cache" at "file:///"+Path.userHome+"/.ivy2/cache",
"Local Maven Repository" at "file:///"+Path.userHome+"/.m2/repository",
"Artifactory Cache" at "https://zignal.artifactoryonline.com/zignal/zignal-repos"
)
val excludeCommonsLogging = ExclusionRule(organization = "commons-logging")
libraryDependencies ++= Seq(
"org.apache.flink" %% "flink-scala" % flinkVersion % "provided",
"org.apache.flink" %% "flink-streaming-scala" % flinkVersion % "provided",
"org.apache.flink" %% "flink-clients" % "1.9.1",
// Upgrade to flink-connector-kafka_2.11
"org.apache.flink" %% "flink-connector-kafka-0.11" % "1.9.1",
//"org.scalaj" %% "scalaj-http" % "2.4.2",
"com.squareup.okhttp3" % "okhttp" % "4.2.2"
)
publishTo := Some("Artifactory Realm" at "https://zignal.artifactoryonline.com/zignal/zignal")
credentials += Credentials("Artifactory Realm", "zignal.artifactoryonline.com", "buildserver", "buildserver")
//mainClass in Compile := Some("com.zignallabs.StoryCounterTopology")
mainClass in Compile := Some("com.zignallabs.WordCount")
scalacOptions ++= Seq(
"-feature",
"-unchecked",
"-deprecation",
"-language:implicitConversions",
"-Yresolve-term-conflict:package",
"-language:postfixOps",
"-target:jvm-1.8")
lazy val root = project.in(file(".")).configs(IntegrationTest)
If you're using default args for the constructors of a case class, it's much more idiomatic Scala to define them like this:
case class AddCount ( firstP: String = "default", count: Int = 1)
This is syntactic sugar that basically gives you the following for free:
case class AddCount ( firstP: String, count: Int) {
def this () = this ("default", 1)
def this (firstP:String) = this (firstP, 1)
def this (count:Int) = this ("default", count)
}
I am able to now run this application using Scala 2.12. The issue was in the environment. I needed to ensure conflicts binaries are not there especially the ones for scala 2.11 and scala 2.12

Scala Akka Streams project fails to compile, "value ~> is not a member of akka.streams.Outlet[In]"

I am using the Akka Streams API in a Scala project, working in Intellij IDEA with the SBT plugin. I have a worker pool Flow as described here: https://doc.akka.io/docs/akka/current/scala/stream/stream-cookbook.html. Here is my code:
package streams
import akka.NotUsed
import akka.stream.scaladsl.{Balance, Flow, GraphDSL, Merge}
import akka.stream.{FlowShape, Graph}
object WorkerPoolFlow {
def apply[In, Out](
worker: Flow[In, Out, Any],
workerCount: Int):
Graph[FlowShape[In, Out], NotUsed] = {
GraphDSL.create() { implicit b =>
val balance = b.add(Balance[In](workerCount, waitForAllDownstreams = true))
val merge = b.add(Merge[Out](workerCount))
for (i <- 0 until workerCount)
balance.out(i) ~> worker.async ~> merge.in(i)
FlowShape(
balance.in,
merge.out)
}
}
}
For some reason the project is now failing to compile, giving this error: value ~> is not a member of akka.stream.Outlet[In].
It compiled fine until today. The only change I am aware of making is installing a Scala linter plugin scalafmt, and importing a few new libraries in build.sbt. Here is my build.sbt:
name := "myProject"
version := "0.1"
scalaVersion := "2.11.11"
unmanagedJars in Compile += file("localDep1.jar")
unmanagedJars in Compile += file("localDep2.jar")
libraryDependencies += "io.spray" %% "spray-json" % "1.3.3"
libraryDependencies += "com.typesafe.akka" %% "akka-actor" % "2.5.6"
libraryDependencies += "com.typesafe.akka" %% "akka-stream" % "2.5.6"
libraryDependencies += "com.typesafe.akka" %% "akka-testkit" % "2.5.6" % Test
libraryDependencies += "com.typesafe.akka" %% "akka-stream-testkit" % "2.5.6" % Test
libraryDependencies += "com.47deg" %% "fetch" % "0.6.0"
libraryDependencies += "ch.qos.logback" % "logback-classic" % "1.2.3"
I have tried reloading SBT, building from SBT outside of IDEA, removing and re-adding dependencies, and cleaning the project, with no luck.
Import GraphDSL.Implicits._:
object WorkerPoolFlow {
def apply[In, Out](
worker: Flow[In, Out, Any],
workerCount: Int):
Graph[FlowShape[In, Out], NotUsed] = {
import GraphDSL.Implicits._
GraphDSL.create() { implicit b =>
...
}
}
}

ToolBox Import Error

I'm getting the following error when compiling the following toy class:
package com.example
import scala.tools.reflect.ToolBox
import scala.reflect.runtime.{currentMirror => m}
object Hello {
def main(args: Array[String]): Unit = {
println("Hello, world!")
}
}
[info] Loading project definition from /Users/me/Temp/Bar/project
[info] Set current project to Bar (in build file:/Users/me/Temp/Bar/)
[info] Compiling 1 Scala source to /Users/me/Temp/Bar/target/scala-2.11/classes...
[error] /Users/me/Temp/Bar/src/main/scala/com/example/Hello.scala:3: object tools is not a member of package scala
[error] import scala.tools.reflect.ToolBox
[error] ^
[error] one error found
[error] (compile:compileIncremental) Compilation failed
This is my build.sbt file:
name := """Bar"""
version := "1.0"
scalaVersion := "2.11.8"
// Change this to another test framework if you prefer
libraryDependencies += "org.scalatest" %% "scalatest" % "2.2.4" % "test"
libraryDependencies += "org.scala-lang" % "scala-reflect" % "2.11.8"
// Uncomment to use Akka
//libraryDependencies += "com.typesafe.akka" %% "akka-actor" % "2.3.11"
The following dependency fixed the issue:
libraryDependencies += "org.scala-lang" % "scala-compiler" % "2.11.8"
Is this the best solution?
The ToolBox class is part of the compiler, not the public reflection API.

Using Spring-Data-Neo4j advanced mapping mode in SBT

I'd like to use the advanced mapping mode of spring-data-neo4j in my Scala SBT project (hosted on github):
I can store Nodes in the database with the repository, but I cannot make the aspectj-weaving work.
This is what I have so far:
build.sbt:
resolvers ++= Seq(
"spring" at "http://repo.spring.io/milestone",
"neo4j-releases" at "http://m2.neo4j.org/releases/"
)
libraryDependencies ++= Seq(
"org.springframework.data" % "spring-data-neo4j" % "3.0.0.M1" % "compile",
"org.springframework.data" % "spring-data-neo4j-aspects" % "3.0.0.M1" % "compile",
"javax.persistence" % "persistence-api" % "1.0" % "compile",
"javax.validation" % "validation-api" % "1.0.0.GA" % "compile",
"junit" % "junit" % "4.11" % "test",
"com.novocode" % "junit-interface" % "0.9" % "test",
"org.springframework" % "spring-test" % "4.0.0.RELEASE" % "test"
)
Seq(aspectjSettings: _*)
verbose in Aspectj := false
showWeaveInfo in Aspectj := false
inputs in Aspectj <+= compiledClasses
binaries in Aspectj <++= update map { report:UpdateReport =>
report.matching(
moduleFilter(organization = "org.springframework.data", name = "spring-data-neo4j-aspects")
)
}
products in Compile <<= products in Aspectj
products in Runtime <<= products in Compile
project/plugins.sbt:
addSbtPlugin("com.typesafe.sbt" % "sbt-aspectj" % "0.9.4")
Node class and repository:
#NodeEntity
class Node {
#GraphId
private var graphId: java.lang.Long = _
}
trait NodeRepository extends GraphRepository[Node]
Test:
#ContextConfiguration(locations = Array("classpath*:/META-INF/spring/module-context.xml"))
#RunWith(classOf[SpringJUnit4ClassRunner])
class SDNTest extends AbstractJUnit4SpringContextTests {
#Autowired private var nodeRepository: NodeRepository = null
#Test
def persist {
val node = new Node()
//nodeRepository.save(node)
node.persist()
}
}
When I try to run the test, I get these errors:
$ sbt test
[info] Weaving 1 input with 1 AspectJ binary to target/scala-2.10/aspectj/classes...
[error] error at sdntest/Node.scala::0 The type sdntest.Node must implement the inherited abstract method org.springframework.data.neo4j.aspects.core.GraphBacked.setPersistentState(Ljava/lang/Object;)
[error] see also: org/springframework/data/neo4j/aspects/core/GraphBacked.java::0
[error] see also: org/springframework/data/neo4j/aspects/support/node/Neo4jNodeBacking.aj:66::0
[error] error at sdntest/Node.scala::0 The type sdntest.Node must implement the inherited abstract method org.springframework.data.neo4j.mapping.ManagedEntity.setPersistentState(Ljava/lang/Object;)
[error] see also: org/springframework/data/neo4j/mapping/ManagedEntity.java::0
[error] see also: org/springframework/data/neo4j/aspects/support/node/Neo4jNodeBacking.aj:66::0
[warn] warning at /home/felix/.ivy2/cache/org.springframework.data/spring-data-neo4j-aspects/jars/spring-data-neo4j-aspects-3.0.0.M1.jar!org/springframework/data/neo4j/aspects/support/relationship/Neo4jRelationshipBacking.class:64::0 advice defined in org.springframework.data.neo4j.aspects.support.relationship.Neo4jRelationshipBacking has not been applied [Xlint:adviceDidNotMatch]
[warn] warning at /home/felix/.ivy2/cache/org.springframework.data/spring-data-neo4j-aspects/jars/spring-data-neo4j-aspects-3.0.0.M1.jar!org/springframework/data/neo4j/aspects/support/relationship/Neo4jRelationshipBacking.class:167::0 advice defined in org.springframework.data.neo4j.aspects.support.relationship.Neo4jRelationshipBacking has not been applied [Xlint:adviceDidNotMatch]
[warn] warning at /home/felix/.ivy2/cache/org.springframework.data/spring-data-neo4j-aspects/jars/spring-data-neo4j-aspects-3.0.0.M1.jar!org/springframework/data/neo4j/aspects/support/relationship/Neo4jRelationshipBacking.class:174::0 advice defined in org.springframework.data.neo4j.aspects.support.relationship.Neo4jRelationshipBacking has not been applied [Xlint:adviceDidNotMatch]
org.aspectj.bridge.AbortException: AspectJ failed
at com.typesafe.sbt.SbtAspectj$Ajc$.runAjcMain(SbtAspectj.scala:220)
...
What am I doing wrong?
If you bump the sbt version to 0.13.2 it now works:
project/build.properties ->
sbt.version=0.13.2
My guess is something about the aspectj plugin doesn't work in older sbt builds.