Can't assembly scala/akka project - scala

I'm newby with sbt and scala. Just want to make jar file to run in on server, but when I try to compile jar file I get error. Program code is very simple, only one class (can't post it here, stackoverflow doesn't allow). When I run it in Intellij everything is fine. But sbt assembly fires this error:
[info] Merging files...
[error] scala.MatchError: akka\stream\OverflowStrategies$.class (of class java.lang.String)
[error] at $a019333dc409d47a4d92$.$anonfun$$sbtdef$2(D:\workspace_scala\TestSbt2\build.sbt:25)
[error] at sbtassembly.Assembly$.$anonfun$applyStrategies$6(Assembly.scala:115)
[error] at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:234)
[error] at scala.collection.Iterator.foreach(Iterator.scala:929)
[error] at scala.collection.Iterator.foreach$(Iterator.scala:929)
[error] at scala.collection.AbstractIterator.foreach(Iterator.scala:1417)
[error] at scala.collection.IterableLike.foreach(IterableLike.scala:71)
[error] at scala.collection.IterableLike.foreach$(IterableLike.scala:70)
[error] at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
[error] at scala.collection.TraversableLike.map(TraversableLike.scala:234)
[error] at scala.collection.TraversableLike.map$(TraversableLike.scala:227)
[error] at scala.collection.AbstractTraversable.map(Traversable.scala:104)
[error] at sbtassembly.Assembly$.applyStrategies(Assembly.scala:114)
[error] at sbtassembly.Assembly$.x$1$lzycompute$1(Assembly.scala:26)
[error] at sbtassembly.Assembly$.x$1$1(Assembly.scala:24)
[error] at sbtassembly.Assembly$.stratMapping$lzycompute$1(Assembly.scala:24)
[error] at sbtassembly.Assembly$.stratMapping$1(Assembly.scala:24)
[error] at sbtassembly.Assembly$.inputs$lzycompute$1(Assembly.scala:68)
[error] at sbtassembly.Assembly$.inputs$1(Assembly.scala:58)
[error] at sbtassembly.Assembly$.apply(Assembly.scala:85)
[error] at sbtassembly.Assembly$.$anonfun$assemblyTask$1(Assembly.scala:249)
[error] at scala.Function1.$anonfun$compose$1(Function1.scala:44)
[error] at sbt.internal.util.$tilde$greater.$anonfun$$u2219$1(TypeFunctions.scala:39)
[error] at sbt.std.Transform$$anon$4.work(System.scala:66)
[error] at sbt.Execute.$anonfun$submit$2(Execute.scala:262)
[error] at sbt.internal.util.ErrorHandling$.wideConvert(ErrorHandling.scala:16)
[error] at sbt.Execute.work(Execute.scala:271)
[error] at sbt.Execute.$anonfun$submit$1(Execute.scala:262)
[error] at sbt.ConcurrentRestrictions$$anon$4.$anonfun$submitValid$1(ConcurrentRestrictions.scala:174)
[error] at sbt.CompletionService$$anon$2.call(CompletionService.scala:36)
[error] at java.util.concurrent.FutureTask.run(Unknown Source)
[error] at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source)
[error] at java.util.concurrent.FutureTask.run(Unknown Source)
[error] at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
[error] at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
[error] at java.lang.Thread.run(Unknown Source)
[error] (assembly) scala.MatchError: akka\stream\OverflowStrategies$.class (of class java.lang.String)
[error] Total time: 1 s, completed Jun 20, 2018 1:20:02 PM
built.sbt
name := "TestSbt2"
version := "0.1"
scalaVersion := "2.12.4"
libraryDependencies += "com.typesafe.akka" %% "akka-actor" % "2.5.13"
libraryDependencies += "com.typesafe.akka" %% "akka-stream" % "2.5.13"
libraryDependencies += "com.typesafe.akka" %% "akka-http" % "10.1.0"
assemblyMergeStrategy in assembly := {
case PathList("reference.conf") => MergeStrategy.concat
}
project/assembly.sbt
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.6")
Program code:
import akka.actor._
import akka.http.scaladsl._
import akka.http.scaladsl.model.ws.{Message, TextMessage}
import akka.http.scaladsl.server.Directives._
import akka.stream._
import akka.stream.scaladsl._
import scala.io.StdIn
object Server2 {
def main(args: Array[String]): Unit = {
implicit val system = ActorSystem()
implicit val materializer = ActorMaterializer()
def echoFlow: Flow[Message, Message, Any] =
Flow[Message].map {
case tm: TextMessage.Strict => TextMessage.Strict("Test " + tm.text)
case _ => TextMessage("Message type unsupported")
}
val websocketRoute =
path("chat") {
handleWebSocketMessages(echoFlow)
}
val bindingFuture = Http().bindAndHandle(websocketRoute, "127.0.0.1", 8080)
// the rest of the sample code will go here
println("- Started server at 127.0.0.1:8080, press enter to kill server")
StdIn.readLine()
system.terminate()
}
}

Not really sure what's going on, but if you add a default case to your assemblyMergeStrategy definition, it should work just fine.
For example:
assemblyMergeStrategy in assembly := {
case "reference.conf" => MergeStrategy.concat
case x =>
val oldStrategy = (assemblyMergeStrategy in assembly).value
oldStrategy(x)
}
Little edit:
Actually on the repo it says
assemblyMergeStrategy in assembly expects a function. You can't do assemblyMergeStrategy in assembly := MergeStrategy.first

The fact that it says "MatchError", it's found in the build.sbt file and your sbt file says
assemblyMergeStrategy in assembly := {
case PathList("reference.conf") => MergeStrategy.concat
}
tells me, that you want to use a fallback case.
Add this as a second case in the same match (after your case PathList)
case x =>
val oldStrategy = (assemblyMergeStrategy in assembly).value
oldStrategy(x)
(copied from GitHub-sbt-assembly)

Thanks people!
This helped:
mainClass in assembly := Some("chat.Server2")
assemblyMergeStrategy in assembly := {
case "reference.conf" => MergeStrategy.concat
case x =>
val oldStrategy = (assemblyMergeStrategy in assembly).value
oldStrategy(x)
}

Related

scala sbt libraryDependencies provided - Avoid downloading 3rd party library

I've the following Spark Scala code that references 3rd party libraries,
package com.protegrity.spark
import org.apache.spark.sql.api.java.UDF2
import com.protegrity.spark.udf.ptyProtectStr
import com.protegrity.spark.udf.ptyProtectInt
class ptyProtectStr extends UDF2[String, String, String] {
def call(input: String, dataElement: String): String = {
return ptyProtectStr(input, dataElement);
}
}
class ptyUnprotectStr extends UDF2[String, String, String] {
def call(input: String, dataElement: String): String = {
return ptyUnprotectStr(input, dataElement);
}
}
class ptyProtectInt extends UDF2[Integer, String, Integer] {
def call(input: Integer, dataElement: String): Integer = {
return ptyProtectInt(input, dataElement);
}
}
class ptyUnprotectInt extends UDF2[Integer, String, Integer] {
def call(input: Integer, dataElement: String): Integer = {
return ptyUnprotectInt(input, dataElement);
}
}
I want to create JAR file using SBT. My build.sbt looks like the following,
name := "Protegrity UDF"
version := "1.0"
scalaVersion := "2.11.8"
libraryDependencies ++= Seq(
"com.protegrity.spark" % "udf" % "2.3.2" % "provided",
"org.apache.spark" %% "spark-core" % "2.3.2" % "provided",
"org.apache.spark" %% "spark-sql" % "2.3.2" % "provided"
)
As you see, I trying to create a thin JAR file using "provided" option as my Spark environment already contains those libraries.
In spite of using "provided", sbt is trying to download from maven and throwing below error,
[warn] Note: Unresolved dependencies path:
[error] sbt.librarymanagement.ResolveException: Error downloading com.protegrity.spark:udf:2.3.2
[error] Not found
[error] Not found
[error] not found: C:\Users\user1\.ivy2\local\com.protegrity.spark\udf\2.3.2\ivys\ivy.xml
[error] not found: https://repo1.maven.org/maven2/com/protegrity/spark/udf/2.3.2/udf-2.3.2.pom
[error] at lmcoursier.CoursierDependencyResolution.unresolvedWarningOrThrow(CoursierDependencyResolution.scala:249)
[error] at lmcoursier.CoursierDependencyResolution.$anonfun$update$35(CoursierDependencyResolution.scala:218)
[error] at scala.util.Either$LeftProjection.map(Either.scala:573)
[error] at lmcoursier.CoursierDependencyResolution.update(CoursierDependencyResolution.scala:218)
[error] at sbt.librarymanagement.DependencyResolution.update(DependencyResolution.scala:60)
[error] at sbt.internal.LibraryManagement$.resolve$1(LibraryManagement.scala:52)
[error] at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$12(LibraryManagement.scala:102)
[error] at sbt.util.Tracked$.$anonfun$lastOutput$1(Tracked.scala:69)
[error] at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$20(LibraryManagement.scala:115)
[error] at scala.util.control.Exception$Catch.apply(Exception.scala:228)
[error] at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$11(LibraryManagement.scala:115)
[error] at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$11$adapted(LibraryManagement.scala:96)
[error] at sbt.util.Tracked$.$anonfun$inputChanged$1(Tracked.scala:150)
[error] at sbt.internal.LibraryManagement$.cachedUpdate(LibraryManagement.scala:129)
[error] at sbt.Classpaths$.$anonfun$updateTask0$5(Defaults.scala:2950)
[error] at scala.Function1.$anonfun$compose$1(Function1.scala:49)
[error] at sbt.internal.util.$tilde$greater.$anonfun$$u2219$1(TypeFunctions.scala:62)
[error] at sbt.std.Transform$$anon$4.work(Transform.scala:67)
[error] at sbt.Execute.$anonfun$submit$2(Execute.scala:281)
[error] at sbt.internal.util.ErrorHandling$.wideConvert(ErrorHandling.scala:19)
[error] at sbt.Execute.work(Execute.scala:290)
[error] at sbt.Execute.$anonfun$submit$1(Execute.scala:281)
[error] at sbt.ConcurrentRestrictions$$anon$4.$anonfun$submitValid$1(ConcurrentRestrictions.scala:178)
[error] at sbt.CompletionService$$anon$2.call(CompletionService.scala:37)
[error] at java.util.concurrent.FutureTask.run(Unknown Source)
[error] at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source)
[error] at java.util.concurrent.FutureTask.run(Unknown Source)
[error] at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
[error] at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
[error] at java.lang.Thread.run(Unknown Source)
[error] (update) sbt.librarymanagement.ResolveException: Error downloading com.protegrity.spark:udf:2.3.2
[error] Not found
[error] Not found
[error] not found: C:\Users\user1\.ivy2\local\com.protegrity.spark\udf\2.3.2\ivys\ivy.xml
[error] not found: https://repo1.maven.org/maven2/com/protegrity/spark/udf/2.3.2/udf-2.3.2.pom
What change in build.sbt should I make to skip the maven download for "com.protegrity.spark"? Interestingly, I don't face this issue for "org.apache.spark" on the same build
Assuming that you have the JAR file available (but not through Maven or another artifact repository) wherever you're compiling the code, just place the JAR in (by default) the lib directory within your project (the path can be changed with the unmanagedBase setting in build.sbt if you need to do that for some reason).
Note that this will result in the unmanaged JAR being included in an assembly JAR. If you want to build a "slightly less fat" JAR that excludes the unmanaged JAR, you'll have to filter it out. One way to accomplish this is with
assemblyExcludedJars in assembly := {
val cp = (fullClasspath in assembly).value
cp.filter(_.data.getName == "name-of-unmanaged.jar")
}
If you don't have the JAR (or perhaps something very close to the JAR) handy, how exactly do you expect the compiler to typecheck your calls into the JAR?

Spark Streaming + Kafka Integration 0.8.2.1

I have problems integrating spark with kafka. I using spark-streaming-kafka-0-8. I compile with SBT.
This is my code:
import org.apache.spark.SparkConf
import org.apache.spark.streaming._
import org.apache.kafka.clients.consumer.ConsumerRecord
import org.apache.kafka.common.serialization.StringDeserializer
import org.apache.spark.streaming.kafka._
object sparkKafka {
def main(args: Array[String]) {
val sparkConf = new SparkConf().setAppName("KafkaWordCount").setMaster("local[*]")
val ssc = new StreamingContext(sparkConf, Seconds(2))
val kafkaStream = KafkaUtils.createStream(ssc,
"localhost:2181", "spark stream", Map("customer" -> 2))
kafkaStream.print()
ssc.start()
ssc.awaitTermination()
}
}
I received this error:
`[info] Running sparkKafka
[error] (run-main-0) java.lang.NoClassDefFoundError: scala/Product$class
[error] java.lang.NoClassDefFoundError: scala/Product$class
[error] at org.apache.spark.SparkConf$DeprecatedConfig.<init>(SparkConf.scala:723)
[error] at org.apache.spark.SparkConf$.<init>(SparkConf.scala:571)
[error] at org.apache.spark.SparkConf$.<clinit>(SparkConf.scala)
[error] at org.apache.spark.SparkConf.set(SparkConf.scala:92)
[error] at org.apache.spark.SparkConf.set(SparkConf.scala:81)
[error] at org.apache.spark.SparkConf.setAppName(SparkConf.scala:118)
[error] at sparkKafka$.main(sparkKafka.scala:15)
[error] at sparkKafka.main(sparkKafka.scala)
[error] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[error] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
[error] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[error] at java.lang.reflect.Method.invoke(Method.java:498)
[error] Caused by: java.lang.ClassNotFoundException: scala.Product$class
[error] at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
[error] at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
[error] at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
[error] at org.apache.spark.SparkConf$DeprecatedConfig.<init>(SparkConf.scala:723)
[error] at org.apache.spark.SparkConf$.<init>(SparkConf.scala:571)
[error] at org.apache.spark.SparkConf$.<clinit>(SparkConf.scala)
[error] at org.apache.spark.SparkConf.set(SparkConf.scala:92)
[error] at org.apache.spark.SparkConf.set(SparkConf.scala:81)
[error] at org.apache.spark.SparkConf.setAppName(SparkConf.scala:118)
[error] at sparkKafka$.main(sparkKafka.scala:15)
[error] at sparkKafka.main(sparkKafka.scala)
[error] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[error] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
[error] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[error] at java.lang.reflect.Method.invoke(Method.java:498)
[error] Nonzero exit code: 1
[error] (Compile / run) Nonzero exit code: 1
[error] Total time: 6 s, completed Jan 14, 2019 2:19:15 PM.`
This is my build.sbt file:
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.2.0"
libraryDependencies += "org.apache.spark" % "spark-streaming_2.11" % "2.2.0"
libraryDependencies += "org.apache.spark" % "spark-streaming-kafka-0-10_2.11" % "2.2.0"
libraryDependencies += "org.apache.spark" % "spark-streaming-kafka-0-8_2.11" % "2.2.0"
How can I interated spark streaming with Kafka? I have a problem even spark-streaming-kafka-0-10....
Thanks
This is a version issue with Scala or Spark. Make sure you first are using Scala 2.11
If you are using Kafka 0.10, or higher (which if you've setup Kafka recently, and are only running it locally, then you likely would be), then you shouldn't be using kafka-0-8 package.
Do not mix spark-streaming-kafka-0-8 with spark-streaming-kafka-0-10
So, if you wanted to use 0-10, as answered previously, the package needs to be org.apache.spark.streaming.kafka010, not org.apache.spark.streaming.kafka
Also, note that the 0-8 does use Zookeeper (localhost:2181, for example), and 0-10 does not.

Spark streaming from kafka topic using scala

I am new in scala/Spark development. I have created a simple streaming application from Kafka topic using sbt and scala. I have the following code
build.sbt
name := "kafka-streaming"
version := "1.0"
assemblyOption in assembly := (assemblyOption in assembly).value.copy(includeScala = false)
assemblyMergeStrategy in assembly := {
case PathList("org", "apache", "spark", "unused", "UnusedStubClass.class") => MergeStrategy.first
case PathList(pl # _*) if pl.contains("log4j.properties") => MergeStrategy.concat
case PathList("META-INF", "io.netty.versions.properties") => MergeStrategy.last
case x =>
val oldStrategy = (assemblyMergeStrategy in assembly).value
oldStrategy(x)
}
scalaVersion := "2.11.8"
resolvers += "jitpack" at "https://jitpack.io"
// still want to be able to run in sbt
// https://github.com/sbt/sbt-assembly#-provided-configuration
run in Compile <<= Defaults.runTask(fullClasspath in Compile, mainClass in (Compile, run), runner in (Compile, run))
fork in run := true
javaOptions in run ++= Seq(
"-Dlog4j.debug=true",
"-Dlog4j.configuration=log4j.properties")
libraryDependencies ++= Seq(
"com.groupon.sparklint" %% "sparklint-spark162" % "1.0.4" excludeAll (
ExclusionRule(organization = "org.apache.spark")
),
"org.apache.spark" %% "spark-core" % "2.4.0",
"org.apache.spark" %% "spark-sql" % "2.4.0",
"org.apache.spark" %% "spark-streaming" % "2.4.0" % "provided",
"org.apache.spark" %% "spark-streaming-kafka" % "1.6.3"
)
WeatherDataStream.scala
package com.supergloo
import kafka.serializer.StringDecoder
import org.apache.log4j.Logger
import org.apache.spark.SparkConf
import org.apache.spark.streaming.{Seconds, StreamingContext}
import org.apache.spark.streaming.dstream.{DStream, InputDStream}
import org.apache.spark.streaming.kafka.KafkaUtils
/**
* Stream from Kafka
*/
object WeatherDataStream {
val localLogger = Logger.getLogger("WeatherDataStream")
def main(args: Array[String]) {
// update
// val checkpointDir = "./tmp"
val sparkConf = new SparkConf().setAppName("Raw Weather")
sparkConf.setIfMissing("spark.master", "local[5]")
val ssc = new StreamingContext(sparkConf, Seconds(2))
val kafkaTopicRaw = "spark-topic"
val kafkaBroker = "127.0.01:9092"
val topics: Set[String] = kafkaTopicRaw.split(",").map(_.trim).toSet
val kafkaParams = Map[String, String]("metadata.broker.list" -> kafkaBroker)
localLogger.info(s"connecting to brokers: $kafkaBroker")
localLogger.info(s"kafkaParams: $kafkaParams")
localLogger.info(s"topics: $topics")
val rawWeatherStream = KafkaUtils.createDirectStream[String, String, StringDecoder, StringDecoder](ssc, kafkaParams, topics)
localLogger.info(s"Manaaaaaaaaaf --->>>: $rawWeatherStream")
//Kick off
ssc.start()
ssc.awaitTermination()
ssc.stop()
}
}
I have created jar file using command
sbt package
and run the application using command
./spark-submit --master spark://myserver:7077 --class
com.supergloo.WeatherDataStream
/home/Manaf/kafka-streaming_2.11-1.0.jar
But i got error like this
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/streaming/kafka/KafkaUtils$
at com.supergloo.WeatherDataStream$.main(WeatherDataStream.scala:37)
at com.supergloo.WeatherDataStream.main(WeatherDataStream.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:849)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:167)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:195)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:924)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:933)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.streaming.kafka.KafkaUtils$
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
Based on my stack overflow analysis, i got idea about create jar using assembly command
sbt assembly
But I got an error like below when executing the assembly command
[error] 153 errors were encountered during merge
[trace] Stack trace suppressed: run last *:assembly for the full output.
[error] (*:assembly) deduplicate: different file contents found in the following:
[error] C:\Users\amanaf\.ivy2\cache\org.apache.arrow\arrow-vector\jars\arrow-vector-0.10.0.jar:git.properties
[error] C:\Users\amanaf\.ivy2\cache\org.apache.arrow\arrow-format\jars\arrow-format-0.10.0.jar:git.properties
[error] C:\Users\amanaf\.ivy2\cache\org.apache.arrow\arrow-memory\jars\arrow-memory-0.10.0.jar:git.properties
[error] deduplicate: different file contents found in the following:
[error] C:\Users\amanaf\.ivy2\cache\javax.inject\javax.inject\jars\javax.inject-1.jar:javax/inject/Inject.class
[error] C:\Users\amanaf\.ivy2\cache\org.glassfish.hk2.external\javax.inject\jars\javax.inject-2.4.0-b34.jar:javax/inject/Inject.class
[error] deduplicate: different file contents found in the following:
[error] C:\Users\amanaf\.ivy2\cache\javax.inject\javax.inject\jars\javax.inject-1.jar:javax/inject/Named.class
[error] C:\Users\amanaf\.ivy2\cache\org.glassfish.hk2.external\javax.inject\jars\javax.inject-2.4.0-b34.jar:javax/inject/Named.class
[error] deduplicate: different file contents found in the following:
[error] C:\Users\amanaf\.ivy2\cache\javax.inject\javax.inject\jars\javax.inject-1.jar:javax/inject/Provider.class
[error] C:\Users\amanaf\.ivy2\cache\org.glassfish.hk2.external\javax.inject\jars\javax.inject-2.4.0-b34.jar:javax/inject/Provider.class
[error] deduplicate: different file contents found in the following:
[error] C:\Users\amanaf\.ivy2\cache\javax.inject\javax.inject\jars\javax.inject-1.jar:javax/inject/Qualifier.class
[error] C:\Users\amanaf\.ivy2\cache\org.glassfish.hk2.external\javax.inject\jars\javax.inject-2.4.0-b34.jar:javax/inject/Qualifier.class
[error] deduplicate: different file contents found in the following:
[error] C:\Users\amanaf\.ivy2\cache\javax.inject\javax.inject\jars\javax.inject-1.jar:javax/inject/Scope.class
[error] C:\Users\amanaf\.ivy2\cache\org.glassfish.hk2.external\javax.inject\jars\javax.inject-2.4.0-b34.jar:javax/inject/Scope.class
[error] deduplicate: different file contents found in the following:
[error] C:\Users\amanaf\.ivy2\cache\javax.inject\javax.inject\jars\javax.inject-1.jar:javax/inject/Singleton.class
[error] C:\Users\amanaf\.ivy2\cache\org.glassfish.hk2.external\javax.inject\jars\javax.inject-2.4.0-b34.jar:javax/inject/Singleton.class
[error] deduplicate: different file contents found in the following:
[error] C:\Users\amanaf\.ivy2\cache\org.lz4\lz4-java\jars\lz4-java-1.4.0.jar:net/jpountz/lz4/LZ4BlockInputStream.class
[error] C:\Users\amanaf\.ivy2\cache\net.jpountz.lz4\lz4\jars\lz4-1.2.0.jar:net/jpountz/lz4/LZ4BlockInputStream.class
[error] deduplicate: different file contents found in the following:
[error] C:\Users\amanaf\.ivy2\cache\org.lz4\lz4-java\jars\lz4-java-1.4.0.jar:net/jpountz/lz4/LZ4BlockOutputStream.class
[error] C:\Users\amanaf\.ivy2\cache\net.jpountz.lz4\lz4\jars\lz4-1.2.0.jar:net/jpountz/lz4/LZ4BlockOutputStream.class
[error] deduplicate: different file contents found in the following:
[error] C:\Users\amanaf\.ivy2\cache\org.lz4\lz4-java\jars\lz4-java-1.4.0.jar:net/jpountz/lz4/LZ4Compressor.class
[error] C:\Users\amanaf\.ivy2\cache\net.jpountz.lz4\lz4\jars\lz4-1.2.0.jar:net/jpountz/lz4/LZ4Compressor.class
[error] deduplicate: different file contents found in the following:
[error] C:\Users\amanaf\.ivy2\cache\org.lz4\lz4-java\jars\lz4-java-1.4.0.jar:net/jpountz/lz4/LZ4Constants.class
[error] C:\Users\amanaf\.ivy2\cache\net.jpountz.lz4\lz4\jars\lz4-1.2.0.jar:net/jpountz/lz4/LZ4Constants.class
[error] deduplicate: different file contents found in the following:
[error] C:\Users\amanaf\.ivy2\cache\org.lz4\lz4-java\jars\lz4-java-1.4.0.jar:net/jpountz/lz4/LZ4Factory.class
[error] C:\Users\amanaf\.ivy2\cache\net.jpountz.lz4\lz4\jars\lz4-1.2.0.jar:net/jpountz/lz4/LZ4Factory.class
[error] deduplicate: different file contents found in the following:
[error] C:\Users\amanaf\.ivy2\cache\org.lz4\lz4-java\jars\lz4-java-1.4.0.jar:net/jpountz/lz4/LZ4FastDecompressor.class
[error] C:\Users\amanaf\.ivy2\cache\net.jpountz.lz4\lz4\jars\lz4-1.2.0.jar:net/jpountz/lz4/LZ4FastDecompressor.class
[error] deduplicate: different file contents found in the following:
[error] C:\Users\amanaf\.ivy2\cache\org.lz4\lz4-java\jars\lz4-java-1.4.0.jar:net/jpountz/lz4/LZ4HCJNICompressor.class
[error] C:\Users\amanaf\.ivy2\cache\net.jpountz.lz4\lz4\jars\lz4-1.2.0.jar:net/jpountz/lz4/LZ4HCJNICompressor.class
[error] deduplicate: different file contents found in the following:
[error] C:\Users\amanaf\.ivy2\cache\org.lz4\lz4-java\jars\lz4-java-1.4.0.jar:net/jpountz/lz4/LZ4HCJavaSafeCompressor$HashTable.class
[error] C:\Users\amanaf\.ivy2\cache\net.jpountz.lz4\lz4\jars\lz4-1.2.0.jar:net/jpountz/lz4/LZ4HCJavaSafeCompressor$HashTable.class
[error] deduplicate: different file contents found in the following:
[error] C:\Users\amanaf\.ivy2\cache\org.lz4\lz4-java\jars\lz4-java-1.4.0.jar:net/jpountz/lz4/LZ4HCJavaSafeCompressor.class
[error] C:\Users\amanaf\.ivy2\cache\net.jpountz.lz4\lz4\jars\lz4-1.2.0.jar:net/jpountz/lz4/LZ4HCJavaSafeCompressor.class
[error] deduplicate: different file contents found in the following:
[error] C:\Users\amanaf\.ivy2\cache\org.lz4\lz4-java\jars\lz4-java-1.4.0.jar:net/jpountz/lz4/LZ4HCJavaUnsafeCompressor$HashTable.class
[error] C:\Users\amanaf\.ivy2\cache\net.jpountz.lz4\lz4\jars\lz4-1.2.0.jar:net/jpountz/lz4/LZ4HCJavaUnsafeCompressor$HashTable.class
[error] deduplicate: different file contents found in the following:
[error] C:\Users\amanaf\.ivy2\cache\org.lz4\lz4-java\jars\lz4-java-1.4.0.jar:net/jpountz/lz4/LZ4HCJavaUnsafeCompressor.class
[error] C:\Users\amanaf\.ivy2\cache\net.jpountz.lz4\lz4\jars\lz4-1.2.0.jar:net/jpountz/lz4/LZ4HCJavaUnsafeCompressor.class
[error] deduplicate: different file contents found in the following:
[error] C:\Users\amanaf\.ivy2\cache\org.lz4\lz4-java\jars\lz4-java-1.4.0.jar:net/jpountz/lz4/LZ4JNI.class
[error] C:\Users\amanaf\.ivy2\cache\net.jpountz.lz4\lz4\jars\lz4-1.2.0.jar:net/jpountz/lz4/LZ4JNI.class
[error] deduplicate: different file contents found in the following:
[error] C:\Users\amanaf\.ivy2\cache\org.lz4\lz4-java\jars\lz4-java-1.4.0.jar:net/jpountz/lz4/LZ4JNICompressor.class
[error] C:\Users\amanaf\.ivy2\cache\net.jpountz.lz4\lz4\jars\lz4-1.2.0.jar:net/jpountz/lz4/LZ4JNICompressor.class
[error] deduplicate: different file contents found in the following:
[error] C:\Users\amanaf\.ivy2\cache\org.lz4\lz4-java\jars\lz4-java-1.4.0.jar:net/jpountz/lz4/LZ4JNIFastDecompressor.class
[error] C:\Users\amanaf\.ivy2\cache\net.jpountz.lz4\lz4\jars\lz4-1.2.0.jar:net/jpountz/lz4/LZ4JNIFastDecompressor.class
[error] deduplicate: different file contents found in the following:
[error] C:\Users\amanaf\.ivy2\cache\org.lz4\lz4-java\jars\lz4-java-1.4.0.jar:net/jpountz/lz4/LZ4JNISafeDecompressor.class
[error] C:\Users\amanaf\.ivy2\cache\net.jpountz.lz4\lz4\jars\lz4-1.2.0.jar:net/jpountz/lz4/LZ4JNISafeDecompressor.class
This issue is related to library versions. I have just updated my build.sbt like this
name := "kafka-streaming"
version := "1.0"
assemblyOption in assembly := (assemblyOption in assembly).value.copy(includeScala = false)
assemblyMergeStrategy in assembly := {
case PathList("org", "apache", "spark", "unused", "UnusedStubClass.class") => MergeStrategy.first
case PathList(pl # _*) if pl.contains("log4j.properties") => MergeStrategy.concat
case PathList("META-INF", "io.netty.versions.properties") => MergeStrategy.last
case x =>
val oldStrategy = (assemblyMergeStrategy in assembly).value
oldStrategy(x)
}
scalaVersion := "2.11.8"
resolvers += "jitpack" at "https://jitpack.io"
// still want to be able to run in sbt
// https://github.com/sbt/sbt-assembly#-provided-configuration
run in Compile <<= Defaults.runTask(fullClasspath in Compile, mainClass in (Compile, run), runner in (Compile, run))
fork in run := true
javaOptions in run ++= Seq(
"-Dlog4j.debug=true",
"-Dlog4j.configuration=log4j.properties")
libraryDependencies ++= Seq(
"com.groupon.sparklint" %% "sparklint-spark162" % "1.0.4" excludeAll (
ExclusionRule(organization = "org.apache.spark")
),
"org.apache.spark" %% "spark-core" % "1.6.2" % "provided",
"org.apache.spark" %% "spark-sql" % "1.6.2" % "provided",
"org.apache.spark" %% "spark-streaming" % "1.6.2" % "provided",
"org.apache.spark" %% "spark-streaming-kafka" % "1.6.2",
"com.datastax.spark" %% "spark-cassandra-connector" % "1.6.0"
)
Now the issue resolved.
This is the basic code to ingest messages from Kafka into Spark Streams to do a word frequency count. The code is customized for local machine.
import org.apache.kafka.clients.consumer.ConsumerConfig
import org.apache.kafka.common.serialization.StringDeserializer
import org.apache.spark.SparkConf
import org.apache.spark.streaming._
import org.apache.spark.sql.SparkSession
import org.apache.spark.streaming.kafka010.LocationStrategies._
import org.apache.spark.streaming.kafka010.ConsumerStrategies._
import org.apache.spark.streaming.kafka010._`
object WordFreqCount {
def main( args:Array[String] ){
println("Start")
val conf = new SparkConf()
.setMaster("local[*]")
.setAppName("KafkaReceiver")
.set("spark.driver.bindAddress","127.0.0.1")
println("conf created")
val spark = SparkSession.builder().config(conf).getOrCreate()
val sc = spark.sparkContext
val ssc = new StreamingContext(sc, Seconds(10))
println("ssc created")
val topics = "wctopic"
val brokers = "127.0.0.1:9092"
val groupId = "wcgroup"
val topicsSet = topics.split(",").toSet
val kafkaParams = Map[String, Object](
"bootstrap.servers" -> brokers,
"key.deserializer" -> classOf[StringDeserializer],
"value.deserializer" -> classOf[StringDeserializer],
"group.id" -> groupId,
"auto.offset.reset" -> "latest",
"enable.auto.commit" -> (false: java.lang.Boolean)
)
val messages = KafkaUtils.createDirectStream[String, String](
ssc,
PreferConsistent,
Subscribe[String, String](topicsSet, kafkaParams))
// Get the lines, split them into words, count the words and print
val lines = messages.map(_.value)
val words = lines.flatMap(_.split(" "))
val wordCounts = words.map(x => (x, 1L)).reduceByKey(_ + _)
wordCounts.print()
//val kafkaStream = KafkaUtils.createStream(ssc, "127.0.0.1:2181", "wcgroup",Map("wctopic" -> 1))
messages.print() //prints the stream of data received
ssc.start()
ssc.awaitTermination()
println("End")`
}
}

ScalaPB with Scala.js and Scala(jvm) - There were linking errors

I have multiple subprojects in my sbt, one being a server (base upon playframework) the other being the clientside (scala.js) and the third one being the communication between the two in form of protobuf (scalapb).
Now, this is my build.sbt:
lazy val generalSettings = Seq(
organization := "tld.awesomeness",
version := "0.0.1",
scalaVersion := "2.12.1"
)
val CrossDependencies = new
{
val scalaTest = "org.scalatest" %% "scalatest" % "3.0.1" % "test"
val scalactic = "org.scalactic" %% "scalactic" % "3.0.1"
val scalaTags = "com.lihaoyi" %% "scalatags" % "0.6.2"
}
lazy val proto = (project in file("modules/proto"))
.settings(generalSettings: _*)
.settings(
PB.targets in Compile := Seq(
scalapb.gen() -> (sourceManaged in Compile).value
),
// If you need scalapb/scalapb.proto or anything from google/protobuf/*.proto
libraryDependencies ++= Seq(
"com.trueaccord.scalapb" %% "scalapb-runtime" % com.trueaccord.scalapb.compiler.Version.scalapbVersion % "protobuf",
"com.trueaccord.scalapb" %%% "scalapb-runtime" % com.trueaccord.scalapb.compiler.Version.scalapbVersion,
"com.trueaccord.scalapb" %%% "scalapb-runtime" % com.trueaccord.scalapb.compiler.Version.scalapbVersion % "protobuf"
)
)
lazy val play = (project in file("modules/play"))
.enablePlugins(PlayScala)
.settings(generalSettings: _*)
.settings(
name := "play",
libraryDependencies ++= Seq(
CrossDependencies.scalaTest,
CrossDependencies.scalactic,
CrossDependencies.scalaTags,
"com.typesafe.play" %% "play-json" % "2.6.0-M1"),
scalaJSProjects := Seq(client),
pipelineStages in Assets := Seq(scalaJSPipeline),
compile in Compile := ((compile in Compile) dependsOn scalaJSPipeline).value
)
.aggregate(slick)
.dependsOn(slick)
.aggregate(flyway)
.dependsOn(flyway)
.aggregate(proto)
.dependsOn(proto)
lazy val client = (project in file("modules/client"))
.enablePlugins(ScalaJSPlugin, ScalaJSWeb)
.settings(generalSettings: _*)
.settings(
name := "client",
libraryDependencies += CrossDependencies.scalaTags,
persistLauncher := true
)
.aggregate(proto)
.dependsOn(proto)
// Loads the jvm project at sbt startup
onLoad in Global := (Command.process("project play", _: State)) compose (onLoad in Global).value
fork in run := true
and this is the plugins.sbt:
// Scala.JS
addSbtPlugin("org.scala-js" % "sbt-scalajs" % "0.6.14")
addSbtPlugin("com.vmunier" % "sbt-web-scalajs" % "1.0.2")
// Play
addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.6.0-SNAPSHOT")
// Proto
addSbtPlugin("com.thesamet" % "sbt-protoc" % "0.99.3" exclude ("com.trueaccord.scalapb", "protoc-bridge_2.10"))
libraryDependencies += "com.trueaccord.scalapb" %% "compilerplugin-shaded" % "0.5.47"
This is one proto file:
syntax = "proto3";
package tld.awesomeness.proto;
message Test {
int32 id = 1;
string email = 2;
}
After compilation I get Test.class
Now in the client I try to:
private def doSend(ws: WebSocket): Unit =
{
val msg = Test().withId(1337)
val a: ArrayBuffer = new ArrayBuffer(msg.toByteArray.length)
msg.toByteArray
ws.send(a)
}
(The websocket itself worked all just fine when I sent strings through it!)
Now I get this huge stacktrace:
[info] Fast optimizing /home/sorona/awesomeness/modules/client/target/scala-2.12/client-fastopt.js
[error] Referring to non-existent class tld.awesomeness.proto.Test.Test$
[error] called from tld.awesomeness.ScalaJSTest$.doSend(org.scalajs.dom.raw.WebSocket)scala.Unit
[error] called from tld.awesomeness.ScalaJSTest$.tld$awesomeness$ScalaJSTest$$$anonfun$call$1(org.scalajs.dom.raw.Event,org.scalajs.dom.raw.WebSocket)org.scalajs.dom.raw.Event
[error] called from tld.awesomeness.ScalaJSTest$.call()scala.Unit
[error] called from tld.awesomeness.Main$.main()scala.Unit
[error] called from scala.scalajs.js.JSApp.$$js$exported$meth$main()java.lang.Object
[error] called from tld.awesomeness.Main$.$$js$exported$meth$main()java.lang.Object
[error] called from tld.awesomeness.Main$.main
[error] exported to JavaScript with #JSExport
[error] involving instantiated classes:
[error] tld.awesomeness.ScalaJSTest$
[error] tld.awesomeness.Main$
[error] Referring to non-existent class tld.awesomeness.proto.Test.Test
[error] called from tld.awesomeness.ScalaJSTest$.doSend(org.scalajs.dom.raw.WebSocket)scala.Unit
[error] called from tld.awesomeness.ScalaJSTest$.tld$awesomeness$ScalaJSTest$$$anonfun$call$1(org.scalajs.dom.raw.Event,org.scalajs.dom.raw.WebSocket)org.scalajs.dom.raw.Event
[error] called from tld.awesomeness.ScalaJSTest$.call()scala.Unit
[error] called from tld.awesomeness.Main$.main()scala.Unit
[error] called from scala.scalajs.js.JSApp.$$js$exported$meth$main()java.lang.Object
[error] called from tld.awesomeness.Main$.$$js$exported$meth$main()java.lang.Object
[error] called from tld.awesomeness.Main$.main
[error] exported to JavaScript with #JSExport
[error] involving instantiated classes:
[error] tld.awesomeness.ScalaJSTest$
[error] tld.awesomeness.Main$
[error] Referring to non-existent method tld.awesomeness.proto.Test.Test.toByteArray()[scala.Byte
[error] called from tld.awesomeness.ScalaJSTest$.doSend(org.scalajs.dom.raw.WebSocket)scala.Unit
[error] called from tld.awesomeness.ScalaJSTest$.tld$awesomeness$ScalaJSTest$$$anonfun$call$1(org.scalajs.dom.raw.Event,org.scalajs.dom.raw.WebSocket)org.scalajs.dom.raw.Event
[error] called from tld.awesomeness.ScalaJSTest$.call()scala.Unit
[error] called from tld.awesomeness.Main$.main()scala.Unit
[error] called from scala.scalajs.js.JSApp.$$js$exported$meth$main()java.lang.Object
[error] called from tld.awesomeness.Main$.$$js$exported$meth$main()java.lang.Object
[error] called from tld.awesomeness.Main$.main
[error] exported to JavaScript with #JSExport
[error] involving instantiated classes:
[error] tld.awesomeness.ScalaJSTest$
[error] tld.awesomeness.Main$
[error] Referring to non-existent method tld.awesomeness.proto.Test.Test.withId(scala.Int)tld.awesomeness.proto.Test.Test
[error] called from tld.awesomeness.ScalaJSTest$.doSend(org.scalajs.dom.raw.WebSocket)scala.Unit
[error] called from tld.awesomeness.ScalaJSTest$.tld$awesomeness$ScalaJSTest$$$anonfun$call$1(org.scalajs.dom.raw.Event,org.scalajs.dom.raw.WebSocket)org.scalajs.dom.raw.Event
[error] called from tld.awesomeness.ScalaJSTest$.call()scala.Unit
[error] called from tld.awesomeness.Main$.main()scala.Unit
[error] called from scala.scalajs.js.JSApp.$$js$exported$meth$main()java.lang.Object
[error] called from tld.awesomeness.Main$.$$js$exported$meth$main()java.lang.Object
[error] called from tld.awesomeness.Main$.main
[error] exported to JavaScript with #JSExport
[error] involving instantiated classes:
[error] tld.awesomeness.ScalaJSTest$
[error] tld.awesomeness.Main$
[error] Referring to non-existent method tld.awesomeness.proto.Test.Test$.apply$default$2()java.lang.String
[error] called from tld.awesomeness.ScalaJSTest$.doSend(org.scalajs.dom.raw.WebSocket)scala.Unit
[error] called from tld.awesomeness.ScalaJSTest$.tld$awesomeness$ScalaJSTest$$$anonfun$call$1(org.scalajs.dom.raw.Event,org.scalajs.dom.raw.WebSocket)org.scalajs.dom.raw.Event
[error] called from tld.awesomeness.ScalaJSTest$.call()scala.Unit
[error] called from tld.awesomeness.Main$.main()scala.Unit
[error] called from scala.scalajs.js.JSApp.$$js$exported$meth$main()java.lang.Object
[error] called from tld.awesomeness.Main$.$$js$exported$meth$main()java.lang.Object
[error] called from tld.awesomeness.Main$.main
[error] exported to JavaScript with #JSExport
[error] involving instantiated classes:
[error] tld.awesomeness.ScalaJSTest$
[error] tld.awesomeness.Main$
[error] Referring to non-existent method tld.awesomeness.proto.Test.Test$.apply$default$1()scala.Int
[error] called from tld.awesomeness.ScalaJSTest$.doSend(org.scalajs.dom.raw.WebSocket)scala.Unit
[error] called from tld.awesomeness.ScalaJSTest$.tld$awesomeness$ScalaJSTest$$$anonfun$call$1(org.scalajs.dom.raw.Event,org.scalajs.dom.raw.WebSocket)org.scalajs.dom.raw.Event
[error] called from tld.awesomeness.ScalaJSTest$.call()scala.Unit
[error] called from tld.awesomeness.Main$.main()scala.Unit
[error] called from scala.scalajs.js.JSApp.$$js$exported$meth$main()java.lang.Object
[error] called from tld.awesomeness.Main$.$$js$exported$meth$main()java.lang.Object
[error] called from tld.awesomeness.Main$.main
[error] exported to JavaScript with #JSExport
[error] involving instantiated classes:
[error] tld.awesomeness.ScalaJSTest$
[error] tld.awesomeness.Main$
[error] Referring to non-existent method tld.awesomeness.proto.Test.Test.<init>(scala.Int,java.lang.String)
[error] called from tld.awesomeness.ScalaJSTest$.doSend(org.scalajs.dom.raw.WebSocket)scala.Unit
[error] called from tld.awesomeness.ScalaJSTest$.tld$awesomeness$ScalaJSTest$$$anonfun$call$1(org.scalajs.dom.raw.Event,org.scalajs.dom.raw.WebSocket)org.scalajs.dom.raw.Event
[error] called from tld.awesomeness.ScalaJSTest$.call()scala.Unit
[error] called from tld.awesomeness.Main$.main()scala.Unit
[error] called from scala.scalajs.js.JSApp.$$js$exported$meth$main()java.lang.Object
[error] called from tld.awesomeness.Main$.$$js$exported$meth$main()java.lang.Object
[error] called from tld.awesomeness.Main$.main
[error] exported to JavaScript with #JSExport
[error] involving instantiated classes:
[error] tld.awesomeness.ScalaJSTest$
[error] tld.awesomeness.Main$
java.lang.RuntimeException: There were linking errors
at scala.sys.package$.error(package.scala:27)
at org.scalajs.core.tools.linker.frontend.BaseLinker.linkInternal(BaseLinker.scala:133)
at org.scalajs.core.tools.linker.frontend.BaseLinker.linkInternal(BaseLinker.scala:86)
at org.scalajs.core.tools.linker.frontend.LinkerFrontend$$anonfun$4.apply(LinkerFrontend.scala:54)
at org.scalajs.core.tools.linker.frontend.LinkerFrontend$$anonfun$4.apply(LinkerFrontend.scala:54)
at org.scalajs.core.tools.logging.Logger$class.time(Logger.scala:28)
at org.scalajs.sbtplugin.Loggers$SbtLoggerWrapper.time(Loggers.scala:7)
at org.scalajs.core.tools.linker.frontend.LinkerFrontend.link(LinkerFrontend.scala:53)
at org.scalajs.core.tools.linker.Linker$$anonfun$link$1.apply$mcV$sp(Linker.scala:50)
at org.scalajs.core.tools.linker.Linker$$anonfun$link$1.apply(Linker.scala:49)
at org.scalajs.core.tools.linker.Linker$$anonfun$link$1.apply(Linker.scala:49)
at org.scalajs.core.tools.linker.Linker.guard(Linker.scala:67)
at org.scalajs.core.tools.linker.Linker.link(Linker.scala:49)
at org.scalajs.core.tools.linker.ClearableLinker$$anonfun$link$1.apply(ClearableLinker.scala:51)
at org.scalajs.core.tools.linker.ClearableLinker$$anonfun$link$1.apply(ClearableLinker.scala:51)
at org.scalajs.core.tools.linker.ClearableLinker.linkerOp(ClearableLinker.scala:62)
at org.scalajs.core.tools.linker.ClearableLinker.link(ClearableLinker.scala:51)
at org.scalajs.sbtplugin.ScalaJSPluginInternal$$anonfun$org$scalajs$sbtplugin$ScalaJSPluginInternal$$scalaJSStageSettings$4$$anonfun$apply$6$$anonfun$apply$7.apply(ScalaJSPluginInternal.scala:251)
at org.scalajs.sbtplugin.ScalaJSPluginInternal$$anonfun$org$scalajs$sbtplugin$ScalaJSPluginInternal$$scalaJSStageSettings$4$$anonfun$apply$6$$anonfun$apply$7.apply(ScalaJSPluginInternal.scala:239)
at sbt.FileFunction$$anonfun$cached$1.apply(Tracked.scala:253)
at sbt.FileFunction$$anonfun$cached$1.apply(Tracked.scala:253)
at sbt.FileFunction$$anonfun$cached$2$$anonfun$apply$3$$anonfun$apply$4.apply(Tracked.scala:267)
at sbt.FileFunction$$anonfun$cached$2$$anonfun$apply$3$$anonfun$apply$4.apply(Tracked.scala:263)
at sbt.Difference.apply(Tracked.scala:224)
at sbt.Difference.apply(Tracked.scala:206)
at sbt.FileFunction$$anonfun$cached$2$$anonfun$apply$3.apply(Tracked.scala:263)
at sbt.FileFunction$$anonfun$cached$2$$anonfun$apply$3.apply(Tracked.scala:262)
at sbt.Difference.apply(Tracked.scala:224)
at sbt.Difference.apply(Tracked.scala:200)
at sbt.FileFunction$$anonfun$cached$2.apply(Tracked.scala:262)
at sbt.FileFunction$$anonfun$cached$2.apply(Tracked.scala:260)
at org.scalajs.sbtplugin.ScalaJSPluginInternal$$anonfun$org$scalajs$sbtplugin$ScalaJSPluginInternal$$scalaJSStageSettings$4$$anonfun$apply$6.apply(ScalaJSPluginInternal.scala:256)
at org.scalajs.sbtplugin.ScalaJSPluginInternal$$anonfun$org$scalajs$sbtplugin$ScalaJSPluginInternal$$scalaJSStageSettings$4$$anonfun$apply$6.apply(ScalaJSPluginInternal.scala:237)
at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:40)
at sbt.std.Transform$$anon$4.work(System.scala:63)
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:228)
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:228)
at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
at sbt.Execute.work(Execute.scala:237)
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:228)
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:228)
at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:159)
at sbt.CompletionService$$anon$2.call(CompletionService.scala:28)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
[error] (client/compile:fastOptJS) There were linking errors
My ide finds everything but obviously I am doing something wrong. I already had a look at https://github.com/thesamet/scalapbjs-test but to no avail.
Problem appears with that line val msg = Test().withId(1337)
edit: after comment I changed build.sbt:
lazy val proto = (crossProject in file("modules/proto"))
.settings(generalSettings: _*)
.settings(
PB.targets in Compile := Seq(
scalapb.gen() -> (sourceManaged in Compile).value
)).
jvmSettings(
libraryDependencies += "com.trueaccord.scalapb" %% "scalapb-runtime" % com.trueaccord.scalapb.compiler.Version.scalapbVersion % "protobuf",
PB.targets in Compile := Seq(
scalapb.gen() -> (sourceManaged in Compile).value
)
).
jsSettings(
libraryDependencies ++= Seq(
"com.trueaccord.scalapb" %%% "scalapb-runtime" % com.trueaccord.scalapb.compiler.Version.scalapbVersion,
"com.trueaccord.scalapb" %%% "scalapb-runtime" % com.trueaccord.scalapb.compiler.Version.scalapbVersion % "protobuf"
),
PB.targets in Compile := Seq(
scalapb.gen() -> (sourceManaged in Compile).value
)
)
lazy val protoJs = proto.js
lazy val protoJVM = proto.jvm
lazy val play = (project in file("modules/play"))
.enablePlugins(PlayScala)
.settings(generalSettings: _*)
.settings(
name := "play",
libraryDependencies ++= Seq(
CrossDependencies.scalaTest,
CrossDependencies.scalactic,
CrossDependencies.scalaTags,
"com.typesafe.play" %% "play-json" % "2.6.0-M1"),
scalaJSProjects := Seq(client),
pipelineStages in Assets := Seq(scalaJSPipeline),
compile in Compile := ((compile in Compile) dependsOn scalaJSPipeline).value
)
.aggregate(slick)
.dependsOn(slick)
.aggregate(flyway)
.dependsOn(flyway)
.aggregate(protoJVM)
.dependsOn(protoJVM)
lazy val client = (project in file("modules/client"))
.enablePlugins(ScalaJSPlugin, ScalaJSWeb)
.settings(generalSettings: _*)
.settings(
name := "client",
libraryDependencies += CrossDependencies.scalaTags,
persistLauncher := true
)
.aggregate(protoJs)
.dependsOn(protoJs)
now now neither play nor client can resolve the proto-class :(
(Also I am aware of the redundant PB.targets in Compile..., I just thought that sharing might not work there, so I added it to both distinct settings again)
With pure CrossProject you need to specify the actual path ScalaPB should look for proto files (the value it guesses is wrong). Here is a minimal example:
lazy val proto = (crossProject.crossType(CrossType.Pure) in file("proto"))
.settings(
PB.targets in Compile := Seq(
scalapb.gen() -> (sourceManaged in Compile).value
),
// The trick is in this line:
PB.protoSources in Compile := Seq(file("proto/src/main/protobuf")),
libraryDependencies ++= Seq(
"com.trueaccord.scalapb" %%% "scalapb-runtime" % com.trueaccord.scalapb.compiler.Version.scalapbVersion,
"com.trueaccord.scalapb" %%% "scalapb-runtime" % com.trueaccord.scalapb.compiler.Version.scalapbVersion % "protobuf"
)
)

Unable to run scala.js on PhantomJSEnv, requiresDOM setting is forced to false

I'd like to use websockets :
import org.scalajs.dom
import scala.scalajs.js.JSApp
import org.scalajs.dom.{CloseEvent, ErrorEvent, Event, MessageEvent}
object ExampleJS extends JSApp {
def main(): Unit = {
val data = ""
val ws = new dom.WebSocket("ws://127.0.0.1:8182")
// val ws = new dom.WebSocket("ws://127.0.0.1:8182, "whatever")
ws.onmessage = (x: MessageEvent) => Console.println(x.data.toString)
ws.onopen = (x: Event) => {}
ws.onerror = (x: ErrorEvent) => Console.println("some error has occurred " + x.message)
ws.onclose = (x: CloseEvent) => {}
ws.send(data)
}
}
But I'm still getting error like this :
[error] /home/lisak/src/viagraphs/scalajs-gremlin-client/js/target/scala-2.11/js-fastopt.js:1221
[error] var ws = new ScalaJS.g["WebSocket"]("ws://127.0.0.1:8081");
[error] ^
[error] TypeError: undefined is not a function
[error] at ScalaJS.c.Lcom_viagraphs_ExampleJS$.main__V (/home/lisak/src/viagraphs/scalajs-gremlin-client/js/target/scala-2.11/js-fastopt.js:1221:12)
[error] at ScalaJS.c.Lcom_viagraphs_ExampleJS$.$$js$exported$meth$main__O (/home/lisak/src/viagraphs/scalajs-gremlin-client/js/target/scala-2.11/js-fastopt.js:1243:16)
[error] at ScalaJS.c.Lcom_viagraphs_ExampleJS$.main (/home/lisak/src/viagraphs/scalajs-gremlin-client/js/target/scala-2.11/js-fastopt.js:1246:15)
[error] at [stdin]:17:91
[error] at Object.<anonymous> ([stdin]-wrapper:6:22)
[error] at Module._compile (module.js:456:26)
[error] at evalScript (node.js:532:25)
[error] at Socket.<anonymous> (node.js:154:11)
[error] at Socket.EventEmitter.emit (events.js:117:20)
[error] at _stream_readable.js:920:16
> last fastOptStage::run
[info] Running com.viagraphs.ExampleJS
[debug] with JSEnv of type class scala.scalajs.sbtplugin.env.nodejs.NodeJSEnv
[debug] with classpath of type class scala.scalajs.tools.classpath.CompleteCIClasspath$SimpleCompleteCIClasspath
[error]
[error] /home/lisak/src/viagraphs/scalajs-gremlin-client/js/target/scala-2.11/js-fastopt.js:1221
[error] var ws = new ScalaJS.g["WebSocket"]("ws://127.0.0.1:8081");
[error] ^
[error] TypeError: undefined is not a function
[error] at ScalaJS.c.Lcom_viagraphs_ExampleJS$.main__V (/home/lisak/src/viagraphs/scalajs-gremlin-client/js/target/scala-2.11/js-fastopt.js:1221:12)
[error] at ScalaJS.c.Lcom_viagraphs_ExampleJS$.$$js$exported$meth$main__O (/home/lisak/src/viagraphs/scalajs-gremlin-client/js/target/scala-2.11/js-fastopt.js:1243:16)
[error] at ScalaJS.c.Lcom_viagraphs_ExampleJS$.main (/home/lisak/src/viagraphs/scalajs-gremlin-client/js/target/scala-2.11/js-fastopt.js:1246:15)
[error] at [stdin]:17:91
[error] at Object.<anonymous> ([stdin]-wrapper:6:22)
[error] at Module._compile (module.js:456:26)
[error] at evalScript (node.js:532:25)
[error] at Socket.<anonymous> (node.js:154:11)
[error] at Socket.EventEmitter.emit (events.js:117:20)
[error] at _stream_readable.js:920:16
java.lang.RuntimeException: node.js exited with code 8
at scala.sys.package$.error(package.scala:27)
at scala.scalajs.sbtplugin.env.ExternalJSEnv.runJS(ExternalJSEnv.scala:65)
at scala.scalajs.sbtplugin.env.nodejs.NodeJSEnv.scala$scalajs$sbtplugin$env$nodejs$NodeJSEnv$$super$runJS(NodeJSEnv.scala:76)
at scala.scalajs.sbtplugin.env.nodejs.NodeJSEnv$$anonfun$runJS$1.apply$mcV$sp(NodeJSEnv.scala:76)
at scala.scalajs.sbtplugin.env.nodejs.NodeJSEnv$$anonfun$runJS$1.apply(NodeJSEnv.scala:76)
at scala.scalajs.sbtplugin.env.nodejs.NodeJSEnv$$anonfun$runJS$1.apply(NodeJSEnv.scala:76)
at scala.scalajs.sbtplugin.env.nodejs.NodeJSEnv$$anonfun$withLibCache$1.apply(NodeJSEnv.scala:43)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
at scala.scalajs.sbtplugin.env.nodejs.NodeJSEnv.withLibCache(NodeJSEnv.scala:42)
at scala.scalajs.sbtplugin.env.nodejs.NodeJSEnv.runJS(NodeJSEnv.scala:76)
at scala.scalajs.sbtplugin.ScalaJSPluginInternal$.scala$scalajs$sbtplugin$ScalaJSPluginInternal$$jsRun(ScalaJSPluginInternal.scala:356)
at scala.scalajs.sbtplugin.ScalaJSPluginInternal$$anonfun$48$$anonfun$apply$18$$anonfun$apply$19.apply(ScalaJSPluginInternal.scala:420)
at scala.scalajs.sbtplugin.ScalaJSPluginInternal$$anonfun$48$$anonfun$apply$18$$anonfun$apply$19.apply(ScalaJSPluginInternal.scala:414)
at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
at sbt.$tilde$greater$$anonfun$$u2219$1.apply(TypeFunctions.scala:42)
at sbt.std.Transform$$anon$4.work(System.scala:64)
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:237)
at sbt.Execute$$anonfun$submit$1$$anonfun$apply$1.apply(Execute.scala:237)
at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:18)
at sbt.Execute.work(Execute.scala:244)
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:237)
at sbt.Execute$$anonfun$submit$1.apply(Execute.scala:237)
at sbt.ConcurrentRestrictions$$anon$4$$anonfun$1.apply(ConcurrentRestrictions.scala:160)
at sbt.CompletionService$$anon$2.call(CompletionService.scala:30)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
[error] (js/compile:fastOptStage::run) node.js exited with code 8
I tried RhinoJSEnv and NodeJSEnv, both end with an error like this. I can't run it on PhantomJSEnv, I have it installed on linux, it is on PATH and working. The problem is that requiresDOM setting is always false even though I explicitly set it to true thuse PhantomJSEnv is never chosen as a runtime :
override lazy val settings =
super.settings ++ Seq(
version := "0.0.1",
scalaVersion := "2.11.2",
resolvers += Resolver.mavenLocal,
offline := true
)
lazy val js = project.in(file("js")).settings(
Seq(
libraryDependencies ++= Seq(
"org.scala-lang.modules.scalajs" %%% "scalajs-dom" % "0.7-SNAPSHOT",
"com.lihaoyi" %%% "utest" % "0.2.0" % "test"
),
test in Test := (test in (Test, fastOptStage)).value,
testFrameworks += new TestFramework("utest.runner.JvmFramework"),
requiresDOM := true
) ++ Plugin.internal.utestJsSettings ++ scalaJSSettings:_*
)
Change your build definition to the following:
lazy val js = project.in(file("js"))
.settings(scalaJSSettings: _*)
.settings(Plugin.internal.utestJsSettings: _*)
.settings(
libraryDependencies ++= Seq(
"org.scala-lang.modules.scalajs" %%% "scalajs-dom" % "0.7-SNAPSHOT",
"com.lihaoyi" %%% "utest" % "0.2.0" % "test"
),
test in Test := (test in (Test, fastOptStage)).value,
testFrameworks += new TestFramework("utest.runner.JvmFramework"),
requiresDOM := true
)
Note the change of order (using .settings multiple times doesn't change anything, its just cleaner IMHO). If you put your project specific settings first, and then the Scala.js settings. The defaults in scalaJSSettings will override your settings and requiresDOM will be false.