I'm trying to create basic scala project in intellij by using the Activator UI
I'm importing the project to the ide and it compile well
But when im trying to run simple code im getting
Exception in thread "main" java.lang.NoClassDefFoundError: scala/collection/GenTraversableOnce$class
at akka.util.Collections$EmptyImmutableSeq$.<init>(Collections.scala:15)
at akka.util.Collections$EmptyImmutableSeq$.<clinit>(Collections.scala)
at akka.japi.Util$.immutableSeq(JavaAPI.scala:209)
at akka.actor.ActorSystem$Settings.<init>(ActorSystem.scala:150)
at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:470)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
at reactivemongo.api.MongoDriver$.reactivemongo$api$MongoDriver$$defaultSystem(api.scala:378)
at reactivemongo.api.MongoDriver$$anonfun$3.apply(api.scala:305)
at reactivemongo.api.MongoDriver$$anonfun$3.apply(api.scala:305)
at scala.Option.getOrElse(Option.scala:120)
at reactivemongo.api.MongoDriver.<init>(api.scala:305)
at example.App$.main(App.scala:10)
at example.App.main(App.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:134)
When the project is loaded there is an error in the project structure
sbt:scala 2.11.2 not in use
What went wrong with the activator ui intellij project generation ?
thanks
miki
I came across this when trying to run spark. This is a incompatibility error between the scala version which was used to compile the dependancy and the version of scala which is used to run your project.
Removing my scala version specification was a hacky way to solve the problem:
// build.sbt
name := "SparkTest"
version := "1.0"
scalaVersion := "2.11.4" <-- remove this
libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.3.0"
Related
I have an Apache Spark 2.0 application written in Scala (2.11.12), built using SBT tool 1.2.8. When I'm trying to run the app in Intellij (2020.3.2 Ultimate), I get the following error -
Exception in thread "main" java.lang.NoClassDefFoundError: com/google/common/util/concurrent/internal/InternalFutureFailureAccess
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:756)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:468)
.....
Caused by: java.lang.ClassNotFoundException: com.google.common.util.concurrent.internal.InternalFutureFailureAccess
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
On googling/searching Stackoverflow, it seems this is caused by some weird Guava dependency issues. I have these added to my Dependencies.scala
dependencies += "com.google.guava" % "guava" % "30.1-jre"
dependencies += "com.google.guava" % "failureaccess" % "1.0"
That didn't solve the issue. Also tried adding "com.google.guava" % "listenablefuture" % "1.0" to the dependencies, but that didn't help either. Tried doing File -> Invalidate Caches/Restart in Intellij but I still get the issue.
Could someone please help?
In my case, adding com.google.guava:failureaccess to external libraries to my project (File -> Project Structure -> Libraries) helped.
After creating a useful application in Scala with dependencies, how do I deploy (create a binary) for it?
I would like to know the most idiomatic way which hopefully is the simplest way.
For me that would be the usual sbt compile, then look for the main class:
./target/scala-2.12/classes/scala_pandoc/Main.class
Then execute it:
$ CLASSPATH="$CLASSPATH:./target/scala-2.12/classes/" scala scala_pandoc.Main --unwrap-explain
Picked up _JAVA_OPTIONS: -Xms256m -Xmx300m
java.lang.ClassNotFoundException: ujson.Value
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at scala_pandoc.Main$.main(Main.scala:51)
at scala_pandoc.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at scala.reflect.internal.util.ScalaClassLoader.$anonfun$run$2(ScalaClassLoader.scala:106)
at scala.reflect.internal.util.ScalaClassLoader.asContext(ScalaClassLoader.scala:41)
at scala.reflect.internal.util.ScalaClassLoader.asContext$(ScalaClassLoader.scala:37)
at scala.reflect.internal.util.ScalaClassLoader$URLClassLoader.asContext(ScalaClassLoader.scala:132)
at scala.reflect.internal.util.ScalaClassLoader.run(ScalaClassLoader.scala:106)
at scala.reflect.internal.util.ScalaClassLoader.run$(ScalaClassLoader.scala:98)
at scala.reflect.internal.util.ScalaClassLoader$URLClassLoader.run(ScalaClassLoader.scala:132)
at scala.tools.nsc.CommonRunner.run(ObjectRunner.scala:28)
at scala.tools.nsc.CommonRunner.run$(ObjectRunner.scala:27)
at scala.tools.nsc.ObjectRunner$.run(ObjectRunner.scala:45)
at scala.tools.nsc.CommonRunner.runAndCatch(ObjectRunner.scala:35)
at scala.tools.nsc.CommonRunner.runAndCatch$(ObjectRunner.scala:34)
at scala.tools.nsc.ObjectRunner$.runAndCatch(ObjectRunner.scala:45)
at scala.tools.nsc.MainGenericRunner.runTarget$1(MainGenericRunner.scala:73)
at scala.tools.nsc.MainGenericRunner.run$1(MainGenericRunner.scala:92)
at scala.tools.nsc.MainGenericRunner.process(MainGenericRunner.scala:103)
at scala.tools.nsc.MainGenericRunner$.main(MainGenericRunner.scala:108)
at scala.tools.nsc.MainGenericRunner.main(MainGenericRunner.scala)
But as we can see it somehow does not find the dependencies. When I compile the project a bunch of files are downloaded/created at ~/.sbt/ and ~/.ivy2 but neither adding those (or all sub folders to CLASSPATH) solves the issue.
The forementioned procedure works for projects without external dependencies.
Workaround:
Use https://github.com/sbt/sbt-assembly which is great (creates an executable .jar) which I can run with java -jar myjar.jar but feels hackish/non official/fragile and besides, it also puts more dependencies on my project.
build.sbt:
lazy val scalatest = "org.scalatest" %% "scalatest" % "3.0.5"
lazy val ujson = "com.lihaoyi" %% "ujson" % "0.7.1"
name := "scala_pandoc"
organization := "org.fmv1992"
licenses += "GPLv2" -> url("https://www.gnu.org/licenses/gpl-2.0.html")
lazy val commonSettings = Seq(
version := "0.0.1-SNAPSHOT",
scalaVersion := "2.12.8",
pollInterval := scala.concurrent.duration.FiniteDuration(50L, "ms"),
maxErrors := 10,
// This final part makes test artifacts being only importable by the test files
// libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.5" % Test,
// ↑↑↑↑↑
// Removed on commit 'cd9d482' to enable 'trait ScalaInitiativesTest' define
// 'namedTest'.
libraryDependencies ++= Seq(scalatest, ujson),
scalacOptions ++= Seq("-feature", "-deprecation", "-Xfatal-warnings")
)
lazy val root = (project in file(".")).settings(commonSettings).settings(assemblyJarName in assembly := "scala_pandoc.jar")
project/build.properties:
sbt.version=1.2.8
project/plugins.sbt:
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.14.9")
Related question: Deploy Scala binaries without dependencies
sbt-assembly is not hacky - it is maintained by one of sbt creators (Eugene Yokota) and lives in official sbt organization, so it is the official way of deploying Scala JARs in sbt.
Well, one of several official ways. You can take a look at sbt-native-packager. The thing is: there are so many possible targets of sbt build that authors decided that even building an uberjar should not be a special snowflake and it should be done via a plugin.
So just use sbt-assembly and don't feel guilty about it. That is the idiomatic way .
I'm new to Scala and Spark. I've been frustrated by how hard it has been to get things to work with IntelliJ. Currently, I can't get run the code below. I'm sure it's something simple, but I can't get it to work.
I'm trying to run:
import org.apache.spark.{SparkConf, SparkContext}
object TestScala {
def main(args: Array[String]): Unit = {
val conf = new SparkConf()
conf.setAppName("Datasets Test")
conf.setMaster("local[2]")
val sc = new SparkContext(conf)
println(sc)
}
}
The error I get is:
Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;
at org.apache.spark.util.Utils$.getCallSite(Utils.scala:1413)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:77)
at TestScala$.main(TestScala.scala:13)
at TestScala.main(TestScala.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)
My build.sbt file:
name := "sparkBook"
version := "1.0"
scalaVersion := "2.12.1"
Change your scalaVersion to 2.11.8 and add the Spark dependency to your build.sbt:
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.0.2"
One more scenario is intellij is pointing to 2.12.4 and all the maven/sbt dependencies are 2.11.8. with scala dep verion 2.11...
I stepped back from 2.12.4 to 2.11.8 at global libraries of intellij ui. and it started working
Details :
Maven pom.xml pointing to 2.11.8 But in my Intellij... sdk is 2.12.4 in global libraries shown below.
Which is causing
java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;
Stepped back to 2.11.8 in Global libraries.. like below
Thats it.. Problem solved. No more error for executing that program.
Conclusion : Maven dependencies alone should not solve the problem, along with that we have to configure scala sdk in global
libraries since its error is coming while running a spark local
program and error is related to Intellij run time.
If you use spark 2.4.3, you need to use scala 2.11 even though spark website says to use scala 2.12. https://spark.apache.org/docs/latest/
To avoid scala.Predef$.refArrayOps([Ljava/lang/Object;)[Ljava/lang/Object;
I am trying to use Scala 2.12.0-M5 and AKKA 2.4.7 in a project. But I get this error when I try so start AKKA. I also tried using M4.
I am sure I must be missing something in my setup, as this clearly must work. But this is pretty much just what I had using 2.11.8 - 2.4.6.
Any help would be appreciated, thanks!
build.sbt:
name := "AKKA-2.4.8"
version := "1.0"
scalaVersion := "2.12.0-M5"
// https://mvnrepository.com/artifact/com.typesafe.akka/akka-actor_2.11
libraryDependencies += "com.typesafe.akka" % "akka-actor_2.11" % "2.4.8"
code:
package testing
import akka.actor.ActorSystem
/**
* Created by on 7/8/16.<br>
* <br>
* AkkaActor demonstrates my problem when starting AKKA
*/
object AkkaActorStarter extends App {
val actorSystem = ActorSystem("testAkka")
}
Error:
[error] (run-main-0) java.lang.NoClassDefFoundError: scala/Product$class
...
Caused by: java.lang.ClassNotFoundException: scala.Product$class
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at akka.util.Timeout.<init>(Timeout.scala:13)
at akka.actor.ActorSystem$Settings.<init>(ActorSystem.scala:171)
at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:522)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:142)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:109)
at testing.AkkaActorStarter$.delayedEndpoint$testing$AkkaActorStarter$1(AkkaActorStarter.scala:11)
at testing.AkkaActorStarter$delayedInit$body.apply(AkkaActorStarter.scala:10)
at scala.Function0.apply$mcV$sp$(Function0.scala:34)
at scala.Function0.apply$mcV$sp(Function0.scala:34)
at scala.App.$anonfun$main$1$adapted(App.scala:76)
at scala.collection.immutable.List.foreach(List.scala:376)
at scala.App.main$(App.scala:76)
at scala.App.main(App.scala:74)
at testing.AkkaActorStarter.main(AkkaActorStarter.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
The problem is, that you are using akka for Scala 2.11 (akka-actor_2.11) with Scala 2.12. Scala minor versions are not binary compatible, you have to use the akka library that is compiled for your exact Scala version, 2.12.0-M5: "com.typesafe.akka" % "akka-actor_2.12.0-M5" % "2.4.8" or use %%, that will use the proper artifact according to your scalaVersion: "com.typesafe.akka" %% "akka-actor" % "2.4.8"
I have created one small program and in order to test it I have write small Scala Test class. But when I tried to execute scala test I was getting below error, please advise,
java.lang.NoSuchMethodError: scala.collection.immutable.$colon$colon.hd$1()Ljava/lang/Object;
at org.scalatest.tools.Runner$.argTooShort$1(Runner.scala:1490)
at org.scalatest.tools.Runner$.parseReporterArgsIntoConfigurations(Runner.scala:1507)
at org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:898)
at org.scalatest.tools.Runner$.run(Runner.scala:858)
at org.scalatest.tools.Runner.run(Runner.scala)
at org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner.runScalaTest2(ScalaTestRunner.java:137)
at org.jetbrains.plugins.scala.testingSupport.scalaTest.ScalaTestRunner.main(ScalaTestRunner.java:28)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:134)
I got the resolution. Thanks all for you reply.
There was a problem with my Scatatest version.
I am using Scala version 11 and scalatest version is not compatible with with Scala version.
libraryDependencies += "org.scalatest" % "scalatest_2.11" % "2.2.1" % "test"
Above line added in .sbt file and refreshed. Now it works fine as expected.