I am running Scala 2.10.4 with Spark 1.5.0-cdh5.5.2 and am getting the following error when running a GraphFrames job:
scala
> val g = GraphFrame(v, e)
error: bad symbolic reference. A signature in Logging.class refers to type LazyLogging
in package com.typesafe.scalalogging.slf4j which is not available.
It may be completely missing from the current classpath, or the version on
the classpath might be incompatible with the version used when compiling Logging.class.
I am starting my spark-shell with the following command:
spark-shell --jars /data/spark-jars/scalalogging-slf4j_2.10-1.1.0.jar,/data/spark-jars/graphframes-0.2.0-spark1.5-s_2.10.jar
I have tried different versions of scalalogging, but nothing seems to work.
Thanks for the help.
Related
I just started working with the MLib for Spark and tried to run the provided examples, more specifically https://github.com/apache/spark/blob/master/examples/src/main/scala/org/apache/spark/examples/ml/DCTExample.scala
However, compilation using the IntelliJ IDE fails with the message
Error:(41, 35) No TypeTag available for (org.apache.spark.ml.linalg.Vector,)
val df = spark.createDataFrame(data.map(Tuple1.apply)).toDF("features")
The project setup uses jdk1.8.0_121, spark2.11-2.1.0 and scala 2.10.6.
Any ideas on why the example fails to run? I followed the following tutorial during installation: https://www.supergloo.com/fieldnotes/intellij-scala-spark/
You can't have spark for Scala 2.11 (that's what _2.11 in the name means) with Scala 2.10, though this specific error looks quite strange. Switch to Scala 2.11.8.
I used spark 1.6.2 and Scala 11.8 to compile my project. The generated uber jar with dependencies is placed inside Spark Job Server (that seems to use Scala 10.4 (SCALA_VERSION=2.10.4 specified in .sh file)
There is no problem in starting the server, uploading context/ app jars. But at runtime, the following errors occur
java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaUniverse$JavaMirror
Why do Scala 2.11 and Spark with scallop lead to "java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror"? talks about using Scala 10 to compile the sources. Is it true?
Any suggestions please...
Use scala 2.10.4 to compile your project. Otherwise you need to compile spark with 11 too.
I am new to Scala and am trying to code read a file using the following code
scala> val textFile = sc.textFile("README.md")
scala> textFile.count()
But I keep getting the following error
error: not found: value sc
I have tried everything, but nothing seems to work. I am using Scala version 2.10.4 and Spark 1.1.0 (I have even tried Spark 1.2.0 but it doesn't work either). I have sbt installed and compiled yet not able to run sbt/sbt assembly. Is the error because of this?
You should run this code using ./spark-shell. It's scala repl with provided sparkContext. You can find it in your apache spark distribution in folder spark-1.4.1/bin.
I am using Scala 2.11, Spark, and Scallop (https://github.com/scallop/scallop). I used sbt to build an application fat jar without Spark provided dependencies (this is at: analysis/target/scala-2.11/dtex-analysis_2.11-0.1.jar)
I am able to run the program fine in sbt.
I tried to run it from the command line as follows:
time ADD_JARS=analysis/target/scala-2.11/dtex-analysis_2.11-0.1.jar java -cp /Applications/spark-1.2.0-bin-hadoop2.4/lib/spark-assembly-1.2.0-hadoop2.4.0.jar:analysis/target/scala-2.11/dtex-analysis_2.11-0.1.jar com.dtex.analysis.transform.GenUserSummaryView -d /Users/arun/DataSets/LME -p output -s txt -o /Users/arun/tmp/LME/LME
I get the following error message:
Exception in thread "main" java.lang.NoSuchMethodError:
scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaUniverse$JavaMirror;
at org.rogach.scallop.package$.(package.scala:37) at
org.rogach.scallop.package$.(package.scala) at
com.dtex.analysis.transform.GenUserSummaryView$Conf.delayedEndpoint$com$dtex$analysis$transform$GenUserSummaryView$Conf$1(GenUserSummaryView.scala:27)
at
com.dtex.analysis.transform.GenUserSummaryView$Conf$delayedInit$body.apply(GenUserSummaryView.scala:26)
at scala.Function0$class.apply$mcV$sp(Function0.scala:40) at
scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at org.rogach.scallop.AfterInit$class.delayedInit(AfterInit.scala:12)
at org.rogach.scallop.ScallopConf.delayedInit(ScallopConf.scala:26)
at
com.dtex.analysis.transform.GenUserSummaryView$Conf.(GenUserSummaryView.scala:26)
at
com.dtex.analysis.transform.GenUserSummaryView$.main(GenUserSummaryView.scala:54)
at
com.dtex.analysis.transform.GenUserSummaryView.main(GenUserSummaryView.scala)
The issue is that you've used incompatible Scala versions, i.e. Spark was compiled with Scala 2.10 and you were trying to use Scala 2.11.
Move everything to Scala 2.10 version and make sure you update your SBT as well.
You may also try to compile Spark sources for Scala 2.11.7 and use it instead.
I was also encountered with the same issue with spark-submit, in my case:
Spark Job was compiled with : Scala 2.10.8
Scala version with which Spark was compiled on the cluster: Scala 2.11.8
To check the Spark version and Scala version on the cluster use "spark-shell" command.
After compiling the Spark Job source with Scala 2.11.8 then submitted the job & it worked !!!.
Is it mandatory to use sbt for cluster usage in akka. I have tried to add a few jars to the classpath. While the compiling goes well, running the relevant class generates an error.
scala -cp
../akka-2.2.1/lib/akka/akka-cluster_2.10-2.2.1.jar:../akka-2.2.1/lib/akka/netty-3.6.6.Final.jar:../akka-2.2.1/lib/akka/akka-remote_2.10-2.2.1.jar:../akka-2.2.1/lib/akka/protobuf-java-2.4.1.jar:./
TransformationFrontend 2551
here's the issue encountered:
java.lang.NoSuchMethodException:
akka.cluster.ClusterActorRefProvider.(java.lang.String,
akka.actor.ActorSystem$Settings, akka.event.EventStream,
akka.actor.Scheduler, akka.actor.DynamicAccess) at
java.lang.Class.getConstructor0(Class.java:2800) at
java.lang.Class.getDeclaredConstructor(Class.java:2043)
This is the official Akka cluster example. Can someone throw some light on my query?
The issue here is probably that you have an akka-actor.jar in your scala distributuion that is Akka 2.1.x and you're trying to use Akka 2.2.x.
You'll have to run your code by running the java command and add the scala-library.jar and the correct akka-actor.jar and typesafe-config.jar to the classpath.
Are you using Scala 2.10? This is the Scala version you need for Akka 2.2.
What does the following yield?
scala -version
It should show something like
$ scala -version
Scala code runner version 2.10.3 -- Copyright 2002-2013, LAMP/EPFL