Exception while running Spark program with SQL context in Scala - scala

I am trying to run a simple Spark scala program built with Maven
Below is the source code:
case class Person(name:String,age:Int)
object parquetoperations {
def main(args:Array[String]){
val sparkconf=new SparkConf().setAppName("spark1").setMaster("local")
val sc=new SparkContext(sparkconf);
val sqlContext= new SQLContext(sc)
import sqlContext.implicits._
val peopleRDD = sc.textFile(args(0));
val peopleDF=peopleRDD.map(_.split(",")).map(attributes=>Person(attributes(0),attributes(1).trim.toInt)).toDF()
peopleDF.createOrReplaceTempView("people")
val adultsDF=sqlContext.sql("select * from people where age>18")
//adultsDF.map(x => "Name: "+x.getAs[String]("name")+ " age is: "+x.getAs[Int]("age")).show();
}
}
and below are the maven dependencies I have.
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>2.10.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>2.1.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.10</artifactId>
<version>2.1.0</version>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-xml</artifactId>
<version>2.11.0-M4</version>
</dependency>
It throws the below error. Tried to debug in various ways with no luck
Exception in thread "main" java.lang.NoSuchMethodError:
scala.Predef$.$scope()Lscala/xml/TopScope$;
Looks like this is an error related to loading the spark web ui

All your dependencies are on Scala 2.10, but your scala-xml dependency is on Scala 2.11.
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-xml</artifactId>
<version>2.11.0-M4</version>
</dependency>
Btw, unless you really have a strong reason to do so, I would suggest you move to Scala 2.11.8. Everything is so much better with 2.11 compared to 2.10.

Related

Getting dependency error for sparksession and SQLContext

I am getting dependency error for my SQLContext and sparksession in my spark program
val sqlContext = new SQLContext(sc)
val spark = SparkSession.builder()
Error for SQLCOntext
Symbol 'type org.apache.spark.Logging' is missing from the classpath. This symbol is required by 'class org.apache.spark.sql.SQLContext'. Make sure that type Logging is in your classpath and check for conflicting dependencies with -Ylog-classpath. A full rebuild may help if 'SQLContext.class' was compiled against an incompatible version of org.apache.spark.
Error for SparkSession:
not found: value SparkSession
Below are the spark dependencies in my pom.xml
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.10</artifactId>
<version>1.6.0-cdh5.15.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>2.0.0-cloudera1-SNAPSHOT</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-catalyst_2.10</artifactId>
<version>1.6.0-cdh5.15.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-test-tags_2.10</artifactId>
<version>1.6.0-cdh5.15.1</version>
</dependency>
You can't have both Spark 2 and Spark 1.6 dependencies defined in your project.
org.apache.spark.Logging is not available in Spark 2 anymore.
Change
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>2.0.0-cloudera1-SNAPSHOT</version>
</dependency>
to
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.6.0-cdh5.15.1</version>
</dependency>

Creating a kafka consumer using spark streaming

I have started kafka, created a topic and a producer. Now I want to read messages sent from that producer. My code
def main(args: Array[String]){
val sparkConf = new SparkConf()
val spark = new SparkContext(sparkConf)
val streamingContext = new StreamingContext(spark, Seconds(5))
val kafkaStream = KafkaUtils.createStream(streamingContext
, "localhost:2181"
, "test-group"
, Map("test" -> 1))
kafkaStream.print
streamingContext.start
streamingContext.awaitTermination
}
The dependencies I use
<properties>
<spark.version>1.6.2</spark.version>
</properties>
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.11</artifactId>
<version>${spark.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka_2.11</artifactId>
<version>${spark.version}</version>
</dependency>
</dependencies>
But every time I try to run it in idea I get
Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.$conforms()Lscala/Predef$$less$colon$less;
at org.apache.spark.util.Utils$.getSystemProperties(Utils.scala:1582)
at org.apache.spark.SparkConf.<init>(SparkConf.scala:59)
at org.apache.spark.SparkConf.<init>(SparkConf.scala:53)
at com.mypackage.KafkaConsumer$.main(KafkaConsumer.scala:10)
at com.mypackage.KafkaConsumer.main(KafkaConsumer.scala)
Other questions here point to conflicts between dependencies.
I use scala 2.10.5 and spark 1.6.2. I tried them in other projects, they worked fine.
Line 10 in this case is val sparkConf = new SparkConf()
I try to run the app in the IDEA without packaging it.
What can be the reason for this problem?
It's an error with Scala version. You're are using different versions of scala in your code and dependencies.
You said you're using scala 2.10 but you import spark-XX_2.11 dependencies. Unify your scala version.

Error while using SparkSession or sqlcontext

I am new to spark. I am just trying to parse a json file using sparksession or sqlcontext.
But whenever I run them, I am getting the following error.
Exception in thread "main" java.lang.NoSuchMethodError:
org.apache.spark.internal.config.package$.CATALOG_IMPLEMENTATION()Lorg/apache/spark/internal/config/ConfigEntry; at
org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$sessionStateClassName(SparkSession.scala:930) at
org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:112) at
org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:110) at
org.apache.spark.sql.DataFrameReader.<init>(DataFrameReader.scala:535) at
org.apache.spark.sql.SparkSession.read(SparkSession.scala:595) at
org.apache.spark.sql.SQLContext.read(SQLContext.scala:504) at
joinAssetsAndAd$.main(joinAssetsAndAd.scala:21) at
joinAssetsAndAd.main(joinAssetsAndAd.scala)
As of now I created a scala project in eclipse IDE and configured it as Maven project and added the spark and sql dependencies.
My dependencies :
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.1.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.0.0</version>
</dependency>
</dependencies>
Could you please explain why I am getting this error and how to correct them?
Try to use the same version for spark-core and spark-sql. Change version of spark-sql to 2.1.0

Getting an error while trying to run a simple spark streaming kafka example

I am trying to run a simple kafka spark streaming example. Here is the error I am getting.
16/10/02 20:45:43 INFO SparkEnv: Registering OutputCommitCoordinator
Exception in thread "main" java.lang.NoSuchMethodError:
scala.Predef$.$scope()Lscala/xml/TopScope$; at
org.apache.spark.ui.jobs.StagePage.(StagePage.scala:44) at
org.apache.spark.ui.jobs.StagesTab.(StagesTab.scala:34) at
org.apache.spark.ui.SparkUI.(SparkUI.scala:62) at
org.apache.spark.ui.SparkUI$.create(SparkUI.scala:215) at
org.apache.spark.ui.SparkUI$.createLiveUI(SparkUI.scala:157) at
org.apache.spark.SparkContext.(SparkContext.scala:443) at
org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:836)
at
org.apache.spark.streaming.StreamingContext.(StreamingContext.scala:84)
at
org.apache.spark.streaming.api.java.JavaStreamingContext.(JavaStreamingContext.scala:138)
at com.application.SparkConsumer.App.main(App.java:27)
I am setting this example using the following pom. I have tried to find this missing scala.Predef class, and added the missing dependency for spark-streaming-kafka-0-8-assembly, and I can see the class when I explore this jar.
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_2.11</artifactId>
<version>0.8.2.0</version>
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
<version>0.8.2.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.0.0</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.11</artifactId>
<version>2.0.0</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka-0-8_2.11</artifactId>
<version>2.0.0</version>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>2.11.0</version>
</dependency>
<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-streaming-kafka-0-8-assembly_2.11</artifactId>
    <version>2.0.0</version>
</dependency>
I have tried a simple spark word count example and it works fine. When I use this spark-streaming-kafka, I am having trouble. I have tried to lookup for this error, but no luck.
Here is the code snippet.
SparkConf sparkConf = new SparkConf().setAppName("someapp").setMaster("local[2]");
// Create the context with 2 seconds batch size
JavaStreamingContext jssc = new JavaStreamingContext(sparkConf, new Duration(2000));
int numThreads = Integer.parseInt(args[3]);
Map<String, Integer> topicMap = new HashMap<String,Integer>();
topicMap.put("fast-messages", 1);
Map<String, String> kafkaParams = new HashMap<String,String>();
kafkaParams.put("metadata.broker.list", "localhost:9092");
JavaPairReceiverInputDStream<String, String> messages =
KafkaUtils.createStream(jssc,"zoo1","my-consumer-group", topicMap);
There seems to be problem when I used 2.11 of 0.8.2.0 kafka. After switching to 2.10 it worked fine.

Scala 2.11.4, akka 2.3.7, spray 1.3.1 giving type mismatch error

I am new to Scala. While trying spray with akka I am getting the following error
Error:(17, 17) type mismatch;
found : String("Welcome to Scala")
required: spray.httpx.marshalling.ToResponseMarshallable
complete("Welcome to Scala")
Code:
import spray.routing._
import akka.actor._
object SampleApplication extends App with SimpleRoutingApp {
implicit val actorSystem = ActorSystem()
startServer(interface = "localhost", port = 8080) {
get {
path("hello") {
complete {
"Welcome to Scala"
}
}
}
}
}
Maven Dependencies:
<dependencies>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>${scala.version}</version>
</dependency>
<dependency>
<groupId>org.specs</groupId>
<artifactId>specs</artifactId>
<version>1.2.5</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>io.spray</groupId>
<artifactId>spray-routing</artifactId>
<version>${spray.version}</version>
</dependency>
<dependency>
<groupId>io.spray</groupId>
<artifactId>spray-can</artifactId>
<version>${spray.version}</version>
</dependency>
<dependency>
<groupId>io.spray</groupId>
<artifactId>spray-httpx</artifactId>
<version>${spray.version}</version>
</dependency>
<dependency>
<groupId>io.spray</groupId>
<artifactId>spray-client</artifactId>
<version>${spray.version}</version>
</dependency>
<dependency>
<groupId>io.spray</groupId>
<artifactId>spray-json_2.11</artifactId>
<version>${spray.version}</version>
</dependency>
<dependency>
<groupId>com.typesafe.akka</groupId>
<artifactId>akka-actor_2.11</artifactId>
<version>${akka.version}</version>
</dependency>
</dependencies>
Ide used is intellij Idea 14
Are dependencies strictly bind to scala version?
Please help in solving the issue.
Your example works for me. I suspect something шы wrong with your dependencies which causes spray.httpx.marshalling not to be found.
Overall I highly recommend not to use Maven with Scala and stick to SBT instead. With SBT you get access to the same dependencies (just like in Maven), but also can directly specify dependency from GitHub. You will also get incremental compilation. Ability to start shell or servlet container within sbt is also convenient. If you use Intellij IDEA it can open SBT configuration as project. And it comes with make tasks you'd probably need (test, run, doc, publish-local, console).