I am new Spark and Intellij. I have my build.sbt file below:
name := "TestSpark"
version := "1.0"
scalaVersion := "2.11.8"
// https://mvnrepository.com/artifact/org.apache.spark/spark-core_2.11
libraryDependencies ++= Seq("org.apache.spark" % "spark-core_2.11" % "1.6.2","org.apache.hadoop" % "hadoop-client" % "2.6.2")
I also have TestMain.scala in src/main/scala-2.11/TestMain.scala
import org.apache.spark.{SparkContext,SparkConf}
/**
* Created by tuannv5 on 24/08/2016.
*/
object TestMain {
def main(args: Array[String]) {
val conf = new SparkConf().setMaster("local[*]").setAppName("Test Spark").set("spark.executor.memory","2g")
val sc = new SparkContext(conf)
val data = sc.parallelize(1 to 1000000).filter(_<10000)
data.foreach(println)
}
}
For some reason when I do Run app in Intellij. I get the error:
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
16/08/24 10:29:28 INFO SparkContext: Running Spark version 1.6.2
16/08/24 10:29:29 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/08/24 10:29:29 ERROR SparkContext: Error initializing SparkContext.
java.net.UnknownHostException: master: master: unknown error
at java.net.InetAddress.getLocalHost(InetAddress.java:1505)
at org.apache.spark.util.Utils$.findLocalInetAddress(Utils.scala:789)
at org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$localIpAddress$lzycompute(Utils.scala:782)
at org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$localIpAddress(Utils.scala:782)
at org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:839)
at org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:839)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.util.Utils$.localHostName(Utils.scala:839)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:420)
at TestMain$.main(TestMain.scala:8)
at TestMain.main(TestMain.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)
Caused by: java.net.UnknownHostException: master: unknown error
at java.net.Inet6AddressImpl.lookupAllHostAddr(Native Method)
at java.net.InetAddress$2.lookupAllHostAddr(InetAddress.java:928)
at java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1323)
at java.net.InetAddress.getLocalHost(InetAddress.java:1500)
... 15 more
16/08/24 10:29:29 INFO SparkContext: Successfully stopped SparkContext
Exception in thread "main" java.net.UnknownHostException: master: master: unknown error
at java.net.InetAddress.getLocalHost(InetAddress.java:1505)
at org.apache.spark.util.Utils$.findLocalInetAddress(Utils.scala:789)
at org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$localIpAddress$lzycompute(Utils.scala:782)
at org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$localIpAddress(Utils.scala:782)
at org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:839)
at org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:839)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.util.Utils$.localHostName(Utils.scala:839)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:420)
at TestMain$.main(TestMain.scala:8)
at TestMain.main(TestMain.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)
Caused by: java.net.UnknownHostException: master: unknown error
at java.net.Inet6AddressImpl.lookupAllHostAddr(Native Method)
at java.net.InetAddress$2.lookupAllHostAddr(InetAddress.java:928)
at java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1323)
at java.net.InetAddress.getLocalHost(InetAddress.java:1500)
... 15 more
Process finished with exit code 1
Can someone please explain to me the problem stated in this error? Is this because my dependencies were not installed correctly or is it because of another reason?
Your hostname is not proper.
Type hostname and check whether it's the same in /etc/hosts:
For example:
root#google:/home/makbul# hostname
Google has to same in /etc/hosts:
127.0.1.1 google
Related
I am trying to run a simple flink streaming job on AWS EMR. The purpose is very simple for now:
Consume data from Kafka in flink
Load to another topic in kafka.
I am using the following dependencies:
scalaVersion := "2.11.8"
val flinkVersion = "1.11.1"
libraryDependencies ++= Seq(
"org.apache.flink" %% "flink-scala" % flinkVersion,
"org.apache.flink" %% "flink-streaming-scala" % flinkVersion,
"org.apache.flink" %% "flink-connector-kafka" % flinkVersion
)
Flink code that i am using is :
private val serdeSchema = new SimpleStringSchema
val env = StreamExecutionEnvironment.getExecutionEnvironment
val stream = env
.addSource(createKafkaConsumer(kafkaInputTopic
, kafkaBrokers, kafkaConfig("consumerGroupId").toString
, kafkaConfig("defaultReset").toString))
stream
.map((s: String) => s)
.addSink(createKafkaProducer(kafkaOutputTopic, kafkaBrokers))
env.execute(jobConfig("jobName").toString)
}
def createKafkaProducer(kafkaTopic: String, kafkaBrokers: String): FlinkKafkaProducer[String] = {
val producer = new FlinkKafkaProducer[String](kafkaBrokers,
kafkaTopic, serdeSchema)
producer
}
def createKafkaConsumer(kafkaInputTopic: String
, kafkaBrokers: String
, consumerGroup:String
, defaultReset: String): FlinkKafkaConsumer[String] = {
val properties = new Properties()
properties.setProperty("bootstrap.servers", kafkaBrokers)
properties.setProperty("group.id", consumerGroup)
properties.setProperty("enable.auto.commit" , "false")
properties.setProperty("auto.offset.reset" , defaultReset)
val consumer = new FlinkKafkaConsumer[String](kafkaInputTopic, serdeSchema, properties)
consumer
}
I generate a assembly jar using sbt. I use the following command to run the job on EMR
/bin/flink run -c com.example.FlinkConsumer flink/target/scala-2.11/flink-assembly-0.1.jar
Below is the stack trace
Caused by: org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: c458520153e875811c46c386b9ec605e)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:112)
at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$21(RestClusterClient.java:565)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$8(FutureUtils.java:291)
at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution failed.
at org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:147)
at org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:110)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by NoRestartBackoffTimeStrategy
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:110)
at org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:76)
at org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:192)
at org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:186)
at org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:180)
at org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:484)
at org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:380)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:279)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:194)
at org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:74)
at org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:152)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:123)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:170)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
at akka.actor.Actor$class.aroundReceive(Actor.scala:517)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.NoSuchMethodError: org.apache.flink.api.common.serialization.SerializationSchema.open(Lorg/apache/flink/api/common/serialization/SerializationSchema$InitializationContext;)V
at org.apache.flink.streaming.connectors.kafka.internals.KafkaSerializationSchemaWrapper.open(KafkaSerializationSchemaWrapper.java:61)
at org.apache.flink.streaming.connectors.kafka.FlinkKafkaProducer.open(FlinkKafkaProducer.java:808)
at org.apache.flink.api.common.functions.util.FunctionUtils.openFunction(FunctionUtils.java:36)
at org.apache.flink.streaming.api.operators.AbstractUdfStreamOperator.open(AbstractUdfStreamOperator.java:102)
at org.apache.flink.streaming.api.operators.StreamSink.open(StreamSink.java:48)
at org.apache.flink.streaming.runtime.tasks.StreamTask.initializeStateAndOpen(StreamTask.java:1007)
at org.apache.flink.streaming.runtime.tasks.StreamTask.lambda$beforeInvoke$0(StreamTask.java:454)
at org.apache.flink.streaming.runtime.tasks.StreamTaskActionExecutor$SynchronizedStreamTaskActionExecutor.runThrowing(StreamTaskActionExecutor.java:94)
at org.apache.flink.streaming.runtime.tasks.StreamTask.beforeInvoke(StreamTask.java:449)
at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:461)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:707)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:532)
at java.lang.Thread.run(Thread.java:748)
It looks like a version issue, but i tried with various versions, i don't see this open method, but i think in the serialize it calls the open method and unable to find one. Can someone please help . I am new to flink.
If you're using EMR's Flink support, then most Flink libraries should be flagged as "provided" so that they're not in your jar, as they're on the classpath from the Flink installation that EMR is providing. You'll still need to explicitly include anything that's not provided by EMR (e.g. flink-connector-kafka).
I am using scalaVersion := "2.10.5" and libraryDependencies += "org.rogach" %% "scallop" % "3.1.2".
Getting following error: Exception in thread "main"
java.lang.NoSuchMethodError:
scala.collection.immutable.$colon$colon.hd$1()Ljava/lang/Object; at
org.rogach.scallop.DefaultConverters$$anon$2.parse(DefaultConverters.scala:27)
at
org.rogach.scallop.ValueConverter$class.parseCached(ValueConverter.scala:21)
at
org.rogach.scallop.DefaultConverters$$anon$2.parseCached(DefaultConverters.scala:24)
at
org.rogach.scallop.Scallop$$anonfun$verify$17.apply(Scallop.scala:632)
at
org.rogach.scallop.Scallop$$anonfun$verify$17.apply(Scallop.scala:630)
at scala.collection.immutable.List.foreach(List.scala:381) at
org.rogach.scallop.Scallop.verify(Scallop.scala:630) at
org.rogach.scallop.ScallopConfBase.verifyBuilder(ScallopConfBase.scala:405)
at
org.rogach.scallop.ScallopConfBase.verify(ScallopConfBase.scala:744)
at
com.unity3d.ads.conf.OperativeEventConverterConf.(OperativeEventConverterConf.scala:50)
at com.unity3d.ads.analytics.TestClass$.main(TestClass.scala:51) at
com.unity3d.ads.analytics.TestClass.main(TestClass.scala) at
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498) at
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738)
at
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126) at
org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
The exact same code is working fine with scalaVersion := "2.11.8"
Unfortunately, I have to use 2.10.5 because I am using spark version 1.6.
sample code:
`import org.rogach.scallop.{ScallopConf, ScallopOption, Serialization, ValueConverter, singleArgConverter}
class TestClass(args: Seq[String]) extends ScallopConf(args) with Serialization {
val testInput: ScallopOption[String] =
opt[String](
name = "test.input",
descr = "test",
required = false,
default = Option("testPath"))
verify()
}
`
Is there any workaround I can use here to make it work with scala 2.10.5?
Answering my question to in case someone else faces the similar issue.
This turned out to be the classpath issue. the root cause of the issue:
I was using spark 2.1 to run code compiled with spark 1.6 version. apparently, 1.6 uses scala 2.10.. while spark 2.1 uses scala 2.11...
I'm using following code to authenticate datastax cassandra cluster (with dse authentication enabled) but i'm getting an exception. Could someone help me identifying & fixing the issue,
code:
import com.datastax.driver.core.Cluster
object MySecondScalaWorksheet {
println("Welcome to the Scala worksheet")
def dseConnect(args: Array[String]){
val cluster = Cluster.builder().addContactPoint("server01.lab.east.com").withCredentials("cassandra", "cassandra").build();
val session = cluster.connect()
println(session.getCluster())
}
}
Exception:
java.lang.NoClassDefFoundError: org/slf4j/LoggerFactory
at com.datastax.driver.core.Cluster.<clinit>(Cluster.java:64)
at MySecondScalaWorksheet$$anonfun$main$1.apply$mcV$sp(MySecondScalaWork
sheet.scala:6)
at org.scalaide.worksheet.runtime.library.WorksheetSupport$$anonfun$$exe
cute$1.apply$mcV$sp(WorksheetSupport.scala:76)
at org.scalaide.worksheet.runtime.library.WorksheetSupport$.redirected(W
orksheetSupport.scala:65)
at org.scalaide.worksheet.runtime.library.WorksheetSupport$.$execute(Wor
ksheetSupport.scala:75)
at MySecondScalaWorksheet$.main(MySecondScalaWorksheet.scala:3)
at MySecondScalaWorksheet.main(MySecondScalaWorksheet.scala)
Caused by: java.lang.ClassNotFoundException: org.slf4j.LoggerFactory
at java.net.URLClassLoader.findClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
It seems the Guava library is missing in your build setup. try adding the below in your sbt build file.
libraryDependencies += "com.google.guava" % "guava" % "21.0"
I have created simple sbt project from tutorial:
build.sbt:
lazy val sorm_test = (project in file(".")).
settings(
name := "SORM_TEST",
scalaVersion := "2.11.7",
libraryDependencies ++= Seq(
"org.sorm-framework" % "sorm" % "0.3.18",
"com.h2database" % "h2" % "1.4.188"
)
)
test.Main.scala:
package test
case class Artist(
names : Map[Locale, Seq[String]],
genres : Set[Genre]
)
case class Genre(
names : Map[Locale, Seq[String]]
)
case class Locale(
code : String
)
import sorm._
object Db extends Instance(
entities = Set(
Entity[Artist](),
Entity[Genre](),
Entity[Locale](unique = Set() + Seq("code"))
),
url = "jdbc:h2:mem:test",
user = "",
password = "",
initMode = InitMode.Create
)
object Main extends App {
// init
Db.##
}
When i run this project in intellij i have such exceptions :
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
Exception in thread "main" java.lang.BootstrapMethodError: java.lang.NoClassDefFoundError: scala/runtime/java8/JFunction1
at scala.tools.reflect.ToolBoxFactory$ToolBoxImpl.parse(ToolBoxFactory.scala:414)
at sorm.persisted.PersistedClass$.createClass(PersistedClass.scala:107)
at sorm.persisted.PersistedClass$$anon$1$$anonfun$resolve$1.apply(PersistedClass.scala:125)
at sorm.persisted.PersistedClass$$anon$1$$anonfun$resolve$1.apply(PersistedClass.scala:125)
at scala.collection.mutable.MapLike$class.getOrElseUpdate(MapLike.scala:194)
at scala.collection.mutable.AbstractMap.getOrElseUpdate(Map.scala:80)
at sorm.persisted.PersistedClass$$anon$1.resolve(PersistedClass.scala:125)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at sorm.persisted.PersistedClass$.apply(PersistedClass.scala:129)
at sorm.Instance$Initialization$$anonfun$9$$anonfun$apply$16.apply(Instance.scala:239)
at sorm.Instance$Initialization$$anonfun$9$$anonfun$apply$16.apply(Instance.scala:239)
at embrace.package$EmbraceAny$.$$extension(package.scala:6)
at sorm.Instance$Initialization$$anonfun$9.apply(Instance.scala:239)
at sorm.Instance$Initialization$$anonfun$9.apply(Instance.scala:239)
at scala.collection.immutable.Set$Set3.foreach(Set.scala:145)
at sorm.Instance$Initialization.<init>(Instance.scala:239)
at sorm.Instance.<init>(Instance.scala:38)
at test.Db$.<init>(Main.scala:15)
at test.Db$.<clinit>(Main.scala)
at test.Main$.delayedEndpoint$test$Main$1(Main.scala:29)
at test.Main$delayedInit$body.apply(Main.scala:27)
at scala.Function0$class.apply$mcV$sp(Function0.scala:34)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.collection.immutable.List.foreach(List.scala:381)
at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
at scala.App$class.main(App.scala:76)
at test.Main$.main(Main.scala:27)
at test.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:140)
Caused by: java.lang.NoClassDefFoundError: scala/runtime/java8/JFunction1
... 38 more
Caused by: java.lang.ClassNotFoundException: scala.runtime.java8.JFunction1
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 38 more
It's ok when i use sbt run. This exception is thrown also when i integrate SORM with Play framework. How can i solve this problem ?
I solved this problem by adding dependencyOverrides += "org.scala-lang" % "scala-compiler" % scalaVersion.value to my sbt configuration.
I receive a very similar Exception in a maven project during runtime:
Caused by: java.lang.BootstrapMethodError: java.lang.NoClassDefFoundError: scala/runtime/java8/JFunction0$mcI$sp
I am currently using
<scala.version>2.11.7</scala.version>
<scala.compat.version>2.11</scala.compat.version>
<scalatest.version>2.2.2</scalatest.version>
<scala-maven-plugin.version>3.2.0</scala-maven-plugin.version>
in the pom properties and refer to these properties in all other entries, like
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>${scala.version}</version>
</dependency>
Maven packaging/installation works fine. Is this a similar problem? Can someone explain the solution above (or maybe translate it to the maven case)?
I am trying to perform following operations but I am getting the error while using groupByKey transformation. I'm using Spark in standalone mode.
sample.sbt contains:
name := "Spark Join"
version := "1.0"
scalaVersion := "2.10.4"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.3.0"
fork := true
My Scala Code
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
import java.util.Properties
object yelpDataJoin {
def main(args: Array[String]) {
val reviewFile = " /home/prasad/Desktop/BigData/psp150030_HW3/data/review3.csv"
val conf = new SparkConf().setAppName("SparkJoins")
val sc = new SparkContext(conf)
val reviewData = sc.textFile(reviewFile, 2)
val groupReviewData = reviewData.map(line => line.split("::")).map(word => (word(2),(word(20),1))).groupByKey().foreach(println)
}
}
I'm getting following error message:
15/07/20 16:10:48 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 32.0 KB, free 265.4 MB)
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/rdd/RDD$
at yelpDataJoin$.main(HW3_Question2.scala:14)
at yelpDataJoin.main(HW3_Question2.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:328)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.rdd.RDD$
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 9 more
Please let me know if I'm doing anything wrong here.
Thanks & Regards,
Prasad