I am trying to follow the slick2 examples using MySql. however when trying to connect to the database I am getting end exception
import scala.slick.driver.MySQLDriver.simple._
import slickTest.DB.SuppliersEntity
object slickTestDB {
def main(args: Array[String]) {
println("CreateDb!")
val schemaName = "slickTest"
val conn_str = "jdbc:mysql://localhost:3306/" + schemaName
val database = Database.forURL(conn_str,user="",password="",driver="com.mysql.jdbc.Driver")
}
}
getting this exception
Exception in thread "main" java.lang.ClassNotFoundException: com.mysql.jdbc.Driver
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:190)
at scala.slick.jdbc.JdbcBackend$DatabaseFactoryDef$$anon$4.<init>(JdbcBackend.scala:62)
at scala.slick.jdbc.JdbcBackend$DatabaseFactoryDef$class.forURL(JdbcBackend.scala:61)
at scala.slick.jdbc.JdbcBackend$$anon$2.forURL(JdbcBackend.scala:23)
at slickTest.slickTestDB$.main(slickTestDB.scala:18)
at slickTest.slickTestDB.main(slickTestDB.scala)
You probably need to add the MySQL Connector library as a dependency in your project.
libraryDependencies += "mysql" % "mysql-connector-java" % "latest.release"
Related
I'm using following code to authenticate datastax cassandra cluster (with dse authentication enabled) but i'm getting an exception. Could someone help me identifying & fixing the issue,
code:
import com.datastax.driver.core.Cluster
object MySecondScalaWorksheet {
println("Welcome to the Scala worksheet")
def dseConnect(args: Array[String]){
val cluster = Cluster.builder().addContactPoint("server01.lab.east.com").withCredentials("cassandra", "cassandra").build();
val session = cluster.connect()
println(session.getCluster())
}
}
Exception:
java.lang.NoClassDefFoundError: org/slf4j/LoggerFactory
at com.datastax.driver.core.Cluster.<clinit>(Cluster.java:64)
at MySecondScalaWorksheet$$anonfun$main$1.apply$mcV$sp(MySecondScalaWork
sheet.scala:6)
at org.scalaide.worksheet.runtime.library.WorksheetSupport$$anonfun$$exe
cute$1.apply$mcV$sp(WorksheetSupport.scala:76)
at org.scalaide.worksheet.runtime.library.WorksheetSupport$.redirected(W
orksheetSupport.scala:65)
at org.scalaide.worksheet.runtime.library.WorksheetSupport$.$execute(Wor
ksheetSupport.scala:75)
at MySecondScalaWorksheet$.main(MySecondScalaWorksheet.scala:3)
at MySecondScalaWorksheet.main(MySecondScalaWorksheet.scala)
Caused by: java.lang.ClassNotFoundException: org.slf4j.LoggerFactory
at java.net.URLClassLoader.findClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
It seems the Guava library is missing in your build setup. try adding the below in your sbt build file.
libraryDependencies += "com.google.guava" % "guava" % "21.0"
I'm trying to use phantom to handle my Cassandra
This is my example to learn how can I use phantom (thanks to Thiago)
https://github.com/iamthiago/cassandra-phantom
I can run successfully 'SongTest'.
But I have some trouble to run database.create in my scala main object.
Same code in 'scalatest' successfully run. But in my scala main object, it fail.
This is my source code
package com.cassandra.phantom.modeling
import com.cassandra.phantom.modeling.database._
import com.cassandra.phantom.modeling.connector._
import scala.concurrent.ExecutionContext.Implicits.global
import scala.concurrent.duration._
/**
* Created by karamko on 2017. 3. 28..
*/
class TestCass extends EmbeddedDatabase with Connector.connector.Connector {
def create() {
database.create(5.seconds)
}
}
object Main extends App{
val test = new TestCass
test.create()
}
and This is my error
Exception in thread "main" java.lang.NoClassDefFoundError: scala/reflect/runtime/package$
at com.outworkers.phantom.column.AbstractColumn$class.com$outworkers$phantom$column$AbstractColumn$$_name(AbstractColumn.scala:55)
at com.outworkers.phantom.column.Column.com$outworkers$phantom$column$AbstractColumn$$_name$lzycompute(Column.scala:24)
at com.outworkers.phantom.column.Column.com$outworkers$phantom$column$AbstractColumn$$_name(Column.scala:24)
at com.outworkers.phantom.column.AbstractColumn$class.name(AbstractColumn.scala:58)
at com.outworkers.phantom.column.Column.name(Column.scala:24)
at com.outworkers.phantom.column.PrimitiveColumn.qb(PrimitiveColumn.scala:38)
at com.outworkers.phantom.builder.query.RootCreateQuery$$anonfun$lightweight$1.apply(CreateQuery.scala:48)
at com.outworkers.phantom.builder.query.RootCreateQuery$$anonfun$lightweight$1.apply(CreateQuery.scala:48)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
at scala.collection.immutable.Set$Set4.foreach(Set.scala:200)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
at scala.collection.AbstractSet.scala$collection$SetLike$$super$map(Set.scala:47)
at scala.collection.SetLike$class.map(SetLike.scala:92)
at scala.collection.AbstractSet.map(Set.scala:47)
at com.outworkers.phantom.builder.query.RootCreateQuery.lightweight(CreateQuery.scala:48)
at com.outworkers.phantom.builder.query.RootCreateQuery.ifNotExists(CreateQuery.scala:71)
at com.outworkers.phantom.CassandraTable.autocreate(CassandraTable.scala:92)
at com.cassandra.phantom.modeling.database.SongsDatabase$$anon$1$$anonfun$createQueries$1.apply(SongsDatabase.scala:15)
at com.cassandra.phantom.modeling.database.SongsDatabase$$anon$1$$anonfun$createQueries$1.apply(SongsDatabase.scala:15)
at com.outworkers.phantom.database.ExecutableCreateStatementsList.future(Database.scala:173)
at com.outworkers.phantom.database.Database.createAsync(Database.scala:85)
at com.outworkers.phantom.database.Database.create(Database.scala:75)
at com.cassandra.phantom.modeling.SongsStreaming$.main(Main.scala:29)
at com.cassandra.phantom.modeling.SongsStreaming.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)
Caused by: java.lang.ClassNotFoundException: scala.reflect.runtime.package$
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 30 more
I solve my problem adding below dependency
libraryDependencies += "org.scala-lang" % "scala-reflect" % scalaVersion.value
Thanks a lot
I tried to run the following code.
package test
import java.util.Properties
import org.apache.flink.streaming.api.scala.{StreamExecutionEnvironment, _}
import org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08
import org.apache.flink.streaming.util.serialization.SimpleStringSchema
object FlinkKafkaStreaming {
def main(args: Array[String]) {
val env = StreamExecutionEnvironment.getExecutionEnvironment
env.enableCheckpointing(5000)
val properties = new Properties()
properties.setProperty("bootstrap.servers", "www.iteblog.com:9092")
// only required for Kafka 0.8
properties.setProperty("zookeeper.connect", "www.iteblog.com:2181")
properties.setProperty("group.id", "iteblog")
val stream = env.addSource(new FlinkKafkaConsumer08[String]("iteblog",
new SimpleStringSchema(), properties))
stream.setParallelism(4).writeAsText("hdfs:///tmp/iteblog/data")
env.execute("IteblogFlinkKafkaStreaming")
}
}
But got the following error
/Library/Java/JavaVirtualMachines/jdk1.7.0_80.jdk/Contents/Home/bin/java -Didea.launcher.port=7533 "-Didea.launcher.bin.path=/Applications/IntelliJ IDEA CE.app/Contents/bin" -Dfile.encoding=UTF-8 -classpath "/Library/Java/JavaVirtualMachines/jdk1.7.0_80.jdk/Contents/Home/jre/lib/charsets.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_80.jdk/Contents/Home/jre/lib/deploy.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_80.jdk/Contents/Home/jre/lib/ext/dnsns.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_80.jdk/Contents/Home/jre/lib/ext/localedata.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_80.jdk/Contents/Home/jre/lib/ext/sunec.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_80.jdk/Contents/Home/jre/lib/ext/sunjce_provider.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_80.jdk/Contents/Home/jre/lib/ext/sunpkcs11.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_80.jdk/Contents/Home/jre/lib/ext/zipfs.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_80.jdk/Contents/Home/jre/lib/htmlconverter.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_80.jdk/Contents/Home/jre/lib/javaws.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_80.jdk/Contents/Home/jre/lib/jce.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_80.jdk/Contents/Home/jre/lib/jfr.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_80.jdk/Contents/Home/jre/lib/jfxrt.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_80.jdk/Contents/Home/jre/lib/jsse.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_80.jdk/Contents/Home/jre/lib/management-agent.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_80.jdk/Contents/Home/jre/lib/plugin.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_80.jdk/Contents/Home/jre/lib/resources.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_80.jdk/Contents/Home/jre/lib/rt.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_80.jdk/Contents/Home/lib/ant-javafx.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_80.jdk/Contents/Home/lib/dt.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_80.jdk/Contents/Home/lib/javafx-doclet.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_80.jdk/Contents/Home/lib/javafx-mx.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_80.jdk/Contents/Home/lib/jconsole.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_80.jdk/Contents/Home/lib/sa-jdi.jar:/Library/Java/JavaVirtualMachines/jdk1.7.0_80.jdk/Contents/Home/lib/tools.jar:/Users/zhenhao.li/ethan-stream/target/classes:/Users/zhenhao.li/.m2/repository/org/apache/flink/flink-scala_2.10/1.0.2/flink-scala_2.10-1.0.2.jar:/Users/zhenhao.li/.m2/repository/org/apache/flink/flink-core/1.0.2/flink-core-1.0.2.jar:/Users/zhenhao.li/.m2/repository/org/apache/flink/flink-annotations/1.0.2/flink-annotations-1.0.2.jar:/Users/zhenhao.li/.m2/repository/org/apache/commons/commons-lang3/3.3.2/commons-lang3-3.3.2.jar:/Users/zhenhao.li/.m2/repository/org/slf4j/slf4j-api/1.7.7/slf4j-api-1.7.7.jar:/Users/zhenhao.li/.m2/repository/org/slf4j/slf4j-log4j12/1.7.7/slf4j-log4j12-1.7.7.jar:/Users/zhenhao.li/.m2/repository/log4j/log4j/1.2.17/log4j-1.2.17.jar:/Users/zhenhao.li/.m2/repository/org/apache/flink/force-shading/1.0.2/force-shading-1.0.2.jar:/Users/zhenhao.li/.m2/repository/com/esotericsoftware/kryo/kryo/2.24.0/kryo-2.24.0.jar:/Users/zhenhao.li/.m2/repository/com/esotericsoftware/minlog/minlog/1.2/minlog-1.2.jar:/Users/zhenhao.li/.m2/repository/org/objenesis/objenesis/2.1/objenesis-2.1.jar:/Users/zhenhao.li/.m2/repository/org/apache/avro/avro/1.7.6/avro-1.7.6.jar:/Users/zhenhao.li/.m2/repository/org/codehaus/jackson/jackson-core-asl/1.9.13/jackson-core-asl-1.9.13.jar:/Users/zhenhao.li/.m2/repository/org/codehaus/jackson/jackson-mapper-asl/1.9.13/jackson-mapper-asl-1.9.13.jar:/Users/zhenhao.li/.m2/repository/com/thoughtworks/paranamer/paranamer/2.3/paranamer-2.3.jar:/Users/zhenhao.li/.m2/repository/org/apache/flink/flink-shaded-hadoop2/1.0.2/flink-shaded-hadoop2-1.0.2.jar:/Users/zhenhao.li/.m2/repository/commons-cli/commons-cli/1.2/commons-cli-1.2.jar:/Users/zhenhao.li/.m2/repository/org/apache/commons/commons-math3/3.5/commons-math3-3.5.jar:/Users/zhenhao.li/.m2/repository/xmlenc/xmlenc/0.52/xmlenc-0.52.jar:/Users/zhenhao.li/.m2/repository/commons-codec/commons-codec/1.4/commons-codec-1.4.jar:/Users/zhenhao.li/.m2/repository/commons-io/commons-io/2.4/commons-io-2.4.jar:/Users/zhenhao.li/.m2/repository/commons-net/commons-net/3.1/commons-net-3.1.jar:/Users/zhenhao.li/.m2/repository/commons-collections/commons-collections/3.2.1/commons-collections-3.2.1.jar:/Users/zhenhao.li/.m2/repository/javax/servlet/servlet-api/2.5/servlet-api-2.5.jar:/Users/zhenhao.li/.m2/repository/org/mortbay/jetty/jetty-util/6.1.26/jetty-util-6.1.26.jar:/Users/zhenhao.li/.m2/repository/com/sun/jersey/jersey-core/1.9/jersey-core-1.9.jar:/Users/zhenhao.li/.m2/repository/commons-el/commons-el/1.0/commons-el-1.0.jar:/Users/zhenhao.li/.m2/repository/commons-logging/commons-logging/1.1.3/commons-logging-1.1.3.jar:/Users/zhenhao.li/.m2/repository/com/jamesmurty/utils/java-xmlbuilder/0.4/java-xmlbuilder-0.4.jar:/Users/zhenhao.li/.m2/repository/commons-lang/commons-lang/2.6/commons-lang-2.6.jar:/Users/zhenhao.li/.m2/repository/commons-configuration/commons-configuration/1.7/commons-configuration-1.7.jar:/Users/zhenhao.li/.m2/repository/commons-digester/commons-digester/1.8.1/commons-digester-1.8.1.jar:/Users/zhenhao.li/.m2/repository/org/xerial/snappy/snappy-java/1.0.5/snappy-java-1.0.5.jar:/Users/zhenhao.li/.m2/repository/com/jcraft/jsch/0.1.42/jsch-0.1.42.jar:/Users/zhenhao.li/.m2/repository/org/apache/zookeeper/zookeeper/3.4.6/zookeeper-3.4.6.jar:/Users/zhenhao.li/.m2/repository/org/apache/commons/commons-compress/1.4.1/commons-compress-1.4.1.jar:/Users/zhenhao.li/.m2/repository/org/tukaani/xz/1.0/xz-1.0.jar:/Users/zhenhao.li/.m2/repository/commons-beanutils/commons-beanutils-bean-collections/1.8.3/commons-beanutils-bean-collections-1.8.3.jar:/Users/zhenhao.li/.m2/repository/commons-daemon/commons-daemon/1.0.13/commons-daemon-1.0.13.jar:/Users/zhenhao.li/.m2/repository/javax/xml/bind/jaxb-api/2.2.2/jaxb-api-2.2.2.jar:/Users/zhenhao.li/.m2/repository/javax/xml/stream/stax-api/1.0-2/stax-api-1.0-2.jar:/Users/zhenhao.li/.m2/repository/javax/activation/activation/1.1/activation-1.1.jar:/Users/zhenhao.li/.m2/repository/com/google/inject/guice/3.0/guice-3.0.jar:/Users/zhenhao.li/.m2/repository/javax/inject/javax.inject/1/javax.inject-1.jar:/Users/zhenhao.li/.m2/repository/aopalliance/aopalliance/1.0/aopalliance-1.0.jar:/Users/zhenhao.li/.m2/repository/org/apache/flink/flink-java/1.0.2/flink-java-1.0.2.jar:/Users/zhenhao.li/.m2/repository/org/apache/flink/flink-optimizer_2.10/1.0.2/flink-optimizer_2.10-1.0.2.jar:/Users/zhenhao.li/.m2/repository/org/apache/flink/flink-runtime_2.10/1.0.2/flink-runtime_2.10-1.0.2.jar:/Users/zhenhao.li/.m2/repository/org/scala-lang/scala-reflect/2.10.4/scala-reflect-2.10.4.jar:/Users/zhenhao.li/.m2/repository/org/scala-lang/scala-library/2.10.4/scala-library-2.10.4.jar:/Users/zhenhao.li/.m2/repository/org/scala-lang/scala-compiler/2.10.4/scala-compiler-2.10.4.jar:/Users/zhenhao.li/.m2/repository/org/scalamacros/quasiquotes_2.10/2.0.1/quasiquotes_2.10-2.0.1.jar:/Users/zhenhao.li/.m2/repository/org/apache/flink/flink-streaming-scala_2.10/1.0.2/flink-streaming-scala_2.10-1.0.2.jar:/Users/zhenhao.li/.m2/repository/org/apache/flink/flink-streaming-java_2.10/1.0.2/flink-streaming-java_2.10-1.0.2.jar:/Users/zhenhao.li/.m2/repository/org/apache/flink/flink-clients_2.10/1.0.2/flink-clients_2.10-1.0.2.jar:/Users/zhenhao.li/.m2/repository/org/apache/commons/commons-math/2.2/commons-math-2.2.jar:/Users/zhenhao.li/.m2/repository/org/apache/sling/org.apache.sling.commons.json/2.0.6/org.apache.sling.commons.json-2.0.6.jar:/Users/zhenhao.li/.m2/repository/io/netty/netty-all/4.0.27.Final/netty-all-4.0.27.Final.jar:/Users/zhenhao.li/.m2/repository/org/javassist/javassist/3.18.2-GA/javassist-3.18.2-GA.jar:/Users/zhenhao.li/.m2/repository/com/typesafe/akka/akka-actor_2.10/2.3.7/akka-actor_2.10-2.3.7.jar:/Users/zhenhao.li/.m2/repository/com/typesafe/config/1.2.1/config-1.2.1.jar:/Users/zhenhao.li/.m2/repository/com/typesafe/akka/akka-remote_2.10/2.3.7/akka-remote_2.10-2.3.7.jar:/Users/zhenhao.li/.m2/repository/io/netty/netty/3.8.0.Final/netty-3.8.0.Final.jar:/Users/zhenhao.li/.m2/repository/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar:/Users/zhenhao.li/.m2/repository/org/uncommons/maths/uncommons-maths/1.2.2a/uncommons-maths-1.2.2a.jar:/Users/zhenhao.li/.m2/repository/com/typesafe/akka/akka-slf4j_2.10/2.3.7/akka-slf4j_2.10-2.3.7.jar:/Users/zhenhao.li/.m2/repository/org/clapper/grizzled-slf4j_2.10/1.0.2/grizzled-slf4j_2.10-1.0.2.jar:/Users/zhenhao.li/.m2/repository/com/github/scopt/scopt_2.10/3.2.0/scopt_2.10-3.2.0.jar:/Users/zhenhao.li/.m2/repository/io/dropwizard/metrics/metrics-core/3.1.0/metrics-core-3.1.0.jar:/Users/zhenhao.li/.m2/repository/io/dropwizard/metrics/metrics-jvm/3.1.0/metrics-jvm-3.1.0.jar:/Users/zhenhao.li/.m2/repository/io/dropwizard/metrics/metrics-json/3.1.0/metrics-json-3.1.0.jar:/Users/zhenhao.li/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.4.2/jackson-databind-2.4.2.jar:/Users/zhenhao.li/.m2/repository/com/fasterxml/jackson/core/jackson-annotations/2.4.0/jackson-annotations-2.4.0.jar:/Users/zhenhao.li/.m2/repository/com/fasterxml/jackson/core/jackson-core/2.4.2/jackson-core-2.4.2.jar:/Users/zhenhao.li/.m2/repository/jline/jline/0.9.94/jline-0.9.94.jar:/Users/zhenhao.li/.m2/repository/junit/junit/3.8.1/junit-3.8.1.jar:/Users/zhenhao.li/.m2/repository/com/twitter/chill_2.10/0.7.4/chill_2.10-0.7.4.jar:/Users/zhenhao.li/.m2/repository/com/twitter/chill-java/0.7.4/chill-java-0.7.4.jar:/Users/zhenhao.li/.m2/repository/org/apache/flink/flink-connector-kafka-0.8_2.10/1.1-SNAPSHOT/flink-connector-kafka-0.8_2.10-1.1-20160514.040356-150.jar:/Users/zhenhao.li/.m2/repository/org/apache/flink/flink-connector-kafka-base_2.10/1.1-SNAPSHOT/flink-connector-kafka-base_2.10-1.1-20160514.040350-150.jar:/Users/zhenhao.li/.m2/repository/org/apache/kafka/kafka_2.10/0.8.2.2/kafka_2.10-0.8.2.2.jar:/Users/zhenhao.li/.m2/repository/com/101tec/zkclient/0.7/zkclient-0.7.jar:/Users/zhenhao.li/.m2/repository/com/google/code/findbugs/jsr305/1.3.9/jsr305-1.3.9.jar:/Users/zhenhao.li/.m2/repository/org/apache/kafka/kafka-clients/0.8.2.2/kafka-clients-0.8.2.2.jar:/Users/zhenhao.li/.m2/repository/net/jpountz/lz4/lz4/1.2.0/lz4-1.2.0.jar:/Users/zhenhao.li/.m2/repository/com/yammer/metrics/metrics-core/2.2.0/metrics-core-2.2.0.jar:/Applications/IntelliJ IDEA CE.app/Contents/lib/idea_rt.jar" com.intellij.rt.execution.application.AppMain com.sky.ethan.stream.example.FlinkKafkaStreaming
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/flink/util/Preconditions
at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumerBase.<init>(FlinkKafkaConsumerBase.java:113)
at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08.<init>(FlinkKafkaConsumer08.java:180)
at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08.<init>(FlinkKafkaConsumer08.java:164)
at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer08.<init>(FlinkKafkaConsumer08.java:131)
at com.sky.ethan.stream.example.FlinkKafkaStreaming$.main(Example.scala:32)
at com.sky.ethan.stream.example.FlinkKafkaStreaming.main(Example.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:144)
Caused by: java.lang.ClassNotFoundException: org.apache.flink.util.Preconditions
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 11 more
Process finished with exit code 1
I played around it and found out that I couldn't call the constructor FlinkKafkaConsumer08. I used Kafka 0.8.2, Java 7 and Scala 2.10.
What might be wrong here?
It looks like a version mismatch between the version you compiled your program with and the version you use to run your program. Could you make sure that it is the same version?
I am trying to perform following operations but I am getting the error while using groupByKey transformation. I'm using Spark in standalone mode.
sample.sbt contains:
name := "Spark Join"
version := "1.0"
scalaVersion := "2.10.4"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.3.0"
fork := true
My Scala Code
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
import java.util.Properties
object yelpDataJoin {
def main(args: Array[String]) {
val reviewFile = " /home/prasad/Desktop/BigData/psp150030_HW3/data/review3.csv"
val conf = new SparkConf().setAppName("SparkJoins")
val sc = new SparkContext(conf)
val reviewData = sc.textFile(reviewFile, 2)
val groupReviewData = reviewData.map(line => line.split("::")).map(word => (word(2),(word(20),1))).groupByKey().foreach(println)
}
}
I'm getting following error message:
15/07/20 16:10:48 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 32.0 KB, free 265.4 MB)
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/rdd/RDD$
at yelpDataJoin$.main(HW3_Question2.scala:14)
at yelpDataJoin.main(HW3_Question2.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:328)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.rdd.RDD$
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 9 more
Please let me know if I'm doing anything wrong here.
Thanks & Regards,
Prasad
Hi I am using standalone hbase and I want to test spark on it. There is no hadoop on my machine.
when I try to get count of a table using HBaseTest.scala (in scala examples)
I get following error:
ERROR TableInputFormat: java.io.IOException: java.lang.reflect.InvocationTargetException
at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:416)
at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:393)
at org.apache.hadoop.hbase.client.HConnectionManager.getConnection(HConnectionManager.java:274)
at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:194)
at org.apache.hadoop.hbase.client.HTable.<init>(HTable.java:156)
at org.apache.hadoop.hbase.mapreduce.TableInputFormat.setConf(TableInputFormat.java:101)
at org.apache.spark.rdd.NewHadoopRDD.getPartitions(NewHadoopRDD.scala:91)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:219)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:217)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.rdd.RDD.partitions(RDD.scala:217)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1632)
at org.apache.spark.rdd.RDD.count(RDD.scala:1012)
at org.apache.spark.examples.HBaseTest$.main(HBaseTest.scala:59)
at org.apache.spark.examples.HBaseTest.main(HBaseTest.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:607)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:167)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:190)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:414)
... 23 more
Caused by: java.lang.VerifyError: class org.apache.hadoop.hbase.protobuf.generated.ClientProtos$Result overrides final method getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at org.apache.hadoop.hbase.protobuf.ProtobufUtil.<clinit>(ProtobufUtil.java:176)
at org.apache.hadoop.hbase.ClusterId.parseFrom(ClusterId.java:64)
at org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:69)
at org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:83)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.retrieveClusterId(HConnectionManager.java:857)
at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.<init>(HConnectionManager.java:662)
... 28 more
Exception in thread "main" java.io.IOException: No table was provided.
at org.apache.hadoop.hbase.mapreduce.TableInputFormatBase.getSplits(TableInputFormatBase.java:154)
at org.apache.spark.rdd.NewHadoopRDD.getPartitions(NewHadoopRDD.scala:95)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:219)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:217)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.rdd.RDD.partitions(RDD.scala:217)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1632)
at org.apache.spark.rdd.RDD.count(RDD.scala:1012)
at org.apache.spark.examples.HBaseTest$.main(HBaseTest.scala:59)
at org.apache.spark.examples.HBaseTest.main(HBaseTest.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:607)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:167)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:190)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
I am not able to figure out whats the issue here.
HBaseTest.scala:
object HBaseTest {
def main(args: Array[String]) {
val sparkConf = new SparkConf().setAppName("HBaseTest").setMaster("local")
val sc = new SparkContext(sparkConf)
val conf = HBaseConfiguration.create()
// Other options for configuring scan behavior are available. More information available at
// http://hbase.apache.org/apidocs/org/apache/hadoop/hbase/mapreduce/TableInputFormat.html
conf.set("zookeeper.znode.parent", "/hbase-unsecure")
conf.set("hbase.zookeeper.quorum", "localhost")
conf.set("hbase.zookeeper.property.clientPort","2181")
conf.addResource(new Path("/usr/lib/hbase/hbase-0.94.8/conf/hbase-site.xml"))
conf.set(TableInputFormat.INPUT_TABLE,"test")
// Initialize hBase table if necessary
val admin = new HBaseAdmin(conf)
if (!admin.isTableAvailable("test")) {
print ("inside if statement")
val tableDesc = new HTableDescriptor(TableName.valueOf("test"))
admin.createTable(tableDesc)
}
val hBaseRDD = sc.newAPIHadoopRDD(conf, classOf[TableInputFormat],
classOf[org.apache.hadoop.hbase.io.ImmutableBytesWritable],
classOf[org.apache.hadoop.hbase.client.Result])
hBaseRDD.count()
sc.stop()
}
}
You ar using TableInputFormat class as input format. TableInputFormat class is belong to hadoop Map-reduce API. You need to install hadoop for using TableInputFormat.