using the following commands as given in the tutorial :
http://s3.thinkaurelius.com/docs/titan/0.5.0/hbase.html
TitanGraph graph = TitanFactory.build()
.set("storage.backend","hbase")
.open();
used the maven dependency :
<dependency>
<groupId>com.thinkaurelius.titan</groupId>
<artifactId>titan-hbase</artifactId>
<version>${titan.version}</version>
</dependency>
Following error is shown
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/MasterNotRunningException
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:264)
at com.thinkaurelius.titan.util.system.ConfigurationUtil.instantiate(ConfigurationUtil.java:42)
at com.thinkaurelius.titan.diskstorage.Backend.getImplementationClass(Backend.java:479)
at com.thinkaurelius.titan.diskstorage.Backend.getStorageManager(Backend.java:413)
at com.thinkaurelius.titan.graphdb.configuration.GraphDatabaseConfiguration.<init>(GraphDatabaseConfiguration.java:1320)
at com.thinkaurelius.titan.core.TitanFactory.open(TitanFactory.java:94)
at com.thinkaurelius.titan.core.TitanFactory$Builder.open(TitanFactory.java:135)
at pluradj.titan.tinkerpop3.example.JavaExample2.main(JavaExample2.java:26)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.MasterNotRunningException
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 9 more
If possible can you tell the same for cassandra also.
Look here at how to set your runtime classpath in Eclipse: How do I set the runtime classpath in Eclipse 4.2?
It appears your runtime classpath is missing jars due to the NoClassDefFoundError exception. For this specific error, locate the hbase lib directory from your hbase install and add that to your classpath.
If you use Cassandra, you'll need to set your classpath appropriately for Cassandra.
Titan's HBase client config accepts arbitrary keys from hbase-site.xml
(if it's on the CLASSPATH) and I recommend putting that in your path as well.
Related
I'm using a spark-shell instance to test the pulling of data from a client's kafka source. To launch the instance I am using the command spark-shell --jars spark-sql-kafka-0-10_2.11-2.5.0-palantir.8.jar, kafka_2.12-2.5.0.jar, kafka-clients-2.5.0.jar (all jars are present in the woring dir).
However, when I run the command val df = spark.read.format("kafka")........... after a few seconds it crashes with the below:
java.lang.NoClassDefFoundError: org/apache/spark/sql/sources/v2/StreamingWriteSupportProvider
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:455)
at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
at java.net.URLClassLoader$1.run(URLClassLoader.java:367)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:411)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:344)
at java.util.ServiceLoader$LazyIterator.nextService(ServiceLoader.java:370)
at java.util.ServiceLoader$LazyIterator.next(ServiceLoader.java:404)
at java.util.ServiceLoader$1.next(ServiceLoader.java:480)
at scala.collection.convert.Wrappers$JIteratorWrapper.next(Wrappers.scala:43)
at scala.collection.Iterator$class.foreach(Iterator.scala:893)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
at scala.collection.TraversableLike$class.filterImpl(TraversableLike.scala:247)
at scala.collection.TraversableLike$class.filter(TraversableLike.scala:259)
at scala.collection.AbstractTraversable.filter(Traversable.scala:104)
at org.apache.spark.sql.execution.datasources.DataSource$.lookupDataSource(DataSource.scala:533)
at org.apache.spark.sql.execution.datasources.DataSource.providingClass$lzycompute(DataSource.scala:89)
at org.apache.spark.sql.execution.datasources.DataSource.providingClass(DataSource.scala:89)
at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:304)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:178)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:146)
... 48 elided
Caused by: java.lang.ClassNotFoundException: org.apache.spark.sql.sources.v2.StreamingWriteSupportProvider
at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 79 more
HOWEVER - if I change the order of the jars in the spark-shell command to spark-shell --jars kafka_2.12-2.5.0.jar, kafka-clients-2.5.0.jar, spark-sql-kafka-0-10_2.11-2.5.0-palantir.8.jar, instead crashes with:
java.lang.NoClassDefFoundError: org/apache/kafka/common/serialization/ByteArrayDeserializer
at org.apache.spark.sql.kafka010.KafkaSourceProvider$.<init>(KafkaSourceProvider.scala:376)
at org.apache.spark.sql.kafka010.KafkaSourceProvider$.<clinit>(KafkaSourceProvider.scala)
at org.apache.spark.sql.kafka010.KafkaSourceProvider.validateBatchOptions(KafkaSourceProvider.scala:330)
at org.apache.spark.sql.kafka010.KafkaSourceProvider.createRelation(KafkaSourceProvider.scala:113)
at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:309)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:178)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:146)
... 48 elided
Caused by: java.lang.ClassNotFoundException: org.apache.kafka.common.serialization.ByteArrayDeserializer
at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 55 more
I am developing behind a very strict proxy managed by our client and am unable to user --packages instead, and I am at a bit of a loss here, am I unable to load all 3 dependencies at the launch of the shell? Am I missing another step somewhere?
In the Structured Streaming + Kafka Integration Guide it says:
For experimenting on spark-shell, you need to add this above library and its dependencies too when invoking spark-shell.
The library you are using seems to be customized and not publicly available in the maven central repository. That means, I can not look into its dependencies.
However, looking at the latest stable version 2.4.5 the dependencies according to maven central repository is kafka-clients version 2.0.0.
You are trying to import multiple scala versions 2.11 & 2.12 of different libraries.
Please add same version of scala libraries & check below how to import into spark-shell.
spark-shell --packages org.apache.spark:spark-sql-kafka-0-10_2.11:2.4.5,org.apache.kafka:kafka_2.11:2.4.1,org.apache.kafka:kafka-clients:2.4.1
One occasionally disruptive issue is dealing with dependency conflicts in cases where a user application and Spark itself both depend on the same library. This comes up relatively rarely, but when it does, it can be vexing for users. Typically, this will manifest itself when a NoSuchMethodError, a ClassNotFoundException, or some other JVM exception related to class loading is thrown during the execution of a Spark job. There are two solutions to this problem. The first is to modify your application to depend on the same version of the third-party library that Spark does. The second is to modify the packaging of your application using a procedure that is often called “shading.” The Maven build tool supports shading through advanced configuration of the plug-in shown in Example 7-5 (in fact, the shading capability is why the plugin is named maven-shade-plugin). Shading allows you to make a second copy of the conflicting package under a different namespace and rewrites your application’s code to use the renamed version. This somewhat brute-force technique is quite effective at resolving runtime dependency conflicts. For specific instructions on how to shade dependencies, see the documentation for your build tool.
I would try to know the scala version of the spark-shell because, it can be a scala version issue
scala> util.Properties.versionString
res3: String = version 2.11.8
if not, then check what spark version you are using and third-party library versions you are using as dependencies because, I am sure there is newest or oldest that your spark version doesn't support.
I hope it helps.
I'm using Jetbrains IntelliJ IDEA with the Scala plugin and I'm trying to execute some code that uses Apache Spark. However whenever I try to run it, the code doesn't execute properly because of the exception
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/fs/FSDataInputStream
at org.apache.spark.SparkConf.loadFromSystemProperties(SparkConf.scala:76)
at org.apache.spark.SparkConf.<init>(SparkConf.scala:71)
at org.apache.spark.SparkConf.<init>(SparkConf.scala:58)
at KMeans$.main(kmeans.scala:71)
at KMeans.main(kmeans.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.fs.FSDataInputStream
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 5 more
Running spark-shell from terminal doesn't give me any problems, the warning unable to load native-hadoop library for your platform doesn't appear to me.
I've read some questions similar to mine, but in those cases they had problems with spark-shell or with cluster configuration.
I was using spark-core_2.12-2.4.3.jar without the dependencies. I solved the issue by adding spark-core library through Maven, which automatically added all the dependencies.
I am trying to create an executable Jar from my Scala project. Therefore, I followed a tutorial. It suggested creating a Java main class within eclipse, which calls the Scala entry point. This works fine when executing within eclipse. After adding this class I was able to export an executable jar. However, it wont work with
java -jar myjar.jar
I made sure to activate "Package required library into jar" when exporting. My Java main class looks like this (where Driver is also located in the package core)
package core;
public class Main {
public static void main(String[] args) {
Driver.main(args);
}
}
And when executing the exported char the follwing error is thrown, which I can debug:
Exception in thread "main" java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
atsun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.eclipse.jdt.internal.jarinjarloader.JarRsrcLoader.main(JarRsrcLoader.java:58)
Caused by: java.lang.NoClassDefFoundError: scala/collection/Seq
at core.Driver$.main(Driver.scala:14)
at core.Driver.main(Driver.scala)
at core.Main.main(Main.java:5)
... 5 more
Caused by: java.lang.ClassNotFoundException: scala.collection.Seq
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
You could use the One-jar application to guarantee all the necessary jars are in the jarfile, I suspect the scala-lang jars are not in it or in the classpath of the command line.
Maven: https://code.google.com/p/onejar-maven-plugin/
SBT: https://github.com/sbt/sbt-onejar
If you aren't using either maven or SBT I would suggest switching to them as well as they are the main supported build systems in Scala
I am working on code which uses openNLP. My code runs on eclipse perfectly, but when I run its jar on a cluster, I get the following error:
Exception in thread "main" java.lang.NoClassDefFoundError: opennlp/tools/util/ObjectStream
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:270)
at org.apache.hadoop.util.RunJar.main(RunJar.java:153)
Caused by: java.lang.ClassNotFoundException: opennlp.tools.util.ObjectStream
at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
at java.lang.ClassLoader.loadClass(ClassLoader.java:323)
at java.lang.ClassLoader.loadClass(ClassLoader.java:268)
... 3 more
You need to have the OpenNLP jar available and in your classpath on your tasks. There are several options:
-libjars and HADOOP_CLASSPATH, see Using the libjars option with Hadoop
'fat jar': build a jar that contains all the necessary jars, submit the fat jar instead
install the 3rd party jars on all nodes (ie. make the cluster '3rd party aware')
use the HDFS distributed cache and download the necessary jars in your code
For a lengthier discussion see How-to: Include Third-Party Libraries in Your MapReduce Job
We are trying to create a functional example with Hibernate, JBoss7, Beans and Servlets using Eclipse as an IDE.
In other example project we were able to make functional Servlets, and we were able to use Hibernate.
We created two eclipse projects:
A Dynamic web project, an Enterprise Java Beans (EJB) project and a EAR project connecting both.
Running a simple test.java file which uses hibernate (and worked on other projects), we the get errors:
Initial SessionFactory creation failed.org.hibernate.cfg.beanvalidation.IntegrationException: Error activating Bean Validation integration
Exception in thread "main" java.lang.ExceptionInInitializerError
at exercicio.SessionFactoryUtil.<clinit>(SessionFactoryUtil.java:20)
at exercicio.DataBaseInterface.<init>(DataBaseInterface.java:17)
at exercicio.Test.main(Test.java:9)
Caused by: org.hibernate.cfg.beanvalidation.IntegrationException: Error activating Bean Validation integration
at org.hibernate.cfg.beanvalidation.BeanValidationIntegrator.integrate(BeanValidationIntegrator.java:156)
at org.hibernate.internal.SessionFactoryImpl.<init>(SessionFactoryImpl.java:303)
at org.hibernate.cfg.Configuration.buildSessionFactory(Configuration.java:1750)
at exercicio.SessionFactoryUtil.configureSessionFactory(SessionFactoryUtil.java:32)
at exercicio.SessionFactoryUtil.<clinit>(SessionFactoryUtil.java:17)
... 2 more
Caused by: java.lang.NoClassDefFoundError: org/slf4j/LoggerFactory
at org.hibernate.validator.util.LoggerFactory.make(LoggerFactory.java:29)
at org.hibernate.validator.util.Version.<clinit>(Version.java:24)
at org.hibernate.validator.engine.ConfigurationImpl.<clinit>(ConfigurationImpl.java:59)
at org.hibernate.validator.HibernateValidator.createGenericConfiguration(HibernateValidator.java:41)
at javax.validation.Validation$GenericBootstrapImpl.configure(Validation.java:269)
at javax.validation.Validation.buildDefaultValidatorFactory(Validation.java:111)
at org.hibernate.cfg.beanvalidation.TypeSafeActivator.getValidatorFactory(TypeSafeActivator.java:445)
at org.hibernate.cfg.beanvalidation.TypeSafeActivator.activate(TypeSafeActivator.java:96)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.hibernate.cfg.beanvalidation.BeanValidationIntegrator.integrate(BeanValidationIntegrator.java:150)
... 6 more
Caused by: java.lang.ClassNotFoundException: org.slf4j.LoggerFactory
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
... 19 more
The problem now seems to be around the jars that we included using project Properties > Java Build Path > Add External Jars for the EJB project:
We added the slf4j-jdk14-1.7.5.jar and hibernate jars but the exception log appears to be indicating that we are still missing some jar.
If we remove the jars from the Java Build Path, the exception log is the same. So we think that the jars are not being deployed correctly, or some extra configuration is required... even thought they appear in the /lib folder inside the EJB project deployment folder.
Is there any procedure we are missing, or any probable causes to investigate? I'll add more info if needed. Thanks.
When using JBoss7 not all Jars are available to the application by default (for example slf4j). You have to specify which modules should be included in the classpath of the application by putting this information into a file in your application.
I always do it by adding jboss-deployment-structure.xml into my application (the EAR in your case).
Here's how https://docs.jboss.org/author/display/AS7/Class+Loading+in+AS7.
(The name of the module in your case would be org.slf4j)
Throwing slf4j jar in the libs folder can create a conflict with the slf4j module already included in JB7 by default... you have to add the dependency to your jboss-deployment-structure.xml
But if your hibernate jars are in the libs folder, they could fail resolve the dependency as well too... the way to go in JB7 would be to make a module for Hibernate too, the Nightie builds of JB7 already include an hibernate module, you could just copy the module from there, then add it to your jboss-deployment-structure.xml