IConversionManager not found in documents4j while converting docx to pdf (have all the binaries) - documents4j

Exception in thread "main" java.lang.NoClassDefFoundError:
com/documents4j/conversion/IConversionManager at
com.test.resume.Documents4jPOC.main(Documents4jPOC.java:29)
Caused by: java.lang.ClassNotFoundException:
com.documents4j.conversion.IConversionManager
at java.net.URLClassLoader$1.run(URLClassLoader.java:428)
at java.net.URLClassLoader$1.run(URLClassLoader.java:417)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:416)
at java.lang.ClassLoader.loadClass(ClassLoader.java:494)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:357)
at java.lang.ClassLoader.loadClass(ClassLoader.java:427)
I'm getting the above exception when trying to convert docx to pdf using documents4j. What am I missing ?

You are missing the API package which contains this class. Ideally, use Maven or another tool for managing your dependencies what avoids such errors.

Related

Solving Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/fs/FSDataInputStream

I'm using Jetbrains IntelliJ IDEA with the Scala plugin and I'm trying to execute some code that uses Apache Spark. However whenever I try to run it, the code doesn't execute properly because of the exception
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/fs/FSDataInputStream
at org.apache.spark.SparkConf.loadFromSystemProperties(SparkConf.scala:76)
at org.apache.spark.SparkConf.<init>(SparkConf.scala:71)
at org.apache.spark.SparkConf.<init>(SparkConf.scala:58)
at KMeans$.main(kmeans.scala:71)
at KMeans.main(kmeans.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.fs.FSDataInputStream
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 5 more
Running spark-shell from terminal doesn't give me any problems, the warning unable to load native-hadoop library for your platform doesn't appear to me.
I've read some questions similar to mine, but in those cases they had problems with spark-shell or with cluster configuration.
I was using spark-core_2.12-2.4.3.jar without the dependencies. I solved the issue by adding spark-core library through Maven, which automatically added all the dependencies.

java.lang.ClassNotFoundException: fr.bordeaux.contactapp.Exceptions.AuthenticationException after renaming package

I move/rename the package where my servlet are and I get this error :
java.lang.ClassNotFoundException: fr.bordeaux.contactapp.Exceptions.AuthenticationException
I'm new to Eclipse and Java. I undo the rename but errors persist.
Everybody have an idea that I have to do ?
Thanks for your help.
There are the trace here :
Caused by: java.lang.NoClassDefFoundError: fr/bordeaux/contactapp/Exceptions/AuthenticationException
at java.lang.Class.getDeclaredFields0(Native Method)
at java.lang.Class.privateGetDeclaredFields(Class.java:2499)
at java.lang.Class.getDeclaredFields(Class.java:1811)
at org.apache.catalina.util.Introspection.getDeclaredFields(Introspection.java:106)
at org.apache.catalina.startup.WebAnnotationSet.loadFieldsAnnotation(WebAnnotationSet.java:256)
at org.apache.catalina.startup.WebAnnotationSet.loadApplicationServletAnnotations(WebAnnotationSet.java:132)
at org.apache.catalina.startup.WebAnnotationSet.loadApplicationAnnotations(WebAnnotationSet.java:65)
at org.apache.catalina.startup.ContextConfig.applicationAnnotationsConfig(ContextConfig.java:334)
at org.apache.catalina.startup.ContextConfig.configureStart(ContextConfig.java:774)
at org.apache.catalina.startup.ContextConfig.lifecycleEvent(ContextConfig.java:305)
at org.apache.catalina.util.LifecycleSupport.fireLifecycleEvent(LifecycleSupport.java:117)
at org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:90)
at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5053)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
... 6 more
Caused by: java.lang.ClassNotFoundException: fr.bordeaux.contactapp.Exceptions.AuthenticationException
at org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1305)
at org.apache.catalina.loader.WebappClassLoaderBase.loadClass(WebappClassLoaderBase.java:1157)
... 20 more
I'm using an older version of Tomcat, so this may or may not help.
Check for a file named web.xml within the WEB-INF folder. I'm using Netbeans, so for me it's under src/main/webapp/WEB-INF - Eclipse may place it elsewhere.
Within that file, you specify the classes used to run your servlet or filter. It's possible that your refactor did not modify the contents of the web.xml file.
Good luck!
What you need to focus on in your error stacktrace is :
Caused by: java.lang.NoClassDefFoundError: fr/bordeaux/contactapp/Exceptions/AuthenticationException
Is seems that you are still referencing the class AuthenticationException
In the package fr.bordeaux.contactapp.Exceptions
And this class doesn't exist there...
PS1: java is case sensitive
Ps2: by convention, packages in Java are lowercase
in my case the error was fixed by adding "/" in url-pattern ,which i forgot to use.After adding "/" the server started successfully. eg
<servlet-mapping>
<servlet-name>imp_servlet</servlet-name>
<url-pattern>/implement_servlet</url-pattern>
</servlet-mapping>

mapreduce code working on eclipse but not on cluster

I am working on code which uses openNLP. My code runs on eclipse perfectly, but when I run its jar on a cluster, I get the following error:
Exception in thread "main" java.lang.NoClassDefFoundError: opennlp/tools/util/ObjectStream
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:270)
at org.apache.hadoop.util.RunJar.main(RunJar.java:153)
Caused by: java.lang.ClassNotFoundException: opennlp.tools.util.ObjectStream
at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
at java.lang.ClassLoader.loadClass(ClassLoader.java:323)
at java.lang.ClassLoader.loadClass(ClassLoader.java:268)
... 3 more
You need to have the OpenNLP jar available and in your classpath on your tasks. There are several options:
-libjars and HADOOP_CLASSPATH, see Using the libjars option with Hadoop
'fat jar': build a jar that contains all the necessary jars, submit the fat jar instead
install the 3rd party jars on all nodes (ie. make the cluster '3rd party aware')
use the HDFS distributed cache and download the necessary jars in your code
For a lengthier discussion see How-to: Include Third-Party Libraries in Your MapReduce Job

JOGL throwing ClassNotFoundException?

I've seen this question brought up a couple of times on this website, but never really seen a clear answer, so excuse me from repeating it. While programming with JOGL and Java3D I've encountered some errors. I was trying to create a project that I might eventually put on the Android App Store. I began the project just using Java3D and JOGL and putting them in the system library on my mac, where they worked fine. Then to try to make the project portable I moved the J3D and JOGL files inside the project so they could be compiled into a jar file that would be runnable without needing to install j3d and JOGL. But then every time I ran the project it threw this error:
Exception in thread "main" java.lang.NoClassDefFoundError: javax/media/opengl/GL
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:190)
at javax.media.j3d.Pipeline$PipelineCreator.run(Pipeline.java:73)
at javax.media.j3d.Pipeline$PipelineCreator.run(Pipeline.java:61)
at java.security.AccessController.doPrivileged(Native Method)
at javax.media.j3d.Pipeline.createPipeline(Pipeline.java:90)
at javax.media.j3d.MasterControl.loadLibraries(MasterControl.java:832)
at javax.media.j3d.VirtualUniverse.<clinit>(VirtualUniverse.java:274)
at javax.media.j3d.GroupRetained.<init>(GroupRetained.java:155)
at javax.media.j3d.TransformGroupRetained.<init>(TransformGroupRetained.java:116)
at javax.media.j3d.TransformGroup.createRetained(TransformGroup.java:114)
at javax.media.j3d.SceneGraphObject.<init>(SceneGraphObject.java:114)
at javax.media.j3d.Node.<init>(Node.java:172)
at javax.media.j3d.Group.<init>(Group.java:549)
at javax.media.j3d.TransformGroup.<init>(TransformGroup.java:87)
at src.Project.<clinit>(Project.java:47)
at src.ProjectPanel.<clinit>(ProjectPanel.java:8)
Caused by: java.lang.ClassNotFoundException: javax.media.opengl.GL
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 17 more
I'm using Eclipse as an IDE, and have the jogl-all.jar and gluegen-rt.jar files in the classpath of the project, as well as all of the require j3d jars, but it cannot find the GL.class file for some reason.
Thanks in advance for help.
When you export your application as a Runnable JAR use the
+ Library handling:
Copy required libraries into a sub-folder next to the generated JAR
or
+ Library handling:
Package required libraries into generated JAR
More information is available in the jogamp jogl wiki:
http://jogamp.org/wiki/index.php/Setting_up_a_JogAmp_project_in_your_favorite_IDE
http://jogamp.org/wiki/index.php/JogAmp_JAR_File_Handling
Also you will need to use the java -jar yourapp.jar command line option to run your application.

Import LWJGL.jar error

I'm getting a new error:
Starting up
Exception in thread "main" java.lang.UnsatisfiedLinkError: no lwjgl in java.library.path
at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1734)
at java.lang.Runtime.loadLibrary0(Runtime.java:823)
at java.lang.System.loadLibrary(System.java:1028)
at org.lwjgl.Sys$1.run(Sys.java:75)
at java.security.AccessController.doPrivileged(Native Method)
at org.lwjgl.Sys.doLoadLibrary(Sys.java:68)
at org.lwjgl.Sys.loadLibrary(Sys.java:84)
at org.lwjgl.Sys.<clinit>(Sys.java:101)
at org.lwjgl.opengl.Display.<clinit>(Display.java:128)
at org.newdawn.slick.AppGameContainer$1.run(AppGameContainer.java:39)
at java.security.AccessController.doPrivileged(Native Method)
at org.newdawn.slick.AppGameContainer.<clinit>(AppGameContainer.java:36)
at GameStarter$.main(SnakeGame.scala:43)
at GameStarter.main(SnakeGame.scala)
I have imported the lwjgl.jar libary.
This is not a problem with Scala. lwjgl consists of a part written in Java and another part that is native. The native part should be found in form of a dll (Windows) or so (*nix) file. You should put this path in the java.library.path property.
See the faq: http://lwjgl.org/faq.php#faq1