Creating an uber jar for Spring - eclipse

This is a follow up to this question: creating an uber jar with spring dependencies
I have created a web service using Eclipse, which is running on Windows. I need to run it as a jar on a Solaris station and there I get the ClassNotFoundException:
Caused by: java.lang.ClassNotFoundException: org.springframework.boot.SpringApplication at java.net.URLClassLoader$1.run(URLClassLoader.java:366) at java.net.URLClassLoader$1.run(URLClassLoader.java:355) at java.security.AccessController.doPrivileged(Native Method) at java.net.URLClassLoader.findClass(URLClassLoader.java:354) at java.lang.ClassLoader.loadClass(ClassLoader.java:425) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308) at java.lang.ClassLoader.loadClass(ClassLoader.java:358
I want to create a big jar with all dependencies but I don't understand the answer to that question above.. where do I add what he wrote? And then do I just need to export a jar as usual using Eclipse's export option?

What no one ever said in the answers to other questions is that you need to use maven to create the jar and not using Eclipse's export to jar option.
What you need to do is:
1) download maven from https://maven.apache.org/download.cgi
2) The maven dir contains a 'bin' folder. Add this folder to your "path" enviornment variable (on Windows 8 right click "This PC" -> properties -> Advanced System Settings -> Environment Variables -> in System Variables find "Path" -> double click it and add it by adding the bin folder path to that variable the same way other paths are located there.
3) open CMD
4) navigate to your project's folder
5) type mvn package
The jar file is created inside the "target" folder.
Good luck

Related

Classes not auto compiling in Eclipse

I am using Eclipse, In struts.xml, if I use:
<action name="hello">
<result>
/hello.jsp
</result>
</action>
This works when I access localhost:8080/app/hello
But if I add a self-defined action class like:
<action name="hello" class="com.hah">
<result>
/hello.jsp
</result>
</action>
The hah.java does not compile automatically, it gives me the exception:
Unable to load configuration. - action - file:/D:/workspace/.metadata/.plugins/org.eclipse.wst.server.core/tmp0/wtpwebapps/Struts2/WEB-INF/classes/struts.xml:17:46
at com.opensymphony.xwork2.config.ConfigurationManager.getConfiguration(ConfigurationManager.java:58)
at org.apache.struts2.dispatcher.Dispatcher.init_PreloadConfiguration(Dispatcher.java:360)
at org.apache.struts2.dispatcher.Dispatcher.init(Dispatcher.java:403)
at org.apache.struts2.dispatcher.ng.InitOperations.initDispatcher(InitOperations.java:69)
at org.apache.struts2.dispatcher.ng.filter.StrutsPrepareAndExecuteFilter.init(StrutsPrepareAndExecuteFilter.java:48)
at org.apache.catalina.core.ApplicationFilterConfig.initFilter(ApplicationFilterConfig.java:279)
at org.apache.catalina.core.ApplicationFilterConfig.getFilter(ApplicationFilterConfig.java:260)
at org.apache.catalina.core.ApplicationFilterConfig.<init>(ApplicationFilterConfig.java:105)
at org.apache.catalina.core.StandardContext.filterStart(StandardContext.java:4658)
at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5277)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:147)
at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1408)
at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1398)
at java.util.concurrent.FutureTask.run(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
Caused by: Action class [com.hah] not found - action - file:/D:/workspace/.metadata/.plugins/org.eclipse.wst.server.core/tmp0/wtpwebapps/Struts2/WEB-INF/classes/struts.xml:17:46
at com.opensymphony.xwork2.config.providers.XmlConfigurationProvider.verifyAction(XmlConfigurationProvider.java:405)
at com.opensymphony.xwork2.config.providers.XmlConfigurationProvider.addAction(XmlConfigurationProvider.java:355)
at com.opensymphony.xwork2.config.providers.XmlConfigurationProvider.addPackage(XmlConfigurationProvider.java:460)
at com.opensymphony.xwork2.config.providers.XmlConfigurationProvider.loadPackages(XmlConfigurationProvider.java:265)
at org.apache.struts2.config.StrutsXmlConfigurationProvider.loadPackages(StrutsXmlConfigurationProvider.java:111)
at com.opensymphony.xwork2.config.impl.DefaultConfiguration.reloadContainer(DefaultConfiguration.java:189)
at com.opensymphony.xwork2.config.ConfigurationManager.getConfiguration(ConfigurationManager.java:55)
... 16 more
But I can make it work properly by compiling the hah.java manually(by using javac hah.java) and move the compiled hah.class into /D:/workspace/.metadata/.plugins/org.eclipse.wst.server.core/tmp0/wtpwebapps/Struts2/WEB-INF/classes/ folder
I have checked Build Automatically and there is an entry "/src WEB-INF/classes " in Deployment Assembly.
So the problem is that the action class does not compile automatically. What can I do to make it compiled when Tomcat starts?
ps: Finally, I reinstall eclipse for Java EE and use struts-2.3.29, it works.
But I still do not know why..
Check if the Project > Build Automatically is selected.
Also
Right click open project property and select Deployment Assembly check if the project target classes are copied into web-inf/classes
If these does not solve, I suggest you create an empty project and select Dynamic Web Project template, and copy your files in this new project.
I finally found a solution to this:
Here's what was happening:
I had the exact same problem. Eclipse was not compiling the Java class into the .class file even if there were packages being created in the below path:
[workspace_root].metadata.plugins\org.eclipse.wst.server.core\tmp0\wtpwebapps\2_StrutsReadingProject\WEB-INF\classes\
I tried manually compiling the class in the cmd prompt. Turns out I was missing the JAR files in the BUILD PATH. All my JARs were correct, there was no duplication. But it seems like Eclipse needs these JARs both, in your build path as well in your WEB-INF/lib folder.
After I added the JARs in both places and built the project again from Eclipse, the classes were compiled properly.
So here is how to add JARs in your Eclipse Build Path:
Right click the project node in the "Project Explorer" > Build Path >
Configure Build Path > Switch to "Libraries" tab > Click "Add External
JARs" > Browse and select all your required JARs > Click "Apply" >
Click "OK".
Hopefully this will solve this error for many people! If it doesn't, try manually compiling the class and the CMD prompt will show exactly any errors.
MORAL OF THE STORY: Eclipse does not show any errors in such a scenario, where the source of error is in its own operations!
Had same problem and apparently this can be caused when deleting a library file (.jar) from disk (ie: in lib folder) but not removing it from the built path.
How I resolved it:
After deleting the .jar library from disk
Open Java Build Path. How:
In Project Explorer, right click on name of project and chose properties
In Properties window chose "Java Build Path"
Locate the respective entry (the file name you deleted)
And click on Remove
click on "Apply and close"
This will automatically rebuild the project if "Build Automatically" is selected (Menu: Project>"Build Automatically").

Eclipse: confusing add to Build Path options

I'm not an "real" developper, but I have the right to at least write some code, and add some Jars to the Eclipse build path, without spending hours trying to figure out if the Jars are actually in the Build Path.
My problem (error here below) was resolved in question [NoClassDefFoundError, cannot run MapReduceColorCount (Avro 1.7.7) by adding the correct Jars.
[cloudera#localhost ~]$ hadoop jar avroColorCount.jar exos.MapReduceColorCount2 inavro01 outavro01
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/avro/mapreduce/AvroKeyInputFormat
at exos.MapReduceColorCount2.run(MapReduceColorCount2.java:71)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
at exos.MapReduceColorCount2.main(MapReduceColorCount2.java:86)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.main(RunJar.java:212)
The following are the different ways I've tried to add Jars to the build path:
1. Maven: adding Dependencies through the POM file, they appear afterwards under "Maven Dependencies".
2. "Configure Build Path": the Jars are actually located in my local file system, thus I add the (library) folders, and the folders appear under the "Referenced Libraries".
3. Create a "lib" folder in the project folder, copy/paste the Jars (located in my local file system), do a project Refresh (the lib folder appears in the Package Explorer), select all Jars and right-click "Add to Build Path"
I confirm that my code show no warnings/errors while performing either method. I usually to an "Export ..." of the Jar file in order to execute it.
Example: I've tried adding to Build Path external Jars from Cloudera's CDH5 (Hadoop 2.3.0-cdh5.1.2 and Avro 1.7.5-cdh5.1.2) which are localted locally in /opt/lib
The only method that really worked was method 3. Why it doesn't work with methods 1. or 2. ?
Thank you in advance for your support
I could not reproduce success with method 3., I received a "cannot cast to namespace.customClass" error instead of the "NoClassDefFoundError" error.
I've found an answer for the latter error with a workaround based on two variables export:
export LIBJARS=avrojar1,avrojar2,jar3
export HADOOP_CLASSPATH=avrojar1:avrojar2:jar3
and then running the hadoop jar command with -libjars ${LIBJARS}.
This was tested with method 1. and method 3. respectively.
I conclusion, my case was specific to avro-related Jars only
Thanks

Access configuration resources in Scala IDE

Some of my colleagues use Eclipse 3.7.2 and Scala IDE 2.1 for development. I want to use typesafe's config module for applicaton configuration. I want to use the convention based default configuration location. According to the examples and documentation, the default config can be found at the following path relative to project root
/src/main/resources/application.conf
But when I run my project using Scala IDE's Scala Application loader, the SimpleConfig type is unable to load any configuration values set in this file. An alternative is to pass in a config-file system property via sbt, but I don't want to have to explicitly set this path somewhere. Can anyone point out what I'm doing wrong?
Exception in thread "main" java.lang.ExceptionInInitializerError
at com.foo.dataservices.MyServer.main(MyServer.scala)
Caused by: com.typesafe.config.ConfigException$Missing: No configuration setting found for key 'bar'
at com.typesafe.config.impl.SimpleConfig.findKey(SimpleConfig.java:115)
at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:138)
at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:150)
at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:155)
at com.typesafe.config.impl.SimpleConfig.getConfigNumber(SimpleConfig.java:170)
at com.typesafe.config.impl.SimpleConfig.getInt(SimpleConfig.java:181)
You need to add your resources folder to your java build path:
Right Click on your project in the Project Explorer
properties -> click Java Build Path -> select "Source" tab
click "Add Folder..." and add your src/main/resources folder
update: if you are using the sbt eclipse plugin you can configure it to automatically add the resources folder to the classpath :
(from: sbteclipse docs)
EclipseKeys.createSrc := EclipseCreateSrc.Default + EclipseCreateSrc.Resource

Execute Schemagen (Jena) from the command line / classpath setting

I am learning the Jena API and I want to use Schemagen to create the classes that look like in the package com.hp.hpl.jena.vocabulary for my own vocabulary;
I donwloaded Jena at http://www.apache.org/dist/incubator/jena/apache-jena-2.7.0-incubating/. Once downloaded I unzipped it and leave it as it is.
C:\Users\moi\NetBeansProjects\apache-jena-2.7.0-incubating\apache-jena-2.7.0-incubating
is the folder where there is the bat folder, the bin folder, the javadoc-arq folder etc.
I tested Jena in one of my project using all the libraries in C:\Users\moi\NetBeansProjects\apache-jena-2.7.0-incubating\apache-jena-2.7.0-incubating\lib with a relative link, and it works.
To make it simple to use in the command line I moved my file "MyKnowledgeBase.rdf" in the lib folder.
I tried from the lib folder
java jena.schemagen -i "myKnowledgeBase.rdf"
and get this
Exception in thread "main" java.lang.NoClassDefFoundError: jena/schemagen
Caused by: java.lang.ClassNotFoundException: jena.schemagen
at java.net.URLClassLoader$1.run(Unknown Source)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
Could not find the main class: jena.schemagen. Program will exit.
So I tried to set the classpath :
C:\Users\moi\NetBeansProjects\apache-jena-2.7.0-incubating\apache-jena-2.7.0-incubating\lib>
set CLASSPATH=commons-codec-1.5.jar;httpclient-4.1.2.jar;httpcore-4.1.3.jar;icu4j3.4.4.jar;jena.arq-2.9.0-incubating.jar;jena.core-2.7.0-incubating.jar;jena.iri0.9.0-incubating.jar;log4j-1.2.16.jar;slf4j-api-1.6.4.jar;slf4j-log4j12-1.6.4.jar;xercesImpl-2.10.0.jar; xml-apis-1.4.01.jar;
But I have still the same error. I also tried with
java -cp commons-codec-1.5.jar;httpclient-4.1.2.jar;httpcore-4.1.3.jar;icu4j3.4.4.jar;jena.arq-2.9.0-incubating.jar;jena.core-2.7.0-incubating.jar;jena.iri0.9.0-incubating.jar;log4j-1.2.16.jar;slf4j-api-1.6.4.jar;slf4j-log4j12-1.6.4.jar;xercesImpl-2.10.0.jar; xml-apis-1.4.01.jar; jena.schemagen -i myKnowledgeBase.rdf
when I do
echo %CLASSPATH%
I get what I entered
I tried to use set CLASSPATH with the absolute path for each jar and it doesn't work too.
So now I don't know what to do.
In Jena I found the schemagen.class in the package "jena" from the jena-core-2.7.0-incubating.jar (with netbeans)
With explorer I didn't find the class file.
I already run several projects in the command line doing java -jar so java and the command line is ok
Thank you for your help
Edit :
I removed the space between the argument -classpath and %CLASSPATH% and I get something different \o/ still doesn't work but it's in progress !
"Unrecognized option" and "Could not create the java virtual machine"
Edit2 :
As I was unable to solve this I created a new project with netbeans. I created a copy of schemagen class, put it as the main class, include all the jar as libraries.
and then :
java -jar "C:\Users\moi\NetBeansProjects\MyJena\dist\MyJena.jar" -i "myKnowledgeBase.rdf" -o "C:\Users\moi\NetBeansProjects\apache-jena-2.7.0-incubating\apache-jena-2.7.0-incubating\lib" --ontology
In all recent releases, including Jena 2.7.0, Linux shell and Windows batch scripts are provided for all of the Jena command line tools. These scripts set the CLASSPATH appropriately. Since you seem to be using Windows, you should use bat\schemagen.bat.
I had the same problem.I 'm using Jena 3.10
if anyone having the same problem , the solution for this is using schemagen bat file that located in the bat folder.
I used this command line for generating the vocabulary
C:\Jena\apache-jena-3.10.0\bat\schemagen.bat -i "FileName"

Jar works with standalone Hadoop, but not on the actual cluster (java.lang.ClassNotFoundException: org.jfree.data.xy.XYDataset)

I am trying to build my project using Eclipse on Windows and execute on a Linux cluster. The project depends on some external jars, which I enclosed using eclipse's "Export->Runnable JAR -> Package required library into jar" build option. I checked the jar contains the classes within a folder structure, and the external jars are in the root folder.
On Hadoop standalone, Cygwin and Linux, this works fine but on an actual Hadoop Linux cluster it fails, when it tries to access a class from the first external jar, throwing up a ClassNotFoundException.
Is there a way to force Hadoop to search the jar, I thought this would work.
10/07/16 11:44:59 INFO mapred.JobClient: Task Id : attempt_201007161003_0005_m_000001_0, Status : FAILED
Error: java.lang.ClassNotFoundException: org.jfree.data.xy.XYDataset
at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:252)
at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:320)
at org.akintayo.analysis.ecg.preprocess.ReadPlotECG.plotECG(ReadPlotECG.java:27)
at org.akintayo.analysis.ecg.preprocess.BuildECGImages.writeECGImages(BuildECGImages.java:216)
at org.akintayo.analysis.ecg.preprocess.BuildECGImages.converSingleECGToImage(BuildECGImages.java:305)
at org.akintayo.analysis.ecg.preprocess.BuildECGImages.main(BuildECGImages.java:457)
at org.akintayo.hadoop.HadoopECGPreprocessByFile$MapTest.map(HadoopECGPreprocessByFile.java:208)
at org.akintayo.hadoop.HadoopECGPreprocessByFile$MapTest.map(HadoopECGPreprocessByFile.java:1)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:358)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307)
at org.apache.hadoop.mapred.Child.main(Child.java:170)
Java can not use jars that are in other jar:/ (classloaders can't handle this)
So what you have to do is to install those packages separately on each machine in cluster, or if not possible add jars on the run, to do this you have to add option -libjars mylib.jar when running hadoop jar myjar.jar -libjars mylib.jar and this should work.
Wojtek's answer is correct. Using -libjars will put your external jars in the distributed cache and make them available to all of your Hadoop nodes.
However, if your external jars are not changing frequently, you may find it more convenient to copy the jar files to the node's hadoop/lib manually. Once you restart Hadoop your external jar will be added to the classpath of your jobs.