I'm trying to build a large Open Source project in Eclipse; it uses Maven so I've installed the various plugins (m2eclipse etc) but I'm a little unfamiliar with this setup.
I can build and run the particular JAR I'm interested in with no issues. However, when the newly built JAR tries to open a large ZIP file, I get this:
Exception in thread "main" java.lang.RuntimeException: java.util.zip.ZipException: invalid CEN header (bad sig
nature)
at org.opentripplanner.graph_builder.impl.GtfsGraphBuilderImpl.buildGraph(GtfsGraphBuilderImpl.java:17
9)
at org.opentripplanner.graph_builder.GraphBuilderTask.run(GraphBuilderTask.java:127)
at org.opentripplanner.graph_builder.GraphBuilderMain.main(GraphBuilderMain.java:51)
Caused by: java.util.zip.ZipException: invalid CEN header (bad signature)
at java.util.zip.ZipFile.open(Native Method)
at java.util.zip.ZipFile.<init>(Unknown Source)
I did some research and it seems that this error means that java.util.ZipFile can't read the file because it's in ZIP64 format. Apparently this was fixed in Java 1.7, so dutifully I've updated the JDK on OS X, and tried to change the Maven project by right-clicking on it in Eclipse, then altering the Project Facet, which in turns seems to have updated the JRE libraries in that project to 1.7:
However, this doesn't work - I still get the error even having re-built the whole project.
Is it possible that the old java.util.zip is still being pulled in from somewhere? I'm not too familiar with how linking works in Java, can older JDKs be 'embedded' like this within dependencies? Or does the java.util.zip just get used that's on the target machine? (this is definitely JRE 1.7) I know for a fact that the code throwing the exception is actually contained within a separate JAR that's pulled in as a Maven dependency:
Do I need to track down and re-build this external JAR against Java 1.7, is that the issue here? Or is there a concept of a Maven 'parent project' that's regressing my new JRE 1.7 back to 1.6? Sorry if these questions are naive.
I originally thought that it would be as simple as just updated the JRE on the runtime machine, but apparently not. So how do I resolve this error?
Assuming that the problem is really caused by using an older version of Java, then rebuilding is not going to make any difference. The real problem is that your application JAR is running on an older JRE.
In the command shell you are using to run your application, run java -version. That will tell you what JDK / JRE will be used when you then run java -jar yourApp.jar ...
The most important point in relationship with maven is bear in mind that the source of the truth is the pom file in Maven and NOT the IDE anymore. So changing the compiler version must be done in the pom.xml file and NOT in the IDE.
Maven ignores what JRE are you using in eclipse, you have to force it with the maven compile plugin config. Here is how in this link http://maven.apache.org/plugins/maven-compiler-plugin/examples/compile-using-different-jdk.html But with version 1.7 instead of 1.3
Related
I'm working on updating a large old Java application to Java 17.
It was on Java 11. Note that we build in Eclipse, and all the projects that are part of this application have Java 1.8 compatibility turned on.
Now that I've updated it all to Java 17 (and updated the truckload of .jar files we use to their latest versions as well), I'm getting a problem with a code generator plugin we wrote.
I was able to rebuild the plugin .jar and it seems to be valid. The problem is that when it runs, it fails With this msg:
The method parse(File) from the type DataModelParser refers to the missing type ParserConfigurationException
Since the class definitely exists, I'm assuming it's running into the issue where Java doesn't like being able to find the class in 2 places.
using Ctrl-Shift-T, I find the class available from Java 17 (expected) and "C:\Users\JohnLuss.p2\pool\plugins\javax.xml_1.3.4.v201005080400.jar"
That jar is part of Eclipse - so why does Eclipse have jars that conflict with java 17?
HOW can I make this work? Is there any way to exclude the Eclipse jar?
I have a plain Eclipse installation without much of anything, and a workspace with a Maven (or better yet Tycho) project. Everything worked until I decided to change the JDK (Preferences -> Installed JREs); now whenever I start a Maven build or the plug-in unit tests I get the following error:
Error occurred during initialization of VM
java.lang.UnsatisfiedLinkError: java.lang.Class.getClassLoader0()Ljava/lang/ClassLoader;
<<no stack trace available>>
I have restarted Eclipse and tried a different installation, and since I thought the problem was with the JRE, I changed it back to the original one. Still the same error.
The corresponding 64bit JRE works, but sadly the build has an integration test for 32bit (Cannot load 32-bit SWT libraries on 64-bit JVM), so that's why I need to get 32bit to work again.
How do I fix it?
Deleting all the .metadata folder of the workspace and setting it up again helped.
I know this question has been asked many times before, but I couldn't find any for this specific case, and this up-to-date. How do I install the Eclipse MapReduce plugin with Hadoop 2.5.2? I've found multiple tutorials addressing this for other versions, and I know that it can be compiled from here, but that link, while it says 2.x, only says up to 2.4.1. I tried to compile it anyway using 2.5.2 using the following command:
ant jar -Dversion=2.4.1 -Dhadoop.version=2.5.2 -Declipse.home=/opt/eclipse -Dhadoop.home=/usr/local/hadoop -DBuildfile: build.xml
This attempted to compile, but then returned
Warning: Could not find file /usr/local/hadoop/share/hadoop/common/lib/jackson-core-asl-1.8.8.jar to copy.
Which makes sense, because the version of this jar in Hadoop 2.5.2 is 1.9.13.
Then I tried to use version 2.5.2, even though it is not mentioned in the jar, since it says that they support 2.x.
ant jar -Dversion=2.5.2 -Dhadoop.version=2.5.2 -Declipse.home=/opt/eclipse -Dhadoop.home=/usr/local/hadoop -DBuildfile: build.xml
In this case, it said BUILD SUCCESSFUL, but there was no output - no jar produced, and no output printed under the 'compile:' and 'jar:' steps.
I even tried downloading the compiled jar from this guy's github, but of course that didn't work either - it didn't show up in Eclipse when I added the jar to the plugins folder.
Is there any way to use the plugin with Hadoop 2.5.2, or do I have to downgrade to 2.4.1? I am using Ubuntu 14.0.4, Hadoop 2.5.2, and Eclipse Luna, but can downgrade that if necessary.
I downloaded hadoop-eclipse-plugin-2.6.0.jar (at the time of writing this is the latest) from https://github.com/winghc/hadoop2x-eclipse-plugin/tree/master/release and pasted the jar file in the plugin folder of the Eclipse. It gave me "Map/Reduce" Wizard and I got the "Run on Hadoop" Option!! I am using Eclipse Luna.
you could use the "Hadoop" Wizard to define the HDFS Server.
If you just need to run the map-reduce jobs from eclipse, then you need to
Create a Java project in eclipse
Add the hadoop jar files in the project reference library
Mapreduce programs can be compiled and executed from eclipse.
Note: By default eclipse considers the local filesystem for input and output files.
I have a web project working in eclipse juno using tomcat 7 on one machine. I exported this project to subversion, and imported it into a fresh workspace of eclipse juno on another machine which also has tomcat 7 installed. To fix errors, I then set up the runtime "Apache Tomcat v7.0" on the second machine, and selected Projects/Clean/Clean all.
Now, my project has a list of libraries that includes "Apache Tomcat v7.0 [Apache Tomcat v7.0]", under which are listed all the jars in my tomcat installation, including servlet-api.jar, which eclipse is able to tell includes the package javax.servlet.http. However, I have hundreds of compiler errors from my source of the form "The import javax.servlet.http cannot be resolved" etc. Short of deleting and recreating the project (which might fix the second machine, but I worry that it will stop the project working on the original machine) what can I do to fix this?
Correction: the machines are running eclipse juno, not indigo as I previously stated.
If you did a refresh on the tree and a clean/build and the errors still exists, it must be the case, that they are not within the build path (check .classpath file).
So even though you may have checked it before, you should go there again and see if they are on the build path and maybe delete it and include them again.
Additionally, maybe this article about the specific problem helps.
Turns out that this is a fairly simple and annoying but otherwise harmless eclipse bug -- it seems as though the list of libraries that constitutes a particular runtime is cached when the project is loaded, and doesn't get updated if the library list changes (e.g. because the runtime didn't exist when the project was imported, but was created afterwards). Restarting eclipse causes it to reload the list, after which a project clean gets rid of the errors.
I was trying to compile a Grails application referencing third party JARs on Mac OSX. Although my system's JRE and JDK is set to Java 1.6 I always got a Compilation error: java.lang.UnsupportedClassVersionError: Bad version number in .class file
when accessing classes in the JAR. Also when testing an existing Grails app the default stats of the app where showing that it's running with Java 1.6. So I really didn't have more ideas than to try to change the Java settings in Eclipse - I thought that's stupid as I'm not using Eclipse for the devlopment of this application - but voila - now the compilation of my app on the command line works just fine!
Can anyone explain me what Eclipse is doing here behind the scenes?
I've had set the JAVA_HOME manually before with no effect.
The JDK (JAVA_HOME) used to launched eclipse is not necessarily the one used to compiled your project.
To see what JRE YOU y can select for your project, check the preferences
General > Java Installed JRE
By default, if you have not added any JRE, the only one declared will be the one used to launched eclipse (which can be defined in your eclipse.ini).
You can add any other JRE you want, including one compatible with your project.
After that, you will need to check in your project properties (or in the general preferences) what JRE is used, with what compliance level: