TrueZip 7 requires Java 7? NoClassDefFoundError: java/nio/file/Path on Java 6 - truezip

TFile depends on java.nio.file.Path (toPath() method returns java.nio.file.Path) that isn't available on Java 6 so calling any TFile method on Java 6 throws "java.lang.NoClassDefFoundError: java/nio/file/Path"
How do you manage to use TFile on Java 6? What I'm thinking of is getting the sources, re-compiling them without this method and using a patched version which is kind of unpleasant solution.

No, TrueZIP 7 does not require JSE 7, JSE 6 is enough as the home page documents. However, some features are only available on JSE 7 (e.g. the TrueZIP Path module) and hence a run time test is performed.
With correct class loader implementations you will never see NoClassDefFoundError. However, some environments have broken class loader implementations which do eager class loading - despite the lazy class loading which is mandated by the spec. Only then you would get a NoClassDefFoundError.
On another note, please mind the Eclipse license of the project. If you really wanted to fix this by patching (you can't because there are circular dependencies between java.io.File and java.nio.file.Path which are the reason for this design), then you would have to publish this fork.
Appendix:
The Java Language Specification for Java 6, chapter 12.2.1 "The Loading Process" reads:
Different subclasses of ClassLoader may implement different loading policies. In particular, a class loader may cache binary representations of classes and interfaces, prefetch them based on expected usage, or load a group of related classes together. These activities may not be completely transparent to a running application if, for example, a newly compiled version of a class is not found because an older version is cached by a class loader. It is the responsibility of a class loader, however, to reflect loading errors only at points in the program where they could have arisen without prefetching or group loading.
English isn't my mother tongue, but I take it from the last sentence that eager class loading is OK as long as the class loader doesn't throw up just because an unused class failed to load eagerly. So if a class loader throws up because TFile.toPath() needs to return a java.nio.file.Path although you never call this method then I consider this to be a problem with the class loader. As an aside, TFile.toPath() throws an UnsupportedOperationException - please check the Javadoc for details.
I would have preferred to take another route but the circular dependency between java.io.File.toPath() and java.nio.file.Path.toFile() left me with no choice.

Related

How to create a custom annotation processor for Eclipse

I'm trying to create a custom annotation processor that generates code at compilation time (as hibernate-jpamodelgen does). I've looked in the web, and I find custom annotation processors that works with maven, but do nothing when added to the Annotation Processing > Factory Path option. How could I create a processor compatible in this way? I have not found a tutorial that works.
My idea is to, for example, annotate an entity to generate automatically a base DTO, a base mapper, etc that can be extended to use in the final code.
Thank you all
OK, Already found out the problem. The tutorial I hda found out dint't specified that, in order to the compiler to be able to apply the annotation processor, there must be a META-INF/services/javax.annotation.processing.Processor file that contains the qualified class name of the processor (or processors).
I created the file pointing to my processor class, generated the jar and added it to Annotation Processing > Factory Path and all worked correctly.
Just be careful to keep the order of the processors correctly (for example, hibernate model generator claims the classes, so no more generation will be made after it), and change the jar file name each time you want to replace the library (it seems eclipse keeps a cache). These two things have given me a good headache.
Thanks all

GWT Hosted Mode RPC Serialization file with bad class definition causes IncompatibleRemoteServiceException

I have a GWT project in Eclipse that throws a com.google.gwt.user.client.rpc.IncompatibleRemoteServiceException when using hosted mode because the code server RPC file hashcode does not match the server RPC file hashcode.
I've tracked this down to a couple classes that implement com.extjs.gxt.ui.client.data.BeanModelTag. These classes appear to be included in the code server generated RPC file incorrectly. Additionally, the class names appear mangled.
For example, instead of com.acme.beans.MyBean the class is referenced as com.acme.beans.BeanModel_com_acme_beans_MyBean.
I suspect this has something to do with the class path for my debug target incorrectly including some jar, src dir, or other project incorrectly, but I don't have good feel for how to debug this further.
GXT 2 (current is 3, 4 should beta soonish) had a feature where it could generate BaseModelData types based on a java bean or pojo, allowing for reflection-like features that GXT 2 used to render templates and grid cells (GXT 3 has compile-time features that work out that property access instead). The BeanModels are not meant to be sent over the wire - instead, you should be sending your original MyBean over the wire.
This generated BeanModel instance is designed to wrap the original MyBean, and is only available to the client code. To pass back to the server again, unwrap the bean - use getBean() to get the underlying pojo.

Java 8: Spliterator, Iterator, Collection and "default" implemenations in Interfaces (Duplicate methods named spliterator)

Have an interesting situation following the release of Java 1.8.0_25 into the wilds... I believe the root of my issue is related primarily to the new (to 1.8) features of "default" implementations within Interfaces.
The application I am working on is currently targeted at 1.7, which until now has been working well. Until users started updating to 1.8. Now that our users have started updating to 1.8, our hand is forced somewhat into moving to 1.8 support.
We have fixed most of the issues (mainly relating to changes to the JavaFX packages between 1.7 and 1.8) but have one vexing issue remaining.
In my wisdom, or lack thereof, I, some time ago, decided to create a SortedList<T> which extends from AbstractList<T>. Until now, this class has worked fine, however when running on a 1.8 runtime, I get:
Duplicate methods named spliterator with the parameters () and () are inherited
from the types Collection<T> and Iterable<T>
This, to me, appears to be caused by the "default" implementations in some of the Interfaces that are implemented by AbstractList<T> (my SortedList<T> class does not implement any additional Interfaces other than Serializable). Implementing Serializable is another problem for us, as we need to support deserialisation of SortedList<T> objects, there's no way around that!).
I can get rid of the error by providing an override implementation of spliterator() in my SortedList<T> class. However, if this is built, it no longer runs on a Java 1.7 environment. If I attempt to use SortedList<T> with a 1.7 runtime, I get:
Problem:
Error: Unresolved compilation problems:
The import java.util.Spliterator cannot be resolved
Spliterator cannot be resolved to a type
com.xxxx.xxxx.util.SortedList.<init>(SortedList.java:13)
This error is pretty obvious, since we've now overridden the spliterator() method in SortedList<T> it needs to include java.util.Spliterator, but that doesn't exist in 1.7.
Ideally we would like to NOT require our customers to update to Java 1.8 if they don't want to.
Is our hand being forced here? Do we need to force users to update to 1.8 and also roll out a new version to any users who have updated to 1.8 by themselves?
Does anyone know a way around this issue?
On a more philosophical note, why has Interface been corrupted with with implementation :-(. Might be a nifty new feature, but they really should have avoided doing anything that would result in breaking changes to existing code, particularly in something so fundamental as lists/collections etc.
Any help or suggestions regarding this predicament would be greatly appreciated.
Cheers,
Mark
The whole point of default methods is to avoid the situation you describe. The code below compiles and runs as expected with Java 7 and 8:
public class SortedList<T> extends AbstractList<T> implements Serializable {
#Override public T get(int index) { return null; }
#Override public int size() { return 0; }
public static void main(String[] args) {
SortedList<String> s = new SortedList<> ();
System.out.println(s.size());
}
}
Ok, so both of you (#Holger and #assylias) were correct... But, our situation is a little more complicated.
The environment we're working in is Eclipse 3.8.1 which doesn't support Java 8 (and won't in the future to my knowelege). So we can't just change to a Java 8
compiler to fix the issues.
Our product is a sizeable Eclipse RCP application. Upgrading our IDE is not currently an option, as there would be major rework involved. We will need to continue
to develop under a Java 1.7 environment for this reason.
If anyone is interested, we have resolved the issue by:
Creating fragments (one per Java version that causes issues, so three in our case) for our main plugin. These fragments are configured as patch fragments.
Added the Java FX JARs into the fragments (This was done to resolve some issues with Java FX in an earlier release and again for the 1.8.0_25 release).
Also in the fragments, in the same namespace as the main plugin, we added the implementation of the SortedList class. The code is identical for each case, but
the fragment for Java 8 is compiled specifically with a Java 8 compiler. Overriding the spliterator() method wasn't necessary in the end (when compiled with the Java 8
compiler, it works ok and still compiles with the 1.7 compiler as there is no reference to the Spliterator class anymore).
This is probably not an ideal solution, but it will work we think :-).
Thanks for your input & suggestions, much appreciated.
try Creating an abstract class that overrides the spliterator() with the prefered behaviour definition i.e.
abstract class Java8_AbstractCollection<E> extends AbstractCollection<E> {
/* (non-Javadoc)
* #see java.util.Collection#spliterator()
*/
public Spliterator<E> spliterator() {
return (Spliterator<E>) super.spliterator();
}
}

scala error when extending java.util.Stack : error while loading vector$

I'm using scala v2.10.2; eclipse with scala plugin v3.0.1; The full error message is:
error while loading Vector$1, class file 'C:\Program
Files\Java\jre7\lib\rt.jar(java/util/Vector$1.class)' is broken (class
java.util.NoSuchElementException/key not found: E)
It occurs when attempting to extending java.util.Stack
import java.util.Stack
class MyStack[T] extends Stack[T]{}
It's worth noting that java.util.Stack is a subclass of java.util.Vector.
java.util.Stack extends the essentially deprecated java.util.Vector, and thus is also essentially deprecated (they're not actually deprecated, but the docs always recommends using newer alternatives if you're running a newer version of Java). The javadoc for Stack recommends using the java.util.Deque interface instead:
A more complete and consistent set of LIFO stack operations is provided by the Deque interface and its implementations, which should be used in preference to this class. For example: Deque<Integer> stack = new ArrayDeque<Integer>();
Using the Deque interface and java.util.ArrayDeque will probably solve your problem since—referring to pretzels1337's answer—this seems to be a Vector-specific bug.
eThis same issue may be part of a larger bug report:
https://issues.scala-lang.org/browse/SI-7455
The report claims fixed in Scala 2.10.3-RC1, Scala 2.11.0-M6
I'm waiting for the next stable scala IDE update before verifying fixed (lazy I know) but a simple work around in the mean time is to simply change the class definitions to extend scala.collection.mutable.Stack instead.
--
Most people running into this issue are trying to use swing; for you I can only recommend trying one of the fixed builds of scala.

Problems compiling routes after migrating to Play 2.1

After migrating to Play-2.1 I stuck into problem that routes compiler stopped working for my routes file. It's been completely fine with Play-2.0.4, but now I'm getting the build error and can't find any workaround for it.
In my project I'm using cake pattern, so controller actions are visible not through <package>.<controller class>.<action>, but through <package>.<component registry>.<controller instance>.<action>. New Play routes compiler is using all action path components except for the last two to form package name that will be used in managed sources (as far as I can get code in https://github.com/playframework/Play20/blob/2.1.0/framework/src/routes-compiler/src/main/scala/play/router/RoutesCompiler.scala). In my case it leads to situation when <package>.<component registry> is chosen as package name, which results in error during build:
[error] server/target/scala-2.10/src_managed/main/com/grumpycats/mmmtg/componentsRegistry/routes.java:5: componentsRegistry is already defined as object componentsRegistry
[error] package com.grumpycats.mmmtg.componentsRegistry;
I made the sample project to demonstrate this problem: https://github.com/rmihael/play-2.1-routes-problem
Is it possible to workaround this problem somehow without dropping cake pattern for controllers? It's the pity that I can't proceed with Play 2.1 due to this problem.
Because of reputation I can not create a comment.
The convention is that classes and objects start with upper case. This convention is applied to pattern matching as well. Looking at a string there seems to be no difference between a package object and normal object (appart from the case). I am not sure how Play 2.1 handles things, that's why this is not an answer but a comment.
You could try the new # syntax in the router. That allows you to create an instance from the Global class. You would still specify <package>.<controller class>.<action>, but in the Global you get it from somewhere else (for example a component registry).
You can find a bit of extra information here under the 'Managed Controller classes instantiation': http://www.playframework.com/documentation/2.1.0/Highlights
This demo project shows it's usage: https://github.com/guillaumebort/play20-spring-demo