#MappedSuperclass static weaving with EclipseLink and multiple jars - jpa

My entities objects are scattered in multiple jars.
In jar A I have a base class name MyBase which is annotated with #MappedSuperclass.
In jar B there is an entity class which derives from MyBase.
The problem is that because the weaving is done in the context of the jar file (I'm using the maven plugin) the base class (MyBase) isn't instrumented (although it should).
If I move the derived class from jar B to A then the weaving process will handle the base as well.
Since I'm working on a large project it is critical for me to develop in a modular way.
Doesn't EclipseLink support such methodology?

The only way I found to override this limitation is to add a temporary entity class to the jar where the #MappedSuperclass base class is defined and remove it after the weaving procedure.
Sad, but true ;-)

I'm not sure on the maven plugin, but you should be able to use the static weaver on both jars, you will need to call it twice to weave both, and will need both jars on the weavers classpath for both calls.

Alternatively you can specify the jar containing your superclass as inpath - as explained here and here:
Managing multiple projects
Building AspectJ source code requires two distinct phases; compiling
the source in .java and .aj files to generate .class files, and then
applying the aspects to the generated .class files. This second phase,
known as weaving, is the key difference between AspectJ and Java
compilers. The Java compilation process is controlled by the classpath
setting, which makes types available for resolution by the compiler.
The same classpath setting is used by the AspectJ compilation process
and it is configured in exactly the same way in Eclipse. However, this
setting is not sufficient to control both the compilation and weaving
steps in all situations. This is why there are two extra settings
available for AspectJ projects.
First, there is the inpath setting. Anything specified here will be
made available to the weaver and so any aspects that apply will be
woven in. Entries can be added to a project's inpath by right-clicking
on the project, selecting Properties, then going to the AspectJ InPath
section. Entries can be either JAR files or directories (class
folders), such as the bin directory of another project. Anything on
the inpath is sent to the project's output, after potentially being
woven with aspects.
The second additional setting is the aspectpath. Whereas the inpath
controls the list of things that get woven, the aspectpath controls
what is woven into that list. In other words, any aspects specified on
the aspectpath are made available to the weaving process, just as if
they were present in source form in the project. This setting is
controlled from the AspectJ Aspect Path property page and can contain
either JAR files or directories.
An output JAR setting is also present in the AspectJ section of each
project's property page. This setting causes the compiler to output
class files directly to a JAR file, instead of to the project's output
folder.
Drove me crazy just like you - hope this helps. ;)

Related

Include one project's jar libraries in another project depending on the first

I am working on two projects in Eclipse.
Project A depends on some jar files that come with the project, and those jar files have been added to the “Libraries” tab in Project A's “Java Build Path” property in Eclipse.
Project B depends on Project A, as well as directly using some classes in some of those jar files in Project A's build path.
I had assumed that adding Project A to Project B's Java Build Class would also add the jars already in Project A's build path, but that appears not to be the case.
Do I have to manually add those jars to Project B's build class, or am I overlooking a setting? If so, why is that a useful standard behaviour?
You have to manually add the jars to project B's classpath.
Adding a project dependency means project A depends on the compiled output of project B. Project B's output (its compiled .class files) doesn't contain the .jar files it depends on.
Why is this? I don't know the rationale of the eclipse authors, but my guess is that they want to keep the classpath as simple and verbose as possible. Things can get confusing if you have multiple versions of the same library on the classpath.
In vanilla java you can provide directory names for the classpath. When loading a class, the JVM will search these directories in order. Eclipse encourages a stricter approach where each jar is specified manually. Note that you can add multiple jars at once, so it's trivial to add all of project B's jars to project A's classpath.

AspectJ aspect not called for loaded classes

In a RCP osgi based application, i want to load classes from disc at runtime. This loading is independent from OSGI infrastructure.
I have a custom classloader, which can do this and works in general (without the AspectJ).
Now these loaded classes shall have aspects applied.
But the aspect code is not called.
I have the .class files build with Ant iajc compiler. Do I need load-time weaving?
When load-time weaving is needed, is it sufficient to add the dependency "org.aspectj.weaver" and make the classloader extend WeavingURLClassLoader?
If it is WeavingURLClassLoader, can i use this URL to point to an aspect, located in another plugin? "platform:/plugin/myplugin/my_package/Aspect.aj". Or without the .aj extension?
How can debug this?
Frank
.
I have got it working now.
using the ctor WeavingURLClassLoader( classesUrls, aspectUrls, parent ).
Where classesUrls needed to be a URLs to folder or Jar. This must include the aspectj runtime jar.
aspectUrls needed to be a URLs to folder or Jar. This must not include the aspectj runtime jar.
URL pointing to specific aspect files within the jars do not work. The trial to use a OSGI bundle URL, did not work. For me, i needed find the JAR files directly on the file system.

Don't load/scan class files from a specific jar

I'd like to know how to configure the maven-bundle-plugin (backed by bnd) to completely ignore the classes contained within an embedded jar.
Background
I'm working in a controlled environment where the environment my code is running on is defined by a single company (including all the tools). The code is java and uses OSGi to define module dependencies.
Some of the provided modules contain what look like invalid class files, I can only assume that the system will 'correct' these class files before it tries to load them into any type of JVM. In any case these class files work when deployed onto the target system.
I'm trying to create a build system based on Maven that can produce packages the system understands and have hit a problem where these invalid class files are being read by BND (via apache-felix) which causes errors.
I'd like a way to have the jars that contain these class files on the class path of the bundle but where the contained .class files aren't read/processed by bnd. I could settle for simply ignoring the errors and continuing but can't find a way to do that either without felix aborting the entire build phase.
I just found the -failok directive, don't know why I didn't find it before. Adding <_failok>true</_failok> to the instructions allows me to continue working.
See instructions-ref

Using JavaCompiler with Classpath referencing jars within ear

I am working on a project in which an enterprise archive (ear) deployed on a JBoss server needs to compile (and run) a class dynamically. I am using the JavaCompiler class to do this - the complication comes from the fact that the class being compiled has references to some of the classes contained within the ejb jar within the ear.
This is not a problem when the deployed ear is 'exploded' on deployment, so it is just a directory rather than an archive - in this case I am able to specify the required jar in the -classpath option of the compiler, and compilation works fine. Unfortunately due to constraints of the systems I am working with, it is not an acceptable solution to deploy these ears 'exploded', and the compiler seems not to be able to 'see' the required jar when it's wrapped up in an archive.
Given that the dynamic compilation is taking place from the ear in question, and therefore the system's class loader has access to the contents of the required jar, is there any way I can tell the compiler to just use the classes as loaded by the system class loader?
I appreciate this is something of a wordy question, but any help would be appreciated.
Thanks
It seems that there is no simple way to have the JavaCompiler load dependencies of compiled code from a ClassLoader. However, one could implement JavaFileManager directly and redirect the operations for the StandardLocation.CLASS_PATH using resource lookups on the context ClassLoader (getResource(<class/resource name>)). This would withdraw the limitation of StandardJavaFileManager directly operating on Files.
Someone already seems to have prototypically implemented that approach:
http://atamur.blogspot.de/2009/10/using-built-in-javacompiler-with-custom.html

Elicpse CDT thinks it's broken

I'm using Eclipse for some embedded development and recently is started to give me these errors every time I save a file or do a build. It's annoying but for the most part it doesn't seem to be causing any problems (It even still highlights warnings/errors int the source. What's going on here?
Plug-in org.eclipse.cdt.cross.arm.gnu was unable to load class
org.eclipse.cdt.managedbuilder.internal.scannerconfig.DefaultGnuWinScannerInfoCollector.
Plug-in org.eclipse.cdt.cross.arm.gnu was unable to load class
org.eclipse.cdt.managedbuilder.internal.scannerconfig.ManagedGCCScannerInfoConsoleParser
It looks like the eclipse wiki FAQ says
The most likely reason is that an exception was thrown in the static initializer for a class declared by the offending plug-in. Check the .log file to see whether that indeed happened.
The Eclipse Platform loader will not load a plug-in when exceptions are thrown during the initialization of the Java classes that make up the plug-in.
Another common reason for this error is the lack of an appropriate constructor for the class being loaded. Most classes declared in extension points must have a public zero-argument constructor. Check the extension point documentation to see what constructor is required for the classes that you declare in an extension.
If the problem only occurs when deploying a packaged plug-in (i.e., when it is not started in a runtime workbench via PDE) it is usually a good idea to check the Bundle-ClassPath attribute in the MANIFEST.MF file.
The JAR file that contains the plug-in classes must be listed in the Bundle-ClassPath. Even if the plug-in's proper classes are all listed, class loading may still fail because a .class file may contain references to other classes that cannot be resolved at runtime. In this case, the missing classes need to be identified (usually by looking at the import statements of the problematic class) and the necessary entries need to be added to the Bundle-ClassPath. If additional JAR files are required, those JARs also need to be listed in the build.properties file so that they are included when the plug-in is packaged.
(See this thread as an illustration of that last point)
So, for instance, in this thread, for another issue back in eclipse3.0 time:
The plugin.xml file specifies "org.eclipse.core.runtime.compatablity" as a required plugin. However, I am using Eclipse Version 3.0.1 and should be using "org.eclipse.core.runtime_3.0.1".
Solution:
Replace the line in the Plugin.xml
<import plugin="org.eclipse.core.runtime.compatability"/>
with
<import plugin="org.eclipse.core.runtime"/>
VonC is right -- with a fair bit of detail on what might go wrong with class loading...
In this case, your arm.cross toolchain is referencing internal classes in CDT's managedbuild which aren't accessible. This is an incompatibility between your arm toolchain and CDT. You should file a bug with them on this error, first trying a newer version.