Eclipse Virgo - classloader returns old class after bundle reload - eclipse

I have bundle A and bundle B. Bundle B imports packages from bundle A and bundle B has Eclipse-RegisterBuddy set to bundle A. Bundle A loads class which is exported by B by Java reflection (Class.forName). When bundle B is redeployed, bundle A still has reference to classloader from old version of B, so Class.forName returns old class version from B. This causes that calling Class.isInstance from A returns false when argument is instance created in B and passed to method in A.
Is there some way to refresh classloader in A to be able return new version of classes from B? It is possible to call bundle refresh A command from Virgo console, this solves this problem, but this refresh causes that all dependent bundles (B and others) are stopped and started again. This is not suitable in our application because bundle B and others which import packages from A are long running batch jobs and cannot be stopped.

Class.forName() is an anti-pattern in OSGi, do not use that: http://wiki.osgi.org/wiki/Avoid_Classloader_Hacks
You should create an interface for your class. The interface should be stable. You should put this interface into a separate bundle like B.API. In bundle A, you should use this interface, instead of the class. After having the stable interface in a separate bundle, refreshing B will not cause the restart of your dependent bundles. You have several options to get the instance that you want:
Option 1
You should think in OSGi services and in their lifecycles. Bundle B should register an OSGi service that bundle A uses. You can write a ServiceTracker or use Declarative Services to get the service.
Option 2
You should choose option 1. I do not recommend option 2, but it can work if you do not want to use OSGi services due to some reason. Get bundle B in bundle A. Use bundleB.loadClass() to load the class type and cast it to the interface that is located in B.API.

Related

How to use two versions of the same JAR in two bundles of OSGi?

I have an OSGi setup with maven.
The parent bundle's pom.xml has a dependency on JAR A which is in turn dependant on JAR B(version 1).
The parent has 2 children bundles. Child bundle 1 uses JAR B with version 1.
Child bundle 2 requires JAR B with version 2.
The JARs are not backward compatible. So I cannot upgrade the version of JAR B from 1 to 2.
I need to use version 1 of JAR in bundle 1 and version 2 of the same JAR in bundle 2, without any errors in class loading.
Right now, I get an error in class loading where any one of the bundles fail due to NoClassDefFound error because of using 2 versions.
How do I resolve this dependency using OSGi?
One of the advantages with OSGi (arguably the most important one) is that you can use different versions of the same library in different bundles. In the few cases where that's a problem you would either get a 'loader constraint violation' or a 'ClassCastException'. The OSGi troubleshooting doc for the Vespa search engine shows a few examples. This would happen if one of your bundles calls an api in another bundle involving the classes in JAR B (as argument or return value), and the two bundles use different versions of B.
In your case, the problem is that either the failing bundle is missing an Import-Package, and/or there is no bundle exporting the required version of the package with the missing class. You can examine the bundles' manifest.mf to verify.
The simplest and safest way to solve this is to embed the desired version of B into each of your child bundles. This way, your child bundles will use their own version of B, living in separate class loaders.
You don't mention how you package your bundles, but if you're using the maven-bundle-plugin, it has an Embed-Dependency config option. Please note that you also have to embed the transitive dependencies of B by using the Embed-Transitive directive to ensure that all code used inside B is available at runtime.
Another solution is possible if both versions of B are packaged as OSGi bundles, and each exports the package (of the missing class) with a unique version number. (Again, verify from looking at B's manifest.mf). Then, you can deploy both B bundles, and have each of your child bundles import the correct version.

Play dependency injection reload

While developing a play framework (version 2.5.9) in Scala using the proposed dependency injection mechanism (https://www.playframework.com/documentation/2.5.x/ScalaDependencyInjection) using guice.
The application uses a custom module (published as jar and imported as external dependency) that has singleton classes (A,B) that uses other classes (1,2) injected with dependency injection mechanism.
The same injected classes (1,2) are used in the current application with the same method.
So for example:
Class A belonging to the custom module uses class 1 that is provided by dependency injection
Class C belonging to the application currently developed uses class 1 that is provided by dependency injection
This set up works fine until changes in the code during the development process triggers sbt to recompile and reload the application.
When this happens the application is restarted but the classes in the module are not provided with the new instances of injected classes (ex. 1) and so if for example class 1 is the database trait in play framework the instance provided in the module stop to work forcing the developer to restart the application from scratch.
Is there any way to prevent this from happening and having sbt/play "autoreload" the application in way that continue to work correctly.
I hope I've made my problem somehow clear :)

How to deploy two cross-dependent EARs in JBoss 7 in order to prevent ClassCastExceptions?

I have a problem when deploying two ear files in Jboss 7 and would be thankful for your help.
Following scenario:
EAR 1 contains EJBs which are looked up by EAR 2 (also) at server startup. Thus, EAR 1 has to be deployed before EAR 2 (via jboss-deployment-structure.xml dependency settings).
Right after being deployed, EAR 1 also needs access to classes contained in EAR 2 because of Hibernate and JNDI related class loading (among others).
But as EAR 2 isn't deployed at that time, there's a need for EAR 1 to contain a client-jar file of EAR 2.
Now, the problem is that in the course of EAR 1 and EAR 2 configuration (at server startup) ClassCastExceptions occur because...
(non-EJB) Java object obj1, whose class C was loaded by the classloader of EAR 1, is bound in JNDI
and after being looked up, supposed to be cast to object obj2 whose class C was loaded by the classloader of EAR 2
Now I wonder, if there's a possibility that these common classes of EAR 1 and EAR 2 are being loaded with the same classloader in JBoss 7. I already tried to put them in a server module, which didn't work out.
Thanks a lot for your help in advance!
PS: I'm aware of the poor design declared above. But due to restrictions, I have to follow up on it.
To avoid class cast exceptions, the common libraries need to be put in a classloader that is common to all applications in the two EARs, and no other copies of those libraries should exist in each application.
If it's an option to use only one EAR instead of two, put all the WARs inside a single EAR, remove the common classes from the WARs and put them on the EAR library folder.
This will associate the common classes to the EAR classloader, which is common to all the applications running inside a EAR, so that would solve the problem.
If you must use two EARS, then the common classes need to be put in a classloader at the level of the server shared by all applications.
JBoss 7 is an OSGI based container that allows to create an isolated module with a few jars and link it to an application. So if you try this:
create a module with the common library AND it's dependencies, see instructions here
link the module to all applications
remove the copies of those libraries from all applications and leave them only at the module
It will work fine, but don't forget to put the dependencies of the common library in the module as well otherwise it will not work, the dependencies need to be visible at the level of the module, and there can be no duplicate jars between module and applications (otherwise class cast exceptions might happen).
Another way far less recommendable but it would also work, is to put the common classes at the level of the common classloader, see this diagram. The common classloader folder is located at $CATALINA_HOME/lib (Jboss is based internally on Tomcat).
Specially if there are other applications running in that server don't try the second option. It could help to have a look at a tool I built for these type of issues, JHades.

GWT blamed RequestFactory ValidationTool must be run on on sub module(project) when launching main project

GWT 2.5.0/Google Plugin for Eclipse/m2e/GWT-maven-plugin 2.5.0/Request Factory
I configs my project according to the GWT wiki working with maven and it works pretty well, but has some trouble in my sub-project.
Suppose i have two project A and B, A is a standard GWT project, B is sub-project and has one GWT module, it contains some common UI widgets and some common Entity proxies and RequestFactory, A project depends on B project through Maven dependency, and also in A's Module.gwt.xml, there is an inheritance on B module.
The problem is when i try to lauching A project using GPE, it blames:
The RequestFactory ValidationTool must be run for the … XXXRequestFactory type
where the XXXRequestFactory is in B project. I have to close project B in Eclipse, so it will not blames, this is cumbersome when i want to modify something in B project which used in A to see the changes, i have to close B then see the changes, then open B and made changes...
I wonder if there is a way to solve this problem so my life would be easier.
Thanks.
One more thing, i also use maven-processor-plugin and build-helper-maven-plugin in project B, and make sure the goals are run when i call maven install on B, but seems no help.
I also had this problem and here is the solution which fixed this issue. This answer assumes that you need to execute the GWT app in dev mode (as you mentioned you tried with gwt eclipse plugin)
You may already know this RequestFactory must validate the interfaces, domain types and proxies before execution. So you need to enable annotation processing for this which creates mapping data for server side components in addition to said validation. If this process not succeeded it will throw the error you mentioned.
You can enable the requestfactory validation for project B in the project properties. Go to compiler properties, enable annotation processing providing the path to requestfactory-apt.jar. After this when you compile the project you can see the .apt_generated in your project home dir containing mapping files. If you open one of them you can see generated mappings for your proxies.
Launch the application (project A in your case) and it should run without any errors
In Maven world you have to specify the dependency for this apt jar. In addition to this you might get compiler errors in those generated classes when doing mvn compile, to resolve that simply delete the content in .apt_generated.

jaxws and EclipseLink refuses to use enums from lib

I have a maven multimodule project with a ejb with a webservice, a lib, and a batch app. The batch app and the ejb module shares some enums, which then is located in the lib module. When attempting to return one of these enums from the lib in a webservice method it claims that there are no valid ejbs in the ejb jar file. Also, when using another one of these enums as attributes in an JPA entity using #Enumerated(EnumType.STRING) I get an error saying
"...is not a valid type for an enumerated mapping. The attribute must be defined as a Java enum."
I am simply wondering why using these enums in this way is a problem? Are there any workarounds besides declaring them twice?
I ran into the same problem, and it was because I was testing with Arquillian and for some reason I had forgotten to add the package containing the actual enum in the shrinkwrap.
So maybe there is something preventing the persistence provider (eclipselink in my case) from seeing your enum class. That's what I would bet is happening in your case, because you have multiple modules.
I had the same problem with a project i was working on. I have a common bundle which holds the general interfaces (and enums) which the persistence bundle did not recognise. As a result, I got the above exception (even though the persistence bundle had dependencies to the common bundle through the imported packages.
I solved this problem by including the common bundle in the Java build path of the persistence bundle:
project -> project properties -> Java build path / Projects ;//add the bundle that contains the enums here