GWT blamed RequestFactory ValidationTool must be run on on sub module(project) when launching main project - gwt

GWT 2.5.0/Google Plugin for Eclipse/m2e/GWT-maven-plugin 2.5.0/Request Factory
I configs my project according to the GWT wiki working with maven and it works pretty well, but has some trouble in my sub-project.
Suppose i have two project A and B, A is a standard GWT project, B is sub-project and has one GWT module, it contains some common UI widgets and some common Entity proxies and RequestFactory, A project depends on B project through Maven dependency, and also in A's Module.gwt.xml, there is an inheritance on B module.
The problem is when i try to lauching A project using GPE, it blames:
The RequestFactory ValidationTool must be run for the … XXXRequestFactory type
where the XXXRequestFactory is in B project. I have to close project B in Eclipse, so it will not blames, this is cumbersome when i want to modify something in B project which used in A to see the changes, i have to close B then see the changes, then open B and made changes...
I wonder if there is a way to solve this problem so my life would be easier.
Thanks.
One more thing, i also use maven-processor-plugin and build-helper-maven-plugin in project B, and make sure the goals are run when i call maven install on B, but seems no help.

I also had this problem and here is the solution which fixed this issue. This answer assumes that you need to execute the GWT app in dev mode (as you mentioned you tried with gwt eclipse plugin)
You may already know this RequestFactory must validate the interfaces, domain types and proxies before execution. So you need to enable annotation processing for this which creates mapping data for server side components in addition to said validation. If this process not succeeded it will throw the error you mentioned.
You can enable the requestfactory validation for project B in the project properties. Go to compiler properties, enable annotation processing providing the path to requestfactory-apt.jar. After this when you compile the project you can see the .apt_generated in your project home dir containing mapping files. If you open one of them you can see generated mappings for your proxies.
Launch the application (project A in your case) and it should run without any errors
In Maven world you have to specify the dependency for this apt jar. In addition to this you might get compiler errors in those generated classes when doing mvn compile, to resolve that simply delete the content in .apt_generated.

Related

Integrate GWT with maven-spring without the Google Plugin for Eclipse

I am facing this weird requirement where I am supposed to create a web page using GWT widgets in an existing spring-maven web project but the corp security doesn't allow me to install any Eclipse plugins. I have the latest SDK but thats about it.
Is there any way to achieve this?
The Google Plugin for Eclipse (GPE), just like so many other Eclipse plugins, is not mandatory; it's just an aid.
But first, if “the corp security doesn't allow me to install any plugin” only means you're not allowed to use the Eclipse marketplace or contact update sites, it's worth mentionning that you can download the update site as a ZIP to be used locally: https://developers.google.com/eclipse/docs/install-from-zip
If that isn't allowed either, then let's look at the features provided by the GPE and how you can possibly do the same without the plugin:
Wizard for creating new projects: you're in a Maven project, so you're not concerned.
Running and debugging: you can do the same with a Java Application launcher. Choose com.google.gwt.dev.DevMode as the Main Class, add the com.google.gwt:gwt-dev JAR to the classpath (you can also add it as a dependency with scope provided, and ignore the warning printed by the gwt-maven-plugin) if needed, add your source folders to the classpath and pass the appropriate arguments.
Wizards: let's be honest, they won't boost your productivity that much.
GWT Compilation: you can do the same with a Java Application launcher. Choose the com.google.gwt.dev.Compiler as the Main Class, add gwt-dev and your source folders to the classpath and pass the appropriate arguments.
Editors: you'll lose the formatting and highlighting of JSNI methods, as well as reference checking of your JSNI references, the auto-complete in UiBinder, and validation of UiBinder and ClientBundle references. All those will be done only when you GWT-compile your project.
RPC: you'll lose the validation of your RPC interfaces and quick-fix to keep your sync and async interfaces in sync. Validation will be done only when you GWT-compile your project.
JUnit: you can do the same with a JUnit launcher: just make sure you add gwt-dev and your source folders to the classpath, and pass the appropriate options as a gwt.args system property (see “Passing Arguments to the Test Infrastructure” in the docs).
Nothing insurmountable.

In Eclipse, how do I exclude test folder of one maven project from another project that has as a dependency?

I have two maven projects imported into Eclipse in the same workspace. Both have a class with the same name and package, but different implementations and in different locations in each project. Let's call this class com.namespace.Factory
Project A has Factory under its test folder, i.e: /src/test/java/com/namespace/Factory.java
Project B has Factory under its main source folder, i.e: /src/main/java/com/namespace/Factory.java
There is also a Project C which is dependent on both. Project C also uses Factory from Project B for some of its unit tests. Problem now is Eclipse can't compile Project C because it can't differentiate between the two Factory classes. If I build all projects in command line, they don't have issues.
You would think that Eclipse would ignore the Project A Factory class since it is in test.
I am using the m2e plugin. My current work around is to setup m2e to not resolve Project C's dependencies within the workspace. This forces it to download the jar that will not have test in it. However, this means I have a change in either A or B, I have to manually install A or B push the latest jar to the local repo, and update Project C's dependencies to pull down the latest jars.
Is there a way to exclude the Project A test folder from the build path in Project C so that I can continue resolving everything within the workspace? It feels like Eclipse is breaking something that is fundamental to maven projects.
I think you're just another user affected by the upstream bug: https://bugs.eclipse.org/bugs/show_bug.cgi?id=376616
To sum it up, the bug report discussion includes:
JDT implements just one buildpath per Eclipse project. This is very unlikely to change, since this was fundamental design choice and many APIs and implementation details rely on that.
Well, but that doesn't really answer your question I guess.
So I see multiple options here, depending on how much influence you might have on the projects:
either try to rename one of the classes => names would be unique
or if the classes contain basically the same functionality, play with dependencies between projects, or even create new one, that the other 2 would depend upon
that's pretty much what comes to my mind right now
Perhaps you can try this:
In project A's Properties dialog (get there by right clicking the project and then click Properties),
click Deployment Assembly on the left.
Eclipse will show all source folders.
Select the test folder (/test) and click Remove.

Sharing RemoteService implementations between 2 GWT projects

I have been struggling a while now to try to reuse the RemoteService implementation from one GWT project into a new one.
Here's the big picture:
I have a working smartgwt-mobile project and we now decided we wanted a desktop version of the same project, using regular smart-gwt. The GUI of this new app will obviously be different but the server side code will be exactly the same.
I tried to just "borrow" the RemoteService interface, its async counterpart and the whole server package by either linking the package folders in the other project inside the new source structure (I am using Eclipse with GWT plugin) or by adding the borrowed code path as a filtered source folder to the build path, and while this satisfies the Eclipse dependency checker, the GWT compiler is unable to find the borrowed code suggesting I need to add "inherit" declarations in the module .gwt.xml file.
When I do this and recompile it now expects a second module .gwt.xml file in the root of the borrowed code which is not acceptable because it would affect the other project.
I have been reading up on the GWT module documentation but I fail to see how to implement such a scheme. It may actually be impossible to do what I am trying to achieve.
I would be willing, if that solves the problem, to create a third project that simply defines a GWT RemoteService module that then will be inherited by both the mobile and desktop smartgwt projects.
Does anybody have suggestions about how to tackle this issue?
I'l agree to "third project that simply defines a GWT RemoteService module that then will be inherited by both the mobile and desktop smartgwt projects"
Why because, I'm just already doing this. Yes that is Obviously an DAO project(DB layer) which has all my DB business logic methods there.
And its always better to maintain separate DAO layer to expose your data to services(ex.webservices).
So here's how I solved the issue.
The problem with linking to an existing GWT project source folders is that the GWT compiler always (at least that's what it looks like) expects to find a GWT module definition file (.gwt.xml). I have not been able to link in the source folders in such a way that the GWT compiler is happy, even though the Eclipse dependency resolver has no problem with it.
So I created a third project using the GWT Eclipse plugin. I unchecked the "Create Sample Code" option, so I ended up with an empty GWT project. I then selected 'Add' > 'New' > 'Other' > 'Google' > 'Module', entered a module name, e.g. 'myModule', a package name, e.g. 'com.myCompany.myModule' and clicked 'Finish'. The GWT New Module wizard created the package and a child package under it named 'com.myCompany.myModule.client' and I created 'com.myCompany.myModule.server' myself.
Now I copied the RemoteService and related classes (The implementation and Async version ), plus all the server side code the RemoteService code calls from the original project I wanted to borrow from and pasted it into the new project. Very soon I had all dependencies satisfied and I opened the Build Path dialog on the new Smart-GWT web app project and included the GWT RemoteService Module Project in the projects tab. Last thing to do was adding an inherit element to the .gwt.xml file:
<inherits name='com.myCompany.myModule.MyModule'/>
Voilá: That's all there is to it. If you select 'GWT Compile Project' It compiles and runs in dev mode without warnings.
I now still have to delete the shared code from the first project and inherit from the module, bat that is simply a repetition of what I already did.
In the end this was much less painful as I imagined it to be, so I recommend this approach.

What are best practices for using Hibernate's hbm2java?

I am using Hibernate, Maven, and Eclipse (STS build) to build a project. I'm using hbm.xml files to specify my schema. I want to use Hibernate's hbm2java to generate my model classes. I have it working well and generating the kind of code I want.
It runs perfectly from the command line, generating the model code and then building and testing as expected.
However, Eclipse seems unable to handle it. It will periodically "lose its mind" and be unable to resolve very simple imports and classes referenced in my DAO classes, which are hand-coded. The things it can't find are classes like HibernateUtil. Ironically, it appears to not have any trouble finding the model classes.
The unresolved classes are in target/classes/blah-blah folder at the end of the run. So they're apparently getting copied to the right place.
In a "continuous integration" environment, is it best to generate the sources once, commit them to my version control, and then disable code gen? Or is it possible to have the code generated each time, thus ensuring I pick up any database changes without human intervention?
IMHO, entities should be the core of your application, and should be designed, implemented and documented with care. They're supposed to be objects, with methods encapsulating behavior. Having them autogenerated is an absurdity, IMO.
Generating them at the very beginning might be an option to get you started, but once they've been generated, hand-craft them and don't generate them again. Add necessary properties and methods as the schema changes, and refactor existing code.
BTW, I really prefer using annotations for the mapping, because it's less verbose, less error-prone, and all the information is in a single place.
Try this:
From command line traverse to your project directory where the project's pom.xml is present and run:
mvn eclipse:clean eclipse:eclipse
If it says unable to find plugin eclipse then try:
mvn eclipse:install-plugin
First and then try the command above again.
In this way all the maven and project dependencies will be resolved at eclipse level also.
Let me know if this is not what you were looking for.

eclipse, one classpath for compiling, another for launching

example:
For logging, my code uses log4j. but other jars my code is dependent upon, uses slf4j instead. So both jars must be in the build path. Unfortunately, its possible for my code to directly use (depend on) slf4j now, either by context-assist, or some other developers changes. I would like any use of slf4j to show up as an error, but my application (and tests) will still need it in the classpath when running.
explanation:
I'd like to find out if this is possible in eclipse. This scenario happens often for me. I'll have a large project, that uses alot of 3rd party libraries. And of course those 3rd party jars have their own dependencies as well. So I have to include all dependencies in the classpath ("build path" in eclipse) for the application and its tests to compile and run (from within eclipse).
But I don't want my code to use all of those jars, just the few direct dependencies I've decided upon myself. So if my code accidentally uses a dependency of a dependency, I want it to show up as a compilation error. Ideally, as class not found, but any error would do.
I know I can manually configure the classpath when running outside of eclipse, and even within eclipse I can modify the classpath for a specific class I'm running (in the run configurations), but thats not manageable if you run alot of individual test cases, or have alot of main() classes.
It sounds like your project has enough dependency relationships that you might consider structuring it with OSGi bundles (plug-ins). Each bundle gets its own classloader and gets to specify what bundles (and optionally what version ranges, etc.) it depends on, what packages it exports, whether it re-exports stuff from its dependencies, etc.
Eclipse itself is structured out of Eclipse plug-ins and fragments, which are just OSGi bundles with an optional tiny bit of additional Eclipse wiring (plugin.xml, which is used to declare Eclipse "extension points" and "extensions") attached. Eclipse thus has fairly good tooling for creating and managing bundles built-in (via the Plug-in Development Environment). Much of what you find out there may lead you to conflate "OSGi bundle" with "plug-in that extends the Eclipse IDE", but the two concepts are quite separable.
The Eclipse tooling does distinguish rather clearly (and sometimes annoyingly, but in the "helpful medicine" way) between the bundles in your build environment vs. the bundles that a particular run configuration includes.
After a few years of living in OSGi land, the default Java "flat classpath" feels weird and even kind of broken to me, largely because (as you've experienced) it throws all JARs into one giant arena and hopes they can sort of work things out. The OSGi environment gives me a lot more control over dependency relationships, and as a "side effect" also naturally demands clarification of those relationships. Between these clear declarations and the tooling's enforcement of them, the project's structure is more obvious to everyone on the team.
if my code accidentally uses a dependency of a dependency, I want it to show up as a compilation error. Ideally, as class not found, but any error would do.
Put your code in one plug-in, your direct dependencies in other plug-ins, their dependencies in other plug-ins, etc. and declare each plug-in's dependencies. Eclipse will immediately do exactly what you want. You won't be offered dependencies' dependencies' contents in autocompletes; you'll get red squiggles and build errors; etc.
Why not use access rules to keep your code clean?
It looks like it would better be managed with maven, integrated in eclipse with m2eclipse.
That way, you can only execute part of the maven build lifecycle, and you can manage separate set of dependencies per build steps.
In my experience it helps to be more resrictive, I made the team filling out (paper) forms why this jar is needed and what license...
and they did rather type in a few lines of code instead of drag along 20 jars to open a file using only one line of code, or another fancy 'feature'.
Using maven could help for a while, but when you first spot jars having names like nightly-build or snapshot, you will know you're in jar-hell.
conclusion: Choose dependencies well
Would using the slf4j-over-log4j jar be useful? That allows using slf4j with actual logging going to log4j.