Maven plugin loading classes - class

I have an application which has legacy Struts actions extending org.apache.struts.StrutsActions. I would like to be sure that all my classes which are extending StrutsActions have a custom annotation.
In order to provide this I have written a small maven enforcer rule to validate my requirement. However I dont know how to load my classes at my mojo to validate them.
Actually I have done something not fancy which is injection outputDirectory and with a custom class loader I have recursively loaded all classes at my build folder.
Thanks

All classes ? What do you mean by that ? Maybe you mean target/classes/** (which is the default output place for classes) or maybe you mean a list of multiple directory tree locations ?
Can you better explain what your Mojo does and which phase and goals you want it to bind with.
I think maybe you are thinking about how to apply Maven's build cycle to your project incorrectly. Could you explain better what your plugin does, maybe it is does "packaging" work ?
But if I understand you correctly you want the plugin's execution to pickup the additional classpath entry for target/classes/** ? So it can load code and resources from the project itself to change some dynamic behaviour inside the maven-plugin ?
The default way to do this is <dependency> however of course this requires a fixed unit.
Other plugins that allow for this behavior (like maven-antrun-plugin) provide a mechansism to change classpath inside the Mojo and use something from <configuration> section of their pom.xml to do it. It is not clear if the plugin you are using is a Maven one or one you have written ?
.
Validation and packaging that is a valid use case. But I question why on the "classpath" ? I would guess you are binding on process-classes phase.
i.e. the classpath is for the purpose of providing code/resources to the runtime to execute. But in your case you have an input directory rather than a class path requirement.
It is possible in a Mojo to setup a directory scanner on an input directory */.class and then it is possible to (using some library) open each file and inspect the annotation without loading it.
This is also a good kind of separation between unreliable input data and the consistent behaviour of the plugin code itself. What happens if a project decides it wants to implement the same package and/or class as used in the implementation of the plugin itself.
UPDATE: If you really are loading the classes you are checkig into the JVM from your Mojo, then at least implement your own ClassLoader to do it. This is not necessarily a simple problem to solve. You make this ClassLoader find things specified from configuration in the input directory.

I have done it with the help of reflections
<dependency>
<groupId>org.reflections</groupId>
<artifactId>reflections</artifactId>
<version>0.9.5</version>
</dependency>
my implementation is like this:
public void execute(EnforcerRuleHelper helper) throws EnforcerRuleException {
URL url = getURL(helper.evaluate("${project.build.outputDirectory}"));
Predicate<String> filter = new FilterBuilder().include(getTargetSuperTypeForSubtypes());
Predicate<String> filter2 = new FilterBuilder().include(getMustHaveAnnotation());
Reflections reflections = new Reflections(new ConfigurationBuilder()
.setScanners(
new TypeAnnotationsScanner().filterResultsBy(filter2),
new SubTypesScanner().filterResultsBy(filter))
.setUrls(url));
validate(reflections);
}

Related

How to create a custom annotation processor for Eclipse

I'm trying to create a custom annotation processor that generates code at compilation time (as hibernate-jpamodelgen does). I've looked in the web, and I find custom annotation processors that works with maven, but do nothing when added to the Annotation Processing > Factory Path option. How could I create a processor compatible in this way? I have not found a tutorial that works.
My idea is to, for example, annotate an entity to generate automatically a base DTO, a base mapper, etc that can be extended to use in the final code.
Thank you all
OK, Already found out the problem. The tutorial I hda found out dint't specified that, in order to the compiler to be able to apply the annotation processor, there must be a META-INF/services/javax.annotation.processing.Processor file that contains the qualified class name of the processor (or processors).
I created the file pointing to my processor class, generated the jar and added it to Annotation Processing > Factory Path and all worked correctly.
Just be careful to keep the order of the processors correctly (for example, hibernate model generator claims the classes, so no more generation will be made after it), and change the jar file name each time you want to replace the library (it seems eclipse keeps a cache). These two things have given me a good headache.
Thanks all

How to unkeep or force remove #javax.persistence.Transient annotated methods in JPA entities in ProGuard?

I'm extracting JPA entities into a separate Fat/Uber jar for external system to use.
ProGuard is used through com.github.wvengen:proguard-maven-plugin to shrink all other unused code.
I want to keep all methods in JPA entities except ones annotated by #javax.persistence.Transient annotation. I've found "!transient" for field modifiers in ProGuard rules, but seems that !#javax.persistence.Transient for methods does not work :(
Can I reach the same effect somehow for methods by other way ?
Unfortunately I've not got an answer on this question and was not be able to solve this by ProGuard+MavenPlugin directly, but I've resolved this problem with one additional step before run a ProGuard. I've just used ByteBuddy + Maven plugin on Maven phase before you run ProGuard Maven plugin and it will then optimize/remove the rest unused stuff, see details about byte-buddy instrumentation step here: byte-buddy remove/strip methods

Setting a configuration that lives as a parent of the compile configuration in Gradle

I added a custom configuration to my plugin
Configuration customCompile = project.configurations.create("customCompile")
.setVisible(false).setTransitive(true)
I want to do something like
configuration.compile.addExtendsFrom(customCompile)
So that in my plugin, I can isolate certain dependencies to add to the classpath of something I'm running (with `project.configurations.customCompile). I want them to remain on the regular compile path as well.
What I did was this :
Configuration compile = project.configurations.getByName('compile')
Set updated = WrapUtil.asSet(compile.getExtendsFrom()) // returns a immutable set
updated.add(customCompile)
compile.setExtendsFrom(updated)
It works, but it feels a little convoluted, extendsFrom seems to have the opposite meaning of inheritance that I'm used to with java classes. Is there a better way to be doing this?
a.extendsFrom(b) is analogue to "a inherits from b", and you can simply do configurations.compile.extendsFrom(customCompile). (Not addExtendsFrom or getExtendsFrom.)

Can gwt analyze dependency like maven does?

How can I get GWT to provide the same dependency insights as mvn dependency:analyze?
Maven can report about dependencies (Used undeclared dependencies and Unused declared dependencies). I'd like to get GWT to do the same because determining missing inherits in my gwt.xml proves difficult.
Is there a good way for the system to analyze dependency state?
Thanks
Peter
I'm not aware of any such tool, and while I think a utility to analyze and report on GWT dependencies could be interesting, I also think it would be difficult to define well.
Used, undeclared dependencies
Before trying to solve this, what is the problem? In maven, this category means that a Class is loaded from a dependency that isn't directly depended on, but instead is transitively loaded. This starts to get into the whole issue of transitive dependencies (which exist in GWT) and scopes (which don't). If A uses a class in C, but only depends on B, which depends on C, this will be listed in the 'used, undeclared dependencies' list.
In GWT however, we very rarely list every single dependency we use directly. Instead, we assume that transitive dependencies will stay transitive - we dont bother to inherit com.google.gwt.user.RemoteService for RPC as long as we already have com.google.gwt.user.User listed.
So how could we tell if we are using an undeclared dependency, the kind that warns when we do a gwt:compile? Perhaps such a tool could find every .gwt.xml file on the classpath, and read through its <source> and <super-source> rules to look for somewhere that a class we are using is declared? Or in the case of invoking GWT.create on something and getting back a non-concrete type, it could look for <replace-with> and <generate-with> rules. As long as your code already compiles in Java, the classes are on the classpath, but you still run the risk that while the class is there, the .java or .gwt.xml files might not be.
Unused, declared dependencies
This seems like an easier problem - analyze the modules we are inheriting, and look through them for any module that could be pruned out. Unfortunately, as the above discussion notes, we can't just look for the classes and which package they are in, which <source> and <super-source> elements are unused - we also would need to look for <replace-with> and <generate-with> rules - consider something like com.google.gwt.user.RemoteService, which only adds a rule and some configuration details, or even com.google.gwt.user.RemoteServiceObfuscateTypeNames, which modifies only a single setting of the RemoteService module. If RemoteServiceObfuscateTypeNames were removed, everything would still compile, but now there might be information about your RPC classes compiled into your app that you don't expect to be there.
With these in mind, perhaps such a tool could watch all possible rebind rules in the current build, and all configuration settings, properties, etc, and see if any of those rules were not using during a gwt:compile process. Then, indicate which modules had unused parts, and if any module (and all of its inherited modules) were unused, it could shown to the user as able to be removed.
One more important piece: order matters, when defining <inherits> statements. If I add a inherits for com.google.gwt.logging.Logging, then follow it with com.google.gwt.logging.LoggingDisabled, logging classes will be on the source path and will compile, but will have no effect. But if those are ordered the other way around, then they will not only be on the source path, but will all be functional. So any analysis of modules used and unused would need to also include transitive inherits statements, and their orders.

Error with Groovy AST transformations when cleaning project in Eclipse

I'm trying to work through groovy's Implementing Local AST Transformations tutorial, but whenever I clean my project I get this error in each file that has the #WithLogging annotation in it:
Groovy:Could not find class for Transformation Processor AC.LoggingASTTransformation declared by AC.WithLogging
So you have a package named "AC" that contains both "WithLogging.groovy" and "LoggingASTTransformation.groovy" classes? Does it also contain any classes that implement the "WithLogging" interface?
If so, I'd suggest you move the class(es) that use your annotation to a location outside of the annotation defining package (the default will suffice, for diagnostic purposes) - Order of compilation matters with transformations. See this post on the groovy users mailing list for more on that.
Also try changing the annotation from #WithLogging to #AC.WithLogging.
As far as cleaning with Eclipse is concerned, I had a similar issue and found that I had to make a trivial modification after a clean to any file that contained my annotation. IE, add a space somewhere. Then save the file. This should rebuild everything properly.