AspectJ Maven Plugin <weaveDependency> - aspectj

I am trying to use aspectj maven plugin in our project that has multiple modules. Following the instructions given in this link http://mojo.codehaus.org/aspectj-maven-plugin/weaveJars.html
I am using #Aspectj annotation. My aspect is in a separate maven module called
artifactId - consumer
And the class whose method i want to intercept or advice is in
artifactId - producer
I have added the following configuration in the pom file of the consumer module:
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>aspectj-maven-plugin</artifactId>
<version>1.4</version>
<configuration>
<source>1.6</source>
<target>1.6</target>
<showWeaveInfo>true</showWeaveInfo>
<weaveDependencies>
<weaveDependency>
<groupId>com.home.demo</groupId>
<artifactId>producer</artifactId>
</weaveDependency>
</weaveDependencies>
</configuration>
<executions>
<execution>
<goals>
<goal>compile</goal>
</goals>
</execution>
</executions>
</plugin>
Also added "producer" as a dependency in the same pom file.
When i am doing mvn clean install for the consumer module the following information comes in the console.
[INFO] [aspectj:compile {execution: default}]
[INFO] Join point 'method-execution(void com.home.demo.producer.messaging.MomServiceEndpointListener.handle(com.home.messaging.service.MessageContext, com.home.messaging.service.MessageContext))' in
Type 'com.home.demo.producer.messaging.MomServiceEndpointListener' (MomServiceEndpointListener.java:21) advised by before advice from 'com.home.demo.ods.app.OdsConsumer' (OdsConsumer.java:38)
But while executing the application, it's not working. The aspect is not getting invoked.
I am not able to understand whether i am missing something.
Also i am having confusion whether the plugin configuration shown above should be in which module consumer(where my aspects are) or producer.

The problem is that weaveDependencies act like sources only.
Your consumer module takes original "sources" from weaveDependencies (producer), weaves them with aspects and put weaved classes into consumer(!!!) target/classes.
Therefore, producer artifact never knows about aspects and you use it unchanged.
You have to re-build a producer jar using classes from consumer/target/classes.
I don't think it's convenient, so i left my attempts to use this plugin in this way.
Also, several weaveDependencies will be merged into one scrap-heap of classes.
You better try Aspects from your external jar dependency and plugin config that is built into producer.

Related

Apache CXF using eclipse: A required class was missing while executing org.apache.cxf:cxf-codegen-plugin:3.2.:wsdl2java

When trying to build our project from within Eclipse I keep getting the following error:
Execution generate-sources of goal
org.apache.cxf:cxf-codegen-plugin:3.2.0:wsdl2java failed: A required
class was missing while executing
org.apache.cxf:cxf-codegen-plugin:3.2.0:wsdl2java:
javax/xml/bind/annotation/adapters/HexBinaryAdapter
The reason for that is that - while we still compile for a Java-8 target environment - the tool chain (i.e. Eclipse, M2E (Eclipe's Maven-plugin), Maven, and CXF) is executed using Java-11.
In Java 9+ javax/xml/bind is not part of the rt.jar anymore, hence the class is missing when the plugin tries to start up. Elsewhere I found that one can enable it by specifying an "--add-modules java.xml.bind" JVM option.
I tried adding that option to the MAVEN_OPTS environment variable but that is apparently ignored when M2E starts up Maven (and with it the CXF plugin) in a separate VM.
Next I tried to specify that option in the plugin's configuration in the pom.xml like so:
<build>
<plugins>
<plugin>
<groupId>org.apache.cxf</groupId>
<artifactId>cxf-codegen-plugin</artifactId>
<version>${cxf.version}</version>
<executions>
<execution>
<id>generate-sources</id>
<phase>generate-sources</phase>
<configuration>
<fork>true</fork>
<additionalJvmArgs>--add-modules java.xml.bind</additionalJvmArgs>
...
</configuration>
<goals>
<goal>wsdl2java</goal>
</goals>
</execution>
</executions>
...
... but that also didn't fly. :-(
Any idea anyone, how and where one can specify that option or how I can make the former standard javax-classes available to a Maven-plugin running under Java 9+ (when executed from Eclipse M2E) ?
Just in case: this is NOT an Eclipse or M2E issues! Even when I start Maven on the command line using Java 9+ I get:
...
[ERROR] Failed to execute goal org.apache.cxf:cxf-codegen-plugin:3.2.0:wsdl2java (generate-sources) on project my_project: Execution generate-sources of goal org.apache.cxf:cxf-codegen-plugin:3.2.0:wsdl2java failed: A required class was missing while executing org.apache.cxf:cxf-codegen-plugin:3.2.0:wsdl2java: javax/xml/bind/annotation/adapters/HexBinaryAdapter

Why does maven ignore the add-source-goal? [duplicate]

This is a snippet of my pom file.
....
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<version>2.4</version>
<executions>
<execution>
<phase>install</phase>
<goals>
<goal>copy-dependencies</goal>
</goals>
<configuration>
......
</configuration>
</execution>
</executions>
</plugin>
</plugins>
...
I use it successfully with the command
mvn install
But, when I try to enclose it into the "pluginManagement" tag, the maven-dependency-plugin stops working when I launch the install goal.
Why does the "pluginManagement" tag change the build behavior? Or should I use another goal or option?
You still need to add
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
</plugin>
</plugins>
in your build, because pluginManagement is only a way to share the same plugin configuration across all your project modules.
From Maven documentation:
pluginManagement: is an element that is seen along side plugins. Plugin Management contains plugin elements in much the same way, except that rather than configuring plugin information for this particular project build, it is intended to configure project builds that inherit from this one. However, this only configures plugins that are actually referenced within the plugins element in the children. The children have every right to override pluginManagement definitions.
The difference between <pluginManagement/> and <plugins/> is that a <plugin/> under:
<pluginManagement/> defines the settings for plugins that will be inherited by modules in your build. This is great for cases where you have a parent pom file and would like to avoid having to copy the same code for the configuration of the plugin over to each of these modules.
<plugins/> is a section for the actual invocation of the plugins. It may or may not be inherited from a <pluginManagement/>.
You don't need to have a <pluginManagement/> in your project, if it's not a parent POM. However, if it's a parent pom, then in the child's pom, you need to have a declaration like:
<plugins>
<plugin>
<groupId>com.foo</groupId>
<artifactId>bar-plugin</artifactId>
</plugin>
</plugins>
Notice how you aren't defining any configuration. You can inherit it from the parent, unless you need to further adjust your invocation as per the child project's needs.
For more specific information, you can check:
The Maven pom.xml reference: Plugins
The Maven pom.xml reference: Plugin Management
You use pluginManagement in a parent pom to configure it in case any child pom wants to use it, but not every child plugin wants to use it. An example can be that your super pom defines some options for the maven Javadoc plugin.
Not each child pom might want to use Javadoc, so you define those defaults in a pluginManagement section. The child pom that wants to use the Javadoc plugin, just defines a plugin section and will inherit the configuration from the pluginManagement definition in the parent pom.
pluginManagement: is an element that is seen along side plugins. Plugin Management contains plugin elements in much the same way, except that rather than configuring plugin information for this particular project build, it is intended to configure project builds that inherit from this one. However, this only configures plugins that are actually referenced within the plugins element in the children. The children have every right to override pluginManagement definitions.
From http://maven.apache.org/pom.html#Plugin%5FManagement
Copied from :
Maven2 - problem with pluginManagement and parent-child relationship
<pluginManagement> just like <dependencyManagement> are both used to share only the configuration between a parent and it's sub-modules.
For that we define the dependencie's and plugin's common configurations in the parent project and then we only have to declare the dependency/plugin in the sub-modules to use it, without having to define a configuration for it (i.e version or execution, goals, etc). Though this does not prevent us from overriding the configuration in the submodule.
In contrast <dependencies> and <plugins> are inherited along with their configurations and should not be redeclared in the sub-modules, otherwise a conflict would occur.

Why is maven running the same pom differently on two computers?

Me and my workmate are trying to call the same Maven command (mvn site) on exactly the same pom and getting totally different output.
The code of which we think is going wrong, is the javadoc-plugin we added lately:
<!-- https://maven.apache.org/plugins/maven-javadoc-plugin/ -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-javadoc-plugin</artifactId>
<version>${version.javadoc.plugin}</version>
<configuration>
<destDir>javadoc</destDir>
<charset>UTF-8</charset>
<docencoding>UTF-8</docencoding>
<doctitle>${project.name} API Documentation
${project.version}.${svn_revision}</doctitle>
<encoding>UTF-8</encoding>
<failonerror>false</failonerror>
<footer>Specification: ${specification.title}</footer>
<header>${project.name} API Documentation
${project.version}.${svn_revision}</header>
<source>1.8</source>
<use>true</use>
<version>true</version>
<windowtitle>${project.name} API Documentation
${project.version}.${svn_revision}</windowtitle>
<additionalparam>-Xdoclint:none</additionalparam>
</configuration>
<executions>
<execution>
<id>attach-javadocs</id>
<phase>deploy</phase>
<goals>
<goal>jar</goal>
</goals>
</execution>
</executions>
</plugin>
Running this gets me the correct javadoc-generation in the targeted folder. When I pushed it to the svn repository and my mate downloaded, it was not working for him.
There is no Error and no warning, it just does not generate the javadoc.
Additional info:
We are not using any local settings.xml.
The output of mvn site -X (debug mode) does not make any difference regarding the javadoc-plugin.
He already reinstalled jdk and re-set his $JAVA_HOME.
Same Maven version
What could be the problem?
Thank you in advance
Run mvn -v to make sure you're using the same Maven and Java versions. The command will print the paths to the Java runtime, make sure they are same and correct.
If that checks out, run mvn help:effective-pom to see what Maven will execute. Redirect the output on both machines to a file and compare them.
Next, try to invoke the plugin directly from the command line. If that works, attaching to the life cycle doesn't work for some reason. If it doesn't work, check for error messages and use -X to check the plugin configuration.
If everything else fails, delete your local Maven repository (or at least the involved plugins).

Requestfactory Validation on Multi-Project Setup

I tried changing to the release version of gwt2.4 and run into a problem. I use multiple projects in my setup. I have a project with serverside code, one project with shared code, that can be used in different gwt projects and a gwt project binding everything together. I build everything with maven. i followed the instructions for annotationprocessing found here:
http://code.google.com/p/google-web-toolkit/wiki/RequestFactoryInterfaceValidation
when I compile my shared project, where the proxies and services are, the folder "generated-sources\apt\" with the DeobfuscatorBuilder.java is created. I have the sources of this project as dependency of my mainproject and try to run the validator as well, but the DeobfuscatorBuilder.java is not created here. Everything compiles but when I invoke a call to the requestfactory I get the error:
com.google.web.bindery.requestfactory.server.UnexpectedException: No RequestContext for operation ZwI9iqZS626uTt_TFwRtUwPYSOE=
I guess there is an mistake in my setup, but I could't find where ..
Does anybody know how to solve this problem?
Regards
arne
UPDATE:
I added this to my pom:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<executions>
<execution>
<id>unpack</id>
<phase>initialize</phase>
<goals>
<goal>unpack</goal>
<!-- <goal>build-classpath</goal> -->
</goals>
<configuration>
<artifactItems>
<artifactItem>
<groupId>com.myproject.core</groupId>
<artifactId>shared</artifactId>
<version>${shared.version}</version>
<classifier>sources</classifier>
<overWrite>true</overWrite>
<outputDirectory>${project.build.directory}/com.myproject.shared</outputDirectory>
</artifactItem>
</artifactItems>
</configuration>
</execution>
</executions>
</plugin>
This unpacks the sources of my dependencies and puts them into my target folder.
Then I added:
<configuration>
<sourceDirectory>target/com.fileee.shared</sourceDirectory>
</configuration>
to my processor-plugin.
This way it is not necessary to have all the projects in the workspace and it should work with a continous integration system. Wouldn't have figured that out without Andys reply though :)
I had the same issue and spent hours scouring the web for an answer without any luck. If I add the processor plugin to the shared project, it generates the DeobfuscatorBuilder class, but I get the same No RequestContext exception as you. If I just have the processsor plugin on the GWT war project, the builder isn't generated at all.
With a fair amount of trial and error I found adding the source directory from the shared project into the processor plugin configuration on the war project worked...
http://code.google.com/p/android-shuffle/source/browse/shuffle-app-engine/pom.xml#269
It's a bit dirty, but it does the trick. If there's an official method that doesn't require cross project hackery I'd be more than welcome to switch, but I haven't seen anything suggested yet.
Cheers
Andy

Compile test source files with scala fsc and maven

Using the maven scala plugin, I'm managing to use the fsc daemon to compile my main classes, thanks to this previous answer Fastest way to compile scala with maven
However, this doesn't work for test source files. I can add a maven execution for the test-compile phase, but if I specify the cc goal it compiles the src/main classes (fast, but wrong classes). If I specify the compileTest goal, it compiles the src/test classes using the standard compiler (right classes, but slow).
What am I missing?
... some progress has been made, reported in the answer I've posted below.
However.. this has revealed that the Compile server is not starting up. It appears that the scala.tools.nsc.MainGenericRunner class is failing to find scala.tools.nsc.CompileServer on the classpath. Now I know that it's in the java classpath as it's in the same jar file that provides MainGenericRunner, but do I need to specify a 'user' classpath somehow?
The command being run to start the CompileServer by the maven plugin looks like this:
cmd.exe /C C:\Progra~1\Java\jdk1.7.0\jre\bin\java -classpath
C:\projects\m2\repository\org\scala-lang\scala-library\2.9.0-1\scala-library-2.9.0-1.jar;C:\projects\m2\repository\org\scala-lang\scala-compiler\2.9.0-1\scala-compiler-2.9.0-1.jar
-Xbootclasspath/a:C:\projects\m2\repository\org\scala-lang\scala-library\2.9.0-1\scala-library-2.9.0-1.jar
scala.tools.nsc.MainGenericRunner
scala.tools.nsc.CompileServer
-target:jvm-1.5 -unchecked
>C:\Users\...\AppData\Local\Temp\scala.tools.nsc.MainGenericRunner.out
2>C:\Users\...\AppData\Local\Temp\scala.tools.nsc.MainGenericRunner.err
And running it gets this error in the MainGenericRunner.err file
Exception in thread "main" java.lang.RuntimeException: Cannot figure out how to run target: scala.tools.nsc.CompileServer
at scala.sys.package$.error(package.scala:27)
at scala.tools.nsc.GenericRunnerCommand.scala$tools$nsc$GenericRunnerCommand$$guessHowToRun(GenericRunnerCommand.scala:38)
at scala.tools.nsc.GenericRunnerCommand$$anonfun$2.apply(GenericRunnerCommand.scala:48)
at scala.tools.nsc.GenericRunnerCommand$$anonfun$2.apply(GenericRunnerCommand.scala:48)
at scala.Option.getOrElse(Option.scala:109)
at scala.tools.nsc.GenericRunnerCommand.<init>(GenericRunnerCommand.scala:48)
at scala.tools.nsc.GenericRunnerCommand.<init>(GenericRunnerCommand.scala:17)
at scala.tools.nsc.MainGenericRunner.process(MainGenericRunner.scala:33)
at scala.tools.nsc.MainGenericRunner$.main(MainGenericRunner.scala:89)
at scala.tools.nsc.MainGenericRunner.main(MainGenericRunner.scala)
Suggestions welcome..!
You may want to consider using SBT.
It has all the advantages of FSC (i.e. keeping a "warm" compiler around to speed things up)
Unlike maven, you don't need tricky manual configuration to enable support for dependency tracking (only recompiling what you really need to).
Unlike fsc, it also doesn't tie you to the version of scala installed on your path, and doesn't break in the face of a misconfigured hostname (and other similar problems)
I had the same problem, the other solutions did not solve the problem correctly, or were not appropriate, what worked was to use a snapshot version of the maven-scala-plugin:
<plugin>
<groupId>org.scala-tools</groupId>
<artifactId>maven-scala-plugin</artifactId>
<version>2.15.3-SNAPSHOT</version>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>testCompile</goal>
</goals>
</execution>
</executions>
<configuration>
<args>
<arg>-unchecked</arg>
<arg>-deprecation</arg>
<arg>-explaintypes</arg>
</args>
</configuration>
</plugin>
Further investigation has produced this solution: Add a new execution with a cc goal and a fixed up main source path:
<execution>
<id>cc-compiletest</id>
<phase>test-compile</phase>
<goals>
<goal>cc</goal>
</goals>
<configuration>
<mainSourceDir>${project.build.sourceDirectory}/../../test/scala</mainSourceDir>
<useFsc>true</useFsc>
<once>true</once>
<displayCmd>true</displayCmd>
</configuration>
</execution>
Which runs the cc 'fast' compile goal against the /test/scala directory rather than the (default) /main/scala
Is this the best/only way to do this?