Me and my workmate are trying to call the same Maven command (mvn site) on exactly the same pom and getting totally different output.
The code of which we think is going wrong, is the javadoc-plugin we added lately:
<!-- https://maven.apache.org/plugins/maven-javadoc-plugin/ -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-javadoc-plugin</artifactId>
<version>${version.javadoc.plugin}</version>
<configuration>
<destDir>javadoc</destDir>
<charset>UTF-8</charset>
<docencoding>UTF-8</docencoding>
<doctitle>${project.name} API Documentation
${project.version}.${svn_revision}</doctitle>
<encoding>UTF-8</encoding>
<failonerror>false</failonerror>
<footer>Specification: ${specification.title}</footer>
<header>${project.name} API Documentation
${project.version}.${svn_revision}</header>
<source>1.8</source>
<use>true</use>
<version>true</version>
<windowtitle>${project.name} API Documentation
${project.version}.${svn_revision}</windowtitle>
<additionalparam>-Xdoclint:none</additionalparam>
</configuration>
<executions>
<execution>
<id>attach-javadocs</id>
<phase>deploy</phase>
<goals>
<goal>jar</goal>
</goals>
</execution>
</executions>
</plugin>
Running this gets me the correct javadoc-generation in the targeted folder. When I pushed it to the svn repository and my mate downloaded, it was not working for him.
There is no Error and no warning, it just does not generate the javadoc.
Additional info:
We are not using any local settings.xml.
The output of mvn site -X (debug mode) does not make any difference regarding the javadoc-plugin.
He already reinstalled jdk and re-set his $JAVA_HOME.
Same Maven version
What could be the problem?
Thank you in advance
Run mvn -v to make sure you're using the same Maven and Java versions. The command will print the paths to the Java runtime, make sure they are same and correct.
If that checks out, run mvn help:effective-pom to see what Maven will execute. Redirect the output on both machines to a file and compare them.
Next, try to invoke the plugin directly from the command line. If that works, attaching to the life cycle doesn't work for some reason. If it doesn't work, check for error messages and use -X to check the plugin configuration.
If everything else fails, delete your local Maven repository (or at least the involved plugins).
Related
Currently I generate files with an annotation processor in eclipse for a project by
Right click on project > Run As > Maven Clean
Right click on project > Run As > Maven install
This is quite time consuming. How do I set up eclipse to make it run the annotation processor on save?
I have the "Build Automatically" feature set but it seems to ignore the annotation processors. BTW I am using m2e apt plugin with "Automatically configure JDT APT activated".
I have annotation processing working in Eclipse for some of my projects; for me, it IS working on save, and I don't have to mvn install (and it works differently than Maven, as Eclipse runs its own compiler).
I'm also using m2e-apt plugin for this.
As noted above, Eclipse runs its own compiler; that means that its output can differ slightly than Maven's (when you "Right click on project > Run As > Maven Clean / Install" you're invoking Maven, not Eclipse). I'm mentioning this because it is entirely possible that your processors have a problem and work in Maven but not in Eclipse (although most of the time they do produce the same output; I've seen some differences, but very small). I'd keep an eye on Eclipse's error log if I were you (because that's where annotation processing errors are written).
So here is what I suggest:
Post A picture with your Maven / Annotation Processing settings in Eclipse (even though you do seem to have the correct option activated).
Post a picture with Java/Compiler settings (there is a checkmark in there that needs to be activated; it doesn't work without).
Posting your pom.xml would, strangely, be helpful. Especially if you have custom configuration for maven-compiler-plugin. Some of that config is interpreted by m2e-apt, such as compiler arguments.
Look for a file called .factorypath. That's where m2e-apt keeps the list of jars that it scans for annotation processing (you'll find all the jars of your project in there, even though they don't actually contain processors; that is, unless your maven-compiler-plugin is configured as such to only consider a specific list of processors). If the jar containing your processor is not in .factorypath, it won't work.
Last but not least, there is another thing that can cause problems. If the project containing the actual annotation processor (so NOT the "client") is in the same workspace as the "client" project, then m2e-apt will simply ignore your annotation processor; I don't know why. Closing your annotation processor project would be enough in this case (you don't have to delete it from workspace).
Edit: Forgot to say that if you do run your annotation processing via Maven (and you're invoking Maven just to process annotations), then mvn compile should be enough. Also, you don't need to run it separately (first mvn clean then mvn compile). You can run it in one shot with mvn clean compile; it is supposed to have the exact same effect.
Make sure your Java project settings (accessible with right-click on project > Java compiler > Annotation processors) do enable annotation processing and that the settings match your expections.
For Maven project, m2e is supposed to configure those settings properly according to the pom.xml content. However, this is not working smoothly for all Maven plugins (some will be supported "out-of-the-box", some others will require a specific plugin...).
I think you need a trigger to run Maven goal, So:
You have to add a valid maven lifecycle action
Example for a jar which is automatically deployed locally by maven install plugin:
<build>
<!-- ... -->
<pluginManagement>
<plugins>
<plugin>
<groupId>org.eclipse.m2e</groupId>
<artifactId>lifecycle-mapping</artifactId>
<version>1.0.0</version>
<configuration>
<lifecycleMappingMetadata>
<pluginExecutions>
<pluginExecution>
<pluginExecutionFilter>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<versionRange>[2.0,)</versionRange>
<goals>
<goal>jar</goal>
</goals>
</pluginExecutionFilter>
<action>
<execute>
<runOnConfiguration>true</runOnConfiguration>
<runOnIncremental>true</runOnIncremental>
</execute>
</action>
</pluginExecution>
<pluginExecution>
<pluginExecutionFilter>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-install-plugin</artifactId>
<versionRange>[2.5.0,)</versionRange>
<goals>
<goal>install</goal>
</goals>
</pluginExecutionFilter>
<action>
<execute>
<runOnConfiguration>true</runOnConfiguration>
<runOnIncremental>true</runOnIncremental>
</execute>
</action>
</pluginExecution>
</pluginExecutions>
</lifecycleMappingMetadata>
</configuration>
</plugin>
</plugins>
</pluginManagement>
</build>
Hint: relates to Maven Project Builder is invoked every time I change a source file (GWT) and as a warning: install typically includes tests if you have included them in your normal maven build cycle
I need to get some updates on this issue, I found this thread back in 2009 here, but the answer was to use maven 2, I'm not sure if Q4E works with maven 3 or not. I need to have some properties files filtered during the mvn package phase for the resulting war to be functional, the resource filtering is working fine with CLI mvn install. But when I do "Run on server/debug on server", the filtering is not working any more.
The aforementioned thread author ended up using q4e, claiming q4e gets the resource filtering right. I have q4e installed as well along with m2e, but still doesn't work, so I don't know if q4e is not working with maven 3, or I'm doing something wrong.
Thanks,
David
updated to the latest m2e-wtp plugin 0.15 (resource filtering bug fix since 0.12), it works fine now.
I'm not sure if this matches your problem, but I wanted to populate my web.xml file with properties from the pom during build and I put a groovy script in the pom to do it. It worked a treat and might work for you too. It definately works in both eclipse and on the command line. Here is my pom fragment:
<plugin>
<!-- Groovy script to set the description and version in the web.xml display name -->
<groupId>org.codehaus.groovy.maven</groupId>
<artifactId>gmaven-plugin</artifactId>
<version>1.0</version>
<executions>
<execution>
<id>groovy-magic</id>
<phase>prepare-package</phase>
<goals>
<goal>execute</goal>
</goals>
<configuration>
<source>
def file = new File("src/main/webapp/WEB-INF/web.xml");
def fileText = file.text;
def match = "<display-name>[^<]*</display-name>";
def replace = "<display-name>"+project.description+" "+project.version+"</display-name>";
fileText = fileText.replaceAll(match, replace);
file.write(fileText);
println "Updated web.xml"
</source>
</configuration>
</execution>
</executions>
</plugin>
I tried changing to the release version of gwt2.4 and run into a problem. I use multiple projects in my setup. I have a project with serverside code, one project with shared code, that can be used in different gwt projects and a gwt project binding everything together. I build everything with maven. i followed the instructions for annotationprocessing found here:
http://code.google.com/p/google-web-toolkit/wiki/RequestFactoryInterfaceValidation
when I compile my shared project, where the proxies and services are, the folder "generated-sources\apt\" with the DeobfuscatorBuilder.java is created. I have the sources of this project as dependency of my mainproject and try to run the validator as well, but the DeobfuscatorBuilder.java is not created here. Everything compiles but when I invoke a call to the requestfactory I get the error:
com.google.web.bindery.requestfactory.server.UnexpectedException: No RequestContext for operation ZwI9iqZS626uTt_TFwRtUwPYSOE=
I guess there is an mistake in my setup, but I could't find where ..
Does anybody know how to solve this problem?
Regards
arne
UPDATE:
I added this to my pom:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<executions>
<execution>
<id>unpack</id>
<phase>initialize</phase>
<goals>
<goal>unpack</goal>
<!-- <goal>build-classpath</goal> -->
</goals>
<configuration>
<artifactItems>
<artifactItem>
<groupId>com.myproject.core</groupId>
<artifactId>shared</artifactId>
<version>${shared.version}</version>
<classifier>sources</classifier>
<overWrite>true</overWrite>
<outputDirectory>${project.build.directory}/com.myproject.shared</outputDirectory>
</artifactItem>
</artifactItems>
</configuration>
</execution>
</executions>
</plugin>
This unpacks the sources of my dependencies and puts them into my target folder.
Then I added:
<configuration>
<sourceDirectory>target/com.fileee.shared</sourceDirectory>
</configuration>
to my processor-plugin.
This way it is not necessary to have all the projects in the workspace and it should work with a continous integration system. Wouldn't have figured that out without Andys reply though :)
I had the same issue and spent hours scouring the web for an answer without any luck. If I add the processor plugin to the shared project, it generates the DeobfuscatorBuilder class, but I get the same No RequestContext exception as you. If I just have the processsor plugin on the GWT war project, the builder isn't generated at all.
With a fair amount of trial and error I found adding the source directory from the shared project into the processor plugin configuration on the war project worked...
http://code.google.com/p/android-shuffle/source/browse/shuffle-app-engine/pom.xml#269
It's a bit dirty, but it does the trick. If there's an official method that doesn't require cross project hackery I'd be more than welcome to switch, but I haven't seen anything suggested yet.
Cheers
Andy
Using the maven scala plugin, I'm managing to use the fsc daemon to compile my main classes, thanks to this previous answer Fastest way to compile scala with maven
However, this doesn't work for test source files. I can add a maven execution for the test-compile phase, but if I specify the cc goal it compiles the src/main classes (fast, but wrong classes). If I specify the compileTest goal, it compiles the src/test classes using the standard compiler (right classes, but slow).
What am I missing?
... some progress has been made, reported in the answer I've posted below.
However.. this has revealed that the Compile server is not starting up. It appears that the scala.tools.nsc.MainGenericRunner class is failing to find scala.tools.nsc.CompileServer on the classpath. Now I know that it's in the java classpath as it's in the same jar file that provides MainGenericRunner, but do I need to specify a 'user' classpath somehow?
The command being run to start the CompileServer by the maven plugin looks like this:
cmd.exe /C C:\Progra~1\Java\jdk1.7.0\jre\bin\java -classpath
C:\projects\m2\repository\org\scala-lang\scala-library\2.9.0-1\scala-library-2.9.0-1.jar;C:\projects\m2\repository\org\scala-lang\scala-compiler\2.9.0-1\scala-compiler-2.9.0-1.jar
-Xbootclasspath/a:C:\projects\m2\repository\org\scala-lang\scala-library\2.9.0-1\scala-library-2.9.0-1.jar
scala.tools.nsc.MainGenericRunner
scala.tools.nsc.CompileServer
-target:jvm-1.5 -unchecked
>C:\Users\...\AppData\Local\Temp\scala.tools.nsc.MainGenericRunner.out
2>C:\Users\...\AppData\Local\Temp\scala.tools.nsc.MainGenericRunner.err
And running it gets this error in the MainGenericRunner.err file
Exception in thread "main" java.lang.RuntimeException: Cannot figure out how to run target: scala.tools.nsc.CompileServer
at scala.sys.package$.error(package.scala:27)
at scala.tools.nsc.GenericRunnerCommand.scala$tools$nsc$GenericRunnerCommand$$guessHowToRun(GenericRunnerCommand.scala:38)
at scala.tools.nsc.GenericRunnerCommand$$anonfun$2.apply(GenericRunnerCommand.scala:48)
at scala.tools.nsc.GenericRunnerCommand$$anonfun$2.apply(GenericRunnerCommand.scala:48)
at scala.Option.getOrElse(Option.scala:109)
at scala.tools.nsc.GenericRunnerCommand.<init>(GenericRunnerCommand.scala:48)
at scala.tools.nsc.GenericRunnerCommand.<init>(GenericRunnerCommand.scala:17)
at scala.tools.nsc.MainGenericRunner.process(MainGenericRunner.scala:33)
at scala.tools.nsc.MainGenericRunner$.main(MainGenericRunner.scala:89)
at scala.tools.nsc.MainGenericRunner.main(MainGenericRunner.scala)
Suggestions welcome..!
You may want to consider using SBT.
It has all the advantages of FSC (i.e. keeping a "warm" compiler around to speed things up)
Unlike maven, you don't need tricky manual configuration to enable support for dependency tracking (only recompiling what you really need to).
Unlike fsc, it also doesn't tie you to the version of scala installed on your path, and doesn't break in the face of a misconfigured hostname (and other similar problems)
I had the same problem, the other solutions did not solve the problem correctly, or were not appropriate, what worked was to use a snapshot version of the maven-scala-plugin:
<plugin>
<groupId>org.scala-tools</groupId>
<artifactId>maven-scala-plugin</artifactId>
<version>2.15.3-SNAPSHOT</version>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>testCompile</goal>
</goals>
</execution>
</executions>
<configuration>
<args>
<arg>-unchecked</arg>
<arg>-deprecation</arg>
<arg>-explaintypes</arg>
</args>
</configuration>
</plugin>
Further investigation has produced this solution: Add a new execution with a cc goal and a fixed up main source path:
<execution>
<id>cc-compiletest</id>
<phase>test-compile</phase>
<goals>
<goal>cc</goal>
</goals>
<configuration>
<mainSourceDir>${project.build.sourceDirectory}/../../test/scala</mainSourceDir>
<useFsc>true</useFsc>
<once>true</once>
<displayCmd>true</displayCmd>
</configuration>
</execution>
Which runs the cc 'fast' compile goal against the /test/scala directory rather than the (default) /main/scala
Is this the best/only way to do this?
I am now trying to use oracle weblogic maven plugin to deploy an application to an admin server with administration port.
I am using t3s protocol to connect but I am wondering whether I can set my custom keystore and certs in the maven plugin/parameters
in pom.xml or command line.
I cannot find the solution on the internet.
Help would greatly be appreciated.
In theory you can set the weblogic ssl headers in maven opts - like so
-Dweblogic.security.TrustKeyStore=CustomTrust -Dweblogic.security.CustomTrustKeyStoreFileName=
But the plugin doesn't seem to pick these up where as the weblogic.Deployer will. This is a little odd since the maven plugin just runs the deployer anyway.
I've also tried setting the java keystore to a custom one (also with no luck)
well the question is like "old" :) - but it seems there is no conclusive answer around and since this question pops up at google in the top10 here is what I did to make the maven -> weblogic deployment work
Using: maven 3.2.3 to deploy to WLS 12.1.3 and the WLS 12.1.3 DEV (Do not forget to execute the configure script prior to starting - well - anything)
Setup (done once)
Follow the Oracle Docs for the Maven Plugin to setup the plugin. In short:
Mainly you will install a maven plugin from the WLS DEV zip to install another maven plugin:
cd %WL_HOME%\oracle_common\plugins\maven\com\oracle\maven\oracle-maven-sync\12.1.3
mvn install:install-file -DpomFile=oracle-maven-sync-12.1.3.pom -Dfile=oracle-maven-sync-12.1.3.jar
install the plugin to be used to deploy:
mvn com.oracle.maven:oracle-maven-sync:push -DoracleHome=%WL_HOME%
Verify the plugin is ok:
mvn help:describe -DgroupId=com.oracle.weblogic -DartifactId=weblogic-maven-plugin -Dversion=12.1.3-0-0
If you need this to be added to a Maven repository proxy you can temporarily change the path to your local repository, executes those commands and that's what will be required (around 230MB in my case). I would add another thirdparty repository on the maven proxy and put everything in there in case you need to clean up later.
Then use the InstallCert tool to import the SSL certificate into a new keystore. We will place this keystore in the maven module that creates the EAR file and executes the deployment.
Deployment
Once you have your EAR file ready you need to add this to your build section:
(not the SSL / keystore messing around is only required when using t3s, you obviously skip the property setting if there is no self-signed certificate involved)
The "TrustKeyStore=CustomStore" parameter is somehow required! The name must not be changed.
<build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>properties-maven-plugin</artifactId>
<version>1.0-alpha-2</version>
<configuration>
<properties>
<weblogic.security.TrustKeyStore>CustomTrust</weblogic.security.TrustKeyStore>
<weblogic.security.CustomTrustKeyStoreFileName>${basedir}/src/main/keystore/cacerts.dev.jks</weblogic.security.CustomTrustKeyStoreFileName>
<weblogic.security.TrustKeystoreType>JKS</weblogic.security.TrustKeystoreType>
<weblogic.security.CustomTrustKeyStorePassPhrase>changeit</weblogic.security.CustomTrustKeyStorePassPhrase>
</properties>
</configuration>
<executions>
<execution>
<goals>
<goal>set-system-properties</goal>
</goals>
<phase>initialize</phase>
</execution>
</executions>
</plugin>
<plugin>
<groupId>com.oracle.weblogic</groupId>
<artifactId>weblogic-maven-plugin</artifactId>
<version>12.1.3-0-0</version>
<configuration>
<adminurl>t3s://HOSTNAME_HERE:7101</adminurl>
<user>WLS-USER-IN-DEPLYOERS-GROUP</user>
<password>WLS-USER-PASSWORD</password>
<source>${project.build.directory}/${project.build.finalName}.${project.packaging}</source>
<targets>TARGET_SERVERNAME_IN_WLS_TO_DEPLOY_TO</targets>
<verbose>true</verbose>
<name>YouApplicationName</name>
<remote>true</remote>
<upload>true</upload>
</configuration>
<executions>
<execution>
<id>wls-deploy-dev</id>
<phase>install</phase>
<goals>
<goal>deploy</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
The above configuration will deploy the EAR during install phase - feel free to change to phase of the weblogic-maven-plugin. It could also be in a profile I guess.
Happy Deploying :)
Links:
weblogic.Deployer command line reference
Docummentation of the WLS maven plugin
Properties Maven Plugin
InstallCert Tool