I have successfully set up a few projects which use Maven to automatically deploy the Maven-generated site to the gh-pages branch of their git repository. GitHub then serves these files at a public URL on a personal subdomain. I'm looking to utilize this functionality to serve a rich client-side only GWT application.
I have modified my pom.xml to compile the GWT application to the target/site/ directory. The two main goals I am still attempting to achieve are:
How do I prevent the standard Maven site plugin from running during the site phase?
What is required so gwt:compile executes during the site phase?
A goal can be bound to a phase by specifying a new execution for the plugin. I'm assuming you've got some custom stuff you need to make most of this work correctly, so I'm just going to focus on what should work to bind a plugin goal to a particular phase.
<plugin>
<artifactId>gwt-maven-plugin</artifactId>
...
<executions>
<execution>
<id>gwt-site</id>
<phase>site</phase><!-- phase to bind to -->
<goals>
<goal>compile</goal><!-- goal to run in that phase -->
</goals>
<configuration>
<!-- Your magic configuration stuff goes here -->
</configuration>
</execution>
<!-- Possible other executions might be defined here -->
</executions>
</plugin>
Preventing the default maven site from being run is more interesting, as it is a phase, with a variety of goals bound to it. The standard site:site goal can be prevented from running in the site phase by explicitly specifying an execution with no goals. This may vary slightly from maven 2 to 3, so I'm going to be a little general here. Take a look at your build logs to see what is currently specified in terms of execution id, group/artifact id to correct possible oversights in my example:
<plugin>
<artifactId>maven-site-plugin</artifactId>
...
<executions>
<execution>
<phase>site</phase>
<goals></goals><!-- This is empty to indicate that no goals should be run in this phase -->
</execution>
</executions>
</plugin>
Related
I have a Scala app (v2.13) created using Maven v3. My resources path is:
src -> main -> resources -> application.conf and aplication.prod.conf
When I generate the JAR file for production, I want to take configuration resources from application.conf, but being overwriten by application.prod.conf.
I can not found a solution for that, all founded examples are for Play framework or previous maven versions.
The JAR file is generated using maven package cmd.
application.prod.conf file
include "application.conf"
# override default (DEV) settings
http {
host = "99.999.999.9"
port = 1111
}
The following example doesn't works for me, because from target path I get only the JAR file to move it on production:
<plugin>
<artifactId>maven-antrun-plugin</artifactId>
<version>3.0.0</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>run</goal>
</goals>
<configuration>
<target>
<delete file="${project.build.outputDirectory}/application.conf"/>
<copy file="src/main/resources/application.prod.conf"
tofile="${project.build.outputDirectory}/application.conf"/>
</target>
</configuration>
</execution>
</executions>
</plugin>
Few options here:
If your application.prod.cont is static and gets shipped with jar, why cant you have a logic in the code which loads appropriate app conf based on the environment app is getting executed
Is it a typesafe config, if so, while running in prod you can pass -Dconfig.resource=/application.prod.conf java command line argument
or application.prod.conf is not shipped with jar then you can pass -Dconfig.file=/path/to/application.prod.conf
Maven has a concept of phases (we're talking about the phase package here to be precise) which are logical places in the life cycle where the plugins can be invoked. Some plugins, like the one that creates the jar, for example, are associated to phases automatically (out-of-the-box), others you define explicitly and associate with the phase (like maven-antrun-plugin which is executed during the phase package as you've showed in the code snippet).
With that in mind, Is it possible that the file is attempted to get copied after the jar was packaged, so that the antrun plugin is invoked after the artifacts were packaged into the jar?
If so, the easiest solution will be moving it to one phase before, for example prepare-package:
<plugin>
<artifactId>maven-antrun-plugin</artifactId>
<version>3.0.0</version>
<executions>
<execution>
<phase>prepare-package</phase> <!-- Note the change here -->
<goals>
<goal>run</goal>
</goals>
<configuration>
<target>
<delete file="${project.build.outputDirectory}/application.conf"/>
<copy file="src/main/resources/application.prod.conf"
tofile="${project.build.outputDirectory}/application.conf"/>
</target>
</configuration>
</execution>
</executions>
</plugin>
Assuming you have maven 3 (there is no maven version 4 yet so it might be a typo), the information about which phases are available in maven here
Having said that, probably its not a good idea to "bake" the configuration file of production into the artifact, two issues here:
Your source code contains the information about production - hosts, ports, maybe even sensitive information like passwords or keys - this shouldn't really happen
From the point of view of build, your artifact is coupled to concrete environment, which is also considered a bad practice basically.
The techniques to resolve this are beyond the scope of the question but at least you've been warned :)
I was going through this tutorial https://www.youtube.com/watch?v=k5ncCJBarRI&t=1443s
Around 1:07:30 the author mentioned about "Gradle has continuous build" later on was able to detect changes in the test and automatically regenerate asciidoc. I was wondering if anyone knows how to set this up in maven?
I have looked through docs in spring and asciidoctor plugin, but was not able to find anything related to this.
I was able to get maven to re-render html when ever there is a change in index.adoc by changing <goal> from process-asciidoc to auto-refresh. However, this does not watch the change in the Test.
Question
Is there a way to tell Maven to watch our test files and re-compile when changes are made?
POM.XML
<plugin>
<groupId>org.asciidoctor</groupId>
<artifactId>asciidoctor-maven-plugin</artifactId>
<version>1.5.7.1</version>
<executions>
<execution>
<id>generate-docs</id>
<phase>prepare-package</phase>
<goals>
<goal>auto-refresh</goal>
</goals>
<configuration>
<sourceDocumentName>index.adoc</sourceDocumentName>
<backend>html</backend>
<attributes>
<snippets>${project.build.directory}/generated-snippets</snippets>
</attributes>
</configuration>
</execution>
</executions>
</plugin>
Thank you.
This is not a continuous build solution but it works similarly. However, the process does take some time because it essentially re-packages the project everytime there is a change... May not be ideal for some use cases...
I found a plugin that watches files. https://github.com/fizzed/maven-plugins
Change the watch directory to where your test files. Changed the goal from compile to package.
Watcher will execute mvnw: package when a change is detected. Then the asciidoctor maven plugin will re-package the project.
Add this to your plugin
<plugin>
<groupId>com.fizzed</groupId>
<artifactId>fizzed-watcher-maven-plugin</artifactId>
<version>1.0.6</version>
<configuration>
<touchFile>target/classes/watcher.txt</touchFile>
<watches>
<watch>
<directory><directory>src/test/[your test package]</directory></directory>
</watch>
</watches>
<goals>
<goal>package</goal>
<!-- <goal>compile</goal> -->
</goals>
<profiles>
<profile>optional-profile-to-activate</profile>
</profiles>
</configuration>
</plugin>
Maven does not have an equivalent of Gradle's continuous build. If you want changes in the tests to be detected and to trigger recompilation of the tests and execution of all of the tasks that depend (directly or indirectly) on the compiled test classes, you'll have to use Gradle.
I have an application which I want to deploy in karaf. I have created a feature file and I am able to add features through this file using karaf console. What I want to achieve now is that create this feature file through maven commands instead of creating it manually and then create a custom karaf distribution using this feature file. How can I achieve it ?
My approach so far is to create a maven module for generating feature file using karaf-maven-plugin and then create another module to generate karaf custom distribution so that we dont need to access maven in production environment.
Is this approach correct ? Do I really need to make two different modules for achieving it. How can I get access to feature file from second module.
These are my poms -
all dependecies
<build>
<finalName>${project.artifactId}-${project.version}</finalName>
<plugins>
<plugin>
<groupId>org.apache.karaf.tooling</groupId>
<artifactId>karaf-maven-plugin</artifactId>
<version>4.0.5</version>
<extensions>true</extensions>
<executions>
<execution>
<id>generate</id>
<phase>generate-resources</phase>
<goals>
<goal>features-generate-descriptor</goal>
</goals>
<configuration>
<startLevel>80</startLevel>
<aggregateFeatures>true</aggregateFeatures>
<includeTransitiveDependency>true</includeTransitiveDependency>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
I am not able to figure out the second part yet. Any help with that is really appreciated.
To generate a custom Karaf you just need to use the karaf-maven-plugin.
For example the following will generate a fully working custom Karaf:
<plugin>
<groupId>org.apache.karaf.tooling</groupId>
<artifactId>karaf-maven-plugin</artifactId>
<version>4.0.0</version>
<extensions>true</extensions>
<configuration>
<!-- no startupFeatures -->
<bootFeatures>
<feature>minimal</feature>
</bootFeatures>
<installedFeatures>
<feature>wrapper</feature>
<feature>spring/4.0.7.RELEASE_1</feature>
</installedFeatures>
</configuration>
</plugin>
This will generate a custom karaf based on the minimal sets of features which are needed to create the minimal distro. If you want to depend on the standard distro just exchange that with standard.
Btw. all this is also documented in the Karaf documentation
We decided to use gwt modules in our application about 1 week ago. We use gwt-maven-eclipse trio and we already configured phases and goals. Also we are doing context deploying to decrease development and testing time.
BUT;
When we package or tomcat:deploy our application, gwt modules are re-compiling(including unchanged ones).
<set-property name="user.agent" value="gecko1_8"></set-property>
<extend-property name="locale" values="en_UK"></extend-property>
I already set these properties up here to speed up compiling time but this is not what i want exactly...
I also configured maven lifecycle mapping in eclipse to fire gwt:compile process-resources resources:testResources when any resources change. But it blocks eclipse and that was not helpful about compiling time either.
This is gwt-maven-plugin configuration in pom.xml
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>gwt-maven-plugin</artifactId>
<version>2.3.0</version>
<executions>
<execution>
<goals>
<goal>compile</goal>
</goals>
</execution>
</executions>
<!--
Plugin configuration. There are many available options, see
gwt-maven-plugin documentation at codehaus.org
-->
<configuration>
<runTarget>A.jsp</runTarget>
<runTarget>B.jsp</runTarget>
<hostedWebapp>${webappDirectory}</hostedWebapp>
</configuration>
</plugin>
Any idea to help me?
gwt-maven-plugin tries (hard) to avoid recompiling modules when the code hasn't changed, but even that takes a bit of time (still less than re-compiling the module; and unfortunately if it detects the module needs to be recompiled, it adds up to the GWT compile time).
If you know you don't need gwt:compile, you can pass -Dgwt.compiler.skip=true to your Maven build to skip the goal and keep "running" your previously compiled code. Similarly, if you know you need gwt:compile, you can pass -Dgwt.compiler.force=true to bypass the "up-to-date check".
I want to include this file when running locally, but exclude it when deploy. I tried the following the doesn't seem to work.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<version>2.3</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>jar</goal>
</goals>
<configuration>
<excludes>
<exclude>filename.properties</exclude>
</excludes>
</configuration>
</execution>
</executions>
</plugin>
You should put that property file into src/test/resources than it will be available for Unit testing etc. but it will not being deployed.
I want to use this property file in development, but in production, I want to be able to use a different file.
There are several ways to achieve this but this is a perfect use case for profiles (and optionally filtering).
The Building For Different Environments with Maven 2 page explains how you could manage different property files and use the Maven Antrun plugin to pick up the "right" one depending on the profile used.
Instead of using the Antrun plugin, you could declare environment specific values as properties in different profiles in your POM and use Maven filtering. See Using Maven profiles and resource filtering for a full example. This is especially nice if you need to protect some informations (that you can "hide" in a profile inside ~/.m2/settings.xml). See also Tips and Tricks for an illustration of this.
Or, instead of putting properties inside profiles, you could put them in "filter files" and apply the right filter depending on the profile. This is a little variation of the above solution. See A Maven2 multi-environment filter setup for an example.