How to set property into gwt.xml file via maven properties? - gwt

I have a property in root pom.xml file: gecko1_8. I want to place this to gwt.xml file.
So I put this property to gwt.xml:
I added the following to build section:
<resources>
<resource>
<directory>src/main/resources</directory>
<filtering>true</filtering>
<excludes>
<exclude>VAADIN.themes/*</exclude>
</excludes>
</resource>
</resources>
But at the end build failed with the error:
ERROR: Invalid property value '${gwt.user.agents}'
ERROR: Failure while parsing XML
How to place values from pom.xml to gwt.xml file via properties?
UPDATED
Interesting thing. When I use "mvn resources:resources", property's value writes correctly to gwt.xml file, but if I run "mvn clean install -pl com.myproject.module:submodule" it failes with "invalid property value".

You have to define a maven profile (better to define a specific profile for each cases) in your pom like this:
<profile>
<id>gecko</id>
<activation>
<activeByDefault>false</activeByDefault>
</activation>
<properties>
<user.agent.gecko>
<![CDATA[<set-property name="user.agent" value="gecko,gecko1_8" />]]>
</user.agent.gecko>
</properties>
<build>
<resources>
<resource>
<directory>src/main/resources</directory>
<includes>
<include>**/*YourGWTModule.gwt.xml</include>
</includes>
<filtering>true</filtering>
</resource>
</resources>
<defaultGoal>process-resources</defaultGoal>
</build>
</profile>
<properties>
<!--<user.agent>ie6,ie8,gecko,gecko1_8,opera,safari</user.agent>-->
<user.agent.all> </user.agent.all>
<user.agent.ie6> </user.agent.ie6>
<user.agent.ie8> </user.agent.ie8>
<user.agent.gecko> </user.agent.gecko>
<user.agent.opera> </user.agent.opera>
<user.agent.safari> </user.agent.safari>
</properties>
Then set it in your YourGWTModule.gwt.xml like this:
<set-property name="locale" value="default" />
<!-- Specified through pom.xml profiles -->
${user.agent.all}
${user.agent.ie6}
${user.agent.ie8}
${user.agent.gecko}
${user.agent.safari}
${user.agent.opera}
</module>
Finally run maven with profile:
mvn -P gecko install

There's one thing that I didn't mention but it appeared to be too important. Plugins that are used during build can have their own goals. And one of these goals can "redo" what maven-resource-plugin did.
So I had a plugin vaadin-maven-plugin:
<plugin>
<groupId>com.vaadin</groupId>
<artifactId>vaadin-maven-plugin</artifactId>
<version>${vaadin.plugin.version}</version>
<configuration>
...
</configuration>
<executions>
<execution>
<goals>
<goal>resources</goal>
<goal>compile</goal>
</goals>
</execution>
</executions>
</plugin>
And removal of a <goal>resources</goal> fixed the issue. And now mvn clean install filters properties in my gwt.xml correctly.
Hope this solution help those who will run at such an issue.

Related

Bind Maven resource filtering to specific phase

I am on Eclipse 2019-09 and trying the following:
I created a file src/main/resources/build/build.properties to hold some information about the current build:
build.version=${project.version}
build.number=${buildNumber}
build.date=${buildTimestamp}
When building the project the variables get replaced correctly and the file inside the target folder is rendered correctly. When changing code in Eclipse (without a Maven build), the buildNumber gets replaced by its variable name.
I suppose it has something to do with m2e doing the filtering too often (not only when clicking build via the menu). The buildNumber is only available during the "real" build and so it is not available on other occasions, hence it is not getting replaced.
Can I tell Maven to only execute the variable replacement during a specific build phase?
Relevant excerpt from the pom.xml:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>123</groupId>
<artifactId>456</artifactId>
<version>789</version>
<properties>
<maven.build.timestamp.format>dd.MM.yyyy HH:mm:ss</maven.build.timestamp.format>
<buildTimestamp>${maven.build.timestamp}</buildTimestamp>
</properties>
<!-- Dummy for Build Number Plugin -->
<scm>
<connection>scm:svn:http://127.0.0.1/dummy</connection>
<developerConnection>scm:svn:https://127.0.0.1/dummy</developerConnection>
<tag>HEAD</tag>
<url>http://127.0.0.1/dummy</url>
</scm>
<build>
<finalName>${project.artifactId}-${project.version}.${buildNumber}</finalName>
<resources>
<resource>
<directory>src/main/resources</directory>
<filtering>true</filtering>
</resource>
</resources>
<plugins>
<!-- Maven Build Number Plugin -->
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>buildnumber-maven-plugin</artifactId>
<version>1.4</version>
<executions>
<execution>
<id>buildnumber</id>
<phase>clean</phase>
<goals>
<goal>create</goal>
</goals>
</execution>
</executions>
<configuration>
<format>{0,number}</format>
<items>
<item>buildNumber</item>
</items>
<doCheck>false</doCheck>
<doUpdate>false</doUpdate>
<revisionOnScmFailure>unknownbuild</revisionOnScmFailure>
</configuration>
</plugin>
</plugins>
</build>
</project>
Working example showing my problems:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.whatever</groupId>
<artifactId>test</artifactId>
<version>0.0.1-SNAPSHOT</version>
<properties>
<maven.compiler.source>12</maven.compiler.source>
<maven.compiler.target>12</maven.compiler.target>
<project.build.sourceEncoding>Cp1252</project.build.sourceEncoding>
<maven-jar-plugin.version>3.5.0</maven-jar-plugin.version>
<maven.build.timestamp.format>dd.MM.yyyy HH:mm:ss</maven.build.timestamp.format>
<buildTimestamp>${maven.build.timestamp}</buildTimestamp>
</properties>
<!-- Dummy for Build Number Plugin -->
<scm>
<connection>scm:svn:http://127.0.0.1/dummy</connection>
<developerConnection>scm:svn:https://127.0.0.1/dummy</developerConnection>
<tag>HEAD</tag>
<url>http://127.0.0.1/dummy</url>
</scm>
<build>
<finalName>${project.artifactId}-${project.version}.${buildNumber}</finalName>
<resources>
<resource>
<directory>src/main/resources</directory>
<filtering>true</filtering>
</resource>
</resources>
<plugins>
<!-- Copy dependencies into target/lib -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<executions>
<execution>
<id>copy-dependencies</id>
<phase>package</phase>
<goals>
<goal>copy-dependencies</goal>
</goals>
<configuration>
<outputDirectory>${project.build.directory}/lib</outputDirectory>
<overWriteReleases>false</overWriteReleases>
<overWriteSnapshots>false</overWriteSnapshots>
<overWriteIfNewer>true</overWriteIfNewer>
</configuration>
</execution>
</executions>
</plugin>
<!-- Create JAR -->
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<configuration>
<archive>
<manifest>
<mainClass>org.whatever.Main</mainClass>
</manifest>
</archive>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
</plugin>
<!-- Maven Build Number Plugin -->
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>buildnumber-maven-plugin</artifactId>
<version>1.4</version>
<executions>
<execution>
<id>buildnumber</id>
<goals>
<goal>create</goal>
</goals>
</execution>
</executions>
<configuration>
<format>{0,number}</format>
<items>
<item>buildNumber</item>
</items>
<doCheck>false</doCheck>
<doUpdate>false</doUpdate>
<revisionOnScmFailure>unknownbuild</revisionOnScmFailure>
</configuration>
</plugin>
</plugins>
</build>
</project>
Create a simple class org.whatever.test.Main.java with just a sysout:
package org.whatever.test;
public class Main {
public static void main(String... args) {
System.out.println("Hey!");
}
}
And create a build.properties under src/main/resources:
build.version=${project.version}
build.number=${buildNumber}
build.date=${buildTimestamp}
Building increases the build number by three. Changing your Main.java (causing recompilation) makes the replaced version of the build.properties in the target directory lose the build number.
EDIT: Fixed the by-three-incrementation. Thanks to khmarbaise. Still left with the replacing issue. https://maven.apache.org/plugins/maven-resources-plugin/resources-mojo.html states
Binds by default to the lifecycle phase: process-resources.
which comes just before compile. Eclipse probably does a compile on every code change (just guessing) and replaces the resources. On this phase no buildNumber is available (and I do not want to increase this number on every code change). I cannot find out how to bind to another phase or go a more elegant way.
EDIT: The more I am thinking about it the more I am thinking of keeping it this way. Building via Maven increments my build number. The build process also zips my class files, libraries etc. Inside the target directory and in the zip files, now there is the correct build number. Changing my code results in the build number being reset to ${buildNumber} in target/classes which is technically correct. I am not on the same build as I was before changing the code, I am somewhere between builds.
I think the build of Eclipse won't bother Maven's building.
The truely problem is the maven-resources-plugin invoked in 'process-resources' phase,and your buildnumber-maven-plugin runs after that.
Here is my solution here, hope it works for you.
https://stackoverflow.com/a/70604055/13049551

How to correctly manage feature configuration deployment in JBoss Fuse 6.2.1?

I am trying to evaluate JBoss Fuse as an integration platform, and I have the following question regarding deployment.
I am trying to set up a fabric and use profiles, more specifically feature repositories for camel/blueprint component deployment.
I am having the following issue with externalizing the component configuration: when i update the snapshot of the configuration file artifact, the configuration changes are not picked up by the container.
Moreover, when i completely remove the profile from the container, the PID config file stays on the server in etc/ folder.
Also there is an additional issue during deployment where the camel bundle gets activated before the config pid file is loaded, resulting in exception in aries blueprint, and i have to additionally refresh the osgi bundle manually.
Here is how the feature repository file looks like:
<?xml version="1.0" encoding="UTF-8"?>
<features name="fuse-poc">
<feature name="fuse-poc-common" version="${project.version}">
<bundle>mvn:com.myorg.fuse/common/${project.version}</bundle>
</feature>
<feature name="fuse-poc-camel" version="${project.version}">
<feature>fuse-poc-common</feature>
<config name="com.myorg.fuse.poc.camel">
test.value=ENC(5XdDgfKwwhMTmbo1z874eQ==)
</config>
<bundle>mvn:com.myorg.fuse/fuse-poc-camel/${project.version} </bundle>
</feature>
<feature name="fuse-poc-activemq" version="${project.version}">
<feature>fuse-poc-common</feature>
<configfile finalname="etc/com.myorg.fuse.poc.jms.cfg">
mvn:com.myorg.fuse/feature/${project.version}/cfg/dev
</configfile>
<bundle>mvn:com.myorg.fuse/fuse-poc-camel- activemq/${project.version}</bundle>
</feature>
</features>
The projects themselves are simple camel archetype projects with one having a basic route with logging and one with route using activemq and cm:property-placeholder in blueprint.xml
Here is the corresponding build section in maven:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>com.myorg.fuse</groupId>
<artifactId>fuse-poc-parent</artifactId>
<version>1.0.0-SNAPSHOT</version>
</parent>
<artifactId>feature</artifactId>
<packaging>pom</packaging>
<name>FUSE PoC Feature Repository</name>
<build>
<resources>
<resource>
<directory>src/main/resources</directory>
<filtering>true</filtering>
</resource>
</resources>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-resources-plugin</artifactId>
<version>2.6</version>
<executions>
<execution>
<id>filter</id>
<phase>generate-resources</phase>
<goals>
<goal>resources</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>build-helper-maven-plugin</artifactId>
<version>1.10</version>
<executions>
<execution>
<id>attach-artifacts</id>
<phase>package</phase>
<goals>
<goal>attach-artifact</goal>
</goals>
<configuration>
<artifacts>
<artifact>
<file>target/classes/fuse-poc.xml</file>
<type>xml</type>
<classifier>features</classifier>
</artifact>
<artifact>
<file>src/main/resources/env/dev/com.myorg.fuse.poc.jms.cfg</file>
<type>cfg</type>
<classifier>dev</classifier>
</artifact>
</artifacts>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>
Here are the commands I use to deploy the feature:
fabric:version-create 1.1
fabric:profile-create --parent jboss-fuse-full fuse-poc
fabric:profile-edit --repository mvn:com.myorg.fuse/feature/1.0.0-SNAPSHOT/xml/features fuse-poc 1.1
fabric:profile-edit --feature fuse-poc-camel fuse-poc 1.1
fabric:profile-edit --feature fuse-poc-activemq fuse-poc 1.1
fabric:container-upgrade 1.1 root
fabric:container-add-profile root fuse-poc
After I manually do osgi:refresh <bundle id> the bundle it starts fine.
fabric:container-remove-profile root fuse-poc
All the config pid entries stay in the config, and all osgi bundles are also staying installed. How do i correctly undeploy artifacts so that the container is clean and an updated version of the same artifact can be deployed without side effects?
I suspect i am doing something conceptually wrong, because ultimately the issue above leads to the following problem: if i add a property to either the element in features, or the .cfg file and install the project using maven again, and then do container-remove-profile, profile-refresh, and container-add-profile, the config does not change at all. It can only be redeployed correctly if i manually do the config:delete command on my pid in the console.
So I finally had time to dig through the sources and cross-reference karaf, fabric and jboss documentation.
Here is how it works:
The code responsible for loading/unloading bundles through the feature repository system is located in fabric-agent-1.2.0.redhat-621084.jar
Notably, the class io.fabric8.agent.service.FeatureConfigInstaller is responsible for the configuration entries, and the class io.fabric8.agent.service.Agent is doing the overall deployment
There is no code at all to uninstall the config file/remove config entries. This makes it troublesome to do development with SNAPSHOTs
However, there is a useful 'override' property that can be specified on the <configfile> element - if it is set to true, the config PID file will be overwritten on deployment which is exactly what i wanted for local development
For <config> element, there is an undocumented 'append' property, which, when set to true, is supposed to append new entries to the config. However, it is hilariously bugged - turns out it also removes all defined previously existing properties when trying to append. Conclusion - use <configfile> functionality instead
Additionally, for externalizing environment configuration i've come to the conclusion the most simple way is to produce multiple feature xml descriptors from the same file through different maven build profiles. The project files consequently look like this:
pom.xml:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>com.myorg.fuse</groupId>
<artifactId>fuse-poc-parent</artifactId>
<version>1.0.0-SNAPSHOT</version>
</parent>
<artifactId>feature</artifactId>
<packaging>pom</packaging>
<name>FUSE PoC Feature Repository</name>
<properties>
<build.environment>dev</build.environment>
</properties>
<build>
<resources>
<resource>
<directory>src/main/resources</directory>
<filtering>true</filtering>
</resource>
</resources>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-resources-plugin</artifactId>
<version>2.6</version>
<executions>
<execution>
<id>filter</id>
<phase>generate-resources</phase>
<goals>
<goal>resources</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>build-helper-maven-plugin</artifactId>
<version>1.10</version>
<executions>
<execution>
<id>attach-artifacts</id>
<phase>package</phase>
<goals>
<goal>attach-artifact</goal>
</goals>
<configuration>
<artifacts>
<artifact>
<file>target/classes/fuse-poc.xml</file>
<type>xml</type>
<classifier>features-${build.environment}</classifier>
</artifact>
<artifact>
<file>src/main/resources/env/${build.environment}/com.myorg.fuse.poc.cfg</file>
<type>cfg</type>
<classifier>${build.environment}</classifier>
</artifact>
</artifacts>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
<profiles>
<profile>
<id>local</id>
<properties>
<build.environment>local-dev</build.environment>
</properties>
</profile>
<profile>
<id>dev</id>
<properties>
<build.environment>dev</build.environment>
</properties>
</profile>
<profile>
<id>uat</id>
<properties>
<build.environment>uat</build.environment>
</properties>
</profile>
<profile>
<id>prod</id>
<properties>
<build.environment>prod</build.environment>
</properties>
</profile>
</profiles>
</project>
feature.xml:
<?xml version="1.0" encoding="UTF-8"?>
<features name="fuse-poc">
<feature name="fuse-poc-common" version="${project.version}">
<bundle>mvn:com.myorg.fuse/common/${project.version}</bundle>
</feature>
<feature name="fuse-poc-camel" version="${project.version}">
<feature>fuse-poc-common</feature>
<configfile finalname="etc/com.myorg.fuse.poc.cfg" override="true">
mvn:com.myorg.fuse/feature/${project.version}/cfg/${build.environment}
</configfile>
<bundle>mvn:com.myorg.fuse/fuse-poc-camel/${project.version}</bundle>
</feature>
</features>
This way you can directly specify the environment when creating the profile for deploying the feature:
fabric:profile-create --parent jboss-fuse-full fuse-poc fabric:profile-edit --repository mvn:com.myorg.fuse/feature/1.0.0-SNAPSHOT/xml/features-local-dev fuse-poc
Note to set up the build integration tool to build the feature module for all profiles except 'local-dev', since that one needs to be used by developers locally and should not be downloaded from the central repository
On a final note, once i resolved the config handling issues and the bundles started deploying correctly, the platform also started undeploying bundles correctly on removing the profile from the container. I suspect there is some issue with uninstalling the bundles whenever the original deployment ended in failure

SEAM GWT Integration

I am trying to integrate GWT with SEAM. i followed the Docs and tried to run the
example as follows.
I created a GWT project, using Eclipse Galileo and created the classes as given in the example
I then added the Seam 2.0.2 jars to the build path
I compiled the application, using Google GWT Compiler using the eclipse UI.
Finally i ran the application.
First I would like to know whether the above steps are correct. After running the application I do not get the desired result.
Also is this the only way to integrate GWT with Seam ?
Update
I have got this example running using ant. But the aim of my exercise will be to run it via eclipse ui.
I created my own project by name GWTTest and tried to recreate the example in the Eclipse
UI. There are a few things that I have noticed. GWT Compile via Eclipse UI creates a directory by name gwttest inside the war file. Where as the directory structure created by ant is different.
In the example there is a piece of code in AskQuestionWidget getService functions as follows
String endpointURL = GWT.getModuleBaseURL() + "seam/resource/gwt";
How do I modify this code to suit my directory structure ?
We use seam+richfaces+gwt and it works very well. Although we build everything with maven, I suppose you can use ant as well. The general idea is to start the whole web application in GWT Development Mode. You don't have to compile everything (which takes a long time in case of GWT compiler). Development mode will compile requested resources on demand. By running GWT application this way, you can also debug client side code.
It is also possible to call GWT methods in response to seam actions.
Update:
I can elaborate on our solution a bit:
Maven
Your project should be configured with packaging: war. There are some official instructions on setting seam with maven (also richfaces):
http://docs.jboss.org/seam/2.2.1.CR2/reference/en-US/html/dependencies.html#d0e34791
http://docs.jboss.org/richfaces/latest_3_3_X/en/devguide/html/SettingsForDifferentEnvironments.html
For GWT add following sections to your pom.xml:
<dependency>
<groupId>com.google.gwt</groupId>
<artifactId>gwt-user</artifactId>
<version>2.1.0</version>
<scope>provided</scope> <!-- prevents from including this in war -->
</dependency>
<dependency>
<groupId>com.google.gwt</groupId>
<artifactId>gwt-dev</artifactId>
<version>2.1.0</version>
<scope>provided</scope> <!-- prevents from including this in war -->
</dependency>
<dependency>
<groupId>pl.ncdc.gwt</groupId>
<artifactId>gwt-servlet-war</artifactId>
<version>2.1.0</version>
<type>war</type> <!-- adds gwt-servlet.jar to your war, but not to your classpath -->
</dependency>
<!-- build section -->
<build>
<resources>
<resource>
<directory>src/main/resources</directory>
</resource>
<resource>
<directory>src/main/java</directory>
<includes>
<include>**/client/**/*.java</include>
<include>**/client/**/*.properties</include>
<include>**/shared/**/*.java</include>
<include>**/shared/**/*.properties</include>
<include>**/*.gwt.xml</include>
</includes>
</resource>
</resources>
<testResources>
<testResource>
<directory>src/test/java</directory>
<includes>
<include>**/client/**/*.java</include>
<include>**/client/**/*.properties</include>
<include>**/shared/**/*.java</include>
<include>**/shared/**/*.properties</include>
<include>**/*.gwt.xml</include>
</includes>
</testResource>
</testResources>
<plugins>
<plugin> <!-- dirty hack for GWT issue #3439 - it is not really fixed -->
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-antrun-plugin</artifactId>
<executions>
<execution>
<id>remove-javax</id>
<phase>compile</phase>
<configuration>
<tasks>
<delete dir="${project.build.directory}/classes/javax" />
</tasks>
</configuration>
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>gwt-maven-plugin</artifactId>
<version>1.3.2.google</version>
<configuration>
<extraJvmArgs>-Xmx512M</extraJvmArgs>
<gwtVersion>${gwt.version}</gwtVersion>
<modules>
<module>com.company.gwt.project.module.Module</module>
</modules>
<soyc>false</soyc>
<draftCompile>${gwt.draft.compile}</draftCompile> <!-- you can control this with profiles -->
<localWorkers>2</localWorkers><!-- in theory should speed things up on our quad CPU hudson -->
<style>${gwt.style}</style> <!-- you can control this with profiles -->
</configuration>
<executions>
<execution>
<id>compile</id>
<phase>prepare-package</phase>
<goals>
<goal>compile</goal>
</goals>
</execution>
<execution>
<id>gwt-test</id>
<phase>integration-test</phase>
<goals>
<goal>test</goal>
</goals>
<configuration>
<includes>**/*GwtTestSuite.java</includes>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
<version>2.1</version>
<configuration>
<archiveClasses>true</archiveClasses>
<warSourceDirectory>src/main/webapp-empty</warSourceDirectory> <!-- just empty dir for workaround -->
<webResources>
<resource>
<directory>src/main/webapp</directory>
<excludes>
<exclude>app.*</exclude> <!-- name of you gwt module(s) - rename-to in gwt.xml -->
<exclude>WEB-INF/web.xml</exclude>
</excludes>
</resource>
<resource>
<directory>src/main/webapp</directory>
<includes>
<include>WEB-INF/web.xml</include>
</includes>
<filtering>true</filtering>
</resource>
</webResources>
</configuration>
</plugin>
</plugins>
</build>
This configuration should produce war with both - seam and gwt compiled. If you want to use such project in development mode put also this in pom.xml:
<dependency>
<groupId>com.xemantic.tadedon</groupId>
<artifactId>tadedon-gwt-dev</artifactId>
<version>1.0-SNAPSHOT</version>
<scope>provided</scope>
</dependency>
And add -server com.xemantic.tadedon.gwt.dev.JettyLauncher to your google web application launcher. This is maven friendly jetty launcher which might be necessary in some situations.
I hope it will help you. Are you interested in communication between gwt and richfacaes application?
If you want, take a look at <SEAM_HOME>/examples/remoting/gwt. From there, run (Make sure you have installed ANT before using it)
ant
Here goes its readme.txt file
You can view the example at:
http://localhost:8080/seam-helloworld/org.jboss.seam.example.remoting.gwt.HelloWorld/HelloWorld.html
GWT: If you want to rebuild the GWT front end, you will need to download GWT, and configure build.properties to point to it. You can then run "ant gwt-compile" from this directory. It is pre-built by default. If you want to use the GWT hosted mode, well, read all about it from the GWT docs !

Problem when importing Maven2 project in Eclipse

In my project, I have a resources directory (src/main/resources) that contains properties and XML files.
I want to filter only the properties files, but not any others kind of files (XML for example). Thus, I've set this in my pom.xml:
<build>
<resources>
<resource>
<directory>src/main/resources</directory>
<filtering>true</filtering>
<includes>
<include>**/*.properties</include>
</includes>
</resource>
<resource>
<directory>src/main/resources</directory>
<filtering>false</filtering>
<excludes>
<exclude>**/*.properties</exclude>
</excludes>
</resource>
</resources>
</build>
This is working well when I run a Maven 2 package command, i.e. both XML and properties files are included in my final JAR, and only properties files have been filtered.
However, as I want to include this project in Eclipse, when I run the command mvn eclipse:eclipse, and import the project, then I have a problem with the source declared in my project properties.
In the "Java Build Path" option of Eclipse for my project, in tab "Source", I see the src/main/resources directory, but Eclipse also add filters which say to exclude all java files (Excluded: **/*.java) and include only properties files (Included: **/*.properties).
In the .classpath file generated, I get this line:
<classpathentry kind="src" path="src/main/resources" including="**/*.properties" excluding="**/*.java"/>
This way, the JAR built by Eclipse is not correct as all my XML files are not in the JAR.
How can I solve this problem?
Edit, regarding this page, I've added this in my pom.xml:
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-eclipse-plugin</artifactId>
<version>2.6</version>
<configuration>
<sourceIncludes>
<sourceInclude>**/*.xml</sourceInclude>
</sourceIncludes>
</configuration>
</plugin>
</plugins>
However, the .classpath generated is not modified with the adequate information...
Edit again.
The addition in my previous edit works only for version 2.6.1+ of the Eclipse plugin, not for 2.6. So, I've tried with version 2.7. However, I don't know how to force the Eclipse plugin to not define the including attribute:
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-eclipse-plugin</artifactId>
<version>2.7</version>
<configuration>
<sourceIncludes>
<sourceInclude>*</sourceInclude>
</sourceIncludes>
</configuration>
</plugin>
</plugins>
If I run the mvn eclipse:eclipse command, I get the following error:
Request to merge when 'filtering' is not identical. Original=resource src/main/resources: output=target/classes, include=[**/*.properties], exclude=[**/*.java], test=false, filtering=true, merging with=resource src/main/resources: output=target/classes, include=[], exclude=[**/*.properties|**/*.java], test=false, filtering=false
Well maven-eclipse-plugin is very bugy :( and we use version 2.5. The solution we found is copy resources to different folder. For regular maven bulid we use default filtering but for eclipse we have special profile. Here is maven-eclipse=plugin configuration in this profile:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-resources-plugin</artifactId>
<executions>
<execution>
<id>copy-resources-step1</id>
<goals>
<goal>copy-resources</goal>
</goals>
<phase>initialize</phase>
<configuration>
<outputDirectory>${project.build.directory}/for-eclipse/resources</outputDirectory>
<resources>
<resource>
<directory>${basedir}/src/main/resources</directory>
<filtering>true</filtering>
</resource>
</resources>
</configuration>
</execution>
<execution>
<id>copy-resources-step2</id>
<goals>
<goal>copy-resources</goal>
</goals>
<phase>initialize</phase>
<configuration>
<outputDirectory>${basedir}/src/main/resources</outputDirectory>
<resources>
<resource>
<directory>${project.build.directory}/for-eclipse/resources</directory>
</resource>
</resources>
</configuration>
</execution>
</executions>
</plugin>
Of course you must modify include/exclude section for your needs.
I did some testing with the maven eclipse plugin 2.6, 2.5.1, 2.7 and 2.8-SNAPSHOT and none of them is indeed producing the expected result.
The only workaround I've been able to find is to use another directory for the resources you don't want to filter. Something like this:
...
<build>
<resources>
<resource>
<directory>src/main/resources1</directory>
<filtering>true</filtering>
<includes>
<include>**/*.properties</include>
</includes>
</resource>
<resource>
<directory>src/main/resources2</directory>
<filtering>false</filtering>
<includes>
<include>**/*.xml</include>
</includes>
</resource>
</resources>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-eclipse-plugin</artifactId>
<version>2.6</version>
</plugin>
</plugins>
</build>
I agree that this is ugly. But at least, it produces a .classpath with the following entries:
<classpathentry kind="src" path="src/main/resources1" including="**/*.properties" excluding="**/*.java"/>
<classpathentry kind="src" path="src/main/resources2" including="**/*.xml" excluding="**/*.java"/>
That should allow you to deploy on Tomcat.
By the way, I wouldn't use the version 2.7 of the plugin because it's not working (see this thread). The version 2.6 may not produce the expected output but at least, it works.
Use properties to have Eclipse and Maven build into different directories. Now you can use Eclipse to quickly develop your code and use Maven to deploy it as a JAR.
I suggest that you either the mvn package goal as an external tool from Eclipse. I don't think that Eclipse can handle complex Maven workflows.
Even the m2eclipse plugin, which I use and highly recommend, has troubles with complex workflows.

Using Maven for deployment

I have this task for the project with 4 nested subprojects using Maven:
For each child: jar-up resource directory including project dependencies
Move up to the parent project
With a single command extract all created archives into various remote destinations (full install), that may include http server, app server, file server, etc. (mostly *NIX). Destination is provided on subproject level
It should also be possible to unzip/copy from the individual subproject (partial install)
Files are not Java - mostly various scripts and HTML
I'm looking at the various plugins to help with the task: assembly, dependency, antrun, unzip. Dependency looks promising but I need to unzip not only dependency jars but the (sub)project content as well. Also since I can't really tight the operation to the Maven lifecycle how would I trigger remote install? mvn dependency:unpack? That's not very descriptive or intuitive. Is is possible to create a custom goal (e.g. project:install) without writing a plugin?
Using Maven is company standard so please do not offer alternatives - I'm pretty much stuck with what I have
Ok, I think the following might do what you need. The drawback of this approach is that there will be an interval between each deployment as the subsequent build is executed. Is this acceptable?
Define a profile in each project with the same name (say "publish"). Within that profile you can define a configuration to use the antrun-plugin to deliver the files with FTP (see below).
In the parent project you'll have a modules element, defining each project as a module. If you run mvn install -P publish, each project will be built in turn with the publish profile enabled, and the final artifact published to the target during the install phase. If you need to deploy additional files, modify the include element accordingly.
Note the parameters for the FTP task have been set as properties, this allows them to be overridden from the command-line and/or inherited from the parent POM.
<profiles>
<profile>
<id>publish</id>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-antrun-plugin</artifactId>
<executions>
<execution>
<id>ftp</id>
<phase>install</phase>
<configuration>
<tasks>
<ftp action="send"
server="${ftp.host}" remotedir="${ftp.remotedir}"
userid="${ftp.userid}" password="${ftp.password}"
depends="${ftp.depends}" verbose="${ftp.verbose}">
<fileset dir="${project.build.directory}">
<include
name="${project.build.finalName}.${project.packaging}"/>
</fileset>
</ftp>
</tasks>
</configuration>
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
<dependencies>
<dependency>
<groupId>commons-net</groupId>
<artifactId>commons-net</artifactId>
<version>1.4.1</version>
</dependency>
<dependency>
<groupId>ant</groupId>
<artifactId>ant-commons-net</artifactId>
<version>1.6.5</version>
</dependency>
<dependency>
<groupId>ant</groupId>
<artifactId>ant-nodeps</artifactId>
<version>1.6.5</version>
</dependency>
</dependencies>
</plugin>
<properties>
<ftp.host>hostname</ftp.host>
<ftp.remotedir>/opt/path/to/install</ftp.remotedir>
<ftp.userid>user</ftp.userid>
<ftp.password>mypassword</ftp.password>
<ftp.depends>yes</ftp.depends>
<ftp.verbose>no</ftp.verbose>
</properties>
</profile>
</profiles>
Update: based on your comment: You could use the dependency plugin to download each dependency, except that a parent can't have a dependency on a child, and it will be built before the child. It would have to be another project. you also need to have somewhere the information for where to deploy them to. At the moment you have the target information in the individual projects so it isn't accessible in the deployer project.
Taking this approach, you can define multiple profiles in the new project, one for each artifact. Each profile defines a dependency:copy execution to obtain the jar and an antrun execution for one of the projects. Common configuration (such as the dependencies for the antrun plugin) can be pulled out of the profiles. Also be aware that the properties will be merged if you define multiple profiles, so yo may need to qualify them with the artifact name, for example ftp.artifact1.host.
<profiles>
<profile>
<id>deploy-artifact1</id>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<executions>
<execution>
<id>copy-dependency</id>
<phase>prepare-package</phase>
<goals>
<goal>copy</goal>
</goals>
<configuration>
<artifactItems>
<artifactItem>
<groupId>name.seller.rich</groupId>
<artifactId>artifact1</artifactId>
<version>1.0.0</version>
<type>jar</type>
<overWrite>false</overWrite>
</artifactItem>
</artifactItems>
<outputDirectory>${project.build.directory}/deploy-staging</outputDirectory>
<overWriteReleases>false</overWriteReleases>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-antrun-plugin</artifactId>
<executions>
<execution>
<id>ftp</id>
<phase>install</phase>
<configuration>
<tasks>
<ftp action="send"
server="${ftp.host}" remotedir="${ftp.remotedir}"
userid="${ftp.userid}" password="${ftp.password}"
depends="${ftp.depends}" verbose="${ftp.verbose}">
<fileset dir="${project.build.directory} includes="deploy-staging/"/>
</ftp>
</tasks>
</configuration>
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
</plugin>
<properties>
<!--if the properties differ between targets, qualify them with the artifact name-->
<ftp.host>hostname</ftp.host>
<ftp.remotedir>/opt/path/to/install</ftp.remotedir>
<ftp.userid>user</ftp.userid>
<ftp.password>mypassword</ftp.password>
<ftp.depends>yes</ftp.depends>
<ftp.verbose>no</ftp.verbose>
</properties>
</profile>
</profiles>
Below POM will help to copy jar's file from project build directory to remote SFTP/FTP server.
Use command mvn install -Dftp.password=password
Since I want to pass password from command prompt for security reason, I have used -Dftp.password=password
After execution of above command all the jar files from maven project target folder will be deployed in MAVEN folder on server.com
<plugin> <groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-antrun-plugin</artifactId>
<executions>
<execution>
<id>ftp</id>
<phase>install</phase>
<configuration>
<tasks>
<scp todir="user#server.com:/MAVEN/"
sftp="true" port="22" trust="true" password="${ftp.password}"
failonerror="false" verbose="true" passphrase="">
<fileset dir="${project.build.directory}">
<include name="*.jar" />
</fileset>
</scp>
</tasks>
</configuration>
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
<dependencies>
<dependency>
<groupId>org.apache.ant</groupId>
<artifactId>ant-jsch</artifactId>
<version>1.9.4</version>
</dependency>
</dependencies>
</plugin>
Does not work without passphrase.
<profile>
<id>publish</id>
<build>
<plugins>
<plugin>
<artifactId>maven-antrun-plugin</artifactId>
<executions>
<execution>
<id>scp</id>
<phase>deploy</phase>
<configuration>
<tasks>
<scp todir="user#host:some/remote/dir"
sftp="true"
keyfile="${user.home}/.ssh/devel-deploy.id_dsa"
failonerror="false"
verbose="true"
passphrase="nopass"
>
<fileset dir="${project.build.directory}">
<include
name="${project.build.finalName}.${project.packaging}"/>
</fileset>
</scp>
</tasks>
</configuration>
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
<dependencies>
<dependency>
<groupId>org.apache.ant</groupId>
<artifactId>ant-jsch</artifactId>
<version>1.9.4</version>
</dependency>
</dependencies>
</plugin>
</plugins>
</build>
</profile>
However, my favourite is
<profile>
<id>upload-devel</id>
<build>
<plugins>
<plugin>
<artifactId>maven-antrun-plugin</artifactId>
<executions>
<execution>
<id>upload-devel</id>
<phase>deploy</phase>
<configuration>
<target>
<exec executable="rsync" failonerror="false">
<arg value="-aiz" />
<arg value="${project.build.directory}/${project.artifactId}.${project.packaging}" />
<arg value="user#host:some/remote/dir/." />
</exec>
</target>
</configuration>
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
</profile>
though I don't know how compatible that is over different platforms.
I would look at using the maven-assembly-plugin to do this.
Something like this can be used to grab the files from the child projects and stuff them in output directories.
<assembly>
<id>xyzzy</id>
<formats>
<format>zip</format>
</formats>
<fileSets>
<fileSet>
<directory>../subproject1/target/</directory>
<outputDirectory>/foo</outputDirectory>
<includes>
<include>*.jar</include>
</includes>
</fileSet>
<fileSet>
<directory>../subproject1/target/html-output/</directory>
<outputDirectory>/foo</outputDirectory>
<includes>
<include>*.html</include>
<include>*.js</include>
<include>*.css</include>
</includes>
</fileSet>
<fileSet>
<directory>../subproject2/target/</directory>
<outputDirectory>/bar</outputDirectory>
<includes>
<include>**/**</include>
</includes>
<excludes>
<exclude>**/*.exclude-this</exclude>
</excludes>
</fileSet>
</fileSets>
</assembly>
Maven is not really designed to deploy jars to a remote location; its main use is compiling and packaging artifacts. The assembly and dependency targets are primarily used to gather dependencies and files to package into an artifact.
Having said that, maven does have a deploy goal which uses a component called wagon. This is primarily intended to deploy to a maven repository. There is a plugin called Cargo that can be used to deploy artifacts to a remote server, but that doesn't explode the jar contents by itself (it relies on the target app server to do all that). You might be able to extend the Maven Wagon functionality yourself.
Also, it is possible to package a custom lifecycle, but that is getting into some pretty low level maven mojo (pun intended).