Scalatest failed with error "java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z" - scala

We are working on a spark project that was working fine. Recently we had a requirement to upgrade the hadoop hdfs version to 3.1.1, so I have added an explicit dependency for hadoop-client and excluded the hadoop-client(version 2.6.5) that is getting imported automatically through spark-core in my pom file as shown below.
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>3.1.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.4.0</version>
<scope>provided</scope>
<exclusions>
<exclusion>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
</exclusion>
</exclusions>
</dependency>
The maven build without tests is getting succeeded and even the application is running fine on the spark cluster. But the scalatests are failing with the error
"ERROR Utils: Aborting task
java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native Method)
at org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:640)
at org.apache.hadoop.fs.FileUtil.canRead(FileUtil.java:1223)"
We are using the maven-surefire-plugin and scalatest plugin in our pom file. We have the hadoop binary for 2.7.1 copied under lib directory within our project and the HADOOP_HOME environment variable under scalatest dependency is set to this path. After making the above dependency changes to get hadoop-client 3.1.1, the maven build with tests started failing. I tried to download the hadoop 3.1.1 binary (with winutils and hadoop.dll) and update my HADOOP_HOME to point to that path but still getting the same error.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.19.1</version>
<configuration>
<argLine>-Xms512m -Xmx1024m</argLine>
<skipTests>true</skipTests>
<reportsDirectory>${project.build.directory}/scalatest-reports</reportsDirectory>
</configuration>
</plugin>
<plugin>
<groupId>org.scalatest</groupId>
<artifactId>scalatest-maven-plugin</artifactId>
<version>2.0.0</version>
<configuration>
<argLine>-Xms512m -Xmx1024m</argLine>
<reportsDirectory>${project.build.directory}/scalatest-reports</reportsDirectory>
<junitxml>jUnitResults</junitxml>
<parallel>false</parallel>
<skipTests>false</skipTests>
<environmentVariables>
**<HADOOP_HOME>${project.basedir}/../lib/hadoop-2.7.1</HADOOP_HOME>**
</environmentVariables>
</configuration>
<executions>
<execution>
<id>test</id>
<goals>
<goal>test</goal>
</goals>
</execution>
</executions>
</plugin>
Any pointers to fix this issue specific to running scalatests using build is greatly appreciated.

Related

Spark Maven dependency incompatibility between delta-core and spark-avro

I'm trying to add delta-core to my scala Spark project, running 2.4.4.
A weird behaviour I'm seeing is that it seems to be in conflict with spark avro. Maven build succeeds, but during runtime I'm getting errors.
If delta table dependency is declared first, I get a runtime error that spark avro is not installed:
User class threw exception: org.apache.spark.sql.AnalysisException:
Failed to find data source: avro. Avro is built-in but external data
source module since Spark 2.4. Please deploy the application as per
the deployment section of "Apache Avro Data Source Guide".;
<dependencies>
<dependency>
<groupId>io.delta</groupId>
<artifactId>delta-core_2.11</artifactId>
<version>0.6.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-avro_2.11</artifactId>
<version>2.4.4</version>
</dependency>
if spark avro is defined first, Avro works, but delta gets an exception:
User class threw exception: java.lang.ClassNotFoundException: Failed
to find data source: delta. Please find packages at
http://spark.apache.org/third-party-projects.html
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-avro_2.11</artifactId>
<version>2.4.4</version>
</dependency>
<dependency>
<groupId>io.delta</groupId>
<artifactId>delta-core_2.11</artifactId>
<version>0.6.1</version>
</dependency>
I thought this could be some kind of dependency conflict so I tried:
<exclusions>
<exclusion>
<groupId>*</groupId>
<artifactId>*</artifactId>
</exclusion>
</exclusions>
on both, but it didn't help.
Got an answer for this in the delta-core issues page. Thanks zsxwing!
The complete solution is based on this previous stack overflow answer to merge the services under META-INF so the different spark sources wont override each-other.
The complete solution - I changed the maven assembly plugin to this:
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<version>3.3.0</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
<configuration>
<descriptors>
<descriptor>${project.basedir}\src\assembly\tvm_assembly.xml</descriptor>
</descriptors>
</configuration>
</plugin>
And in the new tvm_assembly.xml file (based on the original jar-with-depndencies, added the properties for merge):
<assembly xmlns="http://maven.apache.org/ASSEMBLY/2.1.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/ASSEMBLY/2.1.0 http://maven.apache.org/xsd/assembly-2.1.0.xsd">
<!-- TODO: a jarjar format would be better -->
<id>jar-with-dependencies</id>
<formats>
<format>jar</format>
</formats>
<includeBaseDirectory>false</includeBaseDirectory>
<dependencySets>
<dependencySet>
<outputDirectory>/</outputDirectory>
<useProjectArtifact>true</useProjectArtifact>
<unpack>true</unpack>
<scope>runtime</scope>
</dependencySet>
</dependencySets>
<containerDescriptorHandlers>
<containerDescriptorHandler>
<handlerName>metaInf-services</handlerName>
</containerDescriptorHandler>
<containerDescriptorHandler>
<handlerName>metaInf-spring</handlerName>
</containerDescriptorHandler>
<containerDescriptorHandler>
<handlerName>plexus</handlerName>
</containerDescriptorHandler>
</containerDescriptorHandlers>
</assembly>

Liquibase maven plugin with JDK11 failing with ClassNotFoundException: XmlElement

I'm using
Apache Maven 3.3.9
Java version: 11.0.5
And the latest version of liquibase-maven-plugin as follows:
<plugin>
<groupId>org.liquibase</groupId>
<artifactId>liquibase-maven-plugin</artifactId>
<configuration>
<changeLogFile>src\main\resources\changelog.yaml</changeLogFile>
<driver>oracle.jdbc.OracleDriver</driver>
<url>thin_url</url>
<username>user</username>
<password>password</password>
</configuration>
</plugin>
I've added in my poml.xml the following dependencies
<dependency>
<groupId>javax.xml.bind</groupId>
<artifactId>jaxb-api</artifactId>
<version>2.3.1</version>
</dependency>
<dependency>
<groupId>com.sun.xml.bind</groupId>
<artifactId>jaxb-core</artifactId>
<version>2.3.0.1</version>
</dependency>
<dependency>
<groupId>com.sun.xml.bind</groupId>
<artifactId>jaxb-impl</artifactId>
<version>2.3.1</version>
</dependency>
but each time execute the plugin with mvn liquibase:update I've got an Caused by: java.lang.ClassNotFoundException: javax.xml.bind.annotation.XmlElement exception.
Any clue what I'm doing wrong?
I was adding the dependencies in the wrong place. I was adding them as 'regular' dependencies in the pom. I didn't know that I had to add them in the plugin as follows:
<plugin>
<groupId>org.liquibase</groupId>
<artifactId>liquibase-maven-plugin</artifactId>
<version>${liquibase.version}</version>
<configuration>
<changeLogFile>resources\changelog2.yml</changeLogFile>
<driver>oracle.jdbc.OracleDriver</driver>
<url>url</url>
<username>user</username>
<password>password</password>
<verbose>true</verbose>
</configuration>
<dependencies>
<dependency>
<groupId>jakarta.xml.bind</groupId>
<artifactId>jakarta.xml.bind-api</artifactId>
<version>2.3.2</version>
</dependency>
</dependencies>
</plugin>
Java 9 (and above) removed some classes that used to be part of the standard Java runtime.
See this question and answer for more details:
How to resolve java.lang.NoClassDefFoundError: javax/xml/bind/JAXBException in Java 9

Maven build with jenkins for scala spark program : "No primary artifact to install, installing attached artifacts instead

I have a project with multiple scala spark programs, while I run mvn install through eclipse I am able to get the correct jar generated which is used with spark-submit command to run.
After pushing the code to GIT we are trying to build it using jenkins, as we want to automatically push the jar file to our hadoop cluster using ansible.
We have jenkinsfile with build goals as "compile package install -X".
The logs show that-
[DEBUG](f)artifact = com.esi.rxhome:PROJECT1:jar:0.0.1-SNAPSHOT
[DEBUG](f) attachedArtifacts = [com.esi.rxhome:PROJECT1:jar:jar- with-dependencies:0.0.1-SNAPSHOT, com.esi.rxhome:PROJECT1:jar:jar-with-dependencies:0.0.1-SNAPSHOT]
[DEBUG] (f) createChecksum = false
[DEBUG] (f) localRepository = id: local
url: file:///home/jenkins/.m2/repository/
layout: default
snapshots: [enabled => true, update => always]
releases: [enabled => true, update => always]
[DEBUG] (f) packaging = jar
[DEBUG] (f) pomFile = /opt/jenkins-data/workspace/ng_datahub-pipeline_develop-JYTJLDEXV65VZWDCZAXG5Y7SHBG2534GFEF3OF2WC4543G6ANZYA/pom.xml
[DEBUG] (s) skip = false
[DEBUG] (f) updateReleaseInfo = false
[DEBUG] -- end configuration --
[INFO] **No primary artifact to install, installing attached artifacts instead**
I saw the error in the similar post -
Maven: No primary artifact to install, installing attached artifacts instead
But here the answer says - Remove auto clean, I am not sure how to stop that while jenkins is building the jar file.
Below is the pom.xml-
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.esi.rxhome</groupId>
<artifactId>PROJECT1</artifactId>
<version>0.0.1-SNAPSHOT</version>
<packaging>jar</packaging>
<name>${project.artifactId}</name>
<description>RxHomePreprocessing</description>
<inceptionYear>2015</inceptionYear>
<licenses>
<license>
<name>My License</name>
<url>http://....</url>
<distribution>repo</distribution>
</license>
</licenses>
<properties>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
<encoding>UTF-8</encoding>
<scala.version>2.10.6</scala.version>
<scala.compat.version>2.10</scala.compat.version>
</properties>
<dependencies>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>${scala.version}</version>
</dependency>
<!-- Test -->
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.11</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.specs2</groupId>
<artifactId>specs2-core_${scala.compat.version}</artifactId>
<version>2.4.16</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.scalatest</groupId>
<artifactId>scalatest_${scala.compat.version}</artifactId>
<version>2.2.4</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-jdbc</artifactId>
<version>1.2.1000.2.6.0.3-8</version>
</dependency>
<!-- <dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>2.1.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.10</artifactId>
<version>2.1.0</version>
</dependency> -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.6.3</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.10</artifactId>
<version>1.6.3</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_2.10</artifactId>
<version>1.6.3</version>
</dependency>
<dependency>
<groupId>com.databricks</groupId>
<artifactId>spark-csv_2.10</artifactId>
<version>1.5.0</version>
</dependency>
</dependencies>
<build>
<sourceDirectory>src/main/scala</sourceDirectory>
<testSourceDirectory>src/test/scala</testSourceDirectory>
<plugins>
<plugin>
<!-- see http://davidb.github.com/scala-maven-plugin -->
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>3.2.0</version>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>testCompile</goal>
</goals>
<configuration>
<args>
<arg>-make:transitive</arg>
<arg>-dependencyfile</arg>
<arg>${project.build.directory}/.scala_dependencies</arg>
</args>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.18.1</version>
<configuration>
<useFile>false</useFile>
<disableXmlReport>true</disableXmlReport>
<!-- If you have classpath issue like NoDefClassError,... -->
<!-- useManifestOnlyJar>false</useManifestOnlyJar -->
<includes>
<include>**/*Test.*</include>
<include>**/*Suite.*</include>
</includes>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<version>2.4</version>
<configuration>
<skipIfEmpty>true</skipIfEmpty>
</configuration>
<executions>
<execution>
<goals>
<goal>jar</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
<version>3.0.0</version>
<configuration>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
<archive>
<manifest>
<mainClass>com.esi.spark.storedprocedure.Test_jdbc_nospark</mainClass>
</manifest>
</archive>
</configuration>
<executions>
<execution>
<id>make-assembly</id>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<artifactId>maven-clean-plugin</artifactId>
<version>3.0.0</version>
<configuration>
<skip>true</skip>
</configuration>
</plugin>
</plugins>
</build>
</project>
I tried specifying
1 -" jar " for packaging in pom.xml.
2 -changing the maven goals to -
"install"
"clean install"
"compile package install"
But above tries did not help get rid of the message and jar created was of no use.
When I try to execute the spark submit command-
spark-submit --driver-java-options -Djava.io.tmpdir=/home/EH2524/tmp --conf spark.local.dir=/home/EH2524/tmp --driver-memory 2G --executor-memory 2G --total-executor-cores 1 --num-executors 10 --executor-cores 10 --class com.esi.spark.storedprocedure.Test_jdbc_nospark --master yarn /home/EH2524/PROJECT1-0.0.1-20171124.213717-1-jar-with-dependencies.jar
Multiple versions of Spark are installed but SPARK_MAJOR_VERSION is not set
Spark1 will be picked by default
java.lang.ClassNotFoundException: com.esi.spark.storedprocedure.Test_jdbc_nospark
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.spark.util.Utils$.classForName(Utils.scala:175)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:703)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Here, Test_jdbc_nospark is a scala object.
I'm not sure, but your maven-jar-plugin configuration looks suspicious. Normally, the execution would specify a phase, as in
<execution>
<phase>package</phase>
<goals>
<goal>jar</goal>
</goals>
(from this example). Perhaps omitting that is causing your default jar not to be built? Certainly the error message sounds like your default jar is not being built, but you didn't actually say so.
This message was because of the maven-jar-plugin having the as true. Once i removed this, the build is not giving the message "No primary artifact to install, installing attached artifacts instead"
The empty jar was getting created because of incorrect path in pom.xml
Intitally-
<build>
<sourceDirectory>src/main/scala</sourceDirectory>
As jenkins was building through code in git and the pom was inside the project folder.
<build>
<sourceDirectory>folder_name_in_git/src/main/scala</sourceDirectory>

How to compile a widgetSet in vaadin?

I am trying to use the visualizationsForVaadin add-on. The problem is, that I have to compile a custom widgetSet. I've been dealing with this the whole day, and still cannot compile it. First of all, here is my configuration:
Vaadin 6.7.1
gwt 2.3.0
following dependencies are in my pom file:
gwt-ajaxloader 1.1.0
validation-api 1.0.0.GA
gwt-visualization 1.0.2
gwt-user 2.3.0
visualizationsforvaadin 1.1.2.
When I try to compile the widgetSet with maven gwt plugin I get an exception:
Loading inherited module 'com.google.gwt.core.XSLinker'
[ERROR] Line 22: Unexpected element 'when-linker-added'
[ERROR] Failure while parsing XML
An interesting thing is, that the gwt-dev library, that is automatically loaded (as far as I know) is of version 2.0.3
I have tried everything possible (even impossible) and still nothing. At some point I had other exceptions complaining that the import of validation classes could not be resolved. I think, that has been resolved by some other dependencies. Please help. Thank you.
POM configuration:
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>gwt-maven-plugin</artifactId>
<!-- Version 2.1.0-1 works at least with Vaadin 6.5 -->
<version>2.3.0</version>
<configuration>
<!-- if you don't specify any modules, the plugin will find them -->
<!--modules>
..
</modules-->
<webappDirectory>${project.build.directory}/${project.build.finalName}/VAADIN/widgetsets</webappDirectory>
<extraJvmArgs>-Xmx512M -Xss1024k</extraJvmArgs>
<runTarget>clean</runTarget>
<hostedWebapp>${project.build.directory}/${project.build.finalName}</hostedWebapp>
<noServer>true</noServer>
<port>8080</port>
<soyc>false</soyc>
</configuration>
<executions>
<execution>
<goals>
<goal>resources</goal>
<goal>compile</goal>
</goals>
</execution>
</executions>
<dependencies>
<dependency>
<groupId>com.google.gwt</groupId>
<artifactId>gwt-dev</artifactId>
<version>2.3.0</version>
</dependency>
<dependency>
<groupId>com.google.gwt</groupId>
<artifactId>gwt-user</artifactId>
<version>${gwt.version}</version>
</dependency>
</dependencies>
</plugin>
And here are the GWT dependencies
<dependency>
<groupId>com.google.gwt.google-apis</groupId>
<artifactId>gwt-visualization</artifactId>
<version>1.0.2</version>
</dependency>
<dependency>
<groupId>javax.validation</groupId>
<artifactId>validation-api</artifactId>
<version>1.0.0.GA</version>
</dependency>
<dependency>
<groupId>com.google.gwt.google-apis</groupId>
<artifactId>gwt-ajaxloader</artifactId>
<version>1.1.0</version>
</dependency>
<dependency>
<groupId>org.vaadin.addons</groupId>
<artifactId>visualizationsforvaadin</artifactId>
<version>1.1.2</version>
</dependency>
<dependency>
<groupId>javax</groupId>
<artifactId>javaee-web-api</artifactId>
<version>6.0</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.vaadin</groupId>
<artifactId>vaadin</artifactId>
<version>${vaadin.version}</version>
</dependency>
<dependency>
<groupId>com.google.gwt</groupId>
<artifactId>gwt-user</artifactId>
<version>${gwt.version}</version>
</dependency>
I struggled with this over the past 2 days.
For future reference:
What you'll need:
<dependency>
<groupId>com.vaadin</groupId>
<artifactId>vaadin-client-compiler</artifactId>
<version>${versions.vaadin}</version>
</dependency>
Configure gwt-maven-plugin as follows (you may not need all configs that I used, of course):
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>gwt-maven-plugin</artifactId>
<version>${versions.gwt}</version>
<configuration>
<extraJvmArgs>-Xmx512M -Xss1024k</extraJvmArgs>
<webappDirectory>${basedir}/src/main/webapp/VAADIN/widgetsets
</webappDirectory>
<hostedWebapp>${basedir}/src/main/webapp/VAADIN/widgetsets
</hostedWebapp>
<noServer>true</noServer>
<draftCompile>true</draftCompile>
<compileReport>false</compileReport>
<style>DETAILED</style>
<runTarget>http://localhost:8080/</runTarget>
</configuration>
<executions>
<execution>
<goals>
<goal>resources</goal>
<goal>compile</goal>
</goals>
</execution>
</executions>
</plugin>
The vaadin-maven-plugin:
<plugin>
<groupId>com.vaadin</groupId>
<artifactId>vaadin-maven-plugin</artifactId>
<version>${versions.vaadin}</version>
<executions>
<execution>
<configuration>
<!--<modules>
<module>org.vaadin.aceeditor.AceEditorWidgetSet</module>
</modules>-->
</configuration>
<goals>
<goal>update-widgetset</goal>
</goals>
</execution>
</executions>
</plugin>
You may also want to configure your maven-clean-plugin:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-clean-plugin</artifactId>
<version>2.5</version>
<configuration>
<filesets>
<fileset>
<directory>
${basedir}/src/main/webapp/VAADIN/widgetsets
</directory>
</fileset>
</filesets>
</configuration>
</plugin>
In web.xml for your servlet (Ace Editor as an example):
<init-param>
<param-name>widgetset</param-name>
<param-value>org.vaadin.aceeditor.AceEditorWidgetSet</param-value>
</init-param>
Note that I did NOT include gwt as a dependency in my pom.xml.
I have a fully functional Vaadin project with custom widgets in Eclipse and I can't find any references to gwt-dev anywhere in the project. Go to Project properties -> Java Build Path -> Libraries and delete all references to gwt-dev. Also remove it from your pom.xml and try to recompile the widgetset.
edit: It could also be your gwt-user dependency. Try setting the version of gwt-user to 2.3.0 (if it already isn't) and set the scope to provided.

Simply GWT 2.3 and Maven2(3) project in Eclipse Indigo

When I try to create Maven project with this parameters:
Archetype Group Id - org.codehaus.mojo;
Archetype Artifact Id - gwt-maven-plugin;
Archetype Version - 2.3.0-1.
I get some strange errors:
Plugin execution not covered by lifecycle configuration: org.codehaus.mojo:gwt-maven-plugin:2.3.0-1:generateAsync (execution: default, phase: generate-sources)
Plugin execution not covered by lifecycle configuration: org.codehaus.mojo:gwt-maven-plugin:2.3.0-1:i18n (execution: default, phase: generate-sources)
Plugin execution not covered by lifecycle configuration: org.apache.maven.plugins:maven-war-plugin:2.1.1:exploded (execution: default, phase: compile)
And some warnings as:
Implementation of project facet jst.web could not be found. Functionality will be limited.
Implementation of project facet wst.jsdt.web could not be found. Functionality will be limited.
This is my pom.xml:
<?xml version="1.0" encoding="UTF-8"?>
<project
xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<!-- POM file generated with GWT webAppCreator -->
<modelVersion>4.0.0</modelVersion>
<groupId>net.test1</groupId>
<artifactId>TestWebApp</artifactId>
<packaging>war</packaging>
<version>0.0.1-SNAPSHOT</version>
<name>GWT Maven Archetype</name>
<properties>
<!-- Convenience property to set the GWT version -->
<gwtVersion>2.3.0</gwtVersion>
<!-- GWT needs at least java 1.5 -->
<webappDirectory>${project.build.directory}/${project.build.finalName}</webappDirectory>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<dependencies>
<dependency>
<groupId>com.google.gwt</groupId>
<artifactId>gwt-servlet</artifactId>
<version>${gwtVersion}</version>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>com.google.gwt</groupId>
<artifactId>gwt-user</artifactId>
<version>${gwtVersion}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.7</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>javax.validation</groupId>
<artifactId>validation-api</artifactId>
<version>1.0.0.GA</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>javax.validation</groupId>
<artifactId>validation-api</artifactId>
<version>1.0.0.GA</version>
<classifier>sources</classifier>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<!-- Generate compiled stuff in the folder used for developing mode -->
<outputDirectory>${webappDirectory}/WEB-INF/classes</outputDirectory>
<plugins>
<!-- GWT Maven Plugin -->
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>gwt-maven-plugin</artifactId>
<version>2.3.0-1</version>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>test</goal>
<goal>i18n</goal>
<goal>generateAsync</goal>
</goals>
</execution>
</executions>
<!-- Plugin configuration. There are many available options, see
gwt-maven-plugin documentation at codehaus.org -->
<configuration>
<runTarget>TestWebApp.html</runTarget>
<hostedWebapp>${webappDirectory}</hostedWebapp>
<i18nMessagesBundle>net.test1.TestWebApp.client.Messages</i18nMessagesBundle>
</configuration>
</plugin>
<!-- Copy static web files before executing gwt:run -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-war-plugin</artifactId>
<version>2.1.1</version>
<executions>
<execution>
<phase>compile</phase>
<goals>
<goal>exploded</goal>
</goals>
</execution>
</executions>
<configuration>
<webappDirectory>${webappDirectory}</webappDirectory>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>2.3.2</version>
<configuration>
<source>1.5</source>
<target>1.5</target>
</configuration>
</plugin>
</plugins>
</build>
</project>
And so on. What is this? I have tried all possible manuals on the Internet, and everywhere the same. I tried to create project manualy without eclipse and the same. I think, the problem is that manuals in the Internet was writing for old version of Eclipse, Maven, GWT. How can I beat it? How can I just create simple project with GWT 2.3, Maven2 plugin and Eclipse Indigo without errors end warnings?
This is known behavior, discussed on the eclipse wiki. See here: http://wiki.eclipse.org/M2E_plugin_execution_not_covered.
Don't just comment out the problematic sections of your pom, you really do need them. For instance The generated comment for that maven-war-plugin in the pom is "Copy static web files before executing gwt:run" This turns out to be true. If you comment out that plugin and "mvn clean gwt:run", static files will not be copied to the target folder and be unavailable to hosted mode.
Fortunately the workaround is easy. If you open up the pom in Eclipse, look in the Overview section, and click the error message at the top, it will give you some quick fix options. Such as "Permanently mark goal exploded in pom.xml as ignored." This will add some m2e configuration to your pom so it is no longer flagged as an error, and everything will work as before. The newly generated section in your pom is what is described in the link above as the "ignore" option.
Hope this helps.