I try to compile simple scala spark program with maven. I use IntelliJ and am able to run my codes successfully on IntelliJ. But when I try to create jar file using maven although everything is ok maven says:
no source to compile
My directories and files are in proper order:
src -> main -> scala -> Main.scala
$ mvn compile
[INFO] Scanning for projects...
[INFO]
[INFO] -----------------< veribilimiokulu:maven-scala-deneme >-----------------
[INFO] Building maven-scala-deneme 1.0-SNAPSHOT
[INFO] --------------------------------[ jar ]---------------------------------
[WARNING] The POM for com.fasterxml.jackson.core:jackson-core:jar:2.6.7 is invalid, transitive dependencies (if any) will not be available, enable debug logging for more details
[WARNING] The POM for com.fasterxml.jackson.core:jackson-databind:jar:2.6.7.1 is invalid, transitive dependencies (if any) will not be available, enable debug logging for more details
[WARNING] The POM for org.apache.commons:commons-crypto:jar:1.0.0 is invalid, transitive dependencies (if any) will not be available, enable debug logging for more details
[WARNING] The POM for com.fasterxml.jackson.core:jackson-databind:jar:2.6.7 is invalid, transitive dependencies (if any) will not be available, enable debug logging for more details
[INFO]
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) # maven-scala-deneme ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 1 resource
[INFO]
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) # maven-scala-deneme ---
[INFO] No sources to compile
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 6.941 s
[INFO] Finished at: 2018-08-04T23:00:14+03:00
[INFO] ------------------------------------------------------------------------
My pom.xml file is below, I just put dependencies element:
<dependencies>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>2.11.8</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.3.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.3.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_2.11</artifactId>
<version>2.3.1</version>
</dependency>
<dependency>
<groupId>org.scalatest</groupId>
<artifactId>scalatest_2.11</artifactId>
<version>3.2.0-SNAP10</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_2.11</artifactId>
<version>2.3.1</version>
</dependency>
</dependencies>
Related
I have a huge feeling that I'm missing something.
I'm trying to compile Java&Scala project at my work. Luckliy I was able to repuduce my problem on a simple dummy project.
I'm using scala-maven-plugin(https://github.com/davidB/scala-maven-plugin). I'm also base my example on this question
Aiming to solve Scala&Java multi-module project. I have root and two modules m1 and m2.
m1 pom:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>samples.scala-maven-plugin</groupId>
<artifactId>prj_multi_modules</artifactId>
<version>0.0.1-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>
<artifactId>m1</artifactId>
<packaging>jar</packaging>
<dependencies>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
</dependency>
<dependency>
<groupId>samples.scala-maven-plugin</groupId>
<artifactId>m2</artifactId>
<version>0.0.1-SNAPSHOT</version>
</dependency>
</dependencies>
</project>
m2 pom:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>samples.scala-maven-plugin</groupId>
<artifactId>prj_multi_modules</artifactId>
<version>0.0.1-SNAPSHOT</version>
<relativePath>../pom.xml</relativePath>
</parent>
<artifactId>m2</artifactId>
<packaging>jar</packaging>
<properties>
<cdh.version>cdh5.7.6</cdh.version>
<build.prop.dir>..</build.prop.dir>
<scala.version>2.11.8</scala.version>
<scala.version.major>2.11</scala.version.major>
<spark.version>2.3.0.cloudera4</spark.version>
<hadoop.version>2.6.0-${cdh.version}</hadoop.version>
</properties>
<dependencies>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_${scala.version.major}</artifactId>
<version>${spark.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_${scala.version.major}</artifactId>
<version>${spark.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_${scala.version.major}</artifactId>
<version>${spark.version}</version>
<scope>provided</scope>
</dependency>
<dependency> <!-- remove this for spark 2.3? -->
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_${scala.version.major}</artifactId>
<version>${spark.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.github.scopt</groupId>
<artifactId>scopt_${scala.version.major}</artifactId>
<version>3.3.0</version>
</dependency>
<dependency>
<groupId>com.databricks</groupId>
<artifactId>spark-csv_${scala.version.major}</artifactId>
<version>1.5.0</version>
</dependency>
<dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-csv</artifactId>
<version>1.1</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>${hadoop.version}</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>${hadoop.version}</version>
</dependency>
<dependency>
<groupId>com.holdenkarau</groupId>
<artifactId>spark-testing-base_${scala.version.major}</artifactId>
<version>1.6.0_0.7.4</version>
<scope>test</scope>
</dependency>
</dependencies>
</project>
root pom:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<repositories>
<repository>
<id>cloudera</id>
<url>https://repository.cloudera.com/artifactory/cloudera-repos/</url>
</repository>
</repositories>
<groupId>samples.scala-maven-plugin</groupId>
<artifactId>prj_multi_modules</artifactId>
<version>0.0.1-SNAPSHOT</version>
<packaging>pom</packaging>
<properties>
<cdh.version>cdh5.7.6</cdh.version>
<build.prop.dir>..</build.prop.dir>
<scala.version>2.11.8</scala.version>
<scala.version.major>2.11</scala.version.major>
<spark.version>2.3.0.cloudera4</spark.version>
<hadoop.version>2.6.0-${cdh.version}</hadoop.version>
</properties>
<dependencyManagement>
<dependencies>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>${scala.version}</version>
</dependency>
</dependencies>
</dependencyManagement>
<build>
<pluginManagement>
<plugins>
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>3.2.2</version>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>testCompile</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</pluginManagement>
<plugins>
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
</plugin>
</plugins>
</build>
<modules>
<module>m1</module>
<module>m2</module>
</modules>
</project>
My goal is very simple I want m1 project to get all spark dpedencies from m2 module in order to compile a spark code in m1 module.
This is the spark code in m1:
package p1
import org.apache.spark.sql.SparkSession
class MyClass(spark: SparkSession) {
import spark.implicits._
def run(input: String) = {
val df = spark.read.parquet(input)
df.show()
}
}
object MyClass {
case class Params(input: String = "")
def main(args: Array[String]): Unit = {
val parser = new scopt.OptionParser[Params](usageHeader) {
opt[String]('i', "input") required() action { (s, params) => params.copy(input = s) }
}
parser.parse(args, Params()) match {
case Some(p) =>
val spark = SparkSession.builder
.appName("spark2 example")
.getOrCreate()
val job = new Spark2Exp(spark)
job.run(p.input)
spark.stop()
case None =>
System.exit(1)
}
}
val usageHeader: String = "n/a"
}
m2 compilation is working smoothly. While compiling m1 all java compilation process is good but when trying to compile scala code it fails not finding spark dependencies as if it doesn't transient between the dependent modules.
In addition if I copy all m2 spark dependencies to m1 pom everything is working.
I can't seem to understand what I'm doing wrong. I am sure it's scala - maven issue since this scenario is pretty straight forward in java.
Adding compilation error:
"C:\Program Files\Java\jdk1.8.0_151\bin\java" -Dmaven.multiModuleProjectDirectory=C:\dev\scala-maven-plugin "-Dmaven.home=C:\Program Files\JetBrains\IntelliJ IDEA 2017.3\plugins\maven\lib\maven3" "-Dclassworlds.conf=C:\Program Files\JetBrains\IntelliJ IDEA 2017.3\plugins\maven\lib\maven3\bin\m2.conf" "-javaagent:C:\Program Files\JetBrains\IntelliJ IDEA 2017.3\lib\idea_rt.jar=55662:C:\Program Files\JetBrains\IntelliJ IDEA 2017.3\bin" -Dfile.encoding=UTF-8 -classpath "C:\Program Files\JetBrains\IntelliJ IDEA 2017.3\plugins\maven\lib\maven3\boot\plexus-classworlds-2.5.2.jar" org.codehaus.classworlds.Launcher -Didea.version=2017.3 clean install
[INFO] Scanning for projects...
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Build Order:
[INFO]
[INFO] prj_multi_modules
[INFO] m2
[INFO] m1
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Building prj_multi_modules 0.0.1-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO]
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) # prj_multi_modules ---
[INFO]
[INFO] --- scala-maven-plugin:3.2.2:compile (default) # prj_multi_modules ---
[INFO] No sources to compile
[INFO]
[INFO] --- scala-maven-plugin:3.2.2:testCompile (default) # prj_multi_modules ---
[INFO] No sources to compile
[INFO]
[INFO] --- maven-install-plugin:2.4:install (default-install) # prj_multi_modules ---
[INFO] Installing C:\dev\scala-maven-plugin\samples\prj_multi_modules\pom.xml to C:\Users\raphael.peretz\.m2\repository\samples\scala-maven-plugin\prj_multi_modules\0.0.1-SNAPSHOT\prj_multi_modules-0.0.1-SNAPSHOT.pom
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Building m2 0.0.1-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO]
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) # m2 ---
[INFO] Deleting C:\dev\scala-maven-plugin\samples\prj_multi_modules\m2\target
[INFO]
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) # m2 ---
[WARNING] Using platform encoding (UTF-8 actually) to copy filtered resources, i.e. build is platform dependent!
[INFO] skip non existing resourceDirectory C:\dev\scala-maven-plugin\samples\prj_multi_modules\m2\src\main\resources
[INFO]
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) # m2 ---
[INFO] No sources to compile
[INFO]
[INFO] --- scala-maven-plugin:3.2.2:compile (default) # m2 ---
[WARNING] Expected all dependencies to require Scala version: 2.11.8
[WARNING] samples.scala-maven-plugin:m2:0.0.1-SNAPSHOT requires scala version: 2.11.8
[WARNING] com.twitter:chill_2.11:0.8.0 requires scala version: 2.11.8
[WARNING] org.apache.spark:spark-core_2.11:2.3.0.cloudera4 requires scala version: 2.11.8
[WARNING] org.json4s:json4s-jackson_2.11:3.2.11 requires scala version: 2.11.8
[WARNING] org.json4s:json4s-core_2.11:3.2.11 requires scala version: 2.11.8
[WARNING] org.json4s:json4s-ast_2.11:3.2.11 requires scala version: 2.11.8
[WARNING] org.json4s:json4s-core_2.11:3.2.11 requires scala version: 2.11.0
[WARNING] Multiple versions of scala libraries detected!
[INFO] No sources to compile
[INFO]
[INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) # m2 ---
[WARNING] Using platform encoding (UTF-8 actually) to copy filtered resources, i.e. build is platform dependent!
[INFO] skip non existing resourceDirectory C:\dev\scala-maven-plugin\samples\prj_multi_modules\m2\src\test\resources
[INFO]
[INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) # m2 ---
[INFO] No sources to compile
[INFO]
[INFO] --- scala-maven-plugin:3.2.2:testCompile (default) # m2 ---
[WARNING] Expected all dependencies to require Scala version: 2.11.8
[WARNING] samples.scala-maven-plugin:m2:0.0.1-SNAPSHOT requires scala version: 2.11.8
[WARNING] com.twitter:chill_2.11:0.8.0 requires scala version: 2.11.8
[WARNING] org.apache.spark:spark-core_2.11:2.3.0.cloudera4 requires scala version: 2.11.8
[WARNING] org.json4s:json4s-jackson_2.11:3.2.11 requires scala version: 2.11.8
[WARNING] org.json4s:json4s-core_2.11:3.2.11 requires scala version: 2.11.8
[WARNING] org.json4s:json4s-ast_2.11:3.2.11 requires scala version: 2.11.8
[WARNING] org.json4s:json4s-core_2.11:3.2.11 requires scala version: 2.11.0
[WARNING] Multiple versions of scala libraries detected!
[INFO] No sources to compile
[INFO]
[INFO] --- maven-surefire-plugin:2.12.4:test (default-test) # m2 ---
[INFO] No tests to run.
[INFO]
[INFO] --- maven-jar-plugin:2.4:jar (default-jar) # m2 ---
[WARNING] JAR will be empty - no content was marked for inclusion!
[INFO] Building jar: C:\dev\scala-maven-plugin\samples\prj_multi_modules\m2\target\m2-0.0.1-SNAPSHOT.jar
[INFO]
[INFO] --- maven-install-plugin:2.4:install (default-install) # m2 ---
[INFO] Installing C:\dev\scala-maven-plugin\samples\prj_multi_modules\m2\target\m2-0.0.1-SNAPSHOT.jar to C:\Users\raphael.peretz\.m2\repository\samples\scala-maven-plugin\m2\0.0.1-SNAPSHOT\m2-0.0.1-SNAPSHOT.jar
[INFO] Installing C:\dev\scala-maven-plugin\samples\prj_multi_modules\m2\pom.xml to C:\Users\raphael.peretz\.m2\repository\samples\scala-maven-plugin\m2\0.0.1-SNAPSHOT\m2-0.0.1-SNAPSHOT.pom
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Building m1 0.0.1-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO]
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) # m1 ---
[INFO]
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) # m1 ---
[WARNING] Using platform encoding (UTF-8 actually) to copy filtered resources, i.e. build is platform dependent!
[INFO] skip non existing resourceDirectory C:\dev\scala-maven-plugin\samples\prj_multi_modules\m1\src\main\resources
[INFO]
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) # m1 ---
[INFO] No sources to compile
[INFO]
[INFO] --- scala-maven-plugin:3.2.2:compile (default) # m1 ---
[INFO] C:\dev\scala-maven-plugin\samples\prj_multi_modules\m1\src\main\scala:-1: info: compiling
[INFO] Compiling 1 source files to C:\dev\scala-maven-plugin\samples\prj_multi_modules\m1\target\classes at 1558957820236
[ERROR] C:\dev\scala-maven-plugin\samples\prj_multi_modules\m1\src\main\scala\p1\MyClass.scala:3: error: object spark is not a member of package org.apache
[ERROR] import org.apache.spark.sql.SparkSession
[ERROR] ^
[ERROR] C:\dev\scala-maven-plugin\samples\prj_multi_modules\m1\src\main\scala\p1\MyClass.scala:5: error: not found: type SparkSession
[ERROR] class MyClass(spark: SparkSession) {
[ERROR] ^
[ERROR] C:\dev\scala-maven-plugin\samples\prj_multi_modules\m1\src\main\scala\p1\MyClass.scala:29: error: not found: value SparkSession
[ERROR] val spark = SparkSession.builder
[ERROR] ^
[ERROR] C:\dev\scala-maven-plugin\samples\prj_multi_modules\m1\src\main\scala\p1\MyClass.scala:33: error: not found: type Spark2Exp
[ERROR] val job = new Spark2Exp(spark)
[ERROR] ^
[ERROR] four errors found
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] prj_multi_modules .................................. SUCCESS [ 1.435 s]
[INFO] m2 ................................................. SUCCESS [ 22.342 s]
[INFO] m1 ................................................. FAILURE [ 3.497 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 27.379 s
[INFO] Finished at: 2019-05-27T14:50:23+03:00
[INFO] Final Memory: 52M/927M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.2.2:compile (default) on project m1: wrap: org.apache.commons.exec.ExecuteException: Process exited with an error: 1 (Exit value: 1) -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn <goals> -rf :m1
Process finished with exit code 1
Since you need spark in both m1 and m2, why not delcare the spark dependency in root pom file?
Of course, you can remove the duplicate one in m2 afterwards.
I am working with a Spark/Scala application on intelliJ with Maven. I am trying to write a file and after the writing is complete, I am trying to read it line by line and increment a variable named count depending on a condition. Here is my code:
val writer: BufferedWriter = new BufferedWriter(new FileWriter("test.csv"))
/* Logic
for
writing
the file
*/
writer.close()
var count = 0
val bufferedSource = Source.fromFile("test.csv")
for(line <- bufferedSource.getLines()){
count = count+1
line.split(",").foreach(x => {if(x.toString == "1") count = count+1})
}
bufferedSource.close()
When I run the above code, it gives me the following error:
Exception in thread "main" java.lang.NoSuchMethodError: scala.runtime.IntRef.create(I)Lscala/runtime/IntRef;
It is throwing the error at the variable declaration statement: var count = 0
I searched for this error online and found the reason to be scala version mismatch, as pointed out here: NoSuchMethodError when declaring a variable
Initially, I had my project SDK setup to 2.10.6. That's the default version I use for my projects. I didn't mention any scala language dependency in my pom.xml. And when I run the mvn -U package command for building the jar, I get the following messages:
[INFO] Scanning for projects...
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Building project
[INFO] ------------------------------------------------------------------------
[INFO]
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) # IPscan ---
[WARNING] Using platform encoding (Cp1252 actually) to copy filtered resources, i.e. build is platform dependent!
[INFO] skip non existing resourceDirectory C:\Users\zkh9xcl\workspace\IPscan\src\main\resources
[INFO]
[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) # IPscan ---
[WARNING] Expected all dependencies to require Scala version: 2.11.7
[WARNING] com.databricks:spark-csv_2.11:1.4.0 requires scala version: 2.11.7
[WARNING] com.twitter:chill_2.10:0.5.0 requires scala version: 2.10.4
[WARNING] Multiple versions of scala libraries detected!
[INFO] Nothing to compile - all classes are up to date
[INFO]
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) # IPscan ---
[INFO] Nothing to compile - all classes are up to date
[INFO]
[INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) # IPscan ---
[WARNING] Using platform encoding (Cp1252 actually) to copy filtered resources, i.e. build is platform dependent!
[INFO] skip non existing resourceDirectory C:\Users\zkh9xcl\workspace\IPscan\src\test\resources
[INFO]
[INFO] --- scala-maven-plugin:3.2.2:testCompile (scala-test-compile) # IPscan ---
[WARNING] Expected all dependencies to require Scala version: 2.11.7
[WARNING] com.databricks:spark-csv_2.11:1.4.0 requires scala version: 2.11.7
[WARNING] com.twitter:chill_2.10:0.5.0 requires scala version: 2.10.4
[WARNING] Multiple versions of scala libraries detected!
[INFO] No sources to compile
[INFO]
[INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) # IPscan ---
[INFO] No sources to compile
[INFO]
[INFO] --- maven-surefire-plugin:2.12.4:test (default-test) # IPscan ---
[INFO] No tests to run.
[INFO]
[INFO] --- maven-jar-plugin:2.4:jar (default-jar) # IPscan ---
[INFO]
[INFO] --- maven-shade-plugin:2.4.1:shade (default) # IPscan ---
[INFO] Including com.databricks:spark-csv_2.11:jar:1.4.0 in the shaded jar.
[INFO] Including org.scala-lang:scala-library:jar:2.11.7 in the shaded jar.
[INFO] Including org.apache.commons:commons-csv:jar:1.1 in the shaded jar.
[INFO] Including com.univocity:univocity-parsers:jar:1.5.1 in the shaded jar.
[INFO] Skipping pom dependency org.jodd:jodd:pom:3.4.0 in the shaded jar.
[INFO] Including org.threeten:threetenbp:jar:1.3.3 in the shaded jar.
[INFO] Attaching shaded artifact.
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 14.054 s
[INFO] Finished at: 2017-10-26T11:18:39-04:00
[INFO] Final Memory: 38M/663M
[INFO] ------------------------------------------------------------------------
It is throwing a warning saying: [WARNING] Multiple versions of scala libraries detected!
Here's my pom.xml:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>project</groupId>
<artifactId></artifactId>
<version>1.0-SNAPSHOT</version>
<name>IPscan</name>
<repositories>
<repository>
<id>scala-tools.org</id>
<name>Scala-tools Maven2 Repository</name>
<url>http://scala-tools.org/repo-releases</url>
</repository>
</repositories>
<pluginRepositories>
<pluginRepository>
<id>scala-tools.org</id>
<name>Scala-tools Maven2 Repository</name>
<url>http://scala-tools.org/repo-releases</url>
</pluginRepository>
</pluginRepositories>
<build>
<sourceDirectory>src</sourceDirectory>
<plugins>
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>3.2.2</version>
<executions>
<execution>
<id>scala-compile-first</id>
<phase>process-resources</phase>
<goals>
<goal>compile</goal>
</goals>
</execution>
<execution>
<id>scala-test-compile</id>
<phase>process-test-resources</phase>
<goals>
<goal>testCompile</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>2.4.1</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<relocations>
<relocation>
<pattern>org.apache.http</pattern>
<shadedPattern>org.shaded.apache.http</shadedPattern>
</relocation>
</relocations>
<filters>
<filter>
<artifact>*:*</artifact>
</filter>
</filters>
<shadedArtifactAttached>true</shadedArtifactAttached>
<shadedClassifierName>shaded</shadedClassifierName>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
<dependencies>
<dependency>
<groupId>com.databricks</groupId>
<artifactId>spark-csv_2.11</artifactId>
<version>1.4.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.jodd/jodd -->
<dependency>
<groupId>org.jodd</groupId>
<artifactId>jodd</artifactId>
<version>3.4.0</version>
<type>pom</type>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-core_2.10 -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.10</artifactId>
<version>1.6.0</version>
<scope>provided</scope>
<exclusions>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
</exclusion>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>jcl-over-slf4j</artifactId>
</exclusion>
</exclusions>
</dependency>
<!-- https://mvnrepository.com/artifact/org.threeten/threetenbp -->
<dependency>
<groupId>org.threeten</groupId>
<artifactId>threetenbp</artifactId>
<version>1.3.3</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.spark/spark-hive_2.10 -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_2.10</artifactId>
<version>1.6.0</version>
<scope>provided</scope>
<exclusions>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
</exclusion>
</exclusions>
</dependency>
</dependencies>
</project>
I am new to maven, please excuse me if there is a probelm with my pom.xml.
I am not sure why there are multiple versions of scala libraries included in the jar. I actually changed the java version to 2.11.7 and then to 2.10.4 in the global libraries of the project in intelliJ, but it gives a problem saying Library scala-sdk-2.10.4 is not used [Fix]. I am pretty sure I added 2.10.4 to the list of dependencies in intelliJ. No matter what I do, I still get the multiple scala versions warning when I build the jar and the noSuchMethodError still persists. Looks like I am missing something.
What is causing the error? And what is the best way to overcome this and avoid it in the future? Any help would be appreciated. Thank you!
P.S: I am using Spark 1.6 and I can't upgrade to a newer version.
Its a simple case of dependency mismatch. My guess is that it is because of the spark-csv package by com.databricks. Can you change the version to 2.10 and try again.
Let me know if this helped. Cheers.
I am able to run my cucumber runner file(as JUnit) without any issues. Tests are picked up and running fine.
But when i run through maven, though maven points to the Runner file, unable to execute tests.
Please find my maven logs and pom.xml file. Can someone help me what is missing in pom.xml? or eclipse configuration?
[INFO] Scanning for projects...
[INFO]
[INFO] Using the builder org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder with a thread count of 1
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Building LinenHousePOC 0.0.1-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO]
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) # LinenHousePOC ---
[WARNING] Using platform encoding (Cp1252 actually) to copy filtered resources, i.e. build is platform dependent!
[INFO] skip non existing resourceDirectory E:\Programming\Cucumber\LinenHousePOC\src\main\resources
[INFO]
[INFO] --- maven-compiler-plugin:2.5.1:compile (default-compile) # LinenHousePOC ---
[INFO] Nothing to compile - all classes are up to date
[INFO]
[INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) # LinenHousePOC ---
[WARNING] Using platform encoding (Cp1252 actually) to copy filtered resources, i.e. build is platform dependent!
[INFO] Copying 0 resource
[INFO]
[INFO] --- maven-compiler-plugin:2.5.1:testCompile (default-testCompile) # LinenHousePOC ---
[INFO] Nothing to compile - all classes are up to date
[INFO]
[INFO] --- maven-surefire-plugin:2.12.4:test (default-test) # LinenHousePOC ---
[INFO] Surefire report directory: E:\Programming\Cucumber\LinenHousePOC\target\surefire-reports
-------------------------------------------------------
T E S T S
-------------------------------------------------------
Running runners.RunnerTest
Configuring TestNG with: org.apache.maven.surefire.testng.conf.TestNG652Configurator#7f7052
Tests run: 0, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.041 sec
Results :
Tests run: 0, Failures: 0, Errors: 0, Skipped: 0
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 13.463 s
[INFO] Finished at: 2016-07-03T10:45:17+05:30
[INFO] Final Memory: 8M/19M
[INFO] ------------------------------------------------------------------------
Following is my pom.xml:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>LinenHousePOC</groupId>
<artifactId>LinenHousePOC</artifactId>
<version>0.0.1-SNAPSHOT</version>
<build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>1.2.1</version>
</plugin>
</plugins>
</build>
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.11</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>info.cukes</groupId>
<artifactId>cucumber-java</artifactId>
<version>1.2.4</version>
</dependency>
<dependency>
<groupId>info.cukes</groupId>
<artifactId>cucumber-picocontainer</artifactId>
<version>1.2.4</version>
</dependency>
<dependency>
<groupId>info.cukes</groupId>
<artifactId>cucumber-junit</artifactId>
<version>1.2.4</version>
</dependency>
<dependency>
<groupId>info.cukes</groupId>
<artifactId>cucumber-testng</artifactId>
<version>1.1.5</version>
</dependency>
<dependency>
<groupId>org.seleniumhq.selenium</groupId>
<artifactId>selenium-java</artifactId>
<version>2.47.1</version>
</dependency>
<dependency>
<groupId>org.testng</groupId>
<artifactId>testng</artifactId>
<version>6.9.10</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.relevantcodes</groupId>
<artifactId>extentreports</artifactId>
<version>2.41.0</version>
</dependency>
</dependencies>
</project>
Update in your POM build!!
use below configuration and run mvn clean verify. If you don't want to run tests in parallel, remove parallel, perCoreThreadCount and threadCountClasses tags. Make sure to update the regular expression to match your test naming convention **/Run*.java
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.16</version>
<executions>
<execution>
<id>acceptance-test</id>
<phase>integration-test</phase>
<goals>
<goal>test</goal>
</goals>
<configuration>
<outputEncoding>UTF-8</outputEncoding>
<parallel>classes</parallel>
<perCoreThreadCount>true</perCoreThreadCount>
<threadCountClasses>10</threadCountClasses>
<argLine>-Xmx1024m</argLine>
<argLine>-XX:MaxPermSize=256m</argLine>
<includes>
<include>**/Run*.java</include>
</includes>
</configuration>
</execution>
</executions>
</plugin>
Executing the tests from IDE is different from executing the tests from Command Line (Maven).
In IDE, the configuration, the context variables are automatically initialized or picked when we invoke the tests. Where as in the Maven Execution, few parameters needs to be explicitly mentioned.
when you invoke a test in :
Executing tests with Maven (command line)
mvn clean test -Dcucumber.options="--format json-pretty --glue classpath:src/test/resources"
The above command will set the glue from where the tests needs to be picked by maven to invoke the tests.
This glue information is same as the information passed as input in the CucumberOptions in the CucumberRunner.java
Executing tests with Maven (using Maven Profile)
Create a profile and pass the cucumber.options as input parameters as shown below:
<profiles>
<profile>
<id>cucumber-tests</id>
<properties>
<cucumber.options>--glue src/test/resources</cucumber.options>
</properties>
. . .
</profile>
</profiles>
I configured Maven with Junit using mainly this tutorial:
http://maven.apache.org/surefire/maven-surefire-plugin/examples/junit.html
I have a one test Class and when I execute "mvn test", this test Class is caught by Maven but Maven doesn´t check it (The result should be FAIL).
C:\Users\User\git\example\example>mvn test
[INFO] Scanning for projects...
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Building passport 0.0.1-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO]
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) # passport -
--
[WARNING] Using platform encoding (Cp1252 actually) to copy filtered resources,
i.e. build is platform dependent!
[INFO] Copying 1 resource
[INFO]
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) # passport ---
[INFO] Nothing to compile - all classes are up to date
[INFO]
[INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) # pa
ssport ---
[WARNING] Using platform encoding (Cp1252 actually) to copy filtered resources,
i.e. build is platform dependent!
[INFO] Copying 0 resource
[INFO]
[INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) # passpor
t ---
[INFO] No sources to compile
[INFO]
[INFO] --- maven-surefire-plugin:2.18.1:test (default-test) # passport ---
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1.472 s
[INFO] Finished at: 2015-04-22T12:07:27+02:00
[INFO] Final Memory: 9M/155M
[INFO] ------------------------------------------------------------------------
My Test Class is in src/main/resources (the console shows "Copying 1 resource"):
package main.resources;
import static org.junit.Assert.*;
import org.junit.Test;
public class exampleTest {
#Test
public void test() {
int result = 5;
assertTrue(result == 2);
}
}
My POM.xml:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>example</groupId>
<artifactId>example</artifactId>
<version>0.0.1-SNAPSHOT</version>
<packaging>war</packaging>
<name>passport</name>
<build>
<sourceDirectory>src</sourceDirectory>
<plugins>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
<plugin>
<artifactId>maven-war-plugin</artifactId>
<version>2.4</version>
<configuration>
<warSourceDirectory>WebContent</warSourceDirectory>
<failOnMissingWebXml>false</failOnMissingWebXml>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.18.1</version>
<configuration>
<parallel>methods</parallel>
<threadCount>10</threadCount>
</configuration>
<dependencies>
<dependency>
<groupId>org.apache.maven.surefire</groupId>
<artifactId>surefire-junit47</artifactId>
<version>2.18.1</version>
</dependency>
</dependencies>
</plugin>
</plugins>
</build>
<dependencies>
<dependency>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
<version>2.1</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.11</version>
<scope>test</scope>
</dependency>
</dependencies>
</project>
Maven version:
Apache Maven 3.3.1 (cab6659f9874fa96462afef40fcf6bc033d58c1c; 2015-03-13T21:10:2
7+01:00)
Maven home: C:\apache-maven-3.3.1\bin\..
Java version: 1.8.0_40, vendor: Oracle Corporation
Java home: C:\Program Files\Java\jdk1.8.0_40\jre
Default locale: es_ES, platform encoding: Cp1252
OS name: "windows 8.1", version: "6.3", arch: "amd64", family: "dos"
Any idea why don´t check the test Case?
Other question, I don´t know because shows a warning about enconding CP1252 when I have configured my project with UTF-8.
Your test class must reside in src/test/java and NOT in src/main/resources Take a look at the default folder layout.
Based on your question the following should be made part of your pom:
<project>
...
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
...
</project>
I am trying to use a maven project in eclipse. This is the first time I am using maven repository.
I am using maven 3.2.3
When I do
mvn clean install -U
it shows these errors
[INFO] Scanning for projects...
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Building TreetaggerV2 0.0.1-SNAPSHOT
[INFO] ------------------------------------------------------------------------
Downloading: http://hlt-services4.fbk.eu:8080/artifactory/repo/de/tudarmstadt/ukp/dkpro/core/de.tudarmstadt.ukp.dkpro.core.treetagger-bin/20131118.0/de.tudarmstadt.ukp.dkpro.core.treetagger-bin-20131118.0.pom
Downloading: https://repo.maven.apache.org/maven2/de/tudarmstadt/ukp/dkpro/core/de.tudarmstadt.ukp.dkpro.core.treetagger-bin/20131118.0/de.tudarmstadt.ukp.dkpro.core.treetagger-bin-20131118.0.pom
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 2.016 s
[INFO] Finished at: 2014-11-11T01:28:05-08:00
[INFO] Final Memory: 12M/333M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal on project TreetaggerV2: Could not resolve dependencies for project eu.excitementproject:TreetaggerV2:jar:0.0.1-SNAPSHOT: Failed to collect dependencies at de.tudarmstadt.ukp.dkpro.core:de.tudarmstadt.ukp.dkpro.core.treetagger-bin:jar:20131118.0: Failed to read artifact descriptor for de.tudarmstadt.ukp.dkpro.core:de.tudarmstadt.ukp.dkpro.core.treetagger-bin:jar:20131118.0: Could not transfer artifact de.tudarmstadt.ukp.dkpro.core:de.tudarmstadt.ukp.dkpro.core.treetagger-bin:pom:20131118.0 from/to FBK (http://hlt-services4.fbk.eu:8080/artifactory/repo): Not authorized , ReasonPhrase:Unauthorized. -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/DependencyResolutionException
my pom file is
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>eu.excitementproject</groupId>
<artifactId>TreetaggerV2</artifactId>
<version>0.0.1-SNAPSHOT</version>
<dependencies>
<dependency>
<groupId>eu.excitementproject</groupId>
<artifactId>core</artifactId>
<version>1.1.4</version>
</dependency>
<dependency>
<groupId>eu.excitementproject</groupId>
<artifactId>lap</artifactId>
<version>1.1.4</version>
</dependency>
<dependency>
<groupId>com.googlecode.json-simple</groupId>
<artifactId>json-simple</artifactId>
<version>1.1</version>
</dependency>
<!-- TreeTagger related dependencies -->
<dependency>
<groupId>de.tudarmstadt.ukp.dkpro.core</groupId>
<artifactId>de.tudarmstadt.ukp.dkpro.core.treetagger-bin</artifactId>
<version>20131118.0</version>
</dependency>
<dependency>
<groupId>de.tudarmstadt.ukp.dkpro.core</groupId>
<artifactId>de.tudarmstadt.ukp.dkpro.core.treetagger-model-de</artifactId>
<version>20121207.0</version>
</dependency>
<dependency>
<groupId>de.tudarmstadt.ukp.dkpro.core</groupId>
<artifactId>de.tudarmstadt.ukp.dkpro.core.treetagger-model-en</artifactId>
<version>20111109.0</version>
</dependency>
<dependency>
<groupId>de.tudarmstadt.ukp.dkpro.core</groupId>
<artifactId>de.tudarmstadt.ukp.dkpro.core.treetagger-model-it</artifactId>
<version>20101115.0</version>
</dependency>
<!-- end of TreeTagger related dependencies -->
</dependencies>
<repositories>
<repository>
<id>FBK</id>
<url>http://hlt-services4.fbk.eu:8080/artifactory/repo</url>
<snapshots>
<enabled>false</enabled>
</snapshots>
</repository>
</repositories>
</project>
I want to add treetagger from dkpro to the pom , but it gives an error?
I believe you are trying to use Excitement. They have some documentation to help you get TreeTagger set up: https://github.com/hltfbk/Excitement-Open-Platform/wiki/Step-by-Step,-TreeTagger-Installation
Unfortunately, the TreeTagger license does not permit redistribution of the binaries (or models), that is why you have to package the models and binaries yourself for use with Excitement (or DKPr Core for that matter).
Disclosure: I'm a developer of DKPro Core.
That said, the articles mentioned my MariuszS are also in general great pointers.
Read this articles:
TreeTagger configuration with DKPro and Maven
How to package resources (e.g. models) as Maven Artifacts
There is problem with accessing file: de.tudarmstadt.ukp.dkpro.core.treetagger-bin-20131118.0.pom
HTTP Status 401 - Download request for repo:path 'tl-repository:de/tudarmstadt/ukp/dkpro/core/de.tudarmstadt.ukp.dkpro.core.treetagger-bin/20131118.0/de.tudarmstadt.ukp.dkpro.core.treetagger-bin-20131118.0.pom' is forbidden for user 'anonymous'.
In general you need to add server definition with creditensial to settings.xml.
<settings>
<servers>
<server>
<id>hlt-services4.fbk.eu</id>
<username>user</username>
<password>password</password>
</server>
</servers>
</settings>
but in your particular case you need to install this jars locally using ant script (link above)
(..) we offer Ant scripts to automatically download the resources and package them as DKPro-compatible JARs. When the license permits, we upload these to our public Maven repository.
You can try building your project in offline mode -O if all artifacts are installed locally.
More: Adding Credentials to Your Maven Settings