compilation error during gatling load test - scala

I'm trying to write a simulation and i want to be able to run the simulation.
I get an error while trying to $mvn gatling:execute.
My pom has the following dependencies:
<dependency>
<groupId>io.gatling</groupId>
<artifactId>gatling-charts</artifactId>
<version>2.2.5</version>
</dependency>
<dependency>
<groupId>io.gatling</groupId>
<artifactId>gatling-core</artifactId>
<version>2.2.5</version>
</dependency>
and the following plugins:
<build>
<plugins>
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>3.3.1</version>
<configuration>
<scalaVersion>2.12.3</scalaVersion>
</configuration>
</plugin>
<plugin>
<groupId>io.gatling</groupId>
<artifactId>gatling-maven-plugin</artifactId>
<version>2.2.4</version>
<executions>
<execution>
<id>performanceTests</id>
<goals>
<goal>execute</goal>
</goals>
<configuration>
<simulationClass>simulations.SimulationClass</simulationClass>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
This is the error that i came across:
12:50:36.313 [main][WARN ][ZincCompiler.scala:141] i.g.c.ZincCompiler$ -
Pruning sources from previous analysis, due to incompatible CompileSetup.
java.lang.ClassNotFoundException: io.gatling.app.Gatling
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at io.gatling.mojo.MainWithArgsInFile.runMain(MainWithArgsInFile.java:42)
at io.gatling.mojo.MainWithArgsInFile.main(MainWithArgsInFile.java:33)
Included dependency:
<dependency>
<groupId>io.gatling</groupId>
<artifactId>gatling-app</artifactId>
<version>2.2.5</version>
</dependency>
But now i get a new error:
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at io.gatling.mojo.MainWithArgsInFile.runMain(MainWithArgsInFile.java:50)
at io.gatling.mojo.MainWithArgsInFile.main(MainWithArgsInFile.java:33)
Caused by: java.lang.NoSuchMethodError: scala.Product.$init$(Lscala/Product;)V
at com.github.mnogu.gatling.mqtt.protocol.MqttProtocolOptionPart.<init>(MqttProtocol.scala:124)
at com.github.mnogu.gatling.mqtt.protocol.MqttProtocol$.apply(MqttProtocol.scala:24)
at com.github.mnogu.gatling.mqtt.protocol.MqttProtocolBuilder$.apply(MqttProtocolBuilder.scala:13)
at com.github.mnogu.gatling.mqtt.Predef$.mqtt(Predef.scala:9)
at simulations.SimulationClass.<init>(SimulationClass.scala:10)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at java.lang.Class.newInstance(Class.java:442)
at io.gatling.app.Runner.run0(Runner.scala:79)
at io.gatling.app.Runner.run(Runner.scala:64)
at io.gatling.app.Gatling$.start(Gatling.scala:59)
at io.gatling.app.Gatling$.fromArgs(Gatling.scala:43)
at io.gatling.app.Gatling$.main(Gatling.scala:35)
at io.gatling.app.Gatling.main(Gatling.scala)
...
Any leads regarding this problem would be of great help.
Thank you

Your error says that you are missing following dependency,
<!-- https://mvnrepository.com/artifact/io.gatling/gatling-app -->
<dependency>
<groupId>io.gatling</groupId>
<artifactId>gatling-app</artifactId>
<version>2.2.5</version>
</dependency>
Updated
Your second error is mainly because the jar file that your code refers at runtime is not the same jar file that is used at compile time. So make sure that you are using the same jar while compiling and while running.
For that, you can delete the dependent jar files, maven clean your project and recompile and run the program.

Looks like the external jar i was using was compliant only with gatling core 2.2.3. Reverting the gatling core version to 2.2.3 and scala version to 2.11.8 solved the problem.
Here is how my pom now looks:
<dependency>
<groupId>io.gatling.highcharts</groupId>
<artifactId>gatling-charts-highcharts</artifactId>
<version>2.2.3</version>
</dependency>
<dependency>
<groupId>io.gatling</groupId>
<artifactId>gatling-core</artifactId>
<version>2.2.3</version>
</dependency>
<dependency>
<groupId>io.gatling</groupId>
<artifactId>gatling-app</artifactId>
<version>2.2.3</version>
</dependency>
<plugins>
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>3.3.1</version>
<configuration>
<scalaVersion>2.11.8</scalaVersion>
</configuration>
</plugin>
<plugin>
<groupId>io.gatling</groupId>
<artifactId>gatling-maven-plugin</artifactId>
<version>2.2.3</version>
<executions>
<execution>
<id>performanceTests</id>
<goals>
<goal>execute</goal>
</goals>
<configuration>
<simulationClass>simulations.SimulationClass</simulationClass>
</configuration>
</execution>
</executions>
</plugin>
</plugins>

Related

error: java.lang.NoSuchMethodError: 'scala.tools.nsc.reporters.Reporter scala.tools.nsc.Global.reporter()' in scoverage-maven-plugin

I am trying to get some reports with the code coverage information for my project. I found that scoverage-maven-plugin produce xml files covering the % of the different modules that the unit test cases have covered so far.
So I tried to add that plugin in my pom, but I am getting the following error:
[ERROR] error: java.lang.NoSuchMethodError: 'scala.tools.nsc.reporters.Reporter scala.tools.nsc.Global.reporter()'
[INFO] at scoverage.ScoverageInstrumentationComponent$$anon$1.run(plugin.scala:115)
[INFO] at scala.tools.nsc.Global$Run.compileUnitsInternal(Global.scala:1514)
[INFO] at scala.tools.nsc.Global$Run.compileUnits(Global.scala:1498)
[INFO] at scala.tools.nsc.Global$Run.compileSources(Global.scala:1491)
[INFO] at scala.tools.nsc.Global$Run.compile(Global.scala:1620)
[INFO] at scala.tools.nsc.Driver.doCompile(Driver.scala:47)
[INFO] at scala.tools.nsc.MainClass.doCompile(Main.scala:32)
[INFO] at scala.tools.nsc.Driver.process(Driver.scala:67)
[INFO] at scala.tools.nsc.Driver.main(Driver.scala:80)
[INFO] at scala.tools.nsc.Main.main(Main.scala)
[INFO] at java.base/jdk.internal.reflect.DirectMethodHandleAccessor.invoke(DirectMethodHandleAccessor.java:104)
[INFO] at java.base/java.lang.reflect.Method.invoke(Method.java:577)
[INFO] at scala_maven_executions.MainHelper.runMain(MainHelper.java:164)
[INFO] at scala_maven_executions.MainWithArgsInFile.main(MainWithArgsInFile.java:26)
[INFO] java.lang.reflect.InvocationTargetException
[INFO] at java.base/jdk.internal.reflect.DirectMethodHandleAccessor.invoke(DirectMethodHandleAccessor.java:119)
[INFO] at java.base/java.lang.reflect.Method.invoke(Method.java:577)
[INFO] at scala_maven_executions.MainHelper.runMain(MainHelper.java:164)
[INFO] at scala_maven_executions.MainWithArgsInFile.main(MainWithArgsInFile.java:26)
[ERROR] Caused by: java.lang.NoSuchMethodError: 'scala.tools.nsc.reporters.Reporter scala.tools.nsc.Global.reporter()'
[INFO] at scoverage.ScoverageInstrumentationComponent$$anon$1.run(plugin.scala:115)
[INFO] at scala.tools.nsc.Global$Run.compileUnitsInternal(Global.scala:1514)
[INFO] at scala.tools.nsc.Global$Run.compileUnits(Global.scala:1498)
[INFO] at scala.tools.nsc.Global$Run.compileSources(Global.scala:1491)
[INFO] at scala.tools.nsc.Global$Run.compile(Global.scala:1620)
[INFO] at scala.tools.nsc.Driver.doCompile(Driver.scala:47)
[INFO] at scala.tools.nsc.MainClass.doCompile(Main.scala:32)
[INFO] at scala.tools.nsc.Driver.process(Driver.scala:67)
[INFO] at scala.tools.nsc.Driver.main(Driver.scala:80)
[INFO] at scala.tools.nsc.Main.main(Main.scala)
[INFO] at java.base/jdk.internal.reflect.DirectMethodHandleAccessor.invoke(DirectMethodHandleAccessor.java:104)
[INFO] ... 3 more
Not able to find from where I should get scala.tools.nsc.reporters.Reporter scala.tools.nsc.Global.reporter()
The full pom is:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>mygroupid</groupId>
<artifactId>myartifact</artifactId>
<version>1.0.0</version>
<name>myproject</name>
<properties>
<app.build.version>1.0.0</app.build.version>
<encoding>UTF-8</encoding>
<scala.version>2.12.14</scala.version>
<scala.compat.version>2.12</scala.compat.version>
<scala.binary.version>2.12</scala.binary.version>
<spark.version>3.2.1</spark.version>
<scope.value>provided</scope.value>
<scoverage.plugin.version>1.3.0</scoverage.plugin.version>
</properties>
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_${scala.binary.version}</artifactId>
<version>${spark.version}</version>
<scope>${scope.value}</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_${scala.binary.version}</artifactId>
<version>${spark.version}</version>
<scope>${scope.value}</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_${scala.binary.version}</artifactId>
<version>${spark.version}</version>
<scope>${scope.value}</scope>
</dependency>
<dependency>
<groupId>io.delta</groupId>
<artifactId>delta-core_2.12</artifactId>
<version>2.0.0</version>
<scope>${scope.value}</scope>
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-streams</artifactId>
<version>3.1.2</version>
<scope>${scope.value}</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql-kafka-0-10_${scala.binary.version}</artifactId>
<version>${spark.version}</version>
<scope>${scope.value}</scope>
</dependency>
<dependency>
<groupId>org.scalatest</groupId>
<artifactId>scalatest_2.12</artifactId>
<version>3.2.14</version>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<finalName>myproject-${app.build.version}</finalName>
<sourceDirectory>src/main/scala</sourceDirectory>
<testSourceDirectory>src/test/scala</testSourceDirectory>
<plugins>
<plugin>
<groupId>org.scoverage</groupId>
<artifactId>scoverage-maven-plugin</artifactId>
<version>${scoverage.plugin.version}</version>
<configuration>
<scalaVersion>${scala.version}</scalaVersion>
<highlighting>true</highlighting>
<aggregate>true</aggregate>
</configuration>
<executions>
<execution>
<goals>
<goal>report</goal>
</goals>
<phase>test</phase>
</execution>
</executions>
</plugin>
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>3.2.2</version>
<executions>
<execution>
<id>Scaladoc</id>
<goals>
<goal>doc</goal>
</goals>
<phase>prepare-package</phase>
<configuration>
<args>
<arg>-no-link-warnings</arg>
</args>
</configuration>
</execution>
<execution>
<id>attach-javadocs</id>
<goals>
<goal>doc-jar</goal>
</goals>
</execution>
<execution>
<goals>
<goal>compile</goal>
<goal>testCompile</goal>
</goals>
<configuration>
<args>
<arg>-dependencyfile</arg>
<arg>${project.build.directory}/.scala_dependencies</arg>
</args>
<jvmArgs>
<jvmArg>-Xss4m</jvmArg>
<jvmArg>-Xms512m</jvmArg>
<jvmArg>-Xmx4096m</jvmArg>
</jvmArgs>
<scalaVersion>${scala.version}</scalaVersion>
</configuration>
</execution>
</executions>
<configuration>
<scalaVersion>${scala.version}</scalaVersion>
</configuration>
</plugin>
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<version>2.4</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
<configuration>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
<appendAssemblyId>false</appendAssemblyId>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>8</source>
<target>8</target>
</configuration>
</plugin>
<plugin>
<groupId>org.scalatest</groupId>
<artifactId>scalatest-maven-plugin</artifactId>
<version>1.0</version>
<configuration>
<reportsDirectory>${project.build.directory}/surefire-reports</reportsDirectory>
<junitxml>.</junitxml>
<filereports>WDF TestSuite.txt</filereports>
<htmlreporters>${project.build.directory}/html/scalatest</htmlreporters>
</configuration>
<executions>
<execution>
<goals>
<goal>test</goal>
</goals>
<phase>test</phase>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>
Edits after #Dmytro feedback:
I added in the pom the following dependencies:
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-compiler</artifactId>
<version>${scala.version}</version>
</dependency>
<dependency>
<groupId>org.scoverage</groupId>
<artifactId>scalac-scoverage-plugin_2.12</artifactId>
<version>1.4.1</version>
</dependency>
And changed the scope to
<scope.value>compile</scope.value>
But I am still getting the error: java.lang.NoSuchMethodError: 'scala.tools.nsc.reporters.Reporter scala.tools.nsc.Global.reporter()'
Could you try to add
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-compiler</artifactId>
<!--<version>2.12.14</version>-->
<version>${scala.version}</version>
</dependency>
?
https://mvnrepository.com/artifact/org.scala-lang/scala-compiler
Does this change anything for you?
scoverage-maven-plugin 1.3.0 depends on scalac-scoverage-plugin 1.3.0
https://github.com/scoverage/scoverage-maven-plugin/blob/scoverage-maven-plugin-1.3.0/pom.xml#L93-L101
scalac-scoverage-plugin 1.3.0 depends on scala-compiler 2.12.0
https://index.scala-lang.org/scoverage/scalac-scoverage-plugin/artifacts/scalac-scoverage-plugin/1.3.0
https://github.com/scoverage/scalac-scoverage-plugin/blob/v1.3.0/build.sbt#L89
https://github.com/scoverage/scalac-scoverage-plugin/blob/v1.3.0/build.sbt#L15
scala-compiler is marked as provided in the dependencies of scalac-scoverage-plugin so should be added manually when needed.
The method scala.tools.nsc.Global.reporter() returns Reporter in scala-compiler 2.12.0 https://github.com/scala/scala/blob/v2.12.0/src/compiler/scala/tools/nsc/Global.scala#L1581 (it seems in 2.12.0 there is no scala.tools.nsc.Global.reporter(), it appears in 2.12.5). In 2.12.14 the signature is different, the method returns FilteringReporter https://github.com/scala/scala/blob/v2.12.14/src/compiler/scala/tools/nsc/Global.scala#L1748 https://github.com/scala/scala/blob/v2.12.14/src/compiler/scala/tools/nsc/Global.scala#L90 So the issue seems to be in the combination of versions scalac-scoverage-plugin 1.3.0 + scala-compiler 2.12.14.
Try to either upgrade scoverage-plugin or downgrade Scala version so that the signature of the method is the same (returning either Reporter both times or FilteringReporter both times). The signature changed in 2.12.12 -> 2.12.13 (Reporter -> FilteringReporter) https://github.com/scala/scala/blob/v2.12.12/src/compiler/scala/tools/nsc/Global.scala#L90 https://github.com/scala/scala/blob/v2.12.13/src/compiler/scala/tools/nsc/Global.scala#L90 If you're on scoverage-plugin 1.4.1 can you change Scala to 2.12.12?
Try then to downgrade the scala version:
<properties>
<scala.version>2.12.12</scala.version>
...
</properties>
in combination with
<dependency>
<groupId>org.scoverage</groupId>
<artifactId>scalac-scoverage-plugin_2.12</artifactId>
<version>1.4.1</version>
</dependency>

Maven build with jenkins for scala spark program : "No primary artifact to install, installing attached artifacts instead

I have a project with multiple scala spark programs, while I run mvn install through eclipse I am able to get the correct jar generated which is used with spark-submit command to run.
After pushing the code to GIT we are trying to build it using jenkins, as we want to automatically push the jar file to our hadoop cluster using ansible.
We have jenkinsfile with build goals as "compile package install -X".
The logs show that-
[DEBUG](f)artifact = com.esi.rxhome:PROJECT1:jar:0.0.1-SNAPSHOT
[DEBUG](f) attachedArtifacts = [com.esi.rxhome:PROJECT1:jar:jar- with-dependencies:0.0.1-SNAPSHOT, com.esi.rxhome:PROJECT1:jar:jar-with-dependencies:0.0.1-SNAPSHOT]
[DEBUG] (f) createChecksum = false
[DEBUG] (f) localRepository = id: local
url: file:///home/jenkins/.m2/repository/
layout: default
snapshots: [enabled => true, update => always]
releases: [enabled => true, update => always]
[DEBUG] (f) packaging = jar
[DEBUG] (f) pomFile = /opt/jenkins-data/workspace/ng_datahub-pipeline_develop-JYTJLDEXV65VZWDCZAXG5Y7SHBG2534GFEF3OF2WC4543G6ANZYA/pom.xml
[DEBUG] (s) skip = false
[DEBUG] (f) updateReleaseInfo = false
[DEBUG] -- end configuration --
[INFO] **No primary artifact to install, installing attached artifacts instead**
I saw the error in the similar post -
Maven: No primary artifact to install, installing attached artifacts instead
But here the answer says - Remove auto clean, I am not sure how to stop that while jenkins is building the jar file.
Below is the pom.xml-
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.esi.rxhome</groupId>
<artifactId>PROJECT1</artifactId>
<version>0.0.1-SNAPSHOT</version>
<packaging>jar</packaging>
<name>${project.artifactId}</name>
<description>RxHomePreprocessing</description>
<inceptionYear>2015</inceptionYear>
<licenses>
<license>
<name>My License</name>
<url>http://....</url>
<distribution>repo</distribution>
</license>
</licenses>
<properties>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
<encoding>UTF-8</encoding>
<scala.version>2.10.6</scala.version>
<scala.compat.version>2.10</scala.compat.version>
</properties>
<dependencies>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>${scala.version}</version>
</dependency>
<!-- Test -->
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.11</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.specs2</groupId>
<artifactId>specs2-core_${scala.compat.version}</artifactId>
<version>2.4.16</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.scalatest</groupId>
<artifactId>scalatest_${scala.compat.version}</artifactId>
<version>2.2.4</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-jdbc</artifactId>
<version>1.2.1000.2.6.0.3-8</version>
</dependency>
<!-- <dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>2.1.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.10</artifactId>
<version>2.1.0</version>
</dependency> -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.6.3</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.10</artifactId>
<version>1.6.3</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_2.10</artifactId>
<version>1.6.3</version>
</dependency>
<dependency>
<groupId>com.databricks</groupId>
<artifactId>spark-csv_2.10</artifactId>
<version>1.5.0</version>
</dependency>
</dependencies>
<build>
<sourceDirectory>src/main/scala</sourceDirectory>
<testSourceDirectory>src/test/scala</testSourceDirectory>
<plugins>
<plugin>
<!-- see http://davidb.github.com/scala-maven-plugin -->
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>3.2.0</version>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>testCompile</goal>
</goals>
<configuration>
<args>
<arg>-make:transitive</arg>
<arg>-dependencyfile</arg>
<arg>${project.build.directory}/.scala_dependencies</arg>
</args>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.18.1</version>
<configuration>
<useFile>false</useFile>
<disableXmlReport>true</disableXmlReport>
<!-- If you have classpath issue like NoDefClassError,... -->
<!-- useManifestOnlyJar>false</useManifestOnlyJar -->
<includes>
<include>**/*Test.*</include>
<include>**/*Suite.*</include>
</includes>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<version>2.4</version>
<configuration>
<skipIfEmpty>true</skipIfEmpty>
</configuration>
<executions>
<execution>
<goals>
<goal>jar</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
<version>3.0.0</version>
<configuration>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
<archive>
<manifest>
<mainClass>com.esi.spark.storedprocedure.Test_jdbc_nospark</mainClass>
</manifest>
</archive>
</configuration>
<executions>
<execution>
<id>make-assembly</id>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<artifactId>maven-clean-plugin</artifactId>
<version>3.0.0</version>
<configuration>
<skip>true</skip>
</configuration>
</plugin>
</plugins>
</build>
</project>
I tried specifying
1 -" jar " for packaging in pom.xml.
2 -changing the maven goals to -
"install"
"clean install"
"compile package install"
But above tries did not help get rid of the message and jar created was of no use.
When I try to execute the spark submit command-
spark-submit --driver-java-options -Djava.io.tmpdir=/home/EH2524/tmp --conf spark.local.dir=/home/EH2524/tmp --driver-memory 2G --executor-memory 2G --total-executor-cores 1 --num-executors 10 --executor-cores 10 --class com.esi.spark.storedprocedure.Test_jdbc_nospark --master yarn /home/EH2524/PROJECT1-0.0.1-20171124.213717-1-jar-with-dependencies.jar
Multiple versions of Spark are installed but SPARK_MAJOR_VERSION is not set
Spark1 will be picked by default
java.lang.ClassNotFoundException: com.esi.spark.storedprocedure.Test_jdbc_nospark
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.spark.util.Utils$.classForName(Utils.scala:175)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:703)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Here, Test_jdbc_nospark is a scala object.
I'm not sure, but your maven-jar-plugin configuration looks suspicious. Normally, the execution would specify a phase, as in
<execution>
<phase>package</phase>
<goals>
<goal>jar</goal>
</goals>
(from this example). Perhaps omitting that is causing your default jar not to be built? Certainly the error message sounds like your default jar is not being built, but you didn't actually say so.
This message was because of the maven-jar-plugin having the as true. Once i removed this, the build is not giving the message "No primary artifact to install, installing attached artifacts instead"
The empty jar was getting created because of incorrect path in pom.xml
Intitally-
<build>
<sourceDirectory>src/main/scala</sourceDirectory>
As jenkins was building through code in git and the pom was inside the project folder.
<build>
<sourceDirectory>folder_name_in_git/src/main/scala</sourceDirectory>

java.lang.NoClassDefFoundError: com/typesafe/config/ConfigFactory when packaging Scala project with Maven

This question is a continuation of this thread.
The problem that I have refers to packaging Scala Spark project using Maven.
When I run this command:
spark-submit --name 28 --master local[2] --class org.test.consumer.TestRunner \
/usr/tests/test1/target/test_service-1.0-SNAPSHOT.jar \
$arg1 $arg2 $arg3 $arg4 $arg5
..., I get the following error:
Exception in thread "main" java.lang.NoClassDefFoundError: com/typesafe/config/ConfigFactory
at org.test.consumer.kafka.KafkaConsumer.<init>(KafkaConsumer.scala:38)
at org.test.consumer.TestRunner$.main(TestRunner.scala:19)
at org.test.consumer.TestRunner.main(TestRunner.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:729)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: com.typesafe.config.ConfigFactory
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 12 more
This is my current pom.xml that considers the recommendations from the thread mentioned above (I checked that generated jar contains scala classes):
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.test.consumer</groupId>
<artifactId>test_service</artifactId>
<version>1.0-SNAPSHOT</version>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
<java.version>1.8</java.version>
<scala.version>2.11.8</scala.version>
</properties>
<dependencies>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>${scala.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.11</artifactId>
<version>1.6.2</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka_2.11</artifactId>
<version>1.6.2</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.module</groupId>
<artifactId>jackson-module-scala_2.11</artifactId>
<version>2.7.5</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
<version>2.7.5</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
<version>2.7.5</version>
</dependency>
<dependency>
<groupId>org.sedis</groupId>
<artifactId>sedis_2.11</artifactId>
<version>1.2.2</version>
</dependency>
<dependency>
<groupId>com.lambdaworks</groupId>
<artifactId>jacks_2.11</artifactId>
<version>2.3.3</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>1.6.2</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib-local_2.11</artifactId>
<version>2.0.0</version>
</dependency>
<dependency>
<groupId>com.github.nscala-time</groupId>
<artifactId>nscala-time_2.11</artifactId>
<version>2.12.0</version>
</dependency>
</dependencies>
<build>
<plugins>
<!-- Configure maven-compiler-plugin to use the desired Java version -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<configuration>
<source>${java.version}</source>
<target>${java.version}</target>
</configuration>
</plugin>
<!-- Use build-helper-maven-plugin to add Scala source and test source directories -->
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>build-helper-maven-plugin</artifactId>
<version>1.10</version>
<executions>
<execution>
<id>add-source</id>
<phase>generate-sources</phase>
<goals>
<goal>add-source</goal>
</goals>
<configuration>
<sources>
<source>src/main/scala</source>
</sources>
</configuration>
</execution>
<execution>
<id>add-test-source</id>
<phase>generate-test-sources</phase>
<goals>
<goal>add-test-source</goal>
</goals>
<configuration>
<sources>
<source>src/test/scala</source>
</sources>
</configuration>
</execution>
</executions>
</plugin>
<!-- Use scala-maven-plugin for Scala support -->
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>3.2.2</version>
<executions>
<execution>
<goals>
<!-- Need to specify this explicitly, otherwise plugin won't be called when doing e.g. mvn compile -->
<goal>compile</goal>
<goal>testCompile</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>
Add this to your pom.xml.
<dependency>
<groupId>com.typesafe</groupId>
<artifactId>config</artifactId>
<version>1.3.1</version>
</dependency>
Note that you don't have to add this jar if it is already present.
Looks like the jar should be an executable jar with all your dependencies packaged within the jar file.
You will have to use the maven assembly plugin to do that.
<build>
<plugins>
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
<configuration>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
<archive>
<manifest>
<mainClass>org.sample.App</mainClass> // put your main class here.
</manifest>
</archive>
</configuration>
</plugin>
</plugins>
</build>
This plugin will create an additional jar (apart from the regular jar) with the name appended with -jar-with-dependencies. Use this jar instead of using the original jar.

Spark - Exception in thread "main" java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror

I am using Scala 2.11.8 and used maven to build the jar file in IntelliJ
I ran my program in mobaxterm:
/opt/spark-1.6.1-bin-hadoop2.6/bin/spark-submit --class CDR.SQL cdr-maven-1.0-SNAPSHOT.jar
This is the error message:
Exception in thread "main" java.lang.NoSuchMethodError: scala.reflect.api.JavaUniverse.runtimeMirror(Ljava/lang/ClassLoader;)Lscala/reflect/api/JavaUniverse$JavaMirror;
at CDR.SQL$.(SQL.scala:24)
at CDR.SQL$.(SQL.scala)
at CDR.SQL.main(SQL.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
This is my pom.xml file:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>cdr-maven</groupId>
<artifactId>cdr-maven</artifactId>
<version>1.0-SNAPSHOT</version>
<repositories>
<repository>
<id>apache-repo</id>
<name>Apache Repository</name>
<url>https://repository.apache.org/content/repositories/releases</url>
<releases>
<enabled>true</enabled>
</releases>
<snapshots>
<enabled>false</enabled>
</snapshots>
</repository>
</repositories>
<build>
<plugins>
<plugin>
<groupId>org.scala-tools</groupId>
<artifactId>maven-scala-plugin</artifactId>
<version>2.15.2</version>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>testCompile</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
</plugins>
</build>
<dependencies>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>2.11.8</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>1.4.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_2.11</artifactId>
<version>1.4.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>1.4.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_2.11</artifactId>
<version>1.4.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.11</artifactId>
<version>1.4.1</version>
<scope>provided</scope>
</dependency>
</dependencies>
</project>
How do I resolve this?
Thanks for the help!
Have you downloaded Spark distribution or build it by yourself?
Spark 1.6 was built by default with Scala 2.10. Please change your Scala version to 2.10 or update Spark to 2.0, which is built by default with Scala 2.11
Check the version.
<build>
<plugins>
<plugin>
<groupId>org.scala-tools</groupId>
<artifactId>maven-scala-plugin</artifactId>
<version>2.15.2</version>
<executions>
<execution>
Looks like you are using 2.15.x version.

Spark Streaming on EC2: Exception in thread "main" java.lang.ExceptionInInitializerError

I am trying to run spark-submit on a jar file that I created. When I run it locally on my machine it works correctly but when deployed onto Amazon EC2 it returns the following error.
root#ip-172-31-47-217 bin]$ ./spark-submit --master local[2] --class main.java.Streamer ~/streaming-project-1.0-jar-with-dependencies.jar
Exception in thread "main" java.lang.ExceptionInInitializerError
at org.apache.spark.streaming.StreamingContext$.<init>(StreamingContext.scala:728)
at org.apache.spark.streaming.StreamingContext$.<clinit>(StreamingContext.scala)
at org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:81)
at main.java.Streamer$.main(Streamer.scala:24)
at main.java.Streamer.main(Streamer.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:672)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.NoSuchFieldException: SHUTDOWN_HOOK_PRIORITY
at java.lang.Class.getField(Class.java:1592)
at org.apache.spark.util.SparkShutdownHookManager.install(ShutdownHookManager.scala:220)
at org.apache.spark.util.ShutdownHookManager$.shutdownHooks$lzycompute(ShutdownHookManager.scala:50)
at org.apache.spark.util.ShutdownHookManager$.shutdownHooks(ShutdownHookManager.scala:48)
at org.apache.spark.util.ShutdownHookManager$.addShutdownHook(ShutdownHookManager.scala:189)
at org.apache.spark.util.ShutdownHookManager$.<init>(ShutdownHookManager.scala:58)
at org.apache.spark.util.ShutdownHookManager$.<clinit>(ShutdownHookManager.scala)
... 14 more
Below is my pom.xml file:
<?xml version="1.0" encoding="UTF-8"?>
<project>
<groupId>astiefel</groupId>
<artifactId>streaming-project</artifactId>
<modelVersion>4.0.0</modelVersion>
<name>Streamer Project</name>
<packaging>jar</packaging>
<version>1.0</version>
<properties>
<maven.compiler.source>1.6</maven.compiler.source>
<maven.compiler.target>1.6</maven.compiler.target>
<encoding>UTF-8</encoding>
<scala.tools.version>2.10</scala.tools.version>
<!-- Put the Scala version of the cluster -->
<scala.version>2.10.4</scala.version>
</properties>
<dependencies>
<dependency> <!-- Spark dependency -->
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.5.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.10</artifactId>
<version>1.5.1</version>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-compiler</artifactId>
<version>${scala.version}</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_2.10</artifactId>
<version>1.0.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_2.10</artifactId>
<version>1.3.1</version>
</dependency>
<dependency>
<groupId>org.scalanlp</groupId>
<artifactId>breeze_2.10</artifactId>
<version>0.10</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.10</artifactId>
<version>1.4.0</version>
</dependency>
</dependencies>
<repositories>
<repository>
<id>cloudera-repo-releases</id>
<url>https://repository.cloudera.com/artifactory/repo/</url>
</repository>
</repositories>
<build>
<sourceDirectory>src/main/java</sourceDirectory>
<plugins>
<plugin>
<!-- see http://davidb.github.com/scala-maven-plugin -->
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<!--<version>3.1.3</version>-->
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>testCompile</goal>
</goals>
<configuration>
<args>
<arg>-make:transitive</arg>
<arg>-dependencyfile</arg>
<arg>${project.build.directory}/.scala_dependencies</arg>
</args>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<!--<version>2.13</version>-->
<configuration>
<useFile>false</useFile>
<disableXmlReport>true</disableXmlReport>
<!-- If you have classpath issue like NoDefClassError,... -->
<useManifestOnlyJar>false</useManifestOnlyJar>
<includes>
<include>**/*Test.*</include>
<include>**/*Suite.*</include>
</includes>
</configuration>
</plugin>
<!-- "package" command plugin -->
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<!--<version>2.4.1</version>-->
<configuration>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
<executions>
<execution>
<id>make-assembly</id>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>
When you launch spark-ec2 the default hadoop version is 1.2.1. However, the recent Spark versions (at least 1.5.1) requires the SHUTDOWN_HOOK_PRIORITY field in the hadoop.fs.FileSystem class that was introduced in Hadoop 2+.
One fix to get around this problem is to start up your spark cluster with Hadoop version 2+. See spark-ec2 --help for available options. Example: --hadoop-major-version=yarn will install version 2.4 of Hadoop.