Getting java.lang.ClassNotFoundException while running Spark Application - scala

I am new to Spark (Scala) and I am trying to run one spark application through spark submit. Unfortunately I am getting java.lang.ClassNotFoundException exception.
Here is my spark submit command:
./spark-submit --class "spark.phoenix.a" --master local --deploy-mode client /home/ec2-user/phoenix-0.0.1-SNAPSHOT.jar
Here is my Exception:
java.lang.ClassNotFoundException: spark.phoenix.a
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.spark.util.Utils$.classForName(Utils.scala:175)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:689)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Here is my sample code:
package spark.phoenix
import org.apache.spark.SparkConf
import org.apache.spark.sql.SQLContext
object a {
def main(args: Array[String]) {
import org.apache.phoenix.spark._
import org.apache.spark.SparkContext
val zkQuorum = "localhost:2181"
val master = "local[*]"
val sparkConf = new SparkConf()
sparkConf.setAppName("phoenix-spark-save")
.setMaster(s"${master}")
val sc = new SparkContext(sparkConf)
val sqlContext = new SQLContext(sc)
// read from orders phoenix table
val df = sqlContext.phoenixTableAsDataFrame("TABLE1",
Seq.apply("ID", "COL1"),
zkUrl = Some.apply(s"${zkQuorum}") )
}
}

I think problem is with your pom.xml file(if you are using maven). You haven't added maven-shade-plugin.
Add below properties to your pom.xml and and try once.
<build>
<resources>
<resource>
<directory>src/main/resources</directory>
</resource>
</resources>
<plugins>
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>3.2.2</version>
<executions>
<execution>
<id>eclipse-add-source</id>
<goals>
<goal>add-source</goal>
</goals>
</execution>
<execution>
<id>scala-compile-first</id>
<phase>process-resources</phase>
<goals>
<goal>compile</goal>
</goals>
</execution>
<execution>
<id>scala-test-compile-first</id>
<phase>process-test-resources</phase>
<goals>
<goal>testCompile</goal>
</goals>
</execution>
<execution>
<id>attach-scaladocs</id>
<phase>verify</phase>
<goals>
<goal>doc-jar</goal>
</goals>
</execution>
</executions>
<configuration>
<scalaVersion>${scala.version}</scalaVersion>
<recompileMode>incremental</recompileMode>
<useZincServer>true</useZincServer>
<args>
<arg>-unchecked</arg>
<arg>-deprecation</arg>
<arg>-feature</arg>
</args>
<jvmArgs>
<jvmArg>-Xms1024m</jvmArg>
<jvmArg>-Xmx1024m</jvmArg>
<jvmArg>-XX:ReservedCodeCacheSize=${CodeCacheSize}</jvmArg>
</jvmArgs>
<javacArgs>
<javacArg>-source</javacArg>
<javacArg>${java.version}</javacArg>
<javacArg>-target</javacArg>
<javacArg>${java.version}</javacArg>
<javacArg>-Xlint:all,-serial,-path</javacArg>
</javacArgs>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<artifactSet>
<excludes>
<exclude>org.xerial.snappy</exclude>
<exclude>org.scala-lang.modules</exclude>
<exclude>org.scala-lang</exclude>
</excludes>
</artifactSet>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
<relocations>
<relocation>
<pattern>com.google.common</pattern>
<shadedPattern>shaded.com.google.common</shadedPattern>
</relocation>
</relocations>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>

please remove the "" from your command.
Try something like this.
./spark-submit --class spark.phoenix.a --master local --deploy-mode client /home/ec2-user/phoenix-0.0.1-SNAPSHOT.jar

Related

Unable to load the data to elasticsearch using kafka-structured streaming

I am able to consume the messages from kafka using structured streaming and able to print on the console mode but unable to write it to elastic search. Tried all the available options nothing is working. Unfortunately, I am not getting any error also. Any suggestions ? Am I missing anything?
val kafkaDF = spark
.readStream
.format("kafka")
.option("kafka.bootstrap.servers", config.KafkaServer) //"x-sin-edp-d-kfk-8-s01.dfs:6667,x-sin-edp-d-kfk-8-s02.dfs:6667"
.option("subscribe", config.topic) //"uat.push.mulesoft")
.option("startingOffsets", "earliest")
.option("failOnDataLoss", "false")
//.option("auto.offset.reset", "earliest")
.load()
val invoiceWriterQuery = outputDF.writeStream
//.queryName("Flattened Invoice Writer")
.outputMode("append")
// .format("console")
.format("org.elasticsearch.spark.sql")
//.format("es")
.option("es.nodes.wan.only","true")
.option("es.nodes.discovery", "false")
.option("es.nodes.client.only", "false")
.option("es.net.ssl","true")
.option("es.mapping.id", outputDF.col("key").toString())
.option("es.write.operation", "upsert")
.option("es.nodes", config.esnodes )//"https://vpc-dfs-datahub-elastic-dev-j4qxkivimyqieuuzed2ntomviu.ap-southeast-1.es.amazonaws.com/"
.option("es.port", config.esport)
.option("es.resource", "loyalty_vw_eod_sls_dtl_test1/doc")
// .option("path", "output")
.option("checkpointLocation", "chk-point-dir")
.trigger(Trigger.ProcessingTime("1 minute"))
.start()
//sf_df.unpersist()
invoiceWriterQuery.awaitTermination();
--------------------------------------------------------------------------------------------------
4.0.0
<groupId>org.example</groupId>
<artifactId>LoyaltyESWrite</artifactId>
<version>0.0.1-SNAPSHOT</version>
<packaging>jar</packaging>
<name>Loyalty_Write_To_ElasticSearch</name>
<url>http://maven.apache.org</url>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
<encoding>UTF-8</encoding>
<scala.version>2.11.8</scala.version>
<scala.compat.version>2.11</scala.compat.version>
<java.version>1.8</java.version>
<spark.core.version>2.3.1</spark.core.version>
<hadoop.version>2.7.3</hadoop.version>
</properties>
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_${scala.compat.version}</artifactId>
<version>${spark.core.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_${scala.compat.version}</artifactId>
<version>${spark.core.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_${scala.compat.version}</artifactId>
<version>${spark.core.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka-0-10_2.11</artifactId>
<!--<version>${spark.core.version}</version> -->
<version>2.3.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql-kafka-0-10_2.11</artifactId>
<version>2.3.1</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>${hadoop.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>${hadoop.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>${scala.version}</version>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-compiler</artifactId>
<version>${scala.version}</version>
</dependency>
<dependency>
<groupId>net.ceedubs</groupId>
<artifactId>ficus_2.11</artifactId>
<version>1.1.1</version>
</dependency>
<dependency>
<groupId>com.typesafe</groupId>
<artifactId>config</artifactId>
<version>1.0.0</version>
</dependency>
<dependency>
<groupId>net.liftweb</groupId>
<artifactId>lift-json_2.11</artifactId>
<version>3.2.0-M2</version>
</dependency>
<dependency>
<groupId>org.scalatest</groupId>
<artifactId>scalatest_${scala.compat.version}</artifactId>
<version>2.2.6</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>jdk.tools</groupId>
<artifactId>jdk.tools</artifactId>
<version>1.8</version>
<scope>system</scope>
<systemPath>${JAVA_HOME}/lib/tools.jar</systemPath>
</dependency>
<dependency>
<groupId>org.elasticsearch</groupId>
<artifactId>elasticsearch-spark-20_2.11</artifactId>
<version>7.1.1</version>
</dependency>
<!-- https://mvnrepository.com/artifact/net.snowflake/spark-snowflake -->
<dependency>
<groupId>net.snowflake</groupId>
<artifactId>spark-snowflake_2.11</artifactId>
<version>2.8.5-spark_2.3</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>3.2.2</version>
<executions>
<execution>
<id>scala-compile-first</id>
<phase>process-resources</phase>
<goals>
<goal>add-source</goal>
<goal>compile</goal>
</goals>
<configuration>
<sources>
<source>src/main/scala</source>
</sources>
</configuration>
</execution>
<execution>
<id>scala-test-compile</id>
<phase>process-test-resources</phase>
<goals>
<goal>testCompile</goal>
</goals>
</execution>
</executions>
<configuration>
<recompileMode>incremental</recompileMode>
<args>
<arg>-deprecation</arg>
<arg>-explaintypes</arg>
<arg>-target:jvm-1.8</arg>
<arg>${project.build.directory}/.scala_dependencies</arg>
</args>
</configuration>
</plugin>
<plugin>
<groupId>org.scalatest</groupId>
<artifactId>scalatest-maven-plugin</artifactId>
<version>1.0</version>
<configuration>
<reportsDirectory>${project.build.directory}/surefire-reports</reportsDirectory>
<stdout>W</stdout>
</configuration>
<executions>
<execution>
<id>scala-test</id>
<goals>
<goal>test</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.5.1</version>
<configuration>
<source>${java.version}</source>
<target>${java.version}</target>
<encoding>UTF-8</encoding>
<maxmem>1024m</maxmem>
<fork>true</fork>
<compilerArgs>
<arg>-Xlint:all,-serial,-path</arg>
</compilerArgs>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.12.4</version>
<configuration>
<argLine>-Xmx512m</argLine>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
<version>2.6</version>
<configuration>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
<archive>
<manifest>
<mainClass>org.example.WriteToES</mainClass>
</manifest>
</archive>
</configuration>
<executions>
<execution>
<id>create-my-bundle</id>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>

API incompatibility - com.google.code.play2-maven-plugin:play2-maven-plugin:1.0.0-rc5:template-compile: java.lang.ExceptionInInitializerError: null

i am seeing this error when trying to package a maven project:
[ERROR] Failed to execute goal com.google.code.play2-maven-plugin:play2-maven-plugin:1.0.0-rc5:template-compile (default-template-compile) on project api_2.11: Execution default-template-compile of goal com.google.code.play2-maven-plugin:play2-maven-plugin:1.0.0-rc5:template-compile failed: An API incompatibility was encountered while executing com.google.code.play2-maven-plugin:play2-maven-plugin:1.0.0-rc5:template-compile: java.lang.ExceptionInInitializerError: null
[ERROR] -----------------------------------------------------
[ERROR] realm = extension>com.google.code.play2-maven-plugin:play2-maven-plugin:1.0.0-rc5
[ERROR] strategy = org.codehaus.plexus.classworlds.strategy.SelfFirstStrategy
[ERROR] urls[0] = file:/Users/k28497/.m2/repository/com/google/code/play2-maven-plugin/play2-maven-plugin/1.0.0-rc5/play2-maven-plugin-1.0.0-rc5.jar
[ERROR] urls[1] = file:/Users/k28497/.m2/repository/com/google/code/play2-maven-plugin/play2-provider-api/1.0.0-rc5/play2-provider-api-1.0.0-rc5.jar
[ERROR] urls[2] = file:/Users/k28497/.m2/repository/com/google/code/play2-maven-plugin/play2-source-position-mappers/1.0.0-rc5/play2-source-position-mappers-1.0.0-rc5.jar
[ERROR] urls[3] = file:/Users/k28497/.m2/repository/com/google/code/play2-maven-plugin/play2-source-watcher-api/1.0.0-rc5/play2-source-watcher-api-1.0.0-rc5.jar
[ERROR] urls[4] = file:/Users/k28497/.m2/repository/com/google/code/sbt-compiler-maven-plugin/sbt-compiler-api/1.0.0/sbt-compiler-api-1.0.0.jar
[ERROR] urls[5] = file:/Users/k28497/.m2/repository/org/apache/ant/ant/1.9.4/ant-1.9.4.jar
[ERROR] urls[6] = file:/Users/k28497/.m2/repository/org/apache/ant/ant-launcher/1.9.4/ant-launcher-1.9.4.jar
[ERROR] urls[7] = file:/Users/k28497/.m2/repository/org/apache/maven/shared/maven-common-artifact-filters/1.4/maven-common-artifact-filters-1.4.jar
[ERROR] urls[8] = file:/Users/k28497/.m2/repository/org/codehaus/plexus/plexus-archiver/3.6.0/plexus-archiver-3.6.0.jar
[ERROR] urls[9] = file:/Users/k28497/.m2/repository/org/codehaus/plexus/plexus-io/3.0.1/plexus-io-3.0.1.jar
[ERROR] urls[10] = file:/Users/k28497/.m2/repository/commons-io/commons-io/2.6/commons-io-2.6.jar
[ERROR] urls[11] = file:/Users/k28497/.m2/repository/org/apache/commons/commons-compress/1.16.1/commons-compress-1.16.1.jar
[ERROR] urls[12] = file:/Users/k28497/.m2/repository/org/objenesis/objenesis/2.6/objenesis-2.6.jar
[ERROR] urls[13] = file:/Users/k28497/.m2/repository/org/iq80/snappy/snappy/0.4/snappy-0.4.jar
[ERROR] urls[14] = file:/Users/k28497/.m2/repository/org/tukaani/xz/1.8/xz-1.8.jar
[ERROR] urls[15] = file:/Users/k28497/.m2/repository/org/codehaus/plexus/plexus-utils/3.1.0/plexus-utils-3.1.0.jar
[ERROR] urls[16] = file:/Users/k28497/.m2/repository/org/sonatype/plexus/plexus-build-api/0.0.7/plexus-build-api-0.0.7.jar
[ERROR] Number of foreign imports: 1
[ERROR] import: Entry[import from realm ClassRealm[maven.api, parent: null]]
[ERROR]
[ERROR] -----------------------------------------------------
[ERROR] : object java.lang.Object in compiler mirror not found.
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/PluginContainerException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
The plugin in question in my pom.xml:
<?xml version='1.0' encoding='UTF-8'?>
<project xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://maven.apache.org/POM/4.0.0">
<modules>
<module>api</module>
<module>common</module>
</modules>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
<play2.version>2.5.19</play2.version>
<scala.version>2.11.12</scala.version>
<scala.short.version>2.11</scala.short.version>
</properties>
<dependencyManagement>
<dependencies>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>${scala.version}</version>
</dependency>
<dependency>
<groupId>org.scalatest</groupId>
<artifactId>scalatest_${scala.short.version}</artifactId>
<version>2.2.6</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scalap</artifactId>
<version>${scala.version}</version>
</dependency>
<dependency>
<groupId>net.liftweb</groupId>
<artifactId>lift-json_${scala.short.version}</artifactId>
<version>2.6.3</version>
</dependency>
</dependencies>
</dependencyManagement>
<build>
<pluginManagement>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-site-plugin</artifactId>
<version>3.7.1</version>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-project-info-reports-plugin</artifactId>
<version>3.0.0</version>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.7</version>
<configuration>
<skipTests>true</skipTests>
</configuration>
</plugin>
<plugin>
<groupId>org.scalatest</groupId>
<artifactId>scalatest-maven-plugin</artifactId>
<version>1.0</version>
<configuration>
<testFailureIgnore>true</testFailureIgnore>
<reportsDirectory>${project.build.directory}/surefire-reports</reportsDirectory>
<junitxml>.</junitxml>
<filereports>WDF TestSuite.txt</filereports>
</configuration>
<executions>
<execution>
<id>test</id>
<goals>
<goal>test</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
<version>3.3.0</version>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<version>3.2.0</version>
<configuration>
<archive>
<manifest>
<addDefaultImplementationEntries>true</addDefaultImplementationEntries>
<addDefaultSpecificationEntries>true</addDefaultSpecificationEntries>
</manifest>
</archive>
</configuration>
</plugin>
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>4.3.0</version>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>testCompile</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>com.google.code.play2-maven-plugin</groupId>
<artifactId>play2-maven-plugin</artifactId>
<version>1.0.0-rc5</version>
<extensions>true</extensions>
<configuration>
<routesGenerator>injected</routesGenerator>
<distFormats>tar.gz</distFormats>
<distTarLongFileMode>posix</distTarLongFileMode>
<distTopLevelDirectory>${project.name}-${project.version}</distTopLevelDirectory>
</configuration>
<dependencies>
<dependency>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
<version>3.1.1</version>
</dependency>
</dependencies>
</plugin>
<plugin>
<groupId>com.google.code.sbt-compiler-maven-plugin</groupId>
<artifactId>sbt-compiler-maven-plugin</artifactId>
<version>1.0.0</version>
</plugin>
</plugins>
</pluginManagement>
</build>
</project>
and in the plugins in the sub pom module:
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
</plugin>
<plugin>
<groupId>com.google.code.play2-maven-plugin</groupId>
<artifactId>play2-maven-plugin</artifactId>
</plugin>
<plugin>
<groupId>com.google.code.sbt-compiler-maven-plugin</groupId>
<artifactId>sbt-compiler-maven-plugin</artifactId>
</plugin>
</plugins>
Any ideas?

IntelliJ with gatling doesn´t find simulations to run

I'm having issue with IntelliJ with gatling.
When I run the test, gating returns a message refering that "No simulations to run"
[INFO]
[INFO] --- gatling-maven-plugin:3.0.5:test (default) # com.ffTests ---
[ERROR] No simulations to run
No simulations to run
Caused by: org.apache.maven.plugin.MojoFailureException: No simulations to run
at io.gatling.mojo.GatlingMojo.simulations (GatlingMojo.java:372)
at io.gatling.mojo.GatlingMojo.execute (GatlingMojo.java:213)
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo (DefaultBuildPluginManager.java:137)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:210)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:156)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:148)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:81)
at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build (SingleThreadedBuilder.java:56)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute (LifecycleStarter.java:128)
at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:305)
at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:192)
at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:105)
at org.apache.maven.cli.MavenCli.execute (MavenCli.java:957)
at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:289)
at org.apache.maven.cli.MavenCli.main (MavenCli.java:193)
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke (Method.java:564)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced (Launcher.java:282)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch (Launcher.java:
The code I'm is the follow:
package Performance
import com.intuit.karate.gatling.PreDef._
import io.gatling.core.Predef._
import scala.concurrent.duration._
class performanceGo extends Simulation{
val getReason = scenario("login").exec(karateFeature("classpath:performance/TestPerformance.feature"))
setUp(
getReason.inject(rampUsers( users= 10).during(10 seconds))
)
}
The above class must execute a cucumber feature called TestPerformance.feature.
The pom dependencies are
<dependency>
<groupId>com.intuit.karate</groupId>
<artifactId>karate-apache</artifactId>
<version>0.9.5</version>
</dependency>
<dependency>
<groupId>com.intuit.karate</groupId>
<artifactId>karate-junit5</artifactId>
<version>0.9.5</version>
</dependency>
<dependency>
<groupId>com.intuit.karate</groupId>
<artifactId>karate-gatling</artifactId>
<version>${karate.version}</version>
<scope>test</scope>
</dependency>
The plugins I'm using are
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>${maven.compiler.version}</version>
<configuration>
<encoding>UTF-8</encoding>
<source>${java.version}</source>
<target>${java.version}</target>
<compilerArgument>-Werror</compilerArgument>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.22.2</version>
</plugin>
<plugin>
<groupId>io.gatling</groupId>
<artifactId>gatling-maven-plugin</artifactId>
<version>3.0.5</version>
<configuration>
<simulationsFolder>src/test/java</simulationsFolder>
<includes>
<include>mock.CatsKarateSimulation</include>
</includes>
<disableCompiler>true</disableCompiler>
</configuration>
<executions>
<execution>
<phase>test</phase>
<goals>
<goal>test</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>io.gatling.frontline</groupId>
<artifactId>frontline-maven-plugin</artifactId>
<version>1.0.3</version>
<executions>
<execution>
<goals>
<goal>package</goal>
</goals>
</execution>
</executions>
</plugin>
The scala plugin version is "2019.3.17" and I'm using jre1.8.0_241.
Do you have an idea of what may be happening?
May be simply because you haven't pointed to the right scala file. You are using a package called Performance but your pom.xml refers to mock.CatsKarateSimulation.
This issue was related with pom.xm and JAVA_Home variable.
Solution:
java_home = "C:\Program Files\Java\jdk-14"
pom.xml
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.example</groupId>
<artifactId>com.ffTests</artifactId>
<version>1.0-SNAPSHOT</version>
<packaging>jar</packaging>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<java.version>1.8</java.version>
<maven.compiler.version>3.6.0</maven.compiler.version>
<karate.version>0.9.5</karate.version>
<gatling.plugin.version>3.0.2</gatling.plugin.version>
</properties>
<dependencies>
<dependency>
<groupId>com.intuit.karate</groupId>
<artifactId>karate-apache</artifactId>
<version>${karate.version}</version>
</dependency>
<dependency>
<groupId>com.intuit.karate</groupId>
<artifactId>karate-gatling</artifactId>
<version>${karate.version}</version>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<testResources>
<testResource>
<directory>src/test/java</directory>
<excludes>
<exclude>**/*.java</exclude>
</excludes>
</testResource>
</testResources>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>${maven.compiler.version}</version>
<configuration>
<encoding>UTF-8</encoding>
<source>${java.version}</source>
<target>${java.version}</target>
<compilerArgument>-Werror</compilerArgument>
</configuration>
</plugin>
<plugin>
<groupId>io.gatling</groupId>
<artifactId>gatling-maven-plugin</artifactId>
<version>${gatling.plugin.version}</version>
<configuration>
<simulationsFolder>src/test/java</simulationsFolder>
</configuration>
<executions>
<execution>
<phase>test</phase>
<goals>
<goal>test</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>

JethroJDBC DriverManager.getConnection throwing NullPointerException while running Maven Snapshot with Dependencies.jar

JethroJDBC DriverManager.getConnection working fine when I run the program locally in Eclipse IDE, but the same code throwing NullPointerException while running Maven Snapshot with Dependencies.jar. In the executable jar, I can see the classes exists in com.jethrodata package.
Please help me.
HOST="jdbc:JethroData://UATSERVERHOST:9111/UATINSTANCE"
USER= "USERNAME"
PASS= "PassWord"
SourceCode:
public Connection getConnection(String HOST, String USER, String PASS) throws IOException
{
Connection connection = null;
try {
Class.forName("com.jethrodata.JethroDriver");
//STEP 3: Open a connection
connection = DriverManager.getConnection(HOST, USER, PASS);
return connection;
}
catch(Exception e) {
TestAutomationOutput.outputLog("There is an exception in connecting to the jethro database \r\n"+e);
e.printStackTrace();
}
return null;
}
POM.XML:
<dependency>
<groupId>com.jethro</groupId>
<artifactId>jethrojdbc</artifactId>
<version>3.8</version>
</dependency>
<build>
<sourceDirectory>src</sourceDirectory>
<resources>
<resource>
<directory>src</directory>
<excludes>
<exclude>**/*.java</exclude>
</excludes>
</resource>
</resources>
<plugins>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.7.0</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<executions>
<execution>
<id>copy</id>
<phase>package</phase>
<goals>
<goal>copy-dependencies</goal>
</goals>
<configuration>
<outputDirectory>
${project.build.directory}/lib
</outputDirectory>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<version>3.0.2</version>
<configuration>
<archive>
<manifest>
<addClasspath>true</addClasspath>
<classpathPrefix>lib/</classpathPrefix>
<mainClass>rdltestmodule.RDLAutomationFrame</mainClass>
</manifest>
</archive>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
<version>2.2-beta-5</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
<configuration>
<archive>
<manifest>
<mainClass>rdltestmodule.RDLAutomationFrame</mainClass>
</manifest>
</archive>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
Exception:
java.sql.SQLException: java.lang.NullPointerException
at com.jethrodata.JethroDriver.connect(JethroDriver.java:115)
at com.jethrodata.JethroDriver.connect(JethroDriver.java:16)
at java.sql.DriverManager.getConnection(Unknown Source)
at java.sql.DriverManager.getConnection(Unknown Source)
at rdltestmodule.JethroConnection.getConnection(JethroConnection.java:36)
at rdltestmodule.TestManager.MasterTest(TestManager.java:39)
at rdltestmodule.RDLAutomationFrame$2.actionPerformed(RDLAutomationFrame.java:253)
at javax.swing.AbstractButton.fireActionPerformed(Unknown Source)
at javax.swing.AbstractButton$Handler.actionPerformed(Unknown Source)
at javax.swing.DefaultButtonModel.fireActionPerformed(Unknown Source)
at javax.swing.DefaultButtonModel.setPressed(Unknown Source)
at javax.swing.plaf.basic.BasicButtonListener.mouseReleased(Unknown Source)
at java.awt.Component.processMouseEvent(Unknown Source)
at javax.swing.JComponent.processMouseEvent(Unknown Source)
at java.awt.Component.processEvent(Unknown Source)
at java.awt.Container.processEvent(Unknown Source)
at java.awt.Component.dispatchEventImpl(Unknown Source)
at java.awt.Container.dispatchEventImpl(Unknown Source)
at java.awt.Component.dispatchEvent(Unknown Source)
at java.awt.LightweightDispatcher.retargetMouseEvent(Unknown Source)
at java.awt.LightweightDispatcher.processMouseEvent(Unknown Source)
at java.awt.LightweightDispatcher.dispatchEvent(Unknown Source)
at java.awt.Container.dispatchEventImpl(Unknown Source)
at java.awt.Window.dispatchEventImpl(Unknown Source)
at java.awt.Component.dispatchEvent(Unknown Source)
at java.awt.EventQueue.dispatchEventImpl(Unknown Source)
at java.awt.EventQueue.access$500(Unknown Source)
at java.awt.EventQueue$3.run(Unknown Source)
at java.awt.EventQueue$3.run(Unknown Source)
at java.security.AccessController.doPrivileged(Native Method)
at java.security.ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(Unknown Source)
at java.security.ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(Unknown Source)
at java.awt.EventQueue$4.run(Unknown Source)
at java.awt.EventQueue$4.run(Unknown Source)
at java.security.AccessController.doPrivileged(Native Method)
at java.security.ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(Unknown Source)
at java.awt.EventQueue.dispatchEvent(Unknown Source)
at java.awt.EventDispatchThread.pumpOneEventForFilters(Unknown Source)
at java.awt.EventDispatchThread.pumpEventsForFilter(Unknown Source)
at java.awt.EventDispatchThread.pumpEventsForHierarchy(Unknown Source)
at java.awt.EventDispatchThread.pumpEvents(Unknown Source)
at java.awt.EventDispatchThread.pumpEvents(Unknown Source)
at java.awt.EventDispatchThread.run(Unknown Source)
https://jethro.io/driver-downloads
You can download jethro jar and paste it on your project and give the dependency like below
<dependency>
<groupId>com.jethrodata</groupId>
<artifactId>jethro-jdbc</artifactId>
<version>3.8</version>
<systemPath>${basedir}\src\main\resources\lib\jethro-jdbc-3.8.jar</systemPath>
<scope>system</scope>
</dependency>
And you can use JethroConnection class for examle like below
JethroConnection connection = null;
try {
Class.forName("com.jethrodata.JethroDriver");
} catch (ClassNotFoundException e) {
e.printStackTrace();
}
try {
connection = ((JethroConnection) DriverManager
.getConnection(url, username, password));
} catch (SQLException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
JethroStatement stmt = null;
JethroResultSet rs = null;
stmt = (JethroStatement) connection.createStatement();

Exception in thread "streaming-start" java.lang.NoSuchMethodError: scala.Predef$.ArrowAssoc(Ljava/lang/Object;)Ljava/lang/Object;

I seen all the treads related to this issue and they are all very clear that the posters were cross compiling with two versions of Scala. In my case I make sure I only have one version 2.11 but I still get the same error. Any help is appreciated, thanks.
My Spark Env:
/___/ .__/\_,_/_/ /_/\_\ version 2.0.0.2.5.3.0-37
/_/
Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_67)
My pom.xml:
<properties>
<spark.version>2.2.1</spark.version>
<scala.version>2.11.8</scala.version>
<scala.library.version>2.11.8</scala.library.version>
<scala.binary.version>2.11</scala.binary.version>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
<java.source.version>1.7</java.source.version>
<java.compile.version>1.7</java.compile.version>
<kafka.version>0-10</kafka.version>
</properties>
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_${scala.binary.version}</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_${scala.binary.version}</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_${scala.binary.version}</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>com.typesafe.scala-logging</groupId>
<artifactId>scala-logging-slf4j_${scala.binary.version}</artifactId>
<version>2.1.2</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka-${kafka.version}_${scala.binary.version}</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>${scala.library.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_${scala.binary.version}</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
<version>0.11.0.2</version>
</dependency>
</dependencies>
This is the exception:
at org.apache.spark.streaming.kafka010.DirectKafkaInputDStream$$anonfun$start$1.apply(DirectKafkaInputDStream.scala:246)
at org.apache.spark.streaming.kafka010.DirectKafkaInputDStream$$anonfun$start$1.apply(DirectKafkaInputDStream.scala:245)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:244)
at scala.collection.Iterator$class.foreach(Iterator.scala:727)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
at scala.collection.mutable.AbstractSet.scala$collection$SetLike$$super$map(Set.scala:45)
at scala.collection.SetLike$class.map(SetLike.scala:93)
at scala.collection.mutable.AbstractSet.map(Set.scala:45)
at org.apache.spark.streaming.kafka010.DirectKafkaInputDStream.start(DirectKafkaInputDStream.scala:245)
at org.apache.spark.streaming.DStreamGraph$$anonfun$start$5.apply(DStreamGraph.scala:47)
at org.apache.spark.streaming.DStreamGraph$$anonfun$start$5.apply(DStreamGraph.scala:47)
at scala.collection.parallel.mutable.ParArray$ParArrayIterator.foreach_quick(ParArray.scala:145)
at scala.collection.parallel.mutable.ParArray$ParArrayIterator.foreach(ParArray.scala:138)
at scala.collection.parallel.ParIterableLike$Foreach.leaf(ParIterableLike.scala:975)
at scala.collection.parallel.Task$$anonfun$tryLeaf$1.apply$mcV$sp(Tasks.scala:54)
at scala.collection.parallel.Task$$anonfun$tryLeaf$1.apply(Tasks.scala:53)
at scala.collection.parallel.Task$$anonfun$tryLeaf$1.apply(Tasks.scala:53)
at scala.collection.parallel.Task$class.tryLeaf(Tasks.scala:56)
at scala.collection.parallel.ParIterableLike$Foreach.tryLeaf(ParIterableLike.scala:972)
at scala.collection.parallel.AdaptiveWorkStealingTasks$WrappedTask$class.compute(Tasks.scala:165)
at scala.collection.parallel.AdaptiveWorkStealingForkJoinTasks$WrappedTask.compute(Tasks.scala:514)
at scala.concurrent.forkjoin.RecursiveAction.exec(RecursiveAction.java:160)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
When I grep "_2.1" in the output of command: mvn dependency:tree -Dverbose I don't see any references to 2.10.
[INFO] +- org.apache.spark:spark-core_2.11:jar:2.2.1:compile
[INFO] | +- com.twitter:chill_2.11:jar:0.8.0:compile
[INFO] | +- org.apache.spark:spark-launcher_2.11:jar:2.2.1:compile
[INFO] | | +- (org.apache.spark:spark-tags_2.11:jar:2.2.1:compile - omitted for duplicate)
[INFO] | +- org.apache.spark:spark-network-common_2.11:jar:2.2.1:compile
[INFO] | +- org.apache.spark:spark-network-shuffle_2.11:jar:2.2.1:compile
[INFO] | | +- (org.apache.spark:spark-network-common_2.11:jar:2.2.1:compile - omitted for duplicate)
[INFO] | +- org.apache.spark:spark-unsafe_2.11:jar:2.2.1:compile
[INFO] | | +- (org.apache.spark:spark-tags_2.11:jar:2.2.1:compile - omitted for duplicate)
[INFO] | | +- (com.twitter:chill_2.11:jar:0.8.0:compile - omitted for duplicate)
[INFO] | +- org.json4s:json4s-jackson_2.11:jar:3.2.11:compile
[INFO] | | +- org.json4s:json4s-core_2.11:jar:3.2.11:compile
[INFO] | | | +- org.json4s:json4s-ast_2.11:jar:3.2.11:compile
[INFO] | | | +- org.scala-lang.modules:scala-xml_2.11:jar:1.0.1:compile
[INFO] | | | \- (org.scala-lang.modules:scala-parser-combinators_2.11:jar:1.0.1:compile - omitted for conflict with 1.0.4)
[INFO] | +- com.fasterxml.jackson.module:jackson-module-scala_2.11:jar:2.6.5:compile
[INFO] | +- org.apache.spark:spark-tags_2.11:jar:2.2.1:compile
[INFO] +- org.apache.spark:spark-sql_2.11:jar:2.2.1:compile
[INFO] | +- org.apache.spark:spark-sketch_2.11:jar:2.2.1:compile
[INFO] | | +- (org.apache.spark:spark-tags_2.11:jar:2.2.1:compile - omitted for duplicate)
[INFO] | +- (org.apache.spark:spark-core_2.11:jar:2.2.1:compile - omitted for duplicate)
[INFO] | +- org.apache.spark:spark-catalyst_2.11:jar:2.2.1:compile
[INFO] | | +- (org.apache.spark:spark-core_2.11:jar:2.2.1:compile - omitted for duplicate)
[INFO] | | +- (org.apache.spark:spark-tags_2.11:jar:2.2.1:compile - omitted for duplicate)
[INFO] | | +- (org.apache.spark:spark-unsafe_2.11:jar:2.2.1:compile - omitted for duplicate)
[INFO] | | +- (org.apache.spark:spark-sketch_2.11:jar:2.2.1:compile - omitted for duplicate)
[INFO] | +- (org.apache.spark:spark-tags_2.11:jar:2.2.1:compile - omitted for duplicate)
[INFO] +- org.apache.spark:spark-hive_2.11:jar:2.2.1:compile
[INFO] | +- (org.apache.spark:spark-core_2.11:jar:2.2.1:compile - omitted for duplicate)
[INFO] | +- (org.apache.spark:spark-sql_2.11:jar:2.2.1:compile - omitted for duplicate)
[INFO] +- com.typesafe.scala-logging:scala-logging-slf4j_2.11:jar:2.1.2:compile
[INFO] | +- com.typesafe.scala-logging:scala-logging-api_2.11:jar:2.1.2:compile
[INFO] +- org.apache.spark:spark-streaming-kafka-0-10_2.11:jar:2.2.1:compile
[INFO] | +- org.apache.kafka:kafka_2.11:jar:0.10.0.1:compile
[INFO] | | +- org.scala-lang.modules:scala-parser-combinators_2.11:jar:1.0.4:compile
[INFO] | +- (org.apache.spark:spark-tags_2.11:jar:2.2.1:compile - omitted for duplicate)
[INFO] +- org.apache.spark:spark-streaming_2.11:jar:2.2.1:compile
[INFO] | +- (org.apache.spark:spark-core_2.11:jar:2.2.1:compile - omitted for duplicate)
[INFO] | +- (org.apache.spark:spark-tags_2.11:jar:2.2.1:compile - omitted for duplicate)
Also I should state that I am using a Uber jar to run in the Spark server using spark-submit. The Uber is including the jars below. I included the scala jars as a last resource to solve the problem but it does not matter if do or don't.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>3.1.0</version>
<configuration>
<shadedArtifactAttached>false</shadedArtifactAttached>
<keepDependenciesWithProvidedScope>false</keepDependenciesWithProvidedScope>
<artifactSet>
<includes>
<include>org.apache.kafka:spark*</include>
<include>org.apache.spark:spark-streaming-kafka-${kafka.version}_${scala.binary.version}
</include>
<include>org.apache.kafka:kafka_${scala.binary.version}</include>
<include>org.apache.kafka:kafka-clients</include>
<include>org.apache.spark:*</include>
<include>org.scala-lang:scala-library</include>
</includes>
<excludes>
<exclude>org.apache.hadoop:*</exclude>
<exclude>com.fasterxml:*</exclude>
</excludes>
</artifactSet>
<transformers>
<transformer implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
<resource>META-INF/services/javax.ws.rs.ext.Providers</resource>
</transformer>
</transformers>
</configuration>
<executions>
<execution>
<goals>
<goal>shade</goal>
</goals>
</execution>
</executions>
</plugin>
Scala 2.11 does not work with Java 7: https://scala-lang.org/download/2.11.8.html. Please update Java to 8
Finally I got it to work for my given environment. Changes I did was Scala 2.10.6 Java 1.7 Spark 2.0.0.
For completeness here is my pom.xml:
<properties>
<spark.version>2.0.0</spark.version>
<scala.version>2.10.6</scala.version>
<scala.library.version>2.10.6</scala.library.version>
<scala.binary.version>2.10</scala.binary.version>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
<java.source.version>1.7</java.source.version>
<java.compile.version>1.7</java.compile.version>
<kafka.version>0-10</kafka.version>
</properties>
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_${scala.binary.version}</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_${scala.binary.version}</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_${scala.binary.version}</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>com.typesafe.scala-logging</groupId>
<artifactId>scala-logging-slf4j_${scala.binary.version}</artifactId>
<version>2.1.2</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka-${kafka.version}_${scala.binary.version}</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>${scala.library.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_${scala.binary.version}</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
<version>0.11.0.2</version>
</dependency>
</dependencies>
<build>
<sourceDirectory>src/main/java</sourceDirectory>
<testSourceDirectory>src/test/java</testSourceDirectory>
<resources>
<resource>
<directory>src/main/resources</directory>
</resource>
</resources>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>3.1.0</version>
<configuration>
<shadedArtifactAttached>false</shadedArtifactAttached>
<keepDependenciesWithProvidedScope>false</keepDependenciesWithProvidedScope>
<artifactSet>
<includes>
<include>org.apache.kafka:spark*</include>
<include>org.apache.spark:spark-streaming-kafka-${kafka.version}_${scala.binary.version}
</include>
<include>org.apache.kafka:kafka_${scala.binary.version}</include>
<include>org.apache.kafka:kafka-clients</include>
<include>org.apache.spark:*</include>
<include>org.scala-lang:scala-library</include>
</includes>
<excludes>
<exclude>org.apache.hadoop:*</exclude>
<exclude>com.fasterxml:*</exclude>
</excludes>
</artifactSet>
<transformers>
<transformer implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
<resource>META-INF/services/javax.ws.rs.ext.Providers</resource>
</transformer>
</transformers>
</configuration>
<executions>
<execution>
<goals>
<goal>shade</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.1</version>
<configuration>
<source>${java.source.version}</source>
<target>${java.compile.version}</target>
</configuration>
</plugin>
<plugin>
<groupId>org.scala-tools</groupId>
<artifactId>maven-scala-plugin</artifactId>
<version>2.15.2</version>
<executions>
<execution>
<id>compile</id>
<goals>
<goal>compile</goal>
</goals>
<phase>compile</phase>
</execution>
<execution>
<id>test-compile</id>
<goals>
<goal>testCompile</goal>
</goals>
<phase>test-compile</phase>
</execution>
<execution>
<phase>process-resources</phase>
<goals>
<goal>compile</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-eclipse-plugin</artifactId>
<version>2.9</version>
<configuration>
<sourceIncludes>
<sourceInclude>**/*.scala</sourceInclude>
</sourceIncludes>
<projectNameTemplate>[artifactId]</projectNameTemplate>
<projectnatures>
<projectnature>org.scala-ide.sdt.core.scalanature</projectnature>
<projectnature>org.eclipse.m2e.core.maven2Nature</projectnature>
<projectnature>org.eclipse.jdt.core.javanature</projectnature>
</projectnatures>
<buildcommands>
<buildcommand>org.eclipse.m2e.core.maven2Builder</buildcommand>
<buildcommand>org.scala-ide.sdt.core.scalabuilder</buildcommand>
</buildcommands>
<classpathContainers>
<classpathContainer>org.scala-ide.sdt.launching.SCALA_CONTAINER"</classpathContainer>
</classpathContainers>
<excludes>
<exclude>org.scala-lang:scala-library</exclude>
<exclude>org.scala-lang:scala-compiler</exclude>
</excludes>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<version>2.6</version>
<configuration>
<archive>
<manifestEntries>
<Implementation-Version>${project.version}</Implementation-Version>
<SCM-Revision>1.0</SCM-Revision>
</manifestEntries>
</archive>
</configuration>
</plugin>
</plugins>
</build>