Cucumber-scala BDD test with maven - scala

I am trying cucumber BDD tests for my scala maven project, I tried to add dependencies upon which the jars are added in the intellij ,but due to unknown reason I am not able to import the cucumber packages by import statements.
MY POM
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns="http://maven.apache.org/POM/4.0.0"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0
http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.cuke</groupId>
<artifactId>xdq</artifactId>
<version>0.1.0</version>
</parent>
<artifactId>cuke-test</artifactId>
<dependencies>
<dependency>
<groupId>org.json4s</groupId>
<artifactId>json4s-native_2.11</artifactId>
<version>3.5.2</version>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>2.11.8</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.0.2</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.0.2</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_2.11</artifactId>
<version>2.0.2</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.databricks</groupId>
<artifactId>spark-csv_2.11</artifactId>
<version>1.4.0</version>
</dependency>
<dependency>
<groupId>org.json</groupId>
<artifactId>json</artifactId>
<version>20160810</version>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-core</artifactId>
<version>2.8.2</version>
</dependency>
<dependency>
<groupId>org.exadatum</groupId>
<artifactId>json-generator</artifactId>
<version>0.1.0</version>
</dependency>
<dependency>
<groupId>info.cukes</groupId>
<artifactId>cucumber-junit</artifactId>
<version>1.2.4</version>
</dependency>
<dependency>
<groupId>info.cukes</groupId>
<artifactId>cucumber-core</artifactId>
<version>1.2.4</version>
</dependency>
<dependency>
<groupId>org.scalatest</groupId>
<artifactId>scalatest_2.11</artifactId>
<version>3.0.3</version>
</dependency>
<dependency>
<groupId>info.cukes</groupId>
<artifactId>cucumber-scala_2.11</artifactId>
<version>1.2.4</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<executions>
<!-- Run shade goal on package phase -->
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
</execution>
</executions>
<configuration>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
</configuration>
</plugin>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>build-helper-maven-plugin</artifactId>
<extensions>true</extensions>
<executions>
<execution>
<id>attach-azkaban-job-artifact</id>
<phase>package</phase>
<goals>
<goal>attach-artifact</goal>
</goals>
<configuration>
<artifacts>
<artifact>
<file>${project.build.directory}/${project.artifactId}.job</file>
<type>job</type>
</artifact>
</artifacts>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>com.exadatum.sensorium.maven</groupId>
<artifactId>template-maven-plugin</artifactId>
<version>1.0.1</version>
<executions>
<execution>
<id>generate-azkaban-job-file</id>
<phase>generate-resources</phase>
<goals>
<goal>generate</goal>
</goals>
<configuration>
<outputFileName>${project.artifactId}.job</outputFileName>
<template>azkaban-job.vm</template>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<executions>
<execution>
<id>module-assembly</id>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
<configuration>
<appendAssemblyId>false</appendAssemblyId>
<descriptors>
<descriptor>
${multi.module.project.root.dir}/libraries/common/src/main/resources/assemblies/module-tar-assembly.xml
</descriptor>
</descriptors>
<!--<tarLongFileMode>posix</tarLongFileMode>-->
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>3.2.0</version>
<configuration>
<checkMultipleScalaVersions>false</checkMultipleScalaVersions>
<recompileMode>modified-only</recompileMode>
<useZincServer>false</useZincServer>
<scalaVersion>2.11.8</scalaVersion>
</configuration>
<executions>
<execution>
<id>scala-compile-first</id>
<phase>process-resources</phase>
<goals>
<goal>add-source</goal>
<goal>compile</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.2</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
</plugins>
</build>
</project>
Let me know where am I doing things wrong
I tried creating the scala class and extending with ScalaDsl but it wont resolve.

Created an empty scala project and added maven. Copied your pom.xml and removed <parent><\parent> as well as <artifactId>cuke-test</artifactId>. There are other dependency problems but then again, I don't know the structure of you project.
After this I created a class:
import cucumber.api.scala.ScalaDsl
class TestClass extends ScalaDsl {
}
and the import works.
Have a look here for an example project / pom.xml and here for more info about <parent> and pom.
Hope you get it working.

Related

Unable to see libraries downloaded from mvn in packaged JAR

This is my pom.xml:
<?xml version='1.0' encoding='UTF-8'?>
<project xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://maven.apache.org/POM/4.0.0">
<modelVersion>4.0.0</modelVersion>
<groupId>default</groupId>
<artifactId>factory-servicing</artifactId>
<packaging>jar</packaging>
<description>factory-servicing</description>
<version>0.1</version>
<name>factory-servicing</name>
<organization>
<name>default</name>
</organization>
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.12</artifactId>
<version>3.1.2</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.12</artifactId>
<version>3.1.2</version>
</dependency>
<dependency>
<groupId>com.ibm.db2.jcc</groupId>
<artifactId>db2jcc</artifactId>
<version>db2jcc4</version>
</dependency>
<dependency>
<groupId>org.scalikejdbc</groupId>
<artifactId>scalikejdbc_2.12</artifactId>
<version>3.1.0</version>
</dependency>
<dependency>
<groupId>com.typesafe</groupId>
<artifactId>config</artifactId>
<version>1.4.2</version>
</dependency>
<dependency>
<groupId>org.scalatest</groupId>
<artifactId>scalatest_2.12</artifactId>
<version>3.0.8</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.specs2</groupId>
<artifactId>specs2-core_${scala.compat.version}</artifactId>
<version>${spec2.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.specs2</groupId>
<artifactId>specs2-junit_${scala.compat.version}</artifactId>
<version>${spec2.version}</version>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<sourceDirectory>src/main/scala</sourceDirectory>
<testSourceDirectory>src/test/scala</testSourceDirectory>
<plugins>
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>3.3.2</version>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>testCompile</goal>
</goals>
<configuration>
<args>
<arg>-dependencyfile</arg>
<arg>${project.build.directory}/.scala_dependencies</arg>
</args>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.21.0</version>
<configuration>
<skipTests>true</skipTests>
</configuration>
</plugin>
<plugin>
<groupId>org.scalatest</groupId>
<artifactId>scalatest-maven-plugin</artifactId>
<version>2.0.0</version>
<configuration>
<reportsDirectory>${project.build.directory}/surefire-reports</reportsDirectory>
<junitxml>.</junitxml>
<filereports>TestSuiteReport.txt</filereports>
</configuration>
<executions>
<execution>
<id>test</id>
<goals>
<goal>test</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
<profiles>
<profile>
<id>uber-jar</id>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>3.2.0</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
<createDependencyReducedPom>false</createDependencyReducedPom>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_${scala.dep.version}</artifactId>
<version>${spark.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_${scala.dep.version}</artifactId>
<version>${spark.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>com.ibm.db2.jcc</groupId>
<artifactId>db2jcc</artifactId>
<version>db2jcc4</version>
</dependency>
<dependency>
<groupId>org.scalikejdbc</groupId>
<artifactId>scalikejdbc_2.12</artifactId>
<version>3.1.0</version>
</dependency>
<dependency>
<groupId>com.typesafe</groupId>
<artifactId>config</artifactId>
<version>1.4.2</version>
</dependency>
</dependencies>
</profile>
</profiles>
</project>
As you can see, for the outer dependencies stage, I don't have anything scoped to provided. This is because I need all libraries to run my test cases.
However, for the package stage's dependencies, which uses the maven-shade-plugin, I'm providing the spark dependencies as provided, whereas the rest of the libraries from the outer dependencies remain unchanged. This is because for Spark we will be providing the JARs, but for the rest we still need it in the packaged jar.
Then when I run mvn package, it generates a JAR file, which I go on to extract. However, on extracting the JAR file, I only see the class files that I had written in my project, and I don't see any of the other libraries that are not scoped to provided.
How do I ensure that the non-provided libraries are also included as a part of the mvn package command execution?

scala log4j vulnerability

I am using below scala dependencies in pom.xml but it is giving log4j vulnerability error.
<properties>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<scala.major.version>2.11</scala.major.version>
<scala.minor.version>2.11.12</scala.minor.version>
<gridgain.version>8.7.8</gridgain.version>
<ignite.version>2.7.0</ignite.version>
<spark.version>2.3.0</spark.version>
<spring.boot.version>2.4.0</spring.boot.version>
<maven-release-plugin-version>2.5.3</maven-release-plugin-version>
<maven-assembly-plugin-version>3.1.1</maven-assembly-plugin-version>
</properties>
<dependencyManagement>
<dependencies>
<dependency>
<groupId>org.gridgain</groupId>
<artifactId>gridgain-core</artifactId>
<version>${gridgain.version}</version>
</dependency>
<dependency>
<groupId>org.gridgain</groupId>
<artifactId>ignite-spring</artifactId>
<version>${gridgain.version}</version>
</dependency>
<dependency>
<groupId>org.gridgain</groupId>
<artifactId>ignite-indexing</artifactId>
<version>${gridgain.version}</version>
</dependency>
<dependency>
<groupId>org.gridgain</groupId>
<artifactId>ignite-log4j2</artifactId>
<version>${gridgain.version}</version>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-dependencies</artifactId>
<version>2.1.3.RELEASE</version>
<type>pom</type>
<scope>import</scope>
</dependency>
<dependency>
<groupId>com.typesafe.scala-logging</groupId>
<artifactId>scala-logging_${scala.major.version}</artifactId>
<version>3.9.0</version>
</dependency>
<dependency>
<groupId>org.scalatest</groupId>
<artifactId>scalatest_${scala.major.version}</artifactId>
<version>3.0.4</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>${scala.minor.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-reflect</artifactId>
<version>${scala.minor.version}</version>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>org.apache.maven.surefire</groupId>
<artifactId>surefire-junit4</artifactId>
<version>2.22.1</version>
</dependency>
</dependencies>
</dependencyManagement>
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.12</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.scalatest</groupId>
<artifactId>scalatest_${scala.major.version}</artifactId>
</dependency>
</dependencies>
<build>
<pluginManagement>
<plugins>
<plugin>
<artifactId>maven-release-plugin</artifactId>
<version>2.5.3</version>
</plugin>
<plugin>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>4.0.1</version>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>testCompile</goal>
</goals>
</execution>
</executions>
<configuration>
<args>
<!-- work-around for https://issues.scala-lang.org/browse/SI-8358 -->
<arg>-nobootcp</arg>
</args>
</configuration>
</plugin>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<version>${spring.boot.version}</version>
<executions>
<execution>
<goals>
<goal>repackage</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
<version>${maven-assembly-plugin-version}</version>
<configuration>
<finalName>${project.build.finalName}</finalName>
<appendAssemblyId>false</appendAssemblyId>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.22.1</version>
<configuration>
<skipTests>true</skipTests>
</configuration>
</plugin>
<plugin>
<groupId>org.scalatest</groupId>
<artifactId>scalatest-maven-plugin</artifactId>
<version>2.0.0</version>
<configuration>
<reportsDirectory>${project.build.directory}/surefire-reports</reportsDirectory>
<junitxml>.</junitxml>
</configuration>
<executions>
<execution>
<id>test</id>
<goals>
<goal>test</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>com.soebes.maven.plugins</groupId>
<artifactId>iterator-maven-plugin</artifactId>
<version>0.5.0</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>iterator</goal>
</goals>
<configuration>
<folder>../deployment/config</folder>
<pluginExecutors>
<pluginExecutor>
<goal>single</goal>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
<version>${maven-assembly-plugin-version}</version>
</plugin>
<configuration>
<finalName>${project.artifactId}</finalName>
</configuration>
</pluginExecutor>
</pluginExecutors>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
It is giving below error.
[ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:4.0.1:compile (default) on project Execution default of goal net.alchim31.maven:scala-maven-plugin:4.0.1:compile failed: Plugin net.alchim31.maven:scala-maven-plugin:4.0.1 or one of its dependencies could not be resolved: Could not transfer artifact org.apache.logging.log4j:log4j-core:jar:2.8.1 /org/apache/logging/log4j/log4j-core/2.8.1/log4j-core-2.8.1.jar. Error code 403, Requested item is quarantined -> [Help 1]
Which dependency internally using log4j-core-2.8.1.jar and How to update log4j to latest 2.17.0?
try using https://mvnrepository.com/artifact/net.alchim31.maven/scala-maven-plugin/4.5.6
this is the latest that internally use the latest log4j 2.17

Unable to run a Spark Scala JAR on GCP Dataproc

I have created a simple Spark application in Scala which is running fine locally. I have used Maven as a build tool and am packaging the JAR file using shade plugin. The directory structure looks like this:
I am using the following pom.xml :
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>cl.aman.fund</groupId>
<artifactId>sym_data_decryptor</artifactId>
<version>0.0.1-SNAPSHOT</version>
<packaging>jar</packaging>
<properties>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
<encoding>UTF-8</encoding>
<scala.version>2.11.12</scala.version>
</properties>
<build>
<sourceDirectory>src/main/scala</sourceDirectory>
<testSourceDirectory>src/test/scala</testSourceDirectory>
<resources>
<resource>
<directory>/src/resources/</directory>
<filtering>false</filtering>
<includes>
<include>hbase-site.xml</include>
</includes>
</resource>
</resources>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.7</version>
<configuration>
<skipTests>true</skipTests>
</configuration>
</plugin>
<plugin>
<!-- see http://davidb.github.com/scala-maven-plugin -->
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-maven-plugin</artifactId>
<version>4.3.0</version>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>testCompile</goal>
</goals>
<configuration>
<args>
<!-- <arg>-make:transitive</arg> -->
<arg>-dependencyfile</arg>
<arg>${project.build.directory}/.scala_dependencies</arg>
</args>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<version>2.4.1</version>
<configuration>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
<executions>
<execution>
<id>make-assembly</id>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.scalatest</groupId>
<artifactId>scalatest-maven-plugin</artifactId>
<version>1.0</version>
<configuration>
<reportsDirectory>${project.build.directory}/surefire-reports</reportsDirectory>
<junitxml>.</junitxml>
<filereports>WDF TestSuite.txt</filereports>
</configuration>
<executions>
<execution>
<id>test</id>
<goals>
<goal>test</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.jacoco</groupId>
<artifactId>jacoco-maven-plugin</artifactId>
<version>0.8.5</version>
<executions>
<execution>
<goals>
<goal>prepare-agent</goal>
</goals>
</execution>
<execution>
<id>report</id>
<phase>test</phase>
<goals>
<goal>report</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<artifactId>maven-shade-plugin</artifactId>
<version>3.2.1</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<transformers>
<transformer
implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer"/>
</transformers>
<filters>
<filter>
<artifact>*:*</artifact>
<excludes>
<exclude>META-INF/*.SF</exclude>
<exclude>META-INF/*.DSA</exclude>
<exclude>META-INF/*.RSA</exclude>
</excludes>
</filter>
</filters>
<shadedArtifactAttached>true</shadedArtifactAttached>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
<dependencies>
<!-- Scala and Spark dependencies -->
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>${scala.version}</version>
</dependency>
<dependency>
<groupId>com.typesafe.scala-logging</groupId>
<artifactId>scala-logging_2.11</artifactId>
<version>3.9.2</version>
</dependency>
<!-- Spark avro -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-avro_2.11</artifactId>
<version>2.4.4</version>
</dependency>
<dependency>
<groupId>com.google.cloud.bigdataoss</groupId>
<artifactId>gcs-connector</artifactId>
<version>hadoop2-1.9.17</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.4.4</version>
</dependency>
<dependency>
<groupId>com.google.cloud.bigdataoss</groupId>
<artifactId>bigquery-connector</artifactId>
<version>hadoop2-0.13.9</version>
</dependency>
<dependency>
<groupId>org.springframework.security</groupId>
<artifactId>spring-security-core</artifactId>
<version>5.2.1.RELEASE</version>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-xml</artifactId>
<version>2.11.0-M4</version>
</dependency>
<dependency>
<groupId>org.scalactic</groupId>
<artifactId>scalactic_2.11</artifactId>
<version>3.1.0</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.scalamock</groupId>
<artifactId>scalamock-scalatest-support_2.11</artifactId>
<version>3.6.0</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.json4s</groupId>
<artifactId>json4s-native_2.11</artifactId>
<version>3.6.6</version>
</dependency>
<dependency>
<groupId>org.scalatest</groupId>
<artifactId>scalatest_2.11</artifactId>
<version>3.1.0</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.4.4</version>
<type>test-jar</type>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.4.4</version>
<type>test-jar</type>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-core</artifactId>
<version>2.13.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-catalyst_2.11</artifactId>
<version>2.4.4</version>
<type>test-jar</type>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-api-scala_2.12</artifactId>
<version>11.0</version>
</dependency>
<dependency>
<groupId>com.typesafe</groupId>
<artifactId>config</artifactId>
<version>1.2.0</version>
</dependency>
<dependency>
<groupId>org.scalacheck</groupId>
<artifactId>scalacheck_2.11</artifactId>
<version>1.14.2</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-avro_2.11</artifactId>
<version>2.4.4</version>
<type>test-jar</type>
<scope>test</scope>
</dependency>
</dependencies>
</project>
Command used to submit job on Dataproc cluster:
gcloud dataproc jobs submit spark \
--cluster <cluster_name> \
--region <region> \
--class cl.aman.symphony.commons.DecryptorApplication \
--jars gs://<bucket_name>/sym_data_decryptor-0.0.1-SNAPSHOT-jar-with-dependencies.jar \
-- \
"my_Input"
Error:
Exception in thread "main" java.lang.NoSuchMethodError: org.json4s.Serialization$class.read(Lorg/json4s/Serialization;Ljava/lang/String;Lorg/json4s/Formats;Lscala/reflect/Manifest;)Ljava/lang/Object;
at org.json4s.native.Serialization$.read(Serialization.scala:32)
at cl.falabella.symphony.commons.DecryptorApplication$.makeRequest(DecryptorApplication.scala:44)
at cl.falabella.symphony.commons.DecryptorApplication$.main(DecryptorApplication.scala:65)
at cl.falabella.symphony.commons.DecryptorApplication.main(DecryptorApplication.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:890)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:192)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:217)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:137)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Can someone help me with the Scala JAR packaging ?
You are using json4s-native_2.11 3.6.6 but spark-core 2.4.4 which is core of Spark framework uses json4s in version 3.5.3 which makes json4s core incompatible. I recommend switching to json4s version 3.5.3
Your plugin for dependencies might be wrong set.
Mine has some differences and works perfectly.
<build>
<plugins>
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
<configuration>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
</plugin>
</plugins>
</build>
Remember to add in your spark-submit "-jar-with-dependencies" at your jar name.

spring boot mongo querydsl with kotlin not generating meta data classes

pom.xml
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.2.1.RELEASE</version>
<relativePath/> <!-- lookup parent from repository -->
</parent>
<groupId>com.t1</groupId>
<artifactId>t1example</artifactId>
<version>0.0.1-SNAPSHOT</version>
<name>t1</name>
<properties>
<java.version>1.8</java.version>
<kotlin.version>1.3.21</kotlin.version>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-jpa</artifactId>
</dependency>
<dependency>
<groupId>org.jetbrains.kotlin</groupId>
<artifactId>kotlin-reflect</artifactId>
</dependency>
<dependency>
<groupId>org.jetbrains.kotlin</groupId>
<artifactId>kotlin-stdlib-jdk8</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-mongodb</artifactId>
</dependency>
<dependency>
<groupId>com.querydsl</groupId>
<artifactId>querydsl-mongodb</artifactId>
<version>4.2.1</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
<plugin>
<groupId>org.jetbrains.kotlin</groupId>
<artifactId>kotlin-maven-plugin</artifactId>
<configuration>
<jvmTarget>1.8</jvmTarget>
<compilerPlugins>
<plugin>jpa</plugin>
<plugin>spring</plugin>
</compilerPlugins>
<args>
<arg>-Xjsr305=strict</arg>
</args>
</configuration>
<executions>
<execution>
<id>compile</id>
<phase>process-sources</phase>
<goals>
<goal>compile</goal>
</goals>
</execution>
<execution>
<id>kapt</id>
<phase>generate-sources</phase>
<goals>
<goal>kapt</goal>
</goals>
<configuration>
<sourceDirs>
<sourceDir>src/main/kotlin</sourceDir>
</sourceDirs>
<annotationProcessorPaths>
<annotationProcessorPath>
<groupId>com.querydsl</groupId>
<artifactId>querydsl-apt</artifactId>
<version>${querydsl.version}</version>
<classifier>jpa</classifier>
</annotationProcessorPath>
</annotationProcessorPaths>
</configuration>
</execution>
</executions>
<dependencies>
<dependency>
<groupId>org.jetbrains.kotlin</groupId>
<artifactId>kotlin-maven-noarg</artifactId>
<version>${kotlin.version}</version>
</dependency>
<dependency>
<groupId>org.jetbrains.kotlin</groupId>
<artifactId>kotlin-maven-allopen</artifactId>
<version>${kotlin.version}</version>
</dependency>
</dependencies>
</plugin>
</plugins>
</build>
</project>
Record.kt
package com.t1.mmkqe.entity
import org.springframework.data.annotation.Id
import org.springframework.data.mongodb.core.mapping.Document
#Document(collection = "Record")
data class Record(
#Id val name: String = ""
)
It does not generate Q classes.
But if use JPA #Entity annotation, it does.
I try many variants with pom/gradle, different versions and try many examples from internet, but without success.
Perhaps it impossible.
I found how to config, now it works
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
</plugin>
<plugin>
<groupId>org.jetbrains.kotlin</groupId>
<artifactId>kotlin-maven-plugin</artifactId>
<configuration>
<jvmTarget>1.8</jvmTarget>
<compilerPlugins>
<plugin>jpa</plugin>
<plugin>spring</plugin>
</compilerPlugins>
<args>
<arg>-Xjsr305=strict</arg>
</args>
</configuration>
<executions>
<execution>
<id>compile</id>
<phase>process-sources</phase>
<goals>
<goal>compile</goal>
</goals>
</execution>
<execution>
<id>kapt</id>
<phase>generate-sources</phase>
<goals>
<goal>kapt</goal>
</goals>
<configuration>
<sourceDirs>
<sourceDir>${project.basedir}/src/main/kotlin</sourceDir>
</sourceDirs>
<annotationProcessorPaths>
<annotationProcessorPath>
<groupId>com.querydsl</groupId>
<artifactId>querydsl-apt</artifactId>
<version>${querydsl.version}</version>
<!--<classifier>jpa</classifier>-->
</annotationProcessorPath>
</annotationProcessorPaths>
<annotationProcessors>
<processor>org.springframework.data.mongodb.repository.support.MongoAnnotationProcessor</processor>
</annotationProcessors>
</configuration>
</execution>
</executions>
<dependencies>
<dependency>
<groupId>org.jetbrains.kotlin</groupId>
<artifactId>kotlin-maven-noarg</artifactId>
<version>${kotlin.version}</version>
</dependency>
<dependency>
<groupId>org.jetbrains.kotlin</groupId>
<artifactId>kotlin-maven-allopen</artifactId>
<version>${kotlin.version}</version>
</dependency>
</dependencies>
</plugin>
<plugin>
<artifactId>maven-antrun-plugin</artifactId>
<version>1.8</version>
<executions>
<execution>
<phase>process-sources</phase>
<configuration>
<target>
<move todir="src/generated-sources/annotations"
overwrite="true">
<fileset dir="target/generated-sources/kapt/compile"/>
</move>
</target>
</configuration>
<goals>
<goal>run</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>

Artifact has not been packaged yet MDEP-98

When compiling one of the Patient Matching algorithm files -> Choicemaker
i found this error
I have tried few solutions from
Artifact has not been packaged yet
i still haven't found the solution
Kindly please Help
THE ERROR IS AS FOLLOWS
Artifact has not been packaged yet. When used on reactor artifact,
unpack should be executed after packaging: see MDEP-98.
(org.apache.maven.plugins:maven-dependency-plugin:2.10:unpack-dependencies:
e2-standard-platform:pre-integration- ....
The ERROR IS SHOWN NEXT TO tag
This is the POM file
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"
>
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>com.choicemaker.cm</groupId>
<artifactId>choicemaker-base</artifactId>
<version>2.7.1-SNAPSHOT</version>
<relativePath>../../choicemaker-base/pom.xml</relativePath>
</parent>
<groupId>com.choicemaker.cmit</groupId>
<artifactId>cm-analyzer-ce-it</artifactId>
<name>Integration Test: ChoiceMaker Analyzer, Part 1</name>
<description><![CDATA[
Integration test for the standard-E2 version of CM Analyzer, Community Edition.
]]></description>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
<eclipse.application.dir>${project.build.testOutputDirectory}/eclipse.application.dir</eclipse.application.dir>
<eclipse.application.examples.dir>${eclipse.application.dir}/examples</eclipse.application.examples.dir>
<eclipse.application.plugins.dir>${eclipse.application.dir}/plugins</eclipse.application.plugins.dir>
</properties>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<compilerVersion>1.7</compilerVersion>
<source>1.7</source>
<target>1.7</target>
<compilerArgument>-g</compilerArgument>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<executions>
<execution> ---------> ERROR
<id>plugins</id>
<phase>pre-integration-test</phase>
<goals>
<goal>unpack-dependencies</goal>
</goals>
<configuration>
<includeArtifactIds>cm-analyzer-ce</includeArtifactIds>
<excludeTransitive>true</excludeTransitive>
<outputDirectory>${eclipse.application.dir}</outputDirectory>
</configuration>
</execution>
<execution> ---------> ERROR
<id>e2-standard-platform</id>
<phase>pre-integration-test</phase>
<goals>
<goal>unpack-dependencies</goal>
</goals>
<configuration>
<classifier>eclipse2prj</classifier>
<includeArtifactIds>com.choicemaker.e2.std,com.choicemaker.cm.modelmaker.std</includeArtifactIds>
<excludeTransitive>true</excludeTransitive>
<outputDirectory>${eclipse.application.plugins.dir}</outputDirectory>
</configuration>
</execution>
<execution> ----------> ERROR
<id>examples</id>
<phase>pre-integration-test</phase>
<goals>
<goal>unpack-dependencies</goal>
</goals>
<configuration>
<includeArtifactIds>simple_person_matching</includeArtifactIds>
<excludeTransitive>true</excludeTransitive>
<outputDirectory>${eclipse.application.examples.dir}</outputDirectory>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-failsafe-plugin</artifactId>
<executions>
<execution>
<goals>
<goal>integration-test</goal>
<goal>verify</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
<dependencies>
<dependency>
<groupId>com.choicemaker.e2</groupId>
<artifactId>com.choicemaker.e2</artifactId>
</dependency>
<dependency>
<groupId>com.choicemaker.e2it</groupId>
<artifactId>com.choicemaker.e2it</artifactId>
</dependency>
<dependency>
<groupId>com.choicemaker.e2</groupId>
<artifactId>com.choicemaker.e2.std</artifactId>
</dependency>
<dependency>
<groupId>com.choicemaker.e2</groupId>
<artifactId>com.choicemaker.e2.std</artifactId>
<classifier>eclipse2prj</classifier>
<type>zip</type>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.choicemaker.cm</groupId>
<artifactId>cm-analyzer-ce</artifactId>
<classifier>distrib</classifier>
<type>zip</type>
</dependency>
<dependency>
<groupId>com.choicemaker.cm</groupId>
<artifactId>simple_person_matching</artifactId>
</dependency>
<dependency>
<groupId>com.choicemaker.cm</groupId>
<artifactId>com.choicemaker.cm.modelmaker</artifactId>
</dependency>
<dependency>
<groupId>com.choicemaker.cm</groupId>
<artifactId>com.choicemaker.cm.modelmaker.std</artifactId>
<classifier>eclipse2prj</classifier>
<type>zip</type>
</dependency>
<dependency>
<groupId>com.choicemaker.cm</groupId>
<artifactId>org.eclipse.core.boot</artifactId>
</dependency>
<dependency>
<groupId>com.choicemaker.cm</groupId>
<artifactId>org.eclipse.core.launcher</artifactId>
</dependency>
</dependencies>
</project>