Scala - Build SBT - Caused by: com.typesafe.config.ConfigException$Missing: No configuration setting found for key 'akka.version' [duplicate] - scala

I am learning akka-remoting and this is how my project looks
The project structure looks like
project/pom.xml
project/mymodule/pom.xml
project/mymodule/src/main/resources/application.conf
project/mymodule/src/main/scala/com.harit.akkaio.remote.RemoteApp.scala
project/mymodule/src/main/scala/com.harit.akkaio.remote.ProcessingActor.scala
When I run my project on command-line, I see
$ java -jar akkaio-remote/target/akka-remote-jar-with-dependencies.jar com.harit.akkaio.remote.RemoteApp
Hello:com.harit.akkaio.remote.RemoteApp
Exception in thread "main" com.typesafe.config.ConfigException$Missing: No configuration setting found for key 'akka.version'
at com.typesafe.config.impl.SimpleConfig.findKey(SimpleConfig.java:124)
at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:145)
at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:151)
at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:159)
at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:164)
at com.typesafe.config.impl.SimpleConfig.getString(SimpleConfig.java:206)
at akka.actor.ActorSystem$Settings.<init>(ActorSystem.scala:169)
at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:505)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:142)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:119)
at com.harit.akkaio.remote.RemoteApp$.startProcessingActorSystem(RemoteApp.scala:16)
at com.harit.akkaio.remote.RemoteApp$.main(RemoteApp.scala:12)
at com.harit.akkaio.remote.RemoteApp.main(RemoteApp.scala)
RemoteApp.scala
package com.harit.akkaio.remote
import akka.actor.{ActorRef, ActorSystem, Props}
import com.typesafe.config.ConfigFactory
import scala.concurrent.duration._
object RemoteApp {
def main(args: Array[String]): Unit = {
println("Hello:" + args.head)
startProcessingActorSystem()
}
def startProcessingActorSystem() = {
val system = ActorSystem("ProcessingSystem", ConfigFactory.load())
println("ProcessingActorSystem Started")
}
}
ProcessingActor.scala
package com.harit.akkaio.remote
import akka.actor.{Actor, ActorLogging}
case object Process
case object Crash
class ProcessingActor extends Actor with ActorLogging {
def receive = {
case Process => log.info("processing big things")
case Crash => log.info("crashing the system")
context.stop(self)
}
}
application.conf
akka {
remote.netty.tcp.port = 2552
}
mymodule.pom.xml
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<parent>
<artifactId>akkaio</artifactId>
<groupId>com.harit</groupId>
<version>1.0-SNAPSHOT</version>
</parent>
<modelVersion>4.0.0</modelVersion>
<artifactId>akkaio-remote</artifactId>
<properties>
<akka-remote_2.11.version>2.3.11</akka-remote_2.11.version>
</properties>
<dependencies>
<dependency>
<groupId>com.typesafe.akka</groupId>
<artifactId>akka-remote_2.11</artifactId>
<version>${akka-remote_2.11.version}</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<configuration>
<finalName>akka-remote</finalName>
<archive>
<manifest>
<mainClass>com.harit.akkaio.remote.RemoteApp</mainClass>
</manifest>
</archive>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
<executions>
<execution>
<id>make-assembly</id> <!-- this is used for inheritance merges -->
<phase>package</phase> <!-- bind to the packaging phase -->
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>
pom.xml
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.harit</groupId>
<artifactId>akkaio</artifactId>
<version>1.0-SNAPSHOT</version>
<modules>
<module>akkaio-remote</module>
</modules>
<packaging>pom</packaging>
<inceptionYear>2015</inceptionYear>
<properties>
<scala.version>2.11.6</scala.version>
<junit.version>4.12</junit.version>
<scalatest_2.11.version>2.2.5</scalatest_2.11.version>
<akka-actor_2.11.version>2.3.11</akka-actor_2.11.version>
<akka-slf4j_2.11.version>2.3.11</akka-slf4j_2.11.version>
<akka-testkit_2.11.version>2.3.11</akka-testkit_2.11.version>
<mockito-all.version>1.10.19</mockito-all.version>
<maven-scala-plugin.scalaCompatVersion>2.11.6</maven-scala-plugin.scalaCompatVersion>
<scalatest-maven-plugin.version>1.0</scalatest-maven-plugin.version>
</properties>
<repositories>
<repository>
<id>scala-tools.org</id>
<name>Scala-Tools Maven2 Repository</name>
<url>http://scala-tools.org/repo-releases</url>
</repository>
</repositories>
<pluginRepositories>
<pluginRepository>
<id>scala-tools.org</id>
<name>Scala-Tools Maven2 Repository</name>
<url>http://scala-tools.org/repo-releases</url>
</pluginRepository>
</pluginRepositories>
<dependencies>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>${scala.version}</version>
</dependency>
<dependency>
<groupId>com.typesafe.akka</groupId>
<artifactId>akka-actor_2.11</artifactId>
<version>${akka-actor_2.11.version}</version>
</dependency>
<dependency>
<groupId>com.typesafe.akka</groupId>
<artifactId>akka-slf4j_2.11</artifactId>
<version>${akka-slf4j_2.11.version}</version>
</dependency>
<dependency>
<groupId>org.scalatest</groupId>
<artifactId>scalatest-maven-plugin</artifactId>
<version>${scalatest-maven-plugin.version}</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>com.typesafe.akka</groupId>
<artifactId>akka-testkit_2.11</artifactId>
<version>${akka-testkit_2.11.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.scalatest</groupId>
<artifactId>scalatest_2.11</artifactId>
<version>${scalatest_2.11.version}</version>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<defaultGoal>clean install</defaultGoal>
<plugins>
<plugin>
<groupId>org.scala-tools</groupId>
<artifactId>maven-scala-plugin</artifactId>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>testCompile</goal>
</goals>
</execution>
</executions>
<configuration>
<scalaCompatVersion>${maven-scala-plugin.scalaCompatVersion}</scalaCompatVersion>
<scalaVersion>${scala.version}</scalaVersion>
<scalaVersion>${scala.version}</scalaVersion>
<args>
<arg>-target:jvm-1.8</arg>
</args>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.7</version>
<configuration>
<skipTests>true</skipTests>
</configuration>
</plugin>
<plugin>
<groupId>org.scalatest</groupId>
<artifactId>scalatest-maven-plugin</artifactId>
<version>1.0</version>
<configuration>
<reportsDirectory>${project.build.directory}/surefire-reports</reportsDirectory>
<junitxml>.</junitxml>
<filereports>WDF TestSuite.txt</filereports>
</configuration>
<executions>
<execution>
<id>test</id>
<goals>
<goal>test</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
<reporting>
<plugins>
<plugin>
<groupId>org.scala-tools</groupId>
<artifactId>maven-scala-plugin</artifactId>
<configuration>
<scalaVersion>${scala.version}</scalaVersion>
</configuration>
</plugin>
</plugins>
</reporting>
</project>
What am I missing out?
Thanks

It seems that your problem is bundling into a jar-with-dependencies, which causes problems with Akka, as described in the documentation:
Warning
Akka's configuration approach relies heavily on the notion of every module/jar having its own reference.conf file, all of these will be discovered by the configuration and loaded. Unfortunately this also means that if you put/merge multiple jars into the same jar, you need to merge all the reference.confs as well. Otherwise all defaults will be lost and Akka will not function.
As suggested on the same page, you can use maven-shade-plugin to merge all the reference configurations:
If you are using Maven to package your application, you can also make use of the Apache Maven Shade Plugin support for Resource Transformers to merge all the reference.confs on the build classpath into one.
See also: Akka: missing akka.version

Had a similar issue:
com.typesafe.config.ConfigException$Missing:
No configuration setting found for key 'akka.persistence.journal-plugin-fallback'
Solved it with adding an appending transformer:
<plugin>
<artifactId>maven-shade-plugin</artifactId>
<version>2.4.1</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<transformers>
<transformer implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
<resource>reference.conf</resource>
</transformer>
</transformers>
</configuration>
</execution>
</executions>
</plugin>

So the problem generates while making a fat jar but not handling reference.conf the right way.
The explanation follows from #Zoltan's answer:
It seems that your problem is bundling into a jar-with-dependencies, which causes problems with Akka, as described in the documentation:
Warning
Akka's configuration approach relies heavily on the notion of every
module/jar having its own reference.conf file, all of these will be
discovered by the configuration and loaded. Unfortunately this also
means that if you put/merge multiple jars into the same jar, you need
to merge all the reference.confs as well. Otherwise all defaults will
be lost and Akka will not function.
I have a solution for SBT users which doesn't require a plugin.
In build.sbt, add case "reference.conf" => MergeStrategy.concat to your module assembly configuration.
lazy val module_name = (project in file("module_path"))
.settings(
name := "module_name",
commonSettings,
assemblyJarName in assembly := "module_name.jar",
test in assembly := {},
assemblyMergeStrategy in assembly := {
case PathList("META-INF", xs # _*) => MergeStrategy.discard
#################### The line which needs to be added ###################
case "reference.conf" => MergeStrategy.concat
case _ => MergeStrategy.first
}
)
.dependsOn(other_modules, other_modules2)
The command MergeStrategy.concat literally functions the same way. While assembling whenever it encounters reference.conf it concatenates it to instead of creating a separate file for each akka module (which is the default behaviour).
Someone experienced on working with maven(pom.xml), please! extend this answer.

a lot of the submitted answers are suggesting a generic solution for setting up a merge strategy for concatenating the default typesafe config file (aka reference.conf).
That will not necessarily work for Akka libraries as Akka has some of its configurations included into the reference.conf from other files. That includes the akka.version property that lives in a version.conf file.
####################################
# Akka Actor Reference Config File #
####################################
# This is the reference config file that contains all the default settings.
# Make your edits/overrides in your application.conf.
# Akka version, checked against the runtime version of Akka. Loaded from generated conf file.
include "version"
So for an akka library you may need to include that file with a concat merge strategy in addition to the reference.conf strategy as other third party libraries may be doing something similar
assemblyMergeStrategy in assembly := {
case PathList("META-INF", "MANIFEST.MF") =>
val log = sLog.value
log.info("discarding MANIFEST.MF")
MergeStrategy.discard
case PathList("reference.conf") =>
val log = sLog.value
log.info("concatinating reference.conf")
MergeStrategy.concat
case PathList("version.conf") =>
val log = sLog.value
log.info("concatinating version.conf")
MergeStrategy.concat
case default =>
val log = sLog.value
log.debug(s"keeping last $default")
MergeStrategy.last
}
this is a common problem with alpakka as the version.conf in your uber jar will need to be concated to contain values from both libs. So the version.conf in your uber jar would need to look something like this:
akka.version = "2.6.19"
akka.kafka.version = "2.0.7"
(my solution is using scala / sbt and its assembly plugin, but you can do something similar for java / maven)

Adding AppendingTransformer alone didn't resolve the issue for me.
If you are trying to deploy your spark application on EMR and are still facing this issue then please take a look at my solution here. Hope it helps!

Add the following plug-ins:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>1.5</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<shadedArtifactAttached>true</shadedArtifactAttached>
<shadedClassifierName>allinone</shadedClassifierName>
<artifactSet>
<includes>
<include>*:*</include>
</includes>
</artifactSet>
<transformers>
<transformer
implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
<resource>reference.conf</resource>
</transformer>
<transformer
implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<manifestEntries>
<Main-Class>akka.Main</Main-Class>
</manifestEntries>
</transformer>
</transformers>
</configuration>
</execution>
</executions>
</plugin>
Reference here:akka—docs enter link description here

I tried this plugin, it is great but leads to another error due to some signed jars. Here is the error:
Error: A JNI error has occurred, please check your installation and try again
Exception in thread "main" java.lang.SecurityException: Invalid signature file digest for Manifest main attributes
What I would suggest you to use is the spring boot maven plugin. Just add it to your build and enjoy seamless runnable jars. This is one good reason why I love Spring framework.
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<executions>
<execution>
<goals>
<goal>repackage</goal>
</goals>
<configuration>
<classifier>final</classifier>
<mainClass>
com.main.PopularHashTags
</mainClass>
</configuration>
</execution>
</executions>
</plugin>
Note: You don't need to have a Spring Boot application to use this plugin. Just use it in any application and it works as a charm.

Related

java.lang.NoClassDefFoundError using Wildfly, Maven, Rest?

I have 2 Maven projects, and let's say I'm using Project2 as a dependency in Project1.
Project2 is using Java Security and has some encrypting methods.
I ran the command mvn install in Project2 and then this is how I am adding the dependency in Project1:
<dependency>
<groupId> myProject2groupId </groupId>
<artifactId> myProject2artifactId </artifactId>
<version>0.0.1-SNAPSHOT</version>
<scope>compile</scope>
<type>jar</type>
</dependency>
And I added this in Project2 pom.xml:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<version>3.2.0</version>
<executions>
<execution>
<goals>
<goal>test-jar</goal>
</goals>
</execution>
</executions>
<configuration>
<finalName>ProjectJARs/project</finalName>
</configuration>
</plugin>
The import of the classes and methods of Project2 in Project1 are perfectly working, giving me no errors when I try to compile as a Java Application and printing in console.
But when I'm running it on the Wildfly server and applying GET/POST methods from my rest endpoints, it gives me this error:
org.jboss.resteasy.spi.UnhandledException: java.lang.NoClassDefFoundError: simmetricClasses/SimmetricCriptography
(In Project1 Maven Dependencies I can see Project1's folder with this [without test code])
Can someone help me with this error?
Thanks a lot
This is the pom.xml of the project I want to use as a module:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>mygroupId</groupId>
<artifactId>myartifactId</artifactId>
<version>0.0.1-SNAPSHOT</version>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>3.0.0-M5</version>
<configuration>
<includes>
<include>**/*.java</include>
</includes>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<version>3.2.0</version>
<executions>
<execution>
<goals>
<goal>test-jar</goal>
</goals>
</execution>
</executions>
<configuration>
<finalName>ProjectJARs/project</finalName>
</configuration>
</plugin>
</plugins>
</build>
<properties>
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
</properties>
</project>
Is this correct?

Updating play framework project to Java 11 and cannot find correct dependency versions

Updating an existing maven project from Java 8 to 11, and I'm struggling to find the right combination of dependency and plugin versions.
One of the modules uses the play framework, with its associated dependencies, here's what they are set to now:
java: 11.0.5
scala: 2.12.8
sbt: 0.13.17
sbt-compiler-maven-plugin: 1.0.0
sbtrun-maven-plugin: 1.0.1
play: 2.6.21
play2-maven-plugin: 1.0.0-rc5
play2-provider-play26: 1.0.0-rc5
akka: 2.5.27
Which is giving me the following build error:
[ERROR] Failed to execute goal com.google.code.play2-maven-plugin:play2-maven-plugin:1.0.0-rc5:enhance (default-play2-enhance) on project services: Execution default-play2-enhance of goal com.google.code.play2-maven-plugin:play2-maven-plugin:1.0.0-rc5:enhance failed: An API incompatibility was encountered while executing com.google.code.play2-maven-plugin:play2-maven-plugin:1.0.0-rc5:enhance: java.lang.NoSuchMethodError: 'scala.collection.mutable.ArrayOps scala.Predef$.byteArrayOps(byte[])'
[ERROR] -----------------------------------------------------
[ERROR] realm = extension>com.google.code.play2-maven-plugin:play2-maven-plugin:1.0.0-rc5
[ERROR] strategy = org.codehaus.plexus.classworlds.strategy.SelfFirstStrategy
...
...
...
[ERROR] Number of foreign imports: 1
[ERROR] import: Entry[import from realm ClassRealm[maven.api, parent: null]]
Note: I am hoping to keep akka to version 2.5.x, as 2.6.x requires a lot of code changes, but will go to a higher version if necessary
<?xml version="1.0"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>groupId</groupId>
<artifactId>artifactId</artifactId>
<version>version</version>
</parent>
<artifactId>artifactId</artifactId>
<packaging>play2</packaging>
<name>name</name>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<repositories>
<repository>
<id>typesafe</id>
<url>http://repo.typesafe.com/typesafe/releases/</url>
</repository>
</repositories>
<dependencies>
<dependency>
<groupId>com.typesafe.play</groupId>
<artifactId>play_2.12</artifactId>
</dependency>
<dependency>
<groupId>com.typesafe.play</groupId>
<artifactId>play-test_2.12</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.typesafe.play</groupId>
<artifactId>play-java_2.12</artifactId>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.datatype</groupId>
<artifactId>jackson-datatype-joda</artifactId>
</dependency>
<dependency>
<groupId>com.typesafe.akka</groupId>
<artifactId>akka-stream_2.12</artifactId>
</dependency>
<dependency>
<groupId>org.easytesting</groupId>
<artifactId>fest-assert-core</artifactId>
</dependency>
</dependencies>
<build>
<!-- Play source directory -->
<sourceDirectory>${basedir}/app</sourceDirectory>
<testSourceDirectory>${basedir}/test</testSourceDirectory>
<resources>
<resource>
<directory>${basedir}/conf</directory>
</resource>
<resource>
<directory>${basedir}/public</directory>
<targetPath>public</targetPath>
</resource>
<resource>
<directory>${basedir}/target/sbt/web/public/main</directory>
<targetPath>public</targetPath>
</resource>
</resources>
<plugins>
<plugin>
<groupId>com.google.code.play2-maven-plugin</groupId>
<artifactId>play2-maven-plugin</artifactId>
<version>1.0.0-rc5</version>
<extensions>true</extensions>
<configuration>
<mainLang>java</mainLang>
</configuration>
<dependencies>
<dependency>
<groupId>com.google.code.play2-maven-plugin</groupId>
<artifactId>play2-provider-play27</artifactId>
<version>1.0.0-rc5</version>
</dependency>
</dependencies>
<executions>
<execution>
<id>default-play2-enhance</id>
<goals>
<goal>enhance</goal>
</goals>
</execution>
<execution>
<phase>package</phase>
<goals>
<goal>dist</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>com.google.code.sbt-compiler-maven-plugin</groupId>
<artifactId>sbt-compiler-maven-plugin</artifactId>
<version>1.0.0</version>
</plugin>
<plugin>
<groupId>com.google.code.sbtrun-maven-plugin</groupId>
<artifactId>sbtrun-maven-plugin</artifactId>
<version>1.0.1</version>
<executions>
<execution>
<id>compile-assets</id>
<phase>generate-resources</phase>
<goals>
<goal>run</goal>
</goals>
<configuration>
<args>web-assets:assets</args>
<jvmArgs>-Dscala.version=2.12.8</jvmArgs>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<version>2.6</version>
<executions>
<execution>
<id>default-jar</id>
<configuration>
<archive>
<manifest>
<addDefaultImplementationEntries>true</addDefaultImplementationEntries>
<addDefaultSpecificationEntries>true</addDefaultSpecificationEntries>
</manifest>
</archive>
</configuration>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>build-helper-maven-plugin</artifactId>
<version>1.12</version>
<executions>
<execution>
<id>attach-artifacts</id>
<phase>package</phase>
<goals>
<goal>attach-artifact</goal>
</goals>
</execution>
</executions>
<configuration>
<!-- <!– These empty tags keep IntelliJ quiet –> -->
<name />
<regex />
<source />
<value />
<fileSet />
<artifacts>
<artifact>
<file>${project.build.directory}/service-${project.version}-dist.zip</file>
<type>zip</type>
<classifier>dist</classifier>
</artifact>
</artifacts>
</configuration>
</plugin>
</plugins>
</build>
<profiles>
<profile>
<id>eclipse</id> <!-- for M2Eclipse only -->
<build>
<directory>${project.basedir}/target-eclipse</directory>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<skipMain>true</skipMain>
<skip>true</skip>
<source>11</source>
<target>11</target>
</configuration>
<executions>
<execution>
<id>default-compile</id>
<goals><goal>compile</goal></goals>
</execution>
<execution>
<id>default-testCompile</id>
<goals><goal>testCompile</goal></goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
</profile>
</profiles>
</project>
Edit:
Updated SBT and Scala version, added redacted pom
As per scala documentation
https://docs.scala-lang.org/overviews/jdk-compatibility/overview.html
Your SBT version is not compatible with Java 11. I would recommend upgrading to sbt.version=1.2.3
Akka version you are using should be good if you don't want to upgrade to higher.
Update:
I can see you updated sbt version to 0.13.17 and you tried with higher, but you should forget that sbt 0.13 and stick with sbt 1.x since sbt 0.13 is using scala 2.10, and sbt 1.x scala 2.12.
I've also noticed that you don't have version tag for maven-compiler-plugin. If you are using maven 3, it should fail to build because of missing this tag, and since it's not, I guess you are using maven 2 which should be upgraded. Also you should add 3.8.0 version tag to maven-compiler-plugin.
Check this guide to see how to properly configure this plugin for java 11. You should also check other instructions to see if everything is properly configured.

Error: Could not find or load main class com.sparkarma.spark.WordCount

package explore I tried to run the programme below. I am new to spark programming.i am getting following error in Scala IDE.I checked main class, no version problems in scala and spark. I mentioned output folder default. Please let me know what's wrong in my code.
package com.sparkarma.spark
import org.apache.spark.SparkConf
import org.apache.spark.SparkContext
object WordCount{
def main(args:Array[String]) {
val conf = new SparkConf().setAppName("Word Count").setMaster("local[2]")
val sc = new SparkContext(conf)
val bigFile = sc.textFile("hdfs://localhost:9000/home/chaitanya/files/bigText")
val words = bigFile.flatMap(line => line.split(" ")).count()
println(words)
}
}
ScreenShot3
Here is My pom.xml
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.sparkarma.samp</groupId>
<artifactId>SparkMaro</artifactId>
<version>0.0.1-SNAPSHOT</version>
<pluginRepositories>
<pluginRepository>
<id>scala-tools.org</id>
<name>Scala-tools Maven2 Repository</name>
<url>http://scala-tools.org/repo-releases</url>
</pluginRepository>
</pluginRepositories>
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.4.0</version>
</dependency>
<dependency>
<artifactId>scala-library</artifactId>
<groupId>org.scala-lang</groupId>
<version>2.10.6</version>
</dependency>
</dependencies>
<build>
<plugins>
<!-- mixed scala/java compile -->
<plugin>
<groupId>org.scala-tools</groupId>
<artifactId>maven-scala-plugin</artifactId>
<executions>
<execution>
<id>compile</id>
<goals>
<goal>compile</goal>
</goals>
<phase>compile</phase>
</execution>
<execution>
<id>test-compile</id>
<goals>
<goal>testCompile</goal>
</goals>
<phase>test-compile</phase>
</execution>
<execution>
<phase>process-resources</phase>
<goals>
<goal>compile</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>1.7</source>
<target>1.7</target>
</configuration>
</plugin>
<!-- for fatjar -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
<version>2.4</version>
<configuration>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
<executions>
<execution>
<id>assemble-all</id>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<configuration>
<archive>
<manifest>
<addClasspath>true</addClasspath>
<mainClass>fully.qualified.MainClass</mainClass>
</manifest>
</archive>
</configuration>
</plugin>
</plugins>
<pluginManagement>
<plugins>
<!--This plugin's configuration is used to store Eclipse m2e settings
only. It has no influence on the Maven build itself. -->
<plugin>
<groupId>org.eclipse.m2e</groupId>
<artifactId>lifecycle-mapping</artifactId>
<version>1.0.0</version>
<configuration>
<lifecycleMappingMetadata>
<pluginExecutions>
<pluginExecution>
<pluginExecutionFilter>
<groupId>org.scala-tools</groupId>
<artifactId> maven-scala-plugin </artifactId>
<versionRange> [2.15.2,) </versionRange>
<goals>
<goal>compile</goal>
<goal>testCompile</goal>
</goals>
</pluginExecutionFilter>
<action>
<execute />
</action>
</pluginExecution>
</pluginExecutions>
</lifecycleMappingMetadata>
</configuration>
</plugin>
</plugins>
</pluginManagement>
</build>
</project>
Your project folder don't have Scala library container. Seems you are missing Scala libraries or Scala Nature in your project. Review your pom.xml and add appropriate Scala library in your project. Check if below dependency is added.
<dependency>
<artifactId>scala-library</artifactId>
<groupId>org.scala-lang</groupId>
<version>${scala.version}</version>
</dependency>
If yes then follow below steps:
Right Click on project pom.xml ->Run As ->Maven Install
Right Click on Project ->Configure ->Add Scala Nature
Right Click on Project -> Scala -> Add scala library to build Path
Right Click on Scala Library container -> Properties -> Classpath container -> Select containers as per the version defined in pom.xml
Your code looks good. If you still see the issue please provide your pom.xml
Please upgrade your spark-core_2.10 version from 1.4.0 to 1.6.2. Use below maven dependency and follow the above steps.
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.6.2</version>
</dependency>
You are overriding the main method. Make sure that you don't do it
object WordCount{
def main(args:Array[String]) {
val conf = new SparkConf().setAppName("Word Count").setMaster("local[2]")
val sc = new SparkContext(conf)
val bigFile = sc.textFile("hdfs://localhost:9000/home/chaitanya/files/bigText")
val words = bigFile.flatMap(line => line.split(" ")).count()
println(words)
}
}
After looking in to your project explorer you need to right click scala folder and select build path -> use as source folder.

Is there any way in Kotlin to weave in code before/after/around functions like there is with AspectJ in Java?

I have tried to use AspectJ to weave in aspects around Kotlin functions, but with no success.
Maybe I'm just configuring something incorrectly, or maybe AspectJ does not support this.
Does anyone know if this is possible using e.g maven and Eclipse (or IntelliJ)?
Or care to explain why it is not possible?
In addition to the other comments/answers I think it is worth pointing out that you can "weave" code before/after/around function code by using inline functions. e.g.:
fun main(vararg args: String) = nanoTimeAppendedTo(System.out, name = "main") {
/* do something, e.g.: */
Thread.sleep(0)
}
inline fun nanoTimeAppendedTo(appendable: Appendable, name: String, block: () -> Unit) {
val nanoTime = measureNanoTime(block)
appendable.appendln("`$name` took $nanoTime ns")
}
You won't have access to all of the information that AspectJ gives you but for simple cases where you simply want to reuse executing some code before/after/around some other code this works just fine.
This is what I did to weave in the aspects the "binary" way, after that the Java and Kotlin code has been compiled.
I couldn't get the aspectj-maven-plugin to weave the aspect the "binary" way correctly, so I used the plugin jcabi-maven-plugin instead.
See http://plugin.jcabi.com/example-ajc.html
The pom that worked for me:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>my.group.id</groupId>
<artifactId>my.artifact.id</artifactId>
<version>0.0.1-SNAPSHOT</version>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<maven.compiler.source>1.6</maven.compiler.source>
<maven.compiler.target>1.6</maven.compiler.target>
<complianceLevel>1.6</complianceLevel>
</properties>
<dependencies>
<dependency>
<groupId>org.jetbrains.kotlin</groupId>
<artifactId>kotlin-stdlib</artifactId>
<version>1.0.3</version>
</dependency>
<dependency>
<groupId>org.aspectj</groupId>
<artifactId>aspectjrt</artifactId>
<version>1.8.9</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<groupId>org.apache.maven.plugins</groupId>
<version>3.3</version>
<configuration>
<source>${maven.compiler.source}</source>
<target>${maven.compiler.target}</target>
</configuration>
</plugin>
<plugin>
<artifactId>kotlin-maven-plugin</artifactId>
<groupId>org.jetbrains.kotlin</groupId>
<version>1.0.3</version>
<configuration>
<sourceDirs>
<sourceDir>src/main/kotlin</sourceDir>
<sourceDir>src/test/kotlin</sourceDir>
</sourceDirs>
</configuration>
<executions>
<execution>
<id>compile</id>
<phase>compile</phase>
<goals>
<goal>compile</goal>
</goals>
</execution>
<execution>
<id>test-compile</id>
<phase>test-compile</phase>
<goals>
<goal>test-compile</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>com.jcabi</groupId>
<artifactId>jcabi-maven-plugin</artifactId>
<version>0.14.1</version>
<executions>
<execution>
<goals>
<goal>ajc</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>1.5.0</version>
<configuration>
<mainClass>my.package.MyMainClassKt</mainClass>
</configuration>
</plugin>
</plugins>
</build>
So this works with the aspects and a few annotations defined in Java, and then using these annotations to annotate Kotlin methods and classes, to have the aspects successfully injected in the Kotlin code.
Note that if the Kotlin file has both a main method and a class defined in the same file, the Kotlin compiler produces two class files. One class with the name of the class and one class with "Kt" added to its name.
Good to know if you try to use the exec-maven-plugin to run the Kotlin code.
However, this did not play very well with eclipse. Maybe IntelliJ will do a better job here.

No configuration setting found for key 'akka.version'

I am learning akka-remoting and this is how my project looks
The project structure looks like
project/pom.xml
project/mymodule/pom.xml
project/mymodule/src/main/resources/application.conf
project/mymodule/src/main/scala/com.harit.akkaio.remote.RemoteApp.scala
project/mymodule/src/main/scala/com.harit.akkaio.remote.ProcessingActor.scala
When I run my project on command-line, I see
$ java -jar akkaio-remote/target/akka-remote-jar-with-dependencies.jar com.harit.akkaio.remote.RemoteApp
Hello:com.harit.akkaio.remote.RemoteApp
Exception in thread "main" com.typesafe.config.ConfigException$Missing: No configuration setting found for key 'akka.version'
at com.typesafe.config.impl.SimpleConfig.findKey(SimpleConfig.java:124)
at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:145)
at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:151)
at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:159)
at com.typesafe.config.impl.SimpleConfig.find(SimpleConfig.java:164)
at com.typesafe.config.impl.SimpleConfig.getString(SimpleConfig.java:206)
at akka.actor.ActorSystem$Settings.<init>(ActorSystem.scala:169)
at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:505)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:142)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:119)
at com.harit.akkaio.remote.RemoteApp$.startProcessingActorSystem(RemoteApp.scala:16)
at com.harit.akkaio.remote.RemoteApp$.main(RemoteApp.scala:12)
at com.harit.akkaio.remote.RemoteApp.main(RemoteApp.scala)
RemoteApp.scala
package com.harit.akkaio.remote
import akka.actor.{ActorRef, ActorSystem, Props}
import com.typesafe.config.ConfigFactory
import scala.concurrent.duration._
object RemoteApp {
def main(args: Array[String]): Unit = {
println("Hello:" + args.head)
startProcessingActorSystem()
}
def startProcessingActorSystem() = {
val system = ActorSystem("ProcessingSystem", ConfigFactory.load())
println("ProcessingActorSystem Started")
}
}
ProcessingActor.scala
package com.harit.akkaio.remote
import akka.actor.{Actor, ActorLogging}
case object Process
case object Crash
class ProcessingActor extends Actor with ActorLogging {
def receive = {
case Process => log.info("processing big things")
case Crash => log.info("crashing the system")
context.stop(self)
}
}
application.conf
akka {
remote.netty.tcp.port = 2552
}
mymodule.pom.xml
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<parent>
<artifactId>akkaio</artifactId>
<groupId>com.harit</groupId>
<version>1.0-SNAPSHOT</version>
</parent>
<modelVersion>4.0.0</modelVersion>
<artifactId>akkaio-remote</artifactId>
<properties>
<akka-remote_2.11.version>2.3.11</akka-remote_2.11.version>
</properties>
<dependencies>
<dependency>
<groupId>com.typesafe.akka</groupId>
<artifactId>akka-remote_2.11</artifactId>
<version>${akka-remote_2.11.version}</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<configuration>
<finalName>akka-remote</finalName>
<archive>
<manifest>
<mainClass>com.harit.akkaio.remote.RemoteApp</mainClass>
</manifest>
</archive>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
<executions>
<execution>
<id>make-assembly</id> <!-- this is used for inheritance merges -->
<phase>package</phase> <!-- bind to the packaging phase -->
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>
pom.xml
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.harit</groupId>
<artifactId>akkaio</artifactId>
<version>1.0-SNAPSHOT</version>
<modules>
<module>akkaio-remote</module>
</modules>
<packaging>pom</packaging>
<inceptionYear>2015</inceptionYear>
<properties>
<scala.version>2.11.6</scala.version>
<junit.version>4.12</junit.version>
<scalatest_2.11.version>2.2.5</scalatest_2.11.version>
<akka-actor_2.11.version>2.3.11</akka-actor_2.11.version>
<akka-slf4j_2.11.version>2.3.11</akka-slf4j_2.11.version>
<akka-testkit_2.11.version>2.3.11</akka-testkit_2.11.version>
<mockito-all.version>1.10.19</mockito-all.version>
<maven-scala-plugin.scalaCompatVersion>2.11.6</maven-scala-plugin.scalaCompatVersion>
<scalatest-maven-plugin.version>1.0</scalatest-maven-plugin.version>
</properties>
<repositories>
<repository>
<id>scala-tools.org</id>
<name>Scala-Tools Maven2 Repository</name>
<url>http://scala-tools.org/repo-releases</url>
</repository>
</repositories>
<pluginRepositories>
<pluginRepository>
<id>scala-tools.org</id>
<name>Scala-Tools Maven2 Repository</name>
<url>http://scala-tools.org/repo-releases</url>
</pluginRepository>
</pluginRepositories>
<dependencies>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>${scala.version}</version>
</dependency>
<dependency>
<groupId>com.typesafe.akka</groupId>
<artifactId>akka-actor_2.11</artifactId>
<version>${akka-actor_2.11.version}</version>
</dependency>
<dependency>
<groupId>com.typesafe.akka</groupId>
<artifactId>akka-slf4j_2.11</artifactId>
<version>${akka-slf4j_2.11.version}</version>
</dependency>
<dependency>
<groupId>org.scalatest</groupId>
<artifactId>scalatest-maven-plugin</artifactId>
<version>${scalatest-maven-plugin.version}</version>
<scope>compile</scope>
</dependency>
<dependency>
<groupId>com.typesafe.akka</groupId>
<artifactId>akka-testkit_2.11</artifactId>
<version>${akka-testkit_2.11.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.scalatest</groupId>
<artifactId>scalatest_2.11</artifactId>
<version>${scalatest_2.11.version}</version>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<defaultGoal>clean install</defaultGoal>
<plugins>
<plugin>
<groupId>org.scala-tools</groupId>
<artifactId>maven-scala-plugin</artifactId>
<executions>
<execution>
<goals>
<goal>compile</goal>
<goal>testCompile</goal>
</goals>
</execution>
</executions>
<configuration>
<scalaCompatVersion>${maven-scala-plugin.scalaCompatVersion}</scalaCompatVersion>
<scalaVersion>${scala.version}</scalaVersion>
<scalaVersion>${scala.version}</scalaVersion>
<args>
<arg>-target:jvm-1.8</arg>
</args>
</configuration>
</plugin>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>2.7</version>
<configuration>
<skipTests>true</skipTests>
</configuration>
</plugin>
<plugin>
<groupId>org.scalatest</groupId>
<artifactId>scalatest-maven-plugin</artifactId>
<version>1.0</version>
<configuration>
<reportsDirectory>${project.build.directory}/surefire-reports</reportsDirectory>
<junitxml>.</junitxml>
<filereports>WDF TestSuite.txt</filereports>
</configuration>
<executions>
<execution>
<id>test</id>
<goals>
<goal>test</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
<reporting>
<plugins>
<plugin>
<groupId>org.scala-tools</groupId>
<artifactId>maven-scala-plugin</artifactId>
<configuration>
<scalaVersion>${scala.version}</scalaVersion>
</configuration>
</plugin>
</plugins>
</reporting>
</project>
What am I missing out?
Thanks
It seems that your problem is bundling into a jar-with-dependencies, which causes problems with Akka, as described in the documentation:
Warning
Akka's configuration approach relies heavily on the notion of every module/jar having its own reference.conf file, all of these will be discovered by the configuration and loaded. Unfortunately this also means that if you put/merge multiple jars into the same jar, you need to merge all the reference.confs as well. Otherwise all defaults will be lost and Akka will not function.
As suggested on the same page, you can use maven-shade-plugin to merge all the reference configurations:
If you are using Maven to package your application, you can also make use of the Apache Maven Shade Plugin support for Resource Transformers to merge all the reference.confs on the build classpath into one.
See also: Akka: missing akka.version
Had a similar issue:
com.typesafe.config.ConfigException$Missing:
No configuration setting found for key 'akka.persistence.journal-plugin-fallback'
Solved it with adding an appending transformer:
<plugin>
<artifactId>maven-shade-plugin</artifactId>
<version>2.4.1</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<transformers>
<transformer implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
<resource>reference.conf</resource>
</transformer>
</transformers>
</configuration>
</execution>
</executions>
</plugin>
So the problem generates while making a fat jar but not handling reference.conf the right way.
The explanation follows from #Zoltan's answer:
It seems that your problem is bundling into a jar-with-dependencies, which causes problems with Akka, as described in the documentation:
Warning
Akka's configuration approach relies heavily on the notion of every
module/jar having its own reference.conf file, all of these will be
discovered by the configuration and loaded. Unfortunately this also
means that if you put/merge multiple jars into the same jar, you need
to merge all the reference.confs as well. Otherwise all defaults will
be lost and Akka will not function.
I have a solution for SBT users which doesn't require a plugin.
In build.sbt, add case "reference.conf" => MergeStrategy.concat to your module assembly configuration.
lazy val module_name = (project in file("module_path"))
.settings(
name := "module_name",
commonSettings,
assemblyJarName in assembly := "module_name.jar",
test in assembly := {},
assemblyMergeStrategy in assembly := {
case PathList("META-INF", xs # _*) => MergeStrategy.discard
#################### The line which needs to be added ###################
case "reference.conf" => MergeStrategy.concat
case _ => MergeStrategy.first
}
)
.dependsOn(other_modules, other_modules2)
The command MergeStrategy.concat literally functions the same way. While assembling whenever it encounters reference.conf it concatenates it to instead of creating a separate file for each akka module (which is the default behaviour).
Someone experienced on working with maven(pom.xml), please! extend this answer.
a lot of the submitted answers are suggesting a generic solution for setting up a merge strategy for concatenating the default typesafe config file (aka reference.conf).
That will not necessarily work for Akka libraries as Akka has some of its configurations included into the reference.conf from other files. That includes the akka.version property that lives in a version.conf file.
####################################
# Akka Actor Reference Config File #
####################################
# This is the reference config file that contains all the default settings.
# Make your edits/overrides in your application.conf.
# Akka version, checked against the runtime version of Akka. Loaded from generated conf file.
include "version"
So for an akka library you may need to include that file with a concat merge strategy in addition to the reference.conf strategy as other third party libraries may be doing something similar
assemblyMergeStrategy in assembly := {
case PathList("META-INF", "MANIFEST.MF") =>
val log = sLog.value
log.info("discarding MANIFEST.MF")
MergeStrategy.discard
case PathList("reference.conf") =>
val log = sLog.value
log.info("concatinating reference.conf")
MergeStrategy.concat
case PathList("version.conf") =>
val log = sLog.value
log.info("concatinating version.conf")
MergeStrategy.concat
case default =>
val log = sLog.value
log.debug(s"keeping last $default")
MergeStrategy.last
}
this is a common problem with alpakka as the version.conf in your uber jar will need to be concated to contain values from both libs. So the version.conf in your uber jar would need to look something like this:
akka.version = "2.6.19"
akka.kafka.version = "2.0.7"
(my solution is using scala / sbt and its assembly plugin, but you can do something similar for java / maven)
Adding AppendingTransformer alone didn't resolve the issue for me.
If you are trying to deploy your spark application on EMR and are still facing this issue then please take a look at my solution here. Hope it helps!
Add the following plug-ins:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>1.5</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<shadedArtifactAttached>true</shadedArtifactAttached>
<shadedClassifierName>allinone</shadedClassifierName>
<artifactSet>
<includes>
<include>*:*</include>
</includes>
</artifactSet>
<transformers>
<transformer
implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
<resource>reference.conf</resource>
</transformer>
<transformer
implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<manifestEntries>
<Main-Class>akka.Main</Main-Class>
</manifestEntries>
</transformer>
</transformers>
</configuration>
</execution>
</executions>
</plugin>
Reference here:akka—docs enter link description here
I tried this plugin, it is great but leads to another error due to some signed jars. Here is the error:
Error: A JNI error has occurred, please check your installation and try again
Exception in thread "main" java.lang.SecurityException: Invalid signature file digest for Manifest main attributes
What I would suggest you to use is the spring boot maven plugin. Just add it to your build and enjoy seamless runnable jars. This is one good reason why I love Spring framework.
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<executions>
<execution>
<goals>
<goal>repackage</goal>
</goals>
<configuration>
<classifier>final</classifier>
<mainClass>
com.main.PopularHashTags
</mainClass>
</configuration>
</execution>
</executions>
</plugin>
Note: You don't need to have a Spring Boot application to use this plugin. Just use it in any application and it works as a charm.