im trying to run sqs server using elasticmq-rest-sqs:
"org.elasticmq" %% "elasticmq-rest-sqs" % "0.14.7"
and my akka dependencies are:
val akka = Seq(
"com.lightbend.akka" %% "akka-stream-alpakka-sns" % "0.15",
"com.typesafe.akka" %% "akka-testkit" % "2.5.8" % Test withSources(),
"com.typesafe.akka" %% "akka-stream-testkit" % "2.5.8" % Test withSources(),
"com.lightbend.akka" %% "akka-stream-alpakka-sqs" % "0.15"
)
now in my test im writing:
// sqs server
val sqsServer: SQSRestServer = SQSRestServerBuilder.withPort(4576).withInterface("localhost").start()
sqsServer.waitUntilStarted()
and i get the following error:
A needed class was not found. This could be due to an error in your
runpath. Missing class: akka/http/impl/util/SettingsCompanion
java.lang.NoClassDefFoundError: akka/http/impl/util/SettingsCompanion
its versioning thing for sure since im using it the same way in another project, but with play 2.6 (not sure if it has anything to do with it)
if im downgrading elasticmq version i get this error:
An exception or error caused a run to abort:
java.lang.NoClassDefFoundError:
akka/http/scaladsl/settings/RoutingSettings
java.lang.BootstrapMethodError: java.lang.NoClassDefFoundError:
akka/http/scaladsl/settings/RoutingSettings
please help :/
For me, the following version is working: (also matched this problem but fixed it with version)
"org.elasticmq" %% "elasticmq-rest-sqs" % "0.15.2"
"com.typesafe.akka" %% "akka-testkit" % "2.5.8" % Test withSources(),
"com.typesafe.akka" %% "akka-stream-testkit" % "2.5.8" % Test withSources(),
"com.lightbend.akka" %% "akka-stream-alpakka-sqs" % alpakkaVersion
val alpakkaVersion = "1.0-M2"
and:
"com.typesafe.play" % "sbt-plugin" % "2.7.2"
Related
I'm trying to use an AvroParquetWriter to convert a file in Avro format to a parquet file. I load up the schema
val schema:org.apache.Schema = ... getSchema(...)
val parquetFile = new Path("Location/for/parquetFile.txt")
val writer = new AvroParquetWriter[GenericRecord](parquetFile,schema)
My code runs fine up until it gets to initializing the AvroParquetWriter. Then it throws this error:
> java.lang.RuntimeException: java.io.FileNotFoundException:
> java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are
> unset. -see https://wiki.apache.org/hadoop/WindowsProblems at
> org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:722) at
> org.apache.hadoop.util.Shell.getSetPermissionCommand(Shell.java:256)
> at
> org.apache.hadoop.util.Shell.getSetPermissionCommand(Shell.java:273)
> at
> org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:767)
> at
> org.apache.hadoop.fs.RawLocalFileSystem$LocalFSFileOutputStream.<init>(RawLocalFileSystem.java:235)...etc
The advice it seems to give, and the advice I'm finding, is related to how to fix this if you are running a Hadoop cluster on your machine. However, I am not running a Hadoop cluster, nor am I aiming to. I have imported some of its libraries to use with various other pieces of my program in my SBT file, but this does not spin up a local cluster.
It just started doing this. Out of my 2 other colleagues, one is able to run this without issue, and the other just started getting the same issue as me. Here is (the relevant parts of) my build.sbt:
lazy val root = (project in file("."))
.settings(
commonSettings,
name := "My project",
version := "0.1",
libraryDependencies ++= Seq(
"org.apache.hadoop" % "hadoop-common" % "2.9.0",
"com.typesafe.akka" %% "akka-actor" % "2.5.2",
"com.lightbend.akka" %% "akka-stream-alpakka-s3" % "0.9",
"com.enragedginger" % "akka-quartz-scheduler_2.12" % "1.6.0-akka-2.4.x",
"com.typesafe.akka" % "akka-agent_2.12" % "2.5.2",
"com.typesafe.akka" % "akka-remote_2.12" % "2.5.2",
"com.typesafe.akka" % "akka-stream_2.12" % "2.5.2",
"org.apache.kafka" % "kafka-clients" % "0.10.2.1",
"com.typesafe.akka" %% "akka-stream-kafka" % "0.16",
"com.typesafe.akka" %% "akka-persistence" % "2.5.2",
"org.iq80.leveldb" % "leveldb" % "0.7",
"org.fusesource.leveldbjni" % "leveldbjni-all" % "1.8",
"javax.mail" % "javax.mail-api" % "1.5.6",
"com.sun.mail" % "javax.mail" % "1.5.6",
"commons-io" % "commons-io" % "2.5",
"org.apache.avro" % "avro" % "1.8.1",
"net.liftweb" % "lift-json_2.12" % "3.1.0-M1",
"com.google.code.gson" % "gson" % "2.8.1",
"org.json4s" %% "json4s-jackson" % "3.5.2",
"com.amazonaws" % "aws-java-sdk-s3" % "1.11.149",
//"com.amazonaws" % "aws-java-sdk" % "1.11.286",
"org.scalikejdbc" %% "scalikejdbc" % "3.0.0",
"org.scalikejdbc" %% "scalikejdbc-config" % "3.0.0",
"org.scalikejdbc" % "scalikejdbc-interpolation_2.12" % "3.0.2",
"com.microsoft.sqlserver" % "mssql-jdbc" % "6.1.0.jre8",
"org.apache.commons" % "commons-pool2" % "2.4.2",
"commons-pool" % "commons-pool" % "1.6",
"com.jcraft" % "jsch" % "0.1.54",
"ch.qos.logback" % "logback-classic" % "1.2.3",
"com.typesafe.scala-logging" %% "scala-logging" % "3.7.2",
"org.scalactic" %% "scalactic" % "3.0.4",
"mysql" % "mysql-connector-java" % "8.0.8-dmr",
"org.scalatest" %% "scalatest" % "3.0.4" % "test"
)
)
Any ideas as to why it cannot run the Hadoop-related dependencies?
The answer was to follow their suggestion-
I downloaded the latest version of the winutils.exe from
https://github.com/steveloughran/winutils/tree/master/hadoop-3.0.0/bin
Then I manually created this directory structure in C:/Users/MyName/Hadoop/bin - note, the bin MUST be there. You can actually call the Hadoop/ directory whatever you want, but the bin/ must be one level within.
I placed the winutils.exe and placed it in the bin.
In my code I had to put this line above initializing the parquet writer (I'd imagine it can be anytime before it is initialized) to set the Hadoop line:
-
System.setProperty("hadoop.home.dir", "C:/Users/nhanak/Hadoop/")
val writer = new AvroParquetWriter[GenericRecord](parquetFile,iInfo.schema)
Optional - if you want to just keep this within your project and not have it carry over to your local machine, or if others are going to be pulling this repo or you want to pack it in a jar to send off everywhere, etc. - create a directory structure within your project and store the winutils.exe inside of it.
-so, say you create the directory structure src/main/resources/HadoopResources/bin in your project, place the winutils.exe in the bin. Then, to make use of the winutils.exe you need to set the Hadoop home like this:
-
val file = new File("src/main/resources/HadoopResources")
System.setProperty("hadoop.home.dir", file.getAbsolutePath)
i am using SBT to setup Akka Persistence but it's failing with Error :
Detected java.lang.NoSuchMethodError error, which MAY be caused by incompatible Akka versions on the classpath. Please note that a given Akka version MUST be the same across all modules of Akka that you are using, e.g. if you use akka-actor [2.5.6 (resolved from current classpath)] all other core Akka modules MUST be of the same version. External projects like Alpakka, Persistence plugins or Akka HTTP etc. have their own version numbers - please make sure you're using a compatible set of libraries.
Uncaught error from thread [TopsActivitiesSystem-akka.actor.default-dispatcher-16]: akka.persistence.Eventsourced.persist$(Lakka/persistence/Eventsourced;Ljava/lang/Object;Lscala/Function1;)V, shutting down JVM since 'akka.jvm-exit-on-fatal-error' is enabled for for ActorSystem[TopsActivitiesSystem]
Detected java.lang.NoSuchMethodError error, which MAY be caused by incompatible Akka versions on the classpath. Please note that a given Akka version MUST be the same across all modules of Akka that you are using, e.g. if you use akka-actor [2.5.6 (resolved from current classpath)] all other core Akka modules MUST be of the same version. External projects like Alpakka, Persistence plugins or Akka HTTP etc. have their own version numbers - please make sure you're using a compatible set of libraries.
Uncaught error from thread [TopsActivitiesSystem-akka.actor.default-dispatcher-3]: akka.persistence.Eventsourced.persist$(Lakka/persistence/Eventsourced;Ljava/lang/Object;Lscala/Function1;)V, shutting down JVM since 'akka.jvm-exit-on-fatal-error' is enabled for for ActorSystem[TopsActivitiesSystem]
java.lang.NoSuchMethodError: akka.persistence.Eventsourced.persist$(Lakka/persistence/Eventsourced;Ljava/lang/Object;Lscala/Function1;)V
My Configuration SBT Build file is :
scalaVersion := "2.12.4"
lazy val root = (project in file("."))
.configs(IntegrationTest)
def akkaVersion = "2.5.6"
def akkaHttpVersion = "10.0.10"
def logbackVersion = "1.2.3"
def ItAndTest = "it, test"
libraryDependencies ++= Seq(
"com.typesafe.akka" %% "akka-actor" % akkaVersion,
"com.typesafe.akka" %% "akka-stream" % akkaVersion,
"com.typesafe.akka" %% "akka-cluster" % akkaVersion,
"com.typesafe.akka" %% "akka-cluster-tools" % akkaVersion,
"com.typesafe.akka" % "akka-cluster-metrics_2.12" % akkaVersion,
"com.redmart.fc" %% "fc-capacity-model" % "1.7.0",
"com.redmart.akka" %% "akka-downing" % "1.1.0",
"com.jsuereth" %% "scala-arm" % "2.0",
"com.typesafe" % "config" % "1.3.1",
"com.typesafe.scala-logging" %% "scala-logging" % "3.7.2",
"com.typesafe.akka" %% "akka-slf4j" % akkaVersion,
"ch.qos.logback" % "logback-classic" % logbackVersion,
"ch.qos.logback" % "logback-access" % logbackVersion,
"net.logstash.logback" % "logstash-logback-encoder" % "4.11",
"joda-time" % "joda-time" % "2.9.3",
"com.typesafe.akka" %% "akka-http" % akkaHttpVersion,
"com.typesafe.akka" %% "akka-http-spray-json" % akkaHttpVersion,
"com.typesafe.akka" %% "akka-persistence-cassandra" % "0.58",
// "com.typesafe.akka" %% "akka-cluster" % akkaVersion,
// "com.typesafe.akka" %% "akka-cluster-tools" % akkaVersion,
"com.lightbend.akka" %% "akka-management-cluster-http" % "0.4",
// "org.iq80.leveldb" % "leveldb" % "0.9",
// "org.fusesource.leveldbjni" % "leveldbjni-all" % "1.8",
"io.spray" %% "spray-json" % "1.3.3",
"com.enragedginger" %% "akka-quartz-scheduler" % "1.6.1-akka-2.5.x",
// "com.softwaremill.macwire" %% "macros" % "2.2.2" % "provided",
// "com.softwaremill.macwire" %% "util" % "2.2.2",
"com.esotericsoftware" % "kryo" % "4.0.1",
"com.github.romix.akka" %% "akka-kryo-serialization" % "0.5.2",
"com.typesafe.akka" %% "akka-http-testkit" % akkaHttpVersion % ItAndTest,
"org.scalatest" %% "scalatest" % "3.0.4" % ItAndTest,
"org.mockito" % "mockito-core" % "2.10.0" % ItAndTest,
"com.github.dnvriend" %% "akka-persistence-inmemory" % "2.5.1.1" % ItAndTest
)
dependencyOverrides += "com.typesafe.akka" %% "akka-actor" % akkaVersion
dependencyOverrides += "com.typesafe.akka" %% "akka-stream" % akkaVersion
dependencyOverrides += "com.typesafe.akka" %% "akka-cluster" % akkaVersion
dependencyOverrides += "com.typesafe.akka" %% "akka-cluster-tools" % akkaVersion
dependencyOverrides += "com.typesafe.akka" %% "akka-persistence" % akkaVersion
dependencyOverrides += "com.typesafe.akka" %% "akka-slf4j" % akkaVersion
dependencyOverrides += "com.typesafe.akka" % "akka-cluster-metrics_2.12" % akkaVersion
Can Anyone explain the reason why it's failing? I am sure i am overriding to latest package. which is 2.5.6 using Dependency Overrides. i am Using SBT 0.13.
Use the current version (0.5) of akka-management-cluster-http:
"com.lightbend.akka" %% "akka-management-cluster-http" % "0.5"
Version 0.4 depends on an older version of Akka.
I am currently migrating my Play 2 Scala API project and encounter 10 warnings during the compilations indicating :
[warn] Class play.core.enhancers.PropertiesEnhancer$GeneratedAccessor not found - continuing with a stub.
All of them are the same, and I don't have any other indications. I've searched a bit for other similar cases, it's often because of the JDK version and so on but I'm already in 1.8.
Here's what I have in plugins.sbt :
addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.5.3")
addSbtPlugin("org.scalastyle" %% "scalastyle-sbt-plugin" % "0.8.0")
addSbtPlugin("com.sksamuel.scapegoat" %% "sbt-scapegoat" % "1.0.4")
and in build.sbt :
libraryDependencies ++= Seq(
cache,
ws,
"org.reactivemongo" %% "play2-reactivemongo" % "0.10.5.0.akka23",
"org.reactivemongo" %% "reactivemongo" % "0.10.5.0.akka23",
"org.mockito" % "mockito-core" % "1.10.5" % "test",
"org.scalatestplus" %% "play" % "1.2.0" % "test",
"com.amazonaws" % "aws-java-sdk" % "1.8.3",
"org.cvogt" %% "play-json-extensions" % "0.8.0",
javaCore,
"com.clever-age" % "play2-elasticsearch" % "1.1.0" excludeAll(
ExclusionRule(organization = "org.scala-lang"),
ExclusionRule(organization = "com.typesafe.play"),
ExclusionRule(organization = "org.apache.commons", artifact = "commons-lang3")
)
)
Don't hesitate if you need anything else :)
It's not something that blocks me but I'd prefer avoid these 10 warnings everytime I recompile my application.
Thank you ! :)
It seems something in your code is trying to use the Play enhancer and is failing to find it. Are you using Ebean or something that may require the enhancer?
You can try to add the plugin to your plugins.sbt
addSbtPlugin("com.typesafe.sbt" % "sbt-play-enhancer" % "1.1.0")
This should make the warning go away. You can then disable it if you like:
# In build.sbt
playEnhancerEnabled := false
I am new to scala and sbt. After lot of investigations I have come up with below build.sbt. Am not able to find solution for this. It would be very helpful for me to proceed.
scalaVersion := "2.11.7"
scalaVersion in ThisBuild := "2.11.7"
val sparkVersion = "1.3.0"
val akkaVersion = "2.3.6"
libraryDependencies ++= Seq(
jdbc,
cache,
"com.typesafe.play" % "play_2.11" % "2.5.0-M2",
"org.slf4j" % "jcl-over-slf4j" % "1.7.18",
"org.slf4j" % "slf4j-simple" % "1.6.2",
"com.typesafe.akka" % "akka-actor_2.11" % "2.4.2",
"com.typesafe.akka" % "akka-slf4j_2.11" % "2.4.2",
"org.webjars" % "webjars-play_2.11" % "2.4.0-2",
"com.typesafe.play" % "play-ws_2.11" % "2.5.0-M2",
"org.webjars" % "bootstrap" % "3.2.0",
"org.webjars" % "html5shiv" % "3.7.0",
"org.webjars" % "respond" % "1.4.2",
"com.twitter" %% "algebird-core" % "0.9.0", // Twitter Algebird
"net.databinder.dispatch" % "dispatch-core_2.11" % "0.11.3",
"org.reactivemongo" % "play2-reactivemongo_2.11" % "0.11.10.play23",
"org.specs2" %% "specs2-core" % "3.6.5" % "test",
"org.specs2" %% "specs2-junit" % "3.6.5" % "test",
"org.specs2" %% "specs2-mock" % "3.6.5" % "test",
"org.mongodb" %% "casbah" % "2.8.1", // MongoDB Casbah
"com.sksamuel.elastic4s" %% "elastic4s" % "1.4.14"
)
Compilations is fine but getting runtime error, am not sure where is the error.
Error:
java.lang.NoSuchMethodError: play.api.Logger$.init(Ljava/io/File;Lscala/Enumeration$Value;)V
at play.core.server.DevServerStart$$anonfun$mainDev$1.apply(DevServerStart.scala:88)
at play.core.server.DevServerStart$$anonfun$mainDev$1.apply(DevServerStart.scala:61)
at play.utils.Threads$.withContextClassLoader(Threads.scala:21)
at play.core.server.DevServerStart$.mainDev(DevServerStart.scala:60)
at play.core.server.DevServerStart$.mainDevHttpMode(DevServerStart.scala:50)
at play.core.server.DevServerStart.mainDevHttpMode(DevServerStart.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at play.runsupport.Reloader$.startDevMode(Reloader.scala:223)
at play.sbt.run.PlayRun$$anonfun$playRunTask$1$$anonfun$apply$2$$anonfun$apply$3.devModeServer$lzycompute$1(PlayRun.scala:74)
at play.sbt.run.PlayRun$$anonfun$playRunTask$1$$anonfun$apply$2$$anonfun$apply$3.play$sbt$run$PlayRun$$anonfun$$anonfun$$anonfun$$devModeServer$1(PlayRun.scala:74)
at play.sbt.run.PlayRun$$anonfun$playRunTask$1$$anonfun$apply$2$$anonfun$apply$3.apply(PlayRun.scala:100)
at play.sbt.run.PlayRun$$anonfun$playRunTask$1$$anonfun$apply$2$$anonfun$apply$3.apply(PlayRun.scala:53)
at scala.Function1$$anonfun$compose$1.apply(Function1.scala:47)
[trace] Stack trace suppressed: run last compile:run for the full output.
[error] (compile:run) java.lang.reflect.InvocationTargetException
[error] Total time: 1 s, completed 7 Mar, 2016 8:05:59 PM
You cannot mix and match different Play versions.
ReactiveMongo "0.11.10.play23" actually needs Play 2.3. You tried to use the milestone 2 Play 2.5 development version.
The latest official Play version is 2.4.6 currently, so don't use a development milestone (M2) of Play 2.5 which is currently not yet finalized.
There's a matching ReactiveMongo version for Play 2.4. Use that.
Furthermore, when you want to develop a Play application, you should just use the (typesafe) lightbend "Activator" which generates a proper project template using the Play SBT plugin (which has proper template and routes compiliation support et cetera).
I am running unit tests on Spark 1.3.1 with sbt test and besides the unit tests being incredibly slow I keep running into java.lang.ClassNotFoundException: org.apache.spark.storage.RDDBlockId issues. Usually this means a dependency issue, but I wouldn't know from where. Tried installing everything on a new machine, including fresh hadoop, fresh ivy2, but I still run into the same issue
Any help is greatly appreciated
Exception:
Exception in thread "Driver Heartbeater" java.lang.ClassNotFoundException:
org.apache.spark.storage.RDDBlockId
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:270)
My build.sbt:
libraryDependencies ++= Seq(
"org.scalaz" %% "scalaz-core" % "7.1.2" excludeAll ExclusionRule(organization = "org.slf4j"),
"com.typesafe.play" %% "play-json" % "2.3.4" excludeAll ExclusionRule(organization = "org.slf4j"),
"org.apache.spark" %% "spark-core" % "1.3.1" % "provided" withSources() excludeAll (ExclusionRule(organization = "org.slf4j"), ExclusionRule("org.spark-project.akka", "akka-actor_2.10")),
"org.apache.spark" %% "spark-graphx" % "1.3.1" % "provided" withSources() excludeAll (ExclusionRule(organization = "org.slf4j"), ExclusionRule("org.spark-project.akka", "akka-actor_2.10")),
"org.apache.cassandra" % "cassandra-all" % "2.1.6",
"org.apache.cassandra" % "cassandra-thrift" % "2.1.6",
"com.typesafe.akka" %% "akka-actor" % "2.3.11",
"com.datastax.cassandra" % "cassandra-driver-core" % "2.1.6" withSources() withJavadoc() excludeAll (ExclusionRule(organization = "org.slf4j"),ExclusionRule(organization = "org.apache.spark"),ExclusionRule(organization = "com.twitter",name = "parquet-hadoop-bundle")),
"com.github.nscala-time" %% "nscala-time" % "1.2.0" excludeAll ExclusionRule(organization = "org.slf4j") withSources(),
"com.datastax.spark" %% "spark-cassandra-connector-embedded" % "1.3.0-M2" excludeAll (ExclusionRule(organization = "org.slf4j"),ExclusionRule(organization = "org.apache.spark"),ExclusionRule(organization = "com.twitter",name = "parquet-hadoop-bundle")),
"com.datastax.spark" %% "spark-cassandra-connector" % "1.3.0-M2" excludeAll (ExclusionRule(organization = "org.slf4j"),ExclusionRule(organization = "org.apache.spark"),ExclusionRule(organization = "com.twitter",name = "parquet-hadoop-bundle")),
"org.slf4j" % "slf4j-api" % "1.6.1",
"com.twitter" % "jsr166e" % "1.1.0",
"org.slf4j" % "slf4j-nop" % "1.6.1" % "test",
"org.scalatest" %% "scalatest" % "2.2.1" % "test" excludeAll ExclusionRule(organization = "org.slf4j")
)
and my spark test settings (of which I have disabled all to test it)
(spark.kryo.registrator,com.my.spark.MyRegistrator)
(spark.eventLog.dir,)
(spark.driver.memory,16G)
(spark.kryoserializer.buffer.mb,512)
(spark.akka.frameSize,5)
(spark.shuffle.spill,false)
(spark.default.parallelism,8)
(spark.shuffle.consolidateFiles,false)
(spark.serializer,org.apache.spark.serializer.KryoSerializer)
(spark.shuffle.spill.compress,false)
(spark.driver.host,10.10.68.66)
(spark.akka.timeout,300)
(spark.driver.port,55328)
(spark.eventLog.enabled,false)
(spark.cassandra.connection.host,127.0.0.1)
(spark.cassandra.connection.ssl.enabled,false)
(spark.master,local[8])
(spark.cassandra.connection.ssl.trustStore.password,password)
(spark.fileserver.uri,http://10.10.68.66:55329)
(spark.cassandra.auth.username,username)
(spark.local.dir,/tmp/spark)
(spark.app.id,local-1436229075894)
(spark.storage.blockManagerHeartBeatMs,300000)
(spark.executor.id,<driver>)
(spark.storage.memoryFraction,0.5)
(spark.app.name,Count all entries 217885402)
(spark.shuffle.compress,false)
An assembled or packaged jar sent to standalone or mesos works fine! Suggestions?
We ran into the same issue in Spark 1.6.0 (there is already a bug report for it)
We fixed it by switching to the Kryo serializer (which you should be using anyway).
So it appears to be a bug in the default JavaSerializer.
Simply do the following to get rid of it:
new SparkConf().setAppName("Simple Application").set("spark.serializer", "org.apache.spark.serializer.KryoSerializer")
The cause was a large broadcast variable. Unsure why (as it fit in memory), but removing it from the test cases made it work.