Scala run time issue with skinny ORM - scala

I am a newbie to Skinny ORM and I am having a strange problem.
I am trying to get a small snippet working, SBT compiles my code but
I get a run time error message class not found. ( I get the message while
running via Intellij)
My build.sbt:
name := "skinny_jdbc"
version := "1.0"
scalaVersion := "2.12.1"
libraryDependencies ++= Seq(
"org.skinny-framework" %% "skinny-orm" % "2.3.2",
"com.h2database" % "h2" % "1.4.+",
"ch.qos.logback" % "logback-classic" % "1.1.+"
)
My test application:
import scalikejdbc._
import skinny.orm._, feature._
import org.joda.time._
object sk_test extends App{
println ("In SK_test object")
skinny.DBSettings.initialize()
implicit val session = AutoSession
}
It compiles fine - but I get a runtime error
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/commons/pool/ObjectPool
Detailed message is below.
Thanks in Advance.
"C:\Program Files\Java\jdk1.8.0_66\bin\java" -Didea.launcher.port=7534 "-Didea.launcher.bin.path=C:\Program Files (x86)\JetBrains\IntelliJ IDEA Community Edition 2016.3\bin" -Dfile.encoding=UTF-8
-classpath "C:\Program Files\Java\jdk1.8.0_66\jre\lib\charsets.jar;C:\Program Files\Java\jdk1.8.0_66\jre\lib\deploy.jar;C:\Program Files\Java\jdk1.8.0_66\jre\lib\ext\access-bridge-64.jar;
C:\Program Files\Java\jdk1.8.0_66\jre\lib\ext\cldrdata.jar;
C:\Program Files\Java\jdk1.8.0_66\jre\lib\ext\dnsns.jar;
C:\Program Files\Java\jdk1.8.0_66\jre\lib\ext\jaccess.jar;
C:\Program Files\Java\jdk1.8.0_66\jre\lib\ext\jfxrt.jar;
C:\Program Files\Java\jdk1.8.0_66\jre\lib\ext\localedata.jar;
C:\Program Files\Java\jdk1.8.0_66\jre\lib\ext\nashorn.jar;C:\Program Files\Java\jdk1.8.0_66\jre\lib\ext\sunec.jar;
C:\Program Files\Java\jdk1.8.0_66\jre\lib\ext\sunjce_provider.jar;
C:\Program Files\Java\jdk1.8.0_66\jre\lib\ext\sunmscapi.jar;
C:\Program Files\Java\jdk1.8.0_66\jre\lib\ext\sunpkcs11.jar;C:\Program Files\Java\jdk1.8.0_66\jre\lib\ext\zipfs.jar;C:\Program Files\Java\jdk1.8.0_66\jre\lib\javaws.jar;
C:\Program Files\Java\jdk1.8.0_66\jre\lib\jce.jar;
C:\Program Files\Java\jdk1.8.0_66\jre\lib\jfr.jar;
C:\Program Files\Java\jdk1.8.0_66\jre\lib\jfxswt.jar;C:\Program Files\Java\jdk1.8.0_66\jre\lib\jsse.jar;C:\Program Files\Java\jdk1.8.0_66\jre\lib\management-agent.jar;
C:\Program Files\Java\jdk1.8.0_66\jre\lib\plugin.jar;
C:\Program Files\Java\jdk1.8.0_66\jre\lib\resources.jar;
C:\Program Files\Java\jdk1.8.0_66\jre\lib\rt.jar;C:\Users\alpha\Desktop\Coursera\skinny_jdbc\target\scala-2.12\classes;C:\Users\alpha\.ivy2\cache\org.apache.commons\commons-pool2\jars\commons-pool2-2.4.2.jar;C:\Users\alpha\.ivy2\cache\org.skinny-framework\skinny-orm_2.12\jars\skinny-orm_2.12-2.3.2.jar;C:\Users\alpha\.ivy2\cache\org.skinny-framework\skinny-micro-common_2.12\jars\skinny-micro-common_2.12-1.2.1.jar;C:\Users\alpha\.ivy2\cache\org.skinny-framework\skinny-common_2.12\jars\skinny-common_2.12-2.3.2.jar;C:\Users\alpha\.ivy2\cache\org.scalikejdbc\scalikejdbc_2.12\jars\scalikejdbc_2.12-2.5.0.jar;C:\Users\alpha\.ivy2\cache\org.scalikejdbc\scalikejdbc-syntax-support-macro_2.12\jars\scalikejdbc-syntax-support-macro_2.12-2.5.0.jar;C:\Users\alpha\.ivy2\cache\org.scalikejdbc\scalikejdbc-interpolation_2.12\jars\scalikejdbc-interpolation_2.12-2.5.0.jar;C:\Users\alpha\.ivy2\cache\org.scalikejdbc\scalikejdbc-interpolation-macro_2.12\jars\scalikejdbc-interpolation-macro_2.12-2.5.0.jar;C:\Users\alpha\.ivy2\cache\org.scalikejdbc\scalikejdbc-core_2.12\jars\scalikejdbc-core_2.12-2.5.0.jar;C:\Users\alpha\.ivy2\cache\org.scalikejdbc\scalikejdbc-config_2.12\jars\scalikejdbc-config_2.12-2.5.0.jar;C:\Users\alpha\.ivy2\cache\org.scala-lang.modules\scala-parser-combinators_2.12\bundles\scala-parser-combinators_2.12-1.0.4.jar;C:\Users\alpha\.ivy2\cache\org.scala-lang\scala-reflect\jars\scala-reflect-2.12.1.jar;C:\Users\alpha\.ivy2\cache\org.scala-lang\scala-library\jars\scala-library-2.12.1.jar;C:\Users\alpha\.ivy2\cache\org.flywaydb\flyway-core\jars\flyway-core-4.0.3.jar;C:\Users\alpha\Desktop\Coursera\skinny_jdbc\lib\sqljdbc42.jar;C:\Users\alpha\.ivy2\cache\org.slf4j\slf4j-api\jars\slf4j-api-1.7.22.jar;C:\Users\alpha\.ivy2\cache\org.joda\joda-convert\jars\joda-convert-1.8.1.jar;C:\Users\alpha\.ivy2\cache\org.apache.commons\commons-dbcp2\jars\commons-dbcp2-2.1.1.jar;C:\Users\alpha\.ivy2\cache\joda-time\joda-time\jars\joda-time-2.9.6.jar;C:\Users\alpha\.ivy2\cache\commons-logging\commons-logging\jars\commons-logging-1.2.jar;C:\Users\alpha\.ivy2\cache\com.typesafe\config\bundles\config-1.3.1.jar;C:\Users\alpha\.ivy2\cache\com.h2database\h2\jars\h2-1.4.193.jar;C:\Users\alpha\.ivy2\cache\ch.qos.logback\logback-core\jars\logback-core-1.1.8.jar;C:\Users\alpha\.ivy2\cache\ch.qos.logback\logback-classic\jars\logback-classic-1.1.8.jar;C:\Program Files (x86)\JetBrains\IntelliJ IDEA Community Edition 2016.3\lib\idea_rt.jar" com.intellij.rt.execution.application.AppMain sk_test
In SK_test object
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/commons/pool/ObjectPool
at scalikejdbc.CommonsConnectionPoolFactory$.apply(CommonsConnectionPoolFactory.scala:16)
at scalikejdbc.CommonsConnectionPoolFactory$.apply(CommonsConnectionPoolFactory.scala:8)
at scalikejdbc.ConnectionPool$.add(ConnectionPool.scala:116)
at scalikejdbc.config.DBs.setup(DBs.scala:16)
at scalikejdbc.config.DBs.setup$(DBs.scala:10)
at skinny.SkinnyDBsWithEnv.setup(SkinnyDBsWithEnv.scala:9)
at scalikejdbc.config.DBs.$anonfun$setupAll$1(DBs.scala:21)
at scalikejdbc.config.DBs.$anonfun$setupAll$1$adapted(DBs.scala:21)
at scala.collection.immutable.List.foreach(List.scala:378)
at scalikejdbc.config.DBs.setupAll(DBs.scala:21)
at scalikejdbc.config.DBs.setupAll$(DBs.scala:19)
at skinny.SkinnyDBsWithEnv.setupAll(SkinnyDBsWithEnv.scala:9)
at skinny.DBSettingsInitializer.initialize(DBSettingsInitializer.scala:25)
at skinny.DBSettingsInitializer.initialize$(DBSettingsInitializer.scala:21)
at skinny.DBSettings$.initialize(DBSettings.scala:8)
at sk_test$.delayedEndpoint$sk_test$1(sk_test.scala:13)
at sk_test$delayedInit$body.apply(sk_test.scala:10)
at scala.Function0.apply$mcV$sp(Function0.scala:34)
at scala.Function0.apply$mcV$sp$(Function0.scala:34)
at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App.$anonfun$main$1$adapted(App.scala:76)
at scala.collection.immutable.List.foreach(List.scala:378)
at scala.App.main(App.scala:76)
at scala.App.main$(App.scala:74)
at sk_test$.main(sk_test.scala:10)
at sk_test.main(sk_test.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)
Caused by: java.lang.ClassNotFoundException: org.apache.commons.pool.ObjectPool
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 31 more

I believe you used the application.conf as suggested in the website. This should work fine.
development {
db {
default {
driver="org.h2.Driver"
url="jdbc:h2:file:./db/development;MODE=PostgreSQL;AUTO_SERVER=TRUE"
user="sa"
password="sa"
poolInitialSize=2
poolMaxSize=10
}
}
}
Or you have to include commons-dbcp in your build.sbt to pull in the library as a dependency

Related

Azure Databricks, could not initialize class org.apache.spark.eventhubs.EventHubsConf

I'm pretty new to Scala and I'm trying to create a notebook to elaborate data written in an Azure Event Hub. This is my code:
import org.apache.spark.eventhubs._
val connectionString = ConnectionStringBuilder("MY-CONNECTION-STRING")
.setEventHubName("EVENT-HUB-NAME")
.build
val eventHubsConf = EventHubsConf(connectionString)
.setStartingPosition(EventPosition.fromEndOfStream)
val eventhubs = spark.readStream
.format("eventhubs")
.options(eventHubsConf.toMap)
.load()
And I get the following error: java.lang.NoClassDefFoundError: Could not initialize class org.apache.spark.eventhubs.EventHubsConf$
Cluster Configuration:
Databricks Runtime Version: 7.0 (includes Apache Spark 3.0.0, Scala 2.12)
Driver & Worker Type: 14.0 GB Memory, 4 Cores, 0.75 DBU
Standard_DS3_v2
I have installed the following library:
com.microsoft.azure:azure-eventhubs-spark_2.11:2.3.17
cluster libraries
The other JAR installed is to resolve a problem with Logging
The code crashes as soon as I try to create eventHubsConf.
Complete traceback:
java.lang.NoClassDefFoundError: Could not initialize class org.apache.spark.eventhubs.EventHubsConf$
at line14a6ae940dd14957b7172a4cf8f6cdd348.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(command-2632683088190841:7)
at line14a6ae940dd14957b7172a4cf8f6cdd348.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(command-2632683088190841:70)
at line14a6ae940dd14957b7172a4cf8f6cdd348.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(command-2632683088190841:72)
at line14a6ae940dd14957b7172a4cf8f6cdd348.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(command-2632683088190841:74)
at line14a6ae940dd14957b7172a4cf8f6cdd348.$read$$iw$$iw$$iw$$iw$$iw$$iw.<init>(command-2632683088190841:76)
at line14a6ae940dd14957b7172a4cf8f6cdd348.$read$$iw$$iw$$iw$$iw$$iw.<init>(command-2632683088190841:78)
at line14a6ae940dd14957b7172a4cf8f6cdd348.$read$$iw$$iw$$iw$$iw.<init>(command-2632683088190841:80)
at line14a6ae940dd14957b7172a4cf8f6cdd348.$read$$iw$$iw$$iw.<init>(command-2632683088190841:82)
at line14a6ae940dd14957b7172a4cf8f6cdd348.$read$$iw$$iw.<init>(command-2632683088190841:84)
at line14a6ae940dd14957b7172a4cf8f6cdd348.$read$$iw.<init>(command-2632683088190841:86)
at line14a6ae940dd14957b7172a4cf8f6cdd348.$read.<init>(command-2632683088190841:88)
at line14a6ae940dd14957b7172a4cf8f6cdd348.$read$.<init>(command-2632683088190841:92)
at line14a6ae940dd14957b7172a4cf8f6cdd348.$read$.<clinit>(command-2632683088190841)
at line14a6ae940dd14957b7172a4cf8f6cdd348.$eval$.$print$lzycompute(<notebook>:7)
at line14a6ae940dd14957b7172a4cf8f6cdd348.$eval$.$print(<notebook>:6)
at line14a6ae940dd14957b7172a4cf8f6cdd348.$eval.$print(<notebook>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:745)
at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1021)
at scala.tools.nsc.interpreter.IMain.$anonfun$interpret$1(IMain.scala:574)
at scala.reflect.internal.util.ScalaClassLoader.asContext(ScalaClassLoader.scala:41)
at es Scala 2.12 but yoscala.reflect.internal.util.ScalaClassLoader.asContext$(ScalaClassLoader.scala:37)
at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:41)
at scala.tools.nsc.interpreter.IMain.loadAndRunReq$1(IMain.scala:573)
at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:600)
at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:570)
at com.databricks.backend.daemon.driver.DriverILoop.execute(DriverILoop.scala:215)
at com.databricks.backend.daemon.driver.ScalaDriverLocal.$anonfun$repl$1(ScalaDriverLocal.scala:202)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at com.databricks.backend.daemon.driver.DriverLocal$TrapExitInternal$.trapExit(DriverLocal.scala:714)
at com.databricks.backend.daemon.driver.DriverLocal$TrapExit$.apply(DriverLocal.scala:667)
at com.databricks.backend.daemon.driver.ScalaDriverLocal.repl(ScalaDriverLocal.scala:202)
at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$execute$10(DriverLocal.scala:396)
at com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1(UsageLogging.scala:238)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:233)
at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:230)
at com.databricks.backend.daemon.driver.DriverLocal.withAttributionContext(DriverLocal.scala:49)
at com.databricks.logging.UsageLogging.withAttributionTags(UsageLogging.scala:275)
at com.databricks.logging.UsageLogging.withAttributionTags$(UsageLogging.scala:268)
at com.databricks.backend.daemon.driver.DriverLocal.withAttributionTags(DriverLocal.scala:49)
at com.databricks.backend.daemon.driver.DriverLocal.execute(DriverLocal.scala:373)
at com.databricks.backend.daemon.driver.DriverWrapper.$anonfun$tryExecutingCommand$1(DriverWrapper.scala:653)
at scala.util.Try$.apply(Try.scala:213)
at com.databricks.backend.daemon.driver.DriverWrapper.tryExecutingCommand(DriverWrapper.scala:645)
at com.databricks.backend.daemon.driver.DriverWrapper.getCommandOutputAndError(DriverWrapper.scala:486)
at com.databricks.backend.daemon.driver.DriverWrapper.executeCommand(DriverWrapper.scala:598)
at com.databricks.backend.daemon.driver.DriverWrapper.runInnerLoop(DriverWrapper.scala:391)
at com.databricks.backend.daemon.driver.DriverWrapper.runInner(DriverWrapper.scala:337)
at com.databricks.backend.daemon.driver.DriverWrapper.run(DriverWrapper.scala:219)
at java.lang.Thread.run(Thread.java:748)
It seems that your Runtime includes Scala 2.12 but your package is from scala 2.11
Try installing this one
com.microsoft.azure:azure-eventhubs-spark_2.12:2.3.17

Executable Jar created by Gradle bootRepackage missing jars

What i'd like to do:
Creating an executable Jar using gradle, and eclipse as IDE.
Jar should include all libs required to run, and could be used inside a docker container.
Issues so far:
org.springframework.boot is not included in the Jar.
Code:
src/main/java/SampleController.java
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.EnableAutoConfiguration;
import org.springframework.stereotype.Controller;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.ResponseBody;
#Controller
#EnableAutoConfiguration
public class SampleController {
#RequestMapping("/")
#ResponseBody
String home() {
return "Hello World!";
}
public static void main(String[] args) throws Exception {
SpringApplication.run(SampleController.class, args);
}
}
build.gradle
buildscript {
repositories {
jcenter()
mavenCentral()
}
dependencies {
classpath("org.springframework.boot:spring-boot-gradle-plugin:1.5.10.RELEASE")
}
}
apply plugin: 'java-library'
apply plugin: 'eclipse'
apply plugin: 'org.springframework.boot'
// In this section you declare where to find the dependencies of your project
repositories {
// Use jcenter for resolving your dependencies.
// You can declare any Maven/Ivy/file repository here.
jcenter()
mavenCentral()
}
dependencies {
// This dependency is exported to consumers, that is to say found on their compile classpath.
api 'org.apache.commons:commons-math3:3.6.1'
// This dependency is used internally, and not exposed to consumers on their own compile classpath.
implementation 'com.google.guava:guava:21.0'
api 'org.springframework.boot:spring-boot-starter-web'
api 'org.springframework.boot:spring-boot-starter'
// Use JUnit test framework
testImplementation 'junit:junit:4.12'
}
springBoot {
mainClass = "SampleController"
}
sourceCompatibility = 1.8
targetCompatibility = 1.8
Command used:
In workspace/project/build/libs
java -jar project.jar
Error:
java : Exception in thread "main" java.lang.reflect.InvocationTargetException
At line:1 char:1
+ java -jar RSSFeedAggregator.jar
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (Exception in th...TargetException:String) [], RemoteException
+ FullyQualifiedErrorId : NativeCommandError
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:48)
at org.springframework.boot.loader.Launcher.launch(Launcher.java:87)
at org.springframework.boot.loader.Launcher.launch(Launcher.java:50)
at org.springframework.boot.loader.JarLauncher.main(JarLauncher.java:51)
Caused by: java.lang.NoClassDefFoundError: org/springframework/boot/SpringApplication
at SampleController.main(SampleController.java:18)
... 8 more
Caused by: java.lang.ClassNotFoundException: org.springframework.boot.SpringApplication
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at org.springframework.boot.loader.LaunchedURLClassLoader.loadClass(LaunchedURLClassLoader.java:94)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 9 more
I'm kind of a newbie when it comes to this. Does anyone has any ideas on how to resolve my problem ?
guess I found the solution.
From my understanding, when running directly in eclipse, gradle api / implementation automatically add the jars at runtime.
But when creating the fat jars with :bootRepackage, runtime libs need to be explicitly added. (The program runned well on eclipse directly)
runtime 'org.springframework.boot:spring-boot-starter'
runtime 'org.springframework.boot:spring-boot-starter-web'
This resolved my problem.
I won't close the topic yet if anyone has something to add on this issue.

NoSuchMethodError in spark submit

I submitted my jar with dependencies to spark using spark-submit. In main method of my jar I want to create HttpAsyncCliens instance and execute some request (apache http async client library):
val httpClient = HttpAsyncClients.custom.setMaxConnTotal(10).setMaxConnPerRoute(10).build
httpClient.start()
httpClient.execute(new HttpGet("https://google.com"), new FutureCallback[HttpResponse] {
/* callbacks */
})
It throws exceptions:
Exception in thread "pool-1-thread-1" java.lang.NoSuchMethodError:
org.apache.http.util.Asserts.check(ZLjava/lang/String;Ljava/lang/Object;)V
at org.apache.http.impl.nio.reactor.AbstractMultiworkerIOReactor.execute(AbstractMultiworkerIOReactor.java:315)
at org.apache.http.impl.nio.conn.PoolingNHttpClientConnectionManager.execute(PoolingNHttpClientConnectionManager.java:191)
at org.apache.http.impl.nio.client.CloseableHttpAsyncClientBase$1.run(CloseableHttpAsyncClientBase.java:64)
at java.lang.Thread.run(Thread.java:745) Exception in thread "main" java.lang.NoSuchMethodError:
org.apache.http.util.Asserts.check(ZLjava/lang/String;Ljava/lang/Object;)V
at org.apache.http.impl.nio.client.CloseableHttpAsyncClientBase.ensureRunning(CloseableHttpAsyncClientBase.java:90)
at org.apache.http.impl.nio.client.InternalHttpAsyncClient.execute(InternalHttpAsyncClient.java:123)
at org.apache.http.impl.nio.client.CloseableHttpAsyncClient.execute(CloseableHttpAsyncClient.java:74)
at org.apache.http.impl.nio.client.CloseableHttpAsyncClient.execute(CloseableHttpAsyncClient.java:107)
at org.apache.http.impl.nio.client.CloseableHttpAsyncClient.execute(CloseableHttpAsyncClient.java:91)
at spark.Application$.main(Application.scala:37)
at spark.Application.main(Application.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
It seems like there is no http-core dependency in my jar but I can call this method (org.apache.http.util.Asserts.check(ZLjava/lang/String;Ljava/lang/Object;)V) in code before or after http client creation and request:
org.apache.http.util.Asserts.check(true, "test", "test2") // it produces no exception
val httpClient = HttpAsyncClients.custom.setMaxConnTotal(10).setMaxConnPerRoute(10).build
httpClient.start()
httpClient.execute(new HttpGet("https://google.com"), new FutureCallback[HttpResponse] {
/* callbacks */
}) // it produces exception
Why I got NoSuchMethodError if I can call this method from same classpath in code?
Apache httpasyncclient v4.1
The solution here is to sync up with the version of httpcomponents.httpcore that is in the immediate classpath of spark. For version 1.6 the version is 4.0.1.
You will have to unzip the spark jar to view the META-INF, etc. After some detective work things will be alright!

ScalaTest DeferredAbortedSuite error when running simple tests.

My original code had a lot more going on, which distracted me from the true cause of the problem.
This captures the essential problem.
import org.scalatest.AsyncFlatSpec
import scala.concurrent.Future
class AsyncFlatSpecSpec extends AsyncFlatSpec
{
it should "parse an XML file" in {
// ... Parsing ...
Future.successful(succeed)
}
it should "parse an XML file" in {
// ... Serializing ...
Future.successful(succeed)
}
}
This produced these errors:
[info] DeferredAbortedSuite:
[error] Uncaught exception when running AsyncFlatSpecSpec: java.lang.ArrayIndexOutOfBoundsException: 17
[trace] Stack trace suppressed: run last test:testOnly for the full output.
There is no array access happening anywhere in my code. What's going on?
Running "last test:testOnly" wasn't much help:
[info] DeferredAbortedSuite:
[error] Uncaught exception when running AsyncFlatSpecSpec: java.lang.ArrayIndexOutOfBoundsException: 17
sbt.ForkMain$ForkError: java.lang.ArrayIndexOutOfBoundsException: 17
at org.scalatest.exceptions.StackDepth$class.stackTraceElement(StackDepth.scala:63)
at org.scalatest.exceptions.StackDepth$class.failedCodeFileName(StackDepth.scala:77)
at org.scalatest.exceptions.StackDepthException.failedCodeFileName(StackDepthException.scala:36)
at org.scalatest.exceptions.StackDepth$class.failedCodeFileNameAndLineNumberString(StackDepth.scala:59)
at org.scalatest.exceptions.StackDepthException.failedCodeFileNameAndLineNumberString(StackDepthException.scala:36)
at org.scalatest.tools.StringReporter$.withPossibleLineNumber(StringReporter.scala:442)
at org.scalatest.tools.StringReporter$.stringsToPrintOnError(StringReporter.scala:916)
at org.scalatest.tools.StringReporter$.fragmentsForEvent(StringReporter.scala:747)
at org.scalatest.tools.Framework$SbtLogInfoReporter.apply(Framework.scala:622)
at org.scalatest.tools.FilterReporter.apply(FilterReporter.scala:41)
at org.scalatest.tools.SbtDispatchReporter$$anonfun$apply$1.apply(SbtDispatchReporter.scala:23)
at org.scalatest.tools.SbtDispatchReporter$$anonfun$apply$1.apply(SbtDispatchReporter.scala:23)
at scala.collection.Iterator$class.foreach(Iterator.scala:893)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1336)
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
at org.scalatest.tools.SbtDispatchReporter.apply(SbtDispatchReporter.scala:23)
at org.scalatest.tools.Framework$SbtReporter.apply(Framework.scala:1119)
at org.scalatest.tools.Framework.org$scalatest$tools$Framework$$runSuite(Framework.scala:387)
at org.scalatest.tools.Framework$ScalaTestTask.execute(Framework.scala:506)
at sbt.ForkMain$Run$2.call(ForkMain.java:296)
at sbt.ForkMain$Run$2.call(ForkMain.java:286)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Confused, I retreated to the non-Async version, to see if that fared any
better.
import org.scalatest.FlatSpec
class FlatSpecSpec extends FlatSpec {
it should "parse an XML file" in {
// ... Parsing ...
succeed
}
it should "parse an XML file" in {
// ... Serializing ...
succeed
}
}
It produced this different, but still cryptic error message:
[info] DeferredAbortedSuite:
[info] Exception encountered when attempting to run a suite with class name: org.scalatest.DeferredAbortedSuite *** ABORTED *** (20 milliseconds)
[info] Exception encountered when attempting to run a suite with class name: org.scalatest.DeferredAbortedSuite (AsyncFlatSpecSpec.scala:32)
[info] ScalaTest
For completeness, here are the related portions of my build.sbt:
scalaVersion := "2.11.8"
libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.0-M15" % "test"
libraryDependencies += "org.scalactic" %% "scalactic" % "3.0.0-M15"
This was ultimately a trivial mistake on my part, but I wanted to post this for the sake of anyone else Googling these errors.
As many readers probably noticed while going through the examples, the problem was that I had copy/pasted the same test description.
This allows the code to compile, but will fail at runtime with errors that
don't identify the description as the culprit.
Stupid error on my part, but it would be nice if the compiler reported it in a more helpful way.

Running Scala tests in Intellij

I am trying to run Scala tests (specs2) in Intellij Coummunity Edition 13.1.3. I am getting the following error:
Connected to the target VM, address: '127.0.0.1:57980', transport: 'socket'
'Start' method is not found in MyNotifierRunner null
Exception in thread "main" java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
Disconnected from the target VM, address: '127.0.0.1:57980', transport: 'socket'
at org.jetbrains.plugins.scala.testingSupport.specs2.JavaSpecs2Runner.runSingleTest(JavaSpecs2Runner.java:123)
at org.jetbrains.plugins.scala.testingSupport.specs2.JavaSpecs2Runner.main(JavaSpecs2Runner.java:69)
Caused by: java.lang.NoSuchMethodError: org.specs2.matcher.MatchResult$.matchResultAsResult()Lorg/specs2/execute/AsResult;
at components.reports.ReportsDemographicsComponentTest$$anonfun$1.apply$mcV$sp(ReportsDemographicsComponentTest.scala:14)
at components.reports.ReportsDemographicsComponentTest$$anonfun$1.apply(ReportsDemographicsComponentTest.scala:13)
at components.reports.ReportsDemographicsComponentTest$$anonfun$1.apply(ReportsDemographicsComponentTest.scala:13)
at org.specs2.mutable.SideEffectingCreationPaths$$anonfun$executeBlock$1.apply$mcV$sp(FragmentsBuilder.scala:292)
at org.specs2.mutable.SideEffectingCreationPaths$class.replay(FragmentsBuilder.scala:264)
at org.specs2.mutable.Specification.replay(Specification.scala:12)
at org.specs2.mutable.FragmentsBuilder$class.fragments(FragmentsBuilder.scala:27)
at org.specs2.mutable.Specification.fragments(Specification.scala:12)
at org.specs2.mutable.SpecificationLike$class.is(Specification.scala:14)
at org.specs2.mutable.Specification.is(Specification.scala:12)
at org.specs2.specification.SpecificationStructure$$anonfun$content$1.apply(BaseSpecification.scala:56)
at org.specs2.specification.SpecificationStructure$$anonfun$content$1.apply(BaseSpecification.scala:56)
at org.specs2.specification.SpecificationStructure$class.map(BaseSpecification.scala:44)
at org.specs2.mutable.Specification.map(Specification.scala:12)
at org.specs2.specification.SpecificationStructure$class.content(BaseSpecification.scala:56)
at org.specs2.mutable.Specification.content$lzycompute(Specification.scala:12)
at org.specs2.mutable.Specification.content(Specification.scala:12)
at org.specs2.runner.ClassRunner$$anonfun$apply$1$$anonfun$apply$2.apply(ClassRunner.scala:54)
at org.specs2.runner.ClassRunner$$anonfun$apply$1$$anonfun$apply$2.apply(ClassRunner.scala:54)
at org.specs2.control.Exceptions$class.tryo(Exceptions.scala:32)
at org.specs2.control.Exceptions$.tryo(Exceptions.scala:109)
at org.specs2.runner.ClassRunner$$anonfun$apply$1.apply(ClassRunner.scala:54)
at org.specs2.runner.ClassRunner$$anonfun$apply$1.apply(ClassRunner.scala:53)
at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:251)
at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:251)
at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
at scala.collection.mutable.WrappedArray.foreach(WrappedArray.scala:34)
at scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:251)
at scala.collection.AbstractTraversable.flatMap(Traversable.scala:105)
at org.specs2.runner.ClassRunner.apply(ClassRunner.scala:53)
at org.specs2.runner.ClassRunner.start(ClassRunner.scala:31)
at org.specs2.runner.ClassRunner.main(ClassRunner.scala:24)
at org.specs2.runner.NotifierRunner.main(NotifierRunner.scala:24)
... 6 more
Process finished with exit code 1
Here is piece of code runnig in sbt, but failing in Intellij:
class ReportsDemographicsComponentTest extends Specification with ReportsComponents {
"ReportsDemographicsComponent" should {
s"return empty list of $DeviceStatistics for an inexistent deliveryId" in DBUnitTestsUtils(2) {
accountId => implicit session =>
val service = new ReportsDemographicsService(accountId)
val res = service.deviceStatistics(-1)
res.size mustEqual 0
}
}
I have tried restarting Intellij, sbt, cleaning project, but to nu success. When running tests from the sbt command line, everything is OK.
In my case closing Idea and regenerating the projec setup by issuing
sbt gen-idea
Note:
You have to put this line into project/plugins.sbt to have the command:
addSbtPlugin("com.github.mpeltonen" % "sbt-idea" % "1.6.0")