Azure Databricks, could not initialize class org.apache.spark.eventhubs.EventHubsConf - scala

I'm pretty new to Scala and I'm trying to create a notebook to elaborate data written in an Azure Event Hub. This is my code:
import org.apache.spark.eventhubs._
val connectionString = ConnectionStringBuilder("MY-CONNECTION-STRING")
.setEventHubName("EVENT-HUB-NAME")
.build
val eventHubsConf = EventHubsConf(connectionString)
.setStartingPosition(EventPosition.fromEndOfStream)
val eventhubs = spark.readStream
.format("eventhubs")
.options(eventHubsConf.toMap)
.load()
And I get the following error: java.lang.NoClassDefFoundError: Could not initialize class org.apache.spark.eventhubs.EventHubsConf$
Cluster Configuration:
Databricks Runtime Version: 7.0 (includes Apache Spark 3.0.0, Scala 2.12)
Driver & Worker Type: 14.0 GB Memory, 4 Cores, 0.75 DBU
Standard_DS3_v2
I have installed the following library:
com.microsoft.azure:azure-eventhubs-spark_2.11:2.3.17
cluster libraries
The other JAR installed is to resolve a problem with Logging
The code crashes as soon as I try to create eventHubsConf.
Complete traceback:
java.lang.NoClassDefFoundError: Could not initialize class org.apache.spark.eventhubs.EventHubsConf$
at line14a6ae940dd14957b7172a4cf8f6cdd348.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(command-2632683088190841:7)
at line14a6ae940dd14957b7172a4cf8f6cdd348.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(command-2632683088190841:70)
at line14a6ae940dd14957b7172a4cf8f6cdd348.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(command-2632683088190841:72)
at line14a6ae940dd14957b7172a4cf8f6cdd348.$read$$iw$$iw$$iw$$iw$$iw$$iw$$iw.<init>(command-2632683088190841:74)
at line14a6ae940dd14957b7172a4cf8f6cdd348.$read$$iw$$iw$$iw$$iw$$iw$$iw.<init>(command-2632683088190841:76)
at line14a6ae940dd14957b7172a4cf8f6cdd348.$read$$iw$$iw$$iw$$iw$$iw.<init>(command-2632683088190841:78)
at line14a6ae940dd14957b7172a4cf8f6cdd348.$read$$iw$$iw$$iw$$iw.<init>(command-2632683088190841:80)
at line14a6ae940dd14957b7172a4cf8f6cdd348.$read$$iw$$iw$$iw.<init>(command-2632683088190841:82)
at line14a6ae940dd14957b7172a4cf8f6cdd348.$read$$iw$$iw.<init>(command-2632683088190841:84)
at line14a6ae940dd14957b7172a4cf8f6cdd348.$read$$iw.<init>(command-2632683088190841:86)
at line14a6ae940dd14957b7172a4cf8f6cdd348.$read.<init>(command-2632683088190841:88)
at line14a6ae940dd14957b7172a4cf8f6cdd348.$read$.<init>(command-2632683088190841:92)
at line14a6ae940dd14957b7172a4cf8f6cdd348.$read$.<clinit>(command-2632683088190841)
at line14a6ae940dd14957b7172a4cf8f6cdd348.$eval$.$print$lzycompute(<notebook>:7)
at line14a6ae940dd14957b7172a4cf8f6cdd348.$eval$.$print(<notebook>:6)
at line14a6ae940dd14957b7172a4cf8f6cdd348.$eval.$print(<notebook>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:745)
at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1021)
at scala.tools.nsc.interpreter.IMain.$anonfun$interpret$1(IMain.scala:574)
at scala.reflect.internal.util.ScalaClassLoader.asContext(ScalaClassLoader.scala:41)
at es Scala 2.12 but yoscala.reflect.internal.util.ScalaClassLoader.asContext$(ScalaClassLoader.scala:37)
at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:41)
at scala.tools.nsc.interpreter.IMain.loadAndRunReq$1(IMain.scala:573)
at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:600)
at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:570)
at com.databricks.backend.daemon.driver.DriverILoop.execute(DriverILoop.scala:215)
at com.databricks.backend.daemon.driver.ScalaDriverLocal.$anonfun$repl$1(ScalaDriverLocal.scala:202)
at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
at com.databricks.backend.daemon.driver.DriverLocal$TrapExitInternal$.trapExit(DriverLocal.scala:714)
at com.databricks.backend.daemon.driver.DriverLocal$TrapExit$.apply(DriverLocal.scala:667)
at com.databricks.backend.daemon.driver.ScalaDriverLocal.repl(ScalaDriverLocal.scala:202)
at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$execute$10(DriverLocal.scala:396)
at com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1(UsageLogging.scala:238)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:233)
at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:230)
at com.databricks.backend.daemon.driver.DriverLocal.withAttributionContext(DriverLocal.scala:49)
at com.databricks.logging.UsageLogging.withAttributionTags(UsageLogging.scala:275)
at com.databricks.logging.UsageLogging.withAttributionTags$(UsageLogging.scala:268)
at com.databricks.backend.daemon.driver.DriverLocal.withAttributionTags(DriverLocal.scala:49)
at com.databricks.backend.daemon.driver.DriverLocal.execute(DriverLocal.scala:373)
at com.databricks.backend.daemon.driver.DriverWrapper.$anonfun$tryExecutingCommand$1(DriverWrapper.scala:653)
at scala.util.Try$.apply(Try.scala:213)
at com.databricks.backend.daemon.driver.DriverWrapper.tryExecutingCommand(DriverWrapper.scala:645)
at com.databricks.backend.daemon.driver.DriverWrapper.getCommandOutputAndError(DriverWrapper.scala:486)
at com.databricks.backend.daemon.driver.DriverWrapper.executeCommand(DriverWrapper.scala:598)
at com.databricks.backend.daemon.driver.DriverWrapper.runInnerLoop(DriverWrapper.scala:391)
at com.databricks.backend.daemon.driver.DriverWrapper.runInner(DriverWrapper.scala:337)
at com.databricks.backend.daemon.driver.DriverWrapper.run(DriverWrapper.scala:219)
at java.lang.Thread.run(Thread.java:748)

It seems that your Runtime includes Scala 2.12 but your package is from scala 2.11
Try installing this one
com.microsoft.azure:azure-eventhubs-spark_2.12:2.3.17

Related

Intermittent org.apache.spark.SparkException: Could not find spark-version-info.properties in K8s

My application uses K8s cronjob to schedule the application run, consequently creating a pod for each occurrence.
In most of the cases, the application runs well, but in some of them, it fails with the following error:
java.lang.ExceptionInInitializerError
2022-12-23T10:45:32.899555393Z at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
2022-12-23T10:45:32.899572625Z at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
2022-12-23T10:45:32.899576309Z at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
2022-12-23T10:45:32.899590250Z at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490)
2022-12-23T10:45:32.899601166Z at java.base/java.util.concurrent.ForkJoinTask.getThrowableException(ForkJoinTask.java:600)
2022-12-23T10:45:32.899604720Z at java.base/java.util.concurrent.ForkJoinTask.reportException(ForkJoinTask.java:678)
2022-12-23T10:45:32.899608169Z at java.base/java.util.concurrent.ForkJoinTask.invoke(ForkJoinTask.java:737)
2022-12-23T10:45:32.899610934Z at java.base/java.util.stream.ForEachOps$ForEachOp.evaluateParallel(ForEachOps.java:159)
2022-12-23T10:45:32.899613557Z at java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateParallel(ForEachOps.java:173)
2022-12-23T10:45:32.899629628Z at java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:233)
2022-12-23T10:45:32.899633296Z at java.base/java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:497)
2022-12-23T10:45:32.899637988Z at java.base/java.util.stream.ReferencePipeline$Head.forEach(ReferencePipeline.java:661)
[...]
2022-12-23T10:45:32.899959490Z Caused by: java.lang.ExceptionInInitializerError
2022-12-23T10:45:32.899972035Z at org.apache.spark.package$.<init>(package.scala:93)
2022-12-23T10:45:32.899985422Z at org.apache.spark.package$.<clinit>(package.scala)
2022-12-23T10:45:32.899990812Z at org.apache.spark.SparkContext.$anonfun$new$1(SparkContext.scala:183)
2022-12-23T10:45:32.899996770Z at org.apache.spark.internal.Logging.logInfo(Logging.scala:54)
2022-12-23T10:45:32.899998834Z at org.apache.spark.internal.Logging.logInfo$(Logging.scala:53)
2022-12-23T10:45:32.900029297Z at org.apache.spark.SparkContext.logInfo(SparkContext.scala:73)
2022-12-23T10:45:32.900031985Z at org.apache.spark.SparkContext.<init>(SparkContext.scala:183)
2022-12-23T10:45:32.900033999Z at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2526)
2022-12-23T10:45:32.900049879Z at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$1(SparkSession.scala:930)
2022-12-23T10:45:32.900055477Z at scala.Option.getOrElse(Option.scala:189)
2022-12-23T10:45:32.900061375Z at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:921)
[...]
2022-12-23T10:45:32.900111374Z at java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:183)
2022-12-23T10:45:32.900134359Z at java.base/java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1655)
2022-12-23T10:45:32.900137775Z at java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484)
2022-12-23T10:45:32.900145849Z at java.base/java.util.stream.ForEachOps$ForEachTask.compute(ForEachOps.java:290)
2022-12-23T10:45:32.900159046Z at java.base/java.util.concurrent.CountedCompleter.exec(CountedCompleter.java:746)
2022-12-23T10:45:32.900178955Z at java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
2022-12-23T10:45:32.900183168Z at java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
2022-12-23T10:45:32.900206355Z at java.base/java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
2022-12-23T10:45:32.900214981Z at java.base/java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
2022-12-23T10:45:32.900227137Z at java.base/java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)
2022-12-23T10:45:32.900377715Z Caused by: org.apache.spark.SparkException: Could not find spark-version-info.properties
2022-12-23T10:45:32.900381131Z at org.apache.spark.package$SparkBuildInfo$.<init>(package.scala:62)
2022-12-23T10:45:32.900396317Z at org.apache.spark.package$SparkBuildInfo$.<clinit>(package.scala)
2022-12-23T10:45:32.900399396Z ... 25 more
How spark sesssion is being build
lazy val spark: SparkSession = SparkSession.builder
.appName("My application")
.master("local") //Error occur here
.getOrCreate()
Spark version: 2.4.8
Scala version: 2.12
Anyone already had a similar problem?

NoSuchMethodError in spark submit

I submitted my jar with dependencies to spark using spark-submit. In main method of my jar I want to create HttpAsyncCliens instance and execute some request (apache http async client library):
val httpClient = HttpAsyncClients.custom.setMaxConnTotal(10).setMaxConnPerRoute(10).build
httpClient.start()
httpClient.execute(new HttpGet("https://google.com"), new FutureCallback[HttpResponse] {
/* callbacks */
})
It throws exceptions:
Exception in thread "pool-1-thread-1" java.lang.NoSuchMethodError:
org.apache.http.util.Asserts.check(ZLjava/lang/String;Ljava/lang/Object;)V
at org.apache.http.impl.nio.reactor.AbstractMultiworkerIOReactor.execute(AbstractMultiworkerIOReactor.java:315)
at org.apache.http.impl.nio.conn.PoolingNHttpClientConnectionManager.execute(PoolingNHttpClientConnectionManager.java:191)
at org.apache.http.impl.nio.client.CloseableHttpAsyncClientBase$1.run(CloseableHttpAsyncClientBase.java:64)
at java.lang.Thread.run(Thread.java:745) Exception in thread "main" java.lang.NoSuchMethodError:
org.apache.http.util.Asserts.check(ZLjava/lang/String;Ljava/lang/Object;)V
at org.apache.http.impl.nio.client.CloseableHttpAsyncClientBase.ensureRunning(CloseableHttpAsyncClientBase.java:90)
at org.apache.http.impl.nio.client.InternalHttpAsyncClient.execute(InternalHttpAsyncClient.java:123)
at org.apache.http.impl.nio.client.CloseableHttpAsyncClient.execute(CloseableHttpAsyncClient.java:74)
at org.apache.http.impl.nio.client.CloseableHttpAsyncClient.execute(CloseableHttpAsyncClient.java:107)
at org.apache.http.impl.nio.client.CloseableHttpAsyncClient.execute(CloseableHttpAsyncClient.java:91)
at spark.Application$.main(Application.scala:37)
at spark.Application.main(Application.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
It seems like there is no http-core dependency in my jar but I can call this method (org.apache.http.util.Asserts.check(ZLjava/lang/String;Ljava/lang/Object;)V) in code before or after http client creation and request:
org.apache.http.util.Asserts.check(true, "test", "test2") // it produces no exception
val httpClient = HttpAsyncClients.custom.setMaxConnTotal(10).setMaxConnPerRoute(10).build
httpClient.start()
httpClient.execute(new HttpGet("https://google.com"), new FutureCallback[HttpResponse] {
/* callbacks */
}) // it produces exception
Why I got NoSuchMethodError if I can call this method from same classpath in code?
Apache httpasyncclient v4.1
The solution here is to sync up with the version of httpcomponents.httpcore that is in the immediate classpath of spark. For version 1.6 the version is 4.0.1.
You will have to unzip the spark jar to view the META-INF, etc. After some detective work things will be alright!

Spark-Shell error: "spark.dynamicAllocation.{min/max}Executors must be set

I am trying to start spark-shell after setting up Spark 1.2.1 on cloudera quick start VM. I am getting the below error.Looking for help in resolving this issue. Appreciate any quick help on this to resolve the issue. The log of the error is mentioned below:
16/03/03 09:40:37 INFO EventLoggingListener: Logging events to hdfs://quickstart.cloudera:8020/user/spark/applicationHistory/local-1457026830824
org.apache.spark.SparkException: spark.dynamicAllocation.{min/max}Executors must be set!
at org.apache.spark.ExecutorAllocationManager.validateSettings(ExecutorAllocationManager.scala:135)
at org.apache.spark.ExecutorAllocationManager.<init>(ExecutorAllocationManager.scala:98)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:377)
at org.apache.spark.repl.SparkILoop.createSparkContext(SparkILoop.scala:986)
at $iwC$$iwC.<init>(<console>:9)
at $iwC.<init>(<console>:18)
at <init>(<console>:20)
at .<init>(<console>:24)
at .<clinit>(<console>)
at .<init>(<console>:7)
at .<clinit>(<console>)
at $print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:852)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1125)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:674)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:705)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:669)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:828)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:873)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:785)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:123)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:122)
at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:270)
at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:122)
at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:60)
at org.apache.spark.repl.SparkILoop$$anonfun$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:147)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:60)
at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:106)
at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:60)
at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply$mcZ$sp(SparkILoop.scala:962)
at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:916)
at org.apache.spark.repl.SparkILoop$$anonfun$process$1.apply(SparkILoop.scala:916)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:916)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1011)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
scala>
The exception is pretty clear. It seems that you've set the spark.dynamicAllocation.enabled property to true, but failed to set spark.dynamicAllocation.minExecutors and spark.dynamicAllocation.maxExecutors. The spark 1.2.1 documentation clearly states this (from spark.dynamicAllocation.enabled description, emphasis mine):
This requires the following configurations to be set:
spark.dynamicAllocation.minExecutors,
spark.dynamicAllocation.maxExecutors, and
spark.shuffle.service.enabled
If you look at the 1.2 branch of Spark, you'll see that if you don't specify those values, the default defers to -1:
// Lower and upper bounds on the number of executors. These are required.
private val minNumExecutors = conf.getInt("spark.dynamicAllocation.minExecutors", -1)
private val maxNumExecutors = conf.getInt("spark.dynamicAllocation.maxExecutors", -1)
This behavior has changed. If you look at the updated 1.6 branch of Spark, you'll see that they defer to 0 and Integer.MAX_VALUE, respectively:
// Lower and upper bounds on the number of executors.
private val minNumExecutors = conf.getInt("spark.dynamicAllocation.minExecutors", 0)
private val maxNumExecutors = conf.getInt("spark.dynamicAllocation.maxExecutors",
Integer.MAX_VALUE)
This simply means, you need to add these either to the SparkConf settings, or to any other configuration file you're providing to the spark-shell:
val sparkConf = new SparkConf()
.set("spark.dynamicAllocation.minExecutors", minExecutors)
.set("spark.dynamicAllocation.maxExecutors", maxExecutors)

Running Scala tests in Intellij

I am trying to run Scala tests (specs2) in Intellij Coummunity Edition 13.1.3. I am getting the following error:
Connected to the target VM, address: '127.0.0.1:57980', transport: 'socket'
'Start' method is not found in MyNotifierRunner null
Exception in thread "main" java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
Disconnected from the target VM, address: '127.0.0.1:57980', transport: 'socket'
at org.jetbrains.plugins.scala.testingSupport.specs2.JavaSpecs2Runner.runSingleTest(JavaSpecs2Runner.java:123)
at org.jetbrains.plugins.scala.testingSupport.specs2.JavaSpecs2Runner.main(JavaSpecs2Runner.java:69)
Caused by: java.lang.NoSuchMethodError: org.specs2.matcher.MatchResult$.matchResultAsResult()Lorg/specs2/execute/AsResult;
at components.reports.ReportsDemographicsComponentTest$$anonfun$1.apply$mcV$sp(ReportsDemographicsComponentTest.scala:14)
at components.reports.ReportsDemographicsComponentTest$$anonfun$1.apply(ReportsDemographicsComponentTest.scala:13)
at components.reports.ReportsDemographicsComponentTest$$anonfun$1.apply(ReportsDemographicsComponentTest.scala:13)
at org.specs2.mutable.SideEffectingCreationPaths$$anonfun$executeBlock$1.apply$mcV$sp(FragmentsBuilder.scala:292)
at org.specs2.mutable.SideEffectingCreationPaths$class.replay(FragmentsBuilder.scala:264)
at org.specs2.mutable.Specification.replay(Specification.scala:12)
at org.specs2.mutable.FragmentsBuilder$class.fragments(FragmentsBuilder.scala:27)
at org.specs2.mutable.Specification.fragments(Specification.scala:12)
at org.specs2.mutable.SpecificationLike$class.is(Specification.scala:14)
at org.specs2.mutable.Specification.is(Specification.scala:12)
at org.specs2.specification.SpecificationStructure$$anonfun$content$1.apply(BaseSpecification.scala:56)
at org.specs2.specification.SpecificationStructure$$anonfun$content$1.apply(BaseSpecification.scala:56)
at org.specs2.specification.SpecificationStructure$class.map(BaseSpecification.scala:44)
at org.specs2.mutable.Specification.map(Specification.scala:12)
at org.specs2.specification.SpecificationStructure$class.content(BaseSpecification.scala:56)
at org.specs2.mutable.Specification.content$lzycompute(Specification.scala:12)
at org.specs2.mutable.Specification.content(Specification.scala:12)
at org.specs2.runner.ClassRunner$$anonfun$apply$1$$anonfun$apply$2.apply(ClassRunner.scala:54)
at org.specs2.runner.ClassRunner$$anonfun$apply$1$$anonfun$apply$2.apply(ClassRunner.scala:54)
at org.specs2.control.Exceptions$class.tryo(Exceptions.scala:32)
at org.specs2.control.Exceptions$.tryo(Exceptions.scala:109)
at org.specs2.runner.ClassRunner$$anonfun$apply$1.apply(ClassRunner.scala:54)
at org.specs2.runner.ClassRunner$$anonfun$apply$1.apply(ClassRunner.scala:53)
at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:251)
at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:251)
at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
at scala.collection.mutable.WrappedArray.foreach(WrappedArray.scala:34)
at scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:251)
at scala.collection.AbstractTraversable.flatMap(Traversable.scala:105)
at org.specs2.runner.ClassRunner.apply(ClassRunner.scala:53)
at org.specs2.runner.ClassRunner.start(ClassRunner.scala:31)
at org.specs2.runner.ClassRunner.main(ClassRunner.scala:24)
at org.specs2.runner.NotifierRunner.main(NotifierRunner.scala:24)
... 6 more
Process finished with exit code 1
Here is piece of code runnig in sbt, but failing in Intellij:
class ReportsDemographicsComponentTest extends Specification with ReportsComponents {
"ReportsDemographicsComponent" should {
s"return empty list of $DeviceStatistics for an inexistent deliveryId" in DBUnitTestsUtils(2) {
accountId => implicit session =>
val service = new ReportsDemographicsService(accountId)
val res = service.deviceStatistics(-1)
res.size mustEqual 0
}
}
I have tried restarting Intellij, sbt, cleaning project, but to nu success. When running tests from the sbt command line, everything is OK.
In my case closing Idea and regenerating the projec setup by issuing
sbt gen-idea
Note:
You have to put this line into project/plugins.sbt to have the command:
addSbtPlugin("com.github.mpeltonen" % "sbt-idea" % "1.6.0")

Using Google Caliper with Scala

I am trying to use Caliper with Scala(2.10) in Eclipse Juno(4.2). For the start I've set up a benchmark that simply iterates a foreachloop over an array.
import com.google.caliper.Param
import com.google.caliper.SimpleBenchmark
class Benchmark extends SimpleBenchmark {
#Param(Array("10", "100", "1000", "10000"))
val length: Int = 0
var array: Array[Int] = _
override def setUp() {
array = new Array(length)
}
def timeForeach(reps: Int) = {
var result = 0
array.foreach {
result += _
}
result
}
When I start the benchmark with:
object myRunner {
def main(args: Array[String]) {
Runner.main(classOf[Benchmark], args)
}
}
I get these exceptions that I dont understand
0% Scenario{vm=java, trial=0, benchmark=Foreach, length=10} Failed to execute java -cp C:\Users\bob\workspace\myBenchmark\bin;C:\Users\bob\workspace\caliper\caliper\target>\classes;C:\Users\bob\workspace\caliper\caliper\target\test-classes;C:\Users\bob\.m2\repository\com\google\code\findbugs\jsr305\1.3.9\jsr305-1.3.9.jar;C:\Users\bob\.m2\repository\com\google\code\gson\gson\1.7.1\gson-1.7.1.jar;C:\Users\bob\.m2\repository\com\google\guava\guava\11.0.1\guava-11.0.1.jar;C:\Users\bob\.m2\repository\com\google\code\java-allocation-instrumenter\java-allocation-instrumenter\2.0\java-allocation-instrumenter-2.0.jar;C:\Users\bob\.m2\repository\asm\asm\3.3.1\asm-3.3.1.jar;C:\Users\bob\.m2\repository\asm\asm-analysis\3.3.1\asm-analysis-3.3.1.jar;C:\Users\bob\.m2\repository\asm\asm-commons\3.3.1\asm-commons-3.3.1.jar;C:\Users\bob\.m2\repository\asm\asm-tree\3.3.1\asm-tree-3.3.1.jar;C:\Users\bob\.m2\repository\asm\asm-util\3.3.1\asm-util-3.3.1.jar;C:\Users\bob\.m2\repository\asm\asm-xml\3.3.1\asm-xml-3.3.1.jar;C:\Users\bob\.m2\repository\junit\junit\3.8.2\junit-3.8.2.jar com.google.caliper.InProcessRunner --warmupMillis 3000 --runMillis 1000 --measurementType TIME --marker //ZxJ/ -Dbenchmark=Foreach -Dlength=10 org.example.Benchmark
Exception in thread "main" java.lang.NoClassDefFoundError: scala/Function1
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Unknown Source)
at com.google.caliper.ScenarioSelection.getClassByName(ScenarioSelection.java:154)
at com.google.caliper.ScenarioSelection.prepareSuite(ScenarioSelection.java:123)
at com.google.caliper.ScenarioSelection.select(ScenarioSelection.java:83)
at com.google.caliper.InProcessRunner.run(InProcessRunner.java:38)
at com.google.caliper.InProcessRunner.main(InProcessRunner.java:103)
Caused by: java.lang.ClassNotFoundException: scala.Function1
at java.net.URLClassLoader$1.run(Unknown Source)
at java.net.URLClassLoader$1.run(Unknown Source)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
... 7 more
An exception was thrown from the benchmark code.
com.google.caliper.ConfigurationException: Failed to execute java -cp C:\Users\bob\workspace\myBenchmark\bin;C:\Users\bob\workspace\caliper\caliper\target\classes;C:\Users\bob\workspace\caliper\caliper\target\test-classes;C:\Users\bob\.m2\repository\com\google\code\findbugs\jsr305\1.3.9\jsr305-1.3.9.jar;C:\Users\bob\.m2\repository\com\google\code\gson\gson\1.7.1\gson-1.7.1.jar;C:\Users\bob\.m2\repository\com\google\guava\guava\11.0.1\guava-11.0.1.jar;C:\Users\bob\.m2\repository\com\google\code\java-allocation-instrumenter\java-allocation-instrumenter\2.0\java-allocation-instrumenter-2.0.jar;C:\Users\bob\.m2\repository\asm\asm\3.3.1\asm-3.3.1.jar;C:\Users\bob\.m2\repository\asm\asm-analysis\3.3.1\asm-analysis-3.3.1.jar;C:\Users\bob\.m2\repository\asm\asm-commons\3.3.1\asm-commons-3.3.1.jar;C:\Users\bob\.m2\repository\asm\asm-tree\3.3.1\asm-tree-3.3.1.jar;C:\Users\bob\.m2\repository\asm\asm-util\3.3.1\asm-util-3.3.1.jar;C:\Users\bob\.m2\repository\asm\asm-xml\3.3.1\asm-xml-3.3.1.jar;C:\Users\bob\.m2\repository\junit\junit\3.8.2\junit-3.8.2.jar com.google.caliper.InProcessRunner --warmupMillis 3000 --runMillis 1000 --measurementType TIME --marker //ZxJ/ -Dbenchmark=Foreach -Dlength=10 org.example.Benchmark
at com.google.caliper.Runner.measure(Runner.java:309)
at com.google.caliper.Runner.runScenario(Runner.java:229)
at com.google.caliper.Runner.runOutOfProcess(Runner.java:378)
at com.google.caliper.Runner.run(Runner.java:97)
at com.google.caliper.Runner.main(Runner.java:423)
at com.google.caliper.Runner.main(Runner.java:436)
at org.example.myRunner$.main(myRunner.scala:7)
at org.example.myRunner.main(myRunner.scala)
I think I have some issues with the classpath but i am not sure.
I hope someone can help me :)
thanks in advance
Davram Bashere
It looks very much as though Caliper is running a new JVM and doesn't know that it needs to include the Scala libraries on the classpath.
This question describes how to run a Scala app with the java command on the command line. It should be a good starting point to solving this problem.
Even if you are editing scala sources in Eclipse you may still use sbt to run your code. Sbt is a great tool for managing your projects classpath in addition to providing a plugin system for providing the kinds of features like running Caliper benchmarks. I recently worked on a project where I needed just that and factored it out into an published sbt plugin. This may be of some help for you as well.
Thanks for your help. The problem was as expected the stupid classpath. I have downloaded the caliper project from googlecode and added the scala dependency. Now I've build my own jar-with-dependencies and it works quite fine.
Thanks for the answeres anyway :-)