Test in Eclipse works but sbt throws MissingRequirementError: object scala.runtime in compiler mirror not found - scala

I am messing around with parsing and scala.tools.nsc.interactive.Global in Scala and I ran into a problem while executing tests under sbt. The tests run fine from Eclipse both with JUnitRunner and the ScalaTest plugin. After long time spent on Google I can't figure out how to fix this.
When I execute sbt test the following error is thrown:
Exception encountered when attempting to run a suite with class name: compileutils.CompileTest *** ABORTED ***
[info] java.lang.ExceptionInInitializerError:
[info] at compileutils.CompileTest$$anonfun$3.apply$mcV$sp(CompileTest.scala:18)
[info] at compileutils.CompileTest$$anonfun$3.apply(CompileTest.scala:16)
[info] at compileutils.CompileTest$$anonfun$3.apply(CompileTest.scala:16)
[info] at org.scalatest.Transformer$$anonfun$apply$1.apply(Transformer.scala:22)
[info] at org.scalatest.Transformer$$anonfun$apply$1.apply(Transformer.scala:22)
[info] at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
[info] at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104)
[info] at org.scalatest.Transformer.apply(Transformer.scala:22)
[info] at org.scalatest.Transformer.apply(Transformer.scala:20)
[info] at org.scalatest.FunSuiteLike$$anon$1.apply(FunSuiteLike.scala:158)
[info] ...
[info] Cause: scala.reflect.internal.MissingRequirementError: object scala.runtime in compiler mirror not found.
[info] at scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:16)
[info] at scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:17)
[info] at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:48)
[info] at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:40)
[info] at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:61)
[info] at scala.reflect.internal.Mirrors$RootsBase.getPackage(Mirrors.scala:172)
[info] at scala.reflect.internal.Mirrors$RootsBase.getRequiredPackage(Mirrors.scala:175)
[info] at scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackage$lzycompute(Definitions.scala:183)
[info] at scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackage(Definitions.scala:183)
[info] at scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackageClass$lzycompute(Definitions.scala:184)
[info] ...
The class under test
package compileutils
import scala.tools.nsc.Settings
import scala.tools.nsc.interactive.Global
import scala.tools.nsc.reporters.ConsoleReporter
import scala.tools.nsc.interactive.Response
import scala.io.Source
import scala.reflect.internal.util.SourceFile
import scala.reflect.internal.util.BatchSourceFile
import scala.reflect.io.AbstractFile
import java.io.File
object Compile {
val settings = new Settings
val reporter = new ConsoleReporter(settings)
val global = new Global(settings, reporter, "Study compile")
def parse(source: String): Compile.this.global.Tree = {
val sourceFile = new BatchSourceFile(".", source)
global.askReload(List(sourceFile), new Response[Unit])
global.parseTree(sourceFile)
}
def loadTypes(source: String): Either[Compile.this.global.Tree, Throwable] = {
val sourceFile = new BatchSourceFile(".", source)
val tResponse = new Response[global.Tree]
global.askReload(List(sourceFile), new Response[Unit])
global.askLoadedTyped(sourceFile, tResponse)
tResponse.get
}
}
The test
package compileutils
import org.scalatest.BeforeAndAfter
import org.junit.runner.RunWith
import org.scalatest.junit.JUnitRunner
import org.scalatest.FunSuite
import org.scalatest.Matchers._
#RunWith(classOf[JUnitRunner])
class CompileTest extends FunSuite with BeforeAndAfter {
val testSource = "class FromString {val s = \"dsasdsad \"}"
before {}
after {}
test("parse") {
//when
val tree = Compile.parse(testSource)
//then
tree should not be null
}
test("typer") {
//when
val typ = Compile.loadTypes(testSource)
//then
typ should be('left)
}
}
build.sbt
name := "Compiler study"
version := "0.1"
val scalaBuildVersion = "2.10.3"
scalaVersion := scalaBuildVersion
libraryDependencies += "org.scala-lang" % "scala-compiler" % scalaBuildVersion
libraryDependencies += "org.scala-lang" % "scala-library" % scalaBuildVersion
libraryDependencies += "org.scala-lang" % "scala-reflect" % scalaBuildVersion
libraryDependencies += "org.scalatest" %% "scalatest" % "2.1.0" % "test"
libraryDependencies += "junit" % "junit" % "4.11" % "test"
Environment:
sbt launcher version 0.13.0
Scala compiler version 2.10.3 -- Copyright 2002-2013, LAMP/EPFL
javac 1.6.0_45
DISTRIB_ID=Ubuntu
DISTRIB_RELEASE=13.10
DISTRIB_CODENAME=saucy
DISTRIB_DESCRIPTION="Ubuntu 13.10"

It looks like the scala jar isn't in your classpath when running sbt - make sure to add scala-library.jar to your classpath before you run sbt.
Based on one of your comments, it looks like you're running on windows. you might be also running into runtime jar access errors there if the classpath contains strange characters or spaces, or permission errors (e.g., if eclipse is running under an admin account, while sbt isn't).
Try reordering your dependency list, to put scala-library ahead of scala-compiler. if that doesn't work, try the troubleshooting advice here.

The scala-library.jar was not missing from the sbt's classpath but from the classpath of Global. Had to set it in code.
After modifying the source to
val settings = new Settings
val scalaLibraryPath = "/home/csajka/.ivy2/cache/org.scala-lang/scala-library/jars/scala-library-2.10.3.jar"
settings.bootclasspath.append(scalaLibraryPath)
settings.classpath.append(scalaLibraryPath)
val reporter = new ConsoleReporter(settings)
val global = new Global(settings, reporter, "Study compile")
the problem disappeared.
Thanks for the tip #blueberryfields!

Related

Issue in testing using scalatest

I just set up a new scala project with sbt in IntelliJ, and wrote the following basic class:
Person.scala:
package learning.functional
case class Person(
name: String
)
Main.scala:
package learning.functional
import learning.functional.person
object Main{
val p = Person("John")
}
PersonTest.scala:
import learning.functional.Person
import org.scalatest.FunSuite
class PersonTest extends FunSuite {
test("test person") {
val p = Person("John")
assert(p.name == "John")
}
}
When I try to run sbt test, it throws the following error:
## Exception when compiling 1 sources to /Users/johndooley/Desktop/Scala/scala-learning/target/scala-2.13/test-classes
[error] java.lang.AssertionError: assertion failed:
[error] unexpected value engine in trait FunSuite final <expandedname> private[this]
[error] while compiling: /Users/johndooley/Desktop/Scala/scala-learning/src/test/scala/PersonTest.scala
What could be the reason for this? My build.sbt file:
ThisBuild / scalaVersion := "2.13.10"
libraryDependencies += "org.scalatest" % "scalatest_2.10" % "1.9.1"
lazy val root = (project in file("."))
.settings(
name := "scala-learning"
)
Project structure:
I have tried invalidating cache and restarting, doing clean, update, compile, but nothing works.
I solved this by modifying my dependency to the following:
libraryDependencies += "org.scalatest" %% "scalatest" % "3.2.14"
and completely reloading the sbt shell

How do I import a package defined in src/test directory from inside src/it [duplicate]

I would like to share a helper trait between my "test" and "it" configurations in SBT, but I have not figured out how.
Here is a minimal example:
project/Build.scala
import sbt._
import Keys._
object MyBuild extends Build {
val scalaTest = "org.scalatest" %% "scalatest" % "2.0" % "test,it"
lazy val myProject =
Project(id = "my-project", base = file("."))
.configs(IntegrationTest)
.settings(Defaults.itSettings: _*)
.settings(
scalaVersion := "2.10.3",
libraryDependencies ++= Seq(
scalaTest
)
)
}
src/test/scala/Helpers.scala
trait Helper {
def help() { println("helping.") }
}
src/test/scala/TestSuite.scala
import org.scalatest._
class TestSuite extends FlatSpec with Matchers with Helper {
"My code" should "work" in {
help()
true should be(true)
}
}
src/it/scala/ItSuite.scala
import org.scalatest._
class ItSuite extends FlatSpec with Matchers with Helper {
"My code" should "work" in {
help()
true should be(true)
}
}
then, in sbt, "test" works:
sbt> test
helping.
[info] TestSuite:
[info] My code
[info] - should work
[info] Run completed in 223 milliseconds.
[info] Total number of tests run: 1
[info] Suites: completed 1, aborted 0
[info] Tests: succeeded 1, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.
[success] Total time: 0 s, completed Dec 17, 2013 1:54:56 AM
but "it:test" doesn't compile:
sbt> it:test
[info] Compiling 1 Scala source to ./target/scala-2.10/it-classes...
[error] ./src/it/scala/ItSuite.scala:3: not found: type Helper
[error] class ItSuite extends FlatSpec with Matchers with Helper {
[error] ^
[error] ./src/it/scala/ItSuite.scala:5: not found: value help
[error] help()
[error] ^
[error] two errors found
[error] (it:compile) Compilation failed
[error] Total time: 1 s, completed Dec 17, 2013 1:55:00 AM
If you want to share code from Test configuration, it's probably better to create a custom test configuration from Test. See Custom test configuration.
Your project/Build.scala becomes:
import sbt._
import Keys._
object MyBuild extends Build {
lazy val FunTest = config("fun") extend(Test)
val scalaTest = "org.scalatest" %% "scalatest" % "2.0" % "test"
lazy val myProject =
Project(id = "my-project", base = file("."))
.configs(FunTest)
.settings(inConfig(FunTest)(Defaults.testSettings) : _*)
.settings(
scalaVersion := "2.10.3",
libraryDependencies ++= Seq(
scalaTest
)
)
}
Also rename src/it/ to src/fun/. Now fun:test works:
> fun:test
helping.
[info] ItSuite:
[info] My code
[info] - should work
[info] Run completed in 245 milliseconds.
[info] Total number of tests run: 1
[info] Suites: completed 1, aborted 0
[info] Tests: succeeded 1, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.
[success] Total time: 1 s, completed Dec 17, 2013 8:43:17 AM
You can redefine the IntegrationTest Configuration in your project to extend the Test configuration instead of the Runtime Configuration (the default). This will make everything in your test configuration available to your IntegrationTest configuration.
import sbt._
import Keys._
object MyBuild extends Build {
val scalaTest = "org.scalatest" %% "scalatest" % "2.0" % "test,it"
lazy val IntegrationTest = config("it") extend(Test)
lazy val myProject =
Project(id = "my-project", base = file("."))
.configs(IntegrationTest)
.settings(Defaults.itSettings: _*)
.settings(
scalaVersion := "2.10.3",
libraryDependencies ++= Seq(
scalaTest
)
)
}

Spark-cassandra-connector: toArray does not work

I am using the spark-cassandra-connector with Scala and I want to read data from cassandra and display it via the method toArray.
However, I get an error message that it is not member of a class, but it is indicated in the API. Could somebody help me in finding my error?
Here are my files:
build.sbt:
name := "Simple_Project"
version := "1.0"
scalaVersion := "2.11.8"
assemblyMergeStrategy in assembly := {
case PathList("META-INF", xs # _*) => MergeStrategy.discard
case x => MergeStrategy.first
}
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.0.0-preview"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.0.0-preview"
resolvers += "Spark Packages Repo" at "https://dl.bintray.com/spark-packages/maven"
libraryDependencies += "datastax" % "spark-cassandra-connector" % "2.0.0-M2-s_2.11"
SimpleScala.scala:
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
import org.apache.spark.sql._
import org.apache.spark.sql.functions._
import com.datastax.spark.connector._
import com.datastax.spark.connector.rdd._
import org.apache.spark.sql.cassandra._
import org.apache.spark.sql.SQLContext
import com.datastax.spark.connector.cql.CassandraConnector._
object SimpleApp {
def main(args: Array[String]) {
val conf = new SparkConf().setAppName("Simple Application")
conf.set("spark.cassandra.connection.host", "127.0.0.1")
val sc = new SparkContext(conf)
val rdd_2 = sc.cassandraTable("test_2", "words")
rdd_2.toArray.foreach(println)
}
}
Functions for cqlsh:
CREATE KEYSPACE test_2 WITH REPLICATION = {'class': 'SimpleStrategy', 'replication_factor': 1 };
CREATE TABLE test_2.words (word text PRIMARY KEY, count int);
INSERT INTO test_2.words (word, count) VALUES ('foo', 20);
INSERT INTO test_2.words (word, count) VALUES ('bar', 20);
Error message:
[info] Loading global plugins from /home/andi/.sbt/0.13/plugins
[info] Resolving org.scala-sbt.ivy#ivy;2.3.0-sbt-2cc8d2761242b072cedb0a04cb39435[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] Done updating.
[info] Loading project definition from /home/andi/test_spark/project
[info] Updating {file:/home/andi/test_spark/project/}test_spark-build...
[info] Resolving org.scala-sbt.ivy#ivy;2.3.0-sbt-2cc8d2761242b072cedb0a04cb39435[info] Resolving org.fusesource.jansi#jansi;1.4 ...
[info] Done updating.
[info] Set current project to Simple_Project (in build file:/home/andi/test_spark/)
[info] Compiling 1 Scala source to /home/andi/test_spark/target/scala-2.11/classes...
[error] /home/andi/test_spark/src/main/scala/SimpleApp.scala:50: value toArray is not a member of com.datastax.spark.connector.rdd.CassandraTableScanRDD[com.datastax.spark.connector.CassandraRow]
[error] rdd_2.toArray.foreach(println)
[error] ^
[error] one error found
[error] (compile:compileIncremental) Compilation failed
Many thanks in advance,
Andi
CassandraTableScanRDD.toArray method has been deprecated and removed since 2.0.0 release of Spark Cassandra Connector. This method was there until 1.6.0 release. You can use collect method instead.
Unfortunately, the document Spark Cassandra Connector still uses toArray. Anyway, here is that will work
rdd_2.collect.foreach(println)

SBT/Play2 multi-project setup does not include dependant projects in classpath in run/test

I have following SBT/Play2 multi-project setup:
import sbt._
import Keys._
import PlayProject._
object ApplicationBuild extends Build {
val appName = "traveltime-api"
val appVersion = "1.0"
val appDependencies = Seq(
// Google geocoding library
"com.google.code.geocoder-java" % "geocoder-java" % "0.9",
// Emailer
"org.apache.commons" % "commons-email" % "1.2",
// CSV generator
"net.sf.opencsv" % "opencsv" % "2.0",
"org.scalatest" %% "scalatest" % "1.7.2" % "test",
"org.scalacheck" %% "scalacheck" % "1.10.0" % "test",
"org.mockito" % "mockito-core" % "1.9.0" % "test"
)
val lib = RootProject(file("../lib"))
val chiShape = RootProject(file("../chi-shape"))
lazy val main = PlayProject(
appName, appVersion, appDependencies, mainLang = SCALA
).settings(
// Add your own project settings here
resolvers ++= Seq(
"Sonatype Snapshots" at
"http://oss.sonatype.org/content/repositories/snapshots",
"Sonatype Releases" at
"http://oss.sonatype.org/content/repositories/releases"
),
// Scalatest compatibility
testOptions in Test := Nil
).aggregate(lib, chiShape).dependsOn(lib, chiShape)
}
As you can see this project depends on two independant subprojects: lib and chiShape.
Now compile works fine - all sources are correctly compiled. However if I try run or test, neither task in runtime has classes from subprojects on classpath loaded and things go haywire with NoClassFound exceptions.
For example - my application has to load serialized data from file and it goes like this: test starts FakeApplication, it tries to load data and boom:
[info] CsvGeneratorsTest:
[info] #markerFilterCsv
[info] - should fail on bad json *** FAILED ***
[info] java.lang.ClassNotFoundException: com.library.Node
[info] at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
[info] at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
[info] at java.security.AccessController.doPrivileged(Native Method)
[info] at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
[info] at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
[info] at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
[info] at java.lang.Class.forName0(Native Method)
[info] at java.lang.Class.forName(Class.java:264)
[info] at java.io.ObjectInputStream.resolveClass(ObjectInputStream.java:622)
[info] at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1593)
[info] ...
Strangely enough stage creates a directory structure with chi-shapes_2.9.1-1.0.jar and lib_2.9.1-1.0.jar in staged/.
How can I get my runtime/test configurations get subprojects into classpath?
Update:
I've added following code to Global#onStart:
override def onStart(app: Application) {
println(app)
ClassLoader.getSystemClassLoader.asInstanceOf[URLClassLoader].getURLs.
foreach(println)
throw new RuntimeException("foo!")
}
When I launch tests, the classpath is very very ill populated, to say at least :)
FakeApplication(.,sbt.classpath.ClasspathUtilities$$anon$1#182253a,List(),List(),Map(application.load-data -> test, mailer.smtp.test-mode -> true))
file:/home/arturas/Software/sdks/play-2.0.3/framework/sbt/sbt-launch.jar
[info] CsvGeneratorsTest:
When launching staged app, there's a lot of stuff, how it's supposed to be :)
$ target/start
Play server process ID is 29045
play.api.Application#1c2862b
file:/home/arturas/work/traveltime-api/api/target/staged/jul-to-slf4j.jar
That's strange, because there should be at least testing jars in the classpath I suppose?
It seems I've solved it.
The culprit was that ObjectInputStream ignores thread local class loaders by default and only uses system class loader.
So I changed from:
def unserialize[T](file: File): T = {
val in = new ObjectInputStream(new FileInputStream(file))
try {
in.readObject().asInstanceOf[T]
}
finally {
in.close
}
}
To:
/**
* Object input stream which respects thread local class loader.
*
* TL class loader is used by SBT to avoid polluting system class loader when
* running different tasks.
*/
class TLObjectInputStream(in: InputStream) extends ObjectInputStream(in) {
override protected def resolveClass(desc: ObjectStreamClass): Class[_] = {
Option(Thread.currentThread().getContextClassLoader).map { cl =>
try { return cl.loadClass(desc.getName)}
catch { case (e: java.lang.ClassNotFoundException) => () }
}
super.resolveClass(desc)
}
}
def unserialize[T](file: File): T = {
val in = new TLObjectInputStream(new FileInputStream(file))
try {
in.readObject().asInstanceOf[T]
}
finally {
in.close
}
}
And my class not found problems went away!
Thanks to How to put custom ClassLoader to use? and http://tech-tauk.blogspot.com/2010/05/thread-context-classlaoder-in.html on useful insight about deserializing and thread local class loaders.
This sounds similar to this bug https://play.lighthouseapp.com/projects/82401/tickets/659-play-dist-broken-with-sub-projects, though that bug is about dist and not test. I think that the fix has not made it to the latest stable release, so try building Play from source (and don't forget to use aggregate and dependsOn as demonstrated in that link.
Alternatively, as a workaround, inside sbt, you can navigate to the sub-project with project lib and then type test. It's a bit manual, but you can script that if you'd like.

sbt compile yields "object casbah is not a member of package com.mongodb"

My directory structure:
-build.sbt
-src
--main
---scala
----MongoConnect.scala
-lib
My build.sbt:
name := "mongodb-experiments"
version := "0.1"
libraryDependencies ++= Seq(
"com.mongodb.casbah" %% "casbah" % "3.0.0-SNAPSHOT"
)
resolvers += "Sonatype OSS Snapshots" at "https://oss.sonatype.org/content/repositories/snapshots"
My MongoConnect.scala program:
import com.mongodb.casbah.Imports._
object MongoConnect{
def main(args: Array[String]){
println("Hello Mongo")
}
}
Why does sbt compile result in
object casbah is not a member of package com.mongodb
?
sbt compile
[info] Set current project to mongodb-experiments (in build file:/Users/hrishikeshparanjape/git-public/mongodb-experiments/)
[info] Updating {file:/Users/hrishikeshparanjape/git-public/mongodb-experiments/}default-fc358e...
[info] Resolving org.scala-lang#scala-library;2.9.1 ...
[info] Resolving com.mongodb.casbah#casbah_2.9.1;3.0.0-SNAPSHOT ...
[info] Resolving com.mongodb.casbah#casbah-util_2.9.1;3.0.0-SNAPSHOT ...
[info] Resolving org.slf4j#slf4j-api;1.6.0 ...
[info] Resolving org.mongodb#mongo-java-driver;2.7.2 ...
[info] Resolving org.scalaj#scalaj-collection_2.9.1;1.2 ...
[info] Resolving org.scala-tools.time#time_2.8.0;0.2 ...
[info] Resolving joda-time#joda-time;1.6 ...
[info] Resolving com.mongodb.casbah#casbah-commons_2.9.1;3.0.0-SNAPSHOT ...
[info] Resolving com.mongodb.casbah#casbah-core_2.9.1;3.0.0-SNAPSHOT ...
[info] Resolving com.mongodb.casbah#casbah-query_2.9.1;3.0.0-SNAPSHOT ...
[info] Resolving com.mongodb.casbah#casbah-gridfs_2.9.1;3.0.0-SNAPSHOT ...
[info] Done updating.
[info] Compiling 1 Scala source to /Users/hrishikeshparanjape/git-public/mongodb-experiments/target/scala-2.9.1/classes...
[error] /Users/hrishikeshparanjape/git-public/mongodb-experiments/src/main/scala/MongoConnect.scala:1: object casbah is not a member of package com.mongodb
[error] import com.mongodb.casbah.Imports._
[error] ^
[error] one error found
[error] {file:/Users/hrishikeshparanjape/git-public/mongodb-experiments/}default-fc358e/compile:compile: Compilation failed
[error] Total time: 7 s, completed Jul 26, 2012 11:53:35 PM
Why do you use snapshot repository with and old versions of casbah?
libraryDependencies ++= Seq(
"org.mongodb" %% "casbah" % "2.4.1"
)
resolvers += "typesafe" at "http://repo.typesafe.com/typesafe/releases/"
%% sign in dependency will choose configured in sbt scala version
For 3.x version there is a milestone
libraryDependencies ++= Seq(
"org.mongodb" %% "casbah" % "3.0.0-M2"
)
And as I remember in 3.x import should be changed to:
import com.mongodb.casbah._
modify your build.sbt file as:
name := "mongodb-experiments"
version := "0.1"
libraryDependencies ++= Seq(
"com.mongodb.casbah" % "casbah_2.9.0" % "2.2.0-SNAPSHOT"
)
resolvers += "Sonatype OSS Snapshots" at "https://oss.sonatype.org/content/repositories/snapshots"
For some reason, 3.0.0 does not work.