I have a scala project which uses sbt for building/testing/etc. I have a file with unit tests in:
$SBT_PROJECT_ROOT/src/test/scala/foo/bar/SomeSpec.scala
I also have a test resource in:
$SBT_PROJECT_ROOT/src/test/resources/some_test_resource.txt
I attempt to acces this file from the unit tests with:
import org.scalatest._
import scala.io.Source
class TestFiddleParser extends FlatSpec with Matchers {
"This unit test" should "find the test resource" in {
val source = Source.fromURL(getClass.getResource("/some_test_resource.txt"))
val content = source.mkString
println(content.take(1000))
}
}
When I am on the commdand line in the $SBT_PROJECT_ROOT folder and run the command:
sbt test
I can see the first 1000 characters of the test file being printed. Success!
Now I am using eclipse (Scala-IDE) for devolpment. I have eclipse support through sbteclipse (https://github.com/typesafehub/sbteclipse) and I run unit tests from within eclipse using the ScalaTest eclipse plugin (http://www.scalatest.org/user_guide/using_scalatest_with_eclipse).
When I run this unit test from within eclipse, I get a null in the val source. Which I believe means the resource was not found.
What could be the problem?
Ok it was actually quite obvious, just run
sbt eclipse
again AFTER adding the the test resource to the src/test/resources folder.
Related
I have configured a Scala project for integration tests as described in https://www.scala-sbt.org/1.x/docs/Testing.html.
I can get the integration tests executed from sbt, but not from IntelliJ. IntelliJ seems to not recognize that my test class is a test class. The same test class works fine under test, but does not seem recognized as a test when it is under it.
import org.scalatest.flatspec.AnyFlatSpec
import org.scalatest.matchers.should.Matchers
class BlackSpec extends AnyFlatSpec with Matchers {
behavior of "Black.adder"
it should "return 42 for any inputs" in {
val six = Black.adder(1, 2)
six shouldBe 42
}
}
When putting the above test under it and executing it from IntelliJ: "Passed: Total 0, Failed 0, Errors 0, Passed 0".
This problem happens in a large (proprietary) project. A minimal example project works fine everywhere, including in IntelliJ: https://gist.github.com/radumanolescu/601b0c743c73b42396b4f6ca9d5fc1db
In the large project, we probably have a setting that disturbs the recognition that my class is an integration test and should be run with ScalaTest. Any idea what that setting might be?
I'm trying to run a Scala application packed as JAR (including dependencies) but this fails until the Scala library is added by using the -Xbootclasspath/p option.
Failing invocation:
java -jar /path/to/target/scala-2.10/application-assembly-1.0.jar
After the application did some of its intended output, the console shows:
Exception in thread "main"
scala.reflect.internal.MissingRequirementError: object scala.runtime
in compiler mirror not found.
at scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:16)
at scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:17)
at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:48)
at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:40)
at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:61)
at scala.reflect.internal.Mirrors$RootsBase.getPackage(Mirrors.scala:172)
at scala.reflect.internal.Mirrors$RootsBase.getRequiredPackage(Mirrors.scala:175)
at scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackage$lzycompute(Definitions.scala:181)
at scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackage(Definitions.scala:181)
at scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackageClass$lzycompute(Definitions.scala:182)
at scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackageClass(Definitions.scala:182)
at scala.reflect.internal.Definitions$DefinitionsClass.AnnotationDefaultAttr$lzycompute(Definitions.scala:1015)
at scala.reflect.internal.Definitions$DefinitionsClass.AnnotationDefaultAttr(Definitions.scala:1014)
at scala.reflect.internal.Definitions$DefinitionsClass.syntheticCoreClasses$lzycompute(Definitions.scala:1144)
at scala.reflect.internal.Definitions$DefinitionsClass.syntheticCoreClasses(Definitions.scala:1143)
at scala.reflect.internal.Definitions$DefinitionsClass.symbolsNotPresentInBytecode$lzycompute(Definitions.scala:1187)
at scala.reflect.internal.Definitions$DefinitionsClass.symbolsNotPresentInBytecode(Definitions.scala:1187)
at scala.reflect.internal.Definitions$DefinitionsClass.init(Definitions.scala:1252)
at scala.tools.nsc.Global$Run.(Global.scala:1290)
at extract.ScalaExtractor$Compiler$2$.(ScalaExtractor.scala:24)
Working invocation:
java -Xbootclasspath/p:/path/to/home/.sbt/boot/scala-2.10.2/lib/scala-library.jar -jar /path/to/target/scala-2.10/application-assembly-1.0.jar
The strange thing about it is that the application-assembly-1.0.jar was built so that it includes all dependencies including the Scala library. When one extracts the JAR file it can be verified that the class files in the scala.runtime package have been included.
Creation of the JAR file
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.9.1") was added to project/plugins.sbt and the assembly target was invoked. A JAR file of about 25MB results.
Building the JAR with proguard shows the same runtime behavior as seen with assembly's JAR file.
Application code that triggers the MissingRequirementError
Some application code works fine and the previously described exception is triggered as soon as the new Run from the following fragment executes.
import scala.reflect.internal.util.BatchSourceFile
import scala.reflect.io.AbstractFile
import scala.reflect.io.Path.jfile2path
import scala.tools.nsc.Global
import scala.tools.nsc.Settings
…
import scala.tools.nsc._
object Compiler extends Global(new Settings()) {
new Run // This is line 24 from the stack trace!
def parse(path: File) = {
val code = AbstractFile.getFile(path)
val bfs = new BatchSourceFile(code, code.toCharArray)
val parser = new syntaxAnalyzer.UnitParser(new CompilationUnit(bfs))
parser.smartParse()
}
}
val ast = Compiler.parse(file)
Among others, scala-library, scala-compiler and scala-reflect are defined as dependencies in build.sbt.
For the curios / background information
The aim of the application is to aid in localization of Java and Scala programs. The task of the code fragment above is to get an AST from a Scala file in order to find method calls in there.
The questions
Given the Scala library is included in the JAR file, why is necessary to call the JAR using -Xbootclasspath/p:scala-library.jar?
Why do other parts of the application run just fine even though scala.runtime is reported as missing later?
The easy way to configure the settings with familiar keystrokes:
import scala.tools.nsc.Global
import scala.tools.nsc.Settings
def main(args: Array[String]) {
val s = new Settings
s processArgumentString "-usejavacp"
val g = new Global(s)
val r = new g.Run
}
That works for your scenario.
Even easier:
java -Dscala.usejavacp=true -jar ./scall.jar
Bonus info, I happened to come across the enabling commit message:
Went ahead and implemented classpaths as described in email to
scala-internals on the theory that at this point I must know what I'm
doing.
** PUBLIC SERVICE ANNOUNCEMENT **
If your code of whatever kind stopped working with this commit (most
likely the error is something like "object scala not found") you can
get it working again with either of:
passing -usejavacp on the command line
set system property "scala.usejavacp" to "true"
Either of these will alert scala that you want the java application
classpath to be utilized by scala as well.
I'm trying to run a Scala application packed as JAR (including dependencies) but this fails until the Scala library is added by using the -Xbootclasspath/p option.
Failing invocation:
java -jar /path/to/target/scala-2.10/application-assembly-1.0.jar
After the application did some of its intended output, the console shows:
Exception in thread "main"
scala.reflect.internal.MissingRequirementError: object scala.runtime
in compiler mirror not found.
at scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:16)
at scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:17)
at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:48)
at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:40)
at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:61)
at scala.reflect.internal.Mirrors$RootsBase.getPackage(Mirrors.scala:172)
at scala.reflect.internal.Mirrors$RootsBase.getRequiredPackage(Mirrors.scala:175)
at scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackage$lzycompute(Definitions.scala:181)
at scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackage(Definitions.scala:181)
at scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackageClass$lzycompute(Definitions.scala:182)
at scala.reflect.internal.Definitions$DefinitionsClass.RuntimePackageClass(Definitions.scala:182)
at scala.reflect.internal.Definitions$DefinitionsClass.AnnotationDefaultAttr$lzycompute(Definitions.scala:1015)
at scala.reflect.internal.Definitions$DefinitionsClass.AnnotationDefaultAttr(Definitions.scala:1014)
at scala.reflect.internal.Definitions$DefinitionsClass.syntheticCoreClasses$lzycompute(Definitions.scala:1144)
at scala.reflect.internal.Definitions$DefinitionsClass.syntheticCoreClasses(Definitions.scala:1143)
at scala.reflect.internal.Definitions$DefinitionsClass.symbolsNotPresentInBytecode$lzycompute(Definitions.scala:1187)
at scala.reflect.internal.Definitions$DefinitionsClass.symbolsNotPresentInBytecode(Definitions.scala:1187)
at scala.reflect.internal.Definitions$DefinitionsClass.init(Definitions.scala:1252)
at scala.tools.nsc.Global$Run.(Global.scala:1290)
at extract.ScalaExtractor$Compiler$2$.(ScalaExtractor.scala:24)
Working invocation:
java -Xbootclasspath/p:/path/to/home/.sbt/boot/scala-2.10.2/lib/scala-library.jar -jar /path/to/target/scala-2.10/application-assembly-1.0.jar
The strange thing about it is that the application-assembly-1.0.jar was built so that it includes all dependencies including the Scala library. When one extracts the JAR file it can be verified that the class files in the scala.runtime package have been included.
Creation of the JAR file
addSbtPlugin("com.eed3si9n" % "sbt-assembly" % "0.9.1") was added to project/plugins.sbt and the assembly target was invoked. A JAR file of about 25MB results.
Building the JAR with proguard shows the same runtime behavior as seen with assembly's JAR file.
Application code that triggers the MissingRequirementError
Some application code works fine and the previously described exception is triggered as soon as the new Run from the following fragment executes.
import scala.reflect.internal.util.BatchSourceFile
import scala.reflect.io.AbstractFile
import scala.reflect.io.Path.jfile2path
import scala.tools.nsc.Global
import scala.tools.nsc.Settings
…
import scala.tools.nsc._
object Compiler extends Global(new Settings()) {
new Run // This is line 24 from the stack trace!
def parse(path: File) = {
val code = AbstractFile.getFile(path)
val bfs = new BatchSourceFile(code, code.toCharArray)
val parser = new syntaxAnalyzer.UnitParser(new CompilationUnit(bfs))
parser.smartParse()
}
}
val ast = Compiler.parse(file)
Among others, scala-library, scala-compiler and scala-reflect are defined as dependencies in build.sbt.
For the curios / background information
The aim of the application is to aid in localization of Java and Scala programs. The task of the code fragment above is to get an AST from a Scala file in order to find method calls in there.
The questions
Given the Scala library is included in the JAR file, why is necessary to call the JAR using -Xbootclasspath/p:scala-library.jar?
Why do other parts of the application run just fine even though scala.runtime is reported as missing later?
The easy way to configure the settings with familiar keystrokes:
import scala.tools.nsc.Global
import scala.tools.nsc.Settings
def main(args: Array[String]) {
val s = new Settings
s processArgumentString "-usejavacp"
val g = new Global(s)
val r = new g.Run
}
That works for your scenario.
Even easier:
java -Dscala.usejavacp=true -jar ./scall.jar
Bonus info, I happened to come across the enabling commit message:
Went ahead and implemented classpaths as described in email to
scala-internals on the theory that at this point I must know what I'm
doing.
** PUBLIC SERVICE ANNOUNCEMENT **
If your code of whatever kind stopped working with this commit (most
likely the error is something like "object scala not found") you can
get it working again with either of:
passing -usejavacp on the command line
set system property "scala.usejavacp" to "true"
Either of these will alert scala that you want the java application
classpath to be utilized by scala as well.
How does one add a external dependency to a SBT plugin and make it available on both the project and plugin classpath ?:
Specifically I have a simple plugin that should run our TestNG test suites and do some post processing. Here is a simplified version:
import sbt._
import java.util.ArrayList
import Keys._
import org.testng._
object RunTestSuitesPlugin extends Plugin {
lazy val runTestSuites = TaskKey[Unit]("run-test-suites", "runs TestNG test suites")
lazy val testSuites = SettingKey[Seq[String]]("test-suites", "list of test suites to run")
class JavaListWrapper[T](val seq: Seq[T]) {
def toJavaList = seq.foldLeft(new java.util.ArrayList[T](seq.size)) { (al, e) => al.add(e); al }
}
implicit def listToJavaList[T](l: Seq[T]) = new JavaListWrapper(l)
def runTestSuitesTask = runTestSuites <<= (target, streams, testSuites) map {
(targetDirectory, taskStream, suites) =>
import taskStream.log
log.info("running test suites: " + suites)
runSuites(suites)
}
private def runSuites(testSuites: Seq[String]) = {
var tester = new TestNG
tester.setTestSuites(testSuites.toJavaList)
tester.run()
}
def testSuiteSettings = {
inConfig(Compile)(Seq(
runTestSuitesTask,
testSuites := Seq("testsuites/mysuite.xml"),
libraryDependencies += "org.testng" % "testng" % "5.14"))
}
}
The problem is that when I add this plugin to a project and run it with run-test-suites then it fails with java.lang.NoClassDefFoundError: org/testng/TestNG even though show full-classpath shows that testng.jar is on the classpath.
So somehow the classpath used when executing the plugin differs from the one in my project, so how do I make a plugin dependency appear in both places ?
I'll try an answer, but I'm not very familiar with the inner details of sbt.
Normally, the path for the build system (as opposed to your program) is under project, as explained here. That would typically be in a project/plugins.sbt. Sounds right, as there is no reason that the application you develop should be concerned by what libraries your build system uses, nor the other way round.
When your plugin run the application code, that may not be so simple and there could well be classpath/classloader issues. I'm not sure that it will work. Normally, your plugin should implement a testing Framework rather than define its own task. Documentation of testing for sbt is limited.
A testing framework should implement org.scalatools.testing.Framework, in test-interface. Your build will take it into account after you add
testFrameworks += new TestFramework("full.class.name")
When you run the normal test command, it let every framework recognize the test classes it deals with (two criteria available: extending some base class or having some annotation) and run them. The framework run in the build, it is given a class loader to access the application code.
You may have a look at the framework implementation for junit (shipped with sbt). Also there is a TestNG implementation. I don't know it, according to its doc, it is a little bit unorthodox, hopefully it will work for you.
The error was fixed by adding TestNG directly to unmanagedJars in Compile in the project that uses the plugin.
I have not found any resources explaining the structure of the SBT class path during plugin execution so any attempt at explaining why this step is necessary will be greatly appreciated.
When doing a serious refactor in a Java Eclipse project I will often break the build, but focus on getting one test to pass at a time. When running the tests Eclipse warns that the project cannot be compiled, but it will still run the tests it can compile.
Now I'm using SBT and would like to achieve the same thing with 'test-only', but it tries to compile the whole project, fails, and doesn't run the tests. How can I tell it to just compile the bits it can and run the tests.
You should add the following task to your project definition:
import sbt._
class Project(info: ProjectInfo) extends DefaultProject(info) {
lazy val justTest = testTask(testFrameworks, testClasspath, testCompileConditional.analysis, testOptions)
}
This is the same as the ordinary test task, but has no dependencies attached at the end. If you'd like it to have dependencies, call dependsOn on the testTask(...) expression and provide the tasks you want it to depend on.
testTask(testFrameworks, testClasspath, testCompileConditional.analysis, testOptions).dependsOn(testCompile, copyResources, copyTestResources)