I have an sbt project for which I am writing integration tests. The integration test deploys the API defined in the project. I need the version of the project from sbt in order to deploy the corresponding version of the API which has been published to a remote repository (x.x.x-SNAPSHOT). After it is deployed, I'll run integration tests against it.
Is there a way to pass Keys from sbt to a unit test class? I'm using Scalatest and sbt 1.2.7
If you run your unit/integration tests in a forked JVM, you can pass the version through a system property to that JVM:
Test / fork := true
Test / javaOptions += s"-Dproject.version=${version.value}"
Change the scope according to how you set up your unit tests (you might need to use a different configuration or even a specific task).
If you don't want to run your tests in a forked JVM, you could use the following setting to set up the system property before running your tests:
Test / testOptions += Tests.Setup(() => sys.props += "project.version" -> version.value)
In either of these cases, you then should access the project.version system property in your tests to get the version number:
val version = sys.props("project.version")
Alternatively, you can generate a file and put it into your generated resources directory, and load the version number from there:
// build.sbt
Test / resourceGenerators += Def.task {
val versionFile = (Test / resourceManaged).value / "version.txt"
IO.write(versionFile, version.value)
Vector(versionFile)
}
// your test
val is = getClass.getClassLoader.getResourceAsStream("version.txt")
val version = try {
scala.io.Source.fromInputStream(is).mkString
} finally {
is.close()
}
Related
I have an interesting problem where I basically need to create a .jar (plus all of the classpath dependencies) that contains all of the tests of an SBT project (plus any of its subprojects). The idea is that I can just run the jar using java -jar and all of the tests will execute.
I heard that this is possible to do with sbt-assembly but you would have to manually run assembly for each sbt sub-project that you have (each with their own .jars) where as ideally I would just want to run one command that generates a giant .jar for every test in every sbt root+sub project that you happen to have (in the same way if you run test in an sbt project with sub projects it will run tests for everything).
The current testing framework that we are using is specs2 although I am not sure if this makes a difference.
Does anyone know if this is possible?
Exporting test runner is not supported
sbt 1.3.x does not have this feature. Defined tests are executed in tandem with the runner provided by test frameworks (like Specs2) and sbt's build that also reflectively discovers your defined tests (e.g. which class extends Spec2's test traits?). In theory, we already have a good chunk of what you'd need because Test / fork := true creates a program called ForkMain and runs your tests in another JVM. What's missing from that is dispatching of your defined tests.
Using specs2.run runner
Thankfully Specs2 provides a runner out of the box called specs2.run (See In the shell):
scala -cp ... specs2.run com.company.SpecName [argument1 argument2 ...]
So basically all you need to know is:
your classpath
list of fully qualified name for your defined tests
Here's how to get them using sbt:
> print Test/fullClasspath
* Attributed(/private/tmp/specs-runner/target/scala-2.13/test-classes)
* Attributed(/private/tmp/specs-runner/target/scala-2.13/classes)
* Attributed(/Users/eed3si9n/.coursier/cache/v1/https/repo1.maven.org/maven2/org/scala-lang/modules/scala-xml_2.13/1.2.0/scala-xml_2.13-1.2.0.jar)
...
> print Test/definedTests
* Test foo.HelloWorldSpec : subclass(false, org.specs2.specification.core.SpecificationStructure)
We can exercise specs2.run runner from sbt shell as follows:
> Test/runMain specs2.run foo.HelloWorldSpec
Aggregating across subprojects
Aggregating tests across subprojects requires some thinking. Instead of creating a giant ball of assembly, I would recommend the following. Create a dummy subproject testAgg, and then collect all the Test/externalDependencyClasspath and Test/packageBin into its target/dist. You can then grab all the JAR and run java -jar ... as you wanted.
How would one go about that programmatically? See Getting values from multiple scopes.
lazy val collectJars = taskKey[Seq[File]]("")
lazy val collectDefinedTests = taskKey[Seq[String]]("")
lazy val testFilter = ScopeFilter(inAnyProject, inConfigurations(Test))
lazy val testAgg = (project in file("testAgg"))
.settings(
name := "testAgg",
publish / skip := true,
collectJars := {
val cps = externalDependencyClasspath.all(testFilter).value.flatten.distinct
val pkgs = packageBin.all(testFilter).value
cps.map(_.data) ++ pkgs
},
collectDefinedTests := {
val dts = definedTests.all(testFilter).value.flatten
dts.map(_.name)
},
Test / test := {
val jars = collectJars.value
val tests = collectDefinedTests.value
sys.process.Process(s"""java -cp ${jars.mkString(":")} specs2.run ${tests.mkString(" ")}""").!
}
)
This runs like this:
> testAgg/test
[info] HelloWorldSpec
[info]
[info] The 'Hello world' string should
[info] + contain 11 characters
[info] + start with 'Hello'
[info] + end with 'world'
[info]
[info]
[info] Total for specification HelloWorldSpec
[info] Finished in 124 ms
3 examples, 0 failure, 0 error
[info] testAgg / Test / test 1s
If you really want to you probably could generate source from the collectDefinedTests make testAgg depend on the Test configurations of all subprojects, and try to make a giant ball of assembly, but I'll leave as an exercise to the reader :)
I have a Play framework with a bunch of tests (which are run with ScalaTest), and I am trying to organize them by:
Unit test
Integration Test Read
Integration Test Write
I have left all of my unit tests untagged, and have created the following tags:
/* TestTags.scala */
object IntegrationReadTest extends Tag("IntegrationReadTest")
object IntegrationWriteTest extends Tag("IntegrationWriteTest")
so that I can tag my integration tests like this:
/* SomeSpecs.scala */
"foo" taggedAs IntegrationReadTest in {
// External APIs are read from
}
"bar" taggedAs IntegrationWriteTest in {
// External APIs are written to
}
Most of the time while I am developing and running tests, I do not want to run the integration tests, so I modified my build.sbt to ignore them when I run sbt test:
/* build.sbt */
testOptions in Test += Tests.Argument("-l", "IntegrationReadTest")
testOptions in Test += Tests.Argument("-l", "IntegrationWriteTest")
This all works well, but I cannot figure how to run all of the tests (including the integration tests). I have tried many combinations of sbt test and sbt "test:testOnly" but can not figure out how to un-ignore the integration tests.
By default, your tests run in the Test context. This means that sbt test is really doing sbt test:test. Since you are setting testOptions in Test, this applies by default.
From that, it follows that you can create a new context All which puts those tests back in.
/* build.sbt */
lazy val All = config("all") extend(Test) describedAs("run all the tests")
configs(All)
inConfig(All)(Defaults.testSettings)
testOptions in All -= Tests.Argument("-l", "IntegrationReadTest")
testOptions in All -= Tests.Argument("-l", "IntegrationWriteTest")
Then you can run sbt all:test to run them all.
I have a multi-module project and currently run tests during packaging by a task which reads -
val testALL = taskKey[Unit]("Test ALL Modules")
testALL := {
(test in Test in module_A).value
(test in Test in module_B).value
(test in Test in module_C).value
}
Now, I have consolidated all tests in each module into a single top-level ScalaTest Suite. So for each module want to only run this single top-level suite (named say "blah.moduleA.TestSuite" and so on). Have been trying to use testOnly and testFilter in my build.sbt to run just this single suite in each module but cant get the syntax right. Can someone please tell me how to do this?
testOnly is an InputKey[Unit]. You want to turn it in a Task[Unit] to be able to run it directly for a given test suite.
You can achieve this this way:
lazy val foo = taskKey[Unit]("...")
foo := (testOnly in Test).fullInput("hello").value
In sbt's documentation: Preapplying input in sbt
I have some long running tests in my project. These these are sitting in parallel to my integration and unit-tests in
/test/manual/*
Is there in Play 2.4 for Scala a way to disable/mark these test classes. So they are not run automaticly when
$ activator test
but only run when using the test-only command.
Problem is that I do not want to run these longer tests on my CI server.
Having similar problems for long-running integration tests, I created an It configuration derived from the standard test config (in <projectHome>/build.sbt):
lazy val It = config("it").extend(Test)
Then I add the sources and test sources to this config
scalaSource in It <<= (scalaSource in Test)
and you needd to enable to config and corresponding tasks available in the current project
lazy val root = (project in file(".")).configs(It)
.settings(inConfig(It)(Defaults.testTasks): _*)
I then disable long running tests in the Test config :
testOptions in Test := Seq(Tests.Argument("exclude", "LongRunning"))
And include only these long running tests in the It config:
testOptions in It := Seq(Tests.Argument("include", "LongRunning"))
These last 2 configs are kinda dependent on the test framework you use (specs2 in my case, scala test would probably use -n and -l in addition to tags to achieve the same)
Then sbt test will exclude all LongRunning tests and you can run it:test or it:testOnly your.long.running.TestCaseHere in an interactive sbt session if need be.
I can't get the embedded sbt plugin (with auto import enabled) in Intellij (13.1) to recognize custom sbt configurations. I have the follow setup in my sbt build file:
lazy val EndToEndTest = config("e2e") extend (Test)
private lazy val e2eSettings =
inConfig(EndToEndTest)(Defaults.testSettings)
lazy val root: Project = Project(
id = "root",
base = file(".")
)
.configs(EndToEndTest)
.settings(e2eSettings)
The code works according to expectations in the sbt console. E.g I can write:
sbt e2e:test (and it will execute tests located in /src/e2e/scala)
The issue is that the directory /src/e2e/scala won't get registered as a source directory in Intellij. This makes it hard to use intellij to manage the tests. I can manually mark the directory as source but it gets reverted every time
I update my sbt files (auto import).
Do a manual update through the sbt tool window
Related:
Using the preconfigured configuration IntegrationTest works as expected but custom once don't.
According to sbt-idea documentation this can be done in your case by adding
ideaExtraTestConfigurations := Seq(EndToEndTest)
to your project settings.