How do I configure sbt 0.10 to use the junitxml option with specs2?
The specs2 documentation says this is the way to do it using sbt 0.7.x:
override def testOptions = super.testOptions ++ Seq(TestArgument("junitxml"))
How do I say the same thing in sbt 0.10?
FYI, I found that when running Specs2 tests with juntxml SBT fails to fail when the tests fail. Adding "console" as another argument gets a build failure like you'd expect. I suspect this is some interaction between the console reporter and sbt's test driver.
testOptions in Test += Tests.Argument(TestFrameworks.Specs2, "junitxml", "console")
This is described here in the SBT documentation:
testOptions in Test += Tests.Argument("junitxml")
And if you want to specify this option specifically for specs2:
testOptions in Test += Tests.Argument(TestFrameworks.Specs2, "junitxml")
Related
I splitted my Unit- and Integration-Tests with a Filter:
lazy val FunTest = config("it") extend Test
def funTestFilter(name: String): Boolean = name endsWith "Spec"
def unitTestFilter(name: String): Boolean = name endsWith "Test"
...
testOptions in Test := Seq(Tests.Filter(unitTestFilter)),
testOptions in FunTest := Seq(Tests.Filter(funTestFilter)),
...
So I can do something like that:
sbt clean coverage test dockerComposeUp it:test dockerComposeStop coverageReport
Sadly that kills all my Coverage, only the generated BuildInfo has a Coverage.
Using only sbt clean coverage test coverageReport or sbt clean coverage it:test coverageReport work as expected.
The whole project can be found here: https://github.com/pme123/play-binding-form
scoverage Version: 1.5.1
SBT supports incremental compilation, but Scoverage does not support it. Scoverage clears instrumentation information before compilation starts and starts instrumentation process from scratch every time. Compilation of a subset of all classes with Scoverage enabled will result in wrong coverage reports.
In this case sbt-buldinfo plugin is enabled in server module. It registers source generator, which is executed before every compilation and generates server/target/scala_2.12/src_managed/main/sbt-buildinfo/BuildInfo.scala file.
SBT BuildInfo plugin is smart enough to regenerate this file only when its content changes, but since BuildInfoOption.BuildTime is included in buildInfoOptions setting,
this file is regeneraged before every compilation.
When it comes to compilation process, compiler finds one modified file (BuildInfo.scala) every time and starts incremental compilation of this one file. Scoverage clears its previous instrumentation information and saves only information about BuildInfo.scala file.
In case of execution like sbt clean coverage test dockerComposeUp it:test dockerComposeStop coverageReport the first compilation process is part of test task, and the second one it:test task. That's why there is no problem, when they are used separately.
Docker has nothing to do with our problem.
To fix the problem you have to prevent from BuildInfo.scala file regeneration on every compilation, at least when coverage is enabled.
I did it by modifying project/Settings.scala file in this way:
private lazy val buildInfoSettings = Seq(
buildInfoKeys := Seq[BuildInfoKey](name, version, scalaVersion, sbtVersion),
buildInfoOptions ++= { if (coverageEnabled.value) Seq() else Seq(BuildInfoOption.BuildTime) }, // <-- this line was changed
buildInfoOptions += BuildInfoOption.ToJson,
buildInfoPackage := "pme123.adapters.version"
)
buildInfoOptions does not include BuildTime option when coverage is turned on.
It doesn't look elegeant, but it works. You can probably find better way.
instead of having different buildinfo objects depending on the phase, which could lead to compilation errors, you can use your own build time.
lazy val buildTime: SettingKey[String] = SettingKey[String]("buildTime", "time of build")
ThisBuild / buildTime := ZonedDateTime.now(ZoneOffset.UTC).toString
buildInfoKeys :=
Seq[BuildInfoKey](
name,
version,
scalaVersion,
sbtVersion,
buildTime
)
This should resolve this issue.
I have this configuration in a project of mine because I wanted a better control over the way the date is formatted, and I don't have the same issue
I am currently running a akka application with the below command after I run
sbt assembly
java -Dconfig.resource=/application.test.conf -cp /path/to/folder:./target/scala-2.11/app-name.jar ca.path.to.main
Is there a way I can pass this information using sbt and some flags so I don't have to run the sbt assembly task everytime just to run the application?
sbt run config=/application.test.conf cp=/path/to/folder:
(something like the above)
Options that are passed to JVM are read by sbt from the javaOptions setting. So you can configure this setting to have the options you want and then tell sbt to fork new JVM process everytime you run your app from sbt so these options are applied. You can do this from sbt console:
set javaOptions += "-Dconfig.resource=/application.test.conf"
set fork := true
run
Or in your build.sbt file:
javaOptions += "-Dconfig.resource=/application.test.conf"
fork := true
However this might not be the most idiomatic approach to reach your underlying end goal.
I configured build.sbt for the unit test to use a different Play (2.3.9 for Scala and SBT 0.13.5) configuration via,
javaOptions in Test ++= Seq("-Dconfig.file=/home/kitty/acme/test/resources/test-application.conf")
Play did not pick up test-application.conf and used application.conf in conf instead. AFAIK, there is no scalaOption in this case. However, if I include -Dconfig.file in the command line, it works fine,
sbt test -Dconfig.file=/home/kitty/acme/test/resources/test-application.conf
How do I fix this? Thanks.
javaOptions in Test ++= Seq("-Dconfig.file=/home/kitty/acme/test/resources/test-application.conf") didn't work because my fork in Test was false. Therefore, set fork to true and it will work. -Dconfig.resource like -Dconfig.file works the same way too. SBT will not pick it up if it is not forked. Strictly, javaOptions only work with fork is true as mentioned here
you're almost there, you can force JVM options like this
javaOptions in Test ++= Seq("-Dconfig.file=/home/kitty/acme/test/resources/test-application.conf")
config.file also takes a relative path e.g conf/test-application.conf
I'm running integration tests in Scala - these are found in the src/it/scala directory, and I've added the following to my build.sbt:
seq(Defaults.itSettings: _*)
However, when I run SCCT to calculate code coverage, the integration tests are not run. How can I make them be run?
I am using scct 0.3-SNAPSHOT / sbt 0.13
for mergin test + it:test try the following setting:
ScctPlugin.instrumentSettings ++ Defaults.itSettings ++ Seq(
resourceDirectory in ScctPlugin.ScctTest <<= (resourceDirectory in Test),
sources in ScctPlugin.ScctTest ++= (sources in IntegrationTest).value
)
this might get tricky if you have different resources
I would like to specify JVM options when running my app or the tests for the app through SBT. Specifically, I need to be able to give the JVM the -Djava.security.policy parameter so that my policy is loaded and used for the test.
How can I do this with SBT?
With xsbt, you could run your test in a forked JVM (because of one of the reasons mentioned in "Running Project Code".
If you are using a forked jvm:
specify the configuration to affect only the main or test run tasks:
scala javaOptions in (Test,run) += "-Xmx8G"
You should be able to specify any other options to that JVM through javaOptions.
The OP David Eagen reports that the following configuration didn't work at first, not because of the sbt options, but because of the path:
lazy val escacheServer =
Project( "escache-server",
file("server"),
settings = buildSettings ++ Seq(resolvers ++=
Seq(scala_tools_snapshots, typesafe_repo),
libraryDependencies ++= escacheServerDeps,
javaOptions in run += "-Djava.security.policy=jini.policy",
fork in run := true
)
).dependsOn(escache) }
It looks like my problem was that jini.policy wasn't found in the current directory.
I set the full path and now it runs.