scoverage: Combine Coverage from test and it:test - scala

I splitted my Unit- and Integration-Tests with a Filter:
lazy val FunTest = config("it") extend Test
def funTestFilter(name: String): Boolean = name endsWith "Spec"
def unitTestFilter(name: String): Boolean = name endsWith "Test"
...
testOptions in Test := Seq(Tests.Filter(unitTestFilter)),
testOptions in FunTest := Seq(Tests.Filter(funTestFilter)),
...
So I can do something like that:
sbt clean coverage test dockerComposeUp it:test dockerComposeStop coverageReport
Sadly that kills all my Coverage, only the generated BuildInfo has a Coverage.
Using only sbt clean coverage test coverageReport or sbt clean coverage it:test coverageReport work as expected.
The whole project can be found here: https://github.com/pme123/play-binding-form
scoverage Version: 1.5.1

SBT supports incremental compilation, but Scoverage does not support it. Scoverage clears instrumentation information before compilation starts and starts instrumentation process from scratch every time. Compilation of a subset of all classes with Scoverage enabled will result in wrong coverage reports.
In this case sbt-buldinfo plugin is enabled in server module. It registers source generator, which is executed before every compilation and generates server/target/scala_2.12/src_managed/main/sbt-buildinfo/BuildInfo.scala file.
SBT BuildInfo plugin is smart enough to regenerate this file only when its content changes, but since BuildInfoOption.BuildTime is included in buildInfoOptions setting,
this file is regeneraged before every compilation.
When it comes to compilation process, compiler finds one modified file (BuildInfo.scala) every time and starts incremental compilation of this one file. Scoverage clears its previous instrumentation information and saves only information about BuildInfo.scala file.
In case of execution like sbt clean coverage test dockerComposeUp it:test dockerComposeStop coverageReport the first compilation process is part of test task, and the second one it:test task. That's why there is no problem, when they are used separately.
Docker has nothing to do with our problem.
To fix the problem you have to prevent from BuildInfo.scala file regeneration on every compilation, at least when coverage is enabled.
I did it by modifying project/Settings.scala file in this way:
private lazy val buildInfoSettings = Seq(
buildInfoKeys := Seq[BuildInfoKey](name, version, scalaVersion, sbtVersion),
buildInfoOptions ++= { if (coverageEnabled.value) Seq() else Seq(BuildInfoOption.BuildTime) }, // <-- this line was changed
buildInfoOptions += BuildInfoOption.ToJson,
buildInfoPackage := "pme123.adapters.version"
)
buildInfoOptions does not include BuildTime option when coverage is turned on.
It doesn't look elegeant, but it works. You can probably find better way.

instead of having different buildinfo objects depending on the phase, which could lead to compilation errors, you can use your own build time.
lazy val buildTime: SettingKey[String] = SettingKey[String]("buildTime", "time of build")
ThisBuild / buildTime := ZonedDateTime.now(ZoneOffset.UTC).toString
buildInfoKeys :=
Seq[BuildInfoKey](
name,
version,
scalaVersion,
sbtVersion,
buildTime
)
This should resolve this issue.
I have this configuration in a project of mine because I wanted a better control over the way the date is formatted, and I don't have the same issue

Related

Play / sbt: What are these 2 Scala sources that are always compiled on when restarting on Code change

Whenever I change code and Play does a restart, it always compiles 2 Scala sources, like:
[info] Compiling 2 Scala sources to /Users/mpa/dev/myplayproject/server/target/scala-2.13/classes ...
Only after that the sources I changed compiles.
What are these 2 Sources?
Is there a way this can be avoided?
With the tip of #cbley I found the problematic class:
BuildInfo.scala from the sbt-buldinfo Plugin. By default on every compile it creates a new timestamp, which then causes a recompile of BuildInfo.scala.
I actually had a similar problem with scoverage - see here: scoverage: Combine Coverage from test and it:test
Adding the accepted answer from there fixed the problem:
lazy val buildTime: SettingKey[String] = SettingKey[String]
("buildTime", "time of build")
ThisBuild / buildTime := ZonedDateTime.now(ZoneOffset.UTC).toString
buildInfoKeys :=
Seq[BuildInfoKey](
name,
version,
scalaVersion,
sbtVersion,
buildTime
)

How do you impose scala code coverage specifically for integration tests?

Am running the integration test using following sbt command
sbt clean coverage it:test coverageReport
This command runs integration tests, instruments it and generates report as well.
Build.sbt has following:
coverageMinimum in IntegrationTest := 21.0
coverageFailOnMinimum in IntegrationTest := true
Output looks like:
[info] Statement coverage.: 20.16%
[info] Branch coverage....: 12.00%
[info] Coverage reports completed
[info] All done. Coverage was [20.16%]
Output result has 20.16% code coverage but the limits in build.sbt are not enforcing the limit.
If I change build.sbt to following it works:
coverageMinimum := 21.0
coverageFailOnMinimum := true
Wanted to know what am I missing for specifying limits specifically for Integration tests
Version Information:
sbt : 0.13.17
sbt-scoverage : 1.5.1
The following two workarounds seem to work on my machine (sbt-scoverage 1.5.1, sbt 1.1.1, scala 2.12.5)
Workaround 1 - Use inConfig to scope to a configuration:
inConfig(IntegrationTest)(ScoverageSbtPlugin.projectSettings),
inConfig(IntegrationTest)(Seq(coverageMinimum := 21, coverageFailOnMinimum := true))
Now executing sbt clean coverage it:test it:coverageReport throws Coverage minimum was not reached.
Workaround 2 - Modify coverageMinimum setting within a custom command:
def itTestWithMinCoverage = Command.command("itTestWithMinCoverage") { state =>
val extracted = Project extract state
val stateWithCoverage = extracted.append(Seq(coverageEnabled := true, coverageMinimum := 21.0, coverageFailOnMinimum := true), state)
val (s1, _) = Project.extract(stateWithCoverage).runTask(test in IntegrationTest, stateWithCoverage)
val (s2, _) = Project.extract(s1).runTask(coverageReport in IntegrationTest, s1)
s2
}
commands ++= Seq(itTestWithMinCoverage)
Now executing sbt itTestWithMinCoverage throws Coverage minimum was not reached. Note after executing itTestWithMinCoverage the state is discarded so coverageMinimum should be back to default value, and thus not affect unit tests.
It seems the issue is (besides my lack of understanding how scopes exactly work) checkCoverage picks up default value of coverageMinimum even after setting coverageMinimum in IntegrationTest.

How to execute Main-Class as part of compile in SBT

I wanted to execute a task as part of SBT compile, I tried runMain in compile but it is not executing the main class that I am providing. Below is how task looks like in build.sbt
lazy val scalaGeneratorPlugin = Project("scala-generator", file("scala-generator"))
.settings(
libraryDependencies += "org.freemarker" % "freemarker" % "2.3.23",
runMain in compile := Some("com.my.MyMainClass")
)
I am running following command:
sbt scala-generator/compile
Although it gives me success message, it does not execute my MainClass
I am copying laughedelic's answer in comment here:
I think you should use source generation in sbt for that, i.e. there should be different compilation stages.

SBT plugin how to make a source generator dependent on project's sources?

I'm trying to create a source generator in a SBT plugin that generate code based on the project's sources.
I tried something like this:
sourceGenerators in Compile += (sources in Compile) map { sources => doSomethingWithSources(sources) }
Unfortunately, SBT does not want to load this plugin due to the fact that there exists circular dependency.
Due to this fact I've created another task like this:
lazy val myTask = TaskKey[Unit]("myTask", "Do stuff")
This tasks actually depends on the sources value and generates the files.
Later I override the projectSettings value and add this:
myTask in Compile := {
val sourcesValue = (sources in Compile).value
doSomethingWithSources(sourcesValue)
},
sourcesGenerators in Compile += Def.task(Seq(new File("path/to/myGeneratedSource.scala"))).taskValue
I add this task as the dependency to the compile task in the build.sbt of the project that I want my plugin to do stuff like this:
compile in Compile <<= (compile in Compile) dependsOn (myTask in Compile)
While it works (the file is generated), when I launch the sbt command sbt run, it creates the file but does not compile it.
What is more, when I run just sbt compile run, it compiles only the project on the first (compile) task and generates my source and then on run part it compiles the generated source - so, in matter of speaking, it does somehow work, but it needs two compilations.
I'd like to ask if there is a simpler way to do this and, if not, how to make it work in only one compilation.

How can I pass JVM options to SBT to use when running the app or test cases?

I would like to specify JVM options when running my app or the tests for the app through SBT. Specifically, I need to be able to give the JVM the -Djava.security.policy parameter so that my policy is loaded and used for the test.
How can I do this with SBT?
With xsbt, you could run your test in a forked JVM (because of one of the reasons mentioned in "Running Project Code".
If you are using a forked jvm:
specify the configuration to affect only the main or test run tasks:
scala javaOptions in (Test,run) += "-Xmx8G"
You should be able to specify any other options to that JVM through javaOptions.
The OP David Eagen reports that the following configuration didn't work at first, not because of the sbt options, but because of the path:
lazy val escacheServer =
Project( "escache-server",
file("server"),
settings = buildSettings ++ Seq(resolvers ++=
Seq(scala_tools_snapshots, typesafe_repo),
libraryDependencies ++= escacheServerDeps,
javaOptions in run += "-Djava.security.policy=jini.policy",
fork in run := true
)
).dependsOn(escache) }
It looks like my problem was that jini.policy wasn't found in the current directory.
I set the full path and now it runs.