How do you impose scala code coverage specifically for integration tests? - scala

Am running the integration test using following sbt command
sbt clean coverage it:test coverageReport
This command runs integration tests, instruments it and generates report as well.
Build.sbt has following:
coverageMinimum in IntegrationTest := 21.0
coverageFailOnMinimum in IntegrationTest := true
Output looks like:
[info] Statement coverage.: 20.16%
[info] Branch coverage....: 12.00%
[info] Coverage reports completed
[info] All done. Coverage was [20.16%]
Output result has 20.16% code coverage but the limits in build.sbt are not enforcing the limit.
If I change build.sbt to following it works:
coverageMinimum := 21.0
coverageFailOnMinimum := true
Wanted to know what am I missing for specifying limits specifically for Integration tests
Version Information:
sbt : 0.13.17
sbt-scoverage : 1.5.1

The following two workarounds seem to work on my machine (sbt-scoverage 1.5.1, sbt 1.1.1, scala 2.12.5)
Workaround 1 - Use inConfig to scope to a configuration:
inConfig(IntegrationTest)(ScoverageSbtPlugin.projectSettings),
inConfig(IntegrationTest)(Seq(coverageMinimum := 21, coverageFailOnMinimum := true))
Now executing sbt clean coverage it:test it:coverageReport throws Coverage minimum was not reached.
Workaround 2 - Modify coverageMinimum setting within a custom command:
def itTestWithMinCoverage = Command.command("itTestWithMinCoverage") { state =>
val extracted = Project extract state
val stateWithCoverage = extracted.append(Seq(coverageEnabled := true, coverageMinimum := 21.0, coverageFailOnMinimum := true), state)
val (s1, _) = Project.extract(stateWithCoverage).runTask(test in IntegrationTest, stateWithCoverage)
val (s2, _) = Project.extract(s1).runTask(coverageReport in IntegrationTest, s1)
s2
}
commands ++= Seq(itTestWithMinCoverage)
Now executing sbt itTestWithMinCoverage throws Coverage minimum was not reached. Note after executing itTestWithMinCoverage the state is discarded so coverageMinimum should be back to default value, and thus not affect unit tests.
It seems the issue is (besides my lack of understanding how scopes exactly work) checkCoverage picks up default value of coverageMinimum even after setting coverageMinimum in IntegrationTest.

Related

Play / sbt: What are these 2 Scala sources that are always compiled on when restarting on Code change

Whenever I change code and Play does a restart, it always compiles 2 Scala sources, like:
[info] Compiling 2 Scala sources to /Users/mpa/dev/myplayproject/server/target/scala-2.13/classes ...
Only after that the sources I changed compiles.
What are these 2 Sources?
Is there a way this can be avoided?
With the tip of #cbley I found the problematic class:
BuildInfo.scala from the sbt-buldinfo Plugin. By default on every compile it creates a new timestamp, which then causes a recompile of BuildInfo.scala.
I actually had a similar problem with scoverage - see here: scoverage: Combine Coverage from test and it:test
Adding the accepted answer from there fixed the problem:
lazy val buildTime: SettingKey[String] = SettingKey[String]
("buildTime", "time of build")
ThisBuild / buildTime := ZonedDateTime.now(ZoneOffset.UTC).toString
buildInfoKeys :=
Seq[BuildInfoKey](
name,
version,
scalaVersion,
sbtVersion,
buildTime
)

scoverage: Combine Coverage from test and it:test

I splitted my Unit- and Integration-Tests with a Filter:
lazy val FunTest = config("it") extend Test
def funTestFilter(name: String): Boolean = name endsWith "Spec"
def unitTestFilter(name: String): Boolean = name endsWith "Test"
...
testOptions in Test := Seq(Tests.Filter(unitTestFilter)),
testOptions in FunTest := Seq(Tests.Filter(funTestFilter)),
...
So I can do something like that:
sbt clean coverage test dockerComposeUp it:test dockerComposeStop coverageReport
Sadly that kills all my Coverage, only the generated BuildInfo has a Coverage.
Using only sbt clean coverage test coverageReport or sbt clean coverage it:test coverageReport work as expected.
The whole project can be found here: https://github.com/pme123/play-binding-form
scoverage Version: 1.5.1
SBT supports incremental compilation, but Scoverage does not support it. Scoverage clears instrumentation information before compilation starts and starts instrumentation process from scratch every time. Compilation of a subset of all classes with Scoverage enabled will result in wrong coverage reports.
In this case sbt-buldinfo plugin is enabled in server module. It registers source generator, which is executed before every compilation and generates server/target/scala_2.12/src_managed/main/sbt-buildinfo/BuildInfo.scala file.
SBT BuildInfo plugin is smart enough to regenerate this file only when its content changes, but since BuildInfoOption.BuildTime is included in buildInfoOptions setting,
this file is regeneraged before every compilation.
When it comes to compilation process, compiler finds one modified file (BuildInfo.scala) every time and starts incremental compilation of this one file. Scoverage clears its previous instrumentation information and saves only information about BuildInfo.scala file.
In case of execution like sbt clean coverage test dockerComposeUp it:test dockerComposeStop coverageReport the first compilation process is part of test task, and the second one it:test task. That's why there is no problem, when they are used separately.
Docker has nothing to do with our problem.
To fix the problem you have to prevent from BuildInfo.scala file regeneration on every compilation, at least when coverage is enabled.
I did it by modifying project/Settings.scala file in this way:
private lazy val buildInfoSettings = Seq(
buildInfoKeys := Seq[BuildInfoKey](name, version, scalaVersion, sbtVersion),
buildInfoOptions ++= { if (coverageEnabled.value) Seq() else Seq(BuildInfoOption.BuildTime) }, // <-- this line was changed
buildInfoOptions += BuildInfoOption.ToJson,
buildInfoPackage := "pme123.adapters.version"
)
buildInfoOptions does not include BuildTime option when coverage is turned on.
It doesn't look elegeant, but it works. You can probably find better way.
instead of having different buildinfo objects depending on the phase, which could lead to compilation errors, you can use your own build time.
lazy val buildTime: SettingKey[String] = SettingKey[String]("buildTime", "time of build")
ThisBuild / buildTime := ZonedDateTime.now(ZoneOffset.UTC).toString
buildInfoKeys :=
Seq[BuildInfoKey](
name,
version,
scalaVersion,
sbtVersion,
buildTime
)
This should resolve this issue.
I have this configuration in a project of mine because I wanted a better control over the way the date is formatted, and I don't have the same issue

How to execute Main-Class as part of compile in SBT

I wanted to execute a task as part of SBT compile, I tried runMain in compile but it is not executing the main class that I am providing. Below is how task looks like in build.sbt
lazy val scalaGeneratorPlugin = Project("scala-generator", file("scala-generator"))
.settings(
libraryDependencies += "org.freemarker" % "freemarker" % "2.3.23",
runMain in compile := Some("com.my.MyMainClass")
)
I am running following command:
sbt scala-generator/compile
Although it gives me success message, it does not execute my MainClass
I am copying laughedelic's answer in comment here:
I think you should use source generation in sbt for that, i.e. there should be different compilation stages.

Aggregate different modules based on scala binary version

I'm trying to cross build a project for (2.11, 2.12) where some of the subprojects should not be built for 2.12 because their transitive dependencies are not yet released for 2.12. Specifically Spark for Scala 2.12. The root project's aggregate looks like
lazy val root = (project in file(".")).
aggregate(vegas, spark, flink, macros).
settings(commonSettings: _*).
settings(noPublishSettings: _*)
Is there some way to detect the scalaBinaryVersion in the #aggregate and aggregate a different set of projects if the crossbuild is trying to produce a 2.12 artifact?
There appears to be no direct way to do it. As a workaround, you may want to get a similar effect by making spark's libraryDependencies empty and skipping compile and publish when scalaBinaryVersion is 2.12:
// tested on sbt 1.1.0
lazy val spark = (project in file("spark"))
.settings(
// ... other settings ...
// Empty out libraryDependencies when scalaBinaryVersion is 2.12.
libraryDependencies :=
(if (scalaBinaryVersion.value == "2.12") Seq.empty else libraryDependencies.value),
// Skip compilation and publishing when scalaBinaryVersion is 2.12.
skip in compile := scalaBinaryVersion.value == "2.12",
skip in publish := scalaBinaryVersion.value == "2.12"
)
The skip task key allows us to skip some task. From inspect skip:
Task: Boolean
For tasks that support it (currently only compile, update, and publish), setting skip to true will force the task to not to do its work. This exact semantics may vary by task.
However, in contrast to compile and publish, skip in update := scalaBinaryVersion.value == "2.12" does not work here. From sbt Reference Manual:
Overriding all of the above, skip in update := true will tell sbt to never perform resolution. ... Also, (note that) update itself will immediately fail if resolution has not been allowed to run since the last clean.

Code coverage for Scala integration tests with SCCT

I'm running integration tests in Scala - these are found in the src/it/scala directory, and I've added the following to my build.sbt:
seq(Defaults.itSettings: _*)
However, when I run SCCT to calculate code coverage, the integration tests are not run. How can I make them be run?
I am using scct 0.3-SNAPSHOT / sbt 0.13
for mergin test + it:test try the following setting:
ScctPlugin.instrumentSettings ++ Defaults.itSettings ++ Seq(
resourceDirectory in ScctPlugin.ScctTest <<= (resourceDirectory in Test),
sources in ScctPlugin.ScctTest ++= (sources in IntegrationTest).value
)
this might get tricky if you have different resources