Override default compile task in sbt - scala

In SBT, compile task does the compilation of the project code and test:compile does compilation of the project's tests. I want a single compile task which does both. I want to override the default compile task and dont want a task with a new name (because want to enforce compilation success of all tests with every code change to project's main code). Am using Build.scala (not build.sbt) and tried the method described in this SO answer. My trial is pasted below and does not work because the return type of the compile task is TaskKey[Analysis]. How should I change this?
val compileInTest = TaskKey[Analysis]("compile the tests")
compileInTest := {
(compile in Compile in <module-name>).value
(compile in Test in <module-name>).value
}
lazy val projectA = Project(
"a",
file("a"),
settings = hwsettings ++ Seq(
compile := compileInTest
))

You can define alias in .sbtrc file:
alias compile=test:compile
which will do both tasks.

Related

How do I make sbt include non-Java sources to published artifact?

How do I make sbt include non-Java sources to published artifact ?
I'm using Kotlin plugin and can't figure out how to force sbt to include .kt file into published source jar. It only includes .java files.
A lot of people online suggest adding following code to sbt script but it doesn't help
mappings in (Compile, packageSrc) ++= {
val base = (sourceManaged in Compile).value
val files = (managedSources in Compile).value
files.map { f => (f, f.relativeTo(base).get.getPath) }
},
I also tried
includeFilter in (Compile, packageSrc) := "*.scala" || "*.java" || "*.kt",
Here is output of some variables in sbt console
sbt:collections> show unmanagedSourceDirectories
[info] * /home/expert/work/sideprojects/unoexperto/extensions-collections/src/main/scala
[info] * /home/expert/work/sideprojects/unoexperto/extensions-collections/src/main/java
[info] * /home/expert/work/sideprojects/unoexperto/extensions-collections/src/main/kotlin
sbt:collections> show unmanagedSources
[info] * /home/expert/work/sideprojects/unoexperto/extensions-collections/src/main/java/com/walkmind/extensions/collections/TestSomething.java
which plugin you use for kotlin?
https://github.com/pfn/kotlin-plugin has the option kotlinSource to configure where the source directory is located.
sbt packageBin compiled kotlin files and include them to output jar.
build.sbt
// define kotlin source directory
kotlinSource in Compile := baseDirectory.value / "src/main/kotlin",
src/main/kotlin/org.test
package org.test
fun main(args: Array<String>) {
println("Hello World!")
}
console
sbt compile
sbt packageBin
target/scala-2.13
jar include MainKt.class
and folder org/test contains MainKt.class too.
would this solve your problem?
I found a workaround for this in my project https://github.com/makiftutuncu/e. I made following: https://github.com/makiftutuncu/e/blob/master/project/Settings.scala#L105
Basically, I added following setting in SBT to properly generate sources artifact:
// Include Kotlin files in sources
packageConfiguration in Compile := {
val old = (packageConfiguration in Compile in packageSrc).value
val newSources = (sourceDirectories in Compile).value.flatMap(_ ** "*.kt" get)
new Package.Configuration(
old.sources ++ newSources.map(f => f -> f.getName),
old.jar,
old.options
)
}
For the documentation artifact, I added Gradle build to my Kotlin module. I set it up as shown here https://github.com/makiftutuncu/e/blob/master/e-kotlin/build.gradle.kts. This way, I make Gradle build generate the Dokka documentation. And finally, added following setting in SBT to run Gradle while building docs:
// Delegate doc generation to Gradle and Dokka
doc in Compile := {
import sys.process._
Process(Seq("./gradlew", "dokkaJavadoc"), baseDirectory.value).!
target.value / "api"
}
I admit, this is a lot of work just to get 2 artifacts but it did the trick for me. 🤷🏻 Hope this helps.

scoverage: Combine Coverage from test and it:test

I splitted my Unit- and Integration-Tests with a Filter:
lazy val FunTest = config("it") extend Test
def funTestFilter(name: String): Boolean = name endsWith "Spec"
def unitTestFilter(name: String): Boolean = name endsWith "Test"
...
testOptions in Test := Seq(Tests.Filter(unitTestFilter)),
testOptions in FunTest := Seq(Tests.Filter(funTestFilter)),
...
So I can do something like that:
sbt clean coverage test dockerComposeUp it:test dockerComposeStop coverageReport
Sadly that kills all my Coverage, only the generated BuildInfo has a Coverage.
Using only sbt clean coverage test coverageReport or sbt clean coverage it:test coverageReport work as expected.
The whole project can be found here: https://github.com/pme123/play-binding-form
scoverage Version: 1.5.1
SBT supports incremental compilation, but Scoverage does not support it. Scoverage clears instrumentation information before compilation starts and starts instrumentation process from scratch every time. Compilation of a subset of all classes with Scoverage enabled will result in wrong coverage reports.
In this case sbt-buldinfo plugin is enabled in server module. It registers source generator, which is executed before every compilation and generates server/target/scala_2.12/src_managed/main/sbt-buildinfo/BuildInfo.scala file.
SBT BuildInfo plugin is smart enough to regenerate this file only when its content changes, but since BuildInfoOption.BuildTime is included in buildInfoOptions setting,
this file is regeneraged before every compilation.
When it comes to compilation process, compiler finds one modified file (BuildInfo.scala) every time and starts incremental compilation of this one file. Scoverage clears its previous instrumentation information and saves only information about BuildInfo.scala file.
In case of execution like sbt clean coverage test dockerComposeUp it:test dockerComposeStop coverageReport the first compilation process is part of test task, and the second one it:test task. That's why there is no problem, when they are used separately.
Docker has nothing to do with our problem.
To fix the problem you have to prevent from BuildInfo.scala file regeneration on every compilation, at least when coverage is enabled.
I did it by modifying project/Settings.scala file in this way:
private lazy val buildInfoSettings = Seq(
buildInfoKeys := Seq[BuildInfoKey](name, version, scalaVersion, sbtVersion),
buildInfoOptions ++= { if (coverageEnabled.value) Seq() else Seq(BuildInfoOption.BuildTime) }, // <-- this line was changed
buildInfoOptions += BuildInfoOption.ToJson,
buildInfoPackage := "pme123.adapters.version"
)
buildInfoOptions does not include BuildTime option when coverage is turned on.
It doesn't look elegeant, but it works. You can probably find better way.
instead of having different buildinfo objects depending on the phase, which could lead to compilation errors, you can use your own build time.
lazy val buildTime: SettingKey[String] = SettingKey[String]("buildTime", "time of build")
ThisBuild / buildTime := ZonedDateTime.now(ZoneOffset.UTC).toString
buildInfoKeys :=
Seq[BuildInfoKey](
name,
version,
scalaVersion,
sbtVersion,
buildTime
)
This should resolve this issue.
I have this configuration in a project of mine because I wanted a better control over the way the date is formatted, and I don't have the same issue

How to use SBT to run ScalaTest tests against a fat jar?

I have a simple SBT project, consisting of some Scala code in src/main/scala and some test code in src/test/scala. I use the sbt-assembly plugin to create a fat jar for deployment onto remote systems. The fat jar includes all the dependencies of the Scala project, including the Scala runtime itself. This all works great.
Now I'm trying to figure out a way I can run the Scala tests against the fat jar. I tried the obvious thing, creating a new config extending the Test config and modifying the dependencyClasspath to be the fat JAR instead of the default value, however this fails because (I assume because) the Scala runtime is included in the fat jar and collides somehow with the already-loaded Scala runtime.
My solution right now works but it has serious drawbacks. I just use Fork.java to invoke Java on the org.scalatest.tools.Runner runner with a classpath set to include the test code and the fat jar and all of the test dependencies. The downside is that none of the SBT test richness works, there's no testQuick, there's not testOnly, and the test failure reporting is on stdout.
My question boils down to this: how does one use SBT's test commands to run tests when those tests are dependent not on their corresponding SBT compile output, but on a fat JAR file which itself includes all the Scala runtimes?
This is what I landed on (for specs2, but can be adapted). This is basically what you said was your Fork solution, but I figured I'd leave this here in case someone wanted to know what that might be. Unfortunately I don't think you can run this "officially" as a SBT test runner. I should also add that you still want Fork.java even though this is Scala, because Fork.scala depends on a runner class that I don't seem to have.
test.sbt (or build.sbt, if you want to put a bunch of stuff there - SBT reads all .sbt files in the root if you want to organize):
// Set up configuration for building a test assembly
Test / assembly / assemblyJarName := s"${name.value}-test-${version.value}.jar"
Test / assembly / assemblyMergeStrategy := (assembly / assemblyMergeStrategy).value
Test / assembly / assemblyOption := (assembly / assemblyOption).value
Test / assembly / assemblyShadeRules := (assembly / assemblyShadeRules).value
Test / assembly / mainClass := Some("org.specs2.runner.files")
Test / test := {
(Test / assembly).value
val assembledFile: String = (Test / assembly / assemblyOutputPath).value.getAbsolutePath
val minimalClasspath: Seq[String] = (Test / assembly / fullClasspath).value
.filter(_.metadata.get(moduleID.key).get.organization.matches("^(org\\.(scala-lang|slf4j)|log4j).*"))
.map(_.data.getAbsolutePath)
val runClass: String = (Test / assembly / mainClass).value.get
val classPath: Seq[String] = Seq(assembledFile) ++ minimalClasspath
val args: Seq[String] = Seq("-cp", classPath.mkString(":"), runClass)
val exitCode = Fork.java((Test / assembly / forkOptions).value, args)
if (exitCode != 0) {
throw new TestsFailedException()
}
}
Test / assembly / test := {}
Change in build.sbt:
lazy val root = (project in file("."))
.settings(/* your original settings are here */)
.settings(inConfig(Test)(baseAssemblySettings): _*) // enable assembling in test

SBT plugin how to make a source generator dependent on project's sources?

I'm trying to create a source generator in a SBT plugin that generate code based on the project's sources.
I tried something like this:
sourceGenerators in Compile += (sources in Compile) map { sources => doSomethingWithSources(sources) }
Unfortunately, SBT does not want to load this plugin due to the fact that there exists circular dependency.
Due to this fact I've created another task like this:
lazy val myTask = TaskKey[Unit]("myTask", "Do stuff")
This tasks actually depends on the sources value and generates the files.
Later I override the projectSettings value and add this:
myTask in Compile := {
val sourcesValue = (sources in Compile).value
doSomethingWithSources(sourcesValue)
},
sourcesGenerators in Compile += Def.task(Seq(new File("path/to/myGeneratedSource.scala"))).taskValue
I add this task as the dependency to the compile task in the build.sbt of the project that I want my plugin to do stuff like this:
compile in Compile <<= (compile in Compile) dependsOn (myTask in Compile)
While it works (the file is generated), when I launch the sbt command sbt run, it creates the file but does not compile it.
What is more, when I run just sbt compile run, it compiles only the project on the first (compile) task and generates my source and then on run part it compiles the generated source - so, in matter of speaking, it does somehow work, but it needs two compilations.
I'd like to ask if there is a simpler way to do this and, if not, how to make it work in only one compilation.

How to change value of setting for a custom configuration under play/sbt?

I have a play project, and I want to add an sbt task that runs the application with a given folder available as a resource. However, I don't want that folder to be on the classpath during "normal" runs.
I created a configuration, added the resources to that configuration, but when I run in that configuration, the files aren't being picked up
for example, I have:
val Mock = config(“mock”) extend Compile
val mock = inputKey[Unit]("run in mock mode")
val project = Project(“my project”, file(“src/”))
.configs(Mock)
.settings(
unmanagedResourceDirectories in Mock ++= Seq(baseDirectory.value / “mock-resources”)
mock <<= run in Mock
)
I want it so that when I type mock the mock-resources is on the classpath, and when i type run it isn't.
I'm using play 2.2.0 with sbt 0.13.1
You need to set the appropriate settings and tasks that are under Compile configuration to the newly-defined Mock configuration. The reason for this is this:
lazy val Mock = config("mock") extend Compile
When there's no setting or task under Mock sbt keeps searching in Compile where run is indeed defined but uses Compile values.
Do the following and it's going to work - note Classpaths.configSettings and run in Seq:
lazy val Mock = config("mock") extend Compile
lazy val mock = inputKey[Unit]("run in mock mode")
lazy val mockSettings = inConfig(Mock) {
Classpaths.configSettings ++
Seq(
unmanagedClasspath += baseDirectory.value / "mock-resources",
mock <<= run in Mock,
run <<= Defaults.runTask(fullClasspath in Mock, mainClass in Mock, runner in Mock)
)
}
lazy val p = (project in file("src/")).configs(Mock).settings(
mockSettings: _*
)
NOTE I'm unsure why I needed the following line:
run <<= Defaults.runTask(fullClasspath in Mock, mainClass in Mock, runner in Mock)
My guess is that because run uses fullClasspath that defaults to Compile scope it doesn't see the value in Mock. sbt keeps amazing me!
I've asked about it in Why does the default run task not pick settings in custom configuration?
Sample
I've been running the build with the following hello.scala under src directory:
object Hello extends App {
val r = getClass.getResource("/a.properties")
println(s"resource: $r")
}
Upon p/mock:mock it gave me:
> p/mock:mock
[info] Running Hello
resource: file:/Users/jacek/sandbox/mock-config/src/mock-resources/a.properties
Same for p/mock:run:
> p/mock:run
[info] Running Hello
resource: file:/Users/jacek/sandbox/mock-config/src/mock-resources/a.properties
And mock was no different:
> mock
[info] Running Hello
resource: file:/Users/jacek/sandbox/mock-config/src/mock-resources/a.properties