In my project I use plugin which exposes task genExport. I can run genExport task from console with:
sbt genExport
My problem is I cannot configure my sbt project to run genExport after project compilation:
lazy val sample:Project = project
.in(file("sample"))
.settings(
MyPluginKeys.someKey := "someKeyValue",
compile in Compile <<= (compile in Compile) map { x =>
println("----------")
// ???
x
}
)
.enablePlugins(MyPlugin)
From sbt documentation I could not get how to invoke task from plugin by name. I've experimented with:
taskKey[Unit]("genExport").taskValue
without any success. What I'm missing?
val genexport = TaskKey[Unit]("genExport")
And
genExport <<= genExport triggeredBy (compile in Compile)
Related
When I run sbt assembly the tests are not run. How can I make the tests to run before running the assembly task?
From the documentation at https://github.com/sbt/sbt-assembly#assembly-task:
To run the test during assembly,
lazy val app = (project in file("app"))
.settings(
assembly / test := (Test / test).value,
// more settings here ...
)
How do I make sbt include non-Java sources to published artifact ?
I'm using Kotlin plugin and can't figure out how to force sbt to include .kt file into published source jar. It only includes .java files.
A lot of people online suggest adding following code to sbt script but it doesn't help
mappings in (Compile, packageSrc) ++= {
val base = (sourceManaged in Compile).value
val files = (managedSources in Compile).value
files.map { f => (f, f.relativeTo(base).get.getPath) }
},
I also tried
includeFilter in (Compile, packageSrc) := "*.scala" || "*.java" || "*.kt",
Here is output of some variables in sbt console
sbt:collections> show unmanagedSourceDirectories
[info] * /home/expert/work/sideprojects/unoexperto/extensions-collections/src/main/scala
[info] * /home/expert/work/sideprojects/unoexperto/extensions-collections/src/main/java
[info] * /home/expert/work/sideprojects/unoexperto/extensions-collections/src/main/kotlin
sbt:collections> show unmanagedSources
[info] * /home/expert/work/sideprojects/unoexperto/extensions-collections/src/main/java/com/walkmind/extensions/collections/TestSomething.java
which plugin you use for kotlin?
https://github.com/pfn/kotlin-plugin has the option kotlinSource to configure where the source directory is located.
sbt packageBin compiled kotlin files and include them to output jar.
build.sbt
// define kotlin source directory
kotlinSource in Compile := baseDirectory.value / "src/main/kotlin",
src/main/kotlin/org.test
package org.test
fun main(args: Array<String>) {
println("Hello World!")
}
console
sbt compile
sbt packageBin
target/scala-2.13
jar include MainKt.class
and folder org/test contains MainKt.class too.
would this solve your problem?
I found a workaround for this in my project https://github.com/makiftutuncu/e. I made following: https://github.com/makiftutuncu/e/blob/master/project/Settings.scala#L105
Basically, I added following setting in SBT to properly generate sources artifact:
// Include Kotlin files in sources
packageConfiguration in Compile := {
val old = (packageConfiguration in Compile in packageSrc).value
val newSources = (sourceDirectories in Compile).value.flatMap(_ ** "*.kt" get)
new Package.Configuration(
old.sources ++ newSources.map(f => f -> f.getName),
old.jar,
old.options
)
}
For the documentation artifact, I added Gradle build to my Kotlin module. I set it up as shown here https://github.com/makiftutuncu/e/blob/master/e-kotlin/build.gradle.kts. This way, I make Gradle build generate the Dokka documentation. And finally, added following setting in SBT to run Gradle while building docs:
// Delegate doc generation to Gradle and Dokka
doc in Compile := {
import sys.process._
Process(Seq("./gradlew", "dokkaJavadoc"), baseDirectory.value).!
target.value / "api"
}
I admit, this is a lot of work just to get 2 artifacts but it did the trick for me. 🤷🏻 Hope this helps.
I have a simple SBT project, consisting of some Scala code in src/main/scala and some test code in src/test/scala. I use the sbt-assembly plugin to create a fat jar for deployment onto remote systems. The fat jar includes all the dependencies of the Scala project, including the Scala runtime itself. This all works great.
Now I'm trying to figure out a way I can run the Scala tests against the fat jar. I tried the obvious thing, creating a new config extending the Test config and modifying the dependencyClasspath to be the fat JAR instead of the default value, however this fails because (I assume because) the Scala runtime is included in the fat jar and collides somehow with the already-loaded Scala runtime.
My solution right now works but it has serious drawbacks. I just use Fork.java to invoke Java on the org.scalatest.tools.Runner runner with a classpath set to include the test code and the fat jar and all of the test dependencies. The downside is that none of the SBT test richness works, there's no testQuick, there's not testOnly, and the test failure reporting is on stdout.
My question boils down to this: how does one use SBT's test commands to run tests when those tests are dependent not on their corresponding SBT compile output, but on a fat JAR file which itself includes all the Scala runtimes?
This is what I landed on (for specs2, but can be adapted). This is basically what you said was your Fork solution, but I figured I'd leave this here in case someone wanted to know what that might be. Unfortunately I don't think you can run this "officially" as a SBT test runner. I should also add that you still want Fork.java even though this is Scala, because Fork.scala depends on a runner class that I don't seem to have.
test.sbt (or build.sbt, if you want to put a bunch of stuff there - SBT reads all .sbt files in the root if you want to organize):
// Set up configuration for building a test assembly
Test / assembly / assemblyJarName := s"${name.value}-test-${version.value}.jar"
Test / assembly / assemblyMergeStrategy := (assembly / assemblyMergeStrategy).value
Test / assembly / assemblyOption := (assembly / assemblyOption).value
Test / assembly / assemblyShadeRules := (assembly / assemblyShadeRules).value
Test / assembly / mainClass := Some("org.specs2.runner.files")
Test / test := {
(Test / assembly).value
val assembledFile: String = (Test / assembly / assemblyOutputPath).value.getAbsolutePath
val minimalClasspath: Seq[String] = (Test / assembly / fullClasspath).value
.filter(_.metadata.get(moduleID.key).get.organization.matches("^(org\\.(scala-lang|slf4j)|log4j).*"))
.map(_.data.getAbsolutePath)
val runClass: String = (Test / assembly / mainClass).value.get
val classPath: Seq[String] = Seq(assembledFile) ++ minimalClasspath
val args: Seq[String] = Seq("-cp", classPath.mkString(":"), runClass)
val exitCode = Fork.java((Test / assembly / forkOptions).value, args)
if (exitCode != 0) {
throw new TestsFailedException()
}
}
Test / assembly / test := {}
Change in build.sbt:
lazy val root = (project in file("."))
.settings(/* your original settings are here */)
.settings(inConfig(Test)(baseAssemblySettings): _*) // enable assembling in test
In SBT, compile task does the compilation of the project code and test:compile does compilation of the project's tests. I want a single compile task which does both. I want to override the default compile task and dont want a task with a new name (because want to enforce compilation success of all tests with every code change to project's main code). Am using Build.scala (not build.sbt) and tried the method described in this SO answer. My trial is pasted below and does not work because the return type of the compile task is TaskKey[Analysis]. How should I change this?
val compileInTest = TaskKey[Analysis]("compile the tests")
compileInTest := {
(compile in Compile in <module-name>).value
(compile in Test in <module-name>).value
}
lazy val projectA = Project(
"a",
file("a"),
settings = hwsettings ++ Seq(
compile := compileInTest
))
You can define alias in .sbtrc file:
alias compile=test:compile
which will do both tasks.
I have a play project, and I want to add an sbt task that runs the application with a given folder available as a resource. However, I don't want that folder to be on the classpath during "normal" runs.
I created a configuration, added the resources to that configuration, but when I run in that configuration, the files aren't being picked up
for example, I have:
val Mock = config(“mock”) extend Compile
val mock = inputKey[Unit]("run in mock mode")
val project = Project(“my project”, file(“src/”))
.configs(Mock)
.settings(
unmanagedResourceDirectories in Mock ++= Seq(baseDirectory.value / “mock-resources”)
mock <<= run in Mock
)
I want it so that when I type mock the mock-resources is on the classpath, and when i type run it isn't.
I'm using play 2.2.0 with sbt 0.13.1
You need to set the appropriate settings and tasks that are under Compile configuration to the newly-defined Mock configuration. The reason for this is this:
lazy val Mock = config("mock") extend Compile
When there's no setting or task under Mock sbt keeps searching in Compile where run is indeed defined but uses Compile values.
Do the following and it's going to work - note Classpaths.configSettings and run in Seq:
lazy val Mock = config("mock") extend Compile
lazy val mock = inputKey[Unit]("run in mock mode")
lazy val mockSettings = inConfig(Mock) {
Classpaths.configSettings ++
Seq(
unmanagedClasspath += baseDirectory.value / "mock-resources",
mock <<= run in Mock,
run <<= Defaults.runTask(fullClasspath in Mock, mainClass in Mock, runner in Mock)
)
}
lazy val p = (project in file("src/")).configs(Mock).settings(
mockSettings: _*
)
NOTE I'm unsure why I needed the following line:
run <<= Defaults.runTask(fullClasspath in Mock, mainClass in Mock, runner in Mock)
My guess is that because run uses fullClasspath that defaults to Compile scope it doesn't see the value in Mock. sbt keeps amazing me!
I've asked about it in Why does the default run task not pick settings in custom configuration?
Sample
I've been running the build with the following hello.scala under src directory:
object Hello extends App {
val r = getClass.getResource("/a.properties")
println(s"resource: $r")
}
Upon p/mock:mock it gave me:
> p/mock:mock
[info] Running Hello
resource: file:/Users/jacek/sandbox/mock-config/src/mock-resources/a.properties
Same for p/mock:run:
> p/mock:run
[info] Running Hello
resource: file:/Users/jacek/sandbox/mock-config/src/mock-resources/a.properties
And mock was no different:
> mock
[info] Running Hello
resource: file:/Users/jacek/sandbox/mock-config/src/mock-resources/a.properties