I have written an sbt plugin that generates some sources and resources. It is hard coded to work in the Compile scope.
How can I make it work in the Test scope too, so I can use the plugin when running tests and it will look in and output to the correct folder?
For example, in various points in the code I refer to resourceManaged in Compile which relates to src/main/resourcesbut when test is run, I would like it to be resourceManaged in Test when relates to src/test/resources.
How do I abstract away the scope?
This is a topic discussed in Plugins Best Practices, specifically in the Configuration advices section.
Provide raw settings and configured settings
If your plugin is ObfuscatePlugin, provide baseObfuscateSettings that's not scoped in any configuration:
lazy val baseObfuscateSettings: Seq[Def.Setting[_]] = Seq(
obfuscate := Obfuscate((sources in obfuscate).value),
sources in obfuscate := sources.value
)
As you can see in the above it's accessing sources key, but it's not specified which configuration's source.
inConfig
override lazy val projectSettings = inConfig(Compile)(baseObfuscateSettings)
inConfig scopes the passed in sequence of settings into a particular configuration. If you want to support both Compile and Test out of the box, you can say:
override lazy val projectSettings =
inConfig(Compile)(baseObfuscateSettings) ++
inConfig(Test)(baseObfuscateSettings)
Related
When running tasks (e.g., test, jmh:run), I often want to specify javaOptions which are tedious to type by hand (e.g., to dump program data).
This is my current approach:
// build.sbt
lazy val myProject = project
...
.settings(
...
Test / javaOptions ++= if (sys.props.get("dump").nonEmpty) Seq("-X...", ...) else Nil
)
I can set system properties on sbt launch (e.g., sbt -Ddump) and then check them with sys.props, but changing these properties requires me to reload sbt. I would like to parse some arguments when the task is invoked, such that I can write test -dump and modify the Test / javaOptions setting accordingly.
Is this possible? Someone recommended I override the default task but I'm having trouble figuring out what that would look like. I have a suspicion I need an InputTask for this, but also don't know what that'd look like.
You don’t need to reload sbt, if you are running sbt in a shell, you will need to programtically call‚ sys.props.set
Background
I've created template-scala-project which, among other amenities, defines configurations FunctionalTest and AcceptanceTest (in addition to IntegrationTest, which is provided by SBT).
This way, the relevant bits of the directory structure are:
src/main/scala - compile sources - Compile
src/test/scala - test sources - Test
src/it/scala - it sources - IntegrationTest
src/ft/scala - ft sources - FunctionalTest
src/at/scala - at sources - AcceptanceTest
Behavior in SBT
I can run only functional tests, for example, like this:
ft:test
Everything works as I've planned. I can even share test sources to it sources, or ft sources, or at sources... which is a common practical requirement.
Behaviour in IntelliJ
IntelliJ recognizes that src/test/scala and src/it/scala are test sources. IntelliJ does not have any distinction between them, I mean: no distinction between test sources and integration test sources, ... but it's OK. All I need is that src/it/scala is recognized as test sources. And it works as that.
However, IntelliJ does not recognize src/ft/scala as test sources; IntelliJ does not recognize src/at/scala as test sources.
I have inspected the XML file produced by sbt-structure but I was unable to understand the pattern or logic behind it. However (and apparently!) src/ft/scala and src/at/scala should appear under <configuration id="test"> in order to be eligible for being considered as test sources.
Question
In order to test my hypothesis above, I would like to force src/ft/scala to appear under <configuration id="test">, employing "something" in the build.sbt file. How could I accomplish that?
If you want a directory in IntelliJ to be considered a Test Source directory, you can configure it as such by simply right clicking it and selecting Mark Directory As > Test Sources Root
That changes your project structure configuration to let IntelliJ know that tests reside there. The same is true of your resources if those need to be marked appropriately.
With THAT said, I suppose I'm not 100% sure if IntelliJ's use of sbt will recognize that and run them appropriately, but I would expect it to.
After some experimentation, I apparently found something which works well enough. I cannot claim that I've found a solution, but at least it seems to work as I would expect: IntelliJ recognizes FunctionalTest (test) sources and AcceptanceTest (test) sources, the same way it recognizes Test sources and IntegrationTest sources.
Answer:
Create a configuration which extends Runtime and Test. Doing this way, src/ft/scala appears under <configuration id="test">, as the XML file is produced by sbt-structure. See the example below for configuration FunctionalTest:
lazy val FunctionalTest = Configuration.of("FunctionalTest", "ft") extend (Runtime, Test)
As a bonus, I show below some other settings which I find useful to be associated to configuration FunctionalTest:
lazy val forkSettings: Seq[Setting[_]] =
Seq(
fork in ThisScope := false,
parallelExecution in ThisScope := false,
)
lazy val ftSettings: Seq[Setting[_]] =
inConfig(FunctionalTest)(Defaults.testSettings ++ forkSettings ++
Seq(
unmanagedSourceDirectories in FunctionalTest ++= (sourceDirectories in Test).value,
unmanagedResourceDirectories in FunctionalTest ++= (resourceDirectories in Test).value,
))
I also did something similar for configuration AcceptanceTest.
You can find a project which exercises such kind of configurations at:
http://github.com/frgomes/template-scala-project
I am trying to extend my build with task that will generate source file.
I am defining my task in project/Build.scala like this (non-relevant pieces omitted):
object ProjectBuild extends Build {
lazy val generateConfiguration = TaskKey[Seq[File]]("generateConfiguration")
lazy val unscopedSettings = Seq(
generateConfiguration <<=
(libraryDependencies, sourceManaged).map { (dependencies, generatedRoot) =>
// here goes implementation
},
sourceGenerators += generateConfiguration.taskValue
)
override lazy val settings = super.settings ++ inConfig(Compile)(unscopedSettings)
}
When I try to import project in sbt I get following error:
[info] Loading project definition from ...web/project
References to undefined settings:
{.}/compile:sourceManaged from {.}/compile:generateConfiguration
(...web/project/Build.scala:19)
Did you mean compile:sourceManaged ?
{.}/compile:sourceGenerators from {.}/compile:sourceGenerators
(...web/project/Build.scala:33)
Did you mean compile:sourceGenerators ?
I understand that my problem is because I probably reference the setting with wrong scope. I suppose, the issue is within 'this build' ({.}) which for some reason is prepended here (as far as I understand, the setting exists in Global scope for this axis).
How should I correctly express dependency to sourceManaged setting in Compile configuration within Scala code (not .sbt)?
P.S.:
sbt 0.13.8
scala 2.11.7
I seem to have found the issue myself.
Possible reason this did not work was the way I put my custom settings into the build - I tried to override lazy val settings of Build.
Since I described my tasks in Build.scala rather than build.sbt which comes before in eventual build definition, it appears that dependent settings are not yet defined! They will be set later by default imports of build.sbt.
Once I moved addition of custom properties in the project to build.sbt while leaving their definition in Build.scala everything worked as expected.
Despite there is information about override order between *.scala and build.sbt and some simple examples of compound build definitions it was not that obvious.
I'm using a Scalariform AutoPlugin and would like to disable it when running tests on the CI server. Is there a sbt option to do so?
One way to achieve this is via an environment variable. Please note in my example Code below I use the sbt-release plugin but it should be easily adoptable to scalariform.
lazy val isJenkins = sys.props.get("JENKINS").isDefined
lazy val disPlugins = if(isJenkins) Seq(ReleasePlugin) else Seq.empty
lazy val root = (project in file(".")).disablePlugins(disPlugins:_*)
The first val checks if we the system property JENKINS is set. Depending on this value we add the ReleasePlugin to the Sequence of Plugins that need to be disabled. And finally during our project definition we actually disable those.
If you start sbt with the jenkins property set (sbt -DJENKINS=true) the ReleasePlugin is disabled
I am new to Scala, so I hope this question is not too naive.
Suppose I have a multi-module sbt-project and there is a dependence between projects.
lazy val core = (project in file("core")).
settings( ... )
lazy val utils = (project in file("utils")).
settings( ... ).dependsOn(core)
The question, does .dependsOn(core) mean that if I do projects utils; compile it is going to compile the core beforehand (and use its latest version)?
I am asking this, since in practice I don't see this behavior (and I want it).
You are looking for the aggregate method. Like this:
lazy val utils = (project in file("utils")).
settings( ... ).dependsOn(core).aggregate(core)
The aggregate method here causes all tasks run on utils to also be run on core (update, etc...). If you want to disable a task from running on an aggregated project you can check out the documentation here
Yes, you should see this behavior (and I do see it in practice). As the linked documentation says (note that the roles of util and core are opposite there: core depends on util):
This also creates an ordering between the projects when compiling them; util must be updated and compiled before core can be compiled