I need to define a custom test configuration in sbt which runs test, but with some extra settings. I've been looking around trying to figure out how to do this, but I can't seem to get it right.
What I would like to do is something like this: > test which would run the normal test task and > pipelinetest which would exactly the same as test, only with (javaOptions += "-Dpipeline.run=run".
I've figured out how the set the javaOptions for test, like this:
javaOptions in test += "-Dpipeline.run=run" so what I would like to be able to do is this: javaOptions in pipelinetest += "-Dpipeline.run=run"
How would I define pipelinetest to achieve this goal? Do this need to be a new task? Or does would this be a setting in test. I'm very new to sbt and quite confused over this at the moment, and reading the documentation didn't help, so any help would be greatly appreciated.
I have only a partial answer, but I thought this might be useful info. I was just trying to do something similar for the sbt build in Spark -- I wanted to have a way to run tests with a debugger. Mark Harrah's comment pointed me in the right direction. The change I made was:
lazy val TestDebug = config("testDebug") extend(Test)
...
baseProject
.configs(TestDebug)
.settings(inConfig(TestDebug)(Defaults.testTasks): _*)
.settings(Seq(
javaOptions in TestDebug ++= "-agentlib:jdwp=transport=dt_socket,server=y,suspend=y,address=5005"
.split(" ").toSeq))
This left my usual invocations of test, testOnly, etc. alone, but now I could also run testDebug:testOnly ..., which would use the extra options defined above. (it probably also created testDebug:test, etc. with those extra options, which aren't useful, but oh well.)
I didn't really understand why, but one important part for me to get this to work was to use inConfig(TestDebug)(Defaults.testTasks), instead of inConfig(TestDebug)(Defaults.testSettings).
In my case, I ran into trouble figuring out how to (a) get it to work for a multi-project build and (b) our build is even weirder b/c its based on a POM file, which makes the project definitions different than every example.
As usual, my issue with sbt is that I find info which seems related, but my build has some unusual aspects which makes me unable to completely cargo-cult the answer; and though it seems like I need trivial modifications, without a thorough understanding, its hard to modify the examples.
Related
SBT lets you define autoplugins specific to your project by putting them in ./project.
I'm trying to add resources to one such autoplugin - by which I mean something that it could access through a call to getClass.getResourceAsStream.
I have, however, not been able to work out how to do that, or even if it was possible. There's no documentation that I could find on the subject, and the obvious (simply putting resources in ./project with the plugin) fails.
Is what I'm trying to achieve possible?
Yes, you need to place your resource in ./project/src/main/resources/
For a quick demonstration that this works, assume the file name is test.txt, put the following in your build.sbt:
lazy val hello = taskKey[Unit]("prints the content of test.txt")
hello := println(IO.readStream(getClass.getResourceAsStream("test.txt")))
It often comes up during testing and debugging a Scala project built with sbt that I need to pass some extra compiler flags for a particular file. For example -Xlog-implicits to debug implicit resolution problems. However, changing scalacOptions either in build.sbt or the console invalidates the cache and causes the whole project / test suite to be recompiled. In addition to being annoying to wait so long, this also means that a lot of noise from irrelevant files is printed. Instead it would be better if I could compile a specific file with some extra flags from the sbt console, but I did not find a way to do this.
Problem
The reason why changing the scalac options triggers recompilation is because Zinc, Scala's incremental compiler, cannot possibly now which compiler flags affect the semantics of the incremental compilation, so it's pessimistic about it. I believe this can be improved, and some whitelisted flags can be supported, so that next time people like you don't have to ask about it.
Nevertheless, there's a solution to this problem and it's more generic than it looks like at first sight.
Solution
You can create a subproject in your sbt build which is a copy of the project you want to "log implicits" in, but with -Xlog-implicits enabled by default.
// Let's say foo is your project definition
lazy val foo = project.settings(???)
// You define the copy of your project like this
lazy val foo-implicits = foo
.copy(id = "foo-implicits")
.settings(
target := baseDirectory.value./("another-target"),
scalacOptions += "-Xlog-implicits"
)
Note the following properties of this code snippet:
We redefine the project ID because when we reuse a project definition the ID is still the same as the previous one (foo in this case). Sbt fails when there are projects that share the same id.
We redefine the target directory because we want to avoid recompilation. If we keep the same as before, then recompiling foo-implicits will delete the compilation products of the previous compilation (and viceversa). That's exactly what we want to avoid.
We add -Xlog-implicits to the Scalac options as you request in this question. In a generic solution, this piece of code should go away.
When is this useful?
This is useful not only for your use case, but when you want to have different modules of the same project in different Scala versions. The following has two benefits:
You don't use cross-compilation in sbt ++, which is known to have some memory issues.
You can add new library dependencies that only exist for a concrete Scala version.
It has more applications, but I hope this addresses your question.
I'd like to be able to turn off a particular compiler warning, but only for a single file in my project. Is this possible?
The context is that I have a single source file that makes calls to an external macro library that produces adapted-arg warnings. I found that I can eliminate these warnings by changing my build.sbt file:
scalacOptions ++= Seq("-Xlint:-adapted-args,_" /*, ... */)
However, this turns off the warning globally, and I only want it off for the single file that raises them.
I haven't had any luck searching for any of the following possible solutions I thought might exist:
Specifying separate compilation options for different files in my project in my build.sbt file
Providing some pragma-like comment in my source file to change the warnings generated by the compiler, similar to the special scalastyle:on/off comments recognized by Scalastyle
Some annotation for smaller regions of code, like #unchecked
So, is there any way to have different linting options in effect for different files, or even for limited regions of code?
The expectation was that folks would write a custom Reporter that would filter out undesirable warnings.
It's easy to write one that filters by file name, perhaps given a whitelist.
The error/warn API supplies the textual Position. It would also be easy to parse the position.lineContent for a magic comment token like IGNORE or SUPPRESS, which is not as convenient as a SuppressWarnings annotation but is easy to implement.
The compiler asks the reporter if there were errors.
The custom reporter is specified with -Xreporter myclass, or it wouldn't surprise me if someone has written an sbt plugin.
I'm trying to grok how tasks depend on one another in SBT. Using 0.13.7. 'inspect' and 'inspect tree' have been lifesavers, however I still find examples that I can't explain.
For example, I know that 'publishLocal' ends up calling 'copyResources' somehow, but if you run 'inspect tree publishLocal', you don't see the copyResources task in the tree. I can see the 'Copying Resources' output when running with debug logging on and I know that log statement comes from inside the copyResourcesTask function. Is there some other way this is getting invoked? Some other way to see these dependencies?
Some dependencies are dynamic, in the sense that they are computed while running a task. This dependencies cannot be shown by inspect tree because identifying them would require executing the task. And then again, the dependency could change from one run to the next.
See the documentation about taskDyn.
I don't know of any way to show the actual dependencies, though.
I'm having major problems getting Undercover to work using Maven
I'm using ScalaTest for unit tests and this is working perfectly
When I run Undercover though it simply creates empty files
I think it's probably a problem with the configuration in my pom.xml (but the documentation for Undercover is a little sketchy)
Help :)
Thanks
T
At one point I inquired about Emma on a Scala mailing list, and I was told that by some that they had more success with Cobertura. You might want to try that instead.
Up to what stage is this "working correctly", given that empty files are being produced?
Do you have a sample project/POM that demonstrates the problem?