I'm trying to use SBT to build a project that depends on bytecode enhancement. Basically, I need to run some code after compile using the classpath in the current scope (so the command can find the classes to modify), and then make sure that compile doesn't run again afterwards to undo the enhancement.
I'm using SBT 0.13.12, if that matters.
I believe you would want to make a new sbt task and have it depend on compile. Then use that rather than compile.
lazy val bytecodeEnhancedCompile = taskKey[Unit]("bytecode Enhance")
bytecodeEnhancedCompile <<= bytecodeEnhancedCompile dependsOn (compile in Compile)
bytecodeEnhancedCompile := {
....
}
Related
When running tasks (e.g., test, jmh:run), I often want to specify javaOptions which are tedious to type by hand (e.g., to dump program data).
This is my current approach:
// build.sbt
lazy val myProject = project
...
.settings(
...
Test / javaOptions ++= if (sys.props.get("dump").nonEmpty) Seq("-X...", ...) else Nil
)
I can set system properties on sbt launch (e.g., sbt -Ddump) and then check them with sys.props, but changing these properties requires me to reload sbt. I would like to parse some arguments when the task is invoked, such that I can write test -dump and modify the Test / javaOptions setting accordingly.
Is this possible? Someone recommended I override the default task but I'm having trouble figuring out what that would look like. I have a suspicion I need an InputTask for this, but also don't know what that'd look like.
You don’t need to reload sbt, if you are running sbt in a shell, you will need to programtically call‚ sys.props.set
I need a piece of code using SBT, possibly internals, which acquires the full classpath of an SBT project without invoking a compilation. Normally I would use "Runtime / fullClasspath", but that needs the project to be compiled first. Is there any way to get the fullClasspath without triggering a compile? I thought the build.sbt alone determined the classpath, and compilation (in theory) isn't necessary.
I asked on the SBT gitter channel, and there it was also mentioned to use dependencyClasspath. dependencyClasspath doesn't require compilation of the root project, but it does require compilation of all the dependents. So that doesn't solve it yet for me. I'm looking for the complete classpath required to running the root project, without compiling its constituents.
I'm interested in any way to work around this, so if there are any farfetched workarounds, those are welcome too.
An example of what I have now:
Global / printMainClasspath := {
val paths = (rootProject / fullClasspath).value
val joinedPaths = paths
.map(_.data)
.mkString(pathSeparator)
println(joinedPaths)
}
This works partially, it creates a task "printMainClasspath" which prints the full classpath. But if I call it in the sbt shell, it compiles the code first. I'd like the classpath to be printed without invoking a full compile of the project. Ideally, only all build.sbt's of all constituent projects are compiled. Is there a way?
val dryClasspath = taskKey[Seq[File]]("dryClasspath")
dryClasspath := {
val data = settingsData.value
val thisProj = thisProjectRef.value
val allProjects = thisProj +: buildDependencies.value.classpathTransitiveRefs(thisProj)
val classDirs = allProjects.flatMap(p => (p / Runtime / classDirectory).get(data))
val externalJars = (Runtime / externalDependencyClasspath).value.map(_.data)
classDirs ++ externalJars
}
I'm developing a Scala compiler plugin, and right now I have to go to the plugin project, run sbt publishLocal, come back to my project, and run sbt clean compile.
This is because I'm using addCompilerPlugin(...) in my build.sbt
I wonder if there's a way to refer the compiler plugin's local path, so that I can simply run sbt compile.
Thank you.
Here's how we can achieve it:
scalacOptions in Compile ++= {
val jar = (Keys.`package` in (plugin, Compile)).value
System.setProperty("sbt.paths.plugin.jar", jar.getAbsolutePath)
val addPlugin = "-Xplugin:" + jar.getAbsolutePath
// Thanks Jason for this cool idea (taken from https://github.com/retronym/boxer)
// add plugin timestamp to compiler options to trigger recompile of
// main after editing the plugin. (Otherwise a 'clean' is needed in the current project)
val dummy = "-Jdummy=" + jar.lastModified
Seq(addPlugin, dummy)
}
Here's an example: https://github.com/GIVESocialMovement/scala-named-argument-compiler-plugin/blob/master/test-project/build.sbt#L26
This above runs package on the plugin project, gets its jar, and adds the plugin through scalacOptions on the current project.
Thanks this redditor for answering my question: https://www.reddit.com/r/scala/comments/aq2bt6/just_made_a_compiler_plugin_to_enforce_named/
In my project I have a special SBT plugin to generate some configuration specific settings (like sbt-buildinfo). A special task generates a Scala class and stores it in 'src_managed' folder.
The problem is, that after this file is successfully generated, the following 'compile' can't find this class and I get compilation error.
I have several configurations defined with:
compile in conf <<= (compile in conf).dependsOn(mytask)
I call this plugin like so:
;clean;proj/myconf:compile
You should setup special setting for code generator:
sourceGenerators in Compile <+= (myCodeGeneratorTask in Compile)
SBT generate code using project defined generator
I have some test cases that need to look at the classpath to extract the paths of some files/directories in there. This works fine in the IDE.
The problem is that, when running SBT test, Properties.javaClassPath gives me /usr/share/sbt-launcher-packaging/bin/sbt-launch.jar.
The classpath is fine when I run show test:dependency-classpath. Is there a way to obtain that information from inside the running Scala/Java program? Or is there a way to toss it into a system property or environment variable?
By default the tests are run inside of the SBT process, so the classpath will look like it did when you started sbt (I guess sbt does some trixery to dynamicly load the classes for the tests, not sure). One way to do what you want is to run your tests in a forked jvm, that way sbt will start a new jvm to run the test suite and that should have the expected class path:
fork in Test := true
I have been working on understanding how the EmbeddedCassandra works in the spark-cassandra-connector project which uses the classpath to start up and control a Cassandra instance. Here is a line from their configuration that gets the correct classpath.
(compile in IntegrationTest) <<= (compile in Test, compile in IntegrationTest) map { (_, c) => c }
The entire source can be found here: https://github.com/datastax/spark-cassandra-connector/blob/master/project/Settings.scala
Information on the <<= operator can be found here: http://www.scala-sbt.org/0.12.2/docs/Getting-Started/More-About-Settings.html#computing-a-value-based-on-other-keys-values. I'm aware that this is not the current version of sbt, but the definition still holds.