Get full classpath without compilation in SBT - scala

I need a piece of code using SBT, possibly internals, which acquires the full classpath of an SBT project without invoking a compilation. Normally I would use "Runtime / fullClasspath", but that needs the project to be compiled first. Is there any way to get the fullClasspath without triggering a compile? I thought the build.sbt alone determined the classpath, and compilation (in theory) isn't necessary.
I asked on the SBT gitter channel, and there it was also mentioned to use dependencyClasspath. dependencyClasspath doesn't require compilation of the root project, but it does require compilation of all the dependents. So that doesn't solve it yet for me. I'm looking for the complete classpath required to running the root project, without compiling its constituents.
I'm interested in any way to work around this, so if there are any farfetched workarounds, those are welcome too.
An example of what I have now:
Global / printMainClasspath := {
val paths = (rootProject / fullClasspath).value
val joinedPaths = paths
.map(_.data)
.mkString(pathSeparator)
println(joinedPaths)
}
This works partially, it creates a task "printMainClasspath" which prints the full classpath. But if I call it in the sbt shell, it compiles the code first. I'd like the classpath to be printed without invoking a full compile of the project. Ideally, only all build.sbt's of all constituent projects are compiled. Is there a way?

val dryClasspath = taskKey[Seq[File]]("dryClasspath")
dryClasspath := {
val data = settingsData.value
val thisProj = thisProjectRef.value
val allProjects = thisProj +: buildDependencies.value.classpathTransitiveRefs(thisProj)
val classDirs = allProjects.flatMap(p => (p / Runtime / classDirectory).get(data))
val externalJars = (Runtime / externalDependencyClasspath).value.map(_.data)
classDirs ++ externalJars
}

Related

How to include a Scala compiler plugin from a local path?

I'm developing a Scala compiler plugin, and right now I have to go to the plugin project, run sbt publishLocal, come back to my project, and run sbt clean compile.
This is because I'm using addCompilerPlugin(...) in my build.sbt
I wonder if there's a way to refer the compiler plugin's local path, so that I can simply run sbt compile.
Thank you.
Here's how we can achieve it:
scalacOptions in Compile ++= {
val jar = (Keys.`package` in (plugin, Compile)).value
System.setProperty("sbt.paths.plugin.jar", jar.getAbsolutePath)
val addPlugin = "-Xplugin:" + jar.getAbsolutePath
// Thanks Jason for this cool idea (taken from https://github.com/retronym/boxer)
// add plugin timestamp to compiler options to trigger recompile of
// main after editing the plugin. (Otherwise a 'clean' is needed in the current project)
val dummy = "-Jdummy=" + jar.lastModified
Seq(addPlugin, dummy)
}
Here's an example: https://github.com/GIVESocialMovement/scala-named-argument-compiler-plugin/blob/master/test-project/build.sbt#L26
This above runs package on the plugin project, gets its jar, and adds the plugin through scalacOptions on the current project.
Thanks this redditor for answering my question: https://www.reddit.com/r/scala/comments/aq2bt6/just_made_a_compiler_plugin_to_enforce_named/

In SBT, how do you override compile to run arbitrary code afterwards?

I'm trying to use SBT to build a project that depends on bytecode enhancement. Basically, I need to run some code after compile using the classpath in the current scope (so the command can find the classes to modify), and then make sure that compile doesn't run again afterwards to undo the enhancement.
I'm using SBT 0.13.12, if that matters.
I believe you would want to make a new sbt task and have it depend on compile. Then use that rather than compile.
lazy val bytecodeEnhancedCompile = taskKey[Unit]("bytecode Enhance")
bytecodeEnhancedCompile <<= bytecodeEnhancedCompile dependsOn (compile in Compile)
bytecodeEnhancedCompile := {
....
}

How to reference sourceManaged setting from Build.scala

I am trying to extend my build with task that will generate source file.
I am defining my task in project/Build.scala like this (non-relevant pieces omitted):
object ProjectBuild extends Build {
lazy val generateConfiguration = TaskKey[Seq[File]]("generateConfiguration")
lazy val unscopedSettings = Seq(
generateConfiguration <<=
(libraryDependencies, sourceManaged).map { (dependencies, generatedRoot) =>
// here goes implementation
},
sourceGenerators += generateConfiguration.taskValue
)
override lazy val settings = super.settings ++ inConfig(Compile)(unscopedSettings)
}
When I try to import project in sbt I get following error:
[info] Loading project definition from ...web/project
References to undefined settings:
{.}/compile:sourceManaged from {.}/compile:generateConfiguration
(...web/project/Build.scala:19)
Did you mean compile:sourceManaged ?
{.}/compile:sourceGenerators from {.}/compile:sourceGenerators
(...web/project/Build.scala:33)
Did you mean compile:sourceGenerators ?
I understand that my problem is because I probably reference the setting with wrong scope. I suppose, the issue is within 'this build' ({.}) which for some reason is prepended here (as far as I understand, the setting exists in Global scope for this axis).
How should I correctly express dependency to sourceManaged setting in Compile configuration within Scala code (not .sbt)?
P.S.:
sbt 0.13.8
scala 2.11.7
I seem to have found the issue myself.
Possible reason this did not work was the way I put my custom settings into the build - I tried to override lazy val settings of Build.
Since I described my tasks in Build.scala rather than build.sbt which comes before in eventual build definition, it appears that dependent settings are not yet defined! They will be set later by default imports of build.sbt.
Once I moved addition of custom properties in the project to build.sbt while leaving their definition in Build.scala everything worked as expected.
Despite there is information about override order between *.scala and build.sbt and some simple examples of compound build definitions it was not that obvious.

How to prevent SBT from recompiling modified .class files?

In our project, we have a enhance post-process to the .class files generated by compile. This enhance step actually modifies the generated .class files then overrides it.
enhance <<= enhance triggeredBy (compile in Compile)
The problem is that sbt has a mechanism called incremental recompilation. It monitors the generated .class file. Every time the enhancer overrides the generated .class file, sbt recognises these modifications and recompiles related sources in next compile command.
To us, a recompilation is a very time-consuming work. We want to stop sbt from recompiling modified .class file. That may mean making sbt only monitor source changes, not output changes.
I did some searching on this. But I found little things about this. Now I know a trait called Analysis is likely responsible for the mapping from sources to output .class files. So I ask help from you guys.
Ps: we may solve this problem by putting the output of enhance to another folder, but it is not preferred.
sbt strongly discourages mutations to files. You should generate different files instead. By doing so, you will solve your problem, since the incremental compiler of sbt will still be looking at the unaltered .class files. You will have some rewiring to do:
Send the outputs of the compile task somewhere else:
classDirectory in Compile := crossTarget.value / "classes-orig"
Processing these .class files with your tool, and send them to crossTarget.value / "classes" (the original classDirectory:
enhance <<= enhance triggeredBy (compile in Compile)
enhance := {
val fromDir := (classesDirectory in Compile).value
val toDir := crossTarget.value / "classes"
...
}
Rewire productDirectories to use crossTarget.value / "classes" anyway (otherwise it'll go looking in your modified classDirectory:
productDirectories in Compile := Seq(crossTarget.value / "classes")
Make sure that products depends on your enhance task:
products in Compile <<= (products in Compile) dependsOn enhance
You may need some more rewiring if you have resources (see copyResources). But basically you should be able to get there.
I said about that sbt monitors the output .class file. When a .class file is modified, it recompiles the .class file's source.
After some research, we found that sbt notices file's modification by its last modified time. That is to say, we can fool sbt by rolling back the last modified time after the modification, so that sbt won't notice any changes.
So, our solution is simple but effective:
find all .class files
note down their last modified time
do the enhance
put back the former last modified time
This is a small trick. We still expect more robust solutions.
Description:
A little like Chenyu, I had to write a plugin for SBT 1.x, that would enhance compiled classes and later I wanted to make sure that those enhanced classes were used for building the jar.
I did not want to hack this solution, so Chenyu's answer was not acceptable to me and sjrd's answer was very helpful but adjusted to SBT 0.13.
So here is my working solution, with little comments:
Code:
object autoImport {
val enhancedDest = settingKey[File]("Output dir for enhanced sources")
}
def enhanceTask: Def.Initialize[Task[Unit]] = Def.task {
val inputDir = (classDirectory in Compile).value
val outputDir = enhancedDest.value
enhance(inputDir, outputDir)
...
}
override def projectSettings: Seq[Def.Setting[_]] = Seq(
enhancedDest := crossTarget.value / "classes-enhanced",
products in Compile := Seq(enhancedDest.value), // mark enhanced folder to use for packaging
// https://www.scala-sbt.org/1.0/docs/Howto-Dynamic-Task.html#build.sbt+v2
compile in Compile := Def.taskDyn {
val c = (compile in Compile).value // compile 1st.
Def.task {
(copyResources in Compile).value // copy resources before enhance
enhanceTask.value // enhance
c
}
}.value
)

What is the issue with this sbt file?

When I import a SBT project into intelliJ, the build.sbt file is showing lot of errors as shown in the following screenshot. Wondering what might be the issue
IDEA Version 13.1.4
I also see the following
The following source roots are outside of the corresponding base directories:
C:\Users\p0c\Downloads\recfun\src\main\resources
C:\Users\p0c\Downloads\recfun\src\test\java
C:\Users\p0c\Downloads\recfun\src\test\scala
C:\Users\p0c\Downloads\recfun\src\test\resources
These source roots cannot be included in the IDEA project model. Please consider using shared SBT projects instead of shared source roots.
I think the question perhaps does not provide all the required information to answer conclusively, but I'll give it a spin anyways -
Since sbt runs correctly when invoked from the shell, we know the sbt file is fine. I use Idea for my Scala and sbt projects and I am sure the Idea sbt support works very well, but! Idea is far more restrictive than sbt when it comes to project structure. It is really easy to create a valid sbt project structure that Idea can't support very well.
Given that the source roots error indicates that the recfun/src folder is not in the project folder, it seems clear that this error is not emitted during the processing of recfun/build.sbt. The screenshot shows you have at least three different sbt files, progfun-recfun, submission and scala-recfun. Since Idea will also create projects like submission-build, and you have an assignment-build project there, something is probably broken in the project structure, not from the sbt viewpoint - there you're fine, you can build - but from the Idea viewpoint, which is more restrictive.
My suggestion to resolve this would be to change your project structure as follows. First, have a top level project with a build.sbt. Then create a sub-project for each src folder you want. Do not put a src folder in your top level project. Each sub-project needs a build.sbt as well.
Second, make sure the sub-projects build correctly with sbt when run from the shell. Arrange the sub-project build.sbt files with the proper dependencies, using this syntax:
lazy val a = ProjectRef(file("../a"), "a")
lazy val b = ProjectRef(file("../b"), "b")
lazy val root = Project(id = "c", base = file(".")) dependsOn (a, b)
(This example has three sister projects a, b and c, where c depends on a and b. The three projects are placed in directories with the same name, within the root directory. The code snippet is from the build file for c.)
Third, arrange your top level build.sbt to aggregate the sub-projects, using this syntax in the top level build.sbt:
lazy val a = ProjectRef(file("a"), "a")
lazy val b = ProjectRef(file("b"), "b")
lazy val c = ProjectRef(file("c"), "c")
lazy val root = (project in file(".")).
aggregate(a, b, c)
Building this top level project will build each of the sub-projects a, b and c, and the dependencies established in the sub-project build files will ensure they are built in the right order.
Fourth, import the top level project into Idea, and all should be well.
You can get fancy with the file structure if you want, because the project references use relative paths, but it's usually nice to at least start simple.
I had a lot of frustration with sbt and Idea at the start, I hope this helps :)