Is it possible to add a Scala compiler plugin only when compiling test sources?
When a compiler plugin is added by calling SBT's addCompilerPlugin then a library dependency is added. The relevant methods are:
/** Transforms `dependency` to be in the auto-compiler plugin configuration. */
def compilerPlugin(dependency: ModuleID): ModuleID =
dependency.withConfigurations(Some("plugin->default(compile)"))
/** Adds `dependency` to `libraryDependencies` in the auto-compiler plugin configuration. */
def addCompilerPlugin(dependency: ModuleID): Setting[Seq[ModuleID]] =
libraryDependencies += compilerPlugin(dependency)
The question boils down if there is a withConfigurations that causes the plugin to be on the classpath only when compiling test sources. I tried Some("plugin->default(testCompile)") without success.
To answer my own question: It is possible by setting autoCompilerPlugins to false and adding the necessary -Xplugin=... Scalac option manually in the test configuration. This can be done by using the Classpaths.autoPlugins utility method.
In my case I wanted to activate the SemanticDB compiler plugin during tests only. This can be accomplished by the following SBT settings:
autoCompilerPlugins := false,
ivyConfigurations += Configurations.CompilerPlugin,
scalacOptions in Test ++= Classpaths.autoPlugins(update.value, Seq(), ScalaInstance.isDotty(scalaVersion.value)),
scalacOptions in Test += "-Yrangepos",
scalacOptions in Test += "-P:semanticdb:text:on",
Related
I am in the process of writing a library that does monitoring/OpenTracing and I am attempting to use sbt-aspectj so that users of the library don't need to manually instrument their code. I am currently however getting an issue with creating an sbt-project representing such a library.
The idea is that I want an external library as indicated in this sample here https://github.com/sbt/sbt-aspectj/tree/master/src/sbt-test/weave/external however that external library is dependant on an external dependency (i.e. akka-actors). Basically I am trying to combine both https://github.com/sbt/sbt-aspectj/tree/master/src/sbt-test/weave/external and https://github.com/sbt/sbt-aspectj/tree/master/src/sbt-test/weave/jar. I have created a sample project here https://github.com/mdedetrich/sbt-aspectj-issue to indicate the problem I am having however below is the relevant sample
lazy val root = (project in file("."))
.enablePlugins(SbtAspectj)
.settings(
name := RootName,
version := Version,
// add akka-actor as an aspectj input (find it in the update report)
// aspectjInputs in Aspectj ++= update.value.matching(
// moduleFilter(organization = "com.typesafe.akka", name = "akka-actor*")),
// replace the original akka-actor jar with the instrumented classes in runtime
// fullClasspath in Runtime := aspectjUseInstrumentedClasses(Runtime).value,
// only compile the aspects (no weaving)
aspectjCompileOnly in Aspectj := true,
// ignore warnings (we don't have the target classes at this point)
aspectjLintProperties in Aspectj += "invalidAbsoluteTypeName = ignore",
// replace regular products with compiled aspects
products in Compile ++= (products in Aspectj).value,
libraryDependencies ++= Seq(
"com.typesafe.akka" %% "akka-actor" % akkaVersion
)
)
lazy val test = (project in file("test"))
.enablePlugins(SbtAspectj)
.settings(
aspectjBinaries in Aspectj ++= update.value.matching(
moduleFilter(organization = Organization, name = s"$RootName*")),
aspectjInputs in Aspectj ++= update.value.matching(
moduleFilter(organization = "com.typesafe.akka", name = "akka-actor*")),
fullClasspath in Runtime := aspectjUseInstrumentedClasses(Runtime).value,
// weave this project's classes
aspectjInputs in Aspectj += (aspectjCompiledClasses in Aspectj).value,
products in Compile := (products in Aspectj).value,
products in Runtime := (products in Compile).value,
libraryDependencies ++= Seq(
Organization %% RootName % Version
)
)
The idea is that we publish the root project using root/publishLocal and the test project is just designed to include root as a libraryDependency so we can see if the aspect-j is working properly.
The problem is simple that I am unable to get it working. The current code at https://github.com/mdedetrich/sbt-aspectj-issue publishes with root/publishLocal (not sure if its correct though) however when I then do test/run I get this
[info] Weaving 2 inputs with 1 AspectJ binary to /home/mdedetrich/github/sbt-aspectj-issue/test/target/scala-2.13/aspectj/classes...
[error] stack trace is suppressed; run last test / Compile / packageBin for the full output
[error] (test / Compile / packageBin) java.util.zip.ZipException: duplicate entry: META-INF/MANIFEST.MF
[error] Total time: 1 s, completed Dec 29, 2019 4:31:27 PM
sbt:sbt-aspectj-issue>
Which seems to be an issue with having duplicate akka-actor entries. I tried toggling various entries in build.sbt but didn't manage to get it working.
EDIT: This was also posted as a github issue here https://github.com/sbt/sbt-aspectj/issues/44
Generally, you can exclude the META-INF directories from the external libraries woven.
mappings in (Compile, packageBin) := {
(mappings in (Compile, packageBin)).value
.filterNot(_._2.startsWith("META-INF/"))
}
But for akka libraries, there is another problem. In each akka library, there is a reference.conf, which contains the fallback configuration for the provided features. This will also lead to conflicts like the META-INF did. But it cannot be just excluded like the META-INF, because they are essential for akka to work properly.
If you exclude them, you'll have to provide all the required akka configurations in your application.conf, or a merged (not simply concatenate) reference.conf in your project. It's not trivial, and subject to version change of akka.
Another solution would be weaving and repackaging the akka libraries individually, so the reference.conf can be kept in the repackaged libraries. The project layout and build script will a bit more complicated, but also be easier to maintain if you have plan to upgrade to newer versions of akka in the future.
I have a CLI tool written in Java which can modify some source with the added params. For example, it can rename an enum value across a whole project.
I want to write an sbt task that can run this tool from my project dir with the given params, like sbt 'enums -rename A B'. My tool can be injected to the project through the sbt dependencies.
I skimmed through the book sbt in Action looking for an answer, but those examples are not this specific.
My build.sbt (far from working):
name := """toolTestWithActivator"""
version := "1.0-SNAPSHOT"
resolvers += "Local Repository" at "file://C:/Users/torcsi/.ivy2/local"
lazy val root = (project in file(".")).enablePlugins(PlayJava)
scalaVersion := "2.11.6"
libraryDependencies ++= Seq(
"tool" % "tool_2.11" % "1.0",
javaJdbc,
javaEbean,
cache,
javaWs
)
val mytool = taskKey[String]("mytool")
mytool := {
com.my.tool.Main
}
Can sbt handle this type of task/dependency structure, or do I need to do this another way?
SBT is recursive: it compiles .sbt files and .scala files under the project folder and use those to execute your build (in fact you can see sbt as a library that helps you producing builds).
So, as you need your library to define a task, that one is a dependency of your build.sbt file (and not a dependency of your project).
To declare that the build.sbt file depends on your library, just create a ".sbt" file in the project folder; example:
project/dependencies.sbt
libraryDependencies += "tool" %% "tool" % "1.0"
and in build.sbt add:
val mytool = taskKey[Unit]("mytool")
mytool := {
com.my.tool.main(Array())
}
Some comments:
be careful with the scala version used: as sbt 0.13 is compiled with scala 2.10; your library should also be compiled for scala 2.10 (the package should be tools_2.10 ). And the new sbt 1.0 is compiled with scala 2.12.
I used the %% notation, so that sbt adds by itself the expected scala version.
I supposed your cli tool defines a classic java main method (or the scala equivalent). So, the argument should be an Array of String (here an empty one) and it returns Unit (void in java).
Some reference to understand the solution:
http://www.scala-sbt.org/0.13/docs/Organizing-Build.html
I am using spray revolver to test my application while writting code.
I would like to make a revolver run compile additional sources (e.g. /src/dev.scala or whatever) with additional dependencies. This is so when testing locally I can startup some external sevices we are using (e.g. cassandra) in the same vm without the need to have a proper environment set up.
Initially I tried setting these settings like this:
unmanagedSources in Revolver.reStart <<= (unmanagedSources in Compile) map { ss => ss :+ new File("/path/to/my/dev.scala") }
libraryDependencies in Revolver.reStart += ("org.cassandraunit" % "cassandra-unit" % "2.0.2.1")
mainClass in Revolver.reStart := Some("my.main.class.in.dev")
But when running the task,I just get that the main class doesn't exist.
Is there any way to make this work? The idea is to avoid having the cassandra-unit and code in dev.scala out of the compilation for tests & packaging.
That cannot work, because Revolver.reStart still uses compile in Compile, and compile in Compile uses libraryDependencies, not libraryDependencies in Revolver.reStart.
To do this, you need to defined an entirely different, custom configuration that extends your Compile configuration. In that configuration, say "Localcompile", you can define your dependency with
unmanagedSources in Localcompile += new File("/path/to/my/dev.scala")
libraryDependencies += "org.cassandraunit" % "cassandra-unit" % "2.0.2.1" % "localcompile"
See http://www.scala-sbt.org/0.13/docs/Advanced-Configurations-Example.html for examples.
I am attempting to use sbt with the following plugin https://github.com/siasia/xsbt-proguard-plugin. So far I haven't had any issues with the plugin, apart from the fact that proguard puts all of the unmanaged jars into the final min.jar file (causing problems with multiple jar's that conflict). Proguard has the proguardLibraryJars flag which allows you to specify jars for proguard to exclude
Essentially I want to add all of the jars from the TaskKey unamangedJars to proguardLibraryJars using the plugin, i.e. do something like this
lazy val proguard = proguardSettings ++ Seq(
proguardOptions := Seq(
keepMain("com.test.FacebookPostScheduler"),
keepMain("org.postgresql.Driver")
),
proguardLibraryJars <++= unmanagedClasspath
)
The problem is the above obviously doesn't compile at this line
proguardLibraryJars <++= unmanagedClasspath
with the
No implicit for Append.Values[Seq[java.io.File], sbt.Keys.Classpath] found
error.
How would you code what I am attempting to do using the latest SBT (0.11.3-2) using a Build.scala (not a build.sbt)
I have a public repository of an SBT plugin that manages to pass jars to proguard. It doesn't use the proguard plugin but the code might help you figure how to gather dependencies.
https://github.com/tlazaro/xsbt-plugin-deployer/blob/master/src/main/scala/Deployer.scala
Look for:
private def getDepsJars(project: ProjectRef, bs: BuildStructure) = forAllProjects(project, bs) {p =>
artifactPath in Compile in packageBin in p get bs.data
}
That might give you a way to get started. It gathers every needed jar which is what you usually want, not just the unmanaged.
Alternatively you can just use this plugin and maybe collaborate. The code is a bit sloppy, it not intended to be released yet. The plugin does some other neat stuff like compressing everything with pack200 into a jar and having a custom ClassLoader that loads the compressed classes from there in runtime.
adamw/xsbt-proguard-plugin, which is the successor of siasia/xsbt-proguard-plugin seems to have the very option:
By default Proguard will be instructed to include everything except classes from the Java runtime. To treat additional libraries as external (i.e. to add them to the list of -libraryjars passed to Proguard), do the following. Here comes the example how to select a module named "httpclient" from the library dependencies:
proguardLibraryJars <++= (update) map (_.select(module = moduleFilter(name = "httpclient")))
i'm writing my own scala compiler plugin and using sbt to build the project. is it possible to put the source of that plugin in the same project that needs to be compiled using that plugin?
all the documentation on sbt seems to be concerned with using a plugin that's external to the project. it just seems much easier to test the plugin if they're in the same project. otherwise i have to continuously build the plugin, copy that jar over to the main project, and then compile that.
the documentation i read is at http://code.google.com/p/simple-build-tool/wiki/CompilerPlugins.
Here is an example using SBT 0.13:
object PluginBuild extends Build {
def buildSettings = Seq(
name := "test-compiler-plugin",
scalaVersion := "2.10.3"
)
override def settings = super.settings ++ buildSettings
lazy val codeToBeChecked = project.in(file("code-to-be-checked")).
settings(
scalacOptions += "-Xplugin:" + packageBin.in(Compile).in(thePlugin).value
)
lazy val thePlugin = project.in(file("the-plugin")).settings(
libraryDependencies += "org.scala-lang" % "scala-compiler" % scalaVersion.value
)
}
I am not shure about what you are doing, but maybe is the project/plugins/src_managed/ diriectory what you are looking for. If the user of the plugin needs some code from the plugin, it can be found there.