SBT AutoPlugin dependencies - scala

I've created an SBT plugin placed into project folder. This plugin extends sbt.AutoPlugin and adds a custom task.
Something like this:
object MyCustomTask extends AutoPlugin {
...
lazy val myCustomTask = Def.task {
runner.value.run("my.support.project.classpath.Utility")
}
}
and I have this build.sbt
lazy val support = (project in file("support"))
.settings(libraryDependencies ++= Seq(
"com.h2database" % "h2" % "1.4.197"
))
lazy val root = (project in file("root"))
.settings(...)
.dependsOn(support) // <- how can I remove this?
.enablePlugin(MyCustomTask)
I don't want to make a dependency between root project and support project, because in this way root inherits all the dependecies from support that it doesn't needs (like the h2database dependency), but if I remove the dependsOn(support) the task defined in MyCustomTask can't find my.support.project.classpath.Utility.
Ho can I move that dependency into MyCustomTask plugin definition?

Dependencies can be added to the plugin overriding the projectSettings field, like the following:
object MyCustomTask extends AutoPlugin {
...
lazy val myCustomTask = Def.task {
runner.value.run("my.support.project.classpath.Utility")
}
override val projectSettings: Seq[Def.Setting[_]] = Seq(
libraryDependencies += "com.h2database" % "h2" % "1.4.197"
)
}

Related

How to exclude logging (like logback-classic) from jar published by sbt

My Scala project has a libraryDependency on slf4j because I use the API for logging. I also want to see the logging output while running from sbt or IntelliJ, both for the Apps that runMain and the unit tests that testOnly from sbt. Therefore there is also a libraryDependency on logback-classic. However, I do not want that second dependency published because of the convention stated below. When someone uses my published library, the transitive dependency should not be automatically brought in. How should that be done? I don't want to explain to the user how to manually exclude the transitive dependency, because they might be using any number of different tools. The logback-classic should continue to be included in an assembled jar, however, if at all possible. It doesn't seem like exclude() is the answer.
"Embedded components such as libraries or frameworks should not declare a dependency on any SLF4J binding/provider [like logback-classic] but only depend on slf4j-api. When a library declares a transitive dependency on a specific binding, that binding is imposed on the end-user negating the purpose of SLF4J. Note that declaring a non-transitive dependency on a binding, for example for testing, does not affect the end-user."
Publish the jar with slf4j-api but use the sbt Test configuration for logback. Unit tests will then have a concrete implementation but it won't be packaged in your artifact.
libraryDependencies ++= Seq(
"org.slf4j" % "slf4j-api" % "1.7.36",
"ch.qos.logback" % "logback-classic" % "1.2.11" % Test
)
This would be a project with sub-projects. Your sample app uses a concrete implementation, but not the library. Anyone using the library would provide their own.
lazy val root = (project in file("."))
.settings(
publish / skip := true,
)
.aggregate(sampleApp, theLibrary)
lazy val sampleApp = project
.settings(
publish / skip := true,
libraryDependencies ++= Seq(
"ch.qos.logback" % "logback-classic" % "1.2.11"
)
)
.dependsOn(theLibrary % "test->test;compile->compile")
lazy val theLibrary = project
.settings(
libraryDependencies ++= Seq(
"org.slf4j" % "slf4j-api" % "1.7.36",
"ch.qos.logback" % "logback-classic" % "1.2.11" % Test
)
)
My tentative solution is to add this code to an sbt file
ThisBuild / pomPostProcess := {
val logback = DependencyId("ch.qos.logback", "logback-classic")
val rule = DependencyFilter { dependencyId =>
dependencyId != logback
}
(node: Node) => new RuleTransformer(rule).transform(node).head
}
and back it up with this Scala code in the project directory
package org.clulab.sbt
import scala.xml.Node
import scala.xml.NodeSeq
import scala.xml.transform.RewriteRule
case class DependencyId(groupId: String, artifactId: String)
abstract class DependencyTransformer extends RewriteRule {
override def transform(node: Node): NodeSeq = {
val name = node.nameToString(new StringBuilder()).toString()
name match {
case "dependency" =>
val groupId = (node \ "groupId").text.trim
val artifactId = (node \ "artifactId").text.trim
transform(node, DependencyId(groupId, artifactId))
case _ => node
}
}
def transform(node: Node, dependencyId: DependencyId): NodeSeq
}
class DependencyFilter(filter: DependencyId => Boolean) extends DependencyTransformer {
def transform(node: Node, dependencyId: DependencyId): NodeSeq =
if (filter(dependencyId)) node
else Nil
}
object DependencyFilter {
def apply(filter: DependencyId => Boolean): DependencyFilter = new DependencyFilter(filter)
}
I'm still hoping to find a similar solution for editing ivy.xml.

Need to provide a SettingKey from a plugin I use in my sbt plugin

I am using the s3 resolver plugin and would like to override it in my AutoPlugin.
I have tried added the value to projectSettings and globalSettings.
Error
not found: value s3CredentialsProvider
[error] s3CredentialsProvider := s3CredentialsProviderChain
Code
lazy val s3CredentialsProviderChain = {bucket: String =>
new AWSCredentialsProviderChain(
new EnvironmentVariableCredentialsProvider(),
CustomProvider.create(bucket)
)
}
override lazy val projectSettings = Seq(
publishTo := {
if (Keys.isSnapshot.value) {
Some("my-snapshots" at "s3://rest-of-stuff")
} else {
Some("my-releases" at "s3://rest-of-stuff")
}
},
s3CredentialsProvider := s3CredentialsProviderChain
)
The plugin code I'm working on does not define any custom settings of it's own thus has no autoImport of it's own.
Update
I have been unable to resolve the fm.sbt.S3ResolverPlugin in MyPlugin and the code won't compile.
I have tried adding it to enablePlugins on MyPlugin's build.sbt as well as adding it to the dependencies like this:
libraryDependencies ++= Seq(
"com.amazonaws" % "aws-java-sdk-sts" % amazonSDKVersion,
"com.frugalmechanic" % "fm-sbt-s3-resolver" % "0.17.0"
)
I get an error from sbt which I've asked below:
sbt fails to resolve a plugin as dependency
If you create an AutoPlugin in project directory. You need to add this to plugins.sbt.
addSbtPlugin("com.frugalmechanic" % "fm-sbt-s3-resolver" % "0.16.0")
If you create an independent plugin, add this to build.sbt of the plugin
sbtPlugin := true
addSbtPlugin("com.frugalmechanic" % "fm-sbt-s3-resolver" % "0.16.0")
autoImport does not work in scala files that are compiled for sbt, ie plugins for example. You have specify imports statements as in simple scala program. Something like this
import fm.sbt.S3ResolverPlugin
import sbt._
object TestPlugin extends AutoPlugin {
override def requires = S3ResolverPlugin
override def trigger = allRequirements
override def projectSettings: Seq[Def.Setting[_]] = Seq(
S3ResolverPlugin.autoImport.s3CredentialsProvider := ???
)
}
Note that to enable TestPlugin, you have to call enablePlugins(S3ResolverPlugin) in build.sbt

How do you share a custom task in a SBT multi-project

I have a project set up as a SBT multi-build. That looks like this:
- project
Dependencies.scala
- core
build.sbt
- server
build.sbt
build.sbt
I want to use Dependencies.scala as a container for version numbers of libraries that are shared between the sub-projects.
sealed trait Dependencies {
val commonsIo = "2.4"
}
object DependencyVersions extends Dependencies
In the root build.sbt I added a Setting that is given to each sub-project.
lazy val dependencies = settingKey[Dependencies]("versions")
val defaultSettings = Defaults.coreDefaultSettings ++ Seq(
dependencies := DependencyVersions)
def projectFolder(name: String, theSettings: Seq[Def.Setting[_]] = Nil) = Project(name, file(name), settings = theSettings)
lazy val core = projectFolder("core", defaultSettings)
I can't access the dependencies setting in core/build.sbt.
"commons-io" % "commons-io" % dependencies.value.commonsIo, <-- doesn't work
How can I get this to work?
You can define common settings (dependencies) in an object Common extends AutoPlugin (in project/Common.scala), and then use .enablePlugin(Common) on sub-project descriptor (see it in Anorm).
Thanks #cchantep got it working now using the AutoPlugin below
import sbt._
sealed trait Dependencies {
val commonsIo = "2.4"
}
object DependencyVersions extends Dependencies
object DependencyVersionsPlugin extends AutoPlugin {
override def trigger = allRequirements
object autoImport {
lazy val dependencies = settingKey[Dependencies]("Bundles dependency versions")
}
import autoImport._
override def projectSettings = Seq(
dependencies := DependencyVersions
)
}

How to use SBT IntegrationTest configuration from Scala objects

To make our multi-project build more manageable we split up our Build.scala file into several files, e.g. Dependencies.scala contains all dependencies:
import sbt._
object Dependencies {
val slf4j_api = "org.slf4j" % "slf4j-api" % "1.7.7"
...
}
We want to add integration tests to our build. Following the SBT documentation we added
object Build extends sbt.Build {
import Dependencies._
import BuildSettings._
import Version._
import MergeStrategies.custom
lazy val root = Project(
id = "root",
base = file("."),
settings = buildSettings ++ Seq(Git.checkNoLocalChanges, TestReport.testReport)
).configs(IntegrationTest).settings(Defaults.itSettings: _*)
...
}
where Dependencies, BuildSettings, Version and MergeStrategies are custom Scala objects definied in their own files.
Following the documentation we want to add some dependencies for the IntegrationTest configuration in Dependencies.scala:
import sbt._
object Dependencies {
val slf4j_api = "org.slf4j" % "slf4j-api" % "1.7.7"
val junit = "junit" % "junit" % "4.11" % "test,it"
...
}
Unfortunately this breaks the build:
java.lang.IllegalArgumentException: Cannot add dependency
'junit#junit;4.11' to configuration 'it' of module ... because this configuration doesn't exist!
I guess I need to import the IntegrationTest configuration. I tried importing the IntegrationTest configuration in Dependencies.scala:
import sbt.Configurations.IntegrationTest
IntegrationTest is a lazy val defined in the Configurations object:
object Configurations {
...
lazy val IntegrationTest = config("it") extend (Runtime)
...
}
But that did not solve the problem.
Does someone has an idea how to solve this?
You need to add the config to the Project object before you add the dependency to the Project object.
Your code quotes show you doing the former, but you don't show where you are doing the latter in your quoted code.
Please could you post the full config, or try moving those two around each other?
Here is where the config is added to the Project object in the SBT docs you linked to:
lazy val root = (project in file(".")).
configs(IntegrationTest).
Your quoted code above which declares a lazy val but does not use it is not sufficient to get the "it" config into use:
lazy val IntegrationTest = config("it") extend (Runtime)

How to use custom plugin in multi-module project?

In a multi-module project, with one module implementing a custom SBT plugin with custom TaskKey, how this plugin can be imported for the project settings of another submodule.
If the submodule using the plugin is define in submodule1/build.sbt, then submodule1/project/plugins.sbt is not loaded.
If plugin is registered in project/plugins.sbt it will fails when loading top/aggregate project as plugin is not necessarily already built.
Is there any other way to define a custom task requiring custom dependency so that it can be used by a submodule?
Here is how I finally make it works:
import sbt._
import Keys._
object MyBuild extends Build {
private lazy val myGenerator =
// Private project with generator code and its specific dependencies
// (e.g. Javassist)
Project(id = "my-generator",
base = file("project") / "my-generator").settings(
name := "my-generator",
javacOptions in Test ++= Seq("-Xlint:unchecked", "-Xlint:deprecation"),
autoScalaLibrary := false,
scalacOptions += "-feature",
resolvers +=
"Typesafe Snapshots" at "http://repo.typesafe.com/typesafe/releases/",
libraryDependencies ++= Seq( // Dependencies required to generate classes
"org.javassist" % "javassist" % "3.18.2-GA")
)
// Some custom setting & task
lazy val generatedClassDirectory = settingKey[File](
"Directory where classes get generated")
lazy val generatedClasses = taskKey[Seq[(File, String)]]("Generated classes")
lazy val myProject =
Project(id = "my-project", base = file("my-project")).settings(
name := "my-project",
javacOptions in Test ++= Seq("-Xlint:unchecked", "-Xlint:deprecation"),
autoScalaLibrary := false,
scalacOptions += "-feature",
libraryDependencies ++= Seq(/* ... */),
generatedClassDirectory := {
// Defines setting for path to generated classes
val dir = target.value / "generated_classes"
if (!dir.exists) dir.mkdirs()
dir
},
generatedClasses <<= Def.task { // Define task generating .class files
// first get classloader including generator and its dependencies
val cp = (fullClasspath in (myGenerator, Compile)).value
val cl = classpath.ClasspathUtilities.toLoader(cp.files)
// then loaded generator class, and instantiate with structural type
val genClass = cl loadClass "my.custom.GeneratorClass"
val generator = genClass.newInstance.
asInstanceOf[{def writeTo(out: File): File}]
// finally we can call the
val outdir = generatedClassDirectory.value
val generated = generator writeTo outdir
val path = generated.getAbsolutePath
// Mappings describing generated classes
Seq[(File, String)](generated -> path.
drop(outdir.getAbsolutePath.length+1))
} dependsOn(compile in (myGenerator, Compile))/* awkward? */,
managedClasspath in Compile := {
// Add generated classes to compilation classpath,
// so it can be used in my-project sources
val cp = (managedClasspath in Compile).value
cp :+ Attributed.blank(generatedClassDirectory.value)
},
// Make sure custom class generation is done before compile
compile in Compile <<= (compile in Compile) dependsOn generatedClasses,
mappings in (Compile, packageBin) := {
val ms = mappings.in(Compile, packageBin).value
ms ++ generatedClasses.value // add generated classes to package
}
).dependsOn(myGenerator/* required even if there dependsOn(compile in (myGenerator, Compile)) */)
}
Not sure there is better solution, especially about the redondant dependsOn(compile in (myGenerator, Compile)) and .dependsOn(myGenerator).