How to exclude logging (like logback-classic) from jar published by sbt - scala

My Scala project has a libraryDependency on slf4j because I use the API for logging. I also want to see the logging output while running from sbt or IntelliJ, both for the Apps that runMain and the unit tests that testOnly from sbt. Therefore there is also a libraryDependency on logback-classic. However, I do not want that second dependency published because of the convention stated below. When someone uses my published library, the transitive dependency should not be automatically brought in. How should that be done? I don't want to explain to the user how to manually exclude the transitive dependency, because they might be using any number of different tools. The logback-classic should continue to be included in an assembled jar, however, if at all possible. It doesn't seem like exclude() is the answer.
"Embedded components such as libraries or frameworks should not declare a dependency on any SLF4J binding/provider [like logback-classic] but only depend on slf4j-api. When a library declares a transitive dependency on a specific binding, that binding is imposed on the end-user negating the purpose of SLF4J. Note that declaring a non-transitive dependency on a binding, for example for testing, does not affect the end-user."

Publish the jar with slf4j-api but use the sbt Test configuration for logback. Unit tests will then have a concrete implementation but it won't be packaged in your artifact.
libraryDependencies ++= Seq(
"org.slf4j" % "slf4j-api" % "1.7.36",
"ch.qos.logback" % "logback-classic" % "1.2.11" % Test
)
This would be a project with sub-projects. Your sample app uses a concrete implementation, but not the library. Anyone using the library would provide their own.
lazy val root = (project in file("."))
.settings(
publish / skip := true,
)
.aggregate(sampleApp, theLibrary)
lazy val sampleApp = project
.settings(
publish / skip := true,
libraryDependencies ++= Seq(
"ch.qos.logback" % "logback-classic" % "1.2.11"
)
)
.dependsOn(theLibrary % "test->test;compile->compile")
lazy val theLibrary = project
.settings(
libraryDependencies ++= Seq(
"org.slf4j" % "slf4j-api" % "1.7.36",
"ch.qos.logback" % "logback-classic" % "1.2.11" % Test
)
)

My tentative solution is to add this code to an sbt file
ThisBuild / pomPostProcess := {
val logback = DependencyId("ch.qos.logback", "logback-classic")
val rule = DependencyFilter { dependencyId =>
dependencyId != logback
}
(node: Node) => new RuleTransformer(rule).transform(node).head
}
and back it up with this Scala code in the project directory
package org.clulab.sbt
import scala.xml.Node
import scala.xml.NodeSeq
import scala.xml.transform.RewriteRule
case class DependencyId(groupId: String, artifactId: String)
abstract class DependencyTransformer extends RewriteRule {
override def transform(node: Node): NodeSeq = {
val name = node.nameToString(new StringBuilder()).toString()
name match {
case "dependency" =>
val groupId = (node \ "groupId").text.trim
val artifactId = (node \ "artifactId").text.trim
transform(node, DependencyId(groupId, artifactId))
case _ => node
}
}
def transform(node: Node, dependencyId: DependencyId): NodeSeq
}
class DependencyFilter(filter: DependencyId => Boolean) extends DependencyTransformer {
def transform(node: Node, dependencyId: DependencyId): NodeSeq =
if (filter(dependencyId)) node
else Nil
}
object DependencyFilter {
def apply(filter: DependencyId => Boolean): DependencyFilter = new DependencyFilter(filter)
}
I'm still hoping to find a similar solution for editing ivy.xml.

Related

Need to provide a SettingKey from a plugin I use in my sbt plugin

I am using the s3 resolver plugin and would like to override it in my AutoPlugin.
I have tried added the value to projectSettings and globalSettings.
Error
not found: value s3CredentialsProvider
[error] s3CredentialsProvider := s3CredentialsProviderChain
Code
lazy val s3CredentialsProviderChain = {bucket: String =>
new AWSCredentialsProviderChain(
new EnvironmentVariableCredentialsProvider(),
CustomProvider.create(bucket)
)
}
override lazy val projectSettings = Seq(
publishTo := {
if (Keys.isSnapshot.value) {
Some("my-snapshots" at "s3://rest-of-stuff")
} else {
Some("my-releases" at "s3://rest-of-stuff")
}
},
s3CredentialsProvider := s3CredentialsProviderChain
)
The plugin code I'm working on does not define any custom settings of it's own thus has no autoImport of it's own.
Update
I have been unable to resolve the fm.sbt.S3ResolverPlugin in MyPlugin and the code won't compile.
I have tried adding it to enablePlugins on MyPlugin's build.sbt as well as adding it to the dependencies like this:
libraryDependencies ++= Seq(
"com.amazonaws" % "aws-java-sdk-sts" % amazonSDKVersion,
"com.frugalmechanic" % "fm-sbt-s3-resolver" % "0.17.0"
)
I get an error from sbt which I've asked below:
sbt fails to resolve a plugin as dependency
If you create an AutoPlugin in project directory. You need to add this to plugins.sbt.
addSbtPlugin("com.frugalmechanic" % "fm-sbt-s3-resolver" % "0.16.0")
If you create an independent plugin, add this to build.sbt of the plugin
sbtPlugin := true
addSbtPlugin("com.frugalmechanic" % "fm-sbt-s3-resolver" % "0.16.0")
autoImport does not work in scala files that are compiled for sbt, ie plugins for example. You have specify imports statements as in simple scala program. Something like this
import fm.sbt.S3ResolverPlugin
import sbt._
object TestPlugin extends AutoPlugin {
override def requires = S3ResolverPlugin
override def trigger = allRequirements
override def projectSettings: Seq[Def.Setting[_]] = Seq(
S3ResolverPlugin.autoImport.s3CredentialsProvider := ???
)
}
Note that to enable TestPlugin, you have to call enablePlugins(S3ResolverPlugin) in build.sbt

How to use SBT IntegrationTest configuration from Scala objects

To make our multi-project build more manageable we split up our Build.scala file into several files, e.g. Dependencies.scala contains all dependencies:
import sbt._
object Dependencies {
val slf4j_api = "org.slf4j" % "slf4j-api" % "1.7.7"
...
}
We want to add integration tests to our build. Following the SBT documentation we added
object Build extends sbt.Build {
import Dependencies._
import BuildSettings._
import Version._
import MergeStrategies.custom
lazy val root = Project(
id = "root",
base = file("."),
settings = buildSettings ++ Seq(Git.checkNoLocalChanges, TestReport.testReport)
).configs(IntegrationTest).settings(Defaults.itSettings: _*)
...
}
where Dependencies, BuildSettings, Version and MergeStrategies are custom Scala objects definied in their own files.
Following the documentation we want to add some dependencies for the IntegrationTest configuration in Dependencies.scala:
import sbt._
object Dependencies {
val slf4j_api = "org.slf4j" % "slf4j-api" % "1.7.7"
val junit = "junit" % "junit" % "4.11" % "test,it"
...
}
Unfortunately this breaks the build:
java.lang.IllegalArgumentException: Cannot add dependency
'junit#junit;4.11' to configuration 'it' of module ... because this configuration doesn't exist!
I guess I need to import the IntegrationTest configuration. I tried importing the IntegrationTest configuration in Dependencies.scala:
import sbt.Configurations.IntegrationTest
IntegrationTest is a lazy val defined in the Configurations object:
object Configurations {
...
lazy val IntegrationTest = config("it") extend (Runtime)
...
}
But that did not solve the problem.
Does someone has an idea how to solve this?
You need to add the config to the Project object before you add the dependency to the Project object.
Your code quotes show you doing the former, but you don't show where you are doing the latter in your quoted code.
Please could you post the full config, or try moving those two around each other?
Here is where the config is added to the Project object in the SBT docs you linked to:
lazy val root = (project in file(".")).
configs(IntegrationTest).
Your quoted code above which declares a lazy val but does not use it is not sufficient to get the "it" config into use:
lazy val IntegrationTest = config("it") extend (Runtime)

Compile with different settings in different commands

I have a project defined as follows:
lazy val tests = Project(
id = "tests",
base = file("tests")
) settings(
commands += testScalalib
) settings (
sharedSettings ++ useShowRawPluginSettings ++ usePluginSettings: _*
) settings (
libraryDependencies <+= (scalaVersion)("org.scala-lang" % "scala-reflect" % _),
libraryDependencies <+= (scalaVersion)("org.scala-lang" % "scala-compiler" % _),
libraryDependencies += "org.tukaani" % "xz" % "1.5",
scalacOptions ++= Seq()
)
I would like to have three different commands which will compile only some files inside this project. The testScalalib command added above for instance is supposed to compile only some specific files.
My best attempt so far is:
lazy val testScalalib: Command = Command.command("testScalalib") { state =>
val extracted = Project extract state
import extracted._
val newState = append(Seq(
(sources in Compile) <<= (sources in Compile).map(_ filter(f => !f.getAbsolutePath.contains("scalalibrary/") && f.name != "Typers.scala"))),
state)
runTask(compile in Compile, newState)
state
}
Unfortunately when I use the command, it still compiles the whole project, not just the specified files...
Do you have any idea how I should do that?
I think your best bet would be to create different configurations like compile and test, and have the appropriate settings values that would suit your needs. Read Scopes in the official sbt documentation and/or How to define another compilation scope in SBT?
I would not create additional commands, I would create an extra configuration, as #JacekLaskowski suggested, and based on the answer he had cited.
This is how you can do it (using Sbt 0.13.2) and Build.scala (you could of course do the same in build.sbt, and older Sbt version with different syntax)
import sbt._
import Keys._
object MyBuild extends Build {
lazy val Api = config("api")
val root = Project(id="root", base = file(".")).configs(Api).settings(custom: _*)
lazy val custom: Seq[Setting[_]] = inConfig(Api)(Defaults.configSettings ++ Seq(
unmanagedSourceDirectories := (unmanagedSourceDirectories in Compile).value,
classDirectory := (classDirectory in Compile).value,
dependencyClasspath := (dependencyClasspath in Compile).value,
unmanagedSources := {
unmanagedSources.value.filter(f => !f.getAbsolutePath.contains("scalalibrary/") && f.name != "Typers.scala")
}
))
}
now when you call compile everything will get compiled, but when you call api:compile only the classes matching the filter predicate.
Btw. You may want to also look into the possibility of defining different unmanagedSourceDirectories and/or defining includeFilter.

Shapeless example with map won't compile (scala)

I'm trying to map over an HList in shapeless. The following example is derived from here:
import shapeless._
import poly._
object Main {
def main(args: Array[String]) = {
object choose extends (Set ~> Option) {
def apply[T](s : Set[T]) = s.headOption
}
val sets = Set(1) :: Set("foo") :: HNil
val opts = sets map choose // map selects cases of choose for each HList element
}
}
Unfortunately I am unable to compile the example. The compiler says "value map is not a member of HCons[scala.collection.immutable.Set[Int],HCons[scala.collection.immutable.Set[String],HNil]]". I suspect there is a missing import of an implicit that defines the map operation on HLists, but I don't know what that import should be. I'm using sbt with the following build.sbt file:
name := "scala-polymorphism-experiments"
version := "0.1.0"
scalaVersion := "2.10.3"
resolvers ++= Seq(
"Sonatype OSS Releases" at "http://oss.sonatype.org/content/repositories/releases/",
"Sonatype OSS Snapshots" at "http://oss.sonatype.org/content/repositories/snapshots/"
)
libraryDependencies ++= Seq("org.scalatest" % "scalatest_2.10" % "2.0" % "test",
"com.chuusai" % "shapeless" % "2.0.0-SNAPSHOT" cross CrossVersion.full changing())
I also have this problem if I use the M1 release of 2.0.0. What should I change to make this example compile and run?
The problem was never determined. The solution was to comment out all code in all other scala files in the project, recompile, then uncomment and compile again. No doubt an
sbt clean
would have done just as well.

libraryDependencies on sbt Build.scala Full Configuration with sub-projects

I have a project foo with two children foo-core and foo-cli, foo-cli depends on foo-core
(I come from Java/Maven and tried to transpose the parent module with 2 submodules architecture).
Following https://github.com/harrah/xsbt/wiki/Full-Configuration, I wrote my project/Build.scala this way:
import sbt._
import Keys._
object MyBuild extends Build {
//Dependencies
val slf4s = "com.weiglewilczek.slf4s" %% "slf4s" % "1.0.6"
val slf4j = "org.slf4j" %% "slf4j-simple" % "1.5.6"
val grizzled = "org.clapper" %% "grizzled-slf4j" % "0.5"
val junit = "junit" % "junit" % "4.8" % "test"
//End dependencies
lazy val root : Project = Project("root", file(".")) aggregate(cli) settings(
mainClass:= Some("Main")
)
lazy val core : Project = Project("core", file("core"), delegates = root :: Nil) settings(
name := "foo-core",
libraryDependencies ++= Seq(grizzled)
)
lazy val cli: Project = Project("cli", file("cli")) dependsOn(core) settings(
name := "foo-cli",
libraryDependencies ++= Seq(grizzled)
)
}
This configuration does not work: grizzled library is not dowloaded when I run sbt reload;sbt +update (as indicated in http://software.clapper.org/grizzled-slf4j/) and thus the "import grizzli._" fail in my core and cli projects when I sbt compile.
Since I'm new to scala/sbt I imagine I'm doing something awful but can't figure why since I'm confused with all sbt 0.7/sbt0.10 conflicting configurations that were suggested
(like Subproject dependencies in SBT).
Any idea? Hint that could help me?
Thanks in advance
That's grizzled, not grizzli you are using as dependency. The import is:
import grizzled._
This works here from console on project cli and project core, with nothing more than the configuration file above.
Are you using SBT 0.10?