I have this line of code in my build.sbt file:
libraryDependencies ++= Seq("com.foo" %% "lib" % "1.2.3")
Imagine that this library depends on "com.bar.lib" lib. Now in my code I can import com.bar.lib._ and it'll work. But I don't want this to compile, so maybe there is SBT plugin out there just for this purpose?
One of libraries I'm using depends on old cats version. I spent really long time to understand why mapN method not works... I just never imported a newer version of cats in the subproject.
SBT offers the intransitive and exclude features to deal with issues like this, as #earldouglas points out. See: https://www.scala-sbt.org/1.x/docs/Library-Management.html
You replied:
I tried to do so, but intransitive() don't import transitive dependencies (so I need to import all of them by hand to make it compile).
Yes, that is what it is for
What I want is something that will warn me about using libraries not directly imported in SBT file.
So you want transitive dependencies on your classpath, but you want the compiler to reject uses of transitive classes in your project code while allowing them in library code?
That is not a sensible approach: at runtime, these transitive dependencies will be on the classpath. The JVM classpath does not distinguish between different kinds of dependencies; such distinction only exists in SBT at build time.
You would be much better served by either
including a newer version of the cats library, overriding the transitive dep or
excluding the transitively included cats library, if it is broken.
However, I think you probably could achieve what you want by setting different dependencies at different build stages:
at Compile stage, include the dependency with intransitive. Your code should compile against your direct dependencies, but fail if you referenced any transitive dependencies
at Runtime stage, include the dependency with its transitive deps
the SBT code might look like this (untested):
(libraryDependencies in Compile) ++= Seq("com.foo" %% "lib" % "1.2.3" intransitive())
(libraryDependencies in Runtime) ++= Seq("com.foo" %% "lib" % "1.2.3")
Related
When I build an artefact with
sbt universal:packageZipTarball
the scala-compiler.jar is included as a lib in my artefact.
Why is the scala-compiler needed at runtime? In my understanding it shouldn't be needed.
How can I exclude this jar?
It is probably pulled in as a transitive dependency by one of your libraryDependencies. You can use this sbt plugin to find out which one:
https://github.com/sbt/sbt-dependency-graph
Once you found out, you can block it by appending exclude("org.scala-lang", "scala-compiler") to the relevant dependency.
For instance, older versions of the pureconfig library used to erroneously pull this dependency in. It can be fixed like so:
libraryDependencies +=
"com.github.pureconfig" %% "pureconfig" % "0.10.0" exclude("org.scala-lang", "scala-compiler")
I'm writing a SBT plugin. I would like to use Circe JSON library, but it requires the Macro Paradise compiler plugin on Scala 2.10.
Normally you add compiler plugins to build.sbt and SBT plugins to project/plugins.sbt.
Now when you're building a SBT plugin, the other plugins become dependencies, so you put them to build.sbt and they are propagated to the projects where you use your SBT plugin.
When I put the following snippet in build.sbt of my SBT plugin:
addCompilerPlugin("org.scalamacros" % "paradise" % "2.1.0" cross CrossVersion.full)
Does the Paradise compiler plugin propagate to downstream projects?
Compiler plugins are not propagated by default, but they will in fact be required by downstream users as a dependency, and there is no way for you to bypass this requirement.
The reason is simple, their code will be compiled in a different compilation unit, so as long as you have features that depend on the compiler plugin that will be found in the end codebase, you'll also need to stick a note on this plugin to explicitly add the dependency.
Hope this helps, and take for example the really popular Monocle lib here. Annotations won't expand without paradise for instance, so it's all a question of what the end user will need.
Quote
If you want to use macro annotations such as #Lenses, you will also need to include:
addCompilerPlugin("org.scalamacros" %% "paradise" % "2.1.0" cross CrossVersion.full)
This is for Scala 2.11.1 and sbt 0.13.5.
Say I have a Scala/sbt project with the following directory structure:
root/
build.sbt
src/ ..
project/
plugins.sbt
build.properties
LolUtils.scala
and I want to use some external library in LolUtils.scala. How is this generally accomplished in sbt?
If I simply add the libs I need into build.sbt via libraryDependencies += .. then it doesn't find them and fails on the import line with not found: object ...
If I add a separate project/build.sbt, for some reason it starts failing to resolve my plugins, plus I need to manually specify the Scala version in the nested project/build.sbt, which is unnecessary duplication.
What's the best way to accomplish this?
sbt is recursive which means that it uses itself to compile a build definition, i.e. *.sbt files and *.scala files under project directory. To add extra dependencies to use them in the build definition you have to declare them in a project/build.sbt.
There is one caveat to that. You can set any scalaVersion to your project, that is in build.sbt, but you should not modify scalaVersion in the project/build.sbt as it might conflict with the version sbt itself uses (that may or may not lead to binary incompatibility for plugins).
Sbt 0.13.5 is using Scala 2.10.4, and the library you're going to use must be compatible with that particular version of Scala.
> about
[info] This is sbt 0.13.5
...
[info] sbt, sbt plugins, and build definitions are using Scala 2.10.4
I am cross building a scala project with sbt 12.1.
crossScalaVersions := Seq("2.9.2", "2.10.0")
However, it can't find the dependencies because they are named _2.10 not _2.10.0. It seems like it's regular to name your library 2.10 instead of 2.10.0 with the exception of scala-language and scala-compiler. For example, scalaz is not found at http://repo1.maven.org/maven2/org/scalaz/scalaz-core_2.10.0/6.0.4/scalaz-core_2.10.0-6.0.4.pom but at http://repo1.maven.org/maven2/org/scalaz/scalaz-core_2.10/6.0.4/scalaz-core_2.10-6.0.4.pom.
Is there an easy way to handle this without writing custom rules for all of my dependencies?
The actual build.sbt is available online.
Since 2.10.x releases are binary compatible between each other, libraries need to be built only with one version of scala library - and they can (and must) drop the .0 part (if you publish with sbt, it is done automatically). When the maintainer of a library releases a library with _2.10.0 tag, it's a mistake and you should consider filing a bug.
By the way, I looked on your build.sbt - running +compile on it works for me (sbt 0.12.1). Do you experience some errors?
To get the Scala version incorporated into the artifact name in the Scala way, you specify the dependency with the %% operator:
libraryDependencies += "io.backchat.jerkson" %% "jerkson" % "0.7.0"
When the exact match is not available, you can specify the Scala version explicitly (but remember compatibility only exists across patch releases of a given major/minor release of Scala):
libraryDependencies += "io.backchat.jerkson" % "jerkson_2.9.2" % "0.7.0"
I am attempting to use sbt with the following plugin https://github.com/siasia/xsbt-proguard-plugin. So far I haven't had any issues with the plugin, apart from the fact that proguard puts all of the unmanaged jars into the final min.jar file (causing problems with multiple jar's that conflict). Proguard has the proguardLibraryJars flag which allows you to specify jars for proguard to exclude
Essentially I want to add all of the jars from the TaskKey unamangedJars to proguardLibraryJars using the plugin, i.e. do something like this
lazy val proguard = proguardSettings ++ Seq(
proguardOptions := Seq(
keepMain("com.test.FacebookPostScheduler"),
keepMain("org.postgresql.Driver")
),
proguardLibraryJars <++= unmanagedClasspath
)
The problem is the above obviously doesn't compile at this line
proguardLibraryJars <++= unmanagedClasspath
with the
No implicit for Append.Values[Seq[java.io.File], sbt.Keys.Classpath] found
error.
How would you code what I am attempting to do using the latest SBT (0.11.3-2) using a Build.scala (not a build.sbt)
I have a public repository of an SBT plugin that manages to pass jars to proguard. It doesn't use the proguard plugin but the code might help you figure how to gather dependencies.
https://github.com/tlazaro/xsbt-plugin-deployer/blob/master/src/main/scala/Deployer.scala
Look for:
private def getDepsJars(project: ProjectRef, bs: BuildStructure) = forAllProjects(project, bs) {p =>
artifactPath in Compile in packageBin in p get bs.data
}
That might give you a way to get started. It gathers every needed jar which is what you usually want, not just the unmanaged.
Alternatively you can just use this plugin and maybe collaborate. The code is a bit sloppy, it not intended to be released yet. The plugin does some other neat stuff like compressing everything with pack200 into a jar and having a custom ClassLoader that loads the compressed classes from there in runtime.
adamw/xsbt-proguard-plugin, which is the successor of siasia/xsbt-proguard-plugin seems to have the very option:
By default Proguard will be instructed to include everything except classes from the Java runtime. To treat additional libraries as external (i.e. to add them to the list of -libraryjars passed to Proguard), do the following. Here comes the example how to select a module named "httpclient" from the library dependencies:
proguardLibraryJars <++= (update) map (_.select(module = moduleFilter(name = "httpclient")))