How to access library dependencies at build-time in SBT/Scala build? - scala

For example, suppose I have a file project/CodeGeneration.scala that generates "managed" source-code, and suppose that object (CodeGeneration) needs to leverage a third-party library -- say jsoup...
import org.jsoup._
object CodeGeneration {
def generateCode = /* Generate code using jsoup... */
}
Simply adding a line for jsoup to your libraryDependencies in build.sbt doesn't do the trick; it leads to a compilation error complaining about the missing jsoup object/namespace.
So, (how) can one access this dependency from "meta" code -- code that generates other code?

It seems the solution is to leverage sbt's "recursive" nature, and put an additional build.sbt file in the project directory. So, for example, project/build.sbt might look like this:
libraryDependencies += "org.jsoup" % "jsoup" % "1.11.2"
There's more detail in sbt's official documentation.

Related

Set file-level option to scalapb project

I'm using ScalaPB (version 0.11.1) and plugin sbt-protoc (version 1.0.3) to try to compile an old project with ProtocolBuffers in Scala 2.12. Reading the documentation, I want to set the file property preserve_unknown_fields to false. But my question is, where? Where do I need to set this flag? On the .proto file?
I've also tried to include the flag as a package-scoped option by creating a package.proto file next to my other .proto file, with the following content (as it is specified here):
import "scalapb/scalapb.proto";
package eur.astrata.eu.bigdata.tpms.protobuf;
option (scalapb.options) = {
preserve_unknown_fields: false
};
But when trying to compile, I get the following error:
[libprotobuf WARNING T:\src\github\protobuf\src\google\protobuf\compiler\parser.cc:648] No syntax specified for the proto file: package.proto. Please use 'syntax = "proto2";' or 'syntax = "proto3";' to specify a syntax version. (Defaulted to proto2 syntax.)
scalapb/scalapb.proto: File not found.
package.proto:1:1: Import "scalapb/scalapb.proto" was not found or had errors.
I've also tried with syntax = "proto3"; at the beginning but it doesn't work.
Any help would be greatly appreciated.
From the docs:
If you are using sbt-protoc and importing protos like
scalapb/scalapb.proto, or common protocol buffers like
google/protobuf/wrappers.proto:
Add the following to your build.sbt:
libraryDependencies += "com.thesamet.scalapb" %% "scalapb-runtime" % scalapb.compiler.Version.scalapbVersion % "protobuf"
This tells sbt-protoc to extract protos from this jar (and all its dependencies,
which includes Google's common protos), and make them available in the
include path that is passed to protoc.
It is important to add that by setting preserve_unknown_fields to false you are turning off a protobuf feature that could prevent data loss when different parts of a distributed system are not running the same version of the schema.

How to output slf4j log messages from sbt?

I'm currently writing an sbt plugin. The tasks that I define are using functionality from libraries that are otherwise unrelated to sbt and that I don't want to change. These libraries use slf4j for logging. I would like the logging output to show up in the sbt console as if I had used streams.value.log, so that I can e. g. turn debug logging on and off in the usual sbt ways like set someTask / logLevel := Level.Debug. Is there a way to do this?
It seems like sbt-slf4j is supposed to solve my problem:
https://index.scala-lang.org/eltimn/sbt-slf4j/sbt-slf4j/1.0.4?target=_2.12
But I wasn't able to get it to work. My project/build.sbt looks like this:
libraryDependencies += "com.eltimn" %% "sbt-slf4j" % "1.0.4"
resolvers += Resolver.bintrayRepo("eltimn", "maven")
and build.sbt like so:
import org.slf4j.LoggerFactory
import org.slf4j.impl.SbtStaticLoggerBinder
val foo = taskKey[Unit]("foobar")
foo := {
SbtStaticLoggerBinder.sbtLogger = streams.value.log
// symbolic implementation, the actual implementation lives in a library
val logger = LoggerFactory.getLogger("foobar")
logger.warn("Hello!")
}
But running foo does not result in a warning being printed. A warning is printed if I change it to LoggerFactory.getLogger(streams.value.log.name), but this is not an option because again, the code lives in a library.
Is there any good way to solve this?

Use external libraries inside the build.sbt file

Is it somehow possible to use an external library inside the build.sbt file?
E.g. I want to write something like this:
import scala.io.Source
import io.circe._ // not possible
version := myTask
lazy val myTask: String = {
val filename = "version.txt"
Source.fromFile(filename).getLines.mkString(", ")
// do some json parsing using the circe library
// ...
}
One of the things I actually like about sbt is that the build project is (in most ways) just another project (which is also potentially configured by a meta-build project configured by a meta-meta-build project, etc.). This means you can just drop the following line into a project/build.sbt file:
libraryDependencies += "io.circe" %% "circe-jawn" % "0.11.1"
You could also add this to plugins.sbt if you wanted, or any other .sbt file in the projects directory, since the filenames (excluding the extension) have no meaning beyond human convention, but I'd suggest following convention and going with build.sbt.
Note though that sbt implicitly imports sbt.io in .sbt files, so the circe import in your build.sbt (at the root level—i.e. the build config, not the build build config) will need to look like this:
import _root_.io.circe.jawn.decode
scalaVersion := decode[String]("\"2.12.8\"").right.get
(For anyone who hasn't seen it before, the _root_ here just means "start the package hierarchy here instead of assuming io is the imported one".)

How can I accessing the scala play plugin from Build.scala

I am not a very sophisticated SBT user, although I have been using it casually for several years. I have made multiple project builds before, but (as always) this particular 'split one project into sub projects' has hit a snag.
The problem
I have a Build.sbt file in the root directory of a project that had the following line in it
lazy val commonPlay = Project(id = "commonPlay", base=file("modules/commonPlay")).
dependsOn(core).enablePlugins(PlayScala)
The important line in it is 'enablePlugins(PlayScala)'. As well as this build.sbt file, I have a plugins.sbt file in the project directory that declared a number of plugins, including "com.typesafe.play" % "sbt-plugin" % "2.2.1"
I am now migrating the project to use a Build.scala file, and in the Build.scala (which is in the project subdirectory) I have the following code
def playModule(dir: String) =
Project(id = dir, base = file(dir), settings = defaultSettings).
enablePlugins(PlayScala)
lazy val core = module("core")
This gives me a 'not found' exception at the PlayScala.
To Date
I've tried messing around with plugins.sbt: adding it under project/project, but decided that I didn't know what I was doing, and I have something of a phobia about not understanding this sort of detail. I had a good read of http://www.scala-sbt.org/0.13.1/docs/Getting-Started/Using-Plugins.html which was very helpful to my understanding, but didn't actually answer the problem.
My expectation is that all I have to do is specify an import, but I'm not sure how to work out what that import would be
Any help would be appreciated
As you can see in this test of the Play framework itself, you need to use play.PlayScala.
Project(id = dir, base = file(dir)).enablePlugins(play.PlayScala)

How to create a custom package task to jar a subset of classes in SBT

I am trying to define a separate package task without modifying the original task in compile configuration. This new task will package only a subset of classes conforming an API which we need to be able to share with other teams so they can write plugins for our application. So the end result will be two jars, one with the full application and a second one with a subset of the classes.
I approached this problem by creating a different configuration which I called pluginApi and would redefine the packageBin task within this new configuration so it does not change the original definition of packageBin. This idea was taken from here:
How to create custom "package" task to jar up only specific package in SBT?
In my build.stb I have:
lazy val PluginApi = config("pluginApi") extend(Compile) describedAs("Custom plugin api configuration")
lazy val root = project in file(".") overrideConfigs (PluginApi)
This effectively creates my new configuration and I can call
sbt pluginApi:packageBin
Which generates the complete jar in the same way as compile:packageBin would do. I then try to modify the mappings in the new packageBin task with:
mappings in (PluginApi, packageBin) ~= { (ms: Seq[(File, String)]) =>
ms filter { case (file, toPath) =>
toPath.startsWith("some/path/defining/api")
}
}
but this has no effect. I think the reason is because the call to pluginApi:packageBin is delegated to compile:packageBin rather than it being a cloned task.
I can redefine a new packageBin within the new scope like:
packageBin in PluginApi := {
}
However I would have to rewrite all packageBin functionality instead of reusing existing code. Also, in case that rewriting is unavoidable I am not sure how that implementation would be.
Could somebody provide an example about how to achieve this?
You could have it done as follows
lazy val PluginApi = config("pluginApi").extend(Compile)
inConfig(PluginApi)(Defaults.compileSettings) // you have to have standard
mappings in (PluginApi, packageBin) := {
val original = (mappings in (PluginApi, packageBin)).value
original.filter { case (file, toPath) => toPath.startsWith("some/path/defining/api") }
}
unmanagedSourceDirectories in PluginApi := (unmanagedSourceDirectories in Compile).value
Note that, if you keep your sources in src/main/scala you'll have to override unmanagedSourceDirectories in the newly created configuration.
Normally the unmanagedSourceDirectories contains the configuration name. E.g. src/pluginApi/scala or src/pluginApi/java.
I have had similar problems (with more than one jar per project). Our project uses ant - here you can do it, you just will repeat yourself a lot.
However, I have come to the conclusion that this scenario (2 JARs for one project) actually can be simplified by splitting the project - i.e. making 2 modules out of it.
This way, I don't have to "fight" tools which assume project==artifact (like sbt, maybe maven?, IDEA's default setting,...).
As a bonus point the compiler helps me to verify that my dependencies are correct, i.e. that I did not accidentally make my API package depend on the implementation package - when compiling everything together and only splitting classes apart in the JAR step, you do run the risk of getting an invalid dependency in your setup which you would only see when testing, because during compile time everything is compiled together.