I'm currently writing an sbt plugin. The tasks that I define are using functionality from libraries that are otherwise unrelated to sbt and that I don't want to change. These libraries use slf4j for logging. I would like the logging output to show up in the sbt console as if I had used streams.value.log, so that I can e. g. turn debug logging on and off in the usual sbt ways like set someTask / logLevel := Level.Debug. Is there a way to do this?
It seems like sbt-slf4j is supposed to solve my problem:
https://index.scala-lang.org/eltimn/sbt-slf4j/sbt-slf4j/1.0.4?target=_2.12
But I wasn't able to get it to work. My project/build.sbt looks like this:
libraryDependencies += "com.eltimn" %% "sbt-slf4j" % "1.0.4"
resolvers += Resolver.bintrayRepo("eltimn", "maven")
and build.sbt like so:
import org.slf4j.LoggerFactory
import org.slf4j.impl.SbtStaticLoggerBinder
val foo = taskKey[Unit]("foobar")
foo := {
SbtStaticLoggerBinder.sbtLogger = streams.value.log
// symbolic implementation, the actual implementation lives in a library
val logger = LoggerFactory.getLogger("foobar")
logger.warn("Hello!")
}
But running foo does not result in a warning being printed. A warning is printed if I change it to LoggerFactory.getLogger(streams.value.log.name), but this is not an option because again, the code lives in a library.
Is there any good way to solve this?
Related
For a project based on Play with sbt, I'd like to have multiple flavors for test runs, using different configuration files. Motivation is being able to run tests either against a local or a remote database.
There's already a custom config file speicified for general test runs (in build.sbt):
javaOptions in Test += "-Dconfig.file=conf/application.test.conf"
Now I would like to have another command where the same tests run against some configuration file conf/application.test-ci.conf.
Approaches tried so far
Adding a command alias
addCommandAlias("test-ci", ";test -Dconfig.file=conf/application.test-ci.conf")
This fails with an error message of a missing semicolon (;), indicating that sbt interprets the resulting command line as multiple commands, but I don't understand why.
Extend Test
lazy val CITest = config("ci") extend Test
lazy val config = (project in file(".")).enablePlugins(PlayScala)
.configs(CITest)
.settings(inConfig(CITest)(Defaults.testTasks): _*)
.settings(
javaOptions in CITest += "-Dconfig.file=conf/application.test-ci.conf"
)
javaOptions in CITest -= "-Dconfig.file=conf/application.test.conf"
I'm don't fully understand what this is doing, but it always seems to pick up the other test configuration file.
How can I specify multiple test setups picking up different configuration files?
Try first applying the setting via set command and then follow up with test task like so
addCommandAlias(
"test-ci",
""";set Test/javaOptions ++= Seq("-Dconfig.file=conf/application.test.con"); test"""
)
Notice how ; separates the set from test.
Another approach is to modify the setting based on the environment. Usually there is some environment variable set on CI like CI or BUILD, so you can modify javaOptions conditionally (without any custom configurations):
Test/javaOptions ++= {
if (sys.env.get("CI").isEmpty) Seq.empty
else Seq("-Dconfig.file=conf/application.test-ci.conf")
}
Note: Test/javaOptions is the new syntax for javaOptions in Test (since sbt 1)
Is it somehow possible to use an external library inside the build.sbt file?
E.g. I want to write something like this:
import scala.io.Source
import io.circe._ // not possible
version := myTask
lazy val myTask: String = {
val filename = "version.txt"
Source.fromFile(filename).getLines.mkString(", ")
// do some json parsing using the circe library
// ...
}
One of the things I actually like about sbt is that the build project is (in most ways) just another project (which is also potentially configured by a meta-build project configured by a meta-meta-build project, etc.). This means you can just drop the following line into a project/build.sbt file:
libraryDependencies += "io.circe" %% "circe-jawn" % "0.11.1"
You could also add this to plugins.sbt if you wanted, or any other .sbt file in the projects directory, since the filenames (excluding the extension) have no meaning beyond human convention, but I'd suggest following convention and going with build.sbt.
Note though that sbt implicitly imports sbt.io in .sbt files, so the circe import in your build.sbt (at the root level—i.e. the build config, not the build build config) will need to look like this:
import _root_.io.circe.jawn.decode
scalaVersion := decode[String]("\"2.12.8\"").right.get
(For anyone who hasn't seen it before, the _root_ here just means "start the package hierarchy here instead of assuming io is the imported one".)
For example, suppose I have a file project/CodeGeneration.scala that generates "managed" source-code, and suppose that object (CodeGeneration) needs to leverage a third-party library -- say jsoup...
import org.jsoup._
object CodeGeneration {
def generateCode = /* Generate code using jsoup... */
}
Simply adding a line for jsoup to your libraryDependencies in build.sbt doesn't do the trick; it leads to a compilation error complaining about the missing jsoup object/namespace.
So, (how) can one access this dependency from "meta" code -- code that generates other code?
It seems the solution is to leverage sbt's "recursive" nature, and put an additional build.sbt file in the project directory. So, for example, project/build.sbt might look like this:
libraryDependencies += "org.jsoup" % "jsoup" % "1.11.2"
There's more detail in sbt's official documentation.
I have a Play! 2 for Scala application, and I am using Specs2 for tests. I can run all tests with the test command, or a particular specification with test-only MyParticularSpec.
What I would like to do is mark some particular specifications, or even single methods inside a specification, in order to do things like:
running all tests that are not integration (that is, that do not access external resources)
running all tests that do not access external resources in write mode (but still running the reading tests)
running all tests but a given one
and so on.
I guess something like that should be doable, perhaps by adding some annotations, but I am not sure how to go for it.
Does there exist a mechanism to selectively run some tests and not other ones?
EDIT I have answered myself when using test-only. Still the command line option does not work for the test task. Following the sbt guide I have tried to create an additional sbt configuration, like
object ApplicationBuild extends Build {
// more settings
lazy val UnitTest = config("unit") extend(Test)
lazy val specs = "org.scala-tools.testing" %% "specs" % "1.6.9" % "unit"
val main = PlayProject(appName, appVersion, appDependencies, mainLang = SCALA)
.configs(UnitTest)
.settings(inConfig(UnitTest)(Defaults.testTasks) : _*)
.settings(
testOptions in UnitTest += Tests.Argument("exclude integration"),
libraryDependencies += specs
)
}
This works when I pass arguments without options, for instance when I put Test.Argument("plan"). But I was not able to find how to pass a more complex argument. I have tried
Tests.Argument("exclude integration")
Tests.Argument("exclude=integration")
Tests.Argument("-exclude integration")
Tests.Argument("-exclude=integration")
Tests.Argument("exclude", "integration")
Tests.Argument("exclude \"integration\"")
and probably more. Still not any clue what is the correct syntax.
Does anyone know how to pass arguments with options to specs2 from sbt?
First, following the specs2 guide one must add tags to the specifications, like this
class MySpec extends Specification with Tags {
"My spec" should {
"exclude this test" in {
true
} tag ("foo")
"include this one" in {
true
}
}
}
The command line arguments to include are documented here
Then one can selectively include or exclude test with
test-only MySpec -- exclude foo
test-only MySpec -- include foo
You can also use without any change to your build
test-only * -- exclude integration
Tested in Play 2.1-RC3
If you want to pass several arguments you can add several strings to one Test.Argument
testOptions in Test += Tests.Argument("include", "unit")
There are examples of this in the specs2 User Guide here and in the Play documentation there.
I'm using Play2.2, and there are 2 ways to do this depending on whether or not you are in the play console.
From the console type: test-only full.namespace.to.TestSpec
From the terminal type: test-only "full.namespace.to.TestSpec"
I came across this question while trying to figure out how to do something similar for ScalaTest with Play. SBT has detailed documentation on how to configure additional test configurations but these could use a bit of tweaking for Play.
Apart from the subtly different Project configuration I found that I wanted to crib a bunch of the test settings from PlaySettings. The following is running and generating an Intellij project with integration test sources in the "/it" directory. I may still be missing reporting and lifecycle hooks,
object BuildSettings {
def testSettings = {
// required for ScalaTest. See http://stackoverflow.com/questions/10362388/using-scalatest-in-a-playframework-project
testOptions in Test := Nil
}
def itSettings = {
// variously cribbed from https://github.com/playframework/Play20/blob/master/framework/src/sbt-plugin/src/main/scala/PlaySettings.scala
sourceDirectory in IntegrationTest <<= baseDirectory / "it"
scalaSource in Test <<= baseDirectory / "it"
libraryDependencies += "play" %% "play-test" % play.core.PlayVersion.current % "it"
}
}
object ApplicationBuild extends Build {
val main = play.Project(
appName,
appVersion,
Dependencies.dependencies)
.configs( IntegrationTest )
.settings(Dependencies.resolutionRepos)
.settings(BuildSettings.testSettings)
.settings(Defaults.itSettings : _*)
.settings(BuildSettings.itSettings)
}
The ways I know about so far are
Create an ant build.xml file, make compile and run tasks, and include appropriate jars in a classpath=
Make at sbt project and include dependencies with version numbers in build.sbt
Make a maven project and include dependencies in the xml file
Run from the command line setting -classpath explicitly
None of these are bad, but it feels like extra work after being babied with
import json
json.loads('[1, 2]')
and having that work right off the bat, provided I have json installed. In particular tracking down appropriate versions on Mavenhub gets a little tiresome.
Though maybe I'm just being too picky ;-)
What you want is xsbtscript: https://github.com/paulp/xsbtscript
It allows you to create a single script file which includes both the sbt config your code requires along with the Scala code itself.
I think scalas from SBT is better. Either install conscript and run this command:
cs harrah/xsbt --branch v0.10.1
Or create it by hand:
java -Dsbt.main.class=sbt.ScriptMain -Dsbt.boot.directory=/home/user/.sbt/boot -jar sbt-launch.jar "$#"
And then use it like this:
#!/usr/bin/env scalas
!#
/***
scalaVersion := "2.9.0-1"
libraryDependencies ++= Seq(
"net.databinder" %% "dispatch-twitter" % "0.8.3",
"net.databinder" %% "dispatch-http" % "0.8.3"
)
*/
import dispatch.{ json, Http, Request }
import dispatch.twitter.Search
import json.{ Js, JsObject }
def process(param: JsObject) = {
val Search.text(txt) = param
val Search.from_user(usr) = param
val Search.created_at(time) = param
"(" + time + ")" + usr + ": " + txt
}
Http.x((Search("#scala") lang "en") ~> (_ map process foreach println))
Paul's xsbtscript is basically a shell that downloads and install all necessary components to do the same thing. It usually works well, but has some limitations (won't go through authenticated proxies, for instance).