I'm using the following code to hook into SBT's logging system to send the logging messages to another process accessible via a server setting:
extraLoggers := {
val clientLogger = FullLogger {
new Logger {
def log(level: Level.Value, message: => String): Unit =
if(level >= Level.Info) server.value.send(Json.arr("print", level.toString(), message))
def success(message: => String): Unit = server.value.send(Json.arr("print", "info", message))
def trace(t: => Throwable): Unit = server.value.send(Json.arr("print", "error", t.toString))
}
}
val currentFunction = extraLoggers.value
(key: ScopedKey[_]) => clientLogger +: currentFunction(key)
}
When I look at the output being spewed out on the other server process, I don't see the messages with green [success] tags appearing. Everything else (i.e. all the [info] messages and the red [error] messages) appear just fine.
Printing out clientLogger.successEnabled gives me true.
What am I doing wrong?
DISCLAIMER: Please use with care as the answer might be incomplete or even totally wrong.
After consulting the sources of sbt my understanding is that extraLoggers is a setting that is only "A function that provides additional loggers for a given setting." and these additional loggers are additional to StandardMain.console.
If it were possible, you would therefore have to set logManager to have a reference to extraLoggers and your own custom sbt.ConsoleOut in build.sbt, e.g.
logManager := LogManager.defaults(extraLoggers.value, new ConsoleOut {
val lockObject = System.out
def print(s: String): Unit = synchronized { print(s) }
def println(s: String): Unit = synchronized { println(s) }
def println(): Unit = synchronized {
System.out.println()
}
})
It won't however work since sbt.ConsoleOut is a sealed trait and hence there is no way to use it outside the file it was defined.
Having said that, I believe, in sbt 0.13.1, it's not possible to "intercept" the [success] message that's printed out when showSuccess is true as it comes out from ConsoleOut that's outside your control.
What you can do with extraLoggers is to have your own custom logging for tasks and streams.value.log.success("Succezz") should work.
The sample build.sbt with extraLoggers and a t task to demo the custom logger.
extraLoggers := {
val clientLogger = FullLogger {
new Logger {
def log(level: Level.Value, message: => String): Unit =
if(level >= Level.Info) println(s"+++ $message at $level")
def success(message: => String): Unit = println(s"+++ success: $message")
def trace(t: => Throwable): Unit = println(s"+++ trace: throwable: $t")
}
}
val currentFunction = extraLoggers.value
(key: ScopedKey[_]) => clientLogger +: currentFunction(key)
}
val t = taskKey[Unit]("Show extraLoggers")
t := {
println(s"Using extraLoggers")
val s: TaskStreams = streams.value
val log = s.log
log.debug("Saying hi...")
log.info("Hello!")
log.error("Error")
log.success("Succezz")
}
With the file, executing the t task gives the following output:
$ sbt
[info] Loading global plugins from /Users/jacek/.sbt/0.13/plugins
[info] Loading project definition from /Users/jacek/sandbox/sbt-0.13.1-extra-loggers/project
[info] Set current project to sbt-0-13-1-extra-loggers (in build file:/Users/jacek/sandbox/sbt-0.13.1-extra-loggers/)
[sbt-0-13-1-extra-loggers]> about
[info] This is sbt 0.13.1
[info] The current project is {file:/Users/jacek/sandbox/sbt-0.13.1-extra-loggers/}sbt-0-13-1-extra-loggers 0.1-SNAPSHOT
[info] The current project is built against Scala 2.10.3
[info] Available Plugins: com.typesafe.sbt.SbtGit, com.typesafe.sbt.SbtProguard, growl.GrowlingTests, org.sbtidea.SbtIdeaPlugin, np.Plugin, com.timushev.sbt.updates.UpdatesPlugin
[info] sbt, sbt plugins, and build definitions are using Scala 2.10.3
[sbt-0-13-1-extra-loggers]> t
Using extraLoggers
[info] Hello!
+++ Hello! at info
[error] Error
+++ Error at error
[success] Succezz
+++ success: Succezz
[success] Total time: 0 s, completed Dec 16, 2013 10:30:48 PM
Related
In an SBT multi-project build, when you run a task on an aggregator project and it runs the tasks in every aggregated subproject then all the logs from each subproject get output together in one big stream.
This makes it hard to debug build issues in a multi-project build as all the logs get mixed together. Is there a way to output the projectID on each log line so that you can quickly identify which subproject the log came from?
Here is an example project:
name := "my-multiproject-build"
lazy val ProjectOne = project
lazy val ProjectTwo = project
lazy val root = project.in( file(".") ).aggregate(ProjectOne, ProjectTwo)
(what happens by default)
sbt package
[info] Packaging ProjectOne.jar ...
[info] Done packaging.
[info] Packaging ProjectTwo.jar ...
[info] Done packaging.
(what I want)
sbt package
[info] [ProjectOne] Packaging ProjectOne.jar ...
[info] [ProjectOne] Done packaging.
[info] [ProjectTwo] Packaging ProjectTwo.jar ...
[info] [ProjectTwo] Done packaging.
I tried looking into SBT custom loggers, but unfortunately the documentation is a bit sparse and I'm by no means an SBT expert.
Like Rich said, there is currently no extension point to customize sbt's logging format. But if you don't mind relying on internal APIs you can get close to what you want, depending on which version of sbt you are using.
Basically you would have to replace the default logManager rather than adding extraLoggers (the API is similar though).
sbt 0.13.x
Our job here looks simpler. We can reuse BufferedLogger to avoid the boilerplate involved in delegating everything to a ConsoleLogger:
logManager := LogManager.withScreenLogger { (_, state) =>
val console = ConsoleLogger(state.globalLogging.console)
new BufferedLogger(console) {
val project = projectID.value.name
override def log(level: Level.Value, message: => String): Unit =
console.log(level, s"[$project] $message")
}
}
sbt 1.0.x
The logging API was changed here to provide event logging. We now have to provide a log4j Appender which is more flexible, but makes our job more difficult. We can't reuse the classes from sbt.internal where the logging implementation has moved, because they are all private, sealed, final, etc. The only thing I could think of short of duplicating the functionality of ConsoleAppender was to hack the output stream:
logManager := LogManager.defaultManager(
ConsoleOut.printStreamOut(new PrintStream(System.out) {
val project = projectID.value.name
override def println(str: String): Unit = {
val (lvl, msg) = str.span(_ != ']')
super.println(s"$lvl] [$project$msg")
}
}))
Note that there is no guarantee println will be called instead of some other print method.
I don't know if it's possible to use a log4j configuration file to customize the format.
Looking through the SBT code, I don't think this is possible nicely.
Here's a build.sbt which does most of what you want.
import sbt.Level
name := "my-multiproject-build"
lazy val ProjectOne = project
lazy val ProjectTwo = project
lazy val root = project.in( file(".") ).aggregate(ProjectOne, ProjectTwo)
val wrapLogger = (project: Project, inner: AbstractLogger) => {
new AbstractLogger {
override def log(level: Level.Value, message: => String): Unit = {
inner.log(
level,
"[" + project.id + "] " + message
)
}
override def setTrace(flag: Int): Unit = inner.setTrace(flag)
override def setLevel(newLevel: Level.Value): Unit = {
// MultiLogger keeps setting this to debug
inner.setLevel(Level.Info)
}
override def setSuccessEnabled(flag: Boolean): Unit = inner.setSuccessEnabled(flag)
override def logAll(events: Seq[LogEvent]): Unit = {
events.foreach(log)
}
override def control(event: _root_.sbt.ControlEvent.Value, message: => String): Unit
= inner.control(event, message)
override def successEnabled: Boolean = inner.successEnabled
override def getLevel = inner.getLevel
override def getTrace: Int = inner.getTrace
override def trace(t: => Throwable): Unit = inner.trace(t)
override def success(message: => String): Unit = inner.success(message)
}
}
extraLoggers in ProjectOne := {
val currentFunction = extraLoggers.value
(key: ScopedKey[_]) => {
val logger = wrapLogger(ProjectOne, ConsoleLogger())
logger.setLevel(Level.Info)
logger +: currentFunction(key)
}
}
extraLoggers in ProjectTwo := {
val currentFunction = extraLoggers.value
(key: ScopedKey[_]) => {
val logger = wrapLogger(ProjectTwo, ConsoleLogger())
logger.setLevel(Level.Info)
logger +: currentFunction(key)
}
}
The output is now duplicated for project-specific logs: once with the project name prepended and once without it.
The output looks like:
[info] Done packaging.
[info] [ProjectTwo] Done packaging.
[info] Done updating.
[info] [ProjectOne] Done updating.
The ConsoleLogger is constructed at MainLogging.defaultScreen and there are no extension points which let you manipulate the log messages that I can see.
If SBT had used a logging library like logback or log4j2, rather than reinventing the wheel with its own logging framework, this would have been possible. :-(
some of my scala annotation macro do not seem to get expanded, is there a way to inspect/log which expression gets passed to my annotation macro at compile time, because right now the code doesn't even compile...
def virtualize(tree: Tree): Tree = atPos(tree.pos) {
tree match {
case x =>
println("LOG: "+tree) //will only be printed during runtime
c.warning(tree.pos, "LOG: "+tree) //will only generate code for a warning
super.transform(tree)
}
}
Is there a way to issue compiler warnings in annotation macros?
Thanks a lot!
if you use idea then open terminal enter sbt ~compilewill Time compilation
then you can see compilation info log like follow in the terminal :
D:\git\scala-macro-example>sbt ~compile
[info] Loading project definition from D:\git\scala-macro-example\project
[info] Set current project to scala-macro-example (in build file:/D:/git/scala-macro-example/)
[info] Updating {file:/D:/git/scala-macro-example/}root...
[info] Resolving jline#jline;2.12.1 ...
[info] Done updating.
[info] Compiling 2 Scala sources to D:\git\scala-macro-example\module\macros\target\scala-2.11\classes...
[error] D:\git\scala-macro-example\module\macros\src\main\scala\macross\teach\WithHello.scala:16: type ClassWithFunc is not a member of package macross.annotat
ion
[error] class WithHelloImpl(val c: Context) extends macross.annotation.ClassWithFunc{
[error]
after
object ShowInfo {
class Show extends StaticAnnotation {
def macroTransform(annottees: Any*): Any = macro ShowImpl.apply
}
class ShowImpl(val c: Context) {
import c.universe._
def showInfo(s: String) =
c.info(c.enclosingPosition, s.split("\n").mkString("\n |---macro info---\n |", "\n |", ""), true)
def apply(annottees: c.Expr[Any]*): c.Expr[Any] = {
val a: Seq[c.universe.Tree] = annottees.map(_.tree)
showInfo(show(a.head))
c.Expr[Any](Block(a.toList, Literal(Constant(()))))
}
}
}
useing like following:
object ShowInfoUsing {
trait SuperTrait
class SuperClass
#ShowInfo.Show
class ShowInfoUsing(val i: Int = 1) extends SuperClass with SuperTrait {
def f = 1
val a = 1
}
}
you can see info logo in the terminal
[info] |---macro info---
[info] |class ShowInfoUsing extends SuperClass with SuperTrait {
[info] | <paramaccessor> val i: Int = _;
[info] | def <init>(i: Int = 1) = {
[info] | super.<init>();
[info] | ()
[info] | };
[info] | def f = 1;
[info] | val a = 1
[info] |}
[info] #ShowInfo.Show
[info]
I have a code like
test("mockito test") {
class ToTest {
def run(maybe: Option[Int], q: Option[Int] = None): Int = 42
}
val mockTest = mock[ToTest]
when(mockTest.run(None, None)).thenReturn(98)
mockTest.run(None)
verify(mockTest, times(1)).run(None, None)
}
Which fails with
[info] - mockito test *** FAILED ***
[info] org.mockito.exceptions.verification.junit.ArgumentsAreDifferent: Argument(s) are different! Wanted:
[info] toTest$1.run(None, None);
[info] -> at xxx$$anonfun$3.apply$mcV$sp(xxx.scala:55)
[info] Actual invocation has different arguments:
[info] toTest$1.run(None, null);
Or another scenario
test("mockito test") {
class ToTest {
def run(maybe: Option[Int], q: Int = 5): Int = 42
}
val mockTest = mock[ToTest]
when(mockTest.run(None, 5)).thenReturn(101)
mockTest.run(None)
verify(mockTest, times(1)).run(None, 5)
}
which fails with
[info] - mockito test *** FAILED ***
[info] org.mockito.exceptions.verification.junit.ArgumentsAreDifferent: Argument(s) are different! Wanted:
[info] toTest$1.run(None, 5);
[info] -> at xxx$$anonfun$3.apply$mcV$sp(xxx.scala:55)
[info] Actual invocation has different arguments:
[info] toTest$1.run(None, 0);
I guess it's because there are no default parameters in Java. Is there any workaround for it?
Thank you.
I guess that's because of CGLIB (or Byte Buddy in case of 2.0 beta) generates the code in that case, not Scala compiler, hence the default parameters will be always null.
A workaround could be (at least in some cases) to use spy instead:
val mockTest = spy(classOf[ToTest])
Sorry, no sugar syntax for it in ScalaTest.
The following test should pass, but it doesn't
class EngineTest extends FunSuite {
test("engine should not be null") {
val manager: ScriptEngineManager = new ScriptEngineManager
val engine: ScriptEngine = manager.getEngineByName("nashorn")
assert(engine != null)
}
}
The manager.getEngineFactories() seems to be empty. Why? How to init the context?
You need to explicitly pass in a ClassLoader to the ScriptEngineManager constructor. If you don't then it uses Thread.currentThread().getContextClassLoader() which is set to something weird when running under SBT. We just pass in null in our code to get it to work. You can also pass in getClass.getClassLoader:
class EngineTest extends FunSuite {
test("engine should not be null - null classloader") {
val manager: ScriptEngineManager = new ScriptEngineManager(null)
val engine: ScriptEngine = manager.getEngineByName("nashorn")
assert(engine != null)
}
test("engine should not be null - getClass.getClassLoader classloader") {
val manager: ScriptEngineManager = new ScriptEngineManager(getClass.getClassLoader)
val engine: ScriptEngine = manager.getEngineByName("nashorn")
assert(engine != null)
}
}
Both of those tests pass for me:
[info] EngineTest:
[info] - engine should not be null - null classloader
[info] - engine should not be null - getClass.getClassLoader classloader
[info] Run completed in 186 milliseconds.
[info] Total number of tests run: 2
[info] Suites: completed 1, aborted 0
[info] Tests: succeeded 2, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.
What versions are you using? This is sbt .13.
> console
[info] Starting scala interpreter...
[info]
Welcome to Scala version 2.11.0 (OpenJDK 64-Bit Server VM, Java 1.7.0_25).
Type in expressions to have them evaluated.
Type :help for more information.
scala> import javax.script._
import javax.script._
scala> new ScriptEngineManager().getEngineByName("scala")
res0: javax.script.ScriptEngine = scala.tools.nsc.interpreter.IMain#7078c799
scala> new ScriptEngineManager().getEngineByName("rhino")
res1: javax.script.ScriptEngine = com.sun.script.javascript.RhinoScriptEngine#5c854934
scala> new ScriptEngineManager().getEngineFactories
res2: java.util.List[javax.script.ScriptEngineFactory] = [com.sun.script.javascript.RhinoScriptEngineFactory#454ee4c0, scala.tools.nsc.interpreter.IMain$Factory#354e3bce]
Wait, you asked about test context --
Well, before I lost interest in decoding more sbt, adding to libraryDependencies:
"org.scala-lang" % "scala-compiler" % scalaVersion.value % "test",
enables locating the Scala script engine:
#Test def engines: Unit = {
import javax.script._
val all = new ScriptEngineManager().getEngineFactories
Console println s"Found ${all.size}: $all"
assert(all.size > 0)
}
No doubt there's a simple way to add runtime:full-classpath to test:full-classpath directly. Because it's the simple build tool, right?
For Nashorn on Java 8, note the location:
> set fullClasspath in Test += Attributed.blank(file(s"${util.Properties.javaHome}/lib/ext/nashorn.jar"))
[info] Defining test:fullClasspath
[info] The new value will be used by test:console, test:executeTests and 5 others.
[info] Run `last` for details.
[info] Reapplying settings...
[info] Set current project to goofy (in build file:/home/apm/goofy/)
> test
Found 1: [jdk.nashorn.api.scripting.NashornScriptEngineFactory#7fa2239d]
[info] Passed: Total 10, Failed 0, Errors 0, Passed 10
Update: https://github.com/sbt/sbt/issues/1214
Also I guess it's still considered black art:
// Somehow required to get a js engine in tests (https://github.com/sbt/sbt/issues/1214)
fork in Test := true
The only thing I had to change was to use:
fork in Test := true
as mentioned above - from here: https://github.com/sbt/sbt/issues/1214
I've build a litte object, which can interpret scala code on the fly and catches a value out of it.
object Interpreter {
import scala.tools.nsc._
import scala.tools.nsc.interpreter._
class Dummy
val settings = new Settings
settings.usejavacp.value = false
settings.embeddedDefaults[Dummy] // to make imain useable with sbt.
val imain = new IMain(settings)
def run(code: String, returnId: String) = {
this.imain.beQuietDuring{
this.imain.interpret(code)
}
val ret = this.imain.valueOfTerm(returnId)
this.imain.reset()
ret
}
}
object Main {
def main(args: Array[String]) {
println(Interpreter.run("val x = 1", "x"))
}
}
In a pure sbt environment or called by the scala interpreter this code works fine! But if I run this in a simple play (version 2.2.2) application, it gets a null pointer at val ret = this.imain.valueOfTerm(returnId).
play uses also a modified sbt, therefor it should probably work. What does play do that this code doesn't work anymore? Any ideas how to get this code to work in play?
Note
That's the used build.sbt:
name := "Test"
version := "1.0"
scalaVersion := "2.10.3"
libraryDependencies += "org.scala-lang" % "scala-compiler" % scalaVersion.value
Alternatively I tried this implementation, but it doesen't solve the problem either:
object Interpreter2 {
import scala.tools.nsc._
import scala.tools.nsc.interpreter._
import play.api._
import play.api.Play.current
val settings: Settings = {
lazy val urls = java.lang.Thread.currentThread.getContextClassLoader match {
case cl: java.net.URLClassLoader => cl.getURLs.toList
case _ => sys.error("classloader is not a URLClassLoader")
}
lazy val classpath = urls map {_.toString}
val tmp = new Settings()
tmp.bootclasspath.value = classpath.distinct mkString java.io.File.pathSeparator
tmp
}
val imain = new IMain(settings)
def run(code: String, returnId: String) = {
this.imain.beQuietDuring {
this.imain.interpret(code)
}
val ret = this.imain.valueOfTerm(returnId)
this.imain.reset()
ret
}
}
Useful links I found to make this second implementation:
scala.tools.nsc.IMain within Play 2.1
How to set up classpath for the Scala interpreter in a managed environment?
https://groups.google.com/forum/#!topic/scala-user/wV86VwnKaVk
https://github.com/gourlaysama/play-repl-example/blob/master/app/REPL.scala#L18
https://gist.github.com/mslinn/7205854
After spending a few hours on this issue myself, here is a solution that I came up with. It works both inside SBT and outside. It is also expected to work in a variety of managed environments (like OSGi):
private def getClasspathUrls(classLoader: ClassLoader, acc: List[URL]): List[URL] = {
classLoader match {
case null => acc
case cl: java.net.URLClassLoader => getClasspathUrls(classLoader.getParent, acc ++ cl.getURLs.toList)
case c => LOGGER.error("classloader is not a URLClassLoader and will be skipped. ClassLoader type that was skipped is " + c.getClass)
getClasspathUrls(classLoader.getParent, acc)
}
}
val classpathUrls = getClasspathUrls(this.getClass.getClassLoader, List())
val classpathElements = classpathUrls map {url => url.toURI.getPath}
val classpath = classpathElements mkString java.io.File.pathSeparator
val settings = new Settings
settings.bootclasspath.value = classpath
val imain = new IMain(settings)
// use imain to interpret code. It should be able to access all your application classes as well as dependent libraries.
It's because play uses the "fork in run" feature from sbt. This feature starts a new JVM and this causes that this failure appears:
[info] Failed to initialize compiler: object scala.runtime in compiler mirror not found.
[info] ** Note that as of 2.8 scala does not assume use of the java classpath.
[info] ** For the old behavior pass -usejavacp to scala, or if using a Settings
[info] ** object programatically, settings.usejavacp.value = true.
See: http://www.scala-sbt.org/release/docs/Detailed-Topics/Forking