I've build a litte object, which can interpret scala code on the fly and catches a value out of it.
object Interpreter {
import scala.tools.nsc._
import scala.tools.nsc.interpreter._
class Dummy
val settings = new Settings
settings.usejavacp.value = false
settings.embeddedDefaults[Dummy] // to make imain useable with sbt.
val imain = new IMain(settings)
def run(code: String, returnId: String) = {
this.imain.beQuietDuring{
this.imain.interpret(code)
}
val ret = this.imain.valueOfTerm(returnId)
this.imain.reset()
ret
}
}
object Main {
def main(args: Array[String]) {
println(Interpreter.run("val x = 1", "x"))
}
}
In a pure sbt environment or called by the scala interpreter this code works fine! But if I run this in a simple play (version 2.2.2) application, it gets a null pointer at val ret = this.imain.valueOfTerm(returnId).
play uses also a modified sbt, therefor it should probably work. What does play do that this code doesn't work anymore? Any ideas how to get this code to work in play?
Note
That's the used build.sbt:
name := "Test"
version := "1.0"
scalaVersion := "2.10.3"
libraryDependencies += "org.scala-lang" % "scala-compiler" % scalaVersion.value
Alternatively I tried this implementation, but it doesen't solve the problem either:
object Interpreter2 {
import scala.tools.nsc._
import scala.tools.nsc.interpreter._
import play.api._
import play.api.Play.current
val settings: Settings = {
lazy val urls = java.lang.Thread.currentThread.getContextClassLoader match {
case cl: java.net.URLClassLoader => cl.getURLs.toList
case _ => sys.error("classloader is not a URLClassLoader")
}
lazy val classpath = urls map {_.toString}
val tmp = new Settings()
tmp.bootclasspath.value = classpath.distinct mkString java.io.File.pathSeparator
tmp
}
val imain = new IMain(settings)
def run(code: String, returnId: String) = {
this.imain.beQuietDuring {
this.imain.interpret(code)
}
val ret = this.imain.valueOfTerm(returnId)
this.imain.reset()
ret
}
}
Useful links I found to make this second implementation:
scala.tools.nsc.IMain within Play 2.1
How to set up classpath for the Scala interpreter in a managed environment?
https://groups.google.com/forum/#!topic/scala-user/wV86VwnKaVk
https://github.com/gourlaysama/play-repl-example/blob/master/app/REPL.scala#L18
https://gist.github.com/mslinn/7205854
After spending a few hours on this issue myself, here is a solution that I came up with. It works both inside SBT and outside. It is also expected to work in a variety of managed environments (like OSGi):
private def getClasspathUrls(classLoader: ClassLoader, acc: List[URL]): List[URL] = {
classLoader match {
case null => acc
case cl: java.net.URLClassLoader => getClasspathUrls(classLoader.getParent, acc ++ cl.getURLs.toList)
case c => LOGGER.error("classloader is not a URLClassLoader and will be skipped. ClassLoader type that was skipped is " + c.getClass)
getClasspathUrls(classLoader.getParent, acc)
}
}
val classpathUrls = getClasspathUrls(this.getClass.getClassLoader, List())
val classpathElements = classpathUrls map {url => url.toURI.getPath}
val classpath = classpathElements mkString java.io.File.pathSeparator
val settings = new Settings
settings.bootclasspath.value = classpath
val imain = new IMain(settings)
// use imain to interpret code. It should be able to access all your application classes as well as dependent libraries.
It's because play uses the "fork in run" feature from sbt. This feature starts a new JVM and this causes that this failure appears:
[info] Failed to initialize compiler: object scala.runtime in compiler mirror not found.
[info] ** Note that as of 2.8 scala does not assume use of the java classpath.
[info] ** For the old behavior pass -usejavacp to scala, or if using a Settings
[info] ** object programatically, settings.usejavacp.value = true.
See: http://www.scala-sbt.org/release/docs/Detailed-Topics/Forking
Related
In an SBT multi-project build, when you run a task on an aggregator project and it runs the tasks in every aggregated subproject then all the logs from each subproject get output together in one big stream.
This makes it hard to debug build issues in a multi-project build as all the logs get mixed together. Is there a way to output the projectID on each log line so that you can quickly identify which subproject the log came from?
Here is an example project:
name := "my-multiproject-build"
lazy val ProjectOne = project
lazy val ProjectTwo = project
lazy val root = project.in( file(".") ).aggregate(ProjectOne, ProjectTwo)
(what happens by default)
sbt package
[info] Packaging ProjectOne.jar ...
[info] Done packaging.
[info] Packaging ProjectTwo.jar ...
[info] Done packaging.
(what I want)
sbt package
[info] [ProjectOne] Packaging ProjectOne.jar ...
[info] [ProjectOne] Done packaging.
[info] [ProjectTwo] Packaging ProjectTwo.jar ...
[info] [ProjectTwo] Done packaging.
I tried looking into SBT custom loggers, but unfortunately the documentation is a bit sparse and I'm by no means an SBT expert.
Like Rich said, there is currently no extension point to customize sbt's logging format. But if you don't mind relying on internal APIs you can get close to what you want, depending on which version of sbt you are using.
Basically you would have to replace the default logManager rather than adding extraLoggers (the API is similar though).
sbt 0.13.x
Our job here looks simpler. We can reuse BufferedLogger to avoid the boilerplate involved in delegating everything to a ConsoleLogger:
logManager := LogManager.withScreenLogger { (_, state) =>
val console = ConsoleLogger(state.globalLogging.console)
new BufferedLogger(console) {
val project = projectID.value.name
override def log(level: Level.Value, message: => String): Unit =
console.log(level, s"[$project] $message")
}
}
sbt 1.0.x
The logging API was changed here to provide event logging. We now have to provide a log4j Appender which is more flexible, but makes our job more difficult. We can't reuse the classes from sbt.internal where the logging implementation has moved, because they are all private, sealed, final, etc. The only thing I could think of short of duplicating the functionality of ConsoleAppender was to hack the output stream:
logManager := LogManager.defaultManager(
ConsoleOut.printStreamOut(new PrintStream(System.out) {
val project = projectID.value.name
override def println(str: String): Unit = {
val (lvl, msg) = str.span(_ != ']')
super.println(s"$lvl] [$project$msg")
}
}))
Note that there is no guarantee println will be called instead of some other print method.
I don't know if it's possible to use a log4j configuration file to customize the format.
Looking through the SBT code, I don't think this is possible nicely.
Here's a build.sbt which does most of what you want.
import sbt.Level
name := "my-multiproject-build"
lazy val ProjectOne = project
lazy val ProjectTwo = project
lazy val root = project.in( file(".") ).aggregate(ProjectOne, ProjectTwo)
val wrapLogger = (project: Project, inner: AbstractLogger) => {
new AbstractLogger {
override def log(level: Level.Value, message: => String): Unit = {
inner.log(
level,
"[" + project.id + "] " + message
)
}
override def setTrace(flag: Int): Unit = inner.setTrace(flag)
override def setLevel(newLevel: Level.Value): Unit = {
// MultiLogger keeps setting this to debug
inner.setLevel(Level.Info)
}
override def setSuccessEnabled(flag: Boolean): Unit = inner.setSuccessEnabled(flag)
override def logAll(events: Seq[LogEvent]): Unit = {
events.foreach(log)
}
override def control(event: _root_.sbt.ControlEvent.Value, message: => String): Unit
= inner.control(event, message)
override def successEnabled: Boolean = inner.successEnabled
override def getLevel = inner.getLevel
override def getTrace: Int = inner.getTrace
override def trace(t: => Throwable): Unit = inner.trace(t)
override def success(message: => String): Unit = inner.success(message)
}
}
extraLoggers in ProjectOne := {
val currentFunction = extraLoggers.value
(key: ScopedKey[_]) => {
val logger = wrapLogger(ProjectOne, ConsoleLogger())
logger.setLevel(Level.Info)
logger +: currentFunction(key)
}
}
extraLoggers in ProjectTwo := {
val currentFunction = extraLoggers.value
(key: ScopedKey[_]) => {
val logger = wrapLogger(ProjectTwo, ConsoleLogger())
logger.setLevel(Level.Info)
logger +: currentFunction(key)
}
}
The output is now duplicated for project-specific logs: once with the project name prepended and once without it.
The output looks like:
[info] Done packaging.
[info] [ProjectTwo] Done packaging.
[info] Done updating.
[info] [ProjectOne] Done updating.
The ConsoleLogger is constructed at MainLogging.defaultScreen and there are no extension points which let you manipulate the log messages that I can see.
If SBT had used a logging library like logback or log4j2, rather than reinventing the wheel with its own logging framework, this would have been possible. :-(
I hit a MissingRequirementError when I try to invoke scaladoc from within an sbt task.
Using any version of sbt 0.13.x, start with this build.sbt:
val scaladoc = taskKey[Unit]("run scaladoc")
scaladoc := {
import scala.tools.nsc._
val settings = new doc.Settings(error => print(error))
settings.usejavacp.value = true
val docFactory = new doc.DocFactory(new reporters.ConsoleReporter(settings), settings)
val universe = docFactory.makeUniverse(Left((sources in Compile).value.map(_.absolutePath).toList))
}
Then run sbt scaladoc, and behold (during makeUniverse):
[info] Set current project to test (in build file:...)
scala.reflect.internal.MissingRequirementError: object scala.annotation.Annotation in compiler mirror not found.
at scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:16)
at scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:17)
at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:48)
What is wrong here? I've already tried fork := true and different combinations of sbt/scala versions to no avail.
It seems you need to provide scala-library (and indeed, any other dependencies) directly to the DocFactory.
scaladoc := {
import scala.tools.nsc._
val settings = new doc.Settings(error => print(error))
val dependencyPaths = (update in Compile).value
.select().map(_.absolutePath).mkString(java.io.File.pathSeparator)
settings.classpath.append(dependencyPaths)
settings.bootclasspath.append(dependencyPaths)
val docFactory = new doc.DocFactory(new reporters.ConsoleReporter(settings), settings)
val universe = docFactory.makeUniverse(Left((sources in Compile).value.map(_.absolutePath).toList))
}
I am new to SBT and I have been trying to build a custom task for this build.
I have a simple build project:
import sbt._
import Keys._
object JsonBuild extends Build{
lazy val barTask = taskKey[Unit]("some simple task")
val afterTestTask1 = barTask := { println("tests ran!") }
val afterTestTask2 = barTask <<= barTask.dependsOn(test in Test)
lazy val myBarTask = taskKey[Unit]("some simple task")
//val afterMyBarTask1 = myBarTask := { println("tests ran!") }
lazy val afterMyBarTask2 = myBarTask <<= (myBarTask).dependsOn(test in Test) map { _ => println("tests ran!") }
//settings ++ Seq(afterMyBarTask2)
override lazy val settings = super.settings ++ Seq(afterMyBarTask2)
}
I keep getting the error:
References to undefined settings:
{.}/*:myBarTask from {.}/*:myBarTask (C:\Users\haques\Documents\workspace\SBT\jsonParser\project\Build.scala:13)
{.}/test:test from {.}/*:myBarTask (C:\Users\haques\Documents\workspace\SBT\jsonParser\project\Build.scala:13)
Did you mean test:test ?
I have googled around and I cannot find a solution.
Can you explain why it is not working?
lazy val myBarTask = taskKey[Unit]("some simple task")
override lazy val settings = super.settings ++ Seq(myBarTask := { (test in Test).value; println("tests ran!") } )
myBarTask is undefined when you call dependsOn. you should define it before using dependsOn. also value call on key (task/setting) is now preferred way to depend on other keys. you can still use your version, but define myBarTask
This has been bothering.
I did a bit more reading.
I think I know why the above code does not work.
lazy val afterMyBarTask2 = myBarTask <<= (myBarTask).dependsOn(test in Test) map { _ => println("tests ran!") }
When I write (myBarTask).dependsOn(test in Test), the project scope for test is chosen by SBT as ThisBuild.
{.}/test:test from {.}/*:myBarTask (C:\Users\haques\Documents\workspace\SBT\jsonParser\project\Build.scala:13)
ThisBuild project scope does not have the setting test in configuration Test.
Only projects have the setting test.
The key I think that setting is added by some default SBT plugin to the projects settings.
You check what scopes settings exist in SBT by using the inspect command.
If you type in the SBT REPL:
{.}/test:test
The output is:
inspect {.}/test:test
[info] No entry for key.
SBT correctly suggests:
test:test which is:
{file:/C:/Users/haques/Documents/workspace/SBT/jsonParser/}jsonparser/test:test
If the project is not specified in the project scope axis, SBT chooses the current project by default.
Every SBT project if not specified has its own project settings.
I googled a lot and am totally stuck now. I know, that there are similar questions but please read to the end. I have tried all proposed solutions and none did work.
I am trying to use the IMain class from scala.tools.nsc within a Play 2.1 project (Using Scala 2.10.0).
Controller Code
This is the code, where I try to use the IMain in a Websocket. This is only for testing.
object Scala extends Controller {
def session = WebSocket.using[String] { request =>
val interpreter = new IMain()
val (out,channel) = Concurrent.broadcast[String]
val in = Iteratee.foreach[String]{ code =>
interpreter.interpret(code) match {
case Results.Error => channel.push("error")
case Results.Incomplete => channel.push("incomplete")
case Results.Success => channel.push("success")
}
}
(in,out)
}
}
As soon as something gets sent over the Websocket the following error gets logged by play:
Failed to initialize compiler: object scala.runtime in compiler mirror not found.
** Note that as of 2.8 scala does not assume use of the java classpath.
** For the old behavior pass -usejavacp to scala, or if using a Settings
** object programatically, settings.usejavacp.value = true.
Build.scala
object ApplicationBuild extends Build {
val appName = "escalator"
val appVersion = "1.0-SNAPSHOT"
val appDependencies = Seq(
"org.scala-lang" % "scala-compiler" % "2.10.0"
)
val main = play.Project(appName, appVersion, appDependencies).settings(
)
}
What I have tried so far
All this didn't work:
I have included fork := true in the Build.scala
A Settings object with:
embeddedDefaults[MyType]
usejavacp.value = true
The soultion proposed as answer to Question Embedded Scala REPL inherits parent classpath
I dont know what to do now.
The problem here is that sbt doesnt add scala-library to the class path.
The following workaround works.
First create a folder lib in the top project directory(the parent of app,conf etc) and copy there the scala-library.jar
Then you can use the following code to host an interpreter :
val settings = new Settings
settings.bootclasspath.value +=scala.tools.util.PathResolver.Environment.javaBootClassPath + File.pathSeparator + "lib/scala-library.jar"
val in = new IMain(settings){
override protected def parentClassLoader = settings.getClass.getClassLoader()
}
val res = in.interpret("val x = 1")
The above creates the bootclasspath by adding to the java class the scala library. It's not a problem with play framework it comes from the sbt. The same problem occures for any scala project when it runs with sbt. Tested with a simple project. When it runs from eclipse its works fine.
EDIT: Link to sample project demonstrating the above.`
I wonder if the reflect jar is missing. Try adding this too in appDependencies.
"org.scala-lang" % "scala-reflect" % "2.10.0"
I'm trying to get an interactive shell into my Scala application. I'm using the following system:
Scala 2.10.0
sbt 0.12.2
Akka 2.1.0
sbt-lwjgl-plugin 3.1.4
and the following non-working code:
import akka.actor.Actor
import scala.tools.nsc.Settings
import scala.tools.nsc.interpreter.IMain
class TestActor extends Actor {
def receive => {
case _ => {
val settings = new Settings
settings.usejavacp.value = true
settings embeddedDefaults ActorSystem.getClass.getClassLoader
val repl = new IMain(settings)
repl.interpret("import java._") // working
repl.interpret("import scala._") // working
repl.interpret("import akka._") // not working
repl.interpret("import other.java.class.Bar") // not working
}
}
}
Sbt is set to fork := true. I've tried several settings and class path configurations, but didn't find a working configuration. Can someone give me a hint/solution for this problem?
Have you tried to re-import all classpath with absolute path?
val settings = new Settings
settings.usejavacp.value = true
val classLoader = Thread.currentThread.getContextClassLoader
classLoader.asInstanceOf[URLClassLoader].getURLs.map(url => new File(url.toURI).getAbsolutePath).foreach {
jarPath =>
println(s"adding into Scala SDK classpath : ${jarPath}")
settings.bootclasspath.append(jarPath)
settings.classpath.append(jarPath)
}