Scala - Initialize REPL environment - scala

-Hi. I'd like to embed Scala REPL with initialized environment into my app. I've looked at IMain class and it seems I could do it via instance of it. The instance is created and then stored into intp public var in process() of ILoop.
How can I bind some names and/or add some imports before process() (e.g. before REPL)?
Following code fails on line 3 because intp is not yet created (=> NPE):
val x = 3
val interp = new ILoop
interp.bind("x", x) // -> interp.intp.bind("x", x)
val settings = new Settings
settings.usejavacp.value = true
interp.process(settings)
Thank you-.
UPDATE: Overriding createInterpreter() unfortunately doesn't work:
val x = 3
val interp = new ILoop {
override def createInterpreter() {
super.createInterpreter()
intp.bind("x", x) // -> interp.intp.bind("x", x)
}
}
val settings = new Settings
settings.usejavacp.value = true
interp.process(settings)
Interpreter is stuck on input (looks like deadlock, happens only with code above):
x: Int = 3
Failed to created JLineReader: java.lang.NoClassDefFoundError: scala/tools/jline/console/completer/Completer
Falling back to SimpleReader.
Welcome to Scala version 2.9.2 (OpenJDK 64-Bit Server VM, Java 1.7.0_06-icedtea).
Type in expressions to have them evaluated.
Type :help for more information.
scala> println
<infinite_sleep>
Thanks dvigal for suggestion.

There is a github project called scala-ssh-shell which may do what you want, or at least get you closer.

-Hi, sorry I not Scala REPL hacker but i think you can do something like:
class YourILoop(in0: Option[BufferedReader], protected override val out: JPrintWriter)
extends ILoop(in0, out) {
override def createInterpreter() {
if (addedClasspath != "")
settings.classpath append addedClasspath
intp = new ILoopInterpreter
val x = 3;
intp.bind("x", x)
}
}
object Run {
def errorFn(str: String): Boolean = {
Console.err println str
false
}
def process(args: Array[String]): Boolean = {
val command = new GenericRunnerCommand(args.toList, (x: String) => errorFn(x))
import command.{ settings, howToRun, thingToRun }
new YourILoop process settings
}
def main(args: Array[String]) {
process(args)
}
}

Related

What is Spark execution order with function calls in scala?

I have a spark program as follows:
object A {
var id_set: Set[String] = _
def init(argv: Array[String]) = {
val args = new AArgs(argv)
id_set = args.ids.split(",").toSet
}
def main(argv: Array[String]) {
init(argv)
val conf = new SparkConf().setAppName("some.name")
val rdd1 = getRDD(paras)
val rdd2 = getRDD(paras)
//......
}
def getRDD(paras) = {
//function details
getRDDDtails(paras)
}
def getRDDDtails(paras) = {
//val id_given = id_set
id_set.foreach(println) //worked normal, not empty
someRDD.filter{ x =>
val someSet = x.getOrElse(...)
//id_set.foreach(println) ------wrong, id_set just empty set
(some_set & id_set).size > 0
}
}
class AArgs(args: Array[String]) extends Serializable {
//parse args
}
I have a global variable id_set. At first, it is just an empty set. In main function, I call init which sets id_set to a non-empty set from args. After that, I call getRDD function which calls getRDDDtails. In getRDDDtails, I filter a rdd based on contents in id_set. However, the result semms to be empty. I tried to print is_set in executor, and it is just an empty line. So, the problem seems to be is_set is not well initilized(in init function). However, when I try to print is_set in driver(in head lines of function getRDDDtails), it worked normal, not empty.
So, I have tried to add val id_given = id_set in function getRDDDtails, and use id_given later. This seems to fix the problem. But I'm totally confused why should this happen? What is the execution order of Spark programs? Why does my solution work?

ILoop Tab Completion

I am creating a very simple extension of scala.tools.nsc.interpreter.ILoop with the intent of adding some additional bindings, but even in the most basic use-case the tab-completion does not seem to work. If I type in code it interprets and works as expected, but I no tab-completion. Is there something specific that needs to be defined in order for tab-completion to be enabled in the interactive interpreter (REPL)?
My use-case is as simple as the following:
val repl = new ILoop
repl.process(new Settings {
usejavacp.value = true
deprecation.value = true
})
Is there something other than ILoop I should be using?
It kind of works for me, modulo version.
$ scalacm myintp.scala && scalam myintp.Test
Welcome to Scala 2.12.0-RC2 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_101).
Type in expressions for evaluation. Or try :help.
scala> 42
res0: Int = 42
scala> 42.
!= < >>> doubleValue isNaN isValidShort shortValue toDouble toShort
% << ^ floatValue isNegInfinity isWhole signum toFloat unary_+
& <= abs floor isPosInfinity longValue to toHexString unary_-
* == byteValue getClass isValidByte max toBinaryString toInt unary_~
+ > ceil intValue isValidChar min toByte toLong underlying
- >= compare isInfinite isValidInt round toChar toOctalString until
/ >> compareTo isInfinity isValidLong self toDegrees toRadians |
scala> 42.s
self shortValue signum synchronized
scala> 42.self
res1: Int = 42
scala> :quit
Source:
$ cat myintp.scala
package myintp
import scala.tools.nsc._
import scala.tools.nsc.interpreter._
/* 2.12 */
object Test extends App {
val ss = new Settings {
usejavacp.value = true
deprecation.value = true
}
def repl = new ILoop {
override def createInterpreter(): Unit = {
super.createInterpreter()
}
}
repl process ss
}
/* 2.11
object Test extends App {
def repl = new ILoop {
override def createInterpreter(): Unit = {
def binder: Unit = intp beQuietDuring {
intp directBind ("foo", "bar")
intp bind ("baz", "boo")
}
super.createInterpreter()
intp initialize binder
}
}
repl process new Settings
}
*/
/* 2.9
object Test extends App {
def repl = new ILoop {
def binder: Unit = intp beQuietDuring {
intp bind ("baz", "boo")
}
override def loop(): Unit = {
binder
super.loop()
}
}
repl process new Settings
}
*/

Play 2.3 FakeApplication mode not setting in test?

I'm using play 2.3.8 and have some configuration in my GlobalSettings that change based on the mode of the application. So I have something like this:
object Global extends GlobalSettings {
override def onLoadConfig(config: Configuration, path: java.io.File, classloader: ClassLoader, mode: Mode.Mode) = {
println(mode)
val customConfig = //Based on mode.*
config ++ configuration ++ Configuration(ConfigFactory.parseMap(customConfig))
}
}
And then am trying to write tests to ensure that this behavior works:
class MyTest extends PlaySpec {
val testApp = FakeApplication(
additionalConfiguration = Map(
//SomeSettings And Stuff
"logger.application" -> "WARN",
"logger.root" -> "WARN"
)
)
val devApp = new FakeApplication(
additionalConfiguration = Map(
//SomeSettings And Stuff
"logger.application" -> "WARN",
"logger.root" -> "WARN"
)
) {
override val mode = Mode.Dev
}
val prodApp = new FakeApplication(
additionalConfiguration = Map(
//SomeSettings And Stuff
"logger.application" -> "WARN",
"logger.root" -> "WARN"
)
) {
override val mode = Mode.Prod
}
"ThisNonWorkingTestOfMine" must {
"when running application in test mode have config.thing = false" in running(testApp) {
assertResult(Mode.Test)(testApp.mode)
assertResult(false)(testApp.configuration.getBoolean("config.thing").get)
}
"when running application in dev mode have config.thing = false" in running(devApp) {
assertResult(Mode.Dev)(devApp.mode)
assertResult(false)(devApp.configuration.getBoolean("config.thing").get)
}
"when running application in prod mode have config.thing = true" in running(prodApp) {
assertResult(Mode.Prod)(prodApp.mode)
assertResult(true)(prodApp.configuration.getBoolean("config.thing").get)
}
}
}
And when I run these tests I see something a bit odd from my handy println:
Test
null
null
[info] MyTest:
[info] ThisNonWorkingTestOfMine
[info] play - Starting application default Akka system.
[info] play - Shutdown application default Akka system.
[info] - must when running application in test mode have config.thing = false
[info] play - Application started (Dev)
[info] - must when running application in dev mode have config.thing = false
[info] play - Application started (Prod)
[info] - must when running application in prod mode have config.thing = true *** FAILED ***
[info] Expected true, but got false (MyTest.scala:64)
[info] ScalaTest
How do I properly set the mode of the FakeApplication in Play 2.3? The way I have it now is based on a page from Mastering Play but clearly that isn't the way to go when using onLoadConfig it seems
Edit:
I'm also experimenting with OneAppPerTest and creating the FakeApplication in the newAppForTest method but it's still behaving oddly, with null's like the method above. This is really strange because if I set a random property like "foo" -> "bar" in the additionalConfiguration map when making my FakeApplication and then try to read it from config.getString in my Global object, it get's logged as None even though if I do app.configuration.getString in the test itself it shows bar. It feels like there is some type of disconnect here. And I don't get null for the mode if I use the FakeApplication.apply method rather than new FakeApplication
So I think this has something to do with the way that FakeApplication sets the mode to Mode.Test via override because if I copy the FakeApplication class and remove that line and create my own version of the class that let's me set the mode I have no issues. In other words, in my tests package I declare the following class:
package play.api.test
import play.api.mvc._
import play.api.libs.json.JsValue
import scala.concurrent.Future
import xml.NodeSeq
import play.core.Router
import scala.runtime.AbstractPartialFunction
import play.api.libs.Files.TemporaryFile
import play.api.{ Application, WithDefaultConfiguration, WithDefaultGlobal, WithDefaultPlugins }
case class FakeModeApplication(
override val path: java.io.File = new java.io.File("."),
override val classloader: ClassLoader = classOf[FakeModeApplication].getClassLoader,
val additionalPlugins: Seq[String] = Nil,
val withoutPlugins: Seq[String] = Nil,
val additionalConfiguration: Map[String, _ <: Any] = Map.empty,
val withGlobal: Option[play.api.GlobalSettings] = None,
val withRoutes: PartialFunction[(String, String), Handler] = PartialFunction.empty,
val mode: play.api.Mode.Value
) extends {
override val sources = None
} with Application with WithDefaultConfiguration with WithDefaultGlobal with WithDefaultPlugins {
override def pluginClasses = {
additionalPlugins ++ super.pluginClasses.diff(withoutPlugins)
}
override def configuration = {
super.configuration ++ play.api.Configuration.from(additionalConfiguration)
}
override lazy val global = withGlobal.getOrElse(super.global)
override lazy val routes: Option[Router.Routes] = {
val parentRoutes = loadRoutes
Some(new Router.Routes() {
def documentation = parentRoutes.map(_.documentation).getOrElse(Nil)
// Use withRoutes first, then delegate to the parentRoutes if no route is defined
val routes = new AbstractPartialFunction[RequestHeader, Handler] {
override def applyOrElse[A <: RequestHeader, B >: Handler](rh: A, default: A => B) =
withRoutes.applyOrElse((rh.method, rh.path), (_: (String, String)) => default(rh))
def isDefinedAt(rh: RequestHeader) = withRoutes.isDefinedAt((rh.method, rh.path))
} orElse new AbstractPartialFunction[RequestHeader, Handler] {
override def applyOrElse[A <: RequestHeader, B >: Handler](rh: A, default: A => B) =
parentRoutes.map(_.routes.applyOrElse(rh, default)).getOrElse(default(rh))
def isDefinedAt(x: RequestHeader) = parentRoutes.map(_.routes.isDefinedAt(x)).getOrElse(false)
}
def setPrefix(prefix: String) {
parentRoutes.foreach(_.setPrefix(prefix))
}
def prefix = parentRoutes.map(_.prefix).getOrElse("")
})
}
}
And then in my test I can use it like so:
val devApp = new FakeModeApplication(
additionalConfiguration = Map(
//SomeSettings And Stuff
"logger.application" -> "WARN",
"logger.root" -> "WARN"
), mode = Mode.Dev
)
And then the mode value comes through as what I set it to and not as null.
I'm posting this as an answer because it does solve the issue I'm facing, but I don't have an understanding of why the new keyword when making a FakeApplication like so: new FakeApplication() { override mode ... } causes the mode to come through as null in the onLoadConfig method on GlobalSettings. This feels like a hack rather than a solution and I'd appreciate if anyone with enough knowledge around this could post a solution that involves not copying the full FakeApplication class and changing one line.

Creating serializable objects from Scala source code at runtime

To embed Scala as a "scripting language", I need to be able to compile text fragments to simple objects, such as Function0[Unit] that can be serialised to and deserialised from disk and which can be loaded into the current runtime and executed.
How would I go about this?
Say for example, my text fragment is (purely hypothetical):
Document.current.elements.headOption.foreach(_.open())
This might be wrapped into the following complete text:
package myapp.userscripts
import myapp.DSL._
object UserFunction1234 extends Function0[Unit] {
def apply(): Unit = {
Document.current.elements.headOption.foreach(_.open())
}
}
What comes next? Should I use IMain to compile this code? I don't want to use the normal interpreter mode, because the compilation should be "context-free" and not accumulate requests.
What I need to get hold off from the compilation is I guess the binary class file? In that case, serialisation is straight forward (byte array). How would I then load that class into the runtime and invoke the apply method?
What happens if the code compiles to multiple auxiliary classes? The example above contains a closure _.open(). How do I make sure I "package" all those auxiliary things into one object to serialize and class-load?
Note: Given that Scala 2.11 is imminent and the compiler API probably changed, I am happy to receive hints as how to approach this problem on Scala 2.11
Here is one idea: use a regular Scala compiler instance. Unfortunately it seems to require the use of hard disk files both for input and output. So we use temporary files for that. The output will be zipped up in a JAR which will be stored as a byte array (that would go into the hypothetical serialization process). We need a special class loader to retrieve the class again from the extracted JAR.
The following assumes Scala 2.10.3 with the scala-compiler library on the class path:
import scala.tools.nsc
import java.io._
import scala.annotation.tailrec
Wrapping user provided code in a function class with a synthetic name that will be incremented for each new fragment:
val packageName = "myapp"
var userCount = 0
def mkFunName(): String = {
val c = userCount
userCount += 1
s"Fun$c"
}
def wrapSource(source: String): (String, String) = {
val fun = mkFunName()
val code = s"""package $packageName
|
|class $fun extends Function0[Unit] {
| def apply(): Unit = {
| $source
| }
|}
|""".stripMargin
(fun, code)
}
A function to compile a source fragment and return the byte array of the resulting jar:
/** Compiles a source code consisting of a body which is wrapped in a `Function0`
* apply method, and returns the function's class name (without package) and the
* raw jar file produced in the compilation.
*/
def compile(source: String): (String, Array[Byte]) = {
val set = new nsc.Settings
val d = File.createTempFile("temp", ".out")
d.delete(); d.mkdir()
set.d.value = d.getPath
set.usejavacp.value = true
val compiler = new nsc.Global(set)
val f = File.createTempFile("temp", ".scala")
val out = new BufferedOutputStream(new FileOutputStream(f))
val (fun, code) = wrapSource(source)
out.write(code.getBytes("UTF-8"))
out.flush(); out.close()
val run = new compiler.Run()
run.compile(List(f.getPath))
f.delete()
val bytes = packJar(d)
deleteDir(d)
(fun, bytes)
}
def deleteDir(base: File): Unit = {
base.listFiles().foreach { f =>
if (f.isFile) f.delete()
else deleteDir(f)
}
base.delete()
}
Note: Doesn't handle compiler errors yet!
The packJar method uses the compiler output directory and produces an in-memory jar file from it:
// cf. http://stackoverflow.com/questions/1281229
def packJar(base: File): Array[Byte] = {
import java.util.jar._
val mf = new Manifest
mf.getMainAttributes.put(Attributes.Name.MANIFEST_VERSION, "1.0")
val bs = new java.io.ByteArrayOutputStream
val out = new JarOutputStream(bs, mf)
def add(prefix: String, f: File): Unit = {
val name0 = prefix + f.getName
val name = if (f.isDirectory) name0 + "/" else name0
val entry = new JarEntry(name)
entry.setTime(f.lastModified())
out.putNextEntry(entry)
if (f.isFile) {
val in = new BufferedInputStream(new FileInputStream(f))
try {
val buf = new Array[Byte](1024)
#tailrec def loop(): Unit = {
val count = in.read(buf)
if (count >= 0) {
out.write(buf, 0, count)
loop()
}
}
loop()
} finally {
in.close()
}
}
out.closeEntry()
if (f.isDirectory) f.listFiles.foreach(add(name, _))
}
base.listFiles().foreach(add("", _))
out.close()
bs.toByteArray
}
A utility function that takes the byte array found in deserialization and creates a map from class names to class byte code:
def unpackJar(bytes: Array[Byte]): Map[String, Array[Byte]] = {
import java.util.jar._
import scala.annotation.tailrec
val in = new JarInputStream(new ByteArrayInputStream(bytes))
val b = Map.newBuilder[String, Array[Byte]]
#tailrec def loop(): Unit = {
val entry = in.getNextJarEntry
if (entry != null) {
if (!entry.isDirectory) {
val name = entry.getName
// cf. http://stackoverflow.com/questions/8909743
val bs = new ByteArrayOutputStream
var i = 0
while (i >= 0) {
i = in.read()
if (i >= 0) bs.write(i)
}
val bytes = bs.toByteArray
b += mkClassName(name) -> bytes
}
loop()
}
}
loop()
in.close()
b.result()
}
def mkClassName(path: String): String = {
require(path.endsWith(".class"))
path.substring(0, path.length - 6).replace("/", ".")
}
A suitable class loader:
class MemoryClassLoader(map: Map[String, Array[Byte]]) extends ClassLoader {
override protected def findClass(name: String): Class[_] =
map.get(name).map { bytes =>
println(s"defineClass($name, ...)")
defineClass(name, bytes, 0, bytes.length)
} .getOrElse(super.findClass(name)) // throws exception
}
And a test case which contains additional classes (closures):
val exampleSource =
"""val xs = List("hello", "world")
|println(xs.map(_.capitalize).mkString(" "))
|""".stripMargin
def test(fun: String, cl: ClassLoader): Unit = {
val clName = s"$packageName.$fun"
println(s"Resolving class '$clName'...")
val clazz = Class.forName(clName, true, cl)
println("Instantiating...")
val x = clazz.newInstance().asInstanceOf[() => Unit]
println("Invoking 'apply':")
x()
}
locally {
println("Compiling...")
val (fun, bytes) = compile(exampleSource)
val map = unpackJar(bytes)
println("Classes found:")
map.keys.foreach(k => println(s" '$k'"))
val cl = new MemoryClassLoader(map)
test(fun, cl) // should call `defineClass`
test(fun, cl) // should find cached class
}

How do I provide basic configuration for a Scala application?

I am working on a small GUI application written in Scala. There are a few settings that the user will set in the GUI and I want them to persist between program executions. Basically I want a scala.collections.mutable.Map that automatically persists to a file when modified.
This seems like it must be a common problem, but I have been unable to find a lightweight solution. How is this problem typically solved?
I do a lot of this, and I use .properties files (it's idiomatic in Java-land). I keep my config pretty straight-forward by design, though. If you have nested config constructs you might want a different format like YAML (if humans are the main authors) or JSON or XML (if machines are the authors).
Here's some example code for loading props, manipulating as Scala Map, then saving as .properties again:
import java.io._
import java.util._
import scala.collection.JavaConverters._
val f = new File("test.properties")
// test.properties:
// foo=bar
// baz=123
val props = new Properties
// Note: in real code make sure all these streams are
// closed carefully in try/finally
val fis = new InputStreamReader(new FileInputStream(f), "UTF-8")
props.load(fis)
fis.close()
println(props) // {baz=123, foo=bar}
val map = props.asScala // Get to Scala Map via JavaConverters
map("foo") = "42"
map("quux") = "newvalue"
println(map) // Map(baz -> 123, quux -> newvalue, foo -> 42)
println(props) // {baz=123, quux=newvalue, foo=42}
val fos = new OutputStreamWriter(new FileOutputStream(f), "UTF-8")
props.store(fos, "")
fos.close()
Here's an example of using XML and a case class for reading a config. A real class can be nicer than a map. (You could also do what sbt and at least one project do, take the config as Scala source and compile it in; saving it is less automatic. Or as a repl script. I haven't googled, but someone must have done that.)
Here's the simpler code.
This version uses a case class:
case class PluginDescription(name: String, classname: String) {
def toXML: Node = {
<plugin>
<name>{name}</name>
<classname>{classname}</classname>
</plugin>
}
}
object PluginDescription {
def fromXML(xml: Node): PluginDescription = {
// extract one field
def getField(field: String): Option[String] = {
val text = (xml \\ field).text.trim
if (text == "") None else Some(text)
}
def extracted = {
val name = "name"
val claas = "classname"
val vs = Map(name -> getField(name), claas -> getField(claas))
if (vs.values exists (_.isEmpty)) fail()
else PluginDescription(name = vs(name).get, classname = vs(claas).get)
}
def fail() = throw new RuntimeException("Bad plugin descriptor.")
// check the top-level tag
xml match {
case <plugin>{_*}</plugin> => extracted
case _ => fail()
}
}
}
This code reflectively calls the apply of a case class. The use case is that fields missing from config can be supplied by default args. No type conversions here. E.g., case class Config(foo: String = "bar").
// isn't it easier to write a quick loop to reflect the field names?
import scala.reflect.runtime.{currentMirror => cm, universe => ru}
import ru._
def fromXML(xml: Node): Option[PluginDescription] = {
def extract[A]()(implicit tt: TypeTag[A]): Option[A] = {
// extract one field
def getField(field: String): Option[String] = {
val text = (xml \\ field).text.trim
if (text == "") None else Some(text)
}
val apply = ru.newTermName("apply")
val module = ru.typeOf[A].typeSymbol.companionSymbol.asModule
val ts = module.moduleClass.typeSignature
val m = (ts member apply).asMethod
val im = cm reflect (cm reflectModule module).instance
val mm = im reflectMethod m
def getDefault(i: Int): Option[Any] = {
val n = ru.newTermName("apply$default$" + (i+1))
val m = ts member n
if (m == NoSymbol) None
else Some((im reflectMethod m.asMethod)())
}
def extractArgs(pss: List[List[Symbol]]): List[Option[Any]] =
pss.flatten.zipWithIndex map (p => getField(p._1.name.encoded) orElse getDefault(p._2))
val args = extractArgs(m.paramss)
if (args exists (!_.isDefined)) None
else Some(mm(args.flatten: _*).asInstanceOf[A])
}
// check the top-level tag
xml match {
case <plugin>{_*}</plugin> => extract[PluginDescription]()
case _ => None
}
}
XML has loadFile and save, it's too bad there seems to be no one-liner for Properties.
$ scala
Welcome to Scala version 2.10.0-RC5 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_06).
Type in expressions to have them evaluated.
Type :help for more information.
scala> import reflect.io._
import reflect.io._
scala> import java.util._
import java.util._
scala> import java.io.{StringReader, File=>JFile}
import java.io.{StringReader, File=>JFile}
scala> import scala.collection.JavaConverters._
import scala.collection.JavaConverters._
scala> val p = new Properties
p: java.util.Properties = {}
scala> p load new StringReader(
| (new File(new JFile("t.properties"))).slurp)
scala> p.asScala
res2: scala.collection.mutable.Map[String,String] = Map(foo -> bar)
As it all boils down to serializing a map / object to a file, your choices are:
classic serialization to Bytecode
serialization to XML
serialization to JSON (easy using Jackson, or Lift-JSON)
use of a properties file (ugly, no utf-8 support)
serialization to a proprietary format (ugly, why reinvent the wheel)
I suggest to convert Map to Properties and vice versa. "*.properties" files are standard for storing configuration in Java world, why not use it for Scala?
The common way are *. properties, *.xml, since scala supports xml natively, so it would be easier using xml config then in java.