Scala initialization order - scala

In the Programming in Scala (Ch. 10 on "Composition & Inheritance") book there is an example which causes some misunderstandings. This is the extracted part:
abstract class Element {
def contents: Array[String]
val someProperty: String = {
println("=== Element")
contents(0)
}
}
class UniformElement(
str: String
) extends Element {
val s = str
println("=== UniformElement.s " + s)
def contents = Array(s) // error
//def contents = Array(str) // ok
}
val e = new UniformElement("str")
println(e.someProperty)
For some reason the initialization of superclass occures before s initialisation:
scala example.scala
=== Element
=== UniformElement.s str
null
Why does the alternative work without s (see commented line in code)?

The issue is field values are null until constructor completes, and super constructor is referencing indirectly value s which is initialised by child constructor, but the child constructor has not yet completed. The situation looks something like this
class UniformElement {
def <init>(str: String) = {
super.<init>()
s = str
}
}
where we can see if we replace super.<init>() with
val someProperty: String = {
println("=== Element")
contents(0)
}
that it executes before
s = str
Initialisation order issues can often be addressed by changing eager val s into lazy like so
class UniformElement(str: String) extends Element {
lazy val s = str
println("=== UniformElement.s " + s)
def contents = Array(s)
}
which now outputs
=== Element
=== UniformElement.s str
str

Thanks for the interesting question ! My guess (after spending some time on Scastie) would be this order of initialization :
Arguments : in your case, str is the first value to be defined
Parent : in your case, Element
Child : in your case, UniformElement
So, if I try to put it in a single class order, it goes like this :
class UniformElement{
// Argument init
val str = "str"
// Super constructor
def contents: Array[String]
val someProperty: String = {
println("=== Element")
contents(0)
}
// Child constructor
val s = str
println("=== UniformElement.s " + s)
def contents = Array(s) // error
//def contents = Array(str) // ok
}
The trick is that, to init someProperty, scala need to evaluate contents(0) and find contents definition. But when finding definition, s is not yet defined (and str is).
So final 'runtime' process:
class UniformElement{
// Argument init
val str = "str"
// Super constructor with contents replaced by definition
val someProperty: String = {
println("=== Element")
Array(s)(0) // error : s doesn't exists !
// Array(str)(0) // ok : str exists
}
// Child constructor
val s = str
println("=== UniformElement.s " + s)
def contents = Array(s) // error
//def contents = Array(str) // ok
}
To convince yourself, you can try :
println(e.someProperty) // null => s wasn't defined
println(e.contents(0)) // str => s is now defined
Feel free to ask for clarification if needed.

Related

Scala reflect string to singleton object

I'm looking for a way to convert a Scala singleton object given as a string (for example: package1.Main) to the actual instance of Main, so that I can invoke methods on it.
Example of the problem:
package x {
object Main extends App {
val objectPath: String = io.StdIn.readLine("Give an object: ") // user enters: x.B
// how to convert the objectPath (String) to a variable that references singleton B?
val b1: A = magicallyConvert1(objectPath)
b1.hi()
val b2: B.type = magicallyConvert2(objectPath)
b2.extra()
}
trait A {
def hi() = {}
}
object B extends A {
def extra() = {}
}
}
How can the magicallyConvert1 and magicallyConvert2 functions be implemented?
For a normal class, this can be done using something like:
val b: A = Class.forName("x.B").newInstance().asInstanceOf[A]
But I found a solution for singletons, using Java reflections:
A singleton is accesible in Java under the name:
package.SingletonName$.MODULE$
So you have to append "$.MODULE$", which is a static field.
So we can use standard Java reflections to get it.
So the solution is:
def magicallyConvert1(objectPath: String) = {
val clz = Class.forName(objectPath + "$")
val field = clz.getField("MODULE$")
val b: A = field.get(null).asInstanceOf[A]
b
}
def magicallyConvert2(objectPath: String) = {
val clz = Class.forName(objectPath + "$")
val field = clz.getField("MODULE$")
val b: B.type = field.get(null).asInstanceOf[B.type]
b
}
But it would be interesting to still see a solution with Scala-Reflect en Scala-Meta.
take a look at scalameta http://scalameta.org it does what you want and more

Singletons only available once in Scala

Often we need objects that can be reused and takes some time to generate:
def foo() = {
// long lines of code...
val pattern = Pattern.comile("pattern") // suppose this takes long time
// use pattern
}
Then it can be moved to outer scope
private[this] lazy val pattern = Pattern.comile("pattern") // suppose this takes long time
def foo() = {
// long lines of code...
// pattern need to be available only here!
// use pattern
}
But this complicates the source because it leaks scope of the variable pattern, while it is used only in specific location of foo. I am curious about this can be simplified with a macro function:
def foo() = {
// long lines of code...
val pattern = singleton { Pattern.comile("pattern") }
// use pattern
}
If it is possible, we can extend it for more interesting case; ThreadLocal singleton:
def foo() = {
// long lines of code...
val obj = threadLocal { new NotThreadSafeObject() }
// use obj
}
Which will be extended as:
private[this] lazy val foo_obj_gen_by_macro = {
val tl = new ThreadLocal[NotThreadSafeObject]()
tl.set(new NotThreadSafeObject())
tl
}
def foo() = {
// long lines of code...
val obj = foo_obj_gen_by_macro.get
// use obj
}
If it would C++, this can be very easily achieved by using static variable inside of function scope:
void foo() {
// long lines of code...
static Pattern pattern = Pattern.Compile("pattern");
// use pattern
}
Why not just scope it ?
lazy val foo: () => String = {
val pattern = Pattern.compile("pattern")
def result(): String = ???
result _
}
Or as thoredge mentioned even simpler
lazy val foo: () => String = {
val pattern = Pattern.compile("pattern")
() => ???
}

Scala Reflection Conundrum: Can you explain these weird results?

I wrote some Scala code, using reflection, that returns all vals in an object that are of a certain type. Below are three versions of this code. One of them works but is ugly. Two attempts to improve it don't work, in very different ways. Can you explain why?
First, the code:
import scala.reflect.runtime._
import scala.util.Try
trait ScopeBase[T] {
// this version tries to generalize the type. The only difference
// from the working version is [T] instead of [String]
def enumerateBase[S: universe.TypeTag]: Seq[T] = {
val mirror = currentMirror.reflect(this)
universe.typeOf[S].decls.map {
decl => Try(mirror.reflectField(decl.asMethod).get.asInstanceOf[T])
}.filter(_.isSuccess).map(_.get).filter(_ != null).toSeq
}
}
trait ScopeString extends ScopeBase[String] {
// This version works but requires passing the val type
// (String, in this example) explicitly. I don't want to
// duplicate the code for different val types.
def enumerate[S: universe.TypeTag]: Seq[String] = {
val mirror = currentMirror.reflect(this)
universe.typeOf[S].decls.map {
decl => Try(mirror.reflectField(decl.asMethod).get.asInstanceOf[String])
}.filter(_.isSuccess).map(_.get).filter(_ != null).toSeq
}
// This version tries to avoid passing the object's type
// as the [S] type parameter. After all, the method is called
// on the object itself; so why pass the type?
def enumerateThis: Seq[String] = {
val mirror = currentMirror.reflect(this)
universe.typeOf[this.type].decls.map {
decl => Try(mirror.reflectField(decl.asMethod).get.asInstanceOf[String])
}.filter(_.isSuccess).map(_.get).filter(_ != null).toSeq
}
}
// The working example
object Test1 extends ScopeString {
val IntField: Int = 13
val StringField: String = "test"
lazy val fields = enumerate[Test1.type]
}
// This shows how the attempt to generalize the type doesn't work
object Test2 extends ScopeString {
val IntField: Int = 13
val StringField: String = "test"
lazy val fields = enumerateBase[Test2.type]
}
// This shows how the attempt to drop the object's type doesn't work
object Test3 extends ScopeString {
val IntField: Int = 13
val StringField: String = "test"
lazy val fields = enumerateThis
}
val test1 = Test1.fields // List(test)
val test2 = Test2.fields // List(13, test)
val test3 = Test3.fields // List()
The "enumerate" method does work. However, as you can see from the Test1 example, it requires passing the object's own type (Test1.type) as a parameter, which should not have been necessary. The "enumerateThis" method tries to avoid that but fails, producing an empty list. The "enumerateBase" method attempts to generalize the "enumerate" code by passing the val type as a parameter. But it fails, too, producing the list of all vals, not just those of a certain type.
Any idea what's going on?
Your problem in your generic implementation is the loss of the type information of T. Also, don't use exceptions as your primary method of control logic (it's very slow!). Here's a working version of your base.
abstract class ScopeBase[T : universe.TypeTag, S <: ScopeBase[T, S] : universe.TypeTag : scala.reflect.ClassTag] {
self: S =>
def enumerateBase: Seq[T] = {
val mirror = currentMirror.reflect(this)
universe.typeOf[S].baseClasses.map(_.asType.toType).flatMap(
_.decls
.filter(_.typeSignature.resultType <:< universe.typeOf[T])
.filter(_.isMethod)
.map(_.asMethod)
.filter(_.isAccessor)
.map(decl => mirror.reflectMethod(decl).apply().asInstanceOf[T])
.filter(_ != null)
).toSeq
}
}
trait Inherit {
val StringField2: String = "test2"
}
class Test1 extends ScopeBase[String, Test1] with Inherit {
val IntField: Int = 13
val StringField: String = "test"
lazy val fields = enumerateBase
}
object Test extends App {
println(new Test1().fields)
}
Instead of getting the type from universe.typeOf you can use the runtime class currentMirror.classSymbol(getClass).toType, below is an example that works:
def enumerateThis: Seq[String] = {
val mirror = currentMirror.reflect(this)
currentMirror.classSymbol(getClass).toType.decls.map {
decl => Try(mirror.reflectField(decl.asMethod).get.asInstanceOf[String])
}.filter(_.isSuccess).map(_.get).filter(_ != null).toSeq
}
//prints List(test)
With everyone's help, here's the final version that works:
import scala.reflect.runtime.{currentMirror, universe}
abstract class ScopeBase[T: universe.TypeTag] {
lazy val enumerate: Seq[T] = {
val mirror = currentMirror.reflect(this)
currentMirror.classSymbol(getClass).baseClasses.map(_.asType.toType).flatMap {
_.decls
.filter(_.typeSignature.resultType <:< universe.typeOf[T])
.filter(_.isMethod)
.map(_.asMethod)
.filterNot(_.isConstructor)
.filter(_.paramLists.size == 0)
.map(decl => mirror.reflectField(decl.asMethod).get.asInstanceOf[T])
.filter(_ != null).toSeq
}
}
}
trait FieldScope extends ScopeBase[Field[_]]
trait DbFieldScope extends ScopeBase[DbField[_, _]] {
// etc....
}
As you see from the last few lines, my use cases are limited to scope objects for specific field types. This is why I want to parameterize the scope container. If I wanted to enumerate the fields of multiple types in a single scope container, then I would have parameterized the enumerate method.

Class A cannot be cast to Class A after dynamic loading

Let's say I have:
object GLOBAL_OBJECT{
var str = ""
}
class A(_str: String){
GLOBAL_OBJECT.str = _str
}
and I would like to create 2 copies of GLOBAL_OBJECT (for tests), so I am using different classloader to create obj2:
val obj1 = new A("1")
val class_loader = new CustomClassLoader()
val clazz = class_loader.loadClass("my.packagename.A")
val obj2 = clazz.getDeclaredConstructor(classOf[String]).newInstance("2")
println("obj1.getSecret() == " + obj1.getSecret()) // Expected: 1
println("obj2.getSecret() == " + obj2.asInstanceOf[A].getSecret()) // Expected: 2
which results following error:
my.packagename.A cannot be cast to my.packagename.A.
IntelliJ Idea seems to do it correctly, I can run obj2.asInstanceOf[A].getSecret() in "expression" window during debug process without errors.
PS. I have seen similar questions, but I could not find any not regarding loading class from .jarfile.
You're not going to be able to get around Java's class casting, which requires strict typing, within the same ClassLoader. Same with traits/interfaces.
However, Scala comes to the rescue with structural typing (a.k.a. Duck Typing, as in "it quacks like a duck.") Instead of casting it to type A, cast it such that it has the method you want.
Here's an example of a function which uses structural typing:
def printSecret(name : String, secretive : { def getSecret : String } ) {
println(name+".getSecret = "+secretive.getSecret)
}
And here's sample usage:
printSecret("obj1", obj1) // Expected: 1
printSecret("obj2", obj2.asInstanceOf[ {def getSecret : String} ]) // Expected: 2
You could, of course, just call
println("secret: "+ obj2.asInstanceOf[ {def getSecret : String} ].getSecret
Here's full sample code that I wrote and tested.
Main code:
object TestBootstrap {
def createClassLoader() = new URLClassLoader(Array(new URL("file:///tmp/theTestCode.jar")))
}
trait TestRunner {
def runTest()
}
object RunTest extends App {
val testRunner = TestBootstrap.createClassLoader()
.loadClass("my.sample.TestCodeNotInMainClassLoader")
.newInstance()
.asInstanceOf[TestRunner]
testRunner.runTest()
}
In the separate JAR file:
object GLOBAL_OBJECT {
var str = ""
}
class A(_str: String) {
println("A classloader: "+getClass.getClassLoader)
println("GLOBAL classloader: "+GLOBAL_OBJECT.getClass.getClassLoader)
GLOBAL_OBJECT.str = _str
def getSecret : String = GLOBAL_OBJECT.str
}
class TestCodeNotInMainClassLoader extends TestRunner {
def runTest() {
println("Classloader for runTest: " + this.getClass.getClassLoader)
val obj1 = new A("1")
val classLoader1 = TestBootstrap.createClassLoader()
val clazz = classLoader1.loadClass("com.vocalabs.A")
val obj2 = clazz.getDeclaredConstructor(classOf[String]).newInstance("2")
def printSecret(name : String, secretive : { def getSecret : String } ) {
println(name+".getSecret = "+secretive.getSecret)
}
printSecret("obj1", obj1) // Expected: 1
printSecret("obj2", obj2.asInstanceOf[ {def getSecret : String} ]) // Expected: 2
}
}
Structural typing can be used for more than one method, the methods are separated with semicolons. So essentially you create an interface for A with all the methods you intend to test. For example:
type UnderTest = { def getSecret : String ; def myOtherMethod() : Unit }
One workaround to actually run some method from dynamically delivered object instead of casting it is to use reflection in order to extract particular method, from new class and then invoke it on our new object instance:
val m2: Method = obj2.getClass.getMethod("getSecret")
m2.invoke(obj2)
The class file that contains obj2.asInstanceOf[A].getSecret() should be reloaded by CustomClassLoader, too.
And you must not use any class that references to A unless you reload the class by the same class loader that reloads A.

Compile String to AST inside CompilerPlugin?

I would like to create a templating plugin and as the first step convert an arbitrary string to it's "compiled" AST representation (as the scala interpreter does, I guess). So a compiler plugin could e.g assign someString to "HELLO WORLD":
#StringAnnotation("""("hello world").toString.toUpperCase""")
var someString = ""
My current first shot plugin does in short:
runafter parser
create a new representation only compiler and a VirtualFile with the annotation content
compile and print unit.body
see: http://paste.pocoo.org/show/326025/
a)
Right now, "object o{val x = 0}" returns an AST, but e.g. "var x = 1+ 2" doesn't because it wouldn't be a valid .scala file. How can I fix this?
b)
Is onlyPresentation a good choice? Should I instead overriding computeInternalPhases with the appropriate phases or use -Ystop:phase?
c)
Is it possible to bind the environment of the outer compiler to the inner one, so that e.g.
var x = _
(...)
#StringAnnotation("x += 3")
would work?
I found following code[1] using an interpreter and one variable which does something similar:
Interpreter interpreter = new Interpreter(settings);
String[] context = { "FOO" };
interpreter.bind("context", "Array[String]", context);
interpreter
.interpret("de.tutorials.scala2.Test.main(context)");
context[0] = "BAR";
interpreter
.interpret("de.tutorials.scala2.Test.main(context)");
[1] http://www.tutorials.de/java/320639-beispiel-zur-einbindung-des-scala-interpreters-kompilierte-scala-anwendungen.html#post1653884
thanks
Complete Code:
class AnnotationsPI(val global: Global) extends Plugin {
import global._
val name = "a_plugins::AnnotationsPI" //a_ to run before namer
val description = "AST Trans PI"
val components = List[PluginComponent](Component)
private object Component extends PluginComponent with Transform with TypingTransformers with TreeDSL {
val global: AnnotationsPI.this.global.type = AnnotationsPI.this.global
val runsAfter = List[String]("parser");
val phaseName = AnnotationsPI.this.name
def newTransformer(unit: CompilationUnit) = {
new AnnotationsTransformer(unit)
}
val SaTpe = "StringAnnotation".toTypeName
class AnnotationsTransformer(unit: CompilationUnit) extends TypingTransformer(unit) {
/** When using <code>preTransform</code>, each node is
* visited before its children.
*/
def preTransform(tree: Tree): Tree = tree match {
case anno#ValDef(Modifiers(_, _, List(Apply(Select(New(Ident(SaTpe)), _), List(Literal(Constant(a))))), _), b, c, d) => //Apply(Select(New(Ident(SaTpe)), /*nme.CONSTRUCTOR*/_), /*List(x)*/x)
val str = a.toString
val strArr = str.getBytes("UTF-8")
import scala.tools.nsc.{ Global, Settings, SubComponent }
import scala.tools.nsc.reporters.{ ConsoleReporter, Reporter }
val settings = new Settings()
val compiler = new Global(settings, new ConsoleReporter(settings)) {
override def onlyPresentation = true
}
val run = new compiler.Run
val vfName = "Script.scala"
var vfile = new scala.tools.nsc.io.VirtualFile(vfName)
val os = vfile.output
os.write(strArr, 0, str.size) // void write(byte[] b, int off, int len)
os.close
new scala.tools.nsc.util.BatchSourceFile(vfName, str)
run.compileFiles(vfile :: Nil)
for (unit <- run.units) {
println("Unit: " + unit)
println("Body:\n" + unit.body)
}
tree
case _ =>
tree
}
override def transform(tree: Tree): Tree = {
super.transform(preTransform(tree))
}
}
}
I don't know if this helps you much, but instead of fiddling with the Interpreter, you can use treeFrom( aString ) which is part of the scala-refactoring project ( http://scala-refactoring.org/ ). doesn't answer your question about cross-bindings, though...