Scala: replace method at runtime - scala

Let's say I have a class
class Original {
def originalMethod = 1
}
Now, let's say I have an instance of it
val instance = new Original
Is it now possible to do something to instance at runtime to replace the originalMethod with a different method? (granted the signature stays the same)
For example, now, when calling instance.originalMethod the following code will be called println("test"); 1
EDIT
I can't call new Original. I just have an existing instance of it, which I have to modify.
EDIT 2
(#Aleksey Izmailov answer) This is a nice solution, but isn't exactly what I'm looking for. I'm thinking more in terms of testing - writing a class "normally", without specifying functions as variables rather than methods
PS. I stumbled on this question when trying to mimic a Mockito spy

Seems like you have to revert to mutation since you can't have new instances and behavior has to change. Here is perhaps the simplest option (REPL):
Store a function as a mutable field:
scala> class Original {
| var originalMethod = () => 1
| }
defined class Original
scala> val obj = new Original
obj: Original = Original#66ac5762
It's a function that takes nothing, and you need to call apply or () on it to get result.
scala> obj.originalMethod
res0: () => Int = <function0>
scala> obj.originalMethod()
res1: Int = 1 ^
Replace it with a new function:
scala> obj.originalMethod = () => 2
obj.originalMethod: () => Int = <function0>
Now you are getting new behaviour:
scala> obj.originalMethod()
res2: Int = 2
Interestingly enough you can't have default implementation with a generic version of this class because there is no default value you could use unless you change it to Unit or partial function.
Here is a generic version of it:
scala> class Original[T] {
| var originalMethod: () => T = null
| }
defined class Original
scala> val genImpl = new Original[Int] { originalMethod = () => 111 }
genImpl: Original[Int] = $anon$1#6b04acb2
scala> genImpl.originalMethod()
res8: Int = 111
scala> genImpl.originalMethod = () => 222
genImpl.originalMethod: () => Int = <function0>
scala> genImpl.originalMethod()
res9: Int = 222

This isn't possible on the JVM — in either Java or Scala — not in the way that you're asking for.
See e.g. In Java, given an object, is it possible to override one of the methods?
Being in Scala instead of Java doesn't gain you additional leverage, since the fundamentals of how classes and methods work in Scala are the same as Java's. (This was done for both performance and interop reasons.)

Related

Eta-expansion on non-methods works for fields but not for local variables

The following code is pretty much self-explanatory:
class EtaExpansionOnNonMethods { // or object
val zero = 0
val zeroEta = zero _ // compiles: () => Int
def f {
val one = 1
val oneEta = one _ // compilation error
}
}
Error:(7, 18) _ must follow method; cannot follow Int
val oneEta = one _
^
Why is eta-expansion on an e.g. Int field allowed (resulting in () => Int) but not on an Int local variable (resulting in a compilation error)? I'm using version 2.11.7.
That's because val members are actually compiled down to getter/setter-like methods, for example running javap EtaExpansionOnNonMethods.class that you'd get from running scalac gives you:
E:\EtaExp>"C:\Program Files\Java\jdk1.8.0_51\bin\javap" EtaExpansionOnNonMethods.class
Compiled from "EtaExp.scala"
public class EtaExpansionOnNonMethods {
public int zero();
public EtaExpansionOnNonMethods();
}
Notice that if you were to declare the member as private[this] val zero = 0, which is compiled down to a final field, you'd get the exact same error you get when trying to eta-expand a local variable or value.
In the end, the general premise still holds: you can use eta-expansion on methods, even when those methods are not really explicit. :)

Nulls in Scala ...why is this possible?

I was coding in Scala and doing some quick refactoring in Intellij, when I stumbled upon the following piece of weirdness...
package misc
/**
* Created by abimbola on 05/10/15.
*/
object WTF extends App {
val name: String = name
println(s"Value is: $name")
}
I then noticed that the compiler didn't complain, so I decided to attempt to run this and I got a very interesting output
Value is: null
Process finished with exit code 0
Can anyone tell me why this works?
EDIT:
First problem, the value name is assigned a reference to itself even though it does not exist yet; why exactly does the Scala compiler not explode with errors???
Why is the value of the assignment null?
1.) Why does the compiler not explode
Here is a reduced example. This compiles because through given type a default value can be inferred:
class Example { val x: Int = x }
scalac Example.scala
Example.scala:1: warning: value x in class Example does nothing other than call itself recursively
class Example { val x: Int = x }
This does not compile because no default value can be inferred:
class ExampleDoesNotCompile { def x = x }
scalac ExampleDoesNotCompile.scala
ExampleDoesNotCompile.scala:1: error: recursive method x needs result type
class ExampleDoesNotCompile { def x = x }
1.1 What happens here
My interpretation. So beware: The uniform access principle kicks in.
The assignment to the val x calls the accessor x() which returns the unitialized value of x.
So x is set to the default value.
class Example { val x: Int = x }
^
[[syntax trees at end of cleanup]] // Example.scala
package <empty> {
class Example extends Object {
private[this] val x: Int = _;
<stable> <accessor> def x(): Int = Example.this.x;
def <init>(): Example = {
Example.super.<init>();
Example.this.x = Example.this.x();
()
}
}
} ^
2.) Why the value is null
The default values are determined by the environment Scala is compiled to.
In the example you have given it looks like you run on the JVM. The default value for Object here is null.
So when you do not provide a value the default value is used as a fallback.
Default values JVM:
byte 0
short 0
int 0
long 0L
float 0.0f
double 0.0d
char '\u0000'
boolean false
Object null // String are objects.
Also the default value is a valid value for given type:
Here is an example in the REPL:
scala> val x : Int = 0
x: Int = 0
scala> val x : Int = null
<console>:10: error: an expression of type Null is ineligible for implicit conversion
val x : Int = null
^
scala> val x : String = null
x: String = null
why exactly does the Scala compiler not explode with errors?
Because this problem can't be solved in the general case. Do you know the halting problem? The halting problem says that it is not possible to write an algorithm that finds out if a program ever halts. Since the problem of finding out if a recursive definition would result in a null assignment can be reduced to the halting problem, it is also not possible to solve it.
Well, now it is quite easy to forbid recursive definitions at all, this is for example done for values that are no class values:
scala> def f = { val k: String = k+"abc" }
<console>:11: error: forward reference extends over definition of value k
def f = { val k: String = k+"abc" }
^
For class values this feature is not forbidden for a few reasons:
Their scope is not limited
The JVM initializes them with a default value (which is null for reference types).
Recursive values are useful
Your use case is trivial, as is this:
scala> val k: String = k+"abc"
k: String = nullabc
But what about this:
scala> object X { val x: Int = Y.y+1 }; object Y { val y: Int = X.x+1 }
defined object X
defined object Y
scala> X.x
res2: Int = 2
scala> Y.y
res3: Int = 1
scala> object X { val x: Int = Y.y+1 }; object Y { val y: Int = X.x+1 }
defined object X
defined object Y
scala> Y.y
res4: Int = 2
scala> X.x
res5: Int = 1
Or this:
scala> val f: Stream[BigInt] = 1 #:: 1 #:: f.zip(f.tail).map { case (a,b) => a+b }
f: Stream[BigInt] = Stream(1, ?)
scala> f.take(10).toList
res7: List[BigInt] = List(1, 1, 2, 3, 5, 8, 13, 21, 34, 55)
As you can see it is quite easy to write programs where it is not obvious anymore to which value they will result. And since the halting problem is not solvable we can not let the compiler do the work for us in non trivial cases.
This also means that trivial cases, as the one shown in your question, could be hardcoded in the compiler. But since there can't exist a algorithm that can detect all possible trivial cases, all cases that are ever found need to be hardcoded in the compiler (not to mention that a definition of a trivial case does not exist). Therefore it wouldn't be wise to even start hardcoding some of these cases. It would ultimately result in a slower compiler and a compiler that is more difficult to maintain.
One could argue that for an use case that burns every second user it would be wise to at least hardcode such an extreme scenario. On the other hand, some people just need to be burned in order to learn something new. ;)
I think #Andreas' answer already has the necessary info. I'll just try to provide additional explanation:
When you write val name: String = name at the class level, this does a few different things at the same time:
create the field name
create the getter name()
create code for the assignment name = name, which becomes part of the primary constructor
This is what's made explicit by Andreas' 1.1
package <empty> {
class Example extends Object {
private[this] val x: Int = _;
<stable> <accessor> def x(): Int = Example.this.x;
def <init>(): Example = {
Example.super.<init>();
Example.this.x = Example.this.x();
()
}
}
}
The syntax is not Scala, it is (as suggested by [[syntax trees at end of cleanup]]) a textual representation of what the compiler will later convert into bytecode. Some unfamiliar syntax aside, we can interpret this, like the JVM would:
the JVM creates an object. At this point, all fields have default values. val x: Int = _; is like int x; in Java, i.e. the JVM's default value is used, which is 0 for I (i.e. int in Java, or Int in Scala)
the constructor is called for the object
(the super constructor is called)
the constructor calls x()
x() returns x, which is 0
x is assigned 0
the constructor returns
as you can see, after the initial parsing step, there is nothing in the syntax tree that seems immediately wrong, even though the original source code looks wrong. I wouldn't say that this is the behavior I expect, so I would imagine one of three things:
Either, the Scala devs saw it as too intricate to recognize and forbid
or, it's a regression and simply wasn't found as a bug
or, it's a "feature" and there is legitimate need for this behavior
(ordering reflects my opinion of likeliness, in decreasing order)

Getting a null with a val depending on abstract def in a trait [duplicate]

This question already has answers here:
Scala - initialization order of vals
(3 answers)
Closed 7 years ago.
I'm seeing some initialization weirdness when mixing val's and def's in my trait. The situation can be summarized with the following example.
I have a trait which provides an abstract field, let's call it fruit, which should be implemented in child classes. It also uses that field in a val:
scala> class FruitTreeDescriptor(fruit: String) {
| def describe = s"This tree has loads of ${fruit}s"
| }
defined class FruitTreeDescriptor
scala> trait FruitTree {
| def fruit: String
| val descriptor = new FruitTreeDescriptor(fruit)
| }
defined trait FruitTree
When overriding fruit with a def, things work as expected:
scala> object AppleTree extends FruitTree {
| def fruit = "apple"
| }
defined object AppleTree
scala> AppleTree.descriptor.describe
res1: String = This tree has loads of apples
However, if I override fruit using a val...
scala> object BananaTree extends FruitTree {
| val fruit = "banana"
| }
defined object BananaTree
scala> BananaTree.descriptor.describe
res2: String = This tree has loads of nulls
What's going on here?
In simple terms, at the point you're calling:
val descriptor = new FruitTreeDescriptor(fruit)
the constructor for BananaTree has not been given the chance to run yet. This means the value of fruit is still null, even though it's a val.
This is a subcase of the well-known quirk of the non-declarative initialization of vals, which can be illustrated with a simpler example:
class A {
val x = a
val a = "String"
}
scala> new A().x
res1: String = null
(Although thankfully, in this particular case, the compiler will detect something being afoot and will present a warning.)
To avoid the problem, declare fruit as a lazy val, which will force evaluation.
The problem is the initialization order. val fruit = ... is being initialized after val descriptor = ..., so at the point when descriptor is being initialized, fruit is still null. You can fix this by making fruit a lazy val, because then it will be initialized on first access.
Your descriptor field initializes earlier than fruit field as trait intializes earlier than class, that extends it. null is a field's value before initialization - that's why you get it. In def case it's just a method call instead of accessing some field, so everything is fine (as method's code may be called several times - no initialization here). See, http://docs.scala-lang.org/tutorials/FAQ/initialization-order.html
Why def is so different? That's because def may be called several times, but val - only once (so its first and only one call is actually initialization of the fileld).
Typical solution to such problem - using lazy val instead, it will intialize when you really need it. One more solution is early intializers.
Another, simpler example of what's going on:
scala> class A {val a = b; val b = 5}
<console>:7: warning: Reference to uninitialized value b
class A {val a = b; val b = 5}
^
defined class A
scala> (new A).a
res2: Int = 0 //null
Talking more generally, theoretically scala could analize the dependency graph between fields (which field needs other field) and start initialization from final nodes. But in practice every module is compiled separately and compiler might not even know those dependencies (it might be even Java, which calls Scala, which calls Java), so it's just do sequential initialization.
So, because of that, it couldn't even detect simple loops:
scala> class A {val a: Int = b; val b: Int = a}
<console>:7: warning: Reference to uninitialized value b
class A {val a: Int = b; val b: Int = a}
^
defined class A
scala> (new A).a
res4: Int = 0
scala> class A {lazy val a: Int = b; lazy val b: Int = a}
defined class A
scala> (new A).a
java.lang.StackOverflowError
Actually, such loop (inside one module) can be theoretically detected in separate build, but it won't help much as it's pretty obvious.

Make a Scala interpreter oblivious between interpret calls

Is it possible to configure a Scala interpreter (tools.nsc.IMain) so that it "forgets" the previously executed code, whenever I run the next interpret() call?
Normally when it compiles the sources, it wraps them in nested objects, so all the previously defined variables, functions and bindings are available.
It would suffice to not generate the nested objects (or to throw them away), although I would prefer a solution which would even remove the previously compiled classes from the class loader again.
Is there a setting, or a method, or something I can overwrite, or an alternative to IMain that would accomplish this? I need to be able to still access the resulting objects / classes from the host VM.
Basically I want to isolate subsequent interpret() calls without something as heavy weight as creating a new IMain for each iteration.
Here is one possible answer. Basically there is method reset() which calls the following things (mostly private, so either you buy the whole package or not):
clearExecutionWrapper()
resetClassLoader()
resetAllCreators()
prevRequests.clear()
referencedNameMap.clear()
definedNameMap.clear()
virtualDirectory.clear()
In my case, I am using a custom execution wrapper, so that needs to be set up again, and also imports are handled through a regular interpret cycle, so either add them again, or—better—just prepend them with the execution wrapper.
I would like to keep my bindings, they are also gone:
import tools.nsc._
import interpreter.IMain
object Test {
private final class Intp(cset: nsc.Settings)
extends IMain(cset, new NewLinePrintWriter(new ConsoleWriter, autoFlush = true)) {
override protected def parentClassLoader = Test.getClass.getClassLoader
}
object Foo {
def bar() { println("BAR" )}
}
def run() {
val cset = new nsc.Settings()
cset.classpath.value += java.io.File.pathSeparator + sys.props("java.class.path")
val i = new Intp(cset)
i.initializeSynchronous()
i.bind[Foo.type]("foo", Foo)
val res0 = i.interpret("foo.bar(); val x = 33")
println(s"res0: $res0")
i.reset()
val res1 = i.interpret("println(x)")
println(s"res1: $res1")
i.reset()
val res2 = i.interpret("foo.bar()")
println(s"res2: $res2")
}
}
This will find Foo in the first iteration, correctly forget x in the second iteration, but then in the third iteration, it can be seen that the foo binding is also lost:
foo: Test.Foo.type = Test$Foo$#8bf223
BAR
x: Int = 33
res0: Success
<console>:8: error: not found: value x
println(x)
^
res1: Error
<console>:8: error: not found: value foo
foo.bar()
^
res2: Error
The following seems to be fine:
for(j <- 0 until 3) {
val user = "foo.bar()"
val synth = """import Test.{Foo => foo}
""".stripMargin + user
val res = i.interpret(synth)
println(s"res$j: $res")
i.reset()
}

JUnit Theories and Scala

I'm looking for a way to test my Scala code with multiple inputs. Comming from Java/JUnit, I immediately thought of #RunWith(Theories.class).
Where I'm stuck is the usage of #DataPoints and the absence of static members/methods in Scala. So is there a way to write the following code in Scala?
#RunWith(classOf[Theories])
class ScalaTheory {
#DataPoints
val numbers = Array(1, 2, 3)
#Theory
def shouldMultiplyByTwo(number : Int) = {
// Given
val testObject = ObjectUnderTest
// When
val result = testObject.run(number)
// Then
assertTrue(result == number * 2)
}
}
I'm neither fixed on JUnit nor Theories so if there is something Scala-specific for this use case, I'm happy to use that.
To make this work, you have to do two things: use a method[see edit below], not a value, and secondly, define your #DataPoints in a companion object. The following should work:
object ScalaTheory {
#DataPoints
def numbers() = Array(1, 2, 3) // note def not val
}
#RunWith(classOf[Theories])
class ScalaTheory {
#Theory
def shouldMultiplyByTwo(number : Int) = {
// Given
val testObject = ObjectUnderTest
// When
val result = testObject.run(number)
// Then
assertTrue(result == number * 2)
}
}
When you define methods or fields in a companion object in Scala, you get a static forwarder in the class. Decompiling using JAD:
#Theories
public static final int[] numbers()
{
return ScalaTheory$.MODULE$.numbers();
}
So this takes care of the static problem. However, when we use a val numbers = ..., the annotation isn't carried over to the field, but it is for methods. So using def works.
As the others have said, if you're developing from scratch, it may be worth starting in a Scala framework like scalatest. The tool integration with scalatest is improving (i.e maven, Eclipse, Intellij), but it's not the level of JUnit, so evaluate it for your project before you start.
EDIT: In fact, after this discussion on scala-user, you can use val, but you need to tell the scala compiler to apply the DataPoints annotation to the static forwarder:
object ScalaTheory {
#(DataPoints #scala.annotation.target.getter)
val numbers = Array(1, 2, 3)
}
The getter annotation says that the #DataPoints annotation should be applied to the accessor method for the numbers field, that is the numbers() method which is created by the compiler. See package target.
I expect that what you want is possible with every Scala testing framework. I'm only familiar with Specs2, so here's my shot:
class DataPoints extends Specification {
val objectUnderTest: Int => Int = _ + 2
val testCases = 1 :: 2 :: 3 :: 4 :: Nil
def is: Fragments =
(objectUnderTest must multiplyByTwo((_: Int))).foreach(testCases)
def multiplyByTwo(i: Int): Matcher[(Int) => Int] =
(=== (i * 2)) ^^
((f: Int => Int) => f(i) aka "result of applying %s to %d".format(f, i))
}
And the output is:
result of applying <function1> to 1 '3' is not equal to '2'; result of applying <function1> to 3 '5' is not equal to '6'; result of applying <function1> to 4 '6' is not equal to '8'
Disclaimer
I do not state that this is very readable. Also I'm not an expert Specs2 user.
Just as ziggystar I can't really help you with your direct question. But I strongly recommend to switch to a scala testing framework. My personal favorite is scalatest.
In the last example here: http://blog.schauderhaft.de/2011/01/16/more-on-testing-with-scalatest/ I demonstrate how simple and straight forward it is to write test that runs with multiple inputs. It is just a simple loop!