Proxy class support - scala

Is there Scala way for runtime generation of proxy classes? not DynamicProxy but runtime types that extend provided class/interface and pass all calls through provided callback.
Java world uses cglib/javassist for that, but what is the best way to proxy in Scala?
def makeProxy[T](interceptor: Interceptor, implicit baseClass: Manifest[T]): T
Google says Scala macros can be used for this but I am unsure how.

Here is an example how (more-less) to do something like that with macro:
def testImpl[T : c.WeakTypeTag](c: Context): c.Expr[Any] = {
import c.universe._
val className = newTypeName(weakTypeTag[T].tpe.typeSymbol.name.toString) //not best way
val m = weakTypeOf[T].declarations.iterator.toList.map(_.asMethod) //`declaration` takes only current; `members` also takes inherited
.filter(m => !m.isConstructor && !m.isFinal).map { m => //all reflection info about method
q"""override def ${m.name} = 9""" //generating new method
}
c.Expr { q"""new $className { ..$m } """}
}
def t[T] = macro testImpl[T]
class Aaa{ def a = 7; def b = 8}
scala> t[Aaa].a
res39: Int = 9
scala> t[Aaa].b
res40: Int = 9
All such macro works only if overriden methods are not final as they can't change types (as it works in compile-time) - only create new and inherit. This example doesn't process classes with non-empty constructors and many other things. m here is instance of MethodSymbol and gives you full scala-style reflection about input class' method. You need only generate correct AST in response.
To read more about that:
macroses: http://docs.scala-lang.org/overviews/macros/overview.html
scala's runtime-reflection http://docs.scala-lang.org/overviews/reflection/environment-universes-mirrors.html
q"asiquotes" : http://docs.scala-lang.org/overviews/quasiquotes/expression-details.html
Another solution would be:
scala> def getClasss[T: ClassTag] = classTag[T].runtimeClass
getClasss: [T](implicit evidence$1: scala.reflect.ClassTag[T])Class[_]
Using this instance you can apply any asm/cglib/javassist or even DynamicProxy to it.

Related

Is it possible to call a scala macro from generic scala code?

I'm trying to use Scala macros to convert untyped, Map[String, Any]-like expressions to their corresponding typed case class expressions.
The following scala macro (almost) gets the job done:
trait ToTyped[+T] {
def apply(term: Any): T
}
object TypeConversions {
// At compile-time, "type-check" an untyped expression and convert it to
// its appropriate typed value.
def toTyped[T]: ToTyped[T] = macro toTypedImpl[T]
def toTypedImpl[T: c.WeakTypeTag](c: Context): c.Expr[ToTyped[T]] = {
import c.universe._
val tpe = weakTypeOf[T]
if (tpe <:< typeOf[Int] || tpe <:< typeOf[String]) {
c.Expr[ToTyped[T]](
q"""new ToTyped[$tpe] {
def apply(term: Any): $tpe = term.asInstanceOf[$tpe]
}""")
} else {
val companion = tpe.typeSymbol.companion
val maybeConstructor = tpe.decls.collectFirst {
case m: MethodSymbol if m.isPrimaryConstructor => m
}
val constructorFields = maybeConstructor.get.paramLists.head
val subASTs = constructorFields.map { field =>
val fieldName = field.asTerm.name
val fieldDecodedName = fieldName.toString
val fieldType = tpe.decl(fieldName).typeSignature
q"""
val subTerm = term.asInstanceOf[Map[String, Any]]($fieldDecodedName)
TypeConversions.toTyped[$fieldType](subTerm)
"""
}
c.Expr[ToTyped[T]](
q"""new ToTyped[$tpe] {
def apply(term: Any): $tpe = $companion(..$subASTs)
}""")
}
}
}
Using the above toTyped function, I can convert for example an untyped person value to its corresponding typed Person case class:
object TypeConversionTests {
case class Person(name: String, age: Int, address: Address)
case class Address(street: String, num: Int, zip: Int)
val untypedPerson = Map(
"name" -> "Max",
"age" -> 27,
"address" -> Map("street" -> "Palm Street", "num" -> 7, "zip" -> 12345))
val typedPerson = TypeConversions.toTyped[Person](untypedPerson)
typedPerson shouldEqual Person("Max", 27, Address("Palm Street", 7, 12345))
}
However, my problem arises when trying to use the toTyped macro from above in generic scala code. Suppose I have a generic function indirection that uses the toTyped macro:
object CanIUseScalaMacrosAndGenerics {
def indirection[T](value: Any): T = TypeConversions.toTyped[T](value)
import TypeConversionTests._
val indirectlyTyped = indirection[Person](untypedPerson)
indirectlyTyped shouldEqual Person("Max", 27, Address("Palm Street", 7, 12345))
Here, I get a compile-time error from the toTyped macro complaining that the type T is not yet instantiated with a concrete type. I think the reason for the error is that from the perspective of toTyped inside indirection, the type T is still generic and not inferred to be Person just yet. And therefore the macro cannot build the corresponding Person case class when called via indirection. However, from the perspective of the call-site indirection[Person](untypedPerson), we have T == Person, so I wonder if there is a way to obtain the instantiated type of T (i.e., Person) inside the macro toTyped.
Put differently: Can I combine the Scala macro toTyped with the generic function indirection and yet be able to figure out the instantiated type for type parameter T inside the toTyped macro? Or am I on a hopeless track here and there is no way to combine Scala macros and generics like this? In the latter case I would like to know if the only solution here is to push the macro usage so far "out" that I can call it instantiated as toTyped[Person] rather than as toTyped[T].
Any insights are very much appreciated. Thank you! :-)
Macros need to be expanded. Every time you use a function which body is a macro, Scala will have to generate the code and put it there. As you suspect, this is very very specific and contradict the idea of parametric polymorphism where you write code independent of specific knowledge about your type.
Type classes are one of solutions to the general problem when you want to have one generic (parametric) definition and multiple per-type implementations of certain parts of your algorithm. You basically, define something that you could consider interface which (most likely) need to follow some contract (speaking in OOP terminology) and pass this interface as as argument:
// example
trait SpecificPerType[T] {
def doSomethingSpecific(t: T): String
}
val specificForString: SpecificPerType[String] = new SpecificPerType[String] {
def doSomethingSpecific(t: String): String = s"MyString: $t"
}
val specificForInt: SpecificPerType[Int] = new SpecificPerType[Int] {
def doSomethingSpecific(t: Int): String = s"MyInt: $t"
}
def genericAlgorithm[T](values: List[T])(specific: SpecificPerType[T]): String =
values.map(specific.doSomethingSpecific).mkString("\n")
genericAlgorithm(List(1,2,3))(specificForInt)
genericAlgorithm(List("a","b","c"))(specificForString)
As you can see, it would be pretty annoying to pass this specific part around, which is one of the reasons implicits were introduced.
So you could write it using implicits like this:
implicit val specificForString: SpecificPerType[String] = new SpecificPerType[String] {
def doSomethingSpecific(t: String): String = s"MyString: $t"
}
implicit val specificForInt: SpecificPerType[Int] = new SpecificPerType[Int] {
def doSomethingSpecific(t: Int): String = s"MyInt: $t"
}
def genericAlgorithm[T](values: List[T])(implicit specific: SpecificPerType[T]): String =
values.map(specific.doSomethingSpecific).mkString("\n")
/* for implicits with one type parameter there exist a special syntax
allowing to express them as if they were type constraints e.g.:
def genericAlgorithm[T: SpecificPerType](values: List[T]): String =
values.map(implicitly[SpecificPerType[T]].doSomethingSpecific).mkString("\n")
implicitly[SpecificPerType[T]] is a summoning that let you access implicit
by type, rather than by its variable's name
*/
genericAlgorithm(List(1,2,3)) // finds specificForString using its type
genericAlgorithm(List("a","b","c")) // finds specificForInt using its type
If you generate that trait implementation using macro, you will be able to have a generic algorithm e.g.:
implicit def generate[T]: SpecificPerType[T] =
macro SpecificPerTypeMacros.impl // assuming that you defined this macro there
As far as I can tell, this (extracting macros into type classes) is one of common patterns when it comes to being
able to generate some code with macros while, still building logic on top of it
using normal, parametric code.
(Just to be clear: I do not claim that the role of type classes is limited as the carriers of macro generated code).

Way to enhance a class with function delegation

I have the following classes in Scala:
class A {
def doSomething() = ???
def doOtherThing() = ???
}
class B {
val a: A
// need to enhance the class with both two functions doSomething() and doOtherThing() that delegates to A
// def doSomething() = a.toDomething()
// def doOtherThing() = a.doOtherThing()
}
I need a way to enhance at compile time class B with the same function signatures as A that simply delegate to A when invoked on B.
Is there a nice way to do this in Scala?
Thank you.
In Dotty (and in future Scala 3), it's now available simply as
class B {
val a: A
export a
}
Or export a.{doSomething, doOtherThing}.
For Scala 2, there is unfortunately no built-in solution. As Tim says, you can make one, but you need to decide how much effort you are willing to spend and what exactly to support.
You can avoid repeating the function signatures by making an alias for each function:
val doSomething = a.doSomething _
val doOtherthing = a.doOtherThing _
However these are now function values rather than methods, which may or may not be relevant depending on usage.
It might be possible to use a trait or a macro-based solution, but that depends on the details of why delegation is being used.
Implicit conversion could be used for delegation like so
object Hello extends App {
class A {
def doSomething() = "A.doSomething"
def doOtherThing() = "A.doOtherThing"
}
class B {
val a: A = new A
}
implicit def delegateToA(b: B): A = b.a
val b = new B
b.doSomething() // A.doSomething
}
There is this macro delegate-macro which might just be what you are looking for. Its objective is to automatically implement the delegate/proxy pattern, so in your example your class B must extend class A.
It is cross compiled against 2.11, 2.12, and 2.13. For 2.11 and 2.12 you have to use the macro paradise compile plugin to make it work. For 2.13, you need to use flag -Ymacro-annotations instead.
Use it like this:
trait Connection {
def method1(a: String): String
def method2(a: String): String
// 96 other abstract methods
def method100(a: String): String
}
#Delegate
class MyConnection(delegatee: Connection) extends Connection {
def method10(a: String): String = "Only method I want to implement manually"
}
// The source code above would be equivalent, after the macro expansion, to the code below
class MyConnection(delegatee: Connection) extends Connection {
def method1(a: String): String = delegatee.method1(a)
def method2(a: String): String = delegatee.method2(a)
def method10(a: String): String = "Only method I need to implement manually"
// 96 other methods that are proxied to the dependency delegatee
def method100(a: String): String = delegatee.method100(a)
}
It should work in most scenarios, including when type parameters and multiple argument lists are involved.
Disclaimer: I am the creator of the macro.

How can I make a function generic on an MLReader

I am working in Spark 1.6.3. Here are two functions that do the same thing:
def modelFromBytesCV(modelArray: Array[Byte]): CountVectorizerModel = {
val tempPath: Path = KAZOO_TEMP_DIR.resolve(s"model_${System.currentTimeMillis()}")
Files.write(tempPath, modelArray)
CountVectorizerModel.read.load(tempPath.toString)
}
def modelFromBytesIDF(modelArray: Array[Byte]): IDFModel = {
val tempPath: Path = KAZOO_TEMP_DIR.resolve(s"model_${System.currentTimeMillis()}")
Files.write(tempPath, modelArray)
IDFModel.read.load(tempPath.toString)
}
I would like to make these functions generic. What I am hung up on is that the common trait between the CountVectorizerModel object and IDFModel is MLReadable[T] which itself must take as a type either CountVectorizerModel or IDFModel. This is sort of a recursive parent class loop that I can't figure out a solution to.
By comparison, the generic model writer is easy, because MLWritable is a common trait extended by all the models I am interested in:
def modelToBytes[M <: MLWritable](model: M): Array[Byte] = {
val tempPath: Path = KAZOO_TEMP_DIR.resolve(s"model_${System.currentTimeMillis()}")
model.write.overwrite().save(tempPath.toString)
Files.readAllBytes(tempPath)
}
How can I make a generic reader that will turn turn a spark-ml model into a byte array?
To make it work you'll need access to a specific MlReadable object.
import org.apache.spark.ml.util.MLReadable
def modelFromBytes[M](obj: MLReadable[M], modelArray: Array[Byte]): M = {
val tempPath: Path = ???
...
obj.read.load(tempPath.toString)
}
which could be later used as:
val bytes: Array[Byte] = ???
modelFromBytes(CountVectorizerModel, bytes)
Note that, despite the first appearance, there is nothing recursive here - MLReadable[M] refers to companion object, not class as such. So for example CountVectorizerModel object is MLReadable, while CountVectorizeModel class isn't.
Internally, Spark MLReader handles this in a different way - it creates an instance of the class using reflection, and then sets its Params. However this path won't be very useful for you here*.
If compatibility with the current API is required, you can try making readable object implicit:
def modelFromBytes[M](modelArray: Array[Byte])(implicit obj: MLReadable[M]): M = {
...
}
and then
implicit val readable: MLReadable[CountVectorizerModel] = CountVectorizerModel
modelFromBytes[CountVectorizerModel](bytes)
* Technically speaking it is possible to get companion object via reflection
def modelFromBytesCV[M <: MLWritable](
modelArray: Array[Byte])(implicit ct: ClassTag[M]): M = {
val tempPath: Path = ???
...
val cls = Class.forName(ct.runtimeClass.getName + "$");
cls.getField("MODULE$").get(cls).asInstanceOf[MLReadable[M]]
.read.load(tempPath.toString))
}
but I don't think that is a path worth exploring here. In particular we cannot really provide strict type bounds here - using MLWritable is a hack to limit human errors, but is rather useless for compiler.

Calling method via reflection in Scala

I want to call an arbitrary public method of an arbitrary stuff via reflection. I.e. let's say, I want to write method extractMethod to be used like:
class User { def setAvatar(avatar: Avatar): Unit = …; … }
val m = extractMethod(someUser, "setAvatar")
m(someAvatar)
From the Reflection. Overview document from Scala docs, I see the following direct way to do that:
import scala.reflect.ClassTag
import scala.reflect.runtime.universe._
def extractMethod[Stuff: ClassTag: TypeTag](
stuff: Stuff,
methodName: String): MethodMirror =
{
val stuffTypeTag = typeTag[Stuff]
val mirror = stuffTypeTag.mirror
val stuffType = stuffTypeTag.tpe
val methodSymbol = stuffType
.member(TermName(methodName)).asMethod
mirror.reflect(stuff)
.reflectMethod(methodSymbol)
}
However what I'm bothered with this solution is that I need to pass implicit ClassTag[Stuff] and TypeTag[Stuff] parameters (first one is needed for calling reflect, second one — for getting stuffType). Which may be quite cumbersome, especially if extractMethod is called from generics that are called from generics and so on. I'd accept this as necessity for some languages that strongly lack runtime type information, but Scala is based on JRE, which allows to do the following:
def extractMethod[Stuff](
stuff: Stuff,
methodName: String,
parameterTypes: Array[Class[_]]): (Object*) => Object =
{
val unboundMethod = stuff.getClass()
.getMethod(methodName, parameterTypes: _*)
arguments => unboundMethod(stuff, arguments: _*)
}
I understand that Scala reflection allows to get more information that basic Java reflection. Still, here I just need to call a method. Is there a way to somehow reduce requirements (e.g. these ClassTag, TypeTag) of the Scala-reflection-based extractMethod version (without falling back to pure-Java reflection), assuming that performance doesn't matter for me?
Yes, there is.
First, according to this answer, TypeTag[Stuff] is a strictly stronger requirement than ClassTag[Stuff]. Although we don't automatically get implicit ClassTag[Stuff] from implicit TypeTag[Stuff], we can evaluate it manually as ClassTag[Stuff](stuffTypeTag.mirror.runtimeClass(stuffTypeTag.tpe)) and then implicitly or explicitly pass it to reflect that needs it:
import scala.reflect.ClassTag
import scala.reflect.runtime.universe._
def extractMethod[Stuff: TypeTag](
stuff: Stuff,
methodName: String): MethodMirror =
{
val stuffTypeTag = typeTag[Stuff]
val mirror = stuffTypeTag.mirror
val stuffType = stuffTypeTag.tpe
val stuffClassTag = ClassTag[Stuff](mirror.runtimeClass(stuffType))
val methodSymbol = stuffType
.member(TermName(methodName)).asMethod
mirror.reflect(stuff)(stuffClassTag)
.reflectMethod(methodSymbol)
}
Second, mirror and stuffType can be obtained from stuff.getClass():
import scala.reflect.ClassTag
import scala.reflect.runtime.universe._
def extractMethod(stuff: Stuff, methodName: String): MethodMirror = {
val stuffClass = stuff.getClass()
val mirror = runtimeMirror(stuffClass.getClassLoader)
val stuffType = mirror.classSymbol(stuffClass).toType
val stuffClassTag = ClassTag[Stuff](mirror.runtimeClass(stuffType))
val methodSymbol = stuffType
.member(TermName(methodName)).asMethod
mirror.reflect(stuff)(stuffClassTag)
.reflectMethod(methodSymbol)
}
Therefore we obtained Scala-style reflection entities (i.e. finally MethodMirror) without requiring ClassTag and/or TypeTag to be passed explicitly or implicitly from the caller. Not sure, however, how it compares with the ways described in the question (i.e. passing tags from outside and pure Java) in the terms of performance.

Trouble using Implicit Ordered with PriorityQueue (Scala)

I'm trying to create a data structure that has a PriorityQueue in it. I've succeeded in making a non-generic version of it. I can tell it works because it solves the A.I. problem I have.
Here is a snippet of it:
class ProntoPriorityQueue { //TODO make generic
implicit def orderedNode(node: Node): Ordered[Node] = new Ordered[Node] {
def compare(other: Node) = node.compare(other)
}
val hashSet = new HashSet[Node]
val priorityQueue = new PriorityQueue[Node]()
...
I'm trying to make it generic, but if I use this version it stops solving the problem:
class PQ[T <% Ordered[T]] {
//[T]()(implicit val ord: T => Ordered[T]) {
//[T]()(implicit val ord: Ordering[T] {
val hashSet = new HashSet[T]
val priorityQueue = new PriorityQueue[T]
...
I've also tried what's commented out instead of using [T <% Ordered[T]]
Here is the code that calls PQ:
//the following def is commented out while using ProntoPriorityQueue
implicit def orderedNode(node: Node): Ordered[Node] = new Ordered[Node] {
def compare(other: Node) = node.compare(other)
} //I've also tried making this return an Ordering[Node]
val frontier = new PQ[Node] //new ProntoPriorityQueue
//have also tried (not together):
val frontier = new PQ[Node]()(orderedNode)
I've also tried moving the implicit def into the Node object (and importing it), but essentially the same problem.
What am I doing wrong in the generic version? Where should I put the implicit?
Solution
The problem was not with my implicit definition. The problem was the implicit ordering was being picked up by a Set that was automatically generating in a for(...) yield(...) statement. This caused a problem where the yielded set only contained one state.
What's wrong with simply defining an Ordering on your Node (Ordering[Node]) and using the already-generic Scala PriorityQueue?
As general rule, it's better to work with Ordering[T] than T <: Ordered[T] or T <% Ordered[T]. Conceptually, Ordered[T] is an intrinsic (inherited or implemented) property of the type itself. Notably, a type can have only one intrinsic ordering relationship defined this way. Ordering[T] is an external specification of the ordering relationship. There can any be any number of different Ordering[T].
Also, if you're not already aware, you should know that the difference between T <: U and T <% U is that while the former includes only nominal subtype relations (actual inheritance), the latter also includes the application of implicit conversions that yield a value conforming to the type bound.
So if you want to use Node <% Ordered[Node] and you don't have a compare method defined in the class, an implicit conversion will be applied every time a comparison needs to be made. Additionally, if your type has its own compare, the implicit conversion will never be applied and you'll be stuck with that "built-in" ordering.
Addendum
I'll give a few examples based on a class, call it CIString that simply encapsulates a String and implements ordering as case-invariant.
/* Here's how it would be with direct implementation of `Ordered` */
class CIString1(val s: String)
extends Ordered[CIString1]
{
private val lowerS = s.toLowerCase
def compare(other: CIString1) = lowerS.compareTo(other.lowerS)
}
/* An uninteresting, empty ordered set of CIString1
(fails without the `extends` clause) */
val os1 = TreeSet[CIString1]()
/* Here's how it would look with ordering external to `CIString2`
using an implicit conversion to `Ordered` */
class CIString2(val s: String) {
val lowerS = s.toLowerCase
}
class CIString2O(ciS: CIString2)
extends Ordered[CIString2]
{
def compare(other: CIString2) = ciS.lowerS.compareTo(other.lowerS)
}
implicit def cis2ciso(ciS: CIString2) = new CIString2O(ciS)
/* An uninteresting, empty ordered set of CIString2
(fails without the implicit conversion) */
val os2 = TreeSet[CIString2]()
/* Here's how it would look with ordering external to `CIString3`
using an `Ordering` */
class CIString3(val s: String) {
val lowerS = s.toLowerCase
}
/* The implicit object could be replaced by
a class and an implicit val of that class */
implicit
object CIString3Ordering
extends Ordering[CIString3]
{
def compare(a: CIString3, b: CIString3): Int = a.lowerS.compareTo(b.lowerS)
}
/* An uninteresting, empty ordered set of CIString3
(fails without the implicit object) */
val os3 = TreeSet[CIString3]()
Well, one possible problem is that your Ordered[Node] is not a Node:
implicit def orderedNode(node: Node): Ordered[Node] = new Ordered[Node] {
def compare(other: Node) = node.compare(other)
}
I'd try with an Ordering[Node] instead, which you say you tried but there isn't much more information about. PQ would be declared as PQ[T : Ordering].