How can I override a class' method in a Scala 3 compiler plugin? - scala

I want to auto-generate an overriden method in a compiler plugin (Scala 3) for a class like:
trait SpecialSerialize {
def toJson(sb: StringBuilder, c:SJConfig): Unit = {println("wrong")}
}
case class Person(name:String, age:Int) extends SpecialSerialize
The plugin would generate:
case class Person(name:String, age:Int) extends SpecialSerialize {
override def toJson(sb: StringBuilder, c:SJConfig): Unit = ... // code here
}
I have a phase:
class ReflectionWorkerPhase extends PluginPhase {
import tpd._
val phaseName = "reflectionWorker"
override val runsAfter = Set(Pickler.name)
override def transformTypeDef(tree: TypeDef)(implicit ctx: Context): Tree =
if tree.isClassDef && !tree.rhs.symbol.isStatic then // only look at classes
// 0. Get a FreshContext so we can set the tree to this tree. (for '{} later)
implicit val fresh = ctx.fresh
fresh.setTree(tree)
QuotesCache.init(fresh)
implicit val quotes:Quotes = QuotesImpl.apply() // picks up fresh
import quotes.reflect.*
// 1. Set up method symbol, define parameters and return type
val toJsonSymbol = Symbol.newMethod(
Symbol.spliceOwner,
"toJson",
MethodType(
List("sb","config"))( // parameter list
_ => List( // types of the parameters
TypeRepr.of[StringBuilder],
TypeRepr.of[SJConfig],
),
_ => TypeRepr.typeConstructorOf(classOf[Unit]) // return type
),
Flags.Override, // Note override here
Symbol.noSymbol
)
// 2. Get our class' Symbol for ownership reassignment
val classDef = tree.asInstanceOf[ClassDef]
val classSymbol = classDef.symbol
// 3. Define our method definition (DefDef) using our method symbol defined above
val toJsonMethodDef = DefDef(
toJsonSymbol,
{
case List(List(sb: Term, config: Term)) =>
given Quotes = toJsonSymbol.asQuotes
Some({
// Multiple quotes here intentional...
// Real code will generate a list of quoted statements
quoted.Expr.ofList(List(
'{ println("Hello") },
'{ println("World") }
))
}.asTerm.changeOwner(toJsonSymbol))
}
).changeOwner(classSymbol)
// 4. Add toJsonMethodDef to tree and return
val cd = ClassDef.copy(classDef)(
name = classDef.name,
constr = classDef.constructor,
parents = classDef.parents,
selfOpt = classDef.self,
body = toJsonMethodDef +: classDef.body
)
cd.asInstanceOf[dotty.tools.dotc.ast.tpd.Tree]
else
tree
}
When I use the plugin on a sample Person class, I get this error on compile:
Exception in thread "sbt-bg-threads-1" java.lang.ClassFormatError: Duplicate method name "toJson" with signature "(Lscala.collection.mutable.StringBuilder;Lco.blocke.scala_reflection.SJConfig;)V" in class file com/foo/Person
at java.base/java.lang.ClassLoader.defineClass1(Native Method)
at java.base/java.lang.ClassLoader.defineClass(ClassLoader.java:1013)
at java.base/java.security.SecureClassLoader.defineClass(SecureClassLoader.java:150)
at java.base/java.net.URLClassLoader.defineClass(URLClassLoader.java:524)
at java.base/java.net.URLClassLoader$1.run(URLClassLoader.java:427)
at java.base/java.net.URLClassLoader$1.run(URLClassLoader.java:421)
at java.base/java.security.AccessController.doPrivileged(AccessController.java:712)
...
So something in the other compile phases failed to recognize my generated method as being an override and tried to copy in the "master" method from the trait, and of course the JVM lost its mind at runtime, finding 2 copies of toJson with the same signature.
How can I fix this so my generated method is recognized as a valid override such that a 2nd toJson method won't be generated in a later phase?

Related

Instantiate a class with a specified name

I'm writing a Scala library to operate upon Spark DataFrames. I have a bunch of classes, each of which contain a function that operates upon the supplied DataFrame:
class Foo(){val func = SomeFunction(,,,)}
class Bar(){val func = SomeFunction(,,,)}
class Baz(){val func = SomeFunction(,,,)}
The user of my library passes a parameter operation: String to indicate class to instantiate, the value passed has to be the name of one of those classes hence I have code that looks something like this:
operation match {
case Foo => new Foo().SomeFunction
case Bar => new Bar().SomeFunction
case Baz => new Baz().SomeFunction
}
I'm a novice Scala developer but this seems rather like a clunky way of achieving this. I'm hoping there is a simpler way to instantiate the desired class based on the value of operation given that it will be the same as the name of the desired class.
The reason I want to do this is that I want external contributors to contribute their own classes and I want to make it at easy as possible for them to do that, I don't want them to have to know they also need to go and change a pattern match.
For
case class SomeFunction(s: String)
class Foo(){val func = SomeFunction("Foo#func")}
class Bar(){val func = SomeFunction("Bar#func")}
class Baz(){val func = SomeFunction("Baz#func")}
//...
reflection-based version of
def foo(operation: String) = operation match {
case "Foo" => new Foo().func
case "Bar" => new Bar().func
case "Baz" => new Baz().func
// ...
}
is
import scala.reflect.runtime.universe
import scala.reflect.runtime.universe._
def foo(operation: String): SomeFunction = {
val runtimeMirror = universe.runtimeMirror(getClass.getClassLoader)
val classSymbol = runtimeMirror.staticClass(operation)
val constructorSymbol = classSymbol.primaryConstructor.asMethod
val classMirror = runtimeMirror.reflectClass(classSymbol)
val classType = classSymbol.toType
val constructorMirror = classMirror.reflectConstructor(constructorSymbol)
val instance = constructorMirror()
val fieldSymbol = classType.decl(TermName("func")).asTerm
val instanceMirror = runtimeMirror.reflect(instance)
val fieldMirror = instanceMirror.reflectField(fieldSymbol)
fieldMirror.get.asInstanceOf[SomeFunction]
}
Testing:
foo("Foo") //SomeFunction(Foo#func)
foo("Bar") //SomeFunction(Bar#func)
foo("Baz") //SomeFunction(Baz#func)

Determine if type is Option[_] in scala compiler plugin

How can I determine whether a Tree object is a subtype of Option[_] in a scalac (2.11) plugin?
Example:
class OptionIfPhase(prev: Phase) extends StdPhase(prev) {
override def name = OptionIf.this.name
override def apply(unit: CompilationUnit) {
for (tree # Apply(TypeApply(Select(rcvr, TermName("map")), typetree), List(param)) <- unit.body;
if rcvr.tpe <:< definitions.OptionClass.tpe)
{
reporter.echo(tree.pos, s"found a map on ${rcvr} with type ${rcvr.tpe} and typelist ${typetree} and arg $param")
}
}
This if-guard is always false, unfortunately. If I remove it completely, it matches all "typed" maps (as expected). Here's a few examples of expressions that I would like to match:
// All of these maps
val x = Some(4).map(_ * 2)
val xo: Option[Int] = x
xo.map(_ * 3)
None.map(i => "foo")
// Not this one
List(1, 3, 5).map(_ + 1) // removing the if guard completely also matches this
// The inner one
List(Some(1), None, Some(3)).flatMap(_.map(_ * -1))
// And from methods:
def foo: Option[Int] = Some(3)
foo.map(_ + 1)
I tried this:
if rcvr.tpe <:< typeOf[Option[_]]
Which works in the REPL (after import scala.reflect.runtime.universe._), but in the plugin it throws an exception: "scala.ScalaReflectionException: class net.obrg.optionif.OptionIf in compiler mirror not found."
What is an appropriate if condition in the for loop to filter out any "map" on an expression that has been determined to be of type Option[_]?
Background: I'm trying to build a scala plugin that converts all x.map(f) expressions, where x is of type Option, to (if (x.isEmpty) None else Some(f(x.get))).
The surrounding plugin boilerplate, so far:
package net.obrg.optionif
import scala.tools.nsc.{Global, Phase}
import scala.tools.nsc.plugins.{Plugin, PluginComponent}
class OptionIf(val global: Global) extends Plugin {
import global._
override val name = "optionif"
override val description = "flattens map operations on options to if statements"
override val components = List[PluginComponent](Component)
private object Component extends PluginComponent {
override val global: OptionIf.this.global.type = OptionIf.this.global
// I'm not 100% on these constraints, but afaik these are appropriate outer bounds.
// The typer phase is, obviously, necessary to determine which objects are of type Option, to begin with.
override val runsAfter = List("typer")
// The refchecks phase seems to do some if-related optimizations ("Eliminate branches in a conditional if the
// condition is a constant"). So put it before this, at least.
override val runsBefore = List("refchecks")
override val phaseName = OptionIf.this.name
override def newPhase(_prev: Phase) = new OptionIfPhase(_prev)
class OptionIfPhase(prev: Phase) extends StdPhase(prev) {
override def name = OptionIf.this.name
override def apply(unit: CompilationUnit) {
for (tree # Apply(TypeApply(Select(rcvr, TermName("map")), typetree), List(param)) <- unit.body;
if rcvr.tpe <:< definitions.OptionClass.tpe)
{
reporter.echo(tree.pos, s"found a map on ${rcvr} with type ${rcvr.tpe} and typelist ${typetree} and arg $param")
}
}
}
}
}
Adapted from http://www.scala-lang.org/old/node/140
EDIT: I just added the test for None.map(x => "foo") and the if guard actually does catch this! Still not the other ones, though.

Scala Reflection Conundrum: Can you explain these weird results?

I wrote some Scala code, using reflection, that returns all vals in an object that are of a certain type. Below are three versions of this code. One of them works but is ugly. Two attempts to improve it don't work, in very different ways. Can you explain why?
First, the code:
import scala.reflect.runtime._
import scala.util.Try
trait ScopeBase[T] {
// this version tries to generalize the type. The only difference
// from the working version is [T] instead of [String]
def enumerateBase[S: universe.TypeTag]: Seq[T] = {
val mirror = currentMirror.reflect(this)
universe.typeOf[S].decls.map {
decl => Try(mirror.reflectField(decl.asMethod).get.asInstanceOf[T])
}.filter(_.isSuccess).map(_.get).filter(_ != null).toSeq
}
}
trait ScopeString extends ScopeBase[String] {
// This version works but requires passing the val type
// (String, in this example) explicitly. I don't want to
// duplicate the code for different val types.
def enumerate[S: universe.TypeTag]: Seq[String] = {
val mirror = currentMirror.reflect(this)
universe.typeOf[S].decls.map {
decl => Try(mirror.reflectField(decl.asMethod).get.asInstanceOf[String])
}.filter(_.isSuccess).map(_.get).filter(_ != null).toSeq
}
// This version tries to avoid passing the object's type
// as the [S] type parameter. After all, the method is called
// on the object itself; so why pass the type?
def enumerateThis: Seq[String] = {
val mirror = currentMirror.reflect(this)
universe.typeOf[this.type].decls.map {
decl => Try(mirror.reflectField(decl.asMethod).get.asInstanceOf[String])
}.filter(_.isSuccess).map(_.get).filter(_ != null).toSeq
}
}
// The working example
object Test1 extends ScopeString {
val IntField: Int = 13
val StringField: String = "test"
lazy val fields = enumerate[Test1.type]
}
// This shows how the attempt to generalize the type doesn't work
object Test2 extends ScopeString {
val IntField: Int = 13
val StringField: String = "test"
lazy val fields = enumerateBase[Test2.type]
}
// This shows how the attempt to drop the object's type doesn't work
object Test3 extends ScopeString {
val IntField: Int = 13
val StringField: String = "test"
lazy val fields = enumerateThis
}
val test1 = Test1.fields // List(test)
val test2 = Test2.fields // List(13, test)
val test3 = Test3.fields // List()
The "enumerate" method does work. However, as you can see from the Test1 example, it requires passing the object's own type (Test1.type) as a parameter, which should not have been necessary. The "enumerateThis" method tries to avoid that but fails, producing an empty list. The "enumerateBase" method attempts to generalize the "enumerate" code by passing the val type as a parameter. But it fails, too, producing the list of all vals, not just those of a certain type.
Any idea what's going on?
Your problem in your generic implementation is the loss of the type information of T. Also, don't use exceptions as your primary method of control logic (it's very slow!). Here's a working version of your base.
abstract class ScopeBase[T : universe.TypeTag, S <: ScopeBase[T, S] : universe.TypeTag : scala.reflect.ClassTag] {
self: S =>
def enumerateBase: Seq[T] = {
val mirror = currentMirror.reflect(this)
universe.typeOf[S].baseClasses.map(_.asType.toType).flatMap(
_.decls
.filter(_.typeSignature.resultType <:< universe.typeOf[T])
.filter(_.isMethod)
.map(_.asMethod)
.filter(_.isAccessor)
.map(decl => mirror.reflectMethod(decl).apply().asInstanceOf[T])
.filter(_ != null)
).toSeq
}
}
trait Inherit {
val StringField2: String = "test2"
}
class Test1 extends ScopeBase[String, Test1] with Inherit {
val IntField: Int = 13
val StringField: String = "test"
lazy val fields = enumerateBase
}
object Test extends App {
println(new Test1().fields)
}
Instead of getting the type from universe.typeOf you can use the runtime class currentMirror.classSymbol(getClass).toType, below is an example that works:
def enumerateThis: Seq[String] = {
val mirror = currentMirror.reflect(this)
currentMirror.classSymbol(getClass).toType.decls.map {
decl => Try(mirror.reflectField(decl.asMethod).get.asInstanceOf[String])
}.filter(_.isSuccess).map(_.get).filter(_ != null).toSeq
}
//prints List(test)
With everyone's help, here's the final version that works:
import scala.reflect.runtime.{currentMirror, universe}
abstract class ScopeBase[T: universe.TypeTag] {
lazy val enumerate: Seq[T] = {
val mirror = currentMirror.reflect(this)
currentMirror.classSymbol(getClass).baseClasses.map(_.asType.toType).flatMap {
_.decls
.filter(_.typeSignature.resultType <:< universe.typeOf[T])
.filter(_.isMethod)
.map(_.asMethod)
.filterNot(_.isConstructor)
.filter(_.paramLists.size == 0)
.map(decl => mirror.reflectField(decl.asMethod).get.asInstanceOf[T])
.filter(_ != null).toSeq
}
}
}
trait FieldScope extends ScopeBase[Field[_]]
trait DbFieldScope extends ScopeBase[DbField[_, _]] {
// etc....
}
As you see from the last few lines, my use cases are limited to scope objects for specific field types. This is why I want to parameterize the scope container. If I wanted to enumerate the fields of multiple types in a single scope container, then I would have parameterized the enumerate method.

Scala Pickling: Writing a custom pickler / unpickler for nested structures

I'm trying to write a custom SPickler / Unpickler pair to work around some the current limitations of scala-pickling.
The data type I'm trying to pickle is a case class, where some of the fields already have their own SPickler and Unpickler instances.
I'd like to use these instances in my custom pickler, but I don't know how.
Here's an example of what I mean:
// Here's a class for which I want a custom SPickler / Unpickler.
// One of its fields can already be pickled, so I'd like to reuse that logic.
case class MyClass[A: SPickler: Unpickler: FastTypeTag](myString: String, a: A)
// Here's my custom pickler.
class MyClassPickler[A: SPickler: Unpickler: FastTypeTag](
implicit val format: PickleFormat) extends SPickler[MyClass[A]] with Unpickler[MyClass[A]] {
override def pickle(
picklee: MyClass[A],
builder: PBuilder) {
builder.beginEntry(picklee)
// Here we save `myString` in some custom way.
builder.putField(
"mySpecialPickler",
b => b.hintTag(FastTypeTag.ScalaString).beginEntry(
picklee.myString).endEntry())
// Now we need to save `a`, which has an implicit SPickler.
// But how do we use it?
builder.endEntry()
}
override def unpickle(
tag: => FastTypeTag[_],
reader: PReader): MyClass[A] = {
reader.beginEntry()
// First we read the string.
val myString = reader.readField("mySpecialPickler").unpickle[String]
// Now we need to read `a`, which has an implicit Unpickler.
// But how do we use it?
val a: A = ???
reader.endEntry()
MyClass(myString, a)
}
}
I would really appreciate a working example.
Thanks!
Here is a working example:
case class MyClass[A](myString: String, a: A)
Note that the type parameter of MyClass does not need context bounds. Only the custom pickler class needs the corresponding implicits:
class MyClassPickler[A](implicit val format: PickleFormat, aTypeTag: FastTypeTag[A],
aPickler: SPickler[A], aUnpickler: Unpickler[A])
extends SPickler[MyClass[A]] with Unpickler[MyClass[A]] {
private val stringUnpickler = implicitly[Unpickler[String]]
override def pickle(picklee: MyClass[A], builder: PBuilder) = {
builder.beginEntry(picklee)
builder.putField("myString",
b => b.hintTag(FastTypeTag.ScalaString).beginEntry(picklee.myString).endEntry()
)
builder.putField("a",
b => {
b.hintTag(aTypeTag)
aPickler.pickle(picklee.a, b)
}
)
builder.endEntry()
}
override def unpickle(tag: => FastTypeTag[_], reader: PReader): MyClass[A] = {
reader.hintTag(FastTypeTag.ScalaString)
val tag = reader.beginEntry()
val myStringUnpickled = stringUnpickler.unpickle(tag, reader).asInstanceOf[String]
reader.endEntry()
reader.hintTag(aTypeTag)
val aTag = reader.beginEntry()
val aUnpickled = aUnpickler.unpickle(aTag, reader).asInstanceOf[A]
reader.endEntry()
MyClass(myStringUnpickled, aUnpickled)
}
}
In addition to the custom pickler class, we also need an implicit def which returns a pickler instance specialized for concrete type arguments:
implicit def myClassPickler[A: SPickler: Unpickler: FastTypeTag](implicit pf: PickleFormat) =
new MyClassPickler

Specs2: Use a Hamcrest matcher

I have a wide array of Hamcrest matchers for my domain objects written in Java.
I'm now moving to Scala and would like to reuse these existing matchers in the context of specs2 tests.
Given a Hamcrest matcher for class Foo:
public class FooMatcher extends TypeSafeMatcher[Foo] {
...
}
I'd like to be able to use it thus:
val myFooMatcher = new FooMatcher(...)
foo must match (myFooMatcher)
foos must contain (myFooMatcher1, myFooMatcher2)
And so on.
Specs2 seems to have the opposite, an adapter of its Matcher[T] trait to org.hamcrest.Matcher, but I'm looking for the other way around.
Any ideas?
You need to add one implicit conversion for this to work:
import org.hamcrest._
import org.specs2.matcher.MustMatchers._
implicit def asSpecs2Matcher[T](hamcrest: org.hamcrest.TypeSafeMatcher[T]):
org.specs2.matcher.Matcher[T] = {
def koMessage(a: Any) = {
val description = new StringDescription
description.appendValue(a)
hamcrest.describeTo(description)
description.toString
}
(t: T) => (hamcrest.matches(t), koMessage(t))
}
Let's see it in action:
case class Foo(isOk: Boolean = true)
// a Hamcrest matcher for Foo elements
class FooMatcher extends TypeSafeMatcher[Foo] {
def matchesSafely(item: Foo): Boolean = item.isOk
def describeTo(description: Description) = description.appendText(" is ko")
}
// an instance of that matcher
def beMatchingFoo = new FooMatcher
// this returns a success
Foo() must beMatchingFoo
// this returns a failure
Foo(isOk = false) must beMatchingFoo
// this is a way to test that some elements in a collection have
// the desired property
Seq(Foo()) must have oneElementLike { case i => i must beMatchingFoo }
// if you have several matchers you want to try out
Seq(Foo()) must have oneElementLike { case i =>
i must beMatchingFoo and beMatchingBar
}