How to list annotations (custom java ones and others) on field members of Scala case class - scala

So I'm trying to list fields with specific annotation in a Scala case class and I'm not able to get it working... Let's see come code right away
The case class (it's a simplified version of it, mine extends another class and is also nested in my test class where I use it for unit testing only):
case class Foo(#Unique var str: String) {}
The custom Java annotation:
#Retention(RetentionPolicy.RUNTIME)
#Target({ElementType.FIELD, ElementType.PARAMETER})
public #interface Unique {}
And my class (simplified again) where I'm trying to do some stuffs with fields marked as unique
class SomeClass[T] (implicit typeTag: TypeTag[T]) {
val fields: Iterable[universe.TermSymbol] = typeOf(typeTag).members.collect { case s: TermSymbol => s }.
filter(s => s.isVal || s.isVar)
val list = fields.flatMap(f => f.annotations.find(_.tpe =:= TypeOf[Unique]).((f, _))).toList
}
But the val list in the last peace of code is always empty... fields has str listed in but without the annotation.
What am I missing?
The code listing the annotations is from the following answer:
How to list all fields with a custom annotation using Scala's reflection at runtime?

Seems the reference post is Scala 2.10 is old and is not compatible with the newest Scala version.
There is an example for how to get the specify annotation by type.
def listProperties[T: TypeTag]: List[universe.Annotation] = {
typeOf[T].typeSymbol.asClass
.asClass
.primaryConstructor
.typeSignature
.paramLists.flatten.flatMap(_.annotations)
}
val annotations = listProperties[Foo].filter(_.tree.tpe =:= typeOf[Unique])
println(annotations)
and there is way to get the annotation's field value:
case class Foo(#Unique(field = "bar") val str: String) {}
import scala.reflect.runtime.currentMirror
import scala.tools.reflect.ToolBox
val tb = currentMirror.mkToolBox()
val result = tb.eval(tb.untypecheck(head.tree)).asInstanceOf[Unique]
and need to call out your annotation class is implemented by using Java style, In Scala maybe you want to use StaticAnnotation for creating Annotation, like:
class Unique extends StaticAnnotation

Related

Get all the classes that implements a trait in Scala using reflection

I want to list out all the case classes which implements a particular trait. I am currently using Clapper ClassUtil for doing that. I am able to get the case classes that are directly implementing a trait. However, I am not able to get the other classes which are not directly implementing the trait. How can I get all classes which directly or indirectly implements a trait. ?
val finder = ClassFinder()
finder.getClasses().filter(_.isConcrete).filter(_.implements("com.myapp.MyTrait"))
Scala Version : 2.11
Clapper Class Util Version : 1.0.6
Is there any other way I can get these information? Can someone point me to the right direction?
I tried using scala.reflect but could not understand how to get the info.
EDIT:
Sample traits and usages:
trait BaseEntity
trait NamedEntity{ val name:String}
trait MasterDataEntity extends NamedEntity
case class Department(id:Long, override val name:String) extends MasterDataEntity
case class Employee(id:Long, name:String) extends BaseEntity
case class User(id:Long, override val name:String) extends NamedEntity
Now, if I give the trait as NamedEntity, I should be able to get both Department and User since they both are directly or indirectly implementing NamedEntity. With implements method, it will give only User. I also tried by using interfaces method, which will also provide the direct super classes only.
Looking at the source code, the problem seems to be that it doesn't follow the interfaces hierarchy. If you do that, you find all instances:
package foo
import java.io.File
import org.clapper.classutil.{ClassFinder, ClassInfo}
object Main extends App {
val jar = new File("target/scala-2.11/class_test_2.11-0.1.0.jar")
val finder = ClassFinder(jar :: Nil)
val classes = ClassFinder.classInfoMap(finder.getClasses().iterator)
val impl = find("foo.NamedEntity", classes)
impl.foreach(println)
def find(ancestor: String, classes: Map[String, ClassInfo]): List[ClassInfo] =
classes.get(ancestor).fold(List.empty[ClassInfo]) { ancestorInfo =>
val ancestorName = ancestorInfo.name
def compare(info: ClassInfo): Boolean =
info.name == ancestorName ||
(info.superClassName :: info.interfaces).exists {
n => classes.get(n).exists(compare)
}
val it = classes.valuesIterator
it.filter { info => info.isConcrete && compare(info) } .toList
}
}
ClassUtil now contains this functionality (v1.4.0, maybe also in earlier versions):
val finder = ClassFinder()
val impl = ClassFinder.concreteSubclasses("foo.NamedEntity", finder.getClasses())

Scala Quasiquotes Destructuring a Type

Context:
I'm working on a library for working with JMX in Scala. One of the objectives is to have a strong typed interface to Managed Beans. I guess akin to to the Spring framework JMX library.
Objective: Macro to Deserialise TabularData to a case class:
// interface for which I'd like to generate an implementation using a macro
trait JMXTabularAssembler[T <: Product] {
def assemble(data: TabularData): T
}
object JMXAnnotations {
case class Attribute(name: String) extends StaticAnnotation
}
case class example(
#Attribute("name") name: String,
#Attribute("age") age: Int,
unmarked: String
)
Problem: There are plenty of examples of composing tree's using the q"" interpolators. But I can't figure out how to use the tq"" interpolator to
extract the fields out of a case class from a type context.
private def mkAssembler[T <: Product : c.WeakTypeTag](c: Context): c.universe.Tree = {
import c.universe._
val tt = weakTypeOf[T]
}
Question: How do I use the QuasiQuote machinery to destructure the fields of my case class so that I can loop over them and filter out fields with my annotation (my Attribute annotation is not available from the approach I am currently taking"). An implementation of the following that returns the fields with annotations in declaration order is what I am after.
private def harvestFieldsWithAnnotations[T<: Product: c.WeakTypeTag](c: Context):
List[(c.universe.Name, String, c.universe.Type, List[c.universe.Annotation])] = ???
Bonus: The objective is to get the attribute fields, generate trees for each field that extract the field from the TabularData and use these trees to create the JMXTabularAssembler Functor. If you could show me how to do this for the example above it would bootstrap my efforts :D.
What I have tried: I started solving the problem by using reflection. This does not seem the right way to do it. Snippets:
...
val dec = tt.decls.sorted
def getFields = dec.withFilter( t=> t.isTerm && ! t.isMethod)
def getCaseAccessors = dec.withFilter( t => t.isMethod && t.asMethod.isCaseAccessor)
dec.foreach { d=>
println(d.name, d.annotations)
}
getFields.foreach { f =>
println(f.annotations)
}
val types = getCaseAccessors.map { d =>
println(d.annotations)
(d.name, tt.member(d.name).asMethod.returnType)
}
...
The following method does the trick, it does not use quasi quotes. The key is to access the backing field of a symbol representing the field accessor of a case class (the accessed call).
private def harvestFieldsWithAnnotations[T <: Product : c.WeakTypeTag](c: Context) = {
import c.universe._
val tt = weakTypeOf[T]
tt.decls.sorted.filter(t => t.isMethod && t.asMethod.isCaseAccessor).map { ca =>
val asMethod = tt.member(ca.name).asMethod
(ca.name, asMethod.returnType, asMethod.accessed.annotations)
}
}
Field annotations won't get retained unless they are explicitly annotated with scala.annotation.meta.field.
So the Attribute annotation should be:
#field
case class Attribute(name: String) extends StaticAnnotation

How to write class and tableclass mapping for slick2 instead of using case class?

I use case class to transform the class object to data for slick2 before, but current I use another play plugin, the plugin object use the case class, my class is inherent from this case class. So, I can not use case class as the scala language forbidden use case class to case class inherent.
before:
case class User()
class UserTable(tag: Tag) extends Table[User](tag, "User") {
...
def * = (...)<>(User.tupled,User.unapply)
}
it works.
But now I need to change above to below:
case class BasicProfile()
class User(...) extends BasicProfile(...){
...
def unapply(i:User):Tuple12[...]= Tuple12(...)
}
class UserTable(tag: Tag) extends Table[User](tag, "User") {
...
def * = (...)<>(User.tupled,User.unapply)
}
I do not know how to write the tupled and unapply(I am not my writing is correct or not) method like the case class template auto generated. Or you can should me other way to mapping the class to talbe by slick2.
Any one can give me an example of it?
First of all, this case class is a bad idea:
case class BasicProfile()
Case classes compare by their member values, this one doesn't have any. Also the name is not great, because we have the same name in Slick. May cause confusion.
Regarding your class
class User(...) extends BasicProfile(...){
...
def unapply(i:User):Tuple12[...]= Tuple12(...)
}
It is possible to emulate case classes yourself. Are you doing that because of the 22 field limit? FYI: Scala 2.11 supports larger case classes. We are doing what you are trying at Sport195, but there are several aspects to take care of.
apply and unapply need to be members of object User (the companion object of class User). .tupled is not a real method, but generated automatically by the Scala compiler. it turns a method like .apply that takes a list of arguments into a function that takes a single tuple of those arguments. As tuples are limited to 22 columns, so is .tupled. But you could of course auto-generated one yourself, may have to give it another name.
We are using the Slick code generator in combination with twirl template engine (uses # to insert expressions. The $ are inserted as if into the generated Scala code and evaluated, when the generated code is compiled/run.). Here are a few snippets that may help you:
Generate apply method
/** Factory for #{name} objects
#{indentN(2,entityColumns.map(c => "* #param "+c.name+" "+c.doc).mkString("\n"))}
*/
final def apply(
#{indentN(2,
entityColumns.map(c =>
colWithTypeAndDefault(c)
).mkString(",\n")
)}
) = new #{name}(#{columnsCSV})
Generate unapply method:
#{if(entityColumns.size <= 22)
s"""
/** Extractor for ${name} objects */
final def unapply(o: ${name}) = Some((${entityColumns.map(c => "o."+c.name).mkString(", ")}))
""".trim
else
""}
Trait that can be mixed into User to make it a Scala Product:
trait UserBase with Product{
// Product interface
def canEqual(that: Any): Boolean = that.isInstanceOf[#name]
def productArity: Int = #{entityColumns.size}
def productElement(n: Int): Any = Seq(#{columnsCSV})(n)
override def toString = #{name}+s"(${productIterator.toSeq.mkString(",")})"
...
case-class like .copy method
final def copy(
#{indentN(2,columnsCopy)}
): #{name} = #{name}(#{columnsCSV})
To use those classes with Slick you have several options. All are somewhat newer and not documented (well). The normal <> operator Slick goes via tuples, but that's not an option for > 22 columns. One option are the new fastpath converters. Another option is mapping via a Slick HList. No examples exist for either. Another option is going via a custom Shape, which is what we do. This will require you to define a custom shape for your User class and another class defined using Column types to mirror user within queries. Like this: http://slick.typesafe.com/doc/2.1.0/api/#scala.slick.lifted.ProductClassShape Too verbose to write by hand. We use the following template code for this:
/** class for holding the columns corresponding to #{name}
* used to identify this entity in a Slick query and map
*/
class #{name}Columns(
#{indent(
entityColumns
.map(c => s"val ${c.name}: Column[${c.exposedType}]")
.mkString(", ")
)}
) extends Product{
def canEqual(that: Any): Boolean = that.isInstanceOf[#name]
def productArity: Int = #{entityColumns.size}
def productElement(n: Int): Any = Seq(#{columnsCSV})(n)
}
/** shape for mapping #{name}Columns to #{name} */
object #{name}Implicits{
implicit object #{name}Shape extends ClassShape(
Seq(#{
entityColumns
.map(_.exposedType)
.map(t => s"implicitly[Shape[ShapeLevel.Flat, Column[$t], $t, Column[$t]]]")
.mkString(", ")
}),
vs => #{name}(#{
entityColumns
.map(_.exposedType)
.zipWithIndex
.map{ case (t,i) => s"vs($i).asInstanceOf[$t]" }
.mkString(", ")
}),
vs => new #{name}Columns(#{
entityColumns
.map(_.exposedType)
.zipWithIndex
.map{ case (t,i) => s"vs($i).asInstanceOf[Column[$t]]" }
.mkString(", ")
})
)
}
import #{name}Implicits.#{name}Shape
A few helpers we put into the Slick code generator:
val columnsCSV = entityColumns.map(_.name).mkString(", ")
val columnsCopy = entityColumns.map(c => colWithType(c)+" = "+c.name).mkString(", ")
val columnNames = entityColumns.map(_.name.toString)
def colWithType(c: Column) = s"${c.name}: ${c.exposedType}"
def colWithTypeAndDefault(c: Column) =
colWithType(c) + colDefault(c).map(" = "+_).getOrElse("")
def indentN(n:Int,code: String): String = code.split("\n").mkString("\n"+List.fill(n)(" ").mkString(""))
I know this may a bit troublesome to replicate, especially if you are new to Scala. I hope to to find the time get it into the official Slick code generator at some point.

Matching against subclasses in macro

I need to convert a string value into an actual type, so i decided to try a macro-way to do this. I have a bunch of data types:
sealed abstract class Tag(val name: String)
case object Case1 extends Tag("case1")
case object Case2 extends Tag("case2")
case object Case3 extends Tag("case3")
etc...
I want to write a simple resolver:
val tag: Tag = TagResolver.fromString("case2")
This line should return Case2 respectively. I manager to do the following:
def typeFromString(c: Context)(name: c.Expr[String]): c.Expr[Tag] = {
import c.universe._
val tag = typeTag[Tag]
val accSymb = tag.tpe.typeSymbol.asClass
val subclasses = accSymb.knownDirectSubclasses // all my cases
subclasses.map { sub =>
val name = sub.typeSignature.member(newTermName("name")).asMethod // name field
???
}
}
But how can i match name: c.Expr[String] against value of name field and if matched return the appropriate tag?
I don't think there's reliable way of doing this, because knownDirectSubclasses can refer to classes that haven't been compiled yet, so we can't evaluate them.
If you can put these values as annotations on the classes, then these annotations can be read even when classes are being compiled in the current compilation run (via the Symbol.annotations API). Please note, however, that knownDirectSubclasses has known issues: https://issues.scala-lang.org/browse/SI-7046.

Custom Scala enum, most elegant version searched

For a project of mine I have implemented a Enum based upon
trait Enum[A] {
trait Value { self: A =>
_values :+= this
}
private var _values = List.empty[A]
def values = _values
}
sealed trait Currency extends Currency.Value
object Currency extends Enum[Currency] {
case object EUR extends Currency
case object GBP extends Currency
}
from Case objects vs Enumerations in Scala. I worked quite nice, till I run into the following problem. Case objects seem to be lazy and if I use Currency.value I might actually get an empty List. It would have been possible to make a call against all Enum Values on startup so that the value list would be populated, but that would be kind of defeating the point.
So I ventured into the dark and unknown places of scala reflection and came up with this solution, based upon the following SO answers. Can I get a compile-time list of all of the case objects which derive from a sealed parent in Scala?
and How can I get the actual object referred to by Scala 2.10 reflection?
import scala.reflect.runtime.universe._
abstract class Enum[A: TypeTag] {
trait Value
private def sealedDescendants: Option[Set[Symbol]] = {
val symbol = typeOf[A].typeSymbol
val internal = symbol.asInstanceOf[scala.reflect.internal.Symbols#Symbol]
if (internal.isSealed)
Some(internal.sealedDescendants.map(_.asInstanceOf[Symbol]) - symbol)
else None
}
def values = (sealedDescendants getOrElse Set.empty).map(
symbol => symbol.owner.typeSignature.member(symbol.name.toTermName)).map(
module => reflect.runtime.currentMirror.reflectModule(module.asModule).instance).map(
obj => obj.asInstanceOf[A]
)
}
The amazing part of this is that it actually works, but it is ugly as hell and I would be interested if it would be possible to make this simpler and more elegant and to get rid of the asInstanceOf calls.
Here is a simple macro based implementation:
import scala.language.experimental.macros
import scala.reflect.macros.blackbox
abstract class Enum[E] {
def values: Seq[E] = macro Enum.caseObjectsSeqImpl[E]
}
object Enum {
def caseObjectsSeqImpl[A: c.WeakTypeTag](c: blackbox.Context) = {
import c.universe._
val typeSymbol = weakTypeOf[A].typeSymbol.asClass
require(typeSymbol.isSealed)
val subclasses = typeSymbol.knownDirectSubclasses
.filter(_.asClass.isCaseClass)
.map(s => Ident(s.companion))
.toList
val seqTSymbol = weakTypeOf[Seq[A]].typeSymbol.companion
c.Expr(Apply(Ident(seqTSymbol), subclasses))
}
}
With this you could then write:
sealed trait Currency
object Currency extends Enum[Currency] {
case object USD extends Currency
case object EUR extends Currency
}
so then
Currency.values == Seq(Currency.USD, Currency.EUR)
Since it's a macro, the Seq(Currency.USD, Currency.EUR) is generated at compile time, rather than runtime. Note, though, that since it's a macro, the definition of the class Enum must be in a separate project from where it is used (i.e. the concrete subclasses of Enum like Currency). This is a relatively simple implementation; you could do more complicated things like traverse multilevel class hierarchies to find more case objects at the cost of greater complexity, but hopefully this will get you started.
A late answer, but anyways...
As wallnuss said, knownDirectSubclasses is unreliable as of writing and has been for quite some time.
I created a small lib called Enumeratum (https://github.com/lloydmeta/enumeratum) that allows you to use case objects as enums in a similar way, but doesn't use knownDirectSubclasses and instead looks at the body that encloses the method call to find subclasses. It has proved to be reliable thus far.
The article "“You don’t need a macro” Except when you do" by Max Afonov
maxaf describes a nice way to use macro for defining enums.
The end-result of that implementation is visible in github.com/maxaf/numerato
Simply create a plain class, annotate it with #enum, and use the familiar val ... = Value declaration to define a few enum values.
The #enum annotation invokes a macro, which will:
Replace your Status class with a sealed Status class suitable for acting as a base type for enum values. Specifically, it'll grow a (val index: Int, val name: String) constructor. These parameters will be supplied by the macro, so you don't have to worry about it.
Generate a Status companion object, which will contain most of the pieces that now make Status an enumeration. This includes a values: List[Status], plus lookup methods.
Give the above Status enum, here's what the generated code looks like:
scala> #enum(debug = true) class Status {
| val Enabled, Disabled = Value
| }
{
sealed abstract class Status(val index: Int, val name: String)(implicit sealant: Status.Sealant);
object Status {
#scala.annotation.implicitNotFound(msg = "Enum types annotated with ".+("#enum can not be extended directly. To add another value to the enum, ").+("please adjust your `def ... = Value` declaration.")) sealed abstract protected class Sealant;
implicit protected object Sealant extends Sealant;
case object Enabled extends Status(0, "Enabled") with scala.Product with scala.Serializable;
case object Disabled extends Status(1, "Disabled") with scala.Product with scala.Serializable;
val values: List[Status] = List(Enabled, Disabled);
val fromIndex: _root_.scala.Function1[Int, Status] = Map(Enabled.index.->(Enabled), Disabled.index.->(Disabled));
val fromName: _root_.scala.Function1[String, Status] = Map(Enabled.name.->(Enabled), Disabled.name.->(Disabled));
def switch[A](pf: PartialFunction[Status, A]): _root_.scala.Function1[Status, A] = macro numerato.SwitchMacros.switch_impl[Status, A]
};
()
}
defined class Status
defined object Status