Access to annotation value in Scala 3.0 - scala

I created annotation in scala and used it as follows:
object Main extends App {
println(classOf[Annotated].getAnnotations.length)
import scala.reflect.runtime.universe._
val mirror = runtimeMirror(cls.getClassLoader)
}
final class TestAnnotation extends StaticAnnotation
#TestAnnotation
class Annotated
As it's a Scala annotation it can not be read using getAnnotations on the other hand, scala-reflect dependency isn't available anymore for scala 3.0, so we have no access to runtimeMirror
Is there any alternative solution to read an annotation value in scala?

You don't need runtime reflection (Java or Scala) since information about annotations exists at compile time (even in Scala 2).
In Scala 3 you can write a macro and use TASTy reflection
import scala.quoted.*
inline def getAnnotations[A]: List[String] = ${getAnnotationsImpl[A]}
def getAnnotationsImpl[A: Type](using Quotes): Expr[List[String]] = {
import quotes.reflect.*
val annotations = TypeRepr.of[A].typeSymbol.annotations.map(_.tpe.show)
Expr.ofList(annotations.map(Expr(_)))
}
Usage:
#main def test = println(getAnnotations[Annotated]) // List(TestAnnotation)
Tested in 3.0.0-RC2-bin-20210217-83cb8ff-NIGHTLY

Related

Could not find implicit value for parameter write eror, yet I defined the handler using the macro

I have the following:
Account.scala
package modules.accounts
import java.time.Instant
import reactivemongo.api.bson._
case class Account(id: String, name: String)
object Account {
type ID = String
implicit val accountHandler: BSONDocumentHandler[Account] = Macros.handler[Account]
// implicit def accountWriter: BSONDocumentWriter[Account] = Macros.writer[Account]
// implicit def accountReader: BSONDocumentReader[Account] = Macros.reader[Account]
}
AccountRepo.scala
package modules.accounts
import java.time.Instant
import reactivemongo.api.collections.bson.BSONCollection
import scala.concurrent.ExecutionContext
final class AccountRepo(
val coll: BSONCollection
)(implicit ec: ExecutionContext) {
import Account.{ accountHandler, ID }
def insertTest() = {
val doc = Account(s"account123", "accountName") //, Instant.now)
coll.insert.one(doc)
}
}
The error I am getting is:
could not find implicit value for parameter writer: AccountRepo.this.coll.pack.Writer[modules.accounts.Account]
[error] coll.insert.one(doc)
From what I understand the implicit handler that is generated by the macro should be enough and create the Writer. What am I doing wrong?
Reference: http://reactivemongo.org/releases/1.0/documentation/bson/typeclasses.html
The code is mismixing different versions.
The macro generated handler is using the new BSON API, as it can be seen with the import reactivemongo.api.bson, whereas the collection is using an old driver, as it can be seen as it uses reactivemongo.api.collections.bson instead of reactivemongo.api.bson.collection.
It's recommended to have a look at the documentation, and not mixing incompatible versions of related libraries.

How to infer StructType schema for Spark Scala at run time given a Fully Qualified Name of a case class

Since a few days I was wondering if it is possible to infer a schema for Spark in Scala for a given case class, but unknown at compile time.
The only input is a string containing the FQN of the class (that could be used for example to create an instance of the case class at runtime via reflection)
I was thinking if it was possible to do something like:
package com.my.namespace
case class MyCaseClass (name: String, num: Int)
//Somewhere else in codebase
// coming from external configuration file, so unknown at compile time
val fqn = "com.my.namespace.MyCaseClass"
val schema = Encoders.product [ getXYZ( fqn ) ].schema
Of course, any other techniques that is not using Encoders is fine (building StructType analysing an instance of the case class ? Is it even possible ?)
What is the best approach?
Is it something feasible ?
You can use reflective toolbox
package com.my.namespace
import org.apache.spark.sql.types.StructType
import scala.reflect.runtime
import scala.tools.reflect.ToolBox
case class MyCaseClass (name: String, num: Int)
object Main extends App {
val fqn = "com.my.namespace.MyCaseClass"
val runtimeMirror = runtime.currentMirror
val toolbox = runtimeMirror.mkToolBox()
val res = toolbox.eval(toolbox.parse(s"""
import org.apache.spark.sql.Encoders
Encoders.product[$fqn].schema
""")).asInstanceOf[StructType]
println(res) // StructType(StructField(name,StringType,true),StructField(num,IntegerType,false))
}

How to add symbols into a parsed AST?

(This is loosely related to Scala script in 2.11 and Generating a class from string and instantiating it in Scala 2.10).
In the code below I have a code parsed runtime using Toolbox into a corresponding AST. How can add symbol definitions (prefix in the code below) to the AST so that those symbols can be used in the expression tree?
import scala.reflect.runtime.universe._
import scala.tools.reflect.ToolBox
object Main extends App {
val cm = runtimeMirror(getClass.getClassLoader)
val tb = cm.mkToolBox()
val expr = tb.parse("println(i)")
val build = internal.reificationSupport
val prefix = build.setInfo(build.newFreeTerm("i", 2), typeOf[Int])
// TODO: add prefix before expr by some AST manipulation
tb.eval(expr)
}

In Scala I can have reference to a private type via implicit conversion

I've found this interesting behaviour in nscala_time package (scala version of joda-time)
import com.github.nscala_time.time.Imports._
import com.github.nscala_time.time.DurationBuilder
object tests {
val x = 3 seconds
//> x : is of type com.github.nscala_time.time.DurationBuilder
val xx: DurationBuilder = 3 seconds
//> fails to compile:
// class DurationBuilder in package time cannot be accessed in package com.github.nscala_time.time
}
What I'm trying to achieve is implicit conversion from nscala_time Duration to scala.concurrent.Duration
I need this becuase I'm using RxScala and nscala_time in one application.
// e.g. the following should be implicitly converted
// to nscala_time Duration first
// than to scala.lang.concurrent.Duration
3 seconds
nscala_time offers rich time & date api for my application, while I'm using RxScala in the same class for GUI responsivness.
You can download a simple project to play around: https://dl.dropboxusercontent.com/u/9958045/implicit_vs_private.zip
From scala-user group: It's a known issue https://issues.scala-lang.org/browse/SI-1800
perhaps you can use an implicit conversion? (btw Duration in nscala is essentially org.joda.time.Duration):
scala> import com.github.nscala_time.time.Imports._
import com.github.nscala_time.time.Imports._
scala> implicit class DurationHelper(d:org.joda.time.Duration) {
| def toScalaDuration = scala.concurrent.duration.Duration.apply(d.getMillis,scala.concurrent.duration.MILLISECONDS)
| }
defined class DurationHelper
scala> val d = RichInt(3).seconds.toDuration
// toDuration method is defined for com.github.nscala_time.time.DurationBuilder
d: org.joda.time.Duration = PT3S
scala> def exfun(d:scala.concurrent.duration.Duration) = d.toString
exfun: (d: scala.concurrent.duration.Duration)String
scala> exfun(d)
res41: String = 3000 milliseconds
(not using import scala.concurrent.duration._ here to avoid name clashes with joda/nlscala stuff)

Splitting scalac plugin into multiple files

I'd like to split my scalac plugin into multiple files. This sounds easy but I haven't managed to pull it off due to path-dependent type issues stemming from the import global._ line.
Here's Lex Spoon's sample plugin:
package localhost
import scala.tools.nsc
import nsc.Global
import nsc.Phase
import nsc.plugins.Plugin
import nsc.plugins.PluginComponent
class DivByZero(val global: Global) extends Plugin {
import global._
val name = "divbyzero"
val description = "checks for division by zero"
val components = List[PluginComponent](Component)
private object Component extends PluginComponent {
val global: DivByZero.this.global.type = DivByZero.this.global
val runsAfter = "refchecks"
// Using the Scala Compiler 2.8.x the runsAfter should be written as below
// val runsAfter = List[String]("refchecks");
val phaseName = DivByZero.this.name
def newPhase(_prev: Phase) = new DivByZeroPhase(_prev)
class DivByZeroPhase(prev: Phase) extends StdPhase(prev) {
override def name = DivByZero.this.name
def apply(unit: CompilationUnit) {
for ( tree # Apply(Select(rcvr, nme.DIV), List(Literal(Constant(0)))) <- unit.body;
if rcvr.tpe <:< definitions.IntClass.tpe)
{
unit.error(tree.pos, "definitely division by zero")
}
}
}
}
}
How can I put Component and DivByZeroPhase in their own files without having the import global._ in scope?
Here's a really old project where I've done the same thing:
https://github.com/jsuereth/osgi-scalac-plugin/blob/master/src/main/scala/scala/osgi/compiler/OsgiPlugin.scala
If you don't need to pass path-dependent types from the global, don't worry about trying to keep the "this.global" portions of it relevant.
In the Scala Refactoring library, I solved it by having a trait CompilerAccess:
trait CompilerAccess {
val global: tools.nsc.Global
}
Now all the other traits that need to access global just declare CompilerAccess as a dependency:
trait TreeTraverser {
this: CompilerAccess =>
import global._
...
}
and then there's a class that mixes in all these traits and provides an instance of global:
trait SomeRefactoring extends TreeTraverser with OtherTrait with MoreTraits {
val global = //wherever you get your global from
}
This scheme worked quite well for me.
You can create a separate class for your component and pass global in:
class TemplateComponent(val global: Global) extends PluginComponent {
import global._
val runsAfter = List[String]("refchecks")
val phaseName = "plugintemplate"
def newPhase(prev: Phase) = new StdPhase(prev) {
override def name = phaseName
def apply(unit:CompilationUnit) = {
}
}
}