I have defined the following macros to get file, line and object/class from current location:
http://pastebin.com/UsNLemnK
Using SBT, I have defined two projects, in order to compile the macros first, then the actual project using these macros.
The purpose of these macros are to be be used in a log method:
def log( msg: Any, srcFile: String = "", srcLine: String = "", srcClass:String = "")
I am then using this log method as follows:
log(msg, s"$F_",s"$L_",s"$C_")
where F_, L_ and C_ are defined in the macro.
Now, I would like to create a shortcut to avoid this boilerplate and just call:
log(msg)
which should automatically be replaced by
log(msg, s"$F_",s"$L_",s"$C_")
I could define a macro to do this:
def log_(msg: String) : Unit = macro logImpl
def logImpl( c: Context )(msg: c.Expr[String]): c.Expr[Unit] = {
import c.universe._
reify( log(msg.splice, srcFile=s"$F_", srcLine=s"$L_", srcClass=s"$C_") )
}
but again, this macro needs to be compiled before the project, where the log function itself is defined... So I don't see how to solve the compilation dependencies cycle...
Any suggestion about how to do this?
Thanks
Barring the use of macro annotations (which would necessarily and significantly alter your API's syntax), the problem you have to face is that you need the type-checked identifier of your log function.
Since you can't import the entire log implementation, a solution would be to:
wrap the method into a trait,
define this trait in the "macro" project,
add an implicit parameter to the log_ method,
in your "main" project, create an implementation of this trait, and instantiate this implementation in an implicit val visible everywhere you'd like to use the log_ macro (in the package object for example).
Of course, you could also use a simple FunctionN here and avoid the trait definition and implementation, but this way you'll avoid potential conflicts with other same-typed implicits.
In general, your code would resemble the following:
//"macro" project
trait EncapsulatingTrait {
def yourMethod(...)
}
object Macros {
def myMacro(...)(implicit param: EncapsulatingTrait) = macro myMacroImpl
def myMacroImpl( c: Context )(...)
(param: c.Expr[EncapsulatingTrait]): c.Expr[...] = {
import c.universe._
reify(param.splice.yourMethod(...))
}
}
//--------------------------
//"main" project
class Impl extends EncapsulatingTrait {
def yourMethod(...)
}
...
implicit val defaultParam = new Impl
import Macros.myMacro
myMacro(...)
In your specific case, here's how an implementation could look like:
//"macro" project
package yourpackage
import java.io.File
import language.experimental.macros
import scala.reflect.macros.Context
trait LogFunction {
def log( msg: Any, srcFile: String = "", srcLine: Int = -1, srcClass:String = "")
}
object Macros {
// get current line in source code
def L_ : Int = macro lineImpl
def lineImpl( c: Context ): c.Expr[Int] = {
import c.universe._
val line = Literal( Constant( c.enclosingPosition.line ) )
c.Expr[Int]( line )
}
// get current file from source code (relative path)
def F_ : String = macro fileImpl
def fileImpl( c: Context ): c.Expr[String] = {
import c.universe._
val absolute = c.enclosingPosition.source.file.file.toURI
val base = new File( "." ).toURI
val path = Literal( Constant( c.enclosingPosition.source.file.file.getName() ) )
c.Expr[String]( path )
}
// get current class/object (a bit sketchy)
def C_ : String = macro classImpl
def classImpl( c: Context ): c.Expr[String] = {
import c.universe._
val class_ = Literal( Constant( c.enclosingClass.toString.split(" ")( 1 ) ) )
c.Expr[String]( class_ )
}
def log_(msg: String)(implicit logFunc: LogFunction) : Unit = macro logImpl
def logImpl( c: Context )(msg: c.Expr[String])(logFunc: c.Expr[LogFunction]): c.Expr[Unit] = {
import c.universe._
reify( logFunc.splice.log(msg.splice, srcFile=fileImpl(c).splice, srcLine=lineImpl(c).splice, srcClass=classImpl(c).splice) )
}
}
//--------------------------
//"main" project
import yourpackage.LogFunction
class LogImpl extends LogFunction {
def log( msg: Any, srcFile: String = "", srcLine: Int = -1, srcClass:String = "") {
println(List(msg,srcFile,srcLine,srcClass).mkString("|"))
}
}
object testLog {
def main(args: Array[String]): Unit = {
implicit val defaultLog = new LogImpl
import yourpackage.Macros.log_
log_("blah")
}
}
(note that I had to correct the signature of log_ and tweak the macro call a bit)
Related
I want to auto-generate an overriden method in a compiler plugin (Scala 3) for a class like:
trait SpecialSerialize {
def toJson(sb: StringBuilder, c:SJConfig): Unit = {println("wrong")}
}
case class Person(name:String, age:Int) extends SpecialSerialize
The plugin would generate:
case class Person(name:String, age:Int) extends SpecialSerialize {
override def toJson(sb: StringBuilder, c:SJConfig): Unit = ... // code here
}
I have a phase:
class ReflectionWorkerPhase extends PluginPhase {
import tpd._
val phaseName = "reflectionWorker"
override val runsAfter = Set(Pickler.name)
override def transformTypeDef(tree: TypeDef)(implicit ctx: Context): Tree =
if tree.isClassDef && !tree.rhs.symbol.isStatic then // only look at classes
// 0. Get a FreshContext so we can set the tree to this tree. (for '{} later)
implicit val fresh = ctx.fresh
fresh.setTree(tree)
QuotesCache.init(fresh)
implicit val quotes:Quotes = QuotesImpl.apply() // picks up fresh
import quotes.reflect.*
// 1. Set up method symbol, define parameters and return type
val toJsonSymbol = Symbol.newMethod(
Symbol.spliceOwner,
"toJson",
MethodType(
List("sb","config"))( // parameter list
_ => List( // types of the parameters
TypeRepr.of[StringBuilder],
TypeRepr.of[SJConfig],
),
_ => TypeRepr.typeConstructorOf(classOf[Unit]) // return type
),
Flags.Override, // Note override here
Symbol.noSymbol
)
// 2. Get our class' Symbol for ownership reassignment
val classDef = tree.asInstanceOf[ClassDef]
val classSymbol = classDef.symbol
// 3. Define our method definition (DefDef) using our method symbol defined above
val toJsonMethodDef = DefDef(
toJsonSymbol,
{
case List(List(sb: Term, config: Term)) =>
given Quotes = toJsonSymbol.asQuotes
Some({
// Multiple quotes here intentional...
// Real code will generate a list of quoted statements
quoted.Expr.ofList(List(
'{ println("Hello") },
'{ println("World") }
))
}.asTerm.changeOwner(toJsonSymbol))
}
).changeOwner(classSymbol)
// 4. Add toJsonMethodDef to tree and return
val cd = ClassDef.copy(classDef)(
name = classDef.name,
constr = classDef.constructor,
parents = classDef.parents,
selfOpt = classDef.self,
body = toJsonMethodDef +: classDef.body
)
cd.asInstanceOf[dotty.tools.dotc.ast.tpd.Tree]
else
tree
}
When I use the plugin on a sample Person class, I get this error on compile:
Exception in thread "sbt-bg-threads-1" java.lang.ClassFormatError: Duplicate method name "toJson" with signature "(Lscala.collection.mutable.StringBuilder;Lco.blocke.scala_reflection.SJConfig;)V" in class file com/foo/Person
at java.base/java.lang.ClassLoader.defineClass1(Native Method)
at java.base/java.lang.ClassLoader.defineClass(ClassLoader.java:1013)
at java.base/java.security.SecureClassLoader.defineClass(SecureClassLoader.java:150)
at java.base/java.net.URLClassLoader.defineClass(URLClassLoader.java:524)
at java.base/java.net.URLClassLoader$1.run(URLClassLoader.java:427)
at java.base/java.net.URLClassLoader$1.run(URLClassLoader.java:421)
at java.base/java.security.AccessController.doPrivileged(AccessController.java:712)
...
So something in the other compile phases failed to recognize my generated method as being an override and tried to copy in the "master" method from the trait, and of course the JVM lost its mind at runtime, finding 2 copies of toJson with the same signature.
How can I fix this so my generated method is recognized as a valid override such that a 2nd toJson method won't be generated in a later phase?
I want to execute splice() for each argument of my varargs:
import scala.reflect.macros.blackbox
object LoggerMacro {
def log(context: blackbox.Context)
(message: context.Expr[String], arguments: context.Expr[Any]*)
: context.universe.Expr[Unit] = context.universe.reify {
println(message.splice) // Works :)
for (argument <- arguments) {
println(argument.splice) // Fails :(
}
}
}
However, I receive the following error message:
LoggerMacro.scala:9:24
the splice cannot be resolved statically, which means there is a cross-stage evaluation involved.
cross-stage evaluations need to be invoked explicitly, so we're showing you this error.
if you're sure this is not an oversight, add scala-compiler.jar to the classpath,
import `scala.tools.reflect.Eval` and call `<your expr>.eval` instead.
println(argument.splice)
Unfortunately, when I add scala-compiler as dependency and import scala.tools.reflect.Eval, there is still no callable eval method on my expr argument.
How can I access my arguments receiving as varargs?
In Scala 2 it's easier to work with Trees q"..." (and splicing like $, ..$, ...$) rather than Exprs (and .splice). So try
import scala.language.experimental.macros
import scala.reflect.macros.blackbox
object Macro {
def logMacro(message: String, arguments: Any*): Unit = macro log
def log(c: blackbox.Context)(message: c.Tree, arguments: c.Tree*): c.Tree = {
import c.universe._
q"""
_root_.scala.Predef.println($message)
for (argument <- _root_.scala.collection.immutable.Seq.apply(..$arguments)) {
_root_.scala.Predef.println(argument)
}
"""
}
}
Macro.logMacro("a", 1, 2, 3)
// scalacOptions += "-Ymacro-debug-lite"
//{
// _root_.scala.Predef.println("a");
// _root_.scala.collection.immutable.Seq.apply(1, 2, 3).foreach(((argument) => _root_.scala.Predef.println(argument)))
//}
//a
//1
//2
//3
But if you prefer Exprs/.splice then you can write yourself a helper transforming Seq[Expr[A]] into Expr[Seq[A]].
import scala.language.experimental.macros
import scala.reflect.macros.blackbox
object Macro {
def logMacro(message: String, arguments: Any*): Unit = macro log
def log(c: blackbox.Context)(message: c.Expr[String], arguments: c.Expr[Any]*): c.Expr[Unit] = {
import c.universe._
def exprsToExpr[A: WeakTypeTag](exprs: Seq[Expr[A]]): Expr[Seq[A]] =
exprs.foldRight(reify { Seq.empty[A] }) { (expr, acc) => reify {
expr.splice +: acc.splice
}}
reify {
println(message.splice)
for (argument <- exprsToExpr(arguments).splice) {
println(argument)
}
}
}
}
Macro.logMacro("a", 1, 2, 3)
// scalacOptions += "-Ymacro-debug-lite"
//{
// Predef.println("a");
// {
// final <synthetic> <artifact> val rassoc$1 = 1;
// {
// final <synthetic> <artifact> val rassoc$1 = 2;
// {
// final <synthetic> <artifact> val rassoc$1 = 3;
// `package`.Seq.empty[Any].$plus$colon(rassoc$1)
//}.$plus$colon(rassoc$1)
//}.$plus$colon(rassoc$1)
//}.foreach(((argument) => Predef.println(argument)))
//}
//a
//1
//2
//3
The tree is slightly different but runtime result is the same.
You couldn't do argument.splice because argument is not an Expr, it's a local variable defined here, in a scope quoted with reify. Trying to splice it is "cross-stage evaluation", you were trying to splice not a value from the current stage (Expr[A]) but from the next stage (A).
For the same reason .eval (calculating an Expr[A] into A) didn't work.
Difference between .splice and .eval is that the former just inserts a tree (into a tree) while the latter calculates it (if possible).
I wrote some code which acquires some implicit values in the companion object like this:
package example.implicits
class Test {
import Test.GetValue
import Test.Implicits._
val intV = getV[Int]
val stringV = getV[String]
private def getV[T](implicit getV: GetValue[T]): T = getV.value
}
object Test {
trait GetValue[T] {
def value: T
}
object Implicits {
implicit val intValue = new GetValue[Int] {
def value = 10
}
implicit val stringValue = new GetValue[String] {
def value = "ten"
}
}
}
This piece of code cannot be compiled and the compiler complains it couldn't find the required implicit values. Note that my environment is
scala 2.11.8 on Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_66
However, if I use these values explicitly, nothing goes wrong:
class Test {
import Test.GetValue
import Test.Implicits._
val intV = getV[Int](intValue)
val stringV = getV[String](stringValue)
private def getV[T](implicit getV: GetValue[T]): T = getV.value
}
Further more, if I declare new implicit values as following:
class Test {
import Test.GetValue
import Test.Implicits._
implicit val intValue1 = intValue
implicit val stringValue1 = stringValue
val intV = getV[Int]
val stringV = getV[String]
private def getV[T](implicit getV: GetValue[T]): T = getV.value
}
errors will be raised because of ambiguous implicit values.
When I swap position of class Test and object Test, everything goes right:
object Test {
trait GetValue[T] {
def value: T
}
object Implicits {
implicit val intValue = new GetValue[Int] {
def value = 10
}
implicit val stringValue = new GetValue[String] {
def value = "ten"
}
}
}
class Test {
import Test.GetValue
import Test.Implicits._
val intV = getV[Int]
val stringV = getV[String]
private def getV[T](implicit getV: GetValue[T]): T = getV.value
}
So why can't scala find implicit values after I've already imported them in the first case?
And why it can do so when I swap their position?
That's because the compiler hasn't infered the type of Test.intValue yet at the time when it resolves the implicits in getV[Int].
Just give Test.intValue and Test.stringValue their type explicitly, and your problem will be solved.
I read somewhere (sorry, can't remember where exactly), that implicit definitions should always be given an explicit type, notably to avoid this kind of behaviour.
I still get the same error, I have defined the marshaller (and imported it); it appears that the case class entry is not in context when the function is polymorphic. and this throws a Cannot find JsonWriter or JsonFormat type class for Case Class. Is there a reason why spray-json can not find the implicit marshaller for the case class, (even when defined) is this case class in context? Link to marshaller
import spray.json._
import queue.MedusaJsonProtocol._
object MysqlDb {
...
}
case class UserDbEntry(
id: Int,
username: String,
countryId: Int,
created: LocalDateTime
)
trait MysqlDb {
implicit lazy val pool = MysqlDb.pool
}
trait HydraMapperT extends MysqlDb {
val FetchAllSql: String
def fetchAll(currentDate: String): Future[List[HydraDbRow]]
def getJson[T](row: T): String
}
object UserHydraDbMapper extends HydraMapperT {
override val FetchAllSql = "SELECT * FROM user WHERE created >= ?"
override def fetchAll(currentDate: String): Future[List[UserDbEntry]] = {
pool.sendPreparedStatement(FetchAllSql, Array(currentDate)).map { queryResult =>
queryResult.rows match {
case Some(rows) =>
rows.toList map (x => rowToModel(x))
case None => List()
}
}
}
override def getJson[UserDbEntry](row: UserDbEntry): String = {
HydraQueueMessage(
tableType = HydraTableName.UserTable,
payload = row.toJson.toString()
).toJson.toString()
}
private def rowToModel(row: RowData): UserDbEntry = {
UserDbEntry (
id = row("id").asInstanceOf[Int],
username = row("username").asInstanceOf[String],
countryId = row("country_id").asInstanceOf[Int],
created = row("created").asInstanceOf[LocalDateTime]
)
}
}
payload = row.toJson.toString() Can't find marshaller for UserDbEntry
You have defined UserDbEntry locally and there is no JSON marshaller for that type. Add the following:
implicit val userDbEntryFormat = Json.format[UserDbEntry]
I'm not sure how you can call row.toJson given UserDbEntry is a local case class. There must be a macro in there somewhere, but it's fairly clear that it's not in scope for the local UserDbEntry.
Edit
Now that I see your Gist, it looks like you have a package dependency problem. As designed, it'll be circular. You have defined the JSON marshaller in package com.at.medusa.core.queue, which imports UserDbEntry, which depends on package com.at.medusa.core.queue for marshalling.
Having a trait
trait Persisted {
def id: Long
}
how do I implement a method that accepts an instance of any case class and returns its copy with the trait mixed in?
The signature of the method looks like:
def toPersisted[T](instance: T, id: Long): T with Persisted
This can be done with macros (that are officially a part of Scala since 2.10.0-M3). Here's a gist example of what you are looking for.
1) My macro generates a local class that inherits from the provided case class and Persisted, much like new T with Persisted would do. Then it caches its argument (to prevent multiple evaluations) and creates an instance of the created class.
2) How did I know what trees to generate? I have a simple app, parse.exe that prints the AST that results from parsing input code. So I just invoked parse class Person$Persisted1(first: String, last: String) extends Person(first, last) with Persisted, noted the output and reproduced it in my macro. parse.exe is a wrapper for scalac -Xprint:parser -Yshow-trees -Ystop-after:parser. There are different ways to explore ASTs, read more in "Metaprogramming in Scala 2.10".
3) Macro expansions can be sanity-checked if you provide -Ymacro-debug-lite as an argument to scalac. In that case all expansions will be printed out, and you'll be able to detect codegen errors faster.
edit. Updated the example for 2.10.0-M7
It is not possible to achieve what you want using vanilla scala. The problem is that the mixins like the following:
scala> class Foo
defined class Foo
scala> trait Bar
defined trait Bar
scala> val fooWithBar = new Foo with Bar
fooWithBar: Foo with Bar = $anon$1#10ef717
create a Foo with Bar mixed in, but it is not done at runtime. The compiler simply generates a new anonymous class:
scala> fooWithBar.getClass
res3: java.lang.Class[_ <: Foo] = class $anon$1
See Dynamic mixin in Scala - is it possible? for more info.
What you are trying to do is known as record concatenation, something that Scala's type system does not support. (Fwiw, there exist type systems - such as this and this - that provide this feature.)
I think type classes might fit your use case, but I cannot tell for sure as the question doesn't provide sufficient information on what problem you are trying to solve.
Update
You can find an up to date working solution, which utilizes a Toolboxes API of Scala 2.10.0-RC1 as part of SORM project.
The following solution is based on the Scala 2.10.0-M3 reflection API and Scala Interpreter. It dynamically creates and caches classes inheriting from the original case classes with the trait mixed in. Thanks to caching at maximum this solution should dynamically create only one class for each original case class and reuse it later.
Since the new reflection API isn't that much disclosed nor is it stable and there are no tutorials on it yet this solution may involve some stupid repitative actions and quirks.
The following code was tested with Scala 2.10.0-M3.
1. Persisted.scala
The trait to be mixed in. Please note that I've changed it a bit due to updates in my program
trait Persisted {
def key: String
}
2. PersistedEnabler.scala
The actual worker object
import tools.nsc.interpreter.IMain
import tools.nsc._
import reflect.mirror._
object PersistedEnabler {
def toPersisted[T <: AnyRef](instance: T, key: String)
(implicit instanceTag: TypeTag[T]): T with Persisted = {
val args = {
val valuesMap = propertyValuesMap(instance)
key ::
methodParams(constructors(instanceTag.tpe).head.typeSignature)
.map(_.name.decoded.trim)
.map(valuesMap(_))
}
persistedClass(instanceTag)
.getConstructors.head
.newInstance(args.asInstanceOf[List[Object]]: _*)
.asInstanceOf[T with Persisted]
}
private val persistedClassCache =
collection.mutable.Map[TypeTag[_], Class[_]]()
private def persistedClass[T](tag: TypeTag[T]): Class[T with Persisted] = {
if (persistedClassCache.contains(tag))
persistedClassCache(tag).asInstanceOf[Class[T with Persisted]]
else {
val name = generateName()
val code = {
val sourceParams =
methodParams(constructors(tag.tpe).head.typeSignature)
val newParamsList = {
def paramDeclaration(s: Symbol): String =
s.name.decoded + ": " + s.typeSignature.toString
"val key: String" :: sourceParams.map(paramDeclaration) mkString ", "
}
val sourceParamsList =
sourceParams.map(_.name.decoded).mkString(", ")
val copyMethodParamsList =
sourceParams.map(s => s.name.decoded + ": " + s.typeSignature.toString + " = " + s.name.decoded).mkString(", ")
val copyInstantiationParamsList =
"key" :: sourceParams.map(_.name.decoded) mkString ", "
"""
class """ + name + """(""" + newParamsList + """)
extends """ + tag.sym.fullName + """(""" + sourceParamsList + """)
with """ + typeTag[Persisted].sym.fullName + """ {
override def copy(""" + copyMethodParamsList + """) =
new """ + name + """(""" + copyInstantiationParamsList + """)
}
"""
}
interpreter.compileString(code)
val c =
interpreter.classLoader.findClass(name)
.asInstanceOf[Class[T with Persisted]]
interpreter.reset()
persistedClassCache(tag) = c
c
}
}
private lazy val interpreter = {
val settings = new Settings()
settings.usejavacp.value = true
new IMain(settings, new NewLinePrintWriter(new ConsoleWriter, true))
}
private var generateNameCounter = 0l
private def generateName() = synchronized {
generateNameCounter += 1
"PersistedAnonymous" + generateNameCounter.toString
}
// REFLECTION HELPERS
private def propertyNames(t: Type) =
t.members.filter(m => !m.isMethod && m.isTerm).map(_.name.decoded.trim)
private def propertyValuesMap[T <: AnyRef](instance: T) = {
val t = typeOfInstance(instance)
propertyNames(t)
.map(n => n -> invoke(instance, t.member(newTermName(n)))())
.toMap
}
private type MethodType = {def params: List[Symbol]; def resultType: Type}
private def methodParams(t: Type): List[Symbol] =
t.asInstanceOf[MethodType].params
private def methodResultType(t: Type): Type =
t.asInstanceOf[MethodType].resultType
private def constructors(t: Type): Iterable[Symbol] =
t.members.filter(_.kind == "constructor")
private def fullyQualifiedName(s: Symbol): String = {
def symbolsTree(s: Symbol): List[Symbol] =
if (s.enclosingTopLevelClass != s)
s :: symbolsTree(s.enclosingTopLevelClass)
else if (s.enclosingPackageClass != s)
s :: symbolsTree(s.enclosingPackageClass)
else
Nil
symbolsTree(s)
.reverseMap(_.name.decoded)
.drop(1)
.mkString(".")
}
}
3. Sandbox.scala
The test app
import PersistedEnabler._
object Sandbox extends App {
case class Artist(name: String, genres: Set[Genre])
case class Genre(name: String)
val artist = Artist("Nirvana", Set(Genre("rock"), Genre("grunge")))
val persisted = toPersisted(artist, "some-key")
assert(persisted.isInstanceOf[Persisted])
assert(persisted.isInstanceOf[Artist])
assert(persisted.key == "some-key")
assert(persisted.name == "Nirvana")
assert(persisted == artist) // an interesting and useful effect
val copy = persisted.copy(name = "Puddle of Mudd")
assert(copy.isInstanceOf[Persisted])
assert(copy.isInstanceOf[Artist])
// the only problem: compiler thinks that `copy` does not implement `Persisted`, so to access `key` we have to specify it manually:
assert(copy.asInstanceOf[Artist with Persisted].key == "some-key")
assert(copy.name == "Puddle of Mudd")
assert(copy != persisted)
}
While it's not possible to compose an object AFTER it's creation, you can have very wide tests to determine if the object is of a specific composition using type aliases and definition structs:
type Persisted = { def id: Long }
class Person {
def id: Long = 5
def name = "dude"
}
def persist(obj: Persisted) = {
obj.id
}
persist(new Person)
Any object with a def id:Long will qualify as Persisted.
Achieving what I THINK you are trying to do is possible with implicit conversions:
object Persistable {
type Compatible = { def id: Long }
implicit def obj2persistable(obj: Compatible) = new Persistable(obj)
}
class Persistable(val obj: Persistable.Compatible) {
def persist() = println("Persisting: " + obj.id)
}
import Persistable.obj2persistable
new Person().persist()