I use scala macros for extract all object from package, and then i would like get some values from object:
package example
trait A {}
object B extends A { val test = "test" }
//macro
object Macro
def getVals(packageName: String) = macro getValsImpl
def getValsImpl(c: Context)(packageName: c.Expr[String]): c.Expr[Unit] = {
import c.universe._
val pkg = from.tree match {
case Literal(Constant(name: String)) => c.mirror.staticPackage(name)
}
val objects = pkg.typeSignature.members.collect {
//get all objects which a subtype of `A`
case x if (x.isModule && x.typeSignature <:< typeOf[A]) => x
}.toList
val o = objects(0)
println(o.test)
reify {}
}
}
But i got error
value test is not a member of c.universe.ModuleSymbol
You are mistaking compile time artifacts for actual runtime values.
Macro implementation is invoked at compile time. The actual objects you are trying to access don't yet exist (only the language Symbols that represent them). That is why you are getting a ModuleSymbol when you expect the B object.
In other words, you simply can't access B in macro implementation.
Macros are meant to analyze, transform and generate code (represented as Exprs and Trees). So, what you can do is - having a ModuleSymbol that represents an object - generate code that, when compiled and finally executed in runtime, will evaluate to that object. But I don't know if this is what you want here.
Related
In a Scala 3 macro that takes a type parameter T, you can use TypeRepr.of[T] and the new Scala 3 reflection API to explore the companionClass of T, and find the Symbol for an arbitrary method on that companion class (eg companionClass.declarations.find(_.name == "list") to find a list() method).
Given the Symbol for a companion object method, how would you then invoke that method within a quoted code block?
I'm guessing I would need to convert that Symbol to a Expr[T], but I don't know how to do that!
In a Scala 2 macro, the invocation of a listMethod of type c.universe.Symbol in a q"..." quasiquote seems pretty simple - just say $listMethod, and then you can start mapping on the resulting list, eg:
q"""
$listMethod.map(_.toString)
"""
Trying to do a similar thing in a Scala 3 macro gets an error like this:
[error] 27 | ${listMethod}.map(_.toString)
[error] | ^^^^^^^^^^
[error] | Found: (listMethod : x$1.reflect.Symbol)
[error] | Required: quoted.Expr[Any]
What is the correct code to get this working in Scala 3?
You can see more code context in the AvroSerialisableMacro classes (Scala 2 compiles, Scala 3 currently nowhere near!) here: https://github.com/guardian/marley/pull/77/files
First, let's talk how to call a method using symbol name in general.
You might need Select. You can call obtain it in a a few different ways, e.g.:
New(TypeTree.of[YourType]).select(primaryConstructor) // when you want to create something
expression.asTerm.select(method) // when you want to call it on something
Once you selected method you can provide arguments:
select.appliedToArgs(args) // if there is only 1 argument list
select.appliedToArgss(args) // if there is more than one argument list
// (type parameter list is listed in paramSymss
// but shouldn't be used here, so filter it out!)
select.appliedToNone // if this is a method like "def method(): T"
// (single, but empty, parameter list)
select.appliedToArgss(Nil) // is this is a method like "def method: T"
// (with not even empty parameter list)
There are also other methods like appliedToType, appliedToTypeTrees, but if you have a method name as a Symbol and want to use it to call something this should be a good starting point.
And remember that source code of Quotes is your friend, so even when your IDE doesn't give you any suggestions, it can point you towards some solution.
In theory these methods are defined on Term rather than Select (<: Term) but your use case will be most likely picking an expression and calling a method on it with some parameters. So a full example could be e.g.
val expression: Expr[Input]
val method: Symbol
val args: List[Term]
// (input: Input).method(args) : Output
expression // Expr[Input]
.asTerm // Term
.select(method) // Select
.appliedToArgs(args) // Term
.asExpr // Expr[?]
.asExprOf[Output] // Expr[Output]
Obviously, proving that the expression can call method and making sure that types of Terms in args match allowed types of values that you pass to the method, is on you. It is a bit more hassle than it was in Scala 2 since quotes allow you to work with Type[T] and Expr[T] only, so anything that doesn't fall under that category has to be implemented with macros/Tasty ADT until you get to the point that you can return Expr inside ${}.
That said, the example you linked shows that these calls are rather hardcoded, so you don't have to look up Symbols and call them. Your code will most likely do away with:
// T <: ThriftEnum
// Creating companion's Expr can be done with asExprOf called on
// Ref from Dmytro Mitin's answer
def findCompanionOfThisOrParent(): Expr[ThriftEnumObject[T]] = ...
// _Expr_ with the List value you constructed instead of Symbol!
val listOfValues: Expr[List[T]] = '{
${ findCompanionOfThisOrParent() }.list
}
// once you have an Expr you don't have to do any magic
// to call a method on it, Quotes works nice
'{
...
val valueMap = Map(${ listOfValues }.map(x => x ->
org.apache.avro.generic.GenericData.get.createEnum(
com.gu.marley.enumsymbols.SnakesOnACamel.toSnake(x.name), schemaInstance)
): _*)
...
}
Difference between Scala 2 quasiquotes and Scala 3 quotations is that the former must compile during compile time of the main code using macros (i.e. during macro expansion, macro runtime) while the latter must compile earlier, at macro compile time. So Scala 3 quotations '{...}/${...} are more like Scala 2 reify{...}/.splice than Scala 2 quasiquotes q"..."/${...}.
`tq` equivalent in Scala 3 macros
You have to re-create AST. Let's see what shape AST should have:
object B:
def fff(): Unit = ()
import scala.quoted.*
inline def foo(): Unit = ${fooImpl}
def fooImpl(using Quotes): Expr[Unit] =
import quotes.reflect.*
println('{B.fff()}.asTerm.show(using Printer.TreeStructure))
'{()}
foo() // ... Apply(Select(Ident("B"), "fff"), Nil)
So in order to re-create AST try to use Apply(...) and Select.unique(..., "list"):
import scala.quoted.*
inline def foo[T](): Unit = ${fooImpl[T]}
def fooImpl[T: Type](using Quotes): Expr[Unit] =
import quotes.reflect.*
val sym = TypeRepr.of[T].typeSymbol
'{
println("aaa")
${
Apply(
Select.unique(
Ref(sym.companionModule),
"list"
),
Nil
).asExprOf[Unit]
}
}
Testing (in a different file):
class A
object A {
def list(): Unit = println("list")
}
foo[A]()
//scalac: {
// scala.Predef.println("aaa")
// A.list()
//}
// prints at runtime:
// aaa
// list
Using method symbol rather than its name and using convenience methods rather than AST nodes directly, you can rewrite fooImpl as
def fooImpl[T: Type](using Quotes): Expr[Unit] =
import quotes.reflect.*
val sym = TypeRepr.of[T].typeSymbol
val listMethod = sym.companionClass.declarations.find(_.name == "list").get
'{
println("aaa")
${
Ref(sym.companionModule)
.select(listMethod)
.appliedToArgs(Nil)
.asExprOf[Unit]
}
}
This is just an example how to create an AST. You should use your actual return type of def list() instead of Unit in .asExprOf[Unit].
How to get the list of default fields values for typed case class?
scala 3 macro how to implement generic trait
Context: I'm trying to write a macro that is statically aware of an non-fixed number of types. I'm trying to pass these types as a single type parameter using an HList. It would be called as m[ConcreteType1 :: ConcreteType2 :: ... :: HNil](). The macro then builds a match statement which requires some implicits to be found at compile time, a bit like how a json serialiser might demand implicit encoders. I've got a working implementation of the macro when used on a fixed number of type parameters, as follows:
def m[T1, T2](): Int = macro mImpl[T1, T2]
def mImpl[T1: c.WeakTypeTag, T2: c.WeakTypeTag](c: Context)(): c.Expr[Int] = {
import c.universe._
val t = Seq(
weakTypeOf[T1],
weakTypeOf[T2]
).map(c => cq"a: $c => externalGenericCallRequiringImplicitsAndReturningInt(a)")
val cases = q"input match { case ..$t }"
c.Expr[Int](cases)
}
Question: If I have a WeakTypeTag[T] for some T <: HList, is there any way to turn that into a Seq[Type]?
def hlistToSeq[T <: HList](hlistType: WeakTypeTag[T]): Seq[Type] = ???
My instinct is to write a recursive match which turns each T <: HList into either H :: T or HNil, but I don't think that kind of matching exists in scala.
I'd like to hear of any other way to get a list of arbitrary size of types into a macro, bearing in mind that I would need a Seq[Type], not Expr[Seq[Type]], as I need to map over them in macro code.
A way of writing a similar 'macro' in Dotty would be interesting too - I'm hoping it'll be simpler there, but haven't fully investigated yet.
Edit (clarification): The reason I'm using a macro is that I want a user of the library I'm writing to provide a collection of types (perhaps in the form of an HList), which the library can iterate over and expect implicits relating to. I say library, but it will be compiled together with the uses, in order for the macros to run; in any case it should be reusable with different collections of types. It's a bit confusing, but I think I've worked this bit out - I just need to be able to build macros that can operate on lists of types.
Currently you seem not to need macros. It seems type classes or shapeless.Poly can be enough.
def externalGenericCallRequiringImplicitsAndReturningInt[C](a: C)(implicit
mtc: MyTypeclass[C]): Int = mtc.anInt
trait MyTypeclass[C] {
def anInt: Int
}
object MyTypeclass {
implicit val mtc1: MyTypeclass[ConcreteType1] = new MyTypeclass[ConcreteType1] {
override val anInt: Int = 1
}
implicit val mtc2: MyTypeclass[ConcreteType2] = new MyTypeclass[ConcreteType2] {
override val anInt: Int = 2
}
//...
}
val a1: ConcreteType1 = null
val a2: ConcreteType2 = null
externalGenericCallRequiringImplicitsAndReturningInt(a1) //1
externalGenericCallRequiringImplicitsAndReturningInt(a2) //2
I have a quite big structure of case classes and somewhere deep inside this structure I have fields which I want to refine, for example, make lists non-empty. Is it possible to tell ScalaCheck to make those lists non-empty using automatic derivation from scalacheck-magnolia project (without providing each field specifically)?
Example:
import com.mrdziuban.ScalacheckMagnolia.deriveArbitrary
import org.scalacheck.Arbitrary
import org.scalacheck.Gen
case class A(b: B, c: C)
case class B(list: List[Long])
case class C(list: List[Long])
// I've tried:
def genNEL[T: Gen]: Gen[List[T]] = Gen.nonEmptyListOf(implicitly[Gen[T]])
implicit val deriveNEL = Arbitrary(genNEL)
implicit val deriveA = implicitly[Arbitrary[A]](deriveArbitrary)
But it's didn't worked out.
I'm not sure how to be generic, since I'm not familiar with getting automatic derivation for Arbitrary with scalacheck-magnolia. It seems like scalacheck-magnolia is good for deriving an Arbitrary for case classes, but maybe not for containers (lists, vectors, arrays, etc.).
If you want to just use plain ScalaCheck, you could just define the implicit Arbitrary for A yourself. Doing it by hand is some extra boilerplate, but it has the benefit that you have more control if you want to use different generators for different parts of your data structure.
Here's an example where an Arbitrary list of longs is non-empty by default, but is empty for B.
implicit val listOfLong =
Arbitrary(Gen.nonEmptyListOf(Arbitrary.arbitrary[Long]))
implicit val arbC = Arbitrary {
Gen.resultOf(C)
}
implicit val arbB = Arbitrary {
implicit val listOfLong =
Arbitrary(Gen.listOf(Arbitrary.arbitrary[Long]))
Gen.resultOf(B)
}
implicit val arbA = Arbitrary {
Gen.resultOf(A)
}
property("arbitrary[A]") = {
Prop.forAll { a: A =>
a.b.list.size >= 0 && a.c.list.size > 0
}
}
I am struggling on how to create an instance of Functor[Dataset]... the problem is that when you map from A to B the Encoder[B] must be in the implicit scope but I am not sure how to do it.
implicit val datasetFunctor: Functor[Dataset] = new Functor[Dataset] {
override def map[A, B](fa: Dataset[A])(f: A => B): Dataset[B] = fa.map(f)
}
Of course this code is throwing a compilation error since Encoder[B] is not available but I can't add Encoder[B] as an implicit parameter because it would change the map method signature, how can I solve this?
You cannot apply f right away, because you are missing the Encoder. The only obvious direct solution would be: take cats and re-implement all the interfaces, adding an implict Encoder argument. I don't see any way to implement a Functor for Dataset directly.
However maybe the following substitute solution is good enough.
What you could do is to create a wrapper for the dataset, which has a map method without the implicit Encoder, but additionally has a method toDataset, which needs the Encoder in the very end.
For this wrapper, you could apply a construction which is very similar to the so-called Coyoneda-construction (or Coyo? What do they call it today? I don't know...). It essentially is a way to implement a "free functor" for an arbitrary type constructor.
Here is a sketch (it compiles with cats 1.0.1, replaced Spark traits by dummies):
import scala.language.higherKinds
import cats.Functor
/** Dummy for spark-Encoder */
trait Encoder[X]
/** Dummy for spark-Dataset */
trait Dataset[X] {
def map[Y](f: X => Y)(implicit enc: Encoder[Y]): Dataset[Y]
}
/** Coyoneda-esque wrapper for `Dataset`
* that simply stashes all arguments to `map` away
* until a concrete `Encoder` is supplied during the
* application of `toDataset`.
*
* Essentially: the wrapped original dataset + concatenated
* list of functions which have been passed to `map`.
*/
abstract class MappedDataset[X] private () { self =>
type B
val base: Dataset[B]
val path: B => X
def toDataset(implicit enc: Encoder[X]): Dataset[X] = base map path
def map[Y](f: X => Y): MappedDataset[Y] = new MappedDataset[Y] {
type B = self.B
val base = self.base
val path: B => Y = f compose self.path
}
}
object MappedDataset {
/** Constructor for MappedDatasets.
*
* Wraps a `Dataset` into a `MappedDataset`
*/
def apply[X](ds: Dataset[X]): MappedDataset[X] = new MappedDataset[X] {
type B = X
val base = ds
val path = identity
}
}
object MappedDatasetFunctor extends Functor[MappedDataset] {
/** Functorial `map` */
def map[A, B](da: MappedDataset[A])(f: A => B): MappedDataset[B] = da map f
}
Now you can wrap a dataset ds into a MappedDataset(ds), then map it using the implicit MappedDatasetFunctor as long as you want, and then call toDataset in the very end, there you can supply a concrete Encoder for the final result.
Note that this will combine all functions inside map into a single spark stage: it won't be able to save the intermediate results, because the Encoders for all intermediate steps are missing.
I'm not quite there yet with studying cats, I cannot guarantee that this is the most idiomatic solution. Probably there is something Coyoneda-esque already in the library.
EDIT: There is Coyoneda in the cats library, but it requires a natural transformation F ~> G to a functor G. Unfortunately, we don't have a Functor for Dataset (that was the problem in the first place). What my implementation above does is: instead of a Functor[G], it requires a single morphism of the (non-existent) natural transformation at a fixed X (this is what the Encoder[X] is).
I'm trying to implement something like clever parameters converter function with Scala.
Basically in my program I need to read parameters from a properties file, so obviously they are all strings and I would like then to convert each parameter in a specific type that I pass as parameter.
This is the implementation that I start coding:
def getParam[T](key : String , value : String, paramClass : T): Any = {
value match {
paramClass match {
case i if i == Int => value.trim.toInt
case b if b == Boolean => value.trim.toBoolean
case _ => value.trim
}
}
/* Exception handling is missing at the moment */
}
Usage:
val convertedInt = getParam("some.int.property.key", "10", Int)
val convertedBoolean = getParam("some.boolean.property.key", "true", Boolean)
val plainString = getParam("some.string.property.key", "value",String)
Points to note:
For my program now I need just 3 main type of type: String ,Int and Boolean,
if is possible I would like to extends to more object type
This is not clever, cause I need to explicit the matching against every possibile type to convert, I would like an more reflectional like approach
This code doesn't work, it give me compile error: "object java.lang.String is not a value" when I try to convert( actually no conversion happen because property values came as String).
Can anyone help me? I'm quite newbie in Scala and maybe I missing something
The Scala approach for a problem that you are trying to solve is context bounds. Given a type T you can require an object like ParamMeta[T], which will do all conversions for you. So you can rewrite your code to something like this:
trait ParamMeta[T] {
def apply(v: String): T
}
def getParam[T](key: String, value: String)(implicit meta: ParamMeta[T]): T =
meta(value.trim)
implicit case object IntMeta extends ParamMeta[Int] {
def apply(v: String): Int = v.toInt
}
// and so on
getParam[Int](/* ... */, "127") // = 127
There is even no need to throw exceptions! If you supply an unsupported type as getParam type argument, code will even not compile. You can rewrite signature of getParam using a syntax sugar for context bounds, T: Bound, which will require implicit value Bound[T], and you will need to use implicitly[Bound[T]] to access that values (because there will be no parameter name for it).
Also this code does not use reflection at all, because compiler searches for an implicit value ParamMeta[Int], founds it in object IntMeta and rewrites function call like getParam[Int](..., "127")(IntMeta), so it will get all required values at compile time.
If you feel that writing those case objects is too boilerplate, and you are sure that you will not need another method in these objects in future (for example, to convert T back to String), you can simplify declarations like this:
case class ParamMeta[T](f: String => T) {
def apply(s: String): T = f(s)
}
implicit val stringMeta = ParamMeta(identity)
implicit val intMeta = ParamMeta(_.toInt)
To avoid importing them every time you use getParam you can declare these implicits in a companion object of ParamMeta trait/case class, and Scala will pick them automatically.
As for original match approach, you can pass a implicit ClassTag[T] to your function, so you will be able to match classes. You do not need to create any values for ClassTag, as the compiler will pass it automatically. Here is a simple example how to do class matching:
import scala.reflect.ClassTag
import scala.reflect._
def test[T: ClassTag] = classTag[T].runtimeClass match {
case x if x == classOf[Int] => "I'm an int!"
case x if x == classOf[String] => "I'm a string!"
}
println(test[Int])
println(test[String])
However, this approach is less flexible than ParamMeta one, and ParamMeta should be preferred.