Is it possible to get the name of a scala variable at runtime?
E.g. is it possible to write a function getIntVarName(variable: Int): String behaving as follows?
val myInt = 3
assert("myInt" === getIntVarName(myInt))
For what you need to do, It seems to me that runtime is not required, since you already have your myInt variable defined at compile time. If this is the case, you just need a bit of AST manipulation via a macro.
Try
package com.natalinobusa.macros
import scala.language.experimental.macros
import scala.reflect.macros.blackbox.Context
object Macros {
// write macros here
def getName(x: Any): String = macro impl
def impl(c: Context)(x: c.Tree): c.Tree = {
import c.universe._
val p = x match {
case Select(_, TermName(s)) => s
case _ => ""
}
q"$p"
}
}
Be aware that macro's must be compiled as a separate subproject, and cannot be part of the same project where the macro substitution has to be applied. Check this template on how to define such a macro sub-project: https://github.com/echojc/scala-macro-template
scala> import Macros._
import Macros._
scala> val myInt = 3
myInt: Int = 3
scala> "myInt" == getName(myInt)
res6: Boolean = true
You can use scala-nameof to get a variable name, function name, class member name, or type name. It happens at compile-time so there's no reflection involved and no runtime dependency needed.
val myInt = 3
assert("myInt" === nameOf(myInt))
will compile to:
val myInt = 3
assert("myInt" === "myInt")
Basically, it can't be done.
The JVM offers nothing by way of a Method handle (remember, Scala properties are encoded as methods in bytecode to support the uniform access principle). The closest you can get is to use reflection to find a list of methods defined on a particular class - which I appreciate doesn't help with your particular need.
It is possible to implement this as a Scala feature, but it would require a compiler plugin to grab the relevant symbol name from the AST and push it into code as a string literal, so not something I could demonstrate in a short code snippet :)
The other naming problem that often comes up in reflection is method parameters. That one at least I can help with. I have a work-in-progress reflection library here that's based on the compiler-generated scala signature as used by scalap. It's nowhere near being ready for serious use, but it is under active development.
Scala doesn't yet have much more than Java in terms of metadata like this. Keep an eye on the Scala Reflection project, but I doubt that will offer access to local variables anytime soon. In the meantime, consider a bytecode inspector library like ASM. Another big caveat: local variable names are lost during compilation, so you'd need to compile in "debug" mode to preserve them.
I don't think it's possible to get the name of a variable, but you can try it with objects:
object Test1 {
def main(args: Array[String]) {
object MyVar {
def value = 1
}
println(MyVar.getClass)
}
}
This prints: class Test1$MyVar$2$. So you can get 'MyVar' out of it.
This can be achieved with Scala 3 Macros (does it at compile time).
Create a Macro object (this must be in a separate file):
import scala.quoted.{Expr, Quotes}
object NameFromVariable :
def inspectCode(x: Expr[Any])(using Quotes): Expr[String] =
val name = x.show.split("""\.""").last
Expr(name)
Then you need an inline method in your class.
inline def getIntVarName(inline x: Any): Any = ${ NameFromVariable.inspectCode('x) }
And use this method, like:
val myInt = 3
assert("myInt" === getIntVarName(myInt))
See the official documentation: https://docs.scala-lang.org/scala3/guides/macros/macros.html
Related
There is a library X I'm working on, which depends on another library Y. To support multiple versions of Y, X publishes multiple artifacts named X_Y1.0, X_Y1.1, etc. This is done using multiple subprojects in SBT with version-specific source directories like src/main/scala-Y1.0 and src/main/scala-Y1.1.
So far, it worked well. One minor problem is that sometimes version-specific source directories are too much. Sometimes they require a lot of code duplication because it's syntactically impossible to extract just the tiny differences into separate files. Sometimes doing so introduces performance overhead or makes the code unreadable.
Trying to solve the issue, I've added macro annotations to selectively delete a part of the code. It works like this:
class MyClass {
#UntilB1_0
def f: Int = 1
#SinceB1_1
def f: Int = 2
}
However, it seems it only works for methods. When I try to use the macro on fields, compilation fails with an error saying "f is already defined as value f". Also, it doesn't work for classes and objects.
My suspicion is that macros are applied during compilation before resolving method overloads, but after basic checks like checking duplicate names.
Is there a way to make the macros work for fields, classes, and objects too?
Here's an example macro to demonstrate the issue.
import scala.annotation.{compileTimeOnly, StaticAnnotation}
import scala.language.experimental.macros
import scala.reflect.macros.blackbox
#compileTimeOnly("enable macro paradise to expand macro annotations")
class Delete extends StaticAnnotation {
def macroTransform(annottees: Any*): Any = macro DeleteMacro.impl
}
object DeleteMacro {
def impl(c: blackbox.Context)(annottees: c.Expr[Any]*): c.Expr[Any] = {
import c.universe._
c.Expr[Nothing](EmptyTree)
}
}
When the annotation #Delete is used on methods, it works.
class MyClass {
#Delete
def f: Int = 1
def f: Int = 2
}
// new MyClass().f == 2
However, it doesn't work for fields.
class MyClass {
#Delete
val f: Int = 1
val f: Int = 2
}
// error: f is already defined as value f
First of all, good idea :)
It is a strange (and quite uncontrollable) behaviour, and I think that what you want to do is difficult to perform with macros.
To understand why you expansions doesn't work, I tried to print all the scalac phases.
Your expansion works, indeed giving this code:
class Foo {
#Delete
lazy val x : Int = 12
val x : Int = 10
#Delete
def a : Int = 10
def a : Int = 12
}
the code printed after typer is:
package it.unibo {
class Foo extends scala.AnyRef {
def <init>(): it.unibo.Foo = {
Foo.super.<init>();
()
};
<empty>; //val removed
private[this] val x: Int = 10;
<stable> <accessor> def x: Int = Foo.this.x;
<empty>; //def removed
def a: Int = 12
};
...
}
But, unfortunately, the error will be thrown anyway, I'm going to explain why this happens.
In scalac, macros are expanded -- at least in Scala 2.13 -- during the packageobjects phases (so after the parser and namer phases).
Here, different things happen, such as (as said here):
infers types,
checks whether types match,
searches for implicit arguments and adds them to trees,
does implicit conversions,
checks whether all type operations are allowed (for example type cannot be a subtype of itself),
resolves overloading,
type-checks parent references,
checks type violations,
searches for implicits ,
expands macros,
and creates additional methods for case classes (like apply or copy).
The essential problem here is that we cannot change the order, so it happens that invalid val references are checked before the method overloading, and macros expansion happen before method overloading check. For this reason #delete works with methods but it doesn't work with vals.
To solve your problem, I think that is necessary to use compiler plugin, here you can add a phase before the namer, so no error will be thrown. Build compiler plugin is more difficult of writing macros, but I think that is the best option for your case.
We have been banging our heads for a while on this but we cannot find a solution.
In our project we would like to write some DSL to migrate some old code in our codebase.
We would like to make a macro that given an instance of a case class gives us the possibility to extract the value in a typesafe manner. In this case it should be possible to declare x of type Int.
case class MyPersonalCaseClass(token: Int, str: String)
val someVariable = MyPersonalCaseClass(123, "SOMESTRING")
val x = Macros.->(someVariable, "token")
Here “token” is a compile-time constant, referring to the field name.
The macro can be declared with something like
def ->[T](value:T,key: String): Any = macro MacrosImpl.arrow[T]
As for our understanding the only way was with whitebox macros, feel free to change the signatures.
def arrow[T: c.WeakTypeTag](c: whitebox.Context)(value: c.Expr[T], key:c.Expr[String]): c.Expr[Any] =
{
import c.universe._
val caseClassType: c.universe.Type = weakTypeOf[T]
???
}
Scala version is “2.12.8”.
The reason we need something like this is we are porting a lot of code from perl and we would like to give the programmers a vagueish idea they are still writing it.
thanks in advance!
Try
import shapeless.LabelledGeneric
import shapeless.record._
LabelledGeneric[MyPersonalCaseClass].to(someVariable).get(Symbol("token")) // 123
I'm developing a library that depends on another. The dependency has a package object that I'd like to alias into my own package domain, to 'hide' the underlying library from the users of the one I'm developing, for potential later reimplementation of that library. I've tried a couple things, including
object functions {
def identity(a: Any): Any = a
def toUpper(s: String): String = s.toUpperCase
}
object renamedfunctions {
import functions._
}
This compiles but import renamedfunctions._ brings nothing into scope. I've also tried extending the backing object, but scala objects are un-extendable. Does anyone know of a way to accomplish what I'm trying to do without forking the underlying library?
It is not possible to do this with Scala packages, in general. Usually, you would only alias a package locally within a file:
import scala.{ math => physics }
scala> physics.min(1, 2)
res6: Int = 1
But this doesn't do what you ask. Packages themselves aren't values or types, so you cannot assign them as such. These will fail:
type physics = scala.math
val physics = scala.math
With a package object, you can grab ahold of it's concrete members, but not the classes within. For example:
scala> val physics = scala.math.`package`
physics: math.type = scala.math.package$#42fcc7e6
scala> physics.min(1, 2)
res0: Int = 1
But using objects or types that belong to the traditional package won't work:
scala> scala.math.BigDecimal(1)
res1: scala.math.BigDecimal = 1
scala> physics.BigDecimal(1)
<console>:13: error: value BigDecimal is not a member of object scala.math.package
physics.BigDecimal(1)
^
Ok, so what should you do?
The reason you're even considering this is that you want to hide the implementation of which library you're using so that it can easily be replaced later. If that's the case, what you should do is hide the library within another interface or object (a facade). It doesn't mean you need to forward every single method and value contained within the library, only the one's you're actually using. This way, when it comes to migrating to another library, you only need to change one class, because the rest of the code will only reference the facade.
For example, if we wanted to use min and max from scala.math, but later wanted to replace it with another library that provided a more efficient solution (if such a thing exists), we could create a facade like this:
object Math {
def min(x: Int, y: Int): Int = scala.math.min(x, y)
def max(x: Int, y: Int): Int = scala.math.max(x, y)
}
All other classes would use Math.min and Math.max, so that when scala.math was replaced, they could remain the same. You could also make Math a trait (sans implementations) and provide the implementations in a sub-class or object (say ScalaMath), so that classes could inject different implementations.
Unfortunately, the commented-out code crashes the compiler:
package object p { def f = 42 }
package q {
object `package` { def f = p.f }
}
/*
package object q {
val `package` = p.`package`
}
*/
package client {
import q.`package`._
object Test extends App {
println(f)
}
}
That would make clients not break when you migrated to implementations in a package object.
Simply:
val renamedfunctions = functions
import renamedfunctions._
You can see it being done in the scala library itself: https://github.com/scala/scala/blob/2.12.x/src/library/scala/Predef.scala#L150
val Map = immutable.Map
New to Scala and looking for pointers to an idiomatic solution, if there is one.
I'd like to have arbitrary user-supplied Scala functions (which are allowed to reference functions/classes I have defined in my code) applied to some data.
For example: I have foo(s: String): String and bar(s: String): String functions defined in my myprog.scala. The user runs my program like this:
$ scala myprog data.txt --func='(s: Str) => foo(bar(s)).reverse'
This would run line by line through the data file and emit the result of applying the user-specified function to that line.
For extra points, can I ensure that there are no side-effects in the user-defined function? If not, can I restrict the function to use only a restricted subset of functions (which I can assure to be safe)?
#kenjiyoshida has a nice gist that shows how to eval Scala code. Note that when using Eval from that gist, not specifying a return value will result in a runtime failure when Scala defaults to inferring Nothing.
scala> Eval("println(\"Hello\")")
Hello
java.lang.ClassCastException: scala.runtime.BoxedUnit cannot be cast to scala.runtime.Nothing$
... 42 elided
vs
scala> Eval[Unit]("println(\"Hello\")")
Hello
It nicely handles whatever's in scope as well.
object Thing {
val thing: Int = 5
}
object Eval {
def apply[A](string: String): A = {
val toolbox = currentMirror.mkToolBox()
val tree = toolbox.parse(string)
toolbox.eval(tree).asInstanceOf[A]
}
def fromFile[A](file: File): A =
apply(scala.io.Source.fromFile(file).mkString(""))
def fromFileName[A](file: String): A =
fromFile(new File(file))
}
object Thing2 {
val thing2 = Eval[Int]("Thing.thing") // 5
}
Twitter's util package used to have util-eval, but that seems to have been deprecated now (and also triggers a compiler bug when compiled).
As for the second part of your question, the answer seems to be no. Even if you disable default Predef and imports yourself, a user can always get to those functions with the fully qualified package name. You could perhaps use Scala's scala.tools.reflect.ToolBox to first parse your string and then compare against a whitelist, before passing to eval, but at that point things could get pretty hairy since you'll be manually writing code to sanitize the Scala AST (or at the very least reject dangerous input). It definitely doesn't seem to be an "idiomatic solution."
This should be possible by using the standard Java JSR 223 Scripting Engine
see https://issues.scala-lang.org/browse/SI-874
(also mentions using scala.tools.nsc.Interpreter but not sure this is still available)
import javax.script.*;
ScriptEngine e = new ScriptEngineManager().getEngineByName("scala");
e.getContext().setAttribute("label", new Integer(4), ScriptContext.ENGINE_SCOPE);
try {
engine.eval("println(2+label)");
} catch (ScriptException ex) {
ex.printStackTrace();
}
I'm trying to generalize setting up Squeryl (Slick poses the same problems AFAIK). I want to avoid having to name every case class explicitly for a number of general methods.
table[Person]
table[Bookmark]
etc.
This also goes for generating indexes, and creating wrapper methods around the CRUD methods for every case class.
So ideally what I want to do is have a list of classes and make them into tables, add indexes and add a wrapper method:
val listOfClasses = List(classOf[Person], classOf[Bookmark])
listOfClasses.foreach(clazz => {
val tbl = table[clazz]
tbl.id is indexed
etc.
})
I thought Scala Macros would be the thing to apply here, since I don't think you can have values as type parameters. Also I need to generate methods for every type of the form:
def insert(model: Person): Person = persons.insert(model)
I've got my mits on an example on Macros but I don't know how to generate a generic datastructure.
I got this simple example to illustrate what I want:
def makeList_impl(c: Context)(clazz: c.Expr[Class[_]]): c.Expr[Unit] = {
import c.universe._
reify {
println(List[clazz.splice]()) // ERROR: error: type splice is not a member of c.Expr[Class[_]]
}
}
def makeList(clazz: Class[_]): Unit = macro makeList_impl
How do I do this? Or is Scala Macros the wrong tool?
Unfortunately, reify is not flexible enough for your use case, but there's good news. In macro paradise (and most likely in 2.11.0) we have a better tool to construct trees, called quasiquotes: http://docs.scala-lang.org/overviews/macros/quasiquotes.html.
scala> def makeList_impl(c: Context)(clazz: c.Expr[Class[_]]): c.Expr[Any] = {
| import c.universe._
| val ConstantType(Constant(tpe: Type)) = clazz.tree.tpe
| c.Expr[Any](q"List[$tpe]()")
| }
makeList_impl: (c: scala.reflect.macros.Context)(clazz: c.Expr[Class[_]])c.Expr[Any]
scala> def makeList(clazz: Class[_]): Any = macro makeList_impl
defined term macro makeList: (clazz: Class[_])Any
scala> makeList(classOf[Int])
res2: List[Int] = List()
scala> makeList(classOf[String])
res3: List[String] = List()
Quasiquotes are even available in 2.10.x with a minor tweak to the build process (http://docs.scala-lang.org/overviews/macros/paradise.html#macro_paradise_for_210x), so you might want to give them a try.
This will probably not fill all your needs here, but it may help a bit:
The signature of table method looks like this:
protected def table[T]()(implicit manifestT: Manifest[T]): Table[T]
As you can see, it takes implicit Manifest object. That object is passed automatically by the compiler and contains information about type T. This is actually what Squeryl uses to inspect database entity type.
You can just pass these manifests explicitly like this:
val listOfManifests = List(manifest[Person], manifest[Bookmark])
listOfManifests.foreach(manifest => {
val tbl = table()(manifest)
tbl.id is indexed
etc.
})
Unfortunately tbl in this code will have type similar to Table[_ <: CommonSupertypeOfAllGivenEntities] which means that all operations on it must be agnostic of concrete type of database entity.