Preserve method parameter names in scala macro - scala

I have an interface:
trait MyInterface {
def doSomething(usefulName : Int) : Unit
}
I have a macro that iterates over the methods of the interface and does stuff with the method names and parameters. I access the method names by doing something like this:
val tpe = typeOf[MyInterface]
// Get lists of parameter names for each method
val listOfParamLists = tpe.decls
.filter(_.isMethod)
.map(_.asMethod.paramLists.head.map(sym => sym.asTerm.name))
If I print out the names for doSomething's parameters, usefulName has become x$1. Why is this happening and is there a way to preserve the original parameter names?
I am using scala version 2.11.8, macros paradise version 2.1.0, and the blackbox context.
The interface is actually java source in a separate sbt project that I control. I have tried compiling with:
javacOptions in (Compile, compile) ++= Seq("-target", "1.8", "-source", "1.8", "-parameters")
The parameters flag is supposed to preserve the names, but I still get the same result as before.

This has nothing to do with macros and everything to do with Scala's runtime reflection system. In a nutshell, Java 8 and Scala 2.11 both wanted to be able to look up parameter names and each implemented their reflection system to do it.
This works just fine if everything is Scala and you compile it together (duh!). Problems arise when you have a Java class that has to be compiled separately.
Observations and Problem
First thing to notice is that the -parameters flag is only since Java 8, which is about as old as Scala 2.11. So Scala 2.11 is probably not using this feature to lookup method names... Consider the following
MyInterface.java compiled with javac -parameters MyInterface.java
public interface MyInterface {
public int doSomething(int bar);
}
MyTrait.scala compiled with scalac MyTrait.scala
class MyTrait {
def doSomething(bar: Int): Int
}
Then, we can use MethodParameterSpy to inspect the parameter information name that the Java 8 -parameter flag is supposed to give us. Running it on the Java compiled interface, we get (and here I abbreviated some of the output)
public abstract int MyInterface.doSomething(int)
Parameter name: bar
but in the Scala compiled class, we only get
public abstract int MyTrait.doSomething(int)
Parameter name: arg0
Yet, Scala has no problem looking up its own parameter names. That tells us that Scala is actually not relying on this Java 8 feature at all - it constructs its own runtime system for keeping track of parameter names. Then, it comes as no surprise that this doesn't work for classes from Java sources. It generates the names x$1, x$2, ... as placeholders, the same way that Java 8 reflection generates the names arg0, arg1, ... as placeholders when we inspected a compiled Scala trait. (And if we had not passed -parameters, it would have generated those names even for MyInterface.java.)
Solution
The best solution (that works in 2.11) I can come up with to get the parameter names of a Java class is to use Java reflection from Scala. Something like
$ javac -parameters MyInterface.java
$ jar -cf MyInterface.jar MyInterface.class
$ scala -cp MyInterface.jar
scala> :pa
// Entering paste mode (ctrl-D to finish)
import java.lang.reflect._
Class.forName("MyInterface")
.getDeclaredMethods
.map(_.getParameters.map(_.getName))
// Exiting paste mode, now interpreting.
res: Array[Array[String]] = Array(Array(bar))
Of course, this will only work if you have the -parameter flag (else you get arg0).
I should probably also mention that if you don't know if your method was compiled from Java or from Scala, you can always call .isJava (For example: typeOf[MyInterface].decls.filter(_.isMethod).head.isJava) and then branch on that to either your initial solution or what I propose above.
Future
Mercifully, this is all a thing of the past in Scala 2.12. If I am correctly reading this ticket, that means that in 2.12 your code will work for Java classes compiled with -parameter and, my Java reflection hack will also work for Scala classes.
All's well that ends well?

Related

Generate a case class with Binding.scala Vars using Scala.meta throws an exception

I have a scala.js project. There I have a strange behavior with Scala.Meta and Binding.scala.
I want to create a case class from a case class:
case class SimpleCaseClass(i: Int, s: String, list: Seq[String])
should generate to:
SimpleCaseClassFormData(Var[Int], Var[String], Vars[String])
as soon that I have a Vars, I get the following error:
A method defined in a JavaScript raw type of a Scala.js library has been called. This is most likely because you tried to run Scala.js binaries on the JVM. Make sure you are using the JVM version of the libraries.
java.lang.Error: A method defined in a JavaScript raw type of a Scala.js library has been called. This is most likely because you tried to run Scala.js binaries on the JVM. Make sure you are using the JVM version of the libraries.
at scala.scalajs.js.package$.native(package.scala:134)
at scala.scalajs.js.Array.push(Array.scala:106)
at scala.scalajs.js.JSConverters$JSRichGenTraversableOnce$.$anonfun$toJSArray$1(JSConverters.scala:60)
Without (for example SimpleCaseClassFormData(Var[Int], Var[String])
) it works.
Here you find the whole project: scala-adapters-form
Macro annotation is deprecated (See https://github.com/scalameta/scalameta/issues/1182)
You can create an sbt plugin based on Scala Meta instead. See https://github.com/ThoughtWorksInc/sbt-example/ as an example to implement such a plugin.
To avoid incompatible versions, you should use the sbt's built-in Scala Meta, which is version 1.7.0.

How to apply class exclusions to Scalac warnings options?

We have problems when using Scalac -Xfatal-warnings in the following cases:
Implicit vals used by macros internally
Internal vals that macros auto-generate
In both cases, we see Scalac failing to compile because it detects some values are not used, while we know they are (simply, when we remove them, the code doesn't compile anymore)
Although the two might be symptoms of the same problem in Scalac, they boil down to the same issue for us: we need to disable the -Ywarn-unused in Scala 2.11.12
Is there a way to exclude specific class files so they won't be affected by the compiler flag?
As far as I know there is no way of disabling scalac flag for just one file (if you compile your whole project at once by e.g sbt). You can extract class into separate module with different compile flags.
In case of implicit vals used internally in macros, personally I use -Ywarn-macros:after flag, which make these implicits used in macro count as used. (Talking about Scala 2.12.4).

Does Scala have a global object or class?

I know programmers are supposed to wrap their code in an application object:
object Hello extends App {
println("Hello, World")
}
It is required in Eclipse, if I ever want to get any output. However, when I tried to write some code (very casually) in Emacs, I write like this:
class Pair[+T](val first: T, val second: T)
trait Friend[-T] {
def befriend(someone: T)
}
def makeFriendWith(s: Student, f: Friend[Student]) {
f.befriend(s)
}
It seems like there is no universal object or class that wraps over the function makeFriendWith. Is Scala like JavaScript, everything is attached to a global object? If not, what is this function attached to?
Also why can this work in console (I complied it with scala command and it worked) but does not work in Eclipse? What's the use of the Application object?
Scala doesn't have top-level defs, but your script can be run by either the REPL or the scala script runner.
The precise behavior of your script depends on which way you run it.
The REPL can run scripts line-by-line or whole hog. (Compare :paste and :paste -raw versus :load or -i init.script and the future option -I init.script.)
There is an issue about sensitive scripting. The script runner should realize if you're trying to run an App.
There is another effort to make scripting a compiler phase that is easily customized. Scroll to Scripter.scala for code comments about its current heuristics.
In short, your defs must be wrapped in a top-level entity, but exactly how that happens is context-dependent.
There was a recent effort to make an alternative baked-in wrapping scheme available for the REPL.
None of this is mandated by the language spec, any more than special rules pertaining to sbt build files are defined by the language.
You can define methods like this only in the console, which (behind the scenes) automatically wraps them in an anonymous class for you.
Outside of the console, there's no such luxury.
As a JVM language, Scala cannot truly create any top-level entities other than classes and interfaces.
It does, however, have the notion of a "package object" which creates the illusion of value entites (val, var and def) not enclosed in a class or trait.
See http://www.scala-lang.org/docu/files/packageobjects/packageobjects.html for information on package objects.
You can run code like this directly in Eclipse: use Scala worksheet. IntelliJ IDEA Scala plugin supports it as well.

Why does Array.fill take an implicit scala.reflect.ClassManifest?

So I'm playing with writing a battlecode player in Scala. In battlecode certain classes are disallowed and there is a runtime exception if you ever try to access them. When I use the Array.fill function I get a message from the battlecode server saying [java] Illegal class: scala/reflect/Manifest$. This is the offending line:
val g_score = Array.fill[Int](rc.getMapWidth(), rc.getMapHeight())(0)
The method takes an implicit ClassManifest argument which has the following documentation:
A ClassManifest[T] is an opaque descriptor for type T. It is used by the compiler
to preserve information necessary for instantiating Arrays in those cases where
the element type is unknown at compile time.
But I do know the type of the array elements at compile time, as shown above I explicitly state that they will be Int. Is there a way to avoid this? To workaround I've written my own version of Array.fill. This seems like a hack. As an aside, does Scala have real 2D arrays? Array.fill seems to return an Array[Array[T]] which is the only way I found to write my own. This also seems inelegant.
Edit: Using Scala 2.9.1
For background information, see this related question: What is a Manifest in Scala and when do you need it?. In this answer, you will find an explanation why manifests are needed for arrays.
In short: Although the JVM uses type erasure, arrays are an exception and need a manifest. Since you could compile your code, that manifest was found (manifests are always available for proper types). Your error occurs at runtime.
I don't know the details of the battlecode server, but there are two possibilities: Either you are running your compiled classes with a binary incompatible version of Scala (difference in major version, e.g. compiled with Scala 2.9 and server uses 2.10). Or the server doesn't even have the scala-library.jar on its class path.
As said in the comment, manifests are deprecated in Scala 2.10 and replaced by ClassTag.
EDIT: So it seems the class loader is artificially restricting the allowed classes. My suggestion would be: Add a helper Java class. You can easily mix Java and Scala code. If it's just about the Int-Array instantiation, you could provide something like:
public static class Helper {
public static int[][] makeArray(int d1, int d2) { return new int[d1][d2](); }
}
(hope that's valid java code, a bit rusty)
Also, have you tried to create the outer array with new Array[Array[Int]](d1), and then iterate to create the inner arrays?

Embedded Scala REPL inherits parent classpath

As asked in this thread on the Scala mailing list, how can I create an embedded Scala REPL that inherits the classpath of the parent program? Suppose the parent Scala program is launched using scala -cp <classpath> ...; can <classpath> be accessed as a string and used to initialize the embedded REPL? (The Java classpath, available via System.getProperty("java.class.path"), appears to differ from the Scala classpath.)
Alternatively, perhaps the embedded Scala REPL can inherit or construct its ClassLoader from the parent process (Michael Dürig's ScalaDays 2010 talk might be relevant). Is this the recommended approach?
I'm trying to do the same thing, and I just found a way my out by Googling:
lazy val urls = java.lang.Thread.currentThread.getContextClassLoader match {
case cl: java.net.URLClassLoader => cl.getURLs.toList
case _ => error("classloader is not a URLClassLoader")
}
lazy val classpath = urls map {_.toString}
The above code gets you the classpath in current context.
settings.classpath.value = classpath.distinct.mkString(java.io.File.pathSeparator)
Put that into your settings.classpath and you should be able to fire up dispatch or whatever library you need.
set the usejavacp property to true:
val settings = new scala.tools.nsc.Settings
settings.usejavacp.value = true
There does not seem to be an easy way to access the "Scala classpath" from within a running Scala program (in contrast, the "Java classpath" is available through the java.class.path system property). One would like to access, e.g., the field Calculated.userClasspath in the instance of scala.tools.PathResolver, but the latter does not seem accessible. Perhaps the easiest work-around is to modify the scala launch script to store the -classpath parameter string in an environment variable.
Assuming the desired Scala classpath can be determined, it can be passed to the embedded Scala interpreter via:
settings.classpath.value = ...
Update: although the Scala classpath string may not be directly attainable from the Scala runtime, #Eugene points out that it can be extracted from the context classloader. Thanks.