Manual AST generation in Scala 2.10 - scala

In a response to:
Generating a class from string and instantiating it in Scala 2.10
#EugeneBurmako states:
"The example uses manual AST assembly, but it's possible to write a function that parses a string, finds out unbound identifiers, looks up values for them in some mapping and then creates corresponding free terms. There's no such function in Scala 2.10.0 though."
What API calls might help one find unbound identifiers?

Related

Special grammar in scala

I am very new at Scala and Spark area, and I found a strange grammar usage in the scala inside the Apache beam project and I can't understand.
Here is the strange place:
JavaDStream<Metadata> metadataDStream = mapWithStateDStream.map(new Tuple2MetadataFunction());
// register ReadReportDStream to report information related to this read.
new ReadReportDStream(metadataDStream.dstream(), id, getSourceName(source, id), stepName)
.register();
From the above code, you can see inside the constructor of ReadReportDstream, the first parameter is
metadataDStream.dstream()
If we go inside the dstream() method, you will see the following code:
class JavaDStream[T](val dstream: DStream[T])(implicit val classTag: ClassTag[T])
extends AbstractJavaDStreamLike[T, JavaDStream[T], JavaRDD[T]] {
I am wondering why it uses "metadataDStream.dstream()" in the constructor instead of "metadataDStream.dstream"? What does the "()" do?
It's mostly a question of convention. Methods with empty parameter lists are evaluated for their side-effects. Methods without parameters are assumed to be purely functional, and free of side-effects. You can read more about that here - https://docs.scala-lang.org/style/method-invocation.html (Arity-0 section)
So in that case, we're probably having some side-effects in metadataDStream.dstream(). However, syntactically writing it as metadataDStream.dstream won't be an error.

How to generate top-level class/object with scala macro

As we know, it is easy to create an inner class in some methods with scala macro.
But I'd like to know is it possible to generate a top level class/object?
If the answer is yes, then how to avoid generate the same class twice?
my scala version is 2.11
Top-level expansions must retain the number of annottees, their flavors and their names, with the only exception that a class might expand into a same-named class plus a same-named module, in which case they automatically become companions as per previous rule.
https://docs.scala-lang.org/overviews/macros/annotations.html
So you can transform top-level
#annot
class A
into
class A
object A
or
#annot
object A
into
class A
object A
Also there existed c.introduceTopLevel but it was removed.
Context.introduceTopLevel. The Context.introduceTopLevel API, which used to be available in early milestone builds of Scala 2.11.0 as a stepping stone towards type macros, was removed from the final release, because type macros were rejected for including in Scala and discontinued in macro paradise.
https://docs.scala-lang.org/overviews/macros/changelog211.html
Scala Macro: Define Top Level Object
introduceTopLevel has provided a long-requested functionality of generating definitions that can be used outside macro expansions. However, metaprogrammers have
quickly discovered that introduceTopLevel is dangerous. Top-level scope is a resource shared between the typechecker and user metaprograms, so mutating it with
introduceTopLevel can lead to compilation order problems. For example, if one file
in a compilation run relies on definitions created by a macro expansion performed in
another file, compiling the former before the latter may lead to unexpected compilation
errors.
https://infoscience.epfl.ch/record/226166/files/EPFL_TH7159.pdf (section 5.2.3 Conclusion)
If the companion you want to generate already exists then the companion you return in macro annotation's macroTransform will replace the original. You don't need to beware that there will be two "companions", compiler will watch that. But surely normally you match if that's the case (whether there is only annottee or annottee + companion).

How to decode Scala compiler generated names

Our code is killing the Scala compiler with this message:
[error] how can getCommonSuperclass() do its job if different class symbols
get the same bytecode-level internal name: scala/Tuple2$mcJD$sp
To figure it out I'm trying to understand what Tuple2$mcJD$sp is supposed to be. Is it the class generated for (Long, Double)? Is this documented somewhere? Thanks!
Some clues I have found so far:
I think the type abbreviations match those documented in Class.getName.
The name is generated in SpecializeTypes.specializedName. According to the code, the format is m<abbreviated method specialization types>c<abbreviated class specialization types>$sp.
Maybe it's considered a compiler internal thing and not documented anywhere.

Why does Array.fill take an implicit scala.reflect.ClassManifest?

So I'm playing with writing a battlecode player in Scala. In battlecode certain classes are disallowed and there is a runtime exception if you ever try to access them. When I use the Array.fill function I get a message from the battlecode server saying [java] Illegal class: scala/reflect/Manifest$. This is the offending line:
val g_score = Array.fill[Int](rc.getMapWidth(), rc.getMapHeight())(0)
The method takes an implicit ClassManifest argument which has the following documentation:
A ClassManifest[T] is an opaque descriptor for type T. It is used by the compiler
to preserve information necessary for instantiating Arrays in those cases where
the element type is unknown at compile time.
But I do know the type of the array elements at compile time, as shown above I explicitly state that they will be Int. Is there a way to avoid this? To workaround I've written my own version of Array.fill. This seems like a hack. As an aside, does Scala have real 2D arrays? Array.fill seems to return an Array[Array[T]] which is the only way I found to write my own. This also seems inelegant.
Edit: Using Scala 2.9.1
For background information, see this related question: What is a Manifest in Scala and when do you need it?. In this answer, you will find an explanation why manifests are needed for arrays.
In short: Although the JVM uses type erasure, arrays are an exception and need a manifest. Since you could compile your code, that manifest was found (manifests are always available for proper types). Your error occurs at runtime.
I don't know the details of the battlecode server, but there are two possibilities: Either you are running your compiled classes with a binary incompatible version of Scala (difference in major version, e.g. compiled with Scala 2.9 and server uses 2.10). Or the server doesn't even have the scala-library.jar on its class path.
As said in the comment, manifests are deprecated in Scala 2.10 and replaced by ClassTag.
EDIT: So it seems the class loader is artificially restricting the allowed classes. My suggestion would be: Add a helper Java class. You can easily mix Java and Scala code. If it's just about the Int-Array instantiation, you could provide something like:
public static class Helper {
public static int[][] makeArray(int d1, int d2) { return new int[d1][d2](); }
}
(hope that's valid java code, a bit rusty)
Also, have you tried to create the outer array with new Array[Array[Int]](d1), and then iterate to create the inner arrays?

Scala Case Class Map Expansion

In groovy one can do:
class Foo {
Integer a,b
}
Map map = [a:1,b:2]
def foo = new Foo(map) // map expanded, object created
I understand that Scala is not in any sense of the word, Groovy, but am wondering if map expansion in this context is supported
Simplistically, I tried and failed with:
case class Foo(a:Int, b:Int)
val map = Map("a"-> 1, "b"-> 2)
Foo(map: _*) // no dice, always applied to first property
A related thread that shows possible solutions to the problem.
Now, from what I've been able to dig up, as of Scala 2.9.1 at least, reflection in regard to case classes is basically a no-op. The net effect then appears to be that one is forced into some form of manual object creation, which, given the power of Scala, is somewhat ironic.
I should mention that the use case involves the servlet request parameters map. Specifically, using Lift, Play, Spray, Scalatra, etc., I would like to take the sanitized params map (filtered via routing layer) and bind it to a target case class instance without needing to manually create the object, nor specify its types. This would require "reliable" reflection and implicits like "str2Date" to handle type conversion errors.
Perhaps in 2.10 with the new reflection library, implementing the above will be cake. Only 2 months into Scala, so just scratching the surface; I do not see any straightforward way to pull this off right now (for seasoned Scala developers, maybe doable)
Well, the good news is that Scala's Product interface, implemented by all case classes, actually doesn't make this very hard to do. I'm the author of a Scala serialization library called Salat that supplies some utilities for using pickled Scala signatures to get typed field information
https://github.com/novus/salat - check out some of the utilities in the salat-util package.
Actually, I think this is something that Salat should do - what a good idea.
Re: D.C. Sobral's point about the impossibility of verifying params at compile time - point taken, but in practice this should work at runtime just like deserializing anything else with no guarantees about structure, like JSON or a Mongo DBObject. Also, Salat has utilities to leverage default args where supplied.
This is not possible, because it is impossible to verify at compile time that all parameters were passed in that map.