Scala has dynamic compilation? - scala

I have found many pages saying scala doesn't have the dynamic compilation feature, whereas I have achieved the same using twitter util library. I wrote a scala code calling Eval function from mentioned lib and it create a scala class at runtime.
example val obj = Eval[MyScalaTrait](new File("flat file having scala code")))
Can you please tell me for dynamic compilation which one is better scala or groovy. I want to keep scripting part in a flat file and at run time create class. Looks like both are solving my problem, but want to see which would would be best?

I can't tell you about Groovy, but I'm using dynamic compilation of Scala code in several projects. It works mostly flawlessly, but the compiler may take it bit of warm-up time, so don't expect this to give you fantastic low latencies in "real-time" situations.

Related

Scala source code definition of "def" and other built ins

I was doing some research of how to solve this question. However, I am wondering if I can start learning how the function works, or how they pass the argument into the local scope by reading the source code of scala.
I know the source code of scala is hosted in Github, my question is how to locate the definition of def.
Or more generally, how to locate the source code of certain built in functions, operators?
The source code for everything in the Scala standard library is under https://github.com/scala/scala/tree/2.11.x/src/library/scala.
Also, the Scaladoc for the standard library includes links to the source code. So e.g. if you're interested in scala.Option and you're looking at http://www.scala-lang.org/api/2.11.7/#scala.Option, notice that page has "Source: Option.scala" where "Option.scala" is hyperlinked to the source code.
For something like def, which is not part of the standard library, but part of the language, well... there is no single place where def itself is defined. The compiler has 25 phases (you can list them by running scalac -Xshow-phases) and basically every phase participates in the job of making def mean what it means.
If you want to understand def, you'd probably be better off reading the Scala Language Specification; it's highly technical, but still much more approachable than the source code for the compiler.
The part of the spec that addresses your question about named and default arguments is SLS 6.6.1.

Scala Metaprogramming at Runtime

I'm building a tool that will receive unpredictable data structure, and I want to generate case class to accomplish the structure of the received data.
I'm trying to figure out if it's possible to generate case class at runtime? This structure will be know only at runtime.
It's something similar to what macro does, but in runtime.
I've found this project on the internet
mars
Which is very close to what I want to do ,but I couldn't find if it was successful of not.
Another way of doing it is generate the code, compile and put the result in the classpath, like IScala is doing to use the code in an iterative way. But I don't think that this will scale.
Does anybody has already done something like runtime code generation?
This question was also posted in scala-user mailing list
UPDATE: (as per the comments)
If all you want is throw-away code generated at runtime to be fed into to a library that cannot work with just lists and maps, and not code to be stored and used later, it would make sense to look for solutions to this problem for Java or JVM. That is, unless the library requires some Scala specific features not available to vanilla JVM bytecode (Scala adds some extras to the bytecode, which Java code doesn't need/have).
what is the benefit of generating statically typed code dynamically? as opposed to using a dynamic data structure.
I would not attempt that at all. Just use a structure such as nested lists and maps.
Runtime code generation is one of the purposes of the Mars Project. Mars is under development, at the moment there is no release version. Mars requires its own toolchain to expand macros at runtime and should use several features unique to scala.meta (http://scalameta.org/), for example, AST interpretation and AST persistence. Currently we are working on ASTs typechecking in scala-reflect, required for runtime macros expansion.

matlab call scala function

I would like to write some pieces of code in Scala . It is important for me that this code can be called from matlab. AFAIK java code can easily be integrated with matlab. http://www.mathworks.co.uk/help/matlab/ref/javamethod.html
Is this valid also for scala code?
It is also valid for Scala code. Just pretend that you're doing Java interop (e.g. a method called + would actually be $plus) and make sure scala-library.jar is in Matlab's classpath (e.g. using javaaddpath).
Incidentally, I've done Java/Matlab interop before, and it's not as "easily integrated" as one might hope. Passing data is kind of awkward and you tend to trip on classloader and/or classpath issues every now and then.
I'd be wary of planning a big project with lots of tightly-connected interop. In my experience it usually works better using the Unix mindset: make independent tools that do their own thing well, and chain them together to get what you want. For instance, I routinely have Scala write Matlab code and call Matlab to run it, or have Matlab use system to fire up a Scala program to process a subset of the data.
So--calling Scala from Matlab is completely possible, and for simple interfaces looks just like Java. Just try to keep the interface simple.
You could encapsulate your Scala code in another language that's easily called by Matlab, such as Java of C. Since you seem to know about Java, you could use these examples to learn to encapsulate your Scala methods inside Jave methods that you can call from Matlab. A number of other methods are available to you with different languages (MEX-files, loadlibrary, etc...)

Scala: Lazy baking and runtime compilation of cake pattern

One of the great limitations of the cake pattern is that its static. I would like to be able to mix-in traits potentially written by different coders completely independently. However the traits would not need to be mixed-in frequently. The user would have an initialisation screen where they would choose the traits / assemblies, before the main application was run. So the thought occurred to me why not mix-in and compile the chosen traits from with in the user choice selection module. If the compilation failed, no problem the user would just get back some message - incompatible assemblies or what ever. If the compilation succeeded then the top UI module would load the newly compiled classes with the pre-compiled parts of the assemblies and run the main application. Note there might only need to be one or two classes compiled duruing run time initialisation. All the rest of the code could have been compiled normally.
I'm pretty new to Scala. Is this a recognised pattern? Is there any support for it? It seems mad to have to use Guice for a relative simple dependency situation. Can I run the Scala compiler easily from within an application? Can I run it in memory and its outputs be used from memory without unnecessary file creation?
Note: Although appearing to be dynamic, this methodology would remain 100% static.
Edit it occurs to that one of the drives of Microsoft's Roslyn project was to enable just this sort of thing for C# and Visual Basic. But that seems to have been a pretty big project even for a high powered Microsoft team.
Calling the compiler directly from within Scala is doable, but not for the timid. Luckily, the good people at Twitter have automated the process for you. (140 character celebrity micro-blogging, and some cool Scala utilities! Thanks Twitter.) You can use the com.twitter.utils.Eval class to compile and evaluate Scala strings. In your example, you would do something like
val eval = new Eval()
val myObj = eval[BaseClass]("new BaseClass extends " + traitNameList.mkString(" with "))
This will create you a new object with all of the traits you desire built in. The question then arises as to whether this is a good idea. Downsides:
Calling out to the Scala compiler is not quick
If you do this enough, you will overload the PermGen space, as the classes you create will never be garbage collected
This really is more of the sort of thing you want a dynamic language for rather than Scala. You're likely to find places where this all kinds of works, but clashes with the rest of your architecture (yes, that's vague).

How do I generate new source code in text form in a Scala compiler plugin?

I have just finished the first version of a Java 6 compiler plugin, that automatically generates wrappers (proxy, adapter, delegate, call it what you like) based on an annotation.
Since I am doing mixed Java/Scala projects, I would like to be able to use the same annotation inside my Scala code, and get the same generated code (except of course in Scala). That basically means starting from scratch.
What I would like to do, and for which I haven't found an example yet, is how do I generate the code inside a Scala compiler plugin in the same way as in the Java compiler plugin. That is, I match/find where my annotation is used, get the AST for the annotated interface, and then ask the API to give me a Stream/Writer in which I output the generated Scala source code, using String manipulation.
That last part is what I could not find. So how do I tell the API to create a new Scala source file, and give me a Stream/Writer/File/Handle, so I can just write in it, and when I'm done, the Scala compiler compiles it, within the same run in which the plugin was invoked?
Why would I want to do that? Firstly, because than both plugins have the same structure, so maintenance is easy. Secondly, I want to open source it, and there is just no way to support every option that anyone would want, so I expect potential users to want to extend the generation with their own code. This will be a lot easier for them if they just have to do some printf(), instead of learning the AST API (this also applies to me).
Short answer:
It can't be done
Long answer:
You could conceivably generate your source file and push that through a parser instance within your plugin. But not in any way that's likely to be of any use to you, because you'd now have a bigger problem to contend with:
In order to grab all the type/name information for generating the delagate/proxy, you'll have to pick up the annotated type's AST after it has run through both the namer and typer phases (which are inseperable). The catch is that any attempts to call your generated code will already have failed typechecking, the compiler will have thrown an error, and any further bets are off.
Method synthesis is possible in limited cases, so long as you can somehow fool the typechecker for just long enough to get your code generated, which is the trick I pulled with my Autoproxy 'lite' plugin. Even then, you're far better off working with TreeDSL to generate code instead of pumping out raw source.
Kevin is entirely correct, but just for completeness it's worth mentioning that there is another alternative - write a compiler plugin that generates source. This is the approach that I've adopted in Borachio. It's not a very satisfactory solution, but it can be made to work.
Edit - I just reread your question and realised that you're actually asking about generating source anyway
So there is no support for this directly, but it's basically just a question of opening a file and writing the relevant "print" statements. There's no way to invoke the compiler "inside" a plugin AFAIK, but I've written an sbt plugin which hides most of the complexity of invoking the compiler twice.