What's the scala alternative to runtime-preserved annotations - scala

I just realized I cannot have annotations in scala, that are preserved and analyzed at runtime. I also checked this question, but I didn't quite get it what are the alternatives.
DI - an answer mentions that there is no need for DI framework in scala. While that might be the case on a basic level (although I didn't quite like that example; what's the idiomatic way of handling DI?), Java DI frameworks like spring are pretty advanced and handle many things like scheduled jobs, caching, managed persistence, etc, all through annotations, and sometimes - custom ones.
ORM - I'll admit I haven't tried any native scala ORM, but from what I see in squeryl, it also makes some use of annotations, meaning they are unavoidable?
any serialization tool - how do you idiomatically customize serialization output to JSON/XML/...?
Web service frameworks - how do you define (in code) the mappings, headers, etc. for RESTful or SOAP services?
Scala users need to have a hybrid scala/java (for the annotations) project in order to use these facilities that are coming from Java?
And are the native scala alternatives for meta-data nicer than annotations? I'm not yet fully into the scala mode of thinking, and therefore most of the examples look ugly to me, compared to using annotations, so please try to be extra convincing :)

Actually, Scala does have runtime-retained annotations. The difference is that they are not stored as Java annotations but are instead encoded inside the contents of binary ScalaSignature annotation (which, by itself, is a runtime-retained Java annotation).
So, Scala annotations can be retrieved at runtime, but instead of using Java reflection, one must use Scala reflection:
class Awesome extends StaticAnnotation
#Awesome
class AwesomeClass
import scala.reflect.runtime.universe._
val clazz = classOf[AwesomeClass]
val mirror = runtimeMirror(clazz.getClassLoader)
val symbol = mirror.classSymbol(clazz)
println(symbol.annotations) // prints 'List(Awesome)'
Unfortunately, Scala reflection is still marked as experimental and is actually unstable at this point (SI-6240 or SI-6826 are examples of quite serious issues). Nevertheless, it seems like the most straightforward replacement for Java reflection and annotations in the long term.
As for now, one has to use Java annotations which I think is still a nice solution.
Regarding frameworks and libraries for DI/ORM/WS/serialization - Scala still doesn't seem to be mature in this area, at least not as Java is. There are plenty of ongoing projects targeting these problems, some of them are already really nice, others are still in development. To name a few that come to my mind: Squeryl, Slick, Spray, Pickling.
Also, Scala has some advanced features that often make annotations unneccessary. Typeclasses (implemented with implicit parameters) are probably good example of that.

Related

Access scala function in PySpark

I have a Scala library which contains some utility codes and UDF for the Scala Spark API.
However, I would love to now start to use this Scala library with PySpark. Using Java based classes seems to work pretty OK like outlined Running custom Java class in PySpark, however as I use a library written in Scala some the names of some classes might not be straight forward and contain characters like $.
How is interoperability still possible?
How can I use Java/Scala code which is offering a function requiring a generic type parameter?
In general you don't. While access in such cases is sometimes possible, using __getattribute__ / getattr, Py4j is simply not designed with Scala in mind (that's really not Python specific - while Scala is technically speaking interpolatable with Java, it is much richer language, and many of its features are not easily accessible from other JVM languages).
In practice you should do the same thing that Spark does internally - instead of exposing Scala API directly, you create a lean* Java or Scala API, which is specifically designed for interoperability with guest languages. Since Py4j provides translation only between basic Python and Java types, and doesn't handle commonly used Scala interfaces, you will need such intermediate layer anyway, unless Scala library was specifically designed for Java interoperability.
As of your last concern
How can I use Java/Scala code which is offering a function requiring a generic type parameter?
Py4j can handle Java generics just fine without any special treatment. Advanced Scala features (manifests, class tags, type tags) are typically no go, but once again, there are not designed (though it is possible) with Java interoperability in mind.
* As a rule of thumb, if something is Java friendly (doesn't require any crazy hacks, extensive type conversions, or filling the blanks normally handled by the Scala compiler), it should be a good fit for PySpark as well.

What reflection capabilities can we expect from Scala 2.10?

Scala 2.10 brings reflection other than that provided the JVM (or I guess CLR).
What in particular do we have to look forward to, and how will it improve on the platform?
For example, will there be a class that reflects the language's convertibility between fields and accessor methods, so that I can iterate over the properties of an object?
update 2012-07-04:
Daniel SOBRAL (also on SO) details in his blog post "JSON serialization with reflection in Scala! Part 1 - So you want to do reflection?" some of the features coming with reflection:
To recapitulate, Scala 2.10 will come with a Scala reflection library.
That library is used by the compiler itself, but divided into layers through the cake pattern, so different users see different levels of detail, keeping jar sizes adequate to each one's use, and hopefully hiding unwanted detail.
The reflection library also integrates with the upcoming macro facilities, enabling enterprising coders to manipulate code at compile time.
update 2012-06-14. (from Eugene Burmako):
In Scala 2.10.0-M4, we have released the new reflection API that will most likely make it into 2.10.0-final without significant changes.
More details about the API can be found:
SO answer Get companion object instance with new Scala reflection API
Scala Reflection SIP, June 2012 by Martin Odersky (SIP, actually "Scala Improvement Process")
summary and migration route from M3
Extracts:
Universes and mirrors are now separate entities:
universes host reflection artifacts (trees, symbols, types, etc),
mirrors abstract loading of those artifacts (e.g. JavaMirror loads stuff
using a classloader and annotation unpickler, while GlobalMirror uses internal compiler classreader to achieve the same goal).
Public reflection API is split into scala.reflect.base and scala.reflect.api.
The former represents a minimalistic snapshot that is exactly enough to
build reified trees and types.
To build, but not to analyze - everything smart (for example, getting a type signature) is implemented in scala.reflect.api.
Both reflection domains have their own universe: scala.reflect.basis and
scala.reflect.runtime.universe.
The former is super lightweight and doesn't involve any classloaders,
while the latter represents a stripped down compiler.
Initial answer, Sept. 2011:
You can see evolutions of the reflect package in the Scala GitHub repo, with this two very recent commits:
Changes to Liftcode to use new reflection semantics, where a compiler uses type checking.
Started work on compiler toolbox that can compile reflect trees at runtime.
(Liftcode being, according to this thread, aims at simplifying "writing code that writes code")
The class scala/reflect/internal/Importers.scala (created yesterday!) is a good example of using those latest reflection feature.
Two links which should be of interest:
The scala-internals mailing list discussion on the reflection api.
The nightly build api doc for 2.10-SNAPSHOT.
Personally I am hoping to use this for doing runtime discovery of extensions (i.e. a type that extends a known trait), and generating UI forms and a few other things from those.
With current 2.10M4 you already can iterate over members of a class:
reflect.runtime.universe.typeOf[MyClass].members.filter(!_.isMethod)
The above code lists Symbol objects representing members of a class MyClass which are not methods. There are tons of ways you can fine-tune this.

Why do web development frameworks tend to work around the static features of languages?

I was a little surprised when I started using Lift how heavily it uses reflection (or appears to), it was a little unexpected in a statically-typed functional language. My experience with JSP was similar.
I'm pretty new to web development, so I don't really know how these tools work, but I'm wondering,
What aspects of web development encourage using reflection?
Are there any tools (in statically typed languages) that handle (1) referring to code from a template page (2) object-relational mapping, in a way that does not use reflection?
Please see lift source. It doesn't use reflection for most of the code that I have studied. Almost everything is statically typed. If you are referring to lift views they are processed as Xml nodes, that too is not reflection.
Specifically referring to the <lift:Foo.bar/> issue:
When <lift:Foo.bar/> is encountered in the code, Lift makes a few guesses, how the original name should have been (different naming conventions) and then calls java.lang.Class.forName to get the class. (Relevant code in LiftSession.scala and ClassHelpers.scala.) It will only find classes registered with addToPackages during boot.
Note that it is also possible (and common) to register classes and methods manually. Convention is still that all transformations must be of the form NodeSeq => NodeSeq because that is the only thing which makes sense for an untyped HTML/XHTML output.
So, what you have is Lift‘s internal registry of node transformations on one side, and on the other side the implicit registry of the module. Both types use a simple string lookup to execute a method. I guess it is arguable if one is more reflection based than the other.

IoC container that supports constructor injection with Scala named/default arguments?

I would prefer using constructor injection over JavaBean property injection if I could utilize the named and default arguments feature of Scala 2.8. Any IoC-containers exists that supports that or could be easily extended to do so? (The required information is there on runtime in the scala.reflect.ScalaSignature annotation of the class.)
I also have some basic(?) expectations from the IoC container:
Auto-wiring (by target class/trait or annotation, both one-to-one and one-to-many)
Explicit injection (explicit wiring) without much hassle (like Guice is weak there). Like user is injected that way in new ConnectionPool(user="test").
Life-cycle callback for cleanup on shutdown (in the proper order)
Spring can do these, obviosuly, but it doesn't support named parameters. I have considered using FactoryBean-s to bridge Scala and Spring, but that would mean too much hassle (boilerplate or code generation), as far as I see.
PART A
I have a work-in-progress reflection library that parses the Scala signature and is currently able to resolve named parameters: https://github.com/scalaj/scalaj-reflect
Unfortunately, I haven't yet tied it back into Java reflection to be able to invoke methods, nor have I added the logic to resolve default values (though this should be trivial). Both features are very high on my to-do list :)
This isn't an IoC container per-se, but it's a pre-requisite for another project of mine: https://github.com/scalaj/scalaj-spring. Work on scalaj-spring stopped when it became blindingly obvious that I wouldn't be able to make any worthwhile further progress until I had signature-based reflection in place.
PART B
All of that stuff is intended for big enterprisey people anyway. Those with no choice but to integrate their shiny new Scala code into some hulking legacy system... If that's not your use case, then you can just do Scala DI directly inside Scala.
There's DI support provided under the Lift banner: http://www.assembla.com/wiki/show/liftweb/Dependency_Injection
You should also hunt around for references to the cake pattern
Another dependency injection framework in Scala is subcut
I have considered using FactoryBean-s to bridge Scala and Spring, but that would mean too much hassle
I am not sure I understand the complexity. Its actually quite simple to implement Spring FactoryBeans in Scala. Check this little write-up http://olegzk.blogspot.com/2011/07/implementing-springs-factorybean-in.html
I've just released Sindi, an IoC container for the Scala programming language.
http://aloiscochard.github.com/sindi/

What parts of the Java ecosystem and language should a developer learn to get the most out of Scala?

Many of the available resources for learning Scala assume some background in Java. This can prove challenging for someone who is trying to learn Scala with no Java background.
What are some Java-isms a new Scala developer should know about as they learn the language?
For example, it's useful to know what a CLASSPATH is, what the java command line options are, etc...
That's a really great question! I've never thought about people learning Java just so they have it easier to learn Scala...
Apart from all the basics like for loops and such, learning Java Generics can be really helpful. The Scala equivalent is much more potent (and much harder to understand) than Java Generics. You might want to try to figure out where the limits of Java Generics are, and then in which cases Scala's type constructors can be used to overcome those limitations. At the more basic level, it is important to know why Generics are necessary, and how Java is a strongly typed language.
Java allows you to have multiple constructors for one class. This knowledge will be of no use when you learn Scala, because Scala has another way that allows you to offer several methods to create instances of a class. So, you'd rather not have a deep look into this Java concept.
Here are some concepts that differ very strongly between Java and Scala. So, if you learn the Java concepts and then later on want to learn the equivalent in Scala, you should be aware that the Scala equivalent differs so greatly from the Java version that a typical Java developer will have some difficulty to adapt to the Scala way of thinking. Still, it usually helps to first get used to the Java way, because it is usually simpler and easier to learn. I personally prefer to think of Java as the introductory course, and Scala is the pro version.
Java mutable collection concept vs. Scala mutable/immutable differentiation
static methods (Java) vs. singleton objects (Scala)
for loops
Java return statement vs. Scala functional style ("every expression returns a value")
Java's use of null for "no value" vs. Scala's more explicit Option type
imports
Java's switch vs. Scala's match
And here is a list of stuff that you will probably use from the Java standard library, even if you develop in Scala:
IO
GUI (Scala has a wrapper for Swing, but hey)
URLs, URIs, files
date
timers
And finally, some of Scala's features that have no direct equivalent in Java or the Java standard library:
operator overloading
implicits and implicit conversions
multiple argument lists / currying
anonymous functions / functions as values
actors
streams
Scala pattern matching (which rocks)
traits
type inference
for comprehensions
awesome collection operations like fold or map
Of course, all the lists are incomplete. That's just my view on what is important. I hope it helps.
And, by the way: You should definitely know about the class path and other JVM basics.
The standard library, above all else, because that's what Scala has most in common with Java.
You should also get a basic idea of Java's syntax, because a lot of books end up comparing something in Scala to something in Java. But other than the platform and some of the library, they're totally distinctive languages.
There are a few trivial conventions passed from one to the other (like command line options), but as you read books and tutorials on Scala you should pick those up as you go regardless of previous Java experience.
The serie "Scala for Java Refugees" can gives some indications on typical Java topics you are supposed to know and how they translate into Scala.
For instance, the very basic main() Java function which translate into the Application trait, once considered harmful, and now improved (for Scala 2.9 anyway).