Scala v 2.10: How to get a new instance of a class (object) starting from the class name - scala

I have tens of JSON fragments to parse, and for each one I need to get an instance of the right parser. My idea was to create a config file where to write the name of the class to instantiate for each parser (a kind of map url -> parser) . Getting back to your solution, I cannot call the method I implemented in each parser if I have a pointer to Any. I suppose this is a very common problem with a well-set solution, but I have no idea what the best practices could be.
I really have no experience with Java, Reflection, Class Loading and all that stuff. So,
can anyone write for me the body of the method below? I need to get an instance of a class passed as String (no arguments needed for the constructor, at least so far...)
def createInstance(clazzName: String) = {
// get the Class for the given clazzName, e.g. "net.my.BeautifulClazz"
// instantiate an object and return it
}
Thanks, as usual...

There is a very simple answer:
scala> def createInstance(clazzName: String) = Class.forName(clazzName).newInstance
createInstance: (clazzName: String)Any
scala> createInstance("java.lang.String")
res0: Any = ""
If it works for you, everything is fine. If it don't, we have to look into your class loader. This is usually the point when things will get dirty.
Depending in what you want to do, look into:
The cake pattern, if you want to combine your classes during compile time
OSGi when you want to build a plugin infrastructure (look here for a very simple example)
Google guice, if you really need dependency injection (e.g. when mixing Scala and Java code) and the cake pattern does not work for you

Related

Scala code generation with annotations + macros or external script?

I want to know:
Can Scala annotations/transforms implement the code generation below? (objective)
What are the trade-offs as opposed to source code generation with an external tool? (objective)
Is there a better way / how would you do it? (subjective)
Background:
I'm designing an embedded database for Scala + Java as a side project. Although I want it to be usable from Java, I'm writing it in Scala for the extra flexibility/power and because writing Java code kills my soul.
I'm working out what I want model definitions to look like. My initial idea was to parse some model definition files and generate Scala classes. But I think now I'd like them to be definable in Scala code so that no parsing is required and people can bring the full power of Scala to bear in defining and customizing the models (e.g. custom column types.) That's a valuable feature from my Python/Django experience.
So far I have something like:
#model
class Person {
val Name = StringColumn(length=32)
val BirthDate = DateColumn(optional=true)
}
#model
class Student extends Person {
val GPA = FloatColumn(propertyName="gpa")
}
#model
class Teacher extends Person {
val salary = NumericColumn()
}
Which would generate:
class Person {
val Name = StringColumn(name="name", length=32)
val BirthDate = DateColumn(name="birthDate", optional=true)
// generated accessor methods
def name = Person.Name.get(...)
def name_= (name : String) : Unit = Person.Name.set(..., name)
// etc ...
}
// static access to model metadata, e.g. Person.Name is an immutable StringColumn instance
object Person extends Person
class Student extends Person {
val GPA = DoubleColumn(name = "GPA")
def gpa = ...
def gpa_= (value : Float) = ...
}
object Student extends Student
class Teacher extends Person {
// You get the idea
}
object Teacher extends Teacher
Looking at some examples online and doing some research, it seems like AST transforms using a special #model annotation could actually generate the needed code, maybe with a little bit of help, e.g. having the user define the object as well with the model definition. Am I right that this can be done?
Some problems that occur to me with this idea:
The object will be cluttered with properties that are not useful, all it needs are the Column objects. This could be fixed by splitting the class into two classes, PersonMeta and Person extends PersonMeta with Person object extending PersonMeta only.
IDEs will probably not pick up on the generated properties, causing them to underline them with wavy lines (eww...) and making it so auto-complete for property names won't work. The code would still be compile-time checked, so it's really just an IDE gotcha (Dynamic, no doubt, has the same problem.)
Code generation using a script is more IDE friendly, but it's hacky, probably more work, especially since you have to leave custom methods and things intact. It also requires a custom build step that you have to run whenever you change a model (which means you can forget to do it.) While the IDE might not help you with macro code generation (yet, anyway) the compiler will shout at you if you get things wrong. That makes me lean towards doing it with macros + annotation.
What do you think? I'm new to Scala, I kind of doubt I've hit on the best way to define models and generate implementations for them. How would you do it better?
It's possible yeah. Macros can be unpleasant to write and debug, but they do work.
Seems like you already have your solution there
Scala IDEs tend to handle macros correctly-ish (I mean, they have to, they're part of the language and used in some pretty fundamental libraries), so I wouldn't worry about that; if anything a macro is more ide-friendly than an external codegen step because a macro will stay in sync with a user's changes.
I'd see whether you can achieve what you want in vanilla scala before resorting to a macro. Remember that your class-like things don't necessarily have to be actual case classes; take a look at Shapeless generic records for an idea of how you can represent a well-typed "row" of named values from somewhere external. I think a system that could map a structure like those records to and from SQL might end up being more principled (that is, easier to reason about) than one based on "special" case classes.

Use `#annotation.varargs` on constructors

I want to declare a class like this:
class StringSetCreate(val s: String*) {
// ...
}
and call that in Java. The problem is that the constructor is of type
public StringSetCreate(scala.collection.Seq)
So in java, you need to fiddle around with the scala sequences which is ugly.
I know that there is the #annotation.varargs annotation which, if used on a method, generates a second method which takes the java varargs.
This annotation does not work on constructors, at least I don't know where to put it. I found a Scala Issue SI-8383 which reports this problem. As far as I understand there is no solution currently. Is this right? Are there any workarounds? Can I somehow define that second constructor by hand?
The bug is already filed as https://issues.scala-lang.org/browse/SI-8383 .
For a workaround I'd recommend using a factory method (perhaps on the companion object), where #varargs should work:
object StringSetCreate {
#varargs def build(s: String*) = new StringSetCreate(s: _*)
}
Then in Java you call StringSetCreate.build("a", "b") rather than using new.

Is it possible in Play 2.1 to convert a JsValue to a case class instance if you only know the name of the case class at the runtime?

I'm building a JSON RPC in Play 2.1. In order to call the proper methods the RPC dispatcher is using reflection to create and call a class method instance by name.
Right now a RPC method looks like this:
def create(obj: JsValue) = {
val menu: Menu = Json.fromJson[Menu](obj).get
collection.insert(menu).map( r => toDirectResult(r))
}
def createCustom(obj: JsValue) = {
val menu: Menu = Json.fromJson(obj)(Menu.customFormat).get
collection.insert(menu).map( r => toDirectResult(r))
}
What I would like to do is to be able to define the RPC methods like this:
def create(menu: Menu) = {
collection.insert(menu).map( r => toDirectResult(r))
}
The problem is that the RPC dispatcher only knows at the runtime that is has to call the method named "create" on the class named "Menus" and it has the value of the argument to pass to the method as a JsValue. Through reflection I can find out the number of arguments and their types for the RPC method. When the argument type is a case class, how do I transform the JsValue into a case class instance using the implicit Formatter (or Reader) defined in the companion object of the case class?
For the createCustom method I realize that there is no "magic" solution, but since I started learning Scala I discovered that few things are truly impossible with this programming language. Would it be possible to use an annotation or something similar to specify a Formatter that is not implicit?
You need to implement a PathBinder...this should help out http://www.richard-foy.fr/blog/2012/04/09/how-to-implement-a-custom-pathbindable-with-play-2/
After further careful consideration I've decided that reflection is really no the right solution for my problem. It lacks type safety and proper error reporting at compile time, it is harder to debug and has an impact on the performance as well. And I actually have all the information I need to generate the code at build time.
Unfortunately I cannot use the Play router because for the JSON RPC dispatcher the routing depends on the request body, which is not available during the Play routing. But in essence the RPC dispatcher is doing the same thing as the Play router. So for the moment I'm just going to manually code my RPC routes and then the problem in the question is solved. In the future I'm planning to write a SBT plugin that will automatically generate the dispatcher code, similar to the Play router.

what is the input type of classOf

I am wondering what type do I put in place of XXX
def registerClass(cl:XXX) = kryo.register(classOf[cl])
EDIT: For why I want to do this.
I have to register many classes using the above code. I wanted to remove the duplication of calling kyro.register several times, hoping to write code like below:
Seq(com.mypackage.class1,com.mypackage.class2,com.mypackage.class3).foreach(registerClass)
Another question, can I pass String instead? and convert it somehow to a class in registerClass?
Seq("com.mypackage.class1","com.mypackage.class2").foreach(registerClass)
EDIT 2:
When I write com.mypackage.class1, it means any class defined in my source. So if I create a class
package com.mypackage.model
class Dummy(val ids:Seq[Int],val name:String)
I would provide com.mypackage.model.Dummy as input
So,
kryo.register(classOf[com.mypackage.model.Dummy])
Kryo is a Java Serialization library. The signature of the register class is
register(Class type)
You could do it like this:
def registerClass(cl:Class[_]) = kryo.register(cl)
And then call it like this:
registerClass(classOf[Int])
The type parameter to classOf needs to be known at compile time. Without knowing more about what you're trying to do, is there any reason you can't use:
def registerClass(cl:XXX) = kryo.register(cl.getClass)

Why people define class, trait, object inside another object in Scala?

Ok, I'll explain why I ask this question. I begin to read Lift 2.2 source code these days.
It's good if you happened to read lift source code before.
In Lift, I found that, define inner class and inner trait are very heavily used.
object Menu has 2 inner traits and 4 inner classes. object Loc has 18 inner classes, 5 inner traits, 7 inner objects.
There're tons of codes write like this. I wanna to know why the author write like this.
Is it because it's the author's
personal taste or a powerful use of
language feature?
Is there any trade-off for this kind
of usage?
Before 2.8, you had to choose between packages and objects. The problem with packages is that they cannot contain methods or vals on their own. So you have to put all those inside another object, which can get awkward. Observe:
object Encrypt {
private val magicConstant = 0x12345678
def encryptInt(i: Int) = i ^ magicConstant
class EncryptIterator(ii: Iterator[Int]) extends Iterator[Int] {
def hasNext = ii.hasNext
def next = encryptInt(ii.next)
}
}
Now you can import Encrypt._ and gain access to the method encryptInt as well as the class EncryptIterator. Handy!
In contrast,
package encrypt {
object Encrypt {
private[encrypt] val magicConstant = 0x12345678
def encryptInt(i: Int) = i ^ magicConstant
}
class EncryptIterator(ii: Iterator[Int]) extends Iterator[Int] {
def hasNext = ii.hasNext
def next = Encrypt.encryptInt(ii.next)
}
}
It's not a huge difference, but it makes the user import both encrypt._ and encrypt.Encrypt._ or have to keep writing Encrypt.encryptInt over and over. Why not just use an object instead, as in the first pattern? (There's really no performance penalty, since nested classes aren't actually Java inner classes under the hood; they're just regular classes as far as the JVM knows, but with fancy names that tell you that they're nested.)
In 2.8, you can have your cake and eat it too: call the thing a package object, and the compiler will rewrite the code for you so it actually looks like the second example under the hood (except the object Encrypt is actually called package internally), but behaves like the first example in terms of namespace--the vals and defs are right there without needing an extra import.
Thus, projects that were started pre-2.8 often use objects to enclose lots of stuff as if they were a package. Post-2.8, one of the main motivations has been removed. (But just to be clear, using an object still doesn't hurt; it's more that it's conceptually misleading than that it has a negative impact on performance or whatnot.)
(P.S. Please, please don't try to actually encrypt anything that way except as an example or a joke!)
Putting classes, traits and objects in an object is sometimes required when you want to use abstract type variables, see e.g. http://programming-scala.labs.oreilly.com/ch12.html#_parameterized_types_vs_abstract_types
It can be both. Among other things, an instance of an inner class/trait has access to the variables of its parent. Inner classes have to be created with a parent instance, which is an instance of the outer type.
In other cases, it's probably just a way of grouping closely related things, as in your object example. Note that the trait LocParam is sealed, which means that all subclasses have to be in the same compile unit/file.
sblundy has a decent answer. One thing to add is that only with Scala 2.8 do you have package objects which let you group similar things in a package namespace without making a completely separate object. For that reason I will be updating my Lift Modules proposal to use a package object instead of a simple object.