Overriding & inherited secondary constructors in Scala - scala

Suppose I wanted to represent a Book in Scala, and it is generated directly from XML.
I want a wrapping parent class XMLObject to encompass classes that can be directly mapped to and from XML.
Below is an example of a working implementation of this, but I want to know why constructors cannot be abstract and you can't use the override keyword, but you can still redefine a constructor with the same signature as its parent in a subclass, and have it work the way you would expect.
Is this considered "bad" coding practice, and if so, what would be a better way to get similar functionality?
abstract class XMLObject {
def toXML:Node
def this(xml:Node) = this()
}
class Book(
val author:String = "",
val title:String = "",
val genre:String = "",
val price:Double = 0,
val publishDate:Date = null,
val description:String = "",
val id:Int = 0
) extends XMLObject {
override def toXML:Node =
<book id="{id}">
...
</book>
def this(xml:Node) = {
this(
author = (xml \ "author").text,
title = (xml \ "title").text,
genre = (xml \ "genre").text,
price = (xml \ "price").text.toDouble,
publishDate = (new SimpleDateFormat("yyyy-MM-dd")).parse((xml \ "publish_date").text),
description = (xml \ "description").text
)
}
}
Example use:
val book = new Book(someXMLNode)

A constructor can only be called in the form:
new X(...)
That means you know the runtime type of the object you are going to create. Meaning overriding makes no sense here. You can still define constructors in abstract classes for example, but this is for chaining (calling the superclass constructor in the classes constructor).
What you seem to be looking for is rather a factory pattern:
Remove the constructor from XMLObject
If you want, add a function to XMLObject's companion that decides based on the XML you pass in, what sub-class to create.
For example:
object XMLObject {
def apply(xml: Node) = xml match {
case <book> _ </book> => new Book(xml)
// ...
case _ => sys.error("malformed element")
}
}

I would use type classes for this.
The fact that you want to be able to map a Book (and other things) to and from XML is orthogonal to what those entities are. You don't want to choose a class hierarchy just based on the fact that you want these objects to have some XML mapping functionality in common. A proper superclass for Book might be PublishedEntity or something similar, but not XMLObject.
And what happens if next week you want to add JSON parsing/rendering? You've already used the superclass for XML; what would you do?
A better solution would be to make a trait for the XML interface and mix it in at the root. That way you could mix in as many such things as you want, and still be free to choose a sensible class hierarchy.
But an even better solution would be type classes, because they allow you to add support for someone else's class, that you can't add methods to.
Here is a slide presentation that Erik Osheim prepared on type classes.
Many of the JSON parser/formatter packages around (e.g. Spray's) use type classes. I haven't used XML much in Scala, but I would guess there are type class implementations for XML as well. A quick search turned up this.

Related

Why this class need to be abstract in scala?

In java and c#,I can write this:
class Tree {
Tree left;
Tree right;
}
but in scala:
class Tree{
val left:Tree
val right:Tree
}
I need to add abstract for class,or write:
val left:Tree=new Tree()
I can write this:
trait Tree{
val left:Tree
val right:Tree
}
but why if I use class,I "have to"and abstract?I don't think it's a good design
Thanks!
The reason you can write
class Tree {
Tree left;
Tree right;
}
in Java and C# is because the fields are initialized to null by default (or to 0, etc. depending on the type). Scala's designers decided this is a bad idea and you need to initialize them. So the approximate Scala equivalent is
class Tree {
// note, not val
var left: Tree = null
var right: Tree = null
}
which is legal (but probably not something you actually want to use).
but why if I use class,I "have to"and abstract?
You have to mark your class with the abstract keyword because your class is abstract. It cannot possibly be instantiated.
I don't think it's a good design
Good design is subjective. The designers of Scala thought that being explicit in this case, rather than making the class automatically abstract was good design. I would guess that the majority of the Scala community agrees with them.
You disagree, and that is perfectly okay.
It happened because scala sometimes considers val and def as method like declarations. Let me elaborate, please, for instance, we have the next Scala example:
class Example {
val a: String = "example"
}
val example = new Example()
println(example.a)
You may see that a declared as a field. Unlike Java, we can change this declaration easily to def and everything remains compiling:
class Example {
def a: String = "example"
}
val example = new Example()
println(example.a)
In the case of Java (not sure about C# I've never worked with it), if you would like to access the field over getter method, you will need to change all invocation places from field to method access.
Now, you can consider val as let's say sort of eager cached def version - if so then declaring val without actually value assignment implicitly considered by compiler as declaring method without implementation and that's why compiler says Tree is an abstract class - because left and right has no values, hence they are abstract. In order to make it non-abstract, you need to assign a value to fields or use var if you would like to proceed with mutable structure, e.g.:
class Example {
val a: String = "example"
}
val example = new Example()
println(example.a)
Scatie: https://scastie.scala-lang.org/bQIcVNk9SN6qJhbL32SMUQ

Playframework, where to put Json Writes and Reads for reuse?

I have two controllers who writes and reads the same AccountModel case class. This class is an adapter for my "domain" object Account who flatten some collections and transform objects references (Map[Role, Auth]) to a explicit key reference (Set[AuthModel(rolekey:String, level:Int)]).
I would like to reuse this AccountModel and his implicits Writes and Reads but don't know how the achieve that 'the scala way'.
I would say in an object Models with my case classes as inner classes and all the related implicits but I think that this would become unreadable soon.
What are you used to do, where do you put your reusable Json classes, do you have some advices ?
Thanks a lot
There are two main approaches.
Approach 1: Put them on a companion object of your serializable object:
// in file AccountModel.scala
class AccountModel(...) {
...
}
object AccountModel {
implicit val format: Format[AccountModel] = {...}
}
This way everywhere you import AccountModel, the formatters will be also available, so everything will work seamlessly.
Approach 2: Prepare a trait with JSON formatters:
// in a separate file AccountModelJSONSupport.scala
import my.cool.package.AccountModel
trait AccountModelJsonSupport {
implicit val format: Format[AccountModel] = {...}
}
With this approach whenever you need serialization, you have to mix the trait in, like this:
object FirstController extends Controller with AccountModelJsonSupport {
// Format[AccountModel] is available now:
def create = Action(parse.json[AccountModel]) { ... }
}
EDIT: I forgot to add a comparison of the two approaches. I usually stick to approach 1, as it is more straightforward. The JSONSupport mixin strategy is however required when you need two different formatters for the same class or when the model class is not your own and you can't modify it. Thanks for pointing it out in the comments.

Using value classes in scala to implement trait methods?

I have a trait that defines a function--I don't want to specify how it will work until later. This trait is mixed in with several case classes, like so:
trait AnItem
trait DataFormatable {
def render():String = "" // dummy implementation
}
case class Person(name:String, age:Int) extends DataFormatable with AnItem
case class Building(numFloors:Int) extends DataFormatable with AnItem
Ok, so now I want includable modules that pimp specific implementations of this render behavior. Trying to use value classes here:
object JSON {
implicit class PersonRender( val p:Person ) extends AnyVal {
def render():String = {
//render json
}
}
// others
}
object XML {
implicit class PersonRender( val p:Person ) extends AnyVal {
def render():String = {
//render xml
}
}
// others
}
The ideal use would look like this (presuming JSON output desired):
import JSON._
val p:AnItem = Person("John",24)
println(p.render())
All cool--but it doesn't work. Is there a way I can make this loadable-implementation thing work? Am I close?
The DataFormatable trait is doing nothing here but holding you back. You should just get rid of it. Since you want to swap out render implementations based on the existence of implicits in scope, Person can't have it's own render method. The compiler will only look for an implicit conversion to PersonRender if Person doesn't have a method named render in the first place. But because Person inherits (or is forced to implement) render from DataFormatable, there is no need to look for the implicit conversion.
Based on your edit, if you have a collection of List[AnItem], it is also not possible to implicitly convert the elements to have render. While each of the sub-classes may have an implicit conversion that gives them render, the compiler doesn't know that when they are all piled into a list of a more abstract type. Particularly an empty trait such as AnItem.
How can you make this work? You have two simple options.
One, if you want to stick with the implicit conversions, you need to remove DataFormatable as the super-type of your case classes, so that they do not have their own render method. Then you can swap out XML._ and JSON._, and the conversions should work. However, you won't be allowed mixed collections.
Two, drop the implicits altogether and have your trait look like this:
trait DataFormatable {
def toXML: String
def toJSON: String
}
This way, you force every class that mixes in DataFormatable to contain serialization information (which is the way it should be, rather than hiding them in implicits). Now, when you have a List[DataFormatable], you can prove all of the elements can both be converted to JSON or XML, so you can convert a mixed list. I think this would be much better overall, as the code should be more straightforward. What imports you have shouldn't really be defining the behavior of what follows. Imagine the confusion that can arise because XML._ has been imported at the top of the file instead of JSON._.

How does the object in class pattern work, as used in the Lift Framework?

I'm new to scala and can't get my head around how the Lift guys implemented the Record API. However, the question is less about this API but more about Scala in general. I'm interested in how the object in class pattern works, used in Lift.
class MainDoc private() extends MongoRecord[MainDoc] with ObjectIdPk[MainDoc] {
def meta = MainDoc
object name extends StringField(this, 12)
object cnt extends IntField(this)
}
object MainDoc extends MainDoc with MongoMetaRecord[MainDoc]
In the upper snippet you can see how a record is defined in Lift. The interesting part is that the fields are defined as objects. The API allows you to create Instances like this:
val md1 = MainDoc.createRecord
.name("md1")
.cnt(5)
.save
This is probably done by using the apply method? But at the same time you are able to get the values by doing something like this:
val name = md1.name
How does this all work? Are the objects not that static when in scope of an class. Or are they just constructor classes for some internal representation? How is it possible to iterate over all fields, do you use Reflection?
Thanks,
Otto
Otto,
You are more of less on the right track. You actually don't need to define your fields as objects, you could have written your example as
class MainDoc private() extends MongoRecord[MainDoc] with ObjectIdPk[MainDoc] {
def meta = MainDoc
val name = new StringField(this, 12)
val cnt= new IntField(this)
}
object MainDoc extends MainDoc with MongoMetaRecord[MainDoc]
The net.liftweb.record.Field trait does contain an apply method that is the equivalent to set. That's why you can assign the fields by name after instantiating the object.
The field reference you mentioned:
val name = md1.name
Would type name as a StringField. If what you were thinking was
val name: String = md1.name
that would fail to compile (unless there was an implicit in scope to convert Field[T] => T). The proper way retrieve the String value of the field would be
val name = md1.name.get
Record does use reflection to gather the fields. When you define an object within a class, the compiler will create a field to hold the object instance. From the standpoint of reflection, the object appears very similar to the alternate way to define a field that I mentioned before. Each of the definitions probably creates a subclass of the field type, but that's no different than
val name = new StringField(this, 12) {
override def label: NodeSeq = <span>My String Field</span>
}
You're right about it being the apply method. Record's Field base class defines a few apply methods.
def apply(in: Box[MyType]): OwnerType
def apply(in: MyType): OwnerType
By returning the OwnerType, you can chain invocations together.
Regarding the use of object to define fields, that confused me at first, too. The object identifier defines an object within a particular scope. Even though it's convenient to think of object as a shortcut for the singleton pattern, it's more flexible than that. According to the Scala Language Spec (section 5.4):
It is roughly equivalent to the following definition of a lazy value:
lazy val m = new sc with mt1 with ... with mtn { this: m.type => stats }
<snip/>
The expansion given above is not accurate for top-level objects. It cannot be because variable and method definition cannot appear on the top-level outside of a
package object (§9.3). Instead, top-level objects are translated to static fields.
Regarding iterating over all the fields, Record objects define a allFields method which returns a List[net.liftweb.record.Field[_, MyType]].

New constructors for classes with Scala rich wrapping (implicit)

In Scala, you can "add new methods" to existing classes by creating wrapper class and using "implicit def" to convert from the original class to the rich wrapper class.
I have a java library for graphics that uses plenty of constructors with looong lists of floats. I would love to add new constructors to these classes with rich wrapping but this doens't seem to work like the above for methods. In other words, I would like to have simpler constructors but still to be able to keep using the original class names and not some wrapper class names but currently I see no other options.
Ideas?
Sounds like you want to use Scala's apply(...) factory methods, which you build in to the companion object for your class.
For example, if you have:
class Foo(val bar: Int, val baz: Int) {
... class definition ...
}
You could add (in the same file):
object Foo {
def apply(bar: Int) = new Foo(bar, 0)
}
With this in hand, creating a new Foo instance, just providing the bar parameter, is as easy as
val myBar = 42
val myFoo = Foo(myBar) // Note the lack of the 'new' keyword.
This will result in myFoo being assigned an instance of Foo where bar = 42, and baz = 0.
Yes, you need a combination of the "Pimp my Library" approach and an apply factory method.