I have a trait that defines a function--I don't want to specify how it will work until later. This trait is mixed in with several case classes, like so:
trait AnItem
trait DataFormatable {
def render():String = "" // dummy implementation
}
case class Person(name:String, age:Int) extends DataFormatable with AnItem
case class Building(numFloors:Int) extends DataFormatable with AnItem
Ok, so now I want includable modules that pimp specific implementations of this render behavior. Trying to use value classes here:
object JSON {
implicit class PersonRender( val p:Person ) extends AnyVal {
def render():String = {
//render json
}
}
// others
}
object XML {
implicit class PersonRender( val p:Person ) extends AnyVal {
def render():String = {
//render xml
}
}
// others
}
The ideal use would look like this (presuming JSON output desired):
import JSON._
val p:AnItem = Person("John",24)
println(p.render())
All cool--but it doesn't work. Is there a way I can make this loadable-implementation thing work? Am I close?
The DataFormatable trait is doing nothing here but holding you back. You should just get rid of it. Since you want to swap out render implementations based on the existence of implicits in scope, Person can't have it's own render method. The compiler will only look for an implicit conversion to PersonRender if Person doesn't have a method named render in the first place. But because Person inherits (or is forced to implement) render from DataFormatable, there is no need to look for the implicit conversion.
Based on your edit, if you have a collection of List[AnItem], it is also not possible to implicitly convert the elements to have render. While each of the sub-classes may have an implicit conversion that gives them render, the compiler doesn't know that when they are all piled into a list of a more abstract type. Particularly an empty trait such as AnItem.
How can you make this work? You have two simple options.
One, if you want to stick with the implicit conversions, you need to remove DataFormatable as the super-type of your case classes, so that they do not have their own render method. Then you can swap out XML._ and JSON._, and the conversions should work. However, you won't be allowed mixed collections.
Two, drop the implicits altogether and have your trait look like this:
trait DataFormatable {
def toXML: String
def toJSON: String
}
This way, you force every class that mixes in DataFormatable to contain serialization information (which is the way it should be, rather than hiding them in implicits). Now, when you have a List[DataFormatable], you can prove all of the elements can both be converted to JSON or XML, so you can convert a mixed list. I think this would be much better overall, as the code should be more straightforward. What imports you have shouldn't really be defining the behavior of what follows. Imagine the confusion that can arise because XML._ has been imported at the top of the file instead of JSON._.
Related
I'm writing a set of implicit Scala wrapper classes for an existing Java library (so that I can decorate that library to make it more convenient for Scala developers).
As a trivial example, let's say that the Java library (which I can't modify) has a class such as the following:
public class Value<T> {
// Etc.
public void setValue(T newValue) {...}
public T getValue() {...}
}
Now let's say I want to decorate this class with Scala-style getters and setters. I can do this with the following implicit class:
final implicit class RichValue[T](private val v: Value[T])
extends AnyVal {
// Etc.
def value: T = v.getValue
def value_=(newValue: T): Unit = v.setValue(newValue)
}
The implicit keyword tells the Scala compiler that it can convert instances of Value to be instances of RichValue implicitly (provided that the latter is in scope). So now I can apply methods defined within RichValue to instances of Value. For example:
def increment(v: Value[Int]): Unit = {
v.value = v.value + 1
}
(Agreed, this isn't very nice code, and is not exactly functional. I'm just trying to demonstrate a simple use case.)
Unfortunately, Scala does not allow implicit classes to be top-level, so they must be defined within a package object, object, class or trait and not just in a package. (I have no idea why this restriction is necessary, but I assume it's for compatibility with implicit conversion functions.)
However, I'm also extending RichValue from AnyVal to make this a value class. If you're not familiar with them, they allow the Scala compiler to make allocation optimizations. Specifically, the compiler does not always need to create instances of RichValue, and can operate directly on the value class's constructor argument.
In other words, there's very little performance overhead from using a Scala implicit value class as a wrapper, which is nice. :-)
However, a major restriction of value classes is that they cannot be defined within a class or a trait; they can only be members of packages, package objects or objects. (This is so that they do not need to maintain a pointer to the outer class instance.)
An implicit value class must honor both sets of constraints, so it can only be defined within a package object or an object.
And therein lies the problem. The library I'm wrapping contains a deep hierarchy of packages with a huge number of classes and interfaces. Ideally, I want to be able to import my wrapper classes with a single import statement, such as:
import mylib.implicits._
to make using them as simple as possible.
The only way I can currently see of achieving this is to put all of my implicit value class definitions inside a single package object (or object) within a single source file:
package mylib
package object implicits {
implicit final class RichValue[T](private val v: Value[T])
extends AnyVal {
// ...
}
// Etc. with hundreds of other such classes.
}
However, that's far from ideal, and I would prefer to mirror the package structure of the target library, yet still bring everything into scope via a single import statement.
Is there a straightforward way of achieving this that doesn't sacrifice any of the benefits of this approach?
(For example, I know that if I forego making these wrappers value classes, then I can define them within a number of different traits - one for each component package - and have my root package object extend all of them, bringing everything into scope through a single import, but I don't want to sacrifice performance for convenience.)
implicit final class RichValue[T](private val v: Value[T]) extends AnyVal
Is essentially a syntax sugar for the following two definitions
import scala.language.implicitConversions // or use a compiler flag
final class RichValue[T](private val v: Value[T]) extends AnyVal
#inline implicit def RichValue[T](v: Value[T]): RichValue[T] = new RichValue(v)
(which, you might see, is why implicit classes have to be inside traits, objects or classes: they also have matching def)
There is nothing that requires those two definitions to live together. You can put them into separate objects:
object wrappedLibValues {
final class RichValue[T](private val v: Value[T]) extends AnyVal {
// lots of implementation code here
}
}
object implicits {
#inline implicit def RichValue[T](v: Value[T]): wrappedLibValues.RichValue[T] = new wrappedLibValues.RichValue(v)
}
Or into traits:
object wrappedLibValues {
final class RichValue[T](private val v: Value[T]) extends AnyVal {
// implementation here
}
trait Conversions {
#inline implicit def RichValue[T](v: Value[T]): RichValue[T] = new RichValue(v)
}
}
object implicits extends wrappedLibValues.Conversions
Trying to get the hang of Scala classes and traits, here's a simple example. I want to define a class which specifies a variety of operations, that could be implemented in lots of ways. I might start with,
sealed trait Operations{
def add
def multiply
}
So for example, I might instantiate this class with an object does that add and multiply very sensibly,
case object CorrectOperations extends Operations{
def add(a:Double,b:Double)= a+b
def multiply(a:Double,b:Double)= a*b
}
And also, there could be other ways of defining those operations, such as this obviously wrong way,
case object SillyOperations extends Operations{
def add(a:Double,b:Double)= a + b - 10
def multiply(a:Double,b:Double)= a/b
}
I would like to pass such an instance into a function that will execute the operations in a particular way.
def doOperations(a:Double,b:Double, op:operations) = {
op.multiply(a,b) - op.add(a,b)
}
I would like doOperations to take any object of type operations so that I can make use of the add and multiply, whatever they may be.
What do I need to change about doOperations, and what am I misunderstanding here? Thanks
Haven't ran your code, but I suppose you got a compilation error.
If you don't define the signature of the methods in the Operations trait, then by default it will be interpreted as () => Unit.
This means that the methods in the inheriting objects are not really overriding the methods in the trait, but define overloads instead. See more about that here. You can verify this by writing override in front of the method definitions in the object methods. That will force the compiler to explicitly warn you that the methods are not overriding anything from the ancestor trait, and works as a "safety net" against similar mistakes.
To fix the bug, spell out the signature of the trait like the following:
sealed trait Operations{
def add(a:Double,b:Double):Double
def multiply(a:Double,b:Double):Double
}
In fact, the output parameter types are not even necessary in the methods of the inheriting objects (but note the added override attributes):
case object CorrectOperations extends Operations{
override def add(a:Double,b:Double) = a+b
override def multiply(a:Double,b:Double) = a*b
}
In my specific case I have a (growing) library of case classes with a base trait (TKModel)
Then I have an abstract class (TKModelFactory[T <: TKModel]) which is extended by all companion objects.
So my companion objects all inherently know the type ('T') of "answers" they need to provide as well as the type of objects they "normally" accept for commonly implemented methods. (If I get lazy and cut and paste chunks of code to search and destroy this save my bacon a lot!) I do see warnings on the Internet at large however that any form of CompanionObject.method(caseClassInstance: CaseClass) is rife with "code smell" however. Not sure if they actually apply to Scala or not?
There does not however seem to be any way to declare anything in the abstract case class (TKModel) that would refer to (at runtime) the proper companion object for a particular instance of a case class. This results in my having to write (and edit) a few method calls that I want standard in each and every case class.
case class Track(id: Long, name: String, statusID: Long) extends TKModel
object Track extends TKModelFactory[Track]
How would I write something in TKModel such that new Track(1, "x", 1).someMethod() could actually call Track.objectMethod()
Yes I can write val CO = MyCompanionObject along with something like implicit val CO: ??? in the TKModel abstract class and make all the calls hang off of that value. Trying to find any incantation that makes the compiler happy for that however seems to be mission impossible. And since I can't declare that I can't reference it in any placeholder methods in the abstract class either.
Is there a more elegant way to simply get a reference to a case classes companion object?
My specific question, as the above has been asked before (but not yet answered it seems), is there a way to handle the inheritance of both the companion object and the case classes and find the reference such that I can code common method calls in the abstract class?
Or is there a completely different and better model?
If you change TKModel a bit, you can do
abstract class TKModel[T <: TKModel] {
...
def companion: TKModelFactory[T]
def someMethod() = companion.objectMethod()
}
case class Track(id: Long, name: String, statusID: Long) extends TKModel[Track] {
def companion = Track
}
object Track extends TKModelFactory[Track] {
def objectMethod() = ...
}
This way you do need to implement companion in each class. You can avoid this by implementing companion using reflection, something like (untested)
lazy val companion: TKModelFactory[T] = {
Class.forName(getClass.getName + "$").getField("MODULE$").
get(null).asInstanceOf[TKModelFactory[T]]
}
val is to avoid repeated reflection calls.
A companion object does not have access to the instance, but there is no reason the case class can't have a method that calls the companion object.
case class Data(value: Int) {
def add(data: Data) = Data.add(this,data)
}
object Data {
def add(d1: Data, d2: Data): Data = Data(d1.value + d2.value)
}
It's difficult. However you can create an implicit method in companion object. whenever you want to invoke your logic from instance, just trigger implicit rules and the implicit method will instantiate another class which will invoke whatever logic you desired.
I believe it's also possible to do this in generic ways.
You can implement this syntax as an extension method by defining an implicit class in the top-level abstract class that the companion objects extend:
abstract class TKModelFactory[T <: TKModel] {
def objectMethod(t: T)
implicit class Syntax(t: T) {
def someMethod() = objectMethod(t)
}
}
A call to new Track(1, "x", 1).someMethod() will then be equivalent to Track.objectMethod(new Track(1, "x", 1)).
I'm having trouble finding an elegant way of designing a some simple classes to represent HTTP messages in Scala.
Say I have something like this:
abstract class HttpMessage(headers: List[String]) {
def addHeader(header: String) = ???
}
class HttpRequest(path: String, headers: List[String])
extends HttpMessage(headers)
new HttpRequest("/", List("foo")).addHeader("bar")
How can I make the addHeader method return a copy of itself with the new header added? (and keep the current value of path as well)
Thanks,
Rob.
It is annoying but the solution to implement your required pattern is not trivial.
The first point to notice is that if you want to preserve your subclass type, you need to add a type parameter. Without this, you are not able to specify an unknown return type in HttpMessage
abstract class HttpMessage(headers: List[String]) {
type X <: HttpMessage
def addHeader(header: String):X
}
Then you can implement the method in your concrete subclasses where you will have to specify the value of X:
class HttpRequest(path: String, headers: List[String])
extends HttpMessage(headers){
type X = HttpRequest
def addHeader(header: String):HttpRequest = new HttpRequest(path, headers :+header)
}
A better, more scalable solution is to use implicit for the purpose.
trait HeaderAdder[T<:HttpMessage]{
def addHeader(httpMessage:T, header:String):T
}
and now you can define your method on the HttpMessage class like the following:
abstract class HttpMessage(headers: List[String]) {
type X <: HttpMessage
def addHeader(header: String)(implicit headerAdder:HeaderAdder[X]):X = headerAdder.add(this,header) }
}
This latest approach is based on the typeclass concept and scales much better than inheritance. The idea is that you are not forced to have a valid HeaderAdder[T] for every T in your hierarchy, and if you try to call the method on a class for which no implicit is available in scope, you will get a compile time error.
This is great, because it prevents you to have to implement addHeader = sys.error("This is not supported")
for certain classes in the hierarchy when it becomes "dirty" or to refactor it to avoid it becomes "dirty".
The best way to manage implicit is to put them in a trait like the following:
trait HeaderAdders {
implicit val httpRequestHeaderAdder:HeaderAdder[HttpRequest] = new HeaderAdder[HttpRequest] { ... }
implicit val httpRequestHeaderAdder:HeaderAdder[HttpWhat] = new HeaderAdder[HttpWhat] { ... }
}
and then you provide also an object, in case user can't mix it (for example if you have frameworks that investigate through reflection properties of the object, you don't want extra properties to be added to your current instance) (http://www.artima.com/scalazine/articles/selfless_trait_pattern.html)
object HeaderAdders extends HeaderAdders
So for example you can write things such as
// mixing example
class MyTest extends HeaderAdders // who cares about having two extra value in the object
// import example
import HeaderAdders._
class MyDomainClass // implicits are in scope, but not mixed inside MyDomainClass, so reflection from Hiberante will still work correctly
By the way, this design problem is the same of Scala collections, with the only difference that your HttpMessage is TraversableLike. Have a look to this question Calling map on a parallel collection via a reference to an ancestor type
Scala 2.10 introduces value classes, which you specify by making your class extend AnyVal. There are many restrictions on value classes, but one of their huge advantages is that they allow extension methods without the penalty of creating a new class: unless boxing is required e.g. to put the value class in an array, it is simply the old class plus a set of methods that take the class as the first parameter. Thus,
implicit class Foo(val i: Int) extends AnyVal {
def +*(j: Int) = i + j*j
}
unwraps to something that can be no more expensive than writing i + j*j yourself (once the JVM inlines the method call).
Unfortunately, one of the restrictions in SIP-15 which describes value classes is
The underlying type of C may not be a value class.
If you have a value class that you can get your hands on, say, as a way to provide type-safe units without the overhead of boxing (unless you really need it):
class Meter(val meters: Double) extends AnyVal {
def centimeters = meters*100.0 // No longer type-safe
def +(m: Meter) = new Meter(meters+m.meters) // Only works with Meter!
}
then is there a way to enrich Meter without object-creation overhead? The restriction in SIP-15 prevents the obvious
implicit class RichMeter(m: Meter) extends AnyVal { ... }
approach.
In order to extend value classes, you need to recapture the underlying type. Since value classes are required to have their wrapped type accessible (val i not just i above), you can always do this. You can't use the handy implicit class shortcut, but you can still add the implicit conversion longhand. So, if you want to add a - method to Meter you must do something like
class RichMeter(val meters: Double) extends AnyVal {
def -(m: Meter) = new Meter(meters - m.meters)
}
implicit def EnrichMeters(m: Meter) = new RichMeter(m.meters)
Note also that you are allowed to (freely) rewrap any parameters with the original value class, so if it has functionality that you rely on (e.g. it wraps a Long but performs complicated bit-mixing), you can just rewrap the underlying class in the value class you're trying to extend wherever you need it.
(Note also that you'll get a warning unless you import language.implicitConversions.)
Addendum: in Scala 2.11+, you may make the val private; for cases where this was done, you will not be able to use this trick.