I have a project written in MVC style. Views look like this:
trait BaseView {
def asComponent(): Component // each view can be displayed on screen
}
class ConcreteView extends Panel with BaseView {
def asComponent(): Component = this //ConcreteView is itself Component because it extends Panel
}
It is possible to change this code to use implicit conversion from ConcreteView to Component? So I can use ConcreteView as Component (due implicit conversion) without calling ConcreteView#asComponent method?
Yes, it is possible. Just define an implicit conversion from BaseView to Component that calls the asComponent method.
object BaseView {
implicit def viewIsComponent(x:BaseView) : Component = x.asComponent
}
But that does not mean that it is a good idea. An implicit conversion in scala is a very powerful feature. If a BaseView (and by inheritance each XXXView) is a Component, that means that you will get all methods of Component when you want to call a method of val myView:SomeView. That totally clutters the namespace and also can be dangerous because you are not sure if you call a method of your View or of the Component it is implicitly mapped to.
In the scala library there has been a move away from implicit conversions to more explicit and slightly more verbose way. Take for example JavaConversions: they provide implicit conversions from scala collections to java collections and back. This sounds like a good idea, but it has caused a lot of trouble in practice:
conversions happening when you don't expect them to
the namespace of the scala collections cluttered with a lot of additional methods from the java equivalent
difficult to find issues when new methods are added to the target of the implicit conversion that collide with methods in the source of the conversion
The currently recommended way to deal with java/scala collection interop is to use the more explicit JavaConverters, which add a single method asScala to java collections and asJava to scala collections.
So just leave the method as is. Maybe change the name to just .component, since you don't really convert the view to the component, but only allow somebody to access the component that each view must have.
Related
I created a class extend scala.Immutable
class SomeThing(var string: String) extends Immutable {
override def toString: String = string
}
As I expected, scala compiler should help me prevent change state of class SomeThing. But when I run this test
"Test change state of immutable interface" should "not allow" in {
val someThing = new SomeThing("hello")
someThing.string = "hello 1"
println(someThing)
}
The result is hello 1 and scala compiler don't throw any warning or error.
Why they have to add Immutable trait without help us prevent object mutable?
There are several aspects to this question.
1. A simple one is that Scala compiler can't really ensure immutability for many various reasons. For example, the main target platform JVM allows modifying even final fields using reflection. Another reason this is not enforceable is code like this
/////////////////////////////////////////
//// library v1
package library
class LibraryData(val value:Int)
/////////////////////////////////////////
//// code that uses the library
package app
class UserData(val data:LibraryData) extends Immutable
/////////////////////////////////////////
//// library v2
package library
class LibraryData(var value:Int) //now change it to var!
Since the "library" is compiled independently of the "app" and doesn't even know about existence of the "app" there is no point in time where compiler can catch the broken contract.
2. More fundamental misunderstanding you seem to have is what trait does. In this context trait (or "interface" in some other languages) represents a contract between the implementation and the user-code about how the implementation can and should behave. However not every kind of a contract can be represented as a trait (at least without making the code super-complicated). For example, for a mutable collection there is a contract that size should return the number of times add (or +=) has been called but there is no way to represent such a contract as a trait besides declaring that there are methods size and += with corresponding signatures. On the other hand, for most of the contracts there is no way to enforce implementation to follow the contract . For example, an implementation of size that always returns 0 technically matches all the types but is clearly breaking the contract.
Similarly Immutable doc says:
A marker trait for all immutable data structures such as immutable collections.
So it is just a marker trait which is one of the ways to work around contracts that can't be really represented as types. And it says that whoever implements that trait claims to be an immutable object. Your code claims that but clearly breaks the contract. So technically it is your fault for not respecting the contract.
I am wondering if there is a way to get the quick documentation in IntelliJ to work for the class construction pattern many scala developers use below.
SomeClass(Param1,Parma2)
instead of
new SomeClass(param1,Param2)
The direct constructor call made with new obviously works but many scala devs use apply to construct objects. When that pattern is used the Intelij documentation look up fails to find any information on the class.
I don't know if there are documents in IntelliJ per se. However, the pattern is fairly easy to explain.
There's a pattern in Java code for having static factory methods (this is a specialization of the Gang of Four Factory Method Pattern), often along the lines of (translated to Scala-ish):
object Foo {
def barInstance(args...): Bar = ???
}
The main benefit of doing this is that the factory controls object instantiation, in particular:
the particular runtime class to instantiate, possibly based on the arguments to the factory. For example, the generic immutable collections in Scala have factory methods which may create optimized small collections if they're created with a sufficiently small amount of contents. An example of this is a sequence of length 1 can be implemented with basically no overhead with a single field referring to the object and a lookup that checks if the offset is 0 and either throws or returns its sole field.
whether an instance is created. One can cache arguments to the factory and memoize or "hashcons" the created objects, or precreate the most common instances and hand them out repeatedly.
A further benefit is that the factory is a function, while new is an operator, which allows the factory to be passed around:
class Foo(x: Int)
object Foo {
def instance(x: Int) = new Foo(x)
}
Seq(1, 2, 3).map(x => Foo(x)) // results in Seq(Foo(1), Foo(2), Foo(3))
In Scala, this is combined with the fact that the language allows any object which defines an apply method to be used syntactically as a function (even if it doesn't extend Function, which would allow the object to be passed around as if it's a function) and with the "companion object" to a class (which incorporates the things that in Java would be static in the class) to get something like:
class Foo(constructor_args...)
object Foo {
def apply(args...): Foo = ???
}
Which can be used like:
Foo(...)
For a case class, the Scala compiler automatically generates a companion object with certain behaviors, one of which is an apply with the same arguments as the constructor (other behaviors include contract-obeying hashCode and equals as well as an unapply method to allow for pattern matching).
I am currently trying to implement my own UnsignedInt. I would like to implement this correctly so that it fits into the whole Scala type & class system. However, I am really confused by all the classes that fit into Number.
With which class(es) do I need to work: Numeric, Integral or ScalaNumber? Or something completely different? What classes and/or traits should my own class implement?
The short answer is: don't implement your own, use the Spire one :) Otherwise, you should implement Integral (which includes Numeric). Note that your type shouldn't extend it; you need implicit values in the companion object, i.e.
class UnsignedInt { ... }
object UnsignedInt {
implicit val UIntIntegral: Integral[UnsignedInt] = ...
}
You should also consider making your class a value class.
I'm building a JSON RPC in Play 2.1. In order to call the proper methods the RPC dispatcher is using reflection to create and call a class method instance by name.
Right now a RPC method looks like this:
def create(obj: JsValue) = {
val menu: Menu = Json.fromJson[Menu](obj).get
collection.insert(menu).map( r => toDirectResult(r))
}
def createCustom(obj: JsValue) = {
val menu: Menu = Json.fromJson(obj)(Menu.customFormat).get
collection.insert(menu).map( r => toDirectResult(r))
}
What I would like to do is to be able to define the RPC methods like this:
def create(menu: Menu) = {
collection.insert(menu).map( r => toDirectResult(r))
}
The problem is that the RPC dispatcher only knows at the runtime that is has to call the method named "create" on the class named "Menus" and it has the value of the argument to pass to the method as a JsValue. Through reflection I can find out the number of arguments and their types for the RPC method. When the argument type is a case class, how do I transform the JsValue into a case class instance using the implicit Formatter (or Reader) defined in the companion object of the case class?
For the createCustom method I realize that there is no "magic" solution, but since I started learning Scala I discovered that few things are truly impossible with this programming language. Would it be possible to use an annotation or something similar to specify a Formatter that is not implicit?
You need to implement a PathBinder...this should help out http://www.richard-foy.fr/blog/2012/04/09/how-to-implement-a-custom-pathbindable-with-play-2/
After further careful consideration I've decided that reflection is really no the right solution for my problem. It lacks type safety and proper error reporting at compile time, it is harder to debug and has an impact on the performance as well. And I actually have all the information I need to generate the code at build time.
Unfortunately I cannot use the Play router because for the JSON RPC dispatcher the routing depends on the request body, which is not available during the Play routing. But in essence the RPC dispatcher is doing the same thing as the Play router. So for the moment I'm just going to manually code my RPC routes and then the problem in the question is solved. In the future I'm planning to write a SBT plugin that will automatically generate the dispatcher code, similar to the Play router.
I get the coding in that you basically provide an "object SomeClass" and a "class SomeClass" and the companion class is the class declaration and the object is a singleton. Of which you cannot create an instance. So... my question is mostly the purpose of a singleton object in this particular instance.
Is this basically just a way to provide class methods in Scala? Like + based methods in Objective-C?
I'm reading the Programming in Scala book and Chapter 4 just talked about singleton objects, but it doesn't get into a lot of detail on why this matters.
I realize I may be getting ahead of myself here and that it might be explained in greater detail later. If so, let me know. This book is reasonably good so far, but it has a lot of "in Java, you do this", but I have so little Java experience that I sort of miss a bit of the points I fear. I don't want this to be one of those situations.
I don't recall reading anywhere on the Programming in Scala website that Java was a prerequisite for reading this book...
Yes, companion singletons provide an equivalent to Java's (and C++'s, c#'s, etc.) static methods.
(indeed, companion object methods are exposed via "static forwarders" for the sake of Java interop)
However, singletons go a fair way beyond this.
A singleton can inherit methods from other classes/traits, which can't be done with statics.
A singleton can be passed as a parameter (perhaps via an inherited interface)
A singleton can exist within the scope of a surrounding class or method, just as Java can have inner classes
It's also worth noting that a singleton doesn't have to be a companion, it's perfectly valid to define a singleton without also defining a companion class.
Which helps make Scala a far more object-oriented language that Java (static methods don't belong to an object). Ironic, given that it's largely discussed in terms of its functional credentials.
In many cases we need a singleton to stand for unique object in our software system.
Think about the the solar system. We may have following classes
class Planet
object Earth extends Planet
object Sun extends Planet
object is a simple way to create singleton, of course it is usually used to create class level method, as static method in java
Additional to the given answers (and going in the same general direction as jilen), objects play an important role in Scala's implicit mechanism, e.g. allowing type-class-like behavior (as known from Haskell):
trait Monoid[T] {
def zero:T
def sum(t1:T, t2:T):T
}
def fold[T](ts:T*)(implicit m:Monoid[T]) = ts.foldLeft(m.zero)(m.sum(_,_))
Now we have a fold-Function. which "collapses" a number of Ts together, as long as there is an appropriate Monoid (things that have a neutral element, and can be "added" somehow together) for T. In order to use this, we need only one instance of a Monoid for some type T, the perfect job for an object:
implicit object StringMonoid extends Monoid[String] {
def zero = ""
def sum(s1:String, s2:String) = s1 + s2
}
Now this works:
println(fold("a","bc","def")) //--> abcdef
So objects are very useful in their own right.
But wait, there is more! Companion objects can also serve as a kind of "default configuration" when extending their companion class:
trait Config {
def databaseName:String
def userName:String
def password:String
}
object Config extends Config {
def databaseName = "testDB"
def userName = "scott"
def password = "tiger"
}
So on the one hand you have the trait Config, which can be implemented by the user however she wants, but on the other hand there is a ready made object Config when you want to go with the default settings.
Yes, it is basically a way of providing class methods when used as a companion object.