hashCode in case classes in Scala - scala

I've read that Scala'a case class construct automatically generates a fitting equals and hashCode implementation. What does exactly the generated code look like?

As my professor used to say, only the code tells the truth! So just take a look at the code that is generated for:
case class A(i: Int, s: String)
We can instruct the Scala compiler to show us the generated code after the different phases, here after the typechecker:
% scalac -Xprint:typer test.scala
[[syntax trees at end of typer]]// Scala source: test.scala
package <empty> {
#serializable case class A extends java.lang.Object with ScalaObject with Product {
..
override def hashCode(): Int = ScalaRunTime.this._hashCode(A.this);
...
override def equals(x$1: Any): Boolean = A.this.eq(x$1).||(x$1 match {
case (i: Int,s: String)A((i$1 # _), (s$1 # _)) if i$1.==(i).&&(s$1.==(s)) => x$1.asInstanceOf[A].canEqual(A.this)
case _ => false
});
override def canEqual(x$1: Any): Boolean = x$1.$isInstanceOf[A]()
};
}
So you can see that the calculation of the hash code is delegated to ScalaRunTime._hashCode and the equality depends on the equality of the case class' members.

The generated hashCode just calls scala.runtime.ScalaRunTime._hashCode, which is defined as:
def _hashCode(x: Product): Int = {
val arr = x.productArity
var code = arr
var i = 0
while (i < arr) {
val elem = x.productElement(i)
code = code * 41 + (if (elem == null) 0 else elem.hashCode())
i += 1
}
code
}
So what you get is elem1 * 41**n + elem2 * 41**(n-1) .. elemn * 1, where n is the arity of your case class and elemi are the members of that case class.

Please be aware that the previous answers on this question are a bit outdated on the hashCode part.
As of scala 2.9 hashCode for case classes uses MurmurHash: link.
MurmurHash produces good avalanche effect, good distribution and is CPU friendly.

Looks like things have changed; using Mirko's example case class A(i: Int, s: String)I get:
override <synthetic> def hashCode(): Int = {
<synthetic> var acc: Int = -889275714;
acc = scala.runtime.Statics.mix(acc, i);
acc = scala.runtime.Statics.mix(acc, scala.runtime.Statics.anyHash(s));
scala.runtime.Statics.finalizeHash(acc, 2)
};
and
override <synthetic> def equals(x$1: Any): Boolean = A.this.eq(x$1.asInstanceOf[Object]).||(x$1 match {
case (_: A) => true
case _ => false
}.&&({
<synthetic> val A$1: A = x$1.asInstanceOf[A];
A.this.i.==(A$1.i).&&(A.this.s.==(A$1.s)).&&(A$1.canEqual(A.this))
}))
};

Related

Why Scala PartialFunction works without defining isDefinedAt?

It looks First and Second are the same, but why?
First
val iter = List(1, 2, 3, 4, 5).iterator
val first = iter.collect(new PartialFunction[Int, Int]{
def apply(i: Int) = i
def isDefinedAt(i: Int) = i > 0 && i < 3
})
first.foreach((println(_)))
Second
val iter2 = List(1, 2, 3, 4, 5).iterator
val second = iter2.collect {
case i:Int if i > 0 && i < 3 => i
}
second.foreach((println(_)))
Is it because the Scala compiler automatically converts
{ case i:Int if i > 0 && i < 3 => i } into the implelentation form of First with generating isDefinedAt from if i > 0 && i < 3 part?
Also, case i:Int if i > 0 && i < 3 => i is Case class pattern matching, if I am correct. However, in scala/src/library/scala/PartialFunction.scala, there is no Case class definition for PartialFunction.
trait PartialFunction[-A, +B] extends (A => B)
Then why this case class pattern match works?
I suppose Scala compiler does lots of implicit works intelligently but it confuses me to understand what is happening and how to write Scala code.
If there are good references, instead of language or compiler specifications, to understand
Scala code syntax and Scala way of writing code, please suggest.
Is it because the Scala compiler automatically converts { case i:Int if i > 0 && i < 3 => i } into the implelentation form of First with generating isDefinedAt from **if i > 0 && i < 3 ** part?
Yes, the exact translation is given in Pattern Matching Anonymous Functions. Here it'll be
new PartialFunction[Int, Int]{
def apply(x: Int) = x match {
case i:Int if i > 0 && i < 3 => i
}
def isDefinedAt(x: Int) = x match {
case i:Int if i > 0 && i < 3 => true
case _ => false
}
}
Note the difference with your first example in apply! You can still call it when isDefined is false.
Also, case i:Int if i > 0 && i < 3 => i is Case class pattern matching, if I am correct
If anything, it's the other way around; case classes are called that way because they can be pattern-matched and pattern matching uses case keyword in Scala.
Yes, the compiler converts the second version into a PartialFunction[Int,Int] (because that is what collect takes).
There is no case class matching here, and it is not even matching on type because the value must be Int (and therefore the type declaration in the second version is not required).
The style guide gives lots on tips on how Scala is typically written.
For your example
object Main {
def f = (1 to 5).collect { case i if i > 0 && i < 3 => i }
}
The compiler-generated partial function defines applyOrElse because it is more efficient than the naive idiom:
if (pf.isDefinedAt(x)) pf.apply(x) else ???
Showing that implementation, which is similar to what is described in the spec:
$ scalac -Vprint:typer pf.scala
[[syntax trees at end of typer]] // pf.scala
package <empty> {
object Main extends scala.AnyRef {
def <init>(): Main.type = {
Main.super.<init>();
()
};
def f: IndexedSeq[Int] = scala.Predef.intWrapper(1).to(5).collect[Int](({
#SerialVersionUID(value = 0) final <synthetic> class $anonfun extends scala.runtime.AbstractPartialFunction[Int,Int] with java.io.Serializable {
def <init>(): <$anon: Int => Int> = {
$anonfun.super.<init>();
()
};
final override def applyOrElse[A1 <: Int, B1 >: Int](x1: A1, default: A1 => B1): B1 = ((x1.asInstanceOf[Int]: Int): Int #unchecked) match {
case (i # _) if i.>(0).&&(i.<(3)) => i
case (defaultCase$ # _) => default.apply(x1)
};
final def isDefinedAt(x1: Int): Boolean = ((x1.asInstanceOf[Int]: Int): Int #unchecked) match {
case (i # _) if i.>(0).&&(i.<(3)) => true
case (defaultCase$ # _) => false
}
};
new $anonfun()
}: PartialFunction[Int,Int]))
}
}
where AbstractPartialFunction defines
def apply(x: T1): R = applyOrElse(x, PartialFunction.empty)
Here is an external link to a change to use applyOrElse. The improved PartialFunction dates back to 2012. Probably the feature is under-documented or under-advertised. Some information is available by expanding the Scaladoc for PartialFunction. For some reason, that link shows orElse, so you'd actually have to scroll back for applyOrElse. It seems documentation is hard.

Understanding Free monad in scalaz

I'm experimenting with Free monad in Scalaz and trying to build simple interpreter to parse and evaluate expressions like:
dec(inc(dec(dec(10)))
where dec means decrement, inc means increment. Here is what I got:
trait Interpreter[A]
case class V[A](a: A) extends Interpreter[A]
object Inc {
private[this] final val pattern = Pattern.compile("^inc\\((.*)\\)$")
def unapply(arg: String): Option[String] = {
val m = pattern.matcher(arg)
if(m.find()){
Some(m.group(1))
} else None
}
}
object Dec {
private[this] final val pattern = Pattern.compile("^dec\\((.*)\\)$")
def unapply(arg: String): Option[String] = {
val m = pattern.matcher(arg)
if(m.find()){
Some(m.group(1))
} else None
}
}
object Val {
def unapply(arg: String): Option[Int] =
if(arg.matches("^[0-9]+$")) Some(Integer.valueOf(arg))
else None
}
Now this is all I need to build AST. It currently looks as follows:
def buildAst(expression: String): Free[Interpreter, Int] =
expression match {
case Inc(arg) => inc(buildAst(arg))
case Dec(arg) => dec(buildAst(arg))
case Val(arg) => value(arg)
}
private def inc(i: Free[Interpreter, Int]) = i.map(_ + 1)
private def dec(d: Free[Interpreter, Int]) = d.map(_ - 1)
private def value(v: Int): Free[Interpreter, Int] = Free.liftF(V(v))
Now when testing the application:
object Test extends App{
val expression = "inc(dec(inc(inc(inc(dec(10))))))"
val naturalTransform = new (Interpreter ~> Id) {
override def apply[A](fa: Interpreter[A]): Id[A] = fa match {
case V(a) => a
}
}
println(buildAst(expression).foldMap(naturalTransform)) //prints 12
}
And it works pretty much fine (I'm not sure about if it is in scalaz style).
THE PROBLEM is the extractor objects Inc, Dec, Val feels like boilerplate code. Is there a way to reduce such a code duplication.
This will definitely become a problem if the number of functions supported gets larger.
Free monads are creating some boilerplate and that is a fact. However if you are willing to stick to some conventions, you could rewrite interpreter with Freasy Monad:
#free trait Interpreter {
type InterpreterF[A] = Free[InterpreterADT, A]
sealed trait InterpreterADT[A]
def inc(arg: InterpreterF[Int]): InterpreterF[Int]
def dec(arg: InterpreterF[Int]): InterpreterF[Int]
def value(arg: Int): InterpreterF[Int]
}
and that would generate all of case classes and matching on them. The interpreter becomes just a trait to implement.
However, you already have some logic within unapply - so you would have to split the parsing and executing logic:
import Interpreter.ops._
val incP = """^inc\\((.*)\\)$""".r
val decP = """^dec\\((.*)\\)$""".r
val valP = """^val\\((.*)\\)$""".r
def buildAst(expression: String): InterpreterF[Int] = expression match {
case incP(arg) => inc(buildAst(arg))
case decP(arg) => dec(buildAst(arg))
case valP(arg) => value(arg.toInt)
}
Then you could implement an actual interpreter:
val impureInterpreter = new Interpreter.Interp[Id] {
def inc(arg: Int): Int = arg+1
def dec(arg: Int): Int = arg-1
def value(arg: Int): Int = arg
}
and run it:
impureInterpreter.run(buildAst(expression))
I admit that this is more of a pseudocode than tested working solution, but it should give a general idea. Another library that uses similar idea is Freestyle but they use their own free monads implementation instead of relying on a cats/scalaz.
So, I would say it is possible to remove some boilerplate as long as you have no issue with splitting parsing and interpretation. Of course not all can be removed - you have to declare possible operations on your Interpreter algebra as well as you have to implement interpreter yourself.

Scala - not a case class nor does it have method .unapply

I am quite new to Scala and got a few unresolved problems with the following code:
object exprs{
println("Welcome to the Scala worksheet")
def show(e: Expr): String = e match {
case Number(x) => x.toString
case Sum(l, r) => show(l) + " + " + show(r)
}
show(Sum(Number(1), Number(44)))
}
trait Expr {
def isNumber: Boolean
def isSum: Boolean
def numValue: Int
def leftOp: Expr
def rightOp: Expr
def eval: Int = this match {
case Number(n) => n
case Sum(e1, e2) => e1.eval + e2.eval
}
}
class Number(n: Int) extends Expr {
override def isNumber: Boolean = true
override def isSum: Boolean = false
override def numValue: Int = n
override def leftOp: Expr = throw new Error("Number.leftOp")
override def rightOp: Expr = throw new Error("Number.rightOp")
}
class Sum(e1: Expr, e2: Expr) extends Expr {
override def isNumber: Boolean = false
override def isSum: Boolean = true
override def numValue: Int = e1.eval + e2.eval
override def leftOp: Expr = e1
override def rightOp: Expr = e2
}
I get the following errors:
Error: object Number is not a case class, nor does it have an unapply/unapplySeq member
Error: not found: value Sum
How to resolve them? Thanks in advance
In Scala case class are like class with extra goodies + some other properties.
For a normal class,
class A(i: Int, s: String)
You can not create its instance like this,
val a = A(5, "five") // this will not work
You will have to use new to create new instance.
val a = new A(5, "five")
Now lets say we have case class,
case class B(i: Int, s: String)
We can create a new instance of B like this,
val b = B(5, "five")
The reason this works with case class is because case class have an auto-created companion objects with them, which provides several utilities including an apply and unapply method.
So, this usage val b = B(5, "five") is actually val b = B.apply(5, "five"). And here B is not the class B but the companion object B which is actually provieds apply method.
Similarly Scala pattern matching uses the unapply (unapplySeq for SeqLike patterns) methods provided by companion object. And hence normal class instances do not work with pattern matching.
Lets say you wanted to defined a class and not a case class for some specific reason but still want to use them with pattern-matching etc, you can provide its companion object with the required methods by yourselves.
class C(val i: Int, val s: String) {
}
object C {
def apply(i: Int, s: String) = new C(i, s)
def unapply(c: C) = Some((c.i, c.s))
}
// now you can use any of the following to create instances,
val c1 = new C(5, "five")
val c2 = C.apply(5, "five")
val c3 = C(5, "five")
// you can also use pattern matching,
c1 match {
case C(i, s) => println(s"C with i = $i and s = $s")
}
c2 match {
case C(i, s) => println(s"C with i = $i and s = $s")
}
Also, as you are new to learning Scala you should read http://danielwestheide.com/scala/neophytes.html which is probably the best resource for any Scala beginner.

How to return optional information from methods?

The general question is how to return additional information from methods, beside the actual result of the computation. But I want, that this information can silently be ignored.
Take for example the method dropWhile on Iterator. The returned result is the mutated iterator. But maybe sometimes I might be interested in the number of elements dropped.
In the case of dropWhile, this information could be generated externally by adding an index to the iterator and calculating the number of dropped steps afterwards. But in general this is not possible.
I simple solution is to return a tuple with the actual result and optional information. But then I need to handle the tuple whenever I call the method - even if I'm not interested in the optional information.
So the question is, whether there is some clever way of gathering such optional information?
Maybe through Option[X => Unit] parameters with call-back functions that default to None? Is there something more clever?
Just my two cents here…
You could declare this:
case class RichResult[+A, +B](val result: A, val info: B)
with an implicit conversion to A:
implicit def unwrapRichResult[A, B](richResult: RichResult[A, B]): A = richResult.result
Then:
def someMethod: RichResult[Int, String] = /* ... */
val richRes = someMethod
val res: Int = someMethod
It's definitely not more clever, but you could just create a method that drops the additional information.
def removeCharWithCount(str: String, x: Char): (String, Int) =
(str.replace(x.toString, ""), str.count(x ==))
// alias that drops the additional return information
def removeChar(str: String, x: Char): String =
removeCharWithCount(str, x)._1
Here is my take (with some edits with a more realistic example):
package info {
trait Info[T] { var data: Option[T] }
object Info {
implicit def makeInfo[T]: Info[T] = new Info[T] {
var data: Option[T] = None
}
}
}
Then suppose your original method (and use case) is implemented like this:
object Test extends App {
def dropCounterIterator[A](iter: Iterator[A]) = new Iterator[A] {
def hasNext = iter.hasNext
def next() = iter.next()
override def dropWhile(p: (A) => Boolean): Iterator[A] = {
var count = 0
var current: Option[A] = None
while (hasNext && p({current = Some(next()); current.get})) { count += 1 }
current match {
case Some(a) => Iterator.single(a) ++ this
case None => Iterator.empty
}
}
}
val i = dropCounterIterator(Iterator.from(1))
val ii = i.dropWhile(_ < 10)
println(ii.next())
}
To provide and get access to the info, the code would be modified only slightly:
import info.Info // line added
object Test extends App {
def dropCounterIterator[A](iter: Iterator[A]) = new Iterator[A] {
def hasNext = iter.hasNext
def next() = iter.next()
// note overloaded variant because of extra parameter list, not overriden
def dropWhile(p: (A) => Boolean)(implicit info: Info[Int]): Iterator[A] = {
var count = 0
var current: Option[A] = None
while (hasNext && p({current = Some(next()); current.get})) { count += 1 }
info.data = Some(count) // line added here
current match {
case Some(a) => Iterator.single(a) ++ this
case None => Iterator.empty
}
}
}
val i = dropCounterIterator(Iterator.from(1))
val info = implicitly[Info[Int]] // line added here
val ii = i.dropWhile((x: Int) => x < 10)(info) // line modified
println(ii.next())
println(info.data.get) // line added here
}
Note that for some reason the type inference is affected and I had to annotate the type of the function passed to dropWhile.
You want dropWhileM with the State monad threading a counter through the computation.

Hidden features of Scala

Locked. This question and its answers are locked because the question is off-topic but has historical significance. It is not currently accepting new answers or interactions.
What are the hidden features of Scala that every Scala developer should be aware of?
One hidden feature per answer, please.
Okay, I had to add one more. Every Regex object in Scala has an extractor (see answer from oxbox_lakes above) that gives you access to the match groups. So you can do something like:
// Regex to split a date in the format Y/M/D.
val regex = "(\\d+)/(\\d+)/(\\d+)".r
val regex(year, month, day) = "2010/1/13"
The second line looks confusing if you're not used to using pattern matching and extractors. Whenever you define a val or var, what comes after the keyword is not simply an identifier but rather a pattern. That's why this works:
val (a, b, c) = (1, 3.14159, "Hello, world")
The right hand expression creates a Tuple3[Int, Double, String] which can match the pattern (a, b, c).
Most of the time your patterns use extractors that are members of singleton objects. For example, if you write a pattern like
Some(value)
then you're implicitly calling the extractor Some.unapply.
But you can also use class instances in patterns, and that is what's happening here. The val regex is an instance of Regex, and when you use it in a pattern, you're implicitly calling regex.unapplySeq (unapply versus unapplySeq is beyond the scope of this answer), which extracts the match groups into a Seq[String], the elements of which are assigned in order to the variables year, month, and day.
Structural type definitions - i.e. a type described by what methods it supports. For example:
object Closer {
def using(closeable: { def close(): Unit }, f: => Unit) {
try {
f
} finally { closeable.close }
}
}
Notice that the type of the parameter closeable is not defined other than it has a close method
Type-Constructor Polymorphism (a.k.a. higher-kinded types)
Without this feature you can, for example, express the idea of mapping a function over a list to return another list, or mapping a function over a tree to return another tree. But you can't express this idea generally without higher kinds.
With higher kinds, you can capture the idea of any type that's parameterised with another type. A type constructor that takes one parameter is said to be of kind (*->*). For example, List. A type constructor that returns another type constructor is said to be of kind (*->*->*). For example, Function1. But in Scala, we have higher kinds, so we can have type constructors that are parameterised with other type constructors. So they're of kinds like ((*->*)->*).
For example:
trait Functor[F[_]] {
def fmap[A, B](f: A => B, fa: F[A]): F[B]
}
Now, if you have a Functor[List], you can map over lists. If you have a Functor[Tree], you can map over trees. But more importantly, if you have Functor[A] for any A of kind (*->*), you can map a function over A.
Extractors which allow you to replace messy if-elseif-else style code with patterns. I know that these are not exactly hidden but I've been using Scala for a few months without really understanding the power of them. For (a long) example I can replace:
val code: String = ...
val ps: ProductService = ...
var p: Product = null
if (code.endsWith("=")) {
p = ps.findCash(code.substring(0, 3)) //e.g. USD=, GBP= etc
}
else if (code.endsWith(".FWD")) {
//e.g. GBP20090625.FWD
p = ps.findForward(code.substring(0,3), code.substring(3, 9))
}
else {
p = ps.lookupProductByRic(code)
}
With this, which is much clearer in my opinion
implicit val ps: ProductService = ...
val p = code match {
case SyntheticCodes.Cash(c) => c
case SyntheticCodes.Forward(f) => f
case _ => ps.lookupProductByRic(code)
}
I have to do a bit of legwork in the background...
object SyntheticCodes {
// Synthetic Code for a CashProduct
object Cash extends (CashProduct => String) {
def apply(p: CashProduct) = p.currency.name + "="
//EXTRACTOR
def unapply(s: String)(implicit ps: ProductService): Option[CashProduct] = {
if (s.endsWith("=")
Some(ps.findCash(s.substring(0,3)))
else None
}
}
//Synthetic Code for a ForwardProduct
object Forward extends (ForwardProduct => String) {
def apply(p: ForwardProduct) = p.currency.name + p.date.toString + ".FWD"
//EXTRACTOR
def unapply(s: String)(implicit ps: ProductService): Option[ForwardProduct] = {
if (s.endsWith(".FWD")
Some(ps.findForward(s.substring(0,3), s.substring(3, 9))
else None
}
}
But the legwork is worth it for the fact that it separates a piece of business logic into a sensible place. I can implement my Product.getCode methods as follows..
class CashProduct {
def getCode = SyntheticCodes.Cash(this)
}
class ForwardProduct {
def getCode = SyntheticCodes.Forward(this)
}
Manifests which are a sort of way at getting the type information at runtime, as if Scala had reified types.
In scala 2.8 you can have tail-recursive methods by using the package scala.util.control.TailCalls (in fact it's trampolining).
An example:
def u(n:Int):TailRec[Int] = {
if (n==0) done(1)
else tailcall(v(n/2))
}
def v(n:Int):TailRec[Int] = {
if (n==0) done(5)
else tailcall(u(n-1))
}
val l=for(n<-0 to 5) yield (n,u(n).result,v(n).result)
println(l)
Case classes automatically mixin the Product trait, providing untyped, indexed access to the fields without any reflection:
case class Person(name: String, age: Int)
val p = Person("Aaron", 28)
val name = p.productElement(0) // name = "Aaron": Any
val age = p.productElement(1) // age = 28: Any
val fields = p.productIterator.toList // fields = List[Any]("Aaron", 28)
This feature also provides a simplified way to alter the output of the toString method:
case class Person(name: String, age: Int) {
override def productPrefix = "person: "
}
// prints "person: (Aaron,28)" instead of "Person(Aaron, 28)"
println(Person("Aaron", 28))
It's not exactly hidden, but certainly a under advertised feature: scalac -Xprint.
As a illustration of the use consider the following source:
class A { "xx".r }
Compiling this with scalac -Xprint:typer outputs:
package <empty> {
class A extends java.lang.Object with ScalaObject {
def this(): A = {
A.super.this();
()
};
scala.this.Predef.augmentString("xx").r
}
}
Notice scala.this.Predef.augmentString("xx").r, which is a the application of the implicit def augmentString present in Predef.scala.
scalac -Xprint:<phase> will print the syntax tree after some compiler phase. To see the available phases use scalac -Xshow-phases.
This is a great way to learn what is going on behind the scenes.
Try with
case class X(a:Int,b:String)
using the typer phase to really feel how useful it is.
You can define your own control structures. It's really just functions and objects and some syntactic sugar, but they look and behave like the real thing.
For example, the following code defines dont {...} unless (cond) and dont {...} until (cond):
def dont(code: => Unit) = new DontCommand(code)
class DontCommand(code: => Unit) {
def unless(condition: => Boolean) =
if (condition) code
def until(condition: => Boolean) = {
while (!condition) {}
code
}
}
Now you can do the following:
/* This will only get executed if the condition is true */
dont {
println("Yep, 2 really is greater than 1.")
} unless (2 > 1)
/* Just a helper function */
var number = 0;
def nextNumber() = {
number += 1
println(number)
number
}
/* This will not be printed until the condition is met. */
dont {
println("Done counting to 5!")
} until (nextNumber() == 5)
#switch annotation in Scala 2.8:
An annotation to be applied to a match
expression. If present, the compiler
will verify that the match has been
compiled to a tableswitch or
lookupswitch, and issue an error if it
instead compiles into a series of
conditional expressions.
Example:
scala> val n = 3
n: Int = 3
scala> import annotation.switch
import annotation.switch
scala> val s = (n: #switch) match {
| case 3 => "Three"
| case _ => "NoThree"
| }
<console>:6: error: could not emit switch for #switch annotated match
val s = (n: #switch) match {
Dunno if this is really hidden, but I find it quite nice.
Typeconstructors that take 2 type parameters can be written in infix notation
object Main {
class FooBar[A, B]
def main(args: Array[String]): Unit = {
var x: FooBar[Int, BigInt] = null
var y: Int FooBar BigInt = null
}
}
Scala 2.8 introduced default and named arguments, which made possible the addition of a new "copy" method that Scala adds to case classes. If you define this:
case class Foo(a: Int, b: Int, c: Int, ... z:Int)
and you want to create a new Foo that's like an existing Foo, only with a different "n" value, then you can just say:
foo.copy(n = 3)
in scala 2.8 you can add #specialized to your generic classes/methods. This will create special versions of the class for primitive types (extending AnyVal) and save the cost of un-necessary boxing/unboxing :
class Foo[#specialized T]...
You can select a subset of AnyVals :
class Foo[#specialized(Int,Boolean) T]...
Extending the language. I always wanted to do something like this in Java (couldn't). But in Scala I can have:
def timed[T](thunk: => T) = {
val t1 = System.nanoTime
val ret = thunk
val time = System.nanoTime - t1
println("Executed in: " + time/1000000.0 + " millisec")
ret
}
and then write:
val numbers = List(12, 42, 3, 11, 6, 3, 77, 44)
val sorted = timed { // "timed" is a new "keyword"!
numbers.sortWith(_<_)
}
println(sorted)
and get
Executed in: 6.410311 millisec
List(3, 3, 6, 11, 12, 42, 44, 77)
You can designate a call-by-name parameter (EDITED: this is different then a lazy parameter!) to a function and it will not be evaluated until used by the function (EDIT: in fact, it will be reevaluated every time it is used). See this faq for details
class Bar(i:Int) {
println("constructing bar " + i)
override def toString():String = {
"bar with value: " + i
}
}
// NOTE the => in the method declaration. It indicates a lazy paramter
def foo(x: => Bar) = {
println("foo called")
println("bar: " + x)
}
foo(new Bar(22))
/*
prints the following:
foo called
constructing bar 22
bar with value: 22
*/
You can use locally to introduce a local block without causing semicolon inference issues.
Usage:
scala> case class Dog(name: String) {
| def bark() {
| println("Bow Vow")
| }
| }
defined class Dog
scala> val d = Dog("Barnie")
d: Dog = Dog(Barnie)
scala> locally {
| import d._
| bark()
| bark()
| }
Bow Vow
Bow Vow
locally is defined in "Predef.scala" as:
#inline def locally[T](x: T): T = x
Being inline, it does not impose any additional overhead.
Early Initialization:
trait AbstractT2 {
println("In AbstractT2:")
val value: Int
val inverse = 1.0/value
println("AbstractT2: value = "+value+", inverse = "+inverse)
}
val c2c = new {
// Only initializations are allowed in pre-init. blocks.
// println("In c2c:")
val value = 10
} with AbstractT2
println("c2c.value = "+c2c.value+", inverse = "+c2c.inverse)
Output:
In AbstractT2:
AbstractT2: value = 10, inverse = 0.1
c2c.value = 10, inverse = 0.1
We instantiate an anonymous inner
class, initializing the value field
in the block, before the with
AbstractT2 clause. This guarantees
that value is initialized before the
body of AbstractT2 is executed, as
shown when you run the script.
You can compose structural types with the 'with' keyword
object Main {
type A = {def foo: Unit}
type B = {def bar: Unit}
type C = A with B
class myA {
def foo: Unit = println("myA.foo")
}
class myB {
def bar: Unit = println("myB.bar")
}
class myC extends myB {
def foo: Unit = println("myC.foo")
}
def main(args: Array[String]): Unit = {
val a: A = new myA
a.foo
val b: C = new myC
b.bar
b.foo
}
}
placeholder syntax for anonymous functions
From The Scala Language Specification:
SimpleExpr1 ::= '_'
An expression (of syntactic category Expr) may contain embedded underscore symbols _ at places where identifiers are legal. Such an expression represents an anonymous function where subsequent occurrences of underscores denote successive parameters.
From Scala Language Changes:
_ + 1 x => x + 1
_ * _ (x1, x2) => x1 * x2
(_: Int) * 2 (x: Int) => x * 2
if (_) x else y z => if (z) x else y
_.map(f) x => x.map(f)
_.map(_ + 1) x => x.map(y => y + 1)
Using this you could do something like:
def filesEnding(query: String) =
filesMatching(_.endsWith(query))
Implicit definitions, particularly conversions.
For example, assume a function which will format an input string to fit to a size, by replacing the middle of it with "...":
def sizeBoundedString(s: String, n: Int): String = {
if (n < 5 && n < s.length) throw new IllegalArgumentException
if (s.length > n) {
val trailLength = ((n - 3) / 2) min 3
val headLength = n - 3 - trailLength
s.substring(0, headLength)+"..."+s.substring(s.length - trailLength, s.length)
} else s
}
You can use that with any String, and, of course, use the toString method to convert anything. But you could also write it like this:
def sizeBoundedString[T](s: T, n: Int)(implicit toStr: T => String): String = {
if (n < 5 && n < s.length) throw new IllegalArgumentException
if (s.length > n) {
val trailLength = ((n - 3) / 2) min 3
val headLength = n - 3 - trailLength
s.substring(0, headLength)+"..."+s.substring(s.length - trailLength, s.length)
} else s
}
And then, you could pass classes of other types by doing this:
implicit def double2String(d: Double) = d.toString
Now you can call that function passing a double:
sizeBoundedString(12345.12345D, 8)
The last argument is implicit, and is being passed automatically because of the implicit de declaration. Furthermore, "s" is being treated like a String inside sizeBoundedString because there is an implicit conversion from it to String.
Implicits of this type are better defined for uncommon types to avoid unexpected conversions. You can also explictly pass a conversion, and it will still be implicitly used inside sizeBoundedString:
sizeBoundedString(1234567890L, 8)((l : Long) => l.toString)
You can also have multiple implicit arguments, but then you must either pass all of them, or not pass any of them. There is also a shortcut syntax for implicit conversions:
def sizeBoundedString[T <% String](s: T, n: Int): String = {
if (n < 5 && n < s.length) throw new IllegalArgumentException
if (s.length > n) {
val trailLength = ((n - 3) / 2) min 3
val headLength = n - 3 - trailLength
s.substring(0, headLength)+"..."+s.substring(s.length - trailLength, s.length)
} else s
}
This is used exactly the same way.
Implicits can have any value. They can be used, for instance, to hide library information. Take the following example, for instance:
case class Daemon(name: String) {
def log(msg: String) = println(name+": "+msg)
}
object DefaultDaemon extends Daemon("Default")
trait Logger {
private var logd: Option[Daemon] = None
implicit def daemon: Daemon = logd getOrElse DefaultDaemon
def logTo(daemon: Daemon) =
if (logd == None) logd = Some(daemon)
else throw new IllegalArgumentException
def log(msg: String)(implicit daemon: Daemon) = daemon.log(msg)
}
class X extends Logger {
logTo(Daemon("X Daemon"))
def f = {
log("f called")
println("Stuff")
}
def g = {
log("g called")(DefaultDaemon)
}
}
class Y extends Logger {
def f = {
log("f called")
println("Stuff")
}
}
In this example, calling "f" in an Y object will send the log to the default daemon, and on an instance of X to the Daemon X daemon. But calling g on an instance of X will send the log to the explicitly given DefaultDaemon.
While this simple example can be re-written with overload and private state, implicits do not require private state, and can be brought into context with imports.
Maybe not too hidden, but I think this is useful:
#scala.reflect.BeanProperty
var firstName:String = _
This will automatically generate a getter and setter for the field that matches bean convention.
Further description at developerworks
Implicit arguments in closures.
A function argument can be marked as implicit just as with methods. Within the scope of the body of the function the implicit parameter is visible and eligible for implicit resolution:
trait Foo { def bar }
trait Base {
def callBar(implicit foo: Foo) = foo.bar
}
object Test extends Base {
val f: Foo => Unit = { implicit foo =>
callBar
}
def test = f(new Foo {
def bar = println("Hello")
})
}
Build infinite data structures with Scala's Streams :
http://www.codecommit.com/blog/scala/infinite-lists-for-the-finitely-patient
Result types are dependent on implicit resolution. This can give you a form of multiple dispatch:
scala> trait PerformFunc[A,B] { def perform(a : A) : B }
defined trait PerformFunc
scala> implicit val stringToInt = new PerformFunc[String,Int] {
def perform(a : String) = 5
}
stringToInt: java.lang.Object with PerformFunc[String,Int] = $anon$1#13ccf137
scala> implicit val intToDouble = new PerformFunc[Int,Double] {
def perform(a : Int) = 1.0
}
intToDouble: java.lang.Object with PerformFunc[Int,Double] = $anon$1#74e551a4
scala> def foo[A, B](x : A)(implicit z : PerformFunc[A,B]) : B = z.perform(x)
foo: [A,B](x: A)(implicit z: PerformFunc[A,B])B
scala> foo("HAI")
res16: Int = 5
scala> foo(1)
res17: Double = 1.0
Scala's equivalent of Java double brace initializer.
Scala allows you to create an anonymous subclass with the body of the class (the constructor) containing statements to initialize the instance of that class.
This pattern is very useful when building component-based user interfaces (for example Swing , Vaadin) as it allows to create UI components and declare their properties more concisely.
See http://spot.colorado.edu/~reids/papers/how-scala-experience-improved-our-java-development-reid-2011.pdf for more information.
Here is an example of creating a Vaadin button:
val button = new Button("Click me"){
setWidth("20px")
setDescription("Click on this")
setIcon(new ThemeResource("icons/ok.png"))
}
Excluding members from import statements
Suppose you want to use a Logger that contains a println and a printerr method, but you only want to use the one for error messages, and keep the good old Predef.println for standard output. You could do this:
val logger = new Logger(...)
import logger.printerr
but if logger also contains another twelve methods that you would like to import and use, it becomes inconvenient to list them. You could instead try:
import logger.{println => donotuseprintlnt, _}
but this still "pollutes" the list of imported members. Enter the über-powerful wildcard:
import logger.{println => _, _}
and that will do just the right thing™.
require method (defined in Predef) that allow you to define additional function constraints that would be checked during run-time. Imagine that you developing yet another twitter client and you need to limit tweet length up to 140 symbols. Moreover you can't post empty tweet.
def post(tweet: String) = {
require(tweet.length < 140 && tweet.length > 0)
println(tweet)
}
Now calling post with inappropriate length argument will cause an exception:
scala> post("that's ok")
that's ok
scala> post("")
java.lang.IllegalArgumentException: requirement failed
at scala.Predef$.require(Predef.scala:145)
at .post(<console>:8)
scala> post("way to looooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooooong tweet")
java.lang.IllegalArgumentException: requirement failed
at scala.Predef$.require(Predef.scala:145)
at .post(<console>:8)
You can write multiple requirements or even add description to each:
def post(tweet: String) = {
require(tweet.length > 0, "too short message")
require(tweet.length < 140, "too long message")
println(tweet)
}
Now exceptions are verbose:
scala> post("")
java.lang.IllegalArgumentException: requirement failed: too short message
at scala.Predef$.require(Predef.scala:157)
at .post(<console>:8)
One more example is here.
Bonus
You can perform an action every time requirement fails:
scala> var errorcount = 0
errorcount: Int = 0
def post(tweet: String) = {
require(tweet.length > 0, {errorcount+=1})
println(tweet)
}
scala> errorcount
res14: Int = 0
scala> post("")
java.lang.IllegalArgumentException: requirement failed: ()
at scala.Predef$.require(Predef.scala:157)
at .post(<console>:9)
...
scala> errorcount
res16: Int = 1
Traits with abstract override methods are a feature in Scala that is as not widely advertised as many others. The intend of methods with the abstract override modifier is to do some operations and delegating the call to super. Then these traits have to be mixed-in with concrete implementations of their abstract override methods.
trait A {
def a(s : String) : String
}
trait TimingA extends A {
abstract override def a(s : String) = {
val start = System.currentTimeMillis
val result = super.a(s)
val dur = System.currentTimeMillis-start
println("Executed a in %s ms".format(dur))
result
}
}
trait ParameterPrintingA extends A {
abstract override def a(s : String) = {
println("Called a with s=%s".format(s))
super.a(s)
}
}
trait ImplementingA extends A {
def a(s: String) = s.reverse
}
scala> val a = new ImplementingA with TimingA with ParameterPrintingA
scala> a.a("a lotta as")
Called a with s=a lotta as
Executed a in 0 ms
res4: String = sa attol a
While my example is really not much more than a poor mans AOP, I used these Stackable Traits much to my liking to build Scala interpreter instances with predefined imports, custom bindings and classpathes. The Stackable Traits made it possible to create my factory along the lines of new InterpreterFactory with JsonLibs with LuceneLibs and then have useful imports and scope varibles for the users scripts.