Suppose I have the following class hierarchy:
trait A; class A1 extends A; class A2 extends A
Now I need to filter A1 instances in List[A]. I use either pattern matching or isInstanceOf.
as.filter(cond(_){case _: A1 => true}) // use pattern matching
as.filter(_.isInstanceOf[A1]) // use isInstanceOf
Does it work the same ? What would you prefer ?
Why don't you use collect? That has the added benefit that the returned list will be of the right type (List[A1] instead of List[A])
val a1s = as.collect { case x:A1 => x }
While the accepted answer gives you a good advice, please note that typecase in scala is not different than using isInstanceOf combined with asInstanceOf. This two examples are roughly equivalent:
def foo(x: Any) = x match {
case s: String = println(s"$s is a String)
case _ => println("something else")
}
def foo(x: Any) = x match {
case _ if x.isInstanceOf[String] => println(s${x.asInstanceOf[String]} is a String)
case _ => println("something else")
}
So in your specific example it doesn't really matter which of the two you use: you'll always end up doing some kind of downcasting, which is something to generally avoid.
See how the second version is considerably uglier, hence more appropriate, since you're doing a "ugly" thing in a functional language.
So, I'd go with
val a1s = as.collect{case x if x.isInstanceOf[A1] => x.asInstanceOf[A1]}
Ugly things should look ugly.
Does it work the same?
The same answer will be generated, but different code will be emitted in each case, as you might expect.
You can examine the IL which is generated in each case, as follows. Create a "test.scala" file with the following contents:
import PartialFunction.cond
trait A; class A1 extends A; class A2 extends A
class Filterer {
def filter1(as: List[A]) =
as.filter(cond(_){case _: A1 => true}) // use pattern matching
def filter2(as: List[A]) =
as.filter(_.isInstanceOf[A1]) // use isInstanceOf
}
Then run:
scalac test.scala
To examine the IL for the as.filter(cond(_){case _: A1 => true}) version, do
javap -c 'Filterer$$anonfun$filter1$1'
javap -c 'Filterer$$anonfun$filter1$1$$anonfun$apply$1'
Then to examine the IL for the as.filter(_.isInstanceOf[A1]) version, you can do
javap -c 'Filterer$$anonfun$filter2$1'
The "cond" version uses more intermediate values and instantiates more objects representing the extra anonymous functions involved.
In Scala, I am thinking of a simple monad Result that contains either a Good value, or alternatively an Error message. Here is my implementation.
I'd like to ask: Did I do something in an excessively complicated manner? Or mistakes even?
Could this be simplified (but maintaining readability, so no Perl golf)? For example, do I need to use the abstract class and the companion object, or could it be simpler to put everything in a normal class?
abstract class Result[+T] {
def flatMap[U](f: T => Result[U]): Result[U] = this match {
case Good(x) => f(x)
case e: Error => e
}
def map[U](f: T => U): Result[U] = flatMap { (x: T) => Result(f(x)) }
}
case class Good[T](x: T) extends Result[T]
case class Error(e: String) extends Result[Nothing]
object Result { def apply[T](x: T): Result[T] = Good(x) }
Now if I, for example
val x = Good(5)
def f1(v: Int): Result[Int] = Good(v + 1)
def fE(v: Int): Result[Int] = Error("foo")
then I can chain in the usual manner:
x flatMap f1 flatMap f1 // => Good(7)
x flatMap fE flatMap f1 // => Error(foo)
And the for-comprehension:
for (
a <- x;
b <- f1(a);
c <- f1(b)
) yield c // => Good(7)
P.S: I am aware of the \/ monad in Scalaz, but this is for simple cases when installing and importing Scalaz feels a bit heavy.
Looks good to me. I would change the abstract class into a sealed trait. And I think you could leave off the return types for flatMap and map without losing any readability.
I like the companion object because it calls out your unit function for what it is.
In researching how to do Memoization in Scala, I've found some code I didn't grok. I've tried to look this particular "thing" up, but don't know by what to call it; i.e. the term by which to refer to it. Additionally, it's not easy searching using a symbol, ugh!
I saw the following code to do memoization in Scala here:
case class Memo[A,B](f: A => B) extends (A => B) {
private val cache = mutable.Map.empty[A, B]
def apply(x: A) = cache getOrElseUpdate (x, f(x))
}
And it's what the case class is extending that is confusing me, the extends (A => B) part. First, what is happening? Secondly, why is it even needed? And finally, what do you call this kind of inheritance; i.e. is there some specific name or term I can use to refer to it?
Next, I am seeing Memo used in this way to calculate a Fibanocci number here:
val fibonacci: Memo[Int, BigInt] = Memo {
case 0 => 0
case 1 => 1
case n => fibonacci(n-1) + fibonacci(n-2)
}
It's probably my not seeing all of the "simplifications" that are being applied. But, I am not able to figure out the end of the val line, = Memo {. So, if this was typed out more verbosely, perhaps I would understand the "leap" being made as to how the Memo is being constructed.
Any assistance on this is greatly appreciated. Thank you.
A => B is short for Function1[A, B], so your Memo extends a function from A to B, most prominently defined through method apply(x: A): B which must be defined.
Because of the "infix" notation, you need to put parentheses around the type, i.e. (A => B). You could also write
case class Memo[A, B](f: A => B) extends Function1[A, B] ...
or
case class Memo[A, B](f: Function1[A, B]) extends Function1[A, B] ...
To complete 0_'s answer, fibonacci is being instanciated through the apply method of Memo's companion object, generated automatically by the compiler since Memo is a case class.
This means that the following code is generated for you:
object Memo {
def apply[A, B](f: A => B): Memo[A, B] = new Memo(f)
}
Scala has special handling for the apply method: its name needs not be typed when calling it. The two following calls are strictly equivalent:
Memo((a: Int) => a * 2)
Memo.apply((a: Int) => a * 2)
The case block is known as pattern matching. Under the hood, it generates a partial function - that is, a function that is defined for some of its input parameters, but not necessarily all of them. I'll not go in the details of partial functions as it's beside the point (this is a memo I wrote to myself on that topic, if you're keen), but what it essentially means here is that the case block is in fact an instance of PartialFunction.
If you follow that link, you'll see that PartialFunction extends Function1 - which is the expected argument of Memo.apply.
So what that bit of code actually means, once desugared (if that's a word), is:
lazy val fibonacci: Memo[Int, BigInt] = Memo.apply(new PartialFunction[Int, BigInt] {
override def apply(v: Int): Int =
if(v == 0) 0
else if(v == 1) 1
else fibonacci(v - 1) + fibonacci(v - 2)
override isDefinedAt(v: Int) = true
})
Note that I've vastly simplified the way the pattern matching is handled, but I thought that starting a discussion about unapply and unapplySeq would be off topic and confusing.
I am the original author of doing memoization this way. You can see some sample usages in that same file. It also works really well when you want to memoize on multiple arguments too because of the way Scala unrolls tuples:
/**
* #return memoized function to calculate C(n,r)
* see http://mathworld.wolfram.com/BinomialCoefficient.html
*/
val c: Memo[(Int, Int), BigInt] = Memo {
case (_, 0) => 1
case (n, r) if r > n/2 => c(n, n-r)
case (n, r) => c(n-1, r-1) + c(n-1, r)
}
// note how I can invoke a memoized function on multiple args too
val x = c(10, 3)
This answer is a synthesis of the partial answers provided by both 0__ and Nicolas Rinaudo.
Summary:
There are many convenient (but also highly intertwined) assumptions being made by the Scala compiler.
Scala treats extends (A => B) as synonymous with extends Function1[A, B] (ScalaDoc for Function1[+T1, -R])
A concrete implementation of Function1's inherited abstract method apply(x: A): B must be provided; def apply(x: A): B = cache.getOrElseUpdate(x, f(x))
Scala assumes an implied match for the code block starting with = Memo {
Scala passes the content between {} started in item 3 as a parameter to the Memo case class constructor
Scala assumes an implied type between {} started in item 3 as PartialFunction[Int, BigInt] and the compiler uses the "match" code block as the override for the PartialFunction method's apply() and then provides an additional override for the PartialFunction's method isDefinedAt().
Details:
The first code block defining the case class Memo can be written more verbosely as such:
case class Memo[A,B](f: A => B) extends Function1[A, B] { //replaced (A => B) with what it's translated to mean by the Scala compiler
private val cache = mutable.Map.empty[A, B]
def apply(x: A): B = cache.getOrElseUpdate(x, f(x)) //concrete implementation of unimplemented method defined in parent class, Function1
}
The second code block defining the val fibanocci can be written more verbosely as such:
lazy val fibonacci: Memo[Int, BigInt] = {
Memo.apply(
new PartialFunction[Int, BigInt] {
override def apply(x: Int): BigInt = {
x match {
case 0 => 0
case 1 => 1
case n => fibonacci(n-1) + fibonacci(n-2)
}
}
override def isDefinedAt(x: Int): Boolean = true
}
)
}
Had to add lazy to the second code block's val in order to deal with a self-referential problem in the line case n => fibonacci(n-1) + fibonacci(n-2).
And finally, an example usage of fibonacci is:
val x:BigInt = fibonacci(20) //returns 6765 (almost instantly)
One more word about this extends (A => B): the extends here is not required, but necessary if the instances of Memo are to be used in higher order functions or situations alike.
Without this extends (A => B), it's totally fine if you use the Memo instance fibonacci in just method calls.
case class Memo[A,B](f: A => B) {
private val cache = scala.collection.mutable.Map.empty[A, B]
def apply(x: A):B = cache getOrElseUpdate (x, f(x))
}
val fibonacci: Memo[Int, BigInt] = Memo {
case 0 => 0
case 1 => 1
case n => fibonacci(n-1) + fibonacci(n-2)
}
For example:
Scala> fibonacci(30)
res1: BigInt = 832040
But when you want to use it in higher order functions, you'd have a type mismatch error.
Scala> Range(1, 10).map(fibonacci)
<console>:11: error: type mismatch;
found : Memo[Int,BigInt]
required: Int => ?
Range(1, 10).map(fibonacci)
^
So the extends here only helps to ID the instance fibonacci to others that it has an apply method and thus can do some jobs.
sealed class A
class B1 extends A
class B2 extends A
Assuming we have a List of objects of class A :
val l: List[A] = List(new B1, new B2, new B1, new B1)
And we want to filter out the elements of the type B1.
Then we need a predicate and could use the following two alternatives:
l.filter(_.isInstanceOf[B1])
Or
l.filter(_ match {case b: B1 => true; case _ => false})
Personally, I like the first approach more, but I often read, one should use the match-case statement more often (for reasons I do not know).
Therefore, the question is: Are there drawbacks of using isInstanceOf instead of the match-case statement ? When should one use which approach (and which approach should be used here and why) ?
You can filter like that:
l.collect{ case x: B1 => x }
That is much more readable, IMO.
There's no problem using isInstanceOf, as long as you don't use asInstanceOf.
Code that uses both is brittle, because checking and casting are separate actions, whereas using matching you have a single action doing both.
There are no difference
cat t.scala:
class A {
def x(o: AnyRef) = o.isInstanceOf[A]
def y(o: AnyRef) = o match {
case s: A => true
case _ => false
}
}
$ scalac -print t.scala
[[syntax trees at end of cleanup]]// Scala source: t.scala
package <empty> {
class A extends java.lang.Object with ScalaObject {
def x(o: java.lang.Object): Boolean = o.$isInstanceOf[A]();
def y(o: java.lang.Object): Boolean = {
<synthetic> val temp1: java.lang.Object = o;
temp1.$isInstanceOf[A]()
};
def this(): A = {
A.super.this();
()
}
}
}
The advantage of match-case is that you don't have to cast the object in case you want to perform operations on it that depend on its narrower type.
In the following snippet, using isInstanceOf seems to be fine since you don't perform such an operation:
if (obj.isInstanceOf[A]) println(obj)
However, if you do the following:
if (obj.isInstanceOf[A]) {
val a = obj.asInstanceOf[A]
println(a.someField) // someField is declared by A
}
then I'd be in favour of using match-case:
obj match {
case a: A => println(a.someField)
case _ =>
}
It is slightly annoying that you have to include the "otherwise"-case, but using collect (as hinted at by om-nom-nom) could help, at least if you work with collections inherit from Seq:
collectionOfObj.collect{ case a: A => a}.foreach(println(_.someField))
looking through the Scala code, the convenient array creation syntax is achieved by adding an apply method to object Array. At first, I thought this was achieved somehow through case classes because you can run the following, but this does not seem to be the case:
Array(1,2,3) match { case Array(a, b, c) => a + b + c }
I know that I also need to look at WrappedArray and all the superclasses, but I can't figure out how scala achieves this matching on Arrays (and I need to become more familiar with the scala collections class hierarchy). It certainly doesn't work with a run-of-the-mill class.
scala> class A(val x: Int)
scala> new A(4) match { case A(x) => x }
<console>:9: error: not found: value A
new A(4) match { case A(x) => x }
^
<console>:9: error: not found: value x
new A(4) match { case A(x) => x }
How do they get this to work with Array?
You can pattern match with this syntax on any class so long as you have an object with with an unapply or unapplySeq (in the case of varargs) method that returns an Option or Boolean. These are known as extractors. The lines in question from object Array are
def unapplySeq[T](x: Array[T]): Option[IndexedSeq[T]] =
if (x == null) None else Some(x.toIndexedSeq)
In your example you can get it to match using
class A(val x: Int)
object A {
def unapply(a: A) = Some(a.x)
}
so now
scala> new A(4) match { case A(x) => x }
res1: Int = 4
The Programming In Scala chapter on extractors may be useful.
For case classes, an unapply method is just one of the methods that is included for free, along with toString, equals, etc.
Note that the extractor doesn't have to have the same name as the class in question, and it doesn't have to be defined within an object object. For example, in your case you could equally write
val xyz = new { def unapply(a: A) = Some(a.x) } //extending java.lang.Object
new A(4) match { case xyz(x) => x } //Int = 4