Scala Try[Unit] confusion - scala

I have this piece of code
import scala.util.Try
val t: Try[Unit] = Try(Try(1))
and 2 questions:
What is happening here? How can the type Try[Try[Int]] match with
Try[Unit]? Is it because Scala chooses the return type for block
Try(1) to be Unit to match with the desired type?
Is there anyway to detect a nested Try? Says I have a Try[A], how do I know if A is another Try[_]?

You are basically forcing compiler to assign Try as Unit.
For exmple doSomething method below is supposed to return Int as that is last statement, but return type Unit is forcing it to return ().
scala> def doSomething: Unit = 1 + 1
doSomething: Unit
In your example val t: Try[Unit] = Try(Try(1 / 0)), you asking compiler to treat inner Try(1 / 0) as Unit; which means
scala> val innerTry: Unit = Try(1 / 0)
innerTry: Unit = ()
Which means even if Try fails, its Unit which is always Success for another Try that you have.
scala> val t: Try[Unit] = Try(someOperation)
t: scala.util.Try[Unit] = Success(())
Better remove the specific type you are providing and let the compiler figure it out,
scala> val t = Try(Try(1 / 0))
t: scala.util.Try[scala.util.Try[Int]] = Success(Failure(java.lang.ArithmeticException: / by zero))
Also read: Scala: Why can I convert Int to Unit?
final abstract class Unit private extends AnyVal {
// Provide a more specific return type for Scaladoc
override def getClass(): Class[Unit] = ???
}

Try(1) returns Success(1).
Try(Try(1) returns Success(())because of its assignment to Type Try[Unit]
Similar to the following commands in Scala REPL where x is forced to take Unit type:
scala> val x = 5
x: Int = 5
scala> val x:Unit = 5
<console>:23: warning: a pure expression does nothing in statement position; you may be omitting necessary parentheses
val x:Unit = 5
^
x: Unit = ()
scala>
So obviously the compiler is taking the return value and is forced to type () which is Unit.
Because it is Unit may be it makes sense the way compiler interprets it.

Related

Scala 2 type inference

def test[T: ClassTag]: T = {
println(classTag[T])
null.asInstanceOf[T]
}
val x1: Int = test
val x2: Int = test[Int]
prints
Nothing
Int
I would expect the compiler to guess the Int type without the need to provide it explicitly (EDIT: on the right hand side, i.e. make val x1: Int = test work).
Is there perhaps a workaround to get rid of the explicit type annotation?
I suspect that compiler can't infer Int in val x1: Int = test[???] because both options:
val x1: Int = test[Int] // returns Int
and
val x1: Int = test[Nothing] // returns Nothing <: Int
are valid. So compiler just has to guess what option you meant.
When compiler has a choice it often selects the minimal type out of the options. And currently this is Nothing.
Why Scala Infer the Bottom Type when the type parameter is not specified?
In principle, if you'd like to explore the type of left hand side you can make test a macro. Then it can be even not generic. Making it whitebox means that it can return a type more precise than declared (Any) e.g. Int.
import scala.reflect.macros.whitebox // libraryDependencies += scalaOrganization.value % "scala-reflect" % scalaVersion.value
import scala.language.experimental.macros
def test: Any = macro testImpl
def testImpl(c: whitebox.Context): c.Tree = {
import c.universe._
val T = c.internal.enclosingOwner.typeSignature
println(s"T=$T")
q"""
_root_.scala.Predef.println(_root_.scala.reflect.runtime.universe.typeOf[$T])
null.asInstanceOf[$T]
"""
}
// in a different subproject
val x1: Int = test // prints at runtime: Int
// scalacOptions += "-Ymacro-debug-lite"
//scalac: T=Int
//scalac: {
// _root_.scala.Predef.println(_root_.scala.reflect.runtime.universe.typeOf[Int]);
// null.asInstanceOf[Int]
//}
Actually there's nothing wrong with type inference here, type inference here means that the compiler should figure out that the T type parameter is Int, and the returned value from your method expression is an integer, and that's working properly:
x1: Int = 0 // this is the result of your first line
You can also try printing this:
println(x1.getClass) // int
What's not working as expected is implicit resolution for a generic type "ClassTag[_]". The reason it prints Nothing is that the compiler found the object scala.reflect.ClassTag.Nothing : ClassTag[Nothing] suitable for your case. Now about this new thing, there are loads of content on SO and the internet for why this happens and how to deal with it in different cases.
Here's another piece of code to differentiate type inference with type erasure in your case:
def nullAs[T]: T = null.asInstanceOf[T]
val int: Int = nullAs // 0: Int
// type erasure:
case class Container[T](value: T)
implicit val charContainer: Container[Char] = Container('c')
def nullWithContainer[T](implicit container: Container[T]): T = {
println(container)
null.asInstanceOf[T]
}
val long: Long = nullWithContainer
// prints Container(c)
// 0L
Which means type inference is done correctly, but type erasure has happened, because in runtime, the type of charContainer is Container, not Container[Char].
val x1: Int = test
In this line, the Int you have given is the type for the value x1 which would hold the result of the function test and as for the classTag of T the compiler finds no information which is why it returns Nothing, in which case its taking val x1: Int = test[Nothing] so you would always have to mention a Type for test else it would print Nothing

Scala expression evaluation

I'm intrigued by the following Scala compiler behavior. When i declare a function of type unit, but nevertheless provide as body a function that evaluate to an Int, the Scala compiler is ok with it.
def add(x:Int, y:Int) = x + y
def main(args: Array[String]): Unit = {add(args(0).toInt, args(0).toInt)}
While the same is not true with other type such in
def afunc: String = {1} //type mismatch; found : Int(1) required: String
Also if i write
def afunc: Unit = {1}
or
def main(args: Array[String]): Unit = {2 + 2} // Which is just like the first addition above
In both case i get the following warning:
a pure expression does nothing in statement position; you may be omitting necessary parentheses
In a sense there is 2 questions here. What is the difference between a function that return evaluate to an Int and the expression 2 + 2. Then why on earth, the compiler does not complain that a function that is suppose to evaluate to Unit, gets a body that evaluate to another type, as it would happens between other types.
Many thanks in advance,
Maatari
What is the difference between a function that return evaluate to an
Int and the expression 2 + 2.
A function that evaluates to an Int might have side effects
var a = 0
def fn: Int = {
a = a + 1
1
}
Every call to fn changes the value of a.
Then why on earth, the compiler does not complain that a function that
is suppose to evaluate to Unit, gets a body that evaluate to another
type, as it would happens between other types.
When you specify Unit as the return type the compiler does an implicit conversion from whatever value the function returns to Unit (there is only one Unit value as it is an object, you can think of it as the void type of Scala).
Unit is inferred as a return type for convenience. For example,
scala> val h = new collection.mutable.HashMap[String,String]
h: scala.collection.mutable.HashMap[String,String] = Map()
scala> h += "fish" -> "salmon"
res1: h.type = Map(fish -> salmon)
we see that the += method on mutable HashMaps returns the map.
But if you don't infer unit, then
def add(a: String, b: String): Unit = h += a -> b
doesn't work.
Which thing is worse--accidentally expecting a return value vs. demanding you explicitly return the most boring possible return value--depends on how functional your code is. If you have more than a few side-effects, not inferring Unit is rather painful.
A function with just the parentheses is mainly called for the side-effects. It may return a result, as it was common in the days of C or Java but in reality we don't care, cause we will not use it.
{} is equivalent to : Unit =, and the annotation of return type can be seen as a cast or an implicit conversion to match the type expected by the function.
Here is another example where this behavior occurs.
def add(a: Int, b:Int) : Double = a + b
add: (a: Int, b: Int)Double
scala> add(4,5)
res17: Double = 9.0
Thus it seems that every value can be transformed to unit.
scala> val a: Unit = 1:Unit
<console>:28: warning: a pure expression does nothing in statement position; you may be omitting necessary parentheses
val a: Unit = 1:Unit
^
a: Unit = ()

unwanted implicit argument resolution in higher order function map

I'm having issues trying to map some methods defined with implicit arguments over an Option type.
Let's say I define a custom class and a trait with the aforementioned methods operating on said class
class MyClass
trait Processor {
def stringy(implicit arg: MyClass) = arg.toString
def lengthy(implicit arg: MyClass) = arg.toString.length
def inty(implicit arg: MyClass) = arg.toString.map(_.toInt).sum
}
Then I create an implementation with some tests
object TestProcessing extends Processor {
//Here everything works fine, the argument is passed explicitly
def test() {
val my = new MyClass
val res = List(stringy(my), lengthy(my), inty(my))
println(res.mkString("\n"))
}
//Still everything ok, the argument is passed implicitly
def testImplicit() {
implicit val my = new MyClass
val res = List(stringy, lengthy, inty)
println(res.mkString("\n"))
}
object Mapper {
//class wrapped in an Option
val optional = Some(new MyClass)
//trying to factor out common code
def optionally[T](processFunction: MyClass => T): Option[T] = optional map processFunction
//now the specific processing methods that should work on the optional value
def s: Option[String] = optionally(stringy)
def l: Option[Int] = optionally(lengthy)
def i: Option[Int] = optionally(inty)
/*
* Here the compiler complains that
*
*<console>:40: error: could not find implicit value for parameter arg: MyClass
* def s: Option[String] = optionally(stringy)
* ^
*<console>:41: error: could not find implicit value for parameter arg: MyClass
* def l: Option[Int] = optionally(lengthy)
* ^
*<console>:42: error: could not find implicit value for parameter arg: MyClass
* def i: Option[Int] = optionally(inty)
* ^
*/
}
}
While optionally is conceived to explicitly pass the optional value explicitly to its argument function, the compiler demands an implicit definition when I use it on the actual functions.
I have two possible solutions, neither of which is satisfactiory:
passing an implicit argument to optionally as in
optionally(implicit my => stringy)
avoid defining the argument to the specific functions as implicit, as in
def stringy(arg: MyClass)
Each solution defies the goal of achieving both readability and usability.
Is there a third way to go?
If I understand correctly, the problem is that the compiler doesn't seem recognize that you want to partially apply / lift the method to a function here (instead it "thinks" you want to leave out the implicit parameter), so doing that explicitly seems to work:
def s: Option[String] = optionally(stringy(_))
def l: Option[Int] = optionally(lengthy(_))
def i: Option[Int] = optionally(inty(_))

Scala asInstanceOf with parameterized types

I would like to write a function that casts to type A, where A can be e.g. List[Int], or a more complicated parameterized type like Map[Int, List[Int]].
def castToType[A](x: Any): A = {
// throws if A is not the right type
x.asInstanceOf[A]
}
Right now, due to type erasure (I believe), the code merrily works even when the type is not correct. The error only manifests on access, witha ClassCastException.
val x = List(1, 2, 3)
val y = castToType[List[String]](x)
y(0) --> throws java.lang.ClassCastException: java.lang.Integer cannot be cast to java.lang.String
Is there a way I can use manifests to make this work properly? Thanks!
Unfortunately, this in an inherent limitation of asInstanceOf. I'm actually surprised to see the scaladoc mention it in details:
Note that the success of a cast at runtime is modulo Scala's erasure semantics. Therefore the expression 1.asInstanceOf[String] will throw a ClassCastException at runtime, while the expression List(1).asInstanceOf[List[String]] will not. In the latter example, because the type argument is erased as part of compilation it is not possible to check whether the contents of the list are of the requested type.
If you're mainly concerned about failing fast on wrong cast for traversable which would likely be the main issue when getting stuff back from your DB/memcached interface, I was playing around forcing a cast of the head for traversable objects:
def failFastCast[A: Manifest, T[A] <: Traversable[A]](as: T[A], any: Any) = {
val res = any.asInstanceOf[T[A]]
if (res.isEmpty) res
else {
manifest[A].newArray(1).update(0, res.head) // force exception on wrong type
res
}
}
On a simple example it works:
scala> val x = List(1, 2, 3): Any
x: Any = List(1, 2, 3)
scala> failFastCast(List[String](), x)
java.lang.ArrayStoreException: java.lang.Integer
[...]
scala> failFastCast(List[Int](), x)
res22: List[Int] = List(1, 2, 3)
But not on a more complex one:
val x = Map(1 -> ("s" -> 1L)): Any
failFastCast(Map[Int, (String, String)](), x) // no throw
I wonder if there is a way to recursively drill down into A to keep casting until there is no more type parameters...
You are indeed correct - type erasure means that you cannot "cast" in such a way as to distinguish between List[Int] and List[String], for example. However, you can improve on the cast which you are performing, whereby A is erased in such a way as to mean that you cannot distinguish between an Int and a String:
def cast[A](a : Any) = a.asInstanceOf[A]
//... is erased to
def erasedCast(a : Any) = a.asInstanceOf[Any]
What you need are reified generics, using manifests
def cast[A <: AnyRef : Manifest](a : Any) : A
= manifest[A].erasure.cast(a).asInstanceOf[A]
Whilst the final cast is erased to AnyRef, at least you should have the correct Class[_] instance (manifest.erasure) to get the top level type correct. In action:
scala> cast[String]("Hey")
res0: String = Hey
scala> cast[java.lang.Integer]("Hey")
java.lang.ClassCastException
at java.lang.Class.cast(Class.java:2990)
at .cast(<console>:7)
at .<init>(<console>:9)
scala> cast[List[String]](List("Hey"))
res2: List[String] = List(Hey)
scala> cast[List[Int]](List("Hey"))
res3: List[Int] = List(Hey)
My advice is not to use nested reflection to decide whether the target was really a List[Int]: this is not generally feasible. For what should the following return?
cast[List[Int]](List[AnyVal](1, 2))
You could use shapeless's Typeable from Miles Sabin:
Type casting using type parameter
It handles erasure in many cases, though only specific ones:
scala> import shapeless._; import syntax.typeable._
import shapeless._
import syntax.typeable._
scala> val x = List(1, 2, 3)
x: List[Int] = List(1, 2, 3)
scala> val y = x.cast[List[String]]
y: Option[List[String]] = None
To see the set of cases that it handles you can refer to its source:
https://github.com/milessabin/shapeless/blob/master/core/src/main/scala/shapeless/typeable.scala
Yes, the problem occurs due to type erasure. If you try
val x = List(1,2,3)
val y = castToType[Int](x)
The exception is thrown right away, as expected. The same occurs when trying to cast to Array[String] or even Array[Int].
I don't think you can create a generic type converter that works will types inside collections and other objects. You will need to create a converter for each object type. For example:
def castToType[A](x: List[A]) = x.map(i => i.asInstanceOf[A])
Consider this solution:
trait -->[A, B] {
def ->(a: A): B
}
implicit val StringToInt = new -->[String, Int] {
def ->(a: String): Int = a.toInt
}
implicit val DateToLong = new -->[java.util.Date, Long] {
def ->(a: java.util.Date): Long = a.getTime
}
def cast[A,B](t:A)(implicit ev: A --> B):B= ev.->(t)
The advantage is that:
It is type safe - the compiler will tell you if the type cannot be casted
You can define casting rules by providing proper implicits
Now you can use it so:
scala> cast(new java.util.Date())
res9: Long = 1361195427192
scala> cast("123")
res10: Int = 123
EDIT
I've spent some time and wrote this advanced code. First let me show how to use it:
scala> "2012-01-24".as[java.util.Date]
res8: java.util.Date = Tue Jan 24 00:00:00 CET 2012
scala> "2012".as[Int]
res9: Int = 2012
scala> "2012.123".as[Double]
res12: Double = 2012.123
scala> "2012".as[Object] // this is not working, becouse I did not provide caster to Object
<console>:17: error: could not find implicit value for parameter $greater: -->[String,Object]
"2012".as[Object]
^
Pretty nice? See the scala magic:
trait -->[A, B] {
def ->(a: A): B
}
implicit val StringToInt = new -->[String, Int] {
def ->(a: String): Int = a.toInt
}
implicit val StringToDate = new -->[String, java.util.Date] {
def ->(a: String): java.util.Date = (new java.text.SimpleDateFormat("yyyy-MM-dd")).parse(a)
}
implicit val StringToDouble = new -->[String, Double] {
def ->(a: String): Double = a.toDouble
}
trait AsOps[A] {
def as[B](implicit > : A --> B): B
}
implicit def asOps[A](a: A) = new AsOps[A] {
def as[B](implicit > : A --> B) = > ->(a)
}

Is it possible for an optional argument value to depend on another argument in Scala

Does anyone know if something like this is possible in Scala:
case class Thing(property:String)
def f(thing:Thing, prop:String = thing.property) = println(prop)
The above code doesn't compile; giving the error error: not found: value thing at thing.property
The following shows the expected behaviour:
f(Thing("abc"), "123") // prints "123"
f(Thing("abc")) // prints "abc"
I realise I could make the prop argument an Option[String] and do the check in the function definition, but I was wondering if there was a way around it with the new named/default argument support in 2.8.0.
Yes, it's possible in Scala 2.8. Here's a quote from the "Named and Default Arguments in Scala 2.8" design document:
Since the scope of a parameter extends
over all subsequent parameter lists
(and the method body), default
expressions can depend on parameters
of preceding parameter lists (but not
on other parameters in the same
parameter list). Note that when using
a default value which depends on
earlier parameters, the actual
arguments are used, not the default
arguments.
def f(a: Int = 0)(b: Int = a + 1) = b // OK
And another example:
def f[T](a: Int = 1)(b: T = a + 1)(c: T = b)
// generates:
// def f$default$1[T]: Int = 1
// def f$default$2[T](a: Int): Int = a + 1
// def f$default$3[T](a: Int)(b: T): T = b
According to this, your code may look as follows:
scala> case class Thing(property:String)
defined class Thing
scala> def f(thing:Thing)(prop:String = thing.property) = println(prop)
f: (thing: Thing)(prop: String)Unit
scala> f(Thing("abc"))("123")
123
scala> f(Thing("abc"))()
abc
Another simple solution is just to overload the method:
case class Thing (property: String)
def f(thing: Thing, prop: String) = println(prop)
def f(thing: Thing) = f(thing, thing.property)
This is exactly what Option is for. You can use the getOrElse method before the method call or inside the method f.
scala> val abc = Some("abc")
abc: Some[java.lang.String] = Some(abc)
scala> val none: Option[String] = None
none: Option[String] = None
scala> println(abc getOrElse "123")
abc
scala> println(none getOrElse "123")
123
scala> def f(o: Option[String]) = println(o getOrElse "123")
f: (o: Option[String])Unit
scala> f(abc)
abc
scala> f(none)
123
Oh here is something you could do via default arguments:
scala> case class Thing(property: String = "123")
defined class Thing
scala> def f(t: Thing) = println(t.property)
f: (t: Thing)Unit
scala> f(Thing("abc"))
abc
scala> f(Thing())
123
The expected behaviour can be achieved with simple overloading. I needed to put the method in an object because it looks like the REPL does not allow direct overloaded function declarations:
scala> object O {
def overloaded(t:Thing) = println(t.property)
def overloaded(t:Thing,s:String) = println(s)
}
defined module O
scala> O.overloaded(Thing("abc"), "123")
123
scala> O.overloaded(Thing("abc"))
abc