How do I implement a generic mathematical function in Scala - scala

I'm just getting started with Scala and something which I think should be easy is hard to figure out. I am trying to implement the following function:
def square(x:Int):Int = { x * x }
This works just fine, but if I want to try to make this function work for any kind of number I would like to be able to do the following:
def square[T <: Number](x : T):T = { x * x }
This complains and says: error: value * is not a member of type parameter T
Do I need to implement a trait for this?

That was one of my first questions in Stack Overflow or about Scala. The problem is that Scala maintains compatibility with Java, and that means its basic numeric types are equivalent to Java's primitives.
The problem arises in that Java primitives are not classes, and, therefore, do not have a class hierarchy which would allow a "numeric" supertype.
To put it more plainly, Java, and, therefore, Scala, does not see any common grounds between a Double's + and a an Int's +.
The way Scala finally got around this restriction was by using Numeric, and its subclasses Fractional and Integral, in the so-called typeclass pattern. Basically, you use it like this:
def square[T](x: T)(implicit num: Numeric[T]): T = {
import num._
x * x
}
Or, if you do not need any of the numeric operations but the methods you call do, you can use the context bound syntax for type declaration:
def numberAndSquare[T : Numeric](x: T) = x -> square(x)
For more information, see the answers in my own question.

You can define square as:
def square[T: Numeric](x: T): T = implicitly[Numeric[T]].times(x,x)
This approach has the advantage that it will work for any type T that has an implicit conversion to Numeric[T] (i.e. Int, Float, Double, Char, BigInt, ..., or any type for which you supply an implicit conversion).
Edit:
Unfortunately, you'll run into trouble if you try something like List(1,2,3).map(square) (specifically, you'll get a compile error like "could not find implicit value for evidence parameter of type Numeric[T]". To avoid this issue, you can overload square to return a function:
object MyMath {
def square[T: Numeric](x: T) = implicitly[Numeric[T]].times(x,x)
def square[T: Numeric]: T => T = square(_)
}
Hopefully someone with a better understanding of the type inferencer will explain why that is.
Alternatively, one can call List(1,2,3).map(square(_)), as Derek Williams pointed out in the scala-user mailing list thread.

Related

Shapeless Witness and how it can give the actual singleton type

I'm trying to understand singleton types in shapeless and faced misunderstanding about singleton types compile-time type. Here is an example:
val x: Witness.`120`.T = 120.narrow
It works fine, but this constructions looks very unusual. What is Witness.120? In IDE it points to some macro function selectDynamic:
def selectDynamic(tpeSelector: String): Any = macro SingletonTypeMacros.witnessTypeImpl
which has compile-time type Any and judging by the construction Witness.120.T a type member T. This looks like magic... Can anyone give some explanation on what actually is going on when one writes something like:
val x: Witness.`120`.T = //...
Witness creates a so-called literal-based singleton type. Literal type means it's a type that can only accept one value.
So if you create a function like this:
def f(x: Witness.`120`.T) = x
it would accept only integer 120, but not 121.
Since Scala 2.13 literal types are integrated into the language, so you can write it simply as:
def f(x: 120) = x
Function narrow narrows type of value 120 from general Int to literal 120.

How to have generic numeric type in scala for addition and multiplication?

I'm trying to implement generic numbers in scala that support addition and multiplication, can pattern match to any number and aren't restricted to any one type like Int, Double, etc. I looked up the docs and found that java.lang.Number fits the last criteria, i.e, any number pattern matched against java.lang.Number passes. So I quickly wrote up this implementation:
case class Number(num: java.lang.Number) {
def +(that: Number) = Number(this.num + that.num)
def *(that: Number) = Number(this.num * that.num)
}
However, as it turns out, java.lang.Number does not have the methods + and *. So I'm not sure how to implement this now. A professor suggested looking into typeclasses and the spire library, but I'm still having trouble.
Ideally, what I would like to have would be something like this
Number[A] + Number[A] returns Number[A]
Number[A] + Number[B] return Number[Finest type containing both A and B]
I'd be much obliged if someone could help me out with this. Thanks. :)
Scala has a typeclass for this called Numeric with implementations for the usual JVM number types.
note: it doesn't satisfy the Number[A] + Number[B] scenario
Examples Below
If you want to define your own number type, e.g.
case class RationalNumber(numerator: Int, denominator: Int)
Then you would also implement a Numeric instance for it
object RationalNumeric extends Numeric[RationalNumber] {
// implement the abstract methods
}
Some scala library methods (e.g. List's sum: def sum[B >: A](implicit num: Numeric[B]): B) take an implicit Numeric instance, so if you introduce an implicit reference to RationalNumeric, you could do something like:
List(RationalNumber(1,2), RationalNumber(2,3)).sum
Also, Numeric defines an implicit upgrade that adds operators to its class, so you could do this:
import RationalNumeric._
val sum = RationalNumber(1,2) + RationalNumber(2,3)

Implicit conversion paradox

If I try to define an implicit conversion for a primitive type, then it doesn't seem to work. E.g.:
implicit def globalIntToString(a: Int) : String = { a.toString() + "globalhi" }
1.toInt + "hi"
The above will still return simply "1hi" as the result.
However, it seems that if I parameterize a class or a def and then pass in the implicit for the parametrized case, then it seems to work. Does anyone know what the reasons are? E.g. does this have something to do with boxing/unboxing of primtives (e.g., parameterized primitives are boxed)? Does implicit only work with reference types and not primitive types?
class typeConv[T] { implicit def tToStr(a: T) : String = { a.toString() + "hi" } }
class t[K](a: K)(tc : typeConv[K]) { import tc._; println(a + "cool"); println(1.toInt + "cool" ) }
new t(1)(new typeConv[Int])
There are a few subtle things going on here. #yan has already explained the core issue - I'll try to add some more specific information.
As noted, 1.toInt + "hi" will never use any implicit conversion because Scala Int class actually has a method + that takes a String parameter. The compiler will look for an implicit view only when it can't find matching member in the original type.
A little more complicated stuff is happening inside your t class. Scalac will look for implicit conversion from a generic type K to any type that has a + method that takes a String parameter. There will be two candidates for such a conversion: your own tc.tToStr and Scala built-in scala.Predef.any2stringadd.
Normally, any2stringadd would be used, but in your example, your own conversion is used. Why does it have precedence over any2stringadd?
During implicit search, tc.tToStr is seen as a function of type K => String, while any2stringadd is seen as a function of type Any => StringAdd. My guess is that K => String is seen by the compiler as a more specific conversion than Any => StringAdd, but someone would have to confirm it with a proper reference to Scala Language Specification.
As you can see, defining such conversions may cause you a lot of strange behaviour. I'd definitely say that introducing an implicit conversion to a String is asking for trouble.
This happens because Scala defines a + operator on the Int type that takes a String and does not need to resolve an implicit conversion. Also, converting to String is usually a bad idea, as you'd generally have a custom, one-off type that would define the methods you're trying to add.

Spurious ambiguous reference error in Scala 2.7.7 compiler/interpreter?

Can anyone explain the compile error below? Interestingly, if I change the return type of the get() method to String, the code compiles just fine. Note that the thenReturn method has two overloads: a unary method and a varargs method that takes at least one argument. It seems to me that if the invocation is ambiguous here, then it would always be ambiguous.
More importantly, is there any way to resolve the ambiguity?
import org.scalatest.mock.MockitoSugar
import org.mockito.Mockito._
trait Thing {
def get(): java.lang.Object
}
new MockitoSugar {
val t = mock[Thing]
when(t.get()).thenReturn("a")
}
error: ambiguous reference to overloaded definition,
both method thenReturn in trait OngoingStubbing of type
java.lang.Object,java.lang.Object*)org.mockito.stubbing.OngoingStubbing[java.lang.Object]
and method thenReturn in trait OngoingStubbing of type
(java.lang.Object)org.mockito.stubbing.OngoingStubbing[java.lang.Object]
match argument types (java.lang.String)
when(t.get()).thenReturn("a")
Well, it is ambiguous. I suppose Java semantics allow for it, and it might merit a ticket asking for Java semantics to be applied in Scala.
The source of the ambiguitity is this: a vararg parameter may receive any number of arguments, including 0. So, when you write thenReturn("a"), do you mean to call the thenReturn which receives a single argument, or do you mean to call the thenReturn that receives one object plus a vararg, passing 0 arguments to the vararg?
Now, what this kind of thing happens, Scala tries to find which method is "more specific". Anyone interested in the details should look up that in Scala's specification, but here is the explanation of what happens in this particular case:
object t {
def f(x: AnyRef) = 1 // A
def f(x: AnyRef, xs: AnyRef*) = 2 // B
}
if you call f("foo"), both A and B
are applicable. Which one is more
specific?
it is possible to call B with parameters of type (AnyRef), so A is
as specific as B.
it is possible to call A with parameters of type (AnyRef,
Seq[AnyRef]) thanks to tuple
conversion, Tuple2[AnyRef,
Seq[AnyRef]] conforms to AnyRef. So
B is as specific as A. Since both are
as specific as the other, the
reference to f is ambiguous.
As to the "tuple conversion" thing, it is one of the most obscure syntactic sugars of Scala. If you make a call f(a, b), where a and b have types A and B, and there is no f accepting (A, B) but there is an f which accepts (Tuple2(A, B)), then the parameters (a, b) will be converted into a tuple.
For example:
scala> def f(t: Tuple2[Int, Int]) = t._1 + t._2
f: (t: (Int, Int))Int
scala> f(1,2)
res0: Int = 3
Now, there is no tuple conversion going on when thenReturn("a") is called. That is not the problem. The problem is that, given that tuple conversion is possible, neither version of thenReturn is more specific, because any parameter passed to one could be passed to the other as well.
In the specific case of Mockito, it's possible to use the alternate API methods designed for use with void methods:
doReturn("a").when(t).get()
Clunky, but it'll have to do, as Martin et al don't seem likely to compromise Scala in order to support Java's varargs.
Well, I figured out how to resolve the ambiguity (seems kind of obvious in retrospect):
when(t.get()).thenReturn("a", Array[Object](): _*)
As Andreas noted, if the ambiguous method requires a null reference rather than an empty array, you can use something like
v.overloadedMethod(arg0, null.asInstanceOf[Array[Object]]: _*)
to resolve the ambiguity.
If you look at the standard library APIs you'll see this issue handled like this:
def meth(t1: Thing): OtherThing = { ... }
def meth(t1: Thing, t2: Thing, ts: Thing*): OtherThing = { ... }
By doing this, no call (with at least one Thing parameter) is ambiguous without extra fluff like Array[Thing](): _*.
I had a similar problem using Oval (oval.sf.net) trying to call it's validate()-method.
Oval defines 2 validate() methods:
public List<ConstraintViolation> validate(final Object validatedObject)
public List<ConstraintViolation> validate(final Object validatedObject, final String... profiles)
Trying this from Scala:
validator.validate(value)
produces the following compiler-error:
both method validate in class Validator of type (x$1: Any,x$2: <repeated...>[java.lang.String])java.util.List[net.sf.oval.ConstraintViolation]
and method validate in class Validator of type (x$1: Any)java.util.List[net.sf.oval.ConstraintViolation]
match argument types (T)
var violations = validator.validate(entity);
Oval needs the varargs-parameter to be null, not an empty-array, so I finally got it to work with this:
validator.validate(value, null.asInstanceOf[Array[String]]: _*)

Could/should an implicit conversion from T to Option[T] be added/created in Scala?

Is this an opportunity to make things a bit more efficient (for the prorammer): I find it gets a bit tiresome having to wrap things in Some, e.g. Some(5). What about something like this:
implicit def T2OptionT( x : T) : Option[T] = if ( x == null ) None else Some(x)
You would lose some type safety and possibly cause confusion.
For example:
val iThinkThisIsAList = 2
for (i <- iThinkThisIsAList) yield { i + 1 }
I (for whatever reason) thought I had a list, and it didn't get caught by the compiler when I iterated over it because it was auto-converted to an Option[Int].
I should add that I think this is a great implicit to have explicitly imported, just probably not a global default.
Note that you could use the explicit implicit pattern which would avoid confusion and keep code terse at the same time.
What I mean by explicit implicit is rather than have a direct conversion from T to Option[T] you could have a conversion to a wrapper object which provides the means to do the conversion from T to Option[T].
class Optionable[T <: AnyRef](value: T) {
def toOption: Option[T] = if ( value == null ) None else Some(value)
}
implicit def anyRefToOptionable[T <: AnyRef](value: T) = new Optionable(value)
... I might find a better name for it than Optionable, but now you can write code like:
val x: String = "foo"
x.toOption // Some("foo")
val y: String = null
x.toOption // None
I believe that this way is fully transparent and aids in the understanding of the written code - eliminating all checks for null in a nice way.
Note the T <: AnyRef - you should only do this implicit conversion for types that allow null values, which by definition are reference types.
The general guidelines for implicit conversions are as follows:
When you need to add members to a type (a la "open classes"; aka the "pimp my library" pattern), convert to a new type which extends AnyRef and which only defines the members you need.
When you need to "correct" an inheritance hierarchy. Thus, you have some type A which should have subclassed B, but didn't for some reason. In that case, you can define an implicit conversion from A to B.
These are the only cases where it is appropriate to define an implicit conversion. Any other conversion runs into type safety and correctness issues in a hurry.
It really doesn't make any sense for T to extend Option[T], and obviously the purpose of the conversion is not simply the addition of members. Thus, such a conversion would be inadvisable.
It would seem that this could be confusing to other developers, as they read your code.
Generally, it seems, implicit works to help cast from one object to another, to cut out confusing casting code that can clutter code, but, if I have some variable and it somehow becomes a Some then that would seem to be bothersome.
You may want to put some code showing it being used, to see how confusing it would be.
You could also try to overload the method :
def having(key:String) = having(key, None)
def having(key:String, default:String) = having(key, Some(default))
def having(key: String, default: Option[String]=Option.empty) : Create = {
keys += ( (key, default) )
this
}
That looks good to me, except it may not work for a primitive T (which can't be null). I guess a non-specialized generic always gets boxed primitives, so probably it's fine.