I'm trying to build a DenseMatrix of Vectors in breeze. However I keep getting the error message:
could not find implicit value for evidence parameter of type breeze.storage.Zero[breeze.linalg.DenseVector[Double]]
for the line:
val som: DenseMatrix[DenseVector[Double]] = DenseMatrix.tabulate(5, 5){ (i, j) => DenseVector.rand(20)}
Even though doing something similar with a Scala Array works fine:
val som = Array.tabulate(5, 5)((i, j) => DenseVector.rand(20))
I'm not sure what it is I'm doing wrong or what I'm missing? To be honest I don't understand what the error message is telling me... I don't do enough Scala programming to understand this? What even is an Evidence parameter and can I explicitly specify it or do I need an implicit?
This is because DenseMatrix.tabulate[V] firstly fills the matrix with zeroes. So there should be an instance of type class Zero for V, i.e. in our case for DenseVector[Double]. You can define it yourself e.g.
implicit def denseVectorZero[V: Zero : ClassTag]: Zero[DenseVector[V]] =
new Zero(DenseVector.zeros(0))
i.e. if we know Zero for V then we know Zero for DenseVector[V].
Or even easier
implicit def ev[V: ClassTag]: Zero[DenseVector[V]] = new Zero(DenseVector(Array.empty))
Related
so here's the problem I keep running into various situations with Scala - it seemingly ignores the implied type, even when the situation is clear. Granted this could be my understanding I admit, but when it comes to underscore placeholders I keep running into trouble. For example below (this is fictional just to prove the point).The 2nd position of trait X has to be <:X[,] of some kind. There's no ambiguity here - so anywhere that scala sees this position, regardless of how weak it's coded - the contact is X and I should have access to functions like "doesX". Isn't that indisputable? No matter how poorly I deal with that position in the code, I must at least get X. Why does Scala constantly ignore this fact when you get deep into the type system? Any pointers would be appreciated, thank you!
object TestRun extends App {
trait X[T, Y<:X[_,_]] {
def doesX:Unit
def providesY:Y
}
class Test extends X[Int,Test]{
override def doesX: Unit = println("etc..")
def providesY:Test = new Test
}
val a:X[_,_] = new Test //yes I know I could define a better here, its just to demo. I shouldn't have to explicitly relabel the 2nd _ as _<:X[_,<:X[ etc..
val b = a.providesY //clearly this has to be at least a (something) WITH X, but scala claims this as "Any"
b.doesX //error won't compile!!
//trait
}
When you write:
val a: X[_, _] = new Test
^
// This is treated as if the type parameter is Any, for the most part
You are telling the compiler that a is an X, where you don't care what its type parameters are. That is, the unbounded wildcard _ is assumed to have an upper-bound of Any, and that's it.
providesY uses the second type parameter of X to determine its return type, but for a the compiler was told that to discard it. So b is just an Any. This is easier to see using the REPL:
scala> val a: X[_, _] = new Test
a: X[_, _] = Test#27abe2cd
scala> val b = a.providesY
b: Any = Test#f5f2bb7
Therefore, b.doesX fails to compile because the compiler now thinks it is Any. The simple solution is not to use wild cards for types (or any existential types in general, most of the time you do not want this).
scala> val a: X[Int, Test] = new Test
a: X[Int,Test] = Test#1134affc
scala> val b = a.providesY
b: Test = Test#6fc6f14e
scala> b.doesX
etc..
Or you could simply leave off the type annotation, and let the compiler infer the correct type.
HMap seems to be the perfect data structure for my use case, however, I can't get it working:
case class Node[N](node: N)
class ImplVal[K, V]
implicit val iv1 = new ImplVal[Int, Node[Int]]
implicit val iv2 = new ImplVal[Int, Node[String]]
implicit val iv3 = new ImplVal[String, Node[Int]]
val hm = HMap[ImplVal](1 -> Node(1), 2 -> Node("two"), "three" -> Node(3))
My first question is whether it is possible to create those implicits vals automatically. For sure, for typical combinations I could create them manually, but I'm wondering if there is something more generic, less boilerplate way.
Next question is, how to get values out of the map:
val res1 = hm.get(1) // (1) ambiguous implicit values: both value iv2 [...] and value iv1 [...] match expected type ImplVal[Int,V]`
To me, Node[Int] (iv1) and Node[String] (iv2) look pretty different :) I thought, despite the JVM type erasure limitations, Scala could differentiate here. What am I missing? Do I have to use other implicit values to make the difference clear?
The explicit version works:
val res2 = hm.get[Int, Node[Int]](1) // (2) works
Of course, in this simple case, I could add the type information to the get call. But in the following case, where only the keys are known in advance, I don't know how to do it:
def get[T <: HList](keys: T): HList = // return associated values for keys
Is there any simple solution to this problem?
BTW, what documentation about Scala's type system (or Shapeless or in functional programming in general) could be recommended to understand the whole topic better as I have to admit, I'm lacking some background for this topic.
The type of the key determines the type of the value. You have Int keys corresponding to both Node[Int] and Node[String] values, hence the ambiguity. You might find this article helpful in explaining the general mechanism underlying this.
Regard this
val oddOrEven = (odd, even)
oddOrEven._1 would give "odd", while oddOrEven._2 would give "even"
We basically get a tuple with "unnamed" members, if you will so.
But let's assume I wanted to get either odd or even, depending on some external data, like so:
val witness: Int = {numberOfPrevious % 2}
now, let's do this:
val one = oddOrEven._witness
This won't compile.
Is there some special syntax involved or is this simply not possible?
I got curious and wondered whether it was the compiler that could not deduce that the only possible values of witness would be 0 and 1 (but I thought that to be silly on my side, yet I had to try) and tried this:
val oddOrEven = (odd, even)
val witness: Int = {numberOfPrevs % 2}
val v = x match {
case 0 => oddOrEven._1
case 1 => oddOrEven._2
}
Yet again val one = oddOrEven._witness would not work
Then I dug deeper and found out that indeed the compiler would not check for exhaustion. Like:
val v = x match {
case 1 => oddOrEven._1
case 2 => oddOrEven._2
}
would still compile, although 2 was not possible and 0 was missing!
So, I know I am mixing things up here. I am aware that there are matches that are not what is called "exhausting" in my mothertongue, so the possible values are not deduced at compile-time, but at runtime (and indeed I would get a
scala.MatchError: 0 (of class java.lang.Integer)
at runtime.
But, what I'm really interested in: Can I get "unnamed" tuples by an "indirect index" like I mean to?
What about keep it simple, like this:
val oddOrEven = (odd, even)
val witness: Int = {numberOfPrevious % 2} // Or any other calculation of index
oddOrEven.productElement(witness)
You loose type safety while productElement returns Any, but when you know the type of members you can cast, like:
oddOrEven.productElement(witness).asInstanceOf[YourType]
I'm making assumption here that your odd and even values are of the same type, e.g:
sealed trait OddOrEven
case object odd extends OddOrEven
case object event extends OddOrEven
Then:
oddOrEven.productElement(witness).asInstanceOf[OddOrEven]
will give you correct type OddOrEven.
Btw. take a peek at ScalaDoc for Tuple2
You should probably not do that. If you need an index, think about a List or a Vector.
But if you really want this and all of your tuple items are of the same type (for example Int) you could do:
(even, odd).productIterator.collect { case x: Int => 2* x }.toList(0)
Well, you can do something like tuple.productIterator.toList(index-1), but if you want a list, it's probably a better idea to just use a list rather than converting a tuple to it.
And no, compiler doesn't try to figure out all possible ways your code can be executed in order to tell what possible values a variable could take.
I've looked around and found several other examples of this, but I don't really understand from those answers what's actually going on.
I'd like to understand why the following code fails to compile:
val df = readFiles(sqlContext).
withColumn("timestamp", udf(UDFs.parseDate _)($"timestamp"))
Giving the error:
Error:(29, 58) not enough arguments for method udf: (implicit evidence$2: reflect.runtime.universe.TypeTag[java.sql.Date], implicit evidence$3: reflect.runtime.universe.TypeTag[String])org.apache.spark.sql.UserDefinedFunction.
Unspecified value parameter evidence$3.
withColumn("timestamp", udf(UDFs.parseDate _)($"timestamp")).
^
Whereas this code does compile:
val parseDate = udf(UDFs.parseDate _)
val df = readFiles(sqlContext).
withColumn("timestamp", parseDate($"timestamp"))
Obviously I've found a "workaround" but I'd really like to understand:
What this error really means. The info I have found on TypeTags and ClassTags has been really difficult to understand. I don't come from a Java background, which perhaps doesn't help, but I think I should be able to grasp it…
If I can achieve what I want without a separate function definition
The error message is a bit mis-leading indeed; the reason for it is that the function udf takes an implicit parameter list but you are passing an actual parameter. Since I don't know much about spark and since the udf signature is a bit convoluted I'll try to explain what is going on with a simplified example.
In practice udf is a function that given some explicit parameters and an implicit parameter list gives you another function; let's define the following function that given a pivot of type T for which we have an implicit Ordering will give as a function that allows us to split a sequence in two, one containing elements smaller than pivot and the other containing elements that are bigger:
def buildFn[T](pivot: T)(implicit ev: Ordering[T]): Seq[T] => (Seq[T], Seq[T]) = ???
Let's leave out the implementation as it's not important. Now, if I do the following:
val elements: Seq[Int] = ???
val (small, big) = buildFn(10)(elements)
I will make the same kind of mistake that you are showing in your code, i.e. the compiler will think that I am explicitly passing elements as the implicit parameter list and this won't compile. The error message of my example will be somewhat different from the one you have because in my case the number of parameters I am mistakenly passing for the implicit parameter list matches the expected one and then the error will be about types not lining up.
Instead, if I write it as:
val elements: Seq[Int] = ???
val fn = buildFn(10)
val (small, big) = fn(elements)
In this case the compiler will correctly pass the implicit parameters to the function. I don't know of any way to circumvent this problem, unless you want to pass the actual implicit parameters explicitly but I find it quite ugly and not always practical; for reference this is what I mean:
val elements: Seq[Int] = ???
val (small, big) = buildFn(10)(implicitly[Ordering[Int]])(elements)
Is there a way to prohibit a parameterized type being parameterized by a specific type?
e.g. Suppose I want to create my own specialized List[T] type where I do not want List[Nothing] to be legal, i.e. cause a compile error.
I'm looking for a way to make the following error more easy to catch (yes, I understand this is not very functional or great Scala):
val x = ListBuffer()
x += 2
x has type ListBuffer[Nothing].
This sort of works,
class C[A](x: A*)(implicit ev: A =:= A) { }
There will be a type error if A = Nothing is inferred,
val c1 = new C[Int]() // Ok
val c2 = new C(1) // Ok, type `A = Int` inferred
val c3 = new C() // Type error, found (Nothing =:= Nothing) required: (A =:= A)
But it's still possible to explicitly set the type parameter A to Nothing,
val c4 = new C[Nothing]() // Ok
More generally, it's pretty tricky to ensure that two types are unequal in Scala. See previous questions here and here. One approach is to set up a situation where equal types would lead to ambiguous implicits.
You can define a type A >: Null if you specifically want to avoid Nothing, as Null is second from bottom and also covariant to all types (therefore its contravariance includes all types apart from Nothing).
Not sure how useful this is as its type bounds still includes Null.