I am trying to build a map from an input map, but the compiler is unable to prove that a 2-element tuple is a 2-element tuple.
Code
class Element[T] extends AnyRef { }
class Sample
{
def makeList(x:Int): Element[_] = {
x match {
case 1 => new Element[Boolean]
case 2 => new Element[(Boolean, Boolean)]
}
}
val input = Map(1 -> "one",2 -> "two")
val output = input.map(e => e._1 -> makeList(e._1)).toMap
}
sbt compile
sbt:root> ~compile
[info] Compiling 1 Scala source to /Users/tda0106/test/scala/target/scala-2.12/classes ...
[error] /Users/tda0106/test/scala/src/main/scala/Test.scala:14:57: Cannot prove that (Int, Element[_$1]) forSome { type _$1 } <:< (T, U).
[error] val output = input.map(e => e._1 -> makeList(e._1)).toMap
[error] ^
[error] one error found
[error] (Compile / compileIncremental) Compilation failed
[error] Total time: 1 s, completed Jun 27, 2019, 2:38:14 PM
It appears that the problem is related to the forSome { type _$1 }, as otherwise it should match. When I first tried to reproduce it, I used List instead of Element and it compiled. It appears that the different is that List is declared as List[+T] and the + is important here.
Element is from a third party library, so changing it is difficult.
What is the problem that I am running into here, and is there a simple way to fix it?
Scala version: 2.12.8
Scala gets fickle about type inference sometimes when you're doing things with existentials (which is what Element[_] is). A quick explicit type signature will fix that right up.
val output = input.map(e => e._1 -> makeList(e._1)).toMap[Int, Element[_]]
All you're doing is telling the compiler what types you want for the keys and values. The reasons why it can't infer this are long and complicated, but as a general rule once you start throwing underscores in your types, you're going to lose some inference capabilities.
Related
Suppose that I have this Spark code written in Scala 2.12
val dataset = spark.emptyDataset[String]
dataset.foreachPartition( partition => partition.foreach {
entry: String => println(entry)
})
When I run the code, the compiler gave this error
[info] Compiling 1 Scala source to <path>/scala-2.12/classes ...
[error] Code.scala:11:52: value foreach is not a member of Object
[error] empty.foreachPartition( partition => partition.foreach{
[error] ^
[error] one error found
[error] (Compile / compileIncremental) Compilation failed
[error] Total time: 1 s, completed Jul 11, 2020 1:43:41 AM
Why did the compiler partition as an Object instead of Iterator[String]?
I have to manually add the partition type in order for the code to works.
val dataset = spark.emptyDataset[String]
dataset.foreachPartition( (partition:Iterator[String]) => partition.foreach {
entry: String => println(entry)
})
This is because of two overloaded versions of foreachPartition and Java-Scala interop.
If the code were only in Scala (this is minimal code and independent of Spark)
val dataset: Dataset[String] = ???
dataset.foreachPartition(partition => ???)
class Dataset[T] {
def foreachPartition(f: Iterator[T] => Unit): Unit = ???
def foreachPartition(func: ForeachPartitionFunction[T]): Unit = ???
}
trait ForeachPartitionFunction[T] extends Serializable {
def call(t: Iterator[T]): Unit
}
then the type of partition would be inferred (as scala.collection.Iterator[String]).
But in actual Spark code ForeachPartitionFunction is Java interface whose method call accepts java.util.Iterator[String].
So both options
dataset.foreachPartition((
(partition: scala.collection.Iterator[String]) => ???
): Iterator[String] => Unit)
dataset.foreachPartition((
(partition: java.util.Iterator[String]) => ???
): ForeachPartitionFunction[String])
are eligible and compiler can't infer the type of partition.
And inference in Scala is local so after compiler can see partition => partition.foreach... (and java.util.Iterator[String] doesn't have method foreach) it's too late to come back to typing partition.
just like #Dmytro said, scala compiler can't infer which overload function it should apply. however there is a simple workaround you can use, by using this helper function:
def helper[I](f: I => Unit): I => Unit = f
now all you need to do is:
dataset.foreachPartition( (partition:Iterator[String]) => partition.foreach {
helper[String](entry => println(entry))
})
I am not sure how to get past this "No matching Shape found" error, apart from writing lots of boilerplate.
The basic idea illustrated in the Gist is that I have a very basic version of a method (works, but is very specific), then a version that takes the mapper parameter and is more generic (works too, but is specific to one particular type), and then a third version which takes a type parameter and would be very useful, but doesn't compile because of this error.
Basic method:
def updatePD_FirstNames(id: ids.PersonalDetailsId, firstNames: StringLtd30): Future[Int] = {
Better method:
def updatePD_SL(id: ids.PersonalDetailsId, mapper: tables.PersonalDetails => tables.profile.api.Rep[StringLtd30], sl: StringLtd30): Future[Int] = {
Ideal method (but doesn't compile):
def updatePD_X[X](id: ids.PersonalDetailsId, mapper: tables.PersonalDetails => tables.profile.api.Rep[X], sl: X): Future[Int] = {
```
[server] $ compile
[info] Compiling 1 Scala source to ... target\scala-2.12\classes...
[error] ...schema\DbProxy.scala:688: No matching Shape found.
[error] Slick does not know how to map the given types.
[error] Possible causes: T in Table[T] does not match your * projection,
[error] you use an unsupported type in a Query (e.g. scala List),
[error] or you forgot to import a driver api into scope.
[error] Required level: slick.lifted.FlatShapeLevel
[error] Source type: slick.lifted.Rep[X]
[error] Unpacked type: T
[error] Packed type: G
[error] val q2: Query[tables.profile.api.Rep[X], X, Seq] = q1.map(mapper)
[error] ^
[error] one error found
[error] (server/compile:compileIncremental) Compilation failed
[error] Total time: 4 s, completed 23-Mar-2017 11:15:47
```
Full code at https://gist.github.com/aholland/0845bf29d836d672d006ab58f5f1c73c
The only obvious problem I can see in the code you've posted is that X is unconstrained. It could be any type, includes ones that Slick doesn't know how to process.
What you can do is add a context bound on X. The bound you probably want is BaseTypedType, which is a "typed type" Slick uses to identify types it can work with. It's described from 11:30 in https://www.youtube.com/watch?v=tS6N5AaZTLA
You'd use it like this:
import slick.ast.BaseTypedType
def updatePD[X : BaseTypedType](
id: Long,
selector: PersonTable => Rep[X],
newValue: X
): DBIO[Int] =
people.filter(_.id === id).map(selector).update(newValue)
What that means is that when you use the method...
updatePD(anId, _.name, "Alice")
...the compiler has to prove to itself that whatever X you use, there is an approproate type representation in Slick.
This is also from Richard, but the exchange took place on gitter.
The only trouble with the first answer is that by demanding an implicit of type BaseTypedType[X] the context bound forces client code for optional columns to provide an implicit of type BaseTypedType[Option[X]] even when BaseTypedType[X] is already available.
This is unnecessary. Slick handles optional columns for you and if you provide an implicit for BaseTypedType[X] you are providing enough for it to handle columns of type Option[X].
So the context bound, while it works, is more demanding than necessary and results in having to write implicits in the client-code that involve directly referencing null and replicating logic already built into Slick. Not good.
The answer is to declare the implicit parameter as a named implicit parameter (called shape below) in its own parameter list, i.e. in long-form, not using the context bound short-hand :BaseTypedType. Then you can specify the more complicated but less demanding constraint used below.
So the solution is:
def updatePD[X] (id: Long, selector: PersonTable => Rep[X], newValue: X)
(implicit shape: Shape[_ <: FlatShapeLevel, Rep[X], X, _]): DBIO[Int] = {
people.filter(_.id === id).map(selector).update(newValue)
}
Understanding why shape has the exact type Shape[_ <: FlatShapeLevel, Rep[X], X, _] depends on an intimate understanding of Slick's types and implicit mechanisms. Richard may yet write a blog post on that!
Haskell offers typed holes.
Example:
f :: Int -> String -> Bool
f x y = if (x > 10) then True else (g y)
g :: String -> Bool
g s = _
Compilation:
Prelude> :l HoleEx.hs
[1 of 1] Compiling Main ( HoleEx.hs, interpreted )
HoleEx.hs:6:7:
Found hole `_' with type: Bool
Relevant bindings include
s :: String (bound at HoleEx.hs:6:3)
g :: String -> Bool (bound at HoleEx.hs:6:1)
In the expression: _
In an equation for `g': g s = _
Failed, modules loaded: none.
When programming in Scala, I typically use ??? as placeholders in my Scala code that I've not yet written.
However, using typed holes appears to be more powerful to me. Note that I could replace the above else (g y) with else (g 55), yet I'd get a compile-time error:
Prelude> :l HoleEx.hs
[1 of 1] Compiling Main ( HoleEx.hs, interpreted )
HoleEx.hs:3:39:
Couldn't match type `Int' with `[Char]'
Expected type: String
Actual type: Int
In the first argument of `g', namely `x'
In the expression: (g x)
Failed, modules loaded: none.
Although I can use ??? in Scala to get placeholder implementations to compile, unlike holes, I'll get run-time errors with ???.
scala> def g(x: Int): Int = ???
g: (x: Int)Int
scala> g(5)
scala.NotImplementedError: an implementation is missing
Does Scala have typed holes?
For me a hole in a program helps me progress the implementation. It does that by telling me what type I need next. You can get something a little like that by triggering a regular Scala type error.
The trick is to define a type you know will be wrong, and then use it. For example:
object hole
You can then use that in code and you'll get an appropriate type error:
The error here is telling me the "hole" needs to be a B, suggesting I can progress by using g in some way. And if I progress the code to a => f(g(hole)) the compiler will tell me the hole needs to be an A.
That's a kind of hole, but unlike other languages (Idris etc)...
my program is just not compiling, rather than having a recognized hole as such;
there's no help to tell me the types in scope (although IDE code completion my give some suggestions);
I can't nicely name a hole;
I can't automatically lift a hole to a function definition;
I can't automatically solve the hole (as Idris can, sometimes);
I must type the expected return value, otherwise Scala will infer that hole is an acceptable type!
...and probably many other features.
As you say ??? is a useful placeholder, but it has type Nothing -- which means the program compiles, but it doesn't help you fill in the implementation.
At least object hole, although kind of trivial, does give you some type information to progress.
Scala's ??? is just a shorthand for throwing an exception, and is equivalent to Haskell's undefined or just error "not implemented" or similar.
The Scala compiler does not have support for typed holes. You can however just use ??? and inspect the type of that in an IDE or Emacs+Ensime to see its inferred type.
Scala's type inference is relatively minimal when compared to the full Hindley-Milner type inference of languages like Haskell. Which means that having something like typed holes in the Scala compiler would not be feasible.
There's a Scala compiler plugin by Chris Birchall called scala-typed-holes that implements typed holes:
package example
object Example {
def foo(x: Int, y: String): Boolean = {
if (y.length == x) {
??? // TODO implement!
} else {
true
}
}
def bar(x: Int): String = x match {
case 0 => "zero"
case 1 => "one"
case _ => ???
}
}
Generates:
[warn] /Users/chris/code/scala-typed-holes/src/test/scala/example/Example.scala:7:7:
[warn] Found hole with type: Boolean
[warn] Relevant bindings include
[warn] x: Int (bound at Example.scala:5:11
[warn] y: String (bound at Example.scala:5:19)
[warn]
[warn] ??? // TODO implement!
[warn] ^
[warn] /Users/chris/code/scala-typed-holes/src/test/scala/example/Example.scala:16:15:
[warn] Found hole with type: String
[warn] Relevant bindings include
[warn] x: Int (bound at Example.scala:13:11)
[warn]
[warn] case _ => ???
[warn] ^
In order to try and understand Scala's type system I'm attempting
to implment a custom implementation for List.foreach method :
package com
object customForEach extends App {
class customForEach[B, A] extends Iterable[A] with collection.Seq[A] {
def foreach[B](f: A ⇒ B) {
var these = this
while (!these.isEmpty) {
f(these.head)
these = these.tail
}
}
def tail = this match {
case h :: t ⇒ t
}
}
}
When I complile this code I receive errors :
[error] \Desktop\Scala\src\main\scala\typeparam.scala:16: constructor cannot be instantiated to expected type;
[error] found : scala.collection.immutable.::[B(in class ::)]
[error] required: com.customForEach.customForEach[B(in class customForEach),A]
[error] case h :: t ? t
[error] ^
[error] \Desktop\Scala\src\main\scala\typeparam.scala:16: not found: value t
[error] case h :: t ? t
[error] ^
[error] \Desktop\Scala\src\main\scala\typeparam.scala:11: type mismatch;
[error] found : Seq[A]
[error] required: com.customForEach.customForEach[B,A]
[error] these = these.tail
[error] ^
[error] three errors found
[error] (compile:compile) Compilation failed
[error] Total time: 0 s, completed 31-Jan-2015 11:53:40
In particular I find it iteresting how println can be composed with List in this fashion : List(1,2,3).foreach(println)
Do I need to add extend another trait in order to access the .tail function ?
For this error :
not found: value t
[error] case h :: t ? t
Shouldn't t be found since it is created using pattern match operator :: ?
There are many reasons why this code won't work. In order to understand the first compiler error not found: value t, you must look at the error immediately before it. :: exists solely for List, but here you do not have a List, only Iterable with Seq. That pattern match can't work, which causes t to become "not found".
There are even larger problems than that, though. Even if you remove your definition of tail (which is unnecessary), you'll then find that you're missing abstract method definitions for apply, iterator, and length from the Seq trait. I imagine you're doing this because you can't extend List, which is sealed. You can copy the implementations of apply, and length from LinearSeqOptimized, then easily implement an iterator method, but there's still another problem: your class does not have a constructor.
Okay, well we'll look at what List does again. List is abstract and has two sub-types, :: and Nil. Nil is just a case object, and :: has a constructor that accepts the head and tail of the List. This isn't going to help you very much, unless you also want to duplicate the code for :: and Nil as well.
Scala collections are very large complicated beasts, and extending them to override one method is not a simple process.
import scala.slick.driver.MySQLDriver.simple._
class RichTable[T](tag: Tag, name: String) extends Table[T](tag, name) {
case class QueryExt[B](q: Query[RichTable.this.type, B]) {
def whereEq[C](col: RichTable.this.type => Column[C], c: C) = {
q.filter { fields =>
col(fields) === c
}
}
}
}
Then it complains
[error] /home/jilen/workspace/play-slick/src/main/scala/play/slick/SlickQueryExtension.scala:10: value === is not a member of slick.driver.MySQLDriver.simple.Column[C]
[error] col(fields) === c
[error] ^
[error] /home/jilen/workspace/play-slick/src/main/scala/play/slick/SlickQueryExtension.scala:9: ambiguous implicit values:
[error] both value BooleanColumnCanBeQueryCondition in object CanBeQueryCondition of type => scala.slick.lifted.CanBeQueryCondition[scala.slick.lifted.Column[Boolean]]
[error] and value BooleanOptionColumnCanBeQueryCondition in object CanBeQueryCondition of type => scala.slick.lifted.CanBeQueryCondition[scala.slick.lifted.Column[Option[Boolean]]]
[error] match expected type scala.slick.lifted.CanBeQueryCondition[Nothing]
[error] q.filter { fields =>
[error] ^
[error] two errors found
[error] (compile:compile) Compilation failed
[error] Total time: 0 s, completed Mar 6, 2014 1:21:48 AM
There have been questions about this, but the answers did not work for 2.0
How to parametrize Scala Slick queries by WHERE clause conditions?
Slick doesn't have any information about C, so it doesn't know if it can and how it should map it to a database value and if it can use === on it. So you get a type error. You will have to use Scala's type system to restrict the type to one for which Slick knows how to map it. You can do this by providing a so-called Context Bound, in this case :BaseColumnType.
def whereEq[C:BaseColumnType](col: RichTable.this.type => Column[C], c: C) = {
q.filter { fields =>
col(fields) === c
}
}
BaseColumnType is provided by Slick and using it in this way basically tells the Scala compiler to look for an implicit value of type BaseColumnType[C] in scope, where you call whereEq. Because then it is usually known what C will actually be. Slick comes with BaseColumnType[Int], BaseColumnType[String], etc. so at the call site, the Scala compiler can find one when your C is really an Int or String in that particular call and this way pass the info further to Slick.
Same for LiuTiger's question. abstract class Crud[..., PK:BaseColumnType] should do the trick, a trait doesn't work with context bounds. When implementing an abstract DAO be prepared to face a lot of challenges and get to the edges of your Scala type system skills and learn quite a bit about type inference order, implicit parameters, etc.