Where does Scala's 'NotInferedT' come from? - scala

In Scala, we can get a compile error with a message containing 'NotInferedT'. For example :
expected: (NotInferedT, NotInferedT) => Boolean, actual: (Nothing, Nothing)
(as seen here ).
This messge is coming from the Scala compiler, and it appears to mean that Scala cannot infer a type. But is 'NotInferedT' itself a type ? And is it described in the Scala documentation somewhere ?
I can't find 'NotInferedT' in the Scala API docs .

It's the way the Scala plugin (which is basically a Scala compiler) for IntelliJ IDEA names an undefined type it can't resolve:
case UndefinedType(tpt, _) => "NotInfered" + tpt.name

Related

Scala Type Inference Not Working for GADTs in IntelliJ

Following code does not appear as compiling in IntelliJ. It is complaining that the type returned from the eval should be A where I would expect it to infer during usage. The code compiles fine using sbt but does not compile on IntelliJ with both Scala version 2 and 3 (it runs fine but shows error message in code). Is there a special configuration (e.g. sbt params, jvm args etc) that is required for code below to appear as compiled on IntelliJ?
object Solution1 {
trait Expr[A]
case class B(bool: Boolean) extends Expr[Boolean]
case class I(i: Int) extends Expr[Int]
def eval[A](expr: Expr[A]): A = expr match {
case B(b) => b
case I(i) => i
}
}
Generalized algebraic data types (GADTs) are known to be extremely difficult to implement correctly in Scala. AFAIK both the scalac and dotty compilers currently have type soundness holes, particularly using covariant GADTs.
Your eval should return a value of type T, but since i has type Int and b has type Boolean, it seems the compiler is able to see that because I is a subtype of Expr[Int], T must be equal to Int and then T must be equal to Boolean in the same way.
I am surprised this type checks. It's odd because one would expect T to keep it's meaning. It seems Scala 2's GATDs implementation is still faulty.
I'm not surprised Intellij's Scala Plugin flags it red. You can report this, but I'm not sure whether they can do anything about it as it seems Scala is flawed here for the time being. It needs special treatment for them.

Scala Type matching function inside a trait body

I'm new to Scala and working through this book (Function Programming in Scala). One of the exercises involves replicating the Option trait and its functions. However I'm having a problem compiling my solution in the REPL.
sealed trait Nullable[+A] {
def get[B >: A](default: => B) : B = this match {
case Value(v) => v
case Null => default
}
}
case class Value[+A](value: A) extends Nullable[A]
case object Null extends Nullable[Nothing]
REPL error details:
error: constructor cannot be instantiated to expected type;
found : Value[A(in class Value)]
required: Nullable[A(in trait Nullable)]
case Value(v) => v
error: pattern type is incompatible with expected type;
found : Null.type
required: Nullable[A]
case Null => default
Based on those errors I have a nagging feeling that the compiler can't infer that the type of this (being pattern matched on) is a Nullable.
I've tried this block of code on this online Scala utility and it seems to compile and run. The only difference I can see is that the online tool is using Scala version 2.10.3 and I'm running 2.11.7
So I'm not sure if this is environmental or if I need to help the Scala compiler here. I have also tried to compile the answer from the book's authors and I get the same errors.
Any help would be much appreciated.
Posting an answer in case someone else has a similar issue.
Use the REPL :paste command to load the .scala file, instead of the :load command.
Thank you to #noah and #jwvh for the help.

How to use flink fold function in scala

This is a non working try for using Flink fold with scala anonymous function:
val myFoldFunction = (x: Double, t:(Double,String,String)) => x + t._1
env.readFileStream(...).
...
.groupBy(1)
.fold(0.0, myFoldFunction : Function2[Double, (Double,String,String), Double])
It compiles well, but at execution, I get a "type erasure issue" (see below). Doing so in Java is fine, but of course more verbose. I like the concise and clear lambdas. How can I do that in scala?
Caused by: org.apache.flink.api.common.functions.InvalidTypesException:
Type of TypeVariable 'R' in 'public org.apache.flink.streaming.api.scala.DataStream org.apache.flink.streaming.api.scala.DataStream.fold(java.lang.Object,scala.Function2,org.apache.flink.api.common.typeinfo.TypeInformation,scala.reflect.ClassTag)' could not be determined.
This is most likely a type erasure problem.
The type extraction currently supports types with generic variables only in cases where all variables in the return type can be deduced from the input type(s).
The problem you encountered is a bug in Flink [1]. The problem originates from Flink's TypeExtractor and the way the Scala DataStream API is implemented on top of the Java implementation. The TypeExtractor cannot generate a TypeInformation for the Scala type and thus returns a MissingTypeInformation. This missing type information is manually set after creating the StreamFold operator. However, the StreamFold operator is implemented in a way that it does not accept a MissingTypeInformation and, consequently, fails before setting the right type information.
I've opened a pull request [2] to fix this problem. It should be merged within the next two days. By using then the latest 0.10 snapshot version, your problem should be fixed.
[1] https://issues.apache.org/jira/browse/FLINK-2631
[2] https://github.com/apache/flink/pull/1101

Is ??? a valid symbol or operator in scala

I have seen the symbol
???
used in scala code, i however don't know if it's meant to be pseudo code or actual scala code, but my eclipse IDE for scala doesn't flag it and the eclipse worksheet actually evaluates it.
I haven't been able to find anything via google search.
Any help will be appreciated.
Thanks
Yes, this is a valid identifier.
Since Scala 2.10, there is a ??? method in Predef which simply throws a NotImplementedError.
def ??? : Nothing = throw new NotImplementedError
This is intended to be used for quickly sketching the skeleton of some code, leaving the implementations of methods for later, for example:
class Foo[A](a: A) {
def flatMap[B](f: A => Foo[B]): Foo[B] = ???
}
Because it has a type of Nothing (which is a subtype of every other type), it will type-check in place of any value, allowing you to compile the incomplete code without errors. It's often seen in exercises, where the solution needs to be written in place of ???.
To search for method names that are ASCII or unicode strings:
SO search: https://stackoverflow.com/search?q=[scala]+%22%3F%3F%3F%22
finds this thread Scala and Python's pass
scalex covers scala 2.9.1 and scalaz 6.0 http://scalex.org/?q=%3C%3A%3C

Scala Future mapTo fails to compile because of missing ClassTag

Simple question, I have a problem where using mapTo on the result of ask results in a compiler error along the lines of:
not found: value ClassTag
For example:
(job ? "Run").mapTo[Result]
^
I don't understand why it needs a ClassTag to do the cast? If I substitute a standard class from Predef like String as in (job ? "Run").mapTo[String] that compiles OK.
This happens when I define the class right above the line in question, as in:
class Result {}
(job ? "Run").mapTo[Result]
I still get the same problem.
Thanks, Jason.
I should also state that I'm using Scala 2.10.0 and Akka 2.1.0 (if that makes a difference).
This seems to be a particular problem with the Scala 2.10.0 version
After adding
import reflect.ClassTag
the implicitly used ClassTag parameter in mapTo should work.
Either that or updating to a newer Version of Akka/Scala (which should be prefered if possible).