Scala: type hint for lambda - scala

I'm refreshing scala. This looks very simple to me but I can't get it to run:
import java.nio.file.{FileSystems, Files}
object ScalaHello extends App {
val dir = FileSystems.getDefault.getPath("/home/intelli/workspace")
Files.walk(dir).map(_.toFile).forEach(println)
}
It throws error at the mapping lambda:
argument expression's type is not compatible with formal parameter type;
found : java.util.function.Function[java.nio.file.Path,java.io.File]
required: java.util.function.Function[_ >: java.nio.file.Path, _ <: ?R]
I suspect it has something to do with providing type hints for the lambda but I can't find anything surfing Google. Much appreciated

Note that Files.walk returns a Java Stream, so map and forEach come from Java.
Assuming you are using Scala 2.12, your code will work if you either:
Update Scala version to 2.13 (no need to make any other changes in this case)
Specify return type of map:
Files.walk(dir).map[File](_.toFile).forEach(println)
Convert to Scala collections before calling map:
import scala.collection.JavaConverters._
Files.walk(dir).iterator().asScala.map(_.toFile).foreach(println)

Related

forEach in scala shows expected: Consumer[_ >:Path] actual: (Path) => Boolean

Wrong syntax problem in recursively deleting scala files
Files.walk(path, FileVisitOption.FOLLOW_LINKS)
.sorted(Comparator.reverseOrder())
.forEach(Files.deleteIfExists)
The issue is that you're trying to pass a scala-style function to a method expecting a java-8-style function. There's a couple libraries out there that can do the conversion, or you could write it yourself (it's not complicated), or probably the simplest is to just convert the java collection to a scala collection that has a foreach method expecting a scala-style function as an argument:
import scala.collection.JavaConverters._
Files.walk(path, FileVisitOption.FOLLOW_LINKS)
.sorted(Comparator.reverseOrder())
.iterator().asScala
.foreach(Files.deleteIfExists)
In Scala 2.12 I expect this should work:
...forEach(Files.deleteIfExists(_: Path))
The reason you need to specify argument type is because expected type is Consumer[_ >: Path], not Consumer[Path] as it would be in Scala.
If it doesn't work (can't test at the moment), try
val deleteIfExists: Consumer[Path] = Files.deleteIfExists(_)
...forEach(deleteIfExists)
Before Scala 2.12, Joe K's answer is the correct one.

IntelliJ IDEA: default parameter values in Scala

In Scala REPL I can use Seq[String]() as a default value for a parameter of type Seq[T].
Welcome to Scala version 2.11.7 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_101).
Type in expressions to have them evaluated.
Type :help for more information.
scala> def d[T](foo: Seq[T] = Seq[String]()) = 12
d: [T](foo: Seq[T])Int
scala> d()
res0: Int = 12
Trying the same in IDEA, it complains “Seq[String] doesn't conform to expected type Seq[T]”. Why?
IntelliJ IDEA 2016.2.4
Scala Plugin 2016.2.1
Scala 2.11.7
Note 1: Sorry, I know that my example function does not make much sense. However, my real (and useful) function is unnecessarily complex to post it here.
Note 2: At first, instead of type T my type name in the example was Any which was not a good idea (because it shadowed scala.Any) and caused some confusion. Thus I fixed that.
When you say def d[Any], the Any here is a generic place holder. It does not point to class Any in scala. It basically shadows the Any class defined globally in scala. So, when you assign Seq[String] to Seq[Any], the compiler does not know any relation between String and Any. Note that Any could be replaced with any character / word as generic place holder. The result would be the same.
Now coming, to why this works in REPL, I am not exactly sure why REPL accepts if Seq[String] when given as a default value, but I was able to reproduce the error in repl when I do the same operation inside the method body.
The following code in REPL throws error:
def d[Any](foo: Seq[Any]) = {
val a: Seq[Any] = Seq[String]()
}
<console>:12: error: type mismatch;
found : Seq[String]
required: Seq[Any]
val a: Seq[Any] = Seq[String]()
^
I am not sure why REPL was not able to catch the error while given as a default argument.
One alternative theory is, in general when you use generics, the value of the type will be determined based on the caller. For example,
def d[A](a:A) = {}
d(1) // Type of A is Int
d("a") // Type of A is String
So, when you give default value, it assigns the value of String to Any. hence the compilation success.Intellij's Type Checker works based on the first theory and shows an error. But strangely as someone pointed out earlier, the compilation succeeds.

could not find implicit value for evidence parameter of type org.apache.flink.api.common.typeinfo.TypeInformation[...]

I am trying to write some use cases for Apache Flink. One error I run into pretty often is
could not find implicit value for evidence parameter of type org.apache.flink.api.common.typeinfo.TypeInformation[SomeType]
My problem is that I cant really nail down when they happen and when they dont.
The most recent example of this would be the following
...
val largeJoinDataGen = new LargeJoinDataGen(dataSetSize, dataGen, hitRatio)
val see = StreamExecutionEnvironment.getExecutionEnvironment
val newStreamInput = see.addSource(largeJoinDataGen)
...
where LargeJoinDataGen extends GeneratorSource[(Int, String)] and GeneratorSource[T] extends SourceFunction[T], both defined in separate files.
When trying to build this I get
Error:(22, 39) could not find implicit value for evidence parameter of type org.apache.flink.api.common.typeinfo.TypeInformation[(Int, String)]
val newStreamInput = see.addSource(largeJoinDataGen)
1. Why is there an error in the given example?
2. What would be a general guideline when these errors happen and how to avoid them in the future?
P.S.: first scala project and first flink project so please be patient
You may make an import instead of implicits
import org.apache.flink.streaming.api.scala._
It will also help.
This mostly happens when you have user code, i.e. a source or a map function or something of that nature that has a generic parameter. In most cases you can fix that by adding something like
implicit val typeInfo = TypeInformation.of(classOf[(Int, String)])
If your code is inside another method that has a generic parameter you can also try adding a context bound to the generic parameter of the method, as in
def myMethod[T: TypeInformation](input: DataStream[Int]): DataStream[T] = ...
My problem is that I cant really nail down when they happen and when they dont.
They happen when an implicit parameter is required. If we look at the method definition we see:
def addSource[T: TypeInformation](function: SourceFunction[T]): DataStream[T]
But we don't see any implicit parameter defined, where is it?
When you see a polymorphic method where the type parameter is of the form
def foo[T : M](param: T)
Where T is the type parameter and M is a context bound. It means that the creator of the method is requesting an implicit parameter of type M[T]. It is equivalent to:
def foo[T](param: T)(implicit ev: M[T])
In the case of your method, it is actually expanded to:
def addSource[T](function: SourceFunction[T])(implicit evidence: TypeInformation[T]): DataStream[T]
This is why you see the compiler complaining, as it can't find the implicit parameter the method is requiring.
If we go to the Apache Flink Wiki, under Type Information we can see why this happens :
No Implicit Value for Evidence Parameter Error
In the case where TypeInformation could not be created, programs fail to compile with an error stating “could not find implicit value for evidence parameter of type TypeInformation”.
A frequent reason if that the code that generates the TypeInformation has not been imported. Make sure to import the entire flink.api.scala package.
import org.apache.flink.api.scala._
For generic methods, you'll need to require them to generate a TypeInformation at the call-site as well:
For generic methods, the data types of the function parameters and return type may not be the same for every call and are not known at the site where the method is defined. The code above will result in an error that not enough implicit evidence is available.
In such cases, the type information has to be generated at the invocation site and passed to the method. Scala offers implicit parameters for that.
Note that import org.apache.flink.streaming.api.scala._ may also be necessary.
For your types this means that if the invoking method is generic, it also needs to request the context bound for it's type parameter.
For example Scala versions (2.11, 2.12, etc.) are not binary compatible.
The following is a wrong configuration even if you use import org.apache.flink.api.scala._ :
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<scala.version>2.12.8</scala.version>
<scala.binary.version>2.11</scala.binary.version>
</properties>
Correct configuration in Maven:
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<scala.version>2.12.8</scala.version>
<scala.binary.version>2.12</scala.binary.version>
</properties>

Scala Type matching function inside a trait body

I'm new to Scala and working through this book (Function Programming in Scala). One of the exercises involves replicating the Option trait and its functions. However I'm having a problem compiling my solution in the REPL.
sealed trait Nullable[+A] {
def get[B >: A](default: => B) : B = this match {
case Value(v) => v
case Null => default
}
}
case class Value[+A](value: A) extends Nullable[A]
case object Null extends Nullable[Nothing]
REPL error details:
error: constructor cannot be instantiated to expected type;
found : Value[A(in class Value)]
required: Nullable[A(in trait Nullable)]
case Value(v) => v
error: pattern type is incompatible with expected type;
found : Null.type
required: Nullable[A]
case Null => default
Based on those errors I have a nagging feeling that the compiler can't infer that the type of this (being pattern matched on) is a Nullable.
I've tried this block of code on this online Scala utility and it seems to compile and run. The only difference I can see is that the online tool is using Scala version 2.10.3 and I'm running 2.11.7
So I'm not sure if this is environmental or if I need to help the Scala compiler here. I have also tried to compile the answer from the book's authors and I get the same errors.
Any help would be much appreciated.
Posting an answer in case someone else has a similar issue.
Use the REPL :paste command to load the .scala file, instead of the :load command.
Thank you to #noah and #jwvh for the help.

Scala implicit Paramater used in Shapeless library

I'm new to Scala and I'm using the shapeless library to manipulate tuples. (Though this isn't specific to shapeless)
I've run in to a problem related to implicit parameters, which I don't quite understand. Here are there signatures:
def reverse(implicit reverse: Reverse[T]): reverse.Out = reverse(t)
def drop[N <: Nat](implicit drop: Drop[T, N]): drop.Out = drop(t)
And here is it in use:
val foo = ("foo","bar","other").reverse
val bar = ("foo","bar","other").drop(1)
However on Scala 2.10 I get this error:
Error:(25, 37) could not find implicit value for parameter reverse:
shapeless.ops.tuple.Reverse[(String, String, String)]
val zzy = ("foo","bar","other").reverse
Error:(29, 37) not enough arguments for method reverse: (implicit
reverse: shapeless.ops.tuple.Reverse[(String, String,
String)])reverse.Out. Unspecified value parameter reverse.
val foo = ("foo","bar","other").reverse
^
I'm not sure what the implicit parameter is that it is trying to reference, or why I need it. Also, this seems to work on 2.11 (but IntelliJ flags it)
On Scala 2.10.x you need to have the macro-paradise plugin included in your build: see the shapeless documentation here.
IntelliJ often reports spurious errors against code which uses aspects of the Scala type system which it doesn't have adequate support for. This is especially true of code which uses dependent method types as reverse and drop above do (notice that the result type of the methods depend on the argument values). In all cases the command line compiler is authoritative.