forEach in scala shows expected: Consumer[_ >:Path] actual: (Path) => Boolean - scala

Wrong syntax problem in recursively deleting scala files
Files.walk(path, FileVisitOption.FOLLOW_LINKS)
.sorted(Comparator.reverseOrder())
.forEach(Files.deleteIfExists)

The issue is that you're trying to pass a scala-style function to a method expecting a java-8-style function. There's a couple libraries out there that can do the conversion, or you could write it yourself (it's not complicated), or probably the simplest is to just convert the java collection to a scala collection that has a foreach method expecting a scala-style function as an argument:
import scala.collection.JavaConverters._
Files.walk(path, FileVisitOption.FOLLOW_LINKS)
.sorted(Comparator.reverseOrder())
.iterator().asScala
.foreach(Files.deleteIfExists)

In Scala 2.12 I expect this should work:
...forEach(Files.deleteIfExists(_: Path))
The reason you need to specify argument type is because expected type is Consumer[_ >: Path], not Consumer[Path] as it would be in Scala.
If it doesn't work (can't test at the moment), try
val deleteIfExists: Consumer[Path] = Files.deleteIfExists(_)
...forEach(deleteIfExists)
Before Scala 2.12, Joe K's answer is the correct one.

Related

Scala: type hint for lambda

I'm refreshing scala. This looks very simple to me but I can't get it to run:
import java.nio.file.{FileSystems, Files}
object ScalaHello extends App {
val dir = FileSystems.getDefault.getPath("/home/intelli/workspace")
Files.walk(dir).map(_.toFile).forEach(println)
}
It throws error at the mapping lambda:
argument expression's type is not compatible with formal parameter type;
found : java.util.function.Function[java.nio.file.Path,java.io.File]
required: java.util.function.Function[_ >: java.nio.file.Path, _ <: ?R]
I suspect it has something to do with providing type hints for the lambda but I can't find anything surfing Google. Much appreciated
Note that Files.walk returns a Java Stream, so map and forEach come from Java.
Assuming you are using Scala 2.12, your code will work if you either:
Update Scala version to 2.13 (no need to make any other changes in this case)
Specify return type of map:
Files.walk(dir).map[File](_.toFile).forEach(println)
Convert to Scala collections before calling map:
import scala.collection.JavaConverters._
Files.walk(dir).iterator().asScala.map(_.toFile).foreach(println)

No implicits found for parameter evidence

I have a line of code in a scala app that takes a dataframe with one column and two rows, and assigns them to variables start and end:
val Array(start, end) = datesInt.map(_.getInt(0)).collect()
This code works fine when run in a REPL, but when I try to put the same line in a scala object in Intellij, it inserts a grey (?: Encoder[Int]) before the .collect() statement, and show an inline error No implicits found for parameter evidence$6: Encoder[Int]
I'm pretty new to scala and I'm not sure how to resolve this.
Spark needs to know how to serialize JVM types to send them from workers to the master. In some cases they can be automatically generated and for some types there are explicit implementations written by Spark devs. In this case you can implicitly pass them. If your SparkSession is named spark then you miss following line:
import spark.implicits._
As you are new to Scala: implicits are parameters that you don't have to explicitly pass. In your example map function requires Encoder[Int]. By adding this import, it is going to be included in the scope and thus passed automatically to map function.
Check Scala documentation to learn more.

Imports for Datastax QueryBuilder to avoid QueryBuilder.eq in scala

When looking at examples for the Datastax Cassandra Driver, where clauses usually end up like this:
val select = QueryBuilder.select()
.all()
.from("addressbook", "contact")
.where(eq("type", "Friend"))
but when I attempt this in scala i get this error:
Error:(25, 75) type mismatch;
found : Boolean
required: com.datastax.driver.core.querybuilder.Clause
To get it to work I always have to prefix the eq with QueryBuilder.eq to get it to work:
val select = QueryBuilder.select()
.all()
.from("addressbook", "contact")
.where(QueryBuilder.eq("type", "Friend"))
I tried importing QueryBuilder._, since eq is a static method on it, but that didn't help. What setup am i missing to use the more concise form from the example?
eq is a standard Scala method defined on AnyRef (it's to Scala what == is to Java). The non-prefixed version is probably resolved to a call to eq on the object enclosing your statement definition.
One thing you can do is rename your import:
import QueryBuilder.{eq => equ} // TODO find a better name :-)
We could even provide a built-in alias with the driver, as I imagine others will run into this.
If there's a cleaner solution I'd be interested to know it.

Is ??? a valid symbol or operator in scala

I have seen the symbol
???
used in scala code, i however don't know if it's meant to be pseudo code or actual scala code, but my eclipse IDE for scala doesn't flag it and the eclipse worksheet actually evaluates it.
I haven't been able to find anything via google search.
Any help will be appreciated.
Thanks
Yes, this is a valid identifier.
Since Scala 2.10, there is a ??? method in Predef which simply throws a NotImplementedError.
def ??? : Nothing = throw new NotImplementedError
This is intended to be used for quickly sketching the skeleton of some code, leaving the implementations of methods for later, for example:
class Foo[A](a: A) {
def flatMap[B](f: A => Foo[B]): Foo[B] = ???
}
Because it has a type of Nothing (which is a subtype of every other type), it will type-check in place of any value, allowing you to compile the incomplete code without errors. It's often seen in exercises, where the solution needs to be written in place of ???.
To search for method names that are ASCII or unicode strings:
SO search: https://stackoverflow.com/search?q=[scala]+%22%3F%3F%3F%22
finds this thread Scala and Python's pass
scalex covers scala 2.9.1 and scalaz 6.0 http://scalex.org/?q=%3C%3A%3C

Scala Future mapTo fails to compile because of missing ClassTag

Simple question, I have a problem where using mapTo on the result of ask results in a compiler error along the lines of:
not found: value ClassTag
For example:
(job ? "Run").mapTo[Result]
^
I don't understand why it needs a ClassTag to do the cast? If I substitute a standard class from Predef like String as in (job ? "Run").mapTo[String] that compiles OK.
This happens when I define the class right above the line in question, as in:
class Result {}
(job ? "Run").mapTo[Result]
I still get the same problem.
Thanks, Jason.
I should also state that I'm using Scala 2.10.0 and Akka 2.1.0 (if that makes a difference).
This seems to be a particular problem with the Scala 2.10.0 version
After adding
import reflect.ClassTag
the implicitly used ClassTag parameter in mapTo should work.
Either that or updating to a newer Version of Akka/Scala (which should be prefered if possible).