Scala: `ambigious implicit values` but the right value is not event found - scala

I am writing a small Scala Program which should:
Read a file (line by line) from a local FS
Parse from each line three double values
Make instances of a case class based on those three values
Pass those instances to a Binary Heap
To be able to parse Strings to both Doubles and CoordinatePoints I've came up with this trait:
trait Parseable[T] {
def parse(input: String): Either[String, T]
}
and I have a number of type object implementations for the latter:
object Parseable {
implicit val parseDouble: Parseable[Double] = new Parseable[Double] {
override def parse(input: String): Either[String, Double] = {
val simplifiedInput = input.replaceAll("[ \\n]", "").toLowerCase
try Right(simplifiedInput.toDouble) catch {
case _: NumberFormatException =>
Left(input)
}
}
}
implicit val parseInt: Parseable[Int] = new Parseable[Int] {
override def parse(input: String): Either[String, Int] = {
val simplifiedInput = input.replaceAll("[ \\n]", "").toLowerCase
try Right(simplifiedInput.toInt) catch {
case _: NumberFormatException =>
Left(input)
}
}
}
implicit val parseCoordinatePoint: Parseable[CoordinatePoint] = new Parseable[CoordinatePoint] {
override def parse(input: String): Either[String, CoordinatePoint] = {
val simplifiedInput = input.replaceAll("[ \\n]", "").toLowerCase
val unparsedPoints: List[String] = simplifiedInput.split(",").toList
val eithers: List[Either[String, Double]] = unparsedPoints.map(parseDouble.parse)
val sequence: Either[String, List[Double]] = eithers.sequence
sequence match {
case Left(value) => Left(value)
case Right(doublePoints) => Right(CoordinatePoint(doublePoints.head, doublePoints(1), doublePoints(2)))
}
}
}
}
I have a common object that delegates the call to a corresponding implicit Parseable (in the same file):
object InputParser {
def parse[T](input: String)(implicit p: Parseable[T]): Either[String, T] = p.parse(input)
}
and just for reference - this is the CoordinatePoint case class:
case class CoordinatePoint(x: Double, y: Double, z: Double)
In my main program (after having validated that the file is there, and is not empty, etc..) I want to transform each line into an instance of CoordinatePoint as follows:
import Parseable._
import CoordinatePoint._
...
private val bufferedReader = new BufferedReader(new FileReader(fileName))
private val streamOfMaybeCoordinatePoints: Stream[Either[String, CoordinatePoint]] = Stream
.continually(bufferedReader.readLine())
.takeWhile(_ != null)
.map(InputParser.parse(_))
and the error I get is this:
[error] /home/vgorcinschi/data/eclipseProjects/Algorithms/Chapter 2 Sorting/algorithms2_1/src/main/scala/ca/vgorcinschi/algorithms2_4/selectionfilter/SelectionFilter.scala:42:27: ambiguous implicit values:
[error] both value parseDouble in object Parseable of type => ca.vgorcinschi.algorithms2_4.selectionfilter.Parseable[Double]
[error] and value parseInt in object Parseable of type => ca.vgorcinschi.algorithms2_4.selectionfilter.Parseable[Int]
[error] match expected type ca.vgorcinschi.algorithms2_4.selectionfilter.Parseable[T]
[error] .map(InputParser.parse(_))
[error] ^
[error] one error found
[error] (Compile / compileIncremental) Compilation failed
[error] Total time: 1 s, completed Sep 1, 2020 10:38:18 PM
I don't understand nor know where to look for why is the compiler finding Parseable[Int] and Parseable[Double] but not the only right one - Parseable[CoordinatePoint].
So I thought, ok let me give the compiler a hand by specifying the transformation function from beforehand:
private val bufferedReader = new BufferedReader(new FileReader(fileName))
val stringTransformer: String => Either[String, CoordinatePoint] = s => InputParser.parse(s)
private val streamOfMaybeCoordinatePoints: Stream[Either[String, CoordinatePoint]] = Stream
.continually(bufferedReader.readLine())
.takeWhile(_ != null)
.map(stringTransformer)
Alas this yields the same error just a bit up the code - in the function declaration.
I would love to learn what is that that causes such behavior. Both to rectify the code and for personal knowledge. At this point I am very curious.

One fix is to specify type prameter explicitly
InputParser.parse[CoordinatePoint](_)
Another is to prioritize implicits. For example
trait LowPriorityParseable1 {
implicit val parseInt: Parseable[Int] = ...
}
trait LowPriorityParseable extends LowPriorityParseable1 {
implicit val parseDouble: Parseable[Double] = ...
}
object Parseable extends LowPriorityParseable {
implicit val parseCoordinatePoint: Parseable[CoordinatePoint] = ...
}
By the way, since you put implicits into the companion object it doesn't make much sense now to import them.
In the call site of
object InputParser {
def parse[T](input: String)(implicit p: Parseable[T]): Either[String, T] = p.parse(input)
}
type parameter T is inferred (if not specified explicitly) not before the implicit is resolved (type inference and implicit resolution make impact on each other). Otherwise the following code wouldn't compile
trait TC[A]
object TC {
implicit val theOnlyImplicit: TC[Int] = null
}
def materializeTC[A]()(implicit tc: TC[A]): TC[A] = tc
materializeTC() // compiles, A is inferred as Int
So during implicit resolution compiler tries to infer types not too early (otherwise in the example with TC type A would be inferred as Nothing and implicit wouldn't be found). By the way, an exception is implicit conversions where compiler tries to infer types eagerly (sometimes this can make troubles too)
// try to infer implicit parameters immediately in order to:
// 1) guide type inference for implicit views
// 2) discard ineligible views right away instead of risking spurious ambiguous implicits
https://github.com/scala/scala/blob/2.13.x/src/compiler/scala/tools/nsc/typechecker/Implicits.scala#L842-L854

The problem that the compiler does not inference and fix type parameter T in .map(InputParser.parse(_)) before trying to find the implicit in the second parameter list.
In the compiler, there is a concrete algorithm that infers types with its own logic, constraints, and tradeoffs. In that concrete compiler version that you use it first goes to the parameter lists and infer and checks types list by list, and only at the end, it infers type parameter by returning type (I do not imply that in other versions it differs, I only point out that it is implementation behavior not a fundamental constraint).
More precisely what is going on is that type parameter T is not being inferred or specified somehow at the step of typechecking of the second parameter list. T (at that point) is existential and it can be any/every type and there is 3 different implicit object that suitable for such type.
It is just how the compiler and its type inference works for now.

Related

How can I dynamically construct a Scala class without knowing types in advance?

I want to build a simple library in which a developer can define a Scala class that represents command line arguments (to keep it simple, just a single set of required arguments -- no flags or optional arguments). I'd like the library to parse the command line arguments and return an instance of the class. The user of the library would just do something like this:
case class FooArgs(fluxType: String, capacitorCount: Int)
def main(args: Array[String]) {
val argsObject: FooArgs = ArgParser.parse(args).as[FooArgs]
// do real stuff
}
The parser should throw runtime errors if the provided arguments do not match the expected types (e.g. if someone passes the string "bar" at a position where an Int is expected).
How can I dynamically build a FooArgs without knowing its shape in advance? Since FooArgs can have any arity or types, I don't know how to iterate over the command line arguments, cast or convert them to the expected types, and then use the result to construct a FooArgs. Basically, I want to do something along these lines:
// ** notional code - does not compile **
def parse[T](args: Seq[String], klass: Class[T]): T = {
val expectedTypes = klass.getDeclaredFields.map(_.getGenericType)
val typedArgs = args.zip(expectedTypes).map({
case (arg, String) => arg
case (arg, Int) => arg.toInt
case (arg, unknownType) =>
throw new RuntimeException(s"Unsupported type $unknownType")
})
(klass.getConstructor(typedArgs).newInstance _).tupled(typedArgs)
}
Any suggestions on how I can achieve something like this?
When you want to abstract over case class (or Tuple) shape, the standard approach is to get the HList representation of the case class with the help of shapeless library. HList keeps track in its type signature of the types of its elements and their amount. Then you can implement the algorithm you want recursively on the HList. Shapeless also provides a number of helpful transformations of HLists in shapeless.ops.hlist.
For this problem, first we need to define an auxiliary typeclass to parse an argument of some type from String:
trait Read[T] {
def apply(str: String): T
}
object Read {
def make[T](f: String => T): Read[T] = new Read[T] {
def apply(str: String) = f(str)
}
implicit val string: Read[String] = make(identity)
implicit val int: Read[Int] = make(_.toInt)
}
You can define more instances of this typeclass, if you need to support other argument types than String or Int.
Then we can define the actual typeclass that parses a sequence of arguments into some type:
// This is needed, because there seems to be a conflict between
// HList's :: and the standard Scala's ::
import shapeless.{:: => :::, _}
trait ParseArgs[T] {
def apply(args: List[String]): T
}
object ParseArgs {
// Base of the recursion on HList
implicit val hnil: ParseArgs[HNil] = new ParseArgs[HNil] {
def apply(args: List[String]) =
if (args.nonEmpty) sys.error("too many args")
else HNil
}
// A single recursion step on HList
implicit def hlist[T, H <: HList](
implicit read: Read[T], parseRest: ParseArgs[H]
): ParseArgs[T ::: H] = new ParseArgs[T ::: H] {
def apply(args: List[String]) = args match {
case first :: rest => read(first) :: parseRest(rest)
case Nil => sys.error("too few args")
}
}
// The implementation for any case class, based on its HList representation
implicit def caseClass[C, H <: HList](
implicit gen: Generic.Aux[C, H], parse: ParseArgs[H]
): ParseArgs[C] = new ParseArgs[C] {
def apply(args: List[String]) = gen.from(parse(args))
}
}
And lastly we can define some API, that uses this typeclass. For example:
case class ArgParser(args: List[String]) {
def to[C](implicit parseArgs: ParseArgs[C]): C = parseArgs(args)
}
object ArgParser {
def parse(args: Array[String]): ArgParser = ArgParser(args.toList)
}
And a simple test:
scala> ArgParser.parse(Array("flux", "10")).to[FooArgs]
res0: FooArgs = FooArgs(flux,10)
There is a great guide on using shapeless for solving similar problems, which you may find helpful: The Type Astronaut’s Guide to Shapeless

How to filter records with arithmetic operation on columns of different type in Slick 3

For example I have this simplified model where timestamp and duration both represent seconds.
case class Item(id: Int, : Long, duration: Int)
val max_timestmap: Long = ???
val stmt = items.filter(x => (x.timestamp + x.duration) <= max_timestamp)
db.run(stmt.result)
Above will not compile with following error that I have hard time understanding.
ambiguous implicit values:
[error] both value BooleanOptionColumnCanBeQueryCondition in object CanBeQueryCondition of type => slick.lifted.CanBeQueryCondition[slick.lifted.Rep[Option[Boolean]]]
[error] and value BooleanCanBeQueryCondition in object CanBeQueryCondition of type => slick.lifted.CanBeQueryCondition[Boolean]
[error] match expected type slick.lifted.CanBeQueryCondition[Nothing]
[error] filter(x => (x.timestamp + x.duration) <= max_timestamp)
Update 1:
Seems like the issues stems for the fact that timestamp is Long and duration is Int. When both are of the same data type it seems to compile.
Update 2:
I found this solution to work. Cast duration to Long using x.duration.asInstanceOf[Rep[Long]] or probably more appropriate x.duration.asColumnOf[Long]
According to Slick documentation COMING FROM SQL TO SLICK:
Arithmetic operation in different types require explicit casts using .asColumnOf[T].
As you have already discovered, you have to explicitly cast duration to Long: x.duration.asColumnOf[Long].
Though not very helpful in telling the exact problem, the error message makes sense if one looks at method filter and trait/object CanBeQueryCondition in Query.scala:
sealed abstract class Query[+E, U, C[_]] extends QueryBase[C[U]] { self =>
...
def filter[T <: Rep[_]](f: E => T)(implicit wt: CanBeQueryCondition[T]): Query[E, U, C] =
withFilter(f)
...
}
...
trait CanBeQueryCondition[-T] extends (T => Rep[_])
object CanBeQueryCondition {
implicit val BooleanColumnCanBeQueryCondition : CanBeQueryCondition[Rep[Boolean]] =
new CanBeQueryCondition[Rep[Boolean]] {
def apply(value: Rep[Boolean]) = value
}
implicit val BooleanOptionColumnCanBeQueryCondition : CanBeQueryCondition[Rep[Option[Boolean]]] =
new CanBeQueryCondition[Rep[Option[Boolean]]] {
def apply(value: Rep[Option[Boolean]]) = value
}
implicit val BooleanCanBeQueryCondition : CanBeQueryCondition[Boolean] =
new CanBeQueryCondition[Boolean] {
def apply(value: Boolean) = new LiteralColumn(value)
}
}
As pointed out by #Federico Pellegatta, the explicit cast requirement leads to T = Nothing (T <: Rep[_]) for the implicit parameter CanBeQueryCondition[T], thus the reported error.

Why scala is unable to infer wildcard type from 2 functions?

I have the following function defined:
import org.apache.spark.sql.catalyst.{ScalaReflection}
import ScalaReflection.universe
import universe.TypeTag
def scalaTypesFor(dataType: DataType): Set[TypeTag[_]] = ...
def scalaTypeOpt: Option[TypeTag[_]] = ...
val catalystType = ...
scalaTypeOpt.map(v => Set(v))
.getOrElse{
val default = scalaTypesFor(catalystType)
default
}
in this case, both scalaTypesFor and scalaTypeOpt are expected to yield a TypeTag with wildcard parameter, they should be of the same type. However, the compiler gave me the following error:
Error:(29, 51) inferred type arguments [scala.collection.immutable.Set[_117] forSome { type _$2; type _117 >: org.apache.spark.sql.catalyst.ScalaReflection.universe.TypeTag[_$2] <: org.apache.spark.sql.catalyst.ScalaReflection.universe.TypeTag[_] }] do not conform to method getOrElse's type parameter bounds [B >: scala.collection.immutable.Set[org.apache.spark.sql.catalyst.ScalaReflection.universe.TypeTag[_$2]] forSome { type _$2 }]
val effective = scalaTypeOpt.map(v => Set(v)).getOrElse{
^
What's wrong with the type inference and how to fix it?
I think the issue is that you have two unknown types _ and there's no guarantee that they are compatible. Also immutable Sets in scala are incorrectly invariant (before people start commenting, there have been many discussions and the very final word on this is that, indeed, there is no truly fundamental reason for them NOT to be covariant) so that tends to produce annoying typing issues as well.
Don't have a compiler around to test but you could try a few things
(unlikely to work) ascribe the type of Set as Set[TypeTag[_]](v)
change your code so you allow the compiler to capture the unknown type and prove to itself that it's the same one as such:
def scalaTypesFor[T](dataType: DataType): Set[TypeTag[T]] = ???
def scalaTypeOpt[T]: Option[TypeTag[T]] = ???
def xform[T] = {
val catalystType = ???
scalaTypeOpt[T].map(v => Set(v))
.getOrElse{
val default = scalaTypesFor[T](catalystType)
default
}
}
or if you can make them local defs
def xform[T] = {
def scalaTypesFor(dataType: DataType): Set[TypeTag[T]] = ???
def scalaTypeOpt: Option[TypeTag[T]] = ???
val catalystType = ???
scalaTypeOpt.map(v => Set(v))
.getOrElse{
val default = scalaTypesFor(catalystType)
default
}
}

Could not find implicit parameter in companion class

I have a Number Wrapper like this
class NumWrapper[A<:AnyVal](var v: A)(implicit n:Numeric[A]) {
def +(other: A): NumWrapper[A] = {
new NumWrapper(n.plus(v, other))
}
def -(other: A): NumWrapper[A] = {
new NumWrapper(n.minus(v, other))
}
}
All work fine. But when I want to have the implicit conversion, i create a companion class as followed:
object NumWrapper {
implicit def toNumWrapper[A<:AnyVal](v: A) = new NumWrapper[A](v)
}
But I have the error at compilation:
could not find implicit value for parameter n: Numeric[A]
What is wrong here ? Why it is trying to find the implicit match for type A at compilation?
Thank you very much for your help.
Implicit checks in Scala are performed at compile time (it is a statically typed language). If the compiler can't identify a single, unambiguous, matching implicit value that will be available at the calling location, it complains.
One way you can fix this here is to add the implicit requirement to the toNumWrapper method:
object NumWrapper {
implicit def toNumWrapper[A<:AnyVal](v: A)(implicit n:Numeric[A]) = new NumWrapper[A](v)
}
This pushes the requirment for an implicit Numeric out to the location where the implicit conversion is required, eg, in the console, I could then write:
scala> val chk1 = 3L
chk1: Long = 3
scala> val chk2 = NumWrapper.toNumWrapper(chk1)
chk2: NumWrapper[Long] = NumWrapper#1e89017a
And the compiler is happy because (I think - not entirely sure of this) the Long value carries its own implicit Numeric[Long] with it.
According your code,
class NumWrapper[A<:AnyVal](var v: A)(implicit n:Numeric[A])
to call new NumWrapper[MyType](v) a Numeric[MyType] must be in the scope of implicit resolution.
So when you have
object NumWrapper {
implicit def toNumWrapper[A<:AnyVal](v: A) = new NumWrapper[A](v)
}
which is calling this NumWrapper constructor, the Numeric[A] must resolved. That's not the case and the compiler raise the mentioned error.

Having trouble with implicit conversion in scala

Having this code
case class Workspace(ident: Long, name: String)
case class Project(ident: Long, name: String)
implicit def workspaceJSON: JSONR[Workspace] = new JSONR[Workspace] {
def read(json: JValue) =
Workspace.applyJSON(field[Long]("id"), field[String]("name"))(json)
}
implicit def projectJSON: JSONR[Project] = new JSONR[Project] {
def read(json: JValue) =
Project.applyJSON(field[Long]("id"), field[String]("name"))(json)
}
def parseEnt[T: JSONR](json: JValue): Either[String, T] =
fromJSON[T](json).toEither.left.map{ _.toString }
def fetchProjects(ws: Workspace): Either[String, Project] = {
parseEnt(parse("some text"))
}
Which fails to compile on parseEnt(parse("some text")) with
ambiguous implicit values: both method taskJSON in class Fetcher of type =>
Fetcher.this.JSONR[types.Task] and method workspaceJSON in class Fetcher of type =>
Fetcher.this.JSONR[Fetcher.this.Workspace] match expected type Fetcher.this.JSONR[T]
Is there a way to assure scala, that in this case I want type variable T to be a Project and choose projectJSON function to parse it? Or if I'm doing it wrong, then how do it in right way?
The compiler is trying to automatically infer the type T, which must be something that it can be produced starting from a String (more or less, but details an unimportant here)
Unfortunately it can't succeed, since you're providing multiple implicit conversions going from String to Project and Workspace. The simple solution is to explicitly indicate the type you're trying to produce:
parseEnt[Project](parse("some text"))
This pattern is fairly common with serialization type classes, where you are mapping multiple specific types to a generic one (String in this case).