I have the following piece of Scala with ambigiuous implicits, which I would have thought should work, due to lower priority given to inherited implicits. But it does not - it fails with an ambiguous implicit values-error. Can someone explain to me why priorities does not work here?
trait Printer[-T] {
def prettify(instance:T): String
}
trait LowPriorityPrinter {
implicit val anyPrinter:Printer[Any] = new Printer[Any]{ def prettify(instance:Any) = instance.toString() }
}
object Printer extends LowPriorityPrinter {
implicit val intPrinter = new Printer[Int]{ def prettify(instance:Int) = instance.toString() }
}
object MyApp extends App {
def prettyprint[T](i:T)(implicit p:Printer[T]) = println(p.prettify(i))
prettyprint(234)
}
The issue is simple, but nasty. The LowPriorityPrinter catch all instance of your type class needs to be generic, not for Any:
object Printer {
implicit val intPrinter: Printer[Int] =
new Printer[T]{ def prettify(x: T) = x.toString() + " (int") }
implicit def anyPrinter[T]: Printer[T] =
new Printer[T]{ def prettify(x: T) = x.toString() + " (general) }
}
Basically, the literal 234 is just as much an Int as it as an Any and neither of those types is more specific than the other (so the priority trick is useless).
Related
I want to have various "flavors" of a component, each that handles a different "wire" format (e.g. String, Byte array, etc.). Example below. The innards of the read() function aren't important.
Note that on use I need to cast parameter "Heavy" to thing.WIRE to work. Since this is my top-level API I don't want the users to have to cast. They've chosen the flavor when they call FantasticThing.apply (or accept the default). After that I'd rather a cast not be needed.
How can I avoid the cast and have Scala realize that read() argument is a String based on StringFlavor being chosen?
trait Flavor {
type WIRE
def read[T](wire: WIRE)(implicit tt: TypeTag[T]): T
}
trait Maker {
def make(): Flavor
}
object StringFlavor extends Maker {
def make(): Flavor { type WIRE = String } = StringFlavor()
}
case class StringFlavor() extends Flavor {
type WIRE = String
def read[T](wire: String)(implicit tt: TypeTag[T]): T = {
println(tt.tpe)
if(tt.tpe =:= typeOf[Int]) {
5.asInstanceOf[T]
} else
throw new Exception("Boom")
}
}
object FantasticThing {
def apply[WIRE](maker: Maker = StringFlavor): Flavor = maker.make()
}
object RunMe extends App {
val thing: Flavor = FantasticThing(StringMaker)
println(thing.read[Int]("Heavy".asInstanceOf[thing.WIRE])) // <-- How can I avoid this cast?
}
Edit based on Luis Miguel's note: I can't really add that type to FantasticThing.apply() or I'd lose pluggability. I want users to easily select the Flavor they want. I've refactored a bit to show this with a factory pattern, which does add the type info you suggested, but still unfortunately leaves me with a need to cast top-level.
If I provide a bunch of Flavors then users should be able to do something like:
val foo = FantasticThing(ByteArrayFlavor)
You can make WIRE a type parameter and propagate it through a type member or your Maker type. I.e:
import scala.reflect.runtime.universe._
trait Flavor[WIRE] {
def read[T](wire: WIRE)(implicit tt: TypeTag[T]): T
}
trait Maker {
type O
def make(): Flavor[O]
}
object StringMaker extends Maker {
type O = String
def make(): Flavor[O] = StringFlavor()
}
case class StringFlavor() extends Flavor[String] {
def read[T](wire: String)(implicit tt: TypeTag[T]): T = {
if(tt.tpe =:= typeOf[Int]) {
5.asInstanceOf[T]
} else
throw new Exception("Boom")
}
}
object FantasticThing {
def apply(): Flavor[String] = StringMaker.make()
def apply(maker: Maker): Flavor[maker.O] = maker.make() // Path dependent type.
}
object RunMe extends App {
val thing: Flavor[String] = FantasticThing(StringMaker)
thing.read[Int]("Heavy") // res0: Int = 5
}
Edit: Added no-arg apply() to this anwser. If a default value for maker is used (e.g. StringMaker) you get a compile error because argument "Heavy" is now supposed to be type Maker.O. Adding the no-arg apply solves this problem while providing the same experience to the caller.
I took the liberty to modify your code with the intention to show how (what I understand of) your problem can be solved using typeclasses and type parameters, instead of type members.
import scala.reflect.runtime.universe.{TypeTag, typeOf}
implicit class Json(val underlying: String) extends AnyVal
implicit class Csv(val underlying: String) extends AnyVal
trait Flavor[W] {
def read[T](wire: W)(implicit tt: TypeTag[T]): T
}
trait Maker[W] {
def make(): Flavor[W]
}
object Maker {
implicit val StringFlavorMaker: Maker[String] = new Maker[String] {
override def make(): Flavor[String] = StringFlavor
}
implicit val JsonFlavorMaker: Maker[Json] = new Maker[Json] {
override def make(): Flavor[Json] = JsonFlavor
}
implicit val CsvFlavorMaker: Maker[Csv] = new Maker[Csv] {
override def make(): Flavor[Csv] = CsvFlavor
}
}
case object StringFlavor extends Flavor[String] {
override final def read[T](wire: String)(implicit tt: TypeTag[T]): T = {
if(tt.tpe =:= typeOf[Int])
0.asInstanceOf[T]
else
throw new Exception("Boom 1")
}
}
case object JsonFlavor extends Flavor[Json] {
override final def read[T](wire: Json)(implicit tt: TypeTag[T]): T = {
if(tt.tpe =:= typeOf[Int])
3.asInstanceOf[T]
else
throw new Exception("Boom 2")
}
}
case object CsvFlavor extends Flavor[Csv] {
override final def read[T](wire: Csv)(implicit tt: TypeTag[T]): T = {
if(tt.tpe =:= typeOf[Int])
5.asInstanceOf[T]
else
throw new Exception("Boom 3")
}
}
object FantasticThing {
def apply[W](implicit maker: Maker[W]): Flavor[W] = maker.make()
}
Then you can create and user any flavour (given there is an implicit maker in scope) this way.
val stringFlavor = FantasticThing[String]
// stringFlavor: Flavor[String] = StringFlavor
stringFlavor.read[Int]("Heavy")
// res0: Int = 0
val jsonFlavor = FantasticThing[Json]
// jsonFlavor: Flavor[Json] = JsonFlavor
jsonFlavor.read[Int]("{'heavy':'true'}")
// res1: Int = 3
val csvFlavor = FantasticThing[Csv]
// csvFlavor: Flavor[Csv] = CsvFlavor
csvFlavor.read[Int]("Hea,vy")
// res2: Int = 0
In general, is better to stay off of type members, since they are more complex and used for more advanced stuff like path dependent types.
Let me know in the comments if you have any doubt.
DISCLAIMER: I am bad with type members (still learning about them), that may motivate me to use different alternatives. - In any case, I hope you can apply something similar to your real problem..
I have written automatic type class derivation in order to automatically generate elasticsearch Json mapping for case classes.
For that I am using the TypeClass type class in shapeless.
The problem I have is that many fields in the case classes we use are Scala enumerations.
For example
object ConnectionState extends Enumeration {
type ConnectionState = Value
val ordering, requested, pending, available, down, deleting, deleted, rejected = Value
}
Or
object ProductCodeType extends Enumeration {
type ProductCodeType = Value
val devpay, marketplace = Value
}
It seems I have to define a specific implicit instance for every Enumeration that is defined in order that the automatic derivation will pick it up (e.g. for both ConnectionState and ProductCodeType).
I cannot have one implicit def for Enumeration such as
implicit def enumerationMapping: MappingEncoder[Enumeration] = new MappingEncoder[Enumeration] {
def toMapping = jSingleObject("type", jString("text"))
}
that will work for all enumeration types.
I tried making the type class covariant, and a bunch of other things but nothing helped.
Any ideas?
Here is the derivation code:
object Mapping {
trait MappingEncoder[T] {
def toMapping: Json
}
object MappingEncoder extends LabelledProductTypeClassCompanion[MappingEncoder] {
implicit val stringMapping: MappingEncoder[String] = new MappingEncoder[String] {
def toMapping = jSingleObject("type", jString("text"))
}
implicit val intMapping: MappingEncoder[Int] = new MappingEncoder[Int] {
def toMapping = jSingleObject("type", jString("integer"))
}
implicit def seqMapping[T: MappingEncoder]: MappingEncoder[Seq[T]] = new MappingEncoder[Seq[T]] {
def toMapping = implicitly[MappingEncoder[T]].toMapping
}
implicit def optionMapping[T: MappingEncoder]: MappingEncoder[Option[T]] = new MappingEncoder[Option[T]] {
def toMapping = implicitly[MappingEncoder[T]].toMapping
}
object typeClass extends LabelledProductTypeClass[MappingEncoder] {
def emptyProduct = new MappingEncoder[HNil] {
def toMapping = jEmptyObject
}
def product[F, T <: HList](name: String, sh: MappingEncoder[F], st: MappingEncoder[T]) = new MappingEncoder[F :: T] {
def toMapping = {
val head = sh.toMapping
val tail = st.toMapping
(name := head) ->: tail
}
}
def project[F, G](instance: => MappingEncoder[G], to: F => G, from: G => F) = new MappingEncoder[F] {
def toMapping = jSingleObject("properties", instance.toMapping)
}
}
}
}
I was able to fix the problem by adding additional implicit defs to scope:
implicit def enumerationMapping[T <: Enumeration#Value]: MappingEncoder[T] = new MappingEncoder[T] {
def toMapping = jSingleObject("type", jString("text"))
}
implicit def enumerationSeqMapping[T <: Enumeration#Value]: MappingEncoder[Seq[T]] = new MappingEncoder[Seq[T]] {
def toMapping = jSingleObject("type", jString("text"))
}
The second implicit was needed as some of the case classes had members of type Seq[T] where T is some enumeration type.
I'm trying to implement simple type class pattern. It suppose to work similarly to scalaz's typeclasses. Unfortunately I can't get it to work. I have trait Str
trait Str[T] {
def str(t: T): String
}
object Str {
def apply[T](implicit instance: Str[T]) : Str[T] = instance
}
And in my and implicit instance of it.
object Temp extends App {
implicit val intStr = new Str[Int] {
def str(i: Int) = i.toString
}
1.str //error: value str is not a member of Int
}
I would appreciate any insight.
Everything you can do now is
Str[Int].str(1)
to use 1.str you need to introduce implicit conversion.
You can for example use this approach:
implicit class StrOps[A](val self: A) extends AnyVal {
def str(implicit S: Str[A]) = S.str(self)
}
Which gives:
scala> 1.str
res2: String = 1
Just when I thought I understood the basics of Scala's type system... :/
I'm trying to implement a class that reads the contents of a file and outputs a set of records. A record might be a single line, but it could also be a block of bytes, or anything. So what I'm after is a structure that allows the type of Reader to imply the type of the Record, which in turn will imply the correct Parser to use.
This structure works as long as MainApp.records(f) only returns one type of Reader. As soon as it can return more, I get this error:
could not find implicit value for parameter parser
I think the problem lies with the typed trait definitions at the top, but I cannot figure out how to fix the issue...
// Core traits
trait Record[T]
trait Reader[T] extends Iterable[Record[T]]
trait Parser[T] {
def parse(r: Record[T]): Option[Int]
}
// Concrete implementations
class LineRecord[T] extends Record[T]
class FileReader[T](f:File) extends Reader[T] {
val lines = Source.fromFile(f).getLines()
def iterator: Iterator[LineRecord[T]] =
new Iterator[LineRecord[T]] {
def next() = new LineRecord[T]
def hasNext = lines.hasNext
}
}
trait TypeA
object TypeA {
implicit object TypeAParser extends Parser[TypeA] {
def parse(r: Record[TypeA]): Option[Int] = ???
}
}
trait TypeB
object TypeB {
implicit object TypeBParser extends Parser[TypeB] {
def parse(r: Record[TypeB]): Option[Int] = ???
}
}
// The "app"
object MainApp {
def process(f: File) =
records(f) foreach { r => parse(r) }
def records(f: File) = {
if(true)
new FileReader[TypeA](f)
else
new FileReader[TypeB](f)
}
def parse[T](r: Record[T])(implicit parser: Parser[T]): Option[Int] =
parser.parse(r)
}
First off you must import the implicit object in order to use them:
import TypeA._
import TypeB._
That's not enough though. It seems like you're trying to apply implicits dynamically. That's not possible; they have to be found compile time.
If you import the objects as above and change the records so that the compiler finds the correct generic it will run fine:
def records(f: File) = new FileReader[TypeA](f)
But then it may not be what you were looking for ;)
The problem is that the return type of your records method is basically FileReader[_] (since it can return either FileReader[TypeA] or FileReader[TypeB]), and you don't provide an implicit argument of type Parser[Any]. If you remove the if-expression the return type is inferred to FileReader[TypeA], which works fine. I'm not sure what you're trying to do, but obviously the compiler can't select implicit argument based upon a type that is only known at runtime.
1) Using type with implicit inside as type parameter - doesn't bind this implicit to the host type, to do this change objects to the traits and mix them instead of generalizing (type-parametrizing):
def records(f: File) = {
if(true)
new FileReader(f) with TypeA
else
new FileReader(f) with TypeB
}
2) The parser should be in scope of function that calls parse. So you may try smthg like that:
def process(f: File) = {
val reader = records(f);
import reader._
reader foreach { r => parse(r) }
}
PlanB) Simpler alternative is to define type-parameter specific implicit methods inside the AppMain (or some trait mixed in), but it will work only if TypeA/TypeB is known on compile time, so records method can return concrete type:
implicit class TypeAParser(r: Record[TypeA]) {
def parse: Option[Int] = ???
}
implicit class TypeBParser(r: Record[TypeB]) {
def parse: Option[Int] = ???
}
def process[T <: TypeAorB](f: File) =
records[T](f).foreach(_.parse)
def recordsA[T <: TypeAorB](f: File) = new FileReader[T](f)
Here is, I think, the full set of modifications you need to do to get where I think you want to go.
import scala.io.Source
import java.io.File
import reflect.runtime.universe._
// Core traits
trait Record[+T]
trait Reader[+T] extends Iterable[Record[T]]
trait Parser[-T] {
def parse(r: Record[T]): Option[Int]
}
// Concrete implementations [unmodified]
class LineRecord[T] extends Record[T]
class FileReader[T](f:File) extends Reader[T] {
val lines = Source.fromFile(f).getLines()
def iterator: Iterator[LineRecord[T]] =
new Iterator[LineRecord[T]] {
def next() = new LineRecord[T]
def hasNext = lines.hasNext
}
}
sealed trait Alternatives
case class TypeA() extends Alternatives
object TypeA {
implicit object TypeAParser extends Parser[TypeA] {
def parse(r: Record[TypeA]): Option[Int] = ???
}
}
case class TypeB() extends Alternatives
object TypeB {
implicit object TypeBParser extends Parser[TypeB] {
def parse(r: Record[TypeB]): Option[Int] = ???
}
}
class ParseAlternator(parserA: Parser[TypeA], parserB: Parser[TypeB]) extends Parser[Alternatives] {
def parse(r: Record[Alternatives]): Option[Int] = r match {
case x: Record[TypeA #unchecked] if typeOf[Alternatives] =:= typeOf[TypeA] => parserA.parse(x)
case x: Record[TypeB #unchecked] if typeOf[Alternatives] =:= typeOf[TypeB] => parserB.parse(x)
}
}
object ParseAlternator {
implicit def parseAlternator(implicit parserA: Parser[TypeA], parserB: Parser[TypeB]): Parser[Alternatives] = new ParseAlternator(parserA, parserB)
}
// The "app"
object MainApp {
import ParseAlternator._
def process(f: File) =
records(f) foreach { r => parse(r) }
def records(f: File): Reader[Alternatives] = {
if(true)
new FileReader[TypeA](f)
else
new FileReader[TypeB](f)
}
def parse[T](r: Record[T])(implicit parser: Parser[T]): Option[Int] =
parser.parse(r)
}
The gist of it is: all of this would be completely classsical if only your parse instance did not have to pattern-match on a generic type but dealt directly with an Alternative instead.
It's this limitation (inherited from the JVM) that scala can't properly pattern-match on an object of a parametric type that requires the reflection & typeOf usage. Without it, you would just have type alternatives for your content (TypeA, TypeB), which you would add to a sealed trait, and which you would dispatch on, in an implicit that produces a Parser for their supertype.
Of course this isn't the only solution, it's just what I think is the meeting point of what's closest to what you're trying to do, with what's most idiomatic.
I've found extremely weird behaviour (scala 2.9.1 ) using and defining implicit values, and wondering if anyone can explain it, or if it's a scala bug?
I've created a self contained example:
object AnnoyingObjectForNoPurpose {
trait Printer[T] {
def doPrint(v: T): Unit
}
def print[T : Printer](v: T) = implicitly[Printer[T]].doPrint(v)
trait DelayedRunner extends DelayedInit {
def delayedInit(x: => Unit){ x }
}
// this works, as it should
object Normal extends DelayedRunner {
implicit val imp = new Printer[Int] {
def doPrint(v: Int) = println(v + " should work")
}
print(343)
}
// this compiles, but it shouldn't
// and won't run, cause the implicit is still null
object FalsePositive extends DelayedRunner {
print(123)
implicit val imp = new Printer[Int] {
def doPrint(v: Int) = println(v + " should not compile")
}
}
def main(args: Array[String]) {
implicit val imp = new Printer[Int] {
def doPrint(v: Int) = println(v + " should work")
}
print(44)
// print(33.0) // correctly doesn't work
Normal // force it to run
FalsePositive // force this to run too
}
}
Suppose you changed your definition of delayInit to be a no-op, i.e.
def delayedInit(x: => Unit) { }
Then in your main method do something like
println("FP.imp: " + FalsePositive.imp)
As expected that will print FP.imp: null, but the real point of the exercise is to illustrate that the block that defines the body of FalsePositive is acting like a regular class body, not a function body. It's defining public members when it sees val, not local variables.
If you added a method to AnnoyingObjectForNoPurpose like the following, it wouldn't compile because print's implicit requirement isn't satisfied.
def fails {
print(321)
implicit val cantSeeIt = new Printer[Int] {
def doPrint(v: Int) = println(v + " doesn't compile")
}
}
However if you defined a class along the same principle, it would compile, but fail at runtime when initialized, just like your FalsePositive example.
class Fine {
print(321)
implicit val willBeNull = new Printer[Int] {
def doPrint(v: Int) = println(v + " compiles, but fails")
}
}
To be clear, the compile behavior of Fine has nothing to do with the presence of the implicit. Class/object initializers are very happy to compile with val initializers which reference undefined vals.
object Boring {
val b = a
val a = 1
println("a=%s b=%s".format(a, b))
}
Boring compiles just fine and when it is referenced, it prints a=1 b=0
It seems like your question boils down to "Should the body of a class/object deriving from DelayedInit be compiled as if it's a class body or a function block?"
It looks like Odersky picked the former, but you're hoping for the latter.
That's the same bug as if I write this:
object Foo {
println(x)
val x = 5
}
It ain't a bug. Your constructor flows down the object body and happens in order. DelayedInit isn't the cause. This is why you need to be careful when using val/vars and ensure they init first. This is also why people use lazy val's to resolve initialization order issues.