I want to save a collection of FieldMapping classes as a json string -
abstract class Field {
def clazz: Class[_]
def name: String
}
case class StringField(name: String) extends Field {
override def clazz: Class[_] = classOf[String]
}
case class DateField(name: String) extends Field {
override def clazz: Class[_] = classOf[Date]
}
... etc - full code here:
https://github.com/alexeyOnGitHub/scala-typesafe/blob/master/src/main/scala/com/example/model/Field.scala
Circe code:
import com.example.model.{DateField, Field, FieldMapping, StringField}
import io.circe.generic.semiauto.{deriveDecoder, deriveEncoder}
import io.circe.{Decoder, Encoder}
object CirceBoilerplateForConfigs {
implicit val fieldDecoder: Decoder[StringField] = deriveDecoder[StringField]
implicit val fieldEncoder: Encoder[StringField] = deriveEncoder[StringField]
implicit val dateDecoder: Decoder[DateField] = deriveDecoder[DateField]
implicit val dateEncoder: Encoder[DateField] = deriveEncoder[DateField]
implicit val fooDecoder: Decoder[FieldMapping] = deriveDecoder[FieldMapping]
implicit val fooEncoder: Encoder[FieldMapping] = deriveEncoder[FieldMapping]
}
Error:(14, 65) could not find Lazy implicit value of type
io.circe.generic.decoding.DerivedDecoder[com.example.model.FieldMapping]
implicit val fooDecoder: Decoder[FieldMapping] =
deriveDecoder[FieldMapping] Error:(14, 65)
not enough arguments for method deriveDecoder: (implicit decode:
shapeless.Lazy[io.circe.generic.decoding.DerivedDecoder[com.example.model.FieldMapping]])io.circe.Decoder[com.example.model.FieldMapping].
Unspecified value parameter decode. implicit val fooDecoder:
Decoder[FieldMapping] = deriveDecoder[FieldMapping] Error:(15, 65)
could not find Lazy implicit value of type
io.circe.generic.encoding.DerivedObjectEncoder[com.example.model.FieldMapping]
implicit val fooEncoder: Encoder[FieldMapping] =
deriveEncoder[FieldMapping] Error:(15, 65)
not enough arguments for
method deriveEncoder: (implicit encode:
shapeless.Lazy[io.circe.generic.encoding.DerivedObjectEncoder[com.example.model.FieldMapping]])io.circe.ObjectEncoder[com.example.model.FieldMapping].
Unspecified value parameter encode. implicit val fooEncoder:
Encoder[FieldMapping] = deriveEncoder[FieldMapping]
Field should be a sealed trait (with abstract class or not sealed trait this won't work).
The following code compiles:
import java.util.Date
sealed trait Field {
def clazz: Class[_]
def name: String
}
case class StringField(name: String) extends Field {
override def clazz: Class[_] = classOf[String]
}
case class DateField(name: String) extends Field {
override def clazz: Class[_] = classOf[Date]
}
case class FieldMapping(fieldInConnector1: Option[Field],
fieldInConnector2: Option[Field],
selected: Boolean,
defaultValue: String)
import io.circe.generic.semiauto.{deriveDecoder, deriveEncoder}
import io.circe.{Decoder, Encoder}
object CirceBoilerplateForConfigs {
implicit val stringDecoder: Decoder[StringField] = deriveDecoder[StringField]
implicit val stringEncoder: Encoder[StringField] = deriveEncoder[StringField]
implicit val dateDecoder: Decoder[DateField] = deriveDecoder[DateField]
implicit val dateEncoder: Encoder[DateField] = deriveEncoder[DateField]
implicit val fieldDecoder: Decoder[Field] = deriveDecoder[Field]
implicit val fieldEncoder: Encoder[Field] = deriveEncoder[Field]
implicit val fooDecoder: Decoder[FieldMapping] = deriveDecoder[FieldMapping]
implicit val fooEncoder: Encoder[FieldMapping] = deriveEncoder[FieldMapping]
}
Related
Consider the following setup:
trait Foo[A]
object Foo extends Priority2
trait Priority0 {
implicit def foo1: Foo[Int] = new Foo[Int] {}
}
trait Priority1 extends Priority0 {
implicit def foo2: Foo[Boolean] = new Foo[Boolean] {}
}
trait Priority2 extends Priority1 {
implicit def foo3: Foo[Double] = new Foo[Double] {}
}
Now, in a REPL (having loaded the above code up), I can do the following:
scala> def implicitlyFoo[A](implicit foo: Foo[A]) = foo
implicitlyFoo: [A](implicit foo: Foo[A])Foo[A]
scala> implicitlyFoo
res1: Foo[Double] = Priority2$$anon$3#79703b86
Is there a way to encode with some typelevel magic that I want to skip over the instances with A =:= Double, but still let type inference figure out what A is?
I do not want to shadow foo3. This is an MVCE: in my real case, foo3 is a def with other implicit arguments (and may play an indirect role in deriving other Foo's).
I've tried =:!= from shapeless but to no avail:
scala> import shapeless._
import shapeless._
scala> def implicitlyFoo2[A](implicit foo: Foo[A], ev: A =:!= Double) = foo
implicitlyFoo2: [A](implicit foo: Foo[A], implicit ev: A =:!= Double)Foo[A]
scala> implicitlyFoo2
<console>:16: error: ambiguous implicit values:
both method neqAmbig1 in package shapeless of type [A]=> A =:!= A
and method neqAmbig2 in package shapeless of type [A]=> A =:!= A
match expected type Double =:!= Double
implicitlyFoo2
^
Dirty hack is to downcast macro context to its implemenation and use compiler internals.
import scala.language.experimental.macros
import scala.reflect.macros.whitebox
trait Foo[A] {
def say: String
}
trait Priority0 {
implicit def foo1: Foo[Int] = new Foo[Int] {
override def say: String = "int"
}
}
trait Priority1 extends Priority0 {
implicit def foo2: Foo[Boolean] = new Foo[Boolean] {
override def say: String = "bool"
}
}
trait Priority2 extends Priority1 {
implicit def foo3: Foo[Double] = new Foo[Double] {
override def say: String = "double"
}
}
object Foo extends Priority2
def materializeSecondFoo[A]: Foo[A] = macro impl
def impl(c: whitebox.Context): c.Tree = {
import c.universe._
val context = c.asInstanceOf[reflect.macros.runtime.Context]
val global: context.universe.type = context.universe
val analyzer: global.analyzer.type = global.analyzer
var infos = List[analyzer.ImplicitInfo]()
new analyzer.ImplicitSearch(
tree = EmptyTree.asInstanceOf[global.Tree],
pt = typeOf[Foo[_]].asInstanceOf[global.Type],
isView = false,
context0 = global.typer.context.makeImplicit(reportAmbiguousErrors = false),
pos0 = c.enclosingPosition.asInstanceOf[global.Position]
) {
override def searchImplicit(
implicitInfoss: List[List[analyzer.ImplicitInfo]],
isLocalToCallsite: Boolean
): analyzer.SearchResult = {
val implicitInfos = implicitInfoss.flatten
if (implicitInfos.nonEmpty) {
infos = implicitInfos
}
super.searchImplicit(implicitInfoss, isLocalToCallsite)
}
}.bestImplicit
val secondBest = infos.tail.head
global.gen.mkAttributedRef(secondBest.pre, secondBest.sym).asInstanceOf[Tree]
}
materializeSecondFoo.say // bool
Tested in 2.12.8. Motivated by shapeless.Cached.
In 2.13.0 materializeSecondFoo.say should be replaced with
val m = materializeSecondFoo
m.say
The latter is still working in 2.13.10.
Scala 3 implementation:
import scala.quoted.{Quotes, Type, Expr, quotes}
import dotty.tools.dotc.typer.{Implicits => dottyImplicits}
transparent inline def materializeSecondFoo: Foo[_] = ${impl}
def impl(using Quotes): Expr[Foo[_]] = {
import quotes.reflect.*
given c: dotty.tools.dotc.core.Contexts.Context =
quotes.asInstanceOf[scala.quoted.runtime.impl.QuotesImpl].ctx
val typer = c.typer
val search = new typer.ImplicitSearch(
TypeRepr.of[Foo[_]].asInstanceOf[dotty.tools.dotc.core.Types.Type],
dotty.tools.dotc.ast.tpd.EmptyTree,
Position.ofMacroExpansion.asInstanceOf[dotty.tools.dotc.util.SourcePosition].span
)
def eligible(contextual: Boolean): List[dottyImplicits.Candidate] =
if contextual then
if c.gadt.isNarrowing then
dotty.tools.dotc.core.Contexts.withoutMode(dotty.tools.dotc.core.Mode.ImplicitsEnabled) {
c.implicits.uncachedEligible(search.wildProto)
}
else c.implicits.eligible(search.wildProto)
else search.implicitScope(search.wildProto).eligible
def implicits(contextual: Boolean): List[dottyImplicits.SearchResult] =
eligible(contextual).map(search.tryImplicit(_, contextual))
val contextualImplicits = implicits(true)
val nonContextualImplicits = implicits(false)
val contextualSymbols = contextualImplicits.map(_.tree.symbol)
val filteredNonContextual = nonContextualImplicits.filterNot(sr => contextualSymbols.contains(sr.tree.symbol))
val successes = (contextualImplicits ++ filteredNonContextual).collect {
case success: dottyImplicits.SearchSuccess => success.tree.asInstanceOf[ImplicitSearchSuccess].tree
}
successes.tail.head.asExprOf[Foo[_]]
}
materializeSecondFoo.say // bool
val foo = materializeSecondFoo
foo: Foo[Boolean] // compiles
Scala 3.2.0.
My objective is to create a MyDataFrame class that will know how to fetch data at a given path, but I want to provide type-safety. I'm having some trouble using a frameless.TypedDataset with type bounds on remote data. For example
sealed trait Schema
final case class TableA(id: String) extends Schema
final case class TableB(id: String) extends Schema
class MyDataFrame[T <: Schema](path: String, implicit val spark: SparkSession) {
def read = TypedDataset.create(spark.read.parquet(path)).as[T]
}
But I keep getting could not find implicit value for evidence parameter of type frameless.TypedEncoder[org.apache.spark.sql.Row]. I know that TypedDataset.create needs an Injection for this to work. But I'm not sure how I would write this for a generic T. I thought the compiler would be able to deduce that since all subtypes of Schema are case classes that it would work.
Anybody ever run into this?
All implicit parameters should be in the last parameter list and this parameter list should be separate from non-implicit ones.
If you try to compile
class MyDataFrame[T <: Schema](path: String)(implicit spark: SparkSession) {
def read = TypedDataset.create(spark.read.parquet(path)).as[T]
}
you'll see error
Error:(11, 35) could not find implicit value for evidence parameter of type frameless.TypedEncoder[org.apache.spark.sql.Row]
def read = TypedDataset.create(spark.read.parquet(path)).as[T]
So let's just add corresponding implicit parameter
class MyDataFrame[T <: Schema](path: String)(implicit spark: SparkSession, te: TypedEncoder[Row]) {
def read = TypedDataset.create(spark.read.parquet(path)).as[T]
}
we'll have error
Error:(11, 64) could not find implicit value for parameter as: frameless.ops.As[org.apache.spark.sql.Row,T]
def read = TypedDataset.create(spark.read.parquet(path)).as[T]
So let's add one more implicit parameter
import frameless.ops.As
import frameless.{TypedDataset, TypedEncoder}
import org.apache.spark.sql.{Row, SparkSession}
class MyDataFrame[T <: Schema](path: String)(implicit spark: SparkSession, te: TypedEncoder[Row], as: As[Row, T]) {
def read = TypedDataset.create(spark.read.parquet(path)).as[T]
}
or with kind-projector
class MyDataFrame[T <: Schema : As[Row, ?]](path: String)(implicit spark: SparkSession, te: TypedEncoder[Row]) {
def read = TypedDataset.create(spark.read.parquet(path)).as[T]
}
You can create custom type class
trait Helper[T] {
implicit def te: TypedEncoder[Row]
implicit def as: As[Row, T]
}
object Helper {
implicit def mkHelper[T](implicit te0: TypedEncoder[Row], as0: As[Row, T]): Helper[T] = new Helper[T] {
override implicit def te: TypedEncoder[Row] = te0
override implicit def as: As[Row, T] = as0
}
}
class MyDataFrame[T <: Schema : Helper](path: String)(implicit spark: SparkSession) {
val h = implicitly[Helper[T]]
import h._
def read = TypedDataset.create(spark.read.parquet(path)).as[T]
}
or
class MyDataFrame[T <: Schema](path: String)(implicit spark: SparkSession, h: Helper[T]) {
import h._
def read = TypedDataset.create(spark.read.parquet(path)).as[T]
}
or
trait Helper[T] {
def create(dataFrame: DataFrame): TypedDataset[T]
}
object Helper {
implicit def mkHelper[T](implicit te: TypedEncoder[Row], as: As[Row, T]): Helper[T] =
(dataFrame: DataFrame) => TypedDataset.create(dataFrame).as[T]
}
class MyDataFrame[T <: Schema : Helper](path: String)(implicit spark: SparkSession) {
def read = implicitly[Helper[T]].create(spark.read.parquet(path))
}
or
class MyDataFrame[T <: Schema](path: String)(implicit spark: SparkSession, h: Helper[T]) {
def read = h.create(spark.read.parquet(path))
}
Corrected version:
import org.apache.spark.sql.Encoder
import frameless.{TypedDataset, TypedEncoder}
class MyDataFrame[T <: Schema](path: String)(implicit
spark: SparkSession,
e: Encoder[T],
te: TypedEncoder[T]
) {
def read: TypedDataset[T] = TypedDataset.create[T](spark.read.parquet(path).as[T])
}
or using context bounds
class MyDataFrame[T <: Schema : Encoder : TypedEncoder](path: String)(implicit
spark: SparkSession
) {
def read: TypedDataset[T] = TypedDataset.create[T](spark.read.parquet(path).as[T])
}
Testing:
I converted a json file {"id": "xyz"} into parquet file and then
sealed trait Schema
final case class TableA(id: String) extends Schema
final case class TableB(id: String) extends Schema
import org.apache.spark.sql.SparkSession
implicit val spark: SparkSession = SparkSession.builder
.master("local")
.appName("Spark SQL basic example")
.getOrCreate()
import spark.implicits._
import frameless.syntax._
val res: TypedDataset[TableA] = new MyDataFrame[TableA]("path/to/parquet/file").read
println(res) // [id: string]
res.foreach(println).run() // TableA(xyz)
I am trying to abstract out the json parsing logic that gets triggered for a specific type.
I started out creating a Parser trait as follows:
trait Parser {
def parse[T](payload : String) : Try[T]
}
I have an implementation of this trait called JsonParser which is:
class JsonParser extends Parser {
override def parse[T](payload: String): Try[T] = parseInternal(payload)
private def parseInternal[T:JsonParserLike](payload:String):Try[T] = {
implicitly[JsonParserLike[T]].parse(payload)
}
}
The JsonParserLike is defined as follows:
trait JsonParserLike[T] {
def parse(payload: String): Try[T]
}
object JsonParserLike {
implicit val type1Parser:JsonParserLike[Type1] = new JsonParserLike[Type1]
{
//json parsing logic for Type1
}
implicit val type2Parser:JsonParserLike[Type2] = new JsonParserLike[Type2]
{
//json parsing logic for Type2
}
}
When I try compiling the above, the compilation fails with:
ambiguous implicit values:
[error] both value type1Parse in object JsonParserLike of type => parser.jsonutil.JsonParserLike[parser.models.Type1]
[error] and value type2Parser in object JsonParserLike of type => parser.jsonutil.JsonParserLike[parser.models.Type2]
[error] match expected type parser.jsonutil.JsonParserLike[T]
[error] override def parse[T](payload: String): Try[T] = parseInternal(payload)
Not sure why the implicit resolution is failing here. Is it because the parse method in the Parser trait doesn't have an argument of type parameter T?
I tried another approach as follows:
trait Parser {
def parse[T](payload : String) : Try[T]
}
class JsonParser extends Parser {
override def parse[T](payload: String): Try[T] = {
import workflow.parser.JsonParserLike._
parseInternal[T](payload)
}
private def parseInternal[U](payload:String)(implicit c:JsonParserLike[U]):Try[U] = {
c.parse(payload)
}
}
The above gives me the following error:
could not find implicit value for parameter c: parser.JsonParserLike[T]
[error] parseInternal[T](payload)
[error]
^
Edit: Adding the session from the REPL
scala> case class Type1(name: String)
defined class Type1
scala> case class Type2(name:String)
defined class Type2
scala> :paste
// Entering paste mode (ctrl-D to finish)
import scala.util.{Failure, Success, Try}
trait JsonParserLike[+T] {
def parse(payload: String): Try[T]
}
object JsonParserLike {
implicit val type1Parser:JsonParserLike[Type1] = new JsonParserLike[Type1] {
override def parse(payload: String): Try[Type1] = Success(Type1("type1"))
}
implicit val type2Parser:JsonParserLike[Type2] = new JsonParserLike[Type2] {
override def parse(payload: String): Try[Type2] = Success(Type2("type2"))
}
}
// Exiting paste mode, now interpreting.
import scala.util.{Failure, Success, Try}
defined trait JsonParserLike
defined object JsonParserLike
scala> :paste
// Entering paste mode (ctrl-D to finish)
trait Parser {
def parse[T](payload : String) : Try[T]
}
class JsonParser extends Parser {
override def parse[T](payload: String): Try[T] = parseInternal(payload)
private def parseInternal[T:JsonParserLike](payload:String):Try[T] = {
implicitly[JsonParserLike[T]].parse(payload)
}
}
// Exiting paste mode, now interpreting.
<pastie>:24: error: ambiguous implicit values:
both value type1Parser in object JsonParserLike of type => JsonParserLike[Type1]
and value type2Parser in object JsonParserLike of type => JsonParserLike[Type2]
match expected type JsonParserLike[T]
override def parse[T](payload: String): Try[T] = parseInternal(payload)
As I've already tried to explain in the comments, the problem is that the method
override def parse[T](payload: String): Try[T] = parseInternal(payload)
does not accept any JsonParserLike[T] instances. Therefore, the compiler has no way to insert the right instance of JsonParserLike[T] at the call site (where the type T is known).
To make it work, one would have to add some kind of token that uniquely identifies type T to the argument list of parse. One crude way would be to add a JsonParserLike[T] itself:
import util.Try
trait Parser {
def parse[T: JsonParserLike](payload : String) : Try[T]
}
class JsonParser extends Parser {
override def parse[T: JsonParserLike](payload: String): Try[T] =
parseInternal(payload)
private def parseInternal[T:JsonParserLike](payload:String):Try[T] = {
implicitly[JsonParserLike[T]].parse(payload)
}
}
trait JsonParserLike[T] {
def parse(payload: String): Try[T]
}
object JsonParserLike {
implicit val type1Parser: JsonParserLike[String] = ???
implicit val type2Parser: JsonParserLike[Int] = ???
}
Now it compiles, because the JsonParserLike[T] required by parseInternal is inserted automatically as an implicit parameter to parse.
This might be not exactly what you want, because it creates a hard dependency between Parser interface and the JsonParserLike typeclass. You might want to get some inspiration from something like shapeless.Typeable to get rid of the JsonParserLike in the Parser interface, or just rely on circe right away.
It seems like there is an extra complexity from a mix of different types of polymorphism in both examples. Here is a minimal example of just a type class:
// type class itself
trait JsonParser[T] {
def parse(payload: String): Try[T]
}
// type class instances
object JsonParserInstances {
implicit val type1Parser: JsonParser[Type1] = new JsonParser[Type1] {
def parse(payload: String): Try[Type1] = ???
}
implicit val type2Parser: JsonParser[Type2] = new JsonParser[Type2] {
def parse(payload: String): Try[Type2] = ???
}
}
// type class interface
object JsonInterface {
def parse[T](payload: String)(implicit T: JsonParser[T]): Try[T] = {
T.parse(payload)
}
}
def main(args: Array[String]): Unit = {
import JsonParserInstances._
JsonInterface.parse[Type1]("3")
JsonInterface.parse[Type2]("3")
}
More info:
Chapter on Typeclasses in free Scala with Cats Book
I have class as below
trait RiskCheckStatusCode {
def code: String
def isSuccess: Boolean
}
object RiskCheckStatusCode {
val SUCCESS = SuccessRiskCheckStatusCode("1.1.1")
val FAIL = FailRiskCheckStatusCode("2.2.2")
case class SuccessRiskCheckStatusCode(code: String) extends RiskCheckStatusCode {
override def isSuccess = true
}
object SuccessRiskCheckStatusCode {
import spray.json.DefaultJsonProtocol._
implicit val formatter = jsonFormat1(SuccessRiskCheckStatusCode.apply)
}
case class FailRiskCheckStatusCode(code: String) extends RiskCheckStatusCode {
override def isSuccess = false
}
object FailRiskCheckStatusCode {
import spray.json.DefaultJsonProtocol._
implicit val formatter = jsonFormat1(FailRiskCheckStatusCode.apply)
}
}
and now I would like to convert the list of RiskCheckStatusCode to json
object Main extends App{
import spray.json._
import spray.json.DefaultJsonProtocol._
val l = List(RiskCheckStatusCode.SUCCESS, RiskCheckStatusCode.FAIL)
implicit object RiskCheckStatusCodeJsonFormat extends JsonWriter[RiskCheckStatusCode] {
override def write(obj: RiskCheckStatusCode): JsValue = obj match {
case obj: SuccessRiskCheckStatusCode => obj.toJson
case obj: FailRiskCheckStatusCode => obj.toJson
}
}
def json[T](list: T)(implicit formatter: JsonWriter[T]) = {
print(list.toJson)
}
json(l)
}
but the json method can not find jsonWriter[RiskCheckStatusCode].
Can you explain why? Maybe should I do it differently for trait type?
Edit:
It works for
val l: RiskCheckStatusCode = RiskCheckStatusCode.SUCCESS
so the problem is with List[RiskCheckStatusCode] because I have a formatter for RiskCheckStatusCode, not for List[RiskCheckStatusCode]. I tried import DefaultJsonProtocol but it still does not work.
import spray.json.DefaultJsonProtocol._
I have to change the definitions? From
implicit object RiskCheckStatusCodeJsonFormat extends JsonWriter[RiskCheckStatusCode]
to
implicit object RiskCheckStatusCodeJsonFormat extends JsonWriter[List[RiskCheckStatusCode]]
error:
Error:(28, 7) Cannot find JsonWriter or JsonFormat type class for List[com.example.status.RiskCheckStatusCode]
json(l)
Error:(28, 7) not enough arguments for method json: (implicit formatter: spray.json.JsonWriter[List[com.example.status.RiskCheckStatusCode]])Unit.
Unspecified value parameter formatter.
json(l)
Your code is fine you are just not having toJson in your scope (it is located in the package object of spray.json).
Add it and your code should compile:
object Main extends App with DefaultJsonProtocol {
import spray.json._
// ...
}
Furthermore spray has some issues to lift JsonWriter through derived formats (see this for details).
You can switch to JsonFormat instead:
implicit object RiskCheckStatusCodeJsonFormat extends JsonFormat[RiskCheckStatusCode] {
override def write(obj: RiskCheckStatusCode): JsValue = obj match {
case obj: SuccessRiskCheckStatusCode => obj.toJson
case obj: FailRiskCheckStatusCode => obj.toJson
}
override def read(json: JsValue): RiskCheckStatusCode = ???
}
In addition, to cleanup the type of your List change the definition of RiskCheckStatusCode to (this explains more details):
sealed trait RiskCheckStatusCode extends Serializable with Product
Suppose I have a set of converters to String, as a Type class:
import scala.reflect.runtime.universe._
abstract class ToStringConverter[T] {
def convert(value: T): String
}
implicit object IntToStringConverter extends ToStringConverter[Int] {
def convert(value: Int) = value.toString
}
implicit object DoubleStringConverter extends ToStringConverter[Double] {
def convert(value: Double) = value.toString
}
and a convert method that uses the type information to pick right converter:
def convert[T](v: T)(implicit ev: ToStringConverter[T]): String = ev.convert(v)
This works fine If I have the concrete type in advance, for example:
scala> convert[Double](12.2)
res0: String = 12.2
scala> convert[Int](12)
res1: String = 12
Is it possible to use the convert method above with a runtime type, for example, with a type 't' below?
scala> val t = typeOf[Double]
t: reflect.runtime.universe.Type = Double
If you want to do the resolution runtime, reflection is needed, as implicits are resolved compile time. A code like this should do the job:
import scala.reflect.runtime.universe._
abstract class ToStringConverterAny {
def convertAny(value: Any): String
}
abstract class ToStringConverter[T] extends ToStringConverterAny {
def convertAny(value: Any): String = convert(value.asInstanceOf[T])
def convert(value: T): String
}
implicit object IntToStringConverter extends ToStringConverter[Int] {
def convert(value: Int) = value.toString
}
implicit object DoubleStringConverter extends ToStringConverter[Double] {
def convert(value: Double) = value.toString
}
val converters: Map[Type, ToStringConverterAny] = Map(
typeOf[Int] -> IntToStringConverter,
typeOf[Double] -> DoubleStringConverter
)
def convert(t: Type, v: Any) = {
converters(t).convertAny(v)
}
def convert[T](v: T)(implicit ev: ToStringConverter[T]): String = ev.convert(v)
convert[Double](12.2)
convert[Int](12)
val t = typeOf[Double]
val v: Any = 1.23
convert(t, v)
If you want to building converters map automatically, you could also use reflection for this, but enumerating derived classes requires surprisingly non-trivial code (including class loaders - which is understandable when you think about it).
If you can make the ToStringConverterAny sealed, enumerating over its subclasses in a macro should be a bit easier.