Scala compiler infers Nothing for generic arguments - scala

I'm trying something that I've seen in different shapes in different contexts before:
extending scala's query extensions with filterById(id: Id)
This is what I've tried:
trait TableWithId { self: Profile =>
import profile.simple._
trait HasId[Id] { self: Table[_] =>
def id: Column[Id]
}
implicit class HasIdQueryExt[Id: BaseColumnType, U]
(query: Query[Table[U] with HasId[Id], U]) {
def filterById(id: Id)(implicit s: Session) = query.filter(_.id === id)
def insertReturnId(m: U)(implicit s: Session): Id = query.returning(query.map(_.id)) += m
}
}
This works fine, no real magic there. But because there is no type constraint on the Table type, any query to which I apply filterById, looses it's specificness (is is now a generic Table with HasId[Id]), and I can no longer reach it's columns (except for _.id ofcourse).
I don't know how to type this implicit conversion, such that this is prevented. Is it possible? The following "naieve" solution does not work, because Scala infers Nothing for the Id type now:
implicit class HasIdQueryExt[Id: BaseColumnType, U, T <: Table[U] with HasId[Id]]
(query: Query[T, U]) { ... }
I find it kind of strange that suddenly the Id type is inferred as Nothing. How do I hint the compiler where to look for that Id type?

This is my solution for a similar problem. I did use specific type for id though.:
trait GenericComponent { this: Profile =>
import profile.simple._
abstract class TableWithId[A](tag:Tag, name:String) extends Table[A](tag:Tag, name) {
def id = column[Option[UUID]]("id", O.PrimaryKey)
}
abstract class genericTable[T <: Table[A] , A] {
val table: TableQuery[T]
/**
* generic methods
*/
def insert(entry: A)(implicit session:Session): A = session.withTransaction {
table += entry
entry
}
def insertAll(entries: List[A])(implicit session:Session) = session.withTransaction { table.insertAll(entries:_*) }
def all: List[A] = database.withSession { implicit session =>
table.list.map(_.asInstanceOf[A])
}
}
/**
* generic queries for any table which has id:Option[UUID]
*/
abstract class genericTableWithId[T <: TableWithId[A], A <:ObjectWithId ] extends genericTable[T, A] {
def forIds(ids:List[UUID]): List[A] = database.withSession { implicit session =>
ids match {
case Nil => Nil
case x::xs =>table.filter(_.id inSet(ids)).list.map(_.asInstanceOf[A])
}
}
def forId(id:UUID):Option[A] = database.withSession { implicit session =>table.filter(_.id === id).firstOption }
}
}
and then for your concrete component:
case class SomeObjectRecord(
override val id:Option[UUID] = None,
name:String) extends ObjectWithId(id){
// your function definitions there
}
trait SomeObjectComponent extends GenericComponent { this: Profile =>
import profile.simple._
class SomeObjects(tag: Tag) extends TableWithId[SomeObjectRecord](tag, "some_objects") {
def name = column[String]("name", O.NotNull)
def * = (id, name) <> (SomeObjectRecord.tupled, SomeObjectRecord.unapply)
def nameIdx = index("u_name", (name), unique = true)
}
object someobjects extends genericTableWithId[SomeObjects, SomeObjectRecord] {
val table = TableQuery[Units]
// your additional methods there; you get insert and insertAll from the parent
}
}

Related

Import does not bring implicits in scope

I am facing an error about unreachable implicits in scope:
Error:(38, 68) could not find implicit value for parameter strategy: XXX.NeoStrategy[T]
(summoner: Summoner, v: String) => summoner.summonEvaluation[T](v)
I implement the answer of Tim to that question : https://stackoverflow.com/a/56668734/3896166
I tried to import the implicit object Strategies within TypeTable scope with :
import XXX.NeoStrategies._
but to no success.
The followings are each file of the base logic I want to use:
object TypeLib {
sealed trait Type_top
trait Type_A extends Type_top
trait Type_B extends Type_top
}
trait NeoStrategy[T <: Type_top] {
def evaluate(v: String, helper: Helper): Int
}
object NeoStrategies {
implicit object NeoStrategy_A extends NeoStrategy[Type_A] {
def evaluate(v: String, helper: Helper): Int = 1
}
implicit object NeoStrategy_B extends NeoStrategy[Type_B] {
def evaluate(v: String, helper: Helper): Int = 2
}
}
case class Helper(name: String) {
def summonEvaluation[T <: Type_top](v: String)(implicit strategy: NeoStrategy[T]): Int = {
strategy.evaluate(v, this)
}
}
trait TypeOMap {
protected def computeStuff[T <: Type_top]: (Helper, String) => Int
protected val computeMap: Map[String, (Helper, String) => Int]
}
import XXX.NeoStrategies._
trait TypeTable extends TypeOMap {
override protected def computeStuff[T <: Type_top]: (Helper, String) => Int = {
(helper: Helper, v: String) => helper.summonEvaluation[T](v)
}
override protected val computeMap = Map(
"a" -> computeStuff[Type_A],
"b" -> computeStuff[Type_B]
)
}
class Summoner extends TypeTable {
def callsMapAndEvaluates(typeIdentifier: String, helper: Helper, param: String): Double = {
computeMap(typeIdentifier)(helper, param)
}
}
object StackO {
def main(args: Array[String]): Unit = {
val mySummoner = new Summoner
// mySummoner allows the selecting of a given type with
// its "typeIdentifier" input in combination with the "TypeTable" it extends
val r = mySummoner.callsMapAndEvaluates("a", Helper("make it right"), "I, parameter")
}
}
This is not the first time I use implicits but not with something like the computeMap above. Still, I understand the logic of it, but fail at making it right.
How can I have summoner.summonEvaluation[T](v) find the required implicit?
Just add context bound
override protected def computeStuff[T <: Type_top : NeoStrategy] ...
It seems you want to work with singleton types. In Scala 2.12 + Shapeless
import shapeless.Witness
object TypeLib {
sealed trait Type_top
trait Type_A extends Type_top
trait Type_B extends Type_top
}
import TypeLib._
trait NeoStrategy[S <: String] {
type T <: Type_top
def evaluate(v: S, summoner: Summoner): Int
}
object NeoStrategy {
type Aux[S <: String, T0 <: Type_top] = NeoStrategy[S] { type T = T0 }
def mkStrategy[S <: String, T0 <: Type_top](f: (S, Summoner) => Int): Aux[S, T0] = new NeoStrategy[S] {
override type T = T0
override def evaluate(v: S, summoner: Summoner): Int = f(v, summoner)
}
implicit val NeoStrategy_A: NeoStrategy.Aux[Witness.`"a"`.T, Type_A] = mkStrategy((_, _) => 1)
implicit val NeoStrategy_B: NeoStrategy.Aux[Witness.`"b"`.T, Type_B] = mkStrategy((_, _) => 2)
}
case class Summoner(name: String) {
def summonEvaluation[S <: String](s: Witness.Aux[S])(implicit
strategy: NeoStrategy[S]): Int = {
strategy.evaluate(s.value, this)
}
}
def main(args: Array[String]): Unit = {
val mySummoner = Summoner("stack question")
val r = mySummoner.summonEvaluation("a")
val r1 = mySummoner.summonEvaluation("b")
println(r) // 1
println(r1) // 2
}
In Scala 2.13
object TypeLib {
sealed trait Type_top
trait Type_A extends Type_top
trait Type_B extends Type_top
}
import TypeLib._
trait NeoStrategy[S <: String with Singleton] {
type T <: Type_top
def evaluate(v: S, summoner: Summoner): Int
}
object NeoStrategy {
type Aux[S <: String with Singleton, T0 <: Type_top] = NeoStrategy[S] { type T = T0 }
def mkStrategy[S <: String with Singleton, T0 <: Type_top](f: (S, Summoner) => Int): Aux[S, T0] = new NeoStrategy[S] {
override type T = T0
override def evaluate(v: S, summoner: Summoner): Int = f(v, summoner)
}
implicit val NeoStrategy_A: NeoStrategy.Aux["a", Type_A] = mkStrategy((_, _) => 1)
implicit val NeoStrategy_B: NeoStrategy.Aux["b", Type_B] = mkStrategy((_, _) => 2)
}
case class Summoner(name: String) {
def summonEvaluation[S <: String with Singleton](s: S)(implicit
value: ValueOf[S],
strategy: NeoStrategy[S]): Int = {
strategy.evaluate(s, this)
}
}
def main(args: Array[String]): Unit = {
val mySummoner = Summoner("stack question")
val r = mySummoner.summonEvaluation("a")
val r1 = mySummoner.summonEvaluation("b")
println(r) // 1
println(r1) // 2
}
The underlying problem is this:
override protected def computeStuff[T <: Type_top]: (Helper, String) => Int = {
(helper: Helper, v: String) => helper.summonEvaluation[T](v) // implicit for NeoStrategy[T]...?
}
Since summonEvaluation[T] requires an implicit argument of type NeoStrategy[T], this means you must have one in scope for any T that's a subclass of Type_top. However, NeoStrategies only provides two instances: one for Type_A and Type_B. This is not enough for the compiler. Understandably so - for instance, you haven't provided any NeoStrategy for
Type_top itself
subclasses of Type_A and Type_B (perfectly legal to create)
There are two basic ways you can handle this:
Delaying the implicit resolution
As per the other answer, instead of trying to resolve the implicit inside computeStuff, add a context bound there too. If the point where you have to supply the implicit is only reached when you know what T is, you won't have to provide instances for any possible subtype.
Providing implicits for all possible subtypes
If absolutely you want to keep the implicit resolution inside computeStuff, you're going to have to offer a method
implicit def getNeoStrategy[T <: Type_top] : NeoStrategy[T] = ???
Unfortunately, doing this is probably going to involve a bunch of reflection and potentially runtime errors for edge cases, so I'd recommend the context bound on computeStuff.

Automatic type class derivation for case classes with Scala Enumeration fields

I have written automatic type class derivation in order to automatically generate elasticsearch Json mapping for case classes.
For that I am using the TypeClass type class in shapeless.
The problem I have is that many fields in the case classes we use are Scala enumerations.
For example
object ConnectionState extends Enumeration {
type ConnectionState = Value
val ordering, requested, pending, available, down, deleting, deleted, rejected = Value
}
Or
object ProductCodeType extends Enumeration {
type ProductCodeType = Value
val devpay, marketplace = Value
}
It seems I have to define a specific implicit instance for every Enumeration that is defined in order that the automatic derivation will pick it up (e.g. for both ConnectionState and ProductCodeType).
I cannot have one implicit def for Enumeration such as
implicit def enumerationMapping: MappingEncoder[Enumeration] = new MappingEncoder[Enumeration] {
def toMapping = jSingleObject("type", jString("text"))
}
that will work for all enumeration types.
I tried making the type class covariant, and a bunch of other things but nothing helped.
Any ideas?
Here is the derivation code:
object Mapping {
trait MappingEncoder[T] {
def toMapping: Json
}
object MappingEncoder extends LabelledProductTypeClassCompanion[MappingEncoder] {
implicit val stringMapping: MappingEncoder[String] = new MappingEncoder[String] {
def toMapping = jSingleObject("type", jString("text"))
}
implicit val intMapping: MappingEncoder[Int] = new MappingEncoder[Int] {
def toMapping = jSingleObject("type", jString("integer"))
}
implicit def seqMapping[T: MappingEncoder]: MappingEncoder[Seq[T]] = new MappingEncoder[Seq[T]] {
def toMapping = implicitly[MappingEncoder[T]].toMapping
}
implicit def optionMapping[T: MappingEncoder]: MappingEncoder[Option[T]] = new MappingEncoder[Option[T]] {
def toMapping = implicitly[MappingEncoder[T]].toMapping
}
object typeClass extends LabelledProductTypeClass[MappingEncoder] {
def emptyProduct = new MappingEncoder[HNil] {
def toMapping = jEmptyObject
}
def product[F, T <: HList](name: String, sh: MappingEncoder[F], st: MappingEncoder[T]) = new MappingEncoder[F :: T] {
def toMapping = {
val head = sh.toMapping
val tail = st.toMapping
(name := head) ->: tail
}
}
def project[F, G](instance: => MappingEncoder[G], to: F => G, from: G => F) = new MappingEncoder[F] {
def toMapping = jSingleObject("properties", instance.toMapping)
}
}
}
}
I was able to fix the problem by adding additional implicit defs to scope:
implicit def enumerationMapping[T <: Enumeration#Value]: MappingEncoder[T] = new MappingEncoder[T] {
def toMapping = jSingleObject("type", jString("text"))
}
implicit def enumerationSeqMapping[T <: Enumeration#Value]: MappingEncoder[Seq[T]] = new MappingEncoder[Seq[T]] {
def toMapping = jSingleObject("type", jString("text"))
}
The second implicit was needed as some of the case classes had members of type Seq[T] where T is some enumeration type.

Scala type inference working with Slick Table

Have such models (simplified):
case class User(id:Int,name:String)
case class Address(id:Int,name:String)
...
Slick (2.1.0 version) table mapping:
class Users(_tableTag: Tag) extends Table[User](_tableTag, "users") with WithId[Users, User] {`
val id: Column[Int] = column[Int]("id", O.AutoInc, O.PrimaryKey)
...
}
trait WithId[T, R] {
this: Table[R] =>
def id: Column[Int]
}
Mixing trait WithId I want to implement generic DAO methods for different tables with column id: Column[Int] (I want method findById to work with both User and Address table mappings)
trait GenericSlickDAO[T <: WithId[T, R], R] {
def db: Database
def findById(id: Int)(implicit stk: SlickTableQuery[T]): Option[R] = db.withSession { implicit session =>
stk.tableQuery.filter(_.id === id).list.headOption
}
trait SlickTableQuery[T] {
def tableQuery: TableQuery[T]
}
object SlickTableQuery {
implicit val usersQ = new SlickTableQuery[Users] {
val tableQuery: Table Query[Users] = Users
}
}
The problem is that findById doesn't compile:
Error:(13, 45) type mismatch;
found : Option[T#TableElementType] required: Option[R]
stk.tableQuery.filter(_.id === id).list.headOption
As I see it T is of type WithId[T, R] and at the same time is of type Table[R]. Slick implements the Table type such that if X=Table[Y] then X#TableElementType=Y.
So in my case T#TableElementType=R and Option[T#TableElementType] should be inferred as Option[R] but it isn't. Where am I wrong?
Your assumption about WithId[T, R] being of type Table[R] is wrong. The self-type annotation in WithId[T, R] just requires a Table[R] to be mixed in, but that doesn't mean that WithId[T, R] is a Table[R].
I think you confuse the declaration of WithId with instances of WithId which eventually need to be an instance of a Table.
Your upper type bound constraint in the GenericSlickDAO trait also doesn't guarantee you the property of WithId to be an instance of Table, since any type is a subtype of itself.
See this question for a more elaborate explanation about the differences between self-types and subtypes.
I'm using play-slick and I tried to do exactly like you, with a trait and using self-type without success.
But I succeeded with the following:
import modelsunscanned.TableWithId
import scala.slick.jdbc.JdbcBackend
import scala.slick.lifted.TableQuery
import play.api.db.slick.Config.driver.simple._
/**
* #author Sebastien Lorber (lorber.sebastien#gmail.com)
*/
package object models {
private[models] val Users = TableQuery(new UserTable(_))
private[models] val Profiles = TableQuery(new ProfileTable(_))
private[models] val Companies = TableQuery(new CompanyTable(_))
private[models] val Contacts = TableQuery(new ContactTable(_))
trait ModelWithId {
val id: String
}
trait BaseRepository[T <: ModelWithId] {
def tableQuery: TableQuery[TableWithId[T]]
private val FindByIdQuery = Compiled { id: Column[String] =>
tableQuery.filter(_.id === id)
}
def insert(t: T)(implicit session: JdbcBackend#Session) = {
tableQuery.insert(t)
}
def getById(id: String)(implicit session: JdbcBackend#Session): T = FindByIdQuery(id).run.headOption
.getOrElse(throw new RuntimeException(s"Could not find entity with id=$id"))
def findById(id: String)(implicit session: JdbcBackend#Session): Option[T] = FindByIdQuery(id).run.headOption
def update(t: T)(implicit session: JdbcBackend#Session): Unit = {
val nbUpdated = tableQuery.filter(_.id === t.id).update(t)
require(nbUpdated == 1,s"Exactly one should have been updated, not $nbUpdated")
}
def delete(t: T)(implicit session: JdbcBackend#Session) = {
val nbDeleted = tableQuery.filter(_.id === t.id).delete
require(nbDeleted == 1,s"Exactly one should have been deleted, not $nbDeleted")
}
def getAll(implicit session: JdbcBackend#Session): List[T] = tableQuery.list
}
}
// play-slick bug, see https://github.com/playframework/play-slick/issues/227
package modelsunscanned {
abstract class TableWithId[T](tableTag: Tag,tableName: String) extends Table[T](tableTag,tableName) {
def id: Column[String]
}
}
I give you an exemple usage:
object CompanyRepository extends BaseRepository[Company] {
// Don't know yet how to avoid that cast :(
def tableQuery = Companies.asInstanceOf[TableQuery[TableWithId[Company]]]
// Other methods here
...
}
case class Company(
id: String = java.util.UUID.randomUUID().toString,
name: String,
mainContactId: String,
logoUrl: Option[String],
activityDescription: Option[String],
context: Option[String],
employeesCount: Option[Int]
) extends ModelWithId
class CompanyTable(tag: Tag) extends TableWithId[Company](tag,"COMPANY") {
override def id = column[String]("id", O.PrimaryKey)
def name = column[String]("name", O.NotNull)
def mainContactId = column[String]("main_contact_id", O.NotNull)
def logoUrl = column[Option[String]]("logo_url", O.Nullable)
def activityDescription = column[Option[String]]("description", O.Nullable)
def context = column[Option[String]]("context", O.Nullable)
def employeesCount = column[Option[Int]]("employees_count", O.Nullable)
//
def * = (id, name, mainContactId,logoUrl, activityDescription, context, employeesCount) <> (Company.tupled,Company.unapply)
//
def name_index = index("idx_name", name, unique = true)
}
Note that active-slick is also using something similar
This helped me out a lot. It's a pretty simple example of a genericdao https://gist.github.com/lshoo/9785645
package slicks.docs.dao
import scala.slick.driver.PostgresDriver.simple._
import scala.slick.driver._
trait Profile {
val profile: JdbcProfile
}
trait CrudComponent {
this: Profile =>
abstract class Crud[T <: Table[E] with IdentifiableTable[PK], E <: Entity[PK], PK: BaseColumnType](implicit session: Session) {
val query: TableQuery[T]
def count: Int = {
query.length.run
}
def findAll: List[E] = {
query.list()
}
def queryById(id: PK) = query.filter(_.id === id)
def findOne(id: PK): Option[E] = queryById(id).firstOption
def add(m: E): PK = (query returning query.map(_.id)) += m
def withId(model: E, id: PK): E
def extractId(m: E): Option[PK] = m.id
def save(m: E): E = extractId(m) match {
case Some(id) => {
queryById(id).update(m)
m
}
case None => withId(m, add(m))
}
def saveAll(ms: E*): Option[Int] = query ++= ms
def deleteById(id: PK): Int = queryById(id).delete
def delete(m: E): Int = extractId(m) match {
case Some(id) => deleteById(id)
case None => 0
}
}
}
trait Entity[PK] {
def id: Option[PK]
}
trait IdentifiableTable[I] {
def id: Column[I]
}
package slicks.docs
import slicks.docs.dao.{Entity, IdentifiableTable, CrudComponent, Profile}
case class User(id: Option[Long], first: String, last: String) extends Entity[Long]
trait UserComponent extends CrudComponent {
this: Profile =>
import profile.simple._
class UsersTable(tag: Tag) extends Table[User](tag, "users") with IdentifiableTable[Long] {
override def id = column[Long]("id", O.PrimaryKey, O.AutoInc)
def first = column[String]("first")
def last = column[String]("last")
def * = (id.?, first, last) <> (User.tupled, User.unapply)
}
class UserRepository(implicit session: Session) extends Crud[UsersTable, User, Long] {
override def query = TableQuery[UsersTable]
override def withId(user: User, id: Long): User = user.copy(id = Option(id))
}
}

How do I provide the proper type information to get my generic filter function working for slick

I'm trying to implement some generic filters in a base class for my slick tables.
What I'm trying to accomplish is the ability to translate url query strings into database filters. I've come up with a working solution, but the code isn't very DRY.
This is my modified code.
package db.utils.repos
import common.utils.request.filters.{StringMatchesFilter, RequestFilter}
import scala.slick.lifted.{Column => LiftedColumn}
import db.environments.slick.interfaces.SlickEnv
import org.joda.time.DateTime
import scala.slick.profile.RelationalProfile
trait SlickPageable {
val slickEnv: SlickEnv
import slickEnv.profile.simple._
trait FieldMapped {
def longField(f: String): Option[LiftedColumn[Long]]
def optLongField(f: String): Option[LiftedColumn[Option[Long]]]
def stringField(f: String): Option[LiftedColumn[String]]
def optStringField(f: String): Option[LiftedColumn[Option[String]]]
}
trait FieldSet{
def stringField(table: FieldMapped, f: String): Option[LiftedColumn[Any]]
}
object FSet extends FieldSet {
override def stringField(table: FieldMapped, f: String): Option[LiftedColumn[String]] = {
table.stringField(f)
}
}
object OptFSet extends FieldSet {
override def stringField(table: FieldMapped, f: String): Option[LiftedColumn[Option[String]]] = {
table.optStringField(f)
}
}
def filter(table: FieldMapped, f: RequestFilter) = {
queryFilter(table, f, QSet)
}
def optFilter(table: FieldMapped, f: RequestFilter) ={
queryFilter(table, f, OptQSet)
}
def queryFilter[T, R](table: FieldMapped, f: RequestFilter, querySet: QuerySet[T, R]) = {
def checkField[A](field: Option[A], op: (A, String) => R) = {
field match {
case Some(fi) =>
op(fi, f.value)
case None =>
throw new IllegalArgumentException(s"could not retrieve field ${f.field}")
}
}
f match {
case sf: StringMatchesFilter => {
checkField(querySet.fSet.stringField(table, f.field), querySet.sEq)
}
case _ => throw new IllegalArgumentException("No such filter")
}
}
trait QuerySet[T, R] {
val fSet: FieldSet
def sEq(col: T, value: String): R
}
object QSet extends QuerySet[LiftedColumn[String], LiftedColumn[Boolean]] {
override val fSet = FSet
override def sEq(col: LiftedColumn[String], value: String): LiftedColumn[Boolean] = {
col === value
}
}
object OptQSet extends QuerySet[LiftedColumn[Option[String]], LiftedColumn[Option[Boolean]]] {
override val fSet = OptFSet
override def sEq(col: LiftedColumn[Option[String]], value: String): LiftedColumn[Option[Boolean]] = {
col === value
}
}
}
And the error message
type mismatch;
[error] found : (T, String) => R
[error] required: (scala.slick.lifted.Column[_], String) => R
[error] checkField(fieldSet.stringField(table, f.field), querySet.sEq)
If I remove the FieldSet from the queryFilter parameter list I can get it working. But then I need to separate functions: one to call the field mapping methods, and the other to call the optional field mapping methods. It works though, with the type inferencing and everything working properly.
So how can I get this working with FieldSet? From what I can tell it seems there isn't enough type information at compile time. But I don't know how to provide it.

Covariance and type inference in Scala

Given the code
object Renderer {
sealed abstract class BasicRender
case class RenderImages(img: Array[File]) extends BasicRender
case class RenderVideo(video: File) extends BasicRender
def rendererFor[T <: BasicRender : Manifest, Z <: Render.RenderingContext](ctx: Z): Option[Render[T]] = {
val z = manifest[T].erasure
if (z == classOf[RenderImages]) {
Some(new ImagesRenderer(ctx.asInstanceOf[ImagesContext])) // .asInstanceOf[Render[T]])
} else
if (z == classOf[RenderVideo]) {
Some(new VideoRenderer(ctx.asInstanceOf[VideoContext])) // .asInstanceOf[Render[T]])
} else {
None
}
}
private class ImagesRenderer(ctx: ImagesContext) extends Render[RenderImages] {
override def renderJSON(json: String)(implicit jsCtx: PhantomJsContext) = {
None
}
}
private class VideoRenderer(ctx: VideoContext) extends Render[RenderVideo] {
override def renderJSON(json: String)(implicit jsCtx: PhantomJsContext) = {
None
}
}
}
trait Render[+Out] {
def renderJSON(json: String)(implicit jsCtx: PhantomJsContext): Option[Out]
}
I made Render trait covariant for it's type parameter, so if
RenderImages <: BasicRender
then
ImagesRenderer <: Render[RenderImages]
But it looks like the compiler is not able to infer the type of renderer in rendererFor, so I need to add explicit class casting like
Some(new ImagesRenderer(ctx.asInstanceOf[ImagesContext]).asInstanceOf[Render[T]])
what is wrong with my reasoning here?
As explained by Daniel C. Sobral, your problem here is that you are instantiating the different renderers dynamically, in a way that does not capture in the type system the relation between ctx and the result type of rendererFor.
One common way to solve this kind of issues is to use a type class:
import java.io.File
class PhantomJsContext
trait Renderer[+Out] {
def renderJSON(json: String)(implicit jsCtx: PhantomJsContext): Option[Out]
}
trait RendererFactory[ContextType, ResultType] {
def buildRenderer( ctx: ContextType ): Renderer[ResultType]
}
object Renderer {
case class RenderImages(img: Array[File])
case class RenderVideo(video: File)
trait ImagesContext
trait VideoContext
def rendererFor[ContextType, ResultType](ctx: ContextType)( implicit factory: RendererFactory[ContextType, ResultType] ): Renderer[ResultType] = {
factory.buildRenderer( ctx )
}
class ImagesRenderer(ctx: ImagesContext) extends Renderer[RenderImages] {
def renderJSON(json: String)(implicit jsCtx: PhantomJsContext) = ???
}
implicit object ImagesRendererFactory extends RendererFactory[ImagesContext, RenderImages] {
def buildRenderer( ctx: ImagesContext ) = new ImagesRenderer( ctx )
}
class VideoRenderer(ctx: VideoContext) extends Renderer[RenderVideo] {
def renderJSON(json: String)(implicit jsCtx: PhantomJsContext) = ???
}
implicit object VideoRendererFactory extends RendererFactory[VideoContext, RenderVideo] {
def buildRenderer( ctx: VideoContext ) = new VideoRenderer( ctx )
}
}
You can easily check in the REPL that the correct types are returned:
scala> lazy val r1 = Renderer.rendererFor( new Renderer.ImagesContext {} )
r1: Renderer[Renderer.RenderImages] = <lazy>
scala> :type r1
Renderer[Renderer.RenderImages]
scala> lazy val r2 = Renderer.rendererFor( new Renderer.VideoContext {} )
r2: Renderer[Renderer.RenderVideo] = <lazy>
scala> :type r2
Renderer[Renderer.RenderVideo]
T is not to be inferred: it is a parameter that is passed to the method, and there isn't really any guarantee that what is being returned is an Option[Render[T]]. For instance, assume you pass RenderImages and it returns a VideoRenderer, then it would obviously be wrong.
Now, the if conditionals you put in there might prevent that from happening, but none of that is used by the compiler to figure out whether you are returning the right type or not.