Getting a type and class tag / binding a parameterized type in the interpreter - scala

Given a parameterized type:
trait Document[S]
I want to bind an instance of this for an embedded interpreter, e.g.
def test[S](doc: Document[S]) = tools.nsc.interpreter.NamedParam("document", doc)
This wants both a TypeTag and a ClassTag of Document[S]. Note that I need to be able to bind the full type, i.e. Document[S] and not just Document[_].
How would I go about this? I imagine I would add something to document, e.g.
trait Document[S] {
def tt: reflect.runtime.universe.TypeTag[Document[S]] = ???
def ct: reflect.runtime.universe.ClassTag[Document[S]] = ???
}
(why the heck do I need two different tags to get the named parameter?)
EDIT: The following makes it compile
trait Document[S] {
implicit def systemType: reflect.runtime.universe.TypeTag[S]
}
def test[S](doc: Document[S]) = {
import doc.systemType
tools.nsc.interpreter.NamedParam("document", doc)
}
But I still end up with a binding to Document[_] in the interpreter, so my type parameter to Document is lost?!

Related

When doing implicit resolution with type parameters, why does val placement matter?

In one file, I have:
trait JsonSchema[T] {
val propertyType: String
override def toString: String = propertyType
}
object JsonSchema {
implicit def stringSchema: JsonSchema[String] = new JsonSchema[String] {
override val propertyType: String = "string"
}
implicit def intSchema: JsonSchema[Int] = new JsonSchema[Int] {
override val propertyType: String = "integer"
}
implicit def booleanSchema: JsonSchema[Boolean] = new JsonSchema[Boolean] {
override val propertyType: String = "boolean"
}
}
In my main file:
case class MetaHolder[T](v: T)(implicit val meta: JsonSchema[T])
object JsonSchemaExample extends App {
println(MetaHolder(3).meta.toString)
println(MetaHolder("wow").meta.toString)
}
That works hunky-dory. Now suppose I do this instead:
case class MetaHolder[T](v: T) {
val meta: JsonSchema[T] = implicitly[JsonSchema[T]]
}
It no longer compiles. Why?
My goal is to modify the anonymous Endpoint classes in the scala Finch library by adding a val meta to everything. I've been able to do this without any fancy-business so far, but now I want to do some fancy implicit resolution with shapeless to provide a JsonSchema definition for arbitrary case classes. My question is how to do this while maintaining backward compatibility. As in: provide the jsonschema meta feature for people who want to opt in, don't change the compilation burden for anyone who does not want to use meta,
If instead I go the first route, with an added implicit parameter, wouldn't that require a special import to be added by everyone? Or am I missing something and would backward compatibility still be maintained?
There is big difference between implicit x: X among parameters and implicitly[X] inside body.
When you say implicitly[X] this means "check now whether in the current scope there is an implicit X".
When you say def foo(...)(implicit x: X) = ... this means "check later when foo is called that in the scope of the call site there will be an implicit X (and for now inside foo just assume without checking that there is)".
class Foo(...)(implicit x: X) is similar to the latter, "check when constructor is called that there will be an implicit X".
Regarding whether users have to import or not. If you put implicits for type X to companion object of X then they will be found automatically (implicits for type X[Y] should be put to companion object of either X or Y). If you put them somewhere else then they have to be imported to the current scope.
In order for implicitly[JsonSchema[T]] to compile, there must be a JsonSchema[T] in the implicit scope, which means that there must be a JsonSchema[T] (or something implicitly convertible to a JsonSchema[T]) passed through as an implicit argument, as you had with:
case class MetaHolder[T](v: T)(implicit val meta: JsonSchema[T])

TypeTag for case classes

I would like to make a case class Bla that takes a type parameter A and it knows the type of A at runtime (it stores it in its info field).
My attempt is shown in the example below. The problem is that this example does not compile.
case class Bla[A] (){
val info=Run.paramInfo(this) // this does not compile
}
import scala.reflect.runtime.universe._
object Run extends App{
val x=Bla[Int]
def paramInfo[T](x:T)(implicit tag: TypeTag[T]): String = {
val targs = tag.tpe match { case TypeRef(_, _, args) => args }
val tinfo=s"type of $x has type arguments $targs"
println(tinfo)
tinfo
}
paramInfo(x)
}
However when I comment val info=Run.paramInfo(this) then the program runs fine and prints:
type of Bla() has type arguments List(Int)
Is there a way to make this example below compile ? (or in some other way achieve the same goal, i.e. that a case class is self aware of the type of it's type parameter?)
There's little point in using reflection based APIs for this, shapeless has a typeclass that exposes compile time information to runtime using an implicit macro.
import shapeless.Typeable
class Test[T : Typeable] {
def info: String = implicitly[Typeable[T]].describe
}
It's also relatively easy to roll your own thing here, with the added inconvenience of having to compile the implicit macro in a different compilation unit than whatever is using it.
You just need to pass the implicit type tag parameter to the case class constructor (otherwise the type information is lost before calling paraInfo which requires it):
case class Bla[A : TypeTag]() { ... }
Which is shorthand for:
case class Bla[A](implicit tag: TypeTag[A]) { ... }

Scala: Multiple type parameters for implicit class

I'm trying to port parts of a Haskell library for datatype-generic programming to Scala. Here's the problem I've run into:
I've defined a trait, Generic, with some container-type parameter:
trait Generic[G[_]] {
// Some function declarations go here
}
Now I have an abstract class, Collect, with three type parameters, and a function declaration (it signifies a type than can collect all subvalues of type B into a container of type F[_] from some structure of type A):
abstract class Collect[F[_],B,A] {
def collect_ : A => F[B]
}
In order to make it extend Generic, the first two type parameters F[_] and B are given, and A is curried (this effect is simulated using type lambdas):
class CollectC[F[_],B] extends Generic[({type C[A] = Collect[F,B,A]})#C] {
// Function definitions go here
}
The problem is that I need the last class definition to be implicit, because later on in my code I will need to be able to write functions like
class GUnit[G[_]](implicit gg: Generic[G]) {
// Some definitions
}
When I simply prepend implicit to the class definition, I get the an error saying implicit classes must accept exactly one primary constructor parameter. Has anyone encountered a similar problem? Is there a known way to work around it? I don't currently see how I could refactor my code while keeping the same functionality, so any advice is welcome. Thanks in advance!
Implicit classes don't work that way. They are a shorthand for implicit conversions. For instance implicit class Foo(i: Int) is equal to class Foo(i: Int); implicit def Foo(i: Int) = new Foo(i). So it only works with classes that have exactly one parameter in their constructor. It would not make sense for most 0 parameter (type-)classes.
The title of your question also seems to suggest that you think the compilation error is talking about type parameters of the type constructor, but I hope the above paragraph also makes clear that it is actually talking about value parameters of the value constructor.
For what (I think) you are trying to do, you will have to provide an implicit instance of CollectC yourself. I suggest putting it in the companion object of Collect. But you can choose an alternative solution if that fits your needs better.
scala> :paste
// Entering paste mode (ctrl-D to finish)
trait Generic[G[_]] {
// Some function declarations go here
}
abstract class Collect[F[_],B,A] {
def collect_ : A => F[B]
}
object Collect {
implicit def mkCollectC[F[_],B]: CollectC[F,B] = new CollectC[F,B]
}
class CollectC[F[_],B] extends Generic[({type C[A] = Collect[F,B,A]})#C] {
// Function definitions go here
}
// Exiting paste mode, now interpreting.
warning: there were four feature warnings; for details, enable `:setting -feature' or `:replay -feature'
defined trait Generic
defined class Collect
defined object Collect
defined class CollectC
scala> implicitly[Generic[({type C[X] = Collect[List,Int,X]})#C]]
res0: Generic[[X]Collect[[+A]List[A],Int,X]] = CollectC#12e8fb82

Scala data structure with type safe accessors

Please note I am learning Scala so what I suggest may not be the best (idomatic) way of achieving this, therefore I'll describe the problem I'm trying to solve first, then my current implementation!
Problem: Given some input document, e.g. xml or json, create an object Doc with its raw contents as a variable, apply a sequence of FieldExtractors which extract a number of value, i.e. Fields, which are stored on the Doc object and can be accessed in a type safe manner later, e.g. val username: String = doc.getField(UsernameField)
N.B everything must be Serializable so it can be passed over the wire via a particular framework
So, my current attempt:
abstract class Field[+T <: Serializable](val name: String, val valueType: Class[T])
trait Fields {
var fields: mutable.HashMap[Class[Field[Serializable]], Field[Serializable]] = mutable.HashMap()
def hasField[T <: Serializable](field: Field[T]): Boolean = false
def getField[T <: Serializable](field: Field[T]): T = fields.get(field).asInstanceOf(T)
def setField[T <: Serializable](field: Field[T], value: T): Unit = fields.put(field, value)
}
class Doc(val rawData: String) with Fields
abstract class FieldExtractor[+TYPE <: Serializable](val field: Field[TYPE]) {
def extractField(input: Doc): Option[TYPE]
}
But I get all sorts of errors such as:
Error:(14, 39) inferred type arguments [String] do not conform to method hasField's type parameter bounds [T <: Serializable]
val result:Boolean = fieldsObject.hasField(field)
^
I'm wondering if maybe I should use Value types instead? Or the https://github.com/mikaelv/strucs project (which seems rather young)? Or whether there's a better approach?
Eventually I'd write something like
extractors.foldLeft(doc)((doc, extractor) => doc.setField(field, extractor.extractField(doc); doc
This appears to be a symptom of Scala having its own Serializable class.
String (which Scala uses as a synonym for java.lang.String) implements Java's java.io.Serializable whereas your Field class has declared the generic bounds using Scala's Serializable class.
If you change the declaration to this:
abstract class Field[+T <: java.io.Serializable](val name: String, val valueType: Class[T])
it compiles* and does the type safe get/set checks correctly for me (* using Scala SDK 2.11.7, I had to correct some minor typos).
Note, there's also an error in the Fields class. I think you meant to declare the hash map as:
fields: mutable.HashMap[Field[Serializable], Serializable]

getting "incompatibe type" in returning an object instace

I'm writing a Play! 2.1 application using ReactiveMongo. each persistable case class has an object that holds 2 implicit objects, implementing BSONReader[...] and BSONWriter[...], and each case class has methods to return these:
trait Persistable {
implicit def getReader: BSONReader[Persistable]
implicit def getWriter: BSONWriter[Persistable]
val collectionName: String
}
case class MyObj () extends Persistable {
override val collectionName: String = MyObj.collectionName
override def getReader: BSONReader[MyObj] = MyObj.MyObjBSONReader
override def getWriter: BSONWriter[MyObj] = MyObj.MyObjBSONWriter
}
object MyObj{
val collectionName: String = "MyObj"
implicit object MyObjBSONReader extends BSONReader[MyObj] {
def fromBSON(document: BSONDocument): MyObj = {
val doc = document.toTraversable
new MyObj(
)
}
}
implicit object MyObjBSONWriter extends BSONWriter[MyObj] {
def toBSON(myObj: MyObj) = {
BSONDocument(
)
}
}
for some reason, getReader seems to work fine, but getWriter errors:
overriding method getWriter in trait Persistable of type =
reactivemongo.bson.handlers.BSONWriter[models.persistable.Persistable];
method getWriter has incompatible type
what am i doing wrong? both seem to have similar signatures.
another hint is that if i remove the return type from getWriter, i get complie time error in eclipse:
type mismatch; found : models.persistable.MyObj.MyObjBSONWriter.type required:
reactivemongo.bson.handlers.BSONWriter[models.persistable.Persistable]
UPDATE:
I did as #AndrzejDoyle said below, but then the implementation of Persister, which was the heart of this exercise, complains:
def insert(persistable: Persistable) = {
val collection = db(persistable.collectionName)
import play.api.libs.concurrent.Execution.Implicits._
implicit val reader = persistable.getReader
implicit val writer = persistable.getWriter
collection.insert(persistable)
}
error:
trait Persistable takes type
parameters
It is due to covariance and contravariance.
The mongodb reader is defined as BSONReader[+DocumentType]. The + in the generic parameter, means that this class is covariant in that parameter. Or more fully,
If B is a subclass of A, then BSONReader[B] is a subclass of BSONReader[A].
Therefore you can use a BSONReader[MyObj] everywhere that a BSONReader[Persistable] is required.
On the other hand, the writer is contravariant: BSONWriter[-DocumentType]. This means that
If B is a subclass of A, then BSONWriter[B] is a superclass of BSONWriter[A].
Therefore your BSONWriter[MyObj] is not a subclass of BSONWriter[Persistable], and so cannot be used in its place.
This might seem confusing initially (i.e. "why does contravariance make sense when it's 'backwards'?"). However if you think about what the classes are doing, it becomes clearer. The reader probably produces some instance of its generic parameter. A caller then might expect it to produce a Persistable - if you have a version that specifically produces MyObjs instead then this is fine.
The writer on the other hand, is probably given an object of its generic parameter. A caller with a BSONWriter[Persistable] will call the write() method, passing in an instance of Persistable to be written. Your implementation can only write instances of MyObj, and so it doesn't actually match the interface. On the other hand, a BSONWriter[Object] would be a subclass of any BSONWriter, since it can (from a type perspective) accept any type as an argument.
The fundamental problem seems to be that your Persistable trait is looser than you intended. You probably want each implementation to return a reader and writer parameterized on itself, rather than on Persistable in general. You can achieve this via self-bounded generics:
trait Persistable[T <: Persistable[T]] {
implicit def getReader: BSONReader[T]
implicit def getWriter: BSONWriter[T]
val collectionName: String
}
and then declare the class as MyObj[MyObj]. Now the reader and writer are expected to be parameterised on MyObj, and your existing implementations will compile.