As an overview, I am trying to dynamically create a constructor for a case class from a Cassandra Java Row using reflection to find the primary constructor for the case class, and then trying to extract the values from the Cassandra Row.
Specifically, I want to support an Option in a case class as being an optional field in the Row, such that
case class Person(name: String, age: Option[Int])
will successfully populate if the Row has a name and an age, or just the name (and fill in a None for age).
To this end, I followed this very helpful blog post that achieves a similar objective between Case Classes and Maps.
However, I seem to be stuck trying to consolidate the dynamic nature of reflectively extracting types from the Case Class and the compile-time nature of quasiquotes. As an example:
I have a type fieldType which could be a native type or an Option of a native type. If it is an Option, I want to pass returnType.typeArgs.head to my quasiquote construction, so that it can extract the parameterized type from the Row, and if it is not an Option, I will just pass returnType.
if (fieldType <:< typeOf[Option[_]])
q"r.getAs[${returnType.typeArgs.head}]($fieldName)"
else
q"r.as[$returnType]($fieldName)"
(assuming r is a Cassandra Row and as and getAs exist for this Row)
When I try to compile this, I get an error saying that it does not know how to deal with doing r.as[Option[String]]. This makes conceptual sense to me because there is no way the compiler would know which way the runtime comparison will resolve and so needs to check both cases.
So how might I go about making this type check? If I could maybe compare the types fieldType and typeOf[Option[_]] within the quasiquote, it might stop complaining, but I can't figure out how to compare types in a quasiquote, and I'm not sure it's even possible. If I could extract the parameterized type of the Option within the quasiquote, it might stop complaining, but I could not figure that out either.
Sorry, I am very new to Scala and this stuff is, at the moment, very confusing and esoteric to me. If you want to look more closely at what I am doing, I have a repo: https://github.com/thurstonsand/scala-cass/blob/master/src/main/scala/com/weather/scalacass/ScalaCass.scala
where the interesting part is ScalaCass.CaseClassRealizer, and I am testing it in CaseClassUnitTests.
I found help from #liff on the gitter scala/scala page.
Apparently, I was finding my fieldType incorrectly.
I was doing: val fieldType = tpe.decl(encodedName).typeSignature where I should have been doing val fieldType = field.infoIn(tpe). Will update once I know what this difference means.
Related
I have a fairly involved ADT representing a small query language (mongodb, to be specific). A simplified version looks a bit like that:
sealed abstract class Query extends Product with Serializable
final case class Eq[A](field: String, value: A) extends Query
final case class And(queries: Seq[Query]) extends Query
case object None extends Query
I've declared Query without a type parameter since not all values actually have one - None, for example, is parameterless.
I also have a type class, DocumentEncoder[A], that lets me turn any A into a BsonDocument.
The problem I'm running into is that Query needs a DocumentEncoder. Declaring one for each alternative is fairly trivial:
Eq[A] writes itself, provided A: DocumentEncoder.
And is very similar, if we assume that Query does have a DocumentEncoder instance.
None simply encodes as the empty BSON document
What I'm struggling with is with writing a global DocumentEncoder[Query]. What I'd usually do is pattern match on each alternative, but in this case I'm stuck with Eq[A]: I'd need to express something like case Eq[A: DocumentEncoder](field, value) => ..., but this is, as far as I know, not possible - pattern matching happens at runtime, implicit resolution at compile time.
The solution I have, which I find very unsatisfactory, is storing a BsonEncoder[A] as a field of Eq[A]. This allows me to write something like:
implicit val queryEncoder: DocumentEncoder[Query] = DocumentEncoder.from {
case e#Eq(field, value) => [...] e.encoder.encode(value) [...]
[...]
}
I can't help but find this horrible, but can't find a more elegant solution. The only other thing I can think of is that my premise (Query should not have a type parameter) is flawed, but:
having a type parameter, how would I go about writing And's type declaration?
is it ok to declare None as a Query[Unit] ?
maybe in my case I could get away with always having a type parameter, but what about a theoretical more generic case where it's not possible?
Alright, ok, so I can think of another solution, but it feels rather like overkill: having Query's type be a type member rather than a type parameter, and declaring a Query.Aux type alias that lifts the type member to a parameter (for implicit resolution). This sorts of feels like a "big boy"'s solution, though - I've seen it used in libraries like shapeless, and I somehow feel like my code or problems aren't yet of a level to require this kind of expert concepts.
Following on form this excellent set of answers on how to define union types in Scala. I've been using the Miles Sabin definition of Union types, but one questions remains.
How do you work with these if the type isn't know until Runtime? For example:
trait inv[-A] {}
type Or[A,B] = {
type check[X] = (inv[A] with inv[B]) <:< inv[X]
}
case class Foo[A : (Int Or String)#check](a: A)
Foo(1) // Foo[Int] = Foo(1)
Foo("hi") // Foo[String] = Foo(hi)
Foo(2.0) // Error!
This example works since the parameter A is know at compile time, and calling Foo(1) is really calling Foo[Int](1). However, what do you do if parameter A isn't known until runtime? Maybe you're paring a file that contains the data for Foo's, in which case the type parameter of Foo isn't know until you read the data. There's no easy way to set parameter A in this case.
The best solutions I've been able to come up with are:
Pattern Match on the data you've read and then create different Foo's based that type. In my case this isn't feasible because my case-class actually contains dozens of union types, so there'd be hundreds of combinations of types to pattern match.
Cast the type you've just read to be (String or Int), so you have a single type to pass around, that passes the Type Class constraint when you create Foo with it. Then return Foo[_] instead. This puts the onus back on the Foo user to work out the type of each field (since they'll appear to be type Any), but at least it defers having to know the type until the field is actually used, in which case a pattern match seems more tractable.
The second solution looks like this:
def parseLine: Any // Parses data point, but can be either a String or
// Int, so returns Any.
def mkFoo: Foo[_] = {
val a = parseLine.asInstanceOf[Int with String]
Foo(a) // Passes type constraint now
}
In practice I've ended up using the second solution, but I'm wondering if there's something better I can do?
Another way to state the problem is: What does it mean to return a Union Type? Functions can only return a single type, and the trickery we use with Miles Sabin union types is only useful for the types you pass in, not for the types you return.
PS. For context, why this is a problem in my case is that I'm generating a set of case-classes from a Json schema file. Json naturally supports union types, so I would like to make my case classes reflect that too. This works great in one direction: users creating case-classes to be serialized out to Json. But gets sticky in the other direction: user's parsing Json files to have a set of populated case classes returned to them.
The "standard" Scala solution to this problem is to use an ordinary discriminated-union type (ie, to forego true union types altogether):
sealed trait Foo
case class IntFoo(x: Int) extends Foo
case class StringFoo(x: String) extends Foo
This reflects the fact that, as you observe, the particular type of the member is a runtime value; the JVM type-tag of the Foo instance provides this runtime value.
Miles Sabin's implementation of union types is very clever, but I'm not sure if it provides any practical benefit, because it only restricts the type of thing that can go into a Foo, but provides the user of a Foo with no computable version of that restriction, in the way a match provides you with a computable version of the sealed trait. In general, for a restriction to be useful, it needs two sides: a check that only the right things are put in, and an extractor (aka an eliminator) that allows the same right things to come out the other end.
Perhaps if you gave some explanation of why you're looking for a purer union type it would illuminate whether regular discriminated unions are sufficient or if you really need something more.
There's a reason every JSON parser for Scala requires well defined types into which the JSON will be converted, even if some fields have to be dropped: you cannot work with something you don't know the type of.
To given an example, say you have a, and maybe a is a String, maybe it's an Int, but you don't know what it is. Why computation could you possibly make with a, not knowing its type? Why would your code compute the sum of all a's, for instance, if you didn't know in advance it was a number?
Generally, the answer to that is to perform user-provided data manipulation at runtime over data with unknown characteristics, as the user itself sees that it's a number and decides they want to know what the sum of that field is. Fine, but you are going the wrong way about it if so.
There is a well defined way to represent JSON data in Scala (and, for that matter, any data that has the same characteristics as JSON. Which is using a hierarchy of classes. A json value may be a json object, array or one of a number of primitives. A json object contains a list of key/value pairs, whose keys are json strings and values are json values. And so on. This is easy to represent, and there are many library doing so already. In fact, there are so many that there's a project called Json4s which presents a unified API which can be used and is implemented by many of the aforementioned libraries.
Things like the records which Miles Sabin's Shapeless library provide are intended to be used when the input doesn't have a well defined schema, but the program knows what it needs from that input. And, yes, the program might know what to do with a if it is an Int or a String, but not every possible value.
The next Scala 3 (mid 2020) based on Dotty will implement the proposal for Union Type from last Sept. 2018
You see it in "a tour of Scala 3" (June 2019)
Union Types Provide ad-hoc combinations of types
Subsetting = Subtyping
No boxing overhead
case class UserName(name: String)
case class Password(hash: Hash)
def help(id: UserName | Password) = {
val user = id match {
case UserName(name) => lookupName(name)
case Password(hash) => lookupPassword(hash)
}
...
}
Union Types Work also with singleton types
Great for JS interop
type Command = "Click" | "Drag" | "KeyPressed"
def handleEvent(kind: Command) = kind match {
case "Click" => MouseClick()
case "Drag" => MoveTo()
case "KeyPressed" => KeyPressed()
}
I am working on an abstract CRUD-DAO for my play2/slick2 project. To have convenient type-safe primary IDs I am using Unicorn as additional abstraction and convenience on top of slicks MappedTo & ColumnBaseType.
Unicorn provides a basic CRUD-DAO class BaseIdRepository which I want to further extend for project specific needs. The signature of the class is
class BaseIdRepository[I <: BaseId, A <: WithId[I], T <: IdTable[I, A]]
(tableName: String, val query: TableQuery[T])
(implicit val mapping: BaseColumnType[I])
extends BaseIdQueries[I, A, T]
This leads to DAO implementations looking something like
class UserDao extends
BaseIdRepository[UserId, User, Users]("USERS", TableQuery[Users])
This seems awfully redundant to me. I was able to supply tableName and query from T, giving me the following signature on my own Abstract DAO
abstract class AbstractIdDao[I <: BaseId, A <: WithId[I], T <: IdTable[I, A]]
extends BaseIdRepository[I,A,T](TableQuery[T].baseTableRow.tableName, TableQuery[T])
Is it possible in Scala to somehow infer the types I and A to make a signature like the following possible? (Users is a class extending IdTable)
class UserDao extends AbstractIdDao[Users]
Is this possible without runtime-reflection? If only by runtime-reflection: How do i use the Manifest in a class definition and how big is the performance impact in a reactive Application?
Also, since I am fairly new to the language and work on my own: Is this good practice in scala at all?
Thank you for help. Feel free to criticize my question and english. Improvements will of course be submitted to Unicorn git-repo
EDIT:
Actually, TableQuery[T].baseTableRow.tableName, TableQuery[T] does not work due to the error class type required but T found, IDEA was superficially fine with it, scalac wasn't.
As for your first question, I've encountered this when working with Slick too. But if you think about it, you'll see you cannot do this at compile time. This is because this type information is necessary specify the relations between your type parameters. If you would not, you would be able to construct classes of BaseIdRepository where the types don't make sense, such as IdTables where the table doesn't represent the projection. Since you need names for each of these relations, you need 3 named type parameters. If you omit the first one, it is possible to construct an IdRepository without a projection containing an Id; if you omit the second one it is possible to have a table without an ID column; and if you omit the third one, it is possible to query tables that do not have this combination of a table and a projection with an ID. You might not have the types defined in your application that would break any of these rules presently, but the compiler doesn't know that. Supplying the proper type information is unavoidable.
As for your second question, it is very unadvisable to employ reflection just because you think the syntax is verbose. If you can make guarantees about typesafety by simply by providing type parameters, I would advise you to do so. It is in very bad taste and style to write Scala in such a way. It would be ironic to employ typesafe ID's with Unicorn and later hack around its type safety with reflection.
Furthermore, a Manifest is not what you want: a manifest doesn't allow you to provide less type information to the compiler, it only allows you to be more flexible to specify where you do so. It allows you to leverage the compiler's knowledge of types at compile time to circumvent some issues that type erasure introduces. The problem you face here has nothing to do with type erasure, so Manifests won't work. Lastly, runtime reflection won't help you much here because Slick's internal functions won't allow you to compile if you don't already supply the type information.
So yeah, what you want is impossible. Scala (and Slick) need complete information at compile time and no trick is going to be effective in circumventing that.
In groovy one can do:
class Foo {
Integer a,b
}
Map map = [a:1,b:2]
def foo = new Foo(map) // map expanded, object created
I understand that Scala is not in any sense of the word, Groovy, but am wondering if map expansion in this context is supported
Simplistically, I tried and failed with:
case class Foo(a:Int, b:Int)
val map = Map("a"-> 1, "b"-> 2)
Foo(map: _*) // no dice, always applied to first property
A related thread that shows possible solutions to the problem.
Now, from what I've been able to dig up, as of Scala 2.9.1 at least, reflection in regard to case classes is basically a no-op. The net effect then appears to be that one is forced into some form of manual object creation, which, given the power of Scala, is somewhat ironic.
I should mention that the use case involves the servlet request parameters map. Specifically, using Lift, Play, Spray, Scalatra, etc., I would like to take the sanitized params map (filtered via routing layer) and bind it to a target case class instance without needing to manually create the object, nor specify its types. This would require "reliable" reflection and implicits like "str2Date" to handle type conversion errors.
Perhaps in 2.10 with the new reflection library, implementing the above will be cake. Only 2 months into Scala, so just scratching the surface; I do not see any straightforward way to pull this off right now (for seasoned Scala developers, maybe doable)
Well, the good news is that Scala's Product interface, implemented by all case classes, actually doesn't make this very hard to do. I'm the author of a Scala serialization library called Salat that supplies some utilities for using pickled Scala signatures to get typed field information
https://github.com/novus/salat - check out some of the utilities in the salat-util package.
Actually, I think this is something that Salat should do - what a good idea.
Re: D.C. Sobral's point about the impossibility of verifying params at compile time - point taken, but in practice this should work at runtime just like deserializing anything else with no guarantees about structure, like JSON or a Mongo DBObject. Also, Salat has utilities to leverage default args where supplied.
This is not possible, because it is impossible to verify at compile time that all parameters were passed in that map.
I've got a List[Any] of values and a list of corresponding ClassManifest[_]s, storing values' original types. How do i cast some value from list back to it's original type?
def cast[T](x: Any, mf: ClassManifest[T]): T = x.asInstanceOf[T] doesn't work.
Thank You for your answers.
That can't ever possibly work, as the return type of cast will always be taken as the highest common supertype of whatever T is restricted to. There's no way it can be made any more specific at compile time.
If you're trying to build a strongly-typed collection of disparate types, then what you really want is an HList:
http://jnordenberg.blogspot.com/2008/09/hlist-in-scala-revisited-or-scala.html
The way to use a Class instance in Java/Scala to cast an object is to use the Class.cast method. So you may think that you could do:
mf.erasure.cast(x) //T
But this will not work, because mf.erasure is a Class[_] (or a Class<?> in Java), so the cast is meaningless (i.e. offers no extra information). This is (of course) one of the drawbacks in using non-reified generics.