Slick "===" compiling only in for comprehension - scala

I am experiencing a strange behaviour with slick and I would like some help figuring out why it happens. The issue is that I have a query that reads as follows:
db.withSession { implicit session =>
tables.users.where(_.id === UserId(1)).firstOption
}
This does not compile producing an error as follows:
inferred type arguments [Boolean] do not conform to method where's type parameter bounds [T <: scala.slick.lifted.Column[_]]
But if I rewrite the code as:
db.withSession { implicit session =>
(for {
u <- tables.users if u.id === UserId(1)
} yield u).firstOption
}
It compiles and works fine.
The table is defined as follows:
class Users(tag: Tag) extends Table[User](tag, "users") {
def id = column[UserId]("id", O.PrimaryKey, O.AutoInc, O.NotNull)
}
And I have an implicit conversion to map the UserId type:
implicit lazy val userIdColumnType = MappedColumnType.base[UserId, Int](_.value, UserId(_))
It looks like a type inference problem, but I can't really understand why it should happen.
Anyone has any on why this should behave differently in the two scenario I reported?
EDIT: After some investigation I found that when using where the implicit conversion for the userIdColumnType has to be in scope, while with the for comprehension it is not needed. Is there a good explanation for this?

You are using === from ScalaTest. It returns a Boolean. Slick's === returns a Column[Boolean]. The methods filter and where prevent using Boolean (at least in the latest version of Slick), to protect you from accidentally using == or also from using ScalaTest's === in your case, which does a local comparison of the underlying values instead of an equality comparison in the database, which is what you actualy want. For comprehensions are desugared to withFilter and can sometimes generate a Boolean value, so unfortunately we cannot disallow Boolean for comprehensions.
To fix this you need to make sure, that Slick's === is picked in queries. Maybe you can affect this with the import order or scope. Or if you are unlucky you can't and they are incompatible.
I am not sure how the userIdColumnType interacts here at the moment.

I fixed this by importing my driver api and making sure it was in scope
i.e. as I'm using a postgresql driver,
import PostgresProfile.api._

Related

Type bounds on methods not enforced

I can't comprehend why the compiler allows for the current code.
I use phantom types to protect access to methods. Only under specific "states" should methods be allowed to be called.
In most scenarios, this invariant is indeed verified by the compilation. Sometimes however, the compiler just ignores the constraint imposed by the phantom type.
This feels like a major bug. What am I not understanding?
I tried to simplify the problem as much as possible. My use case is more complex:
class Door[State <: DoorState] private {
def doorKey: Int = 1
def open[Phatom >: State <: Closed.type]: Door[Open.type] = new Door[Open.type]
def close[Phatom >: State <: Open.type]: Door[Closed.type] = new Door[Closed.type]
}
object Door {
def applyOpenDoor: Door[Open.type] = new Door[Open.type]
def applyClosedDoor: Door[Closed.type] = new Door[Closed.type]
}
sealed trait DoorState
case object Closed extends DoorState
case object Open extends DoorState
Then
val aClosedDoor = Door.applyClosedDoor
val res1 = aClosedDoor.close // does not compile. Good!
val res2 = aClosedDoor.open.close.close // does not compile. Good!
println(aClosedDoor.open.close.close) // does not compile. Good!
println(aClosedDoor.open.close.close.doorKey) // does not compile. Good!
aClosedDoor.open.close.close.doorKey == 1 // does not compile. Good!
println(aClosedDoor.open.close.close.doorKey == 1) // compiles! WTF?
As you can see above, the client can close a closed door. In my library, the corresponding behaviour is throwing a runtime exception. (I was confident the exception was well protected and impossible to reach)
I have only been able to replicate this problem for expressions returning boolean and when this is the argument to a function (in the example, println)
I am not looking for alternative solutions as much as I am looking for an explanation as to how this can happen? Don't you agree this is a considerable flaw? Or maybe I am missing something?
Scala version: 2.13.5
Edit
After a discussion on gitter, I opened bug request # https://github.com/scala/bug
The problem does not seem to occur in scala 3.
Seems it is an bug related to usage of nullary method
def doorKey: Int = 1
if instead it is defined as nilary method
def doorKey(): Int = 1
then it works as expected
println(aClosedDoor.open.close.close.doorKey() == 1) // compiler error

Null as parameter default value in scala produces type mismatch error

In order to make overloaded calls like
val myPage: DocumentType;
func()
func(myPage)
I wrote a function:
def func(page: DocumentType = null): Unit = {...}
but receive the following error:
type mismatch; found : Null(null) required: DocumentType
When I change DocumentType to String, the error disappears. First question: why?
DocumentType is a type from the library which I cannot change, with the following definition:
type DocumentType <: Document
trait Document
I do not want on each client call to wrap actual parameter to Option (like Option(myPage)) but are there any other options to obtain the similar?
You can just overload functions like
def func(): Unit = { } // do what you would do with null
def func(page: DocumentType): Unit = { } // do what you would do with a DocumentType
You can abstract the implementation by getting both to call some other private function to keep it DRY. You can then call func() or func(new DocumentType())
ORIGINAL ANSWER (not so good)
def func(page: DocumentType): Unit = func(Some(page))
def func(page: Option[DocumentType] = None): Unit = ???
means you don't need to resort to null. You lose the clean API, as you can call
val d = new DocumentType()
func()
func(d)
func(Some(d))
func(None)
Something like this should work:
trait Document
trait DocumentFunc {
// The trick is to tell the compiler that your type can be nullable.
type DocumentType >: Null <: Document
def fun(page: DocumentType = None.orNull): Unit = {
println(page)
}
}
Apparently, the problem is that since you only set the upper bound to Document, the compiler will reject null, because DocumentType could be overridden to be Nothing.
And "obviously", null could not be used in a place where a Nothing is expected.
First disclaimer: I agree with Joel Berkeley, that you should avoid null and I would prefer his solution.
I just wanted to answer the real question: "Why it does not work".
Second disclaimer: I used None.orNull just to not have an explicit null - that is just because the linters I use disallow the use of null.
You may change it if you want.
Third disclaimer: Type Members can almost always be changed by Type Parameters, which are (usually) more easier to use, and more "typesafe".
Type Members, IMHO, should only be used when you really need them, like path dependent types - More info can be found here.
Fourth disclaimer: The use of null and Unit (together with vars if you had), is a symptom of using Scala as Java, which is (usually) a bad use of the language. However, that is just my opinion.

Can Scala infer the actual type from the return type actually expected by the caller?

I have a following question. Our project has a lot of code, that runs tests in Scala. And there is a lot of code, that fills the fields like this:
production.setProduct(new Product)
production.getProduct.setUuid("b1253a77-0585-291f-57a4-53319e897866")
production.setSubProduct(new SubProduct)
production.getSubProduct.setUuid("89a877fa-ddb3-3009-bb24-735ba9f7281c")
Eventually, I grew tired from this code, since all those fields are actually subclasses of the basic class that has the uuid field, so, after thinking a while, I wrote the auxiliary function like this:
def createUuid[T <: GenericEntity](uuid: String)(implicit m : Manifest[T]) : T = {
val constructor = m.runtimeClass.getConstructors()(0)
val instance = constructor.newInstance().asInstanceOf[T]
instance.setUuid(uuid)
instance
}
Now, my code got two times shorter, since now I can write something like this:
production.setProduct(createUuid[Product]("b1253a77-0585-291f-57a4-53319e897866"))
production.setSubProduct(createUuid[SubProduct]("89a877fa-ddb3-3009-bb24-735ba9f7281c"))
That's good, but I am wondering, if I could somehow implement the function createUuid so the last bit would like this:
// Is that really possible?
production.setProduct(createUuid("b1253a77-0585-291f-57a4-53319e897866"))
production.setSubProduct(createUuid("89a877fa-ddb3-3009-bb24-735ba9f7281c"))
Can scala compiler guess, that setProduct expects not just a generic entity, but actually something like Product (or it's subclass)? Or there is no way in Scala to implement this even shorter?
Scala compiler won't infer/propagate the type outside-in. You could however create implicit conversions like:
implicit def stringToSubProduct(uuid: String): SubProduct = {
val n = new SubProduct
n.setUuid(uuid)
n
}
and then just call
production.setSubProduct("89a877fa-ddb3-3009-bb24-735ba9f7281c")
and the compiler will automatically use the stringToSubProduct because it has applicable types on the input and output.
Update: To have the code better organized I suggest wrapping the implicit defs to a companion object, like:
case class EntityUUID(uuid: String) {
uuid.matches("[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{12}") // possible uuid format check
}
case object EntityUUID {
implicit def toProduct(e: EntityUUID): Product = {
val p = new Product
p.setUuid(e.uuid)
p
}
implicit def toSubProduct(e: EntityUUID): SubProduct = {
val p = new SubProduct
p.setUuid(e.uuid)
p
}
}
and then you'd do
production.setProduct(EntityUUID("b1253a77-0585-291f-57a4-53319e897866"))
so anyone reading this could have an intuition where to find the conversion implementation.
Regarding your comment about some generic approach (having 30 types), I won't say it's not possible, but I just do not see how to do it. The reflection you used bypasses the type system. If all the 30 cases are the same piece of code, maybe you should reconsider your object design. Now you can still implement the 30 implicit defs by calling some method that uses reflection similar what you have provided. But you will have the option to change it in the future on just this one (30) place(s).

Trouble with ReactiveMongo's BSON macros and generics

The following code fails for me:
object Message {
def parse[T](bsonDoc: BSONDocument): Try[T] = {
implicit val bsonHandler = Macros.handler[T]
bsonDoc.seeAsTry[T]
}
}
Message.parse[messages.ClientHello](data)
The error is:
No apply function found for T
implicit val bsonHandler = Macros.handler[T]
^
However, if I hardcode a type (one of my case classes), it's fine:
object Message {
def parse(bsonDoc: BSONDocument): Try[ClientHello] = {
implicit val bsonHandler = Macros.handler[ClientHello]
bsonDoc.seeAsTry[ClientHello]
}
}
Message.parse(data)
So I presume this is a problem using generics. Incidentally, I have to import messages.ClientHello. If I just use messages.ClientHello I get:
not found: value ClientHello
implicit val bsonHandler = Macros.handler[messages.ClientHello]
^
How can I achieve what I'm trying to do, which is to have a single method that will take a BSON document and return an instance of the appropriate case class?
1) Macro applications get expanded immediately when encountered (well, modulo some fine details of type inference that are irrelevant here). This means that when you write handler[T], handler will try to expand with T as a type parameter. This won't lead to anything good, hence the error. To make this work, you need to turn Message.parse into a macro itself.
2) This happens because ReactiveMongo macros are unhygienic. Specifically, https://github.com/ReactiveMongo/ReactiveMongo/blob/v0.10.0/macros/src/main/scala/macros.scala#L142 isn't going to work correctly in situations like yours, because it uses simple name of the class, not a fully qualified name. I think the best way to make the macro work correctly would be using Ident(companion), not Ident(companion.name) - that would ensure that this identifier binds to the companion, not to something in scope having the same name.

Scala Macros, generating type parameter calls

I'm trying to generalize setting up Squeryl (Slick poses the same problems AFAIK). I want to avoid having to name every case class explicitly for a number of general methods.
table[Person]
table[Bookmark]
etc.
This also goes for generating indexes, and creating wrapper methods around the CRUD methods for every case class.
So ideally what I want to do is have a list of classes and make them into tables, add indexes and add a wrapper method:
val listOfClasses = List(classOf[Person], classOf[Bookmark])
listOfClasses.foreach(clazz => {
val tbl = table[clazz]
tbl.id is indexed
etc.
})
I thought Scala Macros would be the thing to apply here, since I don't think you can have values as type parameters. Also I need to generate methods for every type of the form:
def insert(model: Person): Person = persons.insert(model)
I've got my mits on an example on Macros but I don't know how to generate a generic datastructure.
I got this simple example to illustrate what I want:
def makeList_impl(c: Context)(clazz: c.Expr[Class[_]]): c.Expr[Unit] = {
import c.universe._
reify {
println(List[clazz.splice]()) // ERROR: error: type splice is not a member of c.Expr[Class[_]]
}
}
def makeList(clazz: Class[_]): Unit = macro makeList_impl
How do I do this? Or is Scala Macros the wrong tool?
Unfortunately, reify is not flexible enough for your use case, but there's good news. In macro paradise (and most likely in 2.11.0) we have a better tool to construct trees, called quasiquotes: http://docs.scala-lang.org/overviews/macros/quasiquotes.html.
scala> def makeList_impl(c: Context)(clazz: c.Expr[Class[_]]): c.Expr[Any] = {
| import c.universe._
| val ConstantType(Constant(tpe: Type)) = clazz.tree.tpe
| c.Expr[Any](q"List[$tpe]()")
| }
makeList_impl: (c: scala.reflect.macros.Context)(clazz: c.Expr[Class[_]])c.Expr[Any]
scala> def makeList(clazz: Class[_]): Any = macro makeList_impl
defined term macro makeList: (clazz: Class[_])Any
scala> makeList(classOf[Int])
res2: List[Int] = List()
scala> makeList(classOf[String])
res3: List[String] = List()
Quasiquotes are even available in 2.10.x with a minor tweak to the build process (http://docs.scala-lang.org/overviews/macros/paradise.html#macro_paradise_for_210x), so you might want to give them a try.
This will probably not fill all your needs here, but it may help a bit:
The signature of table method looks like this:
protected def table[T]()(implicit manifestT: Manifest[T]): Table[T]
As you can see, it takes implicit Manifest object. That object is passed automatically by the compiler and contains information about type T. This is actually what Squeryl uses to inspect database entity type.
You can just pass these manifests explicitly like this:
val listOfManifests = List(manifest[Person], manifest[Bookmark])
listOfManifests.foreach(manifest => {
val tbl = table()(manifest)
tbl.id is indexed
etc.
})
Unfortunately tbl in this code will have type similar to Table[_ <: CommonSupertypeOfAllGivenEntities] which means that all operations on it must be agnostic of concrete type of database entity.