I'm trying to use Couchbase as a cache layer for a relational database that is accessed using Slick. The skeleton of my code that's relevant to the question is as follows:
class RdbTable[T <: Table[_]](implicit val bucket: CouchbaseBucket) {
type ElementType = T#TableElementType
private val table = TableQuery[T].baseTableRow
private def cacheAll(implicit session: Session) =
TableQuery[T].list foreach (elem => cache(elem))
private def cache(elem: ElementType) =
table.primaryKeys foreach (pk => bucket.set[ElementType](key(pk, elem), elem))
private def key(pk: PrimaryKey, elem: ElementType) = ???
.......
}
As you can see, I want to cache each element by all of its primary keys. For this purpose, I need to obtain the value of that key for the given element. But I don't see an obvious way to compute the value of a primary key (the column value, if single-column key; the tuple value, if multi-column).
Any suggestions on what to do? Note that the code MUST NOT know what the actual tables and their columns are. It must be completely general.
We're doing something similar, using Redis as the cache. Each of our records only has one primary key, but in some cases we need to include additional data with the cache key to avoid ambiguity (for example, we have a ManyToMany record that represents an association between two records; when we return a ManyToMany record we'll embed one (but not both) of the associated records, and so in the cache key we need to include the type of the associated record that we're returning).
abstract trait Record {
val cacheKey: CacheKey
}
trait ManyToManyRecord extends Record {
override val cacheKey: ManyToManyCacheKey
}
class CacheKey(recordType: String, key: Int) {
def getKey: String = recordType + ":" + key.toString
}
class ManyToManyCacheKey(recordType: String, key: Int, assocType: String) extends CacheKey {
def getKey: String = recordType + ":" + key.toString + ":" + assocType
}
All of our tables use an integer primary key called "id", so it's easy for us to figure out what the value of "key" is. If you're working with a more complicated schema and don't want to manually write out the "def key: String" (or whatever) definitions for all of your record / table types, then you could try using Slick code generation to automatically generate record / table classes / objects with "def key" created directly from the schema. However, the learning curve for Slick code generation (or any other code generation tool) is steep, so if this is your only use for it then you'd probably be better off generating "def key" by hand. (We generate somewhere between 20-30% of our code using the code generation tool, so the initial investment in learning how to use the tool has paid off)
Slick doesn't come with a built-in primary key extractor for entities. What you can do is use either interfaces, type classes or reflection. E.g. variants of the following:
Either make your entities implement a trait
trait HasPrimaryKey{
def primaryKey: Any
}
class RdbTable[T <: Table[_ <: HasPrimaryKey]](implicit val bucket: CouchbaseBucket) {
...
private def key(elem: ElementType) = elem.primaryKey
// and for each entity:
case class Person( ... ) extends HasPrimaryKey{
def primaryKey = ...
}
or a type class
trait KeyTypeClass[E,T <: Table[E]]{
def key(e: E): Any
}
class RdbTable[T <: Table[_]](implicit val bucket: CouchbaseBucket, keyTC: KeyTypeClass[T]) {
...
private def key(elem: ElementType) = keyTC(elem)
// and for each entity:
implicit val personKey = new KeyTypeClass[Person,PersonTable]{
def key(p: Person) = ...
}
or using reflection to iterate over the primary keys and pull the values out of corresponding fields of the entity.
Code generation as mentioned by Zim-Zam can help with the repetitive elements.
Related
I'm trying to use generic insert function that would return object when inserting data using Slick.
I've tried with following code
// This works without any issues but limited to specific entity and table
def insertEmployee(employee: Employee): IO[Employee] = DB.run(db, {
def returnObj = (EmployeeTable.instance returning EmployeeTable.instance.map(_.id)).into((obj, id) => obj.copy(id = id))
returnObj += employee
})
// This doesn't work and returns error on "copy". Error: Cannot resolve symbol copy
def withReturningObj[E <: BaseEntity, T <: BaseTable[E]](query: TableQuery[T]) =
(query returning query.map(_.id)).into((obj, id) => obj.copy(id = id))
Anyone who could suggest a possible solution?
I'm guessing you haven't defined your own copy method and are trying to use the one synthesized for case classes. The problem here is that obj isn't known to be of a type that has a copy method, just that it is a BaseEntity. I've had this same issue though, as unfortunately there isn't actually a way to constrain a type parameter to be a case class (that I know of, I would LOVE to be wrong about this though). I ended up doing something equivalent to defining a withId() method in your BaseEntity for specifically this purpose.
trait BaseEntity {
val id: IdType
def withId(id: IdType): BaseEntity
}
// ...
def withReturningObj[E <: BaseEntity, T <: BaseTable[E]](query: TableQuery[T]) =
(query returning query.map(_.id)).into((obj, id) => obj.withId(id) // <-- use withId here instead of copy
I am attempting to use scala-cass in order to read from cassandra and convert the resultset to a case class using resultSet.as[CaseClass]. This works great when running the following.
import com.weather.scalacass.syntax._
case class TestTable(id: String, data1: Int, data2: Long)
val resultSet = session.execute(s"select * from test.testTable limit 10")
resultSet.one.as[TestTable]
Now I am attempting to make this more generic and I am unable to find the proper type constraint for the generic class.
import com.weather.scalacass.syntax._
case class TestTable(id: String, data1: Int, data2: Long)
abstract class GenericReader[T] {
val table: String
val keyspace: String
def getRows(session: Session): T = {
val resultSet = session.execute(s"select * from $keyspace.$table limit 10")
resultSet.one.as[T]
}
}
I implement this class with the desired case class and attempt to call getRows on the created Object.
object TestTable extends GenericReader[TestTable] {
val keyspace = "test"
val table = "TestTable"
}
TestTable.getRows(session)
This throws an exception could not find implicit value for parameter ccd: com.weather.scalacass.CCCassFormatDecoder[T].
I am trying to add a type constraint to GenericReader in order to ensure the implicit conversion will work. However, I am unable to find the proper type. I am attempting to read through scala-cass in order to find the proper constraint but I have had no luck so far.
I would also be happy to use any other library that can achieve this.
Looks like as[T] requires an implicit value that you don't have in scope, so you'll need to require that implicit parameter in the getRows method as well.
def getRows(session: Session)(implicit cfd: CCCassFormatDecoder[T]): T
You could express this as a type constraint (what you were looking for in the original question) using context bounds:
abstract class GenericReader[T:CCCassFormatDecoder]
Rather than try to bound your generic T type, it might be easier to just pass through the missing implicit parameter:
abstract class GenericReader[T](implicit ccd: CCCassFormatDecoder[T]) {
val table: String
val keyspace: String
def getRows(session: Session): T = {
val resultSet = session.execute(s"select * from $keyspace.$table limit 10")
resultSet.one.as[T]
}
}
Finding a concrete value for that implicit can then be deferred to when you narrow that T to a specific class (like object TestTable extends GenericReader[TestTable])
I have these case classes:
case class PolicyHolder(id : String, firstName : String, lastName : String)
case class Policy(address : Future[Address], policyHolder : Future[PolicyHolder], created : RichDateTime, duration : RichDuration )
I then have a slick schema defined for Policy
class PolicyDAO(tag: Tag) extends Table[Policy](tag, "POLICIES") with DbConfig {
def address = column[String]("ADDRESS", O.PrimaryKey)
def policyHolder = foreignKey("POLICY_HOLDER_FK", address, TableQuery[PolicyHolderDAO])(_.id)
def created = column[RichDateTime]("CREATED")
def duration = column[String]("DURATION")
def * = (address, policyHolder, created, duration) <> (Policy.apply, Policy.unapply)
}
What is the best way for me to define this projection correctly to map the policyHolder field inside of my Policy case class from the foreign key value to an actual instance of the PolicyHolder case class.
Our solution to this problem is to place the foreign key id in the case class, and to then use a lazy val or a def (the latter possibly being backed by a cache) to retrieve the record using the key. This is assuming that your PolicyHolders are stored in a separate table - if they're denormalized but you want to treat them as separate case classes then you can have the lazy val / def in Policy construct a new case class instead of retrieving the record using the foreign key.
class PolicyDAO(tag: Tag) extends Table[Policy](tag, "POLICIES") with DbConfig {
def address = column[String]("ADDRESS", O.PrimaryKey)
def policyHolderId = column[String]("POLICY_HOLDER_ID")
def created = column[RichDateTime]("CREATED")
def duration = column[String]("DURATION")
def * = (address, policyHolderId, created, duration) <> (Policy.apply, Policy.unapply)
}
case class Policy(address : Future[Address], policyHolderId : Future[String], created : RichDateTime, duration : RichDuration ) {
lazy val policyHolder = policyHolderId.map(id => PolicyHolderDAO.get(id))
}
We also used a common set of create/update/delete methods to account for the nesting, so that when a Policy is committed its inner PolicyHolder will also be committed; we used a CommonDAO class that extended Table and had the prototypes for the create/update/delete methods, and then all DAOs extended CommonDAO instead of Table and overrode create/update/delete as necessary.
Edit: To cut down on errors and reduce the amount of boilerplate we had to write, we used Slick's code generation tool - this way the CRUD operations could be automatically generated from the schema
I use case class to transform the class object to data for slick2 before, but current I use another play plugin, the plugin object use the case class, my class is inherent from this case class. So, I can not use case class as the scala language forbidden use case class to case class inherent.
before:
case class User()
class UserTable(tag: Tag) extends Table[User](tag, "User") {
...
def * = (...)<>(User.tupled,User.unapply)
}
it works.
But now I need to change above to below:
case class BasicProfile()
class User(...) extends BasicProfile(...){
...
def unapply(i:User):Tuple12[...]= Tuple12(...)
}
class UserTable(tag: Tag) extends Table[User](tag, "User") {
...
def * = (...)<>(User.tupled,User.unapply)
}
I do not know how to write the tupled and unapply(I am not my writing is correct or not) method like the case class template auto generated. Or you can should me other way to mapping the class to talbe by slick2.
Any one can give me an example of it?
First of all, this case class is a bad idea:
case class BasicProfile()
Case classes compare by their member values, this one doesn't have any. Also the name is not great, because we have the same name in Slick. May cause confusion.
Regarding your class
class User(...) extends BasicProfile(...){
...
def unapply(i:User):Tuple12[...]= Tuple12(...)
}
It is possible to emulate case classes yourself. Are you doing that because of the 22 field limit? FYI: Scala 2.11 supports larger case classes. We are doing what you are trying at Sport195, but there are several aspects to take care of.
apply and unapply need to be members of object User (the companion object of class User). .tupled is not a real method, but generated automatically by the Scala compiler. it turns a method like .apply that takes a list of arguments into a function that takes a single tuple of those arguments. As tuples are limited to 22 columns, so is .tupled. But you could of course auto-generated one yourself, may have to give it another name.
We are using the Slick code generator in combination with twirl template engine (uses # to insert expressions. The $ are inserted as if into the generated Scala code and evaluated, when the generated code is compiled/run.). Here are a few snippets that may help you:
Generate apply method
/** Factory for #{name} objects
#{indentN(2,entityColumns.map(c => "* #param "+c.name+" "+c.doc).mkString("\n"))}
*/
final def apply(
#{indentN(2,
entityColumns.map(c =>
colWithTypeAndDefault(c)
).mkString(",\n")
)}
) = new #{name}(#{columnsCSV})
Generate unapply method:
#{if(entityColumns.size <= 22)
s"""
/** Extractor for ${name} objects */
final def unapply(o: ${name}) = Some((${entityColumns.map(c => "o."+c.name).mkString(", ")}))
""".trim
else
""}
Trait that can be mixed into User to make it a Scala Product:
trait UserBase with Product{
// Product interface
def canEqual(that: Any): Boolean = that.isInstanceOf[#name]
def productArity: Int = #{entityColumns.size}
def productElement(n: Int): Any = Seq(#{columnsCSV})(n)
override def toString = #{name}+s"(${productIterator.toSeq.mkString(",")})"
...
case-class like .copy method
final def copy(
#{indentN(2,columnsCopy)}
): #{name} = #{name}(#{columnsCSV})
To use those classes with Slick you have several options. All are somewhat newer and not documented (well). The normal <> operator Slick goes via tuples, but that's not an option for > 22 columns. One option are the new fastpath converters. Another option is mapping via a Slick HList. No examples exist for either. Another option is going via a custom Shape, which is what we do. This will require you to define a custom shape for your User class and another class defined using Column types to mirror user within queries. Like this: http://slick.typesafe.com/doc/2.1.0/api/#scala.slick.lifted.ProductClassShape Too verbose to write by hand. We use the following template code for this:
/** class for holding the columns corresponding to #{name}
* used to identify this entity in a Slick query and map
*/
class #{name}Columns(
#{indent(
entityColumns
.map(c => s"val ${c.name}: Column[${c.exposedType}]")
.mkString(", ")
)}
) extends Product{
def canEqual(that: Any): Boolean = that.isInstanceOf[#name]
def productArity: Int = #{entityColumns.size}
def productElement(n: Int): Any = Seq(#{columnsCSV})(n)
}
/** shape for mapping #{name}Columns to #{name} */
object #{name}Implicits{
implicit object #{name}Shape extends ClassShape(
Seq(#{
entityColumns
.map(_.exposedType)
.map(t => s"implicitly[Shape[ShapeLevel.Flat, Column[$t], $t, Column[$t]]]")
.mkString(", ")
}),
vs => #{name}(#{
entityColumns
.map(_.exposedType)
.zipWithIndex
.map{ case (t,i) => s"vs($i).asInstanceOf[$t]" }
.mkString(", ")
}),
vs => new #{name}Columns(#{
entityColumns
.map(_.exposedType)
.zipWithIndex
.map{ case (t,i) => s"vs($i).asInstanceOf[Column[$t]]" }
.mkString(", ")
})
)
}
import #{name}Implicits.#{name}Shape
A few helpers we put into the Slick code generator:
val columnsCSV = entityColumns.map(_.name).mkString(", ")
val columnsCopy = entityColumns.map(c => colWithType(c)+" = "+c.name).mkString(", ")
val columnNames = entityColumns.map(_.name.toString)
def colWithType(c: Column) = s"${c.name}: ${c.exposedType}"
def colWithTypeAndDefault(c: Column) =
colWithType(c) + colDefault(c).map(" = "+_).getOrElse("")
def indentN(n:Int,code: String): String = code.split("\n").mkString("\n"+List.fill(n)(" ").mkString(""))
I know this may a bit troublesome to replicate, especially if you are new to Scala. I hope to to find the time get it into the official Slick code generator at some point.
Schema.org is markup vocabulary (for the web) and defines a number of types in terms of properties (no methods). I am currently trying to model parts of that schema in Scala as internal model classes to be used in conjunction with a document-oriented database (MongoDB) and a web framework.
As can be seen in the definition of LocalBusiness, schema.org uses multiple inheritance to also include properties from the "Place" type. So my question is: How would you model such a schema in Scala?
I have come up with two solutions so far. The first one use regular classes to model a single inheritance tree and uses traits to mixin those additional properties.
trait ThingA {
var name: String = ""
var url: String = ""
}
trait OrganizationA {
var email: String = ""
}
trait PlaceA {
var x: String = ""
var y: String = ""
}
trait LocalBusinessA {
var priceRange: String = ""
}
class OrganizationClassA extends ThingA with OrganizationA {}
class LocalBusinessClassA extends OrganizationClassA with PlaceA with LocalBusinessA {}
The second version tries to use case classes. However, since case class inheritance is deprecated, I cannot model the main hierarchy so easily.
trait ThingB {
val name: String
}
trait OrganizationB {
val email: String
}
trait PlaceB {
val x: String
val y: String
}
trait LocalBusinessB {
val priceRange: String
}
case class OrganizationClassB(val name: String, val email: String) extends ThingB with OrganizationB
case class LocalBusinessClassB(val name: String, val email: String, val x: String, val y: String, val priceRange: String) extends ThingB with OrganizationB with PlaceB with LocalBusinessB
Is there a better way to model this? I could use composition similar to
case class LocalBusinessClassC(val thing:ThingClass, val place: PlaceClass, ...)
but then of course, LocalBusiness cannot be used when a "Place" is expected, for example when I try to render something on Google Maps.
What works best for you depends greatly on how you want to map your objects to the underlying datastore.
Given the need for multiple inheritance, and approach that might be worth considering would be to just use traits. This gives you multiple inheritance with the least amount of code duplication or boilerplating.
trait Thing {
val name: String // required
val url: Option[String] = None // reasonable default
}
trait Organization extends Thing {
val email: Option[String] = None
}
trait Place extends Thing {
val x: String
val y: String
}
trait LocalBusiness extends Organization with Place {
val priceRange: String
}
Note that Organization extends Thing, as does Place, just as in schema.org.
To instantiate them, you create anonymous inner classes that specify the values of all attributes.
object UseIt extends App {
val home = new Place {
val name = "Home"
val x = "-86.586104"
val y = "34.730369"
}
val oz = new Place {
val name = "Oz"
val x = "151.206890"
val y = "-33.873651"
}
val paulis = new LocalBusiness {
val name = "Pauli's"
override val url = "http://www.paulisbarandgrill.com/"
val x = "-86.713660"
val y = "34.755092"
val priceRange = "$$$"
}
}
If any fields have a reasonable default value, you can specify the default value in the trait.
I left fields without value as empty strings, but it probably makes more sense to make optional fields of type Option[String], to better indicate that their value is not set. You liked using Option, so I'm using Option.
The downside of this approach is that the compiler generates an anonymous inner class every place you instantiate one of the traits. This could give you an explosion of .class files. More importantly, though, it means that different instances of the same trait will have different types.
Edit:
In regards to how you would use this to load objects from the database, that depends greatly on how you access your database. If you use an object mapper, you'll want to structure your model objects in the way that the mapper expects them to be structured. If this sort of trick works with your object mapper, I'll be surprised.
If you're writing your own data access layer, then you can simply use a DAO or repository pattern for data access, putting the logic to build the anonymous inner classes in there.
This is just one way to structure these objects. It's not even the best way, but it demonstrates the point.
trait Database {
// treats objects as simple key/value pairs
def findObject(id: String): Option[Map[String, String]]
}
class ThingRepo(db: Database) {
def findThing(id: String): Option[Thing] = {
// Note that in this way, malformed objects (i.e. missing name) simply
// return None. Logging or other responses for malformed objects is left
// as an exercise :-)
for {
fields <- db.findObject(id) // load object from database
name <- field.get("name") // extract required field
} yield {
new Thing {
val name = name
val url = field.get("url")
}
}
}
}
There's a bit more to it than that (how you identify objects, how you store them in the database, how you wire up repository, how you'll handle polymorphic queries, etc.). But this should be a good start.