Slick generic AND driver agnostic - scala

Basically what I want to achieve is a combination of:
Slick 3.0.0 database agnostism
and
Slick 3 reusable generic repository
I tried a lot, actually, but I can't get this to work at all.
abstract class BaseModel[T <: slick.lifted.AbstractTable[_]](query: TableQuery[T], val driver: JdbcProfile, val dbTableName: String)
{
lazy val all: TableQuery[T] = TableQuery[T]
import driver.api._
def createTable = all.schema.create
def dropTable = all.schema.create
abstract class BaseTable[B](val tag: Tag) extends Table[B](tag, dbTableName)
{
def id = column[Long]("id", O.PrimaryKey, O.AutoInc)
}
}
Now here we have a problem already:
def createTable = all.schema.create and the same with dropTable... -> schema cannot be resolved here, although I import the driver before.
But an even bigger problem comes in when I subclass this:
Here is the code
class NodeModel(driver: JdbcProfile, dbTableName: String) extends BaseModel[NodeTable](TableQuery[NodeTable], driver, dbTableName) {
val dbDriver = driver
import dbDriver.api._
class NodeTable(tag: Tag) extends BaseTable[Node](tag)
{
override def * = id.? <> (Node, Node.unapply)
}
//lazy val all: TableQuery[NodeTable] = TableQuery[NodeTable]
def createTable: DBIO[Unit] = all.schema.create
def dropTable: DBIO[Unit] = all.schema.drop
def insert(node: Node) = all += node
}
This won't compile obviously because I cannot pass NodeTable as T, but gives an idea of what I want to achieve.
Do you have any idea how to solve this? I also tried with companion objects, moving the BaseTable out of the BaseModel and trying to load a simpleDriver... but it looks like that functionality was removed from Slick in a recent version :(

Database agnostic and Code is highly reusable
I am using Slick with Playframework and this is how I achieved database agnostic and generic repository.
Note that this work is inspired from Active Slick
I want to have basic crud operations like this to be defined on my case class. I should be able to do count, update, delete and create. I want to write the curd code just once and reuse it for ever.
Here is the snippet which demonstrates this.
case class Dog(name: String, id: Option[Long] = None)
Dog("some_dog").save()
Dog("some_dog").insert()
Dog("some_dog", Some(1)).delete()
CrudActions.scala
import slick.backend.DatabaseConfig
import slick.driver.JdbcProfile
import scala.concurrent.ExecutionContext
trait CrudActions {
val dbConfig: DatabaseConfig[JdbcProfile]
import dbConfig.driver.api._
type Model
def count: DBIO[Int]
def save(model: Model)(implicit ec: ExecutionContext): DBIO[Model]
def update(model: Model)(implicit ec: ExecutionContext): DBIO[Model]
def delete(model: Model)(implicit ec: ExecutionContext): DBIO[Int]
def fetchAll(fetchSize: Int = 100)(implicit ec: ExecutionContext): StreamingDBIO[Seq[Model], Model]
}
Now lets get our Entity into picture. Note that Entity is nothing but our case class
Entity is case class on which we do crud operations. For locating our entity lets also have Id in place. Id is important for locating and operating an entity or record in the database. Also Id uniquely identities for entity
EntityActionsLike.scala
import slick.backend.DatabaseConfig
import slick.driver.JdbcProfile
import scala.concurrent.ExecutionContext
trait EntityActionsLike extends CrudActions {
val dbConfig: DatabaseConfig[JdbcProfile]
import dbConfig.driver.api._
type Entity
type Id
type Model = Entity
def insert(entity: Entity)(implicit ec: ExecutionContext): DBIO[Id]
def deleteById(id: Id)(implicit ec: ExecutionContext): DBIO[Int]
def findById(id: Id)(implicit ec: ExecutionContext): DBIO[Entity]
def findOptionById(id: Id)(implicit ec: ExecutionContext): DBIO[Option[Entity]]
}
Now lets implement these methods. For doing operations we need Table and TableQuery. Lets say we have table and tableQuery. The good about traits is we can declare a contract and leave the implementation details to subclasses or subtypes
EntityActions.scala
import slick.ast.BaseTypedType
import slick.backend.DatabaseConfig
import slick.driver.JdbcProfile
import scala.concurrent.ExecutionContext
trait EntityActions extends EntityActionsLike {
val dbConfig: DatabaseConfig[JdbcProfile]
import dbConfig.driver.api._
type EntityTable <: Table[Entity]
def tableQuery: TableQuery[EntityTable]
def $id(table: EntityTable): Rep[Id]
def modelIdContract: ModelIdContract[Entity,Id]
override def count: DBIO[Int] = tableQuery.size.result
override def insert(entity: Entity)(implicit ec: ExecutionContext): DBIO[Id] = {
tableQuery.returning(tableQuery.map($id(_))) += entity
}
override def deleteById(id: Id)(implicit ec: ExecutionContext): DBIO[Int] = {
filterById(id).delete
}
override def findById(id: Id)(implicit ec: ExecutionContext): DBIO[Entity] = {
filterById(id).result.head
}
override def findOptionById(id: Id)(implicit ec: ExecutionContext): DBIO[Option[Entity]] = {
filterById(id).result.headOption
}
override def save(model: Entity)(implicit ec: ExecutionContext): DBIO[Entity] = {
insert(model).flatMap { id =>
filterById(id).result.head
}.transactionally
}
override def update(model: Entity)(implicit ec: ExecutionContext): DBIO[Entity] = {
filterById(modelIdContract.get(model)).update(model).map { _ => model }.transactionally
}
override def delete(model: Entity)(implicit ec: ExecutionContext): DBIO[Int] = {
filterById(modelIdContract.get(model)).delete
}
override def fetchAll(fetchSize: Int)(implicit ec: ExecutionContext): StreamingDBIO[Seq[Entity], Entity] = {
tableQuery.result.transactionally.withStatementParameters(fetchSize = fetchSize)
}
def filterById(id: Id) = tableQuery.filter($id(_) === id)
def baseTypedType: BaseTypedType[Id]
protected implicit lazy val btt: BaseTypedType[Id] = baseTypedType
}
ActiveRecord.scala
import slick.dbio.DBIO
import scala.concurrent.ExecutionContext
abstract class ActiveRecord[R <: CrudActions](val repo: R) {
def model: repo.Model
def save()(implicit ec: ExecutionContext): DBIO[repo.Model] = repo.save(model)
def update()(implicit ec: ExecutionContext): DBIO[repo.Model] = repo.update(model)
def delete()(implicit ec: ExecutionContext): DBIO[Int] = repo.delete(model)
}
ModelContract.scala
case class ModelIdContract[A, B](get: A => B, set: (A, B) => A)
How to Use
Sample.scala
import com.google.inject.{Inject, Singleton}
import play.api.db.slick.DatabaseConfigProvider
import slick.ast.BaseTypedType
import slick.backend.DatabaseConfig
import slick.driver.JdbcProfile
import slick.{ActiveRecord, EntityActions, ModelIdContract}
case class Dog(name: String, id: Option[Long] = None)
#Singleton
class DogActiveRecord #Inject() (databaseConfigProvider: DatabaseConfigProvider) extends EntityActions {
override val dbConfig: DatabaseConfig[JdbcProfile] = databaseConfigProvider.get[JdbcProfile]
import dbConfig.driver.api._
override def tableQuery = TableQuery(new Dogs(_))
override def $id(table: Dogs): Rep[Id] = table.id
override def modelIdContract: ModelIdContract[Dog, Id] = ModelIdContract(dog => dog.id.get, (dog, id) => dog.copy(id = Some(id)))
override def baseTypedType: BaseTypedType[Id] = implicitly[BaseTypedType[Id]]
override type Entity = Dog
override type Id = Long
override type EntityTable = Dogs
class Dogs(tag: Tag) extends Table[Dog](tag, "DogsTable") {
def name = column[String]("name")
def id = column[Long]("id", O.PrimaryKey)
def * = (name, id.?) <> (Dog.tupled, Dog.unapply)
}
implicit class ActiveRecordImplicit(val model: Entity) extends ActiveRecord(this)
import scala.concurrent.ExecutionContext.Implicits.global
val result = Dog("some_dog").save()
val res2 = Dog("some_other_dog", Some(1)).delete()
val res3 = Dog("some_crazy_dog", Some(1)).update()
}
Now we can do operations on Dog directly like this
Dog("some_dog").save()
This implicit does the magic for us
implicit class ActiveRecordImplicit(val model: Entity) extends ActiveRecord(this)
You can also add scheme creation and dropping logic in EntityActions
tableQuery.schema.create
table.schema.drop

Related

Scala Generic Repository Class For Reactive Mongo Repository(alpakka) - Needed Class Found T

Im trying to create a Generic Class in Scala so I can create a repository for different collection without repeating myself.
The problem is that if I do it as a Generic Class(as in this example) I get a problem in this line:
val codecRegistry = fromRegistries(fromProviders(classOf[T]), DEFAULT_CODEC_REGISTRY)
Expected Class but Found [T]
But if I change T for any other class (lets say User) in all the code it works.
This is my class:
package persistence.repository.impl
import akka.stream.Materializer
import akka.stream.alpakka.mongodb.scaladsl.{MongoSink, MongoSource}
import akka.stream.scaladsl.{Sink, Source}
import akka.{Done, NotUsed}
import com.mongodb.reactivestreams.client.MongoClients
import constants.MongoConstants._
import org.bson.codecs.configuration.CodecRegistries.{fromProviders, fromRegistries}
import org.mongodb.scala.MongoClient.DEFAULT_CODEC_REGISTRY
import org.mongodb.scala.bson.codecs.Macros._
import org.mongodb.scala.model.Filters
import persistence.entity.{ProductItem}
import persistence.repository.Repository
import scala.concurrent.{ExecutionContext, Future}
class UserMongoDatabase[T](implicit materializer: Materializer,
executionContext: ExecutionContext)
extends Repository[T] {
val codecRegistry = fromRegistries(fromProviders(classOf[T]), DEFAULT_CODEC_REGISTRY)
val client = MongoClients.create(HOST)
val db = client.getDatabase(DATABASE)
val requestedCollection = db
.getCollection(USER_COLLECTION, classOf[T])
.withCodecRegistry(codecRegistry)
val source: Source[T, NotUsed] =
MongoSource(requestedCollection.find(classOf[T]))
val rows: Future[Seq[T]] = source.runWith(Sink.seq)
override def getAll: Future[Seq[T]] = rows
override def getById(id: AnyVal): Future[Option[T]] = rows.map {
list =>
list.filter {
user => user.asInstanceOf[ {def _id: AnyVal}]._id == id
}.headOption
}
override def getByEmail(email: String): Future[Option[T]] = rows.map {
list =>
list.filter {
user => user.asInstanceOf[ {def email: AnyVal}].email == email
}.headOption
}
override def save(obj: T): Future[T] = {
val source = Source.single(obj)
source.runWith(MongoSink.insertOne(requestedCollection)).map(_ => obj)
}
override def delete(id: AnyVal): Future[Done] = {
val source = Source.single(id).map(i => Filters.eq("_id", id))
source.runWith(MongoSink.deleteOne(requestedCollection))
}
}
This is my repository trait:
package persistence.repository
import akka.Done
import scala.concurrent.Future
trait Repository[T]{
def getAll: Future[Seq[T]]
def getById(id: AnyVal): Future[Option[T]]
def save(user: T): Future[T]
def delete(id: AnyVal): Future[Done]
def getByEmail(email:String): Future[Option[T]]
}
As said in the comments, this is the perfect example of usage of ClassTag in Scala. It allow to retain the actual class of a generic/parameterized class.
class DefaultMongoDatabase[T](implicit ..., ct: ClassTag[T])
extends Repository[T] {
val codecRegistry = fromRegistries(fromProviders(ev.runtimeClass), ...)
(You can move the classtag logic in the trait if you want.)

How to extract slick entities from play framework dao singleton

I've been creating a new project using Play framework, Slick and Postgresql and I don't understand how can I extract entities from DAO. Most tutorials show examples where entity (class extending table) is a class within dao singleton.
I've done actually the same:
import javax.inject.{Inject, Singleton}
import org.joda.time.DateTime
import play.api.db.slick.{DatabaseConfigProvider, HasDatabaseConfigProvider}
import slick.jdbc.JdbcProfile
import scala.concurrent.{ExecutionContext, Future}
#Singleton
class CardDao #Inject()(protected val dbConfigProvider: DatabaseConfigProvider)
(implicit executionContext: ExecutionContext) extends HasDatabaseConfigProvider[JdbcProfile] {
import dbConfig.profile.api._
val Cards = TableQuery[CardsTable]
private[dao] class CardsTable(tag: Tag) extends Table[Card](tag, "card") {
def id = column[Int]("id", O.PrimaryKey, O.AutoInc)
def name = column[String]("name", O.Unique)
def value = column[BigDecimal]("value")
def currencyId = column[Int]("currency_id")
def created = column[DateTime]("created")
def active = column[Boolean]("active")
override def * = (id, name, value, currencyId, created) <> (Card.tupled, Card.unapply)
}
def all(): Future[Seq[Card]] = db.run(Cards.result)
}
That's ok. I don't mind having an entity here but when I create another DAO I cannot access my TableQuery (joining tables or creating foreign keys) as it's just a field in singleton.
I was trying to extract CardsTable to separate class with companion object containing TableQuery but it turns out that column method, O, foreignKey come somewhere from HasDatabaseConfigProvider trait (importing dbConfig.profile.api._) so I'm not sure but I guess I have to pass dbConfig implicitly to the class.
How would you do that ? It's just the beginning of the project so really don't want to make some rookie mistakes at this point.
Thanks to Ɓukasz I've found a way how to do this:
trait Tables {
this: HasDatabaseConfigProvider[JdbcProfile] => {}
import dbConfig.profile.api._
val Cards = TableQuery[CardsTable]
val FaceValues = TableQuery[FaceValuesTable]
val Currencies = TableQuery[CurrenciesTable]
val Cryptocurrencies = TableQuery[CryptocurrenciesTable]
val Wallets = TableQuery[WalletsTable]
val Transactions = TableQuery[TransactionsTable]
class CryptocurrenciesTable(tag: Tag) extends Table[Cryptocurrency](tag, "cryptocurrency") with ActiveAndCreated[Cryptocurrency] {
def id = column[Int]("id", O.PrimaryKey, O.AutoInc)
def name = column[String]("name", O.Unique)
def cryptocurrencyCode = column[String]("cryptocurrency_code", O.Unique)
def blockchainExplorerUrl = column[String]("blockchain_explorer_url")
def iconSvg = column[String]("icon_svg")
override def * = (id, name, cryptocurrencyCode, blockchainExplorerUrl, iconSvg, created) <>
(Cryptocurrency.tupled, Cryptocurrency.unapply)
}
class FaceValuesTable(tag: Tag) extends Table[FaceValue](tag, "face_value") with ActiveAndCreated[FaceValue] {
def id = column[Int]("id", O.PrimaryKey, O.AutoInc)
def value = column[BigDecimal]("value")
def currencyId = column[Int]("currency_id")
def cardId = column[Int]("card_id")
def card = foreignKey("card_fk", cardId, Cards)(_.id)
def currency = foreignKey("currency_fk", currencyId, Currencies)(_.id)
override def * = (id, value, currencyId, cardId) <> (FaceValue.tupled, FaceValue.unapply)
}
...
}
Simple Dao trait:
trait Dao extends HasDatabaseConfigProvider[JdbcProfile] with Tables
And now all DAOs are very simple:
#Singleton
class WalletDao #Inject()(protected val dbConfigProvider: DatabaseConfigProvider)
(implicit executionContext: ExecutionContext) extends Dao {
import dbConfig.profile.api._
def find(walletAddress: WalletAddress): Future[Option[Wallet]] = {
val query = for {
wallet <- Wallets
cryptocurrency <- Cryptocurrencies if cryptocurrency.id === wallet.id &&
cryptocurrency.cryptocurrencyCode === walletAddress.cryptocurrency
} yield wallet
db.run(query.result.headOption)
}
def find(walletId: Int): Future[Option[Wallet]] = db.run(Wallets.filter(_.id === walletId).result.headOption)
def save(wallet: Wallet): Future[Int] = db.run(Wallets += wallet)
}

Play ReactiveMongo JSON Serialization Generics

I have a problem using Play 2.5.x and ReactiveMongo Play. I am trying to create a Generic Repository and I have serious problems when serialize and deserialize objects to database. It always give me the following error: No Json deserializer found for type E. Try to implement an implicit Reads or Format for this type.
Here is my generic code:
package repositories.mongo
import javax.inject.Inject
import core.Entity
import play.modules.reactivemongo.ReactiveMongoApi
import reactivemongo.api.QueryOpts
import repositories.Repository
import scala.collection.Seq
import scala.concurrent.{ExecutionContext, Future}
import reactivemongo.play.json._
import play.api.libs.json._
import reactivemongo.play.json.collection.JSONCollection
class MongoRepository[K, E <: Entity[K]] #Inject()(reactiveMongo: ReactiveMongoApi) extends Repository[K, E] {
protected def collection(implicit ec: ExecutionContext) = reactiveMongo.database.map(_.collection[JSONCollection](this.getCollectionName))
protected def getCollectionName: String = {
"users"
}
def getAll(count: Int, skip: Int)(implicit ec: ExecutionContext): Future[Seq[E]] = {
this.collection.flatMap(_.find(Json.obj())
.options(QueryOpts(skipN = skip))
.cursor[E]().collect[Seq[E]](count))
}
def getFilter(count: Int, skip: Int, f: E => Boolean)(implicit ec: ExecutionContext): Future[Seq[E]] = {
this.collection.flatMap(_.find(f)
.options(QueryOpts(skipN = skip))
.cursor[E]().collect[Seq[E]](count))
}
def getById(id: K)(implicit ec: ExecutionContext): Future[Option[E]] = {
this.collection.flatMap(_.find(Json.obj("_id" -> id.toString)).one[E])
}
def create(entity: E)(implicit ec: ExecutionContext): Future[Option[E]] = {
this.collection.flatMap(_.insert(entity)).flatMap(_ => Future.successful(Option(entity)))
}
def updateById(id: K, entity: E)(implicit ec: ExecutionContext): Future[Option[E]] = {
this.collection.flatMap(_.findAndUpdate(Json.obj("_id" -> id.toString), entity)
.map(_.result[E]))
}
def deleteById(id: K)(implicit ec: ExecutionContext): Future[Option[E]] = {
this.collection.flatMap(_.findAndRemove(Json.obj("_id" -> id.toString))
.map(_.result[E]))
}
}
Here is my concrete class that includes the json format serializer.
package core
import play.api.libs.json.Json
trait Entity[K] {
val id: K
}
case class User(
id: String,
name: String,
email: String
) extends Entity[String] {
}
object User {
implicit val jsonFormat = Json.format[User]
}
When you create your MongoRepository you need to say that E needs a json Format. You can do it like this:
class MongoRepository[K, E <: Entity[K]: Format]
// this is the same as
class MongoRespository[K, E <: Entity[K]](implicit formatter: Format[E])

Scala/Slick: using inheritance and mixins to reduce boilerplate

I'm scala/play/slick newbie so please don't be too mad if I ask dumb question.
Here goes the question.
I have several slick table definitions, here is one of them:
import javax.inject.Inject
import play.api.db.slick.{DatabaseConfigProvider, HasDatabaseConfigProvider}
import play.api.libs.concurrent.Execution.Implicits.defaultContext
import play.db.NamedDatabase
import slick.driver.JdbcProfile
import scala.concurrent.Future
case class User(id: Int, login: String, password: String) extends Identifiable
class UserDAO #Inject()(#NamedDatabase protected val dbConfigProvider: DatabaseConfigProvider) extends HasDatabaseConfigProvider[JdbcProfile] {
import driver.api._
private val users = TableQuery[UsersTable]
def all(): Future[Seq[User]] = db.run(users.result)
def insert(dog: User): Future[Unit] = db.run(users += dog).map { _ => () }
def delete(id: Int): Future[Int] = db.run(users.filter(_.id === id).delete)
private class UsersTable(tag: Tag) extends Table[User](tag, "USER") {
def id = column[Int]("id", O.PrimaryKey, O.AutoInc)
def email = column[String]("email")
def password = column[String]("password")
def * = (id, email, password) <> (User.tupled, User.unapply)
}
}
Imagine I have much more tables which have def id = column[Int]("id", O.PrimaryKey, O.AutoInc) to eliminate this I need to write something like:
trait Identifiable {
this: Table[_] =>
def id = column[String]("id", O.PrimaryKey)
}
But how do I import Table here in a database agnostic manner? Moreover there is more room for enhancements: all DAO objects providing access to Identifiable Tables can be inherited from a common abstract class containing all, insert, find and delete methods. Something like (was unable to compile it):
abstract class BaseDAO[E <: Identifiable] extends DAO[E] with HasDatabaseConfigProvider[JdbcProfile] {
import driver.api._
private val entities = TableQuery[BaseTable]
def all(): Future[Seq[E]] = db.run(entities.result)
def insert(entity: E): Future[Unit] = db.run(entities += entity).map { _ => () }
def delete(entity: E): Future[Int] = db.run(entities.filter(_.id === entity.id).delete)
def find(id: Int): Future[E] = db.run(entities.filter(_.id === entities.id))
trait BaseTable { this: Table[_] =>
def id = column[String]("id", O.PrimaryKey, O.AutoInc)
}
}
Could somebody please point me to my mistakes? Thanks.
Database agnostic and Code is highly reusable
I am using Slick with Playframework and this is how I achieved database agnostic and generic repository.
Note that this work is inspired from Active Slick
I want to have basic crud operations like this to be defined on my case class. I should be able to do count, update, delete and create. I want to write the curd code just once and reuse it for ever.
Here is the snippet which demonstrates this.
case class Dog(name: String, id: Option[Long] = None)
Dog("some_dog").save()
Dog("some_dog").insert()
Dog("some_dog", Some(1)).delete()
CrudActions.scala
import slick.backend.DatabaseConfig
import slick.driver.JdbcProfile
import scala.concurrent.ExecutionContext
trait CrudActions {
val dbConfig: DatabaseConfig[JdbcProfile]
import dbConfig.driver.api._
type Model
def count: DBIO[Int]
def save(model: Model)(implicit ec: ExecutionContext): DBIO[Model]
def update(model: Model)(implicit ec: ExecutionContext): DBIO[Model]
def delete(model: Model)(implicit ec: ExecutionContext): DBIO[Int]
def fetchAll(fetchSize: Int = 100)(implicit ec: ExecutionContext): StreamingDBIO[Seq[Model], Model]
}
Now lets get our Entity into picture. Note that Entity is nothing but our case class
Entity is case class on which we do crud operations. For locating our entity lets also have Id in place. Id is important for locating and operating an entity or record in the database. Also Id uniquely identities for entity
EntityActionsLike.scala
import slick.backend.DatabaseConfig
import slick.driver.JdbcProfile
import scala.concurrent.ExecutionContext
trait EntityActionsLike extends CrudActions {
val dbConfig: DatabaseConfig[JdbcProfile]
import dbConfig.driver.api._
type Entity
type Id
type Model = Entity
def insert(entity: Entity)(implicit ec: ExecutionContext): DBIO[Id]
def deleteById(id: Id)(implicit ec: ExecutionContext): DBIO[Int]
def findById(id: Id)(implicit ec: ExecutionContext): DBIO[Entity]
def findOptionById(id: Id)(implicit ec: ExecutionContext): DBIO[Option[Entity]]
}
import slick.ast.BaseTypedType
import slick.backend.DatabaseConfig
import slick.driver.JdbcProfile
import scala.concurrent.ExecutionContext
Now lets implement these methods. For doing operations we need Table and TableQuery. Lets say we have table and tableQuery. The good about traits is we can declare a contract and leave the implementation details to subclasses or subtypes
EntityActions.scala
trait EntityActions extends EntityActionsLike {
val dbConfig: DatabaseConfig[JdbcProfile]
import dbConfig.driver.api._
type EntityTable <: Table[Entity]
def tableQuery: TableQuery[EntityTable]
def $id(table: EntityTable): Rep[Id]
def modelIdContract: ModelIdContract[Entity,Id]
override def count: DBIO[Int] = tableQuery.size.result
override def insert(entity: Entity)(implicit ec: ExecutionContext): DBIO[Id] = {
tableQuery.returning(tableQuery.map($id(_))) += entity
}
override def deleteById(id: Id)(implicit ec: ExecutionContext): DBIO[Int] = {
filterById(id).delete
}
override def findById(id: Id)(implicit ec: ExecutionContext): DBIO[Entity] = {
filterById(id).result.head
}
override def findOptionById(id: Id)(implicit ec: ExecutionContext): DBIO[Option[Entity]] = {
filterById(id).result.headOption
}
override def save(model: Entity)(implicit ec: ExecutionContext): DBIO[Entity] = {
insert(model).flatMap { id =>
filterById(id).result.head
}.transactionally
}
override def update(model: Entity)(implicit ec: ExecutionContext): DBIO[Entity] = {
filterById(modelIdContract.get(model)).update(model).map { _ => model }.transactionally
}
override def delete(model: Entity)(implicit ec: ExecutionContext): DBIO[Int] = {
filterById(modelIdContract.get(model)).delete
}
override def fetchAll(fetchSize: Int)(implicit ec: ExecutionContext): StreamingDBIO[Seq[Entity], Entity] = {
tableQuery.result.transactionally.withStatementParameters(fetchSize = fetchSize)
}
def filterById(id: Id) = tableQuery.filter($id(_) === id)
def baseTypedType: BaseTypedType[Id]
protected implicit lazy val btt: BaseTypedType[Id] = baseTypedType
}
ActiveRecord.scala
import slick.dbio.DBIO
import scala.concurrent.ExecutionContext
abstract class ActiveRecord[R <: CrudActions](val repo: R) {
def model: repo.Model
def save()(implicit ec: ExecutionContext): DBIO[repo.Model] = repo.save(model)
def update()(implicit ec: ExecutionContext): DBIO[repo.Model] = repo.update(model)
def delete()(implicit ec: ExecutionContext): DBIO[Int] = repo.delete(model)
}
ModelContract.scala
case class ModelIdContract[A, B](get: A => B, set: (A, B) => A)
How to Use
Sample.scala
import com.google.inject.{Inject, Singleton}
import play.api.db.slick.DatabaseConfigProvider
import slick.ast.BaseTypedType
import slick.backend.DatabaseConfig
import slick.driver.JdbcProfile
import slick.{ActiveRecord, EntityActions, ModelIdContract}
case class Dog(name: String, id: Option[Long] = None)
#Singleton
class DogActiveRecord #Inject() (databaseConfigProvider: DatabaseConfigProvider) extends EntityActions {
override val dbConfig: DatabaseConfig[JdbcProfile] = databaseConfigProvider.get[JdbcProfile]
import dbConfig.driver.api._
override def tableQuery = TableQuery(new Dogs(_))
override def $id(table: Dogs): Rep[Id] = table.id
override def modelIdContract: ModelIdContract[Dog, Id] = ModelIdContract(dog => dog.id.get, (dog, id) => dog.copy(id = Some(id)))
override def baseTypedType: BaseTypedType[Id] = implicitly[BaseTypedType[Id]]
override type Entity = Dog
override type Id = Long
override type EntityTable = Dogs
class Dogs(tag: Tag) extends Table[Dog](tag, "DogsTable") {
def name = column[String]("name")
def id = column[Long]("id", O.PrimaryKey)
def * = (name, id.?) <> (Dog.tupled, Dog.unapply)
}
implicit class ActiveRecordImplicit(val model: Entity) extends ActiveRecord(this)
import scala.concurrent.ExecutionContext.Implicits.global
val result = Dog("some_dog").save()
val res2 = Dog("some_other_dog", Some(1)).delete()
val res3 = Dog("some_crazy_dog", Some(1)).update()
}
Now we can do operations on Dog directly like this
Dog("some_dog").save()
This implicit does the magic for us
implicit class ActiveRecordImplicit(val model: Entity) extends ActiveRecord(this)
You can also add scheme creation and dropping logic in EntityActions
tableQuery.schema.create
table.schema.drop

Scala type inference working with Slick Table

Have such models (simplified):
case class User(id:Int,name:String)
case class Address(id:Int,name:String)
...
Slick (2.1.0 version) table mapping:
class Users(_tableTag: Tag) extends Table[User](_tableTag, "users") with WithId[Users, User] {`
val id: Column[Int] = column[Int]("id", O.AutoInc, O.PrimaryKey)
...
}
trait WithId[T, R] {
this: Table[R] =>
def id: Column[Int]
}
Mixing trait WithId I want to implement generic DAO methods for different tables with column id: Column[Int] (I want method findById to work with both User and Address table mappings)
trait GenericSlickDAO[T <: WithId[T, R], R] {
def db: Database
def findById(id: Int)(implicit stk: SlickTableQuery[T]): Option[R] = db.withSession { implicit session =>
stk.tableQuery.filter(_.id === id).list.headOption
}
trait SlickTableQuery[T] {
def tableQuery: TableQuery[T]
}
object SlickTableQuery {
implicit val usersQ = new SlickTableQuery[Users] {
val tableQuery: Table Query[Users] = Users
}
}
The problem is that findById doesn't compile:
Error:(13, 45) type mismatch;
found : Option[T#TableElementType] required: Option[R]
stk.tableQuery.filter(_.id === id).list.headOption
As I see it T is of type WithId[T, R] and at the same time is of type Table[R]. Slick implements the Table type such that if X=Table[Y] then X#TableElementType=Y.
So in my case T#TableElementType=R and Option[T#TableElementType] should be inferred as Option[R] but it isn't. Where am I wrong?
Your assumption about WithId[T, R] being of type Table[R] is wrong. The self-type annotation in WithId[T, R] just requires a Table[R] to be mixed in, but that doesn't mean that WithId[T, R] is a Table[R].
I think you confuse the declaration of WithId with instances of WithId which eventually need to be an instance of a Table.
Your upper type bound constraint in the GenericSlickDAO trait also doesn't guarantee you the property of WithId to be an instance of Table, since any type is a subtype of itself.
See this question for a more elaborate explanation about the differences between self-types and subtypes.
I'm using play-slick and I tried to do exactly like you, with a trait and using self-type without success.
But I succeeded with the following:
import modelsunscanned.TableWithId
import scala.slick.jdbc.JdbcBackend
import scala.slick.lifted.TableQuery
import play.api.db.slick.Config.driver.simple._
/**
* #author Sebastien Lorber (lorber.sebastien#gmail.com)
*/
package object models {
private[models] val Users = TableQuery(new UserTable(_))
private[models] val Profiles = TableQuery(new ProfileTable(_))
private[models] val Companies = TableQuery(new CompanyTable(_))
private[models] val Contacts = TableQuery(new ContactTable(_))
trait ModelWithId {
val id: String
}
trait BaseRepository[T <: ModelWithId] {
def tableQuery: TableQuery[TableWithId[T]]
private val FindByIdQuery = Compiled { id: Column[String] =>
tableQuery.filter(_.id === id)
}
def insert(t: T)(implicit session: JdbcBackend#Session) = {
tableQuery.insert(t)
}
def getById(id: String)(implicit session: JdbcBackend#Session): T = FindByIdQuery(id).run.headOption
.getOrElse(throw new RuntimeException(s"Could not find entity with id=$id"))
def findById(id: String)(implicit session: JdbcBackend#Session): Option[T] = FindByIdQuery(id).run.headOption
def update(t: T)(implicit session: JdbcBackend#Session): Unit = {
val nbUpdated = tableQuery.filter(_.id === t.id).update(t)
require(nbUpdated == 1,s"Exactly one should have been updated, not $nbUpdated")
}
def delete(t: T)(implicit session: JdbcBackend#Session) = {
val nbDeleted = tableQuery.filter(_.id === t.id).delete
require(nbDeleted == 1,s"Exactly one should have been deleted, not $nbDeleted")
}
def getAll(implicit session: JdbcBackend#Session): List[T] = tableQuery.list
}
}
// play-slick bug, see https://github.com/playframework/play-slick/issues/227
package modelsunscanned {
abstract class TableWithId[T](tableTag: Tag,tableName: String) extends Table[T](tableTag,tableName) {
def id: Column[String]
}
}
I give you an exemple usage:
object CompanyRepository extends BaseRepository[Company] {
// Don't know yet how to avoid that cast :(
def tableQuery = Companies.asInstanceOf[TableQuery[TableWithId[Company]]]
// Other methods here
...
}
case class Company(
id: String = java.util.UUID.randomUUID().toString,
name: String,
mainContactId: String,
logoUrl: Option[String],
activityDescription: Option[String],
context: Option[String],
employeesCount: Option[Int]
) extends ModelWithId
class CompanyTable(tag: Tag) extends TableWithId[Company](tag,"COMPANY") {
override def id = column[String]("id", O.PrimaryKey)
def name = column[String]("name", O.NotNull)
def mainContactId = column[String]("main_contact_id", O.NotNull)
def logoUrl = column[Option[String]]("logo_url", O.Nullable)
def activityDescription = column[Option[String]]("description", O.Nullable)
def context = column[Option[String]]("context", O.Nullable)
def employeesCount = column[Option[Int]]("employees_count", O.Nullable)
//
def * = (id, name, mainContactId,logoUrl, activityDescription, context, employeesCount) <> (Company.tupled,Company.unapply)
//
def name_index = index("idx_name", name, unique = true)
}
Note that active-slick is also using something similar
This helped me out a lot. It's a pretty simple example of a genericdao https://gist.github.com/lshoo/9785645
package slicks.docs.dao
import scala.slick.driver.PostgresDriver.simple._
import scala.slick.driver._
trait Profile {
val profile: JdbcProfile
}
trait CrudComponent {
this: Profile =>
abstract class Crud[T <: Table[E] with IdentifiableTable[PK], E <: Entity[PK], PK: BaseColumnType](implicit session: Session) {
val query: TableQuery[T]
def count: Int = {
query.length.run
}
def findAll: List[E] = {
query.list()
}
def queryById(id: PK) = query.filter(_.id === id)
def findOne(id: PK): Option[E] = queryById(id).firstOption
def add(m: E): PK = (query returning query.map(_.id)) += m
def withId(model: E, id: PK): E
def extractId(m: E): Option[PK] = m.id
def save(m: E): E = extractId(m) match {
case Some(id) => {
queryById(id).update(m)
m
}
case None => withId(m, add(m))
}
def saveAll(ms: E*): Option[Int] = query ++= ms
def deleteById(id: PK): Int = queryById(id).delete
def delete(m: E): Int = extractId(m) match {
case Some(id) => deleteById(id)
case None => 0
}
}
}
trait Entity[PK] {
def id: Option[PK]
}
trait IdentifiableTable[I] {
def id: Column[I]
}
package slicks.docs
import slicks.docs.dao.{Entity, IdentifiableTable, CrudComponent, Profile}
case class User(id: Option[Long], first: String, last: String) extends Entity[Long]
trait UserComponent extends CrudComponent {
this: Profile =>
import profile.simple._
class UsersTable(tag: Tag) extends Table[User](tag, "users") with IdentifiableTable[Long] {
override def id = column[Long]("id", O.PrimaryKey, O.AutoInc)
def first = column[String]("first")
def last = column[String]("last")
def * = (id.?, first, last) <> (User.tupled, User.unapply)
}
class UserRepository(implicit session: Session) extends Crud[UsersTable, User, Long] {
override def query = TableQuery[UsersTable]
override def withId(user: User, id: Long): User = user.copy(id = Option(id))
}
}