I'm trying to wrap my head around data access with Slick 3.0. After consulting various github examples, I've came with following design.
A singleton Slick object where the DataSource and Driver instances are injected
class Slick(dataSource: DataSource, val driver: JdbcDriver) {
val db = driver.api.Database.forDataSource(dataSource)
}
A trait per DB table where the mappings are defined
The trait is mixed in the upper layer where the queries are constructed.
trait RecipeTable {
protected val slick: Slick
// the ugly import that have to be added when Slick API is used
import slick.driver.api._
type RecipeRow = (Option[Long], String)
class RecipeTable(tag: Tag) extends Table[RecipeRow](tag, "recipe") {
def id = column[Option[Long]]("id", O.PrimaryKey, O.AutoInc)
def name = column[String]("name")
def * = (id, name)
}
protected val recipes = TableQuery[RecipeTable]
}
Now there's obvious drawback that in every *Table trait and also in every place where that is mixed in I need to duplicate import slick.driver.api._ in order to have all Slick's stuff in scope.
This something I'd like to avoid. Ideally the import will be defined only once and reused in downstream components.
Could you please suggest the a design that addresses such a duplication?
I was mainly inspired by this example, however the imports are duplicated there as well.
That "ugly" import is actually a good thing about slick's design. But your way of slick usage can be improved as following,
Create a trait which will provide JdbcDriver
package demo.slick.dbl
trait SlickDriverComponent {
val driver: JdbcDriver
}
trait SlickDBComponent extends SlickDriverComponent {
val db: driver.api.Database
}
Now define your DAO traits as traits dependant on this trait,
package demo.slick.dao
import demo.slick.dbl.SlickDBComponent
trait RecipeDAO { self: SlickDBComponent =>
import driver.api._
type RecipeRow = (Option[Long], String)
class RecipeTable(tag: Tag) extends Table[RecipeRow](tag, "recipe") {
def id = column[Option[Long]]("id", O.PrimaryKey, O.AutoInc)
def name = column[String]("name")
def * = (id, name)
}
val recipes = TableQuery[RecipeTable]
def get5Future = db.run(recipes.take(5).result)
}
When it comes to actually connecting with DB and doing things,
package demo.slick.dbl
trait MySqlDriverProvider extends SlickDriverComponent {
val driver = slick.driver.MySQLDriver
}
object MySqlDBConnection extends MySqlDriverProvider {
val connection = driver.api.Database.forConfig("mysql")
}
trait MySqlDBProvider extends SlickDBComponent {
val driver = slick.driver.MySQLDriver
val db: Database = MySqlDBConnection.connection
}
trait PostgresDriverProvider extends SlickDriverComponent {
val driver = slick.driver.PostgresDriver
}
object PostgresDBConnection extends PostgresDriverProvider {
val connection = driver.api.Database.forConfig("postgres")
}
trait PostgresDBProvider extends SlickDBComponent {
val driver = slick.driver.PostgresDriver
val db: Database = PostgresDBConnection.connection
}
Now finally define your DAO objects as follows,
package demo.slick.dao
import demo.slick.dbl.MySqlDBProvider
object MySqlRecipeDAO extends RecipeDAO with MySqlDBProvider
object PostgresRecipeDAO extends RecipeDAO with PostgresDBProvider
Now, you can use these as follows,
pakcage demo.slick
import scala.util.{Failure, Success, Try}
import scala.concurrent.ExecutionContext.Implicits.global
import demo.slick.RecipeDAO
object App extends Application {
val recipesFuture = MysqlRecipeDAO.get5Future
recipesFuture.onComplete({
case Success(seq) => println("Success :: found :: " + seq)
case Failure(ex) => println("Failure :: failed :: " + ex.getMessage)
})
}
Now... as we all know that different databases have different sets of functionalities and hence the "things" available to you will depend upon the driver being used.
So that need to ugly import every time is so that you can write your DAO traits once and then be able to use them with whatever database specific driver implementation you want.
Related
I have a codebase of several applications with a shared database that use Slick for database access. Parts of the code like common table mappings are in a common library. I want to be able to run two kinds of tests on those projects:
tests that use the real database
other tests that use an in-memory H2 database
I have tried several ways to organize my code to support this, but have always hit a wall at some point. My latest attempt is using the DatabaseConfig approach and passes around a database definition (including database, profile and table definitions) to the objects that do database operations (let's call them Services). This way, I can - in theory - easily test a service by passing in the database definition I want to test it with. In practice, I get problems in the interaction between services because types don't fit properly.
I have created a simplified example that shows the kind of problems I get:
import slick.basic.DatabaseConfig
import slick.jdbc.JdbcProfile
import slick.lifted.Rep
import scala.concurrent.Future
trait HasProfile {
val profile: JdbcProfile
}
trait HasId {
val id: Rep[Long]
}
trait Table_Person {
this: HasProfile =>
case class PersonRow(personId: Long, lastName: String)
import profile.api._
class Person(tableTag: Tag) extends Table[PersonRow](tableTag, "person") with HasId {
val personId = column[Long]("person_id", O.PrimaryKey, O.AutoInc)
val lastName = column[String]("last_name")
override val id = personId
val * = (personId, lastName).mapTo[PersonRow]
}
lazy val Person = TableQuery[Person]
}
class DatabaseWrapper {
private val databaseConfig: DatabaseConfig[JdbcProfile] = DatabaseConfig.forConfig("PersonDatabase")
def db = databaseConfig.db
object Schema extends HasProfile with Table_Person {
override val profile = databaseConfig.profile
}
}
class PersonService(val databaseWrapper: DatabaseWrapper) {
import databaseWrapper.Schema._
val loadService = new LoadService(databaseWrapper)
def load(id: Long): Future[Option[PersonRow]] = loadService.load[PersonRow](Person, id)
}
class LoadService(val databaseWrapper: DatabaseWrapper) {
import databaseWrapper.Schema.profile.api._
def load[R](tableQuery: TableQuery[_ <: Table[R] with HasId], id: Long): Future[Option[R]] = databaseWrapper.db.run(tableQuery.filter(_.id === id).result.headOption)
}
This code gives me the following compile error:
Error:(51, 79) type mismatch;
found : slick.lifted.TableQuery[PersonService.this.databaseWrapper.Schema.Person]
required: PersonService.this.loadService.databaseWrapper.Schema.profile.api.TableQuery[_ <: PersonService.this.loadService.databaseWrapper.Schema.profile.api.Table[PersonService.this.databaseWrapper.Schema.PersonRow] with HasId]
(which expands to) slick.lifted.TableQuery[_ <: PersonService.this.loadService.databaseWrapper.Schema.profile.Table[PersonService.this.databaseWrapper.Schema.PersonRow] with HasId]
def load(id: Long): Future[Option[PersonRow]] = loadService.load[PersonRow](Person, id)
It seems the type checker does not realize that PersonService.this.loadService.databaseWrapper is the same as PersonService.this.databaseWrapper.
Is there a way around this? Does this approach even make sense, or are there any better approaches to structure the code?
I have the following class :
case class AucLog(timestamp: UUID, modelname: String, good: Int,
list: List[Double])
class AucDatabase(override val connector : CassandraConnection)
extends Database[AucDatabase](connector) {
object users extends CMetrics with Connector
}
object AucDatabase extends AucDatabase(AucConnector.connector)
abstract class AucMetrics extends Table[AucMetrics, AucLog] {
object id extends UUIDColumn with PartitionKey
object name extends StringColumn
object ud extends IntColumn
object zob extends ListColumn[Double]
}
abstract class CMetrics extends AucMetrics with RootConnector {
def store(metric : AucLog): Future[ResultSet] = {
insert.value(_.id, metric.timestamp)
.value(_.name, metric.modelname)
.value(_.ud, metric.good)
.value(_.zob, metric.list)
.consistencyLevel_=(ConsistencyLevel.ONE)
.future()
}
DmpDatabase.create()
AucDatabase.create()
val pd = DmpDatabase.users.myselect()
val timeout = new Timeout(500000)
val result = Await.result(pd, timeout.duration)
"<--- this attempt to read from my database is working - no problemo ---> "
val todf = result.records.map { elem => elem.idcat }
val rdd = spark.sparkContext.parallelize(todf)
import spark.implicits._
rdd.toDF().show(100)
---> I'm storing one line in my database to be sure that it is not empty when
i'm reading it.
AucDatabase.users.store(new AucLog(UUIDs.timeBased(), "tyron", 0, List(0.1)))
val second = AucDatabase.users.myselect()
val resultmetric = Await.result(second, timeout.duration)
-----> this line cause the Execption
val r = spark.sparkContext.parallelize(resultmetric.records).toDF().show(
What I do not understand is that i'm doing basically the same thing with both databases. Yet, one is throwing the following error : UnsupportedOperationException : No encoder found for com.outworkers.phantom.dsl.UUID.
Thank you.
First of all the store method is macro generated so you don't need to create one. The problem you are having is likely not related to phantom at all, but to some kind of Spark construct.
The phantom UUID is nothing more than a type alias for java.util.UUID, so I'm quite surprised there is no straight up encoder for a default type. If you help me out with the full name of the Encoder class, including the package, I can figure out explicitly what is broken.
I have a Language model, table and repository. So far this works:
package repositories
import javax.inject.Inject
import Helper
import model.{Language, LanguageModel}
import play.api.Logger
import play.api.cache.SyncCacheApi
import slick.jdbc.JdbcProfile
import scala.concurrent.{ExecutionContext, Future}
import scala.util.{Failure, Success}
class LanguageRepository #Inject()(cache: SyncCacheApi, jdbcProfile: JdbcProfile, implicit val executionContext: ExecutionContext)
{
private val model = new LanguageModel(jdbcProfile)
import jdbcProfile.api._
def all(userName: String): Future[Seq[Language]] =
{
cache.get[Future[Seq[Language]]](buildCacheKey(userName)) match
{
case Some(x) => {Logger.info("[LanguageRepository](all) Found something in cache"); x}
case None => {
Logger.info("[LanguageRepository](all) Nothing useful to be found in cache, calling database now")
val result = retrieve(userName)
result.onComplete{
case Success(value) => if(!value.isEmpty) cache.set(buildCacheKey(userName), result)
case Failure(e) => ()
}
result
}
}
}
private def retrieve(userName: String): Future[Seq[Language]] =
{
// TODO extract this to a repositoryTrait and implement fallbacks etc
val db = Database.forURL(Helper.getDbUrl(), driver = Helper.getDbDriver())
db.run(model.all.result)
}
private def buildCacheKey(userName: String): String = s"$userName.languages"
}
Now I am struggling with the today past me left current me.
So I created this trait and wanted to let it be extended by LanguageRepository to get rid of that generic retrieve method that should be the same for all repositories/models. But sadly no luck so far:
trait Repository
{
type Entity
val model: Base
val profile: JdbcProfile
import profile.api._
protected def retrieve(userName: String): Future[Seq[Entity]] =
{
val db = Database.forURL(Helper.getDbUrl(), driver = Helper.getDbDriver())
db.run(model.all.result)
}
}
This is base:
trait Base
{
val dbProfile: JdbcProfile
import dbProfile.api._
type Entity
type EntityTable <: Table[Entity]
lazy val all = TableQuery[EntityTable]
}
Here I get one error >> class type required but Base.this.EntityTable found
class LanguageModel(databaseProfile: JdbcProfile) extends Base
{
override val dbProfile: JdbcProfile = databaseProfile
import dbProfile.api._
...
override type EntityTable = LanguageTable
}
Repository itself is not compiling either, because the types don't match. There are multiple problems and I am not sure where to start to solve them.
Your base table definition won't work like that. You need class types, maybe you should generics instead. Also, instead of creating multiple abstractions, start with only one abstraction and evolve from there. Try something along these lines:
class Repository[A, B <: Table[A]](t: TableQuery[B]) {
val model = t
//protected def retrieve ..
}
class LanguageModel(databaseProfile: JdbcProfile) extends Repository[Language, LanguageTable](TableQuery[LanguageTable]) {
//...
}
Get everything to compile first, then start adding the abstraction one class at a time.
I'm running into problems when trying to populate a DB with tables (using Slick 2.0.0):
import scala.slick.driver.JdbcDriver
import scala.slick.lifted.TableQuery
case class DemoInit(slickDriver:JdbcDriver, tables:Seq[TableQuery[_]]) {
lazy val db = slickDriver.simple.Database.forURL("jdbc_url", "user", "pass", driver = "com.example.Driver")
import slickDriver.simple.{tableQueryToTableQueryExtensionMethods, ddlToDDLInvoker}
def init() {
db withSession { implicit session =>
tables.map(_.ddl).reduce(_ ++ _).create
}
}
}
Attempting to compile the code above leads to the following two errors:
value ddl is not a member of scala.slick.lifted.TableQuery[_$1]
value create is not a member of scala.slick.lifted.TableQuery[_$1]
I'm guessing that the type parameters aren't being inferred correctly. What am I doing wrong?
You need to add the right constraint for the table type:
tables: Seq[TableQuery[_ <: Table[_]]]
In such cases where an implicit conversion that you expect is not taken, it helps to add an explicit call to this conversion. This will give you a better error message that explains why the conversion is not applicable.
Then you'll run into the next issue that you're using reduce wrong. You'll want to use something like tables.map(_.ddl).reduce(_ ++ _) instead.
Since Table is path-dependent you cannot use it with your setup that tries to pass everything to a class as parameters. While you are allowed to refer to path-dependent types from a previous argument list in a later argument list of a def, this is not possible in a class. You'll have to structure your code differently to get the path-dependent types right, e.g.:
import scala.slick.driver.JdbcProfile
class DemoDAO(val slickDriver: JdbcProfile) {
import slickDriver.simple._
lazy val db = slickDriver.simple.Database.forURL("jdbc_url", "user", "pass", driver = "com.example.Driver")
def init(tables: Seq[TableQuery[_ <: Table[_]]]) {
db withSession { implicit session =>
tables.map(_.ddl).reduce(_ ++ _).create
}
}
}
}
What I ended up doing was creating a new trait:
import scala.slick.driver.JdbcProfile
trait TablesSupplier {
def tables(profile:JdbcProfile):Seq[JdbcProfile#SimpleQL#TableQuery[_ <: JdbcProfile#SimpleQL#Table[_]]]
}
I'm using Slick to generate the source code for table objects, so I have a Tables trait, which I use in the implementation of TablesSupplier:
import scala.slick.driver.JdbcProfile
object DemoTablesSupplier extends TablesSupplier {
def tables(profile:JdbcProfile) = {
val _profile = profile
val demoTables = new Tables { val profile = _profile }
import demoTables._
Seq(Table1, Table2, Table3)
}
}
So, my DemoInit now looks like this:
import scala.slick.driver.JdbcDriver
case class DemoInit(slickDriver:JdbcDriver, tables:Seq[TableQuery[_]]) {
lazy val db = slickDriver.simple.Database.forURL("jdbc_url", "user", "pass", driver = "com.example.Driver")
import slickDriver.simple.{tableQueryToTableQueryExtensionMethods, ddlToDDLInvoker}
def init() {
db withSession { implicit session =>
val profile = slickDriver.profile
//import the type of Table and TableQuery that are expected as well as two implicit methods that are necessary in order to use 'ddl' and 'create' methods
import profile.simple.{Table, TableQuery, tableQueryToTableQueryExtensionMethods, ddlToDDLInvoker}
//using asInstanceOf is very ugly but I couldn't figure out a better way
val tables = tablesCreator.tables(profile).asInstanceOf[Seq[TableQuery[_ <: Table[_]]]]
tables.map(_.ddl).reduce(_ ++ _).create
}
}
}
I have a model
package models
import scala.slick.driver.SQLServerDriver.simple._
import play.api.libs.json._
import play.api.db.DB
import play.api.Play.current
import Database.threadLocalSession
case class College(collegeCode: String, collegeName: String)
object College {
lazy val database = Database.forDataSource(DB.getDataSource())
val CollegeTable = new Table[College]("College"){
def collegeCode = column[String]("CollegeCode", O.PrimaryKey)
def collegeName = column[String]("CollegeName")
def * = collegeCode ~ collegeName <> (College.apply _, College.unapply _)
implicit val CollegeReads = Json.reads[College]
implicit val CollegeWrites = Json.writes[College]
}
def getAll: Seq[College] = {
database withSession {
val q = Query(CollegeTable)
q.list
}
}
In my controller i'm attempting to render the data as JSON.
Ok(Json.toJson(College.getAll))
When viewing the page I receive this error:
No Json deserializer found for type Seq[models.College]. Try to implement an implicit Writes or Format for this type.
I thought that defining the implicit read/write in the model would take care of this. It's not until I do something like this:
Ok(Json.toJson(College.getAll.map { c=>
(c.collegeCode, c.collegeName)
} toMap))
in the controller before JSON is actually rendered. What am I doing wrong in the implicit read/write implementation?
The problem is that the controller cannot see the implicits. They are hidden inside the block for defining CollegeTable.
Is there any particular reason you need the implicits in this location? If not then you could put the implicits in your controller.
If you want your model to have the implicits beside it, then move the implicits up a level and import them in the controller.
object College {
implicit val CollegeReads = Json.reads[College]
implicit val CollegeWrites = Json.writes[College]
}
class Controller extends Controller {
import College.CollegeReads
import College.CollegeWrites
...
}