Slick Write a Simple Table Creation Function - scala

Emm...I'm trying out Slick with Play 2. The table creation process has become very frustrating because unlike other ORM (like ebean), Slick doesn't detect if the database is created, if there is already a table existing, it will report an exception. I simply just don't want to drop and create every time I restart the server, so I decide to write a small function to help me:
def databaseCreate(tables: Table*) = {
for (table <- tables) {
if (MTable.getTables(table.getClass.getName).list.isEmpty) table.ddl.create
}
}
What this does is to take in some objects like this one:
object Tag extends Table [(Option[Int], String)]("Tags") {
def id = column[Int]("TAG_ID", O.PrimaryKey, O.AutoInc)
def tag_name = column[String]("TAG_NAME")
def * = id.? ~ tag_name
}
And use MTable method from scala.slick.jdbc.meta.MTable to know if the table exists or not. Then I kinda run into a simple Java reflection problem. If databaseCreate method takes Strings, then I can invoke .ddl.create. So I decide to pass in objects and use relfection: table.getClass.getName. The only problem is there is a type mismatch: (from my IDE)
Expected: MySQLDriver.simple.type#Table, actual: BlogData.Tag.type
BlogData is the big object I used to store all smaller table objects. How do I solve this mismatch problem?? Use a whole bunch of asInstanceOf? It would make the command unbearably long and ugly...
Corrected:
The type mismatch is a false alarm, which comes from the IDE, not from the typesafe console complier. The real problem is:
type Table takes type parameters
def databaseCreate(tables: Table*) = {
^
one error found
Then I followed the advice and changed the code:
def databaseCreate(tables: Table[_]*)(implicit session: Session) = {
for (table <- tables) {
if (MTable.getTables(table.tableName).list.isEmpty) table.ddl.create
}
}
Then I got this error:
ambiguous implicit values: both value session of type slick.driver.MySQLDriver.simple.Session and method threadLocalSession in object Database of type => scala.slick.session.Session match expected type scala.slick.session.Session
if (MTable.getTables(table.tableName).list.isEmpty) table.ddl.create
^
one error found
My imports are here:
import play.api.GlobalSettings
import play.api.Application
import models.BlogData._
import scala.slick.driver.MySQLDriver.simple._
import Database.threadLocalSession
import play.api.Play.current
import scala.slick.jdbc.meta.MTable

In Slick 2: For the sake of completeness, see this and that related thread.
I use the following method:
def createIfNotExists(tables: TableQuery[_ <: Table[_]]*)(implicit session: Session) {
tables foreach {table => if(MTable.getTables(table.baseTableRow.tableName).list.isEmpty) table.ddl.create}
}
Then you can just create your tables with the implicit session:
db withSession {
implicit session =>
createIfNotExists(table1, table2, ..., tablen)
}

You are using SLICK v1 I assume, as you have Table objects. I have working code for v1. This version will drop any tables that are present before recreating them (but wont drop any tables you don't wish to recreate). It will also merge the DDL into one before creation in order to get the creation sequence correct:
def create() = database withSession {
import scala.slick.jdbc.{StaticQuery => Q}
val tables = Seq(TableA, TableB, TableC)
def existingTables = MTable.getTables.list().map(_.name.name)
tables.filter(t => existingTables.contains(t.tableName)).foreach{t =>
Q.updateNA(s"drop table ${t.tableName} cascade").execute
}
val tablesToCreate = tables.filterNot(t => existingTables.contains(t.tableName))
val ddl: Option[DDL] = tablesToCreate.foldLeft(None: Option[DDL]){(ddl, table) =>
ddl match {
case Some(d) => Some(d ++ table.ddl)
case _ => Some(table.ddl)
}
}
ddl.foreach{_.create}
}

Not sure why you get this error message. I would need to see your imports and the place where you call databaseCreate, but what is wrong is def databaseCreate(tables: Table*) should be def databaseCreate(tables: Table[_]*) and probably take a second argument list as well def databaseCreate(tables: Table[_]*)(implicit session: Session) = ....
Also, instead of table.getClass.getName, you can use table.tableName.

You should use asTry with the table create DBIOAction . It will check whether there is table existing... if not exist the DBIOAction will work for creating a table.
in HomePageDAO
val createHomePageTableAction: DBIOAction[Int, NoStream, Effect.Schema with Effect.Write] = homePageTable.schema.create >>(homePageTable+=Homepage("No Data","No Data",0l))
in ExplorePageDAO
val SoftwareTableCreateAction: FixedSqlAction[Unit, NoStream, Effect.Schema] = softwareTable.schema.create
And in createTablesController
package controllers
import javax.inject.{Inject, Singleton}
import play.api.mvc.{AbstractController, Action, AnyContent, ControllerComponents}
import services.dbServices.{ExplorePageDAO, HomePageDAO}
import scala.concurrent.ExecutionContext
#Singleton
class CreateTablesController #Inject()(cc: ControllerComponents)(implicit assetsFinder: AssetsFinder, executionContext: ExecutionContext)
extends AbstractController(cc) {
import services.dbServices.dbSandBox._
def createTable: Action[AnyContent] = Action.async {
dbAccess.run(HomePageDAO.createHomePageTableAction .asTry >> ExplorePageDAO.SoftwareTableCreateAction.asTry).map(_=> Ok("Table Created"))
}
}

Related

Using Phantom 2 with an existing Cassandra session

I am trying to migrate our current implementation from Phantom 1.28.16 to 2.16.4 but I'm running into problems with the setup.
Our framework is providing us with the Cassandra session object during startup which doesn't seem to fit with Phantom. I am trying to get Phantom to accept that already instantiated session instead of going through the regular CassandraConnection object.
I'm assuming that we can't use the Phantom Database class because of this but I am hoping that there still is some way to set up and use the Tables without using that class.
Is this doable?
I ended up doing the following to be able to use Phantom with an existing connection:
Defined a new trait PhantomTable to be used instead of Phantoms 'Table' trait. They are identical except for removal of the RootConnector
trait PhantomTable[T <: PhantomTable[T, R], R] extends CassandraTable[T, R] with TableAliases[T, R]
Defined my tables by extending the PhantomTable trait and also made it to an object. Here I had to import all of the TableHelper macro to get it to compile
...
import com.outworkers.phantom.macros.TableHelper._
final case class Foo(id: String, name: Option[String])
sealed class FooTable extends PhantomTable[FooTable, Foo] {
override val tableName = "foo"
object id extends StringColumn with PartitionKey
object name extends OptionalStringColumn
}
object FooTable extends FooTable
After that it is possible to use all the wanted methods on the FooTable object as long as an implicit Keyspace and Session exists in the scope.
This is a simple main program that shows how the tables can be used
object Main extends App {
val ks = "foo_keyspace"
val cluster = Cluster.builder().addContactPoints("127.0.0.1").build()
implicit val keyspace: KeySpace = KeySpace(ks)
implicit val session: Session = cluster.connect(ks)
val res = for {
_ <- FooTable.create.ifNotExists.future
_ <- FooTable.insert.value(_.id, "1").value(_.name, Some("data")).future
row <- FooTable.select.where(_.id eqs "1").one
} yield row
val r = Await.result(res, 10.seconds)
println(s"Row: $r")
}

How to organize Slick code into separately testable units with a configurable database?

I have a codebase of several applications with a shared database that use Slick for database access. Parts of the code like common table mappings are in a common library. I want to be able to run two kinds of tests on those projects:
tests that use the real database
other tests that use an in-memory H2 database
I have tried several ways to organize my code to support this, but have always hit a wall at some point. My latest attempt is using the DatabaseConfig approach and passes around a database definition (including database, profile and table definitions) to the objects that do database operations (let's call them Services). This way, I can - in theory - easily test a service by passing in the database definition I want to test it with. In practice, I get problems in the interaction between services because types don't fit properly.
I have created a simplified example that shows the kind of problems I get:
import slick.basic.DatabaseConfig
import slick.jdbc.JdbcProfile
import slick.lifted.Rep
import scala.concurrent.Future
trait HasProfile {
val profile: JdbcProfile
}
trait HasId {
val id: Rep[Long]
}
trait Table_Person {
this: HasProfile =>
case class PersonRow(personId: Long, lastName: String)
import profile.api._
class Person(tableTag: Tag) extends Table[PersonRow](tableTag, "person") with HasId {
val personId = column[Long]("person_id", O.PrimaryKey, O.AutoInc)
val lastName = column[String]("last_name")
override val id = personId
val * = (personId, lastName).mapTo[PersonRow]
}
lazy val Person = TableQuery[Person]
}
class DatabaseWrapper {
private val databaseConfig: DatabaseConfig[JdbcProfile] = DatabaseConfig.forConfig("PersonDatabase")
def db = databaseConfig.db
object Schema extends HasProfile with Table_Person {
override val profile = databaseConfig.profile
}
}
class PersonService(val databaseWrapper: DatabaseWrapper) {
import databaseWrapper.Schema._
val loadService = new LoadService(databaseWrapper)
def load(id: Long): Future[Option[PersonRow]] = loadService.load[PersonRow](Person, id)
}
class LoadService(val databaseWrapper: DatabaseWrapper) {
import databaseWrapper.Schema.profile.api._
def load[R](tableQuery: TableQuery[_ <: Table[R] with HasId], id: Long): Future[Option[R]] = databaseWrapper.db.run(tableQuery.filter(_.id === id).result.headOption)
}
This code gives me the following compile error:
Error:(51, 79) type mismatch;
found : slick.lifted.TableQuery[PersonService.this.databaseWrapper.Schema.Person]
required: PersonService.this.loadService.databaseWrapper.Schema.profile.api.TableQuery[_ <: PersonService.this.loadService.databaseWrapper.Schema.profile.api.Table[PersonService.this.databaseWrapper.Schema.PersonRow] with HasId]
(which expands to) slick.lifted.TableQuery[_ <: PersonService.this.loadService.databaseWrapper.Schema.profile.Table[PersonService.this.databaseWrapper.Schema.PersonRow] with HasId]
def load(id: Long): Future[Option[PersonRow]] = loadService.load[PersonRow](Person, id)
It seems the type checker does not realize that PersonService.this.loadService.databaseWrapper is the same as PersonService.this.databaseWrapper.
Is there a way around this? Does this approach even make sense, or are there any better approaches to structure the code?

Slick implicit parameter 'tables' with generated tables

The simple version
What's the preferred way to import and use generated Slick tables?
The detailed version and what I've tried
I used Slick 3.1.1 codegen to generate a Tables.scala from a MySQL (MariaDB) schema.
Tables.scala begins with this:
// AUTO-GENERATED Slick data model
/** Stand-alone Slick data model for immediate use */
object Tables extends {
val profile = slick.driver.MySQLDriver
} with Tables
What's the best way to use these classes? As per the Slick documentation:
The file contains an object Tables from which the code can be imported for use right away. ... The file also contains a trait Tables which can be used in the cake pattern.
... I've tried variations on this example
import Tables._
import Tables.profile.api._
import slick.jdbc.JdbcBackend
class Test(s: String)(implicit db: Database) {
def exec[T](action: DBIO[T])(implicit db: Database): T =
Await.result(db run action)
def run: Unit = exec(((ATable filter (_.id)).result)
}
object Test {
implicit val db = Database.forURL(url, user, password)
new Test("")
}
I get a compile error wherever I reference the class ATable:
could not find implicit value for parameter tables: Tables
I don't even see tables in Tables.scala. How do I get everything I need in scope to use my generated Slick classes?
I got the example to work: Tables._ and Tables.profile.api._ just need to be imported inside the class with an implicit Database available.
import slick.jdbc.JdbcBackend
class Test(s: String)(implicit db: Database) {
import Tables._
import Tables.profile.api._
def exec[T](action: DBIO[T])(implicit db: Database): T =
Await.result(db run action)
def run: Unit = exec(((ATable filter (_.id)).result)
}
object Test {
implicit val db = Database.forURL(url, user, password)
new Test("")
}

How to set an in memory test database for my Scala Play2 CRUD application?

I'm continuing my exploration of the Play framework and its related components. I used the template for CRUD application with a connection to a PostgreSQL database to begin with. It splits the application in models, repositories, controllers and views. This works well.
Now, I'm trying to create some tests for this application with Specs2. More precisely, I'm trying to test the repository. It is defined as follow:
package dal
import javax.inject.{ Inject, Singleton }
import play.api.db.slick.DatabaseConfigProvider
import slick.driver.JdbcProfile
import models.Cat
import scala.concurrent.{ Future, ExecutionContext }
#Singleton
class CatRepository #Inject() (dbConfigProvider: DatabaseConfigProvider)(implicit ec: ExecutionContext) {
...
}
I would like to set an in memory database which would be created (schema, evolutions) before all tests, destroyed after all tests, populated (data, probably with direct SQL) and flushed around each test. I would like to pass it on to my repository instance which I would then use to perform my test. Like:
val repo = new CatRepository(inMem_DB)
So how do I go about:
1) creating this db and applying the evolutions?
maybe:
trait TestDB extends BeforeAfterAll {
var database: Option[Database] = None
def before = {
database = Some(Databases.inMemory(name = "test_db"))
database match {
case Some(con) => Evolutions.applyEvolutions(con)
case _ => None
}
println("DB READY")
}
def after = {
database match {
case Some(con) => con.shutdown()
case _ => None
}
}
}
Using a var and always making a "match/case" when I need to use the db isn't convenient. I guess there is a much better to do this...
2) Populate and flush around each test?
Shall I create a trait extending Around, just as with BeforeAfterAll?
3) Create on of these play.api.db.slick.DatabaseConfigProvider from the database?
Any link showing how to do this?
I found few examples which where covering this with running a FakeApplication, but I assume there is a way to pass somehow the db to such repository object outside of a running application..?
Thank you for helping.
Cheers!
You can use a lazy val and AfterAll for the database setup/teardown, then BeforeAfterEach for each example:
trait TestDB extends AfterAll with BeforeAfterEach {
lazy val database: Database =
Databases.inMemory(name = "test_db")
def afterAll =
database.shutdown
def before =
database.populate
def after =
datbase.clean
}

How to create tables given a Seq[TableQuery] in Typesafe's Slick?

I'm running into problems when trying to populate a DB with tables (using Slick 2.0.0):
import scala.slick.driver.JdbcDriver
import scala.slick.lifted.TableQuery
case class DemoInit(slickDriver:JdbcDriver, tables:Seq[TableQuery[_]]) {
lazy val db = slickDriver.simple.Database.forURL("jdbc_url", "user", "pass", driver = "com.example.Driver")
import slickDriver.simple.{tableQueryToTableQueryExtensionMethods, ddlToDDLInvoker}
def init() {
db withSession { implicit session =>
tables.map(_.ddl).reduce(_ ++ _).create
}
}
}
Attempting to compile the code above leads to the following two errors:
value ddl is not a member of scala.slick.lifted.TableQuery[_$1]
value create is not a member of scala.slick.lifted.TableQuery[_$1]
I'm guessing that the type parameters aren't being inferred correctly. What am I doing wrong?
You need to add the right constraint for the table type:
tables: Seq[TableQuery[_ <: Table[_]]]
In such cases where an implicit conversion that you expect is not taken, it helps to add an explicit call to this conversion. This will give you a better error message that explains why the conversion is not applicable.
Then you'll run into the next issue that you're using reduce wrong. You'll want to use something like tables.map(_.ddl).reduce(_ ++ _) instead.
Since Table is path-dependent you cannot use it with your setup that tries to pass everything to a class as parameters. While you are allowed to refer to path-dependent types from a previous argument list in a later argument list of a def, this is not possible in a class. You'll have to structure your code differently to get the path-dependent types right, e.g.:
import scala.slick.driver.JdbcProfile
class DemoDAO(val slickDriver: JdbcProfile) {
import slickDriver.simple._
lazy val db = slickDriver.simple.Database.forURL("jdbc_url", "user", "pass", driver = "com.example.Driver")
def init(tables: Seq[TableQuery[_ <: Table[_]]]) {
db withSession { implicit session =>
tables.map(_.ddl).reduce(_ ++ _).create
}
}
}
}
What I ended up doing was creating a new trait:
import scala.slick.driver.JdbcProfile
trait TablesSupplier {
def tables(profile:JdbcProfile):Seq[JdbcProfile#SimpleQL#TableQuery[_ <: JdbcProfile#SimpleQL#Table[_]]]
}
I'm using Slick to generate the source code for table objects, so I have a Tables trait, which I use in the implementation of TablesSupplier:
import scala.slick.driver.JdbcProfile
object DemoTablesSupplier extends TablesSupplier {
def tables(profile:JdbcProfile) = {
val _profile = profile
val demoTables = new Tables { val profile = _profile }
import demoTables._
Seq(Table1, Table2, Table3)
}
}
So, my DemoInit now looks like this:
import scala.slick.driver.JdbcDriver
case class DemoInit(slickDriver:JdbcDriver, tables:Seq[TableQuery[_]]) {
lazy val db = slickDriver.simple.Database.forURL("jdbc_url", "user", "pass", driver = "com.example.Driver")
import slickDriver.simple.{tableQueryToTableQueryExtensionMethods, ddlToDDLInvoker}
def init() {
db withSession { implicit session =>
val profile = slickDriver.profile
//import the type of Table and TableQuery that are expected as well as two implicit methods that are necessary in order to use 'ddl' and 'create' methods
import profile.simple.{Table, TableQuery, tableQueryToTableQueryExtensionMethods, ddlToDDLInvoker}
//using asInstanceOf is very ugly but I couldn't figure out a better way
val tables = tablesCreator.tables(profile).asInstanceOf[Seq[TableQuery[_ <: Table[_]]]]
tables.map(_.ddl).reduce(_ ++ _).create
}
}
}