Slick 3: insertOrUpdate not working - postgresql

I'm using slick with Postgresql 9.6.1, Plya! 2.5 and play-slick 2.0.2.
(I also use slick-pg 0.14.3 but I don't think that it changes anything here.)
I'm using insertOrUpdate in a very straight forward way but I still get a unique exception.
I have a very simple test using insertOrUpdate:
if I run it several times I always get an sql exception:
ERROR: duplicate key value violates unique constraint "ga_client_id_pkey"
Detail: Key (client_id)=(1885746393.1464005051) already exists
However, my table is defined with client_id as primary key:
def clientId = column[String]("client_id", O.PrimaryKey)
and defined in sql as followed:
client_id TEXT NOT NULL UNIQUE PRIMARY KEY
The function tested simply does:
db.run(gaClientIds.insertOrUpdate(gaClientId))
and the controller simply calls this methods and does nothing else.
A strange thing is that launching the methods itself several times don't lead to the error but the controller does although it only calls the method.
Is insertOrUpdate slick function not sure yet or am I missing something?

insertOrUpdate is only supported in MySQL driver
http://slick.lightbend.com/doc/3.2.1/supported-databases.html
You can try this library which gives you the implementation of insertOrUpdate/upsert
https://github.com/tminglei/slick-pg
This is how we use it on current project.
import com.github.tminglei.slickpg._
import com.typesafe.config.ConfigFactory
import slick.basic.Capability
object DBComponent {
private val config = ConfigFactory.load()
val driver = config.getString("rdbms.properties.driver") match {
case "org.h2.Driver" => slick.jdbc.H2Profile
case _ => MyPostgresProfile
}
import driver.api._
val db: Database = Database.forConfig("rdbms.properties")
}
trait MyPostgresProfile extends ExPostgresProfile {
// Add back `capabilities.insertOrUpdate` to enable native `upsert` support; for postgres 9.5+
override protected def computeCapabilities: Set[Capability] =
super.computeCapabilities + slick.jdbc.JdbcCapabilities.insertOrUpdate
override val api: MyAPI.type = MyAPI
object MyAPI extends API
}
object MyPostgresProfile extends MyPostgresProfile

Related

Slick implicit parameter 'tables' with generated tables

The simple version
What's the preferred way to import and use generated Slick tables?
The detailed version and what I've tried
I used Slick 3.1.1 codegen to generate a Tables.scala from a MySQL (MariaDB) schema.
Tables.scala begins with this:
// AUTO-GENERATED Slick data model
/** Stand-alone Slick data model for immediate use */
object Tables extends {
val profile = slick.driver.MySQLDriver
} with Tables
What's the best way to use these classes? As per the Slick documentation:
The file contains an object Tables from which the code can be imported for use right away. ... The file also contains a trait Tables which can be used in the cake pattern.
... I've tried variations on this example
import Tables._
import Tables.profile.api._
import slick.jdbc.JdbcBackend
class Test(s: String)(implicit db: Database) {
def exec[T](action: DBIO[T])(implicit db: Database): T =
Await.result(db run action)
def run: Unit = exec(((ATable filter (_.id)).result)
}
object Test {
implicit val db = Database.forURL(url, user, password)
new Test("")
}
I get a compile error wherever I reference the class ATable:
could not find implicit value for parameter tables: Tables
I don't even see tables in Tables.scala. How do I get everything I need in scope to use my generated Slick classes?
I got the example to work: Tables._ and Tables.profile.api._ just need to be imported inside the class with an implicit Database available.
import slick.jdbc.JdbcBackend
class Test(s: String)(implicit db: Database) {
import Tables._
import Tables.profile.api._
def exec[T](action: DBIO[T])(implicit db: Database): T =
Await.result(db run action)
def run: Unit = exec(((ATable filter (_.id)).result)
}
object Test {
implicit val db = Database.forURL(url, user, password)
new Test("")
}

How to set an in memory test database for my Scala Play2 CRUD application?

I'm continuing my exploration of the Play framework and its related components. I used the template for CRUD application with a connection to a PostgreSQL database to begin with. It splits the application in models, repositories, controllers and views. This works well.
Now, I'm trying to create some tests for this application with Specs2. More precisely, I'm trying to test the repository. It is defined as follow:
package dal
import javax.inject.{ Inject, Singleton }
import play.api.db.slick.DatabaseConfigProvider
import slick.driver.JdbcProfile
import models.Cat
import scala.concurrent.{ Future, ExecutionContext }
#Singleton
class CatRepository #Inject() (dbConfigProvider: DatabaseConfigProvider)(implicit ec: ExecutionContext) {
...
}
I would like to set an in memory database which would be created (schema, evolutions) before all tests, destroyed after all tests, populated (data, probably with direct SQL) and flushed around each test. I would like to pass it on to my repository instance which I would then use to perform my test. Like:
val repo = new CatRepository(inMem_DB)
So how do I go about:
1) creating this db and applying the evolutions?
maybe:
trait TestDB extends BeforeAfterAll {
var database: Option[Database] = None
def before = {
database = Some(Databases.inMemory(name = "test_db"))
database match {
case Some(con) => Evolutions.applyEvolutions(con)
case _ => None
}
println("DB READY")
}
def after = {
database match {
case Some(con) => con.shutdown()
case _ => None
}
}
}
Using a var and always making a "match/case" when I need to use the db isn't convenient. I guess there is a much better to do this...
2) Populate and flush around each test?
Shall I create a trait extending Around, just as with BeforeAfterAll?
3) Create on of these play.api.db.slick.DatabaseConfigProvider from the database?
Any link showing how to do this?
I found few examples which where covering this with running a FakeApplication, but I assume there is a way to pass somehow the db to such repository object outside of a running application..?
Thank you for helping.
Cheers!
You can use a lazy val and AfterAll for the database setup/teardown, then BeforeAfterEach for each example:
trait TestDB extends AfterAll with BeforeAfterEach {
lazy val database: Database =
Databases.inMemory(name = "test_db")
def afterAll =
database.shutdown
def before =
database.populate
def after =
datbase.clean
}

Where to put my database access methods when using a DAO with Slick 2.0?

(This question is based on a very similar previous request for help. With the introduction of a DAO and multiple database drivers, the same problem requires a different approach, and I hope warrants a new SO question.)
I have a class and Slick Table defined like this:
import play.api.db.slick.Profile
case class Foo(title: String, description: String, id: Int = 0)
trait FooComponent extends Profile { this: Profile =>
import profile.simple._
class FooTable(tag: Tag) extends Table[Foo](tag, "FOO") {
def id = column[Int]("ID", O.PrimaryKey, O.AutoInc)
def title = column[String]("TITLE", O.NotNull)
def description = column[String]("DESCRIPTION")
def * = (title, description, id) <> (Foo.tupled, Foo.unapply)
}
}
And a data access object:
class DAO(override val profile: JdbcProfile) extends FooComponent with Profile {
val foos = TableQuery[FooTable]
}
object current {
val dao = new DAO(DB(play.api.Play.current).driver)
}
This is pretty awesome, because now I can add something like the following to my application.conf:
db.default.driver=org.h2.Driver
db.default.url="jdbc:h2:mem:play"
db.test.driver=org.postgresql.Driver
db.test.user="testuser"
db.test.password=""
db.test.url="jdbc:postgresql:testdb"
... and if I do the following in a Controller:
import models.current.dao._
import models.current.dao.profile.simple._
I have access to my foos TableQuery, and it automagically gets the driver and database url given for db.default in application.conf.
In a similar, but not-quite-as-nice way, I can do the following in my test Specification:
"test Foos" in new WithApplication() {
val dao = new DAO(play.api.db.slick.DB("test").driver)
import dao._ //import all our database Tables
import dao.profile.simple._ //import specific database methods
play.api.db.slick.DB("test").withSession { implicit s: Session =>
println(s.conn.getMetaData.getURL)
println(foos.list)
}
However, what if I want to define a method which can act on a TableQuery[Foo]? Something like this:
def findByTitle(title: String) = foos.filter(_.id === id).list
Problem
What's the correct way of writing the findByTitle method, and where should I put it so that I can:
Call it in a way such that it won't collide with a method of the same name which acts on TableQuery[Bar]. Coming from OO, I feel like I want to do something like foos.findByTitle("someFoo"), but if there's a better way of doing this functional-style, I'm open to suggestions.
Call it from an application Controller such that the query will work with my db.default h2 driver, and from my test Specification so that it will work with my db.test postgres driver.
As an aside, if I can put this in my DAO:
object current {
val dao = new DAO(DB(play.api.Play.current).driver)
}
and then import models.dao.current._ anywhere I want to use this DAO, how can I extend the same form to the following:
object test {
val dao = new DAO(play.api.db.slick.DB("test").driver)
}
If I try to do this, the compiler complains about not having an implicit Application in scope.
I think you need to read up in implicit conversion and implicit parameters in Scala. There are online Scala books available.
When you get an error message about a missing implicit it either means you ran into a failing type-check provided by a library preventing you from doing something wrong, but that's not the case here. Or you simply forgot to make the implicit available. There are two ways to make an implicit available. Either import it into the scope where you get the error message. Or basically defer the lookup to the callsite of your method. Not sure which one is the right one for play. You either need to import the implicit Application from play, or you need turn val dao into a method and request an implicit application in an implicit argument list def dao(implicit app: Application) = .... You can alternatively turn test into a class and request it there.
If you use the play slick plugin it will need a started play application to be able to call code that uses the DB access from that plugin, you can make sure to start a play app in your tests using WithApplication as described in the docs: http://www.playframework.com/documentation/2.3.x/ScalaFunctionalTestingWithSpecs2

How to map postgresql custom enum column with Slick2.0.1?

I just can't figure it out. What I am using right now is:
abstract class DBEnumString extends Enumeration {
implicit val enumMapper = MappedJdbcType.base[Value, String](
_.toString(),
s => this.withName(s)
)
}
And then:
object SomeEnum extends DBEnumString {
type T = Value
val A1 = Value("A1")
val A2 = Value("A2")
}
The problem is, during insert/update JDBC driver for PostgreSQL complains about parameter type being "character varying" when column type is "some_enum", which is reasonable as I am converting SomeEnum to String.
How do I tell Slick to treat String as DB-defined "enum_type"? Or how to define some other Scala-type that will map to "enum_type"?
I had similar confusion when trying to get my postgreSQL enums to work with slick. Slick-pg allows you to use Scala enums with your databases enums, and the test suite shows how.
Below is an example.
Say we have this enumerated type in our database.
CREATE TYPE Dog AS ENUM ('Poodle', 'Labrador');
We want to be able to map these to Scala enums, so we can use them happily with Slick. We can do this with slick-pg, an extension for slick.
First off, we make a Scala version of the above enum.
object Dogs extends Enumeration {
type Dog = Value
val Poodle, Labrador = Value
}
To get the extra functionality from slick-pg we extend the normal PostgresDriver and say we want to map our Scala enum to the PostgreSQL one (remember to change the slick driver in application.conf to the one you've created).
object MyPostgresDriver extends PostgresDriver with PgEnumSupport {
override val api = new API with MyEnumImplicits {}
trait MyEnumImplicits {
implicit val dogTypeMapper = createEnumJdbcType("Dog", Dogs)
implicit val dogListTypeMapper = createEnumListJdbcType("Dog", Dogs)
implicit val dogColumnExtensionMethodsBuilder = createEnumColumnExtensionMethodsBuilder(Dogs)
implicit val dogOptionColumnExtensionMethodsBuilder = createEnumOptionColumnExtensionMethodsBuilder(Dogs)
}
}
Now when you want to make a new model case class, simply use the corresponding Scala enum.
case class User(favouriteDog: Dog)
And when you do the whole DAO table shenanigans, again you can just use it.
class Users(tag: Tag) extends Table[User](tag, "User") {
def favouriteDog = column[Dog]("favouriteDog")
def * = (favouriteDog) <> (Dog.tupled, Dog.unapply _)
}
Obviously you need the Scala Dog enum in scope wherever you use it.
Due to a bug in slick, currently you can't dynamically link to a custom slick driver in application.conf (it should work). This means you either need to run play framework with start and not get dynamic recompiling, or you can create a standalone sbt project with just the custom slick driver in it and depend on it locally.

Slick Write a Simple Table Creation Function

Emm...I'm trying out Slick with Play 2. The table creation process has become very frustrating because unlike other ORM (like ebean), Slick doesn't detect if the database is created, if there is already a table existing, it will report an exception. I simply just don't want to drop and create every time I restart the server, so I decide to write a small function to help me:
def databaseCreate(tables: Table*) = {
for (table <- tables) {
if (MTable.getTables(table.getClass.getName).list.isEmpty) table.ddl.create
}
}
What this does is to take in some objects like this one:
object Tag extends Table [(Option[Int], String)]("Tags") {
def id = column[Int]("TAG_ID", O.PrimaryKey, O.AutoInc)
def tag_name = column[String]("TAG_NAME")
def * = id.? ~ tag_name
}
And use MTable method from scala.slick.jdbc.meta.MTable to know if the table exists or not. Then I kinda run into a simple Java reflection problem. If databaseCreate method takes Strings, then I can invoke .ddl.create. So I decide to pass in objects and use relfection: table.getClass.getName. The only problem is there is a type mismatch: (from my IDE)
Expected: MySQLDriver.simple.type#Table, actual: BlogData.Tag.type
BlogData is the big object I used to store all smaller table objects. How do I solve this mismatch problem?? Use a whole bunch of asInstanceOf? It would make the command unbearably long and ugly...
Corrected:
The type mismatch is a false alarm, which comes from the IDE, not from the typesafe console complier. The real problem is:
type Table takes type parameters
def databaseCreate(tables: Table*) = {
^
one error found
Then I followed the advice and changed the code:
def databaseCreate(tables: Table[_]*)(implicit session: Session) = {
for (table <- tables) {
if (MTable.getTables(table.tableName).list.isEmpty) table.ddl.create
}
}
Then I got this error:
ambiguous implicit values: both value session of type slick.driver.MySQLDriver.simple.Session and method threadLocalSession in object Database of type => scala.slick.session.Session match expected type scala.slick.session.Session
if (MTable.getTables(table.tableName).list.isEmpty) table.ddl.create
^
one error found
My imports are here:
import play.api.GlobalSettings
import play.api.Application
import models.BlogData._
import scala.slick.driver.MySQLDriver.simple._
import Database.threadLocalSession
import play.api.Play.current
import scala.slick.jdbc.meta.MTable
In Slick 2: For the sake of completeness, see this and that related thread.
I use the following method:
def createIfNotExists(tables: TableQuery[_ <: Table[_]]*)(implicit session: Session) {
tables foreach {table => if(MTable.getTables(table.baseTableRow.tableName).list.isEmpty) table.ddl.create}
}
Then you can just create your tables with the implicit session:
db withSession {
implicit session =>
createIfNotExists(table1, table2, ..., tablen)
}
You are using SLICK v1 I assume, as you have Table objects. I have working code for v1. This version will drop any tables that are present before recreating them (but wont drop any tables you don't wish to recreate). It will also merge the DDL into one before creation in order to get the creation sequence correct:
def create() = database withSession {
import scala.slick.jdbc.{StaticQuery => Q}
val tables = Seq(TableA, TableB, TableC)
def existingTables = MTable.getTables.list().map(_.name.name)
tables.filter(t => existingTables.contains(t.tableName)).foreach{t =>
Q.updateNA(s"drop table ${t.tableName} cascade").execute
}
val tablesToCreate = tables.filterNot(t => existingTables.contains(t.tableName))
val ddl: Option[DDL] = tablesToCreate.foldLeft(None: Option[DDL]){(ddl, table) =>
ddl match {
case Some(d) => Some(d ++ table.ddl)
case _ => Some(table.ddl)
}
}
ddl.foreach{_.create}
}
Not sure why you get this error message. I would need to see your imports and the place where you call databaseCreate, but what is wrong is def databaseCreate(tables: Table*) should be def databaseCreate(tables: Table[_]*) and probably take a second argument list as well def databaseCreate(tables: Table[_]*)(implicit session: Session) = ....
Also, instead of table.getClass.getName, you can use table.tableName.
You should use asTry with the table create DBIOAction . It will check whether there is table existing... if not exist the DBIOAction will work for creating a table.
in HomePageDAO
val createHomePageTableAction: DBIOAction[Int, NoStream, Effect.Schema with Effect.Write] = homePageTable.schema.create >>(homePageTable+=Homepage("No Data","No Data",0l))
in ExplorePageDAO
val SoftwareTableCreateAction: FixedSqlAction[Unit, NoStream, Effect.Schema] = softwareTable.schema.create
And in createTablesController
package controllers
import javax.inject.{Inject, Singleton}
import play.api.mvc.{AbstractController, Action, AnyContent, ControllerComponents}
import services.dbServices.{ExplorePageDAO, HomePageDAO}
import scala.concurrent.ExecutionContext
#Singleton
class CreateTablesController #Inject()(cc: ControllerComponents)(implicit assetsFinder: AssetsFinder, executionContext: ExecutionContext)
extends AbstractController(cc) {
import services.dbServices.dbSandBox._
def createTable: Action[AnyContent] = Action.async {
dbAccess.run(HomePageDAO.createHomePageTableAction .asTry >> ExplorePageDAO.SoftwareTableCreateAction.asTry).map(_=> Ok("Table Created"))
}
}