Assert unique key constraint on h2 database with slick and scalatest - scala

Imagine the following scenario: You have a book that consists of ordered chapters.
First the test:
"Chapters" should "have a unique order" in
{
// val exception = intercept
db.run(
DBIO.seq
(
Chapters.add(0, 0, "Chapter #0"),
Chapters.add(0, 0, "Chapter #1")
)
)
}
Now the implementation:
case class Chapter(id: Option[Long] = None, bookId: Long, order: Long, val title: String) extends Model
class Chapters(tag: Tag) extends Table[Chapter](tag, "chapters")
{
def id = column[Option[Long]]("id", O.PrimaryKey, O.AutoInc)
def bookId = column[Long]("book_id")
def order = column[Long]("order")
def title = column[String]("title")
def * = (id, bookId, order, title) <> (Chapter.tupled, Chapter.unapply)
def uniqueOrder = index("order_chapters", (bookId, order), unique = true)
def bookFK = foreignKey("book_fk", bookId, Books.all)(_.id.get, onUpdate = ForeignKeyAction.Cascade, onDelete = ForeignKeyAction.Restrict)
}
Maybe such a unique-constraint on 2 columns isn't even possible in h2?
Anyway:
Expectation:
An exception to be thrown that I can then intercept/expect in my test, hence a failing test for now, for violating a unique-constraint.
Actual result:
A successful test :(
edit: Also, I use this:
implicit val defaultPatience =
PatienceConfig(timeout = Span(30, Seconds), interval = Span(100, Millis))

db.run returns a Future.
You have to Await on it to get the result of the execution.
Try this:
import scala.concurrent.duration._
val future = db.run(...)
Await.result(future, 5 seconds)

Related

Slick - Inserting a row into two tables linked with an auto-incrementing key?

I'm new to Slick and struggling to find a good canonical example for the following.
I'd like to insert a row into two tables. The first table has a primary key which auto-increments. The second table is related to the first via its primary key.
So I'd like to:
Start a transaction
Insert a row into table 1, which generates a key
Insert a row into table 2, with a foreign key generated in the previous step
End transaction (rollback steps 2 & 3 if either fail)
Would appreciate a canonical example for the above logic, and any related suggestions on my definitions below (I'm very new to Slick!). Thanks!
Insert logic for table 1
private def insertAndReturn(entry: Entry) =
entries returning entries.map(_.id)
into ((_, newId) => entry.copy(id = newId))
def insert(entry: Entry): Future[Entry] =
db.run(insertAndReturn(entry) += entry)
(similar for table 2)
Table 1
class EntryTable(tag: Tag) extends Table[Entry](tag, "tblEntry") {
def id = column[EntryId]("entryID", O.PrimaryKey, O.AutoInc)
...
def * = (id, ...).shaped <> (Entry.tupled, Entry.unapply)
}
Table 2
class UsernameChangeTable(tag: Tag) extends Table[UserNameChange](tag, "tblUserNameChange") {
def entryId = column[EntryId]("entryID")
...
def entry = foreignKey("ENTRY_FK", entryId, entryDao.entries)(
_.id, onUpdate = Restrict, onDelete = Cascade
)
I'm using a MySQL database and Slick 3.1.0.
All that you have to do is
val tx =
insertAndReturn(entry).flatMap { id =>
insertUserNameChange(UserNameChange(id, ...))
}.transactionally
db.run(tx)
Note that insertUserNameChange is the function which inserts the UserNameChange instance into the database. It needs the EntryId which you get back from the previous insertion action.
Compose actions using flatMap and use transactionally to run the whole query in a transaction.
Your Slick tables look fine.
Here is a canonical example implementing this functionality
package models
import scala.concurrent.{Future, Await}
import scala.concurrent.ExecutionContext.Implicits.global
import scala.concurrent.duration.Duration
import slick.backend.DatabasePublisher
import slick.driver.H2Driver.api._
case class Supplier1(id:Int,name:String)
class Suppliers1(tag:Tag) extends Table[Supplier1](tag,"SUPPLIERS") {
def id:Rep[Int] = column[Int]("SUP_ID",O.PrimaryKey,O.AutoInc)
def name:Rep[String] = column[String]("NAME")
def * = (id,name) <>
(Supplier1.tupled,Supplier1.unapply)
}
case class Coffee1(id:Int,name:String,suppId:Int)
class Coffees1(tag:Tag) extends Table[Coffee1](tag,"COFFEES"){
def id:Rep[Int] = column[Int]("C_ID",O.PrimaryKey,O.AutoInc)
def name:Rep[String] = column[String]("COFFEE_NAME")
def suppId:Rep[Int] = column[Int]("SUP_ID")
def * = (id,name,suppId) <> (Coffee1.tupled,Coffee1.unapply)
def supplier = foreignKey("supp_fk", suppId, TableQuery[Suppliers])(_.id)
}
object HelloSlick1 extends App{
val db = Database.forConfig("h2mem1")
val suppliers = TableQuery[Suppliers1]
val coffees = TableQuery[Coffees1]
val setUpF = (suppliers.schema ++ coffees.schema).create
val insertSupplier = suppliers returning suppliers.map(_.id)
//val tx = (insertSupplier += Supplier1(0,"SUPP 1")).flatMap(id=>(coffees += Coffee1(0,"COF",id))).transactionally
val tx = for{
supId <- insertSupplier += Supplier1(0,"SUPP 1")
cId <- coffees += Coffee1(0,"COF",supId)
} yield ()
tx.transactionally
def exec[T](action: DBIO[T]): T =
Await.result(db.run(action), Duration.Inf)
exec(setUpF)
exec(tx)
exec(suppliers.result.map(println))
exec(coffees.result.map(println))
}

How to omit column values when doing a bulk-insert slick 3.x?

I have a JOURNAL table where the INSERT_DATE column should be filled by the DB with the current date and time when the record is inserted. I did not use the TIMESTAMP type on purpose, because of its limited range.
class Journal(tag: Tag) extends Table[JournalEntry](tag, "JOURNAL") {
def id = column[Int]("ID", O.PrimaryKey, O.AutoInc)
def insertDate = column[OffsetDateTime]("INSERT_DATE", SqlType("DateTime default CURRENT_TIMESTAMP"))(localDateTimeColumnType)
def valueDate = column[OffsetDateTime]("VALUE_DATE", SqlType("DateTime"))(localDateTimeColumnType)
def amount = column[Int]("AMOUNT")
def note = column[String]("NOTE", O.Length(100))
def * : ProvenShape[JournalEntry] = (id.?, insertDate.?, valueDate, amount, note)
<> ((JournalEntry.apply _).tupled, JournalEntry.unapply)
}
I also implement a case class:
case class JournalEntry(id: Option[Int], insertDate: Option[LocalDateTime],
valueDate: LocalDateTime, amount: Int, note: String)
When my app starts up, I populate the DB with random test data:
TableQuery[Journal] ++= Seq.fill(1000)(JournalEntry(None, Some(LocalDateTime.now()),
LocalDateTime.of(2006 + Random.nextInt(10), 1 + Random.nextInt(11),
1 + Random.nextInt(27),Random.nextInt(24), Random.nextInt(60)), Random.nextInt(),
TestDatabase.randomString(100)))
This works, but the INSERT_DATE ist set by the JVM not by the Database. The Slick docs say that columns should be omitted, if one wants the default value to get inserted. But I just dont get how I omit columns if I have a case class.
I also found this SO post but could not figure out how to use it in my context.
Any ideas?
The Slick docs give an example of such omission right in the first code snippet here. Follow the steps or the cvogt's answer and you will arrive at the solution:
TableQuery[Journal].map(je => (je.id, je.valueDate, je.amount, je.note)) ++= Seq.fill(1000)((None, LocalDateTime.of(2006 + Random.nextInt(10), 1 + Random.nextInt(11), 1 + Random.nextInt(27),Random.nextInt(24), Random.nextInt(60)), Random.nextInt(), TestDatabase.randomString(100)))
I work in the following way:
import java.time.{ ZonedDateTime, ZoneOffset}
import slick.profile.SqlProfile.ColumnOption.SqlType
import scala.concurrent.duration.Duration
import scala.concurrent.Await
implicit val zonedDateTimeType = MappedColumnType.base[ZonedDateTime, Timestamp](
{dt =>Timestamp.from(dt.toInstant)},
{ts =>ZonedDateTime.ofInstant(ts.toInstant, ZoneOffset.UTC)}
)
class Users(tag: Tag) extends Table[(String, ZonedDateTime)](tag, "users") {
def name = column[String]("name")
def createAt = column[ZonedDateTime]("create_at", SqlType("timestamp not null default CURRENT_TIMESTAMP"))
def * = (name, createAt)
}
val users = TableQuery[Users]
val setup = DBIO.seq(
users.schema.create,
users.map(u => (u.name)) ++= Seq(("Amy"), ("Bob"), ("Chris"), ("Dave"))
Await.result(db.run(setup), Duration.Inf)
I am not using case class here, just a tuple.

Slick 3.0 bulk insert returning object's order

I want to do a bulk insert using Slick 3.0 ++= function and also using returning to return the inserted objects.
I am wondering whether the return objects (Future[Seq[Something]]) has the same order as my arguments Seq[Something] (without id).
More specifically,
val personList: Seq[Person] = Seq(Person("name1"), Person("name2"), Person("name3"))
persons returning persons ++= personList
Is the result definitely be Future(Seq(Person(1, "name1"), Person(2, "name2"), Person(3, "name3")))? or can be in other result order?
Thanks.
Yes,I believe you are using auto incremented primary key.
I am also doing same as you have mentioned:
case class Person(name: String, id: Option[Int] = None)
class PersonTable(tag: Tag) extends Table[Person](tag, "person") {
val id = column[Int]("id", O.PrimaryKey, O.AutoInc)
val name = column[String]("name")
def * = (name, id.?) <> (Person.tupled, Person.unapply)
}
val personTableQuery = TableQuery[PersonTable]
def personTableAutoIncWithObject =
(personTableQuery returning personTableQuery.map(_.id)).into((person, id) => person.copy(id = Some(id)))
// insert all person without id and return all person with their id.
def insertAll(persons: List[Person]): Future[Seq[Person]] =
db.run { personTableAutoIncWithObject ++= persons }
//unit test for insertion order:
test("Add new persons ") {
val response = insertAll(List(Person("A1"), Person("A2"), Person("A3"), Person("A4"), Person("A5")))
whenReady(response) { persons =>
assert(persons === List(Person("A1", Some(1)), Person("A2", Some(2)), Person("A3", Some(3)),
Person("A4", Some(4)), Person("A5", Some(5))))
}
}
As per I know , the result of the batch inserts are in same order what you are sending to the database and "++=" function will return you the count of records inserted to the table.

Creating table view using slick

How can I create queries for postgresql view using slick 3?
I didn't find an answer in the slick documentation.
The question relates to my another question. I got right answer but I don't know how to implement it using slick.
There is only rudimentary support for views in Slick 3, that doesn't guarantee full compile-time safety and compositionality, the latter especially matters considering most views strongly depend on data in other tables.
You can describe a view as a Table and separate schema manipulation statements, which you must use instead of standard table schema extension methods like create and drop. Here is an example for your registries-n-rows case subject to the REGISTRY and ROWS table are already present in the database:
case class RegRn(id: Int, name: String, count: Long)
trait View{
val viewName = "REG_RN"
val registryTableName = "REGISTRY"
val rowsTableName = "ROWS"
val profile: JdbcProfile
import profile.api._
class RegRns(tag: Tag) extends Table[RegRn](tag, viewName) {
def id = column[Int] ("REGISTRY_ID")
def name = column[String]("NAME", O.SqlType("VARCHAR"))
def count = column[Long] ("CT", O.SqlType("VARCHAR"))
override def * = (id, name, count) <> (RegRn.tupled, RegRn.unapply)
...
}
val regRns = TableQuery[RegRns]
val createViewSchema = sqlu"""CREATE VIEW #$viewName AS
SELECT R.*, COALESCE(N.ct, 0) AS CT
FROM #$registryTableName R
LEFT JOIN (
SELECT REGISTRY_ID, count(*) AS CT
FROM #$rowsTableName
GROUP BY REGISTRY_ID
) N ON R.REGISTRY_ID=N.REGISTRY_ID"""
val dropViewSchema = sqlu"DROP VIEW #$viewName"
...
}
You can now create a view with db.run(createViewSchema), drop it with db.run(dropViewSchema) and of course call MTable.getTables("REG_RN") to expectedly find its tableType is "VIEW". Queries are the same as for other tables, e.g.
db run regRns.result.head. You can even insert values into a view as you do for a normal Slick table if the rules allow (not your case due to COALESCE and the subquery).
As I mentioned everything will become a mess when you want to compose existing Tables to create a view. You will have to always keep their names and definitions in sync, as it is not possible now to write anything that would at least guarantee the shape of the view conforms to combined shape of the underlying tables for example. Well, there is no way apart from ugly ones like this:
trait View{
val profile: JdbcProfile
import profile.api._
val registryTableName = "REGISTRY"
val registryId = "REGISTRY_ID"
val regitsryName = "NAME"
class Registries(tag: Tag) extends Table[Registry](tag, registryTableName) {
def id = column[Int] (registryId)
def name = column[String](regitsryName, O.SqlType("VARCHAR"))
override def * = (id, name) <> (Registry.tupled, Registry.unapply)
...
}
val rowsTableName = "ROWS"
val rowsId = "ROW_ID"
val rowsRow = "ROW"
class Rows(tag: Tag) extends Table[Row](tag, rowsTableName) {
def id = column[String](rowsId, O.SqlType("VARCHAR"))
def rid = column[Int] (registryId)
def r = column[String]("rowsRow", O.SqlType("VARCHAR"))
override def * = (id, rid, r) <> (Row.tupled, Row.unapply)
...
}
val viewName = "REG_RN"
class RegRns(tag: Tag) extends Table[RegRn](tag, viewName) {
def id = column[Int] ("REGISTRY_ID")
def name = column[String]("NAME", O.SqlType("VARCHAR"))
def count = column[Long] ("CT", O.SqlType("VARCHAR"))
override def * = (id, name, count) <> (RegRn.tupled, RegRn.unapply)
...
}
val registries = TableQuery[Registries]
val rows = TableQuery[Rows]
val regRns = TableQuery[RegRns]
val createViewSchema = sqlu"""CREATE VIEW #$viewName AS
SELECT R.*, COALESCE(N.ct, 0) AS CT
FROM #$registryTableName R
LEFT JOIN (
SELECT #$registryId, count(*) AS CT
FROM #$rowsTableName
GROUP BY #$registryId
) N ON R.#$registryId=N.#$registryId"""
val dropViewSchema = sqlu"DROP VIEW #$viewName"
...
}
What about appending the query text after the view preamble:
val yourAwesomeQryComposition : TableQuery = ...
val qryText = yourAwesomeQryComposition.map(reg => (reg.id, ....)).result.statements.head
val createViewSchema = sqlu"""CREATE VIEW #$viewName AS #${qryText}"""

Slick 2: Delete row in slick with play framework

This requirement should be really easy, but I don't know why is not working. I want to delete a row based on it's id using slick with play framework.
I'm following this example from play-slick module, but compiler complains that value delete is not a member of scala.slick.lifted.Query[models.Tables.MyEntity,models.Tables.MyEntity#TableElementType].
My controller looks like:
def delete(id: Int) = DBAction{ implicit rs =>
val q = MyEntity.where(_.id === id)
q.delete
Ok("Entity deleted")
}
I've imported the play.api.db.slick.Config.driver.simple._
What am I doing wrong?
Edit:
My schema definition looks like:
class Cities(tag: Tag) extends Table[CityRow](tag, "cities") {
def * = (cod, name, state, lat, long, id) <> (CityRow.tupled, CityRow.unapply)
/** Maps whole row to an option. Useful for outer joins. */
def ? = (cod.?, name.?, state.?, lat, long, id.?).shaped.<>({r=>import r._; _1.map(_=> CityRow.tupled((_1.get, _2.get, _3.get, _4, _5, _6.get)))}, (_:Any) => throw new Exception("Inserting into ? projection not supported."))
val cod: Column[String] = column[String]("cod")
val name: Column[String] = column[String]("name")
val state: Column[Int] = column[Int]("state")
val lat: Column[Option[String]] = column[Option[String]]("lat")
val long: Column[Option[String]] = column[Option[String]]("long")
val id: Column[Int] = column[Int]("id", O.AutoInc, O.PrimaryKey)
/** Foreign key referencing Departamentos (database name fk_ciudades_departamentos1) */
lazy val stateFk = foreignKey("fk_cities_states", state, States)(r => r.idstate, onUpdate=ForeignKeyAction.NoAction, onDelete=ForeignKeyAction.NoAction)
}
I also had a look at that example some time ago and it looked wrong to me too, not sure wether I was doing something wrong myself or not, the delete function was always a bit tricky to get right, expecially using the lifted.Query (like you are doing). Later in the process I made it work importing the right drivers, in my case scala.slick.driver.PostgresDriver.simple._.
Edit after comment:
Probably you have an error in the shape function, hard to say without looking at your schema declaration. This is an example:
case class CityRow(id: Long, name: String) {
class City(tag: Tag) extends Table[CityRow](tag, "city") {
def * = (id, name) <>(CityRow.tupled, CityRow.unapply)
^this is the shape function.
def ? = (id.?, name).shaped.<>({
r => import r._
_1.map(_ => CityRow.tupled((_1.get, _2)))
}, (_: Any) => throw new Exception("Inserting into ? projection not supported."))
val id: Column[Long] = column[Long]("id", O.AutoInc, O.PrimaryKey)
val name: Column[String] = column[String]("name")
}