I have following code which inserts row into table named luczekInfo and function to get data from database. My question is how to make function to update columns from table luczekInfo, on rows returned by get(id) function. What is the best way to update columns values in Slick?
def create(profil: luczekInfo): Either[Failure, luczekInfo] = {
try {
val id = db.withSession {
LuczekInfo returning LuczekInfo.id insert profil
}
Right(profil.copy(id = Some(id)))
} catch {
case e: SQLException =>
Left(databaseError(e))
}
}
def get(id: Int): Either[Failure, luczekInfo] = {
try {
db.withSession {
LuczekInfo.findById(id).firstOption match {
case Some(profil: luczekInfo) =>
Right(profil)
case _ =>
Left(notFoundError(id))
}
}
} catch {
case e: SQLException =>
Left(databaseError(e))
}
}
Thanks in advance for answers.
Slick 2.X
You can update a row in two ways (as far as I know), the first one would be to create a row object of type luczekInfo#TableElementTypeand use it to update the full row:
def updateById(id: Long, row: luczekInfo#TableElementType)(implicit s: Session): Boolean =
luczekInfo.filter(_.id === id).update(row)
Or you can update single fields using:
def updateNameById(mId: Long, mName: String)(implicit s: Session) = {
val q = for { l <- luczekInfo if l.id === mId } yield l.name
q.update(mName).run
}
Where I supposed your table has a file called name.
You can find it also on the Slick documentation in the section on updating.
Slick 3.1.X
there's an additional support for the insertOrUpdate (upsert) operation:
luczekInfo.insertOrUpdate(row)
Related
I'm using scanamo to query a dynamodb table.
I want to create a healthcheck method that will return true as long as dynamo can be scanned (i don't care if there are records in the table, i just want to know that the table is there and can be scanned). And i want false to be returned if the table is not present (i fudge the table name as a test).
This is the scala code i have right now:
trait DynamoTestTrait extends AbstractDynamoConfig {
def test(): Future[List[Either[DynamoReadError, T]]] =
ScanamoAsync.exec(client)(table.consistently.limit(1).scan())
}
and it works when the table is present:
val r = reg.test()
.map(
_.headOption.forall(_.isRight)
)
val e = Await.result(r, scala.concurrent.duration.Duration(5, "seconds"))
but when the table name is incorrect i get an unexpected error:
Unexpected server error: 'Cannot do operations on a non-existent table (Service: AmazonDynamoDBv2; Status Code: 400; Error Code: ResourceNotFoundException;
I would have thought that i was trapping that error in my Left.
How can i trap this error? does scala have a try/catch construct i could use?
I also tried this but the error still gets thrown and not my catch it seems:
trait DynamoTestTrait extends AbstractDynamoConfig {
def test(): Future[Boolean] =
try {
ScanamoAsync
.exec(client)(table.consistently.limit(1).scan())
.map(
_.headOption.forall(_.isRight)
)
} catch {
case e: ResourceNotFoundException => Future.successful(false)
}
}
this seems more scalathonic, but always returns true:
trait DynamoTestTrait extends AbstractDynamoConfig {
def test: Future[Boolean] = {
val result = Try(ScanamoAsync
.exec(client)(table.consistently.limit(1).scan())
.map(
_.headOption.forall(_.isRight)
))
result match{
case Success(v) => Future.successful(true)
case Failure(e) => Future.successful(false)
}
}
}
I need to insert into a table and, based on the return Id, insert into other tables. Unable to run for-comprehension on both Scala collection and DBCollection.
def savebook(book: Book, bookReaders: Seq[BookReaders]) = {
val transformedBookReaders = Seq[BookReaders]()
val action1 = bookDAO.insertRow(book).map { id =>
transformedBookReaders :+ bookReaders.map { bookReader=> new BookAssociation(None, id, bookReader.moduleId, bookReader.userId, bookReader.roleId) }
}
val action2 = bookAssocDao.insertRows(transformedBookReaders )
db.run(action1.andThen(action2).transactionally)
}
My second action is not inserting any row.
If action1 has not run at the time that action2 is constructed, then transformedBookReaders will be empty and nothing will happen. If you could change action to produce a collection of BookReader instead of depending on side effects, then action2 could be constructed
def savebook(book: Book, bookReaders: Seq[BookReaders]) = {
val action1 = bookDAO.insertRow(book).map { id =>
bookReaders.map { bookReader=> new BookAssociation(None, id, bookReader.moduleId, bookReader.userId, bookReader.roleId)
}
val action2 = readers => bookAssocDao.insertRows( readers )
db.run(action1.andThen(action2).transactionally)
}
I may have misunderstood what insertRow() and insertRows() do.
My requirement was to run non database transaction code in for comprehension. I need to iterate over scala collection while I was performing DB transaction code in for comprehension.
I found one link(http://virtuslab.slides.com/pdolega/slick-101#/94) which explains very nicely how to run non database transaction code in for comprehensions by using DBIO.successful
def saveBook(book: Book, bookModules: Seq[BookModule]) = {
def createBookModuleMapping(bookId: Int) = {
bookModules.map { bookModule =>
new bookModule(None, bookId, bookModule.moduleId, bookModule.isActive)
}
}
val action1 = for {
bookId<-bookDao.insert(book)
bookModule <- DBIO.successful(createBookModuleMapping(bookId))
ids <- bookModuleDao.insertRows(bookModule)
} yield ()
db.run(action1.transactionally)
}
I want to update a certain column in row only if row has a valid data. Being specific: I have a table with Event which stores start, stop and isActive flag.
I would like some Events activate by setting isActive to true, however I need to check if start and stop dates are valid.
model:
case class Event {start:DateTime, stop:DateTime, isActive:Boolean}
my validation method signature :
validateEvent(ev: Event): Boolean
My first approach:
def activateEv() = Action.async(parse.json) {
request => {
...
val ev = db.run(dao.findEvById(poid, uid))
val ret = ev.flatMap {
case st: Option[Event] => if (validateEvent(st.get)) {
db.run(dao.updateActivity(poid, true).map {
case 0 => false
case other => true
}
} else Future(false)
}
...
}
}
I believe that it is not the way how this problem should be addressed.
Could you advice ?
Maybe only one db.run will be sufficient ?
This can be achieved in a single db.run using combinators (e.g. flatMap) on DBIOAction objects. Assuming that your dao methods look like that:
case object dao {
def findEvById(poid: Int, uid: Int): DBIOAction[Option[Event], NoStream, Effect.Read] = ???
// In your case `updateActivity` returns an `Int` and you're mapping it to a `Boolean`.
// That mapping could be placed here, so `updateActivity` would return `Boolean` now.
def updateActivity(poid: Int, bool: Boolean): DBIOAction[Boolean, NoStream, Effect.Write] = ???
}
This is how we can achieve the thing you're looking for:
...
val action = dao.findEvById(poid, uid).flatMap {
case Some(event) if validateEvent(event) => dao.updateActivity(poid, true)
case _ => DBIO.successful(false)
}.transactionally
db.run(action)
...
As you see we have a transaction here, which would make a selection followed by an update (only if the event is valid). Moreover, that whole select then update action could be a separate method in your dao.
I am looking to use the Slick 3 framework for a Scala application to manage database interactions. I have been able to automatically generate the necessary table objects using Slick, but I also would like an integration test that verifies that the schemas in the database match the schemas in the objects. This is because sometimes tables get altered without my team being alerted, and so we would prefer to catch the change in an integration test instead of a production application.
One way to do this is to simply run a select query on every single table in a test runner. However, I feel like there should be a more direct way. Furthermore, it is not clear to me how to systematically run through all the tables defined in the file, except to manually append the table object to some sequence the test runner moves through. I notice that there is a schema field, but it only has the ability to generate create and drop statements.
Any help would be greatly appreciated. Thank you!
EDIT:
Here is my solution, but I was hoping for a better one:
class TablesIT extends FunSuite with BeforeAndAfter with ScalaFutures {
var db: Database = _
before{ db = Database.forURL( /* personal details */ )}
object ResultMap extends GetResult[Map[String,Any]] { //Object borrowed from http://stackoverflow.com/questions/20262036/slick-query-multiple-tables-databases-with-getting-column-names
def apply(pr: PositionedResult) = {
val rs = pr.rs // <- jdbc result set
val md = rs.getMetaData
val res = (1 to pr.numColumns).map{ i=> md.getColumnName(i) -> rs.getObject(i) }.toMap
pr.nextRow // <- use Slick's advance method to avoid endless loop
res
}
}
def testTableHasCols[A <: Table[_]](table: slick.lifted.TableQuery[A]): Unit = {
whenReady(db.run(table.take(1).result.headOption.asTry)) { case Success(t) => t match {
case Some(r) => logTrace(r.toString)
case None => logTrace("Empty table")
}
case Failure(ex) => fail("Query exception: " + ex.toString)
}
}
def plainSqlSelect[A](query: String)(implicit gr: GetResult[A]): Future[Seq[A]] = {
val stmt = sql"""#$query""".as[A]
db.run(stmt)
}
def compareNumOfCols[A <: Table[_]](table: slick.lifted.TableQuery[A]) = {
val tableName = table.baseTableRow.tableName
val selectStar = whenReady(db.run(sql"""select * from #$tableName limit 1""".as(ResultMap).headOption)) {
case Some(m) => m.size
case None => 0
}
val model = whenReady(db.run(sql"""#${table.take(1).result.statements.head}""".as(ResultMap).headOption)) {
case Some(m) => m.size
case None => 0
}
assert(selectStar === model, "The number of columns do not match")
}
test("Test table1") {
testTableHasCols(Table1)
compareNumOfCols(Table1)
}
// And on for each table
}
I ended up devising a better solution that uses the following idea. It is more or less the same, and unfortunately I still have to manually create a test for each table, but the method is cleaner, I think. Note, however, that this only works for PostgreSQL because of the information schema, but other database systems have other methods.
class TablesIT extends FunSuite with BeforeAndAfter with ScalaFutures {
var db: Database = _
before{ db = Database.forURL( /* personal details */ )}
def testTableHasCols[A <: Table[_]](table: slick.lifted.TableQuery[A]): Unit = {
whenReady(db.run(table.take(1).result.headOption.asTry)) { case Success(t) => t match {
case Some(r) => logTrace(r.toString)
case None => logTrace("Empty table")
}
case Failure(ex) => fail("Query exception: " + ex.toString)
}
}
def compareNumOfCols[A <: Table[_]](table: slick.lifted.TableQuery[A]) = {
val tableName = table.baseTableRow.tableName
val selectStar = whenReady(db.run(sql"""select column_name from information_schema.columns where table_name='#$tableName'""".as[String])) {
case m: Seq[String] => m.size
case _ => 0
}
val model = table.baseTableRow.create_*.map(_.name).toSeq.size
assert(selectStar === model, "The number of columns do not match")
}
test("Test table1") {
testTableHasCols(Table1)
compareNumOfCols(Table1)
}
// And on for each table
}
I'm trying ScalaQuery, it is really amazing. I could defined the database table using Scala class, and query it easily.
But I would like to know, in the following code, how could I check if a table is exists, so I won't call 'Table.ddl.create' twice and get a exception when I run this program twice?
object Users extends Table[(Int, String, String)]("Users") {
def id = column[Int]("id")
def first = column[String]("first")
def last = column[String]("last")
def * = id ~ first ~ last
}
object Main
{
val database = Database.forURL("jdbc:sqlite:sample.db", driver = "org.sqlite.JDBC")
def main(args: Array[String]) {
database withSession {
// How could I know table Users is alrady in the DB?
if ( ??? ) {
Users.ddl.create
}
}
}
}
ScalaQuery version 0.9.4 includes a number of helpful SQL metadata wrapper classes in the org.scalaquery.meta package, such as MTable:
http://scalaquery.org/doc/api/scalaquery-0.9.4/#org.scalaquery.meta.MTable
In the test code for ScalaQuery, we can see examples of these classes being used. In particular, see org.scalaquery.test.MetaTest.
I wrote this little function to give me a map of all the known tables, keyed by table name.
import org.scalaquery.meta.{MTable}
def makeTableMap(dbsess: Session) : Map[String, MTable] = {
val tableList = MTable.getTables.list()(dbsess);
val tableMap = tableList.map{t => (t.name.name, t)}.toMap;
tableMap;
}
So now, before I create an SQL table, I can check "if (!tableMap.contains(tableName))".
This thread is a bit old, but maybe someone will find this useful. All my DAOs include this:
def create = db withSession {
if (!MTable.getTables.list.exists(_.name.name == MyTable.tableName))
MyTable.ddl.create
}
Here's a full solution that checks on application start using a PostGreSQL DB for PlayFramework
import globals.DBGlobal
import models.UsersTable
import org.scalaquery.meta.MTable
import org.scalaquery.session.Session
import play.api.GlobalSettings
import play.api.Application
object Global extends GlobalSettings {
override def onStart(app: Application) {
DBGlobal.db.withSession { session : Session =>
import org.scalaquery.session.Database.threadLocalSession
import org.scalaquery.ql.extended.PostgresDriver.Implicit._
if (!makeTableMap(session).contains("tableName")) {
UsersTable.ddl.create(session)
}
}
}
def makeTableMap(dbsess: Session): Map[String, MTable] = {
val tableList = MTable.getTables.list()(dbsess)
val tableMap = tableList.map {
t => (t.name.name, t)
}.toMap
tableMap
}
}
With java.sql.DatabaseMetaData (Interface). Depending on your Database, more or less functions might be implemented.
See also the related discussion here.I personally prefer hezamu's suggestion and extend it as follows to keep it DRY:
def createIfNotExists(tables: TableQuery[_ <: Table[_]]*)(implicit session: Session) {
tables foreach {table => if(MTable.getTables(table.baseTableRow.tableName).list.isEmpty) table.ddl.create}
}
Then you can just create your tables with the implicit session:
db withSession {
implicit session =>
createIfNotExists(table1, table2, ..., tablen)
}
You can define in your DAO impl the following method (taken from Slick MTable.getTables always fails with Unexpected exception[JdbcSQLException: Invalid value 7 for parameter columnIndex [90008-60]]) that gives you a true o false depending if there a defined table in your db:
def checkTable() : Boolean = {
val action = MTable.getTables
val future = db.run(action)
val retVal = future map {result =>
result map {x => x}
}
val x = Await.result(retVal, Duration.Inf)
if (x.length > 0) {
true
} else {
false
}
}
Or, you can check if some "GIVENTABLENAME" or something exists with println method:
def printTable() ={
val q = db.run(MTable.getTables)
println(Await.result(q, Duration.Inf).toList(0)) //prints first MTable element
println(Await.result(q, Duration.Inf).toList(1))//prints second MTable element
println(Await.result(q, Duration.Inf).toList.toString.contains("MTable(MQName(public.GIVENTABLENAME_pkey),INDEX,null,None,None,None)"))
}
Don't forget to add
import slick.jdbc.meta._
Then call the methods from anywhere with the usual #Inject(). Using
play 2.4 and play-slick 1.0.0.
Cheers,