I use scala 2.11 and slick 2.1.0 and have a compiled code:
trait TSegmentClient { this: Profile =>
import profile.simple._
class SegmentClients(tag: Tag) extends Table[(Int, Long)](tag, "seg") {
def segmentId = column[Int]("segment_id")
def clientId = column[Long]("client_id")
def * = (segmentId, clientId)
}
}
segmentClients.insert(clientBehaviors.map(c => (1, c.clientId)))
it works.
But i need a case class like this:
case class SegmentClient(segmentId: Int, clientId: Long)
trait TSegmentClient { this: Profile =>
import profile.simple._
class SegmentClients(tag: Tag) extends Table[SegmentClient](tag, "seg") {
def segmentId = column[Int]("segment_id")
def clientId = column[Long]("client_id")
def * = (segmentId, clientId) <> (SegmentClient.tupled, SegmentClient.unapply)
}
}
segmentClients.insert(clientBehaviors.map(c => (1, c.clientId)))
But it doesn't compile.
(value: models.coper.datamining.SegmentClient)(implicit session:
scala.slick.jdbc.JdbcBackend#SessionDef)Int cannot be applied to
(scala.slick.lifted.Query[(scala.slick.lifted.Column[Int],
scala.slick.lifted.Column[Long]),(Int, Long),Seq])
segmentClients.insert(clientBehaviors.map(c => (segmentId, c.clientId)))
What is wrong with my code?
You can do this using another projection to a tuple that is not mapped to a case class.
case class SegmentClient(segmentId: Int, clientId: Long)
trait TSegmentClient { this: Profile =>
import profile.simple._
class SegmentClients(tag: Tag) extends Table[SegmentClient](tag, "seg") {
def segmentId = column[Int]("segment_id")
def clientId = column[Long]("client_id")
def tuple = (segmentId, clientId)
def * = tuple <> (SegmentClient.tupled, SegmentClient.unapply)
}
}
segmentClients.map(_.tuple).insert(clientBehaviors.map(c => (1, c.clientId)))
The insert method on segmentClients in your second example expects a SegmentClient instance since SegmentClients is a mapped table. That's what the compiler error message is basically saying. I don't know whether there's a more idiomatic approach, since I don't know Slick too well, but as a workaround you could also use:
val behaviours = clientBehaviours.list.map(c => SegmentClient(1, c.clientId))
segmentClients.insertAll(behaviours)
Related
I have a trouble in using slick MappedColumnType, the code snippet is as following:
private trait UserTable {
self: HasDatabaseConfigProvider[JdbcProfile] =>
import driver.api._
lazy val userTable = TableQuery[User]
class UserTable(tag: Tag)
extends Table[User](tag, "user") {
implicit def mapper = MappedColumnType.base[JsObject, String](
{ scope: JsObject => scope.toString }, { s: String => Json.parse(s).as[JsObject] }
)
val id = column[Long]("id", O.PrimaryKey, O.AutoInc)
val name = column[String]("name")
val hobby = column[JsObject]("hobby")
def * = (id, name, hobby) <> (User.tupled, User.unapply)
}
}
My User case class is defined as follow:
case class User(id: Long, name: String: hobby: JsObject)
I have corresponding insert and update statement for my database. However the following update statement is not working for me.
def updateQuery(id: Long, newUser: User) = {
userTable.filter(x => x.id === id)
.map(x => (x.hobby))
.update(newUser.hobby)
It will throw a compile error:
No matching Shape found.
Slick does not know how to map the given types.
Possible causes: T in Table[T] does not match your * projection. Or you use an unsupported type in a Query (e.g. scala List).
I think it's pretty straight-forward. Is there something I did wrong?
I need to use Slick 3.1.1 for a Postgres based project, but I have a hard time to write clean code for the following super simple usage:
Assume I have a Task model:
case class Task(id: Option[UUID], foo: Int, bar: String)
The id: UUID is the primary key, so I should NOT provide it (id = None) when doing database INSERT. However, I do need it when doing GET which maps a database row to a Task object.
Therefore, the Slick Table class becomes very ugly:
class Tasks(tag: Tag) extends Table[Task](tag, "tasks") {
def id = column[UUID]("id", O.SqlType("UUID"), O.PrimaryKey)
def foo = column[Int]("foo")
def bar = column[String]("bar")
def insert: MappedProjection[Task, (Int, String)] =
(foo, bar).shaped.<>(
{ tuple =>
Task.tupled(None, tuple._1, tuple._2)
}, { (task: Task) =>
Task.unapply(task).map { tuple =>
(tuple._2, tuple._3)
}
}
)
override def * : ProvenShape[Task] =
(id.?,
foo,
bar).shaped.<>(Task.tupled, Task.unapply)
}
If case class Task has 10 elements, I then have to write (tuple._1, tuple._2, tuple._3, ......) My co-workers will slap my face if I submit a PR like above. Please suggest!
If you'll let the database to autoincrement your IDs, that Table definition could be shortened significantly:
import slick.driver.PostgresDriver.api._
import java.util.UUID
case class Task(id: Option[UUID], foo: Int, bar: String)
class Tasks(tag: Tag) extends Table[Task](tag, "tasks") {
def id = column[Option[UUID]]("id", O.SqlType("UUID"), O.PrimaryKey, O.AutoInc)
def foo = column[Int]("foo")
def bar = column[String]("bar")
def * = (id, foo, bar) <> (Task.tupled, Task.unapply)
}
This could be improved further by moving the id field to the end of the case class and giving it the default None value. This way you won't have to provide it every time you want to instantiate the Task:
case class Task(foo: Int, bar: String, id: Option[UUID] = None)
val firstTask = Task(123, "John")
val secondTask = Task(456, "Paul")
I'm having some difficulties querying/filtering in Slick 2.1.0 when using a custom column type.
A simplified version of my problem:
import scala.slick.driver.MySQLDriver.simple._
sealed class Status(val intValue: Int)
case object Active extends Status(1)
case object Disabled extends Status(2)
case object Deleted extends Status(3)
case class TableMapping(id: Long, status: Status)
class MyTableDefinition(tag: Tag) extends Table[TableMapping](tag, "sometable") {
implicit val statusColumnType = MappedColumnType.base[Status, Int](statusToInt, intToStatus)
def id = column[Long]("ID", O.PrimaryKey, O.AutoInc)
def status = column[Status]("STATUS", O.NotNull, O.Default(Active))
def * = (id, status) <> (TableMapping.tupled, TableMapping.unapply)
private def statusToInt(s: Status): Int = s.intValue
private def intToStatus(i: Int): Status = i match {
case 1 => Active
case 2 => Disabled
case _ => Deleted
}
}
class MyTableDao {
val Items = TableQuery[MyTableDefinition]
def byId(id: Long)(implicit session: Session): Option[TableMapping] = {
Items.filter(_.status =!= Deleted).firstOption
}
}
I get a compile error on this:
Items.filter(_.status =!= Deleted).firstOption
The error states:
value =!= is not a member of scala.slick.lifted.Column[Status]
[error] def byId(id: Long)(implicit session: Session): Option[TableMapping] =
Items.filter(_.status =!= Deleted).firstOption
Any ideas of what I'm doing wrong? Maybe there is a much better way of doing this that I'm not aware of?
The thing is that the Scala compiler will look for an implicit convertion for Deleted.type instead of Status.
As Deleted is declared as an object it is not a class, its actual class is Deleted.type, so you just have to help the compiler to understand that is actually a Status. How? You can try
class MyTableDao {
val Items = TableQuery[MyTableDefinition]
def byId(id: Long)(implicit session: Session): Option[TableMapping] =
Items.filter(_.status =!= Deleted.asInstanceOf[Status]).firstOption
}
That'll do it.
Let me know if it did work, I'm facing a similar problem and I was able to get rid of it doing that.
Your custom type mapper needs to be in scope for the DAO; I'd do something like:
trait MyTypeMapper {
protected implicit val statusColumnType =
MappedColumnType.base[Status, Int](_.intValue, intToStatus)
private def intToStatus(i: Int): Status = i match {
case 1 => Active
case 2 => Disabled
case _ => Deleted
}
}
and then mix the trait into your table mapper and dao:
class MyTableDefinition(tag: Tag)
extends Table[TableMapping](tag, "sometable")
with MyTypeMapper {...}
class MyTableDao extends MyTypeMapper{...}
I'm using Slick 3.0 and trying to create a trait to offer basic operations. Here is my trait:
object DAO {
var db: Database = null
}
trait CommonAPI[T <: Table[_]] {
private val db = DAO.db
private val objects = TableQuery[T]
def count: Future[Int] = db.run(objects.length.result)
def insert(obj: T#TableElementType): Future[Int] = db.run(objects += obj)
def all: Future[Seq[T]] = db.run(objects.result)
}
DAO.db is initialized in Play's onStart method. However, I met compilation error class type required but T found in line private val objects = TableQuery[T].
What am I supposed to do? Thanks!
Here is one solution:
At first, define this to avoid class type issue..
class Service[T <: Table[_]](path: String, cons: Tag => T){
lazy val db = Database.forConfig(path)
def query = TableQuery[T](cons)
}
Then use it this way, Post is sub class of Table:
object Abcd {
object Def extends Service[Post]("mydb", abc) {
def test = {
//db
val q = query.drop(1).take(20)
val r = db.run(q.result)
println(q.result.statements.head)
println(r)
r
}
}
private def abc(tag: Tag) = new Post(tag)
}
This solution tested ok.
Hi I have done something similar. I used a bit different approach. I have one generic DAO and few(per resource/table) classes which just inherits for it.
this is generic DAO:
class BaseDbEntity[T <: BaseEntity, R <: BaseT[T]](val tableName: String, tableQuery: TableQuery[R]) extends DatabaseAccess{
val createReturningId = tableQuery returning tableQuery.map{item => item.id}
def create(entity: T): Int = {
connectionPool withSession {
implicit session =>
val resId = createReturningId += entity
resId
}
}
def getAll = {
connectionPool withSession {
implicit session =>
tableQuery.list
}
}
}
code of full class:
(https://github.com/vixxx123/scalasprayslickexample/blob/master/src/main/scala/com/vixxx123/scalasprayslickexample/database/BaseDbEntity.scala)
and Specific DAO with corresponding Table class:
class PersonDao extends BaseDbEntity[Person, PersonT]("person", TableQuery[PersonT])
class PersonT(tag: Tag) extends BaseT[Person](tag, "person") {
def name: Column[String] = column[String]("name")
def lastname: Column[String] = column[String]("lastname")
override def * = (id.?, name, lastname) <> (
(Person.apply _).tupled, Person.unapply)
}
you can find that class here: https://github.com/vixxx123/scalasprayslickexample/blob/master/src/main/scala/com/vixxx123/scalasprayslickexample/exampleapi/person/PersonDao.scala
Maybe it will help you out.
You can pass a tag instead and create table query from it , if your table class is defined as :
case class Sample(....)
class SampleTable(tag: Tag)
extends Table[Sample](tag, "sample_table") {
.....
}
then you can implement your generic trait as below :
import scala.slick.driver.MySQLDriver.simple.Tag // here mysql is used , you can import the driver specific to your db
object DAO {
var db: Database = null
}
trait CommonAPI[T, A<: Table[T]] {
private val db = DAO.db
private val tableTag : Tag => A = _
def setTag(tag : Tag => A) = { tableTag = tag }
private val objects = TableQuery(tableTag)
def count: Future[Int] = db.run(objects.length.result)
def insert(obj: T#TableElementType): Future[Int] = db.run(objects += obj)
def all: Future[Seq[T]] = db.run(objects.result)
}
I'm trying something that I've seen in different shapes in different contexts before:
extending scala's query extensions with filterById(id: Id)
This is what I've tried:
trait TableWithId { self: Profile =>
import profile.simple._
trait HasId[Id] { self: Table[_] =>
def id: Column[Id]
}
implicit class HasIdQueryExt[Id: BaseColumnType, U]
(query: Query[Table[U] with HasId[Id], U]) {
def filterById(id: Id)(implicit s: Session) = query.filter(_.id === id)
def insertReturnId(m: U)(implicit s: Session): Id = query.returning(query.map(_.id)) += m
}
}
This works fine, no real magic there. But because there is no type constraint on the Table type, any query to which I apply filterById, looses it's specificness (is is now a generic Table with HasId[Id]), and I can no longer reach it's columns (except for _.id ofcourse).
I don't know how to type this implicit conversion, such that this is prevented. Is it possible? The following "naieve" solution does not work, because Scala infers Nothing for the Id type now:
implicit class HasIdQueryExt[Id: BaseColumnType, U, T <: Table[U] with HasId[Id]]
(query: Query[T, U]) { ... }
I find it kind of strange that suddenly the Id type is inferred as Nothing. How do I hint the compiler where to look for that Id type?
This is my solution for a similar problem. I did use specific type for id though.:
trait GenericComponent { this: Profile =>
import profile.simple._
abstract class TableWithId[A](tag:Tag, name:String) extends Table[A](tag:Tag, name) {
def id = column[Option[UUID]]("id", O.PrimaryKey)
}
abstract class genericTable[T <: Table[A] , A] {
val table: TableQuery[T]
/**
* generic methods
*/
def insert(entry: A)(implicit session:Session): A = session.withTransaction {
table += entry
entry
}
def insertAll(entries: List[A])(implicit session:Session) = session.withTransaction { table.insertAll(entries:_*) }
def all: List[A] = database.withSession { implicit session =>
table.list.map(_.asInstanceOf[A])
}
}
/**
* generic queries for any table which has id:Option[UUID]
*/
abstract class genericTableWithId[T <: TableWithId[A], A <:ObjectWithId ] extends genericTable[T, A] {
def forIds(ids:List[UUID]): List[A] = database.withSession { implicit session =>
ids match {
case Nil => Nil
case x::xs =>table.filter(_.id inSet(ids)).list.map(_.asInstanceOf[A])
}
}
def forId(id:UUID):Option[A] = database.withSession { implicit session =>table.filter(_.id === id).firstOption }
}
}
and then for your concrete component:
case class SomeObjectRecord(
override val id:Option[UUID] = None,
name:String) extends ObjectWithId(id){
// your function definitions there
}
trait SomeObjectComponent extends GenericComponent { this: Profile =>
import profile.simple._
class SomeObjects(tag: Tag) extends TableWithId[SomeObjectRecord](tag, "some_objects") {
def name = column[String]("name", O.NotNull)
def * = (id, name) <> (SomeObjectRecord.tupled, SomeObjectRecord.unapply)
def nameIdx = index("u_name", (name), unique = true)
}
object someobjects extends genericTableWithId[SomeObjects, SomeObjectRecord] {
val table = TableQuery[Units]
// your additional methods there; you get insert and insertAll from the parent
}
}