I am adding two additional fields to the Person table: a Date and a String. I built the Person tabel and mapped it with Play Slick by following olivebh's tutorial.
However, I get the following erros from the Slick data model trait Tables:
dao/Tables.scala:85: ambiguous implicit values:
[error] both value e3 of type slick.jdbc.GetResult[String]
[error] and value e1 of type slick.jdbc.GetResult[String]
[error] match expected type slick.jdbc.GetResult[String]
[error] ProjectRow.tupled((<<[Int], <<[String], <<[Date], <<[String]))
which refers to the following line:
implicit def GetResultPersonRow(implicit e0: GR[Int], e1: GR[String], e2: GR[Date], e3: GR[String]): GR[ProjectRow] = GR {
prs =>
import prs._
PersonRow.tupled((<<[Int], <<[String], <<[Date], <<[String]))
}
where the "int, string, date, string" represent the "id, name, birthdate, language" fields respectively. Everything worked fine by following the tutorial that covers "id, name" as an example. But as soon as I added birthdate and language, I got the error quoted above.
Also, when creating the prototypes for the table rows:
class Person(_tableTag: Tag) extends Table[PersonRow](_tableTag, "person") {
def * = (personId, name, birthdate, language) <>(PersonRow.tupled, PersonRow.unapply)
def ? = (Rep.Some(personId), Rep.Some(name), Rep.Some(birthdate), Rep.Some(language)).shaped.<>({ r => import r._; _1.map(_ => ProjectRow.tupled((_1.get, _2.get, _3.get, _4.get))) }, (_: Any) => throw new Exception("Inserting into ? projection not supported."))
val personId: Rep[Int] = column[Int]("person_id", O.AutoInc, O.PrimaryKey)
val name: Rep[String] = column[String]("name", O.Length(50, varying = true))
val birthdate: Rep[Date] = column[Date]("birthdate", O.Length(50, varying = true))
val language: Rep[String] = column[String]("language", O.Length(50, varying = true))
I get the following errors:
No matching Shape found.
[error] Slick does not know how to map the given types.
[error] Possible causes: T in Table[T] does not match your * projection. Or you use an unsupported type in a Query (e.g. scala List).
[error] Required level: slick.lifted.FlatShapeLevel
[error] Source type: (slick.lifted.Rep[Int], slick.lifted.Rep[String], slick.lifted.Rep[java.util.Date], slick.lifted.Rep[String])
[error] Unpacked type: (Int, String, java.util.Date, String)
[error] Packed type: Any
[error] def * = (personId, name, birthdate, language) <>(PersonRow.tupled, PersonRow.unapply)
and also:
dao/Tables.scala:94: could not find implicit value for parameter od: slick.lifted.OptionLift[Tables.this.driver.api.Rep[java.util.Date],O]
[error] def ? = (Rep.Some(personId), Rep.Some(name), Rep.Some(birthdate), Rep.Some(language)).shaped.<>({ r => import r._; _1.map(_ => PersonRow.tupled((_1.get, _2.get, _3.get, _4.get))) }, (_: Any) => throw new Exception("Inserting into ? projection not supported."))
dao/Tables.scala:94: not found: value _1
[error] def ? = (Rep.Some(personId), Rep.Some(name), Rep.Some(birthdate), Rep.Some(language)).shaped.<>({ r => import r._; _1.map(_ => PersonRow.tupled((_1.get, _2.get, _3.get, _4.get))) }, (_: Any) => throw new Exception("Inserting into ? projection not supported."))
Any help in understanding these errors and therefore how I could change the Slick data model trait in order for it to properly handle two additional Date and String fields, would be greatly appreciated. Thank you so much!
Slick can't handle java.util.Date because databases understand only java.sql.Date via JDBC drivers. You can make your own mapper, so that Slick can know how to read/write java.util.Date/java.sql.Date. But, Java 8 now has a better API for handling date/time/calendars.
implicit val localDateTimeColumnType = MappedColumnType.base[LocalDateTime, Timestamp](
ldt => Timestamp.valueOf(ldt),
t => t.toLocalDateTime
)
See this question also. Btw, thanks for reading my article, hope it was useful! :)
Related
I'm not sure what is wrong here.
The following code block is throwing error:
(for {
(e,r) <- tblDetail.joinLeft(tblMaster).on((e,r) => r.col1 === e.col3)
} yield (e.id)
Error
No matching Shape found.
[error] Slick does not know how to map the given types.
[error] Possible causes: T in Table[T] does not match your * projection,
[error] you use an unsupported type in a Query (e.g. scala List),
[error] or you forgot to import a driver api into scope.
[error] Required level: slick.lifted.FlatShapeLevel
[error] Source type: (slick.lifted.Rep[Int], slick.lifted.Rep[String],...)
[error] Unpacked type: T
[error] Packed type: G
[error] (e,r) <- tblDetail.joinLeft(tblMaster).on((e,r) => r.col1 === e.col3)
I checked the Slick Tables for tblDetail and tblMaster they seemed to be fine.
tblMaster
class TblMaster(tag:Tag)
extends Table[(Int,String,...)](tag, "tbl_master") {
def id = column[Int]("id")
def col3 = column[String]("col3")
def * = (id,col3)
}
tblDetail
class TblDetail(tag:Tag)
extends Table[Entity](tag, "tbl_detail") {
def id = column[Int]("id")
def col1 = column[String]("col1")
def * : ProvenShape[Entity] = (id,col1) <>
((Entity.apply _).tupled, Entity.unapply)
}
Any help would be appreciable.
I'm using Slick code-gen to output a very normal Tables.scala file, which maps the tables/columns of the structure of my database.
However i want to EXTEND the functionality of those tables in my DAOs and its proving to be impossible for me (roughly new to scala and play framework)
INSIDE Tables.scala INSIDE the class Meeting
you can write functions that have access to the columns I.E
class Meeting(_tableTag: Tag) extends profile.api.Table[MeetingRow](_tableTag, "meeting") {
def * = (id, dateandtime, endtime, organisationid, details, adminid, datecreated, title, agenda, meetingroom) <> (MeetingRow.tupled, MeetingRow.unapply)
def ? = (Rep.Some(id), Rep.Some(dateandtime), Rep.Some(endtime), Rep.Some(organisationid), Rep.Some(details), Rep.Some(adminid), Rep.Some(datecreated), Rep.Some(title), Rep.Some(agenda), Rep.Some(meetingroom)).shaped.<>({r=>import r._; _1.map(_=> MeetingRow.tupled((_1.get, _2.get, _3.get, _4.get, _5.get, _6.get, _7.get, _8.get, _9.get, _10.get)))}, (_:Any) => throw new Exception("Inserting into ? projection not supported."))
val id: Rep[Int] = column[Int]("id", O.AutoInc, O.PrimaryKey)
val dateandtime: Rep[java.sql.Timestamp] = column[java.sql.Timestamp]("dateandtime")
val endtime: Rep[java.sql.Timestamp] = column[java.sql.Timestamp]("endtime")
val organisationid: Rep[Int] = column[Int]("organisationid")
val details: Rep[String] = column[String]("details", O.Default(""))
val adminid: Rep[Int] = column[Int]("adminid")
val datecreated: Rep[java.sql.Timestamp] = column[java.sql.Timestamp]("datecreated")
val title: Rep[String] = column[String]("title", O.Default(""))
val agenda: Rep[String] = column[String]("agenda", O.Default(""))
val meetingroom: Rep[Int] = column[Int]("meetingroom")
def getAttendees = Tables.Meeting2uzer.filter(_.meetingid === id)
where "ID" in the above function is a column in Meeting.
now the problem arises when i want to write that same function "getAttendees" in my DAO which doesn't have access to the columns in scope.
something along the lines of....
#Singleton
class SlickMeetingDAO #Inject()(db: Database)(implicit ec: ExecutionContext) extends MeetingDAO with Tables {
override val profile: JdbcProfile = _root_.slick.jdbc.PostgresProfile
import profile.api._
private val queryById = Compiled((id: Rep[Int]) => Meeting.filter(_.id === id))
def getAttendees = Meeting2uzer.filter(_.meetingid === "NEED ID OF COLUMN IN SCOPE")
How do i get 'id' which is a column in Tables.Meeting in scope in my DAO to finish this getAttendees function.
If I understand correctly, you're trying to join two tables?
Meeting2uzer.join(Meeting).on(_.meetingid === _.id)
What you've done inside the Tables.scala, would be more inline with creating a foreign key in Slick. You could create a slick foreign key, instead of using an explicit join. See the documentation for Slick here.
is there an easy way to use datetime/timestamp in scala? What's best practice? I currently use "date" to persist data, but I'd also like to persist the current time.
I'm struggling to set the date. This is my code:
val now = new java.sql.Timestamp(new java.util.Date().getTime)
I also tried to do this:
val now = new java.sql.Date(new java.util.Date().getTime)
When changing the datatype in my evolutions to "timestamp", I got an error:
case class MyObjectModel(
id: Option[Int],
title: String,
createdat: Timestamp,
updatedat: Timestamp,
...)
object MyObjectModel{
implicit val myObjectFormat = Json.format[MyObjectModel]
}
Console:
app\models\MyObjectModel.scala:31: No implicit format for
java.sql.Timestamp available.
[error] implicit val myObjectFormat = Json.format[MyObjectModel]
[error] ^
[error] one error found
[error] (compile:compileIncremental) Compilation failed
Update:
object ProcessStepTemplatesModel {
implicit lazy val timestampFormat: Format[Timestamp] = new Format[Timestamp] {
override def reads(json: JsValue): JsResult[Timestamp] = json.validate[Long].map(l => Timestamp.from(Instant.ofEpochMilli(l)))
override def writes(o: Timestamp): JsValue = JsNumber(o.getTime)
}
implicit val processStepFormat = Json.format[ProcessStepTemplatesModel]
}
try using this in your code
implicit object timestampFormat extends Format[Timestamp] {
val format = new SimpleDateFormat("yyyy-MM-dd'T'HH:mm:ss.SS'Z'")
def reads(json: JsValue) = {
val str = json.as[String]
JsSuccess(new Timestamp(format.parse(str).getTime))
}
def writes(ts: Timestamp) = JsString(format.format(ts))
}
it is (de)serialized in a JS compatible format like the following "2018-01-06T18:31:29.436Z"
please note: the implicit object shall be decleared in the code before it is used
I guess your question is handled in What's the standard way to work with dates and times in Scala? Should I use Java types or there are native Scala alternatives?.
Go with Java 8 "java.time".
In the subject you mention Slick (Scala Database Library) but the error you got comes from a Json library and it says that you don't have a converter for java.sql.Timestamp to Json. Without knowing which Json library you are using it's hard to help you with a working example.
Whenever I get an update request for a given id , I am trying to update the masterId and the updatedDtTm columns in a DB table( I don't want to update my createdDtTm). The following is my code :
case class Master(id:Option[Long] = None,masterId:String,createdDtTm:Option[java.util.Date],
updatedDtTm:Option[java.util.Date])
/**
* This is my Slick Mapping table
* with the default projection
*/
`class MappingMaster(tag:Tag) extends
Table[Master](tag,"master") {
implicit val DateTimeColumnType = MappedColumnType.base[java.util.Date, java.sql.Timestamp](
{
ud => new Timestamp(ud.getTime)
}, {
sd => new java.util.Date(sd.getTime)
})
def id = column[Long]("id",O.PrimaryKey,O.AutoInc)
def masterId = column[String]("master_id")
def createdDtTm = column[java.util.Date]("created_dttm")
def updatedDtTm = column[java.util.Date]("updated_dttm")
def * = (id.? , masterId , createdDtTm.? , updatedDtTm.?) <>
((Master.apply _).tupled , Master.unapply _) }
/**
* Some where in the DAO update call
*/
db.run(masterRecords.filter(_.id === id).map(rw =>(rw.masterId,rw.updatedDtTm)).
update(("new_master_id",new Date()))
// I also tried the following
db.run(masterRecords.filter(_id === id).map(rw => (rw.masterId,rw.updatedDtTm).shaped[(String,java.util.Date)]).update(("new_master_id",new Date()))
The documentation of Slick states that inorder to update multiple columns one needs to use the map to get the corresponding columns and then update them.
The problem here is the following - the update method seems to be accepting a value of Nothing.
I also tried the following which was doing the same thing as above:
val t = for {
ms <- masterRecords if (ms.id === "1234")
} yield (ms.masterId , ms.updateDtTm)
db.run(t.update(("new_master_id",new Date())))
When I compile the code , it gives me the following Compilation Exception :
Slick does not know how to map the given types.
[error] Possible causes: T in Table[T] does not match your * projection. Or you use an unsupported type in a Query (e.g. scala List).
[error] Required level: slick.lifted.FlatShapeLevel
[error] Source type: (slick.lifted.Rep[String], slick.lifted.Rep[java.util.Date])
[error] Unpacked type: (String, java.util.Date)
[error] Packed type: Any
[error] db.run(masterRecords.filter(_id === id).map(rw => (rw.masterId,rw.updatedDtTm).shaped[(String,java.util.Date)]).update(("new_master_id",new Date()))
I am using Scala 2.11 with Slick 3.0.1 and IntelliJ as the IDE. Really appreciate if you can throw some light on this.
Cheers,
Sathish
(Replaces original answer) It seems the implicit has to be in scope for the queries, this compiles:
case class Master(id:Option[Long] = None,masterId:String,createdDtTm:Option[java.util.Date],
updatedDtTm:Option[java.util.Date])
implicit val DateTimeColumnType = MappedColumnType.base[java.util.Date, java.sql.Timestamp](
{
ud => new Timestamp(ud.getTime)
}, {
sd => new java.util.Date(sd.getTime)
})
class MappingMaster(tag:Tag) extends Table[Master](tag,"master") {
def id = column[Long]("id",O.PrimaryKey,O.AutoInc)
def masterId = column[String]("master_id")
def createdDtTm = column[java.util.Date]("created_dttm")
def updatedDtTm = column[java.util.Date]("updated_dttm")
def * = (id.? , masterId , createdDtTm.? , updatedDtTm.?) <> ((Master.apply _).tupled , Master.unapply _)
}
private val masterRecords = TableQuery[MappingMaster]
val id: Long = 123
db.run(masterRecords.filter(_.id === id).map(rw =>(rw.masterId,rw.updatedDtTm)).update("new_master_id",new Date()))
val t = for {
ms <- masterRecords if (ms.id === id)
} yield (ms.masterId , ms.updatedDtTm)
db.run(t.update(("new_master_id",new Date())))
This requirement should be really easy, but I don't know why is not working. I want to delete a row based on it's id using slick with play framework.
I'm following this example from play-slick module, but compiler complains that value delete is not a member of scala.slick.lifted.Query[models.Tables.MyEntity,models.Tables.MyEntity#TableElementType].
My controller looks like:
def delete(id: Int) = DBAction{ implicit rs =>
val q = MyEntity.where(_.id === id)
q.delete
Ok("Entity deleted")
}
I've imported the play.api.db.slick.Config.driver.simple._
What am I doing wrong?
Edit:
My schema definition looks like:
class Cities(tag: Tag) extends Table[CityRow](tag, "cities") {
def * = (cod, name, state, lat, long, id) <> (CityRow.tupled, CityRow.unapply)
/** Maps whole row to an option. Useful for outer joins. */
def ? = (cod.?, name.?, state.?, lat, long, id.?).shaped.<>({r=>import r._; _1.map(_=> CityRow.tupled((_1.get, _2.get, _3.get, _4, _5, _6.get)))}, (_:Any) => throw new Exception("Inserting into ? projection not supported."))
val cod: Column[String] = column[String]("cod")
val name: Column[String] = column[String]("name")
val state: Column[Int] = column[Int]("state")
val lat: Column[Option[String]] = column[Option[String]]("lat")
val long: Column[Option[String]] = column[Option[String]]("long")
val id: Column[Int] = column[Int]("id", O.AutoInc, O.PrimaryKey)
/** Foreign key referencing Departamentos (database name fk_ciudades_departamentos1) */
lazy val stateFk = foreignKey("fk_cities_states", state, States)(r => r.idstate, onUpdate=ForeignKeyAction.NoAction, onDelete=ForeignKeyAction.NoAction)
}
I also had a look at that example some time ago and it looked wrong to me too, not sure wether I was doing something wrong myself or not, the delete function was always a bit tricky to get right, expecially using the lifted.Query (like you are doing). Later in the process I made it work importing the right drivers, in my case scala.slick.driver.PostgresDriver.simple._.
Edit after comment:
Probably you have an error in the shape function, hard to say without looking at your schema declaration. This is an example:
case class CityRow(id: Long, name: String) {
class City(tag: Tag) extends Table[CityRow](tag, "city") {
def * = (id, name) <>(CityRow.tupled, CityRow.unapply)
^this is the shape function.
def ? = (id.?, name).shaped.<>({
r => import r._
_1.map(_ => CityRow.tupled((_1.get, _2)))
}, (_: Any) => throw new Exception("Inserting into ? projection not supported."))
val id: Column[Long] = column[Long]("id", O.AutoInc, O.PrimaryKey)
val name: Column[String] = column[String]("name")
}