Play framework - Using anorm with Option[LocalDate] \ Option[LocalDateTime] - scala

I am trying to define a nullable date field in postgres, while using anorm as connection to the database.
I am trying to update an entry:
def update(id: Long, startTime: Option[LocalDate]){
SQL("""UPDATE my_table
|SET start_date = {start_date}
|WHERE id = {id}
""".stripMargin)
.on(
'id ->id,
'start_date -> startDate,
).executeUpdate()
}
But I get a compilation error, looks like anorm can't handle Option[DateTime], although when I configured a parser it works form me:
val parser: RowParser[Info] = {
get[Long]("id") ~
get[Option[DateTime]]("start_date") map {
case id ~ startTime => Info(id, startDate)
}
}
What am I missing here?
Thanks!

I added my own implicit definitions:
implicit def rowToLocalDate: Column[LocalDate] = Column.nonNull {(value, meta) =>
val MetaDataItem(qualified, nullable, clazz) = meta
value match {
case ts: java.sql.Timestamp => Right(new LocalDate(ts.getTime))
case d: java.sql.Date => Right(new LocalDate(d.getTime))
case str: java.lang.String => Right(fmt.parseLocalDate(str))
case _ => Left(TypeDoesNotMatch("Cannot convert " + value + ":" + value.asInstanceOf[AnyRef].getClass) )
}
}
implicit val localDateToStatement = new ToStatement[LocalDate] {
def set(s: java.sql.PreparedStatement, index: Int, aValue: LocalDate): Unit = {
s.setTimestamp(index, new java.sql.Timestamp(aValue.toDateTimeAtStartOfDay().getMillis()))
}
}
And the relevant ParameterMetaData
implicit object LocalDateClassMetaData extends ParameterMetaData[LocalDate] {
val sqlType = ParameterMetaData.DateParameterMetaData.sqlType
val jdbcType = ParameterMetaData.DateParameterMetaData.jdbcType
}
That made the trick

Related question, Anorm compare/search by java.time LocalDateTime what it worked for me is just update to new version (not-yet-release-one)

Related

how to insert usert defined type in cassandra by using lagom scala framework

I am using Lagom(scala) framework and i could find any way to save scala case class object in cassandra with has complex Type. so how to i insert cassandra UDT in Lagom scala. and can any one explain hoe to use BoundStatement.setUDTValue() method.
I have tried to do by using com.datastax.driver.mapping.annotations.UDT.
but does not work for me. I have also tried com.datastax.driver.core
Session Interface. but again it does not.
case class LeadProperties(
name: String,
label: String,
description: String,
groupName: String,
fieldDataType: String,
options: Seq[OptionalData]
)
object LeadProperties{
implicit val format: Format[LeadProperties] = Json.format[LeadProperties]
}
#UDT(keyspace = "leadpropertieskeyspace", name="optiontabletype")
case class OptionalData(label: String)
object OptionalData {
implicit val format: Format[OptionalData] = Json.format[OptionalData]
}
my query:----
val optiontabletype= """
|CREATE TYPE IF NOT EXISTS optiontabletype(
|value text
|);
""".stripMargin
val createLeadPropertiesTable: String = """
|CREATE TABLE IF NOT EXISTS leadpropertiestable(
|name text Primary Key,
|label text,
|description text,
|groupname text,
|fielddatatype text,
|options List<frozen<optiontabletype>>
);
""".stripMargin
def createLeadProperties(obj: LeadProperties): Future[List[BoundStatement]] = {
val bindCreateLeadProperties: BoundStatement = createLeadProperties.bind()
bindCreateLeadProperties.setString("name", obj.name)
bindCreateLeadProperties.setString("label", obj.label)
bindCreateLeadProperties.setString("description", obj.description)
bindCreateLeadProperties.setString("groupname", obj.groupName)
bindCreateLeadProperties.setString("fielddatatype", obj.fieldDataType)
here is the problem I am not getting any method for cassandra Udt.
Future.successful(List(bindCreateLeadProperties))
}
override def buildHandler(): ReadSideProcessor.ReadSideHandler[PropertiesEvent] = {
readSide.builder[PropertiesEvent]("PropertiesOffset")
.setGlobalPrepare(() => PropertiesRepository.createTable)
.setPrepare(_ => PropertiesRepository.prepareStatements)
.setEventHandler[PropertiesCreated](ese ⇒
PropertiesRepository.createLeadProperties(ese.event.obj))
.build()
}
I was faced with the same issue and solve it following way:
Define type and table:
def createTable(): Future[Done] = {
session.executeCreateTable("CREATE TYPE IF NOT EXISTS optiontabletype(filed1 text, field2 text)")
.flatMap(_ => session.executeCreateTable(
"CREATE TABLE IF NOT EXISTS leadpropertiestable ( " +
"id TEXT, options list<frozen <optiontabletype>>, PRIMARY KEY (id))"
))
}
Call this method in buildHandler() like this:
override def buildHandler(): ReadSideProcessor.ReadSideHandler[FacilityEvent] =
readSide.builder[PropertiesEvent]("PropertiesOffset")
.setPrepare(_ => prepare())
.setGlobalPrepare(() => {
createTable()
})
.setEventHandler[PropertiesCreated](processPropertiesCreated)
.build()
Then in processPropertiesCreated() I used it like:
private val writePromise = Promise[PreparedStatement] // initialized in prepare
private def writeF: Future[PreparedStatement] = writePromise.future
private def processPropertiesCreated(eventElement: EventStreamElement[PropertiesCreated]): Future[List[BoundStatement]] = {
writeF.map { ps =>
val userType = ps.getVariables.getType("options").getTypeArguments.get(0).asInstanceOf[UserType]
val newValue = userType.newValue().setString("filed1", "1").setString("filed2", "2")
val bindWriteTitle = ps.bind()
bindWriteTitle.setString("id", eventElement.event.id)
bindWriteTitle.setList("options", eventElement.event.keys.map(_ => newValue).toList.asJava) // todo need to convert, now only stub
List(bindWriteTitle)
}
}
And read it like this:
def toFacility(r: Row): LeadPropertiesTable = {
LeadPropertiesTable(
id = r.getString(fId),
options = r.getList("options", classOf[UDTValue]).asScala.map(udt => OptiontableType(field1 = udt.getString("field1"), field2 = udt.getString("field2"))
)
}
My prepare() function:
private def prepare(): Future[Done] = {
val f = session.prepare("INSERT INTO leadpropertiestable (id, options) VALUES (?, ?)")
writePromise.completeWith(f)
f.map(_ => Done)
}
This is not a very well written code, but I think will help to proceed work.

Sort by date in String Format in Scala Object

I have Following Object structure in scala (case classes) :
{
"accounts": [{
"accountManagement": {
"accountStatus": "submitted",
"accountManagementId": "1513684862218",
"submittedDate": "19/12/2017"
}
}]
}
Look at List "accounts". I want to sort this list on the basis of field "submittedDate" from "accountManagement". Note that submitted date is in string format.
I tried this but not working.
for (accountManagement: AccountManagement <- accountManagementList) {
try {
if(accountManagement.submittedDate != null && accountManagement.submittedDate.nonEmpty){
accountManagement.submittedDate = dateFormatter.parse(accountManagement.submittedDate)
}
}catch {
case e:Exception =>
}
accountManagementsNew = accountManagementsNew ::: List(accountManagement)
}
accountManagementsNew.sortBy(_.updatedDate.getTime)
Lets say your case class looked like this.
// Note that in scala its preferred to use Option to indicate nullable fields
case class AccountManagement(accountStatus: String,
accountManagementId: Long,
submittedDate: Option[String])
val accounts = List(
AccountManagement("submitted", 1L, Some("21/12/2017")),
AccountManagement("submitted", 2L, Some("19/12/2017")),
AccountManagement("submitted", 3L, None),
AccountManagement("submitted", 4L, Some("20/12/2017"))
)
val dtf = DateTimeFormatter.ofPattern("dd/MM/yyyy")
You can either define an implicit ordering that you want to use in this context
implicit val localDateOrdering: Ordering[LocalDate] = Ordering.by(_.toEpochDay)
accounts.filterNot(_.submittedDate.isEmpty) sortBy {
case AccountManagement(_, _, Some(submittedDateString)) => LocalDate.parse(submittedDateString, dtf)
}
or you can directly specify that you would like to use the millisecond representation of said date to sort your data set
accounts.filterNot(_.submittedDate.isEmpty) sortBy {
case AccountManagement(_, _, Some(submittedDateString)) => LocalDate.parse(submittedDateString, dtf).toEpochDay
}
If I understand your requirement correctly, you can first assemble a list of AccountManagementNew from AccountManagement by converting the String-type date to LocalDate, then perform the sorting by date as shown below. Note that Try is used to handle Success/Failure cases.
import java.time.LocalDate
import java.time.format.DateTimeFormatter
import scala.util.{Try, Success, Failure}
case class AccountManagement(
accountStatus: String, accountManagementId: String, submittedDate: String
)
case class AccountManagementNew(
accountStatus: String, accountManagementId: String, updatedDate: LocalDate
)
val accountManagementList = List[AccountManagement](
AccountManagement("submitted", "1513684862218", "19/12/2017"),
AccountManagement("submitted", "1513684862219", "09/01/2018"),
AccountManagement("submitted", "1513684862220", "29/11/2017")
)
val datePattern = DateTimeFormatter.ofPattern("dd/MM/yyyy")
// Assemble a list of the AccountManagementNew case class
val amNewList =
for (am <- accountManagementList) yield {
Try( LocalDate.parse(am.submittedDate, datePattern) ) match {
case Success(d) =>
AccountManagementNew(am.accountStatus, am.accountManagementId, d)
case Failure(_) =>
AccountManagementNew(am.accountStatus, am.accountManagementId, LocalDate.MIN)
}
}
// Use `LocalDate.toEpochDay` for date ordering
implicit val dateOrdering = Ordering.by{d: LocalDate => d.toEpochDay}
amNewList.sortBy(_.updatedDate)
// res1: List[AccountManagementNew] = List(
// AccountManagementNew(submitted,1513684862220,2017-11-29),
// AccountManagementNew(submitted,1513684862218,2017-12-19),
// AccountManagementNew(submitted,1513684862219,2018-01-09)
// )

formatters for List[DateTime] play scala

I am working on a project using Play, Scala, MongoDB. I want to store the List[Datetime] in a collection, so I need fomatters for it. To store the Datetime I used this formatter
implicit def dateFormat = {
val dateStandardFormat = "yyyy-MM-dd'T'HH:mm:ss.SSS"
val dateReads: Reads[DateTime] = Reads[DateTime](js =>
js.validate[JsObject].map(_.value.toSeq).flatMap {
case Seq(("$date", JsNumber(ts))) if ts.isValidLong =>
JsSuccess(new DateTime(ts.toLong))
case _ =>
JsError(__, "validation.error.expected.$date")
}
)
val dateWrites: Writes[DateTime] = new Writes[DateTime] {
def writes(dateTime: DateTime): JsValue = Json.obj("$date"-> dateTime.getMillis())
}
Format(dateReads, dateWrites)
}
but for storing list of datetimes it is not working. thanks in advance for help
You need to create an implicit json Writer and Reader for the List[DateTime]. In your example, you are only defining how to serialize and deserialize the DateTime type. Adding this below the formatter should make the framework know how to JSONify DateTime lists.
See working example below:
val dateStandardFormat = "yyyy-MM-dd'T'HH:mm:ss.SSS"
val dateReads: Reads[DateTime] = Reads[DateTime](js =>
js.validate[JsObject].map(_.value.toSeq).flatMap {
case Seq(("$date", JsNumber(ts))) if ts.isValidLong =>
JsSuccess(new DateTime(ts.toLong))
case _ =>
JsError(__, "validation.error.expected.$date")
}
)
val dateWrites: Writes[DateTime] = new Writes[DateTime] {
def writes(dateTime: DateTime): JsValue = Json.obj("$date" -> dateTime.getMillis())
}
implicit def dateFormat = Format(dateReads, dateWrites)
implicit val listDateTimeFormat = Format(Reads.list[DateTime](dateReads), Writes.list[DateTime](dateWrites))
val m = List(DateTime.now(), DateTime.now(), DateTime.now(), DateTime.now(), DateTime.now())
println(Json.toJson(m).toString())
You could use the MongoDateFormats from this project simple-reactivemongo

BulkLoading to Phoenix using Spark

I was trying to code some utilities to bulk load data through HFiles from Spark RDDs.
I was taking the pattern of CSVBulkLoadTool from phoenix. I managed to generate some HFiles and load them into HBase, but i can't see the rows using sqlline(e.g using hbase shell it is possible). I would be more than grateful for any suggestions.
BulkPhoenixLoader.scala:
class BulkPhoenixLoader[A <: ImmutableBytesWritable : ClassTag, T <: KeyValue : ClassTag](rdd: RDD[(A, T)]) {
def createConf(tableName: String, inConf: Option[Configuration] = None): Configuration = {
val conf = inConf.map(HBaseConfiguration.create).getOrElse(HBaseConfiguration.create())
val job: Job = Job.getInstance(conf, "Phoenix bulk load")
job.setMapOutputKeyClass(classOf[ImmutableBytesWritable])
job.setMapOutputValueClass(classOf[KeyValue])
// initialize credentials to possibily run in a secure env
TableMapReduceUtil.initCredentials(job)
val htable: HTable = new HTable(conf, tableName)
// Auto configure partitioner and reducer according to the Main Data table
HFileOutputFormat2.configureIncrementalLoad(job, htable)
conf
}
def bulkSave(tableName: String, outputPath: String, conf:
Option[Configuration]) = {
val configuration: Configuration = createConf(tableName, conf)
rdd.saveAsNewAPIHadoopFile(
outputPath,
classOf[ImmutableBytesWritable],
classOf[Put],
classOf[HFileOutputFormat2],
configuration)
}
}
ExtendedProductRDDFunctions.scala:
class ExtendedProductRDDFunctions[A <: scala.Product](data: org.apache.spark.rdd.RDD[A]) extends
ProductRDDFunctions[A](data) with Serializable {
def toHFile(tableName: String,
columns: Seq[String],
conf: Configuration = new Configuration,
zkUrl: Option[String] =
None): RDD[(ImmutableBytesWritable, KeyValue)] = {
val config = ConfigurationUtil.getOutputConfiguration(tableName, columns, zkUrl, Some(conf))
val tableBytes = Bytes.toBytes(tableName)
val encodedColumns = ConfigurationUtil.encodeColumns(config)
val jdbcUrl = zkUrl.map(getJdbcUrl).getOrElse(getJdbcUrl(config))
val conn = DriverManager.getConnection(jdbcUrl)
val query = QueryUtil.constructUpsertStatement(tableName,
columns.toList.asJava,
null)
data.flatMap(x => mapRow(x, jdbcUrl, encodedColumns, tableBytes, query))
}
def mapRow(product: Product,
jdbcUrl: String,
encodedColumns: String,
tableBytes: Array[Byte],
query: String): List[(ImmutableBytesWritable, KeyValue)] = {
val conn = DriverManager.getConnection(jdbcUrl)
val preparedStatement = conn.prepareStatement(query)
val columnsInfo = ConfigurationUtil.decodeColumns(encodedColumns)
columnsInfo.zip(product.productIterator.toList).zipWithIndex.foreach(setInStatement(preparedStatement))
preparedStatement.execute()
val uncommittedDataIterator = PhoenixRuntime.getUncommittedDataIterator(conn, true)
val hRows = uncommittedDataIterator.asScala.filter(kvPair =>
Bytes.compareTo(tableBytes, kvPair.getFirst) == 0
).flatMap(kvPair => kvPair.getSecond.asScala.map(
kv => {
val byteArray = kv.getRowArray.slice(kv.getRowOffset, kv.getRowOffset + kv.getRowLength - 1) :+ 1.toByte
(new ImmutableBytesWritable(byteArray, 0, kv.getRowLength), kv)
}))
conn.rollback()
conn.close()
hRows.toList
}
def setInStatement(statement: PreparedStatement): (((ColumnInfo, Any), Int)) => Unit = {
case ((c, v), i) =>
if (v != null) {
// Both Java and Joda dates used to work in 4.2.3, but now they must be java.sql.Date
val (finalObj, finalType) = v match {
case dt: DateTime => (new Date(dt.getMillis), PDate.INSTANCE.getSqlType)
case d: util.Date => (new Date(d.getTime), PDate.INSTANCE.getSqlType)
case _ => (v, c.getSqlType)
}
statement.setObject(i + 1, finalObj, finalType)
} else {
statement.setNull(i + 1, c.getSqlType)
}
}
private def getIndexTables(conn: Connection, qualifiedTableName: String) : List[(String, String)]
= {
val table: PTable = PhoenixRuntime.getTable(conn, qualifiedTableName)
val tables = table.getIndexes.asScala.map(x => x.getIndexType match {
case IndexType.LOCAL => (x.getTableName.getString, MetaDataUtil.getLocalIndexTableName(qualifiedTableName))
case _ => (x.getTableName.getString, x.getTableName.getString)
}).toList
tables
}
}
The generated HFiles I load with the utility tool from hbase as follows:
hbase org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles path/to/hfile tableName
You could just convert your csv file to an RDD of Product and use the .saveToPhoenix method. This is generally how I load csv data into phoenix.
Please see: https://phoenix.apache.org/phoenix_spark.html

Strange result when using squeryl and scala

I'm trying to select the coupled user by getting the correct linkedAccount.
The query that is created is correct but when trying to use a property
on dbuser e.g dbuser.lastName I get a compile error since dbuser is not
of type User but Query1 size=?
It's probably something really simple but I can't figure it out since I'm
a scala and squeryl noob!
Why doesn't it return the correct value and what have I done wrong in my query?
Also, saving works without any issues.
User:
class User(
#Column("id") val id: Long,
#Column("first_name") val firstName : String,
#Column("last_name") val lastName : String,
#Column("email") val email : String,
#Column("email_validated") val emailValidated: Boolean = false,
#Column("last_login") val lastLogin: Timestamp = null,
val created: Timestamp,
val modified: Timestamp,
val active: Boolean = false
) extends KeyedEntity[Long] {
lazy val linkedAccounts: OneToMany[LinkedAccount] = AppDB.usersToLinkedAccounts.left(this)
}
LinkedAccount:
class LinkedAccount(
#Column("id") val id: Long,
#Column("user_id") val userId: Long,
#Column("provider_user_id") val providerUserId: String,
#Column("salt") val salt: String,
#Column("provider_key") val providerKey: String) extends KeyedEntity[Long] {
lazy val user: ManyToOne[User] = AppDB.usersToLinkedAccounts.right(this)
}
AppDB:
object AppDB extends Schema {
val users = table[User]("users")
val linkedAccounts = table[LinkedAccount]("linked_account")
val usersToLinkedAccounts = oneToManyRelation(users, linkedAccounts).via((u, l) => u.id === l.userId)
def userByLinkedAccount(prodivderKey: String, providerUserId: String) = {
from(AppDB.users)(u =>
where(u.id in
from(AppDB.linkedAccounts)(la =>
where(la.userId === u.id and la.providerKey === prodivderKey and la.providerUserId === providerUserId)
select (la.userId)
)
)
select (u)
)
}
The call:
val dbuser = inTransaction {
val u2 = AppDB.userByLinkedAccount(id.providerId, id.id)
println(u2.statement)
}
println(dbuser.lastName)
The sql generated
Select
users10.last_login as users10_last_login,
users10.email as users10_email,
users10.modified as users10_modified,
users10.last_name as users10_last_name,
users10.first_name as users10_first_name,
users10.id as users10_id,
users10.created as users10_created,
users10.email_validated as users10_email_validated,
users10.active as users10_active
From
users users10
Where
(users10.id in ((Select
linked_account13.user_id as linked_account13_user_id
From
linked_account linked_account13
Where
(((linked_account13.user_id = users10.id) and (linked_account13.provider_key = 'facebook')) and (linked_account13.provider_user_id = 'XXXXXXXXXX'))
) ))
BTW, in the documentation to #Column and #ColumnBase it is said:
The preferred way to define column metadata is not not define them (!)
So, you can define columns just as
val id: Long,
instead of
#Column("id") val id: Long
Ok figured it out. I need to make the call, in this case:
.headOption
Also fixed the query after some tips from Per
def userByLinkedAccount(providerKey : String, providerUserId : String) = {
inTransaction {
from(AppDB.users, AppDB.linkedAccounts)((u,la) =>
where (u.id === la.userId and la.providerKey === providerKey and la.providerUserId === providerUserId)
select(u)
).headOption
}
}