formatters for List[DateTime] play scala - scala

I am working on a project using Play, Scala, MongoDB. I want to store the List[Datetime] in a collection, so I need fomatters for it. To store the Datetime I used this formatter
implicit def dateFormat = {
val dateStandardFormat = "yyyy-MM-dd'T'HH:mm:ss.SSS"
val dateReads: Reads[DateTime] = Reads[DateTime](js =>
js.validate[JsObject].map(_.value.toSeq).flatMap {
case Seq(("$date", JsNumber(ts))) if ts.isValidLong =>
JsSuccess(new DateTime(ts.toLong))
case _ =>
JsError(__, "validation.error.expected.$date")
}
)
val dateWrites: Writes[DateTime] = new Writes[DateTime] {
def writes(dateTime: DateTime): JsValue = Json.obj("$date"-> dateTime.getMillis())
}
Format(dateReads, dateWrites)
}
but for storing list of datetimes it is not working. thanks in advance for help

You need to create an implicit json Writer and Reader for the List[DateTime]. In your example, you are only defining how to serialize and deserialize the DateTime type. Adding this below the formatter should make the framework know how to JSONify DateTime lists.
See working example below:
val dateStandardFormat = "yyyy-MM-dd'T'HH:mm:ss.SSS"
val dateReads: Reads[DateTime] = Reads[DateTime](js =>
js.validate[JsObject].map(_.value.toSeq).flatMap {
case Seq(("$date", JsNumber(ts))) if ts.isValidLong =>
JsSuccess(new DateTime(ts.toLong))
case _ =>
JsError(__, "validation.error.expected.$date")
}
)
val dateWrites: Writes[DateTime] = new Writes[DateTime] {
def writes(dateTime: DateTime): JsValue = Json.obj("$date" -> dateTime.getMillis())
}
implicit def dateFormat = Format(dateReads, dateWrites)
implicit val listDateTimeFormat = Format(Reads.list[DateTime](dateReads), Writes.list[DateTime](dateWrites))
val m = List(DateTime.now(), DateTime.now(), DateTime.now(), DateTime.now(), DateTime.now())
println(Json.toJson(m).toString())

You could use the MongoDateFormats from this project simple-reactivemongo

Related

Date validation function scala

I have RDD[(String, String)]. String contains datetimestamp in format ("yyyy-MM-dd HH:mm:ss"). I am converting it in epoch time using the below function where dateFormats is SimpleDateFormat("yyyy-MM-dd HH:mm:ss")
def epochTime (stringOfTime: String): Long = dateFormats.parse(stringOfTime).getTime
I want to modify the function so as to delete the row if it contains null/empty/not-right formatted date and how to apply it to the RDD[(String, String)] so the string value get converted to epoch time as below
Input
(2020-10-10 05:17:12,2015-04-10 09:18:20)
(2020-10-12 06:15:58,2015-04-10 09:17:42)
(2020-10-11 07:16:40,2015-04-10 09:17:49)
Output
(1602303432,1428653900)
(1602479758,1428653862)
(1602397000,1428653869)
You can use a filter to determine which value is not None. To do this you need to change the epochTime method so that it can return Option[Long],def epochTime (stringOfTime: String): Option[Long] inside your method make a check to see if the string is null with the .nonEmpty method, then you can use Try to see if you can parse the string with dateFormats.
After these changes, you must filter the RDD to remove None, and then unwrap each value from Option to Long
The code itself:
val sparkSession = SparkSession.builder()
.appName("Data Validation")
.master("local[*]")
.getOrCreate()
val data = Seq(("2020-10-10 05:17:12","2015-04-10 09:18:20"), ("2020-10-12 06:15:58","2015-04-10 09:17:42"),
("2020-10-11 07:16:40","2015-04-10 09:17:49"), ("t", "t"))
val rdd:RDD[(String,String)] = sparkSession.sparkContext.parallelize(data)
def epochTime (stringOfTime: String): Option[Long] = {
val dateFormats = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss")
if (stringOfTime.nonEmpty) {
val parseDate = Try(dateFormats.parse(stringOfTime).getTime)
parseDate match {
case Success(value) => Some(value)
case _ => None
}
} else {
None
}
}
rdd.map(pair => (epochTime(pair._1),epochTime(pair._2)))
.filter(pair => pair._1.isDefined && pair._2.isDefined)
.map(pair => (pair._1.get, pair._2.get))
.foreach(pair => println(s"Results: (${pair._1}, ${pair._2})"))}

Reading multiple files with akka streams in scala

I'am trying to read multiple files with akka streams and put result in a list.
I can read one file with no problem. the return type is Future[Seq[String]]. problem is processing the sequence inside the Future must go inside an onComplete{}.
i'am trying the following code but abviously it will not work. the list acc outside of the onComplete is empty. but holds values inside the inComplete. I understand the problem but i don't know how to approach this.
// works fine
def readStream(path: String, date: String): Future[Seq[String]] = {
implicit val system = ActorSystem("Sys")
val settings = ActorMaterializerSettings(system)
implicit val materializer = ActorMaterializer(settings)
val result: Future[Seq[String]] =
FileIO.fromPath(Paths.get(path + "transactions_" + date +
".data"))
.via(Framing.delimiter(ByteString("\n"), 256, true))
.map(_.utf8String)
.toMat(Sink.seq)(Keep.right)
.run()
var aa: List[scala.Array[String]] = Nil
result.onComplete(x => {
aa = x.get.map(line => line.split('|')).toList
})
result
}
//this won't work
def concatFiles(path : String, date : String, numberOfDays : Int) :
List[scala.Array[String]] = {
val formatter = DateTimeFormatter.ofPattern("yyyyMMdd");
val formattedDate = LocalDate.parse(date, formatter);
var acc = List[scala.Array[String]]()
for( a <- 0 to numberOfDays){
val date = formattedDate.minusDays(a).toString().replace("-", "")
val transactions = readStream(path , date)
var result: List[scala.Array[String]] = Nil
transactions.onComplete(x => {
result = x.get.map(line => line.split('|')).toList
acc= acc ++ result })
}
acc}
General Solution
Given an Iterator of Paths values a Source of the file lines can be created by combining FileIO & flatMapConcat:
val lineSourceFromPaths : (() => Iterator[Path]) => Source[String, _] = pathsIterator =>
Source
.fromIterator(pathsIterator)
.flatMapConcat { path =>
FileIO
.fromPath(path)
.via(Framing.delimiter(ByteString("\n"), 256, true))
.map(_.utf8String)
}
Application to Question
The reason your List is empty is because the Future values have not completed and therefore your mutable list is not be updated before the function returns the list.
Critique of Code in Question
The organization and style of the code within the question suggest several misunderstandings related to akka & Future. I think you are attempting a rather complex workflow without understanding the fundamentals of the tools you are trying to use.
1.You should not create an ActorSystem each time a function is being called. There is usually 1 ActorSystem per application and it's created only once.
implicit val system = ActorSystem("Sys")
val settings = ActorMaterializerSettings(system)
implicit val materializer = ActorMaterializer(settings)
def readStream(...
2.You should try to avoid mutable collections and instead use Iterator with corresponding functionality:
def concatFiles(path : String, date : String, numberOfDays : Int) : List[scala.Array[String]] = {
val formattedDate = LocalDate.parse(date, DateTimeFormatter.ofPattern("yyyyMMdd"))
val pathsIterator : () => Iterator[Path] = () =>
Iterator
.range(0, numberOfDays+1)
.map(formattedDate.minusDays)
.map(_.String().replace("-", "")
.map(path => Paths.get(path + "transactions_" + date + ".data")
lineSourceFromPaths(pathsIterator)
3.Since you are dealing with Futures you should not wait for Futures to complete and should instead change the return type of concateFiles to Future[List[Array[String]]].

Sort by date in String Format in Scala Object

I have Following Object structure in scala (case classes) :
{
"accounts": [{
"accountManagement": {
"accountStatus": "submitted",
"accountManagementId": "1513684862218",
"submittedDate": "19/12/2017"
}
}]
}
Look at List "accounts". I want to sort this list on the basis of field "submittedDate" from "accountManagement". Note that submitted date is in string format.
I tried this but not working.
for (accountManagement: AccountManagement <- accountManagementList) {
try {
if(accountManagement.submittedDate != null && accountManagement.submittedDate.nonEmpty){
accountManagement.submittedDate = dateFormatter.parse(accountManagement.submittedDate)
}
}catch {
case e:Exception =>
}
accountManagementsNew = accountManagementsNew ::: List(accountManagement)
}
accountManagementsNew.sortBy(_.updatedDate.getTime)
Lets say your case class looked like this.
// Note that in scala its preferred to use Option to indicate nullable fields
case class AccountManagement(accountStatus: String,
accountManagementId: Long,
submittedDate: Option[String])
val accounts = List(
AccountManagement("submitted", 1L, Some("21/12/2017")),
AccountManagement("submitted", 2L, Some("19/12/2017")),
AccountManagement("submitted", 3L, None),
AccountManagement("submitted", 4L, Some("20/12/2017"))
)
val dtf = DateTimeFormatter.ofPattern("dd/MM/yyyy")
You can either define an implicit ordering that you want to use in this context
implicit val localDateOrdering: Ordering[LocalDate] = Ordering.by(_.toEpochDay)
accounts.filterNot(_.submittedDate.isEmpty) sortBy {
case AccountManagement(_, _, Some(submittedDateString)) => LocalDate.parse(submittedDateString, dtf)
}
or you can directly specify that you would like to use the millisecond representation of said date to sort your data set
accounts.filterNot(_.submittedDate.isEmpty) sortBy {
case AccountManagement(_, _, Some(submittedDateString)) => LocalDate.parse(submittedDateString, dtf).toEpochDay
}
If I understand your requirement correctly, you can first assemble a list of AccountManagementNew from AccountManagement by converting the String-type date to LocalDate, then perform the sorting by date as shown below. Note that Try is used to handle Success/Failure cases.
import java.time.LocalDate
import java.time.format.DateTimeFormatter
import scala.util.{Try, Success, Failure}
case class AccountManagement(
accountStatus: String, accountManagementId: String, submittedDate: String
)
case class AccountManagementNew(
accountStatus: String, accountManagementId: String, updatedDate: LocalDate
)
val accountManagementList = List[AccountManagement](
AccountManagement("submitted", "1513684862218", "19/12/2017"),
AccountManagement("submitted", "1513684862219", "09/01/2018"),
AccountManagement("submitted", "1513684862220", "29/11/2017")
)
val datePattern = DateTimeFormatter.ofPattern("dd/MM/yyyy")
// Assemble a list of the AccountManagementNew case class
val amNewList =
for (am <- accountManagementList) yield {
Try( LocalDate.parse(am.submittedDate, datePattern) ) match {
case Success(d) =>
AccountManagementNew(am.accountStatus, am.accountManagementId, d)
case Failure(_) =>
AccountManagementNew(am.accountStatus, am.accountManagementId, LocalDate.MIN)
}
}
// Use `LocalDate.toEpochDay` for date ordering
implicit val dateOrdering = Ordering.by{d: LocalDate => d.toEpochDay}
amNewList.sortBy(_.updatedDate)
// res1: List[AccountManagementNew] = List(
// AccountManagementNew(submitted,1513684862220,2017-11-29),
// AccountManagementNew(submitted,1513684862218,2017-12-19),
// AccountManagementNew(submitted,1513684862219,2018-01-09)
// )

Mongo-Scala-Driver: CodecConfigurationException: can't find a codec for class immutable.Document

Error message:
org.bson.codecs.configuration.CodecConfigurationException: Can't find a codec for class org.mongodb.scala.bson.collection.immutable.Document
Code:
def queueWrite(collection: String, filter: Map[String, () => String], data: Map[String, () => String]) {
val col = collections.get(collection).get
val filterBson = Document()
filter.foreach(f => { filterBson.append(f._1, f._2.apply) })
val dataBson = Document()
data.foreach(f => { dataBson.append(f._1, f._2.apply) })
val options = new FindOneAndUpdateOptions
options.returnDocument(ReturnDocument.AFTER)
options.upsert(true)
val observer = new Observer[Document] {
override def onNext(doc: Document) = println(doc.toJson)
override def onError(e: Throwable) = e.printStackTrace
override def onComplete = println("onComplete")
}
val observable: Observable[Document] = col.findOneAndUpdate(filterBson, dataBson, options)
observable.subscribe(observer)
}
Called with:
val filter = Map[String, () => String]("uuid", p.getUniqueId.toString)
var dataMap = Map[String, () => String]()
dataMap = dataMap.+("uuid" -> p.getUniqueId.toString)
dataMap = dataMap.+("nickname" -> p.getDisplayName)
queueWrite("players", filter, dataMap)
I've tried using mutable documents but then realized that findoneandupdate returns an immutable. I also tried using a BsonDocument for the filter with equal but that ofc had no effect. I'm not really sure where to go from here, any help would be greatly appreciated :)
private val settings = MongoClientSettings.builder
.clusterSettings(clusterSettings)
.build
My MongoClientSettings looked like this before, I needed to change it to this:
private val settings = MongoClientSettings.builder
.clusterSettings(clusterSettings)
.codecRegistry(MongoClient.DEFAULT_CODEC_REGISTRY)
.build
It seems mongo didn't assume default codec registry
Thanks to #Ross for the help!

Play framework - Using anorm with Option[LocalDate] \ Option[LocalDateTime]

I am trying to define a nullable date field in postgres, while using anorm as connection to the database.
I am trying to update an entry:
def update(id: Long, startTime: Option[LocalDate]){
SQL("""UPDATE my_table
|SET start_date = {start_date}
|WHERE id = {id}
""".stripMargin)
.on(
'id ->id,
'start_date -> startDate,
).executeUpdate()
}
But I get a compilation error, looks like anorm can't handle Option[DateTime], although when I configured a parser it works form me:
val parser: RowParser[Info] = {
get[Long]("id") ~
get[Option[DateTime]]("start_date") map {
case id ~ startTime => Info(id, startDate)
}
}
What am I missing here?
Thanks!
I added my own implicit definitions:
implicit def rowToLocalDate: Column[LocalDate] = Column.nonNull {(value, meta) =>
val MetaDataItem(qualified, nullable, clazz) = meta
value match {
case ts: java.sql.Timestamp => Right(new LocalDate(ts.getTime))
case d: java.sql.Date => Right(new LocalDate(d.getTime))
case str: java.lang.String => Right(fmt.parseLocalDate(str))
case _ => Left(TypeDoesNotMatch("Cannot convert " + value + ":" + value.asInstanceOf[AnyRef].getClass) )
}
}
implicit val localDateToStatement = new ToStatement[LocalDate] {
def set(s: java.sql.PreparedStatement, index: Int, aValue: LocalDate): Unit = {
s.setTimestamp(index, new java.sql.Timestamp(aValue.toDateTimeAtStartOfDay().getMillis()))
}
}
And the relevant ParameterMetaData
implicit object LocalDateClassMetaData extends ParameterMetaData[LocalDate] {
val sqlType = ParameterMetaData.DateParameterMetaData.sqlType
val jdbcType = ParameterMetaData.DateParameterMetaData.jdbcType
}
That made the trick
Related question, Anorm compare/search by java.time LocalDateTime what it worked for me is just update to new version (not-yet-release-one)