Parsing a simple array with Spray-json - scala

I'm trying (and failing) to get my head around how spray-json converts json feeds into objects. If I have a simple key -> value json feed then it seems to work ok but the data I want to read comes in a list like this:
[{
"name": "John",
"age": "30"
},
{
"name": "Tom",
"age": "25"
}]
And my code looks like this:
package jsontest
import spray.json._
import DefaultJsonProtocol._
object JsonFun {
case class Person(name: String, age: String)
case class FriendList(items: List[Person])
object FriendsProtocol extends DefaultJsonProtocol {
implicit val personFormat = jsonFormat2(Person)
implicit val friendListFormat = jsonFormat1(FriendList)
}
def main(args: Array[String]): Unit = {
import FriendsProtocol._
val input = scala.io.Source.fromFile("test.json")("UTF-8").mkString.parseJson
val friendList = input.convertTo[FriendList]
println(friendList)
}
}
If I change my test file so it just has a single person not in an array and run val friendList = input.convertTo[Person] then it works and everything parses but as soon as I try and parse an array it fails with the error Object expected in field 'items'
Can anyone point me the direction of what I'm doing wrong?

Well, as is often the way immediately after posting something to StackOverflow after spending hours trying to get something working, I've managed to get this to work.
The correct implementation of FriendsProtocol was:
object FriendsProtocol extends DefaultJsonProtocol {
implicit val personFormat = jsonFormat2(Person)
implicit object friendListJsonFormat extends RootJsonFormat[FriendList] {
def read(value: JsValue) = FriendList(value.convertTo[List[Person]])
def write(f: FriendList) = ???
}
}
Telling Spray how to read / write (just read in my case) the list object is enough to get it working.
Hope that helps someone else!

To make the Friend array easier to use extend the IndexedSeq[Person]trait by implementing the appropriate apply and length methods. This will allow the Standard Scala Collections API methods like map, filter and sortBy directly on the FriendsArray instance itself without having to access the underlying Array[Person] value that it wraps.
case class Person(name: String, age: String)
// this case class allows special sequence trait in FriendArray class
// this will allow you to use .map .filter etc on FriendArray
case class FriendArray(items: Array[Person]) extends IndexedSeq[Person] {
def apply(index: Int) = items(index)
def length = items.length
}
object FriendsProtocol extends DefaultJsonProtocol {
implicit val personFormat = jsonFormat2(Person)
implicit object friendListJsonFormat extends RootJsonFormat[FriendArray] {
def read(value: JsValue) = FriendArray(value.convertTo[Array[Person]])
def write(f: FriendArray) = ???
}
}
import FriendsProtocol._
val input = jsonString.parseJson
val friends = input.convertTo[FriendArray]
friends.map(x => println(x.name))
println(friends.length)
This will then print:
John
Tom
2

Related

Handling nulls with json Play

I am trying to parse json with null values for some fields using Play library. There is a case class which represents the data:
case class Id(value: Int) extends AnyVal
case class Name(value: String) extends AnyVal
case class Number(value: Int) extends AnyVal
case class Data(id: Option[Id], name: Option[Name], number: Option[Number])
Here is how parsing currently works:
def parse(jsValue: JsValue): Try[Seq[Data]] = Try {
jsValue.as[JsArray].value
.flatMap { record =>
val id = Id((record \ "id").as[Int])
val name = Name((record \ "name").as[String])
val number = Number((record \ "number").as[Int])
Some(Data(Some(id), Some(name), Some(number)))
}
}
Parsing with specific data types doesn't handle null cases, so this implementation returns:
Failure(play.api.libs.json.JsResultException: JsResultException(errors:List((,List(JsonValidationError(List(error.expected.jsstring),WrappedArray()))))))
For the input data like this:
{
"id": 1248,
"default": false,
"name": null,
"number": 2
}
I would like to have something like this: Seq(Data(Some(Id(1248)), None, Some(Number(2))))
I am going to write the data into the database so I do not mind writing some null values for these fields.
How can I handle null values for fields in parsed json?
You can simply let the play-json library generate the Reads for your case classes instead of writing them manually:
import play.api.libs.json._
object Data {
implicit val reads: Reads[Data] = {
// move these into the corresponding companion objects if used elsewhere...
implicit val idReads = Json.reads[Id]
implicit val numberReads = Json.reads[Number]
implicit val nameReads = Json.reads[Name]
Json.reads[Data]
}
}
def parse(jsValue: JsValue): Try[Seq[Data]] = Json.fromJson[Seq[Data]](jsValue).toTry
That way, your code will work even in case you change the arguments of your case classes.
If you still want to code it manually, you can use the readNullable parser:
val name: Option[Name] = Name(record.as((__ \ "name").readNullable[String]))
Note, however, that using Try is somewhat frowned upon in FP and directly using JsResult would be more idiomatic.
If you are not locked to use play-json then let me show how it can be done easily with jsoniter-scala:
import com.github.plokhotnyuk.jsoniter_scala.core._
import com.github.plokhotnyuk.jsoniter_scala.macros._
implicit val codec: JsonValueCodec[Seq[Data]] = JsonCodecMaker.make(CodecMakerConfig)
val json: Array[Byte] = """[
{
"id": 1248,
"default": false,
"name": null,
"number": 2
}
]""".getBytes("UTF-8")
val data: Seq[Data] = readFromArray(json)
println(data)
That will produce the following output:
List(Data(Some(Id(1248)),None,Some(Number(2))))
Here you can see an example how to integrate it with the Play framework.

DSL Like Syntax in Scala

I'm trying to come up with a CSV Parser that can be called like this:
parser parse "/path/to/csv/file" using parserConfiguration
Where the parser will be a class that contains the target case class into which the CSV file will be parsed into:
class CSVParser[A] {
def parse(path: String) = Source.fromFile(fromFilePath).getLines().mkString("\n")
def using(cfg: ParserConfig) = ??? How do I chain this optionally???
}
val parser = CSVParser[SomeCaseClass]
I managed to get up to the point where I can call:
parser parse "/the/path/to/the/csv/file/"
But I do not want to run the parse method yet as I want to apply the configuration using the using like DSL as mentioned above! So there are two rules here. If the caller does not supply a parserConfig, I should be able to run with the default, but if the user supplies a parserConfig, I want to apply the config and then run the parse method. I tried it with a combination of implicits, but could not get them to work properly!
Any suggestions?
EDIT: So the solution looks like this as per comments from "Cyrille Corpet":
class CSVReader[A] {
def parse(path: String) = ReaderWithFile[A](path)
case class ReaderWithFile[A](path: String) {
def using(cfg: CSVParserConfig): Seq[A] = {
val lines = Source.fromFile(path).getLines().mkString("\n")
println(lines)
println(cfg)
null
}
}
object ReaderWithFile {
implicit def parser2parsed[A](parser: ReaderWithFile[A]): Seq[A] = parser.using(defaultParserCfg)
}
}
object CSVReader extends App {
def parser[A] = new CSVReader[A]
val sss: Seq[A] = parser parse "/csv-parser/test.csv" // assign this to a val so that the implicit conversion gets applied!! Very important to note!
}
I guess I need to get the implicit in scope at the location where I call the parser parse, but at the same time I do not want to mess up the structure that I have above!
If you replace using with an operator with a higher precedence than parse you can get it to work without needing extra type annotations. Take for instance <<:
object parsedsl {
class ParserConfig
object ParserConfig {
val default = new ParserConfig
}
case class ParseUnit(path: String, config: ParserConfig)
object ParseUnit {
implicit def path2PU(path: String) = ParseUnit(path, ParserConfig.default)
}
implicit class ConfigSyntax(path: String) {
def <<(config: ParserConfig) = ParseUnit(path, config)
}
class CSVParser {
def parse(pu: ParseUnit) = "parsing"
}
}
import parsedsl._
val parser = new CSVParser
parser parse "path" << ParserConfig.default
parser parse "path"
Your parse method should just give a partial result, without doing anything at all. To deal with default implem, you can use implicit conversion to output type:
class CSVParser[A] {
def parse(path: String) = ParserWithFile[A](path)
}
case class ParserWithFile[A](path: String) {
def using(cfg: ParserConfig): A = ???
}
object ParserWithFile {
implicit def parser2parsed[A](parser: ParserWithFile[A]): A = parser.using(ParserConfig.default)
}
val parser = CSVParser[SomeCaseClass]

Adding functionality before calling constructor in extra constructor

Is it possible to add functionality before calling constructor in extra constructor in scala ?
Lets say, I have class User, and want to get one string - and to split it into attributes - to send them to the constructor:
class User(val name: String, val age: Int){
def this(line: String) = {
val attrs = line.split(",") //This line is leading an error - what can I do instead
this(attrs(0), attrs(1).toInt)
}
}
So I know I'm not able to add a line before sending to this, because all constructors need to call another constructor as the first statement of the constructor.
Then what can I do instead?
Edit:
I have a long list of attributes, so I don't want to repeat line.split(",")
I think this is a place where companion object and apply() method come nicely into play:
object User {
def apply(line: String): User = {
val attrs = line.split(",")
new User(attrs(0), attrs(1).toInt)
}
}
class User(val name: String, val age: Int)
Then you just create your object the following way:
val u1 = User("Zorro,33")
Also since you're exposing name and age anyway, you might consider using case class instead of standard class and have consistent way of constructing User objects (without new keyword):
object User {
def apply(line: String): User = {
val attrs = line.split(",")
new User(attrs(0), attrs(1).toInt)
}
}
case class User(name: String, age: Int)
val u1 = User("Zorro,33")
val u2 = User("Zorro", "33")
Ugly, but working solution#1:
class User(val name: String, val age: Int){
def this(line: String) = {
this(line.split(",")(0), line.split(",")(1).toInt)
}
}
Ugly, but working solution#2:
class User(val name: String, val age: Int)
object User {
def fromString(line: String) = {
val attrs = line.split(",")
new User(attrs(0), attrs(1).toInt)
}
}
Which can be used as:
val johny = User.fromString("johny,35")
You could use apply in place of fromString, but this will lead to a confusion (in one case you have to use new, in the other you have to drop it) so I prefer to use different name
Another ugly solution:
class User(line: String) {
def this(name: String, age: Int) = this(s"$name,$age")
val (name, age) = {
val Array(nameStr,ageStr) = line.split(",")
(nameStr,ageStr.toInt)
}
}
But using a method of the companion object is probably better.

Scala Reflection to update a case class val

I'm using scala and slick here, and I have a baserepository which is responsible for doing the basic crud of my classes.
For a design decision, we do have updatedTime and createdTime columns all handled by the application, and not by triggers in database. Both of this fields are joda DataTime instances.
Those fields are defined in two traits called HasUpdatedAt, and HasCreatedAt, for the tables
trait HasCreatedAt {
val createdAt: Option[DateTime]
}
case class User(name:String,createdAt:Option[DateTime] = None) extends HasCreatedAt
I would like to know how can I use reflection to call the user copy method, to update the createdAt value during the database insertion method.
Edit after #vptron and #kevin-wright comments
I have a repo like this
trait BaseRepo[ID, R] {
def insert(r: R)(implicit session: Session): ID
}
I want to implement the insert just once, and there I want to createdAt to be updated, that's why I'm not using the copy method, otherwise I need to implement it everywhere I use the createdAt column.
This question was answered here to help other with this kind of problem.
I end up using this code to execute the copy method of my case classes using scala reflection.
import reflect._
import scala.reflect.runtime.universe._
import scala.reflect.runtime._
class Empty
val mirror = universe.runtimeMirror(getClass.getClassLoader)
// paramName is the parameter that I want to replacte the value
// paramValue is the new parameter value
def updateParam[R : ClassTag](r: R, paramName: String, paramValue: Any): R = {
val instanceMirror = mirror.reflect(r)
val decl = instanceMirror.symbol.asType.toType
val members = decl.members.map(method => transformMethod(method, paramName, paramValue, instanceMirror)).filter {
case _: Empty => false
case _ => true
}.toArray.reverse
val copyMethod = decl.declaration(newTermName("copy")).asMethod
val copyMethodInstance = instanceMirror.reflectMethod(copyMethod)
copyMethodInstance(members: _*).asInstanceOf[R]
}
def transformMethod(method: Symbol, paramName: String, paramValue: Any, instanceMirror: InstanceMirror) = {
val term = method.asTerm
if (term.isAccessor) {
if (term.name.toString == paramName) {
paramValue
} else instanceMirror.reflectField(term).get
} else new Empty
}
With this I can execute the copy method of my case classes, replacing a determined field value.
As comments have said, don't change a val using reflection. Would you that with a java final variable? It makes your code do really unexpected things. If you need to change the value of a val, don't use a val, use a var.
trait HasCreatedAt {
var createdAt: Option[DateTime] = None
}
case class User(name:String) extends HasCreatedAt
Although having a var in a case class may bring some unexpected behavior e.g. copy would not work as expected. This may lead to preferring not using a case class for this.
Another approach would be to make the insert method return an updated copy of the case class, e.g.:
trait HasCreatedAt {
val createdAt: Option[DateTime]
def withCreatedAt(dt:DateTime):this.type
}
case class User(name:String,createdAt:Option[DateTime] = None) extends HasCreatedAt {
def withCreatedAt(dt:DateTime) = this.copy(createdAt = Some(dt))
}
trait BaseRepo[ID, R <: HasCreatedAt] {
def insert(r: R)(implicit session: Session): (ID, R) = {
val id = ???//insert into db
(id, r.withCreatedAt(??? /*now*/))
}
}
EDIT:
Since I didn't answer your original question and you may know what you are doing I am adding a way to do this.
import scala.reflect.runtime.universe._
val user = User("aaa", None)
val m = runtimeMirror(getClass.getClassLoader)
val im = m.reflect(user)
val decl = im.symbol.asType.toType.declaration("createdAt":TermName).asTerm
val fm = im.reflectField(decl)
fm.set(??? /*now*/)
But again, please don't do this. Read this stackoveflow answer to get some insight into what it can cause (vals map to final fields).

How to represent optional fields in spray-json?

I have an optional field on my requests:
case class SearchRequest(url: String, nextAt: Option[Date])
My protocol is:
object SearchRequestJsonProtocol extends DefaultJsonProtocol {
implicit val searchRequestFormat = jsonFormat(SearchRequest, "url", "nextAt")
}
How do I mark the nextAt field optional, such that the following JSON objects will be correctly read and accepted:
{"url":"..."}
{"url":"...", "nextAt":null}
{"url":"...", "nextAt":"2012-05-30T15:23Z"}
I actually don't really care about the null case, but if you have details, it would be nice. I'm using spray-json, and was under the impression that using an Option would skip the field if it was absent on the original JSON object.
Works for me (spray-json 1.1.1 scala 2.9.1 build)
import cc.spray.json._
import cc.spray.json.DefaultJsonProtocol._
// string instead of date for simplicity
case class SearchRequest(url: String, nextAt: Option[String])
// btw, you could use jsonFormat2 method here
implicit val searchRequestFormat = jsonFormat(SearchRequest, "url", "nextAt")
assert {
List(
"""{"url":"..."}""",
"""{"url":"...", "nextAt":null}""",
"""{"url":"...", "nextAt":"2012-05-30T15:23Z"}""")
.map(_.asJson.convertTo[SearchRequest]) == List(
SearchRequest("...", None),
SearchRequest("...", None),
SearchRequest("...", Some("2012-05-30T15:23Z")))
}
You might have to create an explicit format (warning: psuedocodish):
object SearchRequestJsonProtocol extends DefaultJsonProtocol {
implicit object SearchRequestJsonFormat extends JsonFormat[SearchRequest] {
def read(value: JsValue) = value match {
case JsObject(List(
JsField("url", JsString(url)),
JsField("nextAt", JsString(nextAt)))) =>
SearchRequest(url, Some(new Instant(nextAt)))
case JsObject(List(JsField("url", JsString(url)))) =>
SearchRequest(url, None)
case _ =>
throw new DeserializationException("SearchRequest expected")
}
def write(obj: SearchRequest) = obj.nextAt match {
case Some(nextAt) =>
JsObject(JsField("url", JsString(obj.url)),
JsField("nextAt", JsString(nextAt.toString)))
case None => JsObject(JsField("url", JsString(obj.url)))
}
}
}
Use NullOptions trait to disable skipping nulls:
https://github.com/spray/spray-json#nulloptions
Example:
https://github.com/spray/spray-json/blob/master/src/test/scala/spray/json/ProductFormatsSpec.scala
Don't know if this will help you but you can give that field a default value in the case class definition, so if the field is not in the json, it will assign the default value to it.
Easy.
import cc.spray.json._
trait MyJsonProtocol extends DefaultJsonProtocol {
implicit val searchFormat = new JsonWriter[SearchRequest] {
def write(r: SearchRequest): JsValue = {
JsObject(
"url" -> JsString(r.url),
"next_at" -> r.nextAt.toJson,
)
}
}
}
class JsonTest extends FunSuite with MyJsonProtocol {
test("JSON") {
val search = new SearchRequest("www.site.ru", None)
val marshalled = search.toJson
println(marshalled)
}
}
For anyone who is chancing upon this post and wants an update to François Beausoleil's answer for newer versions of Spray (circa 2015+?), JsField is deprecated as a public member of JsValue; you should simply supply a list of tuples instead of JsFields. Their answer is spot-on, though.