I have a case class that returns looks something like this
case class Response(
#JsonDeserialize(contentAs = classOf[java.lang.Long])
longList: List[Long] = null)
I have a customer ObjectMapper, that among other things registers DefaultScalaModule. According to https://github.com/FasterXML/jackson-module-scala/wiki/FAQ, adding #JsonDeserialize should solve the issue, but it doesn't
The issue is in my tests, and I get the following error message
java.lang.ClassCastException: java.lang.Integer cannot be cast to java.lang.Long
at scala.runtime.BoxesRunTime.unboxToLong(BoxesRunTime.java:105)
Test class, list.head is what triggers the error
#SpringBootTest(webEnvironment = SpringBootTest.WebEnvironment.RANDOM_PORT)
class ControllerTest {
#Autowired
var testRestTemplate: TestRestTemplate = _
#Autowired
var objectMapper: ObjectMapper = _
#Test
def test() : Unit = {
val response = testRestTemplate.exchange("url", HttpMethod.GET, classOf[Response])
val list = response.getBody.longList
val a = list.head
}
}
Debugging tells me that my list is actually of type $colon$colon containing Integers
Related
I have an Enumeration in Scala
object Status extends Enumeration {
type Status = Value
val Success = Value
val Error = Value
}
This is used in the below -
case class Response(
status: Status,
errorMessage: String
)
I want to store Response in a file. So, I am using Jackson object mapper (com.fasterxml.jackson.databind.ObjectMapper ) to serialize it.
writeOutputToFile(filePath: Path , objectMapper.writeValueAsString(response))
However, object mapper writes an empty json to the file. I know object mapper requires a getter method to serialize. Is that why this is failing? Would I need a custom object mapper?
You can define generic serializer and deserializer for all enums and then for each of them register corresponding pairs of instances:
class EnumSerializer[T <: scala.Enumeration](e: T) extends JsonSerializer[T#Value] {
override def serialize(x: T#Value, jg: JsonGenerator, spro: SerializerProvider): Unit =
jg.writeString(x.toString)
}
class EnumDeserializer[T <: scala.Enumeration](e: T) extends JsonDeserializer[T#Value] {
private[this] val ec = new ConcurrentHashMap[String, T#Value]
override def deserialize(jp: JsonParser, ctxt: DeserializationContext): T#Value = Try {
val s = jp.getValueAsString
var x = ec.get(s)
if (x eq null) {
x = e.values.iterator.find(_.toString == s).get
ec.put(s, x)
}
x
}.getOrElse(ctxt.handleUnexpectedToken(classOf[T#Value], jp).asInstanceOf[T#Value])
}
val objectMapper: ObjectMapper with ScalaObjectMapper = {
val jsonFactory = new JsonFactoryBuilder()
.configure(...)
.build()
new ObjectMapper(jsonFactory) with ScalaObjectMapper {
registerModule(DefaultScalaModule)
registerModule(new SimpleModule()
.addSerializer(classOf[Status], new EnumSerializer(Status))
.addDeserializer(classOf[Status] new EnumDeserializer(Status))
...
)
}
}
The proposed solution is much safe and more efficient in runtime than just using of Status.withName(s) for each deserialization.
It will work even for the dynamic definition of enum values, like:
object Status extends Enumeration {
type Status = Value
val Success = Value
val Error = Value
def extra(name: String): Status = Value(nextId, name)
}
Status.extra("Unknown")
Full sources are here.
I wrote some Scala code, using reflection, that returns all vals in an object that are of a certain type. Below are three versions of this code. One of them works but is ugly. Two attempts to improve it don't work, in very different ways. Can you explain why?
First, the code:
import scala.reflect.runtime._
import scala.util.Try
trait ScopeBase[T] {
// this version tries to generalize the type. The only difference
// from the working version is [T] instead of [String]
def enumerateBase[S: universe.TypeTag]: Seq[T] = {
val mirror = currentMirror.reflect(this)
universe.typeOf[S].decls.map {
decl => Try(mirror.reflectField(decl.asMethod).get.asInstanceOf[T])
}.filter(_.isSuccess).map(_.get).filter(_ != null).toSeq
}
}
trait ScopeString extends ScopeBase[String] {
// This version works but requires passing the val type
// (String, in this example) explicitly. I don't want to
// duplicate the code for different val types.
def enumerate[S: universe.TypeTag]: Seq[String] = {
val mirror = currentMirror.reflect(this)
universe.typeOf[S].decls.map {
decl => Try(mirror.reflectField(decl.asMethod).get.asInstanceOf[String])
}.filter(_.isSuccess).map(_.get).filter(_ != null).toSeq
}
// This version tries to avoid passing the object's type
// as the [S] type parameter. After all, the method is called
// on the object itself; so why pass the type?
def enumerateThis: Seq[String] = {
val mirror = currentMirror.reflect(this)
universe.typeOf[this.type].decls.map {
decl => Try(mirror.reflectField(decl.asMethod).get.asInstanceOf[String])
}.filter(_.isSuccess).map(_.get).filter(_ != null).toSeq
}
}
// The working example
object Test1 extends ScopeString {
val IntField: Int = 13
val StringField: String = "test"
lazy val fields = enumerate[Test1.type]
}
// This shows how the attempt to generalize the type doesn't work
object Test2 extends ScopeString {
val IntField: Int = 13
val StringField: String = "test"
lazy val fields = enumerateBase[Test2.type]
}
// This shows how the attempt to drop the object's type doesn't work
object Test3 extends ScopeString {
val IntField: Int = 13
val StringField: String = "test"
lazy val fields = enumerateThis
}
val test1 = Test1.fields // List(test)
val test2 = Test2.fields // List(13, test)
val test3 = Test3.fields // List()
The "enumerate" method does work. However, as you can see from the Test1 example, it requires passing the object's own type (Test1.type) as a parameter, which should not have been necessary. The "enumerateThis" method tries to avoid that but fails, producing an empty list. The "enumerateBase" method attempts to generalize the "enumerate" code by passing the val type as a parameter. But it fails, too, producing the list of all vals, not just those of a certain type.
Any idea what's going on?
Your problem in your generic implementation is the loss of the type information of T. Also, don't use exceptions as your primary method of control logic (it's very slow!). Here's a working version of your base.
abstract class ScopeBase[T : universe.TypeTag, S <: ScopeBase[T, S] : universe.TypeTag : scala.reflect.ClassTag] {
self: S =>
def enumerateBase: Seq[T] = {
val mirror = currentMirror.reflect(this)
universe.typeOf[S].baseClasses.map(_.asType.toType).flatMap(
_.decls
.filter(_.typeSignature.resultType <:< universe.typeOf[T])
.filter(_.isMethod)
.map(_.asMethod)
.filter(_.isAccessor)
.map(decl => mirror.reflectMethod(decl).apply().asInstanceOf[T])
.filter(_ != null)
).toSeq
}
}
trait Inherit {
val StringField2: String = "test2"
}
class Test1 extends ScopeBase[String, Test1] with Inherit {
val IntField: Int = 13
val StringField: String = "test"
lazy val fields = enumerateBase
}
object Test extends App {
println(new Test1().fields)
}
Instead of getting the type from universe.typeOf you can use the runtime class currentMirror.classSymbol(getClass).toType, below is an example that works:
def enumerateThis: Seq[String] = {
val mirror = currentMirror.reflect(this)
currentMirror.classSymbol(getClass).toType.decls.map {
decl => Try(mirror.reflectField(decl.asMethod).get.asInstanceOf[String])
}.filter(_.isSuccess).map(_.get).filter(_ != null).toSeq
}
//prints List(test)
With everyone's help, here's the final version that works:
import scala.reflect.runtime.{currentMirror, universe}
abstract class ScopeBase[T: universe.TypeTag] {
lazy val enumerate: Seq[T] = {
val mirror = currentMirror.reflect(this)
currentMirror.classSymbol(getClass).baseClasses.map(_.asType.toType).flatMap {
_.decls
.filter(_.typeSignature.resultType <:< universe.typeOf[T])
.filter(_.isMethod)
.map(_.asMethod)
.filterNot(_.isConstructor)
.filter(_.paramLists.size == 0)
.map(decl => mirror.reflectField(decl.asMethod).get.asInstanceOf[T])
.filter(_ != null).toSeq
}
}
}
trait FieldScope extends ScopeBase[Field[_]]
trait DbFieldScope extends ScopeBase[DbField[_, _]] {
// etc....
}
As you see from the last few lines, my use cases are limited to scope objects for specific field types. This is why I want to parameterize the scope container. If I wanted to enumerate the fields of multiple types in a single scope container, then I would have parameterized the enumerate method.
I have some working jackson scala module code for roundtripping scala case classes. Jackson worked great for flat case classes but when I made one which contains a list of other case classes the amount of code I seemed to need was a lot. Consider:
abstract class Message
case class CardDrawn(player: Long, card: Int, mType: String = "CardDrawn") extends Message
case class CardSet(cards: List[CardDrawn], mType: String = "CardSet") extends Message
To get the CardSet to roundtrip to/from json with jackson scala module I used a custom serializer/deserializer written in java:
object ScrumGameMashaller {
val mapper = new ObjectMapper()
val module = new SimpleModule("CustomSerializer")
module.addSerializer(classOf[CardSet], new CardSetSerializer)
module.addDeserializer(classOf[CardSet], new CardSetDeserializer)
val scalaModule = DefaultScalaModule
mapper.registerModule(scalaModule)
mapper.registerModule(module)
def jsonFrom(value: Any): String = {
import java.io.StringWriter
val writer = new StringWriter()
mapper.writeValue(writer, value)
writer.toString
}
private[this] def objectFrom[T: Manifest](value: String): T =
mapper.readValue(value, typeReference[T])
private[this] def typeReference[T: Manifest] = new TypeReference[T] {
override def getType = typeFromManifest(manifest[T])
}
private[this] def typeFromManifest(m: Manifest[_]): Type = {
if (m.typeArguments.isEmpty) { m.runtimeClass }
else new ParameterizedType {
def getRawType = m.runtimeClass
def getActualTypeArguments = m.typeArguments.map(typeFromManifest).toArray
def getOwnerType = null
}
}
with serializer:
public class CardSetSerializer extends JsonSerializer<CardSet> {
#Override
public void serialize(CardSet cardSet, JsonGenerator jgen, SerializerProvider provider) throws IOException, JsonProcessingException {
jgen.writeStartObject();
jgen.writeArrayFieldStart("cards");
List<CardDrawn> cardsDrawn = cardSet.cards();
scala.collection.Iterator<CardDrawn> iter = cardsDrawn.iterator();
while(iter.hasNext()){
CardDrawn cd = iter.next();
cdSerialize(jgen,cd);
}
jgen.writeEndArray();
jgen.writeStringField("mType", "CardSet");
jgen.writeEndObject();
}
private void cdSerialize(JsonGenerator jgen, CardDrawn cd) throws IOException, JsonProcessingException {
jgen.writeStartObject();
jgen.writeNumberField("player", cd.player());
jgen.writeNumberField("card", cd.card());
jgen.writeEndObject();
}
}
and matching deserializer:
public class CardSetDeserializer extends JsonDeserializer<CardSet> {
private static class CardDrawnTuple {
Long player;
Integer card;
}
#Override
public CardSet deserialize(JsonParser jsonParser, DeserializationContext cxt) throws IOException, JsonProcessingException {
ObjectCodec oc = jsonParser.getCodec();
JsonNode root = oc.readTree(jsonParser);
JsonNode cards = root.get("cards");
Iterator<JsonNode> i = cards.elements();
List<CardDrawn> cardObjects = new ArrayList<>();
while( i.hasNext() ){
CardDrawnTuple t = new CardDrawnTuple();
ObjectNode c = (ObjectNode) i.next();
Iterator<Entry<String, JsonNode>> fields = c.fields();
while( fields.hasNext() ){
Entry<String,JsonNode> f = fields.next();
if( f.getKey().equals("player")) {
t.player = f.getValue().asLong();
} else if( f.getKey().equals("card")){
t.card = f.getValue().asInt();
} else {
System.err.println(CardSetDeserializer.class.getCanonicalName()+ " : unknown field " + f.getKey());
}
}
CardDrawn cd = new CardDrawn(t.player, t.card, "CardDrawn");
cardObjects.add(cd);
}
return new CardSet(JavaConversions.asScalaBuffer(cardObjects).toList(), "CardSet");
}
}
This seems like a lot code to deal with something fairly vanilla in the scala. Can this code be improved (what did I miss that jackson has to make this easy)? Else is there a library which will do structured case classes automatically? The jerkson examples looked easy but that seems to have been abandoned.
Argonaut does a great job. Mark Hibbard helped me out with getting the example below working. All that is needed is to create a codec for the types and it will implicitly add an asJson to your objects to turn them into strings. It will also add a decodeOption[YourClass] to strings to extract an object. The following:
package argonaut.example
import argonaut._, Argonaut._
abstract class Message
case class CardDrawn(player: Long, card: Int, mType: String = "CardDrawn") extends Message
case class CardSet(cards: List[CardDrawn], mType: String = "CardSet") extends Message
object CardSetExample {
implicit lazy val CodecCardSet: CodecJson[CardSet] = casecodec2(CardSet.apply, CardSet.unapply)("cards","mType")
implicit lazy val CodecCardDrawn: CodecJson[CardDrawn] = casecodec3(CardDrawn.apply, CardDrawn.unapply)("player", "card", "mType")
def main(args: Array[String]): Unit = {
val value = CardSet(List(CardDrawn(1L,2),CardDrawn(3L,4)))
println(s"Got some good json ${value.asJson}")
val jstring =
"""{
| "cards":[
| {"player":"1","card":2,"mType":"CardDrawn"},
| {"player":"3","card":4,"mType":"CardDrawn"}
| ],
| "mType":"CardSet"
| }""".stripMargin
val parsed: Option[CardSet] =
jstring.decodeOption[CardSet]
println(s"Got a good object ${parsed.get}")
}
}
outputs:
Got some good json {"cards":[{"player":"1","card":2,"mType":"CardDrawn"},{"player":"3","card":4,"mType":"CardDrawn"}],"mType":"CardSet"}
Got a good object CardSet(List(CardDrawn(1,2,CardDrawn), CardDrawn(3,4,CardDrawn)),CardSet)
The question is old but maybe someone could still find it helpful. Apart from Argonaut, Scala has several Json libraries. Here you can find a list of them updated to the beginning of 2016 (and it still gives you a good overall picture).
Most of them (probably all) should allow you to come up with a drier version of your custom serializer/deserailizer. My preference goes to json4s which aims to provide a single AST across multiple libraries including Jackson (a bit like slf4j does for logging libraries). In this post you can find a working example of a Json custom serializer/deserializer using Json4s and Akka Http.
I am quite new to the scala programming language, and I currently need to do the following. I have a signleton object like the following:
object MyObject extends Serializable {
val map: HashMap[String, Int] = null
val x: int = -1;
val foo: String = ""
}
Now i want to avoid to have to serialize each field of this object separately, thus I was considering writing the whole object to a file, and then, in the next execution of the program, read the file and initialize the singleton object from there. Is there any way to do this?
Basically what I want is when the serialization file doesn't exist, those variables to be initialized to new structures, while when it exists, the fields to be initialized from the ones on the file. But I want to avoid having to serialize/deserialize every field manually...
UPDATE:
I had to use a custom deserializer as presented here: https://issues.scala-lang.org/browse/SI-2403, since i had issues with a custom class I use inside the HashMap as values.
UPDATE2:
Here is the code I use to serialize:
val store = new ObjectOutputStream(new FileOutputStream(new File("foo")))
store.writeObject(MyData)
store.close
And the code to deserialize (in a different file):
#transient private lazy val loadedData: MyTrait = {
if(new File("foo").exists()) {
val in = new ObjectInputStream(new FileInputStream("foo")) {
override def resolveClass(desc: java.io.ObjectStreamClass): Class[_] = {
try { Class.forName(desc.getName, false, getClass.getClassLoader) }
catch { case ex: ClassNotFoundException => super.resolveClass(desc) }
}
}
val obj = in.readObject().asInstanceOf[MyTrait]
in.close
obj
}
else null
}
Thanks,
No needs to serialize an object with only immutable fields (because the compiler will do it for you...) I will assume that the object provides default values. Here is a way to do this:
Start by writing an trait with all the required fields:
trait MyTrait {
def map: HashMap[String, Int]
def x: Int
def foo: String
}
Then write an object with the defaults:
object MyDefaults extends MyTrait {
val map = Map()
val x = -1
val foo =
}
Finally write an implementation unserializing data if it exists:
object MyData extends MyTrait {
private lazy val loadedData: Option[MyTrait] = {
if( /* filename exists */ ) Some( /*unserialize filename as MyTrait*/)
else None
}
lazy val map = loadedData.getOrElse( MyDefault ).map
lazy val x = loadedData.getOrElse( MyDefault ).x
lazy val foo = loadedData.getOrElse( MyDefault ).foo
}
Working with JAXB, the standard way of dealing with a list of "nested" resource representations (e.g. <products><product>X</product><product>Y</product></products> is to create a wrapper object, which in Java might look like this (borrowed from Jhopify):
#XmlType(name = "")
#XmlRootElement(name = "products")
public class ProductList {
List<Product> products = new ArrayList<Product>();
#XmlElement(name = "product", required = true)
public List<Product> getProducts() { return products; }
public void setProducts(List<Product> products) { this.products = products; }
}
However I'm struggling to determine exactly which collection objects to use when translating this to Scala. There's a good introductory post to doing this on the Mostly Blather blog, which uses a Scala Iterable implicitly converted (using JavaConversions) to and from a JCollection.
This works great for marshalling a JAXB class to XML but unfortunately when unmarshalling this throws UnsupportedOperationException on each add attempt. Based on the last paragraph on this Scala documentation page it looks like this happens because Java does not distinguish between mutable and immutable collections in their type.
To deal with the unmarshalling, I've tried an alternative approach specifically using mutable objects:
#XmlType(name = "")
#XmlRootElement(name = "products")
class ProductList {
private var products: Buffer[Product] = new ArrayBuffer[Product]
#XmlElement(name = "product", required = true)
def getProducts: JList[Product] = products
def setProducts(products: JList[Product]) {
this.products = products
}
}
But unfortunately with this approach, unmarshalling gives me an exception:
java.lang.NoSuchMethodError: ProductList.getProducts()Ljava/util/Collection;
Edit: as per Travis' request, here is my unmarshalling code:
val jaxbContext = JAXBContext.newInstance(ProductList.getClass())
val unmarshaller = jaxbContext.createUnmarshaller()
val root = unmarshaller.unmarshal(new StreamSource(new StringReader(responseString)), ProductList.getClass())
val r = root.getValue().asInstanceOf[ProductList]
val representations = r.getProducts.asScala.toList // Uses scalaj
So I'm a bit stumped... I've looked at scalaj's available conversions too but nothing obvious jumps out. Any help much appreciated!
Could you post your unmarshalling code? I've done something similar with JAXB from Scala, and what you have looks like it should work. Here's a complete working example:
import javax.xml.bind.annotation._
class Thing {
#scala.reflect.BeanProperty var name: String = _
}
#XmlRootElement(name = "things")
class Things {
import scala.collection.JavaConversions._
import scala.collection.mutable.Buffer
private var things: Buffer[Thing] = Buffer[Thing]()
#XmlElement(name = "thing", required = true)
def getThings: java.util.List[Thing] = this.things
def setThings(things: java.util.List[Thing]) {
this.things = things
}
}
I'll write the test code in Scala as well, but it would work identically in Java.
object Things {
import java.io.StringReader
import java.io.StringWriter
import javax.xml.bind.JAXBContext
def main(args: Array[String]) {
val thing1 = new Thing
val thing2 = new Thing
thing1.setName("Thing 1")
thing2.setName("Thing 2")
val list: java.util.List[Thing] = new java.util.ArrayList[Thing]
list.add(thing1)
list.add(thing2)
val things = new Things
things.setThings(list)
val context = JAXBContext.newInstance(classOf[Things])
val writer = new StringWriter
context.createMarshaller.marshal(things, writer)
println(writer.toString)
val readThings = context.createUnmarshaller().unmarshal(
new StringReader(writer.toString)
).asInstanceOf[Things]
println("Size: " + readThings.getThings.size)
println("Name of first: " + readThings.getThings.get(0).getName)
}
}
This compiles and produces the output you'd expect.