I have a scenario where in I have serialized and saved JSON data into a document store. The serialization was done using GSON. Now I have added an additional attribute to the class that has been serialized, so now when I try to deserialize I get an exception:
com.google.gson.JsonSyntaxException: java.lang.IllegalStateException: Expected a string but was BEGIN_OBJECT at line 1 column 420
com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$Adapter.read(ReflectiveTypeAdapterFactory.java:176)
com.google.gson.internal.bind.TypeAdapterRuntimeTypeWrapper.read(TypeAdapterRuntimeTypeWrapper.java:40)
com.google.gson.internal.bind.CollectionTypeAdapterFactory$Adapter.read(CollectionTypeAdapterFactory.java:81)
com.google.gson.internal.bind.CollectionTypeAdapterFactory$Adapter.read(CollectionTypeAdapterFactory.java:60)
com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$1.read(ReflectiveTypeAdapterFactory.java:93)
com.google.gson.internal.bind.ReflectiveTypeAdapterFactory$Adapter.read(ReflectiveTypeAdapterFactory.java:172)
com.google.gson.Gson.fromJson(Gson.java:795)
com.google.gson.Gson.fromJson(Gson.java:761)
com.google.gson.Gson.fromJson(Gson.java:710)
com.google.gson.Gson.fromJson(Gson.java:682)
Looks like this seems to be due to the addition of the new class attribute. Is there anyway of specifying that the new attribute is OPTIONAL so that GSON deserialization would work in this case?
Related
I have a akka cluster of microservices running on nodes. I'm trying to use the Jackson serializer for application specific messages exchanged between the nodes (microservices).
My code is written in scala, not Java
I'm using scala 'case class' as akka messages exchanged between actors and each message (case class) have a val LOG:org.slf4j.Logger built in to log specific business logic processing information.
When I send messages between nodes I'm getting the exception
WARN akka.remote.artery.Deserializer - Failed to deserialize message from [akka://MyCluster#127.0.0.1:25251] with serializer id [33] and manifest [com...MyCaseClassMessage]. com.fasterxml.jackson.databind.exc.InvalidDefinitionException: Cannot construct instance of org.slf4j.Logger (no Creators, like default constructor, exist): abstract types either need to be mapped to concrete types, have custom deserializer, or contain additional type information
at [Source: ..."[truncated 'n' bytes]; line: 'n', column: 'n'] (through reference chain: com...MyCaseClassMessage["LOG"])
My case class essentially is something like:
case class MyCaseClassMessage()
extends CborSerializable {
override val LOG:Logger = LoggerFactory.getLogger(classOf[MyCaseClassMessage])
val businessLogic:Array[Byte] = ...
def apply():Array[Byte] = ...
}
I have no idea of how to specify to Jackson how to serialize and (or) deserialize a "val LOG:Logger" in my case class. I just know that if I remove my Logger, substituing it to (for example) println("message") I don't have any problem with serialization and deserialization.
Any help?
Because Jackson relies on reflection and does not understand the convention in Scala case classes that only the constructor parameters are required for defining the message, it will attempt to serialize every field of the object.
The LOG field can be ignored by Jackson by annotating it with an #JsonIgnore annotation (com.fasterxml.jackson.annotation.JsonIgnore):
#JsonIgnore
override val LOG: Logger = LoggerFactory.getLogger(classOf[MyCaseClassMessage])
How can I serialize the field name with a different name from what is defined in the Document object?
from elasticsearch_dsl import Document, Text
class MyDocument(Document):
context = Text()
When saving this document to ES, I want to write the 'context' key as '#context'.
Since #context is not a valid python identifier I am afraid this is not easily possible. You can always drop down to the raw json (by overriding the to_dict method) but I would definitely not recommend it.
Using MapStruct, we want to use ReportingPolicy.ERROR, and have code like the following:
#Mapping(source = "nestedSource.doublyNestedSourceField", target = "nestedTarget.doublyNestedTargetField")
Target mapSourceToTarget(Source source);
Where nestedSource is not the same type as nestedTarget, and both doublyNested*Field types are String.
There is no mapper declared for NestedSource -> NestedTarget. The String properties declared in the Mapping above are the only ones in those types.
The above causes an unmapped source error:
Unmapped source property: "doublyNestedSourceField".
That seems more-or-less reasonable, as we didn't declare a mapper for NestedSource -> NestedTarget.
However, here's the issue: If we change the ReportingPolicy for unmapped sources to warn/ignore, MapStruct figures out how to correctly map the doublyNestedSourceField in the mapper implementation, even though it claims there is no source mapping present. Just wondering what is going on here, and whether I'm missing something.
----Into the weeds a bit more (in the MapStruct code itself)----
I could be doing something wrong, but I did notice that in BeanMethodMapping.java MapStruct attempts to remove "nestedSource.doubleNestedSourceField" from unprocessedSourceProperties, even though the key for the appropriate property is just "nestedSource" in unprocessedSourceProperties. Thus "nestedSource" is left as an unprocessed source property and an error is thrown.
I have created some extensions in saml metadata. I'm trying to unmarshall the xml using opensaml2. I have created the interface, implementation class, builder, marshaller and unmarshaller of the extension. Then I registered the object providers using Configuration.registerObjectProvider
Configuration.registerObjectProvider(RequestedAudiences.TYPE_NAME, new RequestedAudiencesBuilder(), new RequestedAudiencesMarshaller(), new RequestedAudiencesUnmarshaller());
When I try to get the extensions using the bellow code segment
List<XMLObject> extensions = spssoDescriptor.getExtensions().getUnknownXMLObjects();
It returns objects of the type
org.opensaml.xml.schema.impl.XSAnyImpl
So now I can't read any value from the object. I want to get an object of the actual extension implementation class I have created.
Can anyone suggest what I am doing wrong?
The problem was I have registered the object providers after creating the metadata object. So at the time of creating the metadata object, opensaml does not know how to create the required extension object (RequestedAudiences object). Registering the object providers before creating the metadata object resolved the problem.
You can use below scala code to extract the information.
val dato = descriptor.getExtensions().getUnknownXMLObjects.get(0).asInstanceOf[XSAny]
println(dato.getTextContent)
I am able to load some entities into ElasticSearch with out-of-the box Spring Data ElasticSearch. The thing is my model classes contemplate many properties and for some of those I don't want my representation (typing) be reflected into ES.
#Field(serializer = MyCustomSerializer, deserializer = MyCustomDeserializer)
private SomeClass someObject;
I'd like, for example, for SomeClass to be serialized as a String, so I can query it as such. Also, when reading data from ES, I want to be able to write a custom deserializer (MyCustomDeserializer) to convert this String into my own model.
Is there any way I can accomplish that??
Thanks
Spring Data ElasticSearch uses jackson to serialize the fields, so you could achieve custom serialization logic by defining:
#JsonSerialize(using = MyCustomSerializer.class)
#JsonDeserialize(using = MyCustomDeserializer.class)
private SomeClass someObject;
Or configure your mapping globally in a jackson ObjectMapper, replacing the default EntityMapper from spring-data-elasticsearch. More on that here.