I am relatively new to Scala and using Jackson, I am getting below error for following case class:
case class Result(label: String, resultDate: Option[Date] = None)
Resolved[org.springframework.http.converter.HttpMessageNotReadableException: JSON parse error: No deserializer for document type 'result' found; nested exception is com.fasterxml.jackson.databind.JsonMappingException: No deserializer for document type 'result' found
at [Source: (PushbackInputStream); line: 1, column: 209] (through reference chain: com.project["document"])]
You need to provide Jackson a way to deserialize this type: Jackson (core) doesn't know Scala and especially case class since it's originally a Java library.
You can add Jackson Scala Module for automatic support of Scala types.
Related
I have following case classes defined in my flink application (Flink 1.10.1)
case class FilterDefinition(filterDefId: String, filter: TileFilter)
case class TileFilter(tiles: Seq[Long], zoomLevel: Int)
During runtime, I noticed the log saying
FilterDefinition cannot be used as a POJO type because not all fields are valid POJO fields, and must be processed as GenericType. Please read the Flink documentation on "Data Types & Serialization" for details of the effect on performance.
If I interpreted Flink documentation correctly, the flink should be able to serialize the scala case classes and not need Kryo for it. However, it looks like for me, the above case class fallbacks on Kryo serializer.
Did I miss interpret how case classes are handled by flink?
Excerpting here from the documentation:
Java and Scala classes are treated by Flink as a special POJO data
type if they fulfill the following requirements:
The class must be public.
It must have a public constructor without arguments (default
constructor).
All fields are either public or must be accessible through getter and
setter functions. For a field called foo the getter and setter methods
must be named getFoo() and setFoo().
The type of a field must be supported by a registered serializer.
In this case Flink it appears that Flink doesn't know how to serialize TileFilter (or more specifically, Seq[Long]).
I have define scala object(say, MyObject) which extends following
trait GeneratedMessageCompanion[A <: GeneratedMessage with Message[A]]
And when I call parseFrom method on the object, I get following error:
Caused by: java.lang.NoSuchMethodError:....MyObject$.parseFrom([B)Lscalapb/GeneratedMessage;
I tried both scalapb-runtime_2.11 and scalapb-runtime_2.12.
Edit: Issue is solved. It was case of dependency mismatches.
Given a simple case class with a type annotation #Bar:
case class Foo(
field: Option[String] #Bar
)
converting a RDD[Foo] to a Dataset[Foo] fails at runtime with the following stack trace:
User class threw exception: scala.MatchError: scala.Option[String] #Bar (of class scala.reflect.internal.Types$AnnotatedType)
at org.apache.spark.sql.catalyst.ScalaReflection$$anonfun$org$apache$spark$sql$catalyst$ScalaReflection$$serializerFor$1.apply(ScalaReflection.scala:483)
at ...
A ticket is open for this issue (SPARK-27625). However, is there a workaround?
Using spark 2.3.2
The frameless library supports type annotations.
I refactor my code to work with kryo serialization.
Everything works fine except deserialize a property of geomtry from certain class.
No exception is thrown (I set "spark.kryo.registrationRequired" to true).
On debug I try to collect the data and I see that the data in the geomtry is just empty. As a result I understand that the deserialize was fail.
Geomtry is from type of - Any(scala) because it is a complex property maybe.
My question is why the data is empty, And is there connection to the type 'Any' of the property.
Update :
class code: class Entity(val id:String) extends Serializable{
var index:Any = null
var geometry:Any = null
}
geometry contains centeroid, shape and coordinates(complex object)
You should not use Kryo with Scala since the behavior of many Scala classes differs from Java classes and Kryo was originally written to work with Java. You will probably encounter many weird issues like this one if you use Kryo with Scala. You should instead use chill-scala which is an extension of Kryo that handles all of Scala's special cases.
Does anyone knows why when im trying to deserialize JSon into Map[String,MyCustomType] i always get after deserialization Object of type Map[String, HashMap[String, String]]?
Thanks in advance.