circe type field not showing - scala

When encoding to Json with circe we really want the type field to show e.g.
scala> val fooJson = foo.asJson
fooJson: io.circe.Json =
{
"this_is_a_string" : "abc",
"another_field" : 123,
"type" : "Foo"
}
This is taken from the release notes which previously mentions that you can configure the encoding like this:
implicit val customConfig: Configuration =
Configuration.default.withSnakeCaseKeys.withDefaults.withDiscriminator("type")
Also other information about circe here suggests that without any configuration you should get some class type information in the encoding json.
Am I missing something? How do you get the class type to show?

UPDATE 30/03/2017: Follow up to OP's comment
I was able to make this work, as shown in the linked release notes.
Preparation step 1: add additional dependency to build.sbt
libraryDependencies += "io.circe" %% "circe-generic-extras" % "0.7.0"
Preparation step 2: setup dummy sealed trait hierarchy
import io.circe.{ Decoder, Encoder }
import io.circe.parser._, io.circe.syntax._
import io.circe.generic.extras.Configuration
import io.circe.generic.extras.auto._
import io.circe.generic.{ semiauto => boring } // <- This is the default generic derivation behaviour
import io.circe.generic.extras.{ semiauto => fancy } // <- This is the new generic derivation behaviour
implicit val customConfig: Configuration = Configuration.default.withDefaults.withDiscriminator("type")
sealed trait Stuff
case class Foo(thisIsAString: String, anotherField: Int = 13) extends Stuff
case class Bar(thisIsAString: String, anotherField: Int = 13) extends Stuff
object Foo {
implicit val decodeBar: Decoder[Bar] = fancy.deriveDecoder
implicit val encodeBar: Encoder[Bar] = fancy.deriveEncoder
}
object Bar {
implicit val decodeBar: Decoder[Bar] = boring.deriveDecoder
implicit val encodeBar: Encoder[Bar] = boring.deriveEncoder
}
Actual code using this:
val foo: Stuff = Foo("abc", 123)
val bar: Stuff = Bar("xyz", 987)
val fooString = foo.asJson.noSpaces
// fooString: String = {"thisIsAString":"abc","anotherField":123,"type":"Foo"}
val barString = bar.asJson.noSpaces
// barString: String = {"thisIsAString":"xyz","anotherField":987,"type":"Bar"}
val bar2 = for{
json <- parse(barString)
bar2 <- json.as[Stuff]
} yield bar2
// bar2: scala.util.Either[io.circe.Error,Stuff] = Right(Bar(xyz,987))
val foo2 = for{
json <- parse(fooString)
foo2 <- json.as[Stuff]
} yield foo2
// foo2: scala.util.Either[io.circe.Error,Stuff] = Right(Foo(abc,123))
So, provided you import the extra dependency (which is where Configuration comes from), it looks like it works.
Finally, as a sidenote, it does seem that there is some disconnection between Circe's DESIGN.md and practice, for which I am actually happy.
Original answer:
I am not sure this is supposed to be supported, by design.
Taken from Circe's DESIGN.md:
Implicit scope should not be used for configuration. Lots of people have asked for a way to configure generic codec derivation to use e.g. a type field as the discriminator for sealed trait hierarchies, or to use snake case for member names. argonaut-shapeless supports this quite straightforwardly with a JsonCoproductCodec type that the user can provide implicitly.
I don't want to criticize this approach—it's entirely idiomatic Scala, and it often works well in practice—but I personally don't like using implicit values for configuration, and I'd like to avoid it in circe until I am 100% convinced that there's no alternative way to provide this functionality.
What this means concretely: You'll probably never see an implicit argument that isn't a type class instance—i.e. that isn't a type constructor applied to a type in your model—in circe, and configuration of generic codec derivation is going to be relatively limited (compared to e.g. argonaut-shapeless) until we find a nice way to do this kind of thing with type tags or something similar.
In particular, customConfig: Configuration seems to be exactly the type of argument that the last paragraph refers to (e.g. an implicit argument that isn't a type class instance)
I am sure that #travis-brown or any other Circe's main contributors could shed some more light on this, in case there was in fact a way of doing this - and I would be very happy to know it! :)

Related

In Scala 3, is it possible to use declared type of an object in runtime?

In Scala 2, most generic type information of an object is erased at runtime. At this moment, all 3 binary execution environments (JVM, javascript, and LLVM) abide this behaviour, they only differs in minor details in metadata formats.
In a rare case if it incurs critical data loss, or if it triggers a rare binary error. A mechanism should be used to preserve declared type information in an adjoint data structure. The following code gave a short example of such data structure in Scala 2:
import scala.reflect.runtime.universe
import scala.collection.concurrent.TrieMap
import scala.language.implicitConversions
case class Unerase[T](self: T)(
implicit
ev: universe.TypeTag[T]
) {
import Unerase._
cache += {
val inMemoryId = System.identityHashCode(this)
inMemoryId -> ev
}
}
object Unerase {
lazy val cache = TrieMap.empty[Int, universe.TypeTag[_]]
def get[T](v: T): Option[universe.TypeTag[T]] = {
val inMemoryId = System.identityHashCode(v)
cache.get(inMemoryId).map { tt =>
tt.asInstanceOf[universe.TypeTag[T]]
}
}
implicit def unbox[T](v: Unerase[T]): T = v.self
implicit def box[T](v: T)(
implicit
ev: universe.TypeTag[T]
): Unerase[T] = Unerase(v)
}
Any variable declared as type Unerase[T] instead of T will be guaranteed to have its full declared type visible at runtime. Unfortunately, this example no longer works in Scala 3:
implicitly[TypeTag[Int]] // works in Scala 2
summon[Type[Int]] // doesn't work in Scala 3: No given instance of type quoted.Quotes was found for parameter x$1 ...
Is there a mechanism that I can use to implement the same mechanism to fully mitigate type erasure?

Scala resolving Class/Type at runtime + type class constraint

I have a generic function that require a HasMoveCapability implicit instance of the type T (type class pattern)
trait HasMoveCapability[T]
def doLogic[T: TypeTag: HasMoveCapability](): Unit = println(typeTag[T].tpe)
Then I have these two classes which have implicit instances for HasMoveCapability[T]
case class Bird()
object Bird {
implicit val hasMoveCapability = new HasMoveCapability[Bird]{}
}
case class Lion()
object Lion {
implicit val hasMoveCapability = new HasMoveCapability[Lion]{}
}
My question is the following:
I need to resolve the type (Lion or Bird) at runtime depending on an argument and call the function doLogic with the good type.
I tried
val input: String = "bird" // known at runtime
val resolvedType: TypeTag[_] = input match {
case "bird" => typeTag[Bird]
case "lion" => typeTag[Lion]
}
doLogic()(resolvedType) // doesn't compile
// `Unspecified value parameters: hasMoveCapability$T$1: HasMoveCapability[NotInferredT]`
What I would like to do is something like:
val resolvedType: TypeTag[_: HasMoveCapability] = input match{...}
The workaround that I am using so far is to call the function in the pattern match:
input match {
case "bird" => doLogic[Bird]
case "lion" => doLogic[Lion]
}
But by having many functions, the pattern match is getting duplicated and hard to maintain.
I am open to change the design if you have any suggestions :D
You should describe your problem better. Currently your type class HasMoveCapability doesn't seem to do anything useful. Currently what you do seems a hard way to transform the string "bird" into "Bird", "lion" into "Lion".
If you control the code of doLogic you seem not to need TypeTag. TypeTag / ClassTag is a way to persist information from compile time to runtime. You seem to do something in reverse direction.
Type classes / implicits are resolved at compile time. You can't resolve something at compile time based on runtime information (there is no time machine taking you from the future i.e. runtime to the past i.e. compile time). Most probably you need ordinary pattern matching rather than type classes (TypeTag, HasMoveCapability).
In principle you can run compiler at runtime, then you'll have new compile time inside runtime, and you'll be able to infer types, resolve implicits etc.
import scala.tools.reflect.ToolBox
import scala.reflect.runtime.currentMirror
import scala.reflect.runtime.universe.{TypeTag, typeTag}
object App {
trait HasMoveCapability[T]
def doLogic[T: TypeTag: HasMoveCapability](): Unit = println(typeTag[T].tpe)
case class Bird()
object Bird {
implicit val hasMoveCapability = new HasMoveCapability[Bird]{}
}
case class Lion()
object Lion {
implicit val hasMoveCapability = new HasMoveCapability[Lion]{}
}
val input: String = "bird" // known at runtime
val tb = currentMirror.mkToolBox()
tb.eval(tb.parse(s"import App._; doLogic[${input.capitalize}]")) //App.Bird
def main(args: Array[String]): Unit = ()
}
scala get generic type by class

Fail-fast json4s serialisation of sealed trait and object enum when missing serializer

Set up
I'm using json4s 3.2.11 and Scala 2.11.
I have an enumeration defined using sealed trait, and a custom serializer for it:
import org.json4s.CustomSerializer
import org.json4s.JsonAST.JString
import org.json4s.DefaultFormats
import org.json4s.jackson.Serialization
sealed trait Foo
case object X extends Foo
case object Y extends Foo
object FooSerializer
extends CustomSerializer[Foo](
_ =>
({
case JString("x") => X
case JString("y") => Y
}, {
case X => JString("x")
case Y => JString("y")
})
)
This is great, and works well when added to the formats:
{
implicit val formats = DefaultFormats + FooSerializer
Serialization.write(X) // "x"
}
This is great!
Problem
If the serializer is not added to the formats, json4s will use reflection to create a default representation of the fields, which is extremely unhelpful for these objects that don't have fields. It does this silently, seemingly without a way to control it.
{
implicit val formats = DefaultFormats
Serialization.write(X) // {}
}
This is a problematic, as there's no indication of what's gone wrong until much later. This invalid/useless data might be sent around the network or written to databases, if tests don't happen to catch it. And, this may be exposed publicly from a library, meaning downstream users have to remember it as well.
NB. this is different to read, which throws an exception on failure, since the Foo trait doesn't have any useful constructors:
{
implicit val formats = DefaultFormats
Serialization.read[Foo]("\"x\"")
}
org.json4s.package$MappingException: No constructor for type Foo, JString(x)
at org.json4s.Extraction$ClassInstanceBuilder.org$json4s$Extraction$ClassInstanceBuilder$$constructor(Extraction.scala:417)
at org.json4s.Extraction$ClassInstanceBuilder.org$json4s$Extraction$ClassInstanceBuilder$$instantiate(Extraction.scala:468)
at org.json4s.Extraction$ClassInstanceBuilder$$anonfun$result$6.apply(Extraction.scala:515)
...
Question
Is there a way to either disable the default {} formatting for these objects, or to "bake" in the formatting to the object itself?
For instance, having write throw an exception like read would be fine, as it would flag the problem to the caller immediately.
There is an old open issue which seems to ask similar question where one of the contributors suggests to
you need to create a custom deserializer or serializer
which makes it sound there is no out-of-the-box way to alter the default behaviour.
Method 1: Disallow default formats via Scalastyle
Try disallowing import of org.json4s.DefaultFormats using Scalastyle IllegalImportsChecker
<check level="error" class="org.scalastyle.scalariform.IllegalImportsChecker" enabled="true">
<parameters>
<customMessage>Import from illegal package: Please use example.DefaultFormats instead of org.json4s.DefaultFormats</customMessage>
<parameter name="illegalImports"><![CDATA[org.json4s.DefaultFormats]]></parameter>
</parameters>
</check>
and provide custom DefaultFormats like so
package object example {
val DefaultFormats = Serialization.formats(NoTypeHints) + FooSerializer
}
which would allow us to serialise ADTs like so
import example.DefaultFormats
implicit val formats = DefaultFormats
case class Bar(foo: Foo)
println(Serialization.write(Bar(X)))
println(Serialization.write(X))
println(Serialization.write(Y))
which should output
{"foo":"x"}
"x"
"y"
If we try to import org.json4s.DefaultFormats, then Scalastyle should raise the following error:
Import from illegal package: Please use example.DefaultFormats instead of org.json4s.DefaultFormats
Method 2: Bake in serialisation for non-nested values
Perhaps we could "bake in" the formatting into objects by defining write method in Foo which delegates to Serialization.write like so
sealed trait Foo {
object FooSerializer extends CustomSerializer[Foo](_ =>
({
case JString("x") => X
case JString("y") => Y
}, {
case X => JString("x")
case Y => JString("y")
})
)
def write: String =
Serialization.write(this)(DefaultFormats + FooSerializer)
}
case object X extends Foo
case object Y extends Foo
Note how we hardcoded passing FooSerializer format to write. Now we can serialise with
println(X.write)
println(Y.write)
which should output
"x"
"y"
Method 3: Provide custom DefaultFormats alongside org.json4s.DefaultFormats
We could also try defining custom DefaultFormats in our own package like so
package example
object DefaultFormats extends DefaultFormats {
override val customSerializers: List[Serializer[_]] = List(FooSerializer)
}
which would allow us to serialise ADTs like so
import example.DefaultFormats
implicit val formats = DefaultFormats
case class Bar(foo: Foo)
println(Serialization.write(Bar(X)))
println(Serialization.write(X))
println(Serialization.write(Y))
which should output
{"foo":"x"}
"x"
"y"
Having two default formats, org.json4s.DefaultFormats and example.DefaultFormats, would at least make the user have to choose between the two, if say, they use IDE to auto-import them.

Get a class from a type scala

In scala, I want to be able to say
val user = Node.create[User](...) // return User object
So here's what I have so far:
def create[T : TypeTag](map: Map[String, Any]) {
val type = typeOf[T]
// create class from type here???
}
I've been digging around how to create classes from generic types and found out that using ClassManifest seems to be deprecated. Instead, type tags are here, so I'm able to do something like this typeOf[T] and actually get the type.. but then I'm lost. If I could get the class, then I could use something like class.newInstance and manually set the fields from there.
Question is: given a type, can I get a class instance of the given type?
The easiest way in fact is to use ClassTag:
def create[T : ClassTag](map: Map[String, Any]): T = {
val clazz: Class[_] = classTag[T].runtimeClass
clazz.newInstance(<constructor arguments here>).asInstanceOf[T]
}
ClassTag is a thin wrapper around Java Class, primarily used for arrays instantiation.
TypeTag facility is more powerful. First, you can use it to invoke Java reflection:
import scala.reflect.runtime.universe._
def create[T: TypeTag](map: Map[String, Any]): T = {
val mirror = runtimeMirror(getClass.getClassLoader) // current class classloader
val clazz: Class[_] = mirror.runtimeClass(typeOf[T].typeSymbol.asClass)
clazz.newInstance(<constructor arguments here>).asInstanceOf[T]
}
However, Scala reflection allows to instantiate classes without dropping back to Java reflection:
def create[T: TypeTag](map: Map[String, Any]): T = {
// obtain type symbol for the class, it is like Class but for Scala types
val typeSym = typeOf[T].typeSymbol.asClass
// obtain class mirror using runtime mirror for the given classloader
val mirror = runtimeMirror(getClass.getClassLoader) // current class classloader
val cm = mirror.reflectClass(typeSym)
// resolve class constructor using class mirror and
// a constructor declaration on the type symbol
val ctor = typeSym.decl(termNames.CONSTRUCTOR).asMethod
val ctorm = cm.reflectConstructor(cm)
// invoke the constructor
ctorm(<constructor arguments here>).asInstanceOf[T]
}
If you want to create a class with overloaded constructors, it may require more work though - you'll have to select correct constructor from declarations list, but the basic idea is the same. You can read more on Scala reflection here
There is a way to do it with reflection: either runtime reflection, or in a macro. Regarding runtime reflection way, you can have a look at my blog post where I tried to do something like what you are trying to do now. Using compile-time reflection with macros might be a better option, depending on your need.

Scala pickling: how?

I'm trying to use "pickling" serialization is Scala, and I see the same example demonstrating it:
import scala.pickling._
import json._
val pckl = List(1, 2, 3, 4).pickle
Unpickling is just as easy as pickling:
val lst = pckl.unpickle[List[Int]]
This example raises some question. First of all, it skips converting of object to string. Apparently you need to call pckl.value to get json string representation.
Unpickling is even more confusing. Deserialization is an act of turning string (or bytes) into an object. How come this "example" demonstrates deserialization if there is no string/binry representation of object?
So, how do I deserialize simple object with pickling library?
Use the type system and case classes to achieve your goals. You can unpickle to some superior type in your hierarchy (up to and including AnyRef). Here is an example:
trait Zero
case class One(a:Int) extends Zero
case class Two(s:String) extends Zero
object Test extends App {
import scala.pickling._
import json._
// String that can be sent down a wire
val wire: String = Two("abc").pickle.value
// On the other side, just use a case class
wire.unpickle[Zero] match {
case One(a) => println(a)
case Two(s) => println(s)
case unknown => println(unknown.getClass.getCanonicalName)
}
}
Ok, I think I understood it.
import scala.pickling._
import json._
var str = Array(1,2,3).pickle.value // this is JSON string
println(str)
val x = str.unpickle[Array[Int]] // unpickle from string
will produce JSON string:
{
"tpe": "scala.Array[scala.Int]",
"value": [
1,
2,
3
]
}
So, the same way we pickle any type, we can unpickle string. Type of serialization is regulated by implicit formatter declared in "json." and can be replaced by "binary."
It does look like you will be starting with a pickle to unpickle to a case class. But the JSON string can be fed to the JSONPickle class to get the starting pickle.
Here's an example based on their array-json test
package so
import scala.pickling._
import json._
case class C(arr: Array[Int]) { override def toString = s"""C(${arr.mkString("[", ",", "]")})""" }
object PickleTester extends App {
val json = """{"arr":[ 1, 2, 3 ]}"""
val cPickle = JSONPickle( json )
val unpickledC: C = cPickle.unpickle[C]
println( s"$unpickledC, arr.sum = ${unpickledC.arr.sum}" )
}
The output printed is:
C([1,2,3]), arr.sum = 6
I was able to drop the "tpe" in from the test as well as the .stripMargin.trim on the input JSON from the test. It works all in one line, but I thought it might be more apparent split up. It's unclear to me if that "tpe" from the test is supposed to provide a measure of type safety for the incoming JSON.
Looks like the only other class they support for pickling is a BinaryPickle unless you want to roll your own. The latest scala-pickling snapshot jar requires quasiquotes to compile the code in this answer.
I tried someting more complicated this morning and discovered that the "tpe" is required for non-primatives in the incoming JSON - which points out that the serialized string really must be compatible with the pickler( which I mixed into the above code ):
case class J(a: Option[Boolean], b: Option[String], c: Option[Int]) { override def toString = s"J($a, $b, $c)" }
...
val jJson = """{"a": {"tpe": "scala.None.type"},
| "b":{"tpe": "scala.Some[java.lang.String]","x":"donut"},
| "c":{"tpe": "scala.Some[scala.Int]","x":47}}"""
val jPickle = JSONPickle( jJson.stripMargin.trim )
val unpickledJ: J = jPickle.unpickle[J]
println( s"$unpickledJ" )
...
where naturually, I had to use .value on a J(None, Some("donut"), Some(47)) to figure out how to create the jJson input value to prevent the unpickling from throwing an exception.
The output for J is like:
J(None, Some(donut), Some(47))
Looking at this test, it appears that if the incoming JSON is all primatives or case classes (or combinations) that the JSONPickle magic works, but some other classes like Options require extra "tpe" type information to unpickle correctly.