Scala - How to extract Json4s with dynamic case class created with ToolBox - scala

I defined the case class dynamically using Toolbox.
And when I do extract of json4s, I get the following exception:
import org.json4s._
import scala.reflect.runtime._
import scala.tools.reflect.ToolBox
implicit val formats = DefaultFormats
val cm = universe.runtimeMirror(getClass.getClassLoader)
val toolBox = cm.mkToolBox()
val parse =
toolBox.parse(
s"""
| case class Person( name:String, age:String)
| scala.reflect.classTag[ Person].runtimeClass
""".stripMargin)
val person = toolBox.compile( parse)().asInstanceOf[Class[_]]
val js = JsonMethods.parse("""{ "name":"Tom","age" : "28"}""")
val jv = js.extract[person.type ] //How do I pass the class type?
**"Exception in thread "main" org.json4s.MappingException: No constructor for type Class, JObject(List((name,JString(Tom)), (age,JString(28))))"**
But after creating a dummy instance of the dynamically created class,
Then pass in the type of that dummy class and it will be parsed.
I don't know why.
How can I parse without creating a dummy instance?
import org.json4s._
import scala.reflect.runtime._
import scala.tools.reflect.ToolBox
implicit val formats = DefaultFormats
val cm = universe.runtimeMirror(getClass.getClassLoader)
val toolBox = cm.mkToolBox()
val parse =
toolBox.parse(
s"""
| case class Person( name:String, age:String)
| scala.reflect.classTag[ Person].runtimeClass
""".stripMargin)
val person = toolBox.compile( parse)().asInstanceOf[Class[_]]
val dummy = person.getConstructors.head.newInstance( "a", "b") //make dummy instance
val js = JsonMethods.parse("""{ "name":"Tom","age" : "28"}""")
println( js.extract[ dummy.type ] ) // Result: Person(Tom,28)

x.type is a singleton type. So person.type can't be correct, it's the singleton type of this specific variable val person: Class[_].
Fortunately, dummy.type is correct because of the runtime reflection. This works even for ordinary case class
import org.json4s._
import org.json4s.jackson.JsonMethods
implicit val formats = DefaultFormats
case class Person(name: String, age: String)
val js = JsonMethods.parse("""{ "name":"Tom","age" : "28"}""")
val dummy0: AnyRef = Person("a", "b")
val dummy: AnyRef = dummy0
js.extract[dummy.type] // Person(Tom,28)
Actually after resolving implicits js.extract[Person] is
js.extract[Person](formats, ManifestFactory.classType(classOf[Person])
js.extract[dummy.type] is
js.extract[dummy.type](formats, ManifestFactory.singleType(dummy))
So for a toolbox-generated case class we could try
import org.json4s._
import org.json4s.jackson.JsonMethods
import scala.reflect.ManifestFactory
import scala.reflect.runtime.universe
import scala.reflect.runtime.universe._
import scala.tools.reflect.ToolBox
val cm = universe.runtimeMirror(getClass.getClassLoader)
val toolBox = cm.mkToolBox()
implicit val formats = DefaultFormats
val person = toolBox.eval(q"""
case class Person(name:String, age:String)
scala.reflect.classTag[Person].runtimeClass
""").asInstanceOf[Class[_]]
val js = JsonMethods.parse("""{ "name":"Tom","age" : "28"}""")
js.extract(formats, ManifestFactory.classType(person))
// java.lang.ClassCastException: __wrapper$1$6246735221dc4d64a9e372a9d0891e5e.__wrapper$1$6246735221dc4d64a9e372a9d0891e5e$Person$1 cannot be cast to scala.runtime.Nothing$
(toolBox.eval(tree) is instead of toolBox.compile(toolBox.parse(string))())
but this doesn't work.
Manifest should be captured from toolbox compile time
import org.json4s._
import org.json4s.jackson.JsonMethods
import scala.reflect.runtime.universe
import scala.reflect.runtime.universe._
import scala.tools.reflect.ToolBox
val cm = universe.runtimeMirror(getClass.getClassLoader)
val toolBox = cm.mkToolBox()
implicit val formats = DefaultFormats
val person = toolBox.eval(q"""
case class Person(name:String, age:String)
val clazz = scala.reflect.classTag[Person].runtimeClass
scala.reflect.ManifestFactory.classType(clazz)
""").asInstanceOf[Manifest[_]]
val js = JsonMethods.parse("""{ "name":"Tom","age" : "28"}""")
js.extract(formats, person) // Person(Tom,28)
Alternatively you don't need java-reflection Class at all. You can do
import scala.reflect.runtime
import scala.reflect.runtime.universe._
import scala.tools.reflect.ToolBox
val cm = runtime.currentMirror
val toolBox = cm.mkToolBox()
toolBox.eval(q"""
import org.json4s._
import org.json4s.jackson.JsonMethods
implicit val formats = DefaultFormats
case class Person(name: String, age: String)
val js = JsonMethods.parse(${"""{"name":"Tom","age" : "28"}"""})
js.extract[Person]
""") // Person(Tom,28)
or
import org.json4s._
import org.json4s.jackson.JsonMethods
import scala.reflect.runtime
import scala.reflect.runtime.universe._
import scala.tools.reflect.ToolBox
object Main extends App {
val cm = runtime.currentMirror
val toolBox = cm.mkToolBox()
implicit val formats = DefaultFormats
val person: ClassSymbol = toolBox.define(q"case class Person(name: String, age: String)")
val js = JsonMethods.parse("""{"name":"Tom","age" : "28"}""")
val jv = toolBox.eval(q"""
import Main._
js.extract[$person]
""")
println(jv) // Person(Tom,28)
}

Related

Spark/scala create empty dataset using generics in a trait

I have a trait called that takes a type parameter, and one of its methods needs to be able to create an empty typed dataset.
trait MyTrait[T] {
val sparkSession: SparkSession
val spark = sparkSession.session
val sparkContext = spark.sparkContext
def createEmptyDataset(): Dataset[T] = {
import spark.implicits._ // to access .toDS() function
// DOESN'T WORK.
val emptyRDD = sparkContext.parallelize(Seq[T]())
val accumulator = emptyRDD.toDS()
...
}
}
So far I have not gotten it to work. It complains no ClassTag for T, and that value toDS is not a member of org.apache.spark.rdd.RDD[T]
Any help would be appreciated. Thanks!
You have to provide both ClassTag[T] and Encoder[T] in the same scope. For example:
import org.apache.spark.sql.{SparkSession, Dataset, Encoder}
import scala.reflect.ClassTag
trait MyTrait[T] {
val ct: ClassTag[T]
val enc: Encoder[T]
val sparkSession: SparkSession
val sparkContext = spark.sparkContext
def createEmptyDataset(): Dataset[T] = {
val emptyRDD = sparkContext.emptyRDD[T](ct)
spark.createDataset(emptyRDD)(enc)
}
}
with concrete implementation:
class Foo extends MyTrait[Int] {
val sparkSession = SparkSession.builder.getOrCreate()
import sparkSession.implicits._
val ct = implicitly[ClassTag[Int]]
val enc = implicitly[Encoder[Int]]
}
It is possible to skip RDD:
import org.apache.spark.sql.{SparkSession, Dataset, Encoder}
trait MyTrait[T] {
val enc: Encoder[T]
val sparkSession: SparkSession
val sparkContext = spark.sparkContext
def createEmptyDataset(): Dataset[T] = {
spark.emptyDataset[T](enc)
}
}
Check How to declare traits as taking implicit "constructor parameters"?, specifically answer by Blaisorblade and another one by Alexey Romanov.

Accessing a case class annotation

case class FieldName(field: String) extends scala.annotation.StaticAnnotation
#FieldName("foo") trait Foo
import scala.reflect.runtime.universe._
symbolOf[Foo].annotations.head
// ann: Annotation = FieldName("type")
How do I access the annotation as a FieldName object? The doc mentions tree.children.tail, but there's no types to be had.
If you really want the instance of FieldName the best I can think of is using a ToolBox:
scala> case class FieldName(field: String) extends scala.annotation.StaticAnnotation
defined class FieldName
scala> #FieldName("foo") trait Foo
defined trait Foo
scala> import scala.reflect.runtime.universe._
import scala.reflect.runtime.universe._
scala> val annotation = symbolOf[Foo].annotations.head
annotation: reflect.runtime.universe.Annotation = FieldName("foo")
scala> import scala.tools.reflect.ToolBox
import scala.tools.reflect.ToolBox
scala> val tb = runtimeMirror(getClass.getClassLoader).mkToolBox()
tb: scala.tools.reflect.ToolBox[reflect.runtime.universe.type] = scala.tools.reflect.ToolBoxFactory$ToolBoxImpl#1b26a499
scala> tb.eval(tb.untypecheck(annotation.tree)).asInstanceOf[FieldName]
res10: FieldName = FieldName(foo)
With .tree.children.tail you can access the arguments passed to FieldName without creating the actual instance.
scala> annotation.tree.children.tail.map{ case Literal(Constant(field)) => field }
res11: List[Any] = List(foo)
If you just want all FieldName annotations and extract their value, you can do this:
scala> val fields = symbolOf[Foo].annotations.withFilter(
| a => a.tree.tpe <:< typeOf[FieldName]
| ).flatMap(
| a => a.tree.children.tail.map{ case Literal(Constant(field)) => field }
| )
fields: List[Any] = List(foo)

How do I supply an implicit value for an akka.stream.Materializer when sending a FakeRequest?

I'm trying to make sense of the error(s) I'm seeing below, and to learn how to fix it.
could not find implicit value for parameter materializer: akka.Stream.Materializer
val fut: Future[Result] = action.apply(fakeRequest).run
^
not enough arguments for method run (implicit materializer: akka.stream.Materializer)scala.concurrent.Future[play.api.mvc.Result].
Unspecified value parameter materializer.
val fut: Future[Result] = action.apply(fakeRequest).run
^
Here is the test code that produced the error(s):
package com.foo.test
import com.foo.{Api, BoundingBox}
import org.scalatest.{FlatSpec, Matchers}
import play.api.libs.json._
import play.api.mvc._
import play.api.test.{FakeHeaders, FakeRequest}
import scala.concurrent.duration._
import scala.concurrent.{Await, Future}
class TestJmlPlay extends FlatSpec with Matchers {
val bbox = new BoundingBox(-76.778154438007732F, 39.239828198015971F, -76.501003519894326F, 39.354663763993926F)
"latitudes" should "be between swLat and neLat" in {
val action: Action[AnyContent] = (new Api).getForPlay(bbox)
val jsonStr = getStringFromAction(action)
areLatitudesOk(jsonStr, bbox) shouldBe true
}
private def getStringFromAction(action:Action[AnyContent]):String = {
val fakeRequest: Request[String] = new FakeRequest("fakeMethod", "fakeUrl", new FakeHeaders, "fakeBody")
val fut: Future[Result] = action.apply(fakeRequest).run // <== ERROR!
val result = Await.result(fut, 5000 milliseconds)
result.body.toString
}
private def areLatitudesOk(jsonStr: String, bbox: BoundingBox): Boolean = ...
}
You can create an implicit ActorMaterializer within your test class which will use testkit's ActorSystem:
import akka.testkit.TestKit
import akka.actor.ActorSystem
class TestJmlPlay(_system : ActorSystem) extends TestKit(_system) ... {
implicit val materializer: ActorMaterializer = ActorMaterializer()
val bbox = ...
You don't need Materializer.
I believe you are calling not the right action.apply method.
You want def apply(request: Request[A]): Future[Result]
To call the right, you need FakeRequest[AnyContent], same parametrized type as action:Action[AnyContent].This type is forced by PlayBodyParser I believe you set for your action.
After that you don't need .run call

Not enough arguments for method unmarshal: (implicit evidence$1: spray.httpx.unmarshalling.FromResponseUnmarshaller

I am passing from SprayJsonSupport to argonaut based on this example.
After some code modification :
object ElevationJsonProtocol extends DefaultJsonProtocol {
implicit val locationCodec: CodecJson[Elevation] = casecodec2(Elevation, Elevation.unapply)("location", "elevation")
implicit val elevationCodec: CodecJson[Location] = casecodec2(Location, Location.unapply)("lat", "lng")
implicit def googleApiResultCodec: CodecJson[GoogleApiResult] = casecodec2(GoogleApiResult, GoogleApiResult.unapply)("status", "results")
}
I got this error
Error:(41, 42) not enough arguments for method unmarshal: (implicit evidence$1: spray.httpx.unmarshalling.FromResponseUnmarshaller[GoogleApiResult])spray.http.HttpResponse => GoogleApiResult.
Unspecified value parameter evidence$1.
val pipeline = sendReceive ~> unmarshal[GoogleApiResult]
^
I take a look at the unmarshall method:
def unmarshal[T](implicit evidence$1 : spray.httpx.unmarshalling.FromResponseUnmarshaller[T]) : scala.Function1[spray.http.HttpResponse, T]
How can I add the implicit parameter? and why I did not got such error whith the sprayJsonSupport ?
The hole code :
import spray.httpx.unmarshalling.FromResponseUnmarshaller
import scala.util.{Success, Failure}
import scala.concurrent.duration._
import akka.actor.ActorSystem
import akka.pattern.ask
import akka.event.Logging
import akka.io.IO
import spray.json.{JsonFormat, DefaultJsonProtocol}
import spray.can.Http
import spray.httpx.SprayJsonSupport
import spray.client.pipelining._
import spray.util._
import argonaut._, Argonaut._
case class Elevation(location: Location, elevation: Double)
case class Location(lat: Double, lng: Double)
case class GoogleApiResult(status: String, results: List[Elevation])
object ElevationJsonProtocol extends DefaultJsonProtocol {
implicit val locationCodec: CodecJson[Elevation] = casecodec2(Elevation, Elevation.unapply)("location", "elevation")
implicit val elevationCodec: CodecJson[Location] = casecodec2(Location, Location.unapply)("lat", "lng")
implicit def googleApiResultCodec: CodecJson[GoogleApiResult] = casecodec2(GoogleApiResult, GoogleApiResult.unapply)("status", "results")
}
object Main extends App {
// we need an ActorSystem to host our application in
implicit val system = ActorSystem("simple-spray-client")
import system.dispatcher // execution context for futures below
val log = Logging(system, getClass)
log.info("Requesting the elevation of Mt. Everest from Googles Elevation API...")
import ElevationJsonProtocol._
val pipeline = sendReceive ~> unmarshal[GoogleApiResult]
val responseFuture = pipeline (
Get("http://maps.googleapis.com/maps/api/elevation/json?locations=27.988056,86.925278&sensor=false")
)
responseFuture onComplete {
case Success(GoogleApiResult(_, Elevation(_, elevation) :: _)) =>
log.info("The elevation of Mt. Everest is: {} m", elevation)
shutdown()
case Success(somethingUnexpected) =>
log.warning("The Google API call was successful but returned something unexpected: '{}'.", somethingUnexpected)
shutdown()
case Failure(error) =>
log.error(error, "Couldn't get elevation")
shutdown()
}
def shutdown(): Unit = {
IO(Http).ask(Http.CloseAll)(1.second).await
system.shutdown()
}
}
I don't really use argonaut, I use play json with spray. But at a glance it seems like there needs to be an argonaut support trait/import pulled in for your implicit codecs to convert to spray's unmarshaller (similar thing is required for play json).
https://github.com/dwhjames/argonaut-spray
this library seems to be what you want. Your implicits and imports look fine, pulling in the library should solve your problem.

No implicit format for List for BSONObjectID

one of my models includes a list of BSONObjectIDs:
case class User(
_id: BSONObjectID = BSONObjectID.generate,
email: String,
favorite_ids: List[BSONObjectID] = List(),
home_folder_id: Option[BSONObjectID] = None
)
unfortunately, the compiler complains with the following message:
No implicit format for List[reactivemongo.bson.BSONObjectID]
available.
it complains in the last line of the following snippet.
import play.api.libs.json._
import reactivemongo.bson._
import play.modules.reactivemongo.json.BSONFormats._
import play.modules.reactivemongo.json._, ImplicitBSONHandlers._
import play.modules.reactivemongo.json.collection._
implicit val userFormat = Json.format[User]
Funny observation: the Option[BSONObjectID] is working when i comment the List[] line out.
Anyone know how to include a format for lists? I figured that should be available implicitly.
thanks
You can try with snapshot "org.reactivemongo" %% "play2-reactivemongo" % "0.11.2.play24-SNAPSHOT".
scala> import play.modules.reactivemongo.json._
import play.modules.reactivemongo.json._
scala> import reactivemongo.bson._
import reactivemongo.bson._
scala> import play.api.libs.json._
import play.api.libs.json._
scala> implicitly[Reads[BSONObjectID]]
res0: play.api.libs.json.Reads[reactivemongo.bson.BSONObjectID] = play.modules.reactivemongo.json.BSONFormats$BSONObjectIDFormat$#4d27019c
scala> implicitly[Writes[BSONObjectID]]
res1: play.api.libs.json.Writes[reactivemongo.bson.BSONObjectID] = play.modules.reactivemongo.json.BSONFormats$BSONObjectIDFormat$#4d27019c
scala> implicitly[Format[BSONObjectID]]
res2: play.api.libs.json.Format[reactivemongo.bson.BSONObjectID] = play.modules.reactivemongo.json.BSONFormats$BSONObjectIDFormat$#4d27019c
scala> implicitly[Format[List[BSONObjectID]]]
res3: play.api.libs.json.Format[List[reactivemongo.bson.BSONObjectID]] = play.api.libs.json.DefaultFormat$$anon$4#43b5fbbd
scala> implicitly[Reads[JsObject]]
res4: play.api.libs.json.Reads[play.api.libs.json.JsObject] = play.api.libs.json.DefaultReads$JsObjectReads$#78a1f869
scala> implicitly[OWrites[BSONDocument]]
res5: play.api.libs.json.OWrites[reactivemongo.bson.BSONDocument] = play.modules.reactivemongo.json.ImplicitBSONHandlers$BSONDocumentWrites$#1763c4c3
The implicits are all provided by the unified import play.modules.reactivemongo.json._