How to convert JSON to scala shapeless.hlist? - scala

I got json like {"name":"susan","age":25},and a hint to json keyset like "name:String,age:Int",how to create a HList from that json?

Based on the code you added and then deleted, it seems, having a runtime-string hint "name:String,age:Int" and runtime-string json {"name":"susan","age":25}, you want to get an HList with runtime reflection. You can do this as follows
import shapeless.HList
import scala.reflect.runtime
import scala.reflect.runtime.universe._
import scala.tools.reflect.ToolBox
val tb = runtime.currentMirror.mkToolBox()
val jsonStr = """{"name":"susan","age":25}"""
val hint = "name:String,age:Int"
val classType = tb.define(tb.parse(s"case class Test($hint)").asInstanceOf[ImplDef]).asClass.toType
val hlist = tb.eval(q"""
import io.circe.generic.auto._
import io.circe.parser.decode
val classInstance = decode[$classType]($jsonStr)
import shapeless.Generic
Generic[$classType].to(classInstance.toOption.get)
""").asInstanceOf[HList]
println(hlist) // susan :: 25 :: HNil
Please notice that you do everything at runtime so you'll have no access to type String :: Int :: HNil at compile time and hlist has static type just HList (not String :: Int :: HNil) and HList is actually not better than just List[Any].
build.sbt
libraryDependencies ++= Seq(
scalaOrganization.value % "scala-reflect" % scalaVersion.value,
scalaOrganization.value % "scala-compiler" % scalaVersion.value,
"com.chuusai" %% "shapeless" % "2.4.0-M1",
"io.circe" %% "circe-core" % "0.13.0",
"io.circe" %% "circe-parser" % "0.13.0",
"io.circe" %% "circe-generic" % "0.13.0"
)
Actually, I guess we do someting strange. We use highly type-level libraries (shapeless, circe) aimed to static type safety and then run them at runtime with reflection ignoring type safety and getting actually List[Any] (HList).
I guess that if List[Any] (list of field values) is enough for you then you just need to use a more runtime library. For example, with json4s
import org.json4s.{JInt, JObject, JString, JValue}
import org.json4s.jackson.JsonMethods._
val jsonStr: String = """{"name":"susan","age":25}"""
val json: JValue = parse(jsonStr) //JObject(List((name,JString(susan)), (age,JInt(25))))
val l: List[JValue] = json.asInstanceOf[JObject].obj.map(_._2) //List(JString(susan), JInt(25))
val res: List[Any] = l.map {
case JString(s) => s
case JInt(n) => n
} //List(susan, 25)
build.sbt
libraryDependencies += "org.json4s" %% "json4s-jackson" % "3.6.9"
Actually the same can be done with Circe, just with parse instead of decode[A]
import io.circe.{Json, JsonNumber}
import io.circe.parser.parse
val jsonStr: String = """{"name":"susan","age":25}"""
val json: Json = parse(jsonStr).toOption.get //{"name":"susan","age":25}
val l: List[Json] = json.asObject.get.values.toList //List("susan", 25)
val res: List[Any] = l.map(_.fold[Any](null, null, (_: JsonNumber).toInt.get, identity[String], null, null)) //List(susan, 25)
If you need an instance of case class or a tuple rather than HList replace
tb.eval(q"""
import io.circe.generic.auto._
import io.circe.parser.decode
val classInstance = decode[$classType]($jsonStr)
import shapeless.Generic
Generic[$classType].to(classInstance.toOption.get)
""").asInstanceOf[HList] // susan :: 25 :: HNil
with
tb.eval(q"""
import io.circe.generic.auto._
import io.circe.parser.decode
decode[$classType]($json).toOption.get
""").asInstanceOf[Product] //Test(susan,25)
or
tb.eval(q"""
import io.circe.generic.auto._
import io.circe.parser.decode
val classInstance = decode[$classType]($json)
import shapeless.Generic
Generic[$classType].to(classInstance.toOption.get).tupled
""").asInstanceOf[Product] //(susan,25)
correspondingly.

Related

Mapping a string to case classes using Scala play

Using the Scala play library I'm attempting to parse the string :
var str = "{\"payload\": \"[{\\\"test\\\":\\\"123\\\",\\\"tester\\\":\\\"456\\\"}," +
"{\\\"test1\\\":\\\"1234\\\",\\\"tester2\\\":\\\"4567\\\"}]\"}";
into a list of Payload classes using code below :
import play.api.libs.json._
object TestParse extends App {
case class Payload(test : String , tester : String)
object Payload {
implicit val jsonFormat: Format[Payload] = Json.format[Payload]
}
var str = "{\"payload\": \"[{\\\"test\\\":\\\"123\\\",\\\"tester\\\":\\\"456\\\"}," +
"{\\\"test1\\\":\\\"1234\\\",\\\"tester2\\\":\\\"4567\\\"}]\"}";
println((Json.parse(str) \ "payload").as[List[Payload]])
}
build.sbt :
name := "akka-streams"
version := "0.1"
scalaVersion := "2.12.8"
lazy val akkaVersion = "2.5.19"
lazy val scalaTestVersion = "3.0.5"
libraryDependencies ++= Seq(
"com.typesafe.akka" %% "akka-stream" % akkaVersion,
"com.typesafe.akka" %% "akka-stream-testkit" % akkaVersion,
"com.typesafe.akka" %% "akka-testkit" % akkaVersion,
"org.scalatest" %% "scalatest" % scalaTestVersion
)
// https://mvnrepository.com/artifact/com.typesafe.play/play-json
libraryDependencies += "com.typesafe.play" %% "play-json" % "2.10.0-RC6"
It fails with exception :
Exception in thread "main" play.api.libs.json.JsResultException: JsResultException(errors:List((,List(JsonValidationError(List("" is not an object),WrappedArray())))))
Is the case class structure incorrect ?
I've updated the code to :
import play.api.libs.json._
object TestParse extends App {
import TestParse.Payload.jsonFormat
object Payload {
implicit val jsonFormat: Format[RootInterface] = Json.format[RootInterface]
}
case class Payload (
test: Option[String],
tester: Option[String]
)
case class RootInterface (
payload: List[Payload]
)
val str = """{"payload": [{"test":"123","tester":"456"},{"test1":"1234","tester2":"4567"}]}"""
println(Json.parse(str).as[RootInterface])
}
which returns error :
No instance of play.api.libs.json.Format is available for scala.collection.immutable.List[TestParse.Payload] in the implicit scope (Hint: if declared in the same file, make sure it's declared before)
implicit val jsonFormat: Format[RootInterface] = Json.format[RootInterface]
This performs the task but there are cleaner solutions :
import akka.actor.ActorSystem
import akka.stream.scaladsl.{Flow, Sink, Source}
import org.scalatest.Assertions._
import spray.json.{JsObject, JsonParser}
import scala.concurrent.Await
import scala.concurrent.duration.DurationInt
object TestStream extends App {
implicit val actorSystem = ActorSystem()
val mapperFlow = Flow[JsObject].map(x => {
x.fields.get("payload").get.toString().replace("{", "")
.replace("}", "")
.replace("[", "")
.replace("]", "")
.replace("\"", "")
.replace("\\", "")
.split(":").map(m => m.split(","))
.toList
.flatten
.grouped(4)
.map(m => Test(m(1), m(3).toDouble))
.toList
})
val str = """{"payload": [{"test":"123","tester":"456"},{"test":"1234","tester":"4567"}]}"""
case class Test(test: String, tester: Double)
val graph = Source.repeat(JsonParser(str).asJsObject())
.take(3)
.via(mapperFlow)
.mapConcat(identity)
.runWith(Sink.seq)
val result = Await.result(graph, 3.seconds)
println(result)
assert(result.length == 6)
assert(result(0).test == "123")
assert(result(0).tester == 456 )
assert(result(1).test == "1234")
assert(result(1).tester == 4567 )
assert(result(2).test == "123")
assert(result(2).tester == 456 )
assert(result(3).test == "1234")
assert(result(3).tester == 4567 )
}
Alternative, ioiomatic Scala answers are welcome.

ambiguous implicit values: match expected type cats.derived.MkShow[A]: show cats:kittens

I'm trying to create Show Instance for my custom Config class.
The build.sbt file is -
name := "circe-demo"
version := "0.1"
scalaVersion := "2.11.12"
resolvers += Resolver.bintrayRepo("ovotech", "maven")
libraryDependencies += "io.circe" %% "circe-core" % "0.11.0"
libraryDependencies += "io.circe" %% "circe-parser" % "0.11.0"
libraryDependencies += "io.circe" %% "circe-generic" % "0.11.0"
libraryDependencies += "org.typelevel" %% "kittens" % "1.2.0"
libraryDependencies ++= Seq(
"is.cir" %% "ciris-cats",
"is.cir" %% "ciris-cats-effect",
"is.cir" %% "ciris-core",
"is.cir" %% "ciris-enumeratum",
"is.cir" %% "ciris-refined"
).map(_ % "0.12.1")
Complete code is -
import enumeratum.{Enum, EnumEntry}
sealed abstract class AppEnvironment extends EnumEntry
object AppEnvironment extends Enum[AppEnvironment] {
case object Local extends AppEnvironment
case object Testing extends AppEnvironment
case object Production extends AppEnvironment
override val values: Vector[AppEnvironment] =
findValues.toVector
}
import java.net.InetAddress
import scala.concurrent.duration.Duration
final case class ApiConfig(host: InetAddress, port: Int, apiKey: String, timeout: Duration)
import java.net.InetAddress
import cats.Show
import cats.derived.semi
import ciris.config.loader.AppEnvironment.{Local, Production, Testing}
import enumeratum.EnumEntry
import eu.timepit.refined.auto._
import eu.timepit.refined.types.string.NonEmptyString
import scala.concurrent.duration._
final case class Config(appName: NonEmptyString, environment: AppEnvironment, api: ApiConfig)
object Config {
implicit val showConfig: Show[Config] = {
implicit val showDuration: Show[Duration] =
Show.fromToString
implicit val showInetAddress: Show[InetAddress] =
Show.fromToString
implicit def showEnumEntry[E <: EnumEntry]: Show[E] =
Show.show(_.entryName)
// Show.show[Config](x => s"api = ${x.api} appName = ${x.appName} environment ${x.environment}")
semi.show
}
}
semi.show in the above code throws the below exception -
[error] /Users/rajkumar.natarajan/Documents/Coding/kafka_demo/circe-demo/src/main/scala/ciris/config/loader/Config.scala:32:5: ambiguous implicit values:
[error] both value emptyProductDerivedShow in trait MkShowDerivation of type => cats.derived.MkShow[shapeless.HNil]
[error] and method emptyCoproductDerivedShow in trait MkShowDerivation of type => cats.derived.MkShow[shapeless.CNil]
[error] match expected type cats.derived.MkShow[A]
[error] show
[error] ^
[error] one error found
[error] (Compile / compileIncremental) Compilation failed
[error]
I'm new to functional programming using cats.
How can I resolve this exception.
Unfortunately error reporting when such complicated implicits and macros are involved is far from perfect. The message you see actually means that some required implicits for the real generator (MkShow.genericDerivedShowProduct in this case) have not been found and the search went back to some where basic stuff where there is an ambiguity. And the stuff that is missing is mostly very basic such as an implicit for Show[Int] or Show[String]. The simplest way to get them all is to import cats.implicits._ but that will also bring catsStdShowForDuration which is a Show[Duration]. But since it's implementation is really the same as your custom one, it is easier to remove your custom one. One more thing that is missing is Show[NonEmptyString] and it is easy to create one
implicit def showNonEmptyString: Show[NonEmptyString] = Show.show(nes => nes)
To sum up, when I define your showConfig as
implicit val showConfig: Show[Config] = {
import cats.implicits._
// is already defined in cats.implicits._
//implicit val showDuration: Show[Duration] = Show.fromToString
implicit val showInetAddress: Show[InetAddress] = Show.fromToString
implicit def showEnumEntry[E <: EnumEntry]: Show[E] = Show.show(_.entryName)
implicit def showNonEmptyString: Show[NonEmptyString] = Show.show(nes => nes)
// Show.show[Config](x => s"api = ${x.api} appName = ${x.appName} environment ${x.environment}")
semi.show
}
it compiles for me.
P.S. is there any good reason why you put your AppEnvironment under ciris.* package? I'd say that generally putting your custom code into packages of 3-rd party library is an easy way to mess things up.

Circe Encoders and Decoders with Http4s

I am trying to use http4s, circe and http4s-circe.
Below I am trying to use the auto derivation feature of circe.
import org.http4s.client.blaze.SimpleHttp1Client
import org.http4s.Status.ResponseClass.Successful
import io.circe.syntax._
import org.http4s._
import org.http4s.headers._
import org.http4s.circe._
import scalaz.concurrent.Task
import io.circe._
final case class Login(username: String, password: String)
final case class Token(token: String)
object JsonHelpers {
import io.circe.generic.auto._
implicit val loginEntityEncoder : EntityEncoder[Login] = jsonEncoderOf[Login]
implicit val loginEntityDecoder : EntityDecoder[Login] = jsonOf[Login]
implicit val tokenEntityEncoder: EntityEncoder[Token] = jsonEncoderOf[Token]
implicit val tokenEntityDecoder : EntityDecoder[Token] = jsonOf[Token]
}
object Http4sTest2 extends App {
import JsonHelpers._
val url = "http://"
val uri = Uri.fromString(url).valueOr(throw _)
val list = List[Header](`Content-Type`(MediaType.`application/json`), `Accept`(MediaType.`application/json`))
val request = Request(uri = uri, method = Method.POST)
.withBody(Login("foo", "bar").asJson)
.map{r => r.replaceAllHeaders(list :_*)}.run
val client = SimpleHttp1Client()
val result = client.fetch[Option[Token]](request){
case Successful(response) => response.as[Token].map(Some(_))
case _ => Task(Option.empty[Token])
}.run
println(result)
}
I get multiple instances of these two compiler errors
Error:scalac: missing or invalid dependency detected while loading class file 'GenericInstances.class'.
Could not access type Secondary in object io.circe.Encoder,
because it (or its dependencies) are missing. Check your build definition for
missing or conflicting dependencies. (Re-run with `-Ylog-classpath` to see the problematic classpath.)
A full rebuild may help if 'GenericInstances.class' was compiled against an incompatible version of io.circe.Encoder.
Error:(25, 74) could not find implicit value for parameter encoder: io.circe.Encoder[Login]
implicit val loginEntityEncoder : EntityEncoder[Login] = jsonEncoderOf[Login]
I was able to solve this. I did a search on google on sbt circe dependency and I copy pasted the first search result. that was circe 0.1 and that is why things were not working for me.
I changed my dependencies to
libraryDependencies ++= Seq(
"org.http4s" %% "http4s-core" % http4sVersion,
"org.http4s" %% "http4s-dsl" % http4sVersion,
"org.http4s" %% "http4s-blaze-client" % http4sVersion,
"org.http4s" %% "http4s-circe" % http4sVersion,
"io.circe" %% "circe-core" % "0.7.0",
"io.circe" %% "circe-generic" % "0.7.0"
)
and now automatic derivation works fine and I am able to compile the code below
import org.http4s.client.blaze.SimpleHttp1Client
import org.http4s._
import org.http4s.headers._
import org.http4s.circe._
import scalaz.concurrent.Task
import io.circe.syntax._
import io.circe.generic.auto._
import org.http4s.Status.ResponseClass.Successful
case class Login(username: String, password: String)
case class Token(token: String)
object JsonHelpers {
implicit val loginEntityEncoder : EntityEncoder[Login] = jsonEncoderOf[Login]
implicit val loginEntityDecoder : EntityDecoder[Login] = jsonOf[Login]
implicit val tokenEntityEncoder: EntityEncoder[Token] = jsonEncoderOf[Token]
implicit val tokenEntityDecoder : EntityDecoder[Token] = jsonOf[Token]
}
object Http4sTest2 extends App {
import JsonHelpers._
val url = "http://"
val uri = Uri.fromString(url).valueOr(throw _)
val list = List[Header](`Content-Type`(MediaType.`application/json`), `Accept`(MediaType.`application/json`))
val request = Request(uri = uri, method = Method.POST)
.withBody(Login("foo", "bar").asJson)
.map{r => r.replaceAllHeaders(list :_*)}.run
val client = SimpleHttp1Client()
val result = client.fetch[Option[Token]](request){
case Successful(response) => response.as[Token].map(Some(_))
case _ => Task(Option.empty[Token])
}.run
println(result)
}

Why does Scala compiler fail with "object SparkConf in package spark cannot be accessed in package org.apache.spark"?

I cannot access the SparkConf in the package. But I have already import the import org.apache.spark.SparkConf. My code is:
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
import org.apache.spark.rdd.RDD
import org.apache.spark._
import org.apache.spark.streaming._
import org.apache.spark.streaming.StreamingContext._
object SparkStreaming {
def main(arg: Array[String]) = {
val conf = new SparkConf.setMaster("local[2]").setAppName("NetworkWordCount")
val ssc = new StreamingContext( conf, Seconds(1) )
val lines = ssc.socketTextStream("localhost", 9999)
val words = lines.flatMap(_.split(" "))
val pairs_new = words.map( w => (w, 1) )
val wordsCount = pairs_new.reduceByKey(_ + _)
wordsCount.print()
ssc.start() // Start the computation
ssc.awaitTermination() // Wait for the computation to the terminate
}
}
The sbt dependencies are:
name := "Spark Streaming"
version := "1.0"
scalaVersion := "2.10.4"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % "1.5.2" % "provided",
"org.apache.spark" %% "spark-mllib" % "1.5.2",
"org.apache.spark" %% "spark-streaming" % "1.5.2"
)
But the error shows that SparkConf cannot be accessed.
[error] /home/cliu/Documents/github/Spark-Streaming/src/main/scala/Spark-Streaming.scala:31: object SparkConf in package spark cannot be accessed in package org.apache.spark
[error] val conf = new SparkConf.setMaster("local[2]").setAppName("NetworkWordCount")
[error] ^
It compiles if you add parenthesis after SparkConf:
val conf = new SparkConf().setMaster("local[2]").setAppName("NetworkWordCount")
The point is that SparkConf is a class and not a function, so you could use class name also for scope purposes. So when you add parenthesis after the class name, you are making sure you are calling the class constructor and not the scoping functionality. Here is an example from Scala shell illustrating the difference:
scala> class C1 { var age = 0; def setAge(a:Int) = {age = a}}
defined class C1
scala> new C1
res18: C1 = $iwC$$iwC$C1#2d33c200
scala> new C1()
res19: C1 = $iwC$$iwC$C1#30822879
scala> new C1.setAge(30) // this doesn't work
<console>:23: error: not found: value C1
new C1.setAge(30)
^
scala> new C1().setAge(30) // this works
scala>
In this case you cannot omit parentheses so it should be:
val conf = new SparkConf().setMaster("local[2]").setAppName("NetworkWordCount")

scala-pickling in POJO in scala 2.11 - Is it really simple?

I'm trying to use scala-pickling because at site github it seems so easy and clean. But, I'm failing in use it in this simple REPL:
scala> import scala.pickling._
import scala.pickling._
scala> import scala.pickling.Defaults._
import scala.pickling.Defaults._
scala> import binary._
import binary._
scala> class Xpto { var a = 0D; var b = 0 }
defined class Xpto
scala> val v = new Xpto { a = 1.23; b = 5 }
v: Xpto = $anon$1#636d2b03
scala> v.pickle
<console>:19: error: type mismatch;
found : v.type (with underlying type Xpto)
required: ?{def pickle: ?}
Note that implicit conversions are not applicable because they are ambiguous:
both method PickleOps in package pickling of type [T](picklee: T)pickling.PickleOps[T]
and method pickleOps in trait Ops of type [T](picklee: T)scala.pickling.PickleOps[T]
are possible conversion functions from v.type to ?{def pickle: ?}
v.pickle
^
<console>:19: error: value pickle is not a member of Xpto
v.pickle
^
What is wrong?
I did access other issues on StackOverflow with this same type of question, for example:
Scala pickling: Simple custom pickler for my own class?
Obs.: I'm using this reference in build.sbt:
"org.scala-lang.modules" %% "scala-pickling" % "0.10.1"
Are you sure that those are the only imports you are using when in the REPL? The error above is, as it says:
Note that implicit conversions are not applicable because they are ambiguous:
both method PickleOps in package pickling of type [T](picklee: T)pickling.PickleOps[T]
and method pickleOps in trait Ops of type [T](picklee: T)scala.pickling.PickleOps[T]
are possible conversion functions from v.type to ?{def pickle: ?}
So you have at least two implicit conversions, from scala.pickling.PickleOps[T]() and scala.pickling.Ops.pickleOps. This is strange, because PickleOps is not an implicit class.
For me it works (Scala version 2.11.7 Java 1.7.0_79) in a fresh REPL:
scala> import scala.pickling._
scala> import scala.pickling.Defaults._
scala> import binary._
scala> class Xpto { var a = 0D; var b = 0 }
defined class Xpto
scala> val v = new Xpto { a = 1.23; b = 5 }
v: Xpto = cmd5$$anonfun$1$$anon$1#244da0ed
scala> v.pickle
res6: pickleFormat.PickleType = BinaryPickle([0,0,0,23,99,109,100,53,36,36,97,110,111,110,102,117,110,36,49,36,36,97,110,111,110,36,49,63,-13,-82,20,122,-31,71,-82,0,0,0,5])
Great! It runs!
I started a new fresh scala console.
I was using this reference to scala.pickling in build.sbt:
"org.scala-lang" %% "scala-pickling" % "0.10.1"
and now I'm using
"org.scala-lang.modules" %% "scala-pickling" % "0.10.1"
I'm also utilising Scala 2.11.6
Now it works perfectly, and really it's so simple.
scala> import scala.pickling._
import scala.pickling._
scala> import scala.pickling.binary._
import scala.pickling.binary._
scala> import scala.pickling.Defaults._
import scala.pickling.Defaults._
scala> class Xpto { var a = 0D; var b = 0; }
defined class Xpto
scala> val v = new Xpto { a = 1.23; b = 4; }
v: Xpto = $anon$1#1e7bd4df
scala> v.pickle
res0: pickling.binary.pickleFormat.PickleType = BinaryPickle([0,0,0,52,46,108,105,110,101,55,46,46,114,101,97,100,46,46,105,119,46,46,105,119,46,46,105,119,46,46,105,119,46,46,105,119,46,46,105,119,46,46,105,119,46,46,105,119,46,46,97,110,111,110,46,49,63,-13,-82,20,122,-31,71,-82,0,0,0,4])
I don't now if my others libraries references was generating that ambiguous refence. My references in build.sbt are:
libraryDependencies ++= Seq(
"log4j" % "log4j" % "1.2.17",
"javax.transaction" % "jta" % "1.1",
"com.typesafe.akka" %% "akka-actor" % "2.3.10",
"com.typesafe.akka" %% "akka-testkit" % "2.3.10",
"org.scalatest" %% "scalatest" % "3.0.0-SNAP4" % "test",
"org.apache.commons" % "commons-io" % "1.3.2",
"com.typesafe.akka" %% "akka-slf4j" % "2.3.11",
"ch.qos.logback" % "logback-classic" % "1.0.9",
"org.scala-lang.modules" %% "scala-pickling" % "0.10.1"
)
Thanks to Markus.