I am experimenting around with Scala and Play. I want to return a Tuple, in this case a Tuple3 but could be Tuple of any size. I want to serialize the Tuple as JSON, but Play doesn't seem to know how to serialize a Tuple.
I'm just trying to do something very simple like the following
def getClient(clientId: Int) = Action {
val result = ("I", "AM", "TUPLE")
Ok(Json.toJson(result))
}
No Json serializer found for type (String, String, String). Try to implement an implicit Writes or Format for this type.
I tried something like this but it only seems to work on Tuple2.
val seq = Seq[(String,String)](("attr1"->"val1"),("attr2"->"val2"))
val s = Json.toJson(seq.map(e => Json.obj((e._1 -> e._2))))
Ok(s).as(JSON)
You can create a case class like this.
case class MyCaseClass(string1: String, string2: String, string3: String)
then you have to add an implicit writer with
implicit val myCaseClassFormat = Json.format[MyCaseClass]
then you can do Json.toJson(MyCaseClass("I", "AM", "TUPLE"))
Related
From the following code:
type Structure = Map[String, Any]
def getStructureSourceNames(structuresDesc: Structure): Iterable[String] = {
val subColsDesc: Map[String, String] =
structuresDesc.filter(_._2.isInstanceOf[String]).asInstanceOf[Map[String, String]]
val subStructuresDesc: Map[String, Structure] = structuresDesc
.filter(_._2.isInstanceOf[Map[String, Structure]])
.asInstanceOf[Map[String, Structure]]
subColsDesc.values ++ subStructuresDesc.values.flatMap(getStructureSourceNames(_))
}
I want to pass a recursive Map of (String -> String), ia. an example of Structure is:
Map("test" -> Map(
"newid" -> "id",
"newstring" -> "string",
"toto" -> Map("newdouble" -> "double")
),
"otherid" -> "id")
The method getStructureSourceNames should return the list of "final" value, ia. browse the whole tree and for each leaft, get the String value.
When I run this code, this drives me to:
Warning:(78, 32) non-variable type argument String in type scala.collection.immutable.Map[String,Structure] (the underlying of Map[String,Structure]) is unchecked since it is eliminated by erasure
.filter(_._2.isInstanceOf[Map[String, Structure]])
Moreover, I don't like to use isInstanceOf / asInstanceOf. By googling, I found that I could use pattern matching to check for the type, and get the Map with expected typing, but I can"t find how to do it.
Would you have an example of such code?
There are 2 kinds of pattern matching:
1) pattern matching on a sealed trait (good)
2) pattern matching where patterns involve matching with arbitrary classes and equality checks (not better than instanceOf checks)
To avoid 2) you need to make the type you want to match a sealed trait:
sealed trait ConfigValue
case class StringValue(v: String) extends ConfigValue
case class MapValue(map: Map[String, ConfigValue]) extends ConfigValue
val struct: ConfigValue = MapValue(Map("key1" -> StringValue("v1"),
"key2" -> MapValue(Map("sub" -> StringValue("val")))))
def allValues(s: ConfigValue): Iterable[String] = {
s match {
case StringValue(v) => Seq(v)
case MapValue(map) => map.values.flatMap(v => allValues(v))
}
}
println(allValues(struct))
By the way your structure looks similar to json. Maybe you could reuse some json library.
I'm trying to create an encoder and decoder for a case class I have:
case class Road(id: String, light: RoadLight, names: Map[String, String])
RoadLight is a java class, with enum.
public enum RoadLight {
red,yellow,green
}
I have tried to do a semi-auto encode&decode: making a implicit encoders and decoders.
I've started with the Map[String,String] type:
implicit val namesDecoder: Decoder[Map[String, String]] = deriveDecoder[Map[String, String]]
implicit val namesEncoder: Encoder[Map[String, String]] = deriveEncoder[Map[String, String]]
But I did get an error for both of them!
1:
could not find Lazy implicit value of type io.circe.generic.decoding.DerivedDecoder[A]
2: Error: not enough arguments for method deriveDecoder: (implicit decode: shapeless.Lazy[io.circe.generic.decoding.DerivedDecoder[A]])io.circe.Decoder[A].
Unspecified value parameter decode.
implicit val namesDecoder: Decoder[Map[String,String]]= deriveDecoder
I've done everything by the book, can't understand what's wrong. I'm not even trying to parse the case class, only the map, and even that doesn't work.
Any ideas? Thanks!
Scaladoc says
/**
* Semi-automatic codec derivation.
*
* This object provides helpers for creating [[io.circe.Decoder]] and [[io.circe.ObjectEncoder]]
* instances for case classes, "incomplete" case classes, sealed trait hierarchies, etc.
Map is not a case class or element of sealed trait hierarchy.
https://github.com/circe/circe/issues/216
Encode Map[String, MyCaseClass] into Seq[String, String] using circe
Circe and Scala's Enumeration type
circe-generic does not create codecs for java enums, only for scala product and sum types. But rolling your own for RoadLight is not hard. And once you have that, you get the map.
The code below works:
object RoadLightCodecs {
implicit val decRl: Decoder[RoadLight] = Decoder.decodeString.emap {
case "red" => Right(RoadLight.Red)
case "yellow" => Right(RoadLight.Yellow)
case "green" => Right(RoadLight.Green)
case s => Left(s"Unrecognised traffic light $s")
}
implicit val encRl: Encoder[RoadLight] = Encoder.encodeString.contramap(_.toString)
implicit val decodeMap = Decoder.decodeMap[String, RoadLight]
implicit val encodeMap = Encoder.encodeMap[String, RoadLight]
}
So what we have done is made codecs for the basic types and then use them to build the bigger map codec.
Now as far as I am aware, there aren't any libraries that do this automatically for java enums, although it should theoretically be possible to write one. But using combinators on basic codecs to build up more complex ones works great and scales well.
EDIT: I had a play at auto-deriving java enum codecs and you can almost do it:
def decodeEnum[E <: Enum[E]](values: Array[E]): Decoder[E] = Decoder.decodeString.emap { str =>
values.find(_.toString.toLowerCase == str)
.fold[Either[String, E]](Left(s"Value $str does not map correctly"))(Right(_))
}
def encodeEnum[E <: Enum[E]]: Encoder[E] =
Encoder.encodeString.contramap(_.toString.toLowerCase)
implicit val roadLightDecoder = decodeEnum[RoadLight](RoadLight.values())
implicit val roadLightEncoder = encodeEnum[RoadLight]
So encodeEnum could be automatic (you could make it implicit instead of the val at the end) but the decoder needs to be given the values (which I see no way of getting automatically from the type), so you need to pass those when creating the codec.
I have a scala-2.11 function which creates a case class from Map based on the provided class type.
def createCaseClass[T: TypeTag, A](someMap: Map[String, A]): T = {
val rMirror = runtimeMirror(getClass.getClassLoader)
val myClass = typeOf[T].typeSymbol.asClass
val cMirror = rMirror.reflectClass(myClass)
// The primary constructor is the first one
val ctor = typeOf[T].decl(termNames.CONSTRUCTOR).asTerm.alternatives.head.asMethod
val argList = ctor.paramLists.flatten.map(param => someMap(param.name.toString))
cMirror.reflectConstructor(ctor)(argList: _*).asInstanceOf[T]
}
I'm trying to use this in the context of a spark data frame as a UDF. However, I'm not sure what's the best way to pass the case class. The approach below doesn't seem to work.
def myUDF[T: TypeTag] = udf { (inMap: Map[String, Long]) =>
createCaseClass[T](inMap)
}
I'm looking for something like this-
case class MyType(c1: String, c2: Long)
val myUDF = udf{(MyType, inMap) => createCaseClass[MyType](inMap)}
Thoughts and suggestions to resolve this is appreciated.
However, I'm not sure what's the best way to pass the case class
It is not possible to use case classes as arguments for user defined functions. SQL StructTypes are mapped to dynamically typed (for lack of a better word) Row objects.
If you want to operate on statically typed objects please use statically typed Dataset.
From try and error I learn that whatever data structure that is stored in a Dataframe or Dataset is using org.apache.spark.sql.types
You can see with:
df.schema.toString
Basic types like Int,Double, are stored like:
StructField(fieldname,IntegerType,true),StructField(fieldname,DoubleType,true)
Complex types like case class are transformed to a combination of nested types:
StructType(StructField(..),StructField(..),StructType(..))
Sample code:
case class range(min:Double,max:Double)
org.apache.spark.sql.Encoders.product[range].schema
//Output:
org.apache.spark.sql.types.StructType = StructType(StructField(min,DoubleType,false), StructField(max,DoubleType,false))
The UDF parameter type in this cases is Row, or Seq[Row] when you store an array of case classes
A basic debug technic is print to string:
val myUdf = udf( (r:Row) => r.schema.toString )
then, to see was happen:
df.take(1).foreach(println) //
I ran into a problem when trying to convert a Seq of case classes into a Spark Dataset today. I thought I'd share the solution here as it was tough to pin down.
I have a case class I am trying to convert to a Dataset
case class Foo(name: String, names: Option[List[String]])
val myData: Seq[Foo] = Seq(Foo("A", Some(List("T","U"))),
Foo("B", Some(List("V","W"))))
val myFooDataset = sparkSession.createDataset(myData)
This errors out and complains that there is no encoder. How can I get this to work?
The answer in this case is to convert your embedded Lists to Seq. In fact, Just having a column that is a List (without being wrapped in an Option) will work but as soon as you wrap it in an Option it needs to be a Seq instead.
case class Foo(name: String, names: Option[Seq[String]])
val myData: Seq[Foo] = Seq(Foo("A", Some(Seq("T","U"))),
Foo("B", Some(Seq("V","W"))))
val myFooDataset = sparkSession.createDataset(myData)
I have 2 case-classes:
case class OutlierPortal(portal: String, timeData: Seq[OutlierPortalTimeSeriesData])
and
case class OutlierPortalTimeSeriesData(period: Timestamp, totalAmount: Double, isOutlier: Int)
or respectively a Seq[OutlierPortal]
What I want to perform is similar to Scala Macros: Making a Map out of fields of a class in Scala, but I want to map a sequence of a (nested) case-classes to Seq[Map[String, Any]].
However, new to scala I fear a bit the proposed idea of macros. Is there a "simpler" way to map this Sequence of Seq[OutlierPortal] to Seq[Map[String, Any]]
Or would you recommend to start using macros even though a beginner in scala? For me a one-way conversion (case-class -> map) is enough.
If you're looking to avoid fancy tricks, and you don't have too many total classes to write this for, you can just write the methods to create the maps yourself. I'd suggest adding methods named something like toMap to your case classes. OutlierPortalTimeSeriesData's is simple if you use the Map() constructor:
case class OutlierPortalTimeSeriesData(period: Timestamp, totalAmount: Double, isOutlier: Int) {
def toMap: Map[String, Any] = Map(
"period" -> period,
"totalAmount" -> totalAmount,
"isOutlier" -> isOutlier)
}
I suppose there's some duplication there, but at least if you ever have a reason to change the string values but not the variable names, you have the flexibility to do that.
To take a sequence of something you can call toMap on, and turn it into a Seq[Map[String, Any]], just use map:
mySeq.map { _.toMap }
We can use this both to write OutlierPortal's toMap:
case class OutlierPortal(portal: String, timeData: Seq[OutlierPortalTimeSeriesData]) {
def toMap: Map[String, Any] = Map(
"portal" -> portal,
"timeData" -> timeData.map { _.toMap })
}
and then again to convert a Seq[OutlierPortal] to a Seq[Map[String, Any]].
Depending on how you're using these objects and methods, you might want to define a trait that distinguishes classes with this method, and have your case classes extend it:
trait HasToMap { def toMap: Map[String, Any] }
case class Blah( /* ... */ ) extends HasToMap {
def toMap: /* ... */ }
}
This would let you take a value that you know you can convert to Map[String, Any] (or a sequence of them, etc.) in a method that otherwise doesn't care which particular type it is.