entity decode for json that return a list/seq of any val - scala

I am building the back-end of an app using http4s. In the app i receive a json response from an external api (not the one i am working on). The api response is of this pattern below.
json response:
`{
"datatable" : {
"data" : [["AAPl", "MRT", "2020-03-20", 123, 123, 12.4, 233, 3234],
["AAPl", "MRT", "2020-03-20", 123, 123, 12.4, 233, 3234]],
"meta": {
"next_date : null
}
}`
my question is?
can someone show me how to create an entity decoder and entity encoder that would decode the pattern. i can seem to get it to work.
currently i have:
object Stocks {
case class tickerInfo(ticker: String, dim:String, date: String, a: Int, b: Int, c: Float, d: Int, e: Int)
case class data(data: Seq[tickerInfo])
case class meta(next_date : Option[String])
case class table(data: data, meta:meta)
case class stockInfo(datatable:table)
object stockInfo {
implicit def stockInfoEntityDecoder[F[_]:Sync]: EntityDecoder [F, stockInfo] = jsonOf
implicit def stockInfoEntityEncoder[F[_] : Applicative]: EntityEncoder[F, stockInfo] = jsonEncoderOf
}
val decodedJson = C.expect[stockInfo](GET(Uri.uri("www.externalApi.com")
}
But this doesn't work. Please can someone tell me where i am going wrong.
I am getting a run time error *not a compile error). and its an http4s error that says - InvalidMessageBodyFailure.
Thanks

You have a couple of errors in your model, but the main problem is that circe won't be able to automatically decode an array of jsons into a case class.
If you can not modify the source of the data, you would need to create your own custom codec.
object Stocks {
final case class TickerInfo(ticker: String, dim: String, date: String, a: Int, b: Int, c: Float, d: Int, e: Int)
final case class Meta(next_date : Option[String])
final case class Table(data: List[TickerInfo], meta: Meta)
final case class StockInfo(datatable: Table)
object StockInfo {
implicit final val TickerInfoDecoder: Decoder[TickerInfo] = Decoder[List[Json]].emap {
case ticker :: dim :: date :: a :: b :: c :: d :: e :: Nil =>
(
ticker.as[String],
dim.as[String],
date.as[String],
a.as[Int],
b.as[Int],
c.as[Float],
d.as[Int],
e.as[Int]
).mapN(TickerInfo).left.map(_.toString)
case list =>
Left(s"Bad number of fields in: ${list.mkString("[", ", ", "]")}")
}
implicit final val MetaDecoder: Decoder[Meta] = deriveDecoder
implicit final val TableDecoder: Decoder[Table] = deriveDecoder
implicit final val StockInfoDecoder: Decoder[StockInfo] = deriveDecoder
}
}
(you can see it working here, I leave outside the http4s part as it would be tricky to mock, but it shouldn't matter)
The error reporting of that custom encoder could be improved by providing more useful messages for each field. That is left as an exercise for the reader.

You need to:
create a data model, presumably consisting of some case classes and possibly sealed traits. What this data model should look like depends on the external API that you're talking to.
create JSON decoders for this data model. These should go in the case classes' companion objects, because that will allow the compiler to find them when needed without having to import anything.
Use the http4s-circe library in order to integrate these decoders with http4s. https://http4s.org/v0.19/json/
If you did everything correctly, you should then be able to use the Http4s client to retrieve the data, e. g. httpClient.expect[YourModelClass](Uri.uri("http://somewhere.com/something/")).

Welcome! I'd probably start inside and work your way out:
So for elements in the data array, perhaps something like (I'm guessing on the domain):
case class Stock(
name: String,
mart: String,
datePosted: LocalDate,
a: Int,
b: Int,
price: Double,
c: Int,
d: Int
)
Using Circe's automatic derevation should handle this pretty well.
Data seems like an array of arrays of exploded elements? So you may have to write some extraction logic to convert to the inner Object manually if that is the case. Something like this in the Controller may help:
elem match {
case name :: mart :: date :: a :: b :: price :: c :: d :: Nil => Stock(name, mart, date, a, b, price, c, d)
case invalid # _ => log.warn(s"Invalid record: $invalid")
}
Hopefully the above snippet returns something useful like an Either[E, A], etc.
Finally, you'd need objects for the outer JSON. Something simple to capture, such as case class ExternalApiRequest(dataTable: T), where T is a type appropriate for the above case. List[String] in the worst case?
Hope this helped! Let me know if you have any specific errors you were running into

Related

How can I encode None to missing json field using zio-json instead of null?

Let's say I have a case class with the optional field nickName and codec like this:
final case class Person(name: String, nickName: Option[String])
object Person {
implicit val personCodec: JsonCodec[Person] = DeriveJsonCodec.gen
}
I want to encode it using zio-json (v1.5.0) and have this as result:
{"name":"SomeName"}
And this is my test for it:
encoder.encodeJson(Person("SomeName", None), None).toString shouldBe """{"name":"SomeName"}""".stripMargin
Looks like the zio-json encode None with null and I've get the test error:
Expected :"{"name":"SomeName"[]}"
Actual :"{"name":"SomeName"[,"nickName":null]}"
I checked the code and found the encoder for Option https://github.com/zio/zio-json/blob/52d007ee22f214d12e1706b016f149c3243c632c/zio-json/shared/src/main/scala/zio/json/encoder.scala#L188-L202
Any idea how I can encode it as a missing JSON field?
implicit val OptionStringCodec: JsonCodec[Option[String]] = JsonCodec.string.xmap( s =>
s match {
case null | "" => None
case s => Some(s)
},
_.getOrElse("")
)
Note: I'm not familiar sith zio-json, there might be another way like a configuration to achieve the same thing.
Given the sample of code you linked, you can easily write an encoder that doesn't write anything in case of none by copying most of the code but modifying:
def unsafeEncode(oa: Option[A], indent: Option[Int], out: Write): Unit = oa match {
case None => () // out.write("null")
case Some(a) => A.unsafeEncode(a, indent, out)
}

zipping lists with an optional list to construct a list of object in Scala

I have a case class like this:
case class Metric(name: String, value: Double, timeStamp: Int)
I receive individual components to build metrics in separate lists and zip them to create a list of Metric objects.
def buildMetric(names: Seq[String], values: Seq[Double], ts: Seq[Int]): Seq[Metric] = {
(names, values, ts).zipped.toList map {
case (name, value, time) => Metric(name, value, time)
}
}
Now I need to add an optional parameter to both buildMetric function and Metric class.
case class Metric(name: String, value: Double, timeStamp: Int, type: Option[Type])
&
def buildMetric(names: Seq[String], values: Seq[Double], ts: Seq[Int], types: Option[Seq[Type]]): Seq[Metric]
The idea is that we some times receive a sequence of the type which if present matches the length of names and values lists. I am not sure how to modify the body of buildMetric function to create the Metric objects with type information idiomatically. I can think of a couple of approaches.
Do an if-else on types.isDefined and then zip the types with types.get with another list in one condition and leave as above in the other. This makes me write the same code twice.
The other option is to simply use a while loop and create a Metric object with types.map(_(i)) passed a last parameter.
So far I am using the second option, but I wonder if there is a more functional way of handling this problem.
The first option can't be done because zipped only works with tuples of 3 or fewer elements.
The second version might look like this:
def buildMetric(names: Seq[String], values: Seq[Double], ts: Seq[Int], types: Option[Seq[Type]]): Seq[Metric] =
for {
(name, i) <- names.zipWithIndex
value <- values.lift(i)
time <- ts.lift(i)
optType = types.flatMap(_.lift(i))
} yield {
Metric(name, value, time, optType)
}
One more option from my point of view, if you would like to keep this zipped approach - convert types from Option[Seq[Type]] to Seq[Option[Type]] with same length as names filled with None values in case if types is None as well:
val optionTypes: Seq[Option[Type]] = types.fold(Seq.fill(names.length)(None: Option[Type]))(_.map(Some(_)))
// Sorry, Did not find `zipped` for Tuple4 case
names.zip(values).zip(ts).zip(optionTypes).toList.map {
case (((name, value), time), optionType) => Metric(name, value, time, optionType)
}
Hope this helps!
You could just use pattern matching on types:
def buildMetric(names: Seq[String], values: Seq[Double], ts: Seq[Int], types: Option[Seq[Type]]): Seq[Metric] = {
types match {
case Some(types) => names.zip(values).zip(ts).zip(types).map {
case (((name, value), ts,), t) => Metric(name, value, ts, Some(t))
}
case None => (names, values, ts).zipped.map(Metric(_, _, _, None))
}
}

How do I find single attribute value in the list of objects in scala

I need to get the single attribute uuid and not Seq(UUID) from below class
case class Country(uuid: UUID, name: String, code:String)
val countries = Seq(
Country(20354d7a-e4fe-47af-8ff6-187bca92f3f9, "Afghanistan", "AFG"),
Country(caa8b54a-eb5e-4134-8ae2-a3946a428ec7,"Albania", "ALB"),
Country(bd2cbad1-6ccf-48e3-bb92-bc9961bc011e, "Algeria", "DZA")
)
val xyz: UUID = Country_uuid_from_countries
I tried val UUIDs = countries.map(_.uuid) but it returs Seq[UUID]
UUIDs: Seq[UUID] = List(20354d7a-e4fe-47af-8ff6-187bca92f3f9,
caa8b54a-eb5e-4134-8ae2-a3946a428ec7,
bd2cbad1-6ccf-48e3-bb92-bc9961bc011e
)
How do I just get UUID?
So you have a List of Countries, and a function (logic) for transforming one Country into AnotherCountry. And what you really want at the end is another List of AnotherCountries.
That is a well know problem. Every time you have a value A inside a context F[_] (a List is a context of multiplicity), and a function A => B. And you want to apply this transformation preserving the context to get an F[B] as a result.
Then you can use def map[F[_], A, B](fa: F[A])(f: A => B): F[B].
In the case of Scala, is common that the context themselves provide these functions as methods.
So, the only thing you need to do is this:
final case class Country(uuid: UUID, name: String, code: String)
final case class AnotherCountry(uuid: UUID)
val countries = List(
Country(20354d7a-e4fe-47af-8ff6-187bca92f3f9, "Afghanistan", "AFG"),
Country(caa8b54a-eb5e-4134-8ae2-a3946a428ec7,"Albania", "ALB"),
Country(bd2cbad1-6ccf-48e3-bb92-bc9961bc011e, "Algeria", "DZA")
)
val anotherCountires = countries.map { country =>
AnotherCountry(uuid = country.uuid)
}

Nested Scala case classes to/from CSV

There are many nice libraries for writing/reading Scala case classes to/from CSV files. I'm looking for something that goes beyond that, which can handle nested cases classes. For example, here a Match has two Players:
case class Player(name: String, ranking: Int)
case class Match(place: String, winner: Player, loser: Player)
val matches = List(
Match("London", Player("Jane",7), Player("Fred",23)),
Match("Rome", Player("Marco",19), Player("Giulia",3)),
Match("Paris", Player("Isabelle",2), Player("Julien",5))
)
I'd like to effortlessly (no boilerplate!) write/read matches to/from this CSV:
place,winner.name,winner.ranking,loser.name,loser.ranking
London,Jane,7,Fred,23
Rome,Marco,19,Giulia,3
Paris,Isabelle,2,Julien,5
Note the automated header line using the dot "." to form the column name for a nested field, e.g. winner.ranking. I'd be delighted if someone could demonstrate a simple way to do this (say, using reflection or Shapeless).
[Motivation. During data analysis it's convenient to have a flat CSV to play around with, for sorting, filtering, etc., even when case classes are nested. And it would be nice if you could load nested case classes back from such files.]
Since a case-class is a Product, getting the values of the various fields is relatively easy. Getting the names of the fields/columns does require using Java reflection.
The following function takes a list of case-class instances and returns a list of rows, each is a list of strings. It is using a recursion to get the values and headers of child case-class instances.
def toCsv(p: List[Product]): List[List[String]] = {
def header(c: Class[_], prefix: String = ""): List[String] = {
c.getDeclaredFields.toList.flatMap { field =>
val name = prefix + field.getName
if (classOf[Product].isAssignableFrom(field.getType)) header(field.getType, name + ".")
else List(name)
}
}
def flatten(p: Product): List[String] =
p.productIterator.flatMap {
case p: Product => flatten(p)
case v: Any => List(v.toString)
}.toList
header(classOf[Match]) :: p.map(flatten)
}
However, constructing case-classes from CSV is far more involved, requiring to use reflection for getting the types of the various fields, for creating the values from the CSV strings and for constructing the case-class instances.
For simplicity (not saying the code is simple, just so it won't be further complicated), I assume that the order of columns in the CSV is the same as if the file was produced by the toCsv(...) function above.
The following function starts by creating a list of "instructions how to process a single CSV row" (the instructions are also used to verify that the column headers in the CSV matches the the case-class properties). The instructions are then used to recursively produce one CSV row at a time.
def fromCsv[T <: Product](csv: List[List[String]])(implicit tag: ClassTag[T]): List[T] = {
trait Instruction {
val name: String
val header = true
}
case class BeginCaseClassField(name: String, clazz: Class[_]) extends Instruction {
override val header = false
}
case class EndCaseClassField(name: String) extends Instruction {
override val header = false
}
case class IntField(name: String) extends Instruction
case class StringField(name: String) extends Instruction
case class DoubleField(name: String) extends Instruction
def scan(c: Class[_], prefix: String = ""): List[Instruction] = {
c.getDeclaredFields.toList.flatMap { field =>
val name = prefix + field.getName
val fType = field.getType
if (fType == classOf[Int]) List(IntField(name))
else if (fType == classOf[Double]) List(DoubleField(name))
else if (fType == classOf[String]) List(StringField(name))
else if (classOf[Product].isAssignableFrom(fType)) BeginCaseClassField(name, fType) :: scan(fType, name + ".")
else throw new IllegalArgumentException(s"Unsupported field type: $fType")
} :+ EndCaseClassField(prefix)
}
def produce(instructions: List[Instruction], row: List[String], argAccumulator: List[Any]): (List[Instruction], List[String], List[Any]) = instructions match {
case IntField(_) :: tail => produce(tail, row.drop(1), argAccumulator :+ row.head.toString.toInt)
case StringField(_) :: tail => produce(tail, row.drop(1), argAccumulator :+ row.head.toString)
case DoubleField(_) :: tail => produce(tail, row.drop(1), argAccumulator :+ row.head.toString.toDouble)
case BeginCaseClassField(_, clazz) :: tail =>
val (instructionRemaining, rowRemaining, constructorArgs) = produce(tail, row, List.empty)
val newCaseClass = clazz.getConstructors.head.newInstance(constructorArgs.map(_.asInstanceOf[AnyRef]): _*)
produce(instructionRemaining, rowRemaining, argAccumulator :+ newCaseClass)
case EndCaseClassField(_) :: tail => (tail, row, argAccumulator)
case Nil if row.isEmpty => (Nil, Nil, argAccumulator)
case Nil => throw new IllegalArgumentException("Not all values from CSV row were used")
}
val instructions = BeginCaseClassField(".", tag.runtimeClass) :: scan(tag.runtimeClass)
assert(csv.head == instructions.filter(_.header).map(_.name), "CSV header doesn't match target case-class fields")
csv.drop(1).map(row => produce(instructions, row, List.empty)._3.head.asInstanceOf[T])
}
I've tested this using:
case class Player(name: String, ranking: Int, price: Double)
case class Match(place: String, winner: Player, loser: Player)
val matches = List(
Match("London", Player("Jane", 7, 12.5), Player("Fred", 23, 11.1)),
Match("Rome", Player("Marco", 19, 13.54), Player("Giulia", 3, 41.8)),
Match("Paris", Player("Isabelle", 2, 31.7), Player("Julien", 5, 16.8))
)
val csv = toCsv(matches)
val matchesFromCsv = fromCsv[Match](csv)
assert(matches == matchesFromCsv)
Obviously this should be optimized and hardened if you ever want to use this for production...

scala extractor pattern for complex validation but with nice error output

I am struggling with using the extractor pattern in a certain use case where it seems that it could be very powerful.
I start with an input of Map[String, String] coming from a web request. This is either a searchRequest or a countRequest to our api.
searchRequest has keys
query(required)
fromDate(optional-defaulted)
toDate(optional-defaulted)
nextToken(optional)
maxResults(optional-defaulted)
countRequest has keys
query(required)
fromDate(optional-defaulted)
toDate(optional-defaulted)
bucket(optional-defaulted)
Then, I want to convert both of these to a composition type structure like so
protected case class CommonQueryRequest(
originalQuery: String,
fromDate: DateTime,
toDate: DateTime
)
case class SearchQueryRequest(
commonRequest: CommonQueryRequest,
maxResults: Int,
nextToken: Option[Long])
case class CountRequest(commonRequest: CommonQueryRequest, bucket: String)
As you can see, I am sort of converting Strings to DateTimes and Int, Long, etc. My issue is that I really need errors for invalid fromDate vs. invalid toDate format vs. invalid maxResults vs. invalid next token IF available.
At the same time, I need to stick in defaults(which vary depending on if it is a search or count request).
Naturally, with the Map being passed in, you can tell search vs. count so in my first go at this, I added a key="type" with value of search or count so that I could match at least on that.
Am I even going down the correct path? I thought perhaps using matching could be cleaner than our existing implementation but the further I go down this path, it seems to be getting a bit uglier.
thanks,
Dean
I would suggest you to take a look at scalaz.Validation and ValidationNel. It's super nice way to collect validation errors, perfect fit for input request validation.
You can learn more about Validation here: http://eed3si9n.com/learning-scalaz/Validation.html. However in my example I use scalaz 7.1 and it can be a little bit different from what described in this article. However main idea remains the same.
Heres small example for your use case:
import java.util.NoSuchElementException
import org.joda.time.DateTime
import org.joda.time.format.DateTimeFormat
import scala.util.Try
import scalaz.ValidationNel
import scalaz.syntax.applicative._
import scalaz.syntax.validation._
type Input = Map[String, String]
type Error = String
case class CommonQueryRequest(originalQuery: String,
fromDate: DateTime,
toDate: DateTime)
case class SearchQueryRequest(commonRequest: CommonQueryRequest,
maxResults: Int,
nextToken: Option[Long])
case class CountRequest(commonRequest: CommonQueryRequest, bucket: String)
def stringField(field: String)(input: Input): ValidationNel[Error, String] =
input.get(field) match {
case None => s"Field $field is not defined".failureNel
case Some(value) => value.successNel
}
val dateTimeFormat = DateTimeFormat.fullTime()
def dateTimeField(field: String)(input: Input): ValidationNel[Error, DateTime] =
Try(dateTimeFormat.parseDateTime(input(field))) recover {
case _: NoSuchElementException => DateTime.now()
} match {
case scala.util.Success(dt) => dt.successNel
case scala.util.Failure(err) => err.toString.failureNel
}
def intField(field: String)(input: Input): ValidationNel[Error, Int] =
Try(input(field).toInt) match {
case scala.util.Success(i) => i.successNel
case scala.util.Failure(err) => err.toString.failureNel
}
def countRequest(input: Input): ValidationNel[Error, CountRequest] =
(
stringField ("query") (input) |#|
dateTimeField("fromDate")(input) |#|
dateTimeField("toDate") (input) |#|
stringField ("bucket") (input)
) { (query, from, to, bucket) =>
CountRequest(CommonQueryRequest(query, from, to), bucket)
}
val validCountReq = Map("query" -> "a", "bucket" -> "c")
val badCountReq = Map("fromDate" -> "invalid format", "bucket" -> "c")
println(countRequest(validCountReq))
println(countRequest(badCountReq))
scalactic looks pretty cool as well and I may go that route (though not sure if we can use that lib or not but I think I will just proceed forward until someone says no).