Scala Circe cannot decode model with list as member - scala

I have a model, containing a list as a member variable, that I am trying to serialize using Circe in Scale.
The model in question -
case class Order(id: Long, tableId: Long, items: List[Item]) {
}
object Order {
implicit val encoder: Encoder[Order] = deriveEncoder[Order]
implicit val decoder: Decoder[Order] = deriveDecoder[Order]
}
Also, the Item class -
case class Item(id: Long, name: String, serving: String) {
}
object Item {
implicit val encoder: Encoder[Item] = deriveEncoder[Item]
implicit val decoder: Decoder[Item] = deriveDecoder[Item]
}
I am using Circe's semi-auto encoder feature. However, when trying to read data from the database using quill, I am encountering this exception -
[error] /Users/in-rmoitra/Projects/PetProjects/Restrofit-Backend/src/main/scala/models/repository/OrderRepository.scala:17:69: exception during macro expansion:
[error] scala.reflect.macros.TypecheckException: Can't find implicit `Decoder[List[models.Item]]`. Please, do one of the following things:
[error] 1. ensure that implicit `Decoder[List[models.Item]]` is provided and there are no other conflicting implicits;
[error] 2. make `List[models.Item]` `Embedded` case class or `AnyVal`.
[error]
[error] at scala.reflect.macros.contexts.Typers.$anonfun$typecheck$3(Typers.scala:32)
[error] at scala.reflect.macros.contexts.Typers.$anonfun$typecheck$2(Typers.scala:26)
[error] at scala.reflect.macros.contexts.Typers.doTypecheck$1(Typers.scala:25)
[error] at scala.reflect.macros.contexts.Typers.$anonfun$typecheck$7(Typers.scala:38)
[error] at scala.reflect.internal.Trees.wrappingIntoTerm(Trees.scala:1731)
[error] at scala.reflect.internal.Trees.wrappingIntoTerm$(Trees.scala:1728)
[error] at scala.reflect.internal.SymbolTable.wrappingIntoTerm(SymbolTable.scala:18)
[error] at scala.reflect.macros.contexts.Typers.typecheck(Typers.scala:38)
[error] at scala.reflect.macros.contexts.Typers.typecheck$(Typers.scala:20)
[error] at scala.reflect.macros.contexts.Context.typecheck(Context.scala:6)
[error] at scala.reflect.macros.contexts.Context.typecheck(Context.scala:6)
[error] at io.getquill.context.QueryMacro.expandQueryWithMeta(QueryMacro.scala:41)
[error] at io.getquill.context.QueryMacro.expandQuery(QueryMacro.scala:20)
[error] at io.getquill.context.QueryMacro.runQuery(QueryMacro.scala:12)
[error] val ordersFuture: Future[List[(Order, (OrderItem, Item))]] = run(query)
From my limited knowledge of Circe and what I already looked up, the docs say that you do not need to create a decoder for List[A] if you already have a decoder for [A].
It would be great if someone could throw light on what seems to be happening here.

Your Circe code is fine. If you execute
println(
parse("""
|{ "id" : 1,
| "tableId" : 2,
| "items" : [
| { "id": 3,
| "name" : "a",
| "serving" : "b"
| },
| { "id": 4,
| "name" : "c",
| "serving" : "d"
| }
| ]
|}
""".stripMargin)
.flatMap(json => json.as[Order])
)
you'll get
Right(Order(1,2,List(Item(3,a,b), Item(4,c,d))))
So the trouble is in your Quill code.
And don't confuse io.circe.Decoder and io.getquill.context.jdbc.Decoders#Decoder.
https://getquill.io/#extending-quill-custom-encoding
`exception during macro expansion: [error] scala.reflect.macros.TypecheckException` when using quill

Related

scala spark type mismatching

I need to group my rdd by two columns and aggregate the count. I have a function:
def constructDiagnosticFeatureTuple(diagnostic: RDD[Diagnostic])
: RDD[FeatureTuple] = {
val grouped_patients = diagnostic
.groupBy(x => (x.patientID, x.code))
.map(_._2)
.map{ events =>
val p_id = events.map(_.patientID).take(1).mkString
val f_code = events.map(_.code).take(1).mkString
val count = events.size.toDouble
((p_id, f_code), count)
}
//should be in form:
//diagnostic.sparkContext.parallelize(List((("patient", "diagnostics"), 1.0)))
}
At compile time, I am getting an error:
/FeatureConstruction.scala:38:3: type mismatch;
[error] found : Unit
[error] required: org.apache.spark.rdd.RDD[edu.gatech.cse6250.features.FeatureConstruction.FeatureTuple]
[error] (which expands to) org.apache.spark.rdd.RDD[((String, String), Double)]
[error] }
[error] ^
How can I fix it?
I red this post: Scala Spark type missmatch found Unit, required rdd.RDD , but I do not use collect(), so, it does not help me.

GeoTrellis/Scala: Locate missing implicit evidence for Json Parsing

What imports are needed to locate the implicit evidence to compile a call to GeoJson.parse from GeoTrellis?
geotrellis.vector.io.json.Geometry uses spray.json to parse, and must be able to locate a JsonReader or JsonFormats instance, templated to WithCrs and Geometry classes.
The evidence is defined within FeatureFormats; but how can the snippet below use it?
The following does not resolve the evidence:
Import everything in the geotrellis.vector.io.json.* package
Import the Implicits specifically import geotrellis.vector.io.json.Implicits
Import FeatureFormats directly import geotrellis.vector.io.json.FeatureFormats
Ensure the correct imports, especially no imports of com.vividsolutions.jts.Geometry which would mask the target object
Here's the code in question
import geotrellis.vector.Geometry
import geotrellis.proj4.CRS
import geotrellis.vector.io.json.*
import geotrellis.vector.io.json.{GeoJson, WithCrs}
import org.json4s.{DefaultFormats, Formats}
import scala.util.{Failure, Success, Try}
val exampleQueryJson =
"""
|{
| "type": "Polygon",
| "crs": {
| "type": "name",
| "properties": {
| "name": "EPSG:4326"
| }
| },
| "coordinates": [
| [
| [....]
| ]
| ]
|}
""".stripMargin
class GeometryReader extends FeatureFormats {
implicit val jsonFormats: Formats = DefaultFormats
}
object GeometryReader {
def parseGeometry(request: String): Geometry = {
GeoJson.parse[Geometry](request)
}
}
val g = GeometryReader.parseGeometry(exampleQueryJson)
The compile error shows the inability to find the right evidence given what's currently available
[error] /path/redacted/GeometryReader.scala:19: Cannot find JsonReader or JsonFormat type class for geotrellis.vector.io.json.WithCrs[geotrellis.vector.Geometry]
[error] val geometryWithCrs: WithCrs[Geometry] = GeoJson.parse[WithCrs[Geometry]](request)
[error] ^
[error] /path/redacted/GeometryReader.scala:25: Cannot find JsonReader or JsonFormat type class for geotrellis.vector.Geometry
[error] Try(GeoJson.parse[Geometry](request)) match {
[error] ^
[error] two errors found
[error] (compile:compileIncremental) Compilation failed
Short answer: Add
import geotrellis.vector.io._
The creators of this library made use of package objects to publish these implicits. The package object (source code below) extends g.io.json.Implicits, and that brings them into scope.
https://github.com/locationtech/geotrellis/blob/master/vector/src/main/scala/geotrellis/vector/io/package.scala
More about package objects:
https://www.scala-lang.org/docu/files/packageobjects/packageobjects.html

Slick3.2 Error: No matching Shape found

I'm not sure what is wrong here.
The following code block is throwing error:
(for {
(e,r) <- tblDetail.joinLeft(tblMaster).on((e,r) => r.col1 === e.col3)
} yield (e.id)
Error
No matching Shape found.
[error] Slick does not know how to map the given types.
[error] Possible causes: T in Table[T] does not match your * projection,
[error] you use an unsupported type in a Query (e.g. scala List),
[error] or you forgot to import a driver api into scope.
[error] Required level: slick.lifted.FlatShapeLevel
[error] Source type: (slick.lifted.Rep[Int], slick.lifted.Rep[String],...)
[error] Unpacked type: T
[error] Packed type: G
[error] (e,r) <- tblDetail.joinLeft(tblMaster).on((e,r) => r.col1 === e.col3)
I checked the Slick Tables for tblDetail and tblMaster they seemed to be fine.
tblMaster
class TblMaster(tag:Tag)
extends Table[(Int,String,...)](tag, "tbl_master") {
def id = column[Int]("id")
def col3 = column[String]("col3")
def * = (id,col3)
}
tblDetail
class TblDetail(tag:Tag)
extends Table[Entity](tag, "tbl_detail") {
def id = column[Int]("id")
def col1 = column[String]("col1")
def * : ProvenShape[Entity] = (id,col1) <>
((Entity.apply _).tupled, Entity.unapply)
}
Any help would be appreciable.

Extract fields from JsArray in scala on play framework

I have a JSArray like below from server (I cannot change as it belongs to others):
[ {"name": "US", "id": 0, "translations" : {"name: {"es": "Estados unidos", "fr": "Etats-unis"}}},
{"name": "UK", "id": 1, "translations" : {"name: {"es": "Estados Kdda", "fr": "dsfjas"}}},
...
]
I need to extract all the name like US, UK but not name in translations and also id.
I tried several ways and it always have problem. Below are what I tried.
I first tried
case class Country(name: String, id:String)
implicit object CountryReads extends Reads[Country] {
def reads(json: JsValue) = Country(
(json \ "name"),
(json \ "id")
)
}
val countries = Json.parse(result) match { //here result is Json String
case JsArray(Seq(t)) => Some(t.as[Seq[Country]])
case _ => None
}
But I get compiling error as below:
[error] C:\git9\hss\app\models\LivingSocial.scala:80: type mismatch;
[error] found : play.api.libs.json.JsValue
[error] required: play.api.libs.json.JsResult[MyCountry]
[error] def reads(json: JsValue) = (json \ "currencyCode")
[error] ^
[error] one error found
[error] (compile:compile) Compilation failed
then I tried:
val jResult = Json.parse(result)
(jResult \\ "name").foreach { name =>
println(name.as[String])
}
I get error in println() as "\" will recursively pull names under translation also.
Any good way to do it?
case class Country(name: String, id: Int, translations: Map[String, Map[String, String]])
object Country {
implicit val format = Json.format[Country]
}
val js = Json.parse(yourString)
val names: JsResult[Seq[String]] = js.validate[Seq[Country]].map { countries => countries.map(_.name) }
At that point you can deal with the JsResult since you'll need error handling in case the JSON doesn't conform to your expectations.
You can change your read to look like this :
implicit val CountryReads: Reads[Country] = (
(JsPath \ "name").read[String] and
(JsPath \ "name").read[String]
)(Country.apply _)
this is a correct way to create a reader according to play documentation

Akka -- type mismatch; [error] found : Unit [error] required: scala.sys.process.ProcessLogger

I try to write example code to combine akka and actor. But I got the error message when compile the code.
The code is really simple as showed below.
So, What have I got wrong?
[error] /home/qos/workspaces/actors/actors.scala:20: type mismatch;
[error] found : Unit
[error] required: scala.sys.process.ProcessLogger
[error] execute(cmd)
[error] ^
[error] one error found
[error] (compile:compile) Compilation failed
The code is
import scala.sys.process._
import akka.actor._
object TryActor {
def main(args: Array[String]) {
val akkaSystem = ActorSystem("akkaSystem")
val worker = akkaSystem.actorOf(Props[Worker], name = "work0")
worker ! Command("ls")
}
case class Command(cmd: String)
class Worker extends Actor {
def receive = {
case Command(cmd) => {
println(cmd)
"echo recieve message from someone" !
execute(cmd.toString)
}
}
def execute(cmd: String) {
val process = Process(cmd.toString)
process ! ProcessLogger(_ => {})
}
}
}
It's interpreting execute(cmd.toString) as the argument to !, because newlines don't necessarily end statements. To fix this, don't use postfix syntax, which is deprecated for a reason:
def receive = {
case Command(cmd) => {
println(cmd)
"echo recieve message from someone".!
execute(cmd.toString)
}
}