Convert json to object in stream - scala

I'm attempting to convert a json string :
"{"payload": "[{\"test\":\"123\",\"tester\":\"456\"}," +
"{\"test1\":\"1234\",\"tester2\":\"4567\"}]"}"
to it's case class equivalent which I have defined as :
case class PayloadElements(#JsonProperty("test") test: String, #JsonProperty("tester") tester: String)
case class Payload(#JsonProperty("payload") payload: Array[PayloadElements])
but receive error :
2022-10-06 09:41:37,248 [default-akka.actor.default-dispatcher-4] INFO akka.event.slf4j.Slf4jLogger - Slf4jLogger started
2022-10-06 09:41:38,220 [default-akka.actor.default-dispatcher-4] ERROR akka.stream.Materializer - [Error] Upstream failed.
com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot deserialize instance of `[LTestMapper$PayloadElements;` out of VALUE_STRING token
at [Source: (String)"{"payload": "[{\"test\":\"123\",\"tester\":\"456\"},{\"test1\":\"1234\",\"tester2\":\"4567\"}]"}"; line: 1, column: 13] (through reference chain: TestMapper$Payload["payload"])
at com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:59)
Here is my code :
import akka.actor.ActorSystem
import akka.stream.scaladsl.{Flow, Sink, Source}
import com.fasterxml.jackson.annotation.JsonProperty
import com.fasterxml.jackson.databind.ObjectMapper
object TestMapper extends App {
implicit val system = ActorSystem()
val om = new ObjectMapper
val flow = Flow[String].map(x => om.readValue(x, classOf[Payload]))
val sink = Sink.foreach[Payload](x => println(x))
var str = "{\"payload\": \"[{\\\"test\\\":\\\"123\\\",\\\"tester\\\":\\\"456\\\"}," +
"{\\\"test1\\\":\\\"1234\\\",\\\"tester2\\\":\\\"4567\\\"}]\"}";
case class PayloadElements(#JsonProperty("test") test: String, #JsonProperty("tester") tester: String)
case class Payload(#JsonProperty("payload") payload: Array[PayloadElements])
val graph = Source.repeat(str).take(3).via(flow).log("Error").to(sink)
graph.run()
}
Have I defined the case class structure incorrectly ?

Related

scala akka http type mismatch

I created a project with open api then added a endpoint GET /product which is supposed to return a product( i am testing it so I just want it to return any product).
When I run the project I get the following error.
[error] found : org.openapitools.server.model.Product.type
[error] required: (?, ?, ?) => ?
[error] def toEntityMarshallerProduct: ToEntityMarshaller[Product] = jsonFormat3(Product)
the logs point at Product inside jsonFormat3
I noticed it's caused by the number of properties of the Product Model, if I reduce them to 3, it works ! this is weird! does anyone know how to resolve this?
this is the product model file
package org.openapitools.server.model
final case class Product (
id: Int,
name: String,
isAvailable: Boolean,
description: String,
category: String
)
this is the productAPI file
package org.openapitools.server.api
import akka.http.scaladsl.server.Directives._
import akka.http.scaladsl.server.Route
import akka.http.scaladsl.model.StatusCodes
import akka.http.scaladsl.marshalling.ToEntityMarshaller
import akka.http.scaladsl.unmarshalling.FromEntityUnmarshaller
import akka.http.scaladsl.unmarshalling.FromStringUnmarshaller
import org.openapitools.server.AkkaHttpHelper._
import org.openapitools.server.model.Product
class ProductApi(
productService: ProductApiService,
productMarshaller: ProductApiMarshaller
) {
import productMarshaller._
lazy val route: Route =
path("product" / "all") {
get {
productService.productAllGet()
}
}
}
trait ProductApiService {
def productAllGet200(responseProduct: Product)(implicit toEntityMarshallerProduct: ToEntityMarshaller[Product]): Route =
complete((200, responseProduct))
def productAllGet()(implicit toEntityMarshallerProduct: ToEntityMarshaller[Product]): Route
}
trait ProductApiMarshaller {
implicit def toEntityMarshallerProduct: ToEntityMarshaller[Product]
}
and this is the main file
import akka.actor.typed.{ActorSystem, ActorRef}
import akka.actor.typed.scaladsl.Behaviors
import akka.http.scaladsl.Http
import akka.http.scaladsl.server.Route
import akka.http.scaladsl.server.Directives._
import akka.http.scaladsl.model.StatusCodes
// for JSON serialization/deserialization following dependency is required:
// "com.typesafe.akka" %% "akka-http-spray-json" % AkkaHttpVersion
import akka.http.scaladsl.marshalling.ToEntityMarshaller
import akka.http.scaladsl.marshallers.sprayjson.SprayJsonSupport._
import spray.json.DefaultJsonProtocol._
import scala.io.StdIn
import akka.util.Timeout
import scala.concurrent.duration._
import scala.concurrent.{ ExecutionContext, Future }
import org.openapitools.server.api._
import org.openapitools.server.model._
object Main extends App {
// needed to run the route
implicit val system = ActorSystem(Behaviors.empty, "product")
// implicit val materializer = ActorMaterializer()
// needed for the future map/flatmap in the end and future in fetchItem and saveOrder
implicit val executionContext = system.executionContext
object DefaultMarshaller extends ProductApiMarshaller {
def toEntityMarshallerProduct: ToEntityMarshaller[Product] = jsonFormat3(Product)
}
object DefaultService extends ProductApiService {
def productAllGet() (implicit toEntityMarshallerProduct: ToEntityMarshaller[Product]) : Route = {
val reponse = Future {
Product(1,"product",false,"desc","pizza")
}
requestcontext => {
(reponse).flatMap {
(product: Product) =>
productAllGet200(product)(toEntityMarshallerProduct)(requestcontext)
}
}
}
}
val api = new ProductApi(DefaultService, DefaultMarshaller)
val host = "localhost"
val port = 3005
val bindingFuture = Http().newServerAt(host, port).bind(pathPrefix("api"){api.route})
println(s"Server online at http://${host}:${port}/\nPress RETURN to stop...")
bindingFuture.failed.foreach { ex =>
println(s"${ex} Failed to bind to ${host}:${port}!")
}
StdIn.readLine() // let it run until user presses return
bindingFuture
.flatMap(_.unbind()) // trigger unbinding from the port
.onComplete(_ => system.terminate()) // and shutdown when done
}
I noticed it's caused by the number of properties of the Product Model, if I reduce them to 3, it works ! this is weird! does anyone know how to resolve this?
try it with using a Data Transfer Object (DTO) which will provide by your api and a domain model object which is handled internally.
so you will use jsonFormat3 for your DTO and have no need to jsonFormat5.
a simple mapper or helper method inner of case classes provide a smooth integration of that.
for example:
final case class Product (
id: Int,
name: String,
isAvailable: Boolean,
description: String,
category: String
) {
def toDto: ProductDTO = ProductDTO(prop0, prop1, prop2)
}
final case class ProductDTO (
prop0: X,
prop1: Y,
prop2: Z
) {
def toDomain(propA: X, propB: Y): Product = Product(id, name, isAvailable, description, category)
}
object ProductSupport extends SprayJsonSupport with DefaultJsonProtocol {
implicit val productFormat = json3Format(ProductDTO)
}

Can't find a codec for class java.lang.Object. while using mongo with scala

I am using mongo with scala.
object myApp extends App{
case class myClass(field: String, value: Option[Any])
}
Above is my case class and below is the DB code
object DB{
import org.bson.codecs.configuration.CodecRegistries
import org.bson.codecs.configuration.CodecRegistries._
import org.mongodb.scala.MongoClient.DEFAULT_CODEC_REGISTRY
import org.mongodb.scala.bson.codecs.Macros._
import org.mongodb.scala.{MongoClient, MongoCollection, MongoDatabase}
private val codecs = fromProviders(classOf[myClass])
private val codecReg = fromRegistries(codecs,
DEFAULT_CODEC_REGISTRY)
private val dataB: MongoDatabase = MongoClient().getDatabase("database").withCodecRegistry(codecReg)
val myClass: MongoCollection[Rule] = dataB.getCollection("myClass")
}
now, the "value" can have String/Int/None value. If I define a myClass with value as 'None' i.e
val c = myClass("abc", None)
and put this in database than it runs without ant error, but if i keep value as Some(String_value) or Some(int_value) than it shows the error.
val d = myClass("abc", Some("xyz"))
val f = myClass("abc", Some(90))
error
Can't find a codec for class java.lang.Object
Can somebody guide me how should I do this as "value" field can have String/Int/None values.

How to fix NullpointerException error scala

So when i receive message from rabbitmq i want to send it to actor, but when i'm trying to match message i'm getting nullpointerexception. As it appears trouble occurs when i'm matching "msg" and trying send it to actor. Without this part " msg match {
case _ => drawer ! Drawer.Data(msg)
}" All works. How to fix it?
import java.util.UUID
import Client.{GameBullet, GameTank}
import akka.actor.{Actor, ActorLogging, ActorRef}
import com.rabbitmq.client.AMQP.BasicProperties
import com.rabbitmq.client.{AMQP, ConnectionFactory, DefaultConsumer, Envelope}
import org.json4s._
import org.json4s.jackson.JsonMethods._
import org.json4s.jackson.Serialization
import org.json4s.native.Serialization.{read, write}
object MessageSender {
case object Left
case object Right
case object Up
case object Down
case object StartGame
case object MakeShot
case object Start
}
case class Message(id:String,content:String)
class MessageSender(drawer:ActorRef) extends Actor with ActorLogging{
import MessageSender._
val factory = new ConnectionFactory()
factory.setHost("localhost")
val connection = factory.newConnection()
val channel = connection.createChannel()
val replyQueueName: String = channel.queueDeclare().getQueue
val corrId = UUID.randomUUID().toString
channel.queueBind(replyQueueName, "myDirect", replyQueueName)
val props = new BasicProperties.Builder().correlationId(corrId).replyTo(replyQueueName).build()
var currentId = 0
var message = ""
implicit val formats = Serialization.formats(ShortTypeHints(List(classOf[Message])))
override def receive: Receive = {
case StartGame =>
message="startGame"
var response: String = null
var msg:String =""
val code = pretty(render(Extraction.decompose(Message(currentId.toString,message))))
println(code)
channel.basicPublish("myDirect", "service", props, code.getBytes("UTF-8"))
println(replyQueueName)
println(corrId)
while(response == null){
val consumer = new DefaultConsumer(channel) {
override def handleDelivery(consumerTag: String,
envelope: Envelope,
properties: AMQP.BasicProperties,
body: Array[Byte]) {
msg = new String(body, "UTF-8")
println(properties.getCorrelationId)
println(s"message is $msg")
if(properties.getCorrelationId == corrId){
response = new String(body, "UTF-8")
}
}
}
channel.basicConsume(replyQueueName, true, consumer)
}
currentId=response.toInt
log.info(s"Session started [$currentId]")
msg match {
case _ => drawer ! Drawer.Data(msg)
}
self ! Start
}
}
Error:
This i get when i try match msg
According to the attached screenshot, the exception occurs on line 83 of this class.
The problem is therefore that the drawer member is null, so drawer ! Drawer.Data(msg) throws NPE.
I would guess that the initialization of the actor (using props) is done with a variable that is defined later in the same scope resulting with a "froward reference".
See my example in your other related question (Scala Akka NullPointerException error)

Scala Reflection exception during creation of DataSet in Spark

I want to run Spark Job on Spark Jobserver.
During execution, I got an exception:
stack:
java.lang.RuntimeException: scala.ScalaReflectionException: class
com.some.example.instrument.data.SQLMapping in JavaMirror with
org.apache.spark.util.MutableURLClassLoader#55b699ef of type class
org.apache.spark.util.MutableURLClassLoader with classpath
[file:/app/spark-job-server.jar] and parent being
sun.misc.Launcher$AppClassLoader#2e817b38 of type class
sun.misc.Launcher$AppClassLoader with classpath [.../classpath
jars/] not found.
at
scala.reflect.internal.Mirrors$RootsBase.staticClass(Mirrors.scala:123)
at
scala.reflect.internal.Mirrors$RootsBase.staticClass(Mirrors.scala:22)
at
com.some.example.instrument.DataRetriever$$anonfun$combineMappings$1$$typecreator15$1.apply(DataRetriever.scala:136)
at
scala.reflect.api.TypeTags$WeakTypeTagImpl.tpe$lzycompute(TypeTags.scala:232)
at scala.reflect.api.TypeTags$WeakTypeTagImpl.tpe(TypeTags.scala:232)
at
org.apache.spark.sql.catalyst.encoders.ExpressionEncoder$.apply(ExpressionEncoder.scala:49)
at org.apache.spark.sql.Encoders$.product(Encoders.scala:275) at
org.apache.spark.sql.LowPrioritySQLImplicits$class.newProductEncoder(SQLImplicits.scala:233)
at
org.apache.spark.sql.SQLImplicits.newProductEncoder(SQLImplicits.scala:33)
at
com.some.example.instrument.DataRetriever$$anonfun$combineMappings$1.apply(DataRetriever.scala:136)
at
com.some.example.instrument.DataRetriever$$anonfun$combineMappings$1.apply(DataRetriever.scala:135)
at scala.util.Success$$anonfun$map$1.apply(Try.scala:237) at
scala.util.Try$.apply(Try.scala:192) at
scala.util.Success.map(Try.scala:237) at
scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:237) at
scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:237) at
scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32) at
scala.concurrent.impl.ExecutionContextImpl$AdaptedForkJoinTask.exec(ExecutionContextImpl.scala:121)
at
scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at
scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at
scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at
scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
In DataRetriever I convert simple case class to DataSet.
case class definition:
case class SQLMapping(id: String,
it: InstrumentPrivateKey,
cc: Option[String],
ri: Option[SourceInstrumentId],
p: Option[SourceInstrumentId],
m: Option[SourceInstrumentId])
case class SourceInstrumentId(instrumentId: Long,
providerId: String)
case class InstrumentPrivateKey(instrumentId: Long,
providerId: String,
clientId: String)
code that causes a problem:
import session.implicits._
def someFunc(future: Future[ID]): Dataset[SQLMappins] = {
future.map {f =>
val seq: Seq[SQLMapping] = getFromEndpoint(f)
val ds: Dataset[SQLMapping] = seq.toDS()
...
}
}
The job sometimes works, but if I re-run job, it will throw an exception.
update 28.03.2018
I forgot to mention one detail, that turns out to be important.
Dataset was constructed inside of Future.
Calling toDS() inside future causing ScalaReflectionException.
I decided to construct DataSet outside future.map.
You can verify that Dataset can't be constructed in future.map with this example job.
package com.example.sparkapplications
import com.typesafe.config.Config
import org.apache.spark.SparkContext
import org.apache.spark.sql.SparkSession
import scala.concurrent.Await
import scala.concurrent.Future
import scala.concurrent.duration._
import scala.concurrent.ExecutionContext.Implicits.global
import spark.jobserver.SparkJob
import spark.jobserver.SparkJobValid
import spark.jobserver.SparkJobValidation
object FutureJob extends SparkJob{
override def runJob(sc: SparkContext,
jobConfig: Config): Any = {
val session = SparkSession.builder().config(sc.getConf).getOrCreate()
import session.implicits._
val f = Future{
val seq = Seq(
Dummy("1", 1),
Dummy("2", 2),
Dummy("3", 3),
Dummy("4", 4),
Dummy("5", 5)
)
val ds = seq.toDS
ds.collect()
}
Await.result(f, 10 seconds)
}
case class Dummy(id: String, value: Long)
override def validate(sc: SparkContext,
config: Config): SparkJobValidation = SparkJobValid
}
Later I will provide information if the problem persists using spark 2.3.0, and when you pass jar via spark-submit directly.

Op-Rabbit with Spray-Json in Akka Http

I am trying to use the library Op-Rabbit to consume a RabbitMQ queue in an Akka-Http project.
I want to use Spray-Json for the marshalling/ un marshalling.
import com.spingo.op_rabbit.SprayJsonSupport._
import com.spingo.op_rabbit.stream.RabbitSource
import com.spingo.op_rabbit.{Directives, RabbitControl}
object Boot extends App with Config with BootedCore with ApiService {
this: ApiService with Core =>
implicit val materializer = ActorMaterializer()
Http().bindAndHandle(routes, httpInterface, httpPort)
log.info("Http Server started")
implicit val rabbitControl = system.actorOf(Props[RabbitControl])
import Directives._
RabbitSource(
rabbitControl,
channel(qos = 3),
consume(queue(
"such-queue",
durable = true,
exclusive = false,
autoDelete = false)),
body(as[User])).
runForeach { user =>
log.info(user)
} // after each successful iteration the message is acknowledged.
}
In a separate file:
case class User(id: Long,name: String)
object JsonFormat extends DefaultJsonProtocol {
implicit val format = jsonFormat2(User)
}
The error I am getting is:
could not find implicit value for parameter um: akka.http.scaladsl.unmarshalling.FromRequestUnmarshaller[*.*.models.User]
[error] body(as[User])). // marshalling is automatically hooked up using implicits
[error] ^
[error]could not find implicit value for parameter um: com.spingo.op_rabbit.RabbitUnmarshaller[*.*.models.User]
[error] body(as[User])
[error] ^
[error] two errors found
Im not sure how to get the op-rabbit spray-json support working properly.
Thanks for any help.
Try to provide an implicit marshaller for your User class like they do it for Int (in RabbitTestHelpers.scala):
implicit val simpleIntMarshaller = new RabbitMarshaller[Int] with RabbitUnmarshaller[Int] {
val contentType = "text/plain"
val contentEncoding = Some("UTF-8")
def marshall(value: Int) =
value.toString.getBytes
def unmarshall(value: Array[Byte], contentType: Option[String], charset: Option[String]) = {
new String(value).toInt
}
}