How to serialize inner case objects with akka custom serialization? - scala

I have a trait and case objects implementing it
trait Cowboy {
def code: String
}
object Cowboy {
case object Good extends Cowboy {
val code = "G"
}
case object Bad extends Cowboy {
val code = "B"
}
case object Ugly extends Cowboy {
val code = "U"
}
def fromString(code: String) = code match {
case Good.code => Good
case Bad.code => Bad
case Ugly.code => Ugly
}
}
I want to be able to serialize them. With the following serialization configs
serializers {
cowboySerializer = "mypackage.CowboySerializer"
}
serialization-bindings {
"mypackage.Cowboy" = cowboySerializer
}
I get this warning:
Multiple serializers found for class mypackage.Cowboy$Ugly$, choosing
first: Vector((interface
java.io.Serializable,akka.serialization.JavaSerializer#3fee36d8),
(interface
mypackage.Cowboy,brigadier.scraper.ScrapeStatusSerializer#10442350))
which means that java serializer is picked instead of mine.
If I disable java serializer via
akka.actor.serialization-bindings {
"java.io.Serializable" = none
}
the following error occurs:
Rejected to persist event type [mypackage.Cowboy$Ugly$] with sequence
number 1 for persistenceId [XXX] due to [mypackage.Cowboy$Ugly$
cannot be cast to mypackage.Cowboy]
This seems strange for me because casting
Cowboy.Ugly.asInstanceOf[Cowboy]
obviously works.
I also tried to map every case object to serializer as Akka docs says
If your messages are contained inside of a Scala object, then in order
to reference those messages, you will need use the fully qualified
Java class name. For a message named Message contained inside the
object named Wrapper you would need to reference it as Wrapper$Message
instead of Wrapper.Message.
serialization-bindings {
"mypackage.Cowboy$Good" = cowboySerializer
"mypackage.Cowboy$Bad" = cowboySerializer
"mypackage.Cowboy$Ugly" = cowboySerializer
}
and got ActorInitializationException caused by InvocationTargetException caused by ClassNotFoundException (can't find mypackage.Cowboy$Ugly class).
What am I doing wrong?

Related

Scala/Akka/Protobuf: Failed to serialize and deserialize message

we are trying to using protobuf with Akka and serialize all Akka messages via protobuf. For Scala, we have a library called ScalaPB which help us to generate the class, which includes methods like parseFrom, toByteArray etc for serializing or deserialize our data. But, while we try to run the program, getting below exception:
akka.actor.dungeon.SerializationCheckFailedException: Failed to serialize and deserialize message of type com.knoldus.akkaserialization.msg.example.Bang$ for testing. To avoid this error, either disable 'akka.actor.serialize-messages', mark the message with 'akka.actor.NoSerializationVerificationNeeded', or configure serialization to support this message
application.conf file contains below configuration:
akka {
actor {
allow-java-serialization = off
serialize-messages = on
serializers {
proto = "akka.remote.serialization.ProtobufSerializer"
}
serialization-bindings {
"com.knoldus.akkaserialization.msg.example.Bang" = proto
"com.knoldus.akkaserialization.msg.example.Message" = proto
}
}
}
These classes com.knoldus.akkaserialization.msg.example.Bang and com.knoldus.akkaserialization.msg.example.Message generates via ScalaPB and contains all require methods.
Source code of akka.remote.serialization.ProtobufSerializer define,
This Serializer serializes `akka.protobuf.Message` and `com.google.protobuf.Message` It is using reflection to find the `parseFrom` and `toByteArray` methods to avoid dependency to `com.google.protobuf`
So, we expecting, this automatically reads our case classes Bang and Message and perform serialization, but unfortunately getting serialization exception.
Could you help to figure out what exact problem with ScalaPB and ProtoBuff?
The serializer you are using was designed to work with Java protobufs, not with ScalaPB. You need to include your own serializer. Here is a one you can use:
package com.example.protoser
import java.util.concurrent.atomic.AtomicReference
import akka.actor.ExtendedActorSystem
import akka.serialization.BaseSerializer
import scalapb.GeneratedMessageCompanion
class ScalaPbSerializer(val system: ExtendedActorSystem) extends BaseSerializer {
private val classToCompanionMapRef = new AtomicReference[Map[Class[_], GeneratedMessageCompanion[_]]](Map.empty)
override def toBinary(o: AnyRef): Array[Byte] = o match {
case e: scalapb.GeneratedMessage => e.toByteArray
case _ => throw new IllegalArgumentException("Need a subclass of scalapb.GeneratedMessage")
}
override def includeManifest: Boolean = true
override def fromBinary(bytes: Array[Byte], manifest: Option[Class[_]]): AnyRef = {
manifest match {
case Some(clazz) =>
#scala.annotation.tailrec
def messageCompanion(companion: GeneratedMessageCompanion[_] = null): GeneratedMessageCompanion[_] = {
val classToCompanion = classToCompanionMapRef.get()
classToCompanion.get(clazz) match {
case Some(cachedCompanion) => cachedCompanion
case None =>
val uncachedCompanion =
if (companion eq null) Class.forName(clazz.getName + "$", true, clazz.getClassLoader)
.getField("MODULE$").get().asInstanceOf[GeneratedMessageCompanion[_]]
else companion
if (classToCompanionMapRef.compareAndSet(classToCompanion, classToCompanion.updated(clazz, uncachedCompanion)))
uncachedCompanion
else
messageCompanion(uncachedCompanion)
}
}
messageCompanion().parseFrom(bytes).asInstanceOf[AnyRef]
case _ => throw new IllegalArgumentException("Need a ScalaPB companion class to be able to deserialize.")
}
}
}
The configuration should be something like this:
akka {
actor {
serializers {
scalapb = "com.example.protoser.ScalaPbSerializer"
}
serialization-bindings {
"scalapb.GeneratedMessage" = scalapb
}
serialization-identifiers {
"com.example.protoser.ScalaPbSerializer" = 10000
}
}
}
The above was adapted from old code, so edits and suggestions are welcome!
Here is a easy way to do it, just add following lines in your configuration.
https://doc.akka.io/docs/akka/current/serialization.html
Akka provides serializers for several primitive types and protobuf com.google.protobuf.GeneratedMessage (protobuf2) and com.google.protobuf.GeneratedMessageV3 (protobuf3) by default (the latter only if depending on the akka-remote module), so normally you don’t need to add configuration for that if you send raw protobuf messages as actor messages.
ScalaPB generated code can also be serialize to protobuf, so we just need to bind ScalaPB generated case class trait to serializer.
akka {
actor {
serialization-bindings {
"com.google.protobuf.Message" = proto
"scalapb.GeneratedMessage" = proto
"scalapb.GeneratedEnum" = proto
}
}
}
This works for me. My environment is:
akka-grpc: 2.1.4
akka: 2.6.19
Scala: 2.13.8

force pureconfig to generate ConfigReader for case classes

I have a implicit class that wraps around a typesafe Config class to parse information from it.
This class uses pureconfig to parse the data out. I use this because I prefer config.as[String]("foo") instead of loadConfig[String](config, "foo"). But since I use this HoconConfigUtil as an adapter
all case classes I am trying to parse fails because the loadConfig method is not directly called on these
case classes. Because of this I am getting the error as shown below. What would be the best way to handle this issue?
Error Message:
could not find implicit value for evidence parameter of type pureconfig.ConfigReader[com.example.config.Parallelism]
implicit Adaptor class
implicit class HoconConfigUtil(config: Config) {
def as[T](path: String)(implicit derivation: Derivation[ConfigReader[T]]): T = {
pureconfig.loadConfig[T](config, path) match {
case Right(x: T #unchecked) => x
case Left(th: ConfigReaderFailures) => throw makeException(th)
}
}
}

How can I return a scala "object" as if it's a class instance?

I have an abstract superclass with a variety of stateless implementations. It seems like the thing to do is make the abstract superclass a class, and make each implementation an object because I only ever need one of each.
This is a toy example, but it shows the compile error I’m getting:
// Tramsformer.scala
class Transformer {
def transform(value : String) : String
}
object Transformer {
getTransformer(String : name) : Transformer = {
name match {
case "upper" => UpperTransformer
// I'm getting "Cannot Resolve Symbol" on UpperTransformer,
// even though they're in the same package.
case _ => throw new IllegalArgumentException("...")
}
}
}
// ---
// UpperTransformer.scala is in the same package
object UpperTransformer extends Transformer {
override def transform(vlaue : String) = foo.toUpperCase()
}
I’m really shooting for some sort of dynamic dispatch on (dataProvider, desiredAction) here.
Then some class can call Transformer.getTransformer("upper").transform("stack-overflow") without making any unnecessary objects.
Any idea what I should be doing differently? Should I stop worrying about premature optimization and learn to love the instance?
The problem isn't visibility, it's that you simply do not define an object named UpperTransformer anywhere - only a class. If you replace class UpperTransformer with object UpperTransformer, your code should work fine.

Specs2 strange behavior in custom matcher when combining beAnInstanceOf and "aka"

I'm experiencing strange behavior and I'm wondering if this is a bug or if I'm missing something.
The following code:
class Foo extends SpecificationWithJUnit {
"This test should pass" in new ctx {
Bar(Zoo("id")) must haveInstanceZoo
}
trait ctx extends Scope {
def haveInstanceZoo : Matcher[Bar] =
beAnInstanceOf[Zoo] ^^ { (_: Bar).z aka "instanceOfZoo" }
}
}
case class Bar(z: Zoo)
case class Zoo(id: String)
fails with the following Exception:
'org.specs2.matcher.ThrownExpectationsCreation$$anon$1#48072f8c:
org.specs2.matcher.ThrownExpectationsCreation$$anon$1'
is not an instance of 'com.test.Zoo'
If I remove the "aka" from the custom matcher everything works.
Thoughts?
Thanks
Netta
You cannot use aka like this because you are effectively trying to assert that an Expectation, the object you create with aka is an instance of Zoo.
If you want to specify a different failure message on a matcher you can write this:
def haveInstanceZoo: Matcher[Bar] = (bar: Bar) =>
(bar.z.isInstanceOf[Zoo], "bar.z is not an instance of Zoo")

spray-json and spray-routing: how to invoke JsonFormat write in complete

I am trying to figure out how to get a custom JsonFormat write method to be invoked when using the routing directive complete. JsonFormat created with the jsonFormat set of helper functions work fine, but defining a complete JsonFormat will not get called.
sealed trait Error
sealed trait ErrorWithReason extends Error {
def reason: String
}
case class ValidationError(reason: String) extends ErrorWithReason
case object EntityNotFound extends Error
case class DatabaseError(reason: String) extends ErrorWithReason
case class Record(a: String, b: String, error: Error)
object MyJsonProtocol extends DefaultJsonProtocol {
implicit object ErrorJsonFormat extends JsonFormat[Error] {
def write(err: Error) = failure match {
case e: ErrorWithReason => JsString(e.reason)
case x => JsString(x.toString())
}
def read(value: JsValue) = {
value match {
//Really only intended to serialize to JSON for API responses, not implementing read
case _ => throw new DeserializationException("Can't reliably deserialize Error")
}
}
}
implicit val record2Json = jsonFormat3(Record)
}
And then a route like:
import MyJsonProtocol._
trait TestRoute extends HttpService with Json4sSupport {
path("testRoute") {
val response: Record = getErrorRecord()
complete(response)
}
}
If I add logging, I can see that the ErrorJsonFormat.write method never gets called.
The ramifications are as follows showing what output I'm trying to get and what I actually get. Let's say the Record instance was Record("something", "somethingelse", EntityNotFound)
actual
{
"a": "something",
"b": "somethingelse",
"error": {}
}
intended
{
"a": "something",
"b": "somethingelse",
"error": "EntityNotFound"
}
I was expecting that the complete(record) uses the implicit JsonFormat for Record which in turn relies on the implicit object ErrorJsonFormat that specifies the write method that creates the appropriate JsString field. Instead it seems to both recognize the provided ErrorJsonFormat while ignoring its instructions for serializing.
I feel like there should be a solution that does not involve needing to replace implicit val record2Json = jsonFormat3(Record) with an explicit implicit object RecordJsonFormat extends JsonFormat[Record] { ... }
So to summarize what I am asking
Why does the serialization of Record fail to call the ErrorJsonFormat write method (what does it even do instead?) answered below
Is there a way to match my expectation while still using complete(record)?
Edit
Digging through the spray-json source code, there is an sbt-boilerplate template that seems to define the jsonFormat series of methods: https://github.com/spray/spray-json/blob/master/src/main/boilerplate/spray/json/ProductFormatsInstances.scala.template
and the relevant product for jsonFormat3 from that seems to be :
def jsonFormat3[P1 :JF, P2 :JF, P3 :JF, T <: Product :ClassManifest](construct: (P1, P2, P3) => T): RootJsonFormat[T] = {
val Array(p1,p2,p3) = extractFieldNames(classManifest[T])
jsonFormat(construct, p1, p2, p3)
}
def jsonFormat[P1 :JF, P2 :JF, P3 :JF, T <: Product](construct: (P1, P2, P3) => T, fieldName1: String, fieldName2: String, fieldName3: String): RootJsonFormat[T] = new RootJsonFormat[T]{
def write(p: T) = {
val fields = new collection.mutable.ListBuffer[(String, JsValue)]
fields.sizeHint(3 * 4)
fields ++= productElement2Field[P1](fieldName1, p, 0)
fields ++= productElement2Field[P2](fieldName2, p, 0)
fields ++= productElement2Field[P3](fieldName3, p, 0)
JsObject(fields: _*)
}
def read(value: JsValue) = {
val p1V = fromField[P1](value, fieldName1)
val p2V = fromField[P2](value, fieldName2)
val p3V = fromField[P3](value, fieldName3)
construct(p1v, p2v, p3v)
}
}
From this it would seem that jsonFormat3 itself is perfectly fine (if you trace into the productElement2Field it grabs the writer and directly calls write). The problem must then be that the complete(record) doesn't involve JsonFormat at all and somehow alternately marshals the object.
So this seems to answer part 1: Why does the serialization of Record fail to call the ErrorJsonFormat write method (what does it even do instead?). No JsonFormat is called because complete marshals via some other means.
It seems the remaining question is if it is possible to provide a marshaller for the complete directive that will use the JsonFormat if it exists otherwise default to its normal behavior. I realize that I can generally rely on the default marshaller for basic case class serialization. But when I get a complicated trait/case class setup like in this example I need to use JsonFormat to get the proper response. Ideally, this distinction shouldn't have to be explicit for someone writing routes to need to know the situations where its the default marshaller as opposed to needing to invoke JsonFormat. Or in other words, needing to distinguish if the given type needs to be written as complete(someType) or complete(someType.toJson) feels wrong.
After digging further, it seems the root of the problem has been a confusion of the Json4s and Spray-Json libraries in the code. In trying to track down examples of various elements of JSON handling, I didn't recognize the separation between the two libraries readily and ended up with code that mixed some of each, explaining the unexpected behavior.
In this question, the offending piece is pulling in the Json4sSupport in the router. The proper definition should be using SprayJsonSupport:
import MyJsonProtocol._
trait TestRoute extends HttpService with SprayJsonSupport {
path("testRoute") {
val response: Record = getErrorRecord()
complete(response)
}
}
With this all considered, the answers are more apparent.
1: Why does the serialization of Record fail to call the ErrorJsonFormat write method (what does it even do instead)?.
No JsonFormat is called because complete marshals via some other means. That other means is the marshaling provided implicitly by Json4s with Json4sSupport. You can use record.toJson to force spray-json serialization of the object, but the output will not be clean (it will include nested JS objects and "fields" keys).
Is there a way to match my expectation while still using complete(record)?
Yes, using SprayJsonSupport will use implicit RootJsonReader and/or RootJsonWriter where needed to automatically create a relevant Unmarshaller and/or Marshaller. Documentation reference
So with SprayJsonSupport it will see the RootJsonWriter defined by the jsonFormat3(Record) and complete(record) will serialize as expected.