I'm using Akka Http 2.4.7. For Json (that use json4s), serialisation is something like this:
entity(as[CaseClass]) { serialisedCaseClass => .... }
for deserialising, I just return a case class object.
I wonder is there a similar way for protobuf?
No, there isn't anything available out of the box (but I agree it would be nice to have). You currently need to build your own custom unmarshaller to do that (as of akka-http 2.4.11).
You can build your own unmarshaller based on an existing protobuf definition. Here's a stub how to do it:
case class Person(...)
implicit val personUnmarshaller: FromEntityUnmarshaller[Person] =
PredefinedFromEntityUnmarshallers.byteArrayUnmarshaller map { bytes =>
// bytes is an Array[Byte]
// call protobuf to do the unmarshalling
}
and then import this definition in your route.
Related
I'm starting to learn Http4s, at work we may need to migrate Rest APIs implemented with Akka Http to Http4s.
As an example I can define a custom directive like this:
trait CustomDirectives {
def extractUUID: Directive1[UUID] = {
path(Segment).flatMap { maybeuuid =>
Try(UUID.fromString(maybeuuid)) match {
case Success(result) => provide(result)
case Failure(_) =>
complete(StatusCodes.BadRequest, s"Invalid uuid: $maybeuuid")
}
}
}
}
So everytime I want to extract an UUID and validate it I can use this directive.
We have other custom directives to make some processing on headers, and many more.
Is there anything similar to akka custom directives but in Http4s?
This is described in Handling Path Parameters section of the documentation
// standard Scala extractor, matching String against UUID
object UUIDVar {
def unapply(maybeuuid: String): Option[UUID] =
Try(UUID.fromString(maybeuuid)).toOption
}
val usersService = HttpRoutes.of[IO] {
// just use extractor in pattern matching
case GET -> Root / "users" / UUIDVar(uuid) =>
Ok(useUUIDForSth(uuid))
}
However, personally, I find it easier to describe endpoints with libraries like Tapir or Endpoint4s since their DSL seem more intuitive to me and I am not coupling my code with a particular implementation.
No instance of play.api.libs.json.Format is available for akka.actor.typed.ActorRef[org.knoldus.eventSourcing.UserState.Confirmation] in the implicit scope (Hint: if declared in the same file, make sure it's declared before)
[error] implicit val userCommand: Format[AddUserCommand] = Json.format
I am getting this error even though I have made Implicit instance of Json Format for AddUserCommand.
Here is my code:
trait UserCommand extends CommandSerializable
object AddUserCommand{
implicit val format: Format[AddUserCommand] = Json.format[AddUserCommand]
}
final case class AddUserCommand(user:User, reply: ActorRef[Confirmation]) extends UserCommand
Can anyone please help me with this error and how to solve it?
As Gael noted, you need to provide a Format for ActorRef[Confirmation]. The complication around this is that the natural serialization, using the ActorRefResolver requires that an ExtendedActorSystem be present, which means that the usual approaches to defining a Format in a companion object won't quite work.
Note that because of the way Lagom does dependency injection, this approach doesn't really work in Lagom: commands in Lagom basically can't use Play JSON.
import akka.actor.typed.scaladsl.adapter.ClassicActorSystemOps
import play.api.libs.json._
class PlayJsonActorRefFormat(system: ExtendedActorSystem) {
def reads[A] = new Reads[ActorRef[A]] {
def reads(jsv: JsValue): JsResult[ActorRef[A]] =
jsv match {
case JsString(s) => JsSuccess(ActorRefResolver(system.toTyped).resolveActorRef(s))
case _ => JsError(Seq(JsPath() -> Seq(JsonValidationError(Seq("ActorRefs are strings"))))) // hopefully parenthesized that right...
}
}
def writes[A] = new Writes[ActorRef[A]] {
def writes(a: ActorRef[A]): JsValue = JsString(ActorRefResolver(system.toTyped).toSerializationFormat(a))
}
def format[A] = Format[ActorRef[A]](reads, writes)
}
You can then define a format for AddUserCommand as
object AddUserCommand {
def format(arf: PlayJsonActorRefFormat): Format[AddUserCommand] = {
implicit def arfmt[A]: Format[ActorRef[A]] = arf.format
Json.format[AddUserCommand]
}
}
Since you're presumably using JSON to serialize the messages sent around a cluster (otherwise, the ActorRef shouldn't be leaking out like this), you would then construct an instance of the format in your Akka Serializer implementation.
(NB: I've only done this with Circe, not Play JSON, but the basic approach is common)
The error says that it cannot construct a Format for AddUserCommand because there's no Format for ActorRef[Confirmation].
When using Json.format[X], all the members of the case class X must have a Format defined.
In your case, you probably don't want to define a formatter for this case class (serializing an ActorRef doesn't make much sense) but rather build another case class with data only.
Edit: See Levi's answer on how to provide a formatter for ActorRef if you really want to send out there the actor reference.
I have a persistent actor which can receive one type of command Persist(event) where event is a type of trait Event (there are numerous implementations of it). And on success, this reponds with Persisted(event) to the sender.
The event itself is serializable since this is the data we store in persistence storage and the serialization is implemented with a custom serializer which internally uses classes generated from google protobuf .proto files. And this custom serializer is configured in application.conf and bound to the base trait Event. This is already working fine.
Note: The implementations of Event are not classes generated by protobuf. They are normal scala classes and they have a protobuf equivalent of them too, but that's mapped through the custom serializer that's bound to the base Event type. This was done by my predecessors for versioning (which probably isn't required because this can be handled with plain protobuf classes + custom serialization combi too, but that's a different matter) and I don't wish to change that atm.
We're now trying to implement cluster sharding for this actor which also means that my commands (viz. Persist and Persisted) also need to be serializable since they may be forwarded to other nodes.
This is the domain model :
sealed trait PersistenceCommand {
def event: Event
}
final case class Persisted(event: Event) extends PersistenceCommand
final case class Persist(event: Event) extends PersistenceCommand
Problem is, I do not see an elegent way to make it serializable. Following are the options I have considered
Approach 1. Define a new proto file for Persist and Persisted, but what do I use as the datatype for event? I didn't find a way to define something like this :
message Persist {
"com.example.Event" event = 1 // this doesn't work
}
Such that I can use existing Scala trait Event as a data type. If this works, I guess (it's far fetched though) I could bind the generated code (After compiling this proto file) to akka's inbuilt serializer for google protobuf and it may work. The note above explains why I cannot use oneof construct in my proto file.
Approach 2. This is what I've implemented and it works (but I don't like it)
Basically, I wrote a new serializer for the commands and delegated seraizalition and de-serialization of event part of the command to the existing serializer.
class PersistenceCommandSerializer extends SerializerWithStringManifest {
val eventSerializer: ManifestAwareEventSerializer = new ManifestAwareEventSerializer()
val PersistManifest = Persist.getClass.getName
val PersistedManifest = Persisted.getClass.getName
val Separator = "~"
override def identifier: Int = 808653986
override def manifest(o: AnyRef): String = o match {
case Persist(event) => s"$PersistManifest$Separator${eventSerializer.manifest(event)}"
case Persisted(event) => s"$PersistedManifest$Separator${eventSerializer.manifest(event)}"
}
override def toBinary(o: AnyRef): Array[Byte] = o match {
case command: PersistenceCommand => eventSerializer.toBinary(command.event)
}
override def fromBinary(bytes: Array[Byte], manifest: String): AnyRef = {
val (commandManifest, dataManifest) = splitIntoCommandAndDataManifests(manifest)
val event = eventSerializer.fromBinary(bytes, dataManifest).asInstanceOf[Event]
commandManifest match {
case PersistManifest =>
Persist(event)
case PersistedManifest =>
Persisted(event)
}
}
private def splitIntoCommandAndDataManifests(manifest: String) = {
val commandAndDataManifests = manifest.split(Separator)
(commandAndDataManifests(0), commandAndDataManifests(1))
}
}
Problem with this approach is the thing I'm doing in def manifest and in def fromBinary. I had to make sure that I have the command's manifest as well as the event's manifest while serializing and de-serializing. Hence, I had to use ~ as a separator - sort of, my custom serialization technique for the manifest information.
Is there a better or perhaps, a right way, to implement this?
For context: I'm using ScalaPB for generating scala classes from .proto files and classic akka actors.
Any kind of guidance is hugely appreciated!
If you delegate serialization of the nested object to whichever serializer you have configured the nested field should have bytes for the serialized data but also an int32 with the id of the used serializer and bytes for the message manifest. This ensures that you will be able to version/replace the nested serializers which is important for data that will be stored for a longer time period.
You can see how this is done internally in Akka for our own wire formats here: https://github.com/akka/akka/blob/6bf20f4117a8c27f8bd412228424caafe76a89eb/akka-remote/src/main/protobuf/WireFormats.proto#L48 and here https://github.com/akka/akka/blob/6bf20f4117a8c27f8bd412228424caafe76a89eb/akka-remote/src/main/scala/akka/remote/MessageSerializer.scala#L45
I was wondering if it is possible to convert an instance of one to and instance of the other. If so, then how would I do it? I haven't found any mention of this in the akka-http docs.
I am trying to write a class that implements a Java interface that returns the base route of a akka-http application, but internally I want to implement the class in Scala.
javadsl.server.Route is actually implemented by an adapter wrapping a scaladsl.server.Route (the concrete class is called RouteAdapter). You can move around between the two by doing
val scalaRoute = get { complete("OK") } // akka.http.scaladsl.server.Route
val javaRoute = RouteAdapter(scalaRoute) // extends akka.http.javadsl.server.Route
val backToScalaRoute = RouteAdapter(scalaRoute).delegate // akka.http.scaladsl.server.Route
Is there anything standardized within the Scala library to support the disposable resource pattern?
I mean something similar to that supported by C# and .NET just to mention one.
For example does official Scala library provide something like this:
trait Disposable { def dispose() }
class Resource extends Disposable
using (new Resource) { r =>
}
Note: I'm aware of this article «Scala finally block closing/flushing resource» but it seems not integrated within the standard lib
Starting Scala 2.13, the standard library provides a dedicated resource management utility: Using.
You will just have to provide an implicit definition of how to release the resource, using the Releasable trait:
import scala.util.Using
import scala.util.Using.Releasable
case class Resource(field: String)
implicit val releasable: Releasable[Resource] = resource => println(s"closing $resource")
Using(Resource("hello world")) { resource => resource.field.toInt }
// closing Resource(hello world)
// res0: scala.util.Try[Int] = Failure(java.lang.NumberFormatException: For input string: "hello world")
Note that you can place the implicit releasable in the companion object of Resource for clarity.
Note that you can also use Java's AutoCloseable instead of Using.Releasable and thus any Java or Scala object implementing AutoCloseable (such as scala.io.Source or java.io.PrintWriter) can directly be used with Using:
import scala.util.Using
case class Resource(field: String) extends AutoCloseable {
def close(): Unit = println(s"closing $this")
}
Using(Resource("hello world")) { resource => resource.field.toInt }
// closing Resource(hello world)
// res0: scala.util.Try[Int] = Failure(java.lang.NumberFormatException: For input string: "hello world")
At this time you will need to look to the Scala ARM for a common implementation. Though, as you mentioned, it is a separate library.
For more information:
This answer at functional try & catch w/ Scala links to the Loan Pattern on the scala wiki which has code samples. (I am not re-posting the link because link is subject to change)
Using a variable in finally block has a few answers showing ways you could write your own.