How to use actor inside of spray route in REST service? - scala

I'm trying to build event sourced service with REST interface using scala. I somewhat new to scala, although I'm familiar with functional programming (haskell at beginner level).
So I've build persistent actor and view without major problems. The idea of actors is quite simple I think.
object Main extends App {
val system = ActorSystem("HelloSystem")
val systemActor = system.actorOf(Props[SystemActor], name = "systemactor")
val trajectoryView = system.actorOf(Props[TrajectoryView], name = "trajectoryView")
var datas = List()
val processData = ProcessData(0, List(1,2,3), Coordinates(50, 50))
implicit val timeout = Timeout(5 seconds)
def intialDatas(): List[ProcessData] =
(for (i <- 1 to 3) yield ProcessData(i, List(1,2,3), Coordinates(50 + i, 50 + i)))(collection.breakOut)
val command = RegisterProcessCommand(3, this.intialDatas())
val id = Await.result(systemActor ? command, timeout.duration).asInstanceOf[String]
println(id)
systemActor ! MoveProcessCommand(4, ProcessData(4, List(3,4,5), Coordinates(54, 54)), id)
val processes = Await.result(systemActor ? "get", timeout.duration).asInstanceOf[Set[Process]]
println(processes)
implicit val json4sFormats = DefaultFormats
println(write(processes))
println("*****************")
systemActor ! "print"
val getTrajectoryCommand = GetTrajectoryCommand(id)
Thread.sleep(10000)
trajectoryView ! "print"
// val trajectory = Await.result(trajectoryView ? getTrajectoryCommand, timeout.duration).asInstanceOf[ListBuffer[Coordinates]]
println("******* TRAJECTORY *********")
trajectoryView ! "print"
// println(trajectory)
system.shutdown()
}
I've been able to create a script for playing with actor that I've created.
I've read the tutorials for spray routing, but I've been unable to grasp what exactly should I do to provide REST interface for actors that I've created.
object Boot extends App{
implicit val system = ActorSystem("example")
val systemActor = system.actorOf(Props[SystemActor], name = "systemactor")
val trajectoryView = system.actorOf(Props[TrajectoryView], name = "trajectoryView")
val service = system.actorOf(Props[ProcessesService], "processes-rest-service")
implicit val timeout = Timeout(5 seconds)
IO(Http) ? Http.Bind(service, interface = "localhost", port = 8080)
}
And a service
class ProcessesService(systemActor: ActorRef) extends Actor with HttpService {
def actorRefFactory = context
def receive = runRoute(route)
val json4sFormats = DefaultFormats
implicit val timeout = Timeout(5 seconds)
val route = path("processes") {
get {
respondWithMediaType(`application/json`) {
complete {
write(Await.result(systemActor ? "get", timeout.duration).asInstanceOf[Set[Process]])
}
}
}
}
}
I think I need to somehow pass actorRef for SystemActor to this ProcessesService, but I'm not sure how. Also I'm not sure how should I return a response to the request. I understand that I need to somehow pass the "get" message to SystemActor through ActorRef and then serialize the answer to json, but I don't know how to do that.
I would appreciate help!

In spray you can complete routes with a Future.
You should be able to do something like
complete { systemActor ? "get" }
Json serialization is a separate issue.
Oh, your question is vague. Yes you need to be able to reference an actor within your routes. You could just import the val from boot where you define it. They're just Scala variables so where you put them is up to you.

Related

Send element to element from a Source in Akka Http

I'm developing a client-server application using Akka Http and Akka Streams.
The main idea is that the server must feed the http response with a Source from an Akka streams.
The problem is that the server accumulates some elements before sending the first message to the client. However, I need the server to send element to element as soon as a new element is produced by the source.
Code example:
case class Example(id: Long, txt: String, number: Double)
object MyJsonProtocol extends SprayJsonSupport with DefaultJsonProtocol {
implicit val exampleFormat = jsonFormat3(Test)
}
class BatchIterator(batchSize: Int, numberOfBatches: Int, pause: FiniteDuration) extends Iterator[Array[Test]]{
val range = Range(0, batchSize*numberOfBatches).toIterator
val numberOfBatchesIter = Range(0, numberOfBatches).toIterator
override def hasNext: Boolean = range.hasNext
override def next(): Array[Test] = {
println(s"Sleeping for ${pause.toMillis} ms")
Thread.sleep(pause.toMillis)
println(s"Taking $batchSize elements")
Range(0, batchSize).map{ _ =>
val count = range.next()
Test(count, s"Text$count", count*0.5)
}.toArray
}
}
object Server extends App {
import MyJsonProtocol._
implicit val jsonStreamingSupport: JsonEntityStreamingSupport = EntityStreamingSupport.json()
.withFramingRenderer(
Flow[ByteString].intersperse(ByteString(System.lineSeparator))
)
implicit val system = ActorSystem("api")
implicit val materializer = ActorMaterializer()
implicit val executionContext = system.dispatcher
def fetchExamples(): Source[Array[Test], NotUsed] = Source.fromIterator(() => new BatchIterator(5, 5, 2 seconds))
val route =
path("example") {
complete(fetchExamples)
}
val bindingFuture = Http().bindAndHandle(route, "localhost", 9090)
println("Server started at localhost:9090")
StdIn.readLine()
bindingFuture.flatMap(_.unbind()).onComplete(_ ⇒ system.terminate())
}
Then, if I execute:
curl --no-buffer localhost:9090/example
I get all the elements at the same time instead of receiving an element every 2 seconds.
Any idea about how I can "force" the server to send every element as it comes out from the source?
Finally, I've found the solution. The problem was that the source is synchronous... So the solution is just to call to the function async
complete(fetchExamples.async)

File Upload and processing using akka-http websockets

I'm using some sample Scala code to make a server that receives a file over websocket, stores the file temporarily, runs a bash script on it, and then returns stdout by TextMessage.
Sample code was taken from this github project.
I edited the code slightly within echoService so that it runs another function that processes the temporary file.
object WebServer {
def main(args: Array[String]) {
implicit val actorSystem = ActorSystem("akka-system")
implicit val flowMaterializer = ActorMaterializer()
val interface = "localhost"
val port = 3000
import Directives._
val route = get {
pathEndOrSingleSlash {
complete("Welcome to websocket server")
}
} ~
path("upload") {
handleWebSocketMessages(echoService)
}
val binding = Http().bindAndHandle(route, interface, port)
println(s"Server is now online at http://$interface:$port\nPress RETURN to stop...")
StdIn.readLine()
binding.flatMap(_.unbind()).onComplete(_ => actorSystem.shutdown())
println("Server is down...")
}
implicit val actorSystem = ActorSystem("akka-system")
implicit val flowMaterializer = ActorMaterializer()
val echoService: Flow[Message, Message, _] = Flow[Message].mapConcat {
case BinaryMessage.Strict(msg) => {
val decoded: Array[Byte] = msg.toArray
val imgOutFile = new File("/tmp/" + "filename")
val fileOuputStream = new FileOutputStream(imgOutFile)
fileOuputStream.write(decoded)
fileOuputStream.close()
TextMessage(analyze(imgOutFile))
}
case BinaryMessage.Streamed(stream) => {
stream
.limit(Int.MaxValue) // Max frames we are willing to wait for
.completionTimeout(50 seconds) // Max time until last frame
.runFold(ByteString(""))(_ ++ _) // Merges the frames
.flatMap { (msg: ByteString) =>
val decoded: Array[Byte] = msg.toArray
val imgOutFile = new File("/tmp/" + "filename")
val fileOuputStream = new FileOutputStream(imgOutFile)
fileOuputStream.write(decoded)
fileOuputStream.close()
Future(Source.single(""))
}
TextMessage(analyze(imgOutFile))
}
private def analyze(imgfile: File): String = {
val p = Runtime.getRuntime.exec(Array("./run-vision.sh", imgfile.toString))
val br = new BufferedReader(new InputStreamReader(p.getInputStream, StandardCharsets.UTF_8))
try {
val result = Stream
.continually(br.readLine())
.takeWhile(_ ne null)
.mkString
result
} finally {
br.close()
}
}
}
}
During testing using Dark WebSocket Terminal, case BinaryMessage.Strict works fine.
Problem: However, case BinaryMessage.Streaming doesn't finish writing the file before running the analyze function, resulting in a blank response from the server.
I'm trying to wrap my head around how Futures are being used here with the Flows in Akka-HTTP, but I'm not having much luck outside trying to get through all the official documentation.
Currently, .mapAsync seems promising, or basically finding a way to chain futures.
I'd really appreciate some insight.
Yes, mapAsync will help you in this occasion. It is a combinator to execute Futures (potentially in parallel) in your stream, and present their results on the output side.
In your case to make things homogenous and make the type checker happy, you'll need to wrap the result of the Strict case into a Future.successful.
A quick fix for your code could be:
val echoService: Flow[Message, Message, _] = Flow[Message].mapAsync(parallelism = 5) {
case BinaryMessage.Strict(msg) => {
val decoded: Array[Byte] = msg.toArray
val imgOutFile = new File("/tmp/" + "filename")
val fileOuputStream = new FileOutputStream(imgOutFile)
fileOuputStream.write(decoded)
fileOuputStream.close()
Future.successful(TextMessage(analyze(imgOutFile)))
}
case BinaryMessage.Streamed(stream) =>
stream
.limit(Int.MaxValue) // Max frames we are willing to wait for
.completionTimeout(50 seconds) // Max time until last frame
.runFold(ByteString(""))(_ ++ _) // Merges the frames
.flatMap { (msg: ByteString) =>
val decoded: Array[Byte] = msg.toArray
val imgOutFile = new File("/tmp/" + "filename")
val fileOuputStream = new FileOutputStream(imgOutFile)
fileOuputStream.write(decoded)
fileOuputStream.close()
Future.successful(TextMessage(analyze(imgOutFile)))
}
}

500 Internal Server Error in Akka Scala server

This is my code for the server written using Akka framework:
case class Sentence(data: String)
case class RawTriples(triples: List[String])
trait Protocols extends DefaultJsonProtocol {
implicit val sentenceRequestFormat = jsonFormat1(Sentence)
implicit val rawTriplesFormat = jsonFormat1(RawTriples)
}
trait Service extends Protocols {
implicit val system: ActorSystem
implicit def executor: ExecutionContextExecutor
implicit val materializer: Materializer
val openie = new OpenIE
def config: Config
val logger: LoggingAdapter
lazy val ipApiConnectionFlow: Flow[HttpRequest, HttpResponse, Any] =
Http().outgoingConnection(config.getString("services.ip-api.host"), config.getInt("services.ip-api.port"))
def ipApiRequest(request: HttpRequest): Future[HttpResponse] = Source.single(request).via(ipApiConnectionFlow).runWith(Sink.head)
val routes = {
logRequestResult("akka-http-microservice") {
pathPrefix("openie") {
post {
decodeRequest{
entity(as[Sentence]){ sentence =>
complete {
var rawTriples = openie.extract(sentence.data)
val resp: MutableList[String] = MutableList()
for(rtrip <- rawTriples){
resp += (rtrip.toString())
}
val response: List[String] = resp.toList
println(response)
response
}
}
}
}
}
}
}
}
object AkkaHttpMicroservice extends App with Service {
override implicit val system = ActorSystem()
override implicit val executor = system.dispatcher
override implicit val materializer = ActorMaterializer()
override val config = ConfigFactory.load()
override val logger = Logging(system, getClass)
Http().bindAndHandle(routes, config.getString("http.interface"), config.getInt("http.port"))
}
The server accepts a POST request containing a sentence and returns a json array in return. It works fine but if I am making requests to it too frequently using parallelized code, then it gives 500 Internal server error. I wanted to know is there any parameter which I can set in the server to avoid that (number of ready threads for accepting requests etc).
In log files, the error is logged as:
[ERROR] [05/31/2017 11:48:38.110]
[default-akka.actor.default-dispatcher-6]
[akka.actor.ActorSystemImpl(default)] Error during processing of
request: 'null'. Completing with 500 Internal Server Error response.
The doc on the bindAndHandle method shows what you want:
/**
* Convenience method which starts a new HTTP server at the given endpoint and uses the given `handler`
* [[akka.stream.scaladsl.Flow]] for processing all incoming connections.
*
* The number of concurrently accepted connections can be configured by overriding
* the `akka.http.server.max-connections` setting. Please see the documentation in the reference.conf for more
* information about what kind of guarantees to expect.
*
* To configure additional settings for a server started using this method,
* use the `akka.http.server` config section or pass in a [[akka.http.scaladsl.settings.ServerSettings]] explicitly.
*/
akka.http.server.max-connections is probably what you want. As the doc suggests, you can also dig deeper into the akka.http.server config section.
Add following in application.conf file
akka.http {
server {
server-header = akka-http/${akka.http.version}
idle-timeout = infinite
request-timeout = infinite
}
}

How to send actor a tell message when a URL is accessed?

I am trying to connect the Akka HTTP with the Actors. I have a simple actor which receives "hello" and sends back "Hello world"
class TestActor extends Actor{
def receive={
case "hello"=>{
sender!"Hello World"
}
}
}
I have defined the below route:
object Main extends App{
val route1: Route =
get {
path("hello") {
complete {
"This is hello"
}
}
}
implicit val materializer = ActorMaterializer()
implicit val system = ActorSystem("h")
Http().bindAndHandle(route1, "localhost", 8185)
}
I want to send a tell message to the TestActor when /hello is accessed in the URL and display the message "Hello World" as a response. How can I do this?
You have two options. Option 1 is to use the "Ask" pattern. You can "ask" the actor like below. "Ask" returns a future which you can map over and do other operations. You can also complete the request with the future. The caveat here is that it requires a timeout. You have to configure a timeout for this to work which can become tedious to maintain in a larger project.
implicit val timeout: Timeout = 2 seconds
val responseFuture = (actor ? message).mapTo[String] // This will return a Future[String]
complete(responseFuture)
Option 2 is to use "Tell" pattern. This is much preferred over "ask" pattern. You can read about this here. You need to pass the request context to the new actor and complete the request with that new actor. You will do something like below.
val testActor = actorSystem.actorOf(Props(classOf[TestActor], reqContext), "test-actor")
And in the testActor, you will complete the request. You can see here and here to get more information on this pattern.
Step 1 - Create the Actor Instance.
Step 2 - Get a reference to it.
Step 3 - Send it the message
class TestActor extends Actor{
def receive = {
case "hello" => {
sender() ! "Hello World"
}
}
}
object TestActor {
def props: Props = Props(classOf[TestActor])
}
Now...
import akka.pattern.ask
object Main extends App{
val actorSystem = ActorSystem("actor-system")
implicit val implicitActorSystem = actorSystem
implicit val materializer = ActorMaterializer()
// actually create the actor
val testActor = actorSystem.actorOf(TestActor.props, "test-actor")
val route1: Route =
get {
path("hello") {
// get actor's reference using selection
val testActorSelection = actorSystem.actorSelection("/user/test-actor")
// now send using selection
val responseFuture = testActorSelection ? "hello"
// or send using the "val testActor" reference which we already have
val responseFuture = testActor ? "hello"
onComplete(responseFuture) {
case Success(message) => complete(message)
case Failure(ex) => complete(ex.message)
}
}
}
Http().bindAndHandle(route1, "localhost", 8185)
}

How to make htttp request with akka stream for 10K request

I build server with akka-http and akka-stream but it lost some request in 2K+ request. What I miss something or my understand for akka is wrong.
this is my code
implicit val actorRef = ActorSystem("system", testConf)
implicit val materializer = ActorMaterializer(ActorMaterializerSettings(actorRef).withInputBuffer(initialSize = 4, maxSize = 16))
implicit val requestTimeout = Timeout(8 seconds)
def response(req: HttpRequest): Future[Response] = {
val connectionFlow: Flow[HttpRequest, HttpResponse, Future[Http.OutgoingConnection]] =
Http().outgoingConnection(host = "url").async
for {
res <- Source.single(req.copy(uri = s"${req.uri.path}?${req.uri.rawQueryString.get}")).via(connectionFlow).runWith(Sink.head)
data <- res.entity.toStrict(5 second)
} yield (data.getData().decodeString("UTF-8"), res.status.intValue())
}
Thank you.
Most likely there were either timeout or server-side errors in first part of your for-comprehension therefore you got only successful responses in res.
I recommend to create flow/graph that processes your requests with withSupervisionStrategy in your materializer so you can see what exactly went wrong. Details of implementation depend on your business logic.