I am working with the spray api. I have the following code:
import akka.actor.ActorSystem
import spray.routing.SimpleRoutingApp
import spray.json.DefaultJsonProtocol._
object Server1 extends App with SimpleRoutingApp{
implicit val actorSystem = ActorSystem()
startServer(interface="localhost",port = 8080){
println("Listening...")
get{
println("incoming..")
path("state"){
complete{
"in the complete block"
}
}
}
}
}
It is giving a single response on api. It will print "in the complete block" when i call from web browser. Can I make it iterative means that i use a variable and send its value in complete block then I can change the value of that variable and then send its new value in complete block.
You mean something like this:
var state = 0
get{
println("incoming..")
path("state"){
complete{
state = state + 1
s"in the complete block ${state}"
}
}
}
Related
I am building an application using Tumblr's new Colossus framework (http://tumblr.github.io/colossus/). There is still limited documentation on it (and the fact that I'm still very new to Akka doesn't help), so I was wondering if someone could chime in on whether my approach is correct.
The application is simple and consists of two key components:
A thin web service layer that will queue tasks into Redis
A background worker which will poll the same Redis instance for available tasks and process them as they become available
I made a simple example to demonstrate that my concurrency model will work (and it does), which I posted below. However, I would like to make sure that there is not a more idiomatic way to do this.
import colossus.IOSystem
import colossus.protocols.http.Http
import colossus.protocols.http.HttpMethod.Get
import colossus.protocols.http.UrlParsing._
import colossus.service.{Callback, Service}
import colossus.task.Task
object QueueProcessor {
implicit val io = IOSystem() // Create separate IOSystem for worker
Task { ctx =>
while(true) {
// Below code is for testing purposes only. This is where the Redis loop will live, and will use a blocking call to get the next available task
Thread.sleep(5000)
println("task iteration")
}
}
def ping = println("starting") // Method to launch this processor
}
object Main extends App {
implicit val io = IOSystem() // Primary IOSystem for the web service
QueueProcessor.ping // Launch worker
Service.serve[Http]("app", 8080) { ctx =>
ctx.handle { conn =>
conn.become {
case req#Get on Root => Callback.successful(req.ok("Here"))
// The methods to add tasks to the queue will live here
}
}
}
}
I tested the above model and it works. The background loop continues running while the service happily accepts requests. But, I think that there might be a better way to do this with workers (nothing found in documentation), or perhaps Akka Streams?
I got it working with something that seems semi-idiomatic to me. However, new answers & feedback are still welcomed!
class Processor extends Actor {
import scala.concurrent.ExecutionContext.Implicits.global
override def receive = {
case "start" => self ! "next"
case "next" => {
Future {
blocking {
// Blocking call here to wait on Redis (BRPOP/BLPOP)
self ! "next"
}
}
}
}
}
object Main extends App {
implicit val io = IOSystem()
val processor = io.actorSystem.actorOf(Props[Processor])
processor ! "start"
Service.serve[Http]("app", 8080) { ctx =>
ctx.handle { conn =>
conn.become {
// Queue here
case req#Get on Root => Callback.successful(req.ok("Here\n"))
}
}
}
}
I am making a Play web-socket app. When a client connects, I want to send a welcome message.
The code I use is below:
package controllers
import play.api._
import play.api.mvc._
import play.api.libs.iteratee.Concurrent
import play.api.libs.iteratee.Iteratee
import play.api.libs.concurrent.Execution.Implicits.defaultContext
object Test extends Controller {
def index = WebSocket.using[String] { _ =>
val (out,channel) = Concurrent.broadcast[String]
channel.push("Welcome to MyWebSocket")
val in = Iteratee.foreach[String] {
_ match {
case any => channel.push(any)
}
}
(in, out)
}
}
The code works fine when a client sends a message and the server has to respond to it. However, the initial welcome message Welcome to MyWebSocket is not sent. How can I fix this code?
[EDIT]
I kind of figured out the problem, but not a solution yet. The problem probably occurs because the websocket is not yet initialized when the welcome message is being pushed. I modified the code and replaced:
channel.push("Welcome to MyWebSocket")
with
val a = scala.concurrent.Future {
Thread.sleep(1000)
channel.push("Welcome to MyWebSocket")
}
After this I get the expected results (welcome message received by client). I think using the above approach (Thread.sleep and Future) is not the right way to solve this problem, so other solutions are welcome. It could also be a problem with the client side code which takes a while to initialize the socket. I used Firefox and echo web-socket test for the client.
You can use WebSocket.acceptWithActor helper method (have a look at this) and in actor body make something like
out ! "Welcome to MyWebSocket"
It works nicely.
I need to make some consuming calculations on the server side (such as DB querying and data analisys). And the results need to be printed in browser. For these purpose I send Future result from server to client (to load web page immediately and gradually print future results from server). For example, on the server side
import scala.concurrent.Future
import scala.concurrent.ExecutionContext.Implicits.global
def futureResult = Future {
val cc = ConsumingCalculations();
"some result"
}
on the client side
#import scala.concurrent.ExecutionContext.Implicits.global
#main{
#futureResult.onSuccess{ case res =>
#println("This line is printed in console: "+res);
<div>Any html code is NOT printed in browser</div>
}
Future result is NOT posted
}
In server consol we have: "This line is printed in console: some result"
But in the browser we have only: "Future result is NOT posted"
Play 2.1, scala 2.10 are currently used. What's may be wrong, are there any idea?
A future cannot be sent on client side, it must be resolved on server side before displaying to the client.
The classic exemple is to map the result of your future in your controller
def myAction = Action {
Async {
futureResult.map(result =>
Ok(views.html.myView(result))
)
}
}
And in your template, use the result, not the future.
I'd like to be able to send back a response to the client before I do my logging/cleanup for a request.
In play 1.x this was possible with the #Finally annotation. I've read through some posts that say that those annotations were replaced by action composition, but I'm unclear how to emulate the #Finally annotation using it.
It seems to me that the response will only be returned after all the logic in my custom actions has completed.
Have I missed something, or is there no way to do this in Play 2.0?
[EDIT FOR CLARITY]
In other words, I want to be able to run logic after I receive a request and send a response. So I'd like to be able to construct a timeline of the form:
Client sends a request to my server
Server sends back a 200 response, which the client receives
The server does additional processing, logging, etc
In play 1.x I believe I could annote my additional processing logic with a #Finally and have it work like I want it to.
Action composition is not sufficient to do the job, but Action composition + Future, or Action composition + Actors are good ways to achieve this.
Action Composition + Future
Generate your Response, launch your logging/processing in an async context and, in parallel, send the result.
def LoggedAction(f: Request[AnyContent] => Result) = Action { request =>
val result = f(request)
concurrent.future(myLogAction(request, result))
result
}
Action composition + Actors
It's a cleaner way to achieve that. As in the previous case, generate your response, send logging/processing event to your(s) actor(s), and in parallel, send the result.
import play.api._
import play.api.mvc._
import play.libs._
import akka.actor._
object Application extends Controller with Finally {
def index = LoggedAction { r =>
Ok(views.html.index("Your new application is ready."))
}
}
trait Finally {
self: Controller =>
lazy val logActor = Akka.system.actorOf(Props(new LogActor), name = "logActor")
def LoggedAction(f: Request[AnyContent] => Result) = Action { request =>
val result = f(request) // Generate response
logActor ! LogRequest(request) // Send log event to LogActor
println("-> now send page to client")
result
}
case class LogRequest(request: Request[AnyContent])
class LogActor extends Actor {
def receive = {
case LogRequest(req) => {
println(req.host)
// ....
}
}
}
}
// Console
-> now send page to client
127.0.0.1:9000
So I want to write some network code that appears to be blocking, without actually blocking a thread. I'm going to send some data out on the wire, and have a 'queue' of responses that will come back over the network. I wrote up a very simple proof of concept, inspired by the producer/consumer example on the actor tutorial found here: http://www.scala-lang.org/node/242
The thing is, using receive appears to take up a thread, and so I'm wondering if theres anyway to not take up a thread and still get the 'blocking feel'. Heres my code sample:
import scala.actors.Actor._;
import scala.actors.Actor;
case class Request(val s:String);
case class Message(val s:String);
class Connection {
private val act:Actor = actor {
loop {
react {
case m:Message => receive { case r:Request => reply { m } }
}
}
}
def getNextResponse(): Message = {
return (act !? new Request("get")).asInstanceOf[Message];
}
//this would call the network layer and send something over the wire
def doSomething() {
generateResponse();
}
//this is simulating the network layer getting some data back
//and sending it to the appropriate Connection object
private def generateResponse() {
act ! new Message("someData");
act ! new Message("moreData");
act ! new Message("even more data");
}
}
object runner extends Application {
val conn = new Connection();
conn.doSomething();
println( conn.getNextResponse());
println(conn.getNextResponse());
println(conn.getNextResponse());
}
Is there a way to do this without using the receive, and thereby making it threadless?
You could directly rely on react which should block and release the thread:
class Connection {
private val act:Actor = actor {
loop {
react {
case r:Request => reply { r }
}
}
}
[...]
I expect that you can use react rather than receive and not have actors take up threads like receive does. There is thread on this issue at receive vs react.