ZIO, Release resources after execution - scala

I am playing with ZIO and built a simple application that get content via HTTP :
for {
options <- Options.parse(args)
http = HttpClient(args)
content <- Download.execute(args.resource).provide(http)
} yield ()
It does the job but the client is backed by Play StandaloneWsClient and I would like to close it and terminate the actor system as described in the documentation: https://github.com/playframework/play-ws#scala-1
So I created a finaliser method but it seems that is has no effect:
// ...
content <- Download.execute(args.resource).ensuring(http.disconnect()).provide(http)
// ...
class HttpClient {
// ...
def disconnect():UIO[Unit] = ZIO.effectTotal {
client.close()
system.terminate()
}
How can I instruct ZIO to call a finaliser method to free up my resources ?

Related

Colossus Background Task

I am building an application using Tumblr's new Colossus framework (http://tumblr.github.io/colossus/). There is still limited documentation on it (and the fact that I'm still very new to Akka doesn't help), so I was wondering if someone could chime in on whether my approach is correct.
The application is simple and consists of two key components:
A thin web service layer that will queue tasks into Redis
A background worker which will poll the same Redis instance for available tasks and process them as they become available
I made a simple example to demonstrate that my concurrency model will work (and it does), which I posted below. However, I would like to make sure that there is not a more idiomatic way to do this.
import colossus.IOSystem
import colossus.protocols.http.Http
import colossus.protocols.http.HttpMethod.Get
import colossus.protocols.http.UrlParsing._
import colossus.service.{Callback, Service}
import colossus.task.Task
object QueueProcessor {
implicit val io = IOSystem() // Create separate IOSystem for worker
Task { ctx =>
while(true) {
// Below code is for testing purposes only. This is where the Redis loop will live, and will use a blocking call to get the next available task
Thread.sleep(5000)
println("task iteration")
}
}
def ping = println("starting") // Method to launch this processor
}
object Main extends App {
implicit val io = IOSystem() // Primary IOSystem for the web service
QueueProcessor.ping // Launch worker
Service.serve[Http]("app", 8080) { ctx =>
ctx.handle { conn =>
conn.become {
case req#Get on Root => Callback.successful(req.ok("Here"))
// The methods to add tasks to the queue will live here
}
}
}
}
I tested the above model and it works. The background loop continues running while the service happily accepts requests. But, I think that there might be a better way to do this with workers (nothing found in documentation), or perhaps Akka Streams?
I got it working with something that seems semi-idiomatic to me. However, new answers & feedback are still welcomed!
class Processor extends Actor {
import scala.concurrent.ExecutionContext.Implicits.global
override def receive = {
case "start" => self ! "next"
case "next" => {
Future {
blocking {
// Blocking call here to wait on Redis (BRPOP/BLPOP)
self ! "next"
}
}
}
}
}
object Main extends App {
implicit val io = IOSystem()
val processor = io.actorSystem.actorOf(Props[Processor])
processor ! "start"
Service.serve[Http]("app", 8080) { ctx =>
ctx.handle { conn =>
conn.become {
// Queue here
case req#Get on Root => Callback.successful(req.ok("Here\n"))
}
}
}
}

Execute some logic asynchronously in spray routing

Here is my simple routing application:
object Main extends App with SimpleRoutingApp {
implicit val system = ActorSystem("my-system")
startServer(interface = "0.0.0.0", port = System.getenv("PORT").toInt) {
import format.UsageJsonFormat._
import spray.httpx.SprayJsonSupport._
path("") {
get {
complete("OK")
}
} ~
path("meter" / JavaUUID) {
meterUUID => pathEnd {
post {
entity(as[Usage]) {
usage =>
// execute some logic asynchronously
// do not wait for the result
complete("OK")
}
}
}
}
}
}
What I want to achieve is to execute some logic asynchronously in my path directive, do not wait for the result and return immediately HTTP 200 OK.
I am quite new to Scala and spray and wondering if there is any spray way to solve this specific problem. Otherwise I would go into direction of creating Actor for every request and letting it to do the job. Please advice.
There's no special way of handling this in spray: simply fire your async action (a method returning a Future, a message sent to an actor, whatever) and call complete right after.
def doStuffAsync = Future {
// literally anything
}
path("meter" / JavaUUID) { meterUUID =>
pathEnd {
post {
entity(as[Usage]) { usage =>
doStuffAsync()
complete("OK")
}
}
}
}
Conversely, if you need to wait for an async action to complete before sending the response, you can use spray-specific directives for working with Futures or Actors.

concurrent requests limit of Twitter-Finagle

I create a thrift server using Finagle like this
val server = Thrift.serveIface(bindAddr(), new MyService[Future] {
def myRPCFuction() {}
})
But, I found that the maximum number of concurrent requests is five( why 5? when more than 5, the server just ignore the excessed ones.) I look through the doc of Finagle really hard (http://twitter.github.io/finagle/guide/Protocols.html#thrift-and-scrooge), but find nothing hint to configure the max-request-limit.
How to config the maximum concurrent request num of Finagle? Thanks
I've solved this problem by myself and I share it here to help others who may run into the same case. Because I m a thrift user before and in Thrift when you return from the RPC function you return the values back to calling client. While in Finagle only when you use Future.value() you return the value to client. And when use Finagle, you should totally use the asynchronous way, that's to say you had better not sleep or do some other RPC synchronously in the RPC function.
/* THIS is BAD */
val server = Thrift.serveIface(bindAddr(), new MyService[Future] {
def myRPCFuction() {
val rpcFuture = rpcClient.callOtherRpc() // call other rpc which return a future
val result = Await.result(rpcFuture, TwitterDuration(rpcTimeoutSec()*1000, MILLISECONDS))
Future.value(result)
}
})
/* This is GOOD */
val server = Thrift.serveIface(bindAddr(), new MyService[Future] {
def myRPCFuction() {
val rpcFuture = rpcClient.callOtherRpc() // call other rpc which return a future
rpcFuture onSuccess { // do you job when success (you can return to client using Future.value) }
rpcFuture onFailure { // do your job when fail }
}
})
Then, can get a satisfactory concurrency. Hope it helps others who have the same issue.

spray and actor non deterministic tests

Helo,
at the beginning i wold like to apologize for my english :)
akka=2.3.6
spray=1.3.2
scalatest=2.2.1
I encountered strange behavior of teting routes, which asks actors in handleWith directive,
I've route with handleWith directive
pathPrefix("firstPath") {
pathEnd {
get(complete("Hello from this api")) ~
post(handleWith { (data: Data) =>{ println("receiving data")
(dataCalculator ? data).collect {
case Success(_) =>
Right(Created -> "")
case throwable: MyInternalValidatationException =>
Left(BadRequest -> s"""{"${throwable.subject}" : "${throwable.cause}"}""")
}
}})
}
}
and simple actor wchich always responds when receive object Data and has own receive block wrapped in LoggingReceive, so I should see logs when message is receiving by actor
and i test it using (I think simple code)
class SampleStarngeTest extends WordSpec with ThisAppTestBase with OneInstancePerTest
with routeTestingSugar {
val url = "/firstPath/"
implicit val routeTestTimeout = RouteTestTimeout(5 seconds)
def postTest(data: String) = Post(url).withJson(data) ~> routes
"posting" should {
"pass" when {
"data is valid and comes from the identified user" in {
postTest(correctData.copy(createdAt = System.currentTimeMillis()).asJson) ~> check {
print(entity)
status shouldBe Created
}
}
"report is valid and comes from the anonymous" in {
postTest(correctData.copy(createdAt = System.currentTimeMillis(), adid = "anonymous").asJson) ~> check {
status shouldBe Created
}
}
}
}
}
and behavior:
When I run either all tests in package (using Intellij Idea 14 Ultimate) or sbt test I encounter the same results
one execution -> all tests pass
and next one -> not all pass, this which not pass I can see:
1. fail becouse Request was neither completed nor rejected within X seconds ( X up tp 60)
2. system console output from route from line post(handleWith { (data: Data) =>{ println("receiving data"), so code in handleWith was executed
3. ask timeout exception from route code, but not always (among failed tests)
4. no logs from actor LoggingReceive, so actor hasn't chance to respond
5. when I rerun teststhe results are even different from the previous
Is there problem with threading? or test modules, thread blocking inside libraries? or sth else? I've no idea why it isn't work :(

#Finally equivalent in play 2.0.x?

I'd like to be able to send back a response to the client before I do my logging/cleanup for a request.
In play 1.x this was possible with the #Finally annotation. I've read through some posts that say that those annotations were replaced by action composition, but I'm unclear how to emulate the #Finally annotation using it.
It seems to me that the response will only be returned after all the logic in my custom actions has completed.
Have I missed something, or is there no way to do this in Play 2.0?
[EDIT FOR CLARITY]
In other words, I want to be able to run logic after I receive a request and send a response. So I'd like to be able to construct a timeline of the form:
Client sends a request to my server
Server sends back a 200 response, which the client receives
The server does additional processing, logging, etc
In play 1.x I believe I could annote my additional processing logic with a #Finally and have it work like I want it to.
Action composition is not sufficient to do the job, but Action composition + Future, or Action composition + Actors are good ways to achieve this.
Action Composition + Future
Generate your Response, launch your logging/processing in an async context and, in parallel, send the result.
def LoggedAction(f: Request[AnyContent] => Result) = Action { request =>
val result = f(request)
concurrent.future(myLogAction(request, result))
result
}
Action composition + Actors
It's a cleaner way to achieve that. As in the previous case, generate your response, send logging/processing event to your(s) actor(s), and in parallel, send the result.
import play.api._
import play.api.mvc._
import play.libs._
import akka.actor._
object Application extends Controller with Finally {
def index = LoggedAction { r =>
Ok(views.html.index("Your new application is ready."))
}
}
trait Finally {
self: Controller =>
lazy val logActor = Akka.system.actorOf(Props(new LogActor), name = "logActor")
def LoggedAction(f: Request[AnyContent] => Result) = Action { request =>
val result = f(request) // Generate response
logActor ! LogRequest(request) // Send log event to LogActor
println("-> now send page to client")
result
}
case class LogRequest(request: Request[AnyContent])
class LogActor extends Actor {
def receive = {
case LogRequest(req) => {
println(req.host)
// ....
}
}
}
}
// Console
-> now send page to client
127.0.0.1:9000