Add global custom values to Play Framework logger - scala

I have a cluster of different Akka actors, all using logback as logger. In the pure Akka actors startup, I can do this during the app initialization:
MDC.put("role", role)
role being a string representing the process main role (like "worker"), and all the logs will have this additional context values, helping the investigation.
One of the role is a frontend and uses Play framework to publish a REST API. In that case, I do not define an object extending App, and I do not know how/where to set global values like that, so that all logs emitted in the play application are marked with the role (and other additional things I want to put).

Play is a multi threaded application, so using MDC here is not going to work effectively. The best thing you can do is use the SLF4J marker API, which can be passed between threads.
Play 2.6.x will have support for the Marker API directly, but in the mean time you should use SLF4J directly to leverage the Logstash Logback Encoder to create a rich Marker that contains your role and other information.
import static net.logstash.logback.marker.Markers.*
Marker logstashMarker = append("name", "value");
private val logger = org.slf4j.LoggerFactory.getLogger(this.getClass)
logger.debug(logstashMarker, "My message")
Then you can pass logstashMarker as an implicit parameter to your methods, without worrying about thread local information.
Note that Play handles requests and so any "global" information you have in Akka that you want in Play will have to be extracted and added -- for maximum convenience you can put that information in a WrappedRequest using action composition or by adding a filter.

Related

Creating registrations dependent on current ComponentRegistry - Autofac

Currently stuck on a problem with autofac registrations. To summarise I have an autofac module that registers many instances of IHandle. There could be many implementations of IHandle and IHandle and each typeof(A) or typeof(B) has a corrosponding configuration class that is passed into another Module with the same builder.
My question is DURING the build process how can I get the current registrations that implement > and match them to the correct message configuration, remembering that there could be many implementations of IHandle.
I want to use builder.Register(ctx => {}) but how can I loop within this Register call and register multiple processors for each handler in the component registry
I can get the types of IHandle within the registery by dont know how to register the new processor matching the configuration
Hope that makes sense....
Thanks in advance
Richard

Is there com.twitter.util.Local for Scala Future and ExecutionContext?

ThreadLocal-like class that keeps its value through Future.map/Future.flatMap is extremely useful to trace request, for example, for logging.
Is there existing abstraction in Scala library to serve as such Local?
Is there a way to attach such Local to ExecutionContext.global?
Here is a blog entry where someone describes using scala.util.DynamicVariable and a custom scala.concurrent.ExecutionContext to capture and manage it: http://stevenskelton.ca/threadlocal-variables-scala-futures/
And here's is another blog entry describing how Hootsuite does something similar: http://code.hootsuite.com/logging-contextual-info-in-an-asynchronous-scala-application/

What is a good strategy for keeping global application state in Scala?

As a simplest example, say I'm starting my application in a certain mode (e.g. test), then I want to be able to check in other parts of the application what mode I'm running in. This should be extremely simple, but I'm looking for the right Scala replacement for global variables. Please give me a bit more than : "Scala objects are like global variables"
The ideal solution is that at start-up, the application will create an object, and at creation time, that object's 'mode' is set. After that, other parts of the application will just be able to read the state of 'mode'. How can I do this without passing a reference to an object all over the application?
My real scenario actually includes things such as selecting the database name, or singleton database object at start-up, and not allowing anything else to change that object afterwards. The one problem is that I'm trying to achieve this without passing around that reference to the database.
UPDATE:
Here is a simple example of what I would like to do, and my current solution:
object DB{
class PDB extends ProductionDB
class TDB extends TestComplianceDB
lazy val pdb = new PDB
lazy val tdb = new TDB
def db = tdb //(or pdb) How can I set this once at initialisation?
}
So, I've created different database configurations as traits. Depending on whether I'm running in Test or Production mode, I would like to use the correct configuration where configurations look something like:
trait TestDB extends DBConfig {
val m = new Model("H2", new DAL(H2Driver),
Database.forURL("jdbc:h2:mem:testdb", driver = "org.h2.Driver"))
// This is an in-memory database, so it will not yet exist.
dblogger.info("Using TestDB")
m.createDB
}
So now, whenever I use the database, I could use it like this:
val m = DB.db.m
m.getEmployees(departmentId)
My question really is, is this style bad, good or ok (using a singleton to hold a handle to the database). I'm using Slick, and I think this relates to having just one instance of Slick running. Could this lead to scalability issues.
Is there a better way to solve the problem?
You can use the typesafe config library, this is also used in projects like Play and Akka. Both the Play and Akka documentation explain basic parts of it's usage. From the Play documentation (Additional configuration)
Specifying alternative configuration file
The default is to load the application.conf file from the classpath. You can specify an alternative configuration file if needed:
Using -Dconfig.resource
-Dconfig.resource=prod.conf
Using -Dconfig.file
-Dconfig.file=/opt/conf/prod.conf
Using -Dconfig.url
-Dconfig.url=http://conf.mycompany.com/conf/prod.conf
Note that you can always reference the original configuration file in a new prod.conf file using the include directive, such as:
include "application.conf"
key.to.override=blah

Are PlayPlugin instances shared between different threads (Play 1.2.5)

I'm trying to find out how PlayPlugin objects are used within Play Framework (1.2.5).
Are same PlayPlugin instances shared between different Play threads?
With some source lookup I suppose yes but since Play has some meta-programming in many places and I'm not so familiar with all this, I'm not 100% sure.
Call stack for PlayPlugin.beforeInvocation:
PlayPlugin.beforeInvocation
PluginCollection.beforeInvocation
list of enabled plugins is a field within PluginCollection)
Invocation.before
uses static field Play.PluginCollection
Thread.currentThread().setContextClassLoader(Play.classloader) is one thing that could possibly affect Play.PluginCollection, for example.
Single instance for all threads -behaviour would also be confirmed by the article Play Framework: Introduction to Writing Modules:
beforeActionInvocation(): This code is executed before controller
invocation. Useful for validation, where it is used by Play as well.
You could also possibly put additional objects into the render
arguments here. Several plugins also set up some variables inside
thread locals to make sure they are thread safe.
So, I suppose the answer is that yes, the instances are shared, but would like to confirm that.
You are right. Each instance of PlayPlugin(subclass of course) is shared throughout the entire JVM. You get that instance via Play.plugin(class<T> clazz) method call.

How to achieve true application modularity using Akka in OSGi bundles?

When using Akka actors, every actor created gets registered in an ActorRegistry. The ActorRegistry is a singleton, and allows for easy lookup and management (start, stop, ...) of all actors.
In an OSGi environment however, a number of application bundles can be installed each using Akka actors internally (and Akka is installed as a bundle itself). Some of the actors of an application bundle should be available to other bundles and as such act as exported services. Others are strictly internal to the bundle. The ActorRegistry however contains all the actors of all bundles (since it's a singleton), so both the exported as well as the internal ones. This means even the actors used internally in a bundle are available to any other bundle.
But I'd like to gain more control of which actors are available outside the scope of a bundle. Ideally every bundle would have it's own ActorRegistry, and decide which of its actors get published as an OSGi service.
So what would be the best way to use Akka for a modular application in an OSGi environment, to achieve true modularity?
(Background about this on http://blog.xume.com/2011/02/actorregistry-scope-using-akka-in-osgi.html
From what I recall, ActorRegistry was a singleton in an earlier versions of Akka, and, from what I can see in the code now, it no longer is. Now ActorRegistry is a final class, with an instance created for Actor companion object:
object Actor extends Logging {
...
val registry = new ActorRegistry
...
}
class LocalActorRef {
...
def initializeActorInstance = {
...
Actor.registry.register(this)
...
}
...
def stop = {
...
Actor.remote.unregister(this)
...
}
...
}
So you can obviously create multiple instances of the registry.
Secondly, as you know, actors register/unregister themselves in ActorRegistry at start/stop methods, thus, in your case I would end with subclassing/mixing Actor/LocalActorRef (overloading start/stop responsible for registration, and adding here the functionality you're looking for) and/or adding your own ActorRegistry.