How do I create thread pools in Play 2.5.x? - scala

I am currently on Play 2.4.2 and have successfully created thread pools using the following below:
package threads
import scala.concurrent.ExecutionContext
import play.api.libs.concurrent.Akka
import play.api.Play.current
object Contexts {
implicit val db: ExecutionContext = Akka.system.dispatchers.lookup("contexts.db-context")
implicit val pdf: ExecutionContext = Akka.system.dispatchers.lookup("contexts.pdf-context")
implicit val email: ExecutionContext = Akka.system.dispatchers.lookup("contexts.email-context")
}
and then in the code with...
Future{....}(threads.Contexts.db)
We are ready to upgrade to Play 2.5 and having trouble understanding the documentation. The documentation for 2.4.2 uses Akka.system.dispatchers.lookup, which we use without issue. The documentation for 2.5.x uses app.actorSystem.dispatchers.lookup. As far as I know, I have to inject the app into a Class, not an Object. Yet the documentation clearly uses an Object for the example!
Has anyone successfully created thread pools in Play 2.5.x that can help out? Is it as simple as changing Contexts to a class, then injecting it wherever I would like to use this threading? Seems odd since to use the default ExecutionContext I just have to make an implicit import.
Also, we are using Play scala.

If you simply change your Contexts to a class, then you will have to deal with how to get an instance of that class.
In my opinion, if you have a number of thread pools that you want to make use of, named bindings are the way to go. In the below example, I will show you how you could accomplish this with guice.
Note that guice injects depedencies at runtime, but it is also possible to inject dependencies at compile time.
I'm going to show it with the db context as an example. First, this is how you will be using it:
class MyService #Inject() (#Named("db") dbCtx: ExecutionContext) {
// make db access here
}
And here's how you could define the binding:
bind[ExecutionContext].qualifiedWith("db").toProvider[DbExecutionContextProvider]
And somewhere define the provider:
class DbExecutionContextProvider #Inject() (actorSystem: ActorSystem) extends Provider[ExecutionContext] {
override def get(): ExecutionContext = actorSystem.dispatchers.lookup("contexts.db-context")
}
You will have to do this for each of your contexts. I understand this may be a little cumbersome and there may actually be more elegant ways to define the bindings in guice.
Note that I have not tried this out. One issue you might stumble upon could be that you'll end up with conflicts, because play already defines a binding for the ExecutionContext in their BuiltinModule. You may need to override the binding to work around that.

Related

Is there a way to "trickle down" implicits from top level applications to other imported modules?

I am trying to refactor some code for a program which uses an ActorSystem as the backbone for Http calls.
My specific goal is to make my code more modular so I can write libraries of functions which make http calls using an ActorSystem where the ActorSystem is later expected to be provided by the application.
This is a general question though as I tend to run into this problem a reasonable amount.
I have two goals:
Minimize the number of ActorSystems I create to simplify tracking of them (one per top level application is the goal)
Avoid explicitly passing around the ActorSystem and context everywhere it's needed.
Conceptually - the code below illustrates how I'm thinking about it (of course this code would not compile).
import akka.actor.ActorSystem
import intermediateModule._
import scala.concurrent.ExecutionContextExecutor
object MyApp extends App {
// Create the actorsystem and place into scope
implicit val system = ActorSystem()
implicit val context = system.dispatcher
intermediateFunc1(300)
}
// Elsewhere in the intermediate module
object intermediateModule {
import expectsActorSystemModule._
def intermediateFunc1(x: Int) = {
// Relies on ActorSystem and Execution context,
// but won't compile because, of course the application ActorSystem and
// ec is not in scope
usesActorSystem(x)
}
}
// In this modiule, usesActorSystem needs an ActorSystem
object expectsActorSystemModule {
def usesActorSystem(x: Int)
(implicit system: ActorSystem, context: ExecutionContextExecutor) = ???
//... does some stuff like sending http requests with ActorSystem
}
Is there a way to "trickle down" implicits through the sub-modules to achieve the goal of the top level application providing the needed implicits?
Can this be done in a way such that the "depth" of module imports doesn't matter (e.g. if I added a few more intermediate libraries in between the top level app and the module which requires the ActorSystem)?
The answer here is dependency injection. Every object that has dependencies on other objects should receive them as constructor parameters. The important thing here is that higher layers only get their own dependencies, and not their dependencies' dependencies.
In your example, IntermediateModule doesn't use the ActorSystem itself; it only needs it to pass it on to ExpectsActorSystemModule. This is bad, because if the latter changes and requires another dependency, you will need to change the former as well – that is too much coupling. You can refactor it like so:
import akka.actor.ActorSystem
import scala.concurrent.ExecutionContextExecutor
object MyApp extends App {
// Create the actorsystem and place into scope
// wire everything together
implicit val system = ActorSystem()
implicit val context = system.dispatcher
val expectsActorSystemModule = new ExpectsActorSystemModule
val intermediateModule = new IntermediateModule(expectsActorSystemModule)
// run stuff
intermediateModule.intermediateFunc1(300)
}
// Elsewhere in the intermediate module
class IntermediateModule(expectsActorSystemModule: ExpectsActorSystemModule) {
def intermediateFunc1(x: Int) = {
// Note: no ActorSystem or ExecutionContext is needed, because they were
// injected into expectsActorSystemModule
expectsActorSystemModule.usesActorSystem(x)
}
}
// In this module, usesActorSystem needs an ActorSystem
class ExpectsActorSystemModule(
implicit system: ActorSystem,
context: ExecutionContextExecutor) {
def usesActorSystem(x: Int) = ???
//... does some stuff like sending http requests with ActorSystem
}
Note that IntermediateModule no longer needs an ActorSystem or an ExecutionContext, because those were provided directly to ExpectsActorSystemModule.
The slightly annoying part is that at some point you have to instantiate all these objects in your application and wire them all together. In the above example it's only 4 lines in MyApp, but it will get significantly longer for more substantial programs.
There are libraries like MacWire or Guice to help with this, but I would recommend against using them. They make it much less transparent what is going on, and they don't save all that much code either – in my opinion, it's a bad tradeoff. And these two specifically have more downsides. Guice comes from the Java world and gives you basically no compile-time guarantees, meaning that your code might compile just fine and then fail to start because Guice. MacWire is better in that regard (everything is done at compile time), but it's not future-proof because it's implemented as a Scala 2 macro – it will not work on Scala 3 in its current form.
Another approach that is popular among the purely functional programming community is to use ZIO's ZLayer. But since you're working on an existing codebase that is based on the Lightbend tech stack, this is unlikely to be the means of choice in this particular case.

Migrating Play Framework 2.5 - moving from Global.onStart to Dependency Injection

So I am trying to migrate a PlayFramework application from version 2.4.3 to 2.5.6. I am using Squeryl and akka-quartz-scheduler, and Squeryl requires setting up a session manually and akka-quartz-scheduler runs as its own entity, as none of the other modules really depend on it, though it will depend on others. So previously there has been a Global-object to handle them on start up:
import org.squeryl.{Session, SessionFactory}
object Global extends GlobalSettings {
private lazy val injector = Guice.createInjector(CustomModule)
override def onStart(app: Application) {
SessionFactory.concreteFactory = // Squeryl initialization http://squeryl.org/sessions-and-tx.html
injector.getInstance(classOf[CustomScheduler]).initialize()
}
}
This has worked before. However, on 2.5.6 I'm trying to shift away from Global.scala altogether. I'm not sure if this is the best way to do this, but from documentation it seems like it. So I'm trying to create Singleton classes, and load them eagerly before the application loads like instructed here as a replacement for onStart. So like instructed on eager bindings -page I have:
import com.google.inject._
class CustomModule extends AbstractModule {
override def configure() = { // or without override
println("configure called")
bind(classOf[SquerylInitialization]).to(classOf[SquerylInitialization]).asEagerSingleton()
bind(classOf[CustomScheduler]).to(classOf[CustomScheduler]).asEagerSingleton()
}
}
import play.api.{Configuration, Application}
import play.api.db.{DB, DBApi}
import org.squeryl.{SessionFactory, Session}
#Singleton
class SquerylInitialization #Inject()(conf: Configuration, dbApi: DBApi) extends Logging {
SessionFactory.concreteFactory = // Squeryl initialization http://squeryl.org/sessions-and-tx.html
}
import akka.actor.{ActorSystem, ActorRef}
#Singleton
class CustomScheduler #Inject()(system: ActorSystem) extends Logging {
val scheduler: QuartzSchedulerExtension = QuartzSchedulerExtension(system)
// other initialize code here
}
The CustomModule inheriting the AbstractModule and its configure()-method is never called. It is said in Guice documentation that "Alternatively, play will scan the classpath for classes that implement AbstractModule". Documentation might not be the most recent, but that seems to be the way it works.
If for instance on all classes using Squeryl I use dependency injection to load SquerylInitialization it works, but I'm not sure if that's good way to do it as it would have to be required by tons of Classes, and there is hardly any Class depending on the CustomScheduler.
So basically the questions are:
Why isn't the CustomModule calling the configure()-method and eager
loading the Classes, and how that should be fixed?
Is this the standard way to load this kind of functionality, or should some other way used?
So basically comments are correct and the documentation was just out of date, so including
play.modules.enabled += "module.CustomModule"
helped. Thought I tried that as well, but turns out I didn't. Answer just a comment so can't accept that.

Scala and Slick: DatabaseConfigProvider in standalone application

I have an Play 2.5.3 application which uses Slick for reading an object from DB.
The service classes are built in the following way:
class SomeModelRepo #Inject()(protected val dbConfigProvider: DatabaseConfigProvider) {
val dbConfig = dbConfigProvider.get[JdbcProfile]
import dbConfig.driver.api._
val db = dbConfig.db
...
Now I need some standalone Scala scripts to perform some operations in the background. I need to connect to the DB within them and I would like to reuse my existing service classes to read objects from DB.
To instantiate a SomeModelRepo class' object I need to pass some DatabaseConfigProvider as a parameter. I tried to run:
object SomeParser extends App {
object testDbProvider extends DatabaseConfigProvider {
def get[P <: BasicProfile]: DatabaseConfig[P] = {
DatabaseConfigProvider.get("default")(Play.current)
}
}
...
val someRepo = new SomeModelRepo(testDbProvider)
however I have an error: "There is no started application" in the line with "(Play.current)". Moreover the method current in object Play is deprecated and should be replaced with DI.
Is there any way to initialize my SomeModelRepo class' object within the standalone object SomeParser?
Best regards
When you start your Play application, the PlaySlick module handles the Slick configurations for you. With it you have two choices:
inject DatabaseConfigProvider and get the driver from there, or
do a global lookup via DatabaseConfigProvider.get[JdbcProfile](Play.current), which is not preferred.
Either way, you must have your Play app running! Since this is not the case with your standalone scripts you get the error: "There is no started application".
So, you will have to use Slick's default approach, by instantiating db directly from config:
val db = Database.forConfig("default")
You have lot's of examples at Lightbend's templates.
EDIT: Sorry, I didn't read the whole question. Do you really need to have it as another application? You can run your background operations when your app starts, like here. In this example, InitialData class is instantiated as eager singleton, so it's insert() method is run immediately when app starts.

Play Framework PathBindable with Dependency Injection

I'm migrating a Scala Play application to 2.5 and am currently moving my components to dependency injection. There's one place left where I'm at a loss how to do it though. I have a PathBindable implicit conversion defined in the companion object:
object Task {
implicit def pathBindable(implicit stringBinder: PathBindable[String]) =
new PathBindable[Task] {
...
}
}
The implementation of the PathBindable needs to look up the object from a repository, but I haven't found a way to dependency-inject the repository here. As a workaround I'm using the now deprecated Play object:
val tasks = Play.application(Play.current).injector.instanceOf[TasksRepository]
Any ideas how to solve this properly?
According to Lightbend Engineer Greg Methvin, PathBindables should only depend on the state in the path. The reason is that the code runs on the IO thread and should therefore be fast and not block.
I think this is the only way you can access stuff like this in objects.
A better idea is to create a the transformer like this:
class TaskPathBinder #Inject() ( tasks : TaskRepository ) extends PathBindable[Task]{
// implementiation
}
and than inject it in services like this
class NeedsTaskPathBinder #Inject() ( service : SomeSerive ) (implicit taskPathBinder : TaskPathBinder) {
...
}
Hope the you get the idea.

Unit Testing AKKA actors

I am doing a web application with Scala and Akka actors and I'm having some troubles with the tests.
In my case I need to taste an actor who talks with the Database. To do the unit testing I would like to use a Fake Database but I can't replace the new with my desired fake object.
Let's see some code:
Class MyActor extends Actor {
val database = new Database()
def receive = { ... }
}
And in the tests I would like to inject a FakeDatabase object instead Database. I've been looking in Internet but the best that I found is:
Add a parameter to the constructor.
Convert the val database to a var so in the test I could access the attribute by the underlying and replace it.
Both solutions solve the problem but are very dirty.
Isn't a better way to solve the problem?
Thanks!
The two primary options for this scenario are:
Dependency Injection Use a DI framework to inject a real or mock service as needed. In Akka: http://letitcrash.com/post/55958814293/akka-dependency-injection
Cake Pattern This is a Scala-specific way of achieving something akin to dependency injection without actually relying on injection. See: Akka and cake pattern
Echoing the advice here, I wouldn't call injecting the database in the constructor dirty. It might have plenty of benefits, including decoupling actor behaviour from the particular database instance.
However if you know there is only ONE database you will be always using in your production code, then think about defining a package level accessible constructor and a companion object returning a Props object without parameters by default.
Example below:
object MyActor {
def props() : Props = Props(new MyActor(new Database()))
}
class MyActor private[package](database : IDatabase) extends Actor {
def receive = { ... }
}
In this case you will still be able to inject the test database in your tests case (given the same package structure), but prevent users of your code from instantiating MyActor with unexpected database instance.