With Play framework what am I doing wrong in setting up my routers - scala

I'm a newbie to Play and Scala (version 2.6) and I can't figure out how to get the routing to work in a simple fashion. Cobbling together examples from the 2.6 documentation I've manage to create a custom application loader, which I understand is required to perform Evolutions migrations. The example I found included a var router = Routes.empty The BuiltInComponentsFromContext appears to require a router to be used, but in doing so, with the way I've done it my routes are now broken and now all I get are "Action Not Found" messages.
Here is my application.conf:
play.application.loader=MyApplicationLoader
router = my.application.Router
Here is the Application Loader
import play.api.ApplicationLoader
import play.api.ApplicationLoader.Context
import play.api.BuiltInComponentsFromContext
import play.api.db.{Database, DBComponents, HikariCPComponents}
import play.api.db.evolutions.EvolutionsComponents
import play.api.routing.Router
import play.filters.HttpFiltersComponents
//import com.softwaremill.macwire._
class MyApplicationLoader extends ApplicationLoader {
def load(context: Context) = {
new MyComponents(context).application
}
}
class MyComponents(cntx: Context)
extends BuiltInComponentsFromContext(cntx)
with DBComponents
with EvolutionsComponents
with HikariCPComponents
with HttpFiltersComponents
{
// this will actually run the database migrations on startup
//lazy val router = Router.empty
val router = Router.empty
applicationEvolutions
}
It looks to me by declaring:
val router = Router.empty
I'm essentially invalidating any of the routes I've declared in my conf/routes file, and it occurs to me to use the Router.load method, but I can't find an example of how to pass the required environment and configuration values to the method. Assuming I don't want to use static routes how do I do this?

Assuming that you only use compile-time dependency injection just for the sake of the Evolutions (because otherwise you'd faced the same problems earlier), the answer is that you don't have to do that. Evolutions work with the default dynamic dependency injection as well. The part of the documentation you probably basing your assumptions on actually says that if you are already using the compile-time dependency injection, here is how to modify it to make evolutions work. If you look at the source code of the EvolutionsModule you may see that ApplicationEvolutions is bound eagerly. It means that an instance of ApplicationEvolutions will be created at the start of the app during application initialization. And in the source code of the ApplicationEvolutions itself you can see that start() is called from the constructor. So if you provided configuration, the rest should work on its own.

Related

How do make a `CustomExecutionContext` available for dependency injection in a Play 2.6 controller?

I'm following along with Play 2.6's Scala documentation and sample code for creating non-blocking actions, and am running into some runtime issues. I have created a new Play application using the Scala template (sbt new playframework/play-scala-seed.g8).
The code that the Play documentation suggests should work in a new controller is (this code is taken verbatim from the Play documentation page, with some extra imports from me):
// some imports added by me to get the code to compile
import javax.inject.Inject
import scala.concurrent.ExecutionContext
import scala.concurrent.Future
import akka.actor.ActorSystem
import play.api.libs.concurrent.CustomExecutionContext
import play.api.mvc._
import play.api.mvc.ControllerComponents
// end imports added by me
import play.api.libs.concurrent.CustomExecutionContext
trait MyExecutionContext extends ExecutionContext
class MyExecutionContextImpl #Inject()(system: ActorSystem)
extends CustomExecutionContext(system, "my.executor") with MyExecutionContext
class HomeController #Inject()(myExecutionContext: MyExecutionContext, val controllerComponents: ControllerComponents) extends BaseController {
def index = Action.async {
Future {
// Call some blocking API
Ok("result of blocking call")
}(myExecutionContext)
}
}
Then, according to the documentation for using other thread pools, I've defined the my.executor thread pool in the application.conf file of my application:
my.executor {
fork-join-executor {
parallelism-factor = 20.0
parallelism-max = 200
}
}
I should note that I do not want to use the default execution context as I want to prepare for running futures in a separate context that may be used for a limited resource like a database connection pool.
All of this compiles just fine with sbt compile. However, when I run this with sbt run and access my app in a web browser, I get this error:
CreationException: Unable to create injector, see the following errors:
1) No implementation for controllers.MyExecutionContext was bound.
while locating controllers.MyExecutionContext
for the 1st parameter of controllers.NewController.(NewController.scala:17)
while locating controllers.NewController
for the 2nd parameter of router.Routes.(Routes.scala:29)
at play.api.inject.RoutesProvider$.bindingsFromConfiguration(BuiltinModule.scala:121):
Binding(class router.Routes to self) (via modules: com.google.inject.util.Modules$OverrideModule -> play.api.inject.guice.GuiceableModuleConversions$$anon$1)
I've used Play 2.3 in the past, and know that dependency injection works when you define an instance of an object (via #Singleton or in a module); however, Play 2.6's documentation on DI indicates that "Guice is able to automatically instantiate any class with an #Inject on its constructor without having to explicitly bind it. This feature is called just in time bindings is described in more detail in the Guice documentation."
My question is: what specific lines of code or configuration do I need to add to Play's own sample to make this work, and why?
I found one possible solution when reading further in the Binding Annotations section of the Scala Dependency Injection documentation page. In particular, it states:
The simplest way to bind an implementation to an interface is to use the Guice #ImplementedBy annotation.
So, by adding that to the my MyExecutionContext trait, like so:
import com.google.inject.ImplementedBy
#ImplementedBy(classOf[MyExecutionContextImpl])
trait MyExecutionContext extends ExecutionContext
an instance of the MyExecutionContextImpl is instantiated and properly injected into the controller.
Too bad that this #ImplementedBy annotation isn't listed in the sample code for the non-blocking action documentation!

Testing Scala Play + Slick: why is it picking the wrong database?

I'm working on this github project that uses Play 2.5.10 and Slick 3.1.1 (the issue can be reproduced there by running sbt test but can also be checked directly in travis CI). I use a Postgres database configuration default for development and production. I then use a H2 in memory database called test for testing. The default database is configured in conf/application.conf whereas the test database is configured in conf/application.test.conf.
The problem is that for testing I initialize the database with name test but the Application built with GuiceApplicationBuilder is still picking up the default one.
This line is in my build.sbt to pick up the test configuration:
javaOptions in Test += "-Dconfig.file=conf/application.test.conf"
and this is the content of that file:
include "application.conf"
slick.dbs {
test {
driver="slick.driver.H2Driver$"
db.driver="org.h2.Driver"
db.url="jdbc:h2:mem:test;MODE=PostgreSQL"
db.username="sa"
db.password=""
}
}
My DaoFunSpec base class looks like this:
package dao
import org.scalatest.{BeforeAndAfterAll, FunSpec}
import org.scalatestplus.play.OneAppPerSuite
import play.api.Application
import play.api.db.evolutions.Evolutions
import play.api.db.DBApi
abstract class DaoFunSpec extends FunSpec with OneAppPerSuite with BeforeAndAfterAll {
lazy implicit val db = app.injector.instanceOf[DBApi].database("test")
override def beforeAll() {
Evolutions.applyEvolutions(db)
}
override def afterAll() {
Evolutions.cleanupEvolutions(db)
}
def userDao(implicit app: Application) = {
Application.instanceCache[UserDao].apply(app)
}
}
Note the line app.injector.instanceOf[DBApi].database("test") but still Play tries to connect to the default database.
Ok your problem is kinda different (or maybe perhaps a little unexpected). This is the line that causes your headache:
dbApi.databases().foreach(runEvolutions)
Its in: ApplicationEvolutions.scala:42
It's probably self-explanatory :)
Still the problem is more involved. You have two databases actually in your test environment (default and test). Now this leads to several problems - one of which you see above (evolutions are tried on each of them). Another is that if you want to use differently named db you can't just inject something like that:
class UserDao #Inject()(protected val dbConfigProvider: DatabaseConfigProvider)
instead you would need to use (AFAIR):
class UserDao #Inject()(#NamedDatabase("test") protected val dbConfigProvider: DatabaseConfigProvider)
But in that case you testing becomes more complicated.
It would be all much simpler if:
1) You would extract common configuration to common.conf
2) You would change your application.conf to something like this:
include "common.conf"
slick.dbs {
default {
driver="slick.driver.PostgresDriver$"
db.driver="org.postgresql.Driver"
db.url="jdbc:postgresql://localhost:5432/exampledb?searchpath=public"
db.user="postgres"
db.password="postgres"
}
}
3) You would change your application.test.conf to something like this:
include "common.conf"
slick.dbs {
default {
driver="slick.driver.H2Driver$"
db.driver="org.h2.Driver"
db.url="jdbc:h2:mem:test;MODE=PostgreSQL"
db.username="sa"
db.password=""
}
}
Now the only thing is that you should rather have one set of evolutions (default) which is actually not that bad as it would make sure your test db is in sync with your production db (at least in terms of structure).
It's not that above is the only solution. You could still have two differently named db configurations; you would need in such scenario to do some kind of remapping in your Guilce module (you would then have one module for prod and one for test - they could inherit from one another and only overwrite certain thing - like e.g. attaching test db in place of default one). It's basically a matter of taste.

Migrating Play Framework 2.5 - moving from Global.onStart to Dependency Injection

So I am trying to migrate a PlayFramework application from version 2.4.3 to 2.5.6. I am using Squeryl and akka-quartz-scheduler, and Squeryl requires setting up a session manually and akka-quartz-scheduler runs as its own entity, as none of the other modules really depend on it, though it will depend on others. So previously there has been a Global-object to handle them on start up:
import org.squeryl.{Session, SessionFactory}
object Global extends GlobalSettings {
private lazy val injector = Guice.createInjector(CustomModule)
override def onStart(app: Application) {
SessionFactory.concreteFactory = // Squeryl initialization http://squeryl.org/sessions-and-tx.html
injector.getInstance(classOf[CustomScheduler]).initialize()
}
}
This has worked before. However, on 2.5.6 I'm trying to shift away from Global.scala altogether. I'm not sure if this is the best way to do this, but from documentation it seems like it. So I'm trying to create Singleton classes, and load them eagerly before the application loads like instructed here as a replacement for onStart. So like instructed on eager bindings -page I have:
import com.google.inject._
class CustomModule extends AbstractModule {
override def configure() = { // or without override
println("configure called")
bind(classOf[SquerylInitialization]).to(classOf[SquerylInitialization]).asEagerSingleton()
bind(classOf[CustomScheduler]).to(classOf[CustomScheduler]).asEagerSingleton()
}
}
import play.api.{Configuration, Application}
import play.api.db.{DB, DBApi}
import org.squeryl.{SessionFactory, Session}
#Singleton
class SquerylInitialization #Inject()(conf: Configuration, dbApi: DBApi) extends Logging {
SessionFactory.concreteFactory = // Squeryl initialization http://squeryl.org/sessions-and-tx.html
}
import akka.actor.{ActorSystem, ActorRef}
#Singleton
class CustomScheduler #Inject()(system: ActorSystem) extends Logging {
val scheduler: QuartzSchedulerExtension = QuartzSchedulerExtension(system)
// other initialize code here
}
The CustomModule inheriting the AbstractModule and its configure()-method is never called. It is said in Guice documentation that "Alternatively, play will scan the classpath for classes that implement AbstractModule". Documentation might not be the most recent, but that seems to be the way it works.
If for instance on all classes using Squeryl I use dependency injection to load SquerylInitialization it works, but I'm not sure if that's good way to do it as it would have to be required by tons of Classes, and there is hardly any Class depending on the CustomScheduler.
So basically the questions are:
Why isn't the CustomModule calling the configure()-method and eager
loading the Classes, and how that should be fixed?
Is this the standard way to load this kind of functionality, or should some other way used?
So basically comments are correct and the documentation was just out of date, so including
play.modules.enabled += "module.CustomModule"
helped. Thought I tried that as well, but turns out I didn't. Answer just a comment so can't accept that.

How do I create thread pools in Play 2.5.x?

I am currently on Play 2.4.2 and have successfully created thread pools using the following below:
package threads
import scala.concurrent.ExecutionContext
import play.api.libs.concurrent.Akka
import play.api.Play.current
object Contexts {
implicit val db: ExecutionContext = Akka.system.dispatchers.lookup("contexts.db-context")
implicit val pdf: ExecutionContext = Akka.system.dispatchers.lookup("contexts.pdf-context")
implicit val email: ExecutionContext = Akka.system.dispatchers.lookup("contexts.email-context")
}
and then in the code with...
Future{....}(threads.Contexts.db)
We are ready to upgrade to Play 2.5 and having trouble understanding the documentation. The documentation for 2.4.2 uses Akka.system.dispatchers.lookup, which we use without issue. The documentation for 2.5.x uses app.actorSystem.dispatchers.lookup. As far as I know, I have to inject the app into a Class, not an Object. Yet the documentation clearly uses an Object for the example!
Has anyone successfully created thread pools in Play 2.5.x that can help out? Is it as simple as changing Contexts to a class, then injecting it wherever I would like to use this threading? Seems odd since to use the default ExecutionContext I just have to make an implicit import.
Also, we are using Play scala.
If you simply change your Contexts to a class, then you will have to deal with how to get an instance of that class.
In my opinion, if you have a number of thread pools that you want to make use of, named bindings are the way to go. In the below example, I will show you how you could accomplish this with guice.
Note that guice injects depedencies at runtime, but it is also possible to inject dependencies at compile time.
I'm going to show it with the db context as an example. First, this is how you will be using it:
class MyService #Inject() (#Named("db") dbCtx: ExecutionContext) {
// make db access here
}
And here's how you could define the binding:
bind[ExecutionContext].qualifiedWith("db").toProvider[DbExecutionContextProvider]
And somewhere define the provider:
class DbExecutionContextProvider #Inject() (actorSystem: ActorSystem) extends Provider[ExecutionContext] {
override def get(): ExecutionContext = actorSystem.dispatchers.lookup("contexts.db-context")
}
You will have to do this for each of your contexts. I understand this may be a little cumbersome and there may actually be more elegant ways to define the bindings in guice.
Note that I have not tried this out. One issue you might stumble upon could be that you'll end up with conflicts, because play already defines a binding for the ExecutionContext in their BuiltinModule. You may need to override the binding to work around that.

Play Framework: Inject multiple dependencies using constructor injection in the controller

What is the right way to inject multiple dependencies using constructor injection for a controller in Play Framework (2.4.x, which provides guice based DI out of the box) in Scala?
For example,
class ExampleController #Inject() (serviceOne: ServiceOne, serviceTwo: ServiceTwo) extends Controller {
}
The above won't compile saying only one or no constructor arg can be injected.
Was not able to find any good references regarding as how to get this working. Any help is appreciated. Thanks.
You may need to prepend val to the parameters:
class ExampleController #Inject() (val serviceOne: ServiceOne, val serviceTwo: ServiceTwo) extends Controller {
And also check that you have the correct import:
import javax.inject.Inject
Here you also can find an example with multiple dependencies, maybe it helps.
Your code is correct. Additionally to that, you need to configure a module in your application.conf
play.modules.enabled += "com.example.HelloModule"
And then in this very Module you need to describe your dependency injected classes:
import play.api.inject._
class HelloModule extends Module {
def bindings(environment: Environment,
configuration: Configuration) = Seq(
bind[Hello].qualifiedWith("en").to[EnglishHello],
bind[Hello].qualifiedWith("de").to[GermanHello]
)
}
For the offical documentation see this link.