Testing Scala Play + Slick: why is it picking the wrong database? - scala

I'm working on this github project that uses Play 2.5.10 and Slick 3.1.1 (the issue can be reproduced there by running sbt test but can also be checked directly in travis CI). I use a Postgres database configuration default for development and production. I then use a H2 in memory database called test for testing. The default database is configured in conf/application.conf whereas the test database is configured in conf/application.test.conf.
The problem is that for testing I initialize the database with name test but the Application built with GuiceApplicationBuilder is still picking up the default one.
This line is in my build.sbt to pick up the test configuration:
javaOptions in Test += "-Dconfig.file=conf/application.test.conf"
and this is the content of that file:
include "application.conf"
slick.dbs {
test {
driver="slick.driver.H2Driver$"
db.driver="org.h2.Driver"
db.url="jdbc:h2:mem:test;MODE=PostgreSQL"
db.username="sa"
db.password=""
}
}
My DaoFunSpec base class looks like this:
package dao
import org.scalatest.{BeforeAndAfterAll, FunSpec}
import org.scalatestplus.play.OneAppPerSuite
import play.api.Application
import play.api.db.evolutions.Evolutions
import play.api.db.DBApi
abstract class DaoFunSpec extends FunSpec with OneAppPerSuite with BeforeAndAfterAll {
lazy implicit val db = app.injector.instanceOf[DBApi].database("test")
override def beforeAll() {
Evolutions.applyEvolutions(db)
}
override def afterAll() {
Evolutions.cleanupEvolutions(db)
}
def userDao(implicit app: Application) = {
Application.instanceCache[UserDao].apply(app)
}
}
Note the line app.injector.instanceOf[DBApi].database("test") but still Play tries to connect to the default database.

Ok your problem is kinda different (or maybe perhaps a little unexpected). This is the line that causes your headache:
dbApi.databases().foreach(runEvolutions)
Its in: ApplicationEvolutions.scala:42
It's probably self-explanatory :)
Still the problem is more involved. You have two databases actually in your test environment (default and test). Now this leads to several problems - one of which you see above (evolutions are tried on each of them). Another is that if you want to use differently named db you can't just inject something like that:
class UserDao #Inject()(protected val dbConfigProvider: DatabaseConfigProvider)
instead you would need to use (AFAIR):
class UserDao #Inject()(#NamedDatabase("test") protected val dbConfigProvider: DatabaseConfigProvider)
But in that case you testing becomes more complicated.
It would be all much simpler if:
1) You would extract common configuration to common.conf
2) You would change your application.conf to something like this:
include "common.conf"
slick.dbs {
default {
driver="slick.driver.PostgresDriver$"
db.driver="org.postgresql.Driver"
db.url="jdbc:postgresql://localhost:5432/exampledb?searchpath=public"
db.user="postgres"
db.password="postgres"
}
}
3) You would change your application.test.conf to something like this:
include "common.conf"
slick.dbs {
default {
driver="slick.driver.H2Driver$"
db.driver="org.h2.Driver"
db.url="jdbc:h2:mem:test;MODE=PostgreSQL"
db.username="sa"
db.password=""
}
}
Now the only thing is that you should rather have one set of evolutions (default) which is actually not that bad as it would make sure your test db is in sync with your production db (at least in terms of structure).
It's not that above is the only solution. You could still have two differently named db configurations; you would need in such scenario to do some kind of remapping in your Guilce module (you would then have one module for prod and one for test - they could inherit from one another and only overwrite certain thing - like e.g. attaching test db in place of default one). It's basically a matter of taste.

Related

With Play framework what am I doing wrong in setting up my routers

I'm a newbie to Play and Scala (version 2.6) and I can't figure out how to get the routing to work in a simple fashion. Cobbling together examples from the 2.6 documentation I've manage to create a custom application loader, which I understand is required to perform Evolutions migrations. The example I found included a var router = Routes.empty The BuiltInComponentsFromContext appears to require a router to be used, but in doing so, with the way I've done it my routes are now broken and now all I get are "Action Not Found" messages.
Here is my application.conf:
play.application.loader=MyApplicationLoader
router = my.application.Router
Here is the Application Loader
import play.api.ApplicationLoader
import play.api.ApplicationLoader.Context
import play.api.BuiltInComponentsFromContext
import play.api.db.{Database, DBComponents, HikariCPComponents}
import play.api.db.evolutions.EvolutionsComponents
import play.api.routing.Router
import play.filters.HttpFiltersComponents
//import com.softwaremill.macwire._
class MyApplicationLoader extends ApplicationLoader {
def load(context: Context) = {
new MyComponents(context).application
}
}
class MyComponents(cntx: Context)
extends BuiltInComponentsFromContext(cntx)
with DBComponents
with EvolutionsComponents
with HikariCPComponents
with HttpFiltersComponents
{
// this will actually run the database migrations on startup
//lazy val router = Router.empty
val router = Router.empty
applicationEvolutions
}
It looks to me by declaring:
val router = Router.empty
I'm essentially invalidating any of the routes I've declared in my conf/routes file, and it occurs to me to use the Router.load method, but I can't find an example of how to pass the required environment and configuration values to the method. Assuming I don't want to use static routes how do I do this?
Assuming that you only use compile-time dependency injection just for the sake of the Evolutions (because otherwise you'd faced the same problems earlier), the answer is that you don't have to do that. Evolutions work with the default dynamic dependency injection as well. The part of the documentation you probably basing your assumptions on actually says that if you are already using the compile-time dependency injection, here is how to modify it to make evolutions work. If you look at the source code of the EvolutionsModule you may see that ApplicationEvolutions is bound eagerly. It means that an instance of ApplicationEvolutions will be created at the start of the app during application initialization. And in the source code of the ApplicationEvolutions itself you can see that start() is called from the constructor. So if you provided configuration, the rest should work on its own.

Migrating Play Framework 2.5 - moving from Global.onStart to Dependency Injection

So I am trying to migrate a PlayFramework application from version 2.4.3 to 2.5.6. I am using Squeryl and akka-quartz-scheduler, and Squeryl requires setting up a session manually and akka-quartz-scheduler runs as its own entity, as none of the other modules really depend on it, though it will depend on others. So previously there has been a Global-object to handle them on start up:
import org.squeryl.{Session, SessionFactory}
object Global extends GlobalSettings {
private lazy val injector = Guice.createInjector(CustomModule)
override def onStart(app: Application) {
SessionFactory.concreteFactory = // Squeryl initialization http://squeryl.org/sessions-and-tx.html
injector.getInstance(classOf[CustomScheduler]).initialize()
}
}
This has worked before. However, on 2.5.6 I'm trying to shift away from Global.scala altogether. I'm not sure if this is the best way to do this, but from documentation it seems like it. So I'm trying to create Singleton classes, and load them eagerly before the application loads like instructed here as a replacement for onStart. So like instructed on eager bindings -page I have:
import com.google.inject._
class CustomModule extends AbstractModule {
override def configure() = { // or without override
println("configure called")
bind(classOf[SquerylInitialization]).to(classOf[SquerylInitialization]).asEagerSingleton()
bind(classOf[CustomScheduler]).to(classOf[CustomScheduler]).asEagerSingleton()
}
}
import play.api.{Configuration, Application}
import play.api.db.{DB, DBApi}
import org.squeryl.{SessionFactory, Session}
#Singleton
class SquerylInitialization #Inject()(conf: Configuration, dbApi: DBApi) extends Logging {
SessionFactory.concreteFactory = // Squeryl initialization http://squeryl.org/sessions-and-tx.html
}
import akka.actor.{ActorSystem, ActorRef}
#Singleton
class CustomScheduler #Inject()(system: ActorSystem) extends Logging {
val scheduler: QuartzSchedulerExtension = QuartzSchedulerExtension(system)
// other initialize code here
}
The CustomModule inheriting the AbstractModule and its configure()-method is never called. It is said in Guice documentation that "Alternatively, play will scan the classpath for classes that implement AbstractModule". Documentation might not be the most recent, but that seems to be the way it works.
If for instance on all classes using Squeryl I use dependency injection to load SquerylInitialization it works, but I'm not sure if that's good way to do it as it would have to be required by tons of Classes, and there is hardly any Class depending on the CustomScheduler.
So basically the questions are:
Why isn't the CustomModule calling the configure()-method and eager
loading the Classes, and how that should be fixed?
Is this the standard way to load this kind of functionality, or should some other way used?
So basically comments are correct and the documentation was just out of date, so including
play.modules.enabled += "module.CustomModule"
helped. Thought I tried that as well, but turns out I didn't. Answer just a comment so can't accept that.

Scala and Slick: DatabaseConfigProvider in standalone application

I have an Play 2.5.3 application which uses Slick for reading an object from DB.
The service classes are built in the following way:
class SomeModelRepo #Inject()(protected val dbConfigProvider: DatabaseConfigProvider) {
val dbConfig = dbConfigProvider.get[JdbcProfile]
import dbConfig.driver.api._
val db = dbConfig.db
...
Now I need some standalone Scala scripts to perform some operations in the background. I need to connect to the DB within them and I would like to reuse my existing service classes to read objects from DB.
To instantiate a SomeModelRepo class' object I need to pass some DatabaseConfigProvider as a parameter. I tried to run:
object SomeParser extends App {
object testDbProvider extends DatabaseConfigProvider {
def get[P <: BasicProfile]: DatabaseConfig[P] = {
DatabaseConfigProvider.get("default")(Play.current)
}
}
...
val someRepo = new SomeModelRepo(testDbProvider)
however I have an error: "There is no started application" in the line with "(Play.current)". Moreover the method current in object Play is deprecated and should be replaced with DI.
Is there any way to initialize my SomeModelRepo class' object within the standalone object SomeParser?
Best regards
When you start your Play application, the PlaySlick module handles the Slick configurations for you. With it you have two choices:
inject DatabaseConfigProvider and get the driver from there, or
do a global lookup via DatabaseConfigProvider.get[JdbcProfile](Play.current), which is not preferred.
Either way, you must have your Play app running! Since this is not the case with your standalone scripts you get the error: "There is no started application".
So, you will have to use Slick's default approach, by instantiating db directly from config:
val db = Database.forConfig("default")
You have lot's of examples at Lightbend's templates.
EDIT: Sorry, I didn't read the whole question. Do you really need to have it as another application? You can run your background operations when your app starts, like here. In this example, InitialData class is instantiated as eager singleton, so it's insert() method is run immediately when app starts.

Complete unit testing example in play framework + DI

I am looking for complete example of unit test for play 2.4 application + DI.
The idea is very simple:
We have application.test.conf from which I want to read the configuration data, run evolutions etc.
Then I want to inject instance of class which uses DI, example:
class UserBean #Inject()(dbConfigProvider: DatabaseConfigProvider, implicit val configuration: Configuration, cacheApi: CacheApi) {
}
Then call methods of injected object and test it.
The problem that I faced described here: https://stackoverflow.com/questions/37192401/inject-my-bean-like-class-to-test-play-2-4 but nobody answers my question.
Appreciate any help
The Java properties have the highest precedence for loading a conf file with TypeSafe Config. You can tell the sbt to use different config file when running the tests:
javaOptions in Test += "-Dconfig.resource=" + System.getProperty("config.resource", "application.test.conf")
You can create your test Application with GuiceBuilder, see here.
Note that you must have a running app in your test, like:
val myTestApp = new GuiceApplicationBuilder()
.overrides(bind[Component].to[MockComponent])
.build
"my test" in running(myTestApp) { ... }
And then you can use injector, like this:
val app2MyDao = play.api.Application.instanceCache[MyDAO]
val myDAO: MyDAO = app2MyDao(myTestApp)
You can also use the ScalaTest's traits like OneAppPerSuite and override it's fake app.
EDIT: I've made a simple project to demonstrate what I wanted to show.

How to add 'init' and 'destroy' like methods in scala play webapp?

I am very new to Play, I am writing a webapp using scala, in which I wish to implement the following functionality (Play version - 2.3.9):
When the application runs for the first time, five separate variables are to be read from a file (or a DB - this is yet to be decided).
These variables can be globally interfered with and updated while the app runs.
When the app shuts down, the file (or DB) is to be saved with the latest values for these variables.
I need to define sort of java servlet-like init and destroy functionalities to achieve this task. Could somebody guide me on how to achieve this?
Based on play 2.3.9 documentation you should define your hooks via the GlobalSettings:
import play.api._
object Global extends GlobalSettings {
override def onStart(app: Application) {
Logger.info("Application has started")
}
override def onStop(app: Application) {
Logger.info("Application shutdown...")
}
}
But I recommend to move to a newer version >= 2.4, there the GlobalSetting was deprecated, and so the way to add start and end hooks was changed.
In order to define a "start hook" you can add your own Guice module to the application configuration, there you can write what ever you need to heppen when the application starts.
And in order to add a "stop hook" you should use ApplicationLifecycle, see here more.
import scala.concurrent.Future
import javax.inject._
import play.api.inject.ApplicationLifecycle
#Singleton
class MessageQueueConnection #Inject() (lifecycle: ApplicationLifecycle) {
val connection = connectToMessageQueue()
lifecycle.addStopHook { () =>
Future.successful(connection.stop())
}
//...
}