How to apply play-evolutions when running tests in play-framework? - scala

I have problems with evolutions when running tests in play framework using
playframework v2.6.6 for scala
play-slick v3.0.2
play-slick-evolutions v3.0.2
The test looks like this:
class TestFooController extends PlaySpec with GuiceOneServerPerSuite {
"foo endpoint should store some data" in {
val wsClient = app.injector.instanceOf[WSClient]
val url = s"http://localhost:$port/foo"
val requestData = Json.obj("foo" -> "bar")
val response = await(wsClient.url(url).post(requestData))
response.status mustBe OK
}
}
The database configuration looks like this:
slick.dbs.default.driver="slick.driver.H2Driver$"
slick.dbs.default.db.driver="org.h2.Driver"
slick.dbs.default.db.url="jdbc:h2:mem:play"
Asume there is an evolution script which creates the table foos and this script is working fine in dev mode.
When running the test the following error is thrown:
play.api.http.HttpErrorHandlerExceptions$$anon$1: Execution exception[[JdbcSQLException: Table "foos" not found;
The table foos could not be found so I assume that database evolutions have not been applied.
Then I changed the database configuration to postgresql which is used in dev mode.
slick.dbs.default.driver = "slick.driver.PostgresDriver$"
slick.dbs.default.db.driver = "org.postgresql.Driver"
slick.dbs.default.db.url = "jdbc:postgresql://localhost:5432/foo-test"
slick.dbs.default.db.user = "user"
slick.dbs.default.db.password = "password"
With this configuration the test work fine and data are stored in the database, so database evolutions ran just fine.
Now the problem is, that the database is not cleaned up after tests. I'd like to run each test suite with a clean database.
To sum up. With H2Db evolutions are not applied, with postgresql evolutions are applied but not cleaned up.
Even if this explicitly defined in application.test.conf
play.evolutions.autoApply=true
play.evolutions.autoApplyDowns=true
I also tried
play.evolutions.db.default.autoApply=true
play.evolutions.db.default.autoApplyDowns=true
no effect.
Then I tried to do this manually via:
def withManagedDatabase[T](block: Database => T): Unit = {
val dbapi = app.injector.instanceOf[DBApi]
val database = dbapi.database("default")
Evolutions.applyEvolutions(database)
block(database)
Evolutions.cleanupEvolutions(database)
}
and then changing the test to:
"foo endpoint should store some data" in withManagedDatabase { _ =>
...
}
For the H2 database configuration it has no effect, the same error that table foos can not be found is thrown. For the postgresql database configuration an evolution exceptions is thrown
play.api.db.evolutions.InconsistentDatabase: Database 'default' is in an inconsistent state![An evolution has not been applied properly. Please check the problem and resolve it manually before marking it as resolved.]
I want evolution ups running before and evolution downs running after each test suite. How can this be achieved?

You can use the following to apply evolutions before each suite and clean up afterwards:
trait DatabaseSupport extends BeforeAndAfterAll {
this: Suite with ServerProvider =>
private lazy val db = app.injector.instanceOf[DBApi]
override protected def beforeAll(): Unit = {
super.beforeAll()
initializeEvolutions(db.database("default"))
}
override protected def afterAll(): Unit = {
cleanupEvolutions(db.database("default"))
super.afterAll()
}
private def initializeEvolutions(database: Database):Unit = {
Evolutions.cleanupEvolutions(database)
Evolutions.applyEvolutions(database)
}
private def cleanupEvolutions(database: Database):Unit = {
Evolutions.cleanupEvolutions(database)
}
}

This is working for me:
class DAOSpec extends PlaySpec with GuiceOneAppPerSuite {
val dbUrl = sys.env.getOrElse("DATABASE_URL", "postgres://foo:password#localhost:5432/foo")
val testConfig = Map("db.default.url" -> dbUrl)
implicit override def fakeApplication() = new GuiceApplicationBuilder().configure(testConfig).build()
lazy val database = app.injector.instanceOf[Database]
lazy val dao = app.injector.instanceOf[DAO]
"create" must {
"work" in Evolutions.withEvolutions(database) {
val foo = await(dao.create("foo"))
foo.id must not be null
}
}
}

Related

Slick: Updates not available when fetched just after

I was trying out this slick example and when I try to create an entry and then fetch that right after, I don't get the record. I modified the test case which is here as below.
val response = create(BankProduct("car loan", 1)).flatMap(getById)
whenReady(response) { p =>
assert(p.get === BankProduct("car loan", 1))
}
The above fails because the created BankProduct cannot be fetched immediately.
It is using h2 db for this and below is the configuration.
trait H2DBComponent extends DBComponent {
val logger = LoggerFactory.getLogger(this.getClass)
val driver = slick.driver.H2Driver
import driver.api._
val randomDB = "jdbc:h2:mem:test" + UUID.randomUUID().toString() + ";"
val h2Url = randomDB + "MODE=MySql;DATABASE_TO_UPPER=false;INIT=runscript from 'src/test/resources/schema.sql'\\;runscript from 'src/test/resources/schemadata.sql'"
val db: Database = {
logger.info("Creating test connection")
Database.forURL(url = h2Url, driver = "org.h2.Driver")
}
}
private[repo] trait BankProductTable extends BankTable { this: DBComponent =>
import driver.api._
private[BankProductTable] class BankProductTable(tag: Tag) extends Table[BankProduct](tag, "bankproduct") {
val id = column[Int]("id", O.PrimaryKey, O.AutoInc)
val name = column[String]("name")
val bankId = column[Int]("bank_id")
def bank = foreignKey("bank_product_fk", bankId, bankTableQuery)(_.id)
def * = (name, bankId, id.?) <> (BankProduct.tupled, BankProduct.unapply)
}
protected val bankProductTableQuery = TableQuery[BankProductTable]
protected def bankProductTableAutoInc = bankProductTableQuery returning bankProductTableQuery.map(_.id)
}
I don't understand why this is happening and how to avoid this?
I tried adding the propery autoCommit also but it didn't work either.
Appreciate any help to clarify this ambiguity.
This might be due to in-memory database content being lost after create call closes its connection. According to docs:
By default, closing the last connection to a database closes the
database. For an in-memory database, this means the content is lost.
To keep the database open, add ;DB_CLOSE_DELAY=-1 to the database URL.
To keep the content of an in-memory database as long as the virtual
machine is alive, use jdbc:h2:mem:test;DB_CLOSE_DELAY=-1.
However, after adding DB_CLOSE_DELAY=-1, there will be errors due to
runscript from 'src/test/resources/schemadata.sql'
which is executed on each connection, thus refactoring is neccessary such that database is populated only once on initialization.

Slick Connection Pool on per requests

How can use slick connection pool ?
For example :
with this config :
database {
dataSourceClass = org.postgresql.ds.PGSimpleDataSource
driver = org.postgresql.Driver
properties = {
url = "jdbc:postgresql://172.17.0.2/sampleDB"
user = "user"
password = "userpass"
}
minConnections = 10
maxConnections = 20
numThreads = 10
}
I have only one client and this client with web browser request to get all persons from API .
now slick generate 10 connection to database .
second step client refresh browser and slick generate new 10 connection to database without using previous connections .
and then new refresh in browser and slick generate another 10 connection to database . (Now I have about 30 connection on DB with only one client!)
Why ? This is normal ?
Why maxConnections not work ?
I must close connection after requests ?
Or forget some configuration about that ?
Update
This is my sample API :
trait PersonsApi extends DatabaseConfig with JsonMapper {
val getAllPersons = (path("persons") & get) {
complete(db.run(PersonDao.findAll))
}
val getPersonById = (path("persons" / IntNumber) & get) {
num => complete(db.run(PersonDao.findById(num)))
}
val personsApi =
getAllPersons ~
getPersonById
}
This is my example entity class (DAO Pattern) :
class PersonTable(tag: Tag) extends Table[Person](tag, "persons") {
def id = column[Long]("id", O.AutoInc, O.PrimaryKey)
def name = column[String]("name")
def family = column[String]("family")
override def * : ProvenShape[Person] = (id.?, name, family) <> (Person.tupled, Person.unapply)
}
object PersonDao extends BaseDao {
def findAll = personTable.result
def findById(id: Long) = personTable.filter(_.id === id).result
}
This DatabaseConfig interface :
trait DatabaseConfig extends Config {
val driver = slick.driver.PostgresDriver
import driver.api._
def db = Database.forConfig("database")
}
Note : I don't use play framework .
Your configuration seems to be fine. It's impossible to say without further code samples from your application but my bet is you are creating your db on each and every request to your application.
Just make sure that this code:
Database.forConfig("database")
is executed once perhaps by:
putting it as a Singleton injected dependency or
by using play-slick and it's way of dealing with Slick configuration (if you are using Play which is, again, not possible to say from your question, though I assumed it as you mentioned web requests).
EDIT (after question update):
And we have an answer. Each time you call db method new Database object (together with connection pool is created). Just move it as I suggested above (created once per application lifecycle). The easiest way possible (not necessarily the best one) would be to change this line:
def db = Database.forConfig("database")
to this:
lazy val db = Database.forConfig("database")
Above would immediately solve your problem (assuming that there is only one instance of PersonsApi created in your application.
Other solution (better perhaps) would be to create something like this:
object DatabaseConfig extends Config {
val driver = slick.driver.PostgresDriver
import driver.api._
lazy val db = Database.forConfig("database")
}
and then change your API to this:
trait PersonsApi extends JsonMapper {
val getAllPersons = (path("persons") & get) {
complete(DatabaseConfig.db.run(PersonDao.findAll))
}
val getPersonById = (path("persons" / IntNumber) & get) {
num => complete(DatabaseConfig.db.run(PersonDao.findById(num)))
}
val personsApi =
getAllPersons ~
getPersonById
}

Close or shutdown of H2 database after tests is not working

I am facing a problem of database clean-up after each test when using scalatest with Slick.
Here is code of the test:
class H2DatabaseSpec extends WordSpec with Matchers with ScalaFutures with BeforeAndAfter {
implicit override val patienceConfig = PatienceConfig(timeout = Span(5, Seconds))
val h2DB: H2DatabaseService = new H2DatabaseService
var db: Database = _
before {
db = Database.forConfig("h2mem1")
h2DB.createSchema.futureValue
}
after {
db.shutdown.futureValue
}
"H2 database" should {
"query a question" in {
val newQuestion: QuestionEntity = QuestionEntity(Some(1L), "First question")
h2DB.insertQuestion(newQuestion).futureValue
val question = h2DB.getQuestionById(1L)
question.futureValue.get shouldBe newQuestion
}
}
it should {
"query all questions" in {
val newQuestion: QuestionEntity = QuestionEntity(Some(2L), "Second question")
h2DB.insertQuestion(newQuestion).futureValue
val questions = h2DB.getQuestions
questions.futureValue.size shouldBe 1
}
}
}
Database service is just invoking run method on defined database:
class H2DatabaseService {
val db = Database.forConfig("h2mem1")
val questions = TableQuery[Question]
def createSchema =
db.run(questions.schema.create)
def getQuestionById(id: Long): Future[Option[QuestionEntity]] =
db.run(questions.filter(_.id === id).result.headOption)
def getQuestions: Future[Seq[QuestionEntity]] =
db.run(questions.result)
def insertQuestion(question: QuestionEntity): Future[Int] =
db.run(questions += question)
}
class Question(tag: Tag) extends Table[QuestionEntity](tag, "QUESTION") {
def id = column[Option[Long]]("QUESTION_ID", O.PrimaryKey, O.AutoInc)
def title = column[String]("TITLE")
def * = (id, title) <> ((QuestionEntity.apply _).tupled, QuestionEntity.unapply)
}
case class QuestionEntity(id: Option[Long] = None, title: String) {
require(!title.isEmpty, "title.empty")
}
And the database is defined as follows:
h2mem1 = {
url = "jdbc:h2:mem:test1"
driver = org.h2.Driver
connectionPool = disabled
keepAliveConnection = true
}
I am using Scala 2.11.8, Slick 3.1.1, H2 database 1.4.192 and scalatest 2.2.6.
Error that appears when tests are executed is Table "QUESTION" already exists. So it looks like shutdown() method has no effect at all (but it is invoked - checked in debugger).
Anybody knows how to handle such scenario? How to clean-up database properly after each test?
Database has not been correctly cleaned-up because of invoking the method on different object.
H2DatabaseService has it's own db object and the test it's own. Issue was fixed after refactoring H2 database service and invoking:
after {
h2DB.db.shutdown.futureValue
}

Testing Play + Slick app

I've a simple CRUD app built with Scala Play 2.4.3 and Play-slick 1.1.0 (slick 3.1.0) that uses a MySQL database for persistent storage.
I was trying to create the tests for my app and I saw 2 main options:
mocking database access, that as far as I've seen, requires some code changes
make tests use an alternative database (probably, in memory H2).
What's the best approach (vantages and desavantages)?
I prefer the second approach, but I'm finding some difficulties in setting up the tests.
What do I need to do? First, I think that I need to do the tests run with a FakeApplication, right? Do I need any sbt dependency to be able to do that?
After that, how do I specify to use the H2 database?
I had the same struggle and I came up with a solution like this(using second approach):
Create a context for DAO to use:
trait BaseContext{
def dbName: String
val dbConfig = DatabaseConfigProvider.get[JdbcProfile](dbName)
val db = dbConfig.db
val profile = dbConfig.driver
val tables = new Tables { // this is generated by Schema Code Generator
override val profile: JdbcProfile = dbConfig.driver
}
}
#Singleton
class AppContext extends BaseContext{
def dbName = "mysql" // name in your conf right after "slick.dbs"
}
#Singleton
class TestingContext extends BaseContext{
def dbName = "h2"
}
Then create a module to bind the injection, and don't forget to enable it in conf using play.modules.enabled += "your.Module":
class ContextModule(environment: Environment, configuration: Configuration) extends AbstractModule {
override def configure(): Unit = {
if (configuration.getString("app.mode").contains("test")) {
bind(classOf[BaseContext])
.to(classOf[TestingContext])
} else {
bind(classOf[BaseContext])
.to(classOf[AppContext])
}
}
}
And inject it to every DAO you've created:
class SomeDAO #Inject()(context: BaseContext){
val dbConfig = context.dbConfig
val db = context.db
val tables = context.tables
import tables.profile.api._
def otherStuff....
// you can call db.run(...), tables.WhateverYourTableIs, tables.TableRowCaseClass, ...
}
And final step, your configuration file. In my case I used app.mode to mark the environment, and I use separate .conf for different environment. Of cause, in these conf you must have the correct DB configuration. Here's the sample:
app.mode = "test"
# Database configuration
slick.dbs = {
# for unit test
h2 {
driver = "slick.driver.H2Driver$"
db = {
url = "jdbc:h2:mem:test;MODE=MYSQL"
driver = "org.h2.Driver"
keepAliveConnection = true
}
}
}
I'm pretty sure my solution is not a elegant one, but it deliver the goods. :)
Any better solution is welcomed!
my solution was to add step(Play.start(fakeApp)) in the beginning of each spec, and step(Play.stop(fakeApp)) in the end of each spec.
Also:
def fakeApp: FakeApplication = {
FakeApplication(additionalConfiguration =
Map(
"slick.dbs.default.driver" -> "slick.driver.H2Driver$",
"slick.dbs.default.db.driver" -> "org.h2.Driver",
"slick.dbs.default.db.url" -> "jdbc:h2:mem:play"
))
}
This was needed because I'm using play-slick, which requires configurations like:
slick.dbs.default.driver = "slick.driver.MySQLDriver$"
slick.dbs.default.db.driver = "com.mysql.jdbc.Driver"
slick.dbs.default.db.url = "jdbc:mysql://localhost/database"
slick.dbs.default.db.user = "user"
slick.dbs.default.db.password = "password"
more info on the docs

How to apply manually evolutions in tests with Slick and Play! 2.4

I would like to manually run my evolution script at the beginning of each test file. I'm working with Play! 2.4 and Slick 3.
According to the documentation, the way to go seems to be:
Evolutions.applyEvolutions(database)
but I don't manage to get an instance of my database. In the documentation play.api.db.Databases is imported in order to get a database instance but if I try to import it, I get this error: object Databases is not a member of package play.api.db
How can I get an instance of my database in order to run the evolution script?
Edit: as asked in the comments, here is the entire source code giving the error:
import models._
import org.scalatest.concurrent.ScalaFutures._
import org.scalatest.time.{Seconds, Span}
import org.scalatestplus.play._
import play.api.db.evolutions.Evolutions
import play.api.db.Databases
class TestAddressModel extends PlaySpec with OneAppPerSuite {
lazy val appBuilder = new GuiceApplicationBuilder()
lazy val injector = appBuilder.injector()
lazy val dbConfProvider = injector.instanceOf[DatabaseConfigProvider]
def beforeAll() = {
//val database: Database = ???
//Evolutions.applyEvolutions(database)
}
"test" must {
"test" in { }
}
}
I finally found this solution. I inject with Guice:
lazy val appBuilder = new GuiceApplicationBuilder()
lazy val injector = appBuilder.injector()
lazy val databaseApi = injector.instanceOf[DBApi] //here is the important line
(You have to import play.api.db.DBApi.)
And in my tests, I simply do the following (actually I use an other database for my tests):
override def beforeAll() = {
Evolutions.applyEvolutions(databaseApi.database("default"))
}
override def afterAll() = {
Evolutions.cleanupEvolutions(databaseApi.database("default"))
}
Considering that you are using Play 2.4, where evolutions were moved into a separate module, you have to add evolutions to your project dependencies.
libraryDependencies += evolutions
Source: Evolutions
Relevant commit: Split play-jdbc into three different modules
To have access to play.api.db.Databases, you must add jdbc to your dependencies :
libraryDependencies += jdbc
Hope it helps some people passing here.
EDIT: the code would then look like this :
import play.api.db.Databases
val database = Databases(
driver = "com.mysql.jdbc.Driver",
url = "jdbc:mysql://localhost/test",
name = "mydatabase",
config = Map(
"user" -> "test",
"password" -> "secret"
)
)
You now have an instance of the DB, and can execute queries on it :
val statement = database.getConnection().createStatement()
val resultSet = statement.executeQuery("some_sql_query")
You can see more from the docs
EDIT: typo
I find the easiest way to run tests with evolutions applied is to use FakeApplication, and input the connection info for the DB manually.
def withDB[T](code: => T): T =
// Create application to run database evolutions
running(FakeApplication(additionalConfiguration = Map(
"db.default.driver" -> "<my-driver-class>",
"db.default.url" -> "<my-db-url>",
"db.default.user" -> "<my-db>",
"db.default.password" -> "<my-password>",
"evolutionplugin" -> "enabled"
))) {
// Start a db session
withSession(code)
}
Use it like this:
"test" in withDB { }
This allows you, for example, to use an in-memory database for speeding up your unit tests.
You can access the DB instance as play.api.db.DB if you need it. You'll also need to import play.api.Play.current.
Use FakeApplication to read your DB configuration and provide a DB instance.
def withDB[T](code: => T): T =
// Create application to run database evolutions
running(FakeApplication(additionalConfiguration = Map(
"evolutionplugin" -> "disabled"))) {
import play.api.Play.current
val database = play.api.db.DB
Evolutions.applyEvolutions(database)
withSession(code)
Evolutions.cleanupEvolutions(database)
}
Use it like this:
"test" in withDB { }