Play2 and Scala, How should I configure my integration tests to run with proper DB - scala

I'm trying to figure out how to write my db integration tests in my Play2 app.
In my conf file I have specified two databases, xxx_test for regular use and h2 db for testing:
db.xxx_test.driver=com.mysql.jdbc.Driver
db.xxx_test.url="jdbc:mysql://localhost/xxx_test?characterEncoding=UTF-8"
db.xxx_test.user="root"
db.xxx_test.password=""
db.h2.driver=org.h2.Driver
db.h2.url="jdbc:h2:mem:play"
db.h2.user=sa
db.h2.password=""
In my User object I have specified xxx_test to be used when I run the application.
def createUser(user: User): Option[User] = {
DB.withConnection("xxx_test") {
implicit connection =>
SQL("insert into users(first_name, last_name, email, email_validated, last_login, created, modified, active) values({first_name},{last_name},{email},{email_validated},{last_login}, {created}, {modified}, {active})").on(
'first_name -> user.firstName,
'last_name -> user.lastName,
'email -> user.email,
'email_validated -> user.emailValidated,
'last_login -> user.lastLogin,
'created -> user.created,
'modified -> user.modified,
'active -> true
).executeInsert().map(id => {
return Some(User(new Id[Long](id), user.firstName, user.lastName, user.email, user.emailValidated, user.lastLogin, user.created, user.modified, true))
}
)
}
None
}
In my test I create a new inMemoryDatabase and then use the User to create and get my object
for testing.
class DBEvolutionsTest extends Specification {
"The Database" should {
"persist data properly" in {
running(FakeApplication(additionalConfiguration = inMemoryDatabase())) {
User.create(User(Id[Long](1L), "jakob",
"aa",
"aaa",
true,
DateTime.now(),
DateTime.now(),
DateTime.now(),
true))
val newUser = User.findBy(Id[Long](1L))
newUser.get.firstName must beEqualTo("jakob")
}
}
}
}
This of course is not the correct way since the User object uses the xxx_test and not the
h2 db. The test will create a User in the real db and not the one in memory since I have specified the db in the User(DB.withConnection("xxx_test")) object.
I guess there is some smart way of doing this, I do not want to pass the db name around
inte the application like User.create(User(...), "xxx_test")
How have you solved this problem?

You might want to checkout how to do dependency injection in scala. A good solution is to abstract your database out of your User model and then pass it as a dependency.
A simple way to do it would be to change the configuration file for testing. Play lets you specify which config file is used on the command line.
This is not the most practical though.
Another solution is to use implicits, define your database connection as an implicit parameter of your function:
def createUser(user: User)(implicit dbName: String): Option[User]=
DB.withConnection(dbName) { ... }
You will still have to propagate the parameter upwards in all your calls, but you can hide it:
def importUsers(csvFile: File)(implicit dbName: String): Seq[User] = { conn =>
...
User.createUser(u)
...
}
and when you call it from the top:
implicit dbName = "test"
importUsers(...)
This is build in in scala, so it's pretty easy to setup and doesn't need a lot of boilerplate supporting it. Personally, I think that implicits make the code unclear and I prefer the solution presented in this presentation Dead-Simple Dependency Injection.
The gist of it is that you make your createUser and all other methods that depend on a database connection to return a function depending on the connection, and not just the result. Here is how it would work with your example.
1- you create a Connection trait that configures a connection. A simple form would be:
trait ConnectionConfig {
def dbName: String
}
2- your method depending on that config returns a function:
def createUser(user: User): ConnectionConfig => Option[User] = { conn =>
DB.withConnection(conn.dbName) { ... }
}
3- when you use createUser in another method, that method becomes dependent on the connection too, so you mark it as such by returning the dependency on ConnectionConfig with a function return type, for example:
def importUsers(csvFile: File): ConnectionConfig => Seq[User] = { conn =>
...
User.createUser(u)(conn)
...
}
This is a good habit to have, as it will be clear in your code which methods depends on a connection to the database, and you can easily swap connections. So, in your main app, you would create the real connection:
class RealConncetionConfig extends ConnectionConfig {
val dbName = "xxx_test"
}
but in your test file, you create a test DB config:
class DBEvolutionsTest extends Specification {
class TestDBConfig extends ConnectionConfig {
val dbName = "h2"
}
val testDB = new TestDBConfig()
"The Database" should {
"persist data properly" in {
running(FakeApplication(additionalConfiguration = inMemoryDatabase())) {
User.create(User(Id[Long](1L), "jakob",
"aa",
"aaa",
true,
DateTime.now(),
DateTime.now(),
DateTime.now(),
true))(testDB)
val newUser = User.findBy(Id[Long](1L))
newUser.get.firstName must beEqualTo("jakob")
}
}
}
}
This is the gist of it. Check out the presentation and slides I mentioned, there is a nice way to abstract all of that so that you can loose the (conn) argument that is making this code ugly.
As a side comment, if I were you, I would even abstract the type of DB. So, instead of having the SQL in the User model object, have it in a separate implementation, this way you can easily switch the type of database (use mongodb, dynamo...).
It would just be something like this, extending from the previous code:
trait ConnectionConfig {
def createUser(user: User): Option[User]
}
and in the User model object:
def createUser(user: User): ConnectionConfig => Option[User] = { conn =>
conn.createUser(user)
}
this way, when testing parts of your code depending on the User model, you can make a mock DB where createUser always works and returns the expected result (or always fails...), without even using the in memory database (you would still need tests for the real SQL connection, but you could test other parts of your app):
trait ConnectionConfig {
def createUser(user: User): Option[User] = Some(user)
}

The inMemoryDatabase method is defined like this:
def inMemoryDatabase(
name: String = "default",
options: Map[String, String] = Map.empty[String, String]): Map[String, String]
My guess is that you should pass the xxx_test as the name parameter.

You must define a name other than your default(xxx_test) name for the in memory database. I think the following snippet should work.
FakeApplication(additionalConfiguration = inMemoryDatabase("h2"))
Please see also: https://stackoverflow.com/a/11029324/2153190

Related

explicitly close db connection in slick

Here is code I am trying to optimise:
object UserRepo
{
val users = TableQuery[Users]
val dbName = "db"
lazy val queryAllUsers = for (user <- users) yield user
type UserRow = (Int, String, String, String)
def getAll() : Future[ Seq[UserRow] ] =
{
val db = Database.forConfig( dbName )
val f: Future[Seq[UserRow]] = db.run( queryAllUsers.result )
f.onComplete {
case Success(_) => { db.close() }
case Failure(_) => { db.close() }
}
f
}
}
I going to have number of query to the DB I am trying to get rid of string where I am creating DB connection. Is there any execution context I can use to close connection explicitly ?? so code will look more concise ?
Is there option to get used db connection within Future.onComplete scope??
Thanks
As for your comment ( explicitly close db connection in slick ) normally what you do is to create a connection on an application startup (or lazily on first use) and then closing it at the application end.
This obviously all depends what kind of application you are running:
if you are having DI container you would probably manage some of this in your DI mechanisms (like Modules in Guice)
if you are having web application, specifically e.g. Play - you would probably use play-slick that does this initialization / shutting down for you (kind of).
General way (no DI)
The easiest general way (assuming you are not using DI or play-slick) of doing this would be perhaps something like this:
object DbManager {
lazy val db = createDb
private def createDb = {
Database.forConfig("db")
}
def close {
db.close
}
}
Then your code would be:
object UserRepo
{
val users = TableQuery[Users]
lazy val queryAllUsers = for (user <- users) yield user
type UserRow = (Int, String, String, String)
def getAll() : Future[ Seq[UserRow] ] =
{
DbManager.db.run( queryAllUsers.result )
}
}
Above code doesn't do any cleaning up - this would need to be added to some kind of hook when application is closing (in case e.g. of web application) or you would need to manually call DbManager.close at some specified time (when you are closing the application).
Play slick
You would probably need to start from here: https://github.com/playframework/play-slick/tree/master/samples/basic (most basic sample showing play-slick configuration).
Updating your answer with this would be:
class UserRepo #Inject() (dbConfigProvider: DatabaseConfigProvider) extends HasDatabaseConfigProvider[JdbcProfile])
{
import driver.api._
val users = TableQuery[Users]
lazy val queryAllUsers = for (user <- users) yield user
type UserRow = (Int, String, String, String)
def getAll() : Future[ Seq[UserRow] ] =
{
db.run( queryAllUsers.result )
}
}
In this scenario you wouldn't call:
UserRepo.getAll
but you would rather need to inject it:
class MyClientCode #Inject() (userRepo: UserRepo) {
...
userRepo.getAll
...
}
You would need to obviously configure it in your configuration but this should be very straightforward to do with the sample provided above.
So in short your Play application will have database connection configuration and would do all initialization / cleaning up. Your external modules (like the one you described in your comment) would simply pull DatabaseConfigProvider as Guice managed dependency (as show above).

Always execute code at beginning and end of method

I'm connecting to MongoDb using following code :
def insert() = {
val mc = new com.mongodb.MongoClient("localhost", 27017);
val db = mc.getDatabase("MyDb");
//My insert code
mc.close();
} //> insert: ()Unit
I have various methods which open and close the connection.
Can the lines :
val mc = new com.mongodb.MongoClient("localhost", 27017);
val db = mc.getDatabase("MyDb");
mc.close();
be extracted so that they are implicitly called at beginning and end of method.
Does Scala implicits cater for this scenario or is reflection required ?
A common pattern would be to use a call-by-name method where you can pass a function that accepts a DB and does something with it. The call-by-name method can facilitate the creation of the client, etc, and execute the code within.
def withDB[A](block: DB => A): A = {
val mc = new com.mongodb.MongoClient("localhost", 27017);
val db = mc.getDatabase("MyDb");
try block(db) finally mc.close()
}
And use it:
def insert() = withDB { db =>
// do something with `db`
}
However, a look at the documentation says:
A MongoDB client with internal connection pooling. For most applications, you should have one MongoClient instance for the entire JVM.
Which makes the above approach look like a bad idea, assuming that is the version you are using. I can definitely see some concurrency issues trying to do this and having too many connections open.
But, you can follow the same pattern, stuffing the connection creating into a singleton object. You would need to manage the closing of the client when your application is shutdown, however.
object Mongo {
lazy val mc = new com.mongodb.MongoClient("localhost", 27017);
lazy val db = mc.getDatabase("MyDb");
def withDB[A](block: DB => A): A = block(db)
def close(): Unit = mc.close()
}
You can define a method that executes some 'work' function e.g.
def withMongoDb[T](work: DB => T): T = {
val mc = new com.mongodb.MongoClient("localhost", 27017)
// I don't actually know what type `db` is so I'm calling it `DB`
val db: DB = mc.getDatabase("MyDb")
try { work(db) }
finally { mc.close() }
}
Then you can use it like:
withMongoDb { db =>
db.insert(...)
db.query(...)
}
This is a similar approach to the one used in Slick, pre-3.0, i.e. withSession and withTransaction.
Now, if you implemented some convenience methods e.g.
def insertStuff(values: Seq[Int])(implicit db: DB) = {
db.insert(values)
}
Then you could call mark the db as implicit in your withMongoDb call, effectively making sure you never accidentally call insertStuff outside of that block.
withMongoDb { implicit db =>
insertStuff(Seq(1,2,3,4))
}
insertStuff(Seq(1,2,3,4)) // compile error
Instead of implicits you could do something like this:
def mongoConn(ip:String, port:Int, dbName:String):(Database => Unit) => Unit = {
f => {
val mc = new com.mongodb.MongoClient(ip, port)
val db = mc.getDatabase(dbName)
f(db)
mc.close()
}
}
val conn = mongoConn("localhost", 27017, "MyDb")
conn(db => {
//insert code
})

Database transactions in Play framework scala applications (anorm)

I am developing an application using Play framework and scala. I am using anorm for data-access layer. And I've got a problem I could not solve.
Brief: I want to be able to have methods in data-access objects (dao) to work inside transactions as well as being called alone.
Details:
I have data-access layer consist of class with methods that only executes particular SQL over database. Traditionally they looks like:
def list() = DB.withConnection { implicit cn =>
...
}
Now I want to have some methods to be executed in a transaction scope. Like traditional select-update service methods but still be able to run them alone. So, what I have in my mind is like this:
class Service {
def fooTransacted() = {
inTransaction {
val old = dao.select(id = 2)
val newObj = old.copy(isActive = true)
dao.update(newObj)
}
}
def fooSinle() = {
dao.select(id = 2)
}
}
I tried around several ways, but could not come up with any solution.
What about
class Dao {
def foo(id: Long)(implicit connection: Connection) = {
SQL("select * from foo where id={id}").on('id->id).as(...)
}
}
class Service{
def withConnection = {
DB.withConnection {implicit connection =>
Dao.foo(1)
Dao.foo(2)
}
}
def withTransaction = {
DB.withTransaction {implicit connection =>
Dao.foo(1)
Dao.foo(2)
}
}
The solution I've seen used elsewhere (principally in Squeryl), is roughly the following:
import java.sql.Connection
object Helper {
private val conn: ThreadLocal[Connection] = new ThreadLocal
def inTransaction[X](f: Connection => X) = {
conn.get() match {
case null =>
DB.withConnection { newConn =>
conn.set(newConn)
try f(newConn)
finally conn.set(null)
}
case c => f(c)
}
}
}
This way, the inTransaction method is re-entrant, so there's no harm in calling it redundantly inside dao.select.
If you prefer, you can expose conn via a public method, and change the signature of f to => X - you lose some compile-time safety, but the API is a little cleaner.
One pitfall with this approach is that connections are tied to threads, which may cause problems if you're using futures or actors, and a process can resume on a different thread (this is a tricky area anyway, but one you should be aware of).
You might want to look into Squeryl too - it may already do what you need.

Can a scala self-type be satisfied via delegation?

Suppose I have (and this is rather contrived)
trait DbConnection {
val dbName: String
val dbHost: String
}
class Query {
self: DbConnection =>
def doQuery(sql: String) {
// connect using dbName, dbHost
// perform query
}
}
class HasADbConnection(override val dbName: String,
override val dbHost: String) extends DbConnection {
self =>
def doSomething() {
doSomethingElseFirst()
}
def doSomethingElseFirst() = {
val query = new Query() with DbConnection {
override val dbName = self.dbName
override val dbHost = self.dbHost
}
query.doQuery("")
}
}
Is there a way to avoid the redundant "override val dbName = self.dbName, override val dbHost = self.dbHost" in the new Query() creation, and instead indicate that the new Query object should inherit from / delegate to the HasADbConnection instance for these fields?
I realize it may be more appropriate for Query to take a DbConnection as a constructor argument. I'm interested though in other ways of satisfying the Query self-type. Perhaps there is no way of propagating the HasADbconnection fields onto the new Query instance, which is a completely valid answer.
Not sure exactly what you are trying to do, but this seems like a possible match to your intent:
trait C extends B {
def myA = new A() with C { // Note that this could be "with B" rather than "with C" and it would still work.
val name: String = C.this.name // explicit type not actually required - String type can be inferred.
}
}
then, for example:
scala> val myC = new C() { val name = "myC-name" }
myC: C = $anon$1#7f830771
scala> val anA = myC.myA
anA: A with C = C$$anon$1#249c38d5
scala> val aName = anA.name
aName: String = myC-name
Hopefully that can at least help guide your eventual solution. Might be able to give further help if you clarify what you want to do (or why you want to do it) further.
EDIT - after update to question:
I suggest you may be thinking about this the wrong way. I would not like tying the Query class down to knowing how to forge a connection. Rather, either pass a ready-made connection as a parameter to your call that uses the connection, or (as shown below) set up the Query class to supply a function from a connection to a result, then call that function using a connection established elsewhere.
Here is a how I would think about solving this (note that this example doesn't create an actual DB connection per se, I just take your DbConnection type at face value - really you have actually defined a 'DbConnConfig' type):
trait DbConnection {
val dbName: String
val dbHost: String
}
class Query(sql: String) {
def doQuery: DbConnection => String = { conn: DbConnection =>
// No need here to know how to: connect using dbName, dbHost
// perform query, using provided connection:
s"${conn.dbName}-${sql}-${conn.dbHost}" // <- Dummy implementation only here, so you can, for example, try it out in the REPL.
}
}
class HasADbConnection(override val dbName: String,
override val dbHost: String) extends DbConnection {
// You can create your actual connection here...
val conn = makeConnection
def doSomething(query: Query) = {
// ... or here, according to your program's needs.
val conn = makeConnection
query.doQuery(conn)
}
def makeConnection = this // Not a real implementation, just a quick cheat for this example.
}
In reality, doQuery (which could be named better) should have a type of DbConnection => ResultSet or similar. Example usage:
scala> val hasConn = new HasADbConnection("myDb", "myHost")
hasConn: HasADbConnection = HasADbConnection#6c1e5d2f
scala> hasConn.doSomething(new Query("#"))
res2: String = myDb-#-myHost

Higher order functions with Scala Slick for DRY goodness

I have an idea how my data access layer with Scala Slick should look like, but I'm not sure if it's really possible.
Let's assume I have a User table which has the usual fields like id, email, password, etc.
object Users extends Table[(String, String, Option[String], Boolean)]("User") {
def id = column[String]("id", O.PrimaryKey)
def email = column[String]("email")
def password = column[String]("password")
def active = column[Boolean]("active")
def * = id ~ email ~ password.? ~ active
}
And I wish to query them in different ways, currently the ugly way is to have a new database session, do the for comprehension and then do different if statements to achieve what I want.
e.g.
def getUser(email: String, password: String): Option[User] = {
database withSession { implicit session: Session =>
val queryUser = (for {
user <- Users
if user.email === email &&
user.password === password &&
user.active === true
} //yield and map to user class, etc...
}
def getUser(identifier: String): Option[User] = {
database withSession { implicit session: Session =>
val queryUser = (for {
user <- Users
if user.id === identifier &&
user.active === true
} //yield and map to user class, etc...
}
What I would prefer is to have a private method for the query and then public methods which define queries along the lines of
type UserQuery = User => Boolean
private def getUserByQuery(whereQuery: UserQuery): Option[User] = {
database withSession { implicit session: Session =>
val queryUser = (for {
user <- Users
somehow run whereQuery here to filter
} // yield and boring stuff
}
def getUserByEmailAndPassword(email, pass){ ... define by query and call getUserByQuery ...}
getUserById(id){….}
getUserByFoo{….}
That way, the query logic is encapsulated in the relevant public functions and the actual querying and mapping to the user object is in a reusable function that other people dont need to be concerned with.
The problem I have is trying to refactor the "where" bit's into functions that I can pass around. Trying to do things like select them in intellij and using the refactoring results in some pretty crazy typing going on.
Does anyone have any examples they could show of doing close to what I am trying to achieve?
1) wrapping queries in a def means the query statement is re-generated on every single request, and, since query params are not bound, no prepared statement is passed to the underlying DBMS.
2) you're not taking advantage of composition
Instead, if you define parameterized query vals that def query wrappers call, you can get the best of both worlds.
val uBase = for{
u <- Users
ur <- UserRoles if u.id is ur.UserID
} yield (u,ur)
// composition: generates prepared statement one time, on startup
val byRole = for{ roleGroup <- Parameters[String]
(u,ur) <- uBase
r <- Roles if(r.roleGroup is roleGroup) && (r.id is ur.roleID)
} yield u
def findByRole(roleGroup: RoleGroup): List[User] = {
db withSession { implicit ss:SS=>
byRole(roleGroup.toString).list
}
}
If you need one-off finders for a single property, use:
val byBar = Foo.createFinderBy(_.bar)
val byBaz = Foo.createFinderBy(_.baz)
Can't remember where, maybe on SO, or Slick user group, but I did see a very creative solution that allowed for multiple bound params, basically a createFinderBy on steroids. Not so useful to me though, as the solution was limited to a single mapper/table object.
At any rate composing for comprehensions seems to do what you're trying to do.
I have recently done something similar, one way to do this could be following, write a general select method which takes a predicate
def select(where: Users.type => Column[Boolean]): Option[User] = {
database withSession { implicit session: Session =>
val queryUser = (for {
user <- Users where(user)
} //yield and map to user class, etc...
}
and then write the method which passes the actual predicate as a higher order function
def getUserByEmail(email:String):Option[User]={
select((u: Users.type) => u.*._2 === email)
}
similarly
def getActiveUserByEmail(email:String):Option[User]={
select((u: Users.type) => u.*._2 === email && u.*._4 === true)
}