play framework with mongodb - mongodb

I am using mongo with play framework with "reactivemongo", That makes an async bridge between mongo connection and programm. For standalone projects I always use casbah lib - it has more native syntax (sometimes using of Futures in each request is not needed, and my religion does not allow me to use Async.await for blocking each request ) for me and no actors overhead, also I don't like JSON BSON conversion overhead.
But using casbah in play framework direct way(just create Mongo connection in controller ) produces connection leaks - this means you should create connection pool and control yourself, otherwords write reactivemongo.
Has anybody used casbah with mongo in production ? Where is the best and most canonical way of creating and controlling connection in play ecosystem ?

First you should check Connecting to MongoDB. Now go to this tutorial create scala project ( If you used other editor then follow scala project creation steps ).
Now check this following steps :
1> projectName/conf/application.conf add application.conf mongo Db name, collection name, port number, URL etc. play reactive mongo for ex: I added following in my application.conf
mongodb.default.host = "localhost"
mongodb.default.db = "Demo"
mongodb.default.port = "27017"
CI.default.uri = "mongodb://localhost:27017/"
2> Create a .scala file in controller folder give any name for ex. I set file name as ScalaMongoFactory and add following code in this file
import com.mongodb.casbah. {
MongoClient, MongoClientURI
}
import com.typesafe.config.ConfigFactory
object ScalaMongoFactory {
private val config = ConfigFactory.load()
private val DATABASE = config.getString("mongodb.default.db")
private val server = MongoClientURI(config.getString("CI.default.uri"))
private val client = MongoClient(server)
val database = client(DATABASE)
}
3> Now create a new .scala file in controller where you want to use mongo connection. For ex. I created checkConnection.scala file and contains like
import com.cloudinsights.scala.controllers. {
ScalaMongoFactory
}
object checkConnection {
val collection = ScalaMongoFactory.database("your collectionName")
}

There's no need to use Async.wait with reactive mongo (and you shouldn't anyway).

I guess you could use a utility object to manage your connection.
import com.mongodb.casbah.{MongoClient, MongoDB}
import play.api.Play
object MongoManager {
private val server = Play.current.configuration.getString("db.host").get
private val port = Play.current.configuration.getInt("db.port").get
object using {
def apply[A](col: String)(block: MongoDB => A): A = {
val con = MongoClient(server, port)
val a = block(con.apply(col))
con.close
a
}
def apply[A](block: MongoClient => A): A = {
val con = MongoClient(server, port)
val a = block(con)
con.close
a
}
}
object stashed {
private lazy val con = MongoClient(server, port)
def apply(col: String): MongoDB = con.apply(col)
def apply: MongoClient = con
}
}
I didn't find a play plugin for this driver.
Personally I'd recommend using the ReactiveMongo driver instead, since it can also use play's JSON library. If you're taking data from the database and feeding it through a REST api, its a nicer option.

Related

ReactiveMongo connect with MongoConnectionOptions

I was connecting to my MongoDB using MongoConnectionOptions in reactivemongo version 0.15.x. Now I updated to 0.20.11.
The apply() function of the MongoConnectionOptions companion object is deprecated now. The deprecation warning says to use the constructor of MongoConnectionOptions. Unfortunately this constructor is package-private.
I do not want to use the deprecated apply function mainly because my scala compiler options do not allow warnings. (And I really don't want to change this).
The official documentation only explains the deprecated version: http://reactivemongo.org/releases/0.1x/documentation/tutorial/connect-database.html
How can I still connect to the database using connection options?
The factory MongoConnectionOptions(..) will be refactored in coming next major release 1.0.x.
Anyway, you can using .default + .copy(..):
import reactivemongo.api.MongoConnectionOptions
MongoConnectionOptions.default.copy(appName = "Foo")
Moreover, options can be prepared (from config file or programmatically) as URI string.
val host = "localhost"
val port = 27017
reactivemongo.api.MongoConnection.connect(s"mongodb://${host}:${port}")
I use the following syntax and works fine for me:
val host = "localhost"
val port = "27017"
val db_name = "test"
val mongoUri = "mongodb://" + host + ":" + port + "/" + db_name
val driver = new AsyncDriver
val database = for {
uri <- MongoConnection.fromString(mongoUri)
con <- driver.connect(uri)
dn <- Future(uri.db.get)
db <- con.database(dn)
} yield db
val bsonCollection: Future[BSONCollection] = database.map(_.collection(collectionName))

Testing Play + Slick app

I've a simple CRUD app built with Scala Play 2.4.3 and Play-slick 1.1.0 (slick 3.1.0) that uses a MySQL database for persistent storage.
I was trying to create the tests for my app and I saw 2 main options:
mocking database access, that as far as I've seen, requires some code changes
make tests use an alternative database (probably, in memory H2).
What's the best approach (vantages and desavantages)?
I prefer the second approach, but I'm finding some difficulties in setting up the tests.
What do I need to do? First, I think that I need to do the tests run with a FakeApplication, right? Do I need any sbt dependency to be able to do that?
After that, how do I specify to use the H2 database?
I had the same struggle and I came up with a solution like this(using second approach):
Create a context for DAO to use:
trait BaseContext{
def dbName: String
val dbConfig = DatabaseConfigProvider.get[JdbcProfile](dbName)
val db = dbConfig.db
val profile = dbConfig.driver
val tables = new Tables { // this is generated by Schema Code Generator
override val profile: JdbcProfile = dbConfig.driver
}
}
#Singleton
class AppContext extends BaseContext{
def dbName = "mysql" // name in your conf right after "slick.dbs"
}
#Singleton
class TestingContext extends BaseContext{
def dbName = "h2"
}
Then create a module to bind the injection, and don't forget to enable it in conf using play.modules.enabled += "your.Module":
class ContextModule(environment: Environment, configuration: Configuration) extends AbstractModule {
override def configure(): Unit = {
if (configuration.getString("app.mode").contains("test")) {
bind(classOf[BaseContext])
.to(classOf[TestingContext])
} else {
bind(classOf[BaseContext])
.to(classOf[AppContext])
}
}
}
And inject it to every DAO you've created:
class SomeDAO #Inject()(context: BaseContext){
val dbConfig = context.dbConfig
val db = context.db
val tables = context.tables
import tables.profile.api._
def otherStuff....
// you can call db.run(...), tables.WhateverYourTableIs, tables.TableRowCaseClass, ...
}
And final step, your configuration file. In my case I used app.mode to mark the environment, and I use separate .conf for different environment. Of cause, in these conf you must have the correct DB configuration. Here's the sample:
app.mode = "test"
# Database configuration
slick.dbs = {
# for unit test
h2 {
driver = "slick.driver.H2Driver$"
db = {
url = "jdbc:h2:mem:test;MODE=MYSQL"
driver = "org.h2.Driver"
keepAliveConnection = true
}
}
}
I'm pretty sure my solution is not a elegant one, but it deliver the goods. :)
Any better solution is welcomed!
my solution was to add step(Play.start(fakeApp)) in the beginning of each spec, and step(Play.stop(fakeApp)) in the end of each spec.
Also:
def fakeApp: FakeApplication = {
FakeApplication(additionalConfiguration =
Map(
"slick.dbs.default.driver" -> "slick.driver.H2Driver$",
"slick.dbs.default.db.driver" -> "org.h2.Driver",
"slick.dbs.default.db.url" -> "jdbc:h2:mem:play"
))
}
This was needed because I'm using play-slick, which requires configurations like:
slick.dbs.default.driver = "slick.driver.MySQLDriver$"
slick.dbs.default.db.driver = "com.mysql.jdbc.Driver"
slick.dbs.default.db.url = "jdbc:mysql://localhost/database"
slick.dbs.default.db.user = "user"
slick.dbs.default.db.password = "password"
more info on the docs

How to apply manually evolutions in tests with Slick and Play! 2.4

I would like to manually run my evolution script at the beginning of each test file. I'm working with Play! 2.4 and Slick 3.
According to the documentation, the way to go seems to be:
Evolutions.applyEvolutions(database)
but I don't manage to get an instance of my database. In the documentation play.api.db.Databases is imported in order to get a database instance but if I try to import it, I get this error: object Databases is not a member of package play.api.db
How can I get an instance of my database in order to run the evolution script?
Edit: as asked in the comments, here is the entire source code giving the error:
import models._
import org.scalatest.concurrent.ScalaFutures._
import org.scalatest.time.{Seconds, Span}
import org.scalatestplus.play._
import play.api.db.evolutions.Evolutions
import play.api.db.Databases
class TestAddressModel extends PlaySpec with OneAppPerSuite {
lazy val appBuilder = new GuiceApplicationBuilder()
lazy val injector = appBuilder.injector()
lazy val dbConfProvider = injector.instanceOf[DatabaseConfigProvider]
def beforeAll() = {
//val database: Database = ???
//Evolutions.applyEvolutions(database)
}
"test" must {
"test" in { }
}
}
I finally found this solution. I inject with Guice:
lazy val appBuilder = new GuiceApplicationBuilder()
lazy val injector = appBuilder.injector()
lazy val databaseApi = injector.instanceOf[DBApi] //here is the important line
(You have to import play.api.db.DBApi.)
And in my tests, I simply do the following (actually I use an other database for my tests):
override def beforeAll() = {
Evolutions.applyEvolutions(databaseApi.database("default"))
}
override def afterAll() = {
Evolutions.cleanupEvolutions(databaseApi.database("default"))
}
Considering that you are using Play 2.4, where evolutions were moved into a separate module, you have to add evolutions to your project dependencies.
libraryDependencies += evolutions
Source: Evolutions
Relevant commit: Split play-jdbc into three different modules
To have access to play.api.db.Databases, you must add jdbc to your dependencies :
libraryDependencies += jdbc
Hope it helps some people passing here.
EDIT: the code would then look like this :
import play.api.db.Databases
val database = Databases(
driver = "com.mysql.jdbc.Driver",
url = "jdbc:mysql://localhost/test",
name = "mydatabase",
config = Map(
"user" -> "test",
"password" -> "secret"
)
)
You now have an instance of the DB, and can execute queries on it :
val statement = database.getConnection().createStatement()
val resultSet = statement.executeQuery("some_sql_query")
You can see more from the docs
EDIT: typo
I find the easiest way to run tests with evolutions applied is to use FakeApplication, and input the connection info for the DB manually.
def withDB[T](code: => T): T =
// Create application to run database evolutions
running(FakeApplication(additionalConfiguration = Map(
"db.default.driver" -> "<my-driver-class>",
"db.default.url" -> "<my-db-url>",
"db.default.user" -> "<my-db>",
"db.default.password" -> "<my-password>",
"evolutionplugin" -> "enabled"
))) {
// Start a db session
withSession(code)
}
Use it like this:
"test" in withDB { }
This allows you, for example, to use an in-memory database for speeding up your unit tests.
You can access the DB instance as play.api.db.DB if you need it. You'll also need to import play.api.Play.current.
Use FakeApplication to read your DB configuration and provide a DB instance.
def withDB[T](code: => T): T =
// Create application to run database evolutions
running(FakeApplication(additionalConfiguration = Map(
"evolutionplugin" -> "disabled"))) {
import play.api.Play.current
val database = play.api.db.DB
Evolutions.applyEvolutions(database)
withSession(code)
Evolutions.cleanupEvolutions(database)
}
Use it like this:
"test" in withDB { }

Best Practise of Using Connection Pool in Slick 3.0.0 Together with Play Framework

I followed the documentation of Slick 3.0.0-RC1, using Typesafe Config as database connection configuration. Here is my conf:
database = {
driver = "org.postgresql.Driver"
url = "jdbc:postgresql://localhost:5432/postgre"
user = "postgre"
}
I established a file Locale.scala as:
package models
import slick.driver.PostgresDriver.api._
import scala.concurrent.Future
case class Locale(id: String, name: String)
class Locales(tag: Tag) extends Table[Locale](tag, "LOCALES") {
def id = column[String]("ID", O.PrimaryKey)
def name = column[String]("NAME")
def * = (id, name) <> (Locale.tupled, Locale.unapply)
}
object Locales {
private val locales = TableQuery[Locales]
val db = Database.forConfig("database")
def count: Future[Int] =
try db.run(locales.length.result)
finally db.close
}
Then I got confused that when and where the proper time is to create Database object using
val db = Database.forConfig("database")
If I create db like this, there will be as many Database objects as my models. So what is the best practice to get this work?
You can create an Object DBLocator and load it using lazy operator so that its loaded only on demand.
You can always invoke the method defined in DBLocator class to get an instance of Session.

Why multiple mongodb connecions with Casbah?

I have to manage multiple databases connection to MongoDb, using casbah scala client. I have an approximation that works but open hundreds of connections.
I want to save a Map[String, MongoDB] that saves a connection for each database (which is the key. I'm using this in Spark Streaming with a two nodes cluster, so I think that is a serialization issue but I don't know how to fix it.
Take a look to my class.
abstract class AbstractMongoDAO(#transient val config: Config) extends Closeable with Serializable {
#transient private val mongoConfig = config.getConfig(CONFIG_KEY)
private val host = mongoConfig.getString(CONFIG_KEY_HOST)
#transient private var _mongoClient: MongoClient = MongoClient(host)
private var _dbs: mutable.HashMap[String, MongoDB] = mutable.HashMap()
protected def dbs(): mutable.HashMap[String, MongoDB] ={
if(_dbs == null)
_dbs = mutable.HashMap()
_dbs
}
def mongoClient: MongoClient = {
if (_mongoClient == null) {
_mongoClient = MongoClient(host)
}
_mongoClient
}
def db(dbName: String):MongoDB = {
if (dbs.get(dbName) == None) {
_dbs += (dbName -> mongoClient.getDB(dbName))
}
_dbs.get(dbName).get
}
override def close() = {
Option(_mongoClient).foreach(_.close())
}
}
private object AbstractMongoDAO {
val CONFIG_KEY = "mongo"
val CONFIG_KEY_HOST = "host"
}
And then I have another class that extends AbstractMongoDao
class MongoDAO (override val config : Config)
extends AbstractMongoDAO(config) with Serializable
And I get a db connection with this simple code. appName is a variable database.
val _db = db(appName)
What I'm doing wrong?
Casbah is built on top of official Java driver. MongoClient represents an internal pool of db connections to a MongoDB cluster. If you use the same cluster and only change database name and not the host, you don't need to create multiple MongoClients, one would be enough for the whole application.
To configure MongoClient check this documentation and corresponding options. If you have multiple DB hosts or still want to use multiple MongoClients then you can build your options and create MongoClient like this:
val options = MongoClientOptions.builder()
.connectionsPerHost(1)
// add other options if needed
.build();
val _mongoClient = MongoClient(host, options)
In your case since only db name neeeds to change and not the db host I would change the method that gets db to this:
def db(dbName: String):MongoDB =
mongoClient.getDB(dbName) // db will be created in Mongo on the fly if not exist
And you don't need the map anymore.