I am trying to get connection from default Hikaricp config. Following is application.conf
modules {
enabled += "play.api.db.DBModule"
enabled += "play.api.db.HikariCPModule"
enabled += "modules.AppModule"
}
db.default.hikaricp.dataSourceClassName=org.postgresql.ds.PGSimpleDataSource
db.default.hikaricp.dataSource.user=rp
#db.default.hikaricp.dataSource.url="postgres://rp:password#localhost/profile"
db.default.hikaricp.dataSource.password=password
db.default.hikaricp.dataSource.databaseName=profile
db.default.hikaricp.dataSource.serverName=localhost
db.default.hikaricp.connectionTestQuery = "SELECT 1"
Since Play is maintaining the connection pool. Now I am unable to find a way to get connection created above Play. I have tried the following (1, 2, 3):
//1. Injecting
import slick.jdbc.PostgresProfile.api._
class DBConnection #Inject()(db: Database) {
}
// 2. Mentioned here.
object DBConnection {
implicit val db = DatabaseConfigProvider.get[JdbcProfile](Play.current).db
//implicit val db = Database.forConfig("default") (3)
}
How can I get connection from the default connection pool?
Additional Details:
Play Framework version: 2.16.9
Scala version is 2.12.6
Postgres Dependency org.postgresql" % "postgresql" % "9.4-1206-jdbc42.
Added logs here for all three try.
You can use play.api.db.DBApi, like:
class DatabaseService #Inject()(dbApi: DBApi)
(implicit ec: DatabaseExecutionContext) {
lazy val database = dbApi.database("default")
...
}
Related
Play Framework 2.6, Postgresql. Jooq as db access library.
When running test, I'm getting
org.postgresql.util.PSQLException: FATAL: sorry, too many clients
already
Here is a helper class which provides jooq's dsl context:
#Singleton
class Db #Inject() (val db: Database, system: ActorSystem) {
val databaseContext: ExecutionContext = system.dispatchers.lookup("contexts.database")
def query[A](block: DSLContext => A): Future[A] = Future {
db.withConnection { connection: Connection =>
val dsl = DSL.using(connection, SQLDialect.POSTGRES_9_4)
block(dsl)
}
}(databaseContext)
def withTransaction[A](block: DSLContext => A): Future[A] = Future {
db.withTransaction { connection: Connection =>
val dsl: DSLContext = DSL.using(connection, SQLDialect.POSTGRES_9_4)
block(dsl)
}
}(databaseContext)
}
I use this helper class in repositories like this:
db.query { dsl =>
val records = dsl
.selectFrom(USERS)
.where(...)
...
}
}
application.conf
db.default.driver="org.postgresql.Driver"
db.default.url="jdbc:postgresql://localhost/postgres?user=postgres"
...
contexts {
database {
executor = "thread-pool-executor"
throughput = 1
thread-pool-executor {
fixed-pool-size = 9
}
}
}
...
build.sbt
...
libraryDependencies += jdbc
libraryDependencies += "org.jooq" % "jooq" % "3.10.5"
libraryDependencies += "org.jooq" % "jooq-codegen-maven" % "3.10.5"
libraryDependencies += "org.jooq" % "jooq-meta" % "3.10.5"
libraryDependencies += "org.postgresql" % "postgresql" % "42.1.4"
...
And base trait for all my tests for any case:
class BaseFeatureSpec extends FeatureSpec
with GivenWhenThen
with GuiceOneServerPerSuite
with Matchers
with WsScalaTestClient
with BeforeAndAfterEach
with MockitoSugar {
override def fakeApplication(): Application =
new GuiceApplicationBuilder()
.overrides(bind[EmailService].to(classOf[EmailServiceStub]))
.build()
def config: Configuration = fakeApplication().configuration
def actorSystem: ActorSystem = fakeApplication().actorSystem
val db: Db = app.injector.instanceOf[Db]
val wsClient: WSClient = app.injector.instanceOf[WSClient]
val myPublicAddress = s"localhost:$port"
private val injector = fakeApplication().injector
def truncateDbOnEachRun = true
override protected def beforeEach(): Unit = {
if (truncateDbOnEachRun) {
truncateDb
}
}
protected def truncateDb = {
await(db.withTransaction { dsl =>
... truncate all dbs...
})
}
}
max_connections of my postgresql instance is 100.
What I noticed is that when running test, I see that pool is created multiple times almost before each test:
[info] application - Creating Pool for datasource 'default'
[info] application - Creating Pool for datasource 'default'
[info] application - Creating Pool for datasource 'default'
[info] application - Creating Pool for datasource 'default'
And after I'm getting too many connections error.
Please help.
Look like you use a dispatcher with default type, try to add type = PinnedDispatcher, which is used for io type dispatcher, as follow.
thread-pool-executor {
...
type = PinnedDispatcher
}
And you can find details from
https://doc.akka.io/docs/akka/2.5/dispatchers.html
I'v found the issue:
override def fakeApplication(): Application =
new GuiceApplicationBuilder()
.overrides(bind[EmailService].to(classOf[EmailServiceStub]))
.build()
Creates new application instance (because it is method) each time I access to it like app.injector.instanceOf[Db], so that's why I have so many application - Creating Pool for datasource 'default' logs.
I've replaced def with val
override val fakeApplication(): Application =
new GuiceApplicationBuilder()
.overrides(bind[EmailService].to(classOf[EmailServiceStub]))
.build()
And it works fine.
I've a simple CRUD app built with Scala Play 2.4.3 and Play-slick 1.1.0 (slick 3.1.0) that uses a MySQL database for persistent storage.
I was trying to create the tests for my app and I saw 2 main options:
mocking database access, that as far as I've seen, requires some code changes
make tests use an alternative database (probably, in memory H2).
What's the best approach (vantages and desavantages)?
I prefer the second approach, but I'm finding some difficulties in setting up the tests.
What do I need to do? First, I think that I need to do the tests run with a FakeApplication, right? Do I need any sbt dependency to be able to do that?
After that, how do I specify to use the H2 database?
I had the same struggle and I came up with a solution like this(using second approach):
Create a context for DAO to use:
trait BaseContext{
def dbName: String
val dbConfig = DatabaseConfigProvider.get[JdbcProfile](dbName)
val db = dbConfig.db
val profile = dbConfig.driver
val tables = new Tables { // this is generated by Schema Code Generator
override val profile: JdbcProfile = dbConfig.driver
}
}
#Singleton
class AppContext extends BaseContext{
def dbName = "mysql" // name in your conf right after "slick.dbs"
}
#Singleton
class TestingContext extends BaseContext{
def dbName = "h2"
}
Then create a module to bind the injection, and don't forget to enable it in conf using play.modules.enabled += "your.Module":
class ContextModule(environment: Environment, configuration: Configuration) extends AbstractModule {
override def configure(): Unit = {
if (configuration.getString("app.mode").contains("test")) {
bind(classOf[BaseContext])
.to(classOf[TestingContext])
} else {
bind(classOf[BaseContext])
.to(classOf[AppContext])
}
}
}
And inject it to every DAO you've created:
class SomeDAO #Inject()(context: BaseContext){
val dbConfig = context.dbConfig
val db = context.db
val tables = context.tables
import tables.profile.api._
def otherStuff....
// you can call db.run(...), tables.WhateverYourTableIs, tables.TableRowCaseClass, ...
}
And final step, your configuration file. In my case I used app.mode to mark the environment, and I use separate .conf for different environment. Of cause, in these conf you must have the correct DB configuration. Here's the sample:
app.mode = "test"
# Database configuration
slick.dbs = {
# for unit test
h2 {
driver = "slick.driver.H2Driver$"
db = {
url = "jdbc:h2:mem:test;MODE=MYSQL"
driver = "org.h2.Driver"
keepAliveConnection = true
}
}
}
I'm pretty sure my solution is not a elegant one, but it deliver the goods. :)
Any better solution is welcomed!
my solution was to add step(Play.start(fakeApp)) in the beginning of each spec, and step(Play.stop(fakeApp)) in the end of each spec.
Also:
def fakeApp: FakeApplication = {
FakeApplication(additionalConfiguration =
Map(
"slick.dbs.default.driver" -> "slick.driver.H2Driver$",
"slick.dbs.default.db.driver" -> "org.h2.Driver",
"slick.dbs.default.db.url" -> "jdbc:h2:mem:play"
))
}
This was needed because I'm using play-slick, which requires configurations like:
slick.dbs.default.driver = "slick.driver.MySQLDriver$"
slick.dbs.default.db.driver = "com.mysql.jdbc.Driver"
slick.dbs.default.db.url = "jdbc:mysql://localhost/database"
slick.dbs.default.db.user = "user"
slick.dbs.default.db.password = "password"
more info on the docs
I would like to manually run my evolution script at the beginning of each test file. I'm working with Play! 2.4 and Slick 3.
According to the documentation, the way to go seems to be:
Evolutions.applyEvolutions(database)
but I don't manage to get an instance of my database. In the documentation play.api.db.Databases is imported in order to get a database instance but if I try to import it, I get this error: object Databases is not a member of package play.api.db
How can I get an instance of my database in order to run the evolution script?
Edit: as asked in the comments, here is the entire source code giving the error:
import models._
import org.scalatest.concurrent.ScalaFutures._
import org.scalatest.time.{Seconds, Span}
import org.scalatestplus.play._
import play.api.db.evolutions.Evolutions
import play.api.db.Databases
class TestAddressModel extends PlaySpec with OneAppPerSuite {
lazy val appBuilder = new GuiceApplicationBuilder()
lazy val injector = appBuilder.injector()
lazy val dbConfProvider = injector.instanceOf[DatabaseConfigProvider]
def beforeAll() = {
//val database: Database = ???
//Evolutions.applyEvolutions(database)
}
"test" must {
"test" in { }
}
}
I finally found this solution. I inject with Guice:
lazy val appBuilder = new GuiceApplicationBuilder()
lazy val injector = appBuilder.injector()
lazy val databaseApi = injector.instanceOf[DBApi] //here is the important line
(You have to import play.api.db.DBApi.)
And in my tests, I simply do the following (actually I use an other database for my tests):
override def beforeAll() = {
Evolutions.applyEvolutions(databaseApi.database("default"))
}
override def afterAll() = {
Evolutions.cleanupEvolutions(databaseApi.database("default"))
}
Considering that you are using Play 2.4, where evolutions were moved into a separate module, you have to add evolutions to your project dependencies.
libraryDependencies += evolutions
Source: Evolutions
Relevant commit: Split play-jdbc into three different modules
To have access to play.api.db.Databases, you must add jdbc to your dependencies :
libraryDependencies += jdbc
Hope it helps some people passing here.
EDIT: the code would then look like this :
import play.api.db.Databases
val database = Databases(
driver = "com.mysql.jdbc.Driver",
url = "jdbc:mysql://localhost/test",
name = "mydatabase",
config = Map(
"user" -> "test",
"password" -> "secret"
)
)
You now have an instance of the DB, and can execute queries on it :
val statement = database.getConnection().createStatement()
val resultSet = statement.executeQuery("some_sql_query")
You can see more from the docs
EDIT: typo
I find the easiest way to run tests with evolutions applied is to use FakeApplication, and input the connection info for the DB manually.
def withDB[T](code: => T): T =
// Create application to run database evolutions
running(FakeApplication(additionalConfiguration = Map(
"db.default.driver" -> "<my-driver-class>",
"db.default.url" -> "<my-db-url>",
"db.default.user" -> "<my-db>",
"db.default.password" -> "<my-password>",
"evolutionplugin" -> "enabled"
))) {
// Start a db session
withSession(code)
}
Use it like this:
"test" in withDB { }
This allows you, for example, to use an in-memory database for speeding up your unit tests.
You can access the DB instance as play.api.db.DB if you need it. You'll also need to import play.api.Play.current.
Use FakeApplication to read your DB configuration and provide a DB instance.
def withDB[T](code: => T): T =
// Create application to run database evolutions
running(FakeApplication(additionalConfiguration = Map(
"evolutionplugin" -> "disabled"))) {
import play.api.Play.current
val database = play.api.db.DB
Evolutions.applyEvolutions(database)
withSession(code)
Evolutions.cleanupEvolutions(database)
}
Use it like this:
"test" in withDB { }
I am using mongo with play framework with "reactivemongo", That makes an async bridge between mongo connection and programm. For standalone projects I always use casbah lib - it has more native syntax (sometimes using of Futures in each request is not needed, and my religion does not allow me to use Async.await for blocking each request ) for me and no actors overhead, also I don't like JSON BSON conversion overhead.
But using casbah in play framework direct way(just create Mongo connection in controller ) produces connection leaks - this means you should create connection pool and control yourself, otherwords write reactivemongo.
Has anybody used casbah with mongo in production ? Where is the best and most canonical way of creating and controlling connection in play ecosystem ?
First you should check Connecting to MongoDB. Now go to this tutorial create scala project ( If you used other editor then follow scala project creation steps ).
Now check this following steps :
1> projectName/conf/application.conf add application.conf mongo Db name, collection name, port number, URL etc. play reactive mongo for ex: I added following in my application.conf
mongodb.default.host = "localhost"
mongodb.default.db = "Demo"
mongodb.default.port = "27017"
CI.default.uri = "mongodb://localhost:27017/"
2> Create a .scala file in controller folder give any name for ex. I set file name as ScalaMongoFactory and add following code in this file
import com.mongodb.casbah. {
MongoClient, MongoClientURI
}
import com.typesafe.config.ConfigFactory
object ScalaMongoFactory {
private val config = ConfigFactory.load()
private val DATABASE = config.getString("mongodb.default.db")
private val server = MongoClientURI(config.getString("CI.default.uri"))
private val client = MongoClient(server)
val database = client(DATABASE)
}
3> Now create a new .scala file in controller where you want to use mongo connection. For ex. I created checkConnection.scala file and contains like
import com.cloudinsights.scala.controllers. {
ScalaMongoFactory
}
object checkConnection {
val collection = ScalaMongoFactory.database("your collectionName")
}
There's no need to use Async.wait with reactive mongo (and you shouldn't anyway).
I guess you could use a utility object to manage your connection.
import com.mongodb.casbah.{MongoClient, MongoDB}
import play.api.Play
object MongoManager {
private val server = Play.current.configuration.getString("db.host").get
private val port = Play.current.configuration.getInt("db.port").get
object using {
def apply[A](col: String)(block: MongoDB => A): A = {
val con = MongoClient(server, port)
val a = block(con.apply(col))
con.close
a
}
def apply[A](block: MongoClient => A): A = {
val con = MongoClient(server, port)
val a = block(con)
con.close
a
}
}
object stashed {
private lazy val con = MongoClient(server, port)
def apply(col: String): MongoDB = con.apply(col)
def apply: MongoClient = con
}
}
I didn't find a play plugin for this driver.
Personally I'd recommend using the ReactiveMongo driver instead, since it can also use play's JSON library. If you're taking data from the database and feeding it through a REST api, its a nicer option.
How do you use Anorm outside of play in Scala? In the Anorm document for play, it simply uses something like:
DB.withConnection { implicit c =>
val result: Boolean = SQL("Select 1").execute()
}
The DB object is only for Play. How do you use Anorm alone without using Play?
There is no need of DB object (part of Play JDBC not Anorm). Anorm works as along as you provide it connection as implicit:
implicit val con: java.sql.Connection = ??? // whatever you want to resolve connection
SQL"SELECT * FROM Table".as(...)
You can resolve JDBC connection in many way: basic DriverManager.getConnection, JNDI, ...
As for dependency, it's easy to add it in SBT: How to declare dependency on Play's Anorm for a standalone application? .
You could also emulate the DB object as follows (i haven't tried this though)
object DB {
def withConnection[A](block: Connection => A): A = {
val connection: Connection = ConnectionPool.borrow()
try {
block(connection)
} finally {
connection.close()
}
}
}
Taken from https://github.com/TimothyKlim/anorm-without-play/blob/master/src/main/scala/Main.scala
Documenting the code that works for me below:
Entry to include in dependencies in build.sbt:
// https://mvnrepository.com/artifact/org.playframework.anorm/anorm
libraryDependencies += "org.playframework.anorm" %% "anorm" % "2.6.7"
Write helper classes:
#Singleton
class DBUtils {
val schema = AppConfig.defaultSchema
def withDefaultConnection(sqlQuery: SqlQuery) = {
// could replace with DBCP, not got a chance yet
val conn = DriverManager.getConnection(AppConfig.dbUrl,AppConfig.dbUser, AppConfig.dbPassword)
val result = Try(sqlQuery.execute()(conn))
conn.close()
result
}
}
object DBUtils extends DBUtils
Next, any query can use the withDefaultConnection method to execute:
def saveReviews(listOfReviews: List[Review]):Try[Boolean]= {
val query = SQL(
s"""insert into aws.reviews
| ( reviewerId,
| asin,
| overall,
| summary,
| unixReviewTime,
| reviewTime
| )
|values ${listOfReviews.mkString(",")}""".stripMargin)
//println(query.toString())
DBUtils.withDefaultConnection(query)
}