Play can't find database - scala

So, for 3 days now I have had various problems with Play, I am new to the framework, but I don't get what is happening. So, after I was unable to use Slick, due to some futures never returning a value, I decided to switch to Anorm. It worked, until I decided to add a second repository... after which, I am now unable to load my page because I keep getting this:
#769g71c3d - Internal server error, for (GET) [/] ->
play.api.UnexpectedException: Unexpected exception[ProvisionException: Unable to provision, see the following errors:
1) Error injecting constructor, java.lang.IllegalArgumentException: Could not find database for pmf
at dao.ActivityTypeRepository.<init>(ActivityTypeDAO.scala:13)
at dao.ActivityTypeRepository.class(ActivityTypeDAO.scala:13)
while locating dao.ActivityTypeRepository
for the 1st parameter of services.ActivityTypeServiceImpl.<init>(ActivityTypeService.scala:17)
while locating services.ActivityTypeServiceImpl
while locating services.ActivityTypeService
The database is input correctly, I can connect to it via terminal and via datagrip... Has anyone ever had a similar issue?
As requested, here is my configuration:
slick.dbs.default.profile = "slick.jdbc.PostgresProfile$"
slick.dbs.default.db.driver = "org.postgresql.Driver"
slick.dbs.default.db.url = "jdbc:postgresql://localhost:5432/pmf"
These are my classes:
#Singleton
class ActivityTypeRepository #Inject()(dbapi: DBApi)(implicit ec: ExecutionContext) {
private val db = dbapi.database(RepositorySettings.dbName)
private[dao] val activityTypeMapping = {
get[Int]("activityType.id") ~
get[String]("activityType.name") map {
case id ~ name => ActivityType(id, name)
}
}
def listAll: Future[Seq[ActivityType]] = Future {
db.withConnection { implicit conn =>
SQL("SELECT * FROM ActivityType").as(activityTypeMapping *)
}
}
}
#Singleton
class UserRepository #Inject()(dbApi: DBApi)(implicit ec: ExecutionContext){
private val db = dbApi.database(RepositorySettings.dbName)
private[dao] val userMapping = {
get[Option[Long]]("users.id") ~
get[String]("users.email") ~
get[Option[String]]("users.name") ~
get[Option[String]]("users.surname") map {
case id ~ email ~ name ~ surname => User(id, email, name, surname)
}
}
def add(user: User): Future[Option[Long]] = Future {
db.withConnection { implicit conn =>
SQL"INSERT INTO users(id, email, name, surname) VALUES (${user.id}, ${user.email}, ${user.name}, ${user.surname})".executeInsert()
}
}
def find(id: Long): Future[Option[User]] = Future {
db.withConnection { implicit conn =>
SQL"SELECT * FROM User WHERE id = $id".as(userMapping *).headOption
}
}
def findByEmail(email: String): Future[Option[User]] = Future {
db.withConnection { implicit conn =>
SQL"SELECT * FROM User WHERE email = $email".as(userMapping *).headOption
}
}
def listAll: Future[Seq[User]] = Future {
db.withConnection { implicit conn =>
SQL("SELECT * FROM User").as(userMapping *)
}
}
}
EDIT:
Added to application.conf
db {
default.driver = org.postgresql.Driver
default.url = "jdbc:postgresql://localhost:5432/pmf_visualizations"
}
but no change.

Related

How to bind Slick dependency with Lagom?

So, I have this dependency which is used to create tables and interact with Postgres. Here is a Sample Class:
class ConfigTable {
this: DBFactory =>
import driver.api._
implicit val configKeyMapper = MappedColumnType.base[ConfigKey, String](e => e.toString, s => ConfigKey.withName(s))
val configs = TableQuery[ConfigMapping]
class ConfigMapping(tag: Tag) extends Table[Config](tag, "configs") {
def key = column[ConfigKey]("key")
def value = column[String]("value")
def * = (key, value) <> (Config.tupled, Config.unapply _)
}
/**
* add config
*
* #param config
* #return
*/
def add(config: Config): Try[Config] = try {
sync(db.run(configs += config)) match {
case 1 => Success(config)
case _ => Failure(new Exception("Unable to add config"))
}
} catch {
case ex: PSQLException =>
if (ex.getMessage.contains("duplicate key value")) Failure(new Exception("alt id already exists."))
else Failure(new Exception(ex.getMessage))
}
def get(key: ConfigKey): Option[Config] = sync(db.run(configs.filter(x => x.key === key).result)).headOption
def getAll(): Seq[Config] = sync(db.run(configs.result))
}
object ConfigTable extends ConfigTable with PSQLComponent
PSQLComponent is the Abstraction for Database meta configuration:
import slick.jdbc.PostgresProfile
trait PSQLComponent extends DBFactory {
val driver = PostgresProfile
import driver.api.Database
val db: Database = Database.forConfig("db.default")
}
DBFactory is again an abstraction:
import slick.jdbc.JdbcProfile
trait DBFactory {
val driver: JdbcProfile
import driver.api._
val db: Database
}
application.conf:
db.default {
driver = "org.postgresql.Driver"
url = "jdbc:postgresql://localhost:5432/db"
user = "user"
password = "pass"
hikaricp {
minimumIdle = ${db.default.async-executor.minConnections}
maximumPoolSize = ${db.default.async-executor.maxConnections}
}
}
jdbc-defaults.slick.profile = "slick.jdbc.PostgresProfile$"
lagom.persistence.jdbc.create-tables.auto=false
I compile and publish this dependency to nexus and trying to use this in my Lagom Microservice.
Here is the Loader Class:
class SlickExapleAppLoader extends LagomApplicationLoader {
override def load(context: LagomApplicationContext): LagomApplication = new SlickExampleApp(context) {
override def serviceLocator: ServiceLocator = NoServiceLocator
}
override def loadDevMode(context: LagomApplicationContext): LagomApplication = new SlickExampleApp(context) with LagomDevModeComponents {
}
override def describeService = Some(readDescriptor[SlickExampleLMSServiceImpl])
}
abstract class SlickExampleApp(context: LagomApplicationContext)
extends LagomApplication(context)
// No Idea which to use and how, nothing clear from doc too.
// with ReadSideJdbcPersistenceComponents
// with ReadSideSlickPersistenceComponents
// with SlickPersistenceComponents
with AhcWSComponents {
wire[SlickExampleScheduler]
}
I'm trying to implement it in this scheduler:
class SlickExampleScheduler #Inject()(lmsService: LMSService,
configuration: Configuration)(implicit ec: ExecutionContext) {
val brofile = `SomeDomainObject`
val gson = new Gson()
val concurrency = Runtime.getRuntime.availableProcessors() * 10
implicit val timeout: Timeout = 3.minute
implicit val system: ActorSystem = ActorSystem("LMSActorSystem")
implicit val materializer: ActorMaterializer = ActorMaterializer()
// Getting Exception Initializer here..... For ConfigTable ===> ExceptionLine
val schedulerImplDao = new SchedulerImplDao(ConfigTable)
def hitLMSAPI = {
println("=============>1")
schedulerImplDao.doSomething()
}
system.scheduler.schedule(2.seconds, 2.seconds) {
println("=============>")
hitLMSAPI
}
}
Not sure if it's the correct way, or if it's not what is the correct way of doing this. It is the project requirement to keep the Data Models separate from the service for the obvious reasons of re-usability.
Exception Stack:
17:50:38.666 [info] akka.cluster.Cluster(akka://lms-impl-application) [sourceThread=ForkJoinPool-1-worker-1, akkaTimestamp=12:20:38.665UTC, akkaSource=akka.cluster.Cluster(akka://lms-impl-application), sourceActorSystem=lms-impl-application] - Cluster Node [akka.tcp://lms-impl-application#127.0.0.1:45805] - Started up successfully
17:50:38.707 [info] akka.cluster.Cluster(akka://lms-impl-application) [sourceThread=lms-impl-application-akka.actor.default-dispatcher-6, akkaTimestamp=12:20:38.707UTC, akkaSource=akka.cluster.Cluster(akka://lms-impl-application), sourceActorSystem=lms-impl-application] - Cluster Node [akka.tcp://lms-impl-application#127.0.0.1:45805] - No seed-nodes configured, manual cluster join required
java.lang.ExceptionInInitializerError
at com.slick.init.impl.SlickExampleScheduler.<init>(SlickExampleScheduler.scala:29)
at com.slick.init.impl.SlickExampleApp.<init>(SlickExapleAppLoader.scala:42)
at com.slick.init.impl.SlickExapleAppLoader$$anon$2.<init>(SlickExapleAppLoader.scala:17)
at com.slick.init.impl.SlickExapleAppLoader.loadDevMode(SlickExapleAppLoader.scala:17)
at com.lightbend.lagom.scaladsl.server.LagomApplicationLoader.load(LagomApplicationLoader.scala:76)
at play.core.server.LagomReloadableDevServerStart$$anon$1.$anonfun$get$5(LagomReloadableDevServerStart.scala:176)
at play.utils.Threads$.withContextClassLoader(Threads.scala:21)
at play.core.server.LagomReloadableDevServerStart$$anon$1.$anonfun$get$3(LagomReloadableDevServerStart.scala:173)
at scala.Option.map(Option.scala:163)
at play.core.server.LagomReloadableDevServerStart$$anon$1.$anonfun$get$2(LagomReloadableDevServerStart.scala:149)
at scala.util.Success.flatMap(Try.scala:251)
at play.core.server.LagomReloadableDevServerStart$$anon$1.$anonfun$get$1(LagomReloadableDevServerStart.scala:147)
at scala.concurrent.Future$.$anonfun$apply$1(Future.scala:658)
at scala.util.Success.$anonfun$map$1(Try.scala:255)
at scala.util.Success.map(Try.scala:213)
at scala.concurrent.Future.$anonfun$map$1(Future.scala:292)
at scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:33)
at scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33)
at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64)
at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1402)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:157)
Caused by: java.lang.NullPointerException
at com.example.db.models.LoginTable.<init>(LoginTable.scala:29)
at com.example.db.models.LoginTable$.<init>(LoginTable.scala:293)
at com.example.db.models.LoginTable$.<clinit>(LoginTable.scala)
... 24 more
This is how it is woking:
abstract class SlickExampleApp(context: LagomApplicationContext) extends LagomApplication(context)
with SlickPersistenceComponents with AhcWSComponents {
override implicit lazy val actorSystem: ActorSystem = ActorSystem("LMSActorSystem")
override lazy val materializer: ActorMaterializer = ActorMaterializer()
override lazy val lagomServer = serverFor[SlickExampleLMSService](wire[SlickExampleLMSServiceImpl])
lazy val externalService = serviceClient.implement[LMSService]
override def connectionPool: ConnectionPool = new HikariCPConnectionPool(environment)
override def jsonSerializerRegistry: JsonSerializerRegistry = new JsonSerializerRegistry {
override def serializers: immutable.Seq[JsonSerializer[_]] = Vector.empty
}
val loginTable = wire[LoginTable]
wire[SlickExampleScheduler]
}
> One thing I'd like to report is: Lagom docs about the application.conf configuration of slick is not correct, it misleaded me for two days, the I digged into the Liberary code and this is how it goes:
private val readSideConfig = system.settings.config.getConfig("lagom.persistence.read-side.jdbc")
private val jdbcConfig = system.settings.config.getConfig("lagom.persistence.jdbc")
private val createTables = jdbcConfig.getConfig("create-tables")
val autoCreateTables: Boolean = createTables.getBoolean("auto")
// users can disable the usage of jndiDbName for userland read-side operations by
// setting the jndiDbName to null. In which case we fallback to slick.db.
// slick.db must be defined otherwise the application will fail to start
val db = {
if (readSideConfig.hasPath("slick.jndiDbName")) {
new InitialContext()
.lookup(readSideConfig.getString("slick.jndiDbName"))
.asInstanceOf[Database]
} else if (readSideConfig.hasPath("slick.db")) {
Database.forConfig("slick.db", readSideConfig)
} else {
throw new RuntimeException("Cannot start because read-side database configuration is missing. " +
"You must define either 'lagom.persistence.read-side.jdbc.slick.jndiDbName' or 'lagom.persistence.read-side.jdbc.slick.db' in your application.conf.")
}
}
val profile = DatabaseConfig.forConfig[JdbcProfile]("slick", readSideConfig).profile
The configuration it requires is very much different than the suggested one on the Doc.

Close or shutdown of H2 database after tests is not working

I am facing a problem of database clean-up after each test when using scalatest with Slick.
Here is code of the test:
class H2DatabaseSpec extends WordSpec with Matchers with ScalaFutures with BeforeAndAfter {
implicit override val patienceConfig = PatienceConfig(timeout = Span(5, Seconds))
val h2DB: H2DatabaseService = new H2DatabaseService
var db: Database = _
before {
db = Database.forConfig("h2mem1")
h2DB.createSchema.futureValue
}
after {
db.shutdown.futureValue
}
"H2 database" should {
"query a question" in {
val newQuestion: QuestionEntity = QuestionEntity(Some(1L), "First question")
h2DB.insertQuestion(newQuestion).futureValue
val question = h2DB.getQuestionById(1L)
question.futureValue.get shouldBe newQuestion
}
}
it should {
"query all questions" in {
val newQuestion: QuestionEntity = QuestionEntity(Some(2L), "Second question")
h2DB.insertQuestion(newQuestion).futureValue
val questions = h2DB.getQuestions
questions.futureValue.size shouldBe 1
}
}
}
Database service is just invoking run method on defined database:
class H2DatabaseService {
val db = Database.forConfig("h2mem1")
val questions = TableQuery[Question]
def createSchema =
db.run(questions.schema.create)
def getQuestionById(id: Long): Future[Option[QuestionEntity]] =
db.run(questions.filter(_.id === id).result.headOption)
def getQuestions: Future[Seq[QuestionEntity]] =
db.run(questions.result)
def insertQuestion(question: QuestionEntity): Future[Int] =
db.run(questions += question)
}
class Question(tag: Tag) extends Table[QuestionEntity](tag, "QUESTION") {
def id = column[Option[Long]]("QUESTION_ID", O.PrimaryKey, O.AutoInc)
def title = column[String]("TITLE")
def * = (id, title) <> ((QuestionEntity.apply _).tupled, QuestionEntity.unapply)
}
case class QuestionEntity(id: Option[Long] = None, title: String) {
require(!title.isEmpty, "title.empty")
}
And the database is defined as follows:
h2mem1 = {
url = "jdbc:h2:mem:test1"
driver = org.h2.Driver
connectionPool = disabled
keepAliveConnection = true
}
I am using Scala 2.11.8, Slick 3.1.1, H2 database 1.4.192 and scalatest 2.2.6.
Error that appears when tests are executed is Table "QUESTION" already exists. So it looks like shutdown() method has no effect at all (but it is invoked - checked in debugger).
Anybody knows how to handle such scenario? How to clean-up database properly after each test?
Database has not been correctly cleaned-up because of invoking the method on different object.
H2DatabaseService has it's own db object and the test it's own. Issue was fixed after refactoring H2 database service and invoking:
after {
h2DB.db.shutdown.futureValue
}

Slick code generation for only a single schema

Is there a way to have Slick's code generation generate code for only a single schema? Say, public? I have extensions that create a whole ton of tables (eg postgis, pg_jobman) that make the code that slick generates gigantic.
Use this code with appropriate values and schema name,
object CodeGenerator {
def outputDir :String =""
def pkg:String =""
def schemaList:String = "schema1, schema2"
def url:String = "dburl"
def fileName:String =""
val user = "dbUsername"
val password = "dbPassword"
val slickDriver="scala.slick.driver.PostgresDriver"
val JdbcDriver = "org.postgresql.Driver"
val container = "Tables"
def generate() = {
val driver: JdbcProfile = buildJdbcProfile
val schemas = createSchemaList
var model = createModel(driver,schemas)
val codegen = new SourceCodeGenerator(model){
// customize Scala table name (table class, table values, ...)
override def tableName = dbTableName => dbTableName match {
case _ => dbTableName+"Table"
}
override def code = {
//imports is copied right out of
//scala.slick.model.codegen.AbstractSourceCodeGenerator
val imports = {
"import scala.slick.model.ForeignKeyAction\n" +
(if (tables.exists(_.hlistEnabled)) {
"import scala.slick.collection.heterogenous._\n" +
"import scala.slick.collection.heterogenous.syntax._\n"
} else ""
) +
(if (tables.exists(_.PlainSqlMapper.enabled)) {
"import scala.slick.jdbc.{GetResult => GR}\n" +
"// NOTE: GetResult mappers for plain SQL are only generated for tables where Slick knows how to map the types of all columns.\n"
} else ""
) + "\n\n" //+ tables.map(t => s"implicit val ${t.model.name.table}Format = Json.format[${t.model.name.table}]").mkString("\n")+"\n\n"
}
val bySchema = tables.groupBy(t => {
t.model.name.schema
})
val schemaFor = (schema: Option[String]) => {
bySchema(schema).sortBy(_.model.name.table).map(
_.code.mkString("\n")
).mkString("\n\n")
}
}
val joins = tables.flatMap( _.foreignKeys.map{ foreignKey =>
import foreignKey._
val fkt = referencingTable.TableClass.name
val pkt = referencedTable.TableClass.name
val columns = referencingColumns.map(_.name) zip
referencedColumns.map(_.name)
s"implicit def autojoin${fkt + name.toString} = (left:${fkt} ,right:${pkt}) => " +
columns.map{
case (lcol,rcol) =>
"left."+lcol + " === " + "right."+rcol
}.mkString(" && ")
})
override def entityName = dbTableName => dbTableName match {
case _ => dbTableName
}
override def Table = new Table(_) {
table =>
// customize table value (TableQuery) name (uses tableName as a basis)
override def TableValue = new TableValue {
override def rawName = super.rawName.uncapitalize
}
// override generator responsible for columns
override def Column = new Column(_){
// customize Scala column names
override def rawName = (table.model.name.table,this.model.name) match {
case _ => super.rawName
}
}
}
}
println(outputDir+"\\"+fileName)
(new File(outputDir)).mkdirs()
val fw = new FileWriter(outputDir+File.separator+fileName)
fw.write(codegen.packageCode(slickDriver, pkg, container))
fw.close()
}
def createModel(driver: JdbcProfile, schemas:Set[Option[String]]): Model = {
driver.simple.Database
.forURL(url, user = user, password = password, driver = JdbcDriver)
.withSession { implicit session =>
val filteredTables = driver.defaultTables.filter(
(t: MTable) => schemas.contains(t.name.schema)
)
PostgresDriver.createModel(Some(filteredTables))
}
}
def createSchemaList: Set[Option[String]] = {
schemaList.split(",").map({
case "" => None
case (name: String) => Some(name)
}).toSet
}
def buildJdbcProfile: JdbcProfile = {
val module = currentMirror.staticModule(slickDriver)
val reflectedModule = currentMirror.reflectModule(module)
val driver = reflectedModule.instance.asInstanceOf[JdbcProfile]
driver
}
}
I encountered the same problem and I found this question. The answer by S.Karthik sent me in the right direction. However, the code in the answer is slightly outdated. And I think a bit over-complicated. So I crafted my own solution:
import slick.codegen.SourceCodeGenerator
import slick.driver.JdbcProfile
import slick.model.Model
import scala.concurrent.duration.Duration
import scala.concurrent.{Await, ExecutionContext}
val slickDriver = "slick.driver.PostgresDriver"
val jdbcDriver = "org.postgresql.Driver"
val url = "jdbc:postgresql://localhost:5432/mydb"
val outputFolder = "/path/to/src/test/scala"
val pkg = "com.mycompany"
val user = "user"
val password = "password"
object MySourceCodeGenerator {
def run(slickDriver: String, jdbcDriver: String, url: String, outputDir: String,
pkg: String, user: Option[String], password: Option[String]): Unit = {
val driver: JdbcProfile =
Class.forName(slickDriver + "$").getField("MODULE$").get(null).asInstanceOf[JdbcProfile]
val dbFactory = driver.api.Database
val db = dbFactory.forURL(url, driver = jdbcDriver, user = user.orNull,
password = password.orNull, keepAliveConnection = true)
try {
// **1**
val allSchemas = Await.result(db.run(
driver.createModel(None, ignoreInvalidDefaults = false)(ExecutionContext.global).withPinnedSession), Duration.Inf)
// **2**
val publicSchema = new Model(allSchemas.tables.filter(_.name.schema.isEmpty), allSchemas.options)
// **3**
new SourceCodeGenerator(publicSchema).writeToFile(slickDriver, outputDir, pkg)
} finally db.close
}
}
MySourceCodeGenerator.run(slickDriver, jdbcDriver, url, outputFolder, pkg, Some(user), Some(password))
I'll explain what's going on here:
I copied the run function from the SourceCodeGenerator class that's in the slick-codegen library. (I used version slick-codegen_2.10-3.1.1.)
// **1**: In the origninal code, the generated Model was referenced in a val called m. I renamed that to allSchemas.
// **2**: I created a new Model (publicSchema), using the options from the original model, and using a filtered version of the tables set from the original model. It turns out tables from the public schema don't get a schema name in the model. Hence the isEmpty. Should you need tables from one or more other schemas, you can easily create a different filter expression.
// **3**: I create a SourceCodeGenerator with the created publicSchema model.
Of course, it would even be better if the Slick codegenerator could incorporate an option to select one or more schemas.

Why does compilation fail with "not found: value Users"?

I want to retrieve a row from my default database postgres. I have table "Users" defined already.
conf/application.conf
db.default.driver=org.postgresql.Driver
db.default.url="jdbc:postgresql://localhost:5234/postgres"
db.default.user="postgres"
db.default.password=""
controllers/Application.scala
package controllers
import models.{UsersDatabase, Users}
import play.api.mvc._
object Application extends Controller {
def index = Action {
Ok(views.html.index(UsersDatabase.getAll))
}
}
models/Users.scala
package models
import java.sql.Date
import play.api.Play.current
import play.api.db.DB
import slick.driver.PostgresDriver.simple._
case class User(
id: Int,
username: String,
password: String,
full_name: String,
email: String,
gender: String,
dob: Date,
joined_date: Date
)
class Users(tag: Tag) extends Table[User](tag, "Users") {
def id = column[Int]("id")
def username = column[String]("username", O.PrimaryKey)
def password = column[String]("password")
def full_name = column[String]("full_name")
def email = column[String]("email")
def gender = column[String]("gender")
def dob = column[Date]("dob")
def joined_date = column[Date]("joined_date")
def * = (id, username, password, full_name, email, gender, dob, joined_date) <> (User.tupled, User.unapply)
}
object UsersDatabase {
def getAll: List[User] = {
Database.forDataSource(DB.getDataSource()) withSession {
Query(Users).list
}
}
}
While accessing http://localhost:9000/ it gives compilation error:
[error] .../app/models/Users.scala:36: not found: value Users
[error] Query(Users).list
[error] ^
[error] one error found
[error] (compile:compile) Compilation failed
How to resolve this error and access data properly?
The compilation error message says it all - there's no value Users to use in the scope.
Change the object UsersDatabase to look as follows:
object UsersDatabase {
val users = TableQuery[Users]
def getAll: List[User] = {
Database.forDataSource(DB.getDataSource()) withSession { implicit session =>
users.list
}
}
}
And the error goes away since you're using the local val users to list users in the database.
As described in Querying in the official documentation of Slick session val is an implicit value of list (as final def list(implicit session: SessionDef): List[R]), and hence implicit session in the block:
All methods that execute a query take an implicit Session value. Of
course, you can also pass a session explicitly if you prefer:
val l = q.list(session)

Slick: create database

Is there a way to get slick to create the database if it doesn't already exist?
Database.forURL("jdbc:mysql://127.0.0.1/database", driver = "com.mysql.jdbc.Driver", user = "root") withSession {
// create tables, insert data
}
"database" doesn't exist, so I want slick to create it for me. Any ideas? Thanks.
The answer above is relevant to Slick 2.x where withSession is deprecated,
so this is how it is done with Slick 3.0.0 API :
import scala.concurrent.Await
import scala.concurrent.duration._
import org.postgresql.util.PSQLException
import slick.driver.PostgresDriver
import slick.driver.PostgresDriver.api._
object SlickPGUtils {
private val actionTimeout = 10 second
private val driver = "org.postgresql.Driver"
def createDb(host: String, port: Int, dbName: String, user: String, pwd: String) = {
val onlyHostNoDbUrl = s"jdbc:postgresql://$host:$port/"
using(Database.forURL(onlyHostNoDbUrl, user = user, password = pwd, driver = driver)) { conn =>
Await.result(conn.run(sqlu"CREATE DATABASE #$dbName"), actionTimeout)
}
}
def dropDb(host: String, port: Int, dbName: String, user: String, pwd: String) = {
val onlyHostNoDbUrl = s"jdbc:postgresql://$host:$port/"
try {
using(Database.forURL(onlyHostNoDbUrl, user = user, password = pwd, driver = driver)) { conn =>
Await.result(conn.run(sqlu"DROP DATABASE #$dbName"), actionTimeout)
}
} catch {
// ignore failure due to db not exist
case e:PSQLException => if (e.getMessage.equals(s""""database "$dbName" does not exist""")) {/* do nothing */}
case e:Throwable => throw e // escalate other exceptions
}
}
private def using[A <: {def close() : Unit}, B](resource: A)(f: A => B): B =
try {
f(resource)
} finally {
Try {
resource.close()
}.failed.foreach(err => throw new Exception(s"failed to close $resource", err))
}
}
You can connect to the database engine using only "jdbc:mysql://localhost/" as JDBC URL and then issue a pure SQL create database query:
import scala.slick.driver.MySQLDriver.simple._
import scala.slick.jdbc.{StaticQuery => Q}
object Main extends App
{
Database.forURL("jdbc:mysql://localhost/", driver = "com.mysql.jdbc.Driver") withSession {
implicit session =>
Q.updateNA("CREATE DATABASE `dataBaseName`").execute
.
.
.
}
}