I'm new to scala and I'm trying to integrate a PostgreSQL database to a Lagom application written in scala.I'm trying to utilise the persistence API of Lagom. Lagom has inbuilt support for slick.
My table has 3 fields id of type int, name of type string, data of type jsonb
Since Slick doesn't support json format I'm trying to use slick-pg .
Below is my implementation
My custom profile class
import com.github.tminglei.slickpg.{ExPostgresProfile, PgPlayJsonSupport}
import play.api.libs.json.JsValue
import slick.basic.Capability
import slick.jdbc.{JdbcCapabilities, PostgresProfile}
trait CustomPostgresProfile extends ExPostgresProfile with PgPlayJsonSupport {
def pgjson = "jsonb"
override protected def computeCapabilities: Set[Capability] =
super.computeCapabilities + JdbcCapabilities.insertOrUpdate
override val api = PostgresJsonSupportAPI
object PostgresJsonSupportAPI extends API with JsonImplicits {}
}
object CustomPostgresProfile extends PostgresProfile
My table definition
import com.custom.persistence.profile.CustomPostgresProfile.api._
import play.api.libs.json._
case class CustomDataEntity(id:int,name: String, data: JsValue)
object CustomDataTableDef {
val data = TableQuery[CustomDataTableDef]
}
class CustomDataTableDef(tag: Tag) extends Table[CustomDataEntity](tag, "custom"){
def id = column[Int]("id", O.PrimaryKey)
def name = column[String]("name")
def data = column[JsValue]("data")
override def * =
(id,name,data) <> (CustomDataEntity.tupled,CustomDataEntity.unapply(_))
}
when I'm trying to compile the code, I get the below 2 errors
could not find implicit value for parameter tt: slick.ast.TypedType[play.api.libs.json.JsValue]
[error] def data = column[JsValue]("data")
Cannot resolve symbol <>
Please help me to resolve this
Your object CustomPostgresProfile extends PostgresProfile instead of CustomPostgresProfile. If you fix that, it works.
Related
I have the following:
Account.scala
package modules.accounts
import java.time.Instant
import reactivemongo.api.bson._
case class Account(id: String, name: String)
object Account {
type ID = String
implicit val accountHandler: BSONDocumentHandler[Account] = Macros.handler[Account]
// implicit def accountWriter: BSONDocumentWriter[Account] = Macros.writer[Account]
// implicit def accountReader: BSONDocumentReader[Account] = Macros.reader[Account]
}
AccountRepo.scala
package modules.accounts
import java.time.Instant
import reactivemongo.api.collections.bson.BSONCollection
import scala.concurrent.ExecutionContext
final class AccountRepo(
val coll: BSONCollection
)(implicit ec: ExecutionContext) {
import Account.{ accountHandler, ID }
def insertTest() = {
val doc = Account(s"account123", "accountName") //, Instant.now)
coll.insert.one(doc)
}
}
The error I am getting is:
could not find implicit value for parameter writer: AccountRepo.this.coll.pack.Writer[modules.accounts.Account]
[error] coll.insert.one(doc)
From what I understand the implicit handler that is generated by the macro should be enough and create the Writer. What am I doing wrong?
Reference: http://reactivemongo.org/releases/1.0/documentation/bson/typeclasses.html
The code is mismixing different versions.
The macro generated handler is using the new BSON API, as it can be seen with the import reactivemongo.api.bson, whereas the collection is using an old driver, as it can be seen as it uses reactivemongo.api.collections.bson instead of reactivemongo.api.bson.collection.
It's recommended to have a look at the documentation, and not mixing incompatible versions of related libraries.
I'm not really sure where this error is coming from. I'm trying to migrate from an old .find() method where I could pass Pattern: query: BSONDocument to find and call .one[T] on it. This worked perfectly fine, but I'd like to follow the recommendations.
However, I have a feeling that my desire to create a generic database access object is causing some of this headache... take a look below.
The object that i'm trying to read from:
package Models
import reactivemongo.api.bson.{BSONDocumentHandler, Macros}
case class Person(name: String)
object Person {
implicit val handler: BSONDocumentHandler[Person] = Macros.handler[Person]
}
The class containing the method I'm trying to execute (ignore the Await, etc.)
package Data
import reactivemongo.api.AsyncDriver
import reactivemongo.api.bson._
import reactivemongo.api.bson.collection.BSONCollection
import reactivemongo.api.commands.WriteResult
import scala.concurrent.{Await, Future}
import scala.concurrent.duration._
import scala.reflect.ClassTag
object DataService extends DataService
trait DataService {
implicit val ec: scala.concurrent.ExecutionContext = scala.concurrent.ExecutionContext.global
private val driver = new AsyncDriver
private val connection = Await.result(driver.connect(List("localhost:32768")), 10.seconds)
private val database = Await.result(connection.database("database"), 10.seconds)
private def getCollection[T]()(implicit tag: ClassTag[T]): BSONCollection = database.collection(tag.runtimeClass.getSimpleName)
def Get[T]()(implicit handler: BSONDocumentHandler[T], tag: ClassTag[T]): Future[Option[T]] = {
val collection = getCollection[T]
collection.find(BSONDocument("name" -> "hello"), Option.empty).one[T]
}
}
How I'm trying to get it (ignore the Await, this is just for test purposes):
val personName = Await.result(dataService.Get[Person](), 10.seconds).map(_.name).getOrElse("Not found")
The error that I'm getting:
Error:(32, 19) ambiguous implicit values:
both object BSONDocumentIdentity in trait BSONIdentityHandlers of type reactivemongo.api.bson.package.BSONDocumentIdentity.type
and value handler of type reactivemongo.api.bson.BSONDocumentHandler[T]
match expected type collection.pack.Writer[J]
collection.find(BSONDocument("name" -> "hello"), Option.empty).cursor[T]
Thank you all for your help.
Try to add type to your option in your second argument find method, for example:
def Get[T]()(implicit handler: BSONDocumentHandler[T], tag: ClassTag[T]): Future[Option[T]] = {
val collection = getCollection[T]
collection.find(BSONDocument("name" -> "hello"), Option.empty[BSONDocument]).one[T]
}
Slick 3.0.0
play 2.6.2
I am experimenting with Slick and run into an interesting issue. I hope the solution is just trivial and I am thinking too much about this
I have implemented the following simple code.
case class Content(content:String)
class ContentTable(tag: Tag) extends Table[Content](tag, "content"){
def id = column[Long]("id", O.PrimaryKey, O.AutoInc)
def content = column[String]("content")
override def * : ProvenShape[Content] = (content).mapTo[Content]
}
object ContentDb {
val db = Database.forConfig("databaseConfiguration")
lazy val contents = TableQuery[ContentTable]
def all: Seq[Content] = Await.result(db.run(contents.result), 2 seconds)
}
So for this code to work, it requires the following import among others of course.
import slick.jdbc.H2Profile.api._
or alternatively
import slick.jdbc.PostgresProfile.api._
Now, I think, and please correct me if I am wrong, that the database driver should be a configuration detail. That is, I'd choose to run H2 in memory database while developing. Run against test PostgreSQL instance when testing and then run against another instance when in production. I mean the whole idea of abstracting the driver is to have this flexibility... I think.
Now, I have done some research and found that I could do something like this:
trait DbComponent {
val driver: JdbcProfile
import driver.api._
val db: Database
}
trait H2DbComponent extends DbComponent {
val driver: JdbcProfile = slick.jdbc.H2Profile
import driver.api._
val db = Database.forConfig("databaseConfiguration")
}
trait Contents {
def all: Seq[Content]
}
object Contents {
def apply: Contents = new ContentsDb with H2DbComponent
}
trait ContentsDb extends Contents {
this: DbComponent =>
import driver.api._
class ContentTable(tag: Tag) extends Table[Content](tag, "content") {
def id = column[Long]("id", O.PrimaryKey, O.AutoInc)
def content = column[String]("content")
override def * : ProvenShape[Content] = content.mapTo[Content]
}
lazy val contents = TableQuery[ContentTable]
def all: Seq[Content] = Await.result(db.run(contents.result), 2 seconds)
}
Then I can use dependency injection to inject the right instance for each entity that I have. Not ideal, but possible. So, I start digging on how to have conditional dependency injection based on which environment is running in Play Framework.
I was expecting something similar to the following:
#Component(env=("prod","test")
class ProductionContentsDb extends ContentsDb with PostgresDbComponent
#Component(env="dev")
class ProductionContentsDb extends ContentsDb with H2DbComponent
But, no luck...
EDIT
Just after I have finished writing this and started reading it again, I am curious if we can just have the something similar to:
class DbComponent #Inject (driver: JdbcProfile) {
import driver.api._
val db = Database.forConfig("databaseConfiguration")
}
You can create separate configuration files for each environment. like
application.conf --local
application.prod.conf -- prod with contents below
include "aplication.conf"
###update slick configurations
Then while running application in different stages feed in with -Dconfig.resource=
I am playing around with Slick on top of a Twitter Finatra application. Finally I thought I made it but now, when I want to process a result, I always get an recursion error. I looked around but I did not find anything helpful for this particular problem. The code I have is actually quite simple:
Map the Database class to a custom Type:
package com.configurationManagement.library
package object Types {
type SlickDatabase = slick.driver.MySQLDriver.api.Database
}
Model:
package com.configurationManagement.app.domain
import slick.lifted.Tag
import slick.driver.MySQLDriver.api._
import slick.profile.SqlProfile.ColumnOption.NotNull
case class ConfigurationTemplate(id: Option[Int], name: String)
class ConfigurationTemplates(tag: Tag) extends Table[ConfigurationTemplate](tag: Tag, "configuration_template") {
def id = column[Int]("id", O.AutoInc, O.PrimaryKey)
def name = column[String]("name", NotNull)
def uniqueNameIndex = index("unique_name", name, unique = true)
def * = (id.?, name) <> (ConfigurationTemplate.tupled, ConfigurationTemplate.unapply)
}
Controller:
package com.configurationManagement.app.controller
import com.google.inject.{Inject, Singleton}
import com.configurationManagement.app.domain.ConfigurationTemplates
import com.configurationManagement.app.dto.request.RequestConfigurationTemplateDto
import com.configurationManagement.library.Types._
import com.twitter.finatra.http.Controller
import com.twitter.inject.Logging
import slick.driver.MySQLDriver.api._
#Singleton
class ConfigurationTemplateController #Inject()(database: SlickDatabase)
extends Controller with Logging with FutureConverter {
post("/configurations/templates") { dto: RequestConfigurationTemplateDto =>
val templates = TableQuery[ConfigurationTemplates]
val query = templates.filter(_.id === 1)
response.ok(query.map(_.name))
}
}
And here comes the error
Infinite recursion (StackOverflowError) (through reference chain: com.configurationManagement.app.domain.ConfigurationTemplates["table_node"]->slick.ast.TableNode["driver_table"]->com.configurationManagement.app.domain.ConfigurationTemplates["table_node"]->slick.ast.TableNode["driver_table"]->com.configurationManagement.app.domain.ConfigurationTemplates["table_node"]->slick.ast.TableNode["driver_table"]->com.configurationManagement.app.domain.ConfigurationTemplates["table_node"]->slick.ast.TableNode["driver_table"]->com.configurationManagement.app.domain.ConfigurationTemplates["table_node"]->slick.ast.TableNode["driver_table"]->com.configurationManagement.app.domain.ConfigurationTemplates["table_node"]->slick.ast.TableNode["driver_table"]->com.configurationManagement.app.domain.ConfigurationTemplates["table_node"]->slick.ast.TableNode["driver_table"]->com.configurationManagement.app.domain.ConfigurationTemplates["table_node"]->slick.ast
Obvisously this line causes the error:
query.map(_.name)
Two things I see, first you need to add .result to the query to transform it into a FixedSqlStreamingAction, second you need a database to run that query on:
private[this] val database: slick.driver.MySQLDriver.backend.DatabaseDef = Database.forDataSource(...)
database.run(templates.filter(_.id === 1).map(_.name).result)
Which returns a Future[Seq[String]], this probably is the expected type from response.ok.
I followed the documentation of Slick 3.0.0-RC1, using Typesafe Config as database connection configuration. Here is my conf:
database = {
driver = "org.postgresql.Driver"
url = "jdbc:postgresql://localhost:5432/postgre"
user = "postgre"
}
I established a file Locale.scala as:
package models
import slick.driver.PostgresDriver.api._
import scala.concurrent.Future
case class Locale(id: String, name: String)
class Locales(tag: Tag) extends Table[Locale](tag, "LOCALES") {
def id = column[String]("ID", O.PrimaryKey)
def name = column[String]("NAME")
def * = (id, name) <> (Locale.tupled, Locale.unapply)
}
object Locales {
private val locales = TableQuery[Locales]
val db = Database.forConfig("database")
def count: Future[Int] =
try db.run(locales.length.result)
finally db.close
}
Then I got confused that when and where the proper time is to create Database object using
val db = Database.forConfig("database")
If I create db like this, there will be as many Database objects as my models. So what is the best practice to get this work?
You can create an Object DBLocator and load it using lazy operator so that its loaded only on demand.
You can always invoke the method defined in DBLocator class to get an instance of Session.