Phantom dsl 2.24 tables are not created - scala

Recently trying to migrate to the latest phantom version 2.24.8. I created a dummy project, but running into a few issues that I can't figure out. Here's my code:
import com.outworkers.phantom.connectors.{CassandraConnection, ContactPoints}
import com.outworkers.phantom.database.Database
import scala.concurrent.Future
import com.outworkers.phantom.dsl._
case class Test(id: String, timestamp: String)
abstract class Tests extends Table[Tests, Test] {
object id extends StringColumn with PartitionKey
object timestamp extends StringColumn with ClusteringOrder
}
abstract class ConcreteTests extends Tests with RootConnector {
def addTest(l: Test): Future[ResultSet] = {
// store(l).consistencyLevel_=(ConsistencyLevel.LOCAL_ONE).future
insert.value(_.id, l.id)
.value(_.timestamp, l.timestamp)
.consistencyLevel_=(ConsistencyLevel.QUORUM).future
}
}
class MyDB(override val connector: CassandraConnection) extends Database[MyDB](connector) {
object tests extends ConcreteTests with connector.Connector
def init(): Unit = {
tests.create
}
}
object Test{
def main(args: Array[String]): Unit = {
val db = new MyDB(ContactPoints(Seq("127.0.0.1")).keySpace("tests"))
db.init
db.tests.addTest(Test("1", "1323234234"))
println("Done")
}
}
It compiled and ran in IntelliJ and print out 'Done'. However, no table is ever created. Also no exceptions or warnings. It did nothing. I tried to stop the local cassandra database. The code throws the NoHostAvailableException. So it does try to connect the local database. What is the problem?
Another weird thing is that "com.typesafe.play" %% "play-json" % "2.6.9" is in my build.sbt. If I remove the library, the same code throws the following exception:
Exception in thread "main" java.lang.NoClassDefFoundError: scala/reflect/runtime/package$
at com.outworkers.phantom.column.AbstractColumn.com$outworkers$phantom$column$AbstractColumn$$_name(AbstractColumn.scala:55)
at com.outworkers.phantom.column.AbstractColumn.com$outworkers$phantom$column$AbstractColumn$$_name$(AbstractColumn.scala:54)
at com.outworkers.phantom.column.Column.com$outworkers$phantom$column$AbstractColumn$$_name$lzycompute(Column.scala:22)
at com.outworkers.phantom.column.Column.com$outworkers$phantom$column$AbstractColumn$$_name(Column.scala:22)
at com.outworkers.phantom.column.AbstractColumn.name(AbstractColumn.scala:58)
at com.outworkers.phantom.column.AbstractColumn.name$(AbstractColumn.scala:58)
at com.outworkers.phantom.column.Column.name(Column.scala:22)
at com.outworkers.phantom.builder.query.InsertQuery.value(InsertQuery.scala:107)
Really cannot figure what's going on. Any help?
BTW, I'm using scala 2.12.6 and JVM 1.8.181.

You're not using the correct DSL method for table creation, have a look at the official guide. All that table.create does is to create an empty CreateQuery, and you're coercing the return type to Unit manually.
The automated blocking create method is on Database, not on table, so what you want is:
class MyDB(override val connector: CassandraConnection) extends Database[MyDB](connector) {
object tests extends ConcreteTests with connector.Connector
def init(): Unit = {
this.create
}
}
If you want to achieve the same thing using table, you need:
class MyDB(override val connector: CassandraConnection) extends Database[MyDB](connector) {
object tests extends ConcreteTests with connector.Connector
def init(): Unit = {
import scala.concurrent.duration._
import scala.concurrent.Await
Await.result(tests.create.future(), 10.seconds)
}
}
It's only the call to future() method that will trigger any kind of action to the database, otherwise you're just building a query without executing it. The method name can be confusing, and we will improve the docs and future releases to make it more obvious.
The conflict with play 2.6.9 looks very weird, it's entirely possible there's an incompatible dependency behind the scenes to do with macro compilation. Raise that as a separate issue and we can definitely have a look at it.

Related

`exception during macro expansion: [error] scala.reflect.macros.TypecheckException` when using quill

I'm pretty new to Scala, Play, and Quill and I'm not sure what I'm doing wrong. I have my project split up into models, repositories, and services (and controllers, but that is not relevant for this question). Right now, I'm getting this error for the lines in my services that are making changes to the database:
exception during macro expansion: scala.reflect.macros.TypecheckException: Can't find implicit `Decoder[models.AgentId]`. Please, do one of the following things:
1. ensure that implicit `Decoder[models.AgentId]` is provided and there are no other conflicting implicits;
2. make `models.AgentId` `Embedded` case class or `AnyVal`.
And I'm getting this error for all the other lines in my services:
exception during macro expansion: [error] scala.reflect.macros.TypecheckException: not found: value quote
I found a similar ticket, but the same fix does not work for me (I am already requiring ctx as an implicit variable, so I can't import it as well. I'm totally at a loss and if anyone has any suggestions, I would be happy to try anything. I'm using the following versions:
Scala 2.12.4
Quill 2.3.2
Play 2.6.6
The code:
db/package.scala
package db
import io.getquill.{PostgresJdbcContext, SnakeCase}
package object db {
class DBContext(config: String) extends PostgresJdbcContext(SnakeCase, config)
trait Repository {
val ctx: DBContext
}
}
repositories/AgentsRepository.scala
package repositories
import db.db.Repository
import models.Agent
trait AgentsRepository extends Repository {
import ctx._
val agents = quote {
query[Agent]
}
def agentById(id: AgentId) = quote { agents.filter(_.id == lift(id)) }
def insertAgent(agent: Agent) = quote {
query[Agent].insert(_.identifier -> lift(agent.identifier)
).returning(_.id)
}
}
services/AgentsService.scala
package services
import db.db.DBContext
import models.{Agent, AgentId}
import repositories.AgentsRepository
import scala.concurrent.ExecutionContext
class AgentService(implicit val ex: ExecutionContext, val ctx: DBContext)
extends AgentsRepository {
def list: List[Agent] =
ctx.run(agents)
def find(id: AgentId): List[Agent] =
ctx.run(agentById(id))
def create(agent: Agent): AgentId = {
ctx.run(insertAgent(agent))
}
}
models/Agent.scala
package models
import java.time.LocalDateTime
case class AgentId(value: Long) extends AnyVal
case class Agent(
id: AgentId
, identifier: String
)
I am already requiring ctx as an implicit variable, so I can't import it as well
You don't need to import a context itself, but everything which is inside in order to make it work
import ctx._
Make sure to place it before ctx.run called, as in https://github.com/getquill/quill/issues/998#issuecomment-352189214

Unable to provision my slick setup using traits, NullPointerException

I rewrote my slick database layer to use traits (I was using classes before), and I am getting this error now:
It looks like my DatabaseConfig is null possible?
Unexpected exception ProvisionException: Unable to provision, see the
following errors:
Error injecting constructor, java.lang.NullPointerException at
play.api.DefaultApplication.class(Application.scala:221) while
locating play.api.DefaultApplication while locating
play.api.Application Caused by: java.lang.NullPointerException at
play.api.db.slick.HasDatabaseConfig$class.driver(DatabaseConfigProvider.scala:142)
Below is my controller that uses the dbService, along with the traits etc that I am using to wire up my slick code using play-slick (2.02)
#Singleton
class HomeController #Inject() (dbService: DbService) extends Controller {
}
Module:
bind(classOf[DbService]).to(classOf[DbServiceImpl])
My slick db layer is setup as follows:
trait DbService extends
UserTable
with AccountTable {
this: HasDatabaseConfigProvider[JdbcProfile] =>
import driver.api._
// ..
}
#Singleton
class DbServiceImpl #Inject() (protected val dbConfigProvider: DatabaseConfigProvider)
extends DbService with HasDatabaseConfigProvider[JdbcProfile] {
import driver.api._
}
trait AccountTable {
this: HasDatabaseConfigProvider[JdbcProfile] =>
import driver.api._
lazy val accounts = TableQuery[AccountsTable]
def getAccountById(id: Int): Future[Option[Account]] =
db.run(accounts.filter(_.id === id).result.headOption)
class AccountsTable(tag: Tag) extends Table[Account](tag, "accounts") {
def id = column[Int]("id", O.PrimaryKey, O.AutoInc)
def companyName = column[String]("company_name")
def * = (id, companyName) <> (Account.tupled, Account.unapply _)
}
}
What seems to be the problem with my slick setup? I can't figure it out so far.
Update
The full stack trace is here: https://pastebin.com/CXzUB0Kx
The crash comes from here: https://github.com/playframework/play-slick/blob/2.0.2/src/core/src/main/scala/play/api/db/slick/DatabaseConfigProvider.scala#L142, so you are right, your DatabaseConfig (dbConfig) is null.
This could be an initialization order problem. As you can see in the code referenced above, driver (being a lazy val) is certainly meant to be accessed after instanciation.
Did you post the full stacktrace? The full stacktrace leading to the NullPointerException would allow identifying where this access comes from.
Without a more precise stacktrace, you should ensure that you do not access driver or members imported through import driver.api._ too early. The most likely cause would we some val that you should turn into lazy val.
After stacktrace update
It seems that one of you lazy field here ApiService.scala:80 gets initialized, probably from the constructor of WebsiteTable here Schema.scala:544, called from ApiService.scala:81. Please review these locations or post the relevant code here if possible.

How to set an in memory test database for my Scala Play2 CRUD application?

I'm continuing my exploration of the Play framework and its related components. I used the template for CRUD application with a connection to a PostgreSQL database to begin with. It splits the application in models, repositories, controllers and views. This works well.
Now, I'm trying to create some tests for this application with Specs2. More precisely, I'm trying to test the repository. It is defined as follow:
package dal
import javax.inject.{ Inject, Singleton }
import play.api.db.slick.DatabaseConfigProvider
import slick.driver.JdbcProfile
import models.Cat
import scala.concurrent.{ Future, ExecutionContext }
#Singleton
class CatRepository #Inject() (dbConfigProvider: DatabaseConfigProvider)(implicit ec: ExecutionContext) {
...
}
I would like to set an in memory database which would be created (schema, evolutions) before all tests, destroyed after all tests, populated (data, probably with direct SQL) and flushed around each test. I would like to pass it on to my repository instance which I would then use to perform my test. Like:
val repo = new CatRepository(inMem_DB)
So how do I go about:
1) creating this db and applying the evolutions?
maybe:
trait TestDB extends BeforeAfterAll {
var database: Option[Database] = None
def before = {
database = Some(Databases.inMemory(name = "test_db"))
database match {
case Some(con) => Evolutions.applyEvolutions(con)
case _ => None
}
println("DB READY")
}
def after = {
database match {
case Some(con) => con.shutdown()
case _ => None
}
}
}
Using a var and always making a "match/case" when I need to use the db isn't convenient. I guess there is a much better to do this...
2) Populate and flush around each test?
Shall I create a trait extending Around, just as with BeforeAfterAll?
3) Create on of these play.api.db.slick.DatabaseConfigProvider from the database?
Any link showing how to do this?
I found few examples which where covering this with running a FakeApplication, but I assume there is a way to pass somehow the db to such repository object outside of a running application..?
Thank you for helping.
Cheers!
You can use a lazy val and AfterAll for the database setup/teardown, then BeforeAfterEach for each example:
trait TestDB extends AfterAll with BeforeAfterEach {
lazy val database: Database =
Databases.inMemory(name = "test_db")
def afterAll =
database.shutdown
def before =
database.populate
def after =
datbase.clean
}

How to mock using external call in Akka Actor using ScalaTest

I am new to entire ecosystem including Scala, Akka and ScalaTest
I am working on a problem where my Actor gives call to external system.
case object LogProcessRequest
class LProcessor extends Actor {
val log = Logging(context.system, this)
def receive = {
case LogProcessRequest =>
log.debug("starting log processing")
LogReaderDisruptor main(Array())
}
}
The LogReaderDisruptor main(Array()) is a Java class that does many other things.
The test I have currently looks like
class LProcessorSpec extends UnitTestSpec("testSystem") {
"A mocked log processor" should {
"be called" in {
val logProcessorActor = system.actorOf(Props[LProcessor])
logProcessorActor ! LogProcessRequest
}
}
}
where UnitTestSpec looks like (and inspired from here)
import akka.actor.ActorSystem
import akka.testkit.{ImplicitSender, TestKit}
import org.scalatest.matchers.MustMatchers
import org.scalatest.{BeforeAndAfterAll, WordSpecLike}
abstract class UnitTestSpec(name: String)
extends TestKit(ActorSystem(name))
with WordSpecLike
with MustMatchers
with BeforeAndAfterAll
with ImplicitSender {
override def afterAll() {
system.shutdown()
}
}
Question
How can I mock the call to LogReaderDisruptor main(Array()) and verify that it was called?
I am coming from Java, JUnit, Mockito land and something that I would have done here would be
doNothing().when(logReaderDisruptor).main(Matchers.<String>anyVararg())
verify(logReaderDisruptor, times(1)).main(Matchers.<String>anyVararg())
I am not sure how to translate that with ScalaTest here.
Also, This code may not be idiomatic, since I am very new and learning
There are a few ways to do this. The kind of OO way is to wrap logDisrupter as an object and pass it in. I would set up a companion object to instantiate the actor. Something like below. Then you can pass alternate implementation. You can also achieve a similar approach by using traits and mixing in an alternative logDisrupter only as needed.
object LProcessor {
def props(logDisrupter : LogDisrupter) = Props(new LProcessor(logDisrupter))
}
class LProcessor(logDisrupter : LogDisrupter) extends Actor {
val log = Logging(context.system, this)
def receive = {
case LogProcessRequest =>
log.debug("starting log processing")
logDisrupter.doSomething();
}
}
Then instatiate as
val logProcessorActor = system.actorOf(LProcessor.props(logDisrupter))

Where to put my database access methods when using a DAO with Slick 2.0?

(This question is based on a very similar previous request for help. With the introduction of a DAO and multiple database drivers, the same problem requires a different approach, and I hope warrants a new SO question.)
I have a class and Slick Table defined like this:
import play.api.db.slick.Profile
case class Foo(title: String, description: String, id: Int = 0)
trait FooComponent extends Profile { this: Profile =>
import profile.simple._
class FooTable(tag: Tag) extends Table[Foo](tag, "FOO") {
def id = column[Int]("ID", O.PrimaryKey, O.AutoInc)
def title = column[String]("TITLE", O.NotNull)
def description = column[String]("DESCRIPTION")
def * = (title, description, id) <> (Foo.tupled, Foo.unapply)
}
}
And a data access object:
class DAO(override val profile: JdbcProfile) extends FooComponent with Profile {
val foos = TableQuery[FooTable]
}
object current {
val dao = new DAO(DB(play.api.Play.current).driver)
}
This is pretty awesome, because now I can add something like the following to my application.conf:
db.default.driver=org.h2.Driver
db.default.url="jdbc:h2:mem:play"
db.test.driver=org.postgresql.Driver
db.test.user="testuser"
db.test.password=""
db.test.url="jdbc:postgresql:testdb"
... and if I do the following in a Controller:
import models.current.dao._
import models.current.dao.profile.simple._
I have access to my foos TableQuery, and it automagically gets the driver and database url given for db.default in application.conf.
In a similar, but not-quite-as-nice way, I can do the following in my test Specification:
"test Foos" in new WithApplication() {
val dao = new DAO(play.api.db.slick.DB("test").driver)
import dao._ //import all our database Tables
import dao.profile.simple._ //import specific database methods
play.api.db.slick.DB("test").withSession { implicit s: Session =>
println(s.conn.getMetaData.getURL)
println(foos.list)
}
However, what if I want to define a method which can act on a TableQuery[Foo]? Something like this:
def findByTitle(title: String) = foos.filter(_.id === id).list
Problem
What's the correct way of writing the findByTitle method, and where should I put it so that I can:
Call it in a way such that it won't collide with a method of the same name which acts on TableQuery[Bar]. Coming from OO, I feel like I want to do something like foos.findByTitle("someFoo"), but if there's a better way of doing this functional-style, I'm open to suggestions.
Call it from an application Controller such that the query will work with my db.default h2 driver, and from my test Specification so that it will work with my db.test postgres driver.
As an aside, if I can put this in my DAO:
object current {
val dao = new DAO(DB(play.api.Play.current).driver)
}
and then import models.dao.current._ anywhere I want to use this DAO, how can I extend the same form to the following:
object test {
val dao = new DAO(play.api.db.slick.DB("test").driver)
}
If I try to do this, the compiler complains about not having an implicit Application in scope.
I think you need to read up in implicit conversion and implicit parameters in Scala. There are online Scala books available.
When you get an error message about a missing implicit it either means you ran into a failing type-check provided by a library preventing you from doing something wrong, but that's not the case here. Or you simply forgot to make the implicit available. There are two ways to make an implicit available. Either import it into the scope where you get the error message. Or basically defer the lookup to the callsite of your method. Not sure which one is the right one for play. You either need to import the implicit Application from play, or you need turn val dao into a method and request an implicit application in an implicit argument list def dao(implicit app: Application) = .... You can alternatively turn test into a class and request it there.
If you use the play slick plugin it will need a started play application to be able to call code that uses the DB access from that plugin, you can make sure to start a play app in your tests using WithApplication as described in the docs: http://www.playframework.com/documentation/2.3.x/ScalaFunctionalTestingWithSpecs2