Can't make schema into cassandra using phantom-dsl - scala

Trying to create a schema into cassandra using phantom-dsl for unit testing following this tutorial:
http://outworkers.com/blog/post/phantom-tips-3-understanding-phantom-connectors
I ran into this issue when trying to auto-generate schema
[ERROR] /home/.../test/BaseCassandraSpec.scala:54: error: not enough arguments for method autocreate: (keySpace: com.websudos.phantom.connectors.KeySpace)
com.websudos.phantom.builder.query.CreateQuery.Default[com.neruti.db.models.ConcreteUserModel,com.neruti.User].
[ERROR] Unspecified value parameter keySpace.
[ERROR] Await.result(database.userModel.autocreate().future(),10.seconds)
Any advice?
Currently using version 1.29.6
BaseCassandraSpec
import com.neruti.User
import com.neruti.db.models._
import com.neruti.db.databases._
import com.neruti.db.services._
import com.neruti.db.Connector._
import org.scalatest._
import org.scalatest.{BeforeAndAfterAll,FlatSpec,Matchers,ShouldMatchers}
import org.scalatest.concurrent.ScalaFutures
import org.scalamock.scalatest.MockFactory
import scala.concurrent.duration._
import scala.concurrent.{Await, Future}
import scala.concurrent.ExecutionContext.Implicits.global
override protected def beforeAll(): Unit = {
Await.result(database.userModel.autocreate().future(),10.seconds)
}
Database
class UserDatabase (val connector: KeySpaceDef){
object userModel extends ConcreteUserModel with connector.Connector
}
object ProductionDb extends UserDatabase(connector)
trait ProductionDatabaseProvider {
def database: UserDatabase
}
trait ProductionDatabase extends ProductionDatabaseProvider {
override val database = ProductionDb
}
object testDB extends UserDatabase(testConnector)
trait testDatabaseProvider {
def database: UserDatabase
}
trait testDatabase extends testDatabaseProvider{
override val database = testDB
}
Connector
package com.neruti.db
import com.neruti.db.models._
import com.websudos.phantom.database.Database
import com.websudos.phantom.connectors.ContactPoints
import com.websudos.phantom.dsl.KeySpaceDef
object Connector {
// TODO: these key value pairs shld get from HOCON config file
val host= Seq("127.0.0.1")
val port = 9042
val keySpace: String = "nrt_entities"
// val inet = InetAddress.getByName
lazy val connector = ContactPoints(host,port).withClusterBuilder(
_.withCredentials("dev", "nrtDev1989")
).keySpace(keySpace)
// embedded cassandra is not supported anymore. Check phantom-sbt.
// lazy val testConnector: KeySpaceDef = ContactPoint.embedded.keySpace(keySpace)
lazy val testConnector: KeySpaceDef = ContactPoints(host,port).noHeartbeat().keySpace(keySpace)
}

I would upgrade to phantom 2.0.0 as a side note. Next, there are many things to improve in your code, starting with capitalisation of the traits.
You should use database.create or database.createAsync, which no longer require you to re-pass the implicit keyspace or session. Bear in mind this API is version 2.1.1 of phantom-dsl, available on Maven Central.
package com.neruti.db
import com.neruti.db.models._
import com.outworkers.phantom.connectors.ContactPoints
import com.outworkers.phantom.dsl._
object Connector {
// TODO: these key value pairs shld get from HOCON config file
val host= Seq("127.0.0.1")
val port = 9042
val keySpace: String = "nrt_entities"
// val inet = InetAddress.getByName
lazy val connector = ContactPoints(host,port).withClusterBuilder(
_.withCredentials("dev", "nrtDev1989")
).keySpace(keySpace)
lazy val testConnector: KeySpaceDef = ContactPoints(host, port).noHeartbeat().keySpace(keySpace)
}
class MyDb(override val connector: CassandraConnection) extends Database(connector) {
... tables
}
object TestMyDb extends MyDb(Connector.testConnector)
import com.outworkers.phantom.dsl.context
// Now this will only require an execution context, nothing more
TestMyDb.create()

Related

Mongo codec for value classes with Macro

Is it possible to auto derive codecs for values classes in scala-mongo-driver?
Using existing macros produces StackOverflowError
package test
import org.bson.codecs.configuration.CodecRegistries.{fromCodecs, fromProviders, fromRegistries}
import org.mongodb.scala.MongoClient.DEFAULT_CODEC_REGISTRY
import org.mongodb.scala.bson.codecs.Macros._
import org.mongodb.scala.{MongoClient, MongoCollection}
import scala.concurrent.duration._
import scala.concurrent.{Await, Future}
// Models
case class Name(value: String) extends AnyVal
case class Person(name: Name)
object TestValueClassCodecs extends App {
private[this] val codecRegistry =
fromRegistries(
fromProviders(
classOf[Person],
classOf[Name],
),
DEFAULT_CODEC_REGISTRY
)
protected val collection: MongoCollection[Person] =
MongoClient(s"mongodb://localhost:27017")
.getDatabase("TestDB")
.withCodecRegistry(codecRegistry)
.getCollection[Person]("test_repo_values_classes")
val res = Await.result(
collection.insertOne(Person(Name("Jesus"))).toFuture(),
10.seconds
)
}
Output:
Caused by: java.lang.StackOverflowError
at scala.collection.LinearSeqOptimized.length(LinearSeqOptimized.scala:54)
at scala.collection.LinearSeqOptimized.length$(LinearSeqOptimized.scala:51)
at scala.collection.immutable.List.length(List.scala:91)
at scala.collection.SeqLike.size(SeqLike.scala:108)
at scala.collection.SeqLike.size$(SeqLike.scala:108)
at scala.collection.AbstractSeq.size(Seq.scala:45)
at scala.collection.convert.Wrappers$IterableWrapperTrait.size(Wrappers.scala:25)
at scala.collection.convert.Wrappers$IterableWrapperTrait.size$(Wrappers.scala:25)
at scala.collection.convert.Wrappers$SeqWrapper.size(Wrappers.scala:66)
at java.util.AbstractCollection.toArray(AbstractCollection.java:136)
at java.util.ArrayList.<init>(ArrayList.java:178)
at org.bson.internal.ProvidersCodecRegistry.<init>(ProvidersCodecRegistry.java:34)
at org.bson.codecs.configuration.CodecRegistries.fromRegistries(CodecRegistries.java:126)
at org.mongodb.scala.bson.codecs.macrocodecs.MacroCodec.$init$(MacroCodec.scala:86)
at test.TestValueClassCodecs$$anon$1$$anon$3$NameMacroCodec$1.<init>(TestValueClassCodecs.scala:51)
at test.TestValueClassCodecs$$anon$1$$anon$3$NameMacroCodec$2$.apply(TestValueClassCodecs.scala:51)
at test.TestValueClassCodecs$$anon$1$$anon$3.get(TestValueClassCodecs.scala:51)
at org.bson.internal.ProvidersCodecRegistry.get(ProvidersCodecRegistry.java:45)
at org.bson.internal.ProvidersCodecRegistry.get(ProvidersCodecRegistry.java:45)
at org.bson.internal.ChildCodecRegistry.get(ChildCodecRegistry.java:58)
at org.bson.internal.ProvidersCodecRegistry.get(ProvidersCodecRegistry.java:45)
at org.bson.internal.ChildCodecRegistry.get(ChildCodecRegistry.java:58)
at org.bson.internal.ProvidersCodecRegistry.get(ProvidersCodecRegistry.java:45)
at org.bson.internal.ChildCodecRegistry.get(ChildCodecRegistry.java:58)
at org.bson.internal.ProvidersCodecRegistry.get(ProvidersCodecRegistry.java:45)
I am using version:
org.mongodb.scala:mongo-scala-bson_2.12:4.1.1
If the Name class is not a value one everything works just fine.

Can't find a codec for class java.lang.Object. while using mongo with scala

I am using mongo with scala.
object myApp extends App{
case class myClass(field: String, value: Option[Any])
}
Above is my case class and below is the DB code
object DB{
import org.bson.codecs.configuration.CodecRegistries
import org.bson.codecs.configuration.CodecRegistries._
import org.mongodb.scala.MongoClient.DEFAULT_CODEC_REGISTRY
import org.mongodb.scala.bson.codecs.Macros._
import org.mongodb.scala.{MongoClient, MongoCollection, MongoDatabase}
private val codecs = fromProviders(classOf[myClass])
private val codecReg = fromRegistries(codecs,
DEFAULT_CODEC_REGISTRY)
private val dataB: MongoDatabase = MongoClient().getDatabase("database").withCodecRegistry(codecReg)
val myClass: MongoCollection[Rule] = dataB.getCollection("myClass")
}
now, the "value" can have String/Int/None value. If I define a myClass with value as 'None' i.e
val c = myClass("abc", None)
and put this in database than it runs without ant error, but if i keep value as Some(String_value) or Some(int_value) than it shows the error.
val d = myClass("abc", Some("xyz"))
val f = myClass("abc", Some(90))
error
Can't find a codec for class java.lang.Object
Can somebody guide me how should I do this as "value" field can have String/Int/None values.

Creating functional tests Scala Playframework 2.6 Macwire

I wrote some traits to use it as a base for my functional tests
This file is for creating a DB in memory (H2 + Evolutions)
BlogApiDBTest.scala
package functional.common
import play.api.db.Databases
import play.api.db.evolutions.Evolutions
trait BlogApiDBTest {
implicit val testDatabase = Databases.inMemory(
name = "blog_db",
urlOptions = Map(
"MODE" -> "MYSQL"
),
config = Map(
"logStatements" -> true
)
)
org.h2.engine.Mode.getInstance("MYSQL").convertInsertNullToZero = false
Evolutions.applyEvolutions(testDatabase)
}
Here I am overriding some injected components for testing purposes
BlogApiComponentsTest.scala
package functional.common
import common.BlogApiComponents
import org.scalatestplus.play.components.WithApplicationComponents
import play.api.{BuiltInComponents, Configuration}
trait BlogApiComponentsTest extends WithApplicationComponents with BlogApiDBTest {
override def components: BuiltInComponents = new BlogApiComponents(context) {
override lazy val configuration: Configuration = context.initialConfiguration
override lazy val blogDatabase = testDatabase
}
}
This is the base class for my functional tests
BlogApiOneServerPerTestWithComponents.scala
package functional.common
import org.scalatestplus.play.PlaySpec
import org.scalatestplus.play.components.{OneServerPerTestWithComponents}
trait BlogApiOneServerPerTestWithComponents extends PlaySpec with OneServerPerTestWithComponents with BlogApiComponentsTest {
}
Finally the test I am trying to execute
PostControllerSpec.scala
package functional.controllers
import functional.common.BlogApiOneServerPerTestWithComponents
import org.scalatest.concurrent.{IntegrationPatience, ScalaFutures}
import play.api.mvc.{Results}
import play.api.test.{FakeRequest, Helpers}
import play.api.test.Helpers.{GET, route}
class PostControllerSpec extends BlogApiOneServerPerTestWithComponents
with Results
with ScalaFutures
with IntegrationPatience {
"Server query should" should {
"provide an Application" in {
val Some(result) = route(app, FakeRequest(GET, "/posts"))
Helpers.contentAsString(result) must be("success!")
}
}
}
Then I get
blog-api/test/functional/controllers/PostControllerSpec.scala:18:31: Cannot write an instance of play.api.mvc.AnyContentAsEmpty.type to HTTP response. Try to define a Writeable[play.api.mvc.AnyContentAsEmpty.type]
Here is the code
Adding the following import should make it work:
import play.api.test.Helpers._
Looking at the signature of route
def route[T](app: Application, req: Request[T])(implicit w: Writeable[T]): Option[Future[Result]]
we see it expects an implicit w: Writeable[T]. The above import will provide it via Writables

could not find implicit value for parameter env: com.mohiva.play.silhouette.api.Environment[utils.auth.DefaultEnv]

I'm using a Silhouette v4.0 library with play framework 2.5.
And have been trying to write test code using play specs2.
But, I get the following error with my test class as below.
Error Message
[error] could not find implicit value for parameter env: com.mohiva.play.silhouette.api.Environment[utils.auth.DefaultEnv]
.withAuthenticator[DefaultEnv](identity.loginInfo)
^
Here's the test class
package controllers
import com.google.inject.AbstractModule
import org.joda.time.DateTime
import org.specs2.specification.Scope
import org.specs2.matcher._
import org.specs2.mock._
import play.api.test._
import play.api.libs.json._
import play.api.libs.json.Json
import play.api.libs.json.Reads._
import play.api.libs.functional.syntax._
import play.api.libs.concurrent.Execution.Implicits._
import play.api.libs.mailer.{ MailerClient, Email }
import play.api.inject.guice.GuiceApplicationBuilder
import play.api.inject.bind
import com.mohiva.play.silhouette.test._
import com.mohiva.play.silhouette.api._
import com.mohiva.play.silhouette.api.repositories.AuthInfoRepository
import com.mohiva.play.silhouette.api.util._
import com.mohiva.play.silhouette.impl.providers._
import net.codingwell.scalaguice.ScalaModule
import utils.auth.DefaultEnv
class TestControllerSpec extends PlaySpecification with Mockito {
"case" in new Context {
new WithApplication(application) {
val request = FakeRequest(POST, "/api/test")
.withAuthenticator[DefaultEnv](identity.loginInfo) // <-
val result = route(app, request).get
status(result) must be equalTo OK
}
}
trait Context extends Scope {
val identity = User(
loginInfo = LoginInfo(..)
..
)
implicit val env = FakeEnvironment[DefaultEnv](Seq(identity.loginInfo -> identity))
class FakeModule extends AbstractModule with ScalaModule {
def configure() = {
bind[Environment[DefaultEnv]].toInstance(env)
}
}
lazy val application = new GuiceApplicationBuilder()
.overrides(new FakeModule)
.build
}
}
There are some other test classes similar to this class are properly able to compile and execute.
It's kind of implicit problem with scope..
Therefore, I tried to import all the same as another test class which's able to compile properly. But, still unable to compile.
Missing some import?
As the compiler states, you're missing an implicit value. Use the following, which is modeled after one of Silhouette's specs:
class TestControllerSpec extends PlaySpecification with Mockito {
"the POST request" should {
"return an OK response" in new Context {
new WithApplication(application) {
val identity = User(LoginInfo(...))
implicit val env = FakeEnvironment[DefaultEnv](Seq(identity.loginInfo -> identity))
val request = FakeRequest(POST, "/api/test")
.withAuthenticator(identity.loginInfo)
val result = route(app, request).get
status(result) must be equalTo OK
}
}
}
trait Context extends Scope {
...
}
}

reactivemongo, could not find implicit value for parameter reader

I'm doing tests with reactivemongo
In my controller I have this:
package controllers
import models._
import models.JsonFormats._
import play.modules.reactivemongo.MongoController
import scala.concurrent.Future
import reactivemongo.api.Cursor
import org.slf4j.{LoggerFactory, Logger}
import javax.inject.Singleton
import play.api.mvc._
import reactivemongo.api.collections.default.BSONCollection
import reactivemongo.bson._
#Singleton
class Users extends Controller with MongoController {
private final val logger: Logger = LoggerFactory.getLogger(classOf[Users])
val collection = db[BSONCollection]("users")
// list all articles and sort them
def list = Action.async { implicit request =>
// get a sort document (see getSort method for more information)
val sort = getSort(request)
// build a selection document with an empty query and a sort subdocument ('$orderby')
val query = BSONDocument(
"$orderby" -> sort,
"$query" -> BSONDocument())
val activeSort = request.queryString.get("sort").flatMap(_.headOption).getOrElse("none")
// the cursor of documents
val found = collection.find(query).cursor[User]
// build (asynchronously) a list containing all the articles
found.collect[List]().map { users =>
Ok(views.html.admin.list(users, activeSort))
}.recover {
case e =>
e.printStackTrace()
BadRequest(e.getMessage())
}
}
...........
}
and in my model i have this:
package models
import reactivemongo.bson._
case class User(
nickName: String,
email: String,
password: String,
active: Boolean
)
object JsonFormats {
import play.api.libs.json.Json
// Generates Writes and Reads for Feed and User thanks to Json Macros
implicit val userFormat = Json.format[User]
}
When I compile the project returns the following error:
could not find implicit value for parameter reader: reactivemongo.bson.BSONDocumentReader[models.User]
in this line is the problem:
val found = collection.find(query).cursor[User]
Can anyone tell me where I'm wrong or what I'm missing please?
You have no implicit handler defined to map your model class to a BSONDocument. You can implement it yourself, or, just like you did for the JsonFormats, you could use the macros provided by ReactiveMongo.
object BsonFormats {
import reactivemongo.bson.Macros
implicit val userFormat = Macros.handler[User]
}
Alternatively, instead of the BSONCollection, you could use the JSONCollection provided by Play-ReactiveMongo to perform your mapping using the JSON format that you have already defined.
For me, I still get the error even after I have declared the implicits for both bson and json format. What I need to do is just import this:
import reactivemongo.api.commands.bson.BSONCountCommandImplicits._