Mongo codec for value classes with Macro - mongodb

Is it possible to auto derive codecs for values classes in scala-mongo-driver?
Using existing macros produces StackOverflowError
package test
import org.bson.codecs.configuration.CodecRegistries.{fromCodecs, fromProviders, fromRegistries}
import org.mongodb.scala.MongoClient.DEFAULT_CODEC_REGISTRY
import org.mongodb.scala.bson.codecs.Macros._
import org.mongodb.scala.{MongoClient, MongoCollection}
import scala.concurrent.duration._
import scala.concurrent.{Await, Future}
// Models
case class Name(value: String) extends AnyVal
case class Person(name: Name)
object TestValueClassCodecs extends App {
private[this] val codecRegistry =
fromRegistries(
fromProviders(
classOf[Person],
classOf[Name],
),
DEFAULT_CODEC_REGISTRY
)
protected val collection: MongoCollection[Person] =
MongoClient(s"mongodb://localhost:27017")
.getDatabase("TestDB")
.withCodecRegistry(codecRegistry)
.getCollection[Person]("test_repo_values_classes")
val res = Await.result(
collection.insertOne(Person(Name("Jesus"))).toFuture(),
10.seconds
)
}
Output:
Caused by: java.lang.StackOverflowError
at scala.collection.LinearSeqOptimized.length(LinearSeqOptimized.scala:54)
at scala.collection.LinearSeqOptimized.length$(LinearSeqOptimized.scala:51)
at scala.collection.immutable.List.length(List.scala:91)
at scala.collection.SeqLike.size(SeqLike.scala:108)
at scala.collection.SeqLike.size$(SeqLike.scala:108)
at scala.collection.AbstractSeq.size(Seq.scala:45)
at scala.collection.convert.Wrappers$IterableWrapperTrait.size(Wrappers.scala:25)
at scala.collection.convert.Wrappers$IterableWrapperTrait.size$(Wrappers.scala:25)
at scala.collection.convert.Wrappers$SeqWrapper.size(Wrappers.scala:66)
at java.util.AbstractCollection.toArray(AbstractCollection.java:136)
at java.util.ArrayList.<init>(ArrayList.java:178)
at org.bson.internal.ProvidersCodecRegistry.<init>(ProvidersCodecRegistry.java:34)
at org.bson.codecs.configuration.CodecRegistries.fromRegistries(CodecRegistries.java:126)
at org.mongodb.scala.bson.codecs.macrocodecs.MacroCodec.$init$(MacroCodec.scala:86)
at test.TestValueClassCodecs$$anon$1$$anon$3$NameMacroCodec$1.<init>(TestValueClassCodecs.scala:51)
at test.TestValueClassCodecs$$anon$1$$anon$3$NameMacroCodec$2$.apply(TestValueClassCodecs.scala:51)
at test.TestValueClassCodecs$$anon$1$$anon$3.get(TestValueClassCodecs.scala:51)
at org.bson.internal.ProvidersCodecRegistry.get(ProvidersCodecRegistry.java:45)
at org.bson.internal.ProvidersCodecRegistry.get(ProvidersCodecRegistry.java:45)
at org.bson.internal.ChildCodecRegistry.get(ChildCodecRegistry.java:58)
at org.bson.internal.ProvidersCodecRegistry.get(ProvidersCodecRegistry.java:45)
at org.bson.internal.ChildCodecRegistry.get(ChildCodecRegistry.java:58)
at org.bson.internal.ProvidersCodecRegistry.get(ProvidersCodecRegistry.java:45)
at org.bson.internal.ChildCodecRegistry.get(ChildCodecRegistry.java:58)
at org.bson.internal.ProvidersCodecRegistry.get(ProvidersCodecRegistry.java:45)
I am using version:
org.mongodb.scala:mongo-scala-bson_2.12:4.1.1
If the Name class is not a value one everything works just fine.

Related

Can't find a codec for class java.lang.Object. while using mongo with scala

I am using mongo with scala.
object myApp extends App{
case class myClass(field: String, value: Option[Any])
}
Above is my case class and below is the DB code
object DB{
import org.bson.codecs.configuration.CodecRegistries
import org.bson.codecs.configuration.CodecRegistries._
import org.mongodb.scala.MongoClient.DEFAULT_CODEC_REGISTRY
import org.mongodb.scala.bson.codecs.Macros._
import org.mongodb.scala.{MongoClient, MongoCollection, MongoDatabase}
private val codecs = fromProviders(classOf[myClass])
private val codecReg = fromRegistries(codecs,
DEFAULT_CODEC_REGISTRY)
private val dataB: MongoDatabase = MongoClient().getDatabase("database").withCodecRegistry(codecReg)
val myClass: MongoCollection[Rule] = dataB.getCollection("myClass")
}
now, the "value" can have String/Int/None value. If I define a myClass with value as 'None' i.e
val c = myClass("abc", None)
and put this in database than it runs without ant error, but if i keep value as Some(String_value) or Some(int_value) than it shows the error.
val d = myClass("abc", Some("xyz"))
val f = myClass("abc", Some(90))
error
Can't find a codec for class java.lang.Object
Can somebody guide me how should I do this as "value" field can have String/Int/None values.

Scala Reflection exception during creation of DataSet in Spark

I want to run Spark Job on Spark Jobserver.
During execution, I got an exception:
stack:
java.lang.RuntimeException: scala.ScalaReflectionException: class
com.some.example.instrument.data.SQLMapping in JavaMirror with
org.apache.spark.util.MutableURLClassLoader#55b699ef of type class
org.apache.spark.util.MutableURLClassLoader with classpath
[file:/app/spark-job-server.jar] and parent being
sun.misc.Launcher$AppClassLoader#2e817b38 of type class
sun.misc.Launcher$AppClassLoader with classpath [.../classpath
jars/] not found.
at
scala.reflect.internal.Mirrors$RootsBase.staticClass(Mirrors.scala:123)
at
scala.reflect.internal.Mirrors$RootsBase.staticClass(Mirrors.scala:22)
at
com.some.example.instrument.DataRetriever$$anonfun$combineMappings$1$$typecreator15$1.apply(DataRetriever.scala:136)
at
scala.reflect.api.TypeTags$WeakTypeTagImpl.tpe$lzycompute(TypeTags.scala:232)
at scala.reflect.api.TypeTags$WeakTypeTagImpl.tpe(TypeTags.scala:232)
at
org.apache.spark.sql.catalyst.encoders.ExpressionEncoder$.apply(ExpressionEncoder.scala:49)
at org.apache.spark.sql.Encoders$.product(Encoders.scala:275) at
org.apache.spark.sql.LowPrioritySQLImplicits$class.newProductEncoder(SQLImplicits.scala:233)
at
org.apache.spark.sql.SQLImplicits.newProductEncoder(SQLImplicits.scala:33)
at
com.some.example.instrument.DataRetriever$$anonfun$combineMappings$1.apply(DataRetriever.scala:136)
at
com.some.example.instrument.DataRetriever$$anonfun$combineMappings$1.apply(DataRetriever.scala:135)
at scala.util.Success$$anonfun$map$1.apply(Try.scala:237) at
scala.util.Try$.apply(Try.scala:192) at
scala.util.Success.map(Try.scala:237) at
scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:237) at
scala.concurrent.Future$$anonfun$map$1.apply(Future.scala:237) at
scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32) at
scala.concurrent.impl.ExecutionContextImpl$AdaptedForkJoinTask.exec(ExecutionContextImpl.scala:121)
at
scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at
scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at
scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at
scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
In DataRetriever I convert simple case class to DataSet.
case class definition:
case class SQLMapping(id: String,
it: InstrumentPrivateKey,
cc: Option[String],
ri: Option[SourceInstrumentId],
p: Option[SourceInstrumentId],
m: Option[SourceInstrumentId])
case class SourceInstrumentId(instrumentId: Long,
providerId: String)
case class InstrumentPrivateKey(instrumentId: Long,
providerId: String,
clientId: String)
code that causes a problem:
import session.implicits._
def someFunc(future: Future[ID]): Dataset[SQLMappins] = {
future.map {f =>
val seq: Seq[SQLMapping] = getFromEndpoint(f)
val ds: Dataset[SQLMapping] = seq.toDS()
...
}
}
The job sometimes works, but if I re-run job, it will throw an exception.
update 28.03.2018
I forgot to mention one detail, that turns out to be important.
Dataset was constructed inside of Future.
Calling toDS() inside future causing ScalaReflectionException.
I decided to construct DataSet outside future.map.
You can verify that Dataset can't be constructed in future.map with this example job.
package com.example.sparkapplications
import com.typesafe.config.Config
import org.apache.spark.SparkContext
import org.apache.spark.sql.SparkSession
import scala.concurrent.Await
import scala.concurrent.Future
import scala.concurrent.duration._
import scala.concurrent.ExecutionContext.Implicits.global
import spark.jobserver.SparkJob
import spark.jobserver.SparkJobValid
import spark.jobserver.SparkJobValidation
object FutureJob extends SparkJob{
override def runJob(sc: SparkContext,
jobConfig: Config): Any = {
val session = SparkSession.builder().config(sc.getConf).getOrCreate()
import session.implicits._
val f = Future{
val seq = Seq(
Dummy("1", 1),
Dummy("2", 2),
Dummy("3", 3),
Dummy("4", 4),
Dummy("5", 5)
)
val ds = seq.toDS
ds.collect()
}
Await.result(f, 10 seconds)
}
case class Dummy(id: String, value: Long)
override def validate(sc: SparkContext,
config: Config): SparkJobValidation = SparkJobValid
}
Later I will provide information if the problem persists using spark 2.3.0, and when you pass jar via spark-submit directly.

could not find implicit value for parameter env: com.mohiva.play.silhouette.api.Environment[utils.auth.DefaultEnv]

I'm using a Silhouette v4.0 library with play framework 2.5.
And have been trying to write test code using play specs2.
But, I get the following error with my test class as below.
Error Message
[error] could not find implicit value for parameter env: com.mohiva.play.silhouette.api.Environment[utils.auth.DefaultEnv]
.withAuthenticator[DefaultEnv](identity.loginInfo)
^
Here's the test class
package controllers
import com.google.inject.AbstractModule
import org.joda.time.DateTime
import org.specs2.specification.Scope
import org.specs2.matcher._
import org.specs2.mock._
import play.api.test._
import play.api.libs.json._
import play.api.libs.json.Json
import play.api.libs.json.Reads._
import play.api.libs.functional.syntax._
import play.api.libs.concurrent.Execution.Implicits._
import play.api.libs.mailer.{ MailerClient, Email }
import play.api.inject.guice.GuiceApplicationBuilder
import play.api.inject.bind
import com.mohiva.play.silhouette.test._
import com.mohiva.play.silhouette.api._
import com.mohiva.play.silhouette.api.repositories.AuthInfoRepository
import com.mohiva.play.silhouette.api.util._
import com.mohiva.play.silhouette.impl.providers._
import net.codingwell.scalaguice.ScalaModule
import utils.auth.DefaultEnv
class TestControllerSpec extends PlaySpecification with Mockito {
"case" in new Context {
new WithApplication(application) {
val request = FakeRequest(POST, "/api/test")
.withAuthenticator[DefaultEnv](identity.loginInfo) // <-
val result = route(app, request).get
status(result) must be equalTo OK
}
}
trait Context extends Scope {
val identity = User(
loginInfo = LoginInfo(..)
..
)
implicit val env = FakeEnvironment[DefaultEnv](Seq(identity.loginInfo -> identity))
class FakeModule extends AbstractModule with ScalaModule {
def configure() = {
bind[Environment[DefaultEnv]].toInstance(env)
}
}
lazy val application = new GuiceApplicationBuilder()
.overrides(new FakeModule)
.build
}
}
There are some other test classes similar to this class are properly able to compile and execute.
It's kind of implicit problem with scope..
Therefore, I tried to import all the same as another test class which's able to compile properly. But, still unable to compile.
Missing some import?
As the compiler states, you're missing an implicit value. Use the following, which is modeled after one of Silhouette's specs:
class TestControllerSpec extends PlaySpecification with Mockito {
"the POST request" should {
"return an OK response" in new Context {
new WithApplication(application) {
val identity = User(LoginInfo(...))
implicit val env = FakeEnvironment[DefaultEnv](Seq(identity.loginInfo -> identity))
val request = FakeRequest(POST, "/api/test")
.withAuthenticator(identity.loginInfo)
val result = route(app, request).get
status(result) must be equalTo OK
}
}
}
trait Context extends Scope {
...
}
}

Can't make schema into cassandra using phantom-dsl

Trying to create a schema into cassandra using phantom-dsl for unit testing following this tutorial:
http://outworkers.com/blog/post/phantom-tips-3-understanding-phantom-connectors
I ran into this issue when trying to auto-generate schema
[ERROR] /home/.../test/BaseCassandraSpec.scala:54: error: not enough arguments for method autocreate: (keySpace: com.websudos.phantom.connectors.KeySpace)
com.websudos.phantom.builder.query.CreateQuery.Default[com.neruti.db.models.ConcreteUserModel,com.neruti.User].
[ERROR] Unspecified value parameter keySpace.
[ERROR] Await.result(database.userModel.autocreate().future(),10.seconds)
Any advice?
Currently using version 1.29.6
BaseCassandraSpec
import com.neruti.User
import com.neruti.db.models._
import com.neruti.db.databases._
import com.neruti.db.services._
import com.neruti.db.Connector._
import org.scalatest._
import org.scalatest.{BeforeAndAfterAll,FlatSpec,Matchers,ShouldMatchers}
import org.scalatest.concurrent.ScalaFutures
import org.scalamock.scalatest.MockFactory
import scala.concurrent.duration._
import scala.concurrent.{Await, Future}
import scala.concurrent.ExecutionContext.Implicits.global
override protected def beforeAll(): Unit = {
Await.result(database.userModel.autocreate().future(),10.seconds)
}
Database
class UserDatabase (val connector: KeySpaceDef){
object userModel extends ConcreteUserModel with connector.Connector
}
object ProductionDb extends UserDatabase(connector)
trait ProductionDatabaseProvider {
def database: UserDatabase
}
trait ProductionDatabase extends ProductionDatabaseProvider {
override val database = ProductionDb
}
object testDB extends UserDatabase(testConnector)
trait testDatabaseProvider {
def database: UserDatabase
}
trait testDatabase extends testDatabaseProvider{
override val database = testDB
}
Connector
package com.neruti.db
import com.neruti.db.models._
import com.websudos.phantom.database.Database
import com.websudos.phantom.connectors.ContactPoints
import com.websudos.phantom.dsl.KeySpaceDef
object Connector {
// TODO: these key value pairs shld get from HOCON config file
val host= Seq("127.0.0.1")
val port = 9042
val keySpace: String = "nrt_entities"
// val inet = InetAddress.getByName
lazy val connector = ContactPoints(host,port).withClusterBuilder(
_.withCredentials("dev", "nrtDev1989")
).keySpace(keySpace)
// embedded cassandra is not supported anymore. Check phantom-sbt.
// lazy val testConnector: KeySpaceDef = ContactPoint.embedded.keySpace(keySpace)
lazy val testConnector: KeySpaceDef = ContactPoints(host,port).noHeartbeat().keySpace(keySpace)
}
I would upgrade to phantom 2.0.0 as a side note. Next, there are many things to improve in your code, starting with capitalisation of the traits.
You should use database.create or database.createAsync, which no longer require you to re-pass the implicit keyspace or session. Bear in mind this API is version 2.1.1 of phantom-dsl, available on Maven Central.
package com.neruti.db
import com.neruti.db.models._
import com.outworkers.phantom.connectors.ContactPoints
import com.outworkers.phantom.dsl._
object Connector {
// TODO: these key value pairs shld get from HOCON config file
val host= Seq("127.0.0.1")
val port = 9042
val keySpace: String = "nrt_entities"
// val inet = InetAddress.getByName
lazy val connector = ContactPoints(host,port).withClusterBuilder(
_.withCredentials("dev", "nrtDev1989")
).keySpace(keySpace)
lazy val testConnector: KeySpaceDef = ContactPoints(host, port).noHeartbeat().keySpace(keySpace)
}
class MyDb(override val connector: CassandraConnection) extends Database(connector) {
... tables
}
object TestMyDb extends MyDb(Connector.testConnector)
import com.outworkers.phantom.dsl.context
// Now this will only require an execution context, nothing more
TestMyDb.create()

Error testing DAL in slick scala

Pretty new to scala and play and I have been assigned a task to test someone else app,which is running fine btw.Please check if my tests are right and what is the error.
This is employeeEntry.scala file in models
package models
import models.database.Employee
import play.api.db.slick.Config.driver.simple._
import play.api.db.slick._
import play.api.Play.current
case class EmployeeEntry(eid :Int, ename: String, eadd: String, emob: String)
object Employee {
val DBemp = TableQuery[Employee]
def savedat(value: EmployeeEntry):Long = {
DB.withSession { implicit session =>
DBemp+=EmployeeEntry(eid=value.eid,ename=value.ename,eadd=value.eadd,emob=value.emob)
}}
/*val query = for (c <- Employee) yield c.ename
val result = DB.withSession {
implicit session =>
query.list // <- takes session implicitly
}*/
//val query = for (c <- Employee) yield c.ename
def getPersonList: List[EmployeeEntry] = DB.withSession { implicit session => DBemp.list }
def Update: Int = DB.withSession { implicit session =>
(DBemp filter (_.eid === 1) map (s => (s.ename,s.eadd))) update ("test","khair")}
def delet :Int =DB.withSession {
implicit session => (DBemp filter (_.eid === 1)).delete
}
}
And this is file Employee.scala in models/database
package models.database
import models._
import models.EmployeeEntry
import play.api.db.slick.Config.driver.simple._
import scala.slick.lifted._
class Employee(tag:Tag) extends Table[EmployeeEntry](tag,"employee")
{
//val a = "hello"
def eid = column[Int]("eid", O.PrimaryKey)
def ename = column[String]("name", O.DBType("VARCHAR(50)"))
def emob = column[String]("emob",O.DBType("VARCHAR(10)"))
def eadd =column[String]("eadd",O.DBType("VARCHAR(100)"))
def * = (eid, ename, emob, eadd) <> (EmployeeEntry.tupled, EmployeeEntry.unapply)
}
Finally this is my test I am running,which is failing :
import play.api.db.slick.Config.driver.simple._
import play.api.db.slick._
import play.api.Play.current
import org.scalatest.FunSpec
import org.scalatest.matchers.ShouldMatchers
import models.database.Employee
import scala.slick.lifted._
import models._
import models.EmployeeEntry
//import scala.slick.driver.H2Driver.simple._
class databasetest extends FunSpec with ShouldMatchers{
describe("this is to check database layer"){
it("can save a row"){
val a = EmployeeEntry(1006,"udit","schd","90909090")
Employee.savedat(a) should be (1)
}
it("getpersonlist"){
Employee.getPersonList.size should be (1)
}
}
}
The test is failing and error is
java.lang.RuntimeException: There is no started application
at scala.sys.package$.error(package.scala:27)
at play.api.Play$$anonfun$current$1.apply(Play.scala:71)
at play.api.Play$$anonfun$current$1.apply(Play.scala:71)
at scala.Option.getOrElse(Option.scala:120)
at play.api.Play$.current(Play.scala:71)
at models.Employee$.getPersonList(EmployeeEntry.scala:27)
at databasetest$$anonfun$1$$anonfun$apply$mcV$sp$2.apply$mcV$sp(databasetest.scala:39)
at databasetest$$anonfun$1$$anonfun$apply$mcV$sp$2.apply(databasetest.scala:39)
at databasetest$$anonfun$1$$anonfun$apply$mcV$sp$2.apply(databasetest.scala:39)
at org.scalatest.Transformer$$anonfun$apply$1.apply$mcV$sp(Transformer.scala:22)
By default play provides spec2 testing framework.so no need to add scalatest framework for unit testing.For database access layer testing required a connection with database so start a fake application.(this is not running code, just an idea to write unit test)
for more detail take a look play doc : https://www.playframework.com/documentation/2.3.x/ScalaFunctionalTestingWithSpecs2
import org.specs2.mutable.Specification
import models.database.Employee
import models._
import models.EmployeeEntry
import play.api.test.FakeApplication
import play.api.test.Helpers.running
import play.api.Play.current
class databasetest extends Specification {
"database layer" should {
"save a row" in {
running(FakeApplication()) {
val a = EmployeeEntry(1006,"udit","schd","90909090")
Employee.savedat(a) must be equalTo (1)
}
}
"get list" in {
running(FakeApplication()) {
Employee.getPersonList.size must be equalTo (1)
}
}
}