Unable to use Mongo Scala driver with case classes - mongodb

I'm trying to use the Scala mongo driver with case classes, as described at: http://mongodb.github.io/mongo-scala-driver/2.2/getting-started/quick-tour-case-classes/
However I'm getting the exception:
Can't find a codec for class com.foo.model.User$.
org.bson.codecs.configuration.CodecConfigurationException: Can't find a codec for class com.foo.model.User$.
when I try to insert an item.
The case class:
case class User(
_id: ObjectId,
foo: String = "",
foo2: String = "",
foo3: String = "",
first: String,
last: String,
username: String,
pwHash: String = ""
gender: String,
isFoo: Boolean = false)
extends FooTrait
The code:
val providers = fromProviders( classOf[User])
val registry = fromRegistries(providers, DEFAULT_CODEC_REGISTRY)
val connStr = "mongodb://...."
val clusterSettings = ClusterSettings.builder().applyConnectionString(new ConnectionString(connStr)).build()
val clientSettings = MongoClientSettings.builder().codecRegistry(getCodecRegistry).clusterSettings(clusterSettings).build()
val client = MongoClient( clientSettings )
val database: MongoDatabase = client.getDatabase(dbName).withCodecRegistry(registry)
val modelCollection: MongoCollection[User] = db.getCollection("user")
val item = User(.....) //snipped
modelCollection.insertOne(item).toFuture()
Full stack trace:
Can't find a codec for class com.foo.model.User$.
org.bson.codecs.configuration.CodecConfigurationException: Can't find a codec for class com.foo.model.User$.
at org.bson.codecs.configuration.CodecCache.getOrThrow(CodecCache.java:46)
at org.bson.codecs.configuration.ProvidersCodecRegistry.get(ProvidersCodecRegistry.java:63)
at org.bson.codecs.configuration.ProvidersCodecRegistry.get(ProvidersCodecRegistry.java:37)
at com.mongodb.async.client.MongoCollectionImpl.getCodec(MongoCollectionImpl.java:1170)
at com.mongodb.async.client.MongoCollectionImpl.getCodec(MongoCollectionImpl.java:1166)
at com.mongodb.async.client.MongoCollectionImpl.executeInsertOne(MongoCollectionImpl.java:519)
at com.mongodb.async.client.MongoCollectionImpl.insertOne(MongoCollectionImpl.java:501)
at com.mongodb.async.client.MongoCollectionImpl.insertOne(MongoCollectionImpl.java:496)
at org.mongodb.scala.MongoCollection.$anonfun$insertOne$1(MongoCollection.scala:410)
at org.mongodb.scala.MongoCollection.$anonfun$insertOne$1$adapted(MongoCollection.scala:410)
at org.mongodb.scala.internal.ObservableHelper$$anon$2.apply(ObservableHelper.scala:42)
at org.mongodb.scala.internal.ObservableHelper$$anon$2.apply(ObservableHelper.scala:40)
at com.mongodb.async.client.SingleResultCallbackSubscription.requestInitialData(SingleResultCallbackSubscription.java:38)
at com.mongodb.async.client.AbstractSubscription.tryRequestInitialData(AbstractSubscription.java:151)
at com.mongodb.async.client.AbstractSubscription.request(AbstractSubscription.java:82)
at org.mongodb.scala.ObservableImplicits$BoxedSubscription.request(ObservableImplicits.scala:474)
at org.mongodb.scala.ObservableImplicits$ScalaObservable$$anon$2.onSubscribe(ObservableImplicits.scala:373)
at org.mongodb.scala.ObservableImplicits$ToSingleObservable$$anon$3.onSubscribe(ObservableImplicits.scala:440)
at org.mongodb.scala.Observer.onSubscribe(Observer.scala:85)
at org.mongodb.scala.Observer.onSubscribe$(Observer.scala:85)
at org.mongodb.scala.ObservableImplicits$ToSingleObservable$$anon$3.onSubscribe(ObservableImplicits.scala:432)
at com.mongodb.async.client.SingleResultCallbackSubscription.<init>(SingleResultCallbackSubscription.java:33)
at com.mongodb.async.client.Observables$2.subscribe(Observables.java:76)
at org.mongodb.scala.ObservableImplicits$BoxedObservable.subscribe(ObservableImplicits.scala:458)
at org.mongodb.scala.ObservableImplicits$ToSingleObservable.subscribe(ObservableImplicits.scala:432)
at org.mongodb.scala.ObservableImplicits$ScalaObservable.headOption(ObservableImplicits.scala:365)
at org.mongodb.scala.ObservableImplicits$ScalaObservable.head(ObservableImplicits.scala:351)
at org.mongodb.scala.ObservableImplicits$ScalaSingleObservable.toFuture(ObservableImplicits.scala:410)
I think I'm doing everything right - and unless this is a bug, the code should work. My mongo-scala-driver version is 2.2.0.
Any ideas?

Here is a sample that is working on my local box with Mongo 3.6.1.
// ammonite script mongo.sc
import $ivy.`org.mongodb.scala::mongo-scala-driver:2.2.0`
import org.mongodb.scala._
import org.mongodb.scala.connection._
import org.mongodb.scala.bson.ObjectId
import org.mongodb.scala.bson.codecs.Macros._
import org.mongodb.scala.bson.codecs.DEFAULT_CODEC_REGISTRY
import org.bson.codecs.configuration.CodecRegistries.{fromRegistries, fromProviders}
trait FooTrait
case class User(_id: ObjectId,
foo: String = "",
foo2: String = "",
foo3: String = "",
first: String,
last: String,
username: String,
pwHash: String = "",
gender: String,
isFoo: Boolean = false) extends FooTrait
val codecRegistry = fromRegistries(fromProviders(classOf[User]), DEFAULT_CODEC_REGISTRY )
import scala.collection.JavaConverters._
val clusterSettings: ClusterSettings = ClusterSettings.builder().hosts(List(new ServerAddress("localhost")).asJava).build()
val settings: MongoClientSettings = MongoClientSettings.builder().clusterSettings(clusterSettings).build()
val mongoClient: MongoClient = MongoClient(settings)
val database: MongoDatabase = mongoClient.getDatabase("mydb").withCodecRegistry(codecRegistry)
val userCollection: MongoCollection[User] = database.getCollection("user")
val user:User = User(new ObjectId(), "foo", "foo2", "foo3", "first", "last", "username", "pwHash", "gender", true)
import scala.concurrent.duration._
import scala.concurrent.Await
// wait for Mongo to complete insert operation
Await.result(userCollection.insertOne(user).toFuture(),3.seconds)
When you save this snippet into a file mongo.sc, then you can run it with Ammonite using
amm mongo.sc
In case mongo is running on the default port, the mydb database should get created automatically including the new user collection.

Related

Retrieving list of objects from application.conf

I have the following entry in Play for Scala application.conf:
jobs = [
{number: 0, dir: "/dir1", name: "General" },
{number: 1, dir: "/dir2", name: "Customers" }
]
I want to retrieve this list of objects in a Scala program:
val conf = ConfigFactory.load
val jobs = conf.getAnyRefList("jobs").asScala
println(jobs)
this prints
Buffer({number=0, name=General, dir=/dir1}, {number=1, name=Customers, dir=/dir2})
But how to convert the result to actual Scala objects?
Try this one:
case class Job(number: Int, dir: String, name: String)
object Job {
implicit val configLoader: ConfigLoader[List[Job]] = ConfigLoader(_.getConfigList).map(
_.asScala.toList.map(config =>
Job(
config.getInt("number"),
config.getString("dir"),
config.getString("name")
)
)
)
}
Then from Confugutation DI
Configuration.get[List[Job]]("jobs")
Here is a Config object which will extract data from a config file into a type that you specify.
Usage:
case class Job(number: Int, dir: String, name: String)
val jobs = Config[List[Job]]("jobs")
Code:
import com.typesafe.config._
import org.json4s._
import org.json4s.jackson.JsonMethods._
object Config {
private val conf = ConfigFactory.load()
private val jData = parse(conf.root.render(ConfigRenderOptions.concise))
def apply[T](name: String)(implicit formats: Formats = DefaultFormats, mf: Manifest[T]): T =
Extraction.extract(jData \\ name)(formats, mf)
}
This will throw an exception if the particular config object does not exist or does not match the format of T.

connecting slick 3.1.1 to the database

I have the following code and I'm trying to connect to the MySQL database without success.
cat Database.scala
package com.github.odnanref.EmailFilter
import slick.driver.MySQLDriver._
import slick.driver.MySQLDriver.backend.Database
/**
* Created by andref on 12/05/16.
*/
class Database {
val url = "jdbc:mysql://localhost/playdb"
val db = Database.forURL(url, driver = "com.mysql.jdbc.Driver")
override def finalize() {
db.close()
super.finalize()
}
}
cat EmailMessageTable.scala
package com.github.odnanref.EmailFilter
import java.sql.Timestamp
import slick.driver.JdbcProfile
import slick.driver.MySQLDriver.api._
import scala.concurrent.Future
class EmailMessageTable(tag: Tag) extends Table[EmailMessage](tag, "email_message") {
def id = column[Option[Long]]("id", O.AutoInc, O.PrimaryKey)
def email = column[String]("email")
def subject = column[String]("subject")
def body = column[String]("body")
def datain = column[Timestamp]("datain")
def email_id= column[Long]("email_id")
def * = (id, email, subject, body, datain, email_id) <> ((EmailMessage.apply _).tupled, EmailMessage.unapply)
def ? = (id.get.?, email.?, subject.?, body.?, datain.?).shaped.<>({ r =>; _1.map(_ =>
EmailMessage.tupled((_1, _2.get, _3.get, _4.get, _5.get))) }, (_: Any) =>
throw new Exception("Inserting into ? projection not supported."))
}
I can't initialize the database and execute search query's or insert statements based on this code I try to do
val db = new Database()
db.db.run(TableQuery[EmailMessageTable] += EmailMessage(...) )
And it says, it doesn't know the method +=
Also I get this error:
Database.scala:4: imported `Database' is permanently hidden by definition of class Database in package EmailFilter
[warn] import slick.driver.MySQLDriver.backend.Database
What am I doing wrong?
Post EDIT>
package com.github.odnanref.EmailFilter
import java.sql.Timestamp
case class EmailMessage(
id: Option[Long],
email: String,
subject:String,
body:String,
datain: Timestamp,
email_id: Long
)
You are importing a class named Database inside a file that defines another class with the same name. You can:
rename your Database class:
class MyDatabase {
val url = ...
val db = ...
...
}
rename imported class:
import slick.driver.MySQLDriver.backend.{Database => SlickDB}
...
val db = SlickDB.forURL(url, driver = "com.mysql.jdbc.Driver")
avoid importing Database explicitly:
import slick.driver.MySQLDriver.backend
...
val db = backend.Database.forURL(url, driver = "com.mysql.jdbc.Driver")

NullPointerException on executing concurrent queries using Slick

I am working on a Scala application with Postgres 9.3 and Slick 3.1.1. I am getting Null Pointer Exception on slick driver when multiple queries execute at the same time.
Here is my simplified code. I am creating multiple actors which will call the same method to query from the database.
package com.app.repo
import java.sql.Timestamp
import akka.actor.{Actor, ActorSystem, Props}
import slick.driver.PostgresDriver
import slick.driver.PostgresDriver.api._
import scala.concurrent.ExecutionContext.Implicits.global
import scala.concurrent.duration.FiniteDuration
import scala.util.{Failure, Success}
case class SampleData(id: Long, name: String, createDate: java.sql.Timestamp)
object Tables extends {
val profile = PostgresDriver
} with Tables
trait Tables {
val profile: PostgresDriver
import profile.api._
class SampleDataTable(_tableTag: Tag) extends Table[SampleData](_tableTag, Some("processing"), "SampleData") {
def * = (id, name, createDate) <>(SampleData.tupled, SampleData.unapply)
def ? = (Rep.Some(id), Rep.Some(name), Rep.Some(createDate)).shaped.<>({ r => import r._; _1.map(_ => SampleData.tupled((_1.get, _2.get, _3.get))) }, (_: Any) => throw new Exception("Inserting into ? projection not supported."))
val id: Rep[Long] = column[Long]("SampleId", O.AutoInc, O.PrimaryKey)
val name: Rep[String] = column[String]("Name")
val createDate: Rep[java.sql.Timestamp] = column[java.sql.Timestamp]("CreateDate")
}
lazy val sampleDataTable = new TableQuery(tag => new SampleDataTable(tag))
}
class SampleQueryingActor(delay: FiniteDuration, duration: FiniteDuration) extends Actor {
import scala.concurrent.duration._
override def preStart() = {
context.system.scheduler.schedule(0.second, duration, self, "tick")
}
override def receive: Receive = {
case "tick" => {
println("tick received.. ")
//val range = 1 until 1000
RepositoryImpl.reader.onComplete({
case Success(r) => println(s"got sum as ${r.getOrElse(0)}")
case Failure(ex) => ex.printStackTrace()
})
}
}
}
object DriverHelper {
val user = "postgres"
val url = "jdbc:postgresql://192.168.1.50:5432/MyDatabase"
val password = "password"
val jdbcDriver = "org.postgresql.Driver"
val db: PostgresDriver.backend.DatabaseDef = Database.forURL(url, user = user, password = password, driver = jdbcDriver)
}
object RepositoryImpl {
val db: PostgresDriver.backend.DatabaseDef = DriverHelper.db
val now = new Timestamp(System.currentTimeMillis())
def reader = {
db.run(Tables.sampleDataTable.filter(_.createDate > now).map(_.id).sum.result)
}
def insertBatchRecords(list: List[SampleData]) = {
db.run(Tables.sampleDataTable ++= list)
}
}
object PGConnectionTester extends App {
import scala.concurrent.duration._
val sys = ActorSystem("sys")
sys.actorOf(Props(classOf[SampleQueryingActor], 1.seconds, 10.seconds))
sys.actorOf(Props(classOf[SampleQueryingActor], 1.seconds, 10.seconds))
sys.actorOf(Props(classOf[SampleQueryingActor], 1.seconds, 10.seconds))
}
When I execute the above code, I get the error as below:
java.lang.NullPointerException
at slick.jdbc.DriverDataSource.getConnection(DriverDataSource.scala:98)
at slick.jdbc.DataSourceJdbcDataSource.createConnection(JdbcDataSource.scala:64)
at slick.jdbc.JdbcBackend$BaseSession.conn$lzycompute(JdbcBackend.scala:415)
at slick.jdbc.JdbcBackend$BaseSession.conn(JdbcBackend.scala:414)
at slick.jdbc.JdbcBackend$SessionDef$class.prepareStatement(JdbcBackend.scala:297)
at slick.jdbc.JdbcBackend$BaseSession.prepareStatement(JdbcBackend.scala:407)
at slick.jdbc.StatementInvoker.results(StatementInvoker.scala:33)
at slick.jdbc.StatementInvoker.iteratorTo(StatementInvoker.scala:22)
at slick.jdbc.Invoker$class.first(Invoker.scala:31)
at slick.jdbc.StatementInvoker.first(StatementInvoker.scala:16)
at slick.driver.JdbcActionComponent$QueryActionExtensionMethodsImpl$$anon$3.run(JdbcActionComponent.scala:228)
at slick.driver.JdbcActionComponent$SimpleJdbcDriverAction.run(JdbcActionComponent.scala:32)
at slick.driver.JdbcActionComponent$SimpleJdbcDriverAction.run(JdbcActionComponent.scala:29)
at slick.backend.DatabaseComponent$DatabaseDef$$anon$2.liftedTree1$1(DatabaseComponent.scala:237)
at slick.backend.DatabaseComponent$DatabaseDef$$anon$2.run(DatabaseComponent.scala:237)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
The actor will invoke the same method every 10 second. However, I am getting this error only for the first time. After that the queries are executing correctly. I am not able to understand why this is happening. In this sample case, there are only some simple read operations. But in my actual case, since the query is failing, some of the data is getting lost without processing correctly.
Is this error is something to do with connection pooling?
I think you have found this issue. Try to use lazy val for db so it only initializes once:
object DriverHelper {
val user = "postgres"
val url = "jdbc:postgresql://192.168.1.50:5432/MyDatabase"
val password = "password"
val jdbcDriver = "org.postgresql.Driver"
lazy val db: PostgresDriver.backend.DatabaseDef = Database.forURL(url, user = user, password = password, driver = jdbcDriver)
}
Just sharing the information for anyone else facing this issue.
There was a bug with Slick itself. It was reported here. Git user, mustajavi fixed this and was merged to latest Slick branch. With the latest update of 3.1.1, the issue is resolved for me.
Related Links in GitHub:
https://github.com/slick/slick/pull/1401
https://github.com/slick/slick/pull/1445

#JsonIgnore serialising Scala case class property using Jackon and Json4s

I'm trying to prevent one of the properties of a Scala case class being serialised. I've tried annotating the property in question with the usual #JsonIgnore and I've also tried attaching the #JsonIgnoreProperties(Array("property_name")) to the case class. Neither of which seem to achieve what I want.
Here's a small example:
import org.json4s._
import org.json4s.jackson._
import org.json4s.jackson.Serialization
import org.json4s.jackson.Serialization.{read, write}
import com.fasterxml.jackson.annotation._
object Example extends App {
#JsonIgnoreProperties(Array("b"))
case class Message(a: String, #JsonIgnore b: String)
implicit val formats = Serialization.formats(NoTypeHints)
val jsonInput = """{ "a": "Hello", "b":"World!" }"""
val message = read[Message](jsonInput)
println("Read " + message) // "Read Message(Hello,World!)
val output = write(message)
println("Wrote " + output) // "Wrote {"a":"Hello","b":"World!"}"
}
Change your #JsonIgnore to #JsonProperty("b"). You have correctly stated to Ignore the property 'b but 'b has not yet been annotated as a property.
#JsonIgnoreProperties(Array("b"))
case class Message(a: String, #JsonProperty("b") b: String)
With jackson-databind 2.8.6 and jackson-module-scala 2.8.4
"com.fasterxml.jackson.core" % "jackson-databind" % "2.8.6",
"com.fasterxml.jackson.module" % "jackson-module-scala_2.11" % "2.8.4"
Only #JsonIgnoreProperties works fine,
Example case class as below where I'm ignoring "eventOffset" and "hashValue",
import java.util.Date
import com.fasterxml.jackson.annotation.{JsonIgnore, JsonIgnoreProperties}
import com.fasterxml.jackson.databind.ObjectMapper
import com.fasterxml.jackson.module.scala.DefaultScalaModule
import com.fasterxml.jackson.module.scala.experimental.ScalaObjectMapper
#JsonIgnoreProperties(Array("eventOffset", "hashValue"))
case class TestHappenedEvent(eventOffset: Long, hashValue: Long, eventType: String,
createdDate: Date, testField: String) {
def this() {
this(0, 0, "", new Date(), "")
}
def toJSON(): String = {
val objectMapper = new ObjectMapper() with ScalaObjectMapper
objectMapper.registerModule(DefaultScalaModule)
val data = this.copy()
val stream = new ByteArrayOutputStream()
objectMapper.writeValue(stream, data)
stream.toString
}
}
test
import org.scalatest.FunSuite
import spray.json._
class BaseEventSpecs extends FunSuite {
val abstractEvent = TestHappenedEvent(0, 1, "TestHappenedEvent", new Date(2017, 10, 28), "item is sold")
test("converts itself to JSON") {
assert(abstractEvent.toJSON().parseJson ==
"""
{
"eventType":"TestHappenedEvent",
"createdDate":61470000000000,
"testField":"item is sold"
}
""".stripMargin.parseJson)
}
}

Deserialize MongoDB Document with Scala and Jackson-Mapper leads to UnrecognizedProperty _id

I have the following class defined in Scala using Jackson as mapper.
package models
import play.api.Play.current
import org.codehaus.jackson.annotate.JsonProperty
import net.vz.mongodb.jackson.ObjectId
import play.modules.mongodb.jackson.MongoDB
import reflect.BeanProperty
import scala.collection.JavaConversions._
import net.vz.mongodb.jackson.Id
import org.codehaus.jackson.annotate.JsonIgnoreProperties
case class Team(
#BeanProperty #JsonProperty("teamName") var teamName: String,
#BeanProperty #JsonProperty("logo") var logo: String,
#BeanProperty #JsonProperty("location") var location: String,
#BeanProperty #JsonProperty("details") var details: String,
#BeanProperty #JsonProperty("formOfSport") var formOfSport: String)
object Team {
private lazy val db = MongoDB.collection("teams", classOf[Team], classOf[String])
def save(team: Team) { db.save(team) }
def getAll(): Iterable[Team] = {
val teams: Iterable[Team] = db.find()
return teams
}
def findOneByTeamName(teamName: String): Team = {
val team: Team = db.find().is("teamName", teamName).first
return team
}
}
Inserting into mongodb works without problems and an _id is automatically inserted for every document.
But now I want to try read (or deserialize) a document e.g. by calling findOneByTeamName. This always causes an UnrecognizedPropertyException for _id. I create the instance with Team.apply and Team.unapply. Even with an own ObjectId this doesn't work as _id and id are treated different.
Can anyone help how the get the instance or how to deserialize right? Thanks in advance
I am using play-mongojack. Here is my class. You object definition is fine.
import com.fasterxml.jackson.annotation.JsonProperty
import com.fasterxml.jackson.databind.ObjectMapper
import org.mongojack.{MongoCollection, JacksonDBCollection}
import org.mongojack.ObjectId
import org.mongojack.WriteResult
import com.mongodb.BasicDBObject
import scala.reflect.BeanProperty
import javax.persistence.Id
import javax.persistence.Transient
import java.util.Date
import java.util.List
import java.lang.{ Long => JLong }
import play.mongojack.MongoDBModule
import play.mongojack.MongoDBPlugin
import scala.collection.JavaConversions._
class Event (
#BeanProperty #JsonProperty("clientMessageId") val clientMessageId: Option[String] = None,
#BeanProperty #JsonProperty("conversationId") val conversationId: String
) {
#ObjectId #Id #BeanProperty var messageId: String = _ // don't manual set messageId
#BeanProperty #JsonProperty("uploadedFile") var uploadedFile: Option[(String, String, JLong)] = None // the upload file(url,name,size)
#BeanProperty #JsonProperty("createdDate") var createdDate: Date = new Date()
#BeanProperty #Transient var cmd: Option[(String, String)] = None // the cmd(cmd,param)
def createdDateStr() = {
val format = new java.text.SimpleDateFormat("yyyy-MM-dd HH:mm:ss")
format.format(createdDate)
}
}