Not persisting Scala None's instead of persisting as null value - mongodb

I noticed that the scala driver (version 1.2.1) writes Option values of None as null to the corresponding field. I would prefer omitting the fieid completely in this case. Is this possible?
Example
case class Test(foo: Option[String])
persist(Test(None))
leads to
> db.test.find()
{ "_id": "...", "foo": null }
but I want to achieve
> db.test.find()
{ "_id": "..." }
When I used casbah, I think my intended behaviour was the default.

http://mongodb.github.io/mongo-scala-driver/2.4/bson/macros/
Now you can use macros for it:
val testCodec = Macros.createCodecProviderIgnoreNone[Test]()
and in codec conf:
lazy val codecRegistry: CodecRegistry = fromRegistries(fromProviders(testCodec))

Opened a feature request in the mongodb bug tracker (https://jira.mongodb.org/browse/SCALA-294), which was answered by Ross Lawley. He suggests to change the conversion code (from case class to document) from
def toDocument(t: Test) = Document("foo" -> t.foo)
to something like
def toDocument(t: Test) = {
val d = Document()
t.foo.foreach{ value =>
d + ("foo" -> value)
}
d
}

Related

Associations in Activeslick

I was trying Active-Slick and was able to execute active slick example https://github.com/reactivemaster/active-slick-example
But i am not sure how to manage associations using Active-slick. Please provide example.
Also i tried to achieve it using below method but not sure is it good way of doing and is it still eligible to be called as active record pattern.
BookService.scala
val book= Book(None,"Harry Potter")
val action = for {
id <- bookDao.insert(acc)
y<-authorDao.insert(new Author(None,id,"J.K.Rowling"))
}yield y
db.run(action.transactionally
We use UUIDs for the ID column and they are generated in the Scala code, not by the database. I don't know how this will work with your "active record pattern" but it is nice because you can associate objects all you want before having to talk to the database. I also prefer this typed Id[T] in favour of the individual types like BookId and AuthorId.
case class Id[+T](value: String) extends MappedTo[String]
case object Id {
def generate[T]: Id[T] = Id[T](java.util.UUID.randomUUID().toString)
}
case class Author(authorId: Id[Author], name: String)
case class Book(authorId: Id[Book], title: String, authorId: Id[Author])
val newAuthor = Author(Id.generate, "JK Rowling")
val newBook = Book(Id.generate, "Harry Potter", newAuthor.id)
// do other stuff?
val action = for {
_ <- authorDao.insert(newAuthor)
_ <- bookDao.insert(newBook)
} yield 1
db.run(action.transactionally)
Hope this helps.

Slick: how to implement find by example i.e. findByExample generically?

I'm exploring the different possibilities on how to implement a generic DAO using the latest Slick 3.1.1 to boost productivity and yes there is need for it because basing the service layer of my Play Web application on TableQuery alone leads to a lot of boilerplate code. One of the methods I'd like to feature in my generic DAO implementation is the findByExample, possible in JPA with the help of the Criteria API. In my case, I'm using the Slick Code Generator to generate the model classes from a sql script.
I need the following to be able to dynamically access the attribute names, taken from Scala. Get field names list from case class:
import scala.reflect.runtime.universe._
def classAccessors[T: TypeTag]: List[MethodSymbol] = typeOf[T].members.collect {
case m: MethodSymbol if m.isCaseAccessor => m
}.toList
A draft implementation for findByExample would be:
def findByExample[T, R](example: R) : Future[Seq[R]] = {
var qt = TableQuery[T].result
val accessors = classAccessors[R]
(0 until example.productArity).map { i =>
example.productElement(i) match {
case None => // ignore
case 0 => // ignore
// ... some more default values => // ignore
// handle a populated case
case Some(x) => {
val columnName = accessors(i)
qt = qt.filter(_.columnByName(columnName) == x)
}
}
}
qt.result
}
But this doesn't work because I need better Scala Kungfu. T is the entity table type and R is the row type that is generated as a case class and therefore a valid Scala Product type.
The first problem in that code is that would be too inefficient because instead of doing e.g.
qt.filter(_.firstName === "Juan" && _.streetName === "Rosedale Ave." && _.streetNumber === 5)
is doing:
// find all
var qt = TableQuery[T].result
// then filter by each column at the time
qt = qt.filter(_.firstName === "Juan")
qt = qt.filter(_.streetName === "Rosedale Ave.")
qt = qt.filter(_.streetNumber === 5)
Second I can't see how to dynamically access the column name in the filter method i.e.
qt.filter(_.firstName == "Juan")
I need to instead have
qt.filter(_.columnByName("firstName") == "Juan")
but apparently there is no such possibility while using the filter function?
Probably the best ways to implement filters and sorting by dynamically provided column names would be either plain SQL or extending the code generator to generate extension methods, something like this:
implicit class DynamicPersonQueries[C[_]](q: Query[PersonTable, PersonRow, C]){
def dynamicFilter( column: String, value: String ) = column {
case "firstName" => q.filter(_.firstName === value)
case "streetNumber" => q.filter(_.streetNumber === value.toInt)
...
}
}
You might have to fiddle with the types a bit to get it to compile (and ideally update this post afterwards :)).
You can then filter by all the provided values like this:
val examples: Map[String, String] = ...
val t = TableQuery[PersonTable]
val query = examples.foldLeft(t){case (t,(column, value)) => t.dynamicFilter(column, value)
query.result
Extending the code generator is explained here: http://slick.lightbend.com/doc/3.1.1/code-generation.html#customization
After further researching found the following blog post Repository Pattern / Generic DAO Implementation.
There they declare and implement a generic filter method that works for any Model Entity type and therefore it is in my view a valid functional replacement to the more JPA findByExample.
i.e.
T <: Table[E] with IdentifyableTable[PK]
E <: Entity[PK]
PK: BaseColumnType
def filter[C <: Rep[_]](expr: T => C)(implicit wt: CanBeQueryCondition[C]) : Query[T, E, Seq] = tableQuery.filter(expr)

Building a document functionally and based on input value in a Play 2 controller in Scala and ReactiveMongo

I've a Play controller Action that edits a document in MongoDB using ReactiveMongo. The code is shown below. Both name and keywords are optional. I'm creating a temp BSONDocument() and adding tuples to it based on if name and keywords exist are not empty. However, tmp is currently mutable(is a var). I'm wondering how I can get rid of the var.
def editEntity(id: String, name: Option[String], keywords: Option[String]) = Action {
val objectId = new BSONObjectID(id)
//TODO get rid of var here
var tmp = BSONDocument()
if (name.exists(_.trim.nonEmpty)) {
tmp = tmp.add(("name" -> BSONString(name.get)))
}
val typedKeywords : Option[List[String]] = Utils.getKeywords(keywords)
if (typedKeywords.exists(_.size > 0)) {
tmp = tmp.add(("keywords" -> typedKeywords.get.map(x => BSONString(x))))
}
val modifier = BSONDocument("$set" -> tmp)
val updateFuture = collection.update(BSONDocument("_id" -> objectId), modifier)
}
UPDATE After looking at the solution from #Vikas it came to me what if there are more (say 10 or 15) number of input Options that I need to deal with. Maybe a fold or reduce based solution will scale better?
In your current code you're adding an empty BSONDocument() if none of those if conditions matched? val modifier = BSONDocument("$set" -> tmp) will have an empty tmp if name was None and typedKeyWords was None. Assuming that's what you want here is one approach to get rid of transient var. also note having a var locally (in a method) isn't a bad thing (sure I'll still make that code look bit prettier)
val typedKeywords : Option[List[String]] = Utils.getKeywords(keywords)
val bsonDoc = (name,typedKeywords) match{
case (Some(n),Some(kw) ) => BSONDocument().add( "name" -> BSONString(n)) .add(("keywords" -> kw.map(x => BSONString(x))))
case (Some(n), None) => BSONDocument().add( "name" -> BSONString(n))
case (None,Some(kw)) => BSONDocument().add(("keywords" -> kw.map(x => BSONString(x))))
case (None,None) => BSONDocument()
}
val modifier = BSONDocument("$set" -> bsonDoc)

How to query with '$in' over '_id' in reactive mongo and play

I have a project set up with playframework 2.2.0 and play2-reactivemongo 0.10.0-SNAPSHOT. I'd like to query for few documents by their ids, in a fashion similar to this:
def usersCollection = db.collection[JSONCollection]("users")
val ids: List[String] = /* fetched from somewhere else */
val query = ??
val users = usersCollection.find(query).cursor[User].collect[List]()
As a query I tried:
Json.obj("_id" -> Json.obj("$in" -> ids)) // 1
Json.obj("_id.$oid" -> Json.obj("$in" -> ids)) // 2
Json.obj("_id" -> Json.obj("$oid" -> Json.obj("$in" -> ids))) // 3
for which first and second return empty lists and the third fails with error assertion 10068 invalid operator: $oid.
NOTE: copy of my response on the ReactiveMongo mailing list.
First, sorry for the delay of my answer, I may have missed your question.
Play-ReactiveMongo cannot guess on its own that the values of a Json array are ObjectIds. That's why you have to make a Json object for each id that looks like this: {"$oid": "526fda0f9205b10c00c82e34"}. When the ReactiveMongo Play plugin sees an object which first field is $oid, it treats it as an ObjectId so that the driver can send the right type for this value (BSONObjectID in this case.)
This is a more general problem actually: the JSON format does not match exactly the BSON one. That's the case for numeric types (BSONInteger, BSONLong, BSONDouble), BSONRegex, BSONDateTime, and BSONObjectID. You may find more detailed information in the MongoDB documentation: http://docs.mongodb.org/manual/reference/mongodb-extended-json/ .
I managed to solve it with:
val objectIds = ids.map(id => Json.obj("$oid" -> id))
val query = Json.obj("_id" -> Json.obj("$in" -> objectIds))
usersCollection.find(query).cursor[User].collect[List]()
since play-reactivemongo format considers BSONObjectID only when "$oid" is followed by string
implicit object BSONObjectIDFormat extends PartialFormat[BSONObjectID] {
def partialReads: PartialFunction[JsValue, JsResult[BSONObjectID]] = {
case JsObject(("$oid", JsString(v)) +: Nil) => JsSuccess(BSONObjectID(v))
}
val partialWrites: PartialFunction[BSONValue, JsValue] = {
case oid: BSONObjectID => Json.obj("$oid" -> oid.stringify)
}
}
Still, I hope there is a cleaner solution. If not, I guess it makes it a nice pull request.
I'm wondering if transforming id to BSONObjectID isn't more secure this way :
val ids: List[String] = ???
val bsonObjectIds = ids.map(BSONObjectID.parse(_)).collect{case Success(t) => t}
this will only generate valid BSONObjectIDs (and discard invalid ones)
If you do it this way :
val objectIds = ids.map(id => Json.obj("$oid" -> id))
your objectIds may not be valid ones depending on string id really being the stringify version of a BSONObjectID or not
If you import play.modules.reactivemongo.json._ it work without any $oid formatters.
import play.modules.reactivemongo.json._
...
val ids: Seq[BSONObjectID] = ???
val selector = Json.obj("_id" -> Json.obj("$in" -> ids))
usersCollection.find(selector).cursor[User].collect[Seq]()
I tried with the following and it worked for me:
val listOfItems = BSONArray(51, 61)
val query = BSONDocument("_id" -> BSONDocument("$in" -> listOfItems))
val ruleListFuture = bsonFutureColl.flatMap(_.find(query, Option.empty[BSONDocument]).cursor[ResponseAccDataBean]().
collect[List](-1, Cursor.FailOnError[List[ResponseAccDataBean]]()))

How to manage multiple levels of Objects using Casbah and Subset?

I have three objects
case class Metric(val name: String, val tags: Map[String, String])
case class Threshold(val metric: Metric, val critical: Long, val warning: Long)
class Profile(val name: String, val thresholds: List[Threshold])
I plan to store only the Profile object in Mongo DB, but in the Scala App they should be represented by their types.
I am using Subset for the same and have defined of the following nature
implicit val reader = ValueReader[Threshold]({
case metric(metric) ~ critical(critical) ~ warning(warning) =>
new Threshold(metric, critical, warning)
})
implicit val writer = {
def f(threshold: Threshold): DBObject =
(metric -> threshold.metric) ~ (critical -> threshold.critical) ~ (warning -> threshold.warning)
ValueWriter(f _)
}
How can I query to and from Mongo Now?
Any examples around this?
Integration test is a good example on how to work with nested object, query, update etc. Parts of this test are scattered across the documentation as well.
If you plan to read from Mongo, you need readers for all the parts of your model. If you plan to query or update, you need writers as well. Scala compiler should issue an error if it cannot find a necessary implicit.
How would you query Profiles:
object Profile {
val name = "name".fieldOf[String]
val thresholds = "thresholds".subset(Threshold).of[List[Threshold]]
// typical query:
def alarmsFor(metric: String) =
collection.find( thresholds elemMatch {t =>
t.metric.where{_.name == metric} && t.critical > 10
} ) map {
case name(n) ~ thresholds(t) => new Profile(n, t)
}
}
I've made a couple of assumptions in this snippet:
Threshold's fields are defined in object Threshold (t is where you get it)
Threshold.metric field is a subset itself, e.g. val metric = "metric".subset(Metric).of[Metric], so that you can query metric.where{_.name == metric}
Note that as of version 0.7.0 there is still no reader/writer for Map[String,T] (though I plan to have it eventually) -- you'll have to develop it (if you need this field) or work around this problem in Metric's reader/writer.