How to Convert MongoDBObject to JsonString - mongodb

My mongoDb collection looks like this:
> db.FakeCollection.find().pretty()
{
"_id" : ObjectId("52b2d71c5c197846fd3a2737"),
"categories" : [
{
"categoryname" : "entertainment",
"categoryId" : "d3ffca550ae44904aedf77cdcbd31d7a",
"displayname" : "Entertainment",
"subcategories" : [
{
"subcategoryname" : "games",
"subcategoryId" : "ff3d0cbeb0eb4960b11b47d7fc64991b",
"displayname" : "Games"
}
]
}
]
}
I want to write a test case for the below collection using Specs2 JsonMatchers in scala with MongodbCasbah.
How do I convert DBObjects to Strings?

I believe your approach is slightly wrong here. Your collection should look like:
class Category extends BsonRecord[Category] {
def meta = Category
object categoryname extends StringField(this, 200)
object categoryId extends StringField(this, 64)
object displayname extends StringField(this, 100)
object subcategories extends BsonRecordListField(this, Category)
}
object Category extends Category with BsonMetaRecord[Category] {
}
class FakeCollection extends MongoRecord[FakeCollection] with ObjectIdPk[FakeCollection] {
def meta = FakeCollection
object categories extends BsonRecordListField(this, Category)
}
object FakeCollection extends FakeCollection with MongoMetaRecord[FakeCollection] {
override def collectionName = "fakecollection"
def getEntryByName: List[Category] = {
FakeCollection.find
}
}
With that method you can do:
import net.liftweb.json.JsonAST.JValue;
import net.liftweb.http.js.JsExp;
import net.liftweb.http.js.JsExp._;
import net.liftweb.json.JsonDSL.seq2jvalue
val json: JsExp = seq2JValue(FakeColleciton.find.map(_.asJValue))
val stringContent = json.toJsCmd; // now it's here, you can match.
Have a look HERE, see how you can add Foursquare Rogue to make your life easier.

Short answer:
val doc: com.mongodb.DBObject = ???
pretty(render(net.liftweb.mongodb.JObjectParser.serialize(doc)))
Long answer that explains what's going on. I included full type names for clarity:
import net.liftweb.mongodb.JObjectParser
import net.liftweb.json.DefaultFormats
// default JSON formats for `parse` and `serialize` below
implicit val formats = DefaultFormats
// Convert DBObject to JValue:
val doc: com.mongodb.DBObject = ??? // get it somehow
val jsonDoc: net.liftweb.json.JValue = JObjectParser.serialize(doc)
// Convert JValue to DBObject:
val doc2: net.liftweb.json.JObject = ???
val dbObj: com.mongodb.DBObject = JObjectParser.parse(doc2)
// Render JSON as String:
import net.liftweb.json._
pretty(render(jsonDoc))
// or use compactRender, compact(render(jsonDoc)), etc
To compare JSON docs there is Diff: val Diff(changed, added, deleted) = json1 diff json2.
More info here: https://github.com/lift/lift/tree/master/framework/lift-base/lift-json/.
You can test with specs2 and Lift Diff this way for example:
json1 diff json2 mustEqual Diff(changedJson, addedJson, JNothing)

Related

Insert document with ReactiveMongo does not use BSONWriter or Reader

I have the following class:
case class DeviceRegistration(deviceData: DeviceData,
pin: String,
created: DateTime) {}
Type DeviceData is defined simply as 4 string fields.
I've been trying unsuccessfully to insert a type of DeviceRegistration to a mongo collection. I want to make sure the date is stored as a ISODate and not a NumberLong, so I implemented custom readers and writers.
implicit object DeviceDataWriter extends BSONDocumentWriter[DeviceData] {
def write(data: DeviceData): BSONDocument = BSONDocument(
DEVICE_ID_KEY -> data.deviceId,
MODEL_KEY -> data.model,
BRAND_KEY -> data.brand,
MANUFACTURER_KEY -> data.manufacturer)
}
implicit object DeviceRegistrationWriter extends BSONDocumentWriter[DeviceRegistration] {
def write(registration: DeviceRegistration): BSONDocument = BSONDocument(
DEVICE_DATA_KEY -> registration.deviceData,
PIN_KEY -> registration.pin,
CREATED_KEY -> BSONDateTime(registration.created.getMillis))
}
implicit object DeviceDataReader extends BSONDocumentReader[DeviceData] {
def read(doc: BSONDocument): DeviceData = {
val deviceId = doc.getAs[String](DEVICE_ID_KEY).get
val model = doc.getAs[String](MODEL_KEY)
val brand = doc.getAs[String](BRAND_KEY)
val manufacturer = doc.getAs[String](MANUFACTURER_KEY)
DeviceData(deviceId, model, brand, manufacturer)
}
}
implicit object DeviceRegistrationReader extends BSONDocumentReader[DeviceRegistration] {
def read(doc: BSONDocument): DeviceRegistration = {
val deviceData = doc.getAs[DeviceData](DEVICE_DATA_KEY).get
val pin = doc.getAs[String](PIN_KEY).get
val created = doc.getAs[BSONDateTime](CREATED_KEY).map(dt => new DateTime(dt.value))
DeviceRegistration(deviceData, pin, created.get)
}
}
I'm trying to insert the document with the following code:
def save(deviceRegistration: DeviceRegistration): Future[DeviceRegistration] = {
deviceRegistrations.insert(deviceRegistration).map(result => deviceRegistration)
}
To retrieve, I'm doing this:
def findDeviceRegistrationRequest(deviceConfirmationData: DeviceConfirmationData) = {
deviceRegistrations.find(BSONDocument("pin" -> deviceConfirmationData.pin))
.one[DeviceRegistration](ReadPreference.Primary)
}
The record is stored as this:
{ "_id" : ObjectId("56dea8d0d8cadd6ff70690d8"), "deviceData" : { "deviceId" : "kdsajkldjsalkda" }, "pin" : "9914", "created" : NumberLong("1457432783921") }
The created date is clearly not being serialized by my writer. It seems reactivemongo is using some default writer.
Likewise, when I read, I get an exception:
Caused by: java.lang.RuntimeException: (/created,List(ValidationError(List(error.expected.date),WrappedArray())))
So it's clearly also not using my reader.
I didn't have any luck googling around. What am I missing?
I also tried to find a way to specifically set the writer and reader I want to use (instead of relying on the implicit mechanism), but I was not able to figure it out.
Any pointers in the right direction would be most appreciated.

How to use lift-json's class extractor constructor mongo bson array?

I'm use lift-json render a bson string with class extractor, after that, use mongo Document class constructor a document instance with that bson string.
A problem is how about represent $or bson.It seems not a classic json array.
{"$or": [
{"username": "administrator"},
{"phone":"110"},
{"email":"123#xxx.com"},
{"pen_name":"lorancechen"}
]}
How to use lift class extractor represent this bson array?
Besides, the reason of use string between app and mongo is they are communicate under a simple socket.
UPDATE add a example
extractor a normal array class as follow:
import net.liftweb.json._
import net.liftweb.json.Extraction._
case class Name(name: String)
case class JsonArray(array:List[Name])
object JsonClient extends App {
implicit val formats = DefaultFormats
val names = Name("jone01") :: Name("jone02") :: Nil
val array = JsonArray(names)
val jsonString = prettyRender(decompose(array))
println(jsonString)
}
OUTPUT:
{
"array":[
{
"name":"jone01"
},
{
"name":"jone02"
}
]
}
How to represent this
{"$or": [
{"username": "administrator"},
{"phone":"110"},
{"email":"123#xxx.com"},
{"pen_name":"lorancechen"}
]}
every field key (eg, username, phone) of element inner "$or" is not common key name and I haven't find a way to represent it use class template.
I don't get it why your JSON structure is that way, maybe what you want is the following one:
{
"$or": [
{
"username": "administrator", "phone":"110",
"email":"123#xxx.com", "pen_name":"lorancechen"
},
{
"username": "xxx", "phone":"xxx",
"email":"xxx", "pen_name":"xxx"
}
...
]
}
Actually Lift provides such tools and the downside is that the implementation is a little bit ugly.
import net.liftweb.mongodb.{JsonObjectMeta, JsonObject}
// the trait here is for unifying the type of
//four case classes i.e Username, Phone....
sealed trait PersonField
object Username extends JsonObjectMeta[Username]
case class Username(username: String) extends JsonObject[Username]
with PersonField {
def meta = Username
}
case class Phone(phone: String) extends JsonObject[Phone] with PersonField {
def meta = Phone
}
object Phone extends JsonObjectMeta[Phone]
case class Email(email: String) extends JsonObject[Email] with PersonField {
def meta = Email
}
object Email extends JsonObjectMeta[Email]
case class PenName(pen_name: String) extends JsonObject[PenName]
with PersonField {
def meta = PenName
}
object PenName extends JsonObjectMeta[PenName]
case class Output(`$or`: List[PersonField]) extends JsonObject[Output] {
def meta = Output
}
object Output extends JsonObjectMeta[Output]
object JsonClient extends App {
val username = Username("administrator")
val phone = Phone("110")
val email = Email("123#xxx.com")
val penName = PenName("lorancechen")
val outPut = Output(username :: phone :: email :: penName :: Nil)
import net.liftweb.json._
implicit val formats = DefaultFormats
import net.liftweb.json.{JsonAST, Printer}
val result = Printer.pretty(JsonAST.render(outPut.asJObject))
/*
{
"$or":[{
"username":"administrator"
},{
"phone":"110"
},{
"email":"123#xxx.com"
},{
"pen_name":"lorancechen"
}]
}
*/
println(result)
}
Anyway, hope it helps.

Multiple slick `column`s for the same DB column break projection

I'm new to Slick thus I'm not sure whether the problem caused by incorrect usage of implicits or Slick doesn't allow doing what I'm trying to do.
In short I use Slick-pg extension for JSONB support in Postgres. I also use spray-json to deserialize JSONB fields into case classes.
In order to automagically convert columns into objects I wrote generic implicit JsonColumnType that you can see below. It allows me to have any case class for which I defined json formatter to be converted to jsonb field.
On the other hand I want to have alias of JsValue type for the same column so that I can use JSONB-operators.
import com.github.tminglei.slickpg._
import com.github.tminglei.slickpg.json.PgJsonExtensions
import org.bson.types.ObjectId
import slick.ast.BaseTypedType
import slick.jdbc.JdbcType
import spray.json.{JsValue, RootJsonWriter, RootJsonReader}
import scala.reflect.ClassTag
trait MyPostgresDriver extends ExPostgresDriver with PgArraySupport with PgDate2Support with PgRangeSupport with PgHStoreSupport with PgSprayJsonSupport with PgJsonExtensions with PgSearchSupport with PgNetSupport with PgLTreeSupport {
override def pgjson = "jsonb" // jsonb support is in postgres 9.4.0 onward; for 9.3.x use "json"
override val api = MyAPI
private val plainAPI = new API with SprayJsonPlainImplicits
object MyAPI extends API with DateTimeImplicits with JsonImplicits with NetImplicits with LTreeImplicits with RangeImplicits with HStoreImplicits with SearchImplicits with SearchAssistants { //with ArrayImplicits
implicit val ObjectIdColumnType = MappedColumnType.base[ObjectId, Array[Byte]](
{ obj => obj.toByteArray }, { arr => new ObjectId(arr) }
)
implicit def JsonColumnType[T: ClassTag](implicit reader: RootJsonReader[T], writer: RootJsonWriter[T]) = {
val columnType: JdbcType[T] with BaseTypedType[T] = MappedColumnType.base[T, JsValue]({ obj => writer.write(obj) }, { json => reader.read(json) })
columnType
}
}
}
object MyPostgresDriver extends MyPostgresDriver
Here is how my table is defined (minimized version)
case class Article(id : ObjectId, ids : Ids)
case class Ids(doi: Option[String], pmid: Option[Long])
class ArticleRow(tag: Tag) extends Table[Article](tag, "articles") {
def id = column[ObjectId]("id", O.PrimaryKey)
def idsJson = column[JsValue]("ext_ids")
def ids = column[Ids]("ext_ids")
private val fromTuple: ((ObjectId, Ids)) => Article = {
case (id, ids) => Article(id, ids)
}
private val toTuple = (v: Article) => Option((v.id, v.ids))
def * = ProvenShape.proveShapeOf((id, ids) <> (fromTuple, toTuple))(MappedProjection.mappedProjectionShape)
}
private val articles = TableQuery[ArticleRow]
Finally I have function that looks up articles by value of json field
def getArticleByDoi(doi : String): Future[Article] = {
val query = (for (a <- articles if (a.idsJson +>> "doi").asColumnOf[String] === doi) yield a).take(1).result
slickDb.run(query).map { items =>
items.headOption.getOrElse(throw new RuntimeException(s"Article with doi $doi is not found"))
}
}
Sadly I get following exception in runtime
java.lang.ClassCastException: spray.json.JsObject cannot be cast to server.models.db.Ids
The problem is in SpecializedJdbcResultConverter.base where ti.getValue is being called with wrong ti. It should be slick.driver.JdbcTypesComponent$MappedJdbcType but instead it's com.github.tminglei.slickpg.utils.PgCommonJdbcTypes$GenericJdbcType. As result wrong type is passed into my tuple converter.
What makes Slick choose different type for column even though there is explicit definition of projection in table row class ?
Sample project that demonstrates the issue is here.

ReactiveMongo: How to convert a BSONArray to List[String]

Given the following BSONDocument...
val values = BSONDocument("values" -> BSONArray("one", "two", "three"))
How do I convert it to a List? I've tried this...
values.getAs[List[String]]("values").getOrElse(List.empty))
... but it doesn't work - I always get List.empty.
Am I missing something?
EDIT
OK... I think it is worth describing the real case. I ran the distinct command and this was the result:
values: {
values: [
0: BSONObjectID("55d0f641a100000401b7e454")
],
stats: {
n: BSONInteger(1),
nscanned: BSONInteger(1),
nscannedObjects: BSONInteger(1),
timems: BSONInteger(0),
cursor: BSONString(BtreeCursor projectId_1)
},
ok: BSONDouble(1.0)
}
I need to transform values to a Scala List[String] like this:
List("55d0f641a100000401b7e454")
Here is my solution. First, I've defined a BSONReader[BSONValue, String] like this...
package object bsonFormatters {
implicit object BSONValueStringReader extends BSONReader[BSONValue, String] {
def read(bson: BSONValue) = bson match {
case oid: BSONObjectID => oid.stringify
}
}
... and then just imported it in my companion object like this:
import reactivemongo.bson.{BSONString, BSONDocument}
import reactivemongo.core.commands.{CommandError, BSONCommandResultMaker, Command}
case class Distinct(
collectionName: String,
field: String,
query: Option[BSONDocument] = None
) extends Command[Seq[String]] {
override def makeDocuments = BSONDocument(
"distinct" -> BSONString(collectionName),
"key" -> field,
"query" -> query
)
val ResultMaker = Distinct
}
object Distinct extends BSONCommandResultMaker[Seq[String]] {
import bsonFormatters._ // this makes the trick
def apply(document: BSONDocument) = CommandError.checkOk(
document,
Some("distinct")
).toLeft(
document.getAs[List[String]]("values").getOrElse(List.empty)
)
}
I hope it helps.

Nested document with reactive mongo and Scala

I'm trying to store a nested document in MongoDB through Scala. The document looks like:
Project {
"_id": ObjectId("528547370cf6e41449003512"),
"highLevelCode": NumberLong(3),
"description": [
{"_id": ObjectId("528547370cf6e41449003521"),
"lang": "en",
"desc": "desc in English"},
{"_id ": ObjectId("528547370cf6e41449003522"),
"lang": "fr",
"desc": "desc en francais"}],
"budget": NumberLong(12345)
}
Basically I want to store nested descriptions, which could be of multiple languages in the Project document.
The code I wrote is:
import reactivemongo.bson._
import reactivemongo.bson.handlers.{BSONWriter, BSONReader}
import reactivemongo.bson.BSONLong
import reactivemongo.bson.BSONString
case class LocaleText(
id: Option[BSONObjectID],
lang: String,
textDesc: String
)
object LocaleText {
implicit object LocaleTextBSONReader extends BSONReader[LocaleText] {
def fromBSON(document: BSONDocument): LocaleText = {
val doc = document.toTraversable
LocaleText(
doc.getAs[BSONObjectID]("_id"),
doc.getAs[BSONString]("lang").map(_.value).get,
doc.getAs[BSONString]("textDesc").map(_.value).get
)
}
}
implicit object LocaleTextBSONWriter extends BSONWriter[LocaleText] {
def toBSON(localText: LocaleText) = {
BSONDocument(
"_id" -> localText.id.getOrElse(BSONObjectID.generate),
"lang" -> BSONString(localText.lang),
"textDesc" -> BSONString(localText.textDesc)
)
}
}
}
case class Project(
id: Option[BSONObjectID],
description: List[LocaleText],
budget: Option[Long]
)
object Project {
implicit object ProjectReader extends BSONReader[Project]{
def fromBSON(doc: BSONDocument): Project = {
val document = doc.toTraversable
Project(
document.getAs[BSONObjectID]("_id"),
document.getAs[BSONArray]("description").map { values =>
values.values.toList.flatMap { case value =>
value match {
case v: LocaleText => Some(v.asInstanceOf[LocaleText])
case _ => None
}
}
}.getOrElse(List.empty),
document.getAs[BSONLong]("budget").map(_.value)
)
}
}
implicit object ProjectWriter extends BSONWriter[Project]{
def toBSON(project: Project): BSONDocument = {
BSONDocument(
"_id" -> project.id.getOrElse(BSONObjectID.generate),
"description" -> BSONArray(project.description)
).append(Seq(
project.budget.map(b => "budget" -> BSONLong(b))
).flatten:_*)
}
}
}
However, it gave me compilation error like
overloaded method value apply with alternatives: [error] (producer: reactivemongo.bson.Implicits.Producer[(String, reactivemongo.bson.BSONValue)],producers: reactivemongo.bson.Implicits.Producer[(String, reactivemongo.bson.BSONValue)])reactivemongo.bson.AppendableBSONDocument
[error] (els: (String, reactivemongo.bson.BSONValue))reactivemongo.bson.AppendableBSONDocument
[error] cannot be applied to ((String, reactivemongo.bson.BSONObjectID), List[LocaleText])...
Basically Scala doesn't like the line
"description" -> BSONArray(project.description)
However, the following alternative works although I cannot use a List/Array to allow more than two languages:
case class LocaleText(
enDesc: String,
frDesc: String)
case class Project(
id: Option[BSONObjectID],
description: LocaleText)
object Project {
implicit object LocaleTextBSONReader extends BSONReader[LocaleText] {
def fromBSON(document: BSONDocument): LocaleText = {
val doc = document.toTraversable
LocaleText(
doc.getAs[BSONString]("enDesc").map(_.value).get,
doc.getAs[BSONString]("frDesc").map(_.value).get
)
}
}
implicit object LocaleTextBSONWriter extends BSONWriter[LocaleText] {
def toBSON(localText: LocaleText) = {
BSONDocument(
"enDesc" -> BSONString(localText.enDesc),
"frDesc" -> BSONString(localText.frDesc)
)
}
}
implicit object ProjectReader extends BSONReader[Project]{
def fromBSON(doc: BSONDocument): Project = {
val document = doc.toTraversable
Project(
document.getAs[BSONObjectID]("_id"),
document.getAs[BSONString]("iatiId").map(_.value).get,
LocaleTextBSONReader.fromBSON(document.getAs[BSONDocument]("description").get)
}
}
implicit object ProjectWriter extends BSONWriter[Project]{
def toBSON(project: Project): BSONDocument = {
BSONDocument(
"_id" -> project.id.getOrElse(BSONObjectID.generate),
"iatiId" -> BSONString(project.iatiId),
"description" -> LocaleTextBSONWriter.toBSON(project.description)
}
}
How can I convert project.description, which a List of LocaleText to BSONArray for Mongo? I appreciate if you can shed some light on my problem. Thank you very much for your help.
Finally I found the solution to my own question, hope this will help some others who struggle with ReactiveMongo 0.8 as well:
case class LocaleText(
lang: String,
desc: String)
case class Project(
id: Option[BSONObjectID],
descriptions: List[LocaleText])
object Project {
implicit object LocaleTextBSONReader extends BSONReader[LocaleText] {
def fromBSON(document: BSONDocument): LocaleText = {
val doc = document.toTraversable
LocaleText(
doc.getAs[BSONString]("lang").get.value,
doc.getAs[BSONString]("desc").get.value
)
}
}
implicit object LocaleTextBSONWriter extends BSONWriter[LocaleText] {
def toBSON(localText: LocaleText) = {
BSONDocument(
"lang" -> BSONString(localText.lang),
"desc" -> BSONString(localText.desc)
)
}
}
implicit object ProjectReader extends BSONReader[Project]{
def fromBSON(doc: BSONDocument): Project = {
val document = doc.toTraversable
Project(
document.getAs[BSONObjectID]("_id"),
document.getAs[BSONArray]("descriptions").get.toTraversable.toList.map { descText =>
LocaleTextBSONReader.fromBSON(descText.asInstanceOf[TraversableBSONDocument]
}
)
}
}
implicit object ProjectWriter extends BSONWriter[Project]{
def toBSON(project: Project): BSONDocument = {
BSONDocument(
"_id" -> project.id.getOrElse(BSONObjectID.generate),
"descriptions" -> BSONArray(project.descriptions.map {
description => LocaleTextBSONWriter.toBSON(description)
}: _*)
}
}
It might be an issue in the library. I tested your code using the latest version of reactivemongo and it compiled just fine (I needed to adapt your code to fit the new syntax for BSONReaders and BSONWriters but that shouldn't have any influence on the error).
Using reactivemongo 0.10.0 you can even use the newly provided macros:
import reactivemongo.bson._
case class LocaleText(id: Option[BSONObjectID], lang: String, textDesc: String)
object LocaleText {
implicit val localeTextBSONHandler = Macros.handler[LocaleText]
}
case class Project(id: Option[BSONObjectID], description: List[LocaleText], budget: Option[Long])
object Project {
implicit val projectBSONHandler = Macros.handler[Project]
}

Categories