I think there should be an easy solution around, but I wasn't able to find it.
I start accessing data from MongoDB with the following in Scala:
val search = MongoDBObject("_id" -> new ObjectId("xxx"))
val fields = MongoDBObject("community.member.name" -> 1, "community.member.age" -> 1)
for (res <- mongoColl.find(search, fields)) {
var memberInfo = res.getAs[BasicDBObject]("community").get
println(memberInfo)
}
and get a BasicDBObject as result:
{
"member" : [
{
"name" : "John Doe",
"age" : "32",
},{
"name" : "Jane Doe",
"age" : "29",
},
...
]
}
I know that I can access values with getAs[String], though this is not working here...
Anyone has an idea? Searching for a solution for several hours...
If you working with complex MongoDB objects, you can use Salat, which provides simple case class serialization.
Sample with your data:
case class Community(members:Seq[Member], _id: ObjectId = new ObjectId)
case class Member(name:String, age:Int)
val mongoColl: MongoCollection = _
val dao = new SalatDAO[Community, ObjectId](mongoColl) {}
val community = Community(Seq(Member("John Doe", 32), Member("Jane Doe", 29)))
dao.save(community)
for {
c <- dao.findOneById(community._id)
m <- c.members
} println("%s (%s)" format (m.name, m.age))
I think you should try
val member = memberInfo.as[MongoDBList]("member").as[BasicDBObject](0)
println(member("name"))
This problem has not to do really with MongoDB, but rather with your data structure. Your JSON/BSON data structure includes
An object community, which includes
An array of members
Each member has properties name or age.
Your problem is completely equivalent to the following:
case class Community(members:List[Member])
case class Member(name:String, age:Int)
val a = List(member1,member2)
// a.name does not compile, name is a property defined on a member, not on the list
Yes, you can do this beautifully with comprehensions. You could do the following:
for { record <- mongoColl.find(search,fields).toList
community <- record.getAs[MongoDBObject]("community")
member <- record.getAs[MongoDBObject]("member")
name <- member.getAs[String]("name") } yield name
This would work just to get the name. To get multiple values, I think you would do:
for { record <- mongoColl.find(search,fields).toList
community <- record.getAs[MongoDBObject]("community")
member <- record.getAs[MongoDBObject]("member")
field <- List("name","age") } yield member.get(field).toString
Related
I was trying Active-Slick and was able to execute active slick example https://github.com/reactivemaster/active-slick-example
But i am not sure how to manage associations using Active-slick. Please provide example.
Also i tried to achieve it using below method but not sure is it good way of doing and is it still eligible to be called as active record pattern.
BookService.scala
val book= Book(None,"Harry Potter")
val action = for {
id <- bookDao.insert(acc)
y<-authorDao.insert(new Author(None,id,"J.K.Rowling"))
}yield y
db.run(action.transactionally
We use UUIDs for the ID column and they are generated in the Scala code, not by the database. I don't know how this will work with your "active record pattern" but it is nice because you can associate objects all you want before having to talk to the database. I also prefer this typed Id[T] in favour of the individual types like BookId and AuthorId.
case class Id[+T](value: String) extends MappedTo[String]
case object Id {
def generate[T]: Id[T] = Id[T](java.util.UUID.randomUUID().toString)
}
case class Author(authorId: Id[Author], name: String)
case class Book(authorId: Id[Book], title: String, authorId: Id[Author])
val newAuthor = Author(Id.generate, "JK Rowling")
val newBook = Book(Id.generate, "Harry Potter", newAuthor.id)
// do other stuff?
val action = for {
_ <- authorDao.insert(newAuthor)
_ <- bookDao.insert(newBook)
} yield 1
db.run(action.transactionally)
Hope this helps.
I noticed that the scala driver (version 1.2.1) writes Option values of None as null to the corresponding field. I would prefer omitting the fieid completely in this case. Is this possible?
Example
case class Test(foo: Option[String])
persist(Test(None))
leads to
> db.test.find()
{ "_id": "...", "foo": null }
but I want to achieve
> db.test.find()
{ "_id": "..." }
When I used casbah, I think my intended behaviour was the default.
http://mongodb.github.io/mongo-scala-driver/2.4/bson/macros/
Now you can use macros for it:
val testCodec = Macros.createCodecProviderIgnoreNone[Test]()
and in codec conf:
lazy val codecRegistry: CodecRegistry = fromRegistries(fromProviders(testCodec))
Opened a feature request in the mongodb bug tracker (https://jira.mongodb.org/browse/SCALA-294), which was answered by Ross Lawley. He suggests to change the conversion code (from case class to document) from
def toDocument(t: Test) = Document("foo" -> t.foo)
to something like
def toDocument(t: Test) = {
val d = Document()
t.foo.foreach{ value =>
d + ("foo" -> value)
}
d
}
Let say I have a config file with the following:
someConfig: [
{"t1" :
[ {"t11" : "v11",
"t12" : "v12",
"t13" : "v13",
"t14" : "v14",
"t15" : "v15"},
{"t21" : "v21",
"t22" : "v22",
"t23" : "v13",
"t24" : "v14",
"t25" : "v15"}]
},
"p1" :
[ {"p11" : "k11",
"p12" : "k12",
"p13" : "k13",
"p14" : "k14",
"p15" : "k15"},
{"p21" : "k21",
"p22" : "k22",
"p23" : "k13",
"p24" : "k14",
"p25" : "k15"}]
}
]
I would like to retrieve it as a Scala immutable collection Map[List[Map[String, String]]].
using the following code I am only able to retrieve it as a List of HashMaps (more precisely a $colon$colon of HashMap) which fails when I try to iterate trough it. Ideally to complete my code I need a way to convert the HashMap to scala maps
def example: Map[String, List[Map[String,String]]] = {
val tmp = ConfigFactory.load("filename.conf")
val mylist : Iterable[ConfigObject] = tmp.getObjectList("someConfig")
.asScala
(for {
item : ConfigObject <- mylist
myEntry: Entry[String, ConfigValue] <- item.entrySet().asScala
name = entry.getKey
value = entry.getValue.unwrapped()
.asInstanceOf[util.ArrayList[Map[String,String]]]
.asScala.toList
} yield (name, value)).toMap
}
This code should be able to give you what you are looking for.
It builds up lists and maps for your bespoke structure.
The final reduceLeft, is because your json starts with a list, someConfig: [ ], and so I've flattened that out. If you wanted you could probably have removed the [ ]'s, as they as probably not required to represent the data you have.
//These methods convert from Java lists/maps to Scala ones, so its easier to use
private def toMap(hashMap: AnyRef): Map[String, AnyRef] = hashMap.asInstanceOf[java.util.Map[String, AnyRef]].asScala.toMap
private def toList(list: AnyRef): List[AnyRef] = list.asInstanceOf[java.util.List[AnyRef]].asScala.toList
val someConfig: Map[String, List[Map[String, String]]] =
config.getList("someConfig").unwrapped().map { someConfigItem =>
toMap(someConfigItem) map {
case (key, value) =>
key -> toList(value).map {
x => toMap(x).map { case (k, v) => k -> v.toString }
}
}
}.reduceLeft(_ ++ _)
if you stroe your configs in the application.conf like this
someConfig{
list1{
value1 = "myvalue1"
value2 = "myvalue2"
.....
valueN = "myvalueN"
}
list2{
......
}
.....
listN{
......
}
}
you can do the following:
val myconfig = ConfigFactory.load().getObject("someConfig.list1").toConfig
and after you can acces the values like
myconfig.getString("value1")
myconfig.getString("value2")
etc.
which will return the strings "myvalue1", "myvalue2"
not the most elegant way but plain easy
Compilation error:
No Json serializer found for type Seq[(models.Account, models.Company)]. Try to implement an implicit Writes or Format for this type.
How can I define an implicit writes for the result of a join query?
Controller:
def someEndpoint = Action.async { implicit request =>
val query = for {
a <- accounts if a.id === 10
c <- companies if a.companyID === c.id
} yield (a, c)
db.run(query.result).map(rows => Ok(Json.toJson(rows))) // Causes compilation error
}
Each of my models (account and company) have their own implicit writes (here's the company one):
case class Company(id: Int, name: String)
object Company {
implicit val writes = new Writes[Company] {
def writes(company: Company): JsValue = {
Json.obj(
"id" -> company.id,
"name" -> company.name
)
}
}
}
Is it possible to dynamically handle serializations for joins? I have a lot of things I will be joining together... Do I need to explicitly define a writes for each combination?
Writes.seq will help you
small writer
val w = (
(__ \ "account").write[Account] and
(__ \ "company").write[Company]
).tupled
helps you can transform Seq[(models.Account, models.Company)] to JsValue with
Writes.seq(w).writes(rows)
and last command will be
db.run(query.result).map(rows => Ok(Writes.seq(w).writes(rows))
or a more clear variant
db.run(query.result)
.map(
_.map{
case (a,c) => Json.obj("account" -> a, "company" -> c)
}
)
.map(rows =>
Ok(JsArray(rows))
)
it's the same thing, but you create object for every row yourself.
I think you expect the response of your query in JSON to be something like
[
{
"account" : { "number": "123", "companyID" : 1 },
"company" : { "id" : 1, "name" : "My company"}
} , ...
]
The problem is the response of the query is just a tuple, so "account" and "company" are not easy to calculate.
Instead of a tuple you could create a new case class with the joined data, but I understand you want to avoid that. In that case you can instead of a tuple use a Map that is something that will convert automatically to JSON.
Extra: Creating writers for case classes is very simple
import play.api.libs.json._
implicit val personWrites = Json.writes[Person]
Reference: https://www.playframework.com/documentation/2.4.x/ScalaJsonInception
I'm using Lift JSON's for-comprehensions to parse some JSON. The JSON is recursive, so e.g. the field id exists at each level. Here is an example:
val json = """
{
"id": 1
"children": [
{
"id": 2
},
{
"id": 3
}
]
}
"""
The following code
var ids = for {
JObject(parent) <- parse(json)
JField("id", JInt(id)) <- parent
} yield id
println(ids)
produces List(1, 2, 3). I was expecting it to product List(1).
In my program this results in quadratic computation, though I only need linear.
Is it possible to use for-comprehensions to match the top level id fields only?
I haven't delved deep enough to figure out why the default comprehension is recursive, however you can solve this by simply qualifying your search root:
scala> for ( JField( "id", JInt( id ) ) <- parent.children ) yield id
res4: List[BigInt] = List(1)
Note the use of parent.children.