Lazyloading collection in play-salat - mongodb

Is it possible to load a collection lazy with Sala?
e.g. I have an object like
Example 1 (in this case, the whole user list is loaded when retrieving the object)
case class Test(
#Key("_id") _id: ObjectId = new ObjectId,
name: String,
users: List[User]) {
}
or Example 2 (the object is loaded without the list, but no idea how to get the users list)
case class Test(
#Key("_id") _id: ObjectId = new ObjectId,
name: String) {
#Persist val users: List[User] = List()
}
How can I load the object in the first example without the users list?
or: How can I load the users list in the second example?
Thanks in advance!

Salat author here.
Salat doesn't have anything like ORM lazy loading. The #Persist annotation is meant to persist fields outside of the constructor, but suppresses deserialization because only fields in the constructor will be deserialized.
But you can easily decide when making the query whether you want the list of users or not.
case class Test(#Key("_id") id = new ObjectId, name: String, users: List[User] = Nil)
You can persist the users as embedded documents inside the test document, and then use the second argument of the query, the ref, to exclude (0) or include (1) fields in the object.
TestDAO.find(/* query */, MongoDBObject("users" -> 0))
The other strategy is to break out user documents into a child collection - see https://github.com/novus/salat/wiki/ChildCollection for more information. In this example, Test is the "parent" and User is the "child".
The strategy there is that in the parent DAO, when saving, you override the save methods to save users using the child DAO, and then save the parent object with users set to Nil.
Then, by default, a Test instance is retrieved with users set to Nil.
If you want to retrieve Test with users, you will need to add a find method to your DAO that manually:
find the test document
use the _id field of the test document to query for user documents by parent id - this will yield List[User]
deserialize the test document to an instance of Test using grater[Test] and copy it with the list of users

Related

BsonInvalidOperationException in ktor

Being new to MongoDB, I'm currently integrating the kMongo library to my ktor project, and trying to create a database to read & write event models to.
Following the instructions for object mapping in the kMongo user manual, I've created a mongoId field which gets serialised as a String named _id.
My event model is a data class, nested in sealed classes but gets serialised correctly by KotlinX-Serialization. The model looks as such:
sealed class Event {
#SerialName("_id") abstract val mongoId: String
abstract val id: ID.Event
abstract val dateTime: LocalDateTime
fun asString() = id.toString()
sealed class Hiring : Event() {
#SerialName("_id") abstract override val mongoId: String
abstract override val id: ID.Event
abstract override val dateTime: LocalDateTime
#Serializable
data class Start(
override val id: ID.Event,
override val dateTime: LocalDateTime,
val hiringDetailsId: ID.HiringDetails
) : Hiring() {
#SerialName("_id") override val mongoId: String = id.asString()
}
...
In a repository class, I initialise MongoDB and use the generic, parameter-less find() on a collection to retrieve all Event models from the database:
...
private val kmongo = KMongo.createClient().coroutine.client
private val db = kmongo.getDatabase("test")
private val eventCollection = db.getCollection<Event>().coroutine
...
override suspend fun getAllEvents() = eventCollection.find().toList()
Then inside of the Main class, I try to load the Event data on a click trigger:
...
val id = ID.Event(UUID())
...
it.on.click {
runBlocking {
val events = eventRepo.getAllEvents().toString()
logger.debug { events }
}
}
The strange part starts here, the server starts correctly and MongoDB is initialised correctly, but as soon as I try to do the read on the click trigger, I am presented with following error:
org.bson.BsonInvalidOperationException: readString can only be called when CurrentBSONType is STRING, not when CurrentBSONType is DOCUMENT.
at org.bson.AbstractBsonReader.verifyBSONType(AbstractBsonReader.java:689)
at org.bson.AbstractBsonReader.checkPreconditions(AbstractBsonReader.java:721)
at org.bson.AbstractBsonReader.readString(AbstractBsonReader.java:456)
at com.github.jershell.kbson.FlexibleDecoder.decodeString(BsonFlexibleDecoder.kt:130)
at kotlinx.serialization.encoding.AbstractDecoder.decodeStringElement(AbstractDecoder.kt:58)
at kotlinx.serialization.internal.AbstractPolymorphicSerializer.deserialize(AbstractPolymorphicSerializer.kt:52)
at kotlinx.serialization.encoding.Decoder$DefaultImpls.decodeSerializableValue(Decoding.kt:257)
at kotlinx.serialization.encoding.AbstractDecoder.decodeSerializableValue(AbstractDecoder.kt:16)
at org.litote.kmongo.serialization.SerializationCodec.decode(SerializationCodec.kt:66)
at com.mongodb.internal.operation.CommandResultArrayCodec.decode(CommandResultArrayCodec.java:52)
at com.mongodb.internal.operation.CommandResultDocumentCodec.readValue(CommandResultDocumentCodec.java:60)
at org.bson.codecs.BsonDocumentCodec.decode(BsonDocumentCodec.java:87)
at org.bson.codecs.BsonDocumentCodec.decode(BsonDocumentCodec.java:42)
at org.bson.internal.LazyCodec.decode(LazyCodec.java:48)
at org.bson.codecs.BsonDocumentCodec.readValue(BsonDocumentCodec.java:104)
at com.mongodb.internal.operation.CommandResultDocumentCodec.readValue(CommandResultDocumentCodec.java:63)
at org.bson.codecs.BsonDocumentCodec.decode(BsonDocumentCodec.java:87)
at org.bson.codecs.BsonDocumentCodec.decode(BsonDocumentCodec.java:42)
at com.mongodb.internal.connection.ReplyMessage.<init>(ReplyMessage.java:51)
at com.mongodb.internal.connection.InternalStreamConnection.getCommandResult(InternalStreamConnection.java:535)
at com.mongodb.internal.connection.InternalStreamConnection.access$500(InternalStreamConnection.java:86)
at com.mongodb.internal.connection.InternalStreamConnection$2$1.onResult(InternalStreamConnection.java:520)
at com.mongodb.internal.connection.InternalStreamConnection$2$1.onResult(InternalStreamConnection.java:498)
at com.mongodb.internal.connection.InternalStreamConnection$MessageHeaderCallback$MessageCallback.onResult(InternalStreamConnection.java:821)
at com.mongodb.internal.connection.InternalStreamConnection$MessageHeaderCallback$MessageCallback.onResult(InternalStreamConnection.java:785)
at com.mongodb.internal.connection.InternalStreamConnection$5.completed(InternalStreamConnection.java:645)
at com.mongodb.internal.connection.InternalStreamConnection$5.completed(InternalStreamConnection.java:642)
at com.mongodb.internal.connection.AsynchronousChannelStream$BasicCompletionHandler.completed(AsynchronousChannelStream.java:250)
at com.mongodb.internal.connection.AsynchronousChannelStream$BasicCompletionHandler.completed(AsynchronousChannelStream.java:233)
at java.base/sun.nio.ch.Invoker.invokeUnchecked(Invoker.java:129)
at java.base/sun.nio.ch.Invoker.invokeDirect(Invoker.java:160)
at java.base/sun.nio.ch.UnixAsynchronousSocketChannelImpl.implRead(UnixAsynchronousSocketChannelImpl.java:573)
at java.base/sun.nio.ch.AsynchronousSocketChannelImpl.read(AsynchronousSocketChannelImpl.java:276)
at java.base/sun.nio.ch.AsynchronousSocketChannelImpl.read(AsynchronousSocketChannelImpl.java:297)
at com.mongodb.internal.connection.AsynchronousSocketChannelStream$AsynchronousSocketChannelAdapter.read(AsynchronousSocketChannelStream.java:144)
at com.mongodb.internal.connection.AsynchronousChannelStream.readAsync(AsynchronousChannelStream.java:118)
at com.mongodb.internal.connection.AsynchronousChannelStream.readAsync(AsynchronousChannelStream.java:107)
at com.mongodb.internal.connection.InternalStreamConnection.readAsync(InternalStreamConnection.java:642)
at com.mongodb.internal.connection.InternalStreamConnection.access$600(InternalStreamConnection.java:86)
at com.mongodb.internal.connection.InternalStreamConnection$MessageHeaderCallback.onResult(InternalStreamConnection.java:775)
at com.mongodb.internal.connection.InternalStreamConnection$MessageHeaderCallback.onResult(InternalStreamConnection.java:760)
at com.mongodb.internal.connection.InternalStreamConnection$5.completed(InternalStreamConnection.java:645)
at com.mongodb.internal.connection.InternalStreamConnection$5.completed(InternalStreamConnection.java:642)
at com.mongodb.internal.connection.AsynchronousChannelStream$BasicCompletionHandler.completed(AsynchronousChannelStream.java:250)
at com.mongodb.internal.connection.AsynchronousChannelStream$BasicCompletionHandler.completed(AsynchronousChannelStream.java:233)
at java.base/sun.nio.ch.Invoker.invokeUnchecked(Invoker.java:129)
at java.base/sun.nio.ch.UnixAsynchronousSocketChannelImpl.finishRead(UnixAsynchronousSocketChannelImpl.java:447)
at java.base/sun.nio.ch.UnixAsynchronousSocketChannelImpl.finish(UnixAsynchronousSocketChannelImpl.java:195)
at java.base/sun.nio.ch.UnixAsynchronousSocketChannelImpl.onEvent(UnixAsynchronousSocketChannelImpl.java:217)
at java.base/sun.nio.ch.KQueuePort$EventHandlerTask.run(KQueuePort.java:312)
at java.base/sun.nio.ch.AsynchronousChannelGroupImpl$1.run(AsynchronousChannelGroupImpl.java:113)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
at java.base/java.lang.Thread.run(Thread.java:833)
According to the stacktrace, something seems to go wrong in the BSON filtering part, despite there being none. When I use the MongoDB compass to validate the object inside of the database, I can see that everything is initialised and written perfectly fine:
The normal id field is used in my software internally as an ID.Event object type whilst the _id is used by Mongo internally.
Can someone point me to what the potential issue could be here?
I'm not familiar with Kotlin, but I'd like to dive into this a bit further:
If it wouldn't unwrap the second time, the Mongo Compass would likely reveal the _id or id field to contain a bracket { while these are currently mapped as expected (a String for _id and an object for id).
To confirm, the current structure of your document is (eg here in the playground):
{
_id: "7d51",
id: {
id: "7d51"
},
hiringDetailsId: {
id: "8392"
}
}
We can see that in your screenshot from Compass where the _id field shows the value being the string directly whereas the other two fields show that the values are Objects (that each contain { id: "<string>" } values).
The error is specifically stating that the code is expecting a string but finding a document:
BsonInvalidOperationException: readString can only be called when CurrentBSONType is STRING, not when CurrentBSONType is DOCUMENT.
I can't speak to the internal unpacking, but it really feels to me like the nested id.id (and potentially also hiringDetailsId.id) is the problem here. Even if it isn't directly related, it would seem to be an opportunity to simplify the schema unless there is a compelling reason to introduce that extra level of nesting.

ReactiveMongoRepository / MongoRepository does not return _id field

I think this issue probably has to do with my Mongo Document Koltin Data class, but for our business case we need to allow the user to add on any JSON fields to describe their RF data set.
Extending the BasicDBObject was the best way I have found.
The mono being returned when I save a SigMfMetaDocument does not contain the _id field.
I cannot figure out why the save method does not return a Mono wrapping a SigMfDocument with and _id
If there is a better way to create a Type for ReactiveMongoRepository that can dynamically accept any fields I am all ears.
#Document(collection = "sigmfmeta")
class SigMfMetaDocument : BasicDBObject {
#Id
#JsonProperty("id")
val id: String? = UUID.randomUUID().toString()
constructor(map: Map<String, Any>) : super(map)
constructor() : super()
constructor(key: String, value: Object): super()
}
#Repository
interface SigMfMetaRepository : ReactiveMongoRepository<SigMfMetaDocument, String>
So I found a way to solve this for my use case. I was originally assuming the description in the documentation for the save method would apply
(Saves a given entity. Use the returned instance for further operations as the save operation might have changed the entity instance completely).
My thought Mongo auto inserting the _id value would apply to this description.
I changed my model to:
#Document(collection = "sigmfmeta")
class SigMfMetaDocument : BasicBSONObject {
constructor(map: Map<String, Any>) : super(map) {
val id = ObjectId()
this.put("_id", id)
}
constructor() : super()
}
This way I have the _id value after saving for some business logic. Again I defined my Model this way because the metadata file we are accepting needs to allow a client to add any fields they wish to describe a binary file of RF measurement data.

f# Insert on MongoDB using Records

I've been trying for a while to insert on MongoDB using only records with no success.
My problem is that I want to create a simple insert function which I send a generic type and it is inserted into the database.
Like so.
let insert (value: 'a) =
let collection = MongoClient().GetDatabase("db").GetCollection<'a> "col"
collection.InsertOne value
From this function, I tried inserting the following records.
// Error that it can't set the Id
type t1 = {
Id: ObjectId
Text: string
}
// Creates the record perfectly but doesn't generate a new Id
type t2 = {
Id: string
Text: string
}
// Creates the record and autogenerates the Id but doesn't insert the Text, and there are two Ids (_id, Id#)
type t3 = {
mutable Id: ObjectId
Text: string
}
// Creates the record and autogenerates the Id but for every property it generates two on MongoDB (_id, Id#, Text, Text#)
type t4 = {
mutable Id: ObjectId
mutable Text: string
}
So does anyone can think of a solution for this or am I stuck having to use a class.
// Works!!!
type t5() =
member val Id = ObjectId.Empty with get, set
member val Name = "" with get, set
Also, does anyone has any Idea of why when the C# MongoDB library translates the mutable he gets the property with the # at the end?
I would be fine with having all my properties set as mutable, although this wouldn't be my first choice, having he create multiple properties on the DB is quite bad.
You could try annotating your records with CLIMutable (and no mutable fields).
The #s end up in the DB because MongoDB using reflection and F# implementing mutable with backing fields fieldName#

Grails MongoDB Update object with Association

I can't seem to understand where I am going wrong currently when I attempt to update my User domain model that has a hasOne association to a Profile object.
My domain models are as follows:
class User {
static hasOne = [profile: Profile]
static fetchMode = [profile: 'eager']
ObjectId id
String username
}
class Profile {
static belongsTo = [user: User]
ObjectId id
String familyName
String givenName
}
I am able to persist a User with a profile originally but when attempting to update the User object I get validation errors.
Validation error occurred during call to save():
- Field error in object 'co.suitable.User' on field 'profile.familyName': rejected value [null]
- Field error in object 'co.suitable.User' on field 'profile.givenName': rejected value [null]
I am able to print out the user.profile ID and also the user.profile.familyName before saving the object. Like the following:
println(user.profile.familyName)
println(user.profile.id.toString())
user.save(flush: true, failOnError: true)
But I still get the validation errors before saving, i'd imagine that the println(user.profile.familyName) call is fetching the profile object if it already hasn't been loaded which I thought setting the fetchMode would have handled.
The object is able to successfully persist and save when I do:
user.profile = Profile.findById(user.profile.id)
println(user.profile.id.toString())
user.save(flush: true, failOnError: true)
I could wrap that in a service but I was hoping for a solution that would be handled by Grails if possible. Any advice or thoughts is much appreciated.
You should not apply the logic for the SQL DB to Mongo 1 to 1. Mongo and other document-oriented DBs are not originally intended to store the joins between collections. There are some workarounds, like db-refs, but they are to be used with caution.
For your case - with hasOne - I would suggest using mongo's subdocuments (mirrored as GORM's embedded objects) instead of referencing:
class User {
ObjectId id
String username
Profile profile
static embedded = [ 'profile' ]
}
class Profile {
String familyName
String givenName
}
thus you use the mongo in accordance to it's original puprose. Also querying is simpler and faster.

How to implement Polymorphic objects in CouchDB / NoSQL?

I'd like to implement Polymorphic objects in a NoSQL / Document DB?
What is best practice?
Example:
Master Class
Item Object (All should have Item.Title, Item.Subtitle, Item.IconURL)
SubClasses: ItemPhoto, ItemPDF, ItemURL, ItemHTML
(Each subclass would have different properties)
I'd like to list all Items Generically - then get specific data when i drill down.
Possible Options:
Save a two different Documents -with Master/Child Type & ID
Save all as SubClass Documents with Internal Item Object
Other options??
Thanks
CouchDB stores documents (data), not classes (data with code). There's the code in map, validation, list, and show functions which handle documents, but those documents are plain objects that carry data only.
In your example, you can define a library function to check that a given document contains the data of an item, and then use this function to decide what to do. For example:
// in a "appTypes" library:
exports.isItem = function(doc) {
return doc.Title && doc.Subtitle && doc.IconURL;
}
// in a map function
function(doc) {
var appTypes = require('appTypes');
if (appTypes.isItem(doc)) {
// doc is an Item...
}
}
Obviously you can put all code belonging to an Item in an Item class and create instances of that class initialized with the data in the doc. But that's your choice, and does not change how CouchDB will handle the document.