Is there a Deconstruct Mongo Response to DTO short cut? - mongodb

If I have a table in a mongoDB with five properties and I only want to return four of them and none of the mongo added info such as v1 I can map the reposne to a dto like so,
const product = await this.productModel.findById(productId).exec()
return { id: product.id, title: product.title }
Is there a deconstruct shortcut for the return, to extract every field from an interface (Product) from the product response, to save typing each property out ? If for example im retunring 127 properties from a table of entires with 140.
interface Product {
id: string
title: string
...
}

Unfortunately no, typescript interfaces do not really exist when your program compiles
Interface is a structure that defines the contract in your application. It defines the syntax for classes to follow. Classes that are derived from an interface must follow the structure provided by their interface.
The TypeScript compiler does not convert interface to JavaScript. It
uses interface for type checking. This is also known as "duck typing"
or "structural subtyping".
So, you can't really read interface fields and then write some logic (you can maybe achieve this through reflection but it's a bad practice)
An alternative is to explicitly define what fields are to include/or exclude from your object
Suppose that I have an object with this interface:
interface Foo {
field1: string;
field2: string;
field3: string;
.....
field140: string;
}
What you can do here is to define what properties you want to exclude (you take the exclude approach here since you are returning 127 fields of 140)
// This isn't an implementation with mongoose (if you are using it),
// it's just to give you the idea
const FIELDS_TO_EXCLUDE = ["field128", "field129", "field130", ..., "field140"];
productModel.toDTO(){
const documentData = this;
FIELDS_TO_EXCLUDE.forEach(x => delete documentData[x]);
return documentData;
}
In this way, when you will execute the toDTO function your manipulate itself excluding (or including) the fields you want

Related

BsonInvalidOperationException in ktor

Being new to MongoDB, I'm currently integrating the kMongo library to my ktor project, and trying to create a database to read & write event models to.
Following the instructions for object mapping in the kMongo user manual, I've created a mongoId field which gets serialised as a String named _id.
My event model is a data class, nested in sealed classes but gets serialised correctly by KotlinX-Serialization. The model looks as such:
sealed class Event {
#SerialName("_id") abstract val mongoId: String
abstract val id: ID.Event
abstract val dateTime: LocalDateTime
fun asString() = id.toString()
sealed class Hiring : Event() {
#SerialName("_id") abstract override val mongoId: String
abstract override val id: ID.Event
abstract override val dateTime: LocalDateTime
#Serializable
data class Start(
override val id: ID.Event,
override val dateTime: LocalDateTime,
val hiringDetailsId: ID.HiringDetails
) : Hiring() {
#SerialName("_id") override val mongoId: String = id.asString()
}
...
In a repository class, I initialise MongoDB and use the generic, parameter-less find() on a collection to retrieve all Event models from the database:
...
private val kmongo = KMongo.createClient().coroutine.client
private val db = kmongo.getDatabase("test")
private val eventCollection = db.getCollection<Event>().coroutine
...
override suspend fun getAllEvents() = eventCollection.find().toList()
Then inside of the Main class, I try to load the Event data on a click trigger:
...
val id = ID.Event(UUID())
...
it.on.click {
runBlocking {
val events = eventRepo.getAllEvents().toString()
logger.debug { events }
}
}
The strange part starts here, the server starts correctly and MongoDB is initialised correctly, but as soon as I try to do the read on the click trigger, I am presented with following error:
org.bson.BsonInvalidOperationException: readString can only be called when CurrentBSONType is STRING, not when CurrentBSONType is DOCUMENT.
at org.bson.AbstractBsonReader.verifyBSONType(AbstractBsonReader.java:689)
at org.bson.AbstractBsonReader.checkPreconditions(AbstractBsonReader.java:721)
at org.bson.AbstractBsonReader.readString(AbstractBsonReader.java:456)
at com.github.jershell.kbson.FlexibleDecoder.decodeString(BsonFlexibleDecoder.kt:130)
at kotlinx.serialization.encoding.AbstractDecoder.decodeStringElement(AbstractDecoder.kt:58)
at kotlinx.serialization.internal.AbstractPolymorphicSerializer.deserialize(AbstractPolymorphicSerializer.kt:52)
at kotlinx.serialization.encoding.Decoder$DefaultImpls.decodeSerializableValue(Decoding.kt:257)
at kotlinx.serialization.encoding.AbstractDecoder.decodeSerializableValue(AbstractDecoder.kt:16)
at org.litote.kmongo.serialization.SerializationCodec.decode(SerializationCodec.kt:66)
at com.mongodb.internal.operation.CommandResultArrayCodec.decode(CommandResultArrayCodec.java:52)
at com.mongodb.internal.operation.CommandResultDocumentCodec.readValue(CommandResultDocumentCodec.java:60)
at org.bson.codecs.BsonDocumentCodec.decode(BsonDocumentCodec.java:87)
at org.bson.codecs.BsonDocumentCodec.decode(BsonDocumentCodec.java:42)
at org.bson.internal.LazyCodec.decode(LazyCodec.java:48)
at org.bson.codecs.BsonDocumentCodec.readValue(BsonDocumentCodec.java:104)
at com.mongodb.internal.operation.CommandResultDocumentCodec.readValue(CommandResultDocumentCodec.java:63)
at org.bson.codecs.BsonDocumentCodec.decode(BsonDocumentCodec.java:87)
at org.bson.codecs.BsonDocumentCodec.decode(BsonDocumentCodec.java:42)
at com.mongodb.internal.connection.ReplyMessage.<init>(ReplyMessage.java:51)
at com.mongodb.internal.connection.InternalStreamConnection.getCommandResult(InternalStreamConnection.java:535)
at com.mongodb.internal.connection.InternalStreamConnection.access$500(InternalStreamConnection.java:86)
at com.mongodb.internal.connection.InternalStreamConnection$2$1.onResult(InternalStreamConnection.java:520)
at com.mongodb.internal.connection.InternalStreamConnection$2$1.onResult(InternalStreamConnection.java:498)
at com.mongodb.internal.connection.InternalStreamConnection$MessageHeaderCallback$MessageCallback.onResult(InternalStreamConnection.java:821)
at com.mongodb.internal.connection.InternalStreamConnection$MessageHeaderCallback$MessageCallback.onResult(InternalStreamConnection.java:785)
at com.mongodb.internal.connection.InternalStreamConnection$5.completed(InternalStreamConnection.java:645)
at com.mongodb.internal.connection.InternalStreamConnection$5.completed(InternalStreamConnection.java:642)
at com.mongodb.internal.connection.AsynchronousChannelStream$BasicCompletionHandler.completed(AsynchronousChannelStream.java:250)
at com.mongodb.internal.connection.AsynchronousChannelStream$BasicCompletionHandler.completed(AsynchronousChannelStream.java:233)
at java.base/sun.nio.ch.Invoker.invokeUnchecked(Invoker.java:129)
at java.base/sun.nio.ch.Invoker.invokeDirect(Invoker.java:160)
at java.base/sun.nio.ch.UnixAsynchronousSocketChannelImpl.implRead(UnixAsynchronousSocketChannelImpl.java:573)
at java.base/sun.nio.ch.AsynchronousSocketChannelImpl.read(AsynchronousSocketChannelImpl.java:276)
at java.base/sun.nio.ch.AsynchronousSocketChannelImpl.read(AsynchronousSocketChannelImpl.java:297)
at com.mongodb.internal.connection.AsynchronousSocketChannelStream$AsynchronousSocketChannelAdapter.read(AsynchronousSocketChannelStream.java:144)
at com.mongodb.internal.connection.AsynchronousChannelStream.readAsync(AsynchronousChannelStream.java:118)
at com.mongodb.internal.connection.AsynchronousChannelStream.readAsync(AsynchronousChannelStream.java:107)
at com.mongodb.internal.connection.InternalStreamConnection.readAsync(InternalStreamConnection.java:642)
at com.mongodb.internal.connection.InternalStreamConnection.access$600(InternalStreamConnection.java:86)
at com.mongodb.internal.connection.InternalStreamConnection$MessageHeaderCallback.onResult(InternalStreamConnection.java:775)
at com.mongodb.internal.connection.InternalStreamConnection$MessageHeaderCallback.onResult(InternalStreamConnection.java:760)
at com.mongodb.internal.connection.InternalStreamConnection$5.completed(InternalStreamConnection.java:645)
at com.mongodb.internal.connection.InternalStreamConnection$5.completed(InternalStreamConnection.java:642)
at com.mongodb.internal.connection.AsynchronousChannelStream$BasicCompletionHandler.completed(AsynchronousChannelStream.java:250)
at com.mongodb.internal.connection.AsynchronousChannelStream$BasicCompletionHandler.completed(AsynchronousChannelStream.java:233)
at java.base/sun.nio.ch.Invoker.invokeUnchecked(Invoker.java:129)
at java.base/sun.nio.ch.UnixAsynchronousSocketChannelImpl.finishRead(UnixAsynchronousSocketChannelImpl.java:447)
at java.base/sun.nio.ch.UnixAsynchronousSocketChannelImpl.finish(UnixAsynchronousSocketChannelImpl.java:195)
at java.base/sun.nio.ch.UnixAsynchronousSocketChannelImpl.onEvent(UnixAsynchronousSocketChannelImpl.java:217)
at java.base/sun.nio.ch.KQueuePort$EventHandlerTask.run(KQueuePort.java:312)
at java.base/sun.nio.ch.AsynchronousChannelGroupImpl$1.run(AsynchronousChannelGroupImpl.java:113)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
at java.base/java.lang.Thread.run(Thread.java:833)
According to the stacktrace, something seems to go wrong in the BSON filtering part, despite there being none. When I use the MongoDB compass to validate the object inside of the database, I can see that everything is initialised and written perfectly fine:
The normal id field is used in my software internally as an ID.Event object type whilst the _id is used by Mongo internally.
Can someone point me to what the potential issue could be here?
I'm not familiar with Kotlin, but I'd like to dive into this a bit further:
If it wouldn't unwrap the second time, the Mongo Compass would likely reveal the _id or id field to contain a bracket { while these are currently mapped as expected (a String for _id and an object for id).
To confirm, the current structure of your document is (eg here in the playground):
{
_id: "7d51",
id: {
id: "7d51"
},
hiringDetailsId: {
id: "8392"
}
}
We can see that in your screenshot from Compass where the _id field shows the value being the string directly whereas the other two fields show that the values are Objects (that each contain { id: "<string>" } values).
The error is specifically stating that the code is expecting a string but finding a document:
BsonInvalidOperationException: readString can only be called when CurrentBSONType is STRING, not when CurrentBSONType is DOCUMENT.
I can't speak to the internal unpacking, but it really feels to me like the nested id.id (and potentially also hiringDetailsId.id) is the problem here. Even if it isn't directly related, it would seem to be an opportunity to simplify the schema unless there is a compelling reason to introduce that extra level of nesting.

Resolving auto-generated typescript-mongodb types for GraphQL output

I'm using the typescript-mongodb plugin to graphql-codegen to generate Typescript types for pulling data from MongoDB and outputting it via GraphQL on Node.
My input GraphQL schema looks like this
type User #entity{
id: ID #id,
firstName: String #column #map(path: "first_name"),
...
The generated output Typescript types look correct
export type User = {
__typename?: 'User',
id?: Maybe<Scalars['ID']>,
firstName?: Maybe<Scalars['String']>,
...
And the corresponding DB object
export type UserDbObject = {
_id?: Maybe<String>,
first_name: Maybe<string>,
...
The problem is when actually sending back the mongo document as a UserDbObject I do not get the fields mapped in the output. I could write a custom resolver that re-maps the fields back to the User type, but that would mean I'm mapping the fields in two different places.
i.e. I do not get mapped fields from a resolver like this
userById: async(_root: any, args: QueryUserByIdArgs, _context: any) : Promise<UserDbObject> => {
const result = await connectDb().then((db) => {
return db.collection<UserDbObject>('users').findOne({'_id': args.id}).then((doc) => {
return doc;
});
})
...
return result as UserDbObject;
}
};
Is there a way to use the typescript-mongodb plugin to only have to map these fields in the schema, then use the auto-generated code to resolve them?
You can use mappers feature of codegen to map between your GraphQL types and your models types.
See:
https://graphql-code-generator.com/docs/plugins/typescript-resolvers#mappers---overwrite-parents-and-resolved-values
https://graphql-code-generator.com/docs/plugins/typescript-resolvers#mappers-object
Since all codegen plugins are independent and not linked together, you should do it manually, something like:
config:
mappers:
User: UserDbObject
This will make typescript-resolvers plugin to use UserDbObject at any time (as parent value, or as return value).
If you wish to automate this, you can either use the codegen programmatically (https://graphql-code-generator.com/docs/getting-started/programmatic-usage), or you can also create a .js file instead of .yaml file that will create the config section according to your needs.

Common fields on graphql interface type in react apollo with a graphene backend

I have a python graphene-django backend with an interface and two types, let's say
interface InterfaceType {
id: ID!
name: String!
}
type Type1 implements InterfaceType {
aField: String
}
type Type2 implements InterfaceType {
anotherField: String
}
I'm able to query this from my react-apollo frontend using inline fragments:
query {
interfaceQuery {
...on Type1 {
id
name
}
...on Type1 {
id
name
}
}
}
But from what I understand it should also be possible to query the common fields simply as
query {
interfaceQuery
id
name
}
}
When I try this, however, I get the error Cannot query field "id" on type "InterfaceType". Did you mean to use an inline fragment on "Type1" or "Type2"?
I'm using an IntrospectionFragmentMatcher.
Am I misunderstanding and this kind of simple access of common fields is not possible, or is it just not implemented in either graphene or apollo?
If you're seeing that error, it's coming from the server, not Apollo. However, there's nothing specific to either Apollo or Graphene that should prevent you from querying the interface fields without a fragment.
The error is thrown because whatever schema you're using literally doesn't have that field on the provided type. Maybe you updated your schema without restarting the service, or were mistaken about what fields were actually part of the interface -- it's hard to know with only pseudocode provided.

graphql - Combining results from multiple resolvers into one

I have a set of functions at the server side which each return a list of objects of the same type based on the passed parameters to the resolvers in the GraphQL query-
query {
objListQuery {
objResolver1(params) {
obj-id
}
objResolver2(different params) {
obj-id
}
...
}
}
Here, objResolver1 and objResolver2 send back a list of obj objects.
Server side -
function objResolver1(params) -> returns list of obj
function objResolver2(different params) -> returns list of obj
...
I want to perform a logical AND between the results of the resolvers that is, find out the common objects in the results of the different resolvers.
Instead of getting the individual lists, I only want the combined list.
One way is to aggregate the results at the client side but this will increase the amount of duplicated data sent by the server.
What is the best way to achieve this at the server side? What changes are required in the schema?
--------------------EDIT--------------------
The data source is a JSON array of obj objects which is obtained from an external service at the server. Data source is not a database.
Parameters in each resolver can be one or many. It is used for filtering the objects. For example, the data store will have the structure as:
[
{"dateCreated":"2011-08-12T20:17:46.384Z",
"type":"customer",
....
},
{"dateCreated":"2011-08-14T20:17:46.384Z",
"type":"test",
....
}
]
resolvers will be of the form:
dateResolver(String startDate, String endDate) -> returns list of obj whose dateCreated is within the range
typeResolver(String[] type) -> returns list of obj whose type is anyone of the values passed in the array.
Assumed you're using a database you're somehow asking how to shift constraints from database- or repository-layer on controller-level.
While this has some weakness on model-level perhaps, it might depend on the class-implementation if you can easily change the objResolver in the kind that you just build one that allows more parameters like this:
query {
objListQuery {
objResolver(params1, params2, constraint) {
...
}
}
}
Like this you could create a database-query that is directly fetching the right result or you can perform several queries and resolve them inside the objResolver. If the constraint is always AND you could leave the parameter away, but perhaps you like to offer the possibility to use also OR, XOR, or others.
If the amount of parameter-sets is always 2, then it's simple like my code above, also considering the optional constraint. If the amount of parameter-sets might be variable, i.e. 4 or 5, then it's getting complicated if you still want to offer the constraint-parameter(s). Without constraint-parameter(s) it's simple, you just could note the function without parameters but check for the amount of parameters in the caller and handle them accordingly, in the caller you just use so many parameters as required.
query {
objListQuery {
objResolver() {
paramArray = getArguments();
}
}
}
Like written above it's getting hard, if you still want to offer constraint-parameters here, but I'd suggest that would be material for another question.
You can implement a Connection interface, with a single resolver to allow a one-step querying mechanism. You can reduce query endpoints using this technique.
E.g, an example query would look like:
allObjects(start: "01-01-2019", end: "04-29-2019", types:["test", "sales"]){
nodes {
id,
dateCreated,
type
}
}
In the resolver, you can use this criteria to prepare and return the data.
Benefits:
Less query endpoints.
Filtering and pagination.
Your filter interface can be quite fancy:
allObjects(
dateCreated: {
between:{
start,
end
},
skipWeekends: true
},
types: {
include:[],
exclude: []
}
)
Add new criteria as your needs grow. Start with what you want and take it from there.

GraphQL: best way to manage mutations with interfaces?

I'am new to GraphQL but I really like it. Now that I'am playing with interfaces and unions, I'am facing a problem with mutations.
Suppose that I have this schema :
interface FoodType {
id: String
type: String
}
type Pizza implements FoodType {
id: String
type: String
pizzaType: String
toppings: [String]
size: String
}
type Salad implements FoodType {
id: String
type: String
vegetarian: Boolean
dressing: Boolean
}
type BasicFood implements FoodType {
id: String
type: String
}
Now, I'd like to create new food items, so I started doing something like this :
type Mutation {
addPizza(input:Pizza):FoodType
addSalad(input:Salad):FoodType
addBasic(input:BasicFood):FoodType
}
This did not work for 2 reasons :
If I want to pass an object as parameter, this one must be an "input" type. But "Pizza", "Salad" and "BasicFood" are just "type".
An input type cannot implement an interface.
So, my question is : How do you work with mutations in this context of interface without having to duplicate types too much? I'd like to avoid having a Pizza type for queries and an InputPizza type for mutations.
Thank you for your help.
Input and Output types are fundamentally different things, just because you may have ones that happen to represent a similar object (eg Pizza and PizzaInput), it's a false equivalence.
Let's say in your schema Pizza has a mandatory id field (your IDs aren't mandatory, but probably should be). It probably wouldn't make sense for PizzaInput to have an ID field -- it would be generated by the server. The point i'm making is that in anything but the simplest systems, there's going to be processing to turn user input into a fully-fledged object to return.
You just have to bite the bullet and do what feels like duplicated work, you'll see the benefits in the long run.