JSON Schema one-to-many relation, in MongoDB Atlas? - mongodb

I'm stumped trying to get a one-to-many relationship established in MongoDB Atlas, via JSON Schema (for use in a GraphQL API).
This is in the Atlas App Services > Data Access > Schema > "Collections" tab.
(It's possible I'm thinking about this wrong, any ideas appreciated.)
So, I have two collections: Node and URL. Each Node can have one or more URLs.
Each URL has three properties: _id (primary key), url, and node:
// URLs (JSON Schema)
{
"title": "url",
"properties": {
"_id": {
"bsonType": "objectId"
},
"node": {
"bsonType": "string"
},
"url": {
"bsonType": "string"
}
}
}
Each URL is matched to a Node (URL.node === Node._id):
// URLs relationships (JSON Schema)
{
"node": {
"ref": "#/relationship/db_cluster/db_name/nodes",
"foreignKey": "_id",
"isList": false
}
}
Each Node has two properties of its own, _id (primary key) and name, and urls which should be an array of linked documents (from the URLs Collection, of course).
// Nodes (JSON Schema)
{
"title": "node",
"properties": {
"_id": {
"bsonType": "string"
},
"name": {
"bsonType": "string"
},
"urls": {
"bsonType": "array",
"items": {
"bsonType": "object",
"properties": {
"_id": {
"bsonType": "objectId"
},
"node": {
"bsonType": "string"
},
"url": {
"bsonType": "string"
}
}
}
}
}
}
So far so good. The above seems to work OK.
The trouble begins when I try and link the two Collections together. I am attempting to link a few URLs to each Node, with this relationship:
// Nodes relationships (JSON Schema)
{
"urls": {
"ref": "#/relationship/db_cluster/db_name/urls",
"foreignKey": "_id",
"isList": true
}
}
However, I can't save the above relationship. I get an error
relationship source property type must match foreign property type
Any ideas? Thanks.

Related

MongoDB Realm - Filter Queries

In the scheme section where I connect the realm to the MongoDB collections there is a tab to add a filter query to filter the results being synced. I added a filter but every time I query on the app to load the data I get all the documents instead of filtered documents.
How can I filter the data the app receives?
Collection Schema:
{
"title": "Group",
"required": [
"_id",
"cDate",
"name",
"info",
"isPublic",
"joinWithRequest",
"partition"
],
"properties": {
"_id": {
"bsonType": "objectId"
},
"admins": {
"bsonType": "array",
"items": {
"bsonType": "objectId"
}
},
"members": {
"bsonType": "array",
"items": {
"bsonType": "objectId"
}
},
"photoItems": {
"bsonType": "array",
"items": {
"bsonType": "object",
"title": "Item",
"properties": {
"id": {
"type": "string"
},
"cDate": {
"bsonType": "date"
}
},
"required": [
"id",
"cDate"
]
}
},
"videoItems": {
"bsonType": "array",
"items": {
"bsonType": "object",
"title": "Item",
"properties": {
"id": {
"type": "string"
},
"cDate": {
"bsonType": "date"
}
},
"required": [
"id",
"cDate"
]
}
},
"cDate": {
"bsonType": "date"
},
"partition": {
"bsonType": "string"
},
"name": {
"bsonType": "string"
},
"info": {
"bsonType": "string"
},
"icon": {
"bsonType": "string"
},
"isPublic": {
"bsonType": "bool"
},
"joinWithRequest": {
"bsonType": "bool"
}
}
}
Filter:
Client Query:
let groups = realm.objects(Group.self).sorted(byKeyPath: "cDate")
What you're asking about is (was) called a query based sync. While it limited the amount of data sync'd it also was limited by the number of users, system resources etc.
At this time query based sync is no longer supported in MongoDB Realm
However, you can get much of the same functionality by leveraging partitions - which are required for sync'ing anyway. Let me give a high level example.
Suppose you have a 'posts' app with users and then posts done by users of specific groups. For this example, this user belongs to Group_0 and Group_1. Your user object may look like this
class UserClass: Object {
#objc dynamic var _id: //a users uid
#objc dynamic var user_name = ""
let groupPartitionList = List<GroupClass>() //will populate with Group_0 and Group_1
}
and a post may look like this
class PostClass: Object {
#objc dynamic var _id: ObjectId = ObjectId.generate()
#objc dynamic var _partitionKey: GroupPartition = ""
#objc dynamic var title = ""
#obcj dynamic var post = ""
override static func primaryKey() -> String? {
return "_id"
}
}
Suppose there are 50 different groups.
Upon the user authenticating and the user object populating, you would then know the user belongs to Group_0 and Group_1 which could be displayed in a popup menu - defaulting to the first group, Group_0.
Once the user logs in, get just the Group_0 posts
let whichPartition = myUserObject.groupPartitionList[0] //object at index 0 = Group_0
let config = user.configuration(partitionValue: whichPartition)
Realm.asyncOpen(configuration: config) { result in
...
let group0Posts = realm.objects(PostClass.self) //loads just the posts Group_0
}
As you can see, there is no query: the data is limited by the partition key e.g this will not read posts from any other group - only posts with a partition key of Group_0
Likewise, if the user changes the popup to Group_1, only posts with partition key of Group_1 will sync and no other. So even though there are 50 groups and thousands of posts, you will only download and sync Group_0's and Group_1's via the partition value.
You can augment this further by using a server filter (as mentioned in the comments to the question) to withhold certain data so the amount of data being queried at the server level is smaller.
For example, you could add a filter to ignore any posts that are older than 5 years. That will significantly speed up queries by reducing the amount of data being queried (which effectively could reduce the number of results because those are ignored)
There's some excellent reading in the MongoDB Realm Docs Partition Atlas Data Into Realms.

MongoDB Stitch GraphQL Custom Mutation Resolver returning null

GraphQL is a newer feature for MongoDB Stitch, and I know it is in beta, so thank you for your help in advance. I am excited about using GraphQL directly in Stitch so I am hoping that maybe I just overlooked something.
The documentation for the return Payload displays the use of bsonType, but when actually entering the JSON Schema for the payload type it asks for you to use "type" instead of "bsonType". It still works using "bsonType" to me which is odd as long as at least one of the properties uses "type".
Below is the function:
const mongodb = context.services.get("mongodb-atlas");
const collection = mongodb.db("<database>").collection("<collection>");
const query = { _id: BSON.ObjectId(input.id) }
const update = {
"$push": {
"notes": {
"createdBy": context.user.id,
"createdAt": new Date,
"text": input.text
}
}
};
const options = { returnNewDocument: true }
collection.findOneAndUpdate(query, update, options).then(updatedDocument => {
if(updatedDocument) {
console.log(`Successfully updated document: ${updatedDocument}.`)
} else {
console.log("No document matches the provided query.")
}
return {
_id: updatedDocument._id,
notes: updatedDocument.notes
}
})
.catch(err => console.error(`Failed to find and update document: ${err}`))
}
Here is the Input Type in the customer resolver:
"type": "object",
"title": "AddNoteToLeadInput",
"required": [
"id",
"text"
],
"properties": {
"id": {
"type": "string"
},
"text": {
"type": "string"
}
}
}
Below is the Payload Type:
{
"type": "object",
"title": "AddNoteToLeadPayload",
"properties": {
"_id": {
"type": "objectId"
},
"notes": {
"type": "array",
"items": {
"type": "object",
"properties": {
"createdAt": {
"type": "string"
},
"createdBy": {
"type": "string"
},
"text": {
"type": "string"
}
}
}
}
}
}
When entering the wrong "type" the error states:
Expected valid values are:[array boolean integer number null object string]
When entering the wrong "bsonType" the error states:
Expected valid values are:[string object array objectId boolean bool null regex date timestamp int long decimal double number binData]
I've tried every combination I can think of including changing all "bsonType" to "type". I also tried changing the _id to a string when using "type" or objectId when "bsonType". No matter what combination I try when I use the mutation it does what it is supposed to and adds the note into the lead, but the return payload always displays null. I need it to return the _id and note so that it will update the InMemoryCache in Apollo on the front end.
I noticed that you might be missing a return before your call to collection.findOneAndUpdate()
I tried this function (similar to yours) and got GraphiQL to return values (with String for all the input and payload types)
exports = function(input){
const mongodb = context.services.get("mongodb-atlas");
const collection = mongodb.db("todo").collection("dreams");
const query = { _id: input.id }
const update = {
"$push": {
"notes": {
"createdBy": context.user.id,
"createdAt": "6/10/10/10",
"text": input.text
}
}
};
const options = { returnNewDocument: true }
return collection.findOneAndUpdate(query, update, options).then(updatedDocument => {
if(updatedDocument) {
console.log(`Successfully updated document: ${updatedDocument}.`)
} else {
console.log("No document matches the provided query.")
}
return {
_id: updatedDocument._id,
notes: updatedDocument.notes
}
})
.catch(err => console.error(`Failed to find and update document: ${err}`))
}
Hi Bernard – There is an unfortunate bug in the custom resolver form UI at the moment which doesn't allow you to only use bsonType in the input/payload types – we are working on addressing this. In actually you should be able to use either type/bsonType or a mix of the two as long as they agree with your data. I think that the payload type definition you want is likely:
{
"type": "object",
"title": "AddNoteToLeadPayload",
"properties": {
"_id": {
"bsonType": "objectId"
},
"notes": {
"type": "array",
"items": {
"type": "object",
"properties": {
"createdAt": {
"bsonType": "date"
},
"createdBy": {
"type": "string"
},
"text": {
"type": "string"
}
}
}
}
}
}
If that doesn't work, it might be helpful to give us a sample of the data that you would like returned.

How do I add custom queries in GraphQL using Strapi?

I'm using graphQL to query a MongoDB database in React, using Strapi as my CMS. I'm using Apollo to handle the GraphQL queries. I'm able to get my objects by passing an ID argument, but I want to be able to pass different arguments like a name.
This works:
{
course(id: "5eb4821d20c80654609a2e0c") {
name
description
modules {
title
}
}
}
This doesn't work, giving the error "Unknown argument \"name\" on field \"course\" of type \"Query\"
{
course(name: "course1") {
name
description
modules {
title
}
}
}
From what I've read, I need to define a custom query, but I'm not sure how to do this.
The model for Course looks like this currently:
"kind": "collectionType",
"collectionName": "courses",
"info": {
"name": "Course"
},
"options": {
"increments": true,
"timestamps": true
},
"attributes": {
"name": {
"type": "string",
"unique": true
},
"description": {
"type": "richtext"
},
"banner": {
"collection": "file",
"via": "related",
"allowedTypes": [
"images",
"files",
"videos"
],
"plugin": "upload",
"required": false
},
"published": {
"type": "date"
},
"modules": {
"collection": "module"
},
"title": {
"type": "string"
}
}
}
and the
Any help would be appreciated.
Referring to Strapi GraphQL Query API
You can use where with the query courses to filter your fields. You will get a list of courses instead of one course
This should work:
{
courses(where: { name: "course1" }) {
name
description
modules {
title
}
}
}

JSON Schema - can array / list validation be combined with anyOf?

I have a json document I'm trying to validate with this form:
...
"products": [{
"prop1": "foo",
"prop2": "bar"
}, {
"prop3": "hello",
"prop4": "world"
},
...
There are multiple different forms an object may take. My schema looks like this:
...
"definitions": {
"products": {
"type": "array",
"items": { "$ref": "#/definitions/Product" },
"Product": {
"type": "object",
"oneOf": [
{ "$ref": "#/definitions/Product_Type1" },
{ "$ref": "#/definitions/Product_Type2" },
...
]
},
"Product_Type1": {
"type": "object",
"properties": {
"prop1": { "type": "string" },
"prop2": { "type": "string" }
},
"Product_Type2": {
"type": "object",
"properties": {
"prop3": { "type": "string" },
"prop4": { "type": "string" }
}
...
On top of this, certain properties of the individual product array objects may be indirected via further usage of anyOf or oneOf.
I'm running into issues in VSCode using the built-in schema validation where it throws errors for every item in the products array that don't match Product_Type1.
So it seems the validator latches onto that first oneOf it found and won't validate against any of the other types.
I didn't find any limitations to the oneOf mechanism on jsonschema.org. And there is no mention of it being used in the page specifically dealing with arrays here: https://json-schema.org/understanding-json-schema/reference/array.html
Is what I'm attempting possible?
Your general approach is fine. Let's take a slightly simpler example to illustrate what's going wrong.
Given this schema
{
"oneOf": [
{ "properties": { "foo": { "type": "integer" } } },
{ "properties": { "bar": { "type": "integer" } } }
]
}
And this instance
{ "foo": 42 }
At first glance, this looks like it matches /oneOf/0 and not oneOf/1. It actually matches both schemas, which violates the one-and-only-one constraint imposed by oneOf and the oneOf fails.
Remember that every keyword in JSON Schema is a constraint. Anything that is not explicitly excluded by the schema is allowed. There is nothing in the /oneOf/1 schema that says a "foo" property is not allowed. Nor does is say that "foo" is required. It only says that if the instance has a keyword "foo", then it must be an integer.
To fix this, you will need required and maybe additionalProperties depending on the situation. I show here how you would use additionalProperties, but I recommend you don't use it unless you need to because is does have some problematic properties.
{
"oneOf": [
{
"properties": { "foo": { "type": "integer" } },
"required": ["foo"],
"additionalProperties": false
},
{
"properties": { "bar": { "type": "integer" } },
"required": ["bar"],
"additionalProperties": false
}
]
}

elasticsearch 6.2 How to specify child and parent fields within one mapping(_doc)

Since 6.2 no longer support multiple mapping type. I have to migrate existing multi type index into _doc type single mapping. However I am not sure how to map current child properties in this single mapping.
"mappings": {
"_doc": {
"properties": {
"join_field": {
"type": "join",
"relations": {
"question": "answer"
}
},
"text": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
}
}
}
}
Now I wish to have more fields in answer type as well as in question type.But I have no clue how to do that.