I am building a game engine on Meteor JS and trying to create a way to link together a number of collections. The current 'schema' looks like this:
GameCollection = { <meta> } //This is a Collection (a Meteor MongoDB document)
Scene = {gameId: _id, <other resource ids and meta>} //This is a Collection
The issue is I need to create a map from one scene to anther. These paths needs to fork and merge easily. I am getting the feeling that I should be using a graph/triple database to represent this but I want to say within "Meteor's magic" and that means normal MongoDB Collections. If someone has a simple to use alternative I would still like to hear it, but I would prefer a Meteor-esk pattern. Pushes in the right direction would also be great!
I have three specific needs:
If I am at this scene what scene or scenes do I lead to.
If I am at this scene then give me the ids of all scene x number of steps into the future. Where 'x' is a variable (so I can send the lot of them down to the client)
Count and give me all possible paths so I can give a visual representation of the game.
What I am specifically looking for is: is a graph database what I am looking and if not what schema pattern should I use with mongoDB?
UPDATE:
I have confirmed that neo4j will do what I need from a logical standpoint. But I would lose the benefit of working with Meteor Collections. This means losing reactivity which in turn breaks my live collaborative model. I really need a MongoDB alternative.
UPDATE 2:
I ended up trying to stick the relationship inside of the GameCollection. It seems to be working but I would like a cleaner way if possible.
map: [ { //an array of objects (relations)
id: _id //key to a Scene
toKey: _id //leads to scene; toKey is ether 'next' or some num [0..n] for multi paths
}]
So I ended up going the denormalization route. I put an array of scenes into the GameCollection.
scenes: [{ id: random_id, next: 'next_id' || [next_ids], <other resource ids and meta> }]
Then I built this monster:
getScene = function (scenes, id) {
return _.find(scenes, function (scene) {
return (scene.id == id)
})
}
getNext = function (scene) {
if (!scene) { return null }
if (scene.type == 'dialogue') {
return scene.next
}
if (scene.type == 'choice') {
return _.pluck(scene.next, 'id')
}
}
scenesDive = function (list, next, container, limit, depth) {
if (!depth) {
depth = 0
}
myDepth = depth + 1
var scene = getScene(list, next)
if (container.indexOf(scene) != -1) { return } //This path has already been added. Go back up.
container.push(scene)
if (myDepth == limit) { return } //Don't dive deeper then this depth.
var nextAoS = getNext(scene) //DIVE! (array or string)
if (_.isArray(nextAoS)) {
nextAoS.forEach(function (n) {
scenesDive(list, n, container, limit, myDepth)
});
} else {
scenesDive(list, nextAoS, container, limit, myDepth)
}
}
I am sure there is a better way but this is what I am going with for now.
Related
I'm looking to build a Tree structure for an apps where I'll have to store multiple Tree for each user.
Each tree will be composed of documents for each node :
{
_id:1,
user_id:12345,
category_id:6789,
data:[]
}
So when I need to, I can access these data with a query looking for user_id and category_id
I was looking on mongoDB Docs and found that :
https://docs.mongodb.com/manual/applications/data-models-tree-structures/
Which is pretty interesting, but I have a few questions
Considering that I'll only search for full tree, which solutions is better?
Are child references better, or other structure might do the work better?
And if I use child's references, what is the best way to get all the tree?
Can it be done with a single request or do I have to recursively search for each child ?
I know I could get all docs in one query and kind of build the tree from there, is that a good idea?
EDIT:
So I tried with that:
[{
_id:1,
user_id:12345,
category_id:6789,
data:{name:"root"},
parent:null,
childs:[2,3]
},
{
_id:2,
user_id:12345,
category_id:6789,
data:{name:"child1"},
parent:1,
childs:[]
},
{
_id:3,
user_id:12345,
category_id:6789,
data:{name:"child2"},
parent:1,
childs:[4]
},
{
_id:4,
user_id:12345,
category_id:6789,
data:{name:"child2_1"},
parent:3,
childs:[]
}]
With both parent and children, I can easily find leaves and root when building the tree back. (I chosed to build it in the client app, and query the full tree at once)
The fact is I don't really use parent for now, so it looks "Overkill" to get a reference to the parents, but the query is fast enough, it just take some extra space. Maybe a simple "root" boolean could be better ? I really need some kind of advice with that.
I'm still up to some improvements, I'd like to get this working really fast because each users will have 0 to n tree with 0 to n nodes, and I don't want to mess the data structure for that.
This article shows a similar kind of solution : How to query tree structure recursively with MongoDB?
Other wise parent in consideration, you can query out in this way to convert it to tree
function list_to_tree(list) {
const map1 = new Map();
var node,
roots = [],
i;
for (i = 0; i < list.length; i += 1) {
map1.set(list[i]._id.toString(), i); // initialize the map
list[i].childs = []; // initialize the children
}
for (i = 0; i < list.length; i += 1) {
node = list[i];
if (node.parent) {
list[map1.get(node.parent.toString())].childs.push(node);
} else {
roots.push(node);
}
}
let newList = [];
for (let z = 0; z < roots.length; z++) {
newList.push(list.filter((x) => x._id === roots[z]._id)[0]);
}
return newList;
}
Given the data structure below in firebase, i want to run a query to retrieve the blog 'efg'. I don't know the user id at this point.
{Users :
"1234567": {
name: 'Bob',
blogs: {
'abc':{..},
'zyx':{..}
}
},
"7654321": {
name: 'Frank',
blogs: {
'efg':{..},
'hij':{..}
}
}
}
The Firebase API only allows you to filter children one level deep (or with a known path) with its orderByChild and equalTo methods.
So without modifying/expanding your current data structure that just leaves the option to retrieve all data and filter it client-side:
var ref = firebase.database().ref('Users');
ref.once('value', function(snapshot) {
snapshot.forEach(function(userSnapshot) {
var blogs = userSnapshot.val().blogs;
var daBlog = blogs['efg'];
});
});
This is of course highly inefficient and won't scale when you have a non-trivial number of users/blogs.
So the common solution to that is to a so-called index to your tree that maps the key that you are looking for to the path where it resides:
{Blogs:
"abc": "1234567",
"zyx": "1234567",
"efg": "7654321",
"hij": "7654321"
}
Then you can quickly access the blog using:
var ref = firebase.database().ref();
ref.child('Blogs/efg').once('value', function(snapshot) {
var user = snapshot.val();
ref.child('Blogs/'+user+'/blogs').once('value', function(blogSnapshot) {
var daBlog = blogSnapshot.val();
});
});
You might also want to reconsider if you can restructure your data to better fit your use-case and Firebase's limitations. They have some good documentation on structuring your data, but the most important one for people new to NoSQL/hierarchical databases seems to be "avoid building nests".
Also see my answer on Firebase query if child of child contains a value for a good example. I'd also recommend reading about many-to-many relationships in Firebase, and this article on general NoSQL data modeling.
Given your current data structure you can retrieve the User that contains the blog post you are looking for.
const db = firebase.database()
const usersRef = db.ref('users')
const query = usersRef.orderByChild('blogs/efg').limitToLast(1)
query.once('value').then((ss) => {
console.log(ss.val()) //=> { '7654321': { blogs: {...}}}
})
You need to use limitToLast since Objects are sorted last when using orderByChild docs.
It's actually super easy - just use foreslash:
db.ref('Users').child("userid/name")
db.ref('Users').child("userid/blogs")
db.ref('Users').child("userid/blogs/abc")
No need of loops or anything more.
I have the following model in an Angular 6 cli/TS/AngularFire thing I'm trying to build. I'm new to all of those things.
export class Book {
constructor(
public id: string,
public title: string,
public genres: any[]
) {}
}
And I want to be able to find all books that matcha genre stored in Firebase's Cloud Firestore using AngularFire2.
A standard query looks like this (documentation):
afs.collection('books', ref => ref.where('size', '==', 'large'))
Ideally, I want to make a call to Firebase that doesn't get all documents in the collection so it's more efficient (tell me if that's wrong thinking). For example, something like this.
afs.collection('books', ref => ref.where(book.genres.containsAny(array of user defined genres)));
I have a limited understanding of NoSQL data modeling, but would happily change the model if there's something more effective that will stay fast with 1000 or 30,000 or even 100,000 documents.
Right now I am doing this.
filterArray = ["Genetic Engineering", "Science Fiction"];
filteredBooks: Book[] = [];
ngOnInit() {
this.db.collection<Book>('books')
.valueChanges().subscribe(books => {
for (var i=0; i < books.length; i++) {
if (books[i].genres.some(v => this.filterArray.includes(v))) {
this.filteredBooks.push(books[i]);
}
}
});
}
This works to filter the documents, but is there a more efficient way both in terms of speed and scalability (get only the matching documents instead of all)?
You're right to limit the documents first. You don't want to pull 30K documents, THEN filter them. And you are on the right path, but your formatting wasn't quite right. You want to do something like this:
afs.collection<book>('books', ref => ref.where('genres', 'array-contains', genre)
I believe that as of right now, you cannot pass in an array like:
afs.collection<book>('books', ref => ref.where('genres', 'array-contains', genres) // note that genres is plural implying a list of genres
However, it still may be better to do a for-loop through the genres and pull the books list, once for each genre, and concatenate the lists afterward.
Now, you mentioned that you would also like a suggestion to store the data differently. I would recommend that you do NOT use an array for genres. Instead make it a map (basically, an object), like this:
author: string;
title: string;
...
genres: map
Then you can do this:
author: 'Herman Melville'
title: 'Moby Dick'
...
genres: {
'classics': true
'nautical': true
}
And then you can filter the collection like this:
afs.collection<book>('books', ref => ref.where('genres.classics', '==', true).where('genres.nautical', '==' true)
I hope this helps
I'm using the request library to make calls from one sails app to another one which exposes the default blueprint endpoints. It works fine when I query by non-id fields, but I need to run some queries by passing id arrays. The problem is that the moment you provide an id, only the first id is considered, effectively not allowing this kind of query.
Is there a way to get around this? I could switch over to another attribute if all else fails but I need to know if there is a proper way around this.
Here's how I'm querying:
var idArr = [];//array of ids
var queryParams = { id: idArr };
var options: {
//headers, method and url here
json: queryParams
};
request(options, function(err, response, body){
if (err) return next(err);
return next(null, body);
});
Thanks in advance.
Sails blueprint APIs allow you to use the same waterline query langauge that you would otherwise use in code.
You can directly pass the array of id's in the get call to receive the objects as follows
GET /city?where={"id":[1, 2]}
Refer here for more.
Have fun!
Alright, I switched to a hacky solution to get moving.
For all models that needed querying by id arrays, I added a secondary attribute to the model. Let's call it code. Then, in afterCreate(), I updated code and set it equal to the id. This incurs an additional database call, but it's fine since it's called just once - when the object is created.
Here's the code.
module.exports = {
attributes: {
code: {
type: 'string'//the secondary attribute
},
// other attributes
},
afterCreate: function (newObj, next) {
Model.update({ id: newObj.id }, { code: newObj.id }, next);
}
}
Note that newObj isn't a Model object as even I was led to believe. So we cannot simply update its code and call newObj.save().
After this, in the queries having id arrays, substituting id with code makes them work as expected!
I'm trying to use as much of the OOTB sync and RESTful functionality in Backbone. I have a Web API set up for basic CRUD for my models. I have:
var SearchModel = Backbone.Model.extend({});
var SearchMappingModel = Backbone.Model.extend({});
var SearchComponentModel = Backbone.Model.extend({});
var SearchCollection = Backbone.Collection.extend({});
var SearchMappingCollection = Backbone.Collection.extend({});
var SearchComponentCollection = Backbone.Collection.extend({});
For every Search there is 1-to-many SearchMappings, and for every SearchMapping, there are 1-to-many SearchComponents. My URLs for sync would be something like, "/search" for the Search collection, "'/searchmapping/' + searchId" for the SearchMapping collection, and "'/searchcomponent/' + mappingId" for the SearchComponent collection.
My question is, since each collection is dependent on the previous one, is there a way I can make a cascading relationship in backbone to minimize my code and use as much of the basic sync functionality that's already there?
My initial thought is to create a collection within a collection and write my own .fetch() to first fetch the parent collection and on its success then fetch the child, which will then also get its child after its own success, like this:
var SearchCollection = Backbone.Collection.extend({
model: SearchModel,
initialize: function (data) {
this.url = baseURL + "/search";
this.data = data;
this.SearchMappingCollection = new SearchMappingCollection();
},
fetchData: function () {
this.fetch({
success: _.bind(function (results) {
this.fetchListSuccess(results);
}, this)
});
},
fetchListSuccess: function (results) {
this.SearchMappingCollection.fetchData(results);
}
The same would be done on a .save(). This may be a good way of doing it, but wanted to get feedback from anyone else that's done something similar.
I ended up not using a cascading format. It seems that it was adding more complexity and giving nothing in return. All 3 collections now sit on the controller level, and I just load the next collection after each collection is loaded on each "reset" event.