How should I store multiple nested arrays, populated using Mongoose populate(), in the cache using React Query? - mongodb

Apologies if this is basic but I'm struggling to get my head around how to set this up.
I'm using MongoDB/Mongoose for my backend which returns a user object with nested arrays:
{
username: {
type: String,
unique: true
},
name: String,
avatar: String,
recommendations: [{ type: mongoose.Schema.Types.ObjectId, ref: 'Media' }],
watchlist: [{
media: { type: mongoose.Schema.Types.ObjectId, ref: 'Media' },
date_added: Date,
}],
}
If a user visits their watchlist or recommendations page, the nested array gets populated, using mongoose populate(), with the referenced recommendations/watchlist items that they've added.
On the frontend I'm using React Query to handle the data returned from the server. Currently visiting either of the pages returns the whole user object, if I were to cache the entire object using the query key ['user'] the nested array not being populated will be stored as an array of reference id's. Instead I was thinking of maybe trying to update the nested arrays using setQueryData, however this doesn't work if the page is refreshed:
function useWatchlist() {
const { user } = useAuth()
const queryClient = useQueryClient()
const result = useQuery({
queryKey: ['user'],
queryFn: () =>
axios.get(`${baseUrl}/${user.profile_id}/watchlist`).then(response => response.data)
},
{onSuccess: (watchlist) => {
queryClient.setQueryData(['user'], oldUser => {
oldUser.watchlist === watchlist
})
}
})
return {...result, profile: result.data }
}
Should the recommendation/watchlist arrays instead be stored separately using different query keys - ['watchlist']/['recommendations'] or should I attempt to keep the user object structure being returned from the backend?

I would say that yes, you should store them separately. Yet, using relative keys (e.g. ['user', 'watchlist'] and ['user', 'recommendations']) as explained here under Structure:
Structure your Query Keys from most generic to most specific, with as many levels of granularity as you see fit in between
So, you can invalidate them both when the user is refetched.
When I store data such as the "watch list", which only changes when the user changes it, I put a staleTime: Infinity and use setQueryData in the onSuccess of the relevant mutation (when a user updates his watch list).
For the "recommendation list", it's different story, as it would be constantly changing by some logic in the backend. So, I would use invalidateQuery whenever the 'user' key is fetched (or expire the cache, if you update the list each certain interval), and populate it again, on the onSuccess for that query.

Related

Query db without certain elements inside an array

I set up a small database using a model and 2 schemas.
The model goes as follows:
const userSchema = new mongoose.Schema({
friendsRequests: [friendRequestSchema],
//other credentials that are not important//
});
And the friendRequestSchema:
const friendRequestSchema = new mongoose.Schema({
from: { type: Schema.Types.ObjectId, ref: "User" },
to: { type: Schema.Types.ObjectId, ref: "User" },
});
Basically friendsRequests is an array consisting of who requested to add the user to the friends list (which is the from property) and whom the user wants to add to their friends list (which is the to property).
For the query, I am trying to sort out how to send a response without containing the users that are inside the user's friendsRequests array.
If i do this :
const recFriends = await User.findOne({ _id: req.user }).select(
"friendsRequests"
);
i will get back the array with objects containing either sent or received requests. Now i want to query again the User model and have it not return elements from this array. How would i go about doing that?

Using objects as options in Autoform

In my Stacks schema i have a dimensions property defined as such:
dimensions: {
type: [String],
autoform: {
options: function() {
return Dimensions.find().map(function(d) {
return { label: d.name, value: d._id };
});
}
}
}
This works really well, and using Mongol I'm able to see that an attempt to insert data through the form worked well (in this case I chose two dimensions to insert)
However what I really what is data that stores the actual dimension object rather than it's key. Something like this:
[
To try to achieve this I changed type:[String] to type:[DimensionSchema] and value: d._id to value: d. The thinking here that I'm telling the form that I am expecting an object and am now returning the object itself.
However when I run this I get the following error in my console.
Meteor does not currently support objects other than ObjectID as ids
Poking around a little bit and changing type:[DimensionSchema] to type: DimensionSchema I see some new errors in the console (presumably they get buried when the type is an array
So it appears that autoform is trying to take the value I want stored in the database and trying to use that as an id. Any thoughts on the best way to do this?.
For reference here is my DimensionSchema
export const DimensionSchema = new SimpleSchema({
name: {
type: String,
label: "Name"
},
value: {
type: Number,
decimal: true,
label: "Value",
min: 0
},
tol: {
type: Number,
decimal: true,
label: "Tolerance"
},
author: {
type: String,
label: "Author",
autoValue: function() {
return this.userId
},
autoform: {
type: "hidden"
}
},
createdAt: {
type: Date,
label: "Created At",
autoValue: function() {
return new Date()
},
autoform: {
type: "hidden"
}
}
})
According to my experience and aldeed himself in this issue, autoform is not very friendly to fields that are arrays of objects.
I would generally advise against embedding this data in such a way. It makes the data more difficult to maintain in case a dimension document is modified in the future.
alternatives
You can use a package like publish-composite to create a reactive-join in a publication, while only embedding the _ids in the stack documents.
You can use something like the PeerDB package to do the de-normalization for you, which will also update nested documents for you. Take into account that it comes with a learning curve.
Manually code the specific forms that cannot be easily created with AutoForm. This gives you maximum control and sometimes it is easier than all of the tinkering.
if you insist on using AutoForm
While it may be possible to create a custom input type (via AutoForm.addInputType()), I would not recommend it. It would require you to create a template and modify the data in its valueOut method and it would not be very easy to generate edit forms.
Since this is a specific use case, I believe that the best approach is to use a slightly modified schema and handle the data in a Meteor method.
Define a schema with an array of strings:
export const StacksSchemaSubset = new SimpleSchema({
desc: {
type: String
},
...
dimensions: {
type: [String],
autoform: {
options: function() {
return Dimensions.find().map(function(d) {
return { label: d.name, value: d._id };
});
}
}
}
});
Then, render a quickForm, specifying a schema and a method:
<template name="StacksForm">
{{> quickForm
schema=reducedSchema
id="createStack"
type="method"
meteormethod="createStack"
omitFields="createdAt"
}}
</template>
And define the appropriate helper to deliver the schema:
Template.StacksForm.helpers({
reducedSchema() {
return StacksSchemaSubset;
}
});
And on the server, define the method and mutate the data before inserting.
Meteor.methods({
createStack(data) {
// validate data
const dims = Dimensions.find({_id: {$in: data.dimensions}}).fetch(); // specify fields if needed
data.dimensions = dims;
Stacks.insert(data);
}
});
The only thing i can advise at this moment (if the values doesnt support object type), is to convert object into string(i.e. serialized string) and set that as the value for "dimensions" key (instead of object) and save that into DB.
And while getting back from db, just unserialize that value (string) into object again.

Store contents with rest proxy giving incorrect count

ExtJS 5.1.x, with several stores using rest proxy.
Here is an example:
Ext.define('cardioCatalogQT.store.TestResults', {
extend: 'Ext.data.Store',
alias: 'store.TestResults',
config:{
fields: [
{name: 'attribute', type: 'string'},
{name: 'sid', type: 'string'},
{name: 'value_s', type: 'string'},
{name: 'value_d', type: 'string'}
],
model: 'cardioCatalogQT.model.TestResult',
storeId: 'TestResults',
autoLoad: true,
pageSize: undefined,
proxy: {
url: 'http://127.0.0.1:5000/remote_results_get',
type: 'rest',
reader: {
type: 'json',
rootProperty: 'results'
}
}
}
});
This store gets populated when certain things happen in the API. After the store is populated, I need to do some basic things, like count the number of distinct instances of an attribute, say sid, which I do as follows:
test_store = Ext.getStore('TestResults');
n = test_store.collect('sid').length);
The problem is that I have to refresh the browser to get the correct value of 'n,' otherwise, the count is not right. I am doing a test_store.load() and indeed, the request is being sent to the server after the .load() is issued.
I am directly querying the backend database to see what data are there in the table and to get a count to compare to the value given by test_store.collect('sid').length);. The strange thing is that I am also printing out the store object in the debugger, and the expected records (when compared to the content in the database table) are displayed under data.items array, but the value given by test_store.collect('sid').length is not right.
This is all done sequentially in a success callback. I am wondering if there is some sort of asynchronous behavior giving me the inconsistent results between what is is the store and the count on the content of the store?
I tested this with another store that uses the rest proxy and it has the same behavior. On the other hand, using the localStorage proxy gives the correct count consistent with the store records/model instances.
Here is the relevant code in question, an Ajax request fires off and does its thing correctly, and hit this success callback. There really isn't very much interesting going on... the problem section is after the console.log('TEST STORE HERE'); where I get the store, print the contents of the store, load/sync then print the store (which works just fine) and then finally print the length of uniquely grouped items by the sid attribute (which is what is not working):
success: function(response) {
json = Ext.decode(response.responseText);
if(json !== null && typeof (json) !== 'undefined'){
for (i = 0, max = json.items.length; i < max; i += 1) {
if (print_all) {
records.push({
sid: json.items[i].sid,
attribute: json.items[i].attribute,
string: json.items[i].value_s,
number: json.items[i].value_d
});
}
else {
records.push({
sid: json.items[i].sid
})
}
}
//update store with data
store.add(records);
store.sync();
// only add to store if adding to search grid
if (!print_all) {
source.add({
key: payload.key,
type: payload.type,
description: payload.description,
criteria: payload.criteria,
atom: payload.atom,
n: store.collect('sid').length // get length of array for unique sids
});
source.sync();
}
console.log('TEST STORE HERE');
test_store = Ext.getStore('TestResults');
test_store.load();
test_store.sync();
console.log(test_store);
console.log(test_store.collect('sid').length)
}
// update grid store content
Ext.StoreMgr.get('Payload').load();
Ext.ComponentQuery.query('#searchGrid')[0].getStore().load();
}
For completeness, here is the data.items array output items:Array[2886]
which is equivalent count of unique items grouped by the attribute sid and finally the output of console.log(test_store.collect('sid').length), which gives the value from the PREVIOUS run of this: 3114...

Mongoose - update after populate (Cast Exception)

I am not able to update my mongoose schema because of a CastERror, which makes sence, but I dont know how to solve it.
Trip Schema:
var TripSchema = new Schema({
name: String,
_users: [{type: Schema.Types.ObjectId, ref: 'User'}]
});
User Schema:
var UserSchema = new Schema({
name: String,
email: String,
});
in my html page i render a trip with the possibility to add new users to this trip, I retrieve the data by calling the findById method on the Schema:
exports.readById = function (request, result) {
Trip.findById(request.params.tripId).populate('_users').exec(function (error, trip) {
if (error) {
console.log('error getting trips');
} else {
console.log('found single trip: ' + trip);
result.json(trip);
}
})
};
this works find. In my ui i can add new users to the trip, here is the code:
var user = new UserService();
user.email = $scope.newMail;
user.$save(function(response){
trip._users.push(user._id);
trip.$update(function (response) {
console.log('OK - user ' + user.email + ' was linked to trip ' + trip.name);
// call for the updated document in database
this.readOne();
})
};
The Problem is that when I update my Schema the existing users in trip are populated, means stored as objects not id on the trip, the new user is stored as ObjectId in trip.
How can I make sure the populated users go back to ObjectId before I update? otherwise the update will fail with a CastError.
see here for error
I've been searching around for a graceful way to handle this without finding a satisfactory solution, or at least one I feel confident is what the mongoosejs folks had in mind when using populate. Nonetheless, here's the route I took:
First, I tried to separate adding to the list from saving. So in your example, move trip._users.push(user._id); out of the $save function. I put actions like this on the client side of things, since I want the UI to show the changes before I persist them.
Second, when adding the user, I kept working with the populated model -- that is, I don't push(user._id) but instead add the full user: push(user). This keeps the _users list consistent, since the ids of other users have already been replaced with their corresponding objects during population.
So now you should be working with a consistent list of populated users. In the server code, just before calling $update, I replace trip._users with a list of ObjectIds. In other words, "un-populate" _users:
user_ids = []
for (var i in trip._users){
/* it might be a good idea to do more validation here if you like, to make
* sure you don't have any naked userIds in this array already, as you would
*/in your original code.
user_ids.push(trip._users[i]._id);
}
trip._users = user_ids;
trip.$update(....
As I read through your example code again, it looks like the user you are adding to the trip might be a new user? I'm not sure if that's just a relic of your simplification for question purposes, but if not, you'll need to save the user first so mongo can assign an ObjectId before you can save the trip.
I have written an function which accepts an array, and in callback returns with an array of ObjectId. To do it asynchronously in NodeJS, I am using async.js. The function is like:
let converter = function(array, callback) {
let idArray;
async.each(array, function(item, itemCallback) {
idArray.push(item._id);
itemCallback();
}, function(err) {
callback(idArray);
})
};
This works totally fine with me, and I hope should work with you as well

Node.js - Mongoose/MongoDB - Model Schema

I am creating a blog system in Node.js with mongodb as the db.
I have contents like this: (blog articles):
// COMMENTS SCHEMA:
// ---------------------------------------
var Comments = new Schema({
author: {
type: String
},
content: {
type: String
},
date_entered: {
type: Date,
default: Date.now
}
});
exports.Comments = mongoose.model('Comments',Comments);
var Tags = new Schema({
name: {
type: String
}
});
exports.Tags = mongoose.model('Tags',Tags);
// CONTENT SCHEMA:
// ---------------------------------------
exports.Contents = mongoose.model('Contents', new Schema({
title: {
type: String
},
author: {
type: String
},
permalink: {
type: String,
unique: true,
sparse: true
},
catagory: {
type: String,
default: ''
},
content: {
type: String
},
date_entered: {
type: Date,
default: Date.now
},
status: {
type: Number
},
comments: [Comments],
tags: [Tags]
}));
I am a little new to this type of database, im used to MySQL on a LAMP stack.
Basically my question is as follows:
whats the best way to associate the Contents author to a User in the
DB?
Also, whats the best way to do the tags and categories?
In MYSQL we would have a tags table and a categories table and relate by keys, I am not sure the best and most optimal way of doing it in Mongo.
THANK YOU FOR YOUR TIME!!
Couple of ideas for Mongo:
The best way to associate a user is e-mail address - as an attribute of the content/comment document - e-mail is usually a reliable unique key. MongoDB doesn't have foreign keys or associated constraints. But that is fine.
If you have a registration policy, add user name, e-mail address and other details to the users collection. Then de-normalize the content document with the user name and e-mail. If, for any reason, the user changes the name, you will have to update all the associated contents/comments. But so long as the e-mail address is there in the documents, this should be easy.
Tags and categories are best modelled as two lists in the content document, IMHO.
You can also create two indices on these attributes, if required. Depends on the access patterns and the UI features you want to provide
You can also add a document which keeps a tag list and a categories list in the contents collection and use $addToSet to add new tags and categories to this document. Then, you can show a combo box with the current tags as a starting point.
As a final point, think through the ways you plan to access the data and then design documents, collections & indices accordingly
[Update 12/9/11] Was at MongoSv and Eliot (CTO 10gen) presented a pattern relevant to this question: Instead of one comment document per user (which could grow large) have a comment document per day for a use with _id = -YYYYMMDD or even one per month depending on the frequency of comments. This optimizes index creation/document growth vs document proliferation (in case of the design where there is one comment per user).
The best way to associate the Content Authors to a User in the MongoDB, is to take an array in Author Collection which keeps an reference to User. Basically Array because One Content/Book may have multiple Authors i.e. you need to associate one Content to many Users.
The best way for category is to create a different collection in your DB and similarly as above keep a array in Contents.
I hope it helps at-least a little.