Handling nested callbacks/promises with Mongoose - callback

I am a beginner with Node.js and Mongoose. I spent an entire day trying to resolve an issue by scouring through SO, but I just could not find the right solution. Basically, I am using the retrieved values from one collection to query another. In order to do this, I am iterating through a loop of the previously retrieved results.
With the iteration, I am able to populate the results that I need. Unfortunately, the area where I am having an issue is that the response is being sent back before the required information is gathered in the array. I understand that this can be handled by callbacks/promises. I tried numerous ways, but I just haven't been successful with my attempts. I am now trying to make use of the Q library to facilitate the callbacks. I'd really appreciate some insight. Here's a snippet of the portion where I'm currently stuck:
var length = Object.keys(purchasesArray).length;
var jsonArray = [];
var getProductDetails = function () {
var deferred = Q.defer();
for (var i = 0; i < length; i++) {
var property = Object.keys(purchasesArray)[i];
if (purchasesArray.hasOwnProperty(property)) {
var productID = property;
var productQuery = Product.find({asin:
productQuery.exec(function (err, productList) {
jsonArray.push({"productName": productList[0].productName,
"quantity": purchasesArray[productID]});
});
}
}
return deferred.promise;
};
getProductDetails().then(function sendResponse() {
console.log(jsonArray);
response = {
"message": "The action was successful",
"products": jsonArray
};
res.send(response);
return;
}).fail(function (err) {
console.log(err);
})
});
I am particularly able to send one of the two objects in the jsonArray array as the response is being sent after the first element.
Update
Thanks to Roamer-1888 's answer, I have been able to construct a valid JSON response without having to worry about the error of setting headers after sending a response.
Basically, in the getProductDetails() function, I am trying to retrieve product names from the Mongoose query while mapping the quantity for each of the items in purchasesArray. From the function, eventually, I would like to form the following response:
response = {
"message": "The action was successful",
"products": jsonArray
};
where, jsonArray would be in the following form from getProductDetails :
jsonArray.push({
"productName": products[index].productName,
"quantity": purchasesArray[productID]
});

On the assumption that purchasesArray is the result of an earlier query, it would appear that you are trying to :
query your database once per purchasesArray item,
form an array of objects, each containing data derived from the query AND the original purchasesArray item.
If so, and with few other guesses, then the following pattern should do the job :
var getProductDetails = function() {
// map purchasesArray to an array of promises
var promises = purchasesArray.map(function(item) {
return Product.findOne({
asin: item.productID // some property of the desired item
}).exec()
.then(function product {
// Here you can freely compose an object comprising data from :
// * the synchronously derived `item` (an element of 'purchasesArray`)
// * the asynchronously derived `product` (from database).
// `item` is still available thanks to "closure".
// For example :
return {
'productName': product.name,
'quantity': item.quantity,
'unitPrice': product.unitPrice
};
})
// Here, by catching, no individual error will cause the whole response to fail.
.then(null, (err) => null);
});
return Promise.all(promises); // return a promise that settles when all `promises` are fulfilled or any one of them fails.
};
getProductDetails().then(results => {
console.log(results); // `results` is an array of the objects composed in getProductDetails(), with properties 'productName', 'quantity' etc.
res.json({
'message': "The action was successful",
'products': results
});
}).catch(err => {
console.log(err);
res.sendStatus(500); // or similar
});
Your final code will differ in detail, particularly in the composition of the composed object. Don't rely on my guesses.

Related

Why does express/mongodb updateOne post allow $set of some values, but will not update others?

I am baffled by this post method, it will update fields 'x and y', but any attempt to set an array of widgets fails.
It is finding the correct item to update, passing all the required information through, but it will not allow insertion of, or update to 'widgets' fields.
Even if I remove the data intended for widgets and arbitrarily send through 'foo' it will not update with a field 'widgets'.
What am I doing wrong here???
API Call to Update Widgets. The Arbitrary X and Y values will update on the database, but any attempt to update widget makes no change
const saveUpdatedWidgets = async (update, _id) => {
console.log("called to update widgets ",update.widgets," in pagecard saveUpdatedWidgets")
let widgetObject = []
for(let u=0;u<update.widgets.length;u++){
widgetObject.push({
id: update.widgets[u].id,
text: update.widgets[u].text
})
}
Api.withToken().post('/pagewidget/'+_id,
{widgets: widgetObject, x:250, y:250}
).then(function (response) {
console.log("?worked ",response.data)
}).catch(function (error) {
console.log("page save failed for some reason on pagecard: ",error.response);
});
};
This will return the following in the console:
Code for post method is:
//THIS ROUTER WILL NOT UPDATE ANY WIDGETS FOR SOME REASON
router.post('/pagewidget/:_id',auth, async(req,res)=>{
console.log("request to update ",req.body," for id ",req.params," in pagewidgetsave post")
const query = { "_id": req.params };
const addedWidgets = req.body;
const newValues = { $set: addedWidgets }
try {
const thePage = await Pages.updateOne( query, newValues);
res.status(201).send(thePage)
console.log("updated Page: ",thePage);
}
catch(e){
console.log(e);
res.status(400).send(e)
}
})
Results from the console running node shows that values are going through, but only x and y actually update in database..
Here is the axios api.js file if there are any issues here:
import axios from 'axios';
const baseURL = process.env.REACT_APP_BASE_URL || "http://localhost:3001"
export default {
noToken() {
return axios.create({
baseURL: baseURL
});
},
withToken() {
const tokenStr = window.sessionStorage.getItem("token")
return axios.create({
baseURL: baseURL,
headers: {"Authorization" : `Bearer ${tokenStr}`}
});
}
}
What is going on!!?? It finds the page OK, and updates x and y values, but can't update widgets, even if the values for widget are just a string or number...
I found the issue. the MongoDB documentation doesn't mention this too well, and in its examples for updateOne() it passes an object for the update argument.
BUT, if you are setting a new field, this argument must be wrapped inside an array to use $set, this is because it can accept both methods to $set and to $unset. (see mongoDB docs)
(i.e. updateOne({query} , [{$set: {field:"value"}, {$unset: {otherfield:"othervalue"}])
In the end the post method just had to change to const thePage = await Pages.updateOne( query, [newValues]); (with newValues stored as an object inside an array, to allow addition of $unset if it was needed.
This is why it would update existing values OK, but it would not set new values into the database.
What a journey....
Full code for post method here
router.post('/pagewidget/:_id',auth, async(req,res)=>{
const query = {"_id": req.params._id};
const addedWidgets = req.body;
const newValues = { $set: addedWidgets }
try {
const thePage = await Pages.updateOne( query, [newValues]);
res.status(201).send(thePage)
console.log("updated Page: ",thePage);
}
catch(e){
console.log(e);
res.status(400).send(e)
}
})

In mongo how to get the current position of the record in the table with the total records for pagination?

I'm trying to return create a paginated list. I used graphql to query the data. With my query, I pass the number of records I need (In a variable named first) and the ID of the last fetched record (In a varible called after). Now I managed to write a query (Note that I used mongoose) to fetch the records. Now what I need to do is get the relavant information to perform the pagination like hasNextPage, hasPreviousPage, currentPage and totalPages.
To get most of these information I need to get the total number of records in the database. To do that I need to send another db request.
I also need to know the position of the record in the table. No idea how.
Here's the query:
new Promise((resolve, reject) =>
Company.where('_id')
.gt(after)
.limit(first)
.lean()
.exec((error, doc) => {
if (error) {
reject(error);
}
resolve({
edges: doc,
pageInfo: {
hasNextPage: '...',
hasPreviousPage: '...',
currentPage: '...',
totalPages: '...'
}
});
}))
Any idea how to do this efficiently?
you can try this module mongoose-paginate
here what i uses, for pagination,
var current = req.query.filter.current;
var limit = req.query.filter.limit;
console.log('params.query.filter.current',current);
var skip = Number(limit)*Number(current)-Number(limit);
console.log('skip::',skip);
Cours.find({'attributes.version.status': true}).skip(skip).limit(limit).sort({_id:'asc'}).exec(function (err, resulta) {
if (err) {
console.log('erreur trouverCours');
res.json({
protecteds: err
});
}
console.log('cours ::', resulta);
res.json({
"data": resulta
});
});

Ionic 2 MEAN Application doesn't return updated data on get request

I've been having this weird issue with an application I'm building. Essentially a function is invoked I want to read in a user's current game statistics -Wins, losses, draws etc - I do this using a service which creates an observable and consumes data from my rest api. On first call of this method the data read in is the most current up to date version but after this point I update the document for the user in the database and then when I execute the function again it reads in the original document before the update. However when I check the database the document has in face been updated.
Here is my provider function for consuming the data.
getUser(id) {
if (this.data) {
return Promise.resolve(this.data);
}
return new Promise(resolve => {
this.http.get('https://pitchlife-hearts.herokuapp.com/api/users/' + id)
.map(res => res.json())
.subscribe(data => {
this.data = data;
resolve(this.data);
});
});
}
Here is the call I make in my function.
play(challenger, opponent) {
this.userService.getUser(_id).then((data) => {
this.challenger_account = {
_id: data._id,
points: data.maroon_points,
wins: data.wins,
draws: data.draws,
losses: data.losses
};
Here is my update call.
this.userService.updateUser(this.challenger_account);
Here is my api endpoint call as well although this does work every time I update the data.
app.post('/api/users/update', function (req, res) {
// Update a user
var options = {};
User.update({_id : req.body._id }, {
maroon_points: req.body.points,
wins: req.body.wins,
draws: req.body.draws,
losses: req.body.losses
}, options,
function (err, user) {
if (err)
res.send(err);
res.json(user);
});
});
Any help with this would be hugely appreciated as this is driving me crazy.
When are you updating the this.data property that the getUser(id) { ... } method uses?
Because the first time the getUser(id) {...} method is executed, this.data is null and because of that the http request is made. But after that, the value of this.data is always returned, but if you don't update it manually, it'll be always the first value it was set to.

Meteor Publish Distinct Values of Field in Collection

I'm stuck on a pretty simple scenario in Meteor:
I have a huge collection of things with many fields, some of them containing quite a bit of text.
I want to create a page for searching that collection.
One of the fields that each item in the collection has is "category".
I'd like to give the user the ability to filter by that category.
For that, I need to publish just the distinct values of the category field in the collection.
I can't figure out a way to do that without publishing the whole collection which takes way too long. How can I publish just the distinct categories and use them to fill a dropdown?
Bonus question and somewhat related: How do I publish a count of all items in the collection without publishing the whole collection?
A good starting point to make this easier would be to normalize your categories into a separate database collection.
However assuming that is not possible or practical, the best (though imperfect) solution will be to publish two separate versions of your collection, one which returns only the categories field of the entire collection and another which returns all fields of the collection for the selected category only. That would look like the following:
// SERVER
Meteor.startup(function(){
Meteor.publish('allThings', function() {
// return only id and categories field for all your things
return Things.find({}, {fields: {categories: 1}});
});
Meteor.publish('thingsByCategory', function(category) {
// return all fields for things having the selected category
// you can then subscribe via something like a client-side Session variable
// e.g., Meteor.subscribe("thingsByCategory", Session.get("category"));
return Things.find({category: category});
});
});
Note that you will still need to assemble your array of categories client side from the Things cursor (for example, by using underscore's _.pluck and _.uniq methods to grab the categories and remove any dups). But the data set will be much smaller as you are only working with single-field documents now.
(Note that ideally, you would want to use Mongo's distinct() method in your publish function to publish only the distinct categories, but that is not possible directly as it returns an array which cannot be published).
You could use the internal this._documents.collectionName to only send new categories down to the client. Tracking which categories to remove becomes a bit ugly so you probably will still end up maintaining a separate 'categories' collection eventually.
Example:
Meteor.publish( 'categories', function(){
var self = this;
largeCollection.find({},{fields: {category: 1}).observeChanges({
added: function( id, doc ){
if( ! self._documents.categories[ doc.category ] )
self.added( 'categories', doc.category, {category: doc.category});
},
removed: function(){
_.keys( self._documents.categories ).forEach( category ){
if ( largeCollection.find({category: category},{limit: 1}).count() === 0 )
self.removed( 'categories', category );
}
}
});
self.ready();
};
Re: the bonus question, publishing counts: take a look at the meteorite package publish-counts. I think that does what you want.
These patterns might be helpful to you. Here is a publication that publishes counts:
/*****************************************************************************/
/* Counts Publish Function
/*****************************************************************************/
// server: publish the current size of a collection
Meteor.publish("countsByProject", function (arguments) {
var self = this;
if (this.userId) {
var roles = Meteor.users.findOne({_id : this.userId}).roles;
if ( _.contains(roles, arguments.projectId) ) {
//check(arguments.video_id, Integer);
// observeChanges only returns after the initial `added` callbacks
// have run. Until then, we don't want to send a lot of
// `self.changed()` messages - hence tracking the
// `initializing` state.
Videos.find({'projectId': arguments.projectId}).forEach(function (video) {
var count = 0;
var initializing = true;
var video_id = video.video_id;
var handle = Observations.find({video_id: video_id}).observeChanges({
added: function (id) {
//console.log(video._id);
count++;
if (!initializing)
self.changed("counts", video_id, {'video_id': video_id, 'observations': count});
},
removed: function (id) {
count--;
self.changed("counts", video_id, {'video_id': video_id, 'observations': count});
}
// don't care about changed
});
// Instead, we'll send one `self.added()` message right after
// observeChanges has returned, and mark the subscription as
// ready.
initializing = false;
self.added("counts", video_id, {'video_id': video_id, 'observations': count});
self.ready();
// Stop observing the cursor when client unsubs.
// Stopping a subscription automatically takes
// care of sending the client any removed messages.
self.onStop(function () {
handle.stop();
});
}); // Videos forEach
} //if _.contains
} // if userId
return this.ready();
});
And here is one that creates a new collection from a specific field:
/*****************************************************************************/
/* Tags Publish Functions
/*****************************************************************************/
// server: publish the current size of a collection
Meteor.publish("tags", function (arguments) {
var self = this;
if (this.userId) {
var roles = Meteor.users.findOne({_id : this.userId}).roles;
if ( _.contains(roles, arguments.projectId) ) {
var observations, tags, initializing, projectId;
initializing = true;
projectId = arguments.projectId;
observations = Observations.find({'projectId' : projectId}, {fields: {tags: 1}}).fetch();
tags = _.pluck(observations, 'tags');
tags = _.flatten(tags);
tags = _.uniq(tags);
var handle = Observations.find({'projectId': projectId}, {fields : {'tags' : 1}}).observeChanges({
added: function (id, fields) {
if (!initializing) {
tags = _.union(tags, fields.tags);
self.changed("tags", projectId, {'projectId': projectId, 'tags': tags});
}
},
removed: function (id) {
self.changed("tags", projectId, {'projectId': projectId, 'tags': tags});
}
});
initializing = false;
self.added("tags", projectId, {'projectId': projectId, 'tags': tags});
self.ready();
self.onStop(function () {
handle.stop();
});
} //if _.contains
} // if userId
return self.ready();
});
I have not tested it on Meteor, and according to the replies, I'm getting skeptical that it will work but using a mongoDB distinct would do the trick.
http://docs.mongodb.org/manual/reference/method/db.collection.distinct/

Looking for help with reading from MongoDB in Node.JS

I have a number of records stored in a MongoDB I'm trying to output them to the browser window by way of a Node.JS http server. I think I'm a good portion of the way along but I'm missing a few little things that are keeping it from actually working.
The code below uses node-mongo-native to connect to the database.
If there is anyone around who can help me make those last few connections with working in node I'd really appreciate it. To be fair, I'm sure this is just the start.
var sys = require("sys");
var test = require("assert");
var http = require('http');
var Db = require('../lib/mongodb').Db,
Connection = require('../lib/mongodb').Connection,
Server = require('../lib/mongodb').Server,
//BSON = require('../lib/mongodb').BSONPure;
BSON = require('../lib/mongodb').BSONNative;
var host = process.env['MONGO_NODE_DRIVER_HOST'] != null ? process.env['MONGO_NODE_DRIVER_HOST'] : 'localhost';
var port = process.env['MONGO_NODE_DRIVER_PORT'] != null ? process.env['MONGO_NODE_DRIVER_PORT'] : Connection.DEFAULT_PORT;
sys.puts("Connecting to " + host + ":" + port);
function PutItem(err, item){
var result = "";
if(item != null) {
for (key in item) {
result += key + '=' + item[key];
}
}
// sys.puts(sys.inspect(item)) // debug output
return result;
}
function ReadTest(){
var db = new Db('mydb', new Server(host, port, {}), {native_parser:true});
var result = "";
db.open(function (err, db) {
db.collection('test', function(err, collection) {
collection.find(function (err, cursor){
cursor.each( function (err, item) {
result += PutItem(err, item);
});
});
});
});
return result;
}
http.createServer(function (req, res) {
res.writeHead(200, {'Content-Type': 'text/plain'});
res.end("foo"+ReadTest());
}).listen(8124);
console.log('Server running on 8124');
Sources:
- mongo connectivity code:
https://github.com/christkv/node-mongodb-native/blob/master/examples/simple.js
- node. http code: nodejs.org
EDIT CORRECTED CODE
Thanks to Mic below who got me rolling in the right direction. For anyone interested, the corrected solution is here:
function ReadTest(res){
var db = new Db('mydb', new Server(host, port, {}), {native_parser:true});
var result = "";
res.write("in readtest\n");
db.open(function (err, db) {
res.write("now open\n");
db.collection('test', function(err, collection) {
res.write("in collection\n");
collection.find(function (err, cursor){
res.write("found\n");
cursor.each( function (err, item) {
res.write("now open\n");
var x = PutItem(err, item);
sys.puts(x);
res.write(x);
if (item == null) {
res.end('foo');
}
});
});
});
});
}
http.createServer(function (req, res) {
res.writeHead(200, {'Content-Type': 'text/plain'});
res.write("start\n");
ReadTest(res);
}).listen(8124);
console.log('Server running on 8124');
My guess is that you are returning result, writing the response, and closing the connection before anything is fetched from the db.
One solution would be to pass the response object to where you actually need it, something like:
function readTest(res) {
db.open(function (err, db) {
db.collection('test', function(err, collection) {
collection.find(function (err, cursor) {
res.writeHead(200, {'Content-type' : 'text/plain'});
cursor.each( function (err, item) { res.write(item); });
res.end();
...
Of course, you should also handle errors and try to avoid nesting too many levels, but that's a different discussion.
Instead of writing all the low-level Mongodb access code, you might want to try a simple library like mongous so that you can focus on your data, not on MongoDB quirks.
You might want to try mongoskin too.
Reading documents
To apply specific value filters, we can pass specific values to the find() command. Here is a SQL query:
SELECT * FROM Table1 WHERE name = 'ABC'
which is equivalent to the following in MongoDB (notice Collection1 for Table1):
db.Collection1.find({name: 'ABC'})
We can chain count() to get the number of results, pretty() to get a readable result. The results can be further narrowed by adding additional parameters:
db.Collection1.find({name: 'ABC', rollNo: 5})
It's important to notice that these filters are ANDed together, by default. To apply an OR filter, we need to use $or. These filters will be specified depending upon the structure of the document. Ex: for object attribute name for an object school, we need to specify filter like "school.name" = 'AUHS'
We're using here the DOT notation, by trying to access a nested field name of a field school. Also notice that the filters are quoted, without which we'll get syntax errors.
Equality matches on arrays can be performed:
on the entire arrays
based on any element
based on a specific element
more complex matches using operators
In the below query:
db.Collection1.find({name: ['ABC','XYZ']})
MongoDB is going to identify documents by an exact match to an array of one or more values. Now for these types of queries, the order of elements matters, meaning that we will only match documents that have ABC followed by XYZ and those are the only 2 elements of the array name
{name:["ABC","GHI","XYZ"]},
{name:["DEF","ABC","XYZ"]}
In the above document, let's say that we need to get all the documnts where ABC is the first element. So, we'll use the below filter:
db.Schools.find({'name.0': 'ABC' })