I am newbie in sails.js and MongoDB. Thanks for read my question.
I got really trouble.
Situation
I have User db, Report db, Post db.
I want to get Report data and populated post data which parent is Report data. And I also want to get User data from post data.
In other words, I want to get double nested data.
Report
|_ post
|_user
And this is my findReports action in ReportController.js
findReports: function(req, res) {
Report.find().populateAll().sort({createdAt: -1}).exec((er, reports) => {
if (er) return res.negotiate(er)
const rs = reports
console.log('rs : ', rs)
rs.forEach((report) => {
if(report.post) {
User.findOne({id: report.post.user}).exec((er, user) => {
report.post.user = user
console.log("before rs : ", rs)
})
console.log("report.post.user : ", report.post.user)
console.log("after rs : ", rs)
} else {
report.comment.user = User.findOne({id: report.comment.user}).exec((er, user) => {
report.commet.user = user
})
}
})
console.log("final rs : ", rs)
res.view('dashboard/reports/index', { reports: rs })
})
},
When I run my code, In my console, result is printed according following order.
"after rs : ..(blah).." - I don't want...
"final rs : ..(blah).." - I don't want...
"before rs : ..(blah).." - I want to get this rs! but when my res.view("dashboard....", {reports, rs}) is 2's result.
I think when "User.find" method is defered unrelevant to whole logic.
I really get in trouble. Any advice will be really helpful. Please, give me some advice.
Waterline form SailsJS doesn't support deepPopulation. So instead of the ORM you must use another like: Offshore
Example:
Report.find().populate("post.user").exec((err, data) => {
// ur code here
})
Related
This is probably a simple concept to some but an understanding of promises is something that I continue to struggle with...
I have a simple web app that connects to a mongo DB using mongoose and an MVC pattern. I used this tutorial to get up and running but I am not looking to build an API just a back end for my application. I believe interfacing with the controller is where I need some help understanding...
My App structure is as follows
db.congif.js - DB connection managed here
movie.model.js - this is where my object schema is defined
movie.contrller.js - this is where the db operations are written
index.js - main file for my app
Here is the example controller
exports.create = (req, res) => {
// Validate request
if (!req.body.title) {
res.status(400).send({ message: "Content can not be empty!" });
return;
}
// Create a Tutorial
const tutorial = new Tutorial({
title: req.body.title,
description: req.body.description,
published: req.body.published ? req.body.published : false
});
// Save Tutorial in the database
tutorial
.save(tutorial)
.then(data => {
res.send(data);
})
.catch(err => {
res.status(500).send({
message:
err.message || "Some error occurred while creating the Tutorial."
});
});
};
I want to update this to accept an object and return a "response" here begins my lack of understanding
exports.create = (tutorialObject) => {
// Save Tutorial in the database
tutorial
.save(tutorialObject)
.then(data => {
return data;
})
.catch(err => {
return {
message:
err.message || "Some error occurred while creating the Tutorial."
};
});
};
and finally how I am calling this
let dbResponse = tutorial.create(
{
original_title : original_title,
title : title,
poster_path : poster_path
})
So my question... Please help me understand the correct way to code\call this. Is the controller code correct, am I calling it correctly? I am not sure because even though this is working to write records to the DB the dbResponse is always undefined.
The code works to write records to the DB but the response from the controller is undefined. I would expect to have the response be the record that was inserted?
Load huge JSON file using Pg-Promise helpers and fs stream.
I'm using pg-promise and I want to make massive inserts into a table using pgp.helpers. I've seen solution like Multi-row insert with pg-promise and also followed Data import for streams (Spex) but still it fails with the same error as in this post https://github.com/vitaly-t/spex/issues/8
I tried using a example from the other post on CSV stream(rs.csv()) but when i replaced the same with JSonStream parser I still get the same error.
Can you please share a working example?
db.tx(t => {
return streamRead.call(t, stream.pipe(parser), receiver)
})
There might be a better way to do it, but the below code sure works!
I have the chunks(row.length) at 20,000 per insert statement you can adjust accordingly based on your needs.
stream.pipe(parser)
parser.on('data', data =>{
row.push(data)
if (row.length === 20000) {
parser.pause()
//console.log(row.length)
db.tx('inserting-products', t => {
const insert = pgp.helpers.insert(row, cs)
t.none(insert).then(() => {
row =[]
parser.resume()
})
})
}
})
parser.on('end', () =>{
//console.log(row.length)
if(row.length != 0){
db.tx('inserting-products', t => {
const insert = pgp.helpers.insert(row, cs)
t.none(insert).then(() => {
console.log('success')
db.$pool.end()
})
})
}
if(row.length === 0) {
console.log('success')
db.$pool.end()
}
})
Please let me know in comments if this helps or any other ways to improve the process.
I am a beginner with Node.js and Mongoose. I spent an entire day trying to resolve an issue by scouring through SO, but I just could not find the right solution. Basically, I am using the retrieved values from one collection to query another. In order to do this, I am iterating through a loop of the previously retrieved results.
With the iteration, I am able to populate the results that I need. Unfortunately, the area where I am having an issue is that the response is being sent back before the required information is gathered in the array. I understand that this can be handled by callbacks/promises. I tried numerous ways, but I just haven't been successful with my attempts. I am now trying to make use of the Q library to facilitate the callbacks. I'd really appreciate some insight. Here's a snippet of the portion where I'm currently stuck:
var length = Object.keys(purchasesArray).length;
var jsonArray = [];
var getProductDetails = function () {
var deferred = Q.defer();
for (var i = 0; i < length; i++) {
var property = Object.keys(purchasesArray)[i];
if (purchasesArray.hasOwnProperty(property)) {
var productID = property;
var productQuery = Product.find({asin:
productQuery.exec(function (err, productList) {
jsonArray.push({"productName": productList[0].productName,
"quantity": purchasesArray[productID]});
});
}
}
return deferred.promise;
};
getProductDetails().then(function sendResponse() {
console.log(jsonArray);
response = {
"message": "The action was successful",
"products": jsonArray
};
res.send(response);
return;
}).fail(function (err) {
console.log(err);
})
});
I am particularly able to send one of the two objects in the jsonArray array as the response is being sent after the first element.
Update
Thanks to Roamer-1888 's answer, I have been able to construct a valid JSON response without having to worry about the error of setting headers after sending a response.
Basically, in the getProductDetails() function, I am trying to retrieve product names from the Mongoose query while mapping the quantity for each of the items in purchasesArray. From the function, eventually, I would like to form the following response:
response = {
"message": "The action was successful",
"products": jsonArray
};
where, jsonArray would be in the following form from getProductDetails :
jsonArray.push({
"productName": products[index].productName,
"quantity": purchasesArray[productID]
});
On the assumption that purchasesArray is the result of an earlier query, it would appear that you are trying to :
query your database once per purchasesArray item,
form an array of objects, each containing data derived from the query AND the original purchasesArray item.
If so, and with few other guesses, then the following pattern should do the job :
var getProductDetails = function() {
// map purchasesArray to an array of promises
var promises = purchasesArray.map(function(item) {
return Product.findOne({
asin: item.productID // some property of the desired item
}).exec()
.then(function product {
// Here you can freely compose an object comprising data from :
// * the synchronously derived `item` (an element of 'purchasesArray`)
// * the asynchronously derived `product` (from database).
// `item` is still available thanks to "closure".
// For example :
return {
'productName': product.name,
'quantity': item.quantity,
'unitPrice': product.unitPrice
};
})
// Here, by catching, no individual error will cause the whole response to fail.
.then(null, (err) => null);
});
return Promise.all(promises); // return a promise that settles when all `promises` are fulfilled or any one of them fails.
};
getProductDetails().then(results => {
console.log(results); // `results` is an array of the objects composed in getProductDetails(), with properties 'productName', 'quantity' etc.
res.json({
'message': "The action was successful",
'products': results
});
}).catch(err => {
console.log(err);
res.sendStatus(500); // or similar
});
Your final code will differ in detail, particularly in the composition of the composed object. Don't rely on my guesses.
router.get('/wiki/:topicname', function(req, res, next) {
var topicname = req.params.topicname;
console.log(topicname);
summary.wikitext(topicname, function(err, result) {
if (err) {
return res.send(err);
}
if (!result) {
return res.send('No article found');
}
$ = cheerio.load(result);
var db = req.db;
var collection = db.get('try1');
collection.insert({ "topicname" : topicname, "content": result }, function (err, doc){
if (err) {
// If it failed, return error
res.send("There was a problem adding the information to the database.");
}
else {
// And forward to success page
res.send("Added succesfully");
}
});
});
Using this code, I am trying to add the fetched content from Wikipedia in to the collection try1. The message "Added succesfully" is displayed. But the collection seems to be empty. The data is not inserted in the database
The data must be there, mongodb has { w: 1, j: true } write concern options by default so its only returns without an error if the document is truly inserted if there were any document to insert.
Things you should consider:
-Do NOT use insert function, its depricated use insertOne, insertMany or bulkWrite. ref.: http://mongodb.github.io/node-mongodb-native/2.1/api/Collection.html#insert
-The insert methods callback has two parameters. Error if there was an error, and result. The result object has several properties with could be used for after insert result testing like: result.insertedCount will return the number of inserted documents.
So according to these in your code you only test for error but you can insert zero documents without an error.
Also its not clear to me where do you get your database name from. Is the following correct in your code? Are you sure you are connected to the database you want to use?
var db = req.db;
Also you don't have to enclose your property names with " in your insert method. The insert should look something like this:
col.insertOne({topicname : topicname, content: result}, function(err, r) {
if (err){
console.log(err);
} else {
console.log(r.insertedCount);
}
});
Start your mongod server in a correct path,i.e, same path as that of what you are using to check the contents of collection.
sudo mongod --dbpath <actual-path>
I have a number of records stored in a MongoDB I'm trying to output them to the browser window by way of a Node.JS http server. I think I'm a good portion of the way along but I'm missing a few little things that are keeping it from actually working.
The code below uses node-mongo-native to connect to the database.
If there is anyone around who can help me make those last few connections with working in node I'd really appreciate it. To be fair, I'm sure this is just the start.
var sys = require("sys");
var test = require("assert");
var http = require('http');
var Db = require('../lib/mongodb').Db,
Connection = require('../lib/mongodb').Connection,
Server = require('../lib/mongodb').Server,
//BSON = require('../lib/mongodb').BSONPure;
BSON = require('../lib/mongodb').BSONNative;
var host = process.env['MONGO_NODE_DRIVER_HOST'] != null ? process.env['MONGO_NODE_DRIVER_HOST'] : 'localhost';
var port = process.env['MONGO_NODE_DRIVER_PORT'] != null ? process.env['MONGO_NODE_DRIVER_PORT'] : Connection.DEFAULT_PORT;
sys.puts("Connecting to " + host + ":" + port);
function PutItem(err, item){
var result = "";
if(item != null) {
for (key in item) {
result += key + '=' + item[key];
}
}
// sys.puts(sys.inspect(item)) // debug output
return result;
}
function ReadTest(){
var db = new Db('mydb', new Server(host, port, {}), {native_parser:true});
var result = "";
db.open(function (err, db) {
db.collection('test', function(err, collection) {
collection.find(function (err, cursor){
cursor.each( function (err, item) {
result += PutItem(err, item);
});
});
});
});
return result;
}
http.createServer(function (req, res) {
res.writeHead(200, {'Content-Type': 'text/plain'});
res.end("foo"+ReadTest());
}).listen(8124);
console.log('Server running on 8124');
Sources:
- mongo connectivity code:
https://github.com/christkv/node-mongodb-native/blob/master/examples/simple.js
- node. http code: nodejs.org
EDIT CORRECTED CODE
Thanks to Mic below who got me rolling in the right direction. For anyone interested, the corrected solution is here:
function ReadTest(res){
var db = new Db('mydb', new Server(host, port, {}), {native_parser:true});
var result = "";
res.write("in readtest\n");
db.open(function (err, db) {
res.write("now open\n");
db.collection('test', function(err, collection) {
res.write("in collection\n");
collection.find(function (err, cursor){
res.write("found\n");
cursor.each( function (err, item) {
res.write("now open\n");
var x = PutItem(err, item);
sys.puts(x);
res.write(x);
if (item == null) {
res.end('foo');
}
});
});
});
});
}
http.createServer(function (req, res) {
res.writeHead(200, {'Content-Type': 'text/plain'});
res.write("start\n");
ReadTest(res);
}).listen(8124);
console.log('Server running on 8124');
My guess is that you are returning result, writing the response, and closing the connection before anything is fetched from the db.
One solution would be to pass the response object to where you actually need it, something like:
function readTest(res) {
db.open(function (err, db) {
db.collection('test', function(err, collection) {
collection.find(function (err, cursor) {
res.writeHead(200, {'Content-type' : 'text/plain'});
cursor.each( function (err, item) { res.write(item); });
res.end();
...
Of course, you should also handle errors and try to avoid nesting too many levels, but that's a different discussion.
Instead of writing all the low-level Mongodb access code, you might want to try a simple library like mongous so that you can focus on your data, not on MongoDB quirks.
You might want to try mongoskin too.
Reading documents
To apply specific value filters, we can pass specific values to the find() command. Here is a SQL query:
SELECT * FROM Table1 WHERE name = 'ABC'
which is equivalent to the following in MongoDB (notice Collection1 for Table1):
db.Collection1.find({name: 'ABC'})
We can chain count() to get the number of results, pretty() to get a readable result. The results can be further narrowed by adding additional parameters:
db.Collection1.find({name: 'ABC', rollNo: 5})
It's important to notice that these filters are ANDed together, by default. To apply an OR filter, we need to use $or. These filters will be specified depending upon the structure of the document. Ex: for object attribute name for an object school, we need to specify filter like "school.name" = 'AUHS'
We're using here the DOT notation, by trying to access a nested field name of a field school. Also notice that the filters are quoted, without which we'll get syntax errors.
Equality matches on arrays can be performed:
on the entire arrays
based on any element
based on a specific element
more complex matches using operators
In the below query:
db.Collection1.find({name: ['ABC','XYZ']})
MongoDB is going to identify documents by an exact match to an array of one or more values. Now for these types of queries, the order of elements matters, meaning that we will only match documents that have ABC followed by XYZ and those are the only 2 elements of the array name
{name:["ABC","GHI","XYZ"]},
{name:["DEF","ABC","XYZ"]}
In the above document, let's say that we need to get all the documnts where ABC is the first element. So, we'll use the below filter:
db.Schools.find({'name.0': 'ABC' })