How to return the results of a MongoDB cursor back to Actix-web for display? - mongodb

I'm using Actix-web 4.0.0.rc3 along with MongoDB 2.1.0 (async mode). I've got the db integrated with Actix via app_data, but I'm trying to figure out how to display the results of a multi-document cursor.
Currently I'm using:
while let Some(doc) = cursor.try_next().await? {
// processing doc results, and building a row of data
}
Spitting out the results is easy, but I can't figure out how to get those results to display in the browser. I've been searching for an example that would explain how to do this, but haven't found a solution thus far. Using handlebars it was easy to get a field from the DB, and display those results, but the cursor of unknown size has my stumped. I've done this easily with RoR, or Phoenix/Elixir - would appreciate any suggestions (or just a gentle nudge to an example that might already exist).
Thanks

I have that example:
pub async fn get_aggregation_as_json_response(
db: &Database,
aggr_pipeline: Vec<Document>,
) -> HttpResponse {
let mut cursor = db
.collection_with_type::<ExampleModel>("examplemodel")
.aggregate(aggr_pipeline, None)
.await
.expect("Error performing aggregation on examplemodel collection.");
let mut results: Vec<Document> = Vec::new();
while let Some(result) = cursor.next().await {
match result {
Ok(document) => {
results.push(document);
}
Err(_) => {
return HttpResponse::InternalServerError().finish();
}
}
}
HttpResponse::Ok().json(results)
}
Is that what you are looking for ?

Related

dexiejs query get slower overtime

So I am using Svelte+Vite with Dexiejs as my offline Db and Routify for the routes, when I go to a page that is making query to Dexiejs the response is pretty quick on the first request, but when I go to another page and go back to to the same page the response time gets slower each query.
I've used the index like this answer suggest. but it still happens. what did i miss ? can anyone recommend any alternatives that have better performance than dexiejs or pouchDB for offline db ? I am currently trying pouchDB as alternative
here is my code
let taskDone;
let taskOngoing;
let clustercount;
let tasks = [];
onMount(async function () {
// #ts-ignore
let clusterQuery = await db.cluster.reverse().sortBy("id");
clustercount = clusterQuery.length;
for (const clusters of clusterQuery) {
// #ts-ignore
taskDone = await db.task
.where("[cluster_id+status]")
.anyOf([clusters.id, 1], [clusters.id, 2])
.toArray();
// #ts-ignore
taskOngoing = await db.task
.where({ cluster_id: clusters.id })
.toArray();
tasks = [
{
cluster_id: clusters.id,
count_done: taskDone.length,
count_ongoing: taskOngoing.length,
cluster_name: clusters.name,
},
...tasks,
];
count++;
}
dispatch("showList", tasks);
});
It looks like the tasks array is being filled more and more for every time onMount is called. Is it a globally declared variable? Was it meant to be local?

MongoDB Converting Circular Structure to JSON error

I'm trying to query a collection of users using Mongoose in a Node API.
The handler looks like this:
exports.getUsers = async function(req, res, next) {
try {
let users = db.User.find();
return res.status(200).json(users);
} catch(e) {
return next(e);
}
};
This returns an error that reads Converting circular structure to JSON. When I console.log() the results of db.User.find(), I get a Query object. I've checked everything else. All of my other routes are working normally.
Well...I figured it out. I'll post the answer that I discovered in case anyone else is trying to figure this out. It turns out, through a little bit more careful reading of the documentation, that the Query object that is returned has to be executed. There are two ways to execute it - with a callback function or by returning a promise (but not both). I found this page on queries in the mongoose docs helpful. My final handler looked like this.
exports.getUsers = async function(req, res, next) {
try {
db.User.find()
.then(users => {
return res.status(200).json(users);
});
} catch(e) {
return next(e);
}
};
Next time I guess I'll dig around for a few more minutes before asking.
Edit to add:
Found a second solution. Due to the use of the async function, I also was able to use following inside the try block.
let users = await db.User.find();
return res.status(200).json(users);

Fetching json from Mongo with Meteor

I am trying to fetch a json object from the mongodb using meteor, but I have no clue why I’m unable to do so.
I need it to be a JSON object only.
One of the entries of the collection looks like this:
[Image taken from Meteor Dev Tools]
Link: https://i.stack.imgur.com/BxRmS.png
I’m trying to fetch the value part by passing the name.
Code on front end:
export default withTracker(() => {
let aSub = Meteor.subscribe(‘allEntries’);
return {
aBoundaries: DataCollection.find({}).fetch()
}
})(Component Name);
The Meteor Call Statement on front-end:
dataFromDb = Meteor.call(‘functionToBeCalled’, ‘Sydney’);
Server-side Code:
Meteor.publish(‘allEntries’, function(){
return DataCollection.find();
});
Meteor.methods({
functionToBeCalled(aName){
return DataCollection.find({name: aName});
}
});
Another of my questions is:
Is there any way that we publish only all the names in the beginning and then publish the values on demand?
Thanks for your help in advance!
I have tried this as well, but it did not work:
functionToBeCalled(aName){
var query = {};
query['name'] = aName;
return DataCollection.find(query).fetch();
}
The issue seems to be with query.
Collection.find() returns data with cursor.
To get an array of objects, use Collection.find().fetch(). The jsons are returned as collection of array like [{json1}, {json2}].
If there is a single document, you can access the json using Collection.find().fetch()[0]. Another alternative is to use findOne. Example - Collection.findOne(). This will return a single JSON object.
use Meteor.subscribe('allEntries'), do not assign it to a variable.
Meteor.subscribe is asynchronous, it's best you ensure that your subscriptions are ready before you fetch data.
Log DataCollection.find({}).fetch() to your console
Check this official reference https://docs.meteor.com/api/pubsub.html#Meteor-subscribe.
Your second question isn't that clear.
Just in case anyone comes here to look for the answer ~~~
So... I was able to make it work with this code on the server:
Meteor.methods({
functionToBeCalled(aName){
console.log(aName);
return DataCollection.findOne({name: aName});
}
});
And this on the client:
Meteor.call('functionToBeCalled', nameToBePassed, (error,response) => {
console.log(error, "error");
console.log(response, "response"); //response here
})
Thanks for the help!

socket.io for react native (sending query problems)

I'm using this library and i can connect without problems.
Usually when I have worked with sockets the code that ever i used is:
socket = io.connect(url, { query: ‘token=’ + token});
and i can see this info reading socket.request._query
Using socket.io for react native i'm trying to send params:
this.socket = new SocketIO('http://localhost:3000', { query: ‘token=’ + token});
but in socket.request._query only can see this log:
{ transport: 'polling', b64: '1' }
In the library some options are mentioned like: connectParams. But i don't know how i can see the info
Related: link
It's not pretty detailed in the repo, but connectParams is a key/value object, and furthermore the values you sent in it will be appended in the url, as shown here:
if connectParams != nil {
for (key, value) in connectParams! {
let keyEsc = key.urlEncode()!
let valueEsc = "\(value)".urlEncode()!
queryString += "&\(keyEsc)=\(valueEsc)"
}
}
>Source<
So, you should try using connectParams like this(though I'm not sure how you tried it before):
this.socket = new SocketIO('http://localhost:3000', {
connectParams: {
myAwesomeQueryStringParam: "someRandomValue"
}
});
PS: forgive me, my english is pretty bad

How to stream MongoDB Query Results with nodejs?

I have been searching for an example of how I can stream the result of a MongoDB query to a nodejs client. All solutions I have found so far seem to read the query result at once and then send the result back to the server.
Instead, I would (obviously) like to supply a callback to the query method and have MongoDB call that when the next chunk of the result set is available.
I have been looking at mongoose - should I probably use a different driver?
Jan
node-mongodb-driver (the underlying layer that every mongoDB client uses in nodejs) except the cursor API that others mentioned has a nice stream API (#458). Unfortunately i did not find it documented elsewhere.
Update: there are docs.
It can be used like this:
var stream = collection.find().stream()
stream.on('error', function (err) {
console.error(err)
})
stream.on('data', function (doc) {
console.log(doc)
})
It actually implements the ReadableStream interface, so it has all the goodies (pause/resume etc)
Streaming in Mongoose became available in version 2.4.0 which appeared three months after you've posted this question:
Model.where('created').gte(twoWeeksAgo).stream().pipe(writeStream);
More elaborated examples can be found on their documentation page.
mongoose is not really "driver", it's actually an ORM wrapper around the MongoDB driver (node-mongodb-native).
To do what you're doing, take a look at the driver's .find and .each method. Here's some code from the examples:
// Find all records. find() returns a cursor
collection.find(function(err, cursor) {
sys.puts("Printing docs from Cursor Each")
cursor.each(function(err, doc) {
if(doc != null) sys.puts("Doc from Each " + sys.inspect(doc));
})
});
To stream the results, you're basically replacing that sys.puts with your "stream" function. Not sure how you plan to stream the results. I think you can do response.write() + response.flush(), but you may also want to checkout socket.io.
Here is the solution I found (please correct me anyone if thatis the wrong way to do it):
(Also excuse the bad coding - too late for me now to prettify this)
var sys = require('sys')
var http = require("http");
var Db = require('/usr/local/src/npm/node_modules/mongodb/lib/mongodb').Db,
Connection = require('/usr/local/src/npm/node_modules/mongodb/lib/mongodb').Connection,
Collection = require('/usr/local/src/npm/node_modules/mongodb/lib/mongodb').Collection,
Server = require('/usr/local/src/npm/node_modules/mongodb/lib/mongodb').Server;
var db = new Db('test', new Server('localhost',Connection.DEFAULT_PORT , {}));
var products;
db.open(function (error, client) {
if (error) throw error;
products = new Collection(client, 'products');
});
function ProductReader(collection) {
this.collection = collection;
}
ProductReader.prototype = new process.EventEmitter();
ProductReader.prototype.do = function() {
var self = this;
this.collection.find(function(err, cursor) {
if (err) {
self.emit('e1');
return;
}
sys.puts("Printing docs from Cursor Each");
self.emit('start');
cursor.each(function(err, doc) {
if (!err) {
self.emit('e2');
self.emit('end');
return;
}
if(doc != null) {
sys.puts("doc:" + doc.name);
self.emit('doc',doc);
} else {
self.emit('end');
}
})
});
};
http.createServer(function(req,res){
pr = new ProductReader(products);
pr.on('e1',function(){
sys.puts("E1");
res.writeHead(400,{"Content-Type": "text/plain"});
res.write("e1 occurred\n");
res.end();
});
pr.on('e2',function(){
sys.puts("E2");
res.write("ERROR\n");
});
pr.on('start',function(){
sys.puts("START");
res.writeHead(200,{"Content-Type": "text/plain"});
res.write("<products>\n");
});
pr.on('doc',function(doc){
sys.puts("A DOCUMENT" + doc.name);
res.write("<product><name>" + doc.name + "</name></product>\n");
});
pr.on('end',function(){
sys.puts("END");
res.write("</products>");
res.end();
});
pr.do();
}).listen(8000);
I have been studying mongodb streams myself, while I do not have the entire answer you are looking for, I do have part of it.
you can setup a socket.io stream
this is using javascript socket.io and socket.io-streaming available at NPM
also mongodb for the database because
using a 40 year old database that has issues is incorrect, time to modernize
also the 40 year old db is SQL and SQL doesn't do streams to my knowledge
So although you only asked about data going from server to client, I also want to get client to server in my answer because I can NEVER find it anywhere when I search and I wanted to setup one place with both the send and receive elements via stream so everyone could get the hang of it quickly.
client side sending data to server via streaming
stream = ss.createStream();
blobstream=ss.createBlobReadStream(data);
blobstream.pipe(stream);
ss(socket).emit('data.stream',stream,{},function(err,successful_db_insert_id){
//if you get back the id it went into the db and everything worked
});
server receiving stream from the client side and then replying when done
ss(socket).on('data.stream.out',function(stream,o,c){
buffer=[];
stream.on('data',function(chunk){buffer.push(chunk);});
stream.on('end',function(){
buffer=Buffer.concat(buffer);
db.insert(buffer,function(err,res){
res=insertedId[0];
c(null,res);
});
});
});
//This is the other half of that the fetching of data and streaming to the client
client side requesting and receiving stream data from server
stream=ss.createStream();
binarystring='';
stream.on('data',function(chunk){
for(var I=0;i<chunk.length;i++){
binarystring+=String.fromCharCode(chunk[i]);
}
});
stream.on('end',function(){ data=window.btoa(binarystring); c(null,data); });
ss(socket).emit('data.stream.get,stream,o,c);
server side replying to request for streaming data
ss(socket).on('data.stream.get',function(stream,o,c){
stream.on('end',function(){
c(null,true);
});
db.find().stream().pipe(stream);
});
The very last one there is the only one where I am kind of just pulling it out of my butt because I have not yet tried it, but that should work. I actually do something similar but I write the file to the hard drive then use fs.createReadStream to stream it to the client. So not sure if 100% but from what I read it should be, I'll get back to you once I test it.
P.s. anyone want to bug me about my colloquial way of talking, I'm Canadian, and I love saying "eh" come at me with your hugs and hits bros/sis' :D