How to check if a collection has changed? - mongodb

I've created a JSON API with Express.js, Mongoose and MongoDB. Currently, there's no way for the clients of the API to check if the data in a collection has changed - they would need to download the whole collection periodically.
How could I allow the clients of the API to check for changes to a collections (inserts, updates, deletions) without downloading the collection itself?
Is there a way of getting the version number of the collection, the last change timestamp or a hash of the collection with Mongoose? What is the best practice solution to this problem?

In current and earlier versions of MongoDB, you have to do it on the application side, maybe using polling.
MongoDB 3.6 has a brand new feature called change stream that allows you to listen changes happening on your collections in real time.
The sample code to listen selected changes happening on your collection is below:
var MongoClient = require('mongodb').MongoClient
, assert = require('assert');
MongoClient.connect("mongodb://172.16.0.110:27017/myproject?readConcern=majority").then(function(client){
var db = client.db('myproject')
var changeStreams = db.collection('documents').watch()
changeStreams.on('change', function(change){
console.log(change)
})
})
If you are using node.js, you need to use following node module to get it working:
"dependencies": {
"mongodb": "mongodb/node-mongodb-native#3.0.0"
}

Related

Meteor - using snychronised non-persistent / in-memory MongoDB on the server

in a Meteor app, having real-time reactive updates between all connected clients is achieved with writing in collections, publishing and subscribing the right data. In normal case this means also database writes.
But what if I would like to sync particular data which does not need to be persistent and I would like to save the overhead of writing in the database ? Is it possible to use mini-mongo or other in-memory caching on the server by still preserving DDP synchronisation to all clients ?
Example
In my app I have a multiple collapsed threads and I want to show, which users currently expanded particular thread
Viewed by: Mike, Johny, Steven ...
I can store the information in the threads collection or make make a separate viewers collection and publish the information to the clients. But there is actually no meaning in making this information persistent an having the overhead of database writes.
I am confused by the collections documentation. which states:
OPTIONS
connection Object
The server connection that will manage this collection. Uses the default connection if not specified. Pass the return value of calling DDP.connect to specify a different server. Pass null to specify no connection.
and
... when you pass a name, here’s what happens:
...
On the client (and on the server if you specify a connection), a Minimongo instance is created.
But If I create a new collection and pass the option object with conneciton: null
// Creates a new Mongo collections and exports it
export const Presentations = new Mongo.Collection('presentations', {connection: null});
/**
* Publications
*/
if (Meteor.isServer) {
// This code only runs on the server
Meteor.publish(PRESENTATION_BY_MAP_ID, (mapId) => {
check(mapId, nonEmptyString);
return Presentations.find({ matchingMapId: mapId });
});
}
no data is being published to the clients.
TLDR: it's not possible.
There is no magic in Meteor that allow data being synced between clients while the data doesn't transit by the MongoDB database. The whole sync process through publications and subscriptions is triggered by MongoDB writes. Hence, if you don't write to database, you cannot sync data between clients (using the native pub/sub system available in Meteor).
After countless hours of trying everything possible I found a way to what I wanted:
export const Presentations = new Mongo.Collection('presentations', Meteor.isServer ? {connection: null} : {});
I checked the MongoDb and no presentations collection is being created. Also, n every server-restart the collection is empty. There is a small downside on the client, even the collectionHanlde.ready() is truthy the findOne() first returns undefined and is being synced afterwards.
I don't know if this is the right/preferable way, but it was the only one working for me so far. I tried to leave {connection: null} in the client code, but wasn't able to achieve any sync even though I implemented the added/changed/removed methods.
Sadly, I wasn't able to get any further help even in the meteor forum here and here

Atomically query for all collection documents + watching for further changes

Our Java app saves its configurations in a MongoDB collections. When the app starts it reads all the configurations from MongoDB and caches them in Maps. We would like to use the change stream API to be able also to watch for updates of the configurations collections.
So, upon app startup, first we would like to get all configurations, and from now on - watch for any further change.
Is there an easy way to execute the following atomically:
A find() that retrieves all configurations (documents)
Start a watch() that will send all further updates
By atomically I mean - without potentially missing any update (between 1 and 2 someone could update the collection with new configuration).
To make sure I lose no update notifications, I found that I can use watch().startAtOperationTime(serverTime) (for MongoDB of 4.0 or later), as follows.
Query the MongoDB server for its current time, using command such as Document hostInfoDoc = mongoTemplate.executeCommand(new Document("hostInfo", 1))
Query for all interesting documents: List<C> configList = mongoTemplate.findAll(clazz);
Extract the server time from hostInfoDoc: BsonTimestamp serverTime = (BsonTimestamp) hostInfoDoc.get("operationTime");
Start the change stream configured with the saved server time ChangeStreamIterable<Document> changes = eventCollection.watch().startAtOperationTime(serverTime);
Since 1 ends before 2 starts, we know that the documents that were returned by 2 were at least same or fresher than the ones on that server time. And any updates that happened on or after this server time will be sent to us by the change stream (I don't care to run again redundant updates, because I use map as cache, so extra add/remove won't make a difference, as long as the last action arrives).
I think I could also use watch().resumeAfter(_idOfLastAddedDoc) (didn't try). I did not use this approach because of the following scenario: the collection is empty, and the first document is added after getting all (none) documents, and before starting the watch(). In that scenario I don't have previous document _id to use as resume token.
Update
Instead of using "hostInfo" for getting the server time, which couldn't be used in our production, I ended using "dbStats" like that:
Document dbStats= mongoOperations.executeCommand(new Document("dbStats", 1));
BsonTimestamp serverTime = (BsonTimestamp) dbStats.get("operationTime");

Meteor reactive publish data from different collections

i try to build a homeautomation system with meteor. therefore i would like to do the following thing.
i have a collection with all my different liveValues i'm reading from any kind of source. each document is a value of a for example sensor with the actual value.
now i want to create a second collection called thing. in this collection i'd like to add all my "Things" for example "Roomtemperature living" with the data for this thing. one attribute should be the connection to one of liveValues.
Now i want to publish and subscribe with Meteor the Thing collection, because on the webinterface it doesn't matter what liveValue is behind the Thing.
Here, the in my optionen, complicated part starts.
How can i publish the data to the client and i will have a reactive update when the LiveValue has changend for the thing? because it's an differnt collection than "Thing" collection.
In my idea i would like to do this via one subscrition to one "thing" document and i will get back with this subscription the update of the liveValue of the liveValue collection.
Is this workable?
has somebody an idea how i can handle this?
i've heard something about meteor-reactive-publish but i'not sure if this is the solution. also i've heard that this needs a lots of power for the server.
thanks for your help.
So basically you want to merge the documents on server side to one reactive collection on client-side.
You should use observeChanges provided by Meteor Collections as described in the docs.
By this you can observe the changes on your server side collections and publish to your client-side aggregated collection, like this:
// Get the data from a kind of sensor
var cursor = SomeSensor.find({/* your query */});
var self = this;
// Observe the changes in the cursor and publish
// it to the 'things' collection in client
var observer = cursor.observeChanges({
added: function (document) {
self.added('things', document._id, document);
},
removed: function (document) {
self.removed('things', document._id, document);
},
changed: function (document) {
self.changed('things', document._id, document);
}
});
// Make the publication ready
self.ready();
// Stop the observer on subscription stop
self.onStop(function () {
observer.stop();
});
With this the things collection will have the data from all the sensors reactively.
Hope it helps you.

Import "normal" MongoDB collections into DerbyJS 0.6

Same situation like this question, but with current DerbyJS (version 0.6):
Using imported docs from MongoDB in DerbyJS
I have a MongoDB collection with data that was not saved through my
Derby app. I want to query against that and pull it into my Derby app.
Is this still possible?
The accepted answer there links to a dead link. The newest working link would be this: https://github.com/derbyjs/racer/blob/0.3/lib/descriptor/query/README.md
Which refers to the 0.3 branch for Racer (current master version is 0.6).
What I tried
Searching the internets
The naïve way:
var query = model.query('projects-legacy', { public: true });
model.fetch(query, function() {
query.ref('_page.projects');
})
(doesn't work)
A utility was written for this purpose: https://github.com/share/igor
You may need to modify it to only run against a single collection instead of the whole database, but it essentially goes through every document in the database and modifies it with the necessary livedb metadata and creates a default operation for it as well.
In livedb every collection has a corresponding operations collection, for example profiles will have a profiles_ops collection which holds all the operations for the profiles.
You will have to convert the collection to use it with Racer/livedb because of the metadata on the document itself.
An alternative if you dont want to convert is to use traditional AJAX/REST to get the data from your mongo database and then just put it in your local model. This will not be real-time or synced to the server but it will allow you to drive your templates from data that you dont want to convert for some reason.

Node.js listen to MongoDB change

Is there a way for Node.js to listen to a change in a particular data in a collection of MongoDB, and fire an event if a change happens?
Well, this is an old question, but I was struggling with the same thing. I found a number of tidbits that helped me put together a solution, and I've published it as a library:
https://github.com/TorchlightSoftware/mongo-watch
The library is written in coffeescript. Here's an example in javascript, for those that prefer.
var MongoWatch = require('mongo-watch'),
watcher = new MongoWatch({parser: 'pretty'});
watcher.watch('test.users', function(event) {
return console.log('something changed:', event);
});
MongoDB apparently now supports triggers and watch, this answer is outdated.
[Original] I believe you are looking for a database trigger.
Unfortunately, MongoDB has no support for them yet, so I don't think you can listen for changes directly from the database. You'll need to setup some sort of notification system (e.g. pub/sub) that alerts interested parties when a collection has changed.
MongoDB 3.6 introduced change streams, which are designed to solve precisely this problem.
Here's an example: https://github.com/thakkaryash94/mongodb-change-streams-nodejs-example
const pipeline = [
{
$project: { documentKey: false }
}
];
const db = client.db("superheroesdb");
const collection = db.collection("superheroes");
const changeStream = collection.watch(pipeline);
// start listen to changes
changeStream.on("change", function (change) {
console.log(change);
});
Well, kind of late.
But still, if someone looking to notify the MongoDB changes to the node.js then they can use mubsub library.
This is active library and very easy to integrate with nodejs.
It uses Mongo's tailable cursors and capped collections.
It works in pub/sub fashion to notify the subscriber when the document inserted into the MongoDB.
Check on Github for details.