Is anyone using Redis in front of MongoDB? - mongodb

I was wondering if anyone is using Redis in front of MongoDB for a more robust permanent persistence layer.
I realize Redis has VM and is adding new features all the time... but want the flexibility of MongoDB for scaling horizontally.
example: I want to use redis for session/pub/sub with nodejs and websocket to the browser, but also store the data in a scalable way in a searchable archive maintained on MongoDB

Yes, I use Redis often as an index of indexes in front of MongoDB. For example, I generate a key that I use to lookup a set in Redis, then use the members of that set for Mongo queries using $in.

Related

MongoDB: display entities as a graph-nodes UI for remote mongo host

I have lots of data in my MongoDB accross collection and I'd like to vizualise them, in graph-node way, or in relation represenation manner.
I know, that Mongo by default, isn't 'that good' solution for this case, and I should use something like Neo4j or ElasticSearch with Kibana instead. And also I know, that Mongo Compass has in-build charts.
And according to some articles on Mongo site and $graphLookup aggregation operator Mongo can be used in that way.
So my question is about a tool, which is capable to vizualise (or connect by values) entities between each other.
Probably there is some way to integrate and connect Kibana to MongoDB remote instance (not Atlas) or Graphana allow to solve this problem (but I haven't head much about it) so I'll be welcome to ay of your advice about it.

Which approach of search is feasible (elastic search + mongoDB) or mongoDB text indexes

In my project I have to implement text search and and have to choose a feasible
approach among the two that are :-
Synchronising MongoDB database with ElasticSearch.
MongoDB's own text indexes that has Elastic Search like text searching
capabilities.
I have gone through many articles that provide the pros of each of the cases but haven't found any relevant document that provides comparison between the two approaches and which approach is better than the other or what are the limitation for a specific approach.
Note:- I am using Node.js with express.js.
MongoDB is a general purpose database, Elasticsearch is a distributed text search engine backed by Lucene. You can store data in MongoDB and use Elasticsearch exclusively for its' full-text search capabilities. According to your use case, you can only send a subset of the mongo data fields to elastic.
Synchronization of mongodb with Elasticsearch can be done with mongoosastic. You can solve data safety concern by persisting in mongo and speed up your search by using elasticsearch. Also can use python script using elasticsearch.py package to sync mongo and elasticsearch.
Mongodb search is very slow as compared to elasticsearch. Also indexing in mongodb takes up more time and more resources.

How do I make the most use of MongoDB and Redis Caching for a high scalable application?

I want to use the best features of Redis Caching and Mongo DB database for my current product.
I have a very heavy database, and would like to avoid multiple database calls.
Can I cache my documents in Redis and do a query instead?
What level caching would be suggested for the best performance?

MongoDB integration with Solr

I am beginner with mongodb and its integraiton with Solr. From different posts I got an idea about the integration steps. But need info on the below
I have the data in mongodb, for faster retrieval we are integrating it with Solr.
Solr indexes all mongodb entries. Is this indexing one time activity after integration or Do we need to periodically update Solr to index the entries which got inserted after the integration ?
If we need to periodically update solr, it becomes an extra overhead to maintain it in Solr as well along with mongodb. Best approaches on overcoming it.
As far as I know you do not have official(supported/complete) solution to integrate MongoDB and Solr, but let me give you some ideas/direction.
For me the best approach is when it is possible to modify the application and add to the persistence layer the fact that you have all writes operations done in MongoDB and Solr in the "same" time. Like that you can control exactly what you want to send to the Database and what you want to index for a full text operation. But as I said this means that you have to change your application code. (You will have anyway to change it to be able to query Solr when needed). And yes you have to index all the existing documents the first time
You can use a "connector" approach where MongoDB and Solr are kind of connected together, this could be done in various ways.
You can use for example the MongoDB Connector available here : https://github.com/10gen-labs/mongo-connector
LucidWorks, the company behind Solr has also a connector for MongoDB, documented here : http://docs.lucidworks.com/display/help/Create+a+New+MongoDB+Data+Source# (I have not used it so cannot comment, but it is also an approach)
You point #2 is true, you have to manage two clusters and be sure the data are in sync, and sometimes pay the price of inconsistency between the Solr index and the document just updated in MongoDB... So you need to see if the best approach for your application is to use MongoDB alone or MongoDB with Solr (see comment below)
Just a small comment in addition to this answer:
You are talking about "faster retrieval", not sure it should be the reason, if you write correct queries with correct indexes in MongoDB you should be able to do it without Solr. If you requirement is really oriented towards the power of solr meaning: full text index (with all related features it makes sense)
How large is your data? MongoDB has a few good indexing mechanism of its own.
There is a powerful geo-api and for full text search there is http://docs.mongodb.org/manual/core/index-text/. So it would be ideal to identify if your need fits into MongoDB or you need to spill over to SOLR.
About the indexing part. How often if your data updated? If you can afford to have infrequent updates, then a batch job with once a day re-indexing may work for you. Ideally SOLR would work well for some form of master data.

Using kibana and mongodb together without elasticsearch

Is it possible to use kibana front-end along with a mongodb back-end without using elastic search?
I'm using logstash to parse logs and store in mongodb and want to use kibana to display data?
If not, are there any alternatives to implement kibana+mongodb?
I'm afraid that Kibana is specifically designed to use the Elasticsearch API.
While they do both provide JSON responses, they don't return compatible data structures and even if they did, Mongo would not provide the same features (facets/filters) that Kibana makes heavy use of.
You could probably index your MongoDB data in Elasticsearch following instructions similar to https://coderwall.com/p/sy1qcw but then you are unnecessarily duplicating your data in 2 systems.