Using kibana and mongodb together without elasticsearch - mongodb

Is it possible to use kibana front-end along with a mongodb back-end without using elastic search?
I'm using logstash to parse logs and store in mongodb and want to use kibana to display data?
If not, are there any alternatives to implement kibana+mongodb?

I'm afraid that Kibana is specifically designed to use the Elasticsearch API.
While they do both provide JSON responses, they don't return compatible data structures and even if they did, Mongo would not provide the same features (facets/filters) that Kibana makes heavy use of.
You could probably index your MongoDB data in Elasticsearch following instructions similar to https://coderwall.com/p/sy1qcw but then you are unnecessarily duplicating your data in 2 systems.

Related

How to sync between mongodb and elasticsearch?

I have a scala microservice that serves as database api, and the database I am using is mongodb.
I want to add elasticsearch that will contain all the data that my mongodb have, and I need to keep it in sync when the mongodb is updated, how can I achieve it?
what would be the best approach to do this? is there some plugins or something that can help me with this task?
Look at the 5 Different ways to synchronize data from MongoDB to ElasticSearch, personally, I did it with Logstash where I simply filtered one collection and dumped to ES every 24 hrs, the use case is key to determine what strategy/tool is to use.

ELK stack - how to search through mongo database?

I'm using the elk stack for the first time and I can import data with logstash but how do I link my mongodb to elastic instead?
Also, what is the best way to import bulk data?
I'm using the MEAN stack and the newest version of elk 5+. I am not using beats like filebeats but am willing to use if needed.
First, if you're using logstash successfully, then you don't need filebeats. (Although filebeats is much better than logstash).
I think you're confused about the other terms.. You don't hook up mongodb to elastic. When using the ELK stack, logstash is intended to send your logs to elasticsearch, and kibana is the UI layer to view your data.
If you really want to use mongodb (although I don't recommend), then you're using mongodb instead of elasticsearch.
If you are after searching MongoDB data in elasticsearch you will need to import it (from Mongo to Elasticsearch)
There are a few ways, one of them is described here: https://stackoverflow.com/a/24119066/6079633 - but i don't think it supports elastic 5
And there is this one https://github.com/ozlerhakan/mongolastic - which is according to elasticsearch website: "A tool that clones data from ElasticSearch to MongoDB and vice versa"
I know this answer could be late, but it may help other people.
If you need a tool to transfer your data from MongoDB to Elasticsearch, have a look into this mongoose plugin https://github.com/mongoosastic/mongoosastic/tree/master/docs, it's a great tool for automatically indexing MongoDB models into elasticsearch.
and you can transfer your collection data through indexing an existing collection in MongoDB

What are the right practices of using Apache Solr with MongoDB?

I am using Apache Solr 6.3.0 and MongoDB 3.4 for advance text search features. I have successfully, synced mongodb with solr cores using mongo-connector 2.5 and solr doc manager.
I want to know the right way and practices to use solr with mongo and I have some issues that I need help on:
1). Now that my data is available both in mongo database and also indexed and stored in Solr cores, should I now query Solr all the time ? Or should I query solr for text search only and perform rest of the queries on mongo ?
2). Is there some way I could perform powerful search directly on mongo database using the indexing done by Solr ?
3). I have some collections that contain deeply nested json data and MongoDb supports them well. Solr indexes and stores such data in flattened form.But, I want to maintain the original nested json format in query response. Is this something I can achieve with Solr ?
Other suggestions about good practices of using solr with mongoDb will be extremely helpful.
If it makes sense to just query Solr, do that. If it makes sense to query Solr for certain data, do that. It depends on your use case, but if any query can be answered with the data in Solr, it's perfectly fine to use that for everything. That'll probably allow a more efficient use of your caches.
No, not that I know of.
Not really. Solr isn't well suited for nested JSON (even if you have parent/child documents, it's something you'll have to manually handle in every situation and will require special casing all over).
In those situations you can use Solr for querying, get the ids back and then retrieve the actual documents from mongo with their JSON structure intact. In that case you can leave most fields as non-stored in Solr.

Which approach of search is feasible (elastic search + mongoDB) or mongoDB text indexes

In my project I have to implement text search and and have to choose a feasible
approach among the two that are :-
Synchronising MongoDB database with ElasticSearch.
MongoDB's own text indexes that has Elastic Search like text searching
capabilities.
I have gone through many articles that provide the pros of each of the cases but haven't found any relevant document that provides comparison between the two approaches and which approach is better than the other or what are the limitation for a specific approach.
Note:- I am using Node.js with express.js.
MongoDB is a general purpose database, Elasticsearch is a distributed text search engine backed by Lucene. You can store data in MongoDB and use Elasticsearch exclusively for its' full-text search capabilities. According to your use case, you can only send a subset of the mongo data fields to elastic.
Synchronization of mongodb with Elasticsearch can be done with mongoosastic. You can solve data safety concern by persisting in mongo and speed up your search by using elasticsearch. Also can use python script using elasticsearch.py package to sync mongo and elasticsearch.
Mongodb search is very slow as compared to elasticsearch. Also indexing in mongodb takes up more time and more resources.

Is anyone using Redis in front of MongoDB?

I was wondering if anyone is using Redis in front of MongoDB for a more robust permanent persistence layer.
I realize Redis has VM and is adding new features all the time... but want the flexibility of MongoDB for scaling horizontally.
example: I want to use redis for session/pub/sub with nodejs and websocket to the browser, but also store the data in a scalable way in a searchable archive maintained on MongoDB
Yes, I use Redis often as an index of indexes in front of MongoDB. For example, I generate a key that I use to lookup a set in Redis, then use the members of that set for Mongo queries using $in.