Can we use neo4j and mongodb same time? - mongodb

I am new to neo4j and the current project I'm working on uses MongoDB as database, I have a new requirement to show recommendations and visualize data using neo4j. I have no option to replace MongoDB with neo4j and if I am using both DBs I will have to dump data to neo4j every time to get updated data from MongoDB. Is there any better solution for this?

You can use MongoDB as caching and put all your data in Neo4j, but not a good idea to use Neo4j just for data visualization

Related

How to use the appropriate index while querying mongo using mongospark connectors via withPipeline feature?

I am trying to load huge amount of data from mongodb. Data size is in millions. So, it makes sense to pull this data using appropriate indexes and also query mongo in parallel. Thats why to do batch reads, I am using mongo spark.
How to use the appropriate index while querying mongo using mongospark connectors via withPipeline feature?
Also, I was exploring "com.mongodb.reactivestreams.client.MongoCollection". If possible, can someone throw some light on this?

ELK stack - how to search through mongo database?

I'm using the elk stack for the first time and I can import data with logstash but how do I link my mongodb to elastic instead?
Also, what is the best way to import bulk data?
I'm using the MEAN stack and the newest version of elk 5+. I am not using beats like filebeats but am willing to use if needed.
First, if you're using logstash successfully, then you don't need filebeats. (Although filebeats is much better than logstash).
I think you're confused about the other terms.. You don't hook up mongodb to elastic. When using the ELK stack, logstash is intended to send your logs to elasticsearch, and kibana is the UI layer to view your data.
If you really want to use mongodb (although I don't recommend), then you're using mongodb instead of elasticsearch.
If you are after searching MongoDB data in elasticsearch you will need to import it (from Mongo to Elasticsearch)
There are a few ways, one of them is described here: https://stackoverflow.com/a/24119066/6079633 - but i don't think it supports elastic 5
And there is this one https://github.com/ozlerhakan/mongolastic - which is according to elasticsearch website: "A tool that clones data from ElasticSearch to MongoDB and vice versa"
I know this answer could be late, but it may help other people.
If you need a tool to transfer your data from MongoDB to Elasticsearch, have a look into this mongoose plugin https://github.com/mongoosastic/mongoosastic/tree/master/docs, it's a great tool for automatically indexing MongoDB models into elasticsearch.
and you can transfer your collection data through indexing an existing collection in MongoDB

Data-modelling tools for MongoDB

Even though MongoDB is a schema-less DB but there is a requirement in my project where I have to map my classes to the Database objects and prefer to have the data modelling for the same. Please suggest some Data-modelling tools for MongoDB to map the DB to Classes and Objects.
Moon Modeler is a data modeling tool for MongoDB and Mongoose (ODM)
I think you want to model your applications and persist your data in MongoDB. So it depends on what framework/language are you using for application.
I did a quick web search to get these ODM (Object-Document-Mapper) options recommended to work with MongoDB for some of the popular languages.
ruby
https://docs.mongodb.com/mongoid/master/
http://mongomapper.com/
java
https://github.com/mongodb/morphia
http://hibernate.org/ogm/
python
http://mongoengine.org/
http://api.mongodb.com/python/1.6/index.html

How to synchronize spring data elasticsearch with mongodb?

I need to keep data in ElasticSearch in sync with the data I have and maintain in MongoDB.
Currently I have a batch job that finds all the changed data and updates it in elastic search using Spring-Batch and Spring-Data-ElasticSearch.
This works, but I'm looking for a solution where every change is directly mirrored in ElasticSearch.
give this a go
mongo connector
have a read through this 5 ways to sync data

solr Data Import Handlers for MongoDB

I am working on a project where we have millions of entries stored in MongoDB database and, i want to index all this data using SOLR.
After extensive Searching i came to know there are no proper "Data Import Handlers" for mongoDB database.
Can anyone tell me what are the proper approaches for indexing data in MongoDB using SOLR ?
I want to use all the features of SOLR and want it to be scalable in real-time. I saw one or two approaches from different posts but not sure how they will work real time..
Many Thanks
10Gen introduce Mongodb Connector. You can integrate Mongodb with Solr using this tool.
Blog post : Introducing Mongo Connector
Github page : mongo-connector
I have created a plugin to allow you to load data from MongoDb using the Solr data import handler.
Check it out at:
https://github.com/james75/SolrMongoImporter
I wrote a response to a similar question, except it was how to import data from MySQL into SOLR. The example code is in PHP, but should give you a general idea. All you would need to do is set up an iterator to step through your MongoDB assets, extract the data to SOLR datatypes, and then save it to your SOLR index.
If you want it to be real-time, you could add some custom code to the save mechanism (assuming this can be done with MongoDB), and save directly to the SOLR index, then run a commit script to commit data every 15 minutes (via cron).