mongodb persistence patterns for JSON client app, jackson mapper or morphia driver? - mongodb

I've started a new job where they are using mongodb in a java environment.
They have implemented a pattern using DTOs and factories with the morphia driver, this may be due to a migration onto mongodb from a key value store previously. The client is a JSON client.
It seems to me that the jackson-mongo-mapper would be a better approach because it's just mapping pojos from json to BSON and back, seems like it could do away with all DTO factory facade?
Anyone know any pros and cons with these different approaches?

Spring Data for Mongodb is very nice since you can use even another data store or mix them and repository interface is very helpful.
Kundera is an option through JPA2
http://agilemobiledeveloper.wordpress.com/2013/08/22/working-with-mongodb-using-kundera/
There's a lot of java to mongodb options.
http://www.agilemobiledeveloper.com/2013/01/31/hibernate-ogm-mongodb-vs-kundera-vs-jongo-vs-mongodb-api-vs-morphia-vs-spring-data-mongo-mongodb-drivers-for-java/
Adding your own data layer and making sure you use DI and test it fully is very helpful.
NOSQLUnit is awesome -> https://github.com/lordofthejars/nosql-unit

DTOs are good for keeping a separation between implementation and design, so when they need or want to switch from mongo to some other NoSQL or SQL database it can be done cleanly.

Related

Mongoengine and Pymongo?

Can I use mongoengine or djongo for ODM and pymongo for interaction with the db?
I've read these two about something related to my question:
Insert data by pymongo using mongoengine ORM in pyramid
Use MongoEngine and PyMongo together
But, I couldn't find what I'm looking for (I guess).
So here's what I'm trying to find:
¿Does this practice affect the performance of my application?
¿How well recommended is it?
So, if it is recommended, and everything is right, ¿Do I need to put an extra layer of security or something?, because, I want to build an API using the serializations for models that django-rest-framework-mongoengine offers, and then do what I have to do in the view of the API endpoint.
It could be djongo or something like it, what I want is just an ODM for serializing, define a structure for the API and so on, use pymongo for queries, cause according to what I've been reading, mongoengine could make slower the interaction with the db
The term "ORM" does not apply to MongoDB since MongoDB is non-relational. The proper term is "ODM" - object-document mapper.
Generally, a MongoDB ODM is built on top of a MongoDB driver. The functionalities of the ODM and the driver are complementary - the driver provides low-level database access and the ODM provides high-level features like schema, associations, callbacks.
If you want to use the high-level features, it makes sense to use an ODM. If you don't need any of those features and just want to perform basic CRUD operations, using a driver directly is more efficient. Some applications use both of these strategies depending on the operation that needs to be performed.

Data-modelling tools for MongoDB

Even though MongoDB is a schema-less DB but there is a requirement in my project where I have to map my classes to the Database objects and prefer to have the data modelling for the same. Please suggest some Data-modelling tools for MongoDB to map the DB to Classes and Objects.
Moon Modeler is a data modeling tool for MongoDB and Mongoose (ODM)
I think you want to model your applications and persist your data in MongoDB. So it depends on what framework/language are you using for application.
I did a quick web search to get these ODM (Object-Document-Mapper) options recommended to work with MongoDB for some of the popular languages.
ruby
https://docs.mongodb.com/mongoid/master/
http://mongomapper.com/
java
https://github.com/mongodb/morphia
http://hibernate.org/ogm/
python
http://mongoengine.org/
http://api.mongodb.com/python/1.6/index.html

Combining Neo4J and MongoDB : Consistency

I am experimenting a lot these days, and one of the things I wanted to do is combine two popular NoSQL databases, namely Neo4j and MongoDB. Simply because I feel they complement eachother perfectly. The first class citizens in Neo4j, the relations, are imo exactly what's missing in MongoDB, whereas MongoDb allows me to not put large amounts of data in my node properties.
So I am trying to combine the two in a Java application, using the Neo4j Java REST binding, and the MongoDB Java driver. All my domain entities have a unique identifier which I store in both databases. The other data is stored in MongoDB and the relations between entities are stored in Neo4J. For instance, both databases contain a userid, MongoDB contains the profile information, and Neo4J contains friendship relations. With the custom data access layer I have written, this works exactly like I want it to. And it's fast.
BUT... When I want to create a user, I need to create both a node in Neo4j and a document in MongoDB. Not necessarily a problem, except that Neo4j is transactional and MongoDB is not. If both were transactional, I would just roll back both transactions when one of them fails. But since MongoDB isn't transactional, I cannot do this.
How do I ensure that whenever I create a user, either both a Node and Document are created, or none of both. I don't want to end up with a bunch of documents that have no matching node.
On top of that, not only do I want my combined database interaction to be ACID compliant, I also want it to be threadsafe. Both the GraphDatabaseService and the MongoClient / DB are provided from singletons.
I found something about creating "Transaction Documents" in MongoDB, but I realy don't like that approach. I would like something nice and clean like the neo4j beginTx, tx.success, tx.failure, tx.finish setup. Ideally, something I can implement in the same try/catch/finally block.
Should I perhaps make a switch to CouchDB, which does appear to be transactional?
Edit : After some more research, sparked by a comment, I came to realize that CouchDB is also not suitable for my specific needs. To clarify, the Neo4j part is set in stone. The Document Store database is not as long as it has a Java Library.
Pieter-Jan,
if you are able to use Neo4j 2.0 you can implement a Schema-Index-Provider (which is really easy) that creates your documents transactionally in MongoDB.
As Neo4j makes its index providers transactional (since the beginning), we did that with Lucene and there is one for Redis too (needs to be updated). But it is much easier with Neo4j 2.0, if you want to you can check out my implementation for MapDB. (https://github.com/jexp/neo4j-mapdb-index)
Although I'm a huge fan of both technologies, I think a better option for you could be OrientDB. It's a graph (as Neo4) and document (as MongoDB) database in one and supports ACID transactions. Sounds like a perfect match for your needs.
As posted here https://stackoverflow.com/questions/23465663/what-is-the-best-practice-to-combine-neo4j-and-mongodb?lq=1, you might have a look on Structr.
Its backend can be regarded as a Document database around Neo4j. It's fully transactional and open-source.

Silex and MongoDB, which Silex Extension

I would like to use Silex with MongoDB.
I guess the best way is to use a Silex Extension that again uses the Doctrine MongoDB libs.
There are two Silex Extensions right now that seem to provide this functionality.
https://github.com/fate/Silex-Extensions
and
https://github.com/docteurklein/SilexExtensions
Except that the first brings more other Extensions and the second uses submodules (which i would prefer) instead of vendors.sh, are there more things to take care of?
Can anybody recommend the one or the other?
Update:
The Extensions below are outdated, pls use answer above.
to answer it myself:
https://github.com/fate/Silex-Extensions uses the Doctrine MongoDB Abstraction Layer
whereas
https://github.com/docteurklein/SilexExtensions uses the Doctrine MongoDB ODM (Object Document Mapper)
So with the first you can only query MongoDB through Doctrine and with the second you can persist Model Objects to MongoDB as known from f.e. symfony models.
This are brand new MongoDB providers with multi connection support.
MongoDB:
https://github.com/saxulum/saxulum-doctrine-mongodb-provider
MongoDB ODM:
https://github.com/saxulum/saxulum-doctrine-mongodb-odm-provider

spring-data - connecting to relational Databases

Researching Spring-Data - I understand why you would use for NoSQL databases but am struggling why you would use Spring-Data for relational databases over the standard Spring-ORM capabilities (e.g. the JPA support as standard).
Anyone got clear use-cases why you would use the spring-data framework for relational queries?
Thanks,
James.
The JPA Module of the Spring Data project is different from the NOSQL ones as we don't need to provide a low level store abstraction ourselves. So the main features are:
elimination of a large chunk of the implementation code needed for repositories (see this blog post for a showcase)
abstractions for pagination and dynamic sorting
DDD specifications to allow defining domain predicates (see this blog post as example)
support for Querydsl predicates
transparent entity auditing
The JDBC module of Spring contains support for Querydsl as well.