how to get export data from oracle to mongoDB - mongodb

I'm trying to change data from oracle data base to mongodb
the problem is :
my friend give me "file.sql" that have all the tables with data
and I want to get this data and add it to my mongo database
please I don't know how to do it

MongoDB is a NoSQL database, thus the SQL file will not help you.
Ask for CSV files, then you can use the mongoimport tool to import it into your MongoDB. Please note, usually it is a poor design when you migrate from Oracle to MongoDB and move tables to collections one-by-one. Typically in a MongoDB you will have much less collections than you have tables in Oracle.

Related

Is there a faster way to clone limited mongo database

I want to clone remote Mongo Database to my local database with limited data. I know I can do it thanks to exporting each collection then import it to my local database. But is there any faster or efficient way that I can clone the mongo database?
Thanks.
To take a database dump, you need to:
connect to the server
issue the query requesting the data you want
receive query results (as bson)
perform the minimum amount of transformations on that bson to get to the format you want to save in
mongodump with bson output and filter conditions should do these operations no less efficiently than any hand-built system you can come up with.
Same thing for importing.

mongodb - strategy from having relational DB CSV dump imported to highly denormalised mongodb documents

We want to migrate data to mongodb using a CSV files dump created out of a teradata. Data needs to be refreshed in mongodb every night from a fresh teradata csv dump
Approach we are thinking of is:
Get the CSV files exported out of relational db. They are going to be very similar to table structure in relational db
Import the CSV files into mongodb staging collections subsequently, which will be mirroring relational db structure in terms of being normalised. This may be done using say mongoimport in overnight batches. This is going to result in many collections as we are thinking to import each 'type' of CSV into its own collection e.g. Customers.csv and Accounts.csv will result in two respective collections of same name.
Create de-normalised collections out of staging collections, ready to be exposed to UI. Run some schema migration script, which queries the staging collections and creates more denormalised and fewer collections ready for use in the application UI.Eg, Customers and Accounts collections, after running migration script should result in a third collection say AccountCustomers collection where Each account doc has embedded customers array (denormalised and no need of joins when UI needs data)
Question: Is there a better strategy , as all above steps are expected to complete nightly, every night?
Question: Is mongoimport OK to use for importing the CSV files in nightly batches.
Question: What is the best way to migrate (denormalise) collections within the same mongo instance?Eg we have stagingdb having collections customers and accounts and we want to reach a state where we have proddb having collection accountcustomers

extracting schema and models back from the dumped database

I have dumped my project's mongo db database into my machine and want to extract the schema and models from the dumped database.
Is there any method in mongo db to get the schema back with the help of dumped database.
Any suggestions or feedbacks in this regard would be appreciated.
Is it not possible for you to restore the data into a MongoDB instances and look at the data again?
Not sure what you mean by schema, since MongoDB is very flexible on schema. From the nature of youu question, it looks like the schema is defined in mongoose and not exactly in MongoDB per-se and you are trying to reverse engineer that mongoose schema based on the data contained in the dump.
If you do not want to start a new instance and import the data, another option could be using bsondump utility that comes as part of MongoDB package to convert the BSON dump files to JSON format and analyzing them.

MongoDB and similar database technologies

What are comparable database like Mongo DB?
We are trying to evaluate Mongo DB and find the best database for a enterprise level application.
Is there any developer UI and admin UI available for MongoDB like SQL Plus/Toad etc for Oracle?
MongoDB is a document-oriented database, so instead of a row of data, you have a document. In MonogDB's case, its a JSON document. Apache's CouchDB is another document database that stores data in JSON format although there are subtle differences between the two.
Choosing between the two depends on your use case. Sometimes CouchDB is better than MongoDB.
Checkout this comparative to see the differences.
MongoDB is what is known as a NoSQL database, which is I assume why you're interested in it. You can find a list of other NoSQL databases at the below links:
http://en.wikipedia.org/wiki/NoSQL
http://nosql-database.org/
MongoDB does not include a GUI-style administrative interface; however, there are numerous community projects that provide admin UIs for MongoDB:
http://www.mongodb.org/display/DOCS/Admin+UIs
I like document oriented databases like MongoDB very much. Because they are shema-less. You can just insert find and update your records without first having to define a schema. But you can define one in your own Project logic. You have more freedom.
It would be nice to have an embeddable NoSQL database. Like SQLite but document oriented.
Currently I do develop one in Java. (You can also use it withhin an Android App):
https://github.com/neo-expert/thingdb
I am quite happy with MongoVue. I've made a couple of videos about this here.

solr Data Import Handlers for MongoDB

I am working on a project where we have millions of entries stored in MongoDB database and, i want to index all this data using SOLR.
After extensive Searching i came to know there are no proper "Data Import Handlers" for mongoDB database.
Can anyone tell me what are the proper approaches for indexing data in MongoDB using SOLR ?
I want to use all the features of SOLR and want it to be scalable in real-time. I saw one or two approaches from different posts but not sure how they will work real time..
Many Thanks
10Gen introduce Mongodb Connector. You can integrate Mongodb with Solr using this tool.
Blog post : Introducing Mongo Connector
Github page : mongo-connector
I have created a plugin to allow you to load data from MongoDb using the Solr data import handler.
Check it out at:
https://github.com/james75/SolrMongoImporter
I wrote a response to a similar question, except it was how to import data from MySQL into SOLR. The example code is in PHP, but should give you a general idea. All you would need to do is set up an iterator to step through your MongoDB assets, extract the data to SOLR datatypes, and then save it to your SOLR index.
If you want it to be real-time, you could add some custom code to the save mechanism (assuming this can be done with MongoDB), and save directly to the SOLR index, then run a commit script to commit data every 15 minutes (via cron).