how to migrate SQL server's data to mongoDB? - mongodb

I am trying to migrate my database from SQL server to mongoDB, I've exported tables from my database, and wrote the corresponding mongoDB import function for it:
mongoimport -d dashboard -c col --type csv --file "C:/Users/sesa356116/text.csv" --headerline -f bu_id,bu_name,bu_id,
this is getting executed, but when I am trying to see the data in robomongo, no records are getting displayed. What I want to know is after this import function are there any further steps that should be followed in order to get the corresponding records, any leads for the same will be helpful.

Related

Error importing collections into a mongodb database on an ec2 instance

I exported the collections from a local database and I want to import them into an ec2 instance.
I did the following:
1) I exported the collections to a folder called data. The files are in this format:
collecion_test.bson
collection.metadata.json
2) I transferred the folder to an ec2 instance. The path to access the folder is like this:
/home/ec2-user/data
3) I went into mongo and did "use database_test" and created a collection like this: db.createCollection("data")
Finally, I tried to import the file this way:
mongoimport --db database_test --collection data --file /home/ec2-user/data/data.metadata.json -jsonArray
but I get this error:
2022-02-18T19:29:38.380+0000 E QUERY [js] SyntaxError: missing ; before statement #(shell):1:14
I appreciate if anyone can help me analyze this!
The problem is that you used mongodump which created the xxx.bson and xxx.meta.json so you need to use mongorestore to read those files. Use mongoimport to read files created with mongoexport
for a full explanation see https://docs.mongodb.com/database-tools/
In short mongodump/mongorestore deal with bson files while mongoexport/mongoimport work with csv/tsv/json files. So for example one neat thing about these commands, is if you supply an optional -q parameter like {field:x} then only the records that filter would select will be used in the dump.

How to migrate RethinkDb into MongoDb?

My application is using RethinkDb. Everything is running fine, but a new required needs to migrate the db into MongoDb.
Is this possible? How do I migrate the tables/collections, data, indexes, etc?
How about blob types, auto increments. ids?
Thanks!
Is this possible? How do I migrate the tables/collections, data, indexes, etc?
One way to migrate data from RethinkDB to MongoDB is to export data from RethinkDB using rethinkdb dump command, and then use mongoimport to import into MongoDB. For example:
rethinkdb dump -e dbname.tableName
This would generate an archive file:
rethinkdb_dump_<datetime>.tar.gz
After uncompressing the archive file, you can then use mongoimport as below:
mongoimport --jsonArray --db dbName --collection tableName ./rethinkdb_dump_<datetime>/dbName/tableName.json
Unfortunately for the indexes, the format between RethinkDB and MongoDB is quite different. The indexes are stored within the same archived file:
./rethinkdb_dump_<datetime>/dbName/tableName.info
Although you can still write a Python script to read the info file, and use MongoDB Python driver (PyMongo) to create the indexes in MongoDB. See also create_indexes() method for more information.
One of the reasons in suggesting to use Python, is because RethinkDB also has a Client Python driver. So technically, you can also skip the export stage and write a script to connect your RethinkDB to MongoDB.

Inserting data from json file to mongodb

I am learning MongoDB and for practice i downloaded the restaurant data from mongodb site. I am using windows OS and mongo is installed properly.
Now, I want to insert all the restaurant documents ( i.e json data) to mongodb. I am using cmd and tried this command
mongoimport --db test --collection restaurants --drop --file ~/downloads/primer-dataset.json
but it failed and got the message that
SyntaxError: missing ; before statement #(shell):1:4
How to solve this error? Please help me because I couldn't find satisfactory answer even after spending too much time.
mongoimport must be run from the Windows command prompt, not the mongo shell.

Mongo compare and import missing data

I have a mongo server receiving data from servers behind an amazon LB, 2 days ago there was an error and part of the servers sent their data to an old mongo server that had a db of the same name, we realized that and fixed it back right away.
Now i have part of my data stored on the wrong machine.
What I need now is a way to compare the data between the 2 dbs (which each have 2 relevant collections) and insert only the missing data to the correct collection.
I do not care about the unique id mongo gives but i do need to compare by the field "uuid" that we create.
mongo version: 2.4.4
I am new to Mongo and any help would be greatly appreciated.
Yes, you can. Follow these steps...
1 mongoexport and then mongoimport on the basis of fields on which you want to compare and import.
mongoexport --db db_name --collection collection_name --type csv --fields field1,field2,field3 --out /var/www/export.csv
once you get the exported CSV at the specified location. Open and remove unwanted fields...
mongoimport --db db_name --collection collection_name --type csv --file /var/www/export.csv --fields field1,field2,field3 --upsertFields field1,field2,field3
NOTE:
1. If you working in production environment, dealing with huge database then free up your mongo from load of queries and then export else it might get stucked.

Importing large number of records in json file to mongodb

I just started learning to build nodejs application. I am able to figure how things work so i made decision to test my application with large test data.
I create json file with 1 million records in it.
I import data using
mongoimport --host 127.0.0.1 --port 27017 --collection customers --db Customer --file "path to json file mock.json" --jsonArray
Sample Json file is
[{"fname":"Theresia","lname":"Feest","email":"aileen#okeefe.name"},
{"fname":"Shannon","lname":"Bayer","email":"tristian.barrows#christiansenvandervort.com"},
{"fname":"Cora","lname":"Baumbach","email":"domenico.grimes#lesley.co.uk"},
{"fname":"Carolina","lname":"Hintz","email":"betty#romaguerasenger.us"},
{"fname":"Dovie","lname":"Bartell","email":"rogers_mayert#daniel.biz"}]
but it is taking too much time approx. 14 Hrs.
Please suggest any other optimized ways for the same.
Split your single json file into multiple files and then run multiple parallel mongoimport commands for each of the file.