i try to import database collection but i found import whole database not single collection
mongorestore -d db_name dump_folder_path
so here is solution to import and export single collection in mongodb cmd
For Import
mongoimport --db Mydatabase --collection mycollection --drop --file ~/var/www/html/collection/mycollection.json
For Export
mongoexport --collection=mycollection --db=Mydatabase --out=/var/www/html/collection/mycollection.json
Hope this works :)
Related
I am using mongo 3.4
I want to import json file from json array to mongod using bash script, and I want to import the json file only if they don't exist. I tried with --upsert but it does not work.
Is there any easy way to do it? Thanks
mongoimport --db dbName --collection collectionName --file fileName.json --jsonArray --upsert
mongoimport -d dbName -c collectionName jsonFile.json -vvvvv
Even though the output of mongoimport says that n of objects were imported, the exsiting document with same data has not been overwritten.
if use --upsert it will update the existing document.
Found similar discussion here
How to take mongodump for 1 collection from my Database
../mongodump --db db_name --collection collection_name --out /home/dell/999/
i got this error
bash: ../mongodump: No such file or directory
This is working for entire db backup
./mongodump --out /home/dell/777/ --db dbname
But back up for single collection from a database not working
Use mongoexport to export collection data:
mongoexport --db test --collection mycollection --out myCollection.json
If it's a replica set and you want to use the --uri you should use it like this cause documentation states that you can't specify some options when using --uri
mongodump --uri "mongodb://user:password#mongo-en-1.example.io:27017,mongo-en-2.example.io:27017,mongo-en-3.example.io:27017/$Databasename?replicaSet=$replicasetname&authSource=admin" --collection $collectionname
Mongod and pymongo is running correctly. Want to import a jsonfile now.
import pymongo
mongoimport --db test --collection dots --file c:\created.json
It just throws me a syntax error in sublimetext with no further explanation, Anyone sees what it is wrong with my code ?
mongoimport is a command line utility and is independent of pymongo, which is the MongoDB driver for Python. So just launch your cmd and run there:
mongoimport --db test --collection dots --file c:\created.json
I am trying to import the large data set json file into mongodb using mongoimport.
mongoimport --db test --collection sam1 --file 1234.json --jsonArray
error:
2014-07-02T15:57:16.406+0530 error: object to insert too large
2014-07-02T15:57:16.406+0530 tried to import 1 objects
Please try add this option: --batchSize 1
Like:
mongoimport --db test --collection sam1 --file 1234.json --batchSize 1
Data will be parsed and stored into the database batchwise
I have a collection named tracks in a db named socialmedia in my mongo. How can i copy this collection to another mongodb in my network ?
Update:
there is only one mongodb instance
Export your collection to file, copy the file to the other machine and import it on your other machine.
Export from commandline to file:
mongoexport -d socialmedia -c tracs -o filename.json
Import a file(in the same folder) from commandline :
mongoimport -d socialmedia -c tracs --file filename.json
Use cloneCollection
http://docs.mongodb.org/manual/reference/command/cloneCollection/
On the target server, run
{ cloneCollection: "databaseName.socialmedia", from: "mongodb.example.net:27017" }
If you wanted to do this on the same server:
db.socialmedia.copyTo(newNameOfSocialmedia)
http://docs.mongodb.org/manual/reference/method/db.collection.copyTo/
Use mongo import and export. Explanation you can find here
mongoimport --db project_test_db --collection users --out export/users.json
mongoexport --db project_test_db --collection users --sort '{fieldName: 1}' --limit 100 --skip 10 --out export/users.json