Could not export whole database using MongoDB in Ubuntu - mongodb

I need one help. I need to export and import the total database/all collection for MongoDB using Ubuntu. I am explaining my command below.
sudo mongoexport --db FGDP --out /home/subrajyoti/Downloads/newdbexport.json;
Here i am getting the following error message.
2016-12-22T10:28:46.290+0530 error validating settings: must specify a collection
2016-12-22T10:28:46.290+0530 try 'mongoexport --help' for more information
Here i need to export all collection rather than one single one. Please help me.

Exporting all collections of all database using mongodump use the following command:
mongodump -o <directory_backup>

Related

How to install mongoimport on Mac OS X

I have installed Mongodb following the instruction. However, it only has Mongo in usr/local/bin
When I tried to import a json file into my database, it showed an error message:
> mongoimport --jsonArray --db hw4 --collection restaurants --file restaurants.json
2019-11-12T15:18:28.737-0600 E QUERY [js] uncaught exception: SyntaxError: unexpected token: identifier :
#(shell):1:14
It seems there's no mongoimport installed. How could I install it, or is there another way to import data?
The --jsonArray option provided with command could be causing the issue, is probably not required. What does restaurants.json look like?
mongoimport should be run from the shell command line, not from inside the mongo shell.
To ensure the correct installation of MongoDB and its associated tools, please follow the procedure outlined in Install MongoDB Community Edition on macOS

Mongorestore not restoring data from URI

Recently I was asked to restore a MongoDB database but all I was given was the following URI mongodb://localhost:27017/testdb
I ran mongod and then the following command from within my /usr/local/mongodb/bin folder:
mongorestore --uri "mongodb://localhost:27017/testdb"
After running this command i get the following output:
2019-05-26T23:00:27.148-0400 using default 'dump' directory
2019-05-26T23:00:27.149-0400 preparing collections to restore from
2019-05-26T23:00:27.150-0400 done
However, nothing seems to have happened. This is the first time I do this and I don't know what is happening. I'd really appreciate it if someone could tell me what am I doing wrong. Thank you!
mongorestore --uri mongodb://localhost:27017/dbName --db YOUR_DB_NAME YOUR_TARGET_FOLDER/YOUR_DB_NAME
e.g :
mongorestore '--uri' 'mongodb://localhost:27017/testdb' '--db' 'testdb' '--drop' '../path/mongodump/testdb'

MongoImport Error: X509_STORE_add_cert:cert already in hash table

I am currently trying to import a group of JSON files containing data into my mongo database hosted on IBM Bluemix/Compose.
I have a script that runs through the files creating and then running a mongoimport command to import the files into the database, this works great against my local database(and indeed occasionally against the Compose database) however most of the time I get the following error -
2017-05-09T14:59:02.508+0100 Failed: error connecting to db server:
SSL errors: x509 certificate routines:X509_STORE_add_cert:cert
already in hash table x509 certificate
2017-05-09T14:59:02.508+0100 imported 0 documents
My mongoimport command looks like this -
mongoimport --batchSize 100 --ssl --sslAllowInvalidCertificates --host *censored* --collection Personnel --file data/TestData/Personnel_WICS.json -u admin -p *censored* -d MY_DB --authenticationDatabase admin
Is this a mongoimport error? Perhaps an issue with Compose? Or am I doing something incorrectly with the command?
I should note that the files I am importing range in size from 3mb-100mb, but even reducing the larger file sizes down by splitting them up does not seem to help.
My import script runs one import command immediately following the completion of the previous one, is there maybe some issue in running several back to back imports like this?
For anyone finding this in future - it looks like this may have been caused by a mismatch in mongo versions between the machine I'm running the mongoimport command from and the mongo database hosted in compose.
Compose DB Version: 3.2
Build server machine(running mongoimport): 3.4
Downgrading the build server version has resolved the issue.

Export number of Documents from mongodb

I use mongochef as a UI client for my mongo database. Now I have collection which consists of 12,000 records. I want to export them using the mongochef.
I have tried with the export option(available) which is working fine up to 3000 documents. But if the number of records gets increasing the system is getting hung up.
Can you please let me know the best way to export all the documents in a nice way using mongochef.
Thanks.
Finally I came to conclusion to use the mongo using terminal which the best way to use(efficient).
read about the primary and secondary databases and executed the following query:
mongoexport --username user --password pass --host host --db database --collection coll --out file_name.json

Inserting data from json file to mongodb

I am learning MongoDB and for practice i downloaded the restaurant data from mongodb site. I am using windows OS and mongo is installed properly.
Now, I want to insert all the restaurant documents ( i.e json data) to mongodb. I am using cmd and tried this command
mongoimport --db test --collection restaurants --drop --file ~/downloads/primer-dataset.json
but it failed and got the message that
SyntaxError: missing ; before statement #(shell):1:4
How to solve this error? Please help me because I couldn't find satisfactory answer even after spending too much time.
mongoimport must be run from the Windows command prompt, not the mongo shell.