I'm trying to dump a MongoDB database to an archive. Using the following command as given in documentation.
sudo mongodump --uri=mongodb://username:password#host:27017/dbname?authMechanism=SCRAM-SHA-1&authSource=authdb --archive=file.archive
But it doesn't dump as expected rather it creates a dump folder with .json file for each collection, which should be a single archive file as given.
It also shows the following error -
--archive=file.archive: command not found
Mongo version -
MongoDB shell version v3.6.3
I had this problem & I figured out the reason it was happening.
The command I was running was
sudo /usr/bin/mongodump --uri=mongodb+srv://{username}:{password}#{atlasendpoint}.mongodb.net/?retryWrites=true&w=majority --archive={filename}.archive --gzip 2>&1
I was getting the same error as you. I change the shell command by wrapping quotation marks around the URL; this fixed it for me.
sudo /usr/bin/mongodump --uri="mongodb+srv://{username}:{password}#{atlasendpoint}.mongodb.net/?retryWrites=true&w=majority" --archive={filename}.archive --gzip 2>&1
Related
I'm trying to create a backup of my database using Mongodump. The problem is that every time i execute the dump, i get the following error:
--collection: command not found
Here is the command:
mongodump --uri=MYURI --collection TEST-COL --gzip --out=/var/backups/testbackup
I'm using linux, while on windows the same command seems to work. Any advice?
The URI is a connection string, must be wrap with double quotes ("").
--uri=<connectionString>
The issue seems straight forward. I have a database (test) and a collection called (users) so I run the command:
mongoexport -d test -c users -o output.json
However I get the below error:
As per what I have figured out till now over the internet, this may have something to do with the file path but I am unsure as how to amend this as I never mess with PATH variable due to a bad experience...
You don't run mongoexport from the mongo shell, you have to run it from the OS shell (same as you run mongo)
mongoexport is not a Mongo shell command, it's an operating system command.
Just like you run mongo.exe to start the shell from OS prompt, you should run mongoexport the same way from OS prompt. Example:
c:\mongodb\bin>mongoexport --db ventfeed --collection users --out C:\temp\contacts.json
Thanks
Trying to backup my data base.
My mongoDB version is: 3.0.12
I am getting this error:
$ mongodump --out .
2017-05-19T09:45:29.536+0000 Failed: error creating bson file `city/address_/house_sensors:power_sensors.bson`: open city/address_/house_sensors:power_sensors.bson: no such file or directory
Is it because I used slash character in my collection name?
How can I fixe that?
Thanks!
As you pointed, the problem is with your collection name. I'd recommend to rename it to something without slashes.
If you cannot rename it (it's used by other systems) you should use the output option with "-" so it is written to standard output, then redirect it to a file:
mongodump -d yourDB -c "your/colName" --out "-" --quiet > col.bson
Then you can restore it with:
mongorestore -d yourDB -c "your/colName" col.bson
I tried mongo import like this
mongoimport -d test -c foo importfile.json
mongoimport --host localhost --db local --collection lecturer --type json --file temp.json --headerline --upsert
and I've got same error message "Syntax Error: missing ; before statement (shell):1"
what's wrong with my code & how to import if my data stored in C:\Documents and Settings\User\Desktop ?? please help, thank's in advance
mongoimport is intended to run in command prompt and not in the mongo shell. Try exiting out of the shell and running the command.
One solution is:
First, in cmd, change to the directory containing mongoexport.exe file, then type your command.
C:\Program Files\MongoDB\Server\3.2\bin> .\mongoexport.exe -d foo -c bar -o output.json
mongoimport is to be run on the terminal and not inside the mongo shell. To run mongoimport in terminal, you will need to install the same. On ubuntu, you can do :
apt-get install mongo-tools
Hope this helps :)
I had the same problem and was able to figure it out after a brief struggling and googling.
1. Navigate to the bin directory in command prompt
(cd c:..\bin)
2. Run the mongoimport command but you have to specify the full path of your json file.
That solves the problem
try to use CSV is a good.
mongoimport -d mydb -c things --type csv --file locations.csv --headerline --upsert
You can convert by ms excel.
Open the "Mongo/Server/3.4/bin" folder of mongo db in another command window and try again.It Will work.
Open a new terminal or command prompt within the location of the file you want to import and it should work. It will not work on MongoDB shell
I've dumped a mongodb database with the following mongodump command line
mongodump -h www.myhost.com -u myusername -p mypassword -d mydb > dump.bson
And I'm trying to restore the dump on my local server:
mongorestore -h localhost -d mydb dump.bson
Unfortunately it fails with the following error:
assertion: 10264 invalid object size: 1096040772
Does anyone know what could cause this error?
On both servers mongo's version is 1.8.3
Thanks
Because first string output from mongodump is "db level locking enabled: 0"
You need to do this
tail -n+2 dump.bson > dump_fix.bson
mongorestore -h localhost -d mydb dump_fix.bson
excuse my english :P this happened to me when i did export with mongoexport and try to import with mongorestore :D my mistake! i had to use mongoimport.
Remember: mongoexport/mongoimport, and mongodump/mongorestore
i hope this is usefull to some one :P
I also encountered this problem. And finally I found that this problem was caused by using mongodump command in a wrong way.
well use mongo restore instead of mongodump
This isn't explained very well anywhere that I found, but I found a solution that worked.
I downloaded a .tgz file from mongolab, which contained .bson and .json files in it.
I created a ~/dump folder on my mac.
I copied all those .bson and .json files into the ~/dump folder, so I had ~/dump/users.bson for example.
I ran this command in terminal:
mongorestore -h 127.0.0.1 -db <the_db_name_on_server_this_backup_is_from>
It imported in seconds. I'm sure there are other ways/options, but this is what worked for me.