mongodb dump and pipe to other db name - mongodb

Mongodb version 3.2.12. I have two local databases, "base1" and "base2"
I want to copy all data (all collections) from base1 over to base2, replacing everything there (like when dumping production to a dev environment).
Any pipe command (or other simple way) to do this?
I tried
mongodump --archive --db base1 | mongorestore --db base2 --archive
lists a lot of "writing base1.collectionname to archive on stdout", but nothing gets written to base2.
I also tried
mongodump --db base1 --gzip --archive=/path/to/file.gz
mongorestore --db base2 --gzip --archive=/path/to/file.gz
Dump works, restore just says "creating intents for archive", "done"

I come across the same issue and after some googling and search I found this post
https://stackoverflow.com/a/43810346/3785901
I tried this command mentioned:
mongodump --host HOST:PORT --db SOURCE_DB --username USERNAME --password PASSWORD --archive | mongorestore --host HOST:PORT --nsFrom 'SOURCE_DB.*' --nsTo 'TARGET_DB.*' --username USERNAME --password PASSWORD --archive --drop
and it works like a charm.
It should work in your case, good luck.

I use following commands:
mongodump \
--host ${mongo.host} \
--port ${mongo.port} \
--username ${mongo.backup_restore_user} \
--password ${mongo.backup_restore_password} \
--db ${mongo.db} \
--gzip \
--dumpDbUsersAndRoles \
--archive=${archive}
and
mongorestore \
--keepIndexVersion \
--drop \
--gzip \
--restoreDbUsersAndRoles \
--db ${mongo.db} \
--host ${mongo.host} --port ${pims.mongo.port} \
--username ${mongo.backup_restore_user} \
--password ${mongo.backup_restore_password} \
--archive=${archive}

Related

mongorestore out of memory

The mongo process has been killed by out of memory after import data from gz file (~1.75GB) to mongodb (docker container).
it used about 10GB of RAM. Tried to use with --numParallelCollections and numInsertionWorkersPerCollection but still not work.
mongorestore -u superAdmin -p 'password123321!' \
--numParallelCollections 1 \
--numInsertionWorkersPerCollection=1 \
--authenticationDatabase=admin --drop --db "logs" --gzip logs.bson.gz

Mongorestore in docker failed: Failed: gzip: invalid header

I created a mongo dump with commands (as suggested in this answer)
docker exec -it mongodb bash
mongodump --host $cluster --ssl --username $username --authenticationDatabase admin --db $dbname --gzip --archive > dumpname.gz
Now when I'm trying to restore the dump with
docker exec mongodb bash -c 'mongorestore --gzip --archive=dumpname.gz'
I get
Failed: gzip: invalid header
It seems like there is some bug with using redirection (>). So when I changed the first command to not use it, mongorestore started to work:
mongodump --host $cluster --ssl --username $username --authenticationDatabase admin --db $dbname --gzip --archive=dumpname.gz
Some similar problems could be found here

mongodump 3.2.1 positional arguments not allowed

trying mongodump with following options and get "positional arguments not allowed"
mongodump --host=hostname --port=27017 --db=db --out=/path --oplog --gzip
tried mongodump -h hostname -d dbname and that works
What does the message
positional arguments not allowed
mean?
You got the syntax wrong in the first one. You need to remove the = sign. See documentation.
mongodump --host hostname --port 27017 --db db --out /path --oplog --gzip
mongodump -d<dbname> -o <backUpPath>
like this:
mongodump -d projectdb -o /Users/zhangzhanqi/Desktop/backup_mongo/aaa
The syntax has been changed to replacement of = with space character in front of the argument names. To clear the point with a descriptive answer, I put two general forms with long argument and short argument names.
Long parameter form:
mongodump --host hostname --port 27017 --db db --out /path --oplog --gzip
Short parameter form:
mongodump -h hostname -p 27017 -d db -o /path --oplog --gzip
You can find more explanation and examples in the following link:
https://docs.mongodb.com/manual/reference/program/mongodump/

Mongorestore to a different database

In MongoDB, is it possible to dump a database and restore the content to a different database? For example like this:
mongodump --db db1 --out dumpdir
mongorestore --db db2 --dir dumpdir
But it doesn't work. Here's the error message:
building a list of collections to restore from dumpdir dir
don't know what to do with subdirectory "dumpdir/db1", skipping...
done
You need to actually point at the "database name" container directory "within" the output directory from the previous dump:
mongorestore -d db2 dumpdir/db1
And usually just <path> is fine as a positional argument rather than with -dir which would only be needed when "out of position" i.e "in the middle of the arguments list".
p.s. For archive backup file (tested with mongorestore v3.4.10)
mongorestore --gzip --archive=${BACKUP_FILE_GZ} --nsFrom "${DB_NAME}.*" --nsTo "${DB_NAME_RESTORE}.*"
mongodump --db=DB_NAME --out=/path-to-dump
mongorestore --nsFrom "DB_NAME.*" --nsTo "NEW_DB_NAME.*" /path-to-dump
In addition to the answer of Blakes Seven, if your databases use authentication I got this to work using the --uri option, which requires a recent mongo version (>3.4.6):
mongodump --uri="mongodb://$sourceUser:$sourcePwd#$sourceHost/$sourceDb" --gzip --archive | mongorestore --uri="mongodb://$targetUser:$targetPwd#$targetHost/$targetDb" --nsFrom="$sourceDb.*" --nsTo="$targetDb.*" --gzip --archive
Thank you! #Blakes Seven
Adding Docker notes:
container names are interchangeable with container ID's
(assumes authenticated, assumes named container=my_db and new_db)
dump:
docker exec -it my_db bash -c "mongodump --uri mongodb://db:password#localhost:27017/my_db --archive --gzip | cat > /tmp/backup.gz"
copy to workstation:
docker cp my_db:/tmp/backup.gz c:\backups\backup.gz
copy into new container(form backups folder):
docker cp .\backup.gz new_db:/tmp
restore from container tmp folder:
docker exec -it new_db bash -c "mongorestore --uri mongodb://db:password#localhost:27017/new_db --nsFrom 'my_db.*' --nsTo 'new_db.*' --gzip --archive=/tmp/backup.gz"
You can restore DB with another name. The syntax is:
mongorestore --port 27017 -u="username" -p="password"
--nsFrom "dbname.*"
--nsTo "new_dbname.*"
--authenticationDatabase admin /backup_path

Upload Data into MongoLab database from terminal

I'm having trouble figuring out how to upload csv data to my MongoLab database. From my terminal I have used
sudo mongoimport --db heroku_hkr86p3z -u <dbusername> -p <dbpassword> --collection contributors --type csv --headerline --file /Users/tonywinglau/Desktop/independent-expenditure.csv
and
sudo mongoimport --host mongodb://<username>:<password>#ds035310.mlab.com:35310/heroku_hkr86p3z --db heroku_hkr86p3z -u <username> -p <password> --collection contributors --type csv --headerline --file /Users/tonywinglau/Desktop/independent-expenditure.csv
both of which respond with
Failed: error connecting to db server: no reachable servers
imported 0 documents
From what I have read it might be something to do with my 'mongo config' file (I can't find it if it does exist) being set to only connect with localhost? How do I import data directly into my mongolab hosted database?
Your command line should look like this:
mongoimport -d <databasename> -c <collectionname> --type csv --file <filelocation/file.csv> --host <hostdir example:ds011291.mlab.com> --port <portnumber example:11111> -u <username> -p <password> --headerline
The host direction and the port number it gived by mlab when you create the database.
Example:
ds000000.mlab.com:000000/databaseName