I'm trying to copy over a collection from an instance of a mongoDB on my local machine to a collection hosted by mongoLabs.
I'm able to dump the collection into a dump directory, but when I try to import with the command below I get a: No such file or directory: "/dump/my_db/my_coll.bson" error. This is the command I use:
mongorestore -h ds047057.mongolab.com:47057 -d main_db -c main_coll -u xxxx -p xxxx /dump/my_db/my_coll.bson
I still get the same error if I use the full pathname.
Thanks
I believe you want to point mongorestore at the directory containing your db rather than the file containing the specific collection you're targeting. So:
mongorestore -h ds047057.mongolab.com:47057 -d main_db -c main_coll -u xxxx -p xxxx /dump/my_db
Yes! Thanks jared!
The --directoryperdb can not work when authorized below.
mongorestore -u xxx_production -p -h 127.0.0.1 --directoryperdb rongyoudao_production_mongodb
While the -d works below.
mongorestore -h 127.0.0.1:27017 -d xxx_production -u xxx -p /root/backups/2014-06-19/xxx_production_mongodb/xxx_production
Related
Daily iam making the mongo dump to my local machine.But i just want only few collections to download is it possible to do that?
And my command to download collection is as below.
mongodump -h $MONGODB_SERVICE_HOST -d countly -c fc3d4e90cfa6a1759ca8ca56021e7f18_rma -o /opt/app-root/src/hello -u 'admin' -p $MONGODB_ADMIN_PASSWORD
I am trying to dump my collection to file called hello in the server and then download to local machine.
You can use mongo export to export a collection:
mongoexport -h <Remote_Host_address> -d <database_name> -c <collection> -u <user> -p <password> -o <outputfile.json>
And use mongoimport to import the json file into your local db:
mongoimport -h <Local_Host_address> -d <database_name> -c <collection> --file outputfile.json
This implies that you can connect to the remote mongo database from your local machine. If not, you can export from the remote machine and then just scp to your box.
Note that it's not recommended to use mongoexport/import to do full backups of your db. Refer to the pages I linked for more information and parameters.
I've got my own machine with postgres dmp file, which I want to restore on the remote virtual machine (e.g. ip is 192.168.0.190 and postgres port is 5432) in my network. Is it possible to restore this dump using pg_restore without copying dump to remote machine? Because the size of dump about 12GB and the disk space on the virtual machine is 20GB.
Thanks
You can run a restore over the network without copying the dump to the remote host.
Just invoke pg_restore with -h <hostname> and -p <port> (and probably -U <username> to authenticate as different user) on the host you got the dump file, for example:
pg_restore -h 192.168.0.190 -p 5432 -d databasename -U myuser mydump.dump
References:
pg_restore documentation
Alternatively, you can use psql:
psql -h 192.168.0.190 -p 5432 -d <dbname> -U <username> -W -f mydump.dump
An example for a remote RDS instance on AWS
psql -h mydb.dsdreetr34.eu-west-1.rds.amazonaws.com -p 5432 -d mydbname -U mydbuser -W -f mydatabase-dump.sql
-f, --file=FILENAME execute commands from file, then exit
-W, --password force password prompt (should happen automatically)
You can pass the password parameter in your script before "pg_restore" using PGPASSWORD="your_database_password"
I run this and works to me:
scp backup.dump user#remotemachine:~
ssh user#remotemachine "pg_restore -h localhost -p 5432 -U databaseuser -W -F c -d databasename -v backup.dump"
You can write a script to automate this.
I have used some queries for import and export database from mongolab.com to my local mongodb server. Can you please anyone tell me, how to retrieve all data from mongolab.com (clouddb) to local mongodb server.
I have trying these codes in my local mongodb server with command line prompt:
mongodump -h ds040032.mongolab.com:40032 -d mydb -u "<"myname">" -p "<"mypass">" -o "<"D:\2016\LearnMongoDB\NEWDB">"
mongoexport -h ds040032.mongolab.com:40032 -d mydb -c "<"collectionname">" -u "<"myname">" -p "<"mypass">" -o "<"D:\2016\LearnMongoDB\Testingf">"
mongorestore -h ds040032.mongolab.com:40032 -d mydb -u "<"myname">" -p "<"mypass">" "<"input db directory">"
After entering, I am not getting any results with the commandline prompt. Getting Still cursor loading symbol.
try db.copyDatabase
db.copyDatabase('from_mydb','to_mydb','ds040032.mongolab.com:40032',
'<myname>','<mypassword>')
Go to local mongo shell and apply above command with appropriate parameters.
In 2017, db.copyDatabase (using shell) works, but the format has changed a bit:
db.copyDatabase('mlab_databse_name', 'local_folder_for_data_name', 'ds000000.mlab.com:00000', 'database_user_name', 'database_user_password')
I've got my own machine with postgres dmp file, which I want to restore on the remote virtual machine (e.g. ip is 192.168.0.190 and postgres port is 5432) in my network. Is it possible to restore this dump using pg_restore without copying dump to remote machine? Because the size of dump about 12GB and the disk space on the virtual machine is 20GB.
Thanks
You can run a restore over the network without copying the dump to the remote host.
Just invoke pg_restore with -h <hostname> and -p <port> (and probably -U <username> to authenticate as different user) on the host you got the dump file, for example:
pg_restore -h 192.168.0.190 -p 5432 -d databasename -U myuser mydump.dump
References:
pg_restore documentation
Alternatively, you can use psql:
psql -h 192.168.0.190 -p 5432 -d <dbname> -U <username> -W -f mydump.dump
An example for a remote RDS instance on AWS
psql -h mydb.dsdreetr34.eu-west-1.rds.amazonaws.com -p 5432 -d mydbname -U mydbuser -W -f mydatabase-dump.sql
-f, --file=FILENAME execute commands from file, then exit
-W, --password force password prompt (should happen automatically)
You can pass the password parameter in your script before "pg_restore" using PGPASSWORD="your_database_password"
I run this and works to me:
scp backup.dump user#remotemachine:~
ssh user#remotemachine "pg_restore -h localhost -p 5432 -U databaseuser -W -F c -d databasename -v backup.dump"
You can write a script to automate this.
Dumped a MongoDB successfully:
$ mongodump -h ourhost.com:portnumber -d db_name01 -u username -p
I need to import or export it to a testserver and have struggle with it, please help me figure out.
I tried some ways:
$ mongoimport -h host.com:port -c dbname -d dbname_test -u username -p
connected to host.
Password: ...
Gives this error:
assertion: 9997 auth failed: { errmsg: "auth fails", ok: 0.0 }
$ mongoimport -h host.com:port -d dbname_test -u username -p
Gives this error:
no collection specified!
How to specify which collection to use? What should I use for -d? What I'd like to upload or what I want to use as test out there? I would like to import the full DB not only collection of it.
The counterpart to mongodump is mongorestore (and the counterpart to mongoimport is mongoexport) -- the major difference is in the format of the files created and understood by the tools (dump and restore read and write BSON files; export and import deal with text file formats: JSON, CSV, TSV.
If you've already run mongodump, you should have a directory named dump, with a subdirectory for each database that was dumped, and a file in those directories for each collection. You can then restore this with a command like:
mongorestore -h host.com:port -d dbname_test -u username -p password dump/dbname/
Assuming that you want to put the contents of the database dbname into a new database called dbname_test.
You may have to specify the authentication database
mongoimport -h localhost:27017 --authenticationDatabase admin -u user -p -d database -c collection --type csv --headerline --file awesomedata.csv
For anyone else might reach this question after all these years (like I did), and if you are using
a dump which was created using mongodump
and trying to restore from a dump directory
and going to be using the default port 27017
All you got to do is,
mongorestore dump/
Refer to the mongorestore doc for more info. cheers!
When you do a mongodump it will dump in a binary format. You need to use mongorestore to "import" this data.
Mongoimport is for importing data that was exported using mongoexport