How to export collection to CSV in MongoDB? - mongodb

How do you export all the records in a MongoDB collection to a .csv file?
mongoexport --host localhost --db dbname --collection name --type=csv > test.csv
This asks me to specify name of the fields I need to export. Can I just export all the fields without specifying the names of fields?

#karoly-horvath has it right. Fields are required for csv.
According to this bug in the MongoDB issue tracker https://jira.mongodb.org/browse/SERVER-4224 you MUST provide the fields when exporting to a csv. The docs are not clear on it. That is the reason for the error.
Try this:
mongoexport --host localhost --db dbname --collection name --csv --out text.csv --fields firstName,middleName,lastName
UPDATE:
This commit: https://github.com/mongodb/mongo-tools/commit/586c00ef09c32c77907bd20d722049ed23065398 fixes the docs for 3.0.0-rc10 and later. It changes
Fields string `long:"fields" short:"f" description:"comma separated list of field names, e.g. -f name,age"`
to
Fields string `long:"fields" short:"f" description:"comma separated list of field names (required for exporting CSV) e.g. -f \"name,age\" "`
VERSION 3.0 AND ABOVE:
You should use --type=csv instead of --csv since it has been deprecated.
More details: https://docs.mongodb.com/manual/reference/program/mongoexport/#export-in-csv-format
Full command:
mongoexport --host localhost --db dbname --collection name --type=csv --out text.csv --fields firstName,middleName,lastName

Also, you are not allowed spaces between comma separated field names.
BAD:
-f firstname, lastname
GOOD:
-f firstname,lastname

mongoexport --help
....
-f [ --fields ] arg comma separated list of field names e.g. -f name,age
--fieldFile arg file with fields names - 1 per line
You have to manually specify it and if you think about it, it makes perfect sense. MongoDB is schemaless; CSV, on the other hand, has a fixed layout for columns. Without knowing what fields are used in different documents it's impossible to output the CSV dump.
If you have a fixed schema perhaps you could retrieve one document, harvest the field names from it with a script and pass it to mongoexport.

If you want, you can export all collections to csv without specifying --fields (will export all fields).
From http://drzon.net/export-mongodb-collections-to-csv-without-specifying-fields/ run this bash script
OIFS=$IFS;
IFS=",";
# fill in your details here
dbname=DBNAME
user=USERNAME
pass=PASSWORD
host=HOSTNAME:PORT
# first get all collections in the database
collections=`mongo "$host/$dbname" -u $user -p $pass --eval "rs.slaveOk();db.getCollectionNames();"`;
collections=`mongo $dbname --eval "rs.slaveOk();db.getCollectionNames();"`;
collectionArray=($collections);
# for each collection
for ((i=0; i<${#collectionArray[#]}; ++i));
do
echo 'exporting collection' ${collectionArray[$i]}
# get comma separated list of keys. do this by peeking into the first document in the collection and get his set of keys
keys=`mongo "$host/$dbname" -u $user -p $pass --eval "rs.slaveOk();var keys = []; for(var key in db.${collectionArray[$i]}.find().sort({_id: -1}).limit(1)[0]) { keys.push(key); }; keys;" --quiet`;
# now use mongoexport with the set of keys to export the collection to csv
mongoexport --host $host -u $user -p $pass -d $dbname -c ${collectionArray[$i]} --fields "$keys" --csv --out $dbname.${collectionArray[$i]}.csv;
done
IFS=$OIFS;

works for me remoting to a docker container with mongo:4.2.6
mongoexport -h mongodb:27017 --authenticationDatabase=admin -u username -p password -d database -c collection -q {"created_date": { "$gte": { "$date": "2020-08-03T00:00:00.000Z" }, "$lt": { "$date": "2020-08-09T23:59:59.999Z" } } } --fields=somefield1,somefield2 --type=csv --out=/archive.csv

Easy export csv or json file With Mongo Compass tool
Mongo Compass As the GUI for MongoDB, MongoDB Compass allows you to make smarter decisions about document structure, querying, indexing, document validation, and more. Commercial subscriptions include technical support for MongoDB Compass.
https://www.mongodb.com/try/download/compass

I could not get mongoexport to do this for me. I found that,to get an exhaustive list of all the fields, you need to loop through the entire collection once. Use this to generate the headers. Then loop through the collection again to populate these headers for each document.
I've written a script to do just this. Converting MongoDB docs to csv irrespective of schema differences between individual documents.
https://github.com/surya-shodan/mongoexportcsv

Also if you want to export inner json fields use dot (. operator).
JSON record:
{
"_id" : "00118685076F2C77",
"value" : {
"userIds" : [
"u1"
],
"deviceId" : "dev"
}
mongoexport command with dot operator (using mongo version 3.4.7):
./mongoexport --host localhost --db myDB --collection myColl
--type=csv --out out.csv --fields value.deviceId,value.userIds
Output csv:
value.deviceId,value.userIds
d1,"[""u1""]"
d2,"[""u2""]"
Note: Make sure you do not export an array. It would corrupt the CSV format like field userIds shown above

Solution for MongoDB Atlas users!
Add the --fields parameter as comma separated field names enclosed in double inverted quotes:
--fields "<FIELD 1>,<FIELD 2>..."
This is complete example:
mongoexport --host Cluster0-shard-0/shard1URL.mongodb.net:27017,shard2URL.mongodb.net:27017,shard3URL.mongodb.net:27017 --ssl --username <USERNAME> --password <PASSWORD> --authenticationDatabase admin --db <DB NAME> --collection <COLLECTION NAME> --type <OUTPUT FILE TYPE> --out <OUTPUT FILE NAME> --fields "<FIELD 1>,<FIELD 2>..."

This working for me Try it
mongoexport --host cluster0-shard-dummy-link.mongodb.net:27017 --db yourdbname --forceTableScan --collection users --type json --out /var/www/html/user.json --authenticationDatabase admin --ssl --username Yourusername --password Yourpassword
Above cmd return whole data of the users collection
if you want filter field then add --fields=email,name

For all those who are stuck with an error.
Let me give you guys a solution with a brief explanation of the same:-
command to connect:-
mongoexport --host your_host --port your_port -u your_username -p your_password --db your_db --collection your_collection --type=csv --out file_name.csv --fields all_the_fields --authenticationDatabase admin
--host --> host of Mongo server
--port --> port of Mongo server
-u --> username
-p --> password
--db --> db from which you want to export
--collection --> collection you want to export
--type --> type of export in my case CSV
--out --> file name where you want to export
--fields --> all the fields you want to export (don't give spaces in between two field name in between commas in case of CSV)
--authenticationDatabase --> database where all your user information is stored

Below command used to export collection to CSV format.
Note: naag is database, employee1_json is a collection.
mongoexport --db naag--collection employee1_json --type csv --out /home/orienit/work/mongodb/employee1_csv_op1

Related

export csv from MongoDB

I am new to MongoDB. I want to export some fields to csv file and if that field is present in particular row then I want empty value in that field. Currently I am trying this:
mongoexport --host hostname --collection collectionname -q '{}' -f "field1","field2" --db dbname --username user --password pass --out out.csv
But problem is that output does not keep the field if field value is not present in the database. Any suggestion how can I perform the desired operation?
Try:
mongoexport --host hostname --username user --password pass --db dbname --collection collectionname --type=csv --fields field1,field2 --query '{field1: { $exists: true}, field2: { $exists: true}}' --out out.csv
For more detail: Click here

mongoexport fields from subdocuments to csv

I am trying to export a field from a subdocument with no luck.
Here is my syntax;
mongoexport -d test -c accounts -f account_number,situses.coordinates -o coordinates.csv --type=csv
The output includes the account_number but not the coordinates field from the subdocument.
According to the docs, this is supposed to work.
The following will export the entire situses subdocument, but I only want the one field.
mongoexport -d test -c accounts -f account_number,situses -o coordinates.csv --type=csv
Am I just referencing the subdocument field wrong or something?
I'm running Mongodb 3.0.4
ADDITIONAL INFO
The following syntax worked on an earlier version of Mongodb (2.6.x ?). Notice the subdoc.0.fieldname syntax.
mongoexport -d test -c accounts -f account_number,situses.0.coordinates -o coordinates.csv --csv
It appears support for directly referencing a subdocument has been removed.
There is mistake in your syntax.
From mongo version 3.0.0, mongoexport removed the --type = csv option. Use the --type=csv option to specify CSV format for the output.
You should use :
mongoexport --db tests --collection accounts --type=csv --fields account_number,situses --out coordinates.csv
For nested fields you should use :
mongoexport --db tests --collection accounts --csv --fields 'account_number,situses.0.coordinates' --out /home/vishwas/c1.csv
EDIT for mongo 3.0 with sub documents:
You need to create separate collection with required fields from subdocuments like -
db.test.aggregate({"$unwind":"$situses"},{"$project":{"_id":0,"account_number":1,"siteUsesCo":"$situses.coordinates"}},{"$out" : "forcsv"})
If you want only one field from subdocument then use aggregation like -
db.test.aggregate({"$unwind":"$situses"},{"$limit":1},{"$project":{"_id":0,"account_number":1,"siteUsesCo":"$situses.coordinates"}},{"$out" : "forcsv"})
And then export from forcsv collection like-
mongoexport --db test --collection forcsv --csv --fields 'account_number,siteUsesCo' --out coordinates.csv
And after exporting delete collection forcsv.
And one more solution, where you can configure output in flexible way
mongo host:port/test --quiet query.js -u username -p passw0rd > accounts.csv
and query.js:
db = db.getSiblingDB('test');
db.getCollection('accounts').find({}, {account_number:1, situses:1, _id:0}).forEach(
function(item_data) { print(`${item_data.account_number},${item_data.situses[0].coordinates}`); });
It looks like this is a known bug to be fixed in 3.0.5.
See this; https://jira.mongodb.org/browse/TOOLS-657

how to run mongoexport csv with a query

I'm trying to export a mongodb query to csv file. here's what I have:
mongoexport --db db_name --collection agents --query ‘{ $and: [ {clients_count: {$gt:2}}, {vendors_count:{$gt:10}} ] }’ --csv --fieldFile userFields.txt --out outputFilePathAndName.csv
I got the following error:
Error parsing command line: too many positional options
What am I doing wrong?
got it. correct query:
mongoexport --db db_name --collection collectionName --query '{$and:[{clients_count:{$gt:2}},{vendors_count:{$gt:10}}]}' --csv --fieldFile userFields.txt --out filepat/fileName.csv
the key is to use single quotes ' and leave no spaces in the query.
EDIT
having issue with the fields, but seems to be grabbing the correct documents.

How to get mongo command results in to a flat file

How do I export the results of a MongoDB command to a flat file
For example, If I am to get db.collectionname.find() into a flat file.
I tried db.collectionname.find() >> "test.txt" doesnt seem to work.
you can try the following from the command line
mongo 127.0.0.1/db --eval "var c = db.collection.find(); while(c.hasNext()) {printjson(c.next())}" >> test.txt
assuming you have a database called 'db' running on localhost and a collection called 'collection' this will export all records into a file called test.txt
If you have a longer script that you want to execute you can also create a script.js file
and just use
mongo 127.0.0.1/db script.js >> test.txt
I hope this helps
I know of no way to do that from the mongo shell directly, but you can get mongoexport to execute queries and send the results to a file with the -q and -o options:
mongoexport -h mongo.dev.priv -d models -c profiles -q '{ $query : { _id : "MRD461000" } }' -o MRD_Series1.json
The above hits queries the profiles collection in the models database grabbing the JSON document for _id = "MRD641000". Works for me.
Use this
mongo db_name --username user --password password < query1.js >> result.txt
Try this - returns a json file with the data of the query, you can change .json for .txt and other.
mongoexport --db products --collection clicks --query '{"createdInt":{$gte:20190101}, "clientId":"123", "country":"ES"}' --out clicks-2019.json
Having missed the db needing to be the actual db in Peshkira's answer, here is a general syntax for a one liner in shell (assuming no password):
mongo <host>:<db name> --eval "var x = <db name>.<collection name>.<query>; while(x.hasNext()) { printjson( x.next() ) }" >> out.txt
I tested it both on my mac and Google cloud Ubuntu 15 with Mongo 3+.
Install MongoDB Compass, then it will have a tool to export query result to Json/CSV files.
mongoexport --host 127.0.0.1 --port 27017 --username youruser -p yourpass \
-d yourDatabaseName -c collectionName --type csv \
--fields field1,field2 -q '{"field1" : 1495730914381}' \
--out report.csv
mongoexport --db db_name --collection collection_name --csv --out file_name.csv -f field1,field2, field3

How to use mongoimport to import CSV files?

CSV file with contact information:
Name,Address,City,State,ZIP
Jane Doe,123 Main St,Whereverville,CA,90210
John Doe,555 Broadway Ave,New York,NY,10010
Running this doesn't add documents to the database:
$ mongoimport -d mydb -c things --type csv --file locations.csv --headerline
Trace says imported 1 objects, but in the MongoDB shell running db.things.find() doesn't show any new documents.
What am I missing?
Your example worked for me with MongoDB 1.6.3 and 1.7.3. Example below was for 1.7.3. Are you using an older version of MongoDB?
$ cat > locations.csv
Name,Address,City,State,ZIP
Jane Doe,123 Main St,Whereverville,CA,90210
John Doe,555 Broadway Ave,New York,NY,10010
ctrl-d
$ mongoimport -d mydb -c things --type csv --file locations.csv --headerline
connected to: 127.0.0.1
imported 3 objects
$ mongo
MongoDB shell version: 1.7.3
connecting to: test
> use mydb
switched to db mydb
> db.things.find()
{ "_id" : ObjectId("4d32a36ed63d057130c08fca"), "Name" : "Jane Doe", "Address" : "123 Main St", "City" : "Whereverville", "State" : "CA", "ZIP" : 90210 }
{ "_id" : ObjectId("4d32a36ed63d057130c08fcb"), "Name" : "John Doe", "Address" : "555 Broadway Ave", "City" : "New York", "State" : "NY", "ZIP" : 10010 }
I was perplexed with a similar problem where mongoimport did not give me an error but would report importing 0 records. I had saved my file that didn't work using the OSX Excel for Mac 2011 version using the default "Save as.." "xls as csv" without specifying "Windows Comma Separated(.csv)" format specifically. After researching this site and trying the "Save As again using "Windows Comma Separated (.csv)" format, mongoimport worked fine. I think mongoimport expects a newline character on each line and the default Mac Excel 2011 csv export didn't provide that character at the end of each line.
We need to execute the following command:
mongoimport --host=127.0.0.1 -d database_name -c collection_name --type csv --file csv_location --headerline
-d is database name
-c is collection name
--headerline If using --type csv or --type tsv, uses the first line as field names. Otherwise, mongoimport will import the first line as a distinct document.
For more information: mongoimport
you will most likely need to authenticate if you're working in production sort of environments. You can use something like this to authenticate against the correct database with appropriate credentials.
mongoimport -d db_name -c collection_name --type csv --file filename.csv --headerline --host hostname:portnumber --authenticationDatabase admin --username 'iamauser' --password 'pwd123'
I use this on mongoimport shell
mongoimport --db db_name --collection collection_name --type csv --file C:\\Your_file_path\target_file.csv --headerline
type can choose csv/tsv/json
But only csv/tsv can use --headerline
You can read more on the offical doc.
Check that you have a blank line at the end of the file, otherwise the last line will be ignored on some versions of mongoimport
When I was trying to import the CSV file, I was getting an error. What I have done.
First I changed the header line's column names in Capital letter and removed "-" and added "_" if needed. Then Typed below command for importing CSV into mongo
$ mongoimport --db=database_name --collection=collection_name --type=csv --file=file_name.csv --headerline
Robert Stewart have already answered for how to import with mongoimport.
I am suggesting easy way to import CSV elegantly with 3T MongoChef Tool (3.2+ version). Might help someone in future.
You just need to select collection
Select file to import
You can also unselect data which is going to import. Also many options are there.
Collection imported
See how to import video
First you should come out of the mongo shell and then execute the mongoimport command like this:
Manojs-MacBook-Air:bin Aditya$ mongoimport -d marketdata -c minibars
--type csv
--headerline
--file '/Users/Aditya/Downloads/mstf.csv'
2017-05-13T20:00:41.989+0800 connected to: localhost
2017-05-13T20:00:44.123+0800 imported 97609 documents
Manojs-MacBook-Air:bin Aditya$
Robert Stewart's answers is great.
I'd like to add that you also can type your fields with --columHaveTypes and --fields like this :
mongoimport -d myDb -c myCollection --type csv --file myCsv.csv
--columnsHaveTypes --fields "label.string(),code.string(),aBoolean.boolean()"
(Careful to not have any space after the comma between your fields)
For other types, see doc here : https://docs.mongodb.com/manual/reference/program/mongoimport/#cmdoption-mongoimport-columnshavetypes
For the 3.4 version, please use the following syntax:
mongoimport -u "username" -p "password" -d "test" -c "collections" --type csv --file myCsv.csv --headerline
After 3 days, I finally made it on my own. Thanks to all the users who supported me.
My requirement was to import the .csv (with no headline) to remote MongoDB instance. For mongoimport v3.0.7below command worked for me:
mongoimport -h <host>:<port> -u <db-user> -p <db-password> -d <database-name> -c <collection-name> --file <csv file location> --fields <name of the columns(comma seperated) in csv> --type csv
For example:
mongoimport -h 1234.mlab.com:61486 -u arpitaggarwal -p password -d my-database -c employees --file employees.csv --fields name,email --type csv
Below is the screenshot of how it looks like after import:
where name and email are the columns in the .csv file.
Given .csv file I have which has only one column with no Header, below command worked for me:
mongoimport -h <mongodb-host>:<mongodb-port> -u <username> -p <password> -d <mongodb-database-name> -c <collection-name> --file file.csv --fields <field-name> --type csv
where field-name refers to the Header name of the column in .csv file.
C:\wamp\mongodb\bin>mongoexport --db proj_mmm --collection offerings --csv --fieldFile offerings_fields.txt --out offerings.csv
Just use this after executing mongoimport
It will return number of objects imported
use db
db.collectionname.find().count()
will return the number of objects.
use :
mongoimport -d 'database_name' -c 'collection_name' --type csv --headerline --file filepath/file_name.csv
mongoimport -d test -c test --type csv --file SampleCSVFile_119kb.csv --headerline
check collection data:-
var collections = db.getCollectionNames();
for(var i = 0; i< collections.length; i++)
{
print('Collection: ' + collections[i]);
// print the name of each collection
db.getCollection(collections[i]).find().forEach(printjson);
//and then print the json of each of its elements
}
1]We can save xsl as .csv file
2] Got to MongoDB bin pathon cmd - > cd D:\Arkay\soft\MongoDB\bin
3] Run below command
> mongoimport.exe -d dbname -c collectionname --type csv --file "D:\Arkay\test.csv" --headerline
4] Verify on Mongo side using below coomand.
>db.collectioname.find().pretty().limit(1)
Strangely no one mentioned --uri flag:
mongoimport --uri connectionString -c questions --type csv --file questions.csv --headerline
Sharing for future readers:
In our case, we needed to add the host parameter to make it work
mongoimport -h mongodb://someMongoDBhostUrl:somePORTrunningMongoDB/someDB -d someDB -c someCollection -u someUserName -p somePassword --file someCSVFile.csv --type csv --headerline --host=127.0.0.1
Make sure to copy the .csv file to /usr/local/bin or whatever folder your mondodb is in
All these answers above are great. And the way to go on a full featured application.
But if you want to prototype fast, want flexibility as the collection still changes as well as to minimize your early code base, there is a much simpler way that is not much discussed.
You can basically forego mongoimport by now. I could have saved 3 hours if it was mentioned here on this question. So let me share for others:
Mongodb has a GUI called Mongo Compass has both csv and json import features out of the box in a matter of clicks. It is an official part of the Mongo ecosytem. At the time of writing it is free and it works very well for my use case.
https://www.mongodb.com/products/compass
You simply get MongoDB compass running on your machine by following the simple installation. A couple of fields for DB connection and authentication directly in the GUI.
Import the csv/json file. It took less than a second on a 30KB file to be parsed before user (me) validates.
Validate the "type" of each property. Great feature, I could directly mention the property types such as booleans, integers, etc. In my experience, they seem all default to string. You can update before importing. Dates were more finicky and needed special attention on the coding side.
One click further the csv is a collection in your mongo db local or on the cloud. Voila!
If you have multiple files and you want to import all of them using python, you can do the following.
import os
import subprocess
# directory of files
dir_files = 'C:\data'
# create list of all files
_, _, fns = next(os.walk(dir_files))
files = [os.path.join(dir_files, fn) for fn in fns]
# mongotool address
mongotool = r'C:\Program Files\MongoDB\Server\4.4\bin\mongoimport.exe'
# name of mongodb database
mydatabase = 'mydatabase'
# name of mongodb collection
mycollection = 'mycollection'
# import all files to mongodb
for fl in files:
commands =[mongotool, '--db', mydatabase,
'--collection', mycollection,
'--file', fl,
'--type', 'tsv',
'--headerline']
subprocess.Popen(commands, shell=True)