Can I use mongoexport to export a computed column? I want to double the "Score" while returning. I can use Excel do this on my csv. But I wanted to know if mongoexport natively supports this. I tried the following but it didn't work. It returned Score itself:
mongoexport -d MyDB -c MyCollection -f _id, FirstName , Score*2 --csv --out f:\NewScores.csv
I found this similar question. But it's about find() where I can achieve this using $project.
I really doubt that mongoexport can perform any caluclation or manipulation of data while exporting it. Its just dumps/export the data from DB to a file. Straight and simple.
Related
Let's say I want to insert an object that contains date objects using mongoimport from the commandline.
echo "{\"int_key\": 1, \"date_key\": new Date(\"2022-12-27\")}" | mongoimport --host "192.168.60.10" --db example_db --collection example_collection
will not work because the object I am trying to insert is not in the form of a valid JSON. The reason I want to use mongoimport is because there is an array of a large number of objects that I want to persist at one go. If I try to use the mongo command the argument length for --eval is too long. For example,
mongo --host "192.168.60.10" --eval "db=db.getSiblingDB(\"example_db\");db.getCollection(\"example_collection\").insert([{\"int_key\": 1, \"date_key\": new Date(\"2022-12-27\")}])"
but the array inside insert() has a very large number of objects. Can you suggest any workaround to this? I was thinking I could use mongoimport to read all the objects put into an array through stdin or a file. The options for using a json array would not allow the kind of array of objects I insert using the insert() in mongo --eval.
You have to use this
echo "{\"int_key\": 1, \"date_key\": {\"$date\": \"2022-12-27\"}}"
It may require:
echo "{\"int_key\": 1, \"date_key\": {\"\$date\": \"2022-12-27T00:00:00Z\"}}"
For other data types see MongoDB Extended JSON (v2)
I use mongoimport in the same way to insert around 6 billion documents per day, it is very fast and reliable.
Depending on how you use it, mongoimport does not import small amount of documents could be relevant for you.
i have a few fields in my collection at the mongoDB.
i have tried exported out everything.
which looking like this
{"_id":{"$oid":"5a5ef05dbe83813f55141a51"},"comments_data":{"id":"211","comments":{"paging":{"cursors":{"after":"WzZANVFV4TlRVME5qUXpPUT09","before":"WTI5dEF4TlRVNE1USTVNemczTXpZAMk56YzZANVFV4TlRBMU9ERTFNQT09"}},"data":[{"created_time":"2018-01-04T09:29:09+0000","message":"Super","from":{"name":"M Mun","id":"1112"},"id":"1111"},{"created_time":"2018-01-07T22:25:08+0000","message":"Happy bday..Godbless you...","from":{"name":"L1","id":"111"},"id":"1111"},{"created_time":"2018-01-10T00:22:00+0000","message":"Nelson ","from":{"name":"Boon C","id":"1111"},"id":"10111"},{"created_time":"2018-01-10T01:07:19+0000","message":"Thank to SingTel I like to","from":{"name":"Sarkar WI","id":"411653482605703"},"id":"10155812413346677_10155825869201677"}]}},"post_id":"28011986676_10155812413346677","post_message":"\"Usher in the New Year with deals and rewards that will surely perk you up, exclusively for Singtel customers. Find out more at singtel.com/rewards\"",
but now i want to export just a single field which is the 'message' from the 'comments_data' from the collection.
i tried using this mongoexport --db sDB --collection sTest --fields data.comments_data --out test88.json
but when i check my exported file, it just contains something like this
{"_id":{"$oid":"5a5ef05dbe83813f55141a51"}}
which is something not i have expected.
i just want something like "message":"Happy bday..Godbless you..."
but when i query out at the mongoshell with db.sTest.find({}, {comments_data:1, _id:0})i can roughly get what i want.
If this ...
db.sTest.find({}, {'comments_data.message':1, _id:0})
... selects the data you are interested in then the equivalent mongoexport command is:
mongoexport --db sDB --collection sTest --fields 'comments_data.message' --type csv --out test88.csv
Note: this uses --type csv because, according to the docs, use of the JSON output format causes MongoDB to export all fields in the selected sub document ...
For csv output formats, mongoexport includes only the specified field(s), and the specified field(s) can be a field within a sub-document.
For JSON output formats, mongoexport includes only the specified field(s) and the _id field, and if the specified field(s) is a field within a sub-document, the mongoexport includes the sub-document with all its fields, not just the specified field within the document.
If you must have JSON format and limit your output to a single field then I think you'll need to write the reduced documents to a separate collection and export that collection, as per this answer.
When I export my data from mongoDB I obtain the following file:
Everything is a string in the mongoDB except for the date that is ISODate.
123#123.com,sha1:64000:18:BTJnM903gIt5FNlSsZIRx1tLC9ErPJuB:9YVs800sgRPr1aaLj73qqnJ6,123,123,123#123.com,2017-04-28T09:20:07.480Z,cus_AYcVXIUf68nT52
If I import this file into MongoDB it import each value as String value. I need to parse the date as Date format, the rest can be string.
I've seen that there's an argument for MongoImport --columnsHaveTypes. I've tryed it without any result:
mongoimport -u test-p test --authenticationDatabase test -h localhost:30158 --db test--collection users --type csv --file users.csv --upsert --upsertFields username --fields username.string\(\),password.string\(\),cname.string\(\),sname.string\(\),mail.string\(\),creation.date\(\),validation.auto\(\),clients.string\(\),customer.string\(\) --columnsHaveTypes
I get this error:
Failed: type coercion failure in document #0 for column 'creation', could not parse token '2017-04-28T09:20:07.480Z' to type date
What I could do?
Kind regards.
Summary: you need to provide the format in which the date will be presented.
From the mongoimport documentation on the columnsHaveTypes parameter, you can't just say created.date\(\) - you need to provide an argument, which is a template for the way the date is represented in the CSV. Apparently the way you do this is, in accordance with the Go Language time.Parse function, by rendering a particular reference date in the format of your choice.
I think the reference date should be formatted like this:
2006-01-02T15:04:05.000Z
So you need to change your mongoimport call to specify the date with the format template, like this:
creation.date\(2006-01-02T15:04:05.000Z\)
I am trying to export a collection to csv which has the following fields:
_id
number
name
price
pollingTime
I can see the polling time data when I open the collection in RoboMongo or try to access the collection through mongoshell, but when I export that into a CSV, the pollingTime field comes out blank.
Here's my mongoexport command:
mongoexport --db=itemDB --collection=itemprice1 --type=csv --fieldFile=fields.txt --out items.csv
I need to send this data to some non-tech business folks; any idea if I need to make any changes in the fields.txt. Fields.txt is like this:
_id
number
name
totalPrice
pollingTimme
Apologies - I discovered immediately after while reviewing my question itself that I was making a mistake in the spelling of pollingTime field. I still want to keep the question and answer on StackOverflow so that others searching such a problem will try to look at their spellings :-)
I have over 100 fields and I am looking for a way so that I can just export the entire collection as CSV format
The command-line is asking to provide all fields via
-f [ --fields ] arg comma seperated list of field names e.g. -f
name,age
is there a way to get the entire collection like using dump but not in bson format?
I need CSV data
Thank you
In bash you can create this "export-all-collections-to-csv.sh" and pass the database name as the only argument (feel free to reduce this to a single collection):
OIFS=$IFS;
IFS=",";
dbname=$1 #put "database name" here if you don't want to pass it as an argument
collections=`mongo $dbname --eval "rs.slaveOk();db.getCollectionNames();" --quiet`;
collectionArray=($collections);
for ((i=0; i<${#collectionArray[#]}; ++i));
do
keys=`mongo $dbname --eval "rs.slaveOk();var keys = []; for(var key in db.${collectionArray[$i]}.findOne()) { keys.push(key); }; keys;" --quiet`;
mongoexport --db $dbname --collection ${collectionArray[$i]} --fields "$keys" --csv --out $dbname.${collectionArray[$i]}.csv;
done
IFS=$OIFS;
You could create a file with the field names (may be easier for you):
--fieldFile arg file with fields names - 1 per line
In your case they might all be the same but the reason you have to specify the field names is because they could be different for every document however the field names in the csv must be fixed.