How to use mongodb functions with mongoimport? - mongodb

Let's say I want to insert an object that contains date objects using mongoimport from the commandline.
echo "{\"int_key\": 1, \"date_key\": new Date(\"2022-12-27\")}" | mongoimport --host "192.168.60.10" --db example_db --collection example_collection
will not work because the object I am trying to insert is not in the form of a valid JSON. The reason I want to use mongoimport is because there is an array of a large number of objects that I want to persist at one go. If I try to use the mongo command the argument length for --eval is too long. For example,
mongo --host "192.168.60.10" --eval "db=db.getSiblingDB(\"example_db\");db.getCollection(\"example_collection\").insert([{\"int_key\": 1, \"date_key\": new Date(\"2022-12-27\")}])"
but the array inside insert() has a very large number of objects. Can you suggest any workaround to this? I was thinking I could use mongoimport to read all the objects put into an array through stdin or a file. The options for using a json array would not allow the kind of array of objects I insert using the insert() in mongo --eval.

You have to use this
echo "{\"int_key\": 1, \"date_key\": {\"$date\": \"2022-12-27\"}}"
It may require:
echo "{\"int_key\": 1, \"date_key\": {\"\$date\": \"2022-12-27T00:00:00Z\"}}"
For other data types see MongoDB Extended JSON (v2)
I use mongoimport in the same way to insert around 6 billion documents per day, it is very fast and reliable.
Depending on how you use it, mongoimport does not import small amount of documents could be relevant for you.

Related

Export populated data from MongoDB to CSV file

I am using MongoDB at mLab. I have multiple collections - 1 main and other supporting. Therefore, the main collection consists of IDs pointing to supporting collections. I would like to export the actual data from the main collection to a CSV file. So I need to populate the data first and then export the result.
I see I can export collections individually but then the data are not populated. I suppose I should use bash script to do this but I do not know how.
Could you point me the right direction or suggest a way to do this?
Thank you!
Using the mongo shell will be the better idea in your case, as per the official documents below is the steps to write the bash script to read the data from mongo collection in bash shell scripts:
Simple example to get the data count from a collection with updated date time with greater than 10 days.
DATE2=$(date -d '10 days ago' "+%Y-%m-%dT%H:%M:%S.%3NZ");
counter = $(mongo --quiet dbName --eval 'db.dbCollection.find({"updatedAt":{"$gt":new ISODate("'$DATE'")}}).count()')
echo counter;
Or you can get the list of data and iterate over it to populate it as per your requirements.
For more on mongo shell query click here

How to mongoexport with one field

i have a few fields in my collection at the mongoDB.
i have tried exported out everything.
which looking like this
{"_id":{"$oid":"5a5ef05dbe83813f55141a51"},"comments_data":{"id":"211","comments":{"paging":{"cursors":{"after":"WzZANVFV4TlRVME5qUXpPUT09","before":"WTI5dEF4TlRVNE1USTVNemczTXpZAMk56YzZANVFV4TlRBMU9ERTFNQT09"}},"data":[{"created_time":"2018-01-04T09:29:09+0000","message":"Super","from":{"name":"M Mun","id":"1112"},"id":"1111"},{"created_time":"2018-01-07T22:25:08+0000","message":"Happy bday..Godbless you...","from":{"name":"L1","id":"111"},"id":"1111"},{"created_time":"2018-01-10T00:22:00+0000","message":"Nelson ","from":{"name":"Boon C","id":"1111"},"id":"10111"},{"created_time":"2018-01-10T01:07:19+0000","message":"Thank to SingTel I like to","from":{"name":"Sarkar WI","id":"411653482605703"},"id":"10155812413346677_10155825869201677"}]}},"post_id":"28011986676_10155812413346677","post_message":"\"Usher in the New Year with deals and rewards that will surely perk you up, exclusively for Singtel customers. Find out more at singtel.com/rewards\"",
but now i want to export just a single field which is the 'message' from the 'comments_data' from the collection.
i tried using this mongoexport --db sDB --collection sTest --fields data.comments_data --out test88.json
but when i check my exported file, it just contains something like this
{"_id":{"$oid":"5a5ef05dbe83813f55141a51"}}
which is something not i have expected.
i just want something like "message":"Happy bday..Godbless you..."
but when i query out at the mongoshell with db.sTest.find({}, {comments_data:1, _id:0})i can roughly get what i want.
If this ...
db.sTest.find({}, {'comments_data.message':1, _id:0})
... selects the data you are interested in then the equivalent mongoexport command is:
mongoexport --db sDB --collection sTest --fields 'comments_data.message' --type csv --out test88.csv
Note: this uses --type csv because, according to the docs, use of the JSON output format causes MongoDB to export all fields in the selected sub document ...
For csv output formats, mongoexport includes only the specified field(s), and the specified field(s) can be a field within a sub-document.
For JSON output formats, mongoexport includes only the specified field(s) and the _id field, and if the specified field(s) is a field within a sub-document, the mongoexport includes the sub-document with all its fields, not just the specified field within the document.
If you must have JSON format and limit your output to a single field then I think you'll need to write the reduced documents to a separate collection and export that collection, as per this answer.

extract date from object id and export it to csv in mongodb

I am pretty new to mongodb. Am trying to export data from a collection to a csv file. I have done that and it works fine. I have a question . Is there a way to export just date from ObjectId to a new column. I understand we can get date from ObjectId using ObjectId.getTimestamp(). Is there a way we can do the same for mongoexport. Below is the query i use to export data
mongoexport --db MyDB --collection CollectionName --type=csv --fieldFile fieldsList.txt --out Data.csv
You cannot do this with mongoexport, but if the case is generally simple enough then you can really just use the mongo shell.
For instance to just export data from all fields in a collection with a flat structure and append the last field as the timestamp then you can do:
mongo localhost/MyDB --quiet --eval 'db.CollectioName.find().forEach(d => print(Object.keys(d).concat(["#time"]).map(k => (k === "#time") ? d["_id"].getTimestamp().valueOf() : d[k].valueOf() ).join(", ")))' > Data.csv
Showing the script part as pretty:
db.CollectioName.find().forEach(d =>
print(Object.keys(d).concat(["#time"]).map(k =>
(k === "#time") ? d["_id"].getTimestamp().valueOf() : d[k].valueOf() ).join(", ")
)
)
Which essentially says that when iterating all documents for the given collection we
Grab a list of all document fields
Append the "special" field of #time to the end of the list
Loop those fields and return an array of values - where the #time gets the timestamp from the ObjectId in _id
Join the result array with commas and print all of it out
If you had a list of fields then you could simply replace the Object.keys(d) part with an array of field names, such as:
db.CollectioName.find().forEach(d =>
print(["_id","field1","field2"].concat(["#time"]).map(k =>
(k === "#time") ? d["_id"].getTimestamp().valueOf() : d[k].valueOf() ).join(", ")
)
)
But really as long as you provide the database to connect to and the --quiet and --eval options with the script line, then you can simply redirect the output to your destination file, from any scripting you want.
It does not take all considerations for a CSV into account. But it is a "quick and dirty" solution for most basic cases at a pinch, or at the very least a starting point for expansion without writing a full program listing.
If you really want more than this, then there are drivers for your language of choice as well as a plethora of CSV writing libraries for every single one of those languages. And it's really not that much harder than the listing here, especially with a library taking all "quoting" considerations into mind.

Mongoexport to return computed column

Can I use mongoexport to export a computed column? I want to double the "Score" while returning. I can use Excel do this on my csv. But I wanted to know if mongoexport natively supports this. I tried the following but it didn't work. It returned Score itself:
mongoexport -d MyDB -c MyCollection -f _id, FirstName , Score*2 --csv --out f:\NewScores.csv
I found this similar question. But it's about find() where I can achieve this using $project.
I really doubt that mongoexport can perform any caluclation or manipulation of data while exporting it. Its just dumps/export the data from DB to a file. Straight and simple.

mongodump by date / find() in dumped data

How to dump all collections by date? If my records hasn't timestamp field?
Fields: _id, name, email, carnumber... etc.
And how to look/find() in archived/dumped database?
I need to create search mechanism, for searching in archive
You can pass a query to mongodump that will make it dump only a portion of your data. If you can't make a query that finds a required portion of data, then you're out of luck.
Result of mongodump is a collection of bson files. They are not directly queryable. But you can load them into another database and query that. Or you can use mongoexport utility that creates JSON documents. JSON is a little bit easier to work with.
Although what Sergio says is broadly true, let me expand a bit:
First, You mention using _id - if that is an ObjectID (the default), then it contains a timestamp - the first 4 bytes are a unix style timestamp:
http://www.mongodb.org/display/DOCS/Object+IDs#ObjectIDs-BSONObjectIDSpecification
Next, the problem with using mongoexport is that JSON does not preserve all BSON types (http://bsonspec.org/#/specification) - BSON has more types than JSON does and so storing as JSON can be problematic unless you have rules to re-import
If you keep the data in BSON format there is the bsondump to inspect things as-is in the files:
http://www.mongodb.org/display/DOCS/Import+Export+Tools#ImportExportTools-bsondump
Or, if you had an "archive" MongoDB instance, you could just use mongodump/mongorestore, which works directly with the BSON files and does not have the JSON issues seen with mongoexport etc.:
http://www.mongodb.org/display/DOCS/Import+Export+Tools#ImportExportTools-mongodumpandmongorestore