while trying to export collection from MongoDB compass it's not exporting all data, it's only export fields that are present in all documents. for eg: if document 1 has
{
"Name": "Alex",
"__v": 0
}
and if Document 2 has
{
"Name": "Joe",
"ID" : 07
"__v": 0
}
and when trying to export collection it's only exporting Name fields. I'm trying to export all fields through the MongoDB Compass. is there any other way to export all data through any code or script
EDIT: the solution is Update to new version of compass and while exporting data from mongo if the field name is not present in the list, there is an option to add a field through we can add a field that misses by compass
MongoDB Compass has known issues on exporting an importing data for long time and it seems they are not willing to improve it!
When you try to export data using compass, it uses some sample documents to select the fields and if you are unlucky enough, you will miss some fields.
SOLUTION:
Use the Mongo DB Compass Aggregation tab to find all the existing fields in all documents:
[{$project: {
arrayofkeyvalue: {
$objectToArray: '$$ROOT'}
}},
{$unwind: '$arrayofkeyvalue'},
{$group: {
_id: null,
allkeys: {
$addToSet: '$arrayofkeyvalue.k'
}
}}]
Add the fields from the 1st step to the Export Full Collection (Select Fields).
Export it!
the solution is while exporting data from mongo if the field name is not present in the list, there is an option to add a field through which we can add a field that missed by compass.
Related
Is it possible to add a field to every document in a collection in MongoDB Compass? Or is this something that has to be done in the shell?
There is no option in Compass to update all documents with a new field; Compass's "Document tab" has option to modify a document's field or add a new field (modify one document at a time).
This is to be done from the mongo shell or your favorite programming language.
From the shell, to update all documents in a collection with a new field use the db.collection.updateMany() method. For example, db.test.updateMany( { }, { $set: { new_field: "initial value" } } ).
Once, the documents are updated, these can be viewed from the Compass; just do a refresh in the Documents view / tab.
I am trying to query a binary field in mongo db. The data looks like this:
{"_id":"WE8fSixi8EuWnUiThhZdlw=="}
I've tried a lot of things for example:
{ '_id': new Binary( 'WE8fSixi8EuWnUiThhZdlw==', Binary.SUBTYPE_DEFAULT) }
{ '_id': Binary( 'WE8fSixi8EuWnUiThhZdlw==', 0) }
etc
Nothing seems to be working, have exhausted google and the mongo documentation, any helper would be amazing.
UPDATE:
Now you should be able to query UUID and BinData from MongoDB Compass v1.20+ (COMPASS-1083). For example: {"field": BinData(0, "valid_base64")}.
PREVIOUS:
I see that you're using MongoDB Compass to query the field. Unfortunately, the current version of MongoDB Compass (v1.16.x) does not support querying binary data.
You can utilise mongo shell to query the data instead. For example:
db.collection.find({'_id':BinData(0, "WE8fSixi8EuWnUiThhZdlw==")});
Please note that the field name _id is reserved for use as a primary key; its value must be unique in the collection, and is immutable. Depending on the value of the binary that you're storing into _id, I would suggest to store the binary in another field and keep the value of _id to contain ObjectId.
I have a collection that contains around 5000 documents. In each document there is field BrandID that I would like to change from string to MongoDB ObjectID. I have tried following shell command but it only updates first document. I dont get any error at all.
db.getCollection('SGProductRepository').find({ BrandID: {$ne : ""}}).forEach(function(obj) {
obj.BrandID = new ObjectId(obj.BrandID);
db.getCollection('SGProductRepository').save(obj);
})
Any idea why it is not working. I am using RoboMongo as an editor.
Thanks
I am implementing search functionality using Elasticsearch.
I receive "username" set returned by Elasticsearch after which I need to query a collection in MongoDB for latest comment of each user in the "username" set.
Question: Lets say I receive ~100 usernames everytime I query Elasticsearch what would be the fastest way to query MongoDB to get the latest comment of each user. Is querying MongoDB 100 times in a for loop using .findOne() the only option?
(Note - Because latest comment of a user changes very often, I dont want to store it in Elasticsearch as that will trigger retrieve-change-reindex process for the entire document far too frequently)
This answer assumes following schema for your mongo db stored in comments db.
{
"_id" : ObjectId("5788b71180036a1613ac0e34"),
"username": "abc",
"comment": "Best"
}
assuming usernames is the list of users you get from elasticsearch, you can perform following aggregate:
a =[
{$match: {"username":{'$in':usernames}}},
{$sort:{_id:-1}},
{
$group:
{
_id: "$username",
latestcomment: { $first: "$comment" }
}
}
]
db.comments.aggregate(a)
You can try this..
db.foo.find().sort({_id:1}).limit(100);
The 1 will sort ascending (old to new) and -1 will sort descending (new to old.)
I want to get the maximum id in mongodb using PDI spoon.
I have this fields in my collection:
Id String
Genre String
Before I insert new record I should get the maximum Id.
Can you help me on how to get the maximum Id?
You can use the MongoDB Input step.
Use the following query:
({
$group: {
_id: '',
theMaxId: {
$max: "$_id"
}
}
}
Note: This is an aggregation framework query and you must check the "Query is aggregation pipeline" box at the bottom of the query box on the query tab.
As a side note - if you are using the MongoDB id's, it is auto-incrementing so you don't need to find the max, increment it and use if for your new record.
Hope that helps,
Mark