I have two fields with separate arrays that have comparable data in them.
The first has a Name, and an ID. The second has a nickname.
I want to make sure that the count of the two are the same. If they are not the same, I want to know the mongoID of that document.
How would I do this?
With MapReduce it would be possible. If your document looks like:
document: { array1: [ a, b], array2: [c] }
You could write map and reduce functions like:
map = function(){
if(this.array1.length!=this.array2.length)
emit(this_id,1);
}
reduce = function(key,values){ return key;}
For instance, to get the results inline:
db.foo.mapReduce(map,reduce,{out:{inline:1}}).results
Related
I'm not sure this is possible, but i'd like to create a single view or at least a single query that looks in different collections based on what's being queried.
for example, if the first character is an "A" look in the "Aresults" collection, if it's a "B" look in the "Bresults" collection, etc.
I could potentially create a "A-Z" collection with just those letters, and do a $lookup from there based on a condition, but i'm not sure how to do that either.
I am aware that i could create a view with a $unionWith having all the "*results" collections, but that seems very inefficient.
Any other ideas? Is there perhaps some type of dynamic query structure within mongodb like in MySQL (couldn't find any)?
Thanks
Something like this?
const prefix = db.meta_data.findOne({field: condition}).prefix ;
db.createView('view_name', prefix + 'results', [<your aggregation pipeline>]);
or this?
const pipeline = [];
db.meta_data.find({ field: condition }).forEach(x => {
pipeline.push({ $unionWith: { coll: prefix + 'results' } });
});
db.collection.aggregate([pipeline]);
in mongodb records are store like this
{_id:100,type:"section",ancestry:nil,.....}
{_id:300,type:"section",ancestry:100,.....}
{_id:400,type:"problem",ancestry:100,.....}
{_id:500,type:"section",ancestry:100,.....}
{_id:600,type:"problem",ancestry:500,.....}
{_id:700,type:"section",ancestry:500,.....}
{_id:800,type:"problem",ancestry:100,.....}
i want to fetch records in order like this
first record whose ancestry is nil
then all record whose parent is first record we search and whose type is 'problem'
then all record whose parent is first record we search and whose type is 'section'
Expected output is
{_id:100,type:"section",ancestry:nil,.....}
{_id:400,type:"problem",ancestry:100,.....}
{_id:800,type:"problem",ancestry:100,.....}
{_id:300,type:"section",ancestry:100,.....}
{_id:500,type:"section",ancestry:100,.....}
{_id:600,type:"problem",ancestry:500,.....}
{_id:700,type:"section",ancestry:500,.....}
Try this MongoDB shell command:
db.collection.find().sort({ancestry:1, type: 1})
Different languages, where ordered dictionaries aren't available, may use a list of 2-tuples to the sort argument. Something like this (Python):
collection.find({}).sort([('ancestry', pymongo.ASCENDING), ('type', pymongo.ASCENDING)])
#vinipsmaker 's answer is good. However, it doesn't work properly if _ids are random numbers or there exist documents that aren't part of the tree structure. In that case, the following code would work rightly:
function getSortedItems() {
var sorted = [];
var ids = [ null ];
while (ids.length > 0) {
var cursor = db.Items.find({ ancestry: ids.shift() }).sort({ type: 1 });
while (cursor.hasNext()) {
var item = cursor.next();
ids.push(item._id);
sorted.push(item);
}
}
return sorted;
}
Note that this code is not fast because db.Items.find() will be executed n times, where n is the number of documents in the tree structure.
If the tree structure is huge or you will do the sort many times, you can optimize this by using $in operator in the query and sort the result on the client side.
In addition, creating index on the ancestry field will make the code quicker in either case.
I have two documents:
{ p1:"a", p2:"b" }
{ p1:"a", p2:"b", p3:"c" }
What I should to do with query: { p1:"a", p2:"b" } to find only first document? So I want find only documents with fields what I specified. If document has more fields (than query) it should not be presented in search results.
I must admit I know of no normal querying method by which to solve this problem. There is only one way I know of and that is to use MongoDBs object comparison. To do this you would change your structure to be something along the lines of:
{
ps: [a,b]
}
or:
{
ps: {p1:a,p2:b}
}
And then you would query like:
db.col.find({ p: [a,b] })
or:
db.col.find({ p: {p1:a, p2:b} })
There is one immedate problem with this though. It is key order dependant which means that if your a and b are actually the other way around in another document it won't match. So you will need to make sure you care about order when saving if you do this.
Hope it helps,
It's not easy to do with that structure, you need some other indexable cue as to what to take and to leave.
If you know what fields you DON'T want, you can use $exists, http://www.mongodb.org/display/DOCS/Advanced+Queries#AdvancedQueries-%24exists but it's not very efficient.
db.col.find({p3:{$exists:false}})
You could also just do this;
db.col.find({ p1:"a", p2:"b", p3:null }
Could you just throw away the fields you don't want in your own code? Or could you restructure into nested groups to make it easier to filter?
{ basicData:{p1:"a", p2:"b"}, extraData:{p3:"c",p4:"d"} }
I am using mongoose with nodejs. I am using mapReduce to fetch data grouped by a field.So all it gives me as a collection is the key with the grouping field only from every row of database.
I need to fetch all the fields from the database grouped by a field and sorted on the basis of another field.e.g.: i have a database having details of places and fare for travelling to those places and a few other fields also.Now i need to fetch the data in such a way that i get the data grouped on the basis of places sorted by the fare for them. MapReduce helps me to get that, but i cannot get the other fields.
Is there a way to get all the fields using map reduce, rather than just getting the two fields as mentioned in the above example??
I must admit I'm not sure I understand completely what you're asking.
But maybe one of the following thoughts helps you:
either) when you iterate over your mapReduce results, you could fetch complete documents from mongodb for each result. That would give you access to all fields in each document for the cost of some network traffic.
or) The value that you send into emit(key, value) can be an object. So you could construct a value object that contains all your desired fields. Just be sure to use the exactly same object structure for your reduce method's return value.
I try to illustrate with an (untested) example.
map = function() {
emit(this.place,
{
'field1': this.field1,
'field2': this.field2,
'count' : 1
});
}
reduce = function(key, values) {
var result = {
'field1': values[0].field1,
'field2': values[0].field2,
'count' : 0 };
for (v in values) {
result.count += values[v].count;
}
return obj;
}
First, the background. I used to have a collection logs and used map/reduce to generate various reports. Most of these reports were based on data from within a single day, so I always had a condition d: SOME_DATE. When the logs collection grew extremely big, inserting became extremely slow (slower than the app we were monitoring was generating logs), even after dropping lots of indexes. So we decided to have each day's data in a separate collection - logs_YYYY-mm-dd - that way indexes are smaller, and we don't even need an index on date. This is cool since most reports (thus map/reduce) are on daily data. However, we have a report where we need to cover multiple days.
And now the question. Is there a way to run a map/reduce (or more precisely, the map) over multiple collections as if it were only one?
A reduce function may be called once, with a key and all corresponding values (but only if there are multiple values for the key - it won't be called at all if there's only 1 value for the key).
It may also be called multiple times, each time with a key and only a subset of the corresponding values, and the previous reduce results for that key. This scenario is called a re-reduce. In order to support re-reduces, your reduce function should be idempotent.
There are two key features in a idempotent reduce function:
The return value of the reduce function should be in the same format as the values it takes in. So, if your reduce function accepts an array of strings, the function should return a string. If it accepts objects with several properties, it should return an object containing those same properties. This ensures that the function doesn't break when it is called with the result of a previous reduce.
Don't make assumptions based on the number of values it takes in. It isn't guaranteed that the values parameter contains all the values for the given key. So using values.length in calculations is very risky and should be avoided.
Update: The two steps below aren't required (or even possible, I haven't checked) on the more recent MongoDB releases. It can now handle these steps for you, if you specify an output collection in the map-reduce options:
{ out: { reduce: "tempResult" } }
If your reduce function is idempotent, you shouldn't have any problems map-reducing multiple collections. Just re-reduce the results of each collection:
Step 1
Run the map-reduce on each required collection and save the results in a single, temporary collection. You can store the results using a finalize function:
finalize = function (key, value) {
db.tempResult.save({ _id: key, value: value });
}
db.someCollection.mapReduce(map, reduce, { finalize: finalize })
db.anotherCollection.mapReduce(map, reduce, { finalize: finalize })
Step 2
Run another map-reduce on the temporary collection, using the same reduce function. The map function is a simple function that selects the keys and values from the temporary collection:
map = function () {
emit(this._id, this.value);
}
db.tempResult.mapReduce(map, reduce)
This second map-reduce is basically a re-reduce and should give you the results you need.
I used map-reduce method. here is an example.
var mapemployee = function () {
emit(this.jobid,this.Name);};
var mapdesignation = function () {
emit(this.jobid, this.Designation);};
var reduceF = function(key, values) {
var outs = {Name:null,Designation: null};
values.forEach(function(v){
if(outs.Name ==null){
outs.Name = v.Name }
if(outs.Name ==null){
outs.Nesignation = v.Designation}
});
return outs;
};
result = db.employee.mapReduce(mapemployee, reduceF, {out: {reduce: 'output'}});
result = db.designation.mapReduce(mapdesignation,reduceF, {out: {reduce: 'output'}});
Refference : http://www.itgo.me/a/x3559868501286872152/mongodb-join-two-collections