insert to specific index for mongo array - mongodb

Mongo supports arrays of documents inside documents. For example, something like
{_id: 10, "coll": [1, 2, 3] }
Now, imagine I wanted to insert an arbitrary value at an arbitrary index
{_id: 10, "coll": [1, {name: 'new val'}, 2, 3] }
I know you can update values in place with $ and $set, but nothing for insertion. it kind of sucks to have to replace the entire array just for inserting at a specific index.

Starting with version 2.6 you finally can do this. You have to use $position operator. For your particular example:
db.students.update(
{ _id: 10},
{ $push: {
coll: {
$each: [ {name: 'new val'} ],
$position: 1
}
}}
)

The following will do the trick:
var insertPosition = 1;
var newItem = {name: 'new val'};
db.myCollection.find({_id: 10}).forEach(function(item)
{
item.coll = item.coll.slice(0, insertPosition).concat(newItem, item.coll.slice(insertPosition));
db.myCollection.save(item);
});
If the insertPosition is variable (i.e., you don't know exactly where you want to insert it, but you know you want to insert it after the item with name = "foo", just add a for() loop before the item.coll = assignment to find the insertPosition (and add 1 to it, since you want to insert it AFTER name = "foo".

Handy answer (not selected answer, but highest rated) from this similar post:
Can you have mongo $push prepend instead of append?
utilizes $set to insert 3 at the first position in an array, called "array". Sample from related answer by Sergey Nikitin:
db.test.update({"_id" : ObjectId("513ad0f8afdfe1e6736e49eb")},
{'$set': {'array.-1': 3}})

Regarding your comment:
Well.. with concurrent users this is going to be problematic with any database...
What I would do is the following:
Add a last modified timestamp to the document. Load the document, let the user modify it and use the timstamp as a filter when you update the document and also update the timestamp in one step. If it updates 0 documents you know it was modified in the meantime and you can ask the user to reload it.

Using the $position operator this can be done starting from version 2.5.3.
It must be used with the $each operator. From the documentation:
db.collection.update( <query>,
{ $push: {
<field>: {
$each: [ <value1>, <value2>, ... ],
$position: <num>
}
}
}
)

Related

MongoDB: Remove certain number of elements from start of an array

Is there a way to remove certain amounts of elements from start of an array in MongoDB?
Suppose I don't know about element's details (like id or uuid) but I know that I want to remove the first N elements from the start of it. Is there a way to do it in mongoDB? I know I can fetch the whole document and process it in my own programming language environment but I thought it would be nicer if mongoDB already implemented a way to achieve it atomically by its own query language.
There is a $pop operator to remove a single element from the array either from top or bottom position,
and there is a closed jira support request SERVER-4798 regarding multiple pop operations, but in comment they have suggested to use update with aggregation pipeline option.
So you can try update with aggregation pipeline starting from MongoDB 4.2,
$slice, pass negative number and it will slice elements from 0 index
let n = 2;
db.collection.updateOne(
{}, // your query
[{
$set: { arr: { $slice: ["$arr", -n] } }
}]
)
Playground
What #turivishal metioned is true however it only works if array's size is always 4. For it to work for all array sizes we should consider size of the array in aggregation, so:
let n = 2;
db.collection.update({},
[
{
$set: {
arr: {
$slice: [
"$arr",
{
$subtract: [
-n,
{
$size: "$arr"
}
]
},
]
}
}
}
])
Playground

Meteor collection get last document of each selection

Currently I use the following find query to get the latest document of a certain ID
Conditions.find({
caveId: caveId
},
{
sort: {diveDate:-1},
limit: 1,
fields: {caveId: 1, "visibility.visibility":1, diveDate: 1}
});
How can I use the same using multiple ids with $in for example
I tried it with the following query. The problem is that it will limit the documents to 1 for all the found caveIds. But it should set the limit for each different caveId.
Conditions.find({
caveId: {$in: caveIds}
},
{
sort: {diveDate:-1},
limit: 1,
fields: {caveId: 1, "visibility.visibility":1, diveDate: 1}
});
One solution I came up with is using the aggregate functionality.
var conditionIds = Conditions.aggregate(
[
{"$match": { caveId: {"$in": caveIds}}},
{
$group:
{
_id: "$caveId",
conditionId: {$last: "$_id"},
diveDate: { $last: "$diveDate" }
}
}
]
).map(function(child) { return child.conditionId});
var conditions = Conditions.find({
_id: {$in: conditionIds}
},
{
fields: {caveId: 1, "visibility.visibility":1, diveDate: 1}
});
You don't want to use $in here as noted. You could solve this problem by looping through the caveIds and running the query on each caveId individually.
you're basically looking at a join query here: you need all caveIds and then lookup last for each.
This is a problem of database schema/denormalization in my opinion: (but this is only an opinion!):
You could as mentioned here, lookup all caveIds and then run the single query for each, every single time you need to look up last dives.
However I think you are much better off recording/updating the last dive inside your cave document, and then lookup all caveIds of interest pulling only the lastDive field.
That will give you immediately what you need, rather than going through expensive search/sort queries. This is at the expense of maintaining that field in the document, but it sounds like it should be fairly trivial as you only need to update the one field when a new event occurs.

sorting documents in mongodb

Let's say I have four documents in my collection:
{u'a': {u'time': 3}}
{u'a': {u'time': 5}}
{u'b': {u'time': 4}}
{u'b': {u'time': 2}}
Is it possible to sort them by the field 'time' which is common in both 'a' and 'b' documents?
Thank you
No, you should put your data into a common format so you can sort it on a common field. It can still be nested if you want but it would need to have the same path.
You can use use aggregation and the following code has been tested.
db.test.aggregate({
$project: {
time: {
"$cond": [{
"$gt": ["$a.time", null]
}, "$a.time", "$b.time"]
}
}
}, {
$sort: {
time: -1
}
});
Or if you also want the original fields returned back: gist
Alternatively you can sort once you get the result back, using a customized compare function ( not tested,for illustration purpose only)
db.eval(function() {
return db.mycollection.find().toArray().sort( function(doc1, doc2) {
var time1 = doc1.a? doc1.a.time:doc1.b.time,
time2 = doc2.a?doc2.a.time:doc2.b.time;
return time1 -time2;
})
});
You can, using the aggregation framework.
The trick here is to $project a common field to all the documents so that the $sort stage can use the value in that field to sort the documents.
The $ifNull operator can be used to check if a.time exists, it
does, then the record will be sorted by that value else, by b.time.
code:
db.t.aggregate([
{$project:{"a":1,"b":1,
"sortBy":{$ifNull:["$a.time","$b.time"]}}},
{$sort:{"sortBy":-1}},
{$project:{"a":1,"b":1}}
])
consequences of this approach:
The aggregation pipeline won't be covered by any of the index you
create.
The performance will be very poor for very large data sets.
What you could ideally do is to ask the source system that is sending you the data to standardize its format, something like:
{"a":1,"time":5}
{"b":1,"time":4}
That way your query can make use of the index if you create one on the time field.
db.t.ensureIndex({"time":-1});
code:
db.t.find({}).sort({"time":-1});

$unwind an object in aggregation framework

In the MongoDB aggregation framework, I was hoping to use the $unwind operator on an object (ie. a JSON collection). Doesn't look like this is possible, is there a workaround? Are there plans to implement this?
For example, take the article collection from the aggregation documentation . Suppose there is an additional field "ratings" that is a map from user -> rating. Could you calculate the average rating for each user?
Other than this, I'm quite pleased with the aggregation framework.
Update: here's a simplified version of my JSON collection per request. I'm storing genomic data. I can't really make genotypes an array, because the most common lookup is to get the genotype for a random person.
variants: [
{
name: 'variant1',
genotypes: {
person1: 2,
person2: 5,
person3: 7,
}
},
{
name: 'variant2',
genotypes: {
person1: 3,
person2: 3,
person3: 2,
}
}
]
It is not possible to do the type of computation you are describing with the aggregation framework - and it's not because there is no $unwind method for non-arrays. Even if the person:value objects were documents in an array, $unwind would not help.
The "group by" functionality (whether in MongoDB or in any relational database) is done on the value of a field or column. We group by value of field and sum/average/etc based on the value of another field.
Simple example is a variant of what you suggest, ratings field added to the example article collection, but not as a map from user to rating but as an array like this:
{ title : title of article", ...
ratings: [
{ voter: "user1", score: 5 },
{ voter: "user2", score: 8 },
{ voter: "user3", score: 7 }
]
}
Now you can aggregate this with:
[ {$unwind: "$ratings"},
{$group : {_id : "$ratings.voter", averageScore: {$avg:"$ratings.score"} } }
]
But this example structured as you describe it would look like this:
{ title : title of article", ...
ratings: {
user1: 5,
user2: 8,
user3: 7
}
}
or even this:
{ title : title of article", ...
ratings: [
{ user1: 5 },
{ user2: 8 },
{ user3: 7 }
]
}
Even if you could $unwind this, there is nothing to aggregate on here. Unless you know the complete list of all possible keys (users) you cannot do much with this. [*]
An analogous relational DB schema to what you have would be:
CREATE TABLE T (
user1: integer,
user2: integer,
user3: integer
...
);
That's not what would be done, instead we would do this:
CREATE TABLE T (
username: varchar(32),
score: integer
);
and now we aggregate using SQL:
select username, avg(score) from T group by username;
There is an enhancement request for MongoDB that may allow you to do this in the aggregation framework in the future - the ability to project values to keys to vice versa. Meanwhile, there is always map/reduce.
[*] There is a complicated way to do this if you know all unique keys (you can find all unique keys with a method similar to this) but if you know all the keys you may as well just run a sequence of queries of the form db.articles.find({"ratings.user1":{$exists:true}},{_id:0,"ratings.user1":1}) for each userX which will return all their ratings and you can sum and average them simply enough rather than do a very complex projection the aggregation framework would require.
Since 3.4.4, you can transform object to array using $objectToArray
See:
https://docs.mongodb.com/manual/reference/operator/aggregation/objectToArray/
This is an old question, but I've run across a tidbit of information through trial and error that people may find useful.
It's actually possible to unwind on a dummy value by fooling the parser this way:
db.Opportunity.aggregate(
{ $project: {
Field1: 1, Field2: 1, Field3: 1,
DummyUnwindField: { $ifNull: [null, [1.0]] }
}
},
{ $unwind: "$DummyUnwindField" }
);
This will produce 1 row per document, regardless of whether or not the value exists. You may be able tinker with this to generate the results you want. I had hoped to combine this with multiple $unwinds to (sort of like emit() in map/reduce), but alas, the last $unwind wins or they combine as an intersection rather than union which makes it impossible to achieve the results I was looking for. I am sadly disappointed with the aggregate framework functionality as it doesn't fit the one use case I was hoping to use it for (and seems strangely like a lot of the questions on StackOverflow in this area are asking) - ordering results based on match rate. Improving the poor map reduce performance would have made this entire feature unnecessary.
This is what I found & extended.
Lets create experimental database in mongo
db.copyDatabase('livedb' , 'experimentdb')
Now Use experimentdb & convert Array to object in your experimentcollection
db.getCollection('experimentcollection').find({}).forEach(function(e){
if(e.store){
e.ratings = [e.ratings]; //Objects name to be converted to array eg:ratings
db.experimentcollection.save(e);
}
})
Some nerdy js code to convert json to flat object
var flatArray = [];
var data = db.experimentcollection.find().toArray();
for (var index = 0; index < data.length; index++) {
var flatObject = {};
for (var prop in data[index]) {
var value = data[index][prop];
if (Array.isArray(value) && prop === 'ratings') {
for (var i = 0; i < value.length; i++) {
for (var inProp in value[i]) {
flatObject[inProp] = value[i][inProp];
}
}
}else{
flatObject[prop] = value;
}
}
flatArray.push(flatObject);
}
printjson(flatArray);

How do I update/set/unset a key in embedded document/array in MongoDB?

I am trying to unset all values in a document that's embedded in an array. Let's say I have a collection coll with array things, containing a value myval. I want to unset myval. This looks like:
{ things: [{ myval: 1 }, { myval: 2 }] }
I've tried both
db.coll.update({}, {$unset: {'things.myval': 1}})
and
db.coll.things.update({}, {$unset: {'myval': 1}})
Neither of these work. I can't find any documentation online describing how to do this.
I came across this and was frustrated to see the only was to do this was using a nested forEach. So after some searching I found another way! Since mongo version 3.6 there is a all positional operator ($[]) (version 3.6 was introduced way after this question was asked... and even after it was last active, BUT incase anyone else comes across this)
To remove the column myval from the first embedded document to match, do this:
db.coll.update(
{things: {'$exists': true}},
{$unset: {'things.$[].myval': 1}}
)
or if you want to remove it from all document that match use the updateMany
db.coll.updateMany(
{things: {'$exists': true}},
{$unset: {'things.$[].myval': 1}}
)
The $unset will only edit it if the value exists but it will not do a safe navigation (i.e. it wont check it things exists first) so the exists is needed on the embedded document/array.
You can remove a value from an "array" using the $pull operator:
db.coll.update({}, {$pull: {'things': {'myval': 1}}});
Also have a look at the documentation of $pull:
http://www.mongodb.org/display/DOCS/Updating#Updating-%24pull
Should work with your first approach according with the official documentation:
db.coll.update({}, {$unset: {'things.myval': 1}})
But that doesn't work for me to. So, the solution I found, applied to your example was:
db.coll.find().forEach(function(o) {
for(var i=0; i<o.things.length; i++) {
var unset = {};
unset["things." + i + ".myval"] = "";
db.coll.update( { "_id": o._id }, { "$unset": unset } );
}
})
Or with a more functional style:
db.coll.find().forEach(function(o) {
o.things.forEach(function(t, i) {
var unset = {};
unset["things." + i + ".myval"] = "";
db.coll.update( { "_id": o._id }, { "$unset": unset } );
})
})
It can't work because the {}-empty selector selects more than one documents, and update normally try to update a single document.In order to make it work, use the multi flag to tell mongo update that more than one document is comin.
multi - indicates if all documents matching criteria should be updated
rather than just one. Can be useful with the $ operators below.
db.coll.update({}, {$unset: {'things.myval': 1}},false,true)
or
db.coll.update({}, {$pull: {'things': {'myval': 1}}},false,true)
Try either one and will see which one works, but you have to use multi param in both