I trying to match the data in Subarray for some reason it is grouped like this.
Data :
{
"_id": 1,
"addresDetails": [
[
{
"Name":"John",
"Place":"Berlin",
"Pincode":"10001"
},
{
"Name":"Sarah",
"Place":"Newyork",
"Pincode":"10002"
}
],
[
{
"Name":"Mark",
"Place":"Tokyo",
"Pincode":"10003"
},
{
"Name":"Michael",
"Place":"Newyork",
"Pincode":"10002"
}
]
]
}
I tried with this Match query:
{
"$match":{
"attributes":{
"$elemMatch":{
"$in":["Mark"]
}
}
}
}
I am getting No data found , How do i match the elements in this subarrays.
Query
aggregation way, in general if you are stuck and query operators or update operators seems not enough, aggregation provides so much more operators, and its alternative.
2 nested filter in the 2 level arrays to find a Name in array [Mark]
*maybe there is a shorter more declarative way with $elemMatch, and possible a way to use index, also think about schema change, maybe you dont really need array with array members (the bellow doesnt use index)
*i used addressDetails remove the one s else you will get empty results
Playmongo
aggregate(
[{"$match":
{"$expr":
{"$ne":
[{"$filter":
{"input": "$addressDetails",
"as": "a",
"cond":
{"$ne":
[{"$filter":
{"input": "$$a",
"as": "d",
"cond": {"$in": ["$$d.Name", ["Mark"]]}}},
[]]}}},
[]]}}}])
You can apparently nest elemMatch as well, e.g.:
db.collection.find({
"addresDetails": {
$elemMatch: {
$elemMatch: {
"Name": "Mark"
}
}
}
})
This matches your document, as shown by this mongo playground link, but is probably not very efficient.
Alternatively you can use aggregations. For example unwind may help to flatten out your nested arrays, and allow for easier match afterwards.
db.collection.aggregate([
{
"$unwind": "$addresDetails"
},
{
"$match": {
"addresDetails.Name": "Mark"
}
}
])
You can find the mongo playground link for this here. But unwind is usually not preferred as the first stage of the aggregation pipeline either, again because of performance reasons.
Also please note that the results for these 2 options are different!
Related
I have 2 collections as follow:
event
{
"_id" : ObjectId("61f272dd1fac703fec69105a"),
"eventActivity" : [
ObjectId("61f76703196ea94bd43fa92e"),
]
}
event-activity
{
"_id" : ObjectId("61f76703196ea94bd43fa92e"),
"activity" : ObjectId("61f2a69bfe99e07db083de50"),
}
Based on the collections above, event has eventActivity field which refers to event-activity collection. I'm trying to filter the event by the value of event-activity.activity.
So if for example my filtration selection has activity in an array ['61d6b2060d6fe32d9853ad40', '61f2a69bfe99e07db083de50'], it will return the event. If the filtration selection has activity id ['61d6b2060d6fe32d9853ad40'], it should not return any event as there is no event with that activity id from event-activity
I can't really understand how the aggregate lookup work but I tried this and it doesn't work.
event.aggregate([
{"$lookup":{
"from":"event-activity",
"localField":"activity",
"foreignField":"_id",
"as":"event-activity"
}},
{
"$match":{
"event-activity.activity":{
"$in":["61d6b2060d6fe32d9853ad40","61f2a69bfe99e07db083de50"]
}
}
}
])
I referred to the manual here
Or can it be done by find() instead?
Query
you can use lookup with pipeline and put the match inside
if the lookup result is empty you can remove or keep the document based on your needs, with something like this
{"$match":{"$expr":{"$ne":["$activities", []]}}}
Test code here
event.aggregate(
[{"$lookup":
{"from":"event-activity",
"localField":"eventActivity",
"foreignField":"_id",
"pipeline":
[{"$match":
{"activity":
{"$in":
[ObjectId("61d6b2060d6fe32d9853ad40"),
ObjectId("61f2a69bfe99e07db083de50")]}}}],
"as":"activities"}}])
If I've understood correctly you can use this aggregation query:
This query uses a $lookup with a pipeline where the result is given by a match with an $in. So, the join will return the values where the event-activity.activity is in the array event.eventActivity.
db.event.aggregate([
{
"$lookup": {
"from": "event-activity",
"as": "activities",
"let": {
"ea": "$eventActivity"
},
"pipeline": [
{
"$match": {
"$expr": {
"$in": [
"$activity",
"$$ea"
]
}
}
}
]
}
}
])
Example here where I've used integers as activity to see easier the join.
I am trying to unset certain fields based on the name of the field (the key).
Say I have something like this:
{
5: "cool",
93: "cool",
30: "cool",
56: "cool"
}
How would I unset all fields with a value below, say, 40.
So the result should be:
{
5: "cool",
30: "cool"
}
I tried using the less than command on the whole field or using the positional operator but both of those failed.
collection.update_one(
{"_id": id},
{"$unset": {"blacklist.$": {"$lt": 40}}}
)
I couldn't find anything online or on the docs so I am hoping to find some help here,
Thanks!
You can't really use $unset like this, we can still achieve this using pipelined updates - with slightly more complicated syntax.
Our approach will be to turn our root object to an array using $objectToArray, iterate over it and filter all numeric keys under a certain threshold. then finally convert back to an object and update our document, like so:
db.collection.update({},
[
{
$replaceRoot: {
newRoot: {
$arrayToObject: {
$filter: {
input: {
$objectToArray: "$$ROOT"
},
cond: {
$cond: [
{
$regexMatch: {
input: "$$this.k",
regex: "^[0-9]+$"
}
},
{
$lt: [
{
$toInt: "$$this.k"
},
40
]
},
true
]
}
}
}
}
}
}
])
Mongo Playground
Query
into [] the $$ROOT
filter the keys to be either "_id" or < 40
back to object and
replace root
*this removes all fields >=40 , you can change the last $lt if you don't need this
*toms answer works and its more general, if you have mixed keys that some are numbers and some are not use toms way (this is slightly simpler but supposes that all fields are numbers except the "_id")
Test code here
aggregate(
[{"$replaceRoot":
{"newRoot":
{"$arrayToObject":
[{"$filter":
{"input":{"$objectToArray":"$$ROOT"},
"cond":
{"$or":
[{"$eq":["$$this.k", "_id"]},
{"$lt":[{"$toInt":"$$this.k"}, 40]}]}}}]}}}])
I have a document in the following form
{
"_id": "5c9a53b348a0ac000140b5f9",
"e": [
{
"_id": "d6c74cd5-5808-4b0c-b857-57ddbcc72ce5",
"isDeleted": true
},
{
"_id": "d6c74cd5-5808-4b0c-b857-57ddbcc72ce6",
"isDeleted": false
}
]
}
Every document has a list of elements on it, each of the elements which may or may not be deleted. By default, I don't want to return the deleted data. Right now, I filter them server-side but that still means a lot of data gets transmitted unnecessarily. Is it possible to exclude this data on the database?
I've looked at $elemMatch but that only returns a single value so it doesn't look like the right tool for the job.
Is there a way to project a document with an array of nested documents to only include those subdocuments that adhere to a certain condition?
You can use $filter aggregation here
db.collection.aggregate([
{ "$addFields": {
"e": {
"$filter": {
"input": "$e"
"cond": { "$eq": ["$$this.isDeleted", true] }
}
}
}}
])
I'm doing a rather complicated aggregation pipeline and have a rather strange phenomenon - I extracted a short example to visualize my problem here.
It seemed related to MongoDb $addFields and $match - but it doesn't contain any information for me to fix the problem at hand.
Note: Please note that my problem is not with the specific example of using date fields and or dealing with values, the problem is that I'm not able to $match using an expression - using a field that was added before with $addFields or not.
Given MongoDB: 3.6.3 (currently latest)
Let's insert some testdata:
db.testexample.insert({
"dateField": new ISODate("2016-05-18T16:00:00Z")
});
db.testexample.insert({
"dateField": new ISODate("2018-05-18T16:00:00Z")
});
Now let's make simple pipeline that computes only the year of the date and $matches on that:
db.testexample.aggregate([
{
"$addFields": {
"dateFieldYear": {"$year": "$dateField"}
}
},
{
"$match": {
"dateFieldYear": {"$eq": "$dateFieldYear"}}
}
}
])
--> No matches
It should match as it's the same field? Maybe with more trickery (using an $add)?
db.testexample.aggregate([
{
"$addFields": {
"dateFieldYear": {"$year": "$dateField"}
}
},
{
"$match": {
"dateFieldYear": {"$eq": {"$add": ["$dateFieldYear", 0]}}
}
}
])
--> No matches
Still no dice.. Next i thought that variables altogether are a problem. So let's fix the values:
db.testexample.aggregate([
{
"$addFields": {
"dateFieldYear": {"$year": "$dateField"}
}
},
{
"$match": {
"dateFieldYear": {"$eq": {"$add": [2016, 0]}}
}
}
])
--> No matches
Wait.. something is really wrong here.. Let's see with a static value:
db.testexample.aggregate([
{
"$addFields": {
"dateFieldYear": {"$year": "$dateField"}
}
},
{
"$match": {
"dateFieldYear": 2016
}
}
])
--> 1 record found!
So my conclusion seems to be that $match cannot take an expression on a field in an aggregate pipeline. But this doesn't seem possible - as the documentation states that $match follows the query syntax as described here.
Anybody can help how it can be done to $match using the simple example "dateFieldYear": {"$eq": "$dateFieldYear"}} - why doesn't this work as expected?
Thanks so much for any help
You can use $expr ( 3.6 mongo version operator ) to use aggregation functions in regular query.
Compare query operators vs aggregation comparison operators.
In your case
db.testexample.find({$expr:{$eq:["$dateFieldYear", "$dateFieldYear"]}})
Regular Query:
db.testexample.find({$expr:{$eq:["$dateFieldYear", {"$year": "$dateField"}]}})
Aggregation Query:
db.testexample.aggregate({$match:{$expr:{$eq:["$dateFieldYear", {"$year": "$dateField"}]}})
I want to get all matching values, using $elemMatch.
// create test data
db.foo.insert({values:[0,1,2,3,4,5,6,7,8,9]})
db.foo.find({},{
'values':{
'$elemMatch':{
'$gt':3
}
}
}) ;
My expecected result is {values:[3,4,5,6,7,8,9]} . but , really result is {values:[4]}.
I read mongo document , I understand this is specification.
How do I search for multi values ?
And more, I use 'skip' and 'limit'.
Any idea ?
Using Aggregation:
db.foo.aggregate([
{$unwind:"$values"},
{$match:{"values":{$gt:3}}},
{$group:{"_id":"$_id","values":{$push:"$values"}}}
])
You can add further filter condition in the $match, if you would like to.
You can't achieve this using an $elemMatch operator since, mongoDB doc says:
The $elemMatch projection operator limits the contents of an array
field that is included in the query results to contain only the array
element that matches the $elemMatch condition.
Note
The elements of the array are documents.
If you look carefully at the documentation on $elemMatch or the counterpart to query of the positional $ operator then you would see that only the "first" matched element is returned by this type of "projection".
What you are looking for is actually "manipulation" of the document contents where you want to "filter" the content of the array in the document rather than return the original or "matched" element, as there can be only one match.
For true "filtering" you need the aggregation framework, as there is more support there for document manipulation:
db.foo.aggregate([
// No point selecting documents that do not match your condition
{ "$match": { "values": { "$gt": 3 } } },
// Unwind the array to de-normalize as documents
{ "$unwind": "$values },
// Match to "filter" the array
{ "$match": { "values": { "$gt": 3 } } },
// Group by to the array form
{ "$group": {
"_id": "$_id",
"values": { "$push": "$values" }
}}
])
Or with modern versions of MongoDB from 2.6 and onwards, where the array values are "unique" you could do this:
db.foo.aggregate([
{ "$project": {
"values": {
"$setDifference": [
{ "$map": {
"input": "$values",
"as": "el",
"in": {
"$cond": [
{ "$gt": [ "$$el", 3 ] },
"$$el",
false
]
}
}},
[false]
]
}
}}
])