MongoDB $lookup if the local field exists - mongodb

I have these entities:
// collectionA
{
key: "value",
ref: SOME-OBJECT-ID
}
// collectionB
{
_id: SOME-OBJECT-ID
key1: "value1"
}
I want that if ref exists in the collectionA entity, it will lookup for it on the collectionB and bring its data.
If the ref key is missing or it doesn't missing but the entity in collectionB is missing I get empty result from all of the aggregate query.
This is the aggregate query:
{ $match },
{
$lookup: {
from: "collectionB",
let: {
ref: "$ref"
},
pipeline: [
{
$match: {
$expr: {
$eq: [
"$_id", "$$ref"
]
}
}
},
{
$project: {
key1: 1
}
}
],
as: "someData"
}
}
How can I avoid this or add any conditional $lookup?

One way of doing that is adding another match at the beginning to skip from source
To skip from B, you can omit at the end.
{$match:{ ref:{$exists:true}}}
It will consider only ref existing docs.
play
db.A.aggregate([
{
"$match": {
ref: {
$exists: true
}
}
},
{
"$lookup": {
"from": "B",
"localField": "ref",
"foreignField": "_id",
"as": "output"
}
}
])
But you don't need to do this if you don't have specific use case, as it will not impact much.

I have found it. The document was not selected because I have used the $unwind - and it won't return the document if we are trying to do it on an empty array. So this is the fix:
{
$unwind: {
path: "$ref",
preserveNullAndEmptyArrays: true
}
}
Instead of:
{
$unwind: "$ref"
}
I found the preserveNullAndEmptyArrays from this answer How to get all result if unwind field does not exist in mongodb

Related

How do I use a wildcard in my lookup foreignField?

I'm trying to make a lookup, where the foreignField is dynamic:
{
$merge: {
_id: ObjectId('61e56339b528bf009feca149')
}
},
{
$lookup: {
from: 'computer',
localField: '_id',
foreignField: 'configs.?.refId',
as: 'computers'
}
}
I know that the foreignField always starts with configs and ends with refId, but the string between the two is dynamic.
Here is an example of what a document looks like:
'_id': ObjectId('6319bd1540b41d1a35717a16'),
'name': 'MyComputer',
'configs': {
'ybe': {
'refId': ObjectId('61e56339b528bf009feca149')
'name': 'Ybe Config'
},
'test': {
'refId': ObjectId('61f3d7ec47805d1443f14540')
'name': 'TestConfig'
},
...
}
As you can see the configs property contains different objects with different names ('ybe', 'test', etc...). I want to lookup based on the refId inside of all of those objects.
How do I achieve that?
Using dynamic value as a field name is considered an anti-pattern and introduces unnecessary complexity to querying. However, you can achieve your behaviour with $objectToArray by converting the object into array of k-v pairs and perform the $match in a sub-pipeline.
db.coll.aggregate([
{
"$lookup": {
"from": "computer",
"let": {
id: "$_id"
},
"pipeline": [
{
$set: {
configs: {
"$objectToArray": "$configs"
}
}
},
{
"$unwind": "$configs"
},
{
$match: {
$expr: {
$eq: [
"$$id",
"$configs.v.refId"
]
}
}
}
],
"as": "computers"
}
}
])
MongoPlayground

MongoDB : How to loop on a field in order to lookup each value?

I need to loop on products field in order to lookup on products collection and check product category. If category is equal some value, i m adding a new Field on my current document.
Here a sample of my first document :
{
_id:(objId),
'products':[
{productId:987678},
{productId:3456765}
}
And my products documents :
{_id:(objId), category:1, name:2}
If category is correct, i use addField to add this on my first document:
category:true;
I can't figure out how to do this. Anyone can help me, please ?
https://mongoplayground.net/p/_KeECXkJTfB , here we have unwinded products before we do lookup then compared the category found and then we set the category true if found.
db.product.aggregate([
{
"$unwind": "$products"
},
{
"$lookup": {
"from": "category",
"localField": "products.productId",
"foreignField": "_id",
"as": "inventory_docs"
}
},
{
"$unwind": {
path: "$inventory_docs",
preserveNullAndEmptyArrays: true
}
},
{
"$addFields": {
"products.category": {
$cond: {
if: {
"$gt": [
{
$strLenCP: {
"$toString": {
"$ifNull": [
"$inventory_docs.category",
""
]
}
}
},
0
]
},
then: true,
else: false
}
}
}
}
])

Array is reordered when using $lookup

I have this aggregation:
db.getCollection("users").aggregate([
{
"$match": {
"_id": "5a708a38e6a4078bd49f01d5"
}
},
{
"$lookup": {
"from": "user-locations",
"localField": "locations",
"as": "locations",
"foreignField": "_id"
}
}
])
It works well, but there is one small thing that I don't understand and I can't fix.
In the query output, the locations array is reordered by ObjectId and I really need to keep the original order of data.
Here is how the locations array from the users collection looks like
'locations' : [
ObjectId("5b55e9820b720a1a7cd19633"),
ObjectId("5a708a38e6a4078bd49ef13f")
],
And here is the result after the aggregation:
'locations' : [
{
'_id' : ObjectId("5a708a38e6a4078bd49ef13f"),
'name': 'Location 2'
},
{
'_id' : ObjectId("5b55e9820b720a1a7cd19633"),
'name': 'Location 1'
}
],
What am I missing here? I really have no idea how to proceed with this issue.
Could you give me a push?
$lookup does not guarantee order of result documents, you can try a approach to manage natural order of document,
$unwind deconstruct locations array and add auto index number will start from 0,
$lookup with locations
$set to select first element from locations
$sort by index field in ascending order
$group by _id and reconstruct locations array
db.users.aggregate([
{ $match: { _id: "5a708a38e6a4078bd49f01d5" } },
{
$unwind: {
path: "$locations",
includeArrayIndex: "index"
}
},
{
$lookup: {
from: "user-locations",
localField: "locations",
foreignField: "_id",
as: "locations"
}
},
{ $set: { locations: { $arrayElemAt: ["$locations", 0] } } },
{ $sort: { index: 1 } },
{
$group: {
_id: "$_id",
locations: { $push: "$locations" }
}
}
])
Playground
From this closed bug report:
When using $lookup, the order of the documents returned is not guaranteed. The documents are returned in "natural order" - as they are encountered in the database. The only way to get a guaranteed consistent order is to add a $sort stage to the query.
Basically the way any Mongo query/pipeline works is that it returns documents in the order they were matched, meaning the "right" order is not guaranteed especially if there's indes usage involved.
What you should do is add a $sort stage as suggested, like so:
db.collection.aggregate([
{
"$match": {
"_id": "5a708a38e6a4078bd49f01d5"
}
},
{
"$lookup": {
"from": "user-locations",
"let": {
"locations": "$locations"
},
"pipeline": [
{
"$match": {
"$expr": {
"$setIsSubset": [
[
"$_id"
],
"$$locations"
]
}
}
},
{
$sort: {
_id: 1 // any other sort field you want.
}
}
],
"as": "locations",
}
}
])
You can also keep the original $lookup syntax you're using and just $unwind, $sort and then $group to restore the structure.

MongoDB aggregation - replace field in collection from value in another collection

so i've got a collection like so
_id: "a6c67aad-e90c-4a13-aae0-74e5ca5c8632"
value : true
and one like this
_id: "a6c67aad-e90c-4a13-aae0-74e5ca5c8632"
otherValue : false
How can i use aggregation pipelines to update the second collection otherValue with the value from the first collection based on _id
I've tried using lookup and then unwind like
{
from: 'col1',
localField: 'otherValue',
foreignField: 'value',
as: 'data'
}
and then unwind
{
path: '$val'
}
But im not quite sure where to go from here, any help would be greatly appreciated.
Try this:
db.collection1.aggregate([
{
$lookup: {
from: "collection2",
let: { c1_id: "$_id", value: "$value" },
pipeline: [
{
$match: {
$expr: { $eq: ["$_id", "$$c1_id"] }
}
},
{
$addFields: { otherValue: "$$value" }
}
],
as: "data"
}
},
{
$unwind: "$data"
}
])
Output:
{
"_id" : "a6c67aad-e90c-4a13-aae0-74e5ca5c8632",
"value" : true,
"data" : {
"_id" : "a6c67aad-e90c-4a13-aae0-74e5ca5c8632",
"otherValue" : true
}
}
Where collection1 is:
{
"_id" : "a6c67aad-e90c-4a13-aae0-74e5ca5c8632",
"value" : true
}
Where collection2 is:
{
"_id" : "a6c67aad-e90c-4a13-aae0-74e5ca5c8632",
"otherValue" : false
}
You might use the $merge aggregation stage.
match the documents from the source collection that you want to use to update the second collect.
lookup the matching document from the second collection
unwind so it is a single document instead of an array (this stage also eliminates documents that don't match one from the second collection)
addFields to store the value from the first document into the looked up document
replaceRoot to the modified looked-up document
merge the modified documents with the original collection, matching on _id
db.collection.aggregate([
{$match: { filter to pick the documents }},
{$lookup: {
from: "otherCollection"
localField: "_id",
foreignField: "_id",
as: "otherDocument"
}},
{$unwind: "$otherDocument"},
{$addFields:{
"otherDocument.otherValue": "$value"
}},
{$replaceRoot: {newRoot: "$otherDocument"}},
{$merge: {
into: "otherCollection",
on: "_id",
whenMatched: "replace",
whenNotMatched: "insert"
}}
])

MongoDB how to find documents with undefined $lookups (aggregation)

I have a document in an origin collection that holds OR not holds a reference to a foreign collection document - The key is not mandatory, so sometimes it's missing.
In such a situation the $lookup is "failed" and the desired document is not getting fetched from the DB.
This is the pipeline:
{
$lookup: {
from: "tables",
let: { "enginefuel_type": "$engine.fuel_type" },
pipeline: [
{ $match: { $expr: { $eq: ["$_id", "$$enginefuel_type"] }}},
{ $project: { title: 1 }}
],
as: "engine.fuel_type"
}
},
{
$unwind: "$engine.fuel_type"
},
{
$lookup: {
from: "tables",
let: { "enginegear": "$engine.gear" },
pipeline: [
{ $match: { $expr: { $eq: ["$_id", "$$enginegear"] }}},
{ $project: { title: 1 }}
],
as: "engine.gear"
}
},
{
$unwind: "$engine.gear"
}
I need the document to be found anyway - whether it has the engine.fuel_type and/or engine.gear fields or not.
If there is, so it should take the document from the foreign, otherwise just remain empty but not ignore the whole document.
I thought about making some pre-if statement checking if the field exists before making the aggregate query (could also be more efficient, reducing requests to the DB).
Is there any good way to do that?
Lookup stage works as you need it to, even though the field is not present in the origin collection, the document will not be ignored and will be a part of the result with "engine.fuel_type" array containing 0 elements.
It's the unwind stage that removes the documents which have 0 array elements. Fortunately, $unwind stage provides preserveNullAndEmptyArrays: option, that includes all the result.
So you could try to do something like this:-
$lookup: {
from: "tables",
let: { "enginefuel_type": "$engine.fuel_type" },
pipeline: [
{ $match: { $expr: { $eq: ["$_id", "$$enginefuel_type"] }}},
{ $project: { title: 1 }}
],
as: "engine.fuel_type"
}
},
{
$unwind: {
path: "engine.fuel_type",
preserveNullAndEmptyArrays: true
}
}