MongoDB $lookup - conditional value for localfield? - mongodb

I have 2 collections I want to combine using lookup.
Collection1: _AddressSyncStatus
fields I wanna use from this collection: "address"
Collection2: EthTokenTransfers
Fields I want to use from this collection: "to_address", "from_address".
Now when I use mongo compass, the lookup filter expects a local field, in my case the local field of EthTokenTransfers to join the collections. My problem now is, that I want to lookup where address from _AddressSyncStatus is either EthTokenTransfers.from_address OR EthTokenTransfers.to_address.
Is this possible using aggregations?
{
from: '_AddressSyncStatus',
localField: 'string', //from_address OR to_address
foreignField: 'address',
as: 'synced_address'
}

One way to do it is using the $lookup pipeline with $or:
db.EthTokenTransfers.aggregate([
{
$lookup: {
from: "_AddressSyncStatus",
let: {
from_address: "$from_address",
to_address: "$to_address"
},
pipeline: [
{
$match: {
$expr: {$or: [{$eq: ["$_id", "$$from_address"]},
{$eq: ["$_id", "$$to_address"]}
]
}
}
}
],
as: "synced_address"
}
}
])
As you can see here.
But I think that for the rset of the work with this data, it will be more convenient like this:
db.EthTokenTransfers.aggregate([
{
$lookup: {
from: "_AddressSyncStatus",
localField: "from_address",
foreignField: "_id",
as: "synced_address_from"
}
},
{
$lookup: {
from: "_AddressSyncStatus",
localField: "to_address",
foreignField: "_id",
as: "synced_address_to"
}
}
])
As you can see here

Related

Mongo $lookup aggregation using date timestamps

I have two collections: Car Drive Histories and Car Geolocations. For the purpose of analyzing drive patterns I have to aggregate driving histories and link them to car geolocations.
I've used $match and $project aggregation stages to get drive history documents with the following structure:
travelPurpose:<String>
carID:<ObjectId>
checkOutTime:<Date>
checkInTime:<Date>
The next step is to use $lookup stage to get car location between the two timestamps (checkOutTime and checkInTime). Every car geolocation document has carID and geoLocationTimestamp fields. If I use static dates, for example as such:
{
from: 'carGeoLocations',
localField: 'carID',
foreignField: 'carID',
pipeline: [
{$match: {
geoLocationTimestamp: {
$gte: ISODate('2022-01-01T00:00:00.000+0000'),
$lte: ISODate('2023-01-01T00:00:00.000+0000')
}
}}
],
as: 'coordinates'
}
I do get geolocations between 1. 1. 2022 and 1. 1. 2023. Mongo Playground with an example of this behaviour can be accessed here.
However, if I try to use dynamic dates based on values of checkOutTime and checkInTime, no documents are retrieved. Mongo playground with this example is available here. I've tried the following:
{
from: 'carGeoLocations',
localField: 'carID',
foreignField: 'carID',
pipeline: [
{$match: {
geoLocationTimestamp: {
$gte: "$checkOutTime",
$lte: "$checkInTime"
}
}}
],
as: 'coordinates'
}
and
{
from: 'carGeoLocations',
localField: 'carID',
foreignField: 'carID',
let: {t1: '$checkOutTime', t2: '$checkInTime'}
pipeline: [
{$match: {
geoLocationTimestamp: {
$gte: '$$t1',
$lte: '$$t2'
}
}}
],
as: 'coordinates'
}
With the same results. Can anyone spot any issues with my approach?
First lookup and than use match for geoLocationTimestamp
Try following code
{
$lookup:{
from: 'carGeoLocations',
localField: 'carID',
foreignField: 'carID',
as: 'coordinates'
}
},
{
$match:{
geoLocationTimestamp: {
$gte:'$coordinates.checkOutTime',
$lte:'$coordinates.checkInTime'
}
}
}
Update
After further experimentation, it turns out you need to use $expr when you want to use variables declared with let in $lookup stage of aggretation.
My lookup stage now looks like this:
{
"$lookup": {
"from": "carGeoLocations",
"localField": "carID",
"foreignField": "carID",
"let": {
t1: "$checkOutTime",
t2: "$checkInTime"
},
"pipeline": [
{
$match: {
$and: [
{
$expr: {
$gte: [
"$geoLocationTimestamp",
"$$t1"
],
}
},
{
$expr: {
$lte: [
"$geoLocationTimestamp",
"$$t2"
],
}
},
]
}
}
],
"as": "coordinates"
}
}

MongoDB lookup with match not efficient in large amount of data

I have a two collection that systems and systemsToUsers. It is refer the systems _id as a foreign key system_id in a systemToUsers table. I need system data which all are _id not presents in the sytemToUser table.
I am using the below query but it is taking a long time (5 seconds) for 100 000 data. But need to optimize the query reduce the execution time
db.Systems.aggregate([
{ $lookup: {
from: 'systemsToUsers',
localField: '_id', foreignField: 'system_id',
as: 'systemswithusers' } },
{ $match: { $expr: { $eq: [ { $size: '$systemswithusers' }, 0 ] } } }
]);
Try below $match and make sure every column in aggregate have index.
db.Systems.aggregate([
{
$lookup: {
from: "systemsToUsers",
localField: "_id",
foreignField: "system_id",
as: "systemswithusers"
}
},
{
$match: {
systemswithusers: []
}
}
])
mongoplayground

MongoDB $lookup creates array

So I am trying to join two collections together:
Collections are:
shows
episodes
I am using the $lookup value inside the shows collection.
[{$lookup: {
from: 'episode',
localField: 'url',
foreignField: 'show_b',
as: 'match_docs'
}}]
However I am getting all of the episodes from each show inside the match_docs in theory that is fine, however I need to be able to limit it to the latest episode limit:1 for each show ordered by pubDate
If anyone knows how I could limit the match_docs to only lookup once that would be great
I have also tried
{
from: 'episode',
localField: 'url',
foreignField: 'show_b',
pipeline: [
{$sort:{id:1}},{$limit:1},
],
as: 'match_docs'
}
With no success.
That would be easy with the second syntax of $lookup:
[
{
$lookup: {
from: 'episodes', # make sure your collection name is correct
let: {url: '$url'},
pipeline: [
{
$match: {
$expr: {
$eq: ['$show_b', '$$url']
}
}
},
{
$sort: {
pubDate: -1
}
},
{
$limit: 1
}
],
as: 'match_docs'
}
}
]

MongoDB Merge two properties from different collections into one and sort them

I'm faced into the issue when I'm trying to merge the results of two MongoDB lookup's into one property, and then I want to unwind them and sort.
I have a problem with merging the results of lookup's.
Here is the actual code:
db.getCollection('collectionA').aggregate([
{
$lookup:
{
from: 'collectionB',
localField: "_id",
foreignField: "collectionAKey",
as: "collectionBInfo"
}
},
{
$lookup:
{
from: 'collectionC',
localField: "_id",
foreignField: "collectionAKey",
as: "collectionCInfo"
}
},
/// then I just want to create one property from both of lookup's, unwind them and sort by the timestamp
{
$unwind: "$mergedCollectionsAandB"
},
{
$sort: {
"mergedCollectionsAandB.timestamp": -1
}
}
])
Here is a models of collections:
CollectionA
_id
name
CollectionB
_id
timestamp
collectionAKey
CollectionC
_id
timestamp
collectionAKey
I assume that it's possible by using $mergeObjects MongoDB operator, but I'm stuck a little bit how to do it in a right way. Is that possible? Thanks in advance.
So the final version of my query looks like that, it's what I was looking for:
db.getCollection('collectionA').aggregate([
{
$lookup:
{
from: 'collectionB',
localField: "_id",
foreignField: "collectionAKey",
as: "collectionBInfo"
}
},
{
$lookup:
{
from: 'collectionC',
localField: "_id",
foreignField: "collectionAKey",
as: "collectionCInfo"
}
},
{
$project: {
"mergedCollectionsAandB": { $concatArrays: ["$collectionBInfo", "$collectionCInfo"] }
}
},
{
$unwind: "$mergedCollectionsAandB"
},
{
$sort: {
"mergedCollectionsAandB.timestamp": -1
}
}
])

MongoDB $lookup multiple localFields from one document

I've got Adresses model:
{
user1: ObjectId(),
user2: ObjectId()
}
Each user field is reference to User model.
Is it possible to perform one lookup which will propagate data?
You can just put multiple references as following:
collection.aggregate([
{
$lookup: {
from: "whereyouwant",
localField: "localfield",
foreignField: "foreignfield",
as: "name1"
}
},
{
$lookup: {
from: "whereyouwant2",
localField: "localfield2",
foreignField: "foreignfield2",
as: "name2"
}
},... rest of code