I have two collections: Car Drive Histories and Car Geolocations. For the purpose of analyzing drive patterns I have to aggregate driving histories and link them to car geolocations.
I've used $match and $project aggregation stages to get drive history documents with the following structure:
travelPurpose:<String>
carID:<ObjectId>
checkOutTime:<Date>
checkInTime:<Date>
The next step is to use $lookup stage to get car location between the two timestamps (checkOutTime and checkInTime). Every car geolocation document has carID and geoLocationTimestamp fields. If I use static dates, for example as such:
{
from: 'carGeoLocations',
localField: 'carID',
foreignField: 'carID',
pipeline: [
{$match: {
geoLocationTimestamp: {
$gte: ISODate('2022-01-01T00:00:00.000+0000'),
$lte: ISODate('2023-01-01T00:00:00.000+0000')
}
}}
],
as: 'coordinates'
}
I do get geolocations between 1. 1. 2022 and 1. 1. 2023. Mongo Playground with an example of this behaviour can be accessed here.
However, if I try to use dynamic dates based on values of checkOutTime and checkInTime, no documents are retrieved. Mongo playground with this example is available here. I've tried the following:
{
from: 'carGeoLocations',
localField: 'carID',
foreignField: 'carID',
pipeline: [
{$match: {
geoLocationTimestamp: {
$gte: "$checkOutTime",
$lte: "$checkInTime"
}
}}
],
as: 'coordinates'
}
and
{
from: 'carGeoLocations',
localField: 'carID',
foreignField: 'carID',
let: {t1: '$checkOutTime', t2: '$checkInTime'}
pipeline: [
{$match: {
geoLocationTimestamp: {
$gte: '$$t1',
$lte: '$$t2'
}
}}
],
as: 'coordinates'
}
With the same results. Can anyone spot any issues with my approach?
First lookup and than use match for geoLocationTimestamp
Try following code
{
$lookup:{
from: 'carGeoLocations',
localField: 'carID',
foreignField: 'carID',
as: 'coordinates'
}
},
{
$match:{
geoLocationTimestamp: {
$gte:'$coordinates.checkOutTime',
$lte:'$coordinates.checkInTime'
}
}
}
Update
After further experimentation, it turns out you need to use $expr when you want to use variables declared with let in $lookup stage of aggretation.
My lookup stage now looks like this:
{
"$lookup": {
"from": "carGeoLocations",
"localField": "carID",
"foreignField": "carID",
"let": {
t1: "$checkOutTime",
t2: "$checkInTime"
},
"pipeline": [
{
$match: {
$and: [
{
$expr: {
$gte: [
"$geoLocationTimestamp",
"$$t1"
],
}
},
{
$expr: {
$lte: [
"$geoLocationTimestamp",
"$$t2"
],
}
},
]
}
}
],
"as": "coordinates"
}
}
Related
//issues-collection
[
{
project: ObjectId("67"), // ref to a project
id: 101,
assignedTo: ObjectId("1212") // ref to a user
}
]
//project-collection
[
{
_id: ObjectId("67"),
title: "Proj 1",
members: [
{
_id: ObjectId("213213"),
user: ObjectId("1212"), // ref to a user
discipline: "Architect"
}
]
}
]
I need a query for the issue collection on the disciplin field from the project collection
The $lookup should be used. Something like this.
const getIssuesByDiscipline = (projectId, discpline: string) => {
const res = mongoose.model("issues").aggregate([
{
$lookup: {
from: "project",
localField: "assignedTo",
foreignField: "members.user",
as: "project",
//pipeline: ?
}
]
},
}...
Not sure how to use $lookup when the foregin field is an array. Maybe something with pipeline?
Any suggestion?
I think you want something like:
MongoDB version 5.0 or higher:
Use the project from issues collection to find the documents on project collection with the matching _id
Use the pipeline to $filter relevant users
db.issues.aggregate([
{$lookup: {
from: "project",
localField: "project",
foreignField: "_id",
as: "project",
let: {assignedTo: "$assignedTo"},
pipeline: [
{$project: {
members: {$filter: {
input: "$members",
cond: {$eq: ["$$this.user", "$$assignedTo"]}
}}
}}
]
}}
])
See how it works on the playground example
MongoDB versions before 5.0:
Can't use both localField, foreignField with a pipeline. Instead, add another phase to the pipeline (as a first phase) to match the relevant documents:
db.issues.aggregate([
{$lookup: {
from: "project",
as: "project",
let: {assignedTo: "$assignedTo", project: "$project"},
pipeline: [
{$match: {$expr: {$eq: ["$$project", "$_id"]}}},
{$project: {
members: {$filter: {
input: "$members",
cond: {$eq: ["$$this.user", "$$assignedTo"]}
}}
}}
]
}}
])
See how it works on the playground example
I have 2 collections I want to combine using lookup.
Collection1: _AddressSyncStatus
fields I wanna use from this collection: "address"
Collection2: EthTokenTransfers
Fields I want to use from this collection: "to_address", "from_address".
Now when I use mongo compass, the lookup filter expects a local field, in my case the local field of EthTokenTransfers to join the collections. My problem now is, that I want to lookup where address from _AddressSyncStatus is either EthTokenTransfers.from_address OR EthTokenTransfers.to_address.
Is this possible using aggregations?
{
from: '_AddressSyncStatus',
localField: 'string', //from_address OR to_address
foreignField: 'address',
as: 'synced_address'
}
One way to do it is using the $lookup pipeline with $or:
db.EthTokenTransfers.aggregate([
{
$lookup: {
from: "_AddressSyncStatus",
let: {
from_address: "$from_address",
to_address: "$to_address"
},
pipeline: [
{
$match: {
$expr: {$or: [{$eq: ["$_id", "$$from_address"]},
{$eq: ["$_id", "$$to_address"]}
]
}
}
}
],
as: "synced_address"
}
}
])
As you can see here.
But I think that for the rset of the work with this data, it will be more convenient like this:
db.EthTokenTransfers.aggregate([
{
$lookup: {
from: "_AddressSyncStatus",
localField: "from_address",
foreignField: "_id",
as: "synced_address_from"
}
},
{
$lookup: {
from: "_AddressSyncStatus",
localField: "to_address",
foreignField: "_id",
as: "synced_address_to"
}
}
])
As you can see here
I have two collections that I want to join with $lookup based on two id fields. Both fields are from type guid and looke like this in mongodb compass: 'Binary('cavTZa/U2kqfHtf08sI+Fg==', 3)'
This syntax in the compass aggregation pipeline builder gives the expected result:
{
from: 'clients',
localField: 'ClientId',
foreignField: '_id',
as: 'ClientData'
}
But i want to add some projection and tried to change it like this:
{
from: 'clients',
'let': {
id: '$_id.clients'
},
pipeline: [
{
$match: {
$expr: {
$eq: [
'$ClientId',
'$$id'
]
}
}
},
{
$project: {
Name: 1,
_id: 0
}
}
],
as: 'ClientData'
}
But the result here is that every client from collection 'clients' is added to every document in the starting table. I have to use MongoDB 3.6 so the new lookup syntax from >=5.0 is not available.
Any ideas for me? Does $eq work for binary stored guid data?
In the first example, you say that the local field is ClientId and the foreign field is _id. But that's not what you used in your second example.
This should work better:
{
from: 'clients',
'let': {
ClientId: '$ClientId'
},
pipeline: [
{
$match: {
$expr: {
$eq: [
'$$ClientId',
'$_id'
]
}
}
},
{
$project: {
Name: 1,
_id: 0
}
}
],
as: 'ClientData'
}
So I am trying to join two collections together:
Collections are:
shows
episodes
I am using the $lookup value inside the shows collection.
[{$lookup: {
from: 'episode',
localField: 'url',
foreignField: 'show_b',
as: 'match_docs'
}}]
However I am getting all of the episodes from each show inside the match_docs in theory that is fine, however I need to be able to limit it to the latest episode limit:1 for each show ordered by pubDate
If anyone knows how I could limit the match_docs to only lookup once that would be great
I have also tried
{
from: 'episode',
localField: 'url',
foreignField: 'show_b',
pipeline: [
{$sort:{id:1}},{$limit:1},
],
as: 'match_docs'
}
With no success.
That would be easy with the second syntax of $lookup:
[
{
$lookup: {
from: 'episodes', # make sure your collection name is correct
let: {url: '$url'},
pipeline: [
{
$match: {
$expr: {
$eq: ['$show_b', '$$url']
}
}
},
{
$sort: {
pubDate: -1
}
},
{
$limit: 1
}
],
as: 'match_docs'
}
}
]
I have a collection matches like this. I'm using players object {key: ObjectId, key: ObjectID} instead of classic array [ObjectId, ObjectID] for reference players collection
{
"_id": ObjectId("5eb93f8efd259cd7fbf49d55"),
"date": "01/01/2020",
"players": {
"home": ObjectId("5eb93f8efd259cd7fbf49d59"),
"away": ObjectId("5eb93f8efd259cd7fbf49d60")
}
},
{...}
And players collection:
{
"_id": ObjectId("5eb93f8efd259cd7fbf49d59"),
"name": "Roger Federer"
"country": "Suiza"
},
{
"_id": ObjectId("5eb93f8efd259cd7fbf49d60"),
"name": "Rafa Nadal"
"country": "España"
},
{...}
What's the better way to do mongoDB lookup? something like this is correct?
const rows = await db.collection('matches').aggregate([
{
$lookup: {
from: "players",
localField: "players.home",
foreignField: "_id",
as: "players.home"
}
},
{
$lookup: {
from: "players",
localField: "players.away",
foreignField: "_id",
as: "players.away"
},
{ $unwind: "$players.home" },
{ $unwind: "$players.away" },
}]).toArray()
I want output like this:
{
_id: 5eb93f8efd259cd7fbf49d55,
date: "12/05/20",
players: {
home: {
_id: 5eb93f8efd259cd7fbf49d59,
name: "Roger Federer",
country: "Suiza"
},
away: {
_id: 5eb93f8efd259cd7fbf49d60,
name: "Rafa Nadal",
country: "España"
}
}
}
{...}
You can try below aggregation query :
db.matches.aggregate([
{
$lookup: {
from: "players",
localField: "players.home",
foreignField: "_id",
as: "home"
}
},
{
$lookup: {
from: "players",
localField: "players.away",
foreignField: "_id",
as: "away"
}
},
/** Check output of lookup is not empty array `[]` & get first doc & write it to respective field, else write the same value as original */
{
$project: {
date: 1,
"players.home": { $cond: [ { $eq: [ "$home", [] ] }, "$players.home", { $arrayElemAt: [ "$home", 0 ] } ] },
"players.away": { $cond: [ { $eq: [ "$away", [] ] }, "$players.away", { $arrayElemAt: [ "$away", 0 ] } ] }
}
}
])
Test : mongoplayground
Changes or Issues with current Query :
1) As you're using two $unwind stages one after the other, If anyone of the field either home or away doesn't have a matching document in players collection then in the result you don't even get actual match document also, But why ? It's because if you do $unwind on [] (which is returned by lookup stage) then unwind will remove that parent document from result, To overcome this you need to use preservenullandemptyarrays option in unwind stage.
2) Ok, there is another way to do this without actually using $unwind. So do not use as: "players.home" or as: "players.away" cause you're actually writing back to original field, Just in case if you don't find a matching document an empty array [] will be written to actual fields either to "home" or "away" wherever there is not match (In this case you would loose actual ObjectId() value existing in that particular field in matches doc). So write output of lookup to a new field.
Or even more efficient way, instead of two $lookup stages (Cause each lookup has to go through docs of players collection again & again), you can try one lookup with multiple-join-conditions-with-lookup :
db.matches.aggregate([
{
$lookup: {
from: "players",
let: { home: "$players.home", away: "$players.away" },
pipeline: [
{
$match: { $expr: { $or: [ { $eq: [ "$_id", "$$home" ] }, { $eq: [ "$_id", "$$away" ] } ] } }
}
],
as: "data"
}
}
])
Test : mongoplayground
Note : Here all the matching docs from players which match with irrespective of away or home field will be pushed to data array. So to keep DB operation simple you can get that array from DB along with actual matches document & Offload some work to code which is to map respective objects from data array to players.home & players.away fields.