I'm filtering the data based on a Boolean savedBoolean , and if that Boolean is not being inputted I'm getting all the data, this code works for now. But how to do it in a cleaner way since I'm duplicating the code.
let filteredReviews : any | undefined;
if (savedBoolean === true || savedBoolean === false) {
filteredReviews = await Interviewee.aggregate([{
$project: {
_id: 0,
userId: 1,
'interviews.review': 1,
},
},
{
$unwind: '$interviews',
},
{
$match: {
userId: '4',
'interviews.review.saved': savedBoolean,
},
},
{
$group: {
_id: '$interviews.review._id',
review: {
$first: '$interviews.review',
},
},
},
]).skip((Number(page) - 1) * 3).limit(3);
}
if (savedBoolean === undefined) {
filteredReviews = await Interviewee.aggregate([{
$project: {
_id: 0,
userId: 1,
'interviews.review': 1,
},
},
{
$match: {
userId: '4',
},
},
{
$unwind: '$interviews',
},
]).skip((Number(page) - 1) * 3).limit(3);
}
In MongoDB, the db.collection.remove() method removes documents from a collection. You can remove all documents from a collection, remove all documents that match a condition, or limit the operation to remove just a single document.
Related
I am trying to conditionally execute two different mongoose operators but it just return no error and doesn't work to update the document.
My Schema:
const CartSchema: Schema<Document<ICart>> = new Schema({
clientID: { type: String, required: true },
sourceID: { type: String, required: true },
items: { type: Array },
source: { type: String, required: true },
}, { collection: "carts", timestamps: true });
The way I am trying to implement that:
await CartModel.findOneAndUpdate(
{ sourceID: clientID, 'items.id': Types.ObjectId(itemID) },
{
$cond: {
if: {
$eq: 1,
},
then: { $pull: { 'items.$.id': Types.ObjectId(itemID) }},
else: { $inc: { 'items.$.amount': -1 }},
}
},
{ new: true }
).lean({ virtuals: true })
And I also tried to have this kind of query: { sourceID: clientID } but it didn't help. I thought maybe I could not find the element and it just silently pass through.
The main idea here of what I am gonna do is - have a conditional mongoose request where I'll either remove the object from the array if the current value in the field amount is equal to 1, or decrement the value to -1.
Apparently, the $cond operator works only in the aggregation framework but in my Atlas tier I cannot check if the query works properly, but I suppose it should look something like this:
await CartModel.aggregate([
{ $match: {
sourceID: clientID,
'items.id': Types.ObjectId(itemID)
}
},
{
$cond: {
if: {
$eq: 1,
},
then: { $pull: { 'items.$.id': Types.ObjectId(itemID) }},
else: { $inc: { 'items.$.amount': -1 }},
}
}
])
I have an aggregate like this :
const files = await File.aggregate([
{
$match: { facilityId: { $in: facilities } }
},
{
$sort: { createdAt: 1 }
},
{
$project: {
file: 0,
}
}
])
And i would like to have each "facility" return only 4 files, i used to do something like facilities.map(id => query(id)) but id like to speed things up in production env.
Using $limit will limit the whole query, that's not what i want, i tried using $slice in the projection stage but got en error :
MongoError: Bad projection specification, cannot include fields or add computed fields during an exclusion projection
how can i achieve that in a single query ?
Schema of the collections is :
const FileStorageSchema = new Schema({
name: { type: String, required: true },
userId: { type: String },
facilityId: { type: String },
patientId: { type: String },
type: { type: String },
accessed: { type: Boolean, default: false, required: true },
file: {
type: String, //
required: true,
set: AES.encrypt,
get: AES.decrypt
},
sent: { type: Boolean, default: false, required: true },
},
{
timestamps: true,
toObject: { getters: true },
toJSON: { getters: true }
})
And i would like to returns all fields except for the file fields that contains the encrypted blob encoded as base64.
Also: i have the feeling that my approach is not correct, what i really would like to get is being able to query all facilityId at once but limited to the 4 latest file created for each facility, i though using an aggregate would help me achieve this but im starting to think it's not how its done.
From the question the schema is not clear. So I have two answers based on two Schemas. Please use what works for you
#1
db.collection.aggregate([
{
$match: {
facilityId: {
$in: [
1,
2
]
}
}
},
{
$group: {
_id: "$facilityId",
files: {
$push: "$file"
}
}
},
{
$project: {
files: {
$slice: [
"$files",
0,
4
],
}
}
}
])
Test Here
#2
db.collection.aggregate([
{
$match: {
facilityId: {
$in: [
1,
2
]
}
}
},
{
$project: {
facilityId: 1,
file: {
$slice: [
"$file",
4
]
}
}
}
])
Test Here
I am trying to update one document using findOneAndUpdate and $set but I clearly missing something very crucial here because the new request is overwriting old values.
My Device schema looks like this:
{
deviceId: {
type: String,
immutable: true,
required: true,
},
version: {
type: String,
required: true,
},
deviceStatus: {
sensors: [
{
sensorId: {
type: String,
enum: ['value1', 'value2', 'value3'],
},
status: { type: Number, min: -1, max: 2 },
},
],
},
}
And I am trying to update the document using this piece of code:
const deviceId = req.params.deviceId;
Device.findOneAndUpdate(
{ deviceId },
{ $set: req.body },
{},
(err, docs) => {
if (err) {
res.send(err);
} else {
res.send({ success: true });
}
}
);
And when I try to send a request from the postman with the body that contains one or multiple sensors, only the last request is saved in the database.
{
"deviceStatus": {
"sensors": [
{
"sensorId": "test",
"status": 1
}
]
}
}
I would like to be able to update values that are already in the database based on req.body or add new ones if needed. Any help will be appreciated.
The documentation said:
The $set operator replaces the value of a field with the specified
value.
You need the $push operator, it appends a specified value to an array.
Having this documents:
[
{
_id: 1,
"array": [
2,
4,
6
]
},
{
_id: 2,
"array": [
1,
3,
5
]
}
]
Using $set operator:
db.collection.update({
_id: 1
},
{
$set: {
array: 10
}
})
Result:
{
"_id": 1,
"array": 10
}
Using $push operator:
db.collection.update({
_id: 1
},
{
$push: {
array: 10
}
})
Result:
{
"_id": 1,
"array": [
2,
4,
6,
10
]
}
you want to using $push and $set in one findOneAndUpdate, that's impossible, I prefer use findById() and process and save() ,so just try
let result = await Device.findById(deviceId )
//implementation business logic on result
await result.save()
If you want to push new sensors every time you make request then update your code as shown below:
const deviceId = req.params.deviceId;
Device.findOneAndUpdate(
{ deviceId },
{
$push: {
"deviceStatus.sensors": { $each: req.body.sensors }
}
},
{},
(err, docs) => {
if (err) {
res.send(err);
} else {
res.send({ success: true });
}
}
);
Update to the old answer:
If you want to update sensors every time you make request then update your code as shown below:
const deviceId = req.params.deviceId;
Device.findOneAndUpdate(
{ "deviceId": deviceId },
{ "deviceStatus": req.body.sensors },
{ upsert: true },
(err, docs) => {
if (err) {
res.send(err);
} else {
res.send({ success: true });
}
}
);
I would like to select all events with a certain type from an events collection and then return 2 different groups using a single selection.
For example I currently have the following 2 selections:
const sessions = await Event.aggregate([
{
$match: {
isAdmin: { $ne: true }
}
}, {
$group: {
_id: '$sessionId'
}
}
]);
const users = await Event.aggregate([
{
$match: {
isAdmin: { $ne: true }
}
}, {
$group: {
_id: '$userId'
}
}
]);
I would like to achieve an end result of:
{
numberOfSessions: sessions.length,
numberOfUsers: users.length
}
By using a single query.
Thanks in advance!
You could use facet aggregation pipeline which will provide the capability to create multi-dimensions data within a single stage. For Eg:
const sessions = await Event.aggregate([
{
$match: {
isAdmin: { $ne: true }
}
}, {
$facet: {
sessions: [{
$sortByCount: "$sessionId"
}],
users: [{
$sortByCount: "$userId"
}]
}
}
]);
I have an image schema that has a reference to a category schema and a nested array that contains an object with two fields (user, createdAt)
I am trying to query the schema by a category and add two custom fields to each image in my query.
Here is the solution with virtual fields:
totalLikes: Count of all nested attributes
schema.virtual("totalLikes").get(function() {
return this.likes.length;
});
canLike: Check if user with id "5c8f9e676ed4356b1de3eaa1" is included in the nested array. If user is included it should return false otherwise true
schema.virtual("canLike").get(function() {
return !this.likes.find(like => {
return like.user === "5c8f9e676ed4356b1de3eaa1";
});
});
In sql it would be a simple SUBQUERY but I can't get it working in Mongoose.
Schema:
import mongoose from "mongoose";
const model = new mongoose.Schema(
{
category: {
type: mongoose.Schema.Types.ObjectId,
ref: "Category"
},
likes: [{
user: {
type: String,
required: true
},
createdAt: {
type: Date,
required: true
}
}]
})
here is a sample document:
[{
category:5c90a0777952597cda9e9c8d,
likes: [
{
_id: "5c90a4c79906507dac54e764",
user: "5c8f9e676ed4356b1de3eaa1",
createdAt:"2019-03-19T08:13:59.250+00:00"
}
]
},
{
category:5c90a0777952597cda9e9c8d,
likes: [
{
_id: "5c90a4c79906507dac54e764",
user: "5c8f9e676ed4356b1dw223332",
createdAt:"2019-03-19T08:13:59.250+00:00"
},
{
_id: "5c90a4c79906507dac54e764",
user: "5c8f9e676ed4356b1d8498933",
createdAt:"2019-03-19T08:13:59.250+00:00"
}
]
}]
Here is how it should look like:
[{
category:5c90a0777952597cda9e9c8d,
likes: [
{
_id: "5c90a4c79906507dac54e764",
user: "5c8f9e676ed4356b1de3eaa1",
createdAt:"2019-03-19T08:13:59.250+00:00"
}
],
totalLikes: 1,
canLike: false
},
{
category:5c90a0777952597cda9e9c8d,
likes: [
{
_id: "5c90a4c79906507dac54e764",
user: "5c8f9e676ed4356b1dw223332",
createdAt:"2019-03-19T08:13:59.250+00:00"
},
{
_id: "5c90a4c79906507dac54e764",
user: "5c8f9e676ed4356b1d8498933",
createdAt:"2019-03-19T08:13:59.250+00:00"
}
],
totalLikes: 2,
canLike: true
}]
Here is what I tried:
Resolver:
1) Tried in Mongoose call - Fails
const resources = await model.aggregate([
{ $match: {category: "5c90a0777952597cda9e9c8d"},
$addFields: {
totalLikes: {
$size: {
$filter: {
input: "$likes",
as: "el",
cond: "$$el.user"
}
}
}
},
$addFields: {
canLike: {
$match: {
'likes.user':"5c8f9e676ed4356b1de3eaa1"
}
}
}
}
])
2) Tried to change it after db call - works but not preferred solution
model.where({ competition: "5c90a0777952597cda9e9c8d" }).exec(function (err, records) {
resources = records.map(resource => {
resource.likes = resource.likes ? resource.likes: []
const included = resource.likes.find(like => {
return like.user === "5c8f9e676ed4356b1de3eaa1";
});
resource.set('totalLikes', resource.likes.length, {strict: false});
resource.set('canLike', !included, {strict: false});
return resource
});
})
Does anyone know how I can do it at runtime? THX
you can achieve it using aggregate
Model.aggregate()
.addFields({ // map likes so that it can result to array of ids
likesMap: {
$map: {
input: "$likes",
as: "like",
in: "$$like.user"
}
}
})
.addFields({ // check if the id is present in likesMap
canLike: {
$cond: [
{
$in: ["5c8f9e676ed4356b1de3eaa1", "$likesMap"]
},
true,
false
]
},
totalLikes: {
$size: "$likes"
}
})
.project({ // remove likesMap
likesMap: 0,
})