I am trying to add a new document to a mongo array and I require one of the fields to be the current timestamp. This is for field level versioning but I can't figure out how to combine $push and $currentDate to get the result I would like.
Can someone point me in the right direction?
db.tmp.adviceReportingJourney.update(
{ _id : "5525f99be4b041151d51386e5525f99be4b041151d513870" },
{
$push: {
"$currentDate": {
"Conversation1MeetingCreated" : {
"vid" : 4,
"ts" : {"$type": "timestamp"},
"data" : 1428552213559
}
}
}
}
)
You can use your coding language date.Now to add the current time ;-)
Like (i asume java;-)):
db.tmp.adviceReportingJourney.update(
{ _id : "5525f99be4b041151d51386e5525f99be4b041151d513870" },
{
$push: {
"$currentDate": {
"Conversation1MeetingCreated" : {
"vid" : 4,
"ts" : {"$type": "timestamp"},
"data" : Date.now()
}
}
}
}
)
UPDATE: When running mongo > 3.0 you could use $currentDate. From the documentation $currentDate it shows that $currentDate only works on db.collection.update(), db.collection.findAndModify().
See the example below to update the embedded document "Conversation1MeetingCreated" (where you update the timestamp) use:
{
$currentDate: {
"Conversation1MeetingCreated.ts": { $type: "timestamp" }
},
$set: {
"Conversation1MeetingCreated.vid" : 4,
"Conversation1MeetingCreated.data": 1428552213559
}
}
Hope it helps.
Related
I have the following data (Cars):
[
{
"make" : “Ferrari”,
"model" : “F40",
"services" : [
{
"type" : "FULL",
“date_time" : ISODate("2019-10-31T09:00:00.000Z"),
},
{
"type" : "FULL",
"scheduled_date_time" : ISODate("2019-11-04T09:00:00.000Z"),
}
],
},
{
"make" : "BMW",
"model" : “M3",
"services" : [
{
"type" : "FULL",
"scheduled_date_time" : ISODate("2019-10-31T09:00:00.000Z"),
},
{
"type" : "FULL",
“scheduled_date_time" : ISODate("2019-11-04T09:00:00.000Z"),
}
],
}
]
Using Spring data MongoDb I would like a query to retrieve all the Cars where the scheduled_date_time of the last item in the services array is in-between a certain date range.
A query which I used previously when using the first item in the services array is like:
mongoTemplate.find(Query.query(
where("services.0.scheduled_date_time").gte(fromDate)
.andOperator(
where("services.0.scheduled_date_time").lt(toDate))),
Car.class);
Note the 0 index since it's first one as opposed to the last one (for my current requirement).
I thought using an aggregate along with a projection and .arrayElementAt(-1) would do the trick but I haven't quite got it to work. My current effort is:
Aggregation agg = newAggregation(
project().and("services").arrayElementAt(-1).as("currentService"),
match(where("currentService.scheduled_date_time").gte(fromDate)
.andOperator(where("currentService.scheduled_date_time").lt(toDate)))
);
AggregationResults<Car> results = mongoTemplate.aggregate(agg, Car.class, Car.class);
return results.getMappedResults();
Any help suggestions appreciated.
Thanks,
This mongo aggregation retrieves all the Cars where the scheduled_date_time of the last item in the services array is in-between a specific date range.
[{
$addFields: {
last: {
$arrayElemAt: [
'$services',
-1
]
}
}
}, {
$match: {
'last.scheduled_date_time': {
$gte: ISODate('2019-10-26T04:06:27.307Z'),
$lt: ISODate('2019-12-15T04:06:27.319Z')
}
}
}]
I was trying to write it in spring-data-mongodb without luck.
They do not support $addFields yet, see here.
Since version 2.2.0 RELEASE spring-data-mongodb includes the Aggregation Repository Methods
The above query should be
interface CarRepository extends MongoRepository<Car, String> {
#Aggregation(pipeline = {
"{ $addFields : { last:{ $arrayElemAt: [$services,-1] }} }",
"{ $match: { 'last.scheduled_date_time' : { $gte : '$?0', $lt: '$?1' } } }"
})
List<Car> getCarsWithLastServiceDateBetween(LocalDateTime start, LocalDateTime end);
}
This method logs this query
[{ "$addFields" : { "last" : { "$arrayElemAt" : ["$services", -1]}}}, { "$match" : { "last.scheduled_date_time" : { "$gte" : "$2019-11-03T03:00:00Z", "$lt" : "$2019-11-05T03:00:00Z"}}}]
The date parameters are not parsing correctly. I didn't spend much time making it work.
If you want the Car Ids this could work.
public List<String> getCarsIdWithServicesDateBetween(LocalDateTime start, LocalDateTime end) {
return template.aggregate(newAggregation(
unwind("services"),
group("id").last("services.date").as("date"),
match(where("date").gte(start).lt(end))
), Car.class, Car.class)
.getMappedResults().stream()
.map(Car::getId)
.collect(Collectors.toList());
}
Query Log
[{ "$unwind" : "$services"}, { "$group" : { "_id" : "$_id", "date" : { "$last" : "$services.scheduled_date_time"}}}, { "$match" : { "date" : { "$gte" : { "$date" : 1572750000000}, "$lt" : { "$date" : 1572922800000}}}}]
I have a Mongo collection the consists of a document and a nested object describing what collections the document is in and when it was added. I would like to remove key-value pairs from a nested object based on a condition, e.g. is the value (a date) before 1-1-2016.
Example:
{
"_id" : ObjectId("581214940911ad3de98002db"),
"collections" : {
"c01" : ISODate("2016-10-27T15:52:04.512Z"),
"c02" : ISODate("2015-11-21T16:06:06.546Z")
}
}
needs to become
{
"_id" : ObjectId("581214940911ad3de98002db"),
"collections" : {
"c01" : ISODate("2016-10-27T15:52:04.512Z"),
}
}
One alternative would be to change the schema to something like this:
{
"_id" : ObjectId("581214940911ad3de98002db"),
"collections" : [
{
"id": "c01",
"date": ISODate("2016-10-27T15:52:04.512Z")
},
{
"id": "c02",
"date" : ISODate("2015-11-21T16:06:06.546Z")
}
]
}
in which case removing a document from a would be easy. I am a bit reluctant to do that because it would complicate some of the other queries I would like to support. Thanks!
I prefer the second structure for your schema
{
"_id" : ObjectId("581214940911ad3de98002db"),
"collections" : [
{
"id": "c01",
"date": ISODate("2016-10-27T15:52:04.512Z")
},
{
"id": "c02",
"date" : ISODate("2015-11-21T16:06:06.546Z")
}
]
}
then able to remove from collections like this
db.collectionName.update(
{ },// if you want can add query for specific Id {"_id" : requestId},
{ $pull: { collections: { date: {$lt: yourDate} } } }, // if need can convert iso date string like: new Date(yourDate).toISOString()
{ multi: true }
)
If I have a document with the following basic structure:
{
...
Monday: { a:1, b:2 },
Tuesday: { c:3, d:4 }
...
}
Am I able to 'push' an additional key:value pair to Monday's value? Result would be:
{
Monday: { a:1, b:2, z:8 },
Tuesday: { c:3, d:4 }
...
}
The $push operator seems to only work for arrays.
Just do something like that
db.foo.update({"_id" :ObjectId("...") },{$set : {"Monday.z":8}})
How to add a new key:value pair to all existing objects of a mongoDB documents
Old Key and Value Pairs
> db.students.find().pretty();
{ "_id" : ObjectId("601594f5a22527655335415c"), "name" : "Doddanna" }
{ "_id" : ObjectId("601594f5a22527655335415d"), "name" : "Chawan" }
Update New Key and Value Pairs Using updateMany() and $set
> db.students.updateMany({},{$set:{newKey1:"newValue1", newKey2:"newValue2", newKeyN:"newValueN"}});
{ "acknowledged" : true, "matchedCount" : 2, "modifiedCount" : 2 }
Have a look on Updated pretty result
> db.students.find().pretty();
{
"_id" : ObjectId("601594f5a22527655335415c"),
"name" : "Doddanna",
"newKey1" : "newValue1",
"newKey2" : "newValue2",
"newKeyN" : "newValueN"
}
{
"_id" : ObjectId("601594f5a22527655335415d"),
"name" : "Chawan",
"newKey1" : "newValue1",
"newKey2" : "newValue2",
"newKeyN" : "newValueN"
}
I know this might be irrelevant to the question but as a matter of fact, I opened this page because I was looking for an exact query with mongoose. here is my answer with mongoose.
If we have an abstract model (mongoose schema) named week in our javascript application then the code will be:
// javascript with mongoose
...
const key = "z";
const KeyValue = 8;
await week.updateOne({
_id, // mongoDb document id
},
{
$set:{
[`Monday.${key}`]: KeyValue,
},
},
{
upsert: true // options
},
);
...
var json = {
Monday: { a:1, b:2 },
Tuesday: { c:3, d:4 } }
json['Monday']['z'] = 8;
console.log(json);
I am trying to run a mongo query to update the value of one field with the value of another field. I have the following documents:
{ "_id" : ObjectId("56e0a3a2d59feaa43fba49d5"), "old" : 16, "new" : 17 }
{ "_id" : ObjectId("56e0a3a2d59feaa43fba49d3"), "old" : 11, "new" : 12 }
I would like to make it look like this after update:
{ "_id" : ObjectId("56e0a3a2d59feaa43fba49d5"), "old" : 16, "new" : 16 }
{ "_id" : ObjectId("56e0a3a2d59feaa43fba49d3"), "old" : 11, "new" : 11 }
I've tried the following with no luck
db.runCommand(
{
findAndModify: "testData",
query: { $where: "this.new != this.old" },
update: { old : this.new },
upsert: true
}
)
and
db.testData.update( { $where: "this.new != this.old" }, { $set: { old: this.new } } );
Is this even possible with mongoDb?
I would like to do it in a single query and not iterate through each document.
Any ideas would be greatly appreciated.
Thank you
You can try something like this and probably change _id when you want to update other document:
db.testData.find().forEach(function(elem) {
db.testData.update({
"_id": "56e0a3a2d59feaa43fba49d5"
}, {
$set: {
new: elem.old
}
});
});
You can't do that in MongoDB yet (note to visitors from the future: I'm referring to V3.2).
You have to iterate on the documents.
NB there's a trick in case you don't mind deleting the old field: use rename to rename old to new.
NB2 for some SQLike-fu actions (not this one) the aggregation framework can be useful.
I have a document that includes a field like this:
{
...
log: [
{
utc_timestamp: ISODate("2014-11-15T10:26:47.337Z"),
type: "clicked"
},
{
utc_timestamp: ISODate("2014-10-15T16:12:51.959Z"),
type: "emailed"
},
{
utc_timestamp: ISODate("2014-10-15T16:10:51.959Z"),
type: "clicked"
},
{
utc_timestamp: ISODate("2014-09-15T04:59:19.431Z"),
type: "emailed"
},
{
utc_timestamp: ISODate("2014-09-15T04:58:19.431Z"),
type: "clicked"
},
],
...
}
How do I get the count of log entries of type "clicked" from this month, only if there is not a log entry of type "emailed" this month?
In other words, I want to find out which clicks have not been sent a related email.
So, in this example, the count would be 1 since the most recent "clicked" entry doesn't have an "emailed" entry.
Note: For this use case, clicks don't have unique IDs - this is all the data that is logged.
Use the following aggregation pipeline:
db.click_log.aggregate([
{ "$match" : { "log.type" : { "$ne" : "emailed" } } }, // get rid of docs with an "emailed" value in log.type and docs not from this month
{ "$unwind" : "$log" }, // unwind to get log elements as separate docs
{ "$project" : { "_id" : 1, "log" : 1, "month" : { "$month" : "$log.utc_timestamp" } } },
{ "$match" : { "log" : "clicked", "month" : <# of month> } }, // get rid of log elements not from this month and that aren't type clicked
{ "$group" : { "_id" : "$_id", "count" : { "$sum" : 1 } } } // collect clicked elements from same original doc and count number
])
This will return, for each document not having "emailed" as a value of log.type, the count of elements of the array log that have log.type value clicked and with timestamp from the current month. If you want a sliding 30-day period for month, change the $match to be a range query with $gt and $lt covering the desired time period.
You can use query something similar to below.
db.dbversitydotcom_col.aggregate([ { $unwind: “$log” },
{ $match: { “log.type” : “clicked”, "log.utc_timestamp" : "your required date" } },
{ $sort: { “Files.Size” : -1.0 } }, { $limit: 5.0 } ]).count()
Please refer to http://dbversity.com/mongodb-importance-of-aggregation-framework/ for more detailed explanation,