MongoDB use array field's element to $set a new field of the document - mongodb

In the database, I have documents like the following
Ticket {
"eventHistory": [
{
"event": "CREATED",
"timestamp": "aa-bb-cccc"
},
{
"event": "ASSIGNED",
"timestamp": "ii-jj-kkkk"
},
...
{
"event": "CLOSED",
"timestamp": "xx-yy-zzzz"
}
]
}
I would like to add a closedAt field to the relevant Tickets, getting the value from the eventHistory array's last element. The resultant document would look like the following
Ticket {
"eventHistory": [
{
"event": "CREATED",
"timestamp": "aa-bb-cccc"
},
{
"event": "ASSIGNED",
"timestamp": "ii-jj-kkkk"
},
...
{
"event": "CLOSED",
"timestamp": "xx-yy-zzzz"
}
],
"closedAt": "xx-yy-zzzz"
}
The following pipeline allows me to use the entire object that's present as the eventHistory array's last element.
db.collection.updateMany(
<query>,
[
"$set": {
"closedAt": {
"$arrayElemAt": [
"$eventHistory",
-1
]
}
}
]
...
)
But I want to use only the timestamp field; not the entire object.
Please help me adjust (and/or improve) the pipeline.

One option to fix your query is:
db.collection.updateMany(
<query>,
[
{
$set: {
"Ticket.closedAt": {
$last: "$Ticket.eventHistory.timestamp"
}
}
}
])
See how it works on the playground example
But note that you assume that last item is a closing one. Is this necessarily the case? Otherwise you can validate it.

Related

MongoDB Rust Driver weird behavior

There is this weird thing,
I have installed the MongoDB Compass and made a aggregation query that works in the Aggregation tab but now when I use the same query in my rust web server it behaves very weirdly
Original message:
{"_id":{"$oid":"61efd41c56ffe6b1b4a15c7a"},"time":{"$date":"2022-01-25T10:42:36.175Z"},"edited_time":{"$date":"2022-01-30T14:29:54.361Z"},"changes":[],"content":"LORA","author":{"$oid":"61df3cab3087579f8767a38d"}}
Message in MongoDB compass after the query:
{
"_id": {
"$oid": "61efd41c56ffe6b1b4a15c7a"
},
"time": {
"$date": "2022-01-25T10:42:36.175Z"
},
"edited_time": {
"$date": "2021-12-17T09:55:45.856Z"
},
"changes": [{
"time": {
"$date": "2021-12-17T09:55:45.856Z"
},
"change": {
"ChangedContent": "LORA"
}
}],
"content": "LMAO",
"author": {
"$oid": "61df3cab3087579f8767a38d"
}
}
Message after the Web Servers query:
{
"_id": {
"$oid": "61efd41c56ffe6b1b4a15c7a"
},
"time": {
"$date": "2022-01-25T10:42:36.175Z"
},
"edited_time": {
"$date": "2022-01-30T14:40:57.152Z"
},
"changes": {
"$concatArrays": ["$changes", [{
"time": {
"$date": "2022-01-30T14:40:57.152Z"
},
"change": {
"ChangedContent": "$content"
}
}]]
},
"content": "LMAO",
"author": {
"$oid": "61df3cab3087579f8767a38d"
}
}
Pure query in MongoDB Compass:
$set stage
{
"changes": { $concatArrays: [ "$changes", [ { "time": ISODate('2021-12-17T09:55:45.856+00:00'), "change": { "ChangedContent": "$content" } } ] ] },
"edited_time": ISODate('2021-12-17T09:55:45.856+00:00'),
"content": "LMAO",
}
Pure query in Web Server:
let update_doc = doc! {
"$set": {
"changes": {
"$concatArrays": [
"$changes", [
{
"time": now,
"change": {
"ChangedContent": "$content"
}
}
]
]
},
"edited_time": now,
"content": content
}
};
I am using update_one method,
like this
messages.update_one(message_filter, update_doc, None).await?;
I don't really understand, and this happens often, sometimes it fixes it self when I add somewhere randomly some scope in the doc eg.: { } but this time I couldn't figure it out,
I had version of the query with $push but that didn't work too
Is there some fault in the rust driver or am I doing something wrong, are there some rules about formatting when using rust driver that I am missing?
The $set aggregation pipeline stage is different from the $set update operator. And the only difference that I can tell, is the pipeline stage handles $concatArrays while the update operator does not.
$set Aggregation Pipeline Stage
$set appends new fields to existing documents. You can include one or more $set stages in an aggregation operation.
To add field or fields to embedded documents (including documents in arrays) use the dot notation.
To add an element to an existing array field with $set, use with $concatArrays.
$set Update Operator
Starting in MongoDB 5.0, update operators process document fields with
string-based names in lexicographic order. Fields with numeric names
are processed in numeric order.
If the field does not exist, $set will add a new field with the
specified value, provided that the new field does not violate a type
constraint. If you specify a dotted path for a non-existent field,
$set will create the embedded documents as needed to fulfill the
dotted path to the field.
If you specify multiple field-value pairs, $set will update or create
each field.
So if you want to update an existing document by inserting elements into an array field, use the $push update operator (potentially with $each if you're inserting multiple elements):
let update_doc = doc! {
"$set": {
"edited_time": now,
"content": content
},
"$push": {
"changes": {
"time": now,
"change": {
"ChangedContent": "$content"
}
}
}
};
Edit: I missed that $content was supposed to be mapped from the existing field as well. That is not supported by an update document, however MongoDB has support for using an aggregation pipeline to update the document. See: Update MongoDB field using value of another field So you can use the original $set just in a different way:
let update_pipeline = vec![
doc! {
"$set": {
"changes": {
"$concatArrays": [
"$changes", [
{
"time": now,
"change": {
"ChangedContent": "$content"
}
}
]
]
},
"edited_time": now,
"content": content
}
}
];
messages.update_one(message_filter, update_pipeline, None).await?;

Push document as child and update timestamp in parent document

I have a document as following:
{
"_id": ObjectId("507f191e810c19729de860ea"),
"updateTimestamp": Date(1234567),
"actions": [
{
"timestamp": Date(123456)
"action": "FIRE"
},
{
"timestamp": Date(1234567)
"action": "HIDE"
}
]
}
How can I add a child document to the actions array and update the updateTimestamp field of the parent document at the same time? The idea is that I later can sort such documents on their most recent action activity, as reflected by the updateTimestamp field.
As simple as:
db.collection.updateOne(
{ _id: '[id here]'},
{
$set: {
'updateTimestamp': '[timestampHere]'
},
$push: {
'actions': [new record here]
}
}
);

Updating a nested Array in using UpdateOne()

I'm having an issue updating a nested Array in a document. Reading around the topic i've come across various method, one that i've tweaked below, however nothing seems to work for me!
I'm trying to update the field systemUpdate_DT which is in a parent Array called List and a child array called customData. I'm referring to the object in the child array using the key _id of the parent array and key field_id in the child array.
How do I update the systemUpdate_DT of the respective object?
Live Example: https://mongoplayground.net/p/453OFPOQqBp
A document in the collection looks like:
[
{
"_id": "6032a5ad80443334a35f2232",
"List": [
{
"_id": "6032a5af80443334a35f2234",
"customData": [
{
"_id": "6032a5bc80443334a35f223c",
"systemUpdate_DT": null,
"field_id": "6032a5bc80443334a35f223b"
},
{
"_id": "6032a5c280443334a35f223e",
"systemUpdate_DT": null,
"field_id": "6032a5c280443334a35f223d"
}
]
},
{
"_id": "6032a5b080443334a35f2236",
"customData": [
{
"_id": "6032a5bc80443334a35f223c",
"systemUpdate_DT": null,
"field_id": "6032a5bc80443334a35f223b"
},
{
"_id": "6032a5c280443334a35f223e",
"systemUpdate_DT": null,
"field_id": "6032a5c280443334a35f223d"
}
]
}
]
}
]
My Update Query looks like:
db.collection.updateOne({
{
"List._id": mongodb.ObjectId("6032a5af80443334a35f2234"),
"List.customData.field_id": mongodb.ObjectId("6032a5bc80443334a35f223b")
},
{
$set: {
"List.$.customData.systemUpdate_DT": 'updatedDTTM'
}
})
As there's two nested arrays in your document, you can't set the field with classic positional operator '$'.
Instead, you should use the arrayFilters option like this:
db.collection.update({
"_id": ObjectId("6032a5ad80443334a35f2232")
},
{
$set: {
"List.$[list].customData.$[customData].systemUpdate_DT": "updatedDTTM"
}
},
{
"multi": false,
"upsert": false,
arrayFilters: [
{
"list._id": {
"$eq": ObjectId("6032a5af80443334a35f2234")
}
},
{
"customData._id": {
"$eq": ObjectId("6032a5bc80443334a35f223c")
}
}
]
})
try it online: mongoplayground.net/p/fb_86rNUKvt

Move data from inside nested array

I have inserted multiple documents in my Mongo database incorrectly. I have accidentally nested the data inside another data object:
{
"_id": "5cdfda8ddc5cf00031fd3949",
"payload": {
"timestamp": "2019-05-18T10:12:29.896Z",
"data": {
"data": {
"name": 10,
"age": 10,
}
}
},
"__v": 0
}
I would like the document to not have the extra data object. So I would like it to look like this:
{
"_id": "5cdfda8ddc5cf00031fd3949",
"payload": {
"timestamp": "2019-05-18T10:12:29.896Z",
"data": {
"name": 10,
"age": 10,
}
},
"__v": 0
}
Is there a way in Mongo for me to update all the documents that have 2 data objects to just have one like shown above?
Alas, you cannot do this with one database request. You have to loop over all documents programmatically, set the new data and update them in the database.
You could use the aggregation framework, which won't let you update in place, but you could use the $out operator to write the results to a new collection, if that's an option.
db.collection.aggregate([
{
$project: {
__v : 1,
"payload.timestamp" : 1,
"payload.data" : "$payload.data.data"
},
},
{
"$out": "newCollection"
}
])
Or if you have a mixture of docs with correct format and docs with incorrect format, you can use the $cond operator to determine the correct output:
db.collection.aggregate([
{
$project: {
__v : 1,
"payload.timestamp" : 1,
"payload.data" : {
$cond: [
{ $ne : [ "$payload.data.data", undefined]},
"$payload.data.data",
"$payload.data"
]}
}
},
{
"$out": "newCollection"
}
])

MongoDb aggregation project onto collection

I've a problem with a huge MongoDb aggregation pipeline. I've many constraint and I've simplified the problem a lot. Hence, don't discuss the goal for this query.
I've a mongo aggregation that gives something similar to this:
[
{
"content": {
"processes": [
{
"id": "101a",
"title": "delivery"
},
{
"id": "101b",
"title": "feedback"
}
]
}
}
]
To this intermediate result I'm forced to apply a project operation in order to obtain something similar to this:
[
{
"results":
{
"titles": [
{
"id": "101a",
"value": "delivery"
},
{
"id": "101b",
"value": "feedback"
}
]
}
}
]
enter code here
But applying this projections:
"results.titles.id": "$content.processes.id",
"results.titles.value": "$content.processes.title"
I obtain this:
[
{
"results":
{
"titles": {
"id": ["101a", "101b"]
"value": ["delivery", "feedback"]
}
}
}
}
]
Collection are created but not in the proper position.
Is it possible to exploit some operator inside the project operation in order to tell mongo to create an array in a parent position?
Something like this:
"results.titles.$[x].value" : "$content.processes.value"
You can use the dot notation to project entire array:
db.col.aggregate([
{
$project: {
"results.titles": "$content.processes"
}
}
])
and if you need to rename title to value then you have to apply $map operator:
db.col.aggregate([
{
$project: {
"results.titles": {
$map: {
input: "$content.processes",
as: "process",
in: {
id: "$$process.id",
value: "$$process.title"
}
}
}
}
}
])