Transform data on mongo query - mongodb

I have collection that has an array field and string date field. How do i transform my mongo data that looks like:
{"d" : [ 1, 2, 3, 4, 5, 6, 7 ], "date" : "21-10-2020" }
to
{"21-10-2020" : [ 1, 2, 3, 4, 5, 6, 7 ] }
using query?
Is there a way to do this transform?

You can try,
$arrayToObject convert array format of k(key) and v(value) and it will convert to object
$replaceWith replace object to root
db.collection.aggregate([
{
$replaceWith: {
$arrayToObject: [
[{ k: "$date", v: "$d" }]
]
}
}
])
Playground

Related

Get the set of all unique values in array field

Given the following documents:
{ "_id" : ObjectId("585901b7875bab86885cf54f"), "foo" : 24, "bar" : [ 1, 2, 5, 6 ] }
{ "_id" : ObjectId("585901be875bab86885cf550"), "foo" : 42, "bar" : [ 3, 4 ] }
I want to get all the unique values in the bar field, something like:
{"_id": "something", "bar": [1, 2, 3, 4, 5, 6]}
This is what I tried:
db.stuff.aggregate([{
$group: {
_id: null,
bar: {
$addToSet: {$each: "$bar"}
}
}
}])
But complains that $each is not a recognized operator.
This does work:
db.stuff.aggregate([{
$group: {
_id: null,
bar: {
$addToSet: "$bar"
}
}
}])
But obviously produces a wrong result:
{ "_id" : null, "bar" : [ [ 3, 4 ], [ 1, 2, 5, 6 ] ] }
EDIT
I managed to have the result I want by adding a first $unwind stage:
db.stuff.aggregate([{
$unwind: { "$bar" },
$group: {
_id: null,
bar: {
$addToSet: "$bar"
}
}
}])
=> { "_id" : null, "bar" : [ 4, 3, 5, 2, 6, 1 ] }
Is it possible at all to make it in one single pipeline stage?
The distinct() works with array field as well so will beautifully do this.
db.stuff.distinct('bar')
The aggregation framework is overkill for this and will not perform well

Search Exact Array Values In Multiple Fields

I have a collection which has 3 documents like below:
Collection:
{
name: "A",
arr: [1, 2, 3],
arr1: [4, 5, 6]
},
{
name: "B",
arr: [3, 7, 11],
arr1: [5, 6, 9]
},
{
name: "C",
arr: [3, 4, 5],
arr1: [7, 9, 12]
}
I want to search array below in the collection.
But all array values must be matched in fields "arr" or "arr1".
I mean array values can be in either fields but all values must be in the document.
So when I search array in the collection only second which has name:"B" and third which has name:"C" documents should be the result.
Because in the second document; first array value( 3 ) in the "arr" field and second and third array values(5 and 9) in the "arr1" field. In the third document first and second (3, 5) array values in the "arr" field and third array value (9) in the "arr1" field.
Array : [3, 5, 9]
Can you help me?
The best way to do this is using the $redact operator.
db.collection.aggregate([
{ "$redact": {
"$cond": [
{ "$setIsSubset": [ [3,5,9], { "$setUnion": [ "$arr", "$arr1" ] } ] },
"$$KEEP",
"$$PRUNE"
]}
}
])
You can also use $project with the $setUnion operator
and $match.
db.collection.aggregate([
{ "$project": { "name": 1, "arr": 1, "arr1": 1, "allvalues": { "$setUnion": [ "$arr", "$arr1" ]}}},
{ "$match": { "allvalues": { "$all": [3, 5, 9] }}}
])
Output:
{ "_id" : ObjectId("55d48fd2939d0f7d372d6dbe"), "name" : "B", "arr" : [ 3, 7, 11 ], "arr1" : [ 5, 6, 9 ] }
{ "_id" : ObjectId("55d48fd2939d0f7d372d6dbf"), "name" : "C", "arr" : [ 3, 4, 5 ], "arr1" : [ 7, 9, 12 ] }

MongoDB - Using find() to get values in an array

My collection contains documents of the following form:
{
'game_name': 'football',
'scores': [4,1,2,7,6,0,5]
}
How do I find the 'scores' of all such objects in the collection and sort them in ascending order?
If I understand your Question
db.yourcollection.aggregate([
{
$unwind:"$scores"
},{
$sort:{
"scores":1
}
},{
$group:{
_id:null,
scores:{$addToSet:"$scores"}
}
},{
$project:{
_id:0,
scores:1
}
}
])
Result is:
{
"result" : [
{
"scores" : [
7,
5,
4,
6,
2,
1,
0
]
}
],
"ok" : 1
}

mongodb aggregation pipeline, fold an array property in `$project`ion

Given a collection of documents that each has an array property ks:
{
_id: ObjectId('...'),
ks: [4, 3, 2, 1, 3],
v: 45
},
{
_id: ObjectId('...'),
ks: [3, 3, 5],
v: 21
},
{
_id: ObjectId('...'),
ks: [1, 5, 2, 8, 9, 7],
v: 12
}
How can I aggregate this collection to a list using key = min ks or other fold functions?
[
{
_id: 1,
v: 28.5 // = mean [45, 12]
},
{
_id: 3,
v: 21 // = mean [21]
}
]
Grouping using the keyf function works
keyf: function(d) { d.ks.reduce(function(acc, a) { return acc<a ? acc : a; }) }
But is there a way to do this with aggregation pipeline?
It seems that you want the minimum $min value of ks for your aggregation key and the $avg of "v" for each min ks. You need to $unwind "ks" first.
You also need to $group your data twice, once for finding the min of ks and the next time for calculating the avg of v.
db.collection.aggregate([
// Unwind the array
{ "$unwind": "$ks" },
// Find the minimal key per document
{ "$group": {
"_id": "$_id",
"ks": { "$min": "$ks" },
"v": { "$first": "$v" }
}},
// Group with the average value
{ "$group": {
"_id": "$ks",
"v": { "$avg": "$v" }
}},
// Group does not sort results
{ "$sort": { "_id": 1 } }
])
Results in:
[
{
"_id" : 1,
"v" : 28.5
},
{
"_id" : 3,
"v" : 21
}
]

Can I easily return all of the fields of a subdocument as fields in the top level document using the aggregation framework?

I have a document similar to the following, from which I want to return the sub-fields of the current top level field as the top level fields in every document of the results array:
{
field1: {
subfield1: {},
subfield2: [],
subfield3: 44,
subfield5: xyz
},
field2: {
othercontent: {}
}
}
I want the results of my aggregation query to return the following (the contents of field1 as the top level document):
{
subfield1: {},
subfield2: [],
subfield3: 44,
subfield5: xyz
}
Can this be done with $project and the aggregation framework without defining every sub fields to return as a top level field?
You can use $replaceRoot aggregation operator since 3.4:
db.getCollection('sample').aggregate([
{
$replaceRoot: {newRoot: "$field1"}
}
])
Provides output as expected:
{
"subfield" : {},
"subfield2" : [],
"subfield3" : 44,
"subfield5" : "xyz"
}
It's generally hard to make MongoDB deal with ambiguous or parameterized json keys. I ran into a similar issue and the best solution was to modify the schema so that the members of the subdocument became elements in an array.
However, I think this will get you close to what you want (all code should run directly in the Mongo shell). Assuming you have documents like:
db.collection.insert({
"_id": "doc1",
"field1": {
"subfield1": {"key1": "value1"},
"subfield2": ["a", "b", "c"],
"subfield3": 1,
"subfield4": "a"
},
"field2": "other content"
})
db.collection.insert({
"_id": "doc2",
"field1": {
"subfield1": {"key2": "value2"},
"subfield2": [1, 2, 3],
"subfield3": 2,
"subfield4": "b"
},
"field2": "yet more content"
})
Then you can run an aggregation command that promotes the content of field1 while ignoring the rest of the document:
db.collection.aggregate({
"$group":{
"_id": "$_id",
"value": {"$push": "$field1"}
}})
This makes all the subfield* keys into top-level fields of an object, and that object is the only element in an array. It's clumsy, but workable:
"result" : [
{
"_id" : "doc2",
"value" : [
{
"subfield1" : {"key2" : "value2"},
"subfield2" : [1, 2, 3],
"subfield3" : 2,
"subfield4" : "b"
}
]
},
{
"_id" : "doc1",
"value" : [
{
"subfield1" : {"key1" : "value1"},
"subfield2" : ["a","b","c"],
"subfield3" : 1,
"subfield4" : "a"
}
]
}
],
"ok" : 1
Starting Mongo 4.2, the $replaceWith aggregation operator can be used to replace a document by another (in our case by a sub-document) as syntaxic sugar for $replaceRoot:
// { field1: { a: 1, b: 2, c: 3 }, field2: { d: 4, e: 5 } }
// { field1: { a: 6, b: 7 }, field2: { d: 8 } }
db.collection.aggregate({ $replaceWith: "$field1" })
// { a: 1, b: 2, c: 3 }
// { a: 6, b: 7 }