I have a mongoDB with companies having the following structure:
"name": "CORESITE",
"isin": "us21870q1058",
"XDateInserted": "2020-09-16 14:19",
"XDateUpdated": "2020-10-09 14:38",
i wish to query all companies to an array, sorted by XdateUpdated. The following bit of code succeeds in sorting the result by name, however, sorting by XdateUpdated just return me a random order. Any Solutions ?
const allCompanies = await database
.get()
.db(Mongo.db)
.collection(Mongo.CC)
.find(
{
circulate: { $ne: false },
},
{
projection: {
_id: 0,
isin: 1,
name: 1,
XDateInserted: 1,
},
}
)
.sort({ name: -1 })
.toArray();
Here is a solution using aggregation.
You'll first have to convert the string date to Date type. Then Sort it.
const allCompanies = await database
.get()
.db(Mongo.db)
.collection(Mongo.CC)
.aggregate([
{
$addFields: {
updatedDateFormatted: {
$dateFromString: {
dateString: "$XDateUpdated"
}
}
}
},
{
$sort: {
updatedDateFormatted: -1
}
},
{
$project: {
updatedDateFormatted: 0 // To hide the custom field from result
}
}
])
.toArray();
Related
I'm trying to query specific fields in my document and sort them by one of the fields, however, the engine seems to completely ignore the sort.
I use the query:
db.symbols.find({_id:'AAPL'}, {'income_statement.annual.totalRevenue':1,'income_statement.annual.fiscalDateEnding':1}).sort({'income_statement.annual.totalRevenue': 1})
This is the output:
[
{
_id: 'AAPL',
income_statement: {
annual: [
{
fiscalDateEnding: '2021-09-30',
totalRevenue: '363172000000'
},
{
fiscalDateEnding: '2020-09-30',
totalRevenue: '271642000000'
},
{
fiscalDateEnding: '2019-09-30',
totalRevenue: '256598000000'
},
{
fiscalDateEnding: '2018-09-30',
totalRevenue: '265595000000'
},
{
fiscalDateEnding: '2017-09-30',
totalRevenue: '229234000000'
}
]
}
}
]
I would expect to have the entries sorted by fiscalDateEnding, starting with 2017-09-30 ascending.
However, the order is fixed, even if I use -1 for sorting.
Any ideas?
The sort you are using is for the ordering of documents in the result set. This is different from the ordering of array elements inside the document.
For your case, if you are using a newer version of MongoDB (5.2+), you can use the $sortArray.
db.symbols.aggregate([
{
$project: {
_id: 1,
annual: {
$sortArray: {
input: "$income_statement.annual",
sortBy: {
fiscalDateEnding: 1
}
}
}
}
}
])
If you are using older version of MongoDB, you can do the followings to perform the sorting.
db.collection.aggregate([
{
"$unwind": "$income_statement.annual"
},
{
$sort: {
"income_statement.annual.fiscalDateEnding": 1
}
},
{
$group: {
_id: "$_id",
annual: {
$push: "$income_statement.annual"
}
}
},
{
"$project": {
_id: 1,
income_statement: {
annual: "$annual"
}
}
}
])
Here is the Mongo Playground for your reference.
I was trying to migrate a large MongoDB of ~600k documents, like so:
for await (const doc of db.collection('collection').find({
legacyProp: { $exists: true },
})) {
// additional data fetching from separate collections here
const newPropValue = await fetchNewPropValue(doc._id)
await db.collection('collection').findOneAndUpdate({ _id: doc._id }, [{ $set: { newProp: newPropValue } }, { $unset: ['legacyProp'] }])
}
}
When the migration script finished, data was still being updated for about 30 minutes or so. I've concluded this by computing document count of documents containing legacyProp property:
db.collection.countDocuments({ legacyProp: { $exists: true } })
which was decreasing on subsequent calls. After a while, the updates stopped and the final document count of documents containing legacy prop was around 300k, so the update failed silently resulting in a data loss. I'm curious what exactly happened, and most importantly, how do you update large MongoDB collections without any data loss? Keep in mind, there is additional data fetching involved before every update operation.
My first attempt would be to build function of fetchNewPropValue() in an aggregation pipeline.
Have a look at Aggregation Pipeline Operators
If this is not possible then you can try to put all newPropValue's into array and use it like this. 600k properties should fit easily into your RAM.
const newPropValues = await fetchNewPropValue() // getting all new properties as array [{_id: ..., val: ...}, {_id: ..., val: ...}, ...]
db.getCollection('collection').updateMany(
{ legacyProp: { $exists: true } },
[
{
$set: {
newProp: {
$first: {
$filter: { input: newPropValues, cond: { $eq: ["$_id", "$$this._id"] } }
}
}
}
},
{ $set: { legacyProp: "$$REMOVE", newProp: "$$newProp.val" } }
]
)
Or you can try bulkWrite:
let bulkOperations = []
db.getCollection('collection').find({ legacyProp: { $exists: true } }).forEach(doc => {
const newPropValue = await fetchNewPropValue(doc._id);
bulkOperations.push({
updateOne: {
filter: { _id: doc._id },
update: {
$set: { newProp: newPropValue },
$unset: { legacyProp: "" }
}
}
});
if (bulkOperations.length > 10000) {
db.getCollection('collection').bulkWrite(bulkOperations, { ordered: false });
bulkOperations = [];
}
})
if (bulkOperations.length > 0)
db.getCollection('collection').bulkWrite(bulkOperations, { ordered: false })
I'm trying to analyse some data and I thought my queries would be faster ultimately by storing a relationship between my collections instead. So I wrote something to do the data normalisation, which is as follows:
var count = 0;
db.Interest.find({'PersonID':{$exists: false}, 'Data.DateOfBirth': {$ne: null}})
.toArray()
.forEach(function (x) {
if (null != x.Data.DateOfBirth) {
var peep = { 'Name': x.Data.Name, 'BirthMonth' :x.Data.DateOfBirth.Month, 'BirthYear' :x.Data.DateOfBirth.Year};
var person = db.People.findOne(peep);
if (null == person) {
peep._id = db.People.insertOne(peep).insertedId;
//print(peep._id);
}
db.Interest.updateOne({ '_id': x._id }, {$set: { 'PersonID':peep._id }})
++count;
if ((count % 1000) == 0) {
print(count + ' updated');
}
}
})
This script is just passed to mongo.exe.
Basically, I attempt to find an existing person, if they don't exist create them. In either case, link the originating record with the individual person.
However this is very slow! There's about 10 million documents and at the current rate it will take about 5 days to complete.
Can I speed this up simply? I know I can multithread it to cut it down, but have I missed something?
In order to insert new persons into People collection, use this one:
db.Interest.aggregate([
{
$project: {
Name: "$Data.Name",
BirthMonth: "$Data.DateOfBirth.Month",
BirthYear: "$Data.DateOfBirth.Year",
_id: 0
}
},
{
$merge: {
into: "People",
// requires an unique index on {Name: 1, BirthMonth: 1, BirthYear: 1}
on: ["Name", "BirthMonth", "BirthYear"]
}
}
])
For updating PersonID in Interest collection use this pipeline:
db.Interest.aggregate([
{
$lookup: {
from: "People",
let: {
name: "$Data.Name",
month: "$Data.DateOfBirth.Month",
year: "$Data.DateOfBirth.Year"
},
pipeline: [
{
$match: {
$expr: {
$and: [
{ $eq: ["$Name", "$$name"] },
{ $eq: ["$BirthMonth", "$$month"] },
{ $eq: ["$BirthYear", "$$year"] }
]
}
}
},
{ $project: { _id: 1 } }
],
as: "interests"
}
},
{
$set: {
PersonID: { $first: "$interests._id" },
interests: "$$REMOVE"
}
},
{ $merge: { into: "Interest" } }
])
Mongo Playground
I have a Mongoose collection called Track that has an array of fitnessPlan subdocuments, each of which currently has a month field that needs to be changed to week in production. I am using mongoose-migrate to migrate these values from the old month field to a new week field. Here's what I have got at the moment:
async function up () {
await Track.updateMany({},
{
$set: {
'fitnessPlans.$[elem].month': '$fitnessPlans.$[elem].week',
},
},
{ arrayFilters: [{ "elem.week": { $gte: 0 } }], strict: false, });
await Track.updateMany({},
{
$unset: {
'fitnessPlans.$[elem].week': '',
},
},
{ arrayFilters: [{ "elem.week": { $gte: 0 } }], strict: false, });
}
However, mongoose-migrate is throwing the following error:
Cast to number failed for value "$fitnessPlans.$[elem].week" at path "month"
I'm guessing this is because the string isn't evaluating correctly, but I'm not sure how else to reference that field's value in this setting.
Try update with aggregation pipeline starting from MongoDB 4.2,
$map to iterate loop of fitnessPlans array merge objects with current and new created week field using $mergeObjects
$unset month field
async function up () {
await Track.updateMany({},
[
{
$set: {
fitnessPlans: {
$map: {
input: "$fitnessPlans",
in: {
$mergeObjects: ["$$this", { week: "$$this.month" }]
}
}
}
}
},
{ $unset: "fitnessPlans.month" }
],
{ strict: false });
}
Playground
I am getting in an array of data and then want to insert that into my mongodb. I want to overwrite any duplicate values with the new one (on Im uploading) and if no duplicates just add it onto the current array.
I currently have:
db.cases.updateOne(
{ companyID: 218 },
{
$addToSet: {
cases: [AN ARRAY OF CASES]
},
$currentDate: { lastModified: true }
})
The collection has multiple companies and each has an array of cases - see image below :
The other thing that doesn't seem to work is that the currentDate doesn't seem to change whenever I update the cases, not sure if thats the way I have written the query?
Thank you.
You can use $addToSet (https://docs.mongodb.com/manual/reference/operator/update/addToSet/) with $[] (https://docs.mongodb.com/manual/reference/operator/update/positional-filtered/) to accomplish this.
I scripted an example in Node.js to show you what I mean:
const { MongoClient } = require('mongodb');
async function main() {
/**
* Connection URI. Update <username>, <password>, and <your-cluster-url> to reflect your cluster.
*/
const uri = "mongodb+srv://<username>:<password>#<your-cluster-url>?retryWrites=true&w=majority";
/**
* The Mongo Client you will use to interact with your database
*/
const client = new MongoClient(uri, { useUnifiedTopology: true });
try {
// Connect to the MongoDB cluster
await client.connect();
// Make the appropriate DB calls
await updateArray(client, 'id1')
} finally {
// Close the connection to the MongoDB cluster
await client.close();
}
}
main().catch(console.error);
async function updateArray(client, id) {
const docId = "299";
const mynewdoc = {
id: docId,
priority: 5,
casenumber: 40,
new: "field"
}
const result = await client.db("NameOfYourDb").collection("NameOfYourCollection").updateOne(
{ _id: id },
{
$set: {
"cases.$[element]": mynewdoc
}
},
{
arrayFilters: [{ "element.id": docId }]
}
)
if (result) {
console.log(result);
} else {
console.log(`Not found '${id}'`);
}
const result2 = await client.db("NameOfYourDb").collection("NameOfYourCollection").updateOne(
{ _id: id },
{
$addToSet: {
"cases": mynewdoc
}
}
)
if (result2) {
console.log(result2);
} else {
console.log(`Not found '${id}'`);
}
}
See https://www.mongodb.com/blog/post/quick-start-nodejs-mongodb--how-to-get-connected-to-your-database for an explanation of how the Node.js code is structured.
So I have figured a hack around but it is not exactly what I wanted - maybe someone can use this and expand upon it to make it work better:
db.cases.aggregate(
{ $match: { 'companyID': 218 }},
{ $unwind: '$cases' },
{ $group: { "_id": "$cases.casenumber", cases: {$push:'$cases'}, "count": { $sum:1 }}},
{ $match: { "count": { "$gt": 1 }}}
{ $project: { "cases": { $slice: [ "$cases", -1, { $subtract: [ { $size: "$cases" }, 1 ]}] }}},
{ $project: { "cases": { $arrayElemAt: ['$cases', 0] }}},
{ $group: { _id: 1, cases: {$push:'$cases' }}},
{ $out: 'cases' }
)
The only problem is that using out overwrites the cases document so I need to figure out a way to write it to the cases array inside of the company id (218)
Hopefully this can help someone though.
Thanks.