MongoDb optimisation - mongodb

Say I have a MongoDB collection with 5,000 records (or thousands). I want to update this collection with new records.
These new records can be the same or with few updated records.
To do this, I have 2 approaches:
Iterate over these records and determine which ones to update (based on a condition). If the condition is truthy update the record.
Delete all the existing records (deleteMany), and then simply add new data.
Which one would be more performant and optimised? And why? Or this there another way?

Related

How to efficiently loop through a MongoDB collection in order update a sequence column?

I am new to MongoDB/Mongoose and have a challenge I'm trying to solve in order to avoid a performance rabbit hole!
I have a MongoDB collection containing a numeric column called 'sequence' and after inserting a new document, I need to cycle through the collection starting at the position of the inserted document and to increment the value of sequence by one. In this way I maintain a collection of documents numbered from 1 to n (i.e. where n = the number of documents in the collection), and can render the collection as a table in which newly inserted records appear in the right place.
Clearly one way to do this is to loop through the collection, doing a seq++ in each iteration, and then using Model.updateOne() to apply the value of seq to sequence for each document in turn.
My concern is that this involves calling updateOne() potentially hundreds of times, which might not be optimal for performance. Any suggestions on how I should approach this in a more efficient way?

MongoDB update multiple rows at a time

I have a collection of 1000 documents already having column old_col.
Now I want to add new columns in this existing document, for multiple rows at once.
For instance, the existing situation would be:
{"old_col"=1},{"old_col"=2},{"old_col"=1000}
...to be changed to:
{"old_col"=1,"new_col"=11},{"old_col"=2,"new_col"=12},{"old_col"=1000,"new_col"=11000}.
How can I do this?

Mongodb, find last record, does the total count affect the performance?

I have a mongodb collection only insert and find last record will be issued against it.
And the count of records of this collection is very big, will this affect the time of find last record? Or maybe the affect is negligible?
the query used to find last record:
db.col.find().sort({created: -1}).limit(1)
try this one work
db.collection.find().limit(1).sort({$natural:-1})

Remove given number of records in Mongodb

I have Too much records in my Collection, can I have only desired number of records and remove others without any condition?
I have a collection called Products with around 10,0000 of records and its slowing down my Local application, I am thinking to shrink this huge amount of records to something around 1000, How can do it?
OR
How to copy a collection with limited number of records?
If you want to copy collection with limited number of records without any filter condition, for loop can be used . It copies 1000 document form originalCollection to copyCollection.
db.originalCollection.find().limit(1000).forEach( function(doc)
{db.copyCollection.insert(doc)} );

MongoDB update many documents with different timestamps in one update

I have 10000 documents in one MongoDB collection. I'd like to update all the documents with datetime values that are 1 second apart for each document (so all the date time values are unique and are spaced 1 second apart). Is there any way to do this with a single update instead of updating each document in turn which results in 10000 distinct update operations?
Thanks.
No, there is no way to do this with a single update statement. There are no expressions which run at the server to allow this type of update. There is a feature request for this but it is not done so it cannot be used.