Mongoose findOneAndUpdate() only updates if document already exists, doesn't create new doc if not? - mongodb

I ran into a problem with my javascript bot, my custom prefixes don't get saved if there isn't yet a custom prefix for that server, if there is though, it does get updated correctly.
await mongo().then(async (mongoose) => {
try {
let newprefix = content.replace(`${prefix}setprefix `, '')
await prefixSchema.findOneAndUpdate({_id: guild.id}, {_id: guild.id, prefix: newprefix})
.then(async () => {
console.log(`updated prefix for guild: ${guild.id}`)
await channel.send(`Succesfully updated prefix for this server to '${newprefix}'`)
message.guild.me.setNickname(`[${newprefix}] - Helix`)
})
.catch(async (err) => {
console.error(`failed to update prefix for guild: ${guild.id}\n${err}`)
await channel.send(`Failed to update prefix.`)
})
console.log("saved to db")
} catch {
console.log("Something went wrong while saving new prefix for a server.")
} finally {
mongoose.connection.close()
}
The bot does print and send that it succesfully updated the prefix, but if there isn't already a document for the guild.id, nothing is saved. What did I do wrong and how can I solve it?
Thanks for reading!

Model.updateOne()
Parameters
[options.upsert=false] «Boolean» if true, and no documents found, insert a new document
MongoDB will update only the first document that matches filter regardless of the value of the multi option.
Use replaceOne() if you want to overwrite an entire document rather than using atomic operators like $set.
Example:
const res = await Person.updateOne({ name: 'Jean-Luc Picard' }, { ship: 'USS Enterprise' });
res.n; // Number of documents matched
res.nModified; // Number of documents modified
please visit https://mongoosejs.com/docs/api.html#model_Model.updateOne for more information.

Related

FireStore: How to merge timestamp data to existing document?

I am new to FireStore and and building an app where users can bookmark photo documents and show them on their personal feed. This works fine. Now, I want to be able to sort the bookmarks by bookmarked date when the user is viewing their personal feed (orderBy method). Thus, to make this happen, I figured I'd add a timestamp value at the moment the user bookmarks the document.
Here's my attempt. I wanted to verify with the community whether this is a good way to do it. I am concerned about redundancy and extra writes.
async addDocToFeed({state}, doc) {
try {
const feedRef = this.$fireStore
.collection(`users/${state.userProfile.uid}/feed`)
.doc(doc.id)
await feedRef.set(doc) < --- copy record to user's feed collection (see json sample below)
const bookmark = this.$fireStore
.collection(`users/${state.userProfile.uid}/feed`)
.doc(doc.id)
bookmark.update({
bookmarked: this.$fireStoreObj.FieldValue.serverTimestamp()
})
// })
console.log('doc bookmarked')
} catch (error) {
console.error('error updating doc', error)
}
}
Example JSON of doc before adding the timestamp:
{"id":"1KecNCqYlcVRjq4BLCbZ","comments":"__vue_devtool_nan__","url":"https://firebasestorage.googleapis.com/v0/b/vue-photoapp-api.appspot.com/o/photos%2F0.jpg?alt=media&token=ee23b95b-b5d8-4abe-b1b9-e335d591b413","tags":["router","Texas"],"filename":"0.jpg","description":"test with new router setup","createdAt":{"seconds":1596020630,"nanoseconds":473000000},"title":"test with new router setup","status":"Unsolved","userId":"SvuTxDtHXJdBHImNQWByqnO3F2U2","displayName":"MrRouter"}
I tried to do:
await feedRef.set({doc, bookmarked: this.$fireStoreObj.FieldValue.serverTimestamp()}, {merge: true})
but that erased all the data and only added the bookmarked timestamp.
Thanks for any advice or assurances I'm on the right track (or not)
this
.$fireStore
.collection(users/${state.userProfile.uid}/feed)
.doc(doc.id);
.set({
bookmarked: Date.now()
}, {merge: true})
.then(() => {
resolve(true);
}).catch((error) => {
reject(error)
})
This should work.

Mongoose findOneAndUpdate return not updated model

I've 1 little issue. I'm trying update model by findOneAndUpdate method. And this method works unexpected - method update model in DB but return old model (before update)
try {
const updatedLanding = await Landing.findOneAndUpdate({key: req.body.key}, {
$set: {
name: req.body.name,
}
},
).exec((err, result) => {
if (err) {
res.status(422).send({error: err});
return
}
res.send({response: result})
});
}
catch (e) {
res.status(400).send(e)
}
in mongoose query, findOneAndUpdate returns the old record that has been updated, not the updated record, the record has actually been updated, but you can not get the updated result as the query returns the old one by default, if you want to see the updated record you have to issue another query to find the record and get its updated data.
If you update document using findOneAndUpdate() hook, you'll get the old document unless you specify
{ new: true }

Data not being delete from elasticsearch index via mongoosastic?

I have mongoosastic setup within a MEAN stack program. Everything works correctly except when I delete a document from mongodb it is not deleted in the elasticsearch index. So every time I do a search that includes delete items, the deleted item is returned but is null when it is hydrated. Does mongoosastic handle deleting from the ES index? Do I have to program an index refresh?
var mongoose = require('mongoose');
var mongoosastic = require("mongoosastic");
var Schema = mongoose.Schema;
var quantumSchema = new mongoose.Schema({
note: {
type: String,
require: true,
es_indexed: true
}
});
quantumSchema.plugin(mongoosastic);
var Quantum = mongoose.model('Quantum', quantumSchema);
Quantum.createMapping(function(err, mapping){
if(err){
console.log('error creating mapping (you can safely ignore this)');
console.log(err);
}else{
console.log('mapping created!');
console.log(mapping);
}
});
I had the same error. If you look in the Documentation it states that you have to explicit remove the document after deleting it.
This is the way i am doing a deletion now.
const deleteOne = Model => async (id)=> {
const document = await Model.findByIdAndDelete(id);
if (!document) {
return new Result()
.setSuccess(false)
.setError('Unable to delete Entity with ID: ' + id + '.')
}
//this ensures the deletion from the elasticsearch index
document.remove();
return new Result()
.setSuccess(true)
.setData(document)
}
I dont know what version of mongoosastic you're using but i use mongoosastic#3.6.0 and my indexed doc get deleted whenever i remove it either using Model.findByIdAndRemove or Model.remove. Therefore try to cross check the way you delete you're docs.
I solved the problem by changing the way I delete the data.
I was using:
Quantum.findByIdAndRemove(quantumid)
I switched it to:
Quantum.findById(quantumid, function(err, quantum) {
quantum.remove(function(err, quantum) {
if (err) {
console.log(err);
return;
}
});
});
I did not research the reason for this working, but it solved the problem and I moved on.

MongoDb/Mongoskin - CLEANLY Update entire document w/o specifying properties

All the examples I have seen for MongoDb & Mongoskin for update, have individual properties being updated, like so:
// this works when I specify the properties
db.collection('User').update({_id: mongoskin.helper.toObjectID(user._id)},
{'$set':{displayName:user.displayName}}, function(err, result) {
if (err) throw err;
if (result){ res.send(result)}
});
But what if I wanted the whole object/document to be updated instead:
// this does not appear to work
db.collection('User').update({_id: mongoskin.helper.toObjectID(user._id)}, {'$set':user},
function(err, result){
// do something
}
It returns the error:
// It appears Mongo does not like the _id as part of the update
MongoError: After applying the update to the document {_id: ObjectId('.....
To overcome this issue, this is what I had to do to make things work:
function (req, res) {
var userId = req.body.user._id
var user = req.body.user;
delete user._id;
db.collection('User').update({_id: mongoskin.helper.toObjectID(userId)},
{'$set':user}, function(err, result) {
if (err) throw err;
console.log('result: ' + result)
if (result){ res.send(result)}
});
})
It there a more elegant way of updating the whole document, instead of hacking it with:
delete user._id
If you want to update the whole object, you do not need a $set. I am not aware of mongoskin, but in shell you would do something like:
var userObj = {
_id: <something>
...
};
db.user.update({_id: user._id}, user);
Which I think can be translated in your mongoskin in the following way.
db.collection('User').update({_id: user._id}, user, function(){...})
But here is the problem. You can not update _id of the element in Mongo. And this is what your error tells you. So you can remove the _id from your user object and have it separately. Search by this separate _id and update with a user object without _id.

Is there a way to perform a "dry run" of an update operation?

I am in the process of changing the schema for one of my MongoDB collections. (I had been storing dates as strings, and now my application stores them as ISODates; I need to go back and change all of the old records to use ISODates as well.) I think I know how to do this using an update, but since this operation will affect tens of thousands of records I'm hesitant to issue an operation that I'm not 100% sure will work. Is there any way to do a "dry run" of an update that will show me, for a small number of records, the original record and how it would be changed?
Edit: I ended up using the approach of adding a new field to each record, and then (after verifying that the data was right) renaming that field to match the original. It looked like this:
db.events.find({timestamp: {$type: 2}})
.forEach( function (e) {
e.newTimestamp = new ISODate(e.timestamp);
db.events.save(e);
} )
db.events.update({},
{$rename: {'newTimestamp': 'timestamp'}},
{multi: true})
By the way, that method for converting the string times to ISODates was what ended up working. (I got the idea from this SO answer.)
My advice would be to add the ISODate as a new field. Once confirmed that all looks good you could then unset the the string date.
Create a test environment with your database structure. Copy a handful of records to it. Problem solved. Not the solution you were looking for, I'm sure. But, I believe, this is the exact circumstances that a 'test environment' should be used for.
Select ID of particular records that you would like to monitor. place in the update {_id:{$in:[<your monitored id>]}}
Another option which depends of the amount of overhead it will cause you -
You can consider writing a script, that performs the find operation, add printouts or run in debug while the save operation is commented out. Once you've gained confidence you can apply the save operation.
var changesLog = [];
var errorsLog = [];
events.find({timestamp: {$type: 2}}, function (err, events) {
if (err) {
debugger;
throw err;
} else {
for (var i = 0; i < events.length; i++) {
console.log('events' + i +"/"+(candidates.length-1));
var currentEvent = events[i];
var shouldUpdateCandidateData = false;
currentEvent.timestamp = new ISODate(currentEvent.timestamp);
var change = currentEvent._id;
changesLog.push(change);
// // ** Dry Run **
// currentEvent.save(function (err) {
// if (err) {
// debugger;
// errorsLog.push(currentEvent._id + ", " + currentEvent.timeStamp + ', ' + err);
// throw err;
// }
// });
}
console.log('Done');
console.log('Changes:');
console.log(changesLog);
console.log('Errors:');
console.log(errorsLog);
return;
}
});
db.collection.find({"_manager": { $exists: true, $ne: null }}).forEach(
function(doc){
doc['_managers']=[doc._manager]; // String --> List
delete doc['_manager']; // Remove "_managers" key-value pair
printjson(doc); // Debug by output the doc result
//db.teams.save(doc); // Save all the changes into doc data
}
)
In my case the collection contain _manager and I would like to change it to _managers list. I have tested it in my local working as expected.
In the several latest versions of MongoDB (at least starting with 4.2), you could do that using a transaction.
const { MongoClient } = require('mongodb')
async function main({ dryRun }) {
const client = new MongoClient('mongodb://127.0.0.1:27017', {
maxPoolSize: 1
})
const pool = await client.connect()
const db = pool.db('someDB')
const session = pool.startSession()
session.startTransaction()
try {
const filter = { id: 'some-id' }
const update = { $rename: { 'newTimestamp': 'timestamp' } }
// This is the important bit
const options = { session: session }
await db.collection('someCollection').updateMany(
filter,
update,
options // using session
)
const afterUpdate = db.collection('someCollection')
.find(
filter,
options // using session
)
.toArray()
console.debug('updated documents', afterUpdate)
if (dryRun) {
// This will roll back any changes made within the session
await session.abortTransaction()
} else {
await session.commitTransaction()
}
} finally {
await session.endSession()
await pool.close()
}
}
const _ = main({ dryRun: true })