mongoose model has no access to connections in-progress transactions - mongodb

In my current express application I want to use the new ability of mongodb multi-doc transactions.
First of all it is important to point out how I connect and handle the models
My app.js (server) firstly connects to the db by using db.connect().
I require all models in my db.index file. Since the models will be initiated with the same mongoose reference, I assume that future requires of the models in different routes point to the connected and same connection. Please correct me if I'm wrong with any of these assumptions.
I save the connection reference inside the state object and returning it when needed for my transaction later
./db/index.ts
const fs = require('fs');
const path = require('path');
const mongoose = require('mongoose');
const state = {
connection = null,
}
// require all models
const modelFiles = fs.readdirSync(path.join(__dirname, 'models'));
modelFiles
.filter(fn => fn.endsWith('.js') && fn !== 'index.js')
.forEach(fn => require(path.join(__dirname, 'models', fn)));
const connect = async () => {
state.connection = await mongoose.connect(.....);
return;
}
const get = () => state.connection;
module.exports = {
connect,
get,
}
my model files are containing my required schemas
./db/models/example.model.ts
const mongoose = require('mongoose');
const Schema = mongoose.Schema;
const ExampleSchema = new Schema({...);
const ExampleModel = mongoose.model('Example', ExampleSchema);
module.exports = ExampleModel;
Now the route where I try to do a basic transaction. F
./routes/item.route.ts
const ExampleModel = require('../db/models/example.model');
router.post('/changeQty', async (req,res,next) => {
const connection = db.get().connection;
const session = await connection.startSession(); // works fine
// start a transaction
session.startTransaction(); // also fine
const {someData} = req.body.data;
try{
// jsut looping that data and preparing the promises
let promiseArr = [];
someData.forEach(data => {
// !!! THIS TRHOWS ERROR !!!
let p = ExampleModel.findOneAndUpdate(
{_id : data.id},
{$incr : {qty : data.qty}},
{new : true, runValidators : true}
).session(session).exec();
promiseArr.push(p);
})
// running the promises parallel
await Promise.all(promiseArr);
await session.commitTransaction();
return res.status(..)....;
}catch(err){
await session.abortTransaction();
// MongoError : Given transaction number 1 does not match any in-progress transactions.
return res.status(500).json({err : err});
}finally{
session.endSession();
}
})
But I always get the following error, which probably has to do sth with the connection reference of my models. I assume, that they don't have access to the connection which started the session, so they are not aware of the session.
MongoError: Given transaction number 1 does not match any in-progress
transactions.
Maybe I somehow need to initiate the models inside db.connect with the direct connection reference ?
There is a big mistake somewhere and I hope you can lead me to the correct path. I appreciate Any help, Thanks in advance

This is because you're doing operations in parallel:
So you've got a bunch of race conditions. Just use async/await
and make your life easier.
let p = await ExampleModel.findOneAndUpdate(
{_id : data.id},
{$incr : {qty : data.qty}},
{new : true, runValidators : true}
).session(session).exec();
Reference : https://github.com/Automattic/mongoose/issues/7311
If that does not work try to execute promises one by one rather than promise.all().

Related

Mongo transactions make a snapshot for the reading operations?

Imagine the following scenario:
Start a session
Start a transaction for that session
run an read on Document A
a different session made an update on Document A (During execution)
write Document B based on the original read of Document A
Commit the transaction
End the session
Will the update on Document A be atomic between read and write, or is there a concurrency problem? I understand transaction does a snapshot of all write operations but not sure what happens on the reading side.
await session.withTransaction(async () => {
const coll1 = client.db('mydb1').collection('foo');
const coll2 = client.db('mydb2').collection('bar');
const docA = await coll1.findOne({ abc: 1 }, { session });
// docA is deleted by other session on this point
if (docA){
//Does this runs on an outdated condition?
await coll2.insertOne({ xyz: 999 }, { session });
}
}, transactionOptions)

How to create a simple retrieve API after the connection has been done?

I am very new to MongoDB and Nodejs and I would like to know how to create a very simple find all the records from a collection.
What I have done so far and it is listening to port 5000 in the console log:
const express = require('express')
const BodyParser = require("body-parser");
const {MongoClient} = require('mongodb')
const port = 5000;
const app = express()
app.use(BodyParser.json());
app.use(BodyParser.urlencoded({ extended: true }));
var db;
const uri = "mongodb+srv://user1:mypassword#cluster0.2sgiu.mongodb.net/ecommerce?retryWrites=true&w=majority";
app.listen(port,() => {
MongoClient.connect(uri, { useNewUrlParser: true, useUnifiedTopology: true}, (err, database) => {
if (err) {
console.log("Error occurred connecting to MongoDB...")
}
console.log(`Listening to part ${port}...`)
});
})
How do I proceed from here to create an API to retrieve all records from a Books collection after the connection has been done? Thanks
You should first save the database object that is returned to you by the callback function of connect (let's called that db).
Then, you can use that object to access your collections like "Books" in this example:
db.Books
To get all the documents inside a collection you can use the find method with an empty object as the argument.
const books = db.Books.find({});
If you want to tie MongoDB queries to your Express app, you should a create routes in which, you can respond the request with data that you fetched. Considering that this data is going to be publicly available.
app.get('/books', (req, res) => {
const books = db.Books.find({});
res.status(200).send({ books });
});
You can now make a GET request to localhost:5000/books and it should respond to you back with the contents of the Books collection. Either use your browser or curl to make the request.
$ curl localhost:5000/books

Cloud Function that moves documents between collection

I have three collections: "past", today", and "future".
"today" collection is supposed to have only one document.
At midnight, I need to find in my "future" collection a document that has "next" field, or, if there is no such document, find one that have value in "number" field closest grater than value in "number" field of the document in my "today" collection. Then I need to move the "today"'s document into "past" collection, and also to move that found document from "future" collection to the "today" collection.
As far as I understand, there is no "move" method, so I have to use combination of deletes and creates which need to be done in one transaction.
I figured out how to do a "scheduler" part, but can't figure out how to code the rest (the actual moving of the documents).
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp();
const firestore = admin.firestore();
exports.scheduledFunction = functions.pubsub.schedule('0 0 * * *')
.onRun((context) => {
//I need to move my documents...
});
Could you help me with the code, please?
It might me that you are looking for a documentation in a wrong place. Its not in the Firestore/Extend with cloud functions. Its in Firestore basic documentation, but you have to switch the code type to node-js. https://firebase.google.com/docs/firestore/query-data/order-limit-data
You have to collect data by two queries: in today and future collections.
By these queries you get the docs and its' data.
Than you just need to make a doc in past, delete and make new a doc (or rewrite the existing one) in today, and to delete in future.
There is how I would do it in a simple callable function:
exports.scheduledFunction = functions.pubsub.schedule('0 0 * * *')
.onRun(async (context) => {
try {
let queryToday = admin.firestore().collection('today'); //you can add .limit(1)
const todaySnapshot = await queryToday.get();
const todayDoc = todaySnapshot.docs[0];
const todayData = todayDoc.data();
const todayToPastRef = admin.firestore().doc(`past/${todayData.documentUid}`);
/* or how the id is stored? you can just call
const todayToPastRef = admin.firestore().collection('past').doc()
and it will be generated automatically
*/
const promises = [];
promises.push(todayToPastRef.set(todayData));
let queryFuture = admin.firestore().collection('future').orderBy('date').limit(1);
/*
or how is the date stored? Idk if firebase allows to query by Timestamp
you just want to fetch the closest date after today so the order is ascending
*/
const futureSnapshot = await queryFuture.get();
const futureDoc = futureSnapshot.docs[0];
const futureData = futureDoc.data();
const futureToTodayRef = admin.firestore().doc(`today/${futureData.documentUid}`);
promises.push(futureToTodayRef.set(todayData));
promises.push(futureDoc.ref.delete());
promises.push(todayDoc.ref.delete());
/*
or you can try to change today's doc data, but the id will remain the same
promises.push(todayDoc.ref.update(futureData))
*/
return Promise.all(promises); // function will be executed after all the promises are fullfilled or rejected
} catch (err) {
return Promise.reject(err);
}
});
Note that instead of .then() and .catch() im using async/await.
Use console.log() for debugging, and try VSCode, so you can inspect methods and properties on the objects, which is rly helpful
UPDATE:
Yes, you can do it with a batch. There is another example:
exports.scheduledFunction = functions.pubsub.schedule('0 0 * * *').onRun((context) => {
const db = admin.firestore();
let queryToday = db.collection('today');
let queryFuture = db.collection('future').orderBy('date').limit(1);
const batch = db.batch();
return queryToday
.get()
.then(todaySnapshot => {
const todayDoc = todaySnapshot.docs[0];
const todayData = todayDoc.data();
const todayToPastRef = db.doc(`past/${todayData.docUid}`);
batch.set(todayToPastRef, todayData);
batch.delete(todayDoc.ref);
return queryFuture.get();
})
.then(futureSnapshot => {
const futureDoc = futureSnapshot.docs[0];
const futureData = futureDoc.data();
const futureToTodayRef = db.doc(`today/${futureData.docUid}`);
batch.set(futureToTodayRef, futureData);
batch.delete(futureDoc.ref);
// now two operations are completed, you just can commit the batch
return batch.commit();
})
.catch(err => {
// if todaySnapshot or futureSnapshot were not fetched, batch wont be commited
// or, for example, if snapshots were empty
return Promise.reject(err)
});
});
You can also fetch documents in parallel with .getAll() or something like that. You should test and experiment

insertMany drop down mongodb service

I have API, where I get datas. I use mongoose to save it into my local MongoDB. Each time when I save, I create dynamically new model and use insertMany on it:
const data = result.Data;
const newCollectionName = `tab_${name.toLowerCase()}`;
const Schema = new mongoose.Schema({}, { strict: false });
const TAB = mongoose.model(newCollectionName, Schema);
return TAB.collection.drop(() => {
// eslint-disable-next-line
const clearJSONs = Object.values(data)
.filter(x => typeof x === 'object');
return TAB.insertMany(clearJSONs, (err, result2) => {
// console.log(result2);
res.json({ success: true });
});
});
But... later, when all almost complete, my mongoose service falls down ... And... I even dont know what to do.
upd. mongo log:
2018-06-17T13:43:09.883+0300 E STORAGE [conn58] WiredTiger error (24) [1529232189:883394][4799:0x7f9fe1d30700], WT_SESSION.create: /var/lib/mongodb/: directory-sync: open: Too many open files
How to solve this?
The problem was in ulimits of the system. The best solution was described by this guy

Mongoose - save after lean

I'm fetching an object from mongo using mongoose but i want to use lean as there are attributes that aren't in mongoose model (strict:false)
After modifying that object i can't save it back
let order = await Order.findOne({_id: req.body.id}).populate({path: 'o_s', match: {removed: false}}).lean()
order.save() //not working as it's not mongoose model any more
Order.update({_id:order._id},{order}).exec(function(){
return res.json({status: true, message: 'Order Updated', order: order});
}) //not giving error but not updating too
any suggestions ?
In order to update the doc you retrieved from the db, you have to remove the _id property.
Consider this contrived example:
#!/usr/bin/env node
'use strict';
const mongoose = require('mongoose');
mongoose.connect('mongodb://localhost/test');
const conn = mongoose.connection;
const Schema = mongoose.Schema;
const schema = new Schema({
name: String
});
schema.set('strict', false);
const Test = mongoose.model('test', schema);
const test = new Test({
name: 'Henry',
nick: 'Billy'
});
async function run() {
await conn.dropDatabase();
await test.save();
let doc = await Test.findOne({}).lean();
const id = doc._id;
delete doc._id;
doc.name = 'Billy';
doc.nick = 'The Kid';
await Test.update({ _id: id }, doc);
let updated = await Test.findOne({});
console.log(updated);
return conn.close();
}
run();
output:
{ _id: 5ad34679bc95f172c26f3382,
name: 'Billy',
nick: 'The Kid',
__v: 0 }
note that you lose auto versioning in this scenario, if this is important to you, you'll need to manually increment the version property.
i suggest to doesn't use lean() because it has some performance isues that i realized.
otherwise you can use JSON.parse(JSON.stringify(order)) and for some reasean it's a better approach. especially wen you want to iterate over the result ( in my case it was two times faster for iteration).