Transaction 1 has been committed in MongoDB - mongodb

I am trying to use transactions to update multiple documents.
One being a loading sheet document [await sheet.save({ session });]
and other being an array of stock reservation records [await Stock.bulkWrite()].
const session = await mongoose.startSession();
session.startTransaction({
readPreference: 'primary',
readConcern: { level: 'local' },
writeConcern: { w: 'majority' },
});
let sheetAfterSave: any = null;
try {
sheetAfterSave = await sheet.save({ session });
records.forEach(async (el: any) => {
let updatedStockRecord = await Stock.bulkWrite(
[
{
updateOne: {
filter: {
index: el.index,
product: el.product,
batchNo: el.batchNo,
agency,
totalQuantity: { $gte: el.loadingTotal },
},
update: {
$push: {
reservations: {
loadingSheetId: sheetAfterSave._id,
reservedCaseQuantity: el.loadingCaseCount,
reservedUnitQuantity: el.loadingUnitCount,
reservedTotalQuantity: el.loadingTotal,
},
},
},
},
},
],
{
session: session,
}
);
});
await session.commitTransaction();
session.endSession();
} catch (error) {
console.log(error);
await session.abortTransaction();
session.endSession();
throw new Error(
`Error occured while trying to create a new loading sheet. ${error}`
);
}
In this operation loading sheet document gets saved in the database, but not the stock reservation records array.
It gives the error
[distribution] MongoError: Transaction 1 has been committed.
[distribution] at MessageStream.messageHandler (/app/node_modules/mongoose/node_modules/mongodb/lib/cmap/connection.js:263:20)
[distribution] at MessageStream.emit (node:events:376:20)
[distribution] at processIncomingData (/app/node_modules/mongoose/node_modules/mongodb/lib/cmap/message_stream.js:144:12)
[distribution] at MessageStream._write (/app/node_modules/mongoose/node_modules/mongodb/lib/cmap/message_stream.js:42:5)
[distribution] at writeOrBuffer (node:internal/streams/writable:388:12)
[distribution] at MessageStream.Writable.write (node:internal/streams/writable:333:10)
[distribution] at TLSSocket.ondata (node:internal/streams/readable:716:22)
[distribution] at TLSSocket.emit (node:events:376:20)
[distribution] at addChunk (node:internal/streams/readable:305:12)
[distribution] at readableAddChunk (node:internal/streams/readable:280:9)
[distribution] at TLSSocket.Readable.push (node:internal/streams/readable:219:10)
[distribution] at TLSWrap.onStreamRead (node:internal/stream_base_commons:192:23)
[distribution] [ERROR] 14:56:43 MongoError: Transaction 1 has been committed.
As I use the session in a failure like this the saved documents must be rolled back but its not happening here.
Am I missing something here?? Appreciate your help
Cheers
Do you see anything wrong with this code. I am trying to create a dynamic error inside the loop to make sure that all the the transactions are rolled back if any error occurs inside the loop?? Dynamic error outside the loop rolls back all the transactions perfectly but not the one inside the loop.
const session = await mongoose.startSession();
try {
await session.withTransaction(
async () => {
sheetAfterSave = await sheet.save({ session });
records.forEach(async (el: any) => {
let updatedStockRecord = await Stock.bulkWrite(
[
{
updateOne: {
filter: {
index: el.index,
product: el.product,
batchNo: el.batchNo,
agency,
totalQuantity: { $gte: el.loadingTotal },
},
update: {
$push: {
reservations: {
loadingSheetId: sheetAfterSave._id,
reservedCaseQuantity: el.loadingCaseCount,
reservedUnitQuantity: el.loadingUnitCount,
reservedTotalQuantity: el.loadingTotal,
},
},
},
},
},
],
{
session: session,
}
);
console.log('******************');
throw new Error('12/24/2020 ERROR INSIDE LOOP'); // MongoError: Transaction 1 has been committed.
});
throw new Error('12/24/2020 ERROR OUTSIDE LOOP'); // All transactions are rolled back perfectly
},
{
readPreference: 'primary',
readConcern: { level: 'local' },
writeConcern: { w: 'majority' },
}
);
} catch (error) {
console.log('ERROR BLOCK', error);
throw new Error(
`Error occured while trying to create a new loading sheet. ${error}`
);
} finally {
session.endSession();
await mongoose.connection.close();
console.log('************ FINALLY *****************');
}

I was able to solve the issue.
Problem was not with the below code
await session.commitTransaction(); (success)
session.endSession(); (failure)
} catch (error) { (entered)
await session.abortTransaction(); (invoked)
but it was with the records.forEach loop.
records.forEach(async (el: any) => {...});
inside the foreach when throwing an error it is not caught by the outermost try catch block since the content inside the loop is in a different functional context than the code outside of the loop.
Once I changed the loop from .forEach to
for (const el of records) {}
its working as expected.
Posting the answer in case if someone faces the same in the future. Thanks for the support guys :)

This mostly happens when one or more an update or db operation is not executed before the commit transaction is called. for example:
const session = await mongoose.startSession();
await session.withTransaction(async() => {
await User.findOneAndUpdate({_id: user._id}, {$set: {profile: profile._id}}, {session});
profile.set(body);
profile.user = user._id;
profile.save({session}); // THIS LINE WILL CAUSE SUCH ERROR BECAUSE IT RETURNS PROMISE
})
session.endSession();
return profile;
The right thing to do is to make sure operations are executed/await before the session transaction is committed as shown below.
const session = await mongoose.startSession();
await session.withTransaction(async() => {
await User.findOneAndUpdate({_id: user._id}, {$set: {profile: profile._id}}, {session});
profile.set(body);
profile.user = user._id;
await profile.save({session}); //AWAIT TO ENSURE COMPLETION OF EXECUTION
})
session.endSession();
return profile;
In the original question, ForEach() loop used is async operation which will not guarantee that all operations are completed before the transaction is committed.
Substituting it with a for..of loop which is somehow sync operation will guarantee a linear other of execution and error-out instantly if anything fails

The following sequence of calls/events will attempt to abort a committed transaction:
await session.commitTransaction(); (success)
session.endSession(); (failure)
} catch (error) { (entered)
await session.abortTransaction(); (invoked)
See https://docs.mongodb.com/manual/core/transactions/ for correct usage.

Related

Can an outside query modify a document involved in an ongoing transaction?

const client = new MongoClient(uri);
await client.connect();
await client
.db('mydb1')
.collection('foo');
const session = client.startSession();
const transactionOptions = {
readPreference: 'primary',
readConcern: { level: 'local' },
writeConcern: { w: 'majority' }
};
try {
await session.withTransaction(async () => {
const coll1 = client.db('mydb1').collection('foo');
await coll1.updateOne({user_id: 12344, paid: false }, { $set: { paid: true } } { session });
// long running computation after this line.
// what if another query deletes the document inserted above
// before this transaction completes.
await calls_third_party_payment_vendor_api_to_process_payment();
}, transactionOptions);
} finally {
await session.endSession();
await client.close();
}
What if the update document inside the transaction is simultaneously updated from an outside query before the transaction is committed?
What you have described is a transaction/operation conflict and
the operation blocks behind the transaction commit and infinitely retries with backoff logic until MaxTimeMS is reached
I wrote a report on MongoDB transactions, containing some examples in NodeJS too. if you are interested on the subject, I recommend you the paragraphs WiredTiger Cache and What happens when we create a multi-document transaction?

Cloud Functions: Mongoose aggregate search causing Error: Collection method aggregate is synchronous error

I am calling a Cloud Function call from my Flutter app when it starts. When I start from 'flutter run', my function call returns Error: Collection method aggregate is synchronous.
But, when I refresh the app, with command + r, then it returns the correct value.
app.dart
var gameFunction = await functions.httpsCallable('gameS').call({
'gamingId': 'afijeoaijosf'
});
f function index
export const gameS = functions.https.onCall(async (data, context) => {
try {
return await gameFunctionF(data, context);
} catch (err) {
const aa = err as functions.https.HttpsError;
throw new functions.https.HttpsError(aa.code, aa.message, aa.details);
}
});
f function gameFunctionF
export async function gameFunctionF(data: any, context: CallableContext) {
console.log(data);
try {
return await db
.aggregate([
{ $project: { game_profile: 1} },
])
.toArray();
} catch (err) {
console.log(`aaaa - err - ${err}`);
throw 'nooooo';
}
}
When I start the app, I get that Collection method aggregate is synchronous error. But then when I refresh the app with command + r, it returns the data that I want. No variable has been changed no nothing. var gameFunction runs when the app.dart is called.
I really don't get why it is happening since it only causes this error when I start from flutter run but it works fine when I just refresh the app.
I'd tried with my android phone that it causes the error when i open the app after I terminate the app as well.
As Mousumi suggested to look at, that link solved the problem for me.
const mongoConn = createConnection();
export const mongooooooo = mongoConn.openUri(
`uri`,
{
connectTimeoutMS: 30000,
}
);
This is how I changed from createConnection(uri). And then, you have to adjust the db function to
(await db)
.aggregate([
{ $project: { game_profile: 1} },
])
The error has been solved! Thank you to Mousumi!
UPDATE ---------
I had to add await in front of (await db) in order to get the promise result.
await (await db)
.aggregate([
{ $project: { game_profile: 1} },
])

Why are my tests occasionally passing and occasionally failing? (Jest, Mongoose, MongoDB)

I have a setup file for my tests that looks like this:
const mongoose = require('mongoose');
mongoose.set('useCreateIndex', true);
mongoose.promise = global.Promise;
async function removeAllCollections() {
const collections = Object.keys(mongoose.connection.collections);
for (const collectionName of collections) {
const collection = mongoose.connection.collections[collectionName];
await collection.deleteMany();
}
}
async function dropAllCollections() {
const collections = Object.keys(mongoose.connection.collections);
for (const collectionName of collections) {
const collection = mongoose.connection.collections[collectionName];
try {
await collection.drop();
} catch (error) {
// Sometimes this error happens, but you can safely ignore it
if (error.message === 'ns not found') return;
// This error occurs when you use it.todo. You can
// safely ignore this error too
if (error.message.includes('a background operation is currently running'))
return;
console.log(error.message);
}
}
}
export default function setupDB(databaseName) {
// Connect to Mongoose
beforeAll(async () => {
const url = `mongodb://127.0.0.1/${databaseName}`;
await mongoose.connect(
url,
{
useNewUrlParser: true,
useCreateIndex: true,
useUnifiedTopology: true
},
err => {
if (err) {
console.error(err);
process.exit(1);
}
}
);
});
// Cleans up database between each test
afterEach(async () => {
await removeAllCollections();
});
// Disconnect Mongoose
afterAll(async () => {
await dropAllCollections();
await mongoose.connection.close();
});
}
I am then writing tests like this:
import User from 'db/models/User';
import setupDB from 'utils/setupDbForTesting';
setupDB('mongoose_bcrypt_test');
it('correctly hashes and salts passwords', async done => {
// create a user a new user
const newUser = new User({
username: 'jmar777',
password: 'Password123'
});
await newUser.save(function (err) {
if (err) {
console.log(err);
}
});
const user = await User.findOne({ username: 'jmar777' });
user.comparePassword('Password123', function (err, isMatch) {
if (err) throw err;
expect(isMatch).toBeTruthy();
});
user.comparePassword('123Password', function (err, isMatch) {
if (err) throw err;
expect(isMatch).toBeFalsy();
});
done();
});
However, every other time I run these tests, they pass (or fail) so for every time T that the tests pass, T + 1 they will fail. My question is - why?
The tests fail because user (in the callback for User.findOne) returns null, even though the user has been saved.
I think the issue lies in the tearing down of the database, but I really can't see any problems. Any help would be appreciated, thanks.

Jest mock mongoose.startSession() throws error

i'm implemented transaction in the post method. it was work fine. But now I have to update unit test case for that method. I tried to mock startSession() and startTransaction() to check toHaveBeenCalled.But while running test case i got like MongooseError: Connection 0 was disconnected when calling startSession``. I am new to that so i don't know how to mock that?.
Method:
static post = (funcCall: Promise<Document>) => async (response: Response, fields?: string[]) => {
const session = await startSession();
session.startTransaction();
try {
const dbResponse = await funcCall; // model.save(request.body)
// commit the changes if everything was successful
await session.commitTransaction();
success(pick(dbResponse, fields ? fields : ['_id']), 201)(response);
} catch (error) {
// this will rollback any changes made in the database
await session.abortTransaction();
throwError(error);
} finally {
// ending the session
session.endSession();
}
};
My Test case:
it('should perform post when valid parameters is passed.', async () => {
// Preparing
const mockSaveReturn = {
_id: objectID,
test_name: 'sample',
};
jest.spyOn(mongoose, 'startSession')
const spySave = jest.spyOn(modelPrototype.prototype, 'save').mockReturnValueOnce(mockSaveReturn);
const document = new modelPrototype(mockSaveReturn);
// Executing
await post(document.save())(response as Response);
expect(response.send).toHaveBeenCalledWith({ _id: '54759eb3c090d83494e2d804' });
expect(spySave).toHaveBeenCalled();
// Cleaning
spySave.mockClear();
});

How to implement Redis Cache with MongoDB?

const redis = require('redis');
const getAsync = promisify(client.get).bind(client);
router.get('/get_all', async (req, res) => {
let language = req.query.language;
let data = '';
try {
data = await getAsync(language);
}
catch (err) {
console.log(err);
}
if (data) {
console.log('Data is received from Redis Cache!');
res.send(data);
}
else {
try {
const result = await Users.find({
language: language
});
client.setex(language, 86400, JSON.stringify(result));
res.send(result);
}
catch (err) {
console.log('ERROR');
console.log('MONGO DB RETURNED THE FOLLOWING ERROR');
console.log(err);
res.end();
}
}
}
This is how i have implemented redis cache with mongod db.
* Is this the right way to do the implementation?
If it is wrong, please tell me how to implement.
One problem which i am facing now is, if i add try and catch block to
const data = await getAsync(language);
I am getting error.
UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error >originated either by throwing inside of an async function without a catch >block, or by rejecting a promise which was not handled with .catch(). >(rejection id: 2)
Editted the code with two try and catch blocks.
Is this good way to implement redis with mongo?