Use Promise.all() inside mongodb transaction is not working propertly in Nestjs - mongodb

Hi I am working in a NestJS project using mongodb and mongoose.
I have to create transactions with a lot of promises inside, so i think it was a good idea to use Promise.all() inside my transaction for performace issues.
Unfortunately when i started working with my transactions i have a first issue, i was using
session.startTransaction(); and my code was throwing the following error:
Given transaction number 2 does not match any in-progress transactions. The active transaction number is 1, the error was thrown sometimes, not always but it was a problem
So i read the following question Mongoose `Promise.all()` Transaction Error, and i started to use withTransaction(), this solved the problem, but now mi code does not work propertly.
the code basically takes an array of bookings and then creates them, also needs to create combos of the bookings, what I need is that if a creation of a booking or a combo fail nothing should be inserted, for perfomance I use Promise.all().
But when i execute the function sometimes it creates more bookings than expected, if bookingsArray is from size 2, some times it creates 3 bookings and i just don't know why, this occurs very rarely but it is a big issue.
If i remove the Promise.all() from the transaction it works perfectly, but without Promise.all() the query is slow, so I wanted to know if there is any error in my code, or if you just cannot use Promise.all() inside a mongodb transaction in Nestjs
Main function with the transaction and Promise.all(), this one sometimes create the wrong number of bookings
async createMultipleBookings(
userId: string,
bookingsArray: CreateBookingDto[],
): Promise<void> {
const session = await this.connection.startSession();
await session.withTransaction(async () => {
const promiseArray = [];
for (let i = 0; i < bookingsArray.length; i++) {
promiseArray.push(
this.bookingRepository.createSingleBooking(
userId,
bookingsArray[i],
session,
),
);
}
promiseArray.push(
this.bookingRepository.createCombosBookings(bookingsArray, session),
);
await Promise.all(promiseArray);
});
session.endSession();
}
Main function with the transaction and withot Promise.all(), works fine but slow
async createMultipleBookings(
userId: string,
bookingsArray: CreateBookingDto[],
): Promise<void> {
const session = await this.connection.startSession();
await session.withTransaction(async () => {
for (let i = 0; i < bookingsArray.length; i++) {
await this.bookingRepository.createSingleBooking(
userId,
bookingsArray[i],
session,
);
}
await this.bookingRepository.createCombosBookings(bookingsArray, session);
});
session.endSession();
}
Functions called inside the main function
async createSingleBooking(
userId: string,
createBookingDto: CreateBookingDto,
session: mongoose.ClientSession | null = null,
) {
const product = await this.productsService.getProductById(
createBookingDto.productId,
session,
);
const user = await this.authService.getUserByIdcustomAttributes(
userId,
['profile', 'name'],
session,
);
const laboratory = await this.laboratoryService.getLaboratoryById(
product.laboratoryId,
session,
);
if (product.state !== State.published)
throw new BadRequestException(
`product ${createBookingDto.productId} is not published`,
);
const bookingTracking = this.createBookingTraking();
const value = product.prices.find(
(price) => price.user === user.profile.role,
);
const bookingPrice: Price = !value
? {
user: user.profile.role,
measure: Measure.valorACotizar,
price: null,
}
: value;
await new this.model({
...createBookingDto,
userId,
canceled: false,
productType: product.productType,
bookingTracking,
bookingPrice,
laboratoryId: product.laboratoryId,
userName: user.name,
productName: product.name,
laboratoryName: laboratory.name,
facultyName: laboratory.faculty,
createdAt: new Date(),
}).save({ session });
await this.productsService.updateProductOutstanding(
createBookingDto.productId,
session,
);
}
async createCombosBookings(
bookingsArray: CreateBookingDto[],
session: mongoose.ClientSession,
): Promise<void> {
const promiseArray = [];
for (let i = 1; i < bookingsArray.length; i++) {
promiseArray.push(
this.combosService.createCombo(
{
productId1: bookingsArray[0].productId,
productId2: bookingsArray[i].productId,
},
session,
),
);
}
await Promise.all(promiseArray);
}
also this is how i create the connection element:
export class BookingService {
constructor(
#InjectModel(Booking.name) private readonly model: Model<BookingDocument>,
private readonly authService: AuthService,
private readonly bookingRepository: BookingRepository,
#InjectConnection()
private readonly connection: mongoose.Connection,
) {}

Related

How to remove all the items from MongoDB except last N using Mongoose?

I am using NestJS to keep logic and history of Calculator. So the point is I want to keep only last 10 cases in DB. But don't know how to do it. Let's take a look. Here's my history.service.ts
#Injectable()
export class HistoryService {
constructor(
#InjectModel(HistoryItem.name)
private historyModel: Model<HistoryItemDocument>,
) {}
async create(dto: CreateHistoryItemDto): Promise<HistoryItem> {
const historyItem = await this.historyModel.create({ ...dto });
return historyItem;
}
async getAll(): Promise<HistoryItem[]> {
const allHistoryItems = await this.historyModel
.find()
.sort({ _id: -1 }) //Here I sort the items to get the latest ones
.limit(maxNumberOfDBItemsToDisplay);
//Here I limit number of items to send it to Client
return allHistoryItems;
}
async getOne(id: ObjectId): Promise<HistoryItem> {
const historyItem = await this.historyModel.findById(id);
return historyItem;
}
async delete(id: ObjectId): Promise<ObjectId> {
const historyItem = await this.historyModel.findByIdAndDelete(id);
return historyItem.id;
}
}
As you see I can get 10 last items from DB. But how to remove the rest to keep base updated only with 10 last cases?

MongoDB Mutation Upsert - Can't Get Id of New Record on First Submit

I'm executing an upsert mutation on MongoDB to create a new document or update an existing document. If the document exists, the mutation returns the id as expected. If a new document is created, the mutation returns null (in both Apollo sandbox and via console.log) in the initial return then in subsequent returns it will return the id. I need it to return the id of the newly created document immediately (on the first return) so I can use that id in subsequent actions.
Starting from the beginning here's the setup:
TYPEDEF
updateHourByEmployeeIdByJobDate(
jobDate: String
startTime: String
endTime: String
hoursWorked: String
employee: String
): Hour
RESOLVER
updateHourByEmployeeIdByJobDate: async (
parent,
{ jobDate, startTime, endTime, hoursWorked, employee },
context
) => {
// if (context.user) {
console.log("resolver hours update = ");
return Hour.findOneAndUpdate(
{ employee, jobDate },
{
jobDate,
startTime,
endTime,
hoursWorked,
employee,
},
{
upsert: true,
}
);
// }
// throw new AuthenticationError("You need to be logged in!");
},
MUTATION
//UPDATE HOUR RECORD - CREATES DOCUMENT IF IT DOESN'T EXIST OR UPDATES IF IT DOES EXIST VIA THE UPSERT OPTION ON THE RESOLVER
export const UPDATE_HOURS_BYEMPLOYEEID_BYJOBDATE = gql`
mutation UpdateHourByEmployeeIdByJobDate(
$jobDate: String
$startTime: String
$endTime: String
$hoursWorked: String
$employee: String
) {
updateHourByEmployeeIdByJobDate(
jobDate: $jobDate
startTime: $startTime
endTime: $endTime
hoursWorked: $hoursWorked
employee: $employee
) {
_id
}
}
`;
FRONT-END EXECUTION
const [ mostRecentHourUpdateId, setMostRecentHoursUpdateId ] = useState();
const [updateHours] = useMutation(UPDATE_HOURS_BYEMPLOYEEID_BYJOBDATE, {
onCompleted: (data) => {
console.log('mutation result #1 = ', data)
setMostRecentHoursUpdateId(data?.updateHourByEmployeeIdByJobDate?._id);
console.log('mutation result #2 = ', mostRecentHourUpdateId)
},
});
//section update database - this mutation is an upsert...it either updates or creates a record
const handleUpdateDatabase = async (data) => {
console.log(data);
try {
// eslint-disable-next-line
const { data2 } = await updateHours({
variables: {
jobDate: moment(data.date).format("MMMM DD YYYY"), //"January 20 2023"
startTime: `${data.startTime}:00 (MST)`, //"12:00:00 (MST)"
endTime: `${data.endTime}:00 (MST)`, //"13:00:00 (MST)"
hoursWorked: data.hours.toString(), //"3.0"
employee: userId, //"6398fb54494aa98f85992da3"
},
});
console.log('handle update database function = data = ', data2); //fix
} catch (err) {
console.error(err);
}
singleHoursRefetch();
};
I've tried using onComplete as part of the mutation request as well as useEffect not to mention running the mutation in Apollo Sandbox. Same result in all scenarios. The alternative is to re-run the useQuery to get the most recent / last record created but this seems like a challenging solution (if at some point the records are sorted differently) and/or seems like I should be able to get access to the newly created record immediately as part of the mutation results.
You'll want to use { returnNewDocument: true }
Like this:
const getData = async () => {
const returnedRecord = await Hour.findOneAndUpdate( { employee, jobDate }, { jobDate, startTime, endTime, hoursWorked, employee, }, { upsert: true, returnNewDocument: true } );
// do something with returnedRecord
}
getData();
For more information:
https://www.mongodb.com/docs/manual/reference/method/db.collection.findOneAndUpdate/

When working with sessions, timestamps are not added

I'm working with the latest mongoose version of today (6.2.7) and I'm having a really weird bug.
This is my Schema:
const testSchema = new Schema<ITestSchema>({
age: Number
}, { timestamps: true });
const testModel = model<ITestSchema>("test", testSchema);
When I'm creating new collections out of it everything is working perfect! and I'm getting timestamps (updatedAt and createdAt) added to the collection.
But When I'm working with sessions the timestamps are not added and i see only "age", "_d" and "__v".
This is the example code for the creation with the sessions:
const test = async () => {
const session: ClientSession = await mongoose.startSession();
try {
session.startTransaction();
const newTest = new testModel({
age: 30,
}, { session });
await newTest.save({ session });
await session.commitTransaction();
} catch (error) {
await session.abortTransaction();
throw error;
} finally {
await session.endSession();
}
};
I tried read the doc several times and searched for similar issues online but could not find any.
Thanks 3>
i know the question is old but i just came across this and found a solution, you mistake is that you are using session in 2 different places both when creating the model and when saving, the thing is if you use session only on testModel it also won't work, so the session must only be used when calling save function
sample code:
const modelObject = {
'_id' : new Types.ObjectId(),
...dto
}
const newItem = new this.model(modelObject)
return newItem.save({
session
});

Google Calendar API (Saving events in MongoDB, Express JS)

I can't figure out how to save fetched events from Calendar API. I was able to print out array of events in console. I would require save multiple events at once and have verification if they already exist in database with unique id.
Here's my event.js scheme in express js.
const mongoose = require('mongoose');
const Schema = mongoose.Schema;
const EventSchema = new Schema({
id: {
type: String,
required: false,
unique:true
},
start: {
type: String
},
end: {
type: String
},
status: {
type: String
},
creator: {
type: Array
},
description: {
type: String
}
});
module.exports = Event = mongoose.model('events', EventSchema);
Here's my event.js router in express js.
router.post("/google/get", async (req, res, next) => {
const {
google
} = require('googleapis')
const {
addWeeks
} = require('date-fns')
const {
OAuth2
} = google.auth
const oAuth2Client = new OAuth2(
process.env.GOOGLE_CLIENT_ID,
process.env.GOOGLE_CLIENT_SECRET
)
oAuth2Client.setCredentials({
refresh_token: process.env.GOOGLE_REFRESH_TOKEN,
})
const calendar = google.calendar({
version: 'v3',
auth: oAuth2Client
})
calendar.events.list({
calendarId: 'MY CALENDAR ID',
timeMin: new Date().toISOString(),
timeMax: addWeeks(new Date(), 1).toISOString(),
singleEvents: true,
orderBy: 'startTime',
},
function (err, response) {
if (err) {
console.log("The API returned an error: " + err)
return
}
var events = response.data.items
events.forEach(function (event) {
var start = event.start.dateTime || event.start.date
console.log("%s - %s", start, event.summary)
})
}
)
In Mongoose, in order to save something to a database, all you need to do is to instantiate the model that you created. Your event schema exports Event as a model that you can then treat as a regular object. So you would do something along the lines of:
let currentEvent = new Event({id, start, end, status, creator, description});
currentEvent.save();
Once that is done, it should be stored in your MongoDB. I assume that as the code for this is not visible it is already set up and working. You can just run the above inside of your for loop with some minor tweaks to grab each value correctly and it should sort your issue out!
As for your unique ID and making sure that it doesn't already exist in your database, you can use the same model to find values by checking the id against your database and seeing if it exists. As follows:
Event.findById(id, (err, event) => {
if(event == null) {
let currentEvent = new Event({id, start, end, status, creator, description});
currentEvent.save();
} else {
alert("Error, this event already exists")
}
});
I believe something like this should work, however I might have it wrong with how to check if the event exists, I can't remember if it returns null or something different, so just console log the value of event and check to see what it returns if there isn't an event that exists with that ID, and just re-run your if statement with that instead.

Mongoose/Mongodb, update each doc query, very slow

I have this update query in mongoose. It's 1600 posts and takes like 5 min to run.
What's the bottleneck? Am I using the wrong approach?
export const getAndStoreLatestKPI = async () => {
console.log("start kpi");
try {
const marketCaps = await getKPI();
const stocks = await mongoose.model("stock").find().exec();
for (const stock of stocks) {
const marketCap = marketCaps.find(
(marketCap) => marketCap.i === stock.insId
);
if (marketCap != null) {
const marketCapAdjustedVal =
stock.country === "Finland" ? marketCap.n * 10 : marketCap.n;
const update = {
marketCap: marketCapAdjustedVal,
};
console.log(marketCapAdjustedVal);
await mongoose
.model("stock")
.findOneAndUpdate({ insId: stock.insId }, { update });
}
}
console.log("done");
return Promise.resolve();
} catch (err) {
return Promise.reject(err);
}
};
export const getKPI = async (kpiId: number) => {
try {
const kpiFetch = await Axios.get(someurl);
return Promise.resolve(kpiFetch.data.values);
} catch (err) {
return Promise.reject(err);
}
};
So the main bottle neck is your for loop. for each stock item you perform several "expensive" actions such as data fetching from external API + a single update, and you're doing them 1 by 1.
What I would recommend you doing is looping on several items at once. similar to the idea multithreading.
There are several different solutions on how to do it in nodejs for example nodejs worker threads
However I personally use and recommend using bluebird which gives you this ability and many others straight out of the box.
Some sample code:
import Bluebird = require('bluebird');
const stocks = await mongoose.model("stock").find().exec();
await Bluebird.map(stocks, async (stock) => {
const marketCap = marketCaps.find(
(marketCap) => marketCap.i === stock.insId
);
if (marketCap != null) {
const marketCapAdjustedVal =
stock.country === "Finland" ? marketCap.n * 10 : marketCap.n;
const update = {
marketCap: marketCapAdjustedVal,
};
console.log(marketCapAdjustedVal);
await mongoose
.model("stock")
.findOneAndUpdate({ insId: stock.insId }, { update });
}
}, {concurrency: 25})
// concurrency details how many concurrent process run parallel. the heavier they are the less you want concurrent for obvious reasons.