Multi row insert into multiple tables in a transaction using pg promise - postgresql

I am trying to perform a transaction using pg promise where I intend to insert multiple rows in multiple tables in the transaction. I am using pg promise helper methods to optimize the insert into each table but unable to concatenate all the queries together to perform only one query. Also would really appreciate if someone can point out if there is any way to further optimise the below code.
Currently I am getting an error that
"Parameter 'queries' must be an array." on pgp.hlpers.concat
db.tx(async t => {
let customerDataArray = [];
let campaignStatusArray = [];
let messageDetailsArray = [];
for(const customer of customerDetails)
{
const customerData = {
columna1: customer.x,
columnb1: customer.y"
};
const campaignStatus = {
columna2: customer.a,
columnb2: customer.b
};
customerDataArray.push(customerData);
campaignStatusArray.push(campaignStatus);
for (const message of customer.messages)
{
const messageStatus = {
columna: message.x,
columnb: message.y
};
messageDetailsArray.push(messageStatus);
}
}
const customerDataCs = new pgp.helpers.ColumnSet(['columna1', 'columnb1'], {table: 'customerdata'});
const campaignStatusCs = new pgp.helpers.ColumnSet(['columna2', 'columnb2'], {table: 'campaignstatus'});
const messageStatusCs = new pgp.helpers.ColumnSet(['columna', 'columnb'], {table: 'messageStatus'});
const customerDataQuery = pgp.helpers.insert(customerDataArray, customerDataCs);
const campaignStatusQuery = pgp.helpers.insert(campaignStatusArray, campaignStatusCs);
const messageStatusQuery = pgp.helpers.insert(messageDetailsArray, messageStatusCs);
const concatAllQueries = pgp.helpers.concat(customerDataQuery, campaignStatusQuery, messageStatusQuery);
await t.none(concatAllQueries);
})
.then(() => {
// success;
})
.catch(error => {
});
'''

You need to pass your queries to concat in a single array like so:
const concatAllQueries = pgp.helpers.concat([ customerDataQuery, campaignStatusQuery, messageStatusQuery ]);

Related

how to update a collection if you already called it MongoDB Mongoos

Ok so I have a problem in which I use a collection to gather some ratings data and work with it, by the time I finish the rating update process, I have new ratings that I would like to update the collection with. However I can't call update because I get the error "Cannot overwrite model once compiled." I understand that I already called once the model to work with the data and that's why I get the error. is there any way I can update the collection? Or I will just have to workaround by creating a new collection with the latest rating, and then matching the latest ratings collection with the one I use to work with the data.
This is my code
let calculateRating = async () => {
const getData = await matchesCollection().find().lean();
const playerCollection = await playersCollection();
const getDataPlayer = await playerCollection.find().lean();
let gamesCounting = [];
getDataPlayer.forEach((player) => {
player.makePlayer = ranking.makePlayer(1500);
});
for (let i = 0; i < getData.length; i++) {
const resultA = getDataPlayer.findIndex(({ userId }, index) => {
if (userId === getData[i].userA) {
return index;
}
});
const resultB = getDataPlayer.findIndex(
({ userId }) => userId === getData[i].userB
);
const winner = getData[i].winner;
if (getDataPlayer[resultA] === undefined) {
continue;
} else if (getDataPlayer[resultB] === undefined) {
continue;
}
gamesCounting.push([
getDataPlayer[resultA].makePlayer,
getDataPlayer[resultB].makePlayer,
winner,
]);
}
ranking.updateRatings(gamesCounting);
let ratingsUpdate = [];
getDataPlayer.forEach((item) => {
let newRating = item.makePlayer.getRating();
let newDeviation = item.makePlayer.getRd();
let newVolatility = item.makePlayer.getVol();
item.rating = newRating;
item.rd = newDeviation;
item.vol = newVolatility;
ratingsUpdate.push(item);
});
};
I try the work around with creating the new collection

Batches with BulkWriter in google firestore

Does anyone know why this does not work, what am I doing wrong here. It get stuck after the console.log "after read stream"
I am trying to read a bunch of files, convert it to json and upload with bulkwriter to firestore.
After each 400 document I am calling close to write them to firestore and then I am creating a new bulkwriter
I also tried awaiting bulkWriter.create(eventDoc, {}) but it does not work. It also get stuck and there is no error. Why is this ? the create method returns a promise.
Why can't it be awaited ?
https://googleapis.dev/nodejs/firestore/latest/BulkWriter.html#create
The idea is to process 1 file at the time and it can contains tens of thousands of rows which needs to be uploaded to firestore
I am calling this method in for...of loop and awaiting the processBatch method
Any help highly appreciated
async processBatch(document: string, file: string): Promise<void> {
const db = admin.firestore();
console.log('start: ', document);
let bulkWriter;
const writeBatchLimit = 400;
let documentsInBatch = 0;
let totalInDocument = 0;
const eventsCollectionRef = db.collection('events');
const eventDoc = eventsCollectionRef.doc(document);
return new Promise((resolve, reject) => {
console.log('promise');
bulkWriter = db.bulkWriter();
const csvStream = fs.createReadStream(file);
console.log('after read stream');
bulkWriter.create(eventDoc, {})
.then(result => {
console.log('Successfully: ', result);
csvStream.pipe(csvParser())
.on('data', row => {
console.log('row');
bulkWriter.create(eventDoc.collection('event').doc(), row);
documentsInBatch++;
if (documentsInBatch > writeBatchLimit) {
bulkWriter.close();
totalInDocument = + documentsInBatch;
documentsInBatch = 0;
bulkWriter = db.bulkWriter();
}
})
.on('end', () => {
console.log('file: ', file + ', totalInDocument: ', totalInDocument);
resolve();
});
})
.catch(err => {
console.log('Failed: ', err);
reject();
});
});
}
This seems to work:
async processBatch(document: string, file: string): Promise<void> {
const db = admin.firestore();
console.log(`start: ${document}`);
let bulkWriter;
const writeBatchLimit = 400;
let documentsInBatch = 0;
let numOfBatches = 0;
let totalInDocument = 0;
const eventsCollectionRef = db.collection('events');
const eventDoc = eventsCollectionRef.doc(document);
bulkWriter = db.bulkWriter();
const csvStream = fs.createReadStream(file);
bulkWriter.create(eventDoc, {});
csvStream.pipe(csvParser())
.on('data', row => {
bulkWriter.create(eventDoc.collection('event').doc(), row);
documentsInBatch++;
if (documentsInBatch > writeBatchLimit) {
numOfBatches++;
totalInDocument += documentsInBatch;
documentsInBatch = 0;
bulkWriter.close();
console.log(`Committing batch ${numOfBatches}, cumulative: ${totalInDocument}`);
bulkWriter = db.bulkWriter();
}
})
.on('end', () => {
console.log(`file: ${file}, totalInDocument: ${totalInDocument}`);
});
}

how can i update a field in firestore database with google cloud function when another field changes?

I want to write a simple google cloud function for firestore database which updates a field in a document when another field changes at the same document. The triggered field is named "copper" and the updates will be done to the field "coppervalue". I wrote a simple function for this, it doesnt give any error but it doesnt update the field "coppervalue" either, so i would like to learn where i am doing wrong.
Here is my cloud function code:
const functions = require('firebase-functions');
exports.copperupdate = functions.firestore
.document("/kullanici/{uid}")
.onUpdate((change,context) => {
const newfieldvalue = change.after.data();
const fieldname = newfieldvalue.name;
if(fieldname==="copper"){
const d = new Date();
const currenttime = d.getTime();
const coppervalue = snap.data()['coppervalue'];
const copperdate = snap.data()['copperdate'];
const copperdec = (currenttime-copperdate)/1000
const copper_real= (copperdec*copper/60)+coppervalue;
const sonuccopper = Math.trunc(copper_real)
return change.after.ref.set({
coppervalue: sonuccopper
}, {merge: true});
}else{
return false;
}
});
Thanks in advance.
Finally figured it out;
My mistake was i thought newfieldvalue.name was accessing the name of the field instead it was accessing "name" field so i made a few changes, here is the code
const functions = require('firebase-functions');
exports.copperupdate = functions.firestore
.document("/kullanici/{uid}")
.onUpdate((change,context) => {
const newfieldvalue = change.after.data();
const previousfieldvalue = change.before.data();
if (newfieldvalue.copper === previousfieldvalue.copper){
return false;
}else{
const d = new Date();
const currenttime = d.getTime();
const copper = newfieldvalue.copper;
const coppervalue = newfieldvalue.coppervalue;
const copperdate = newfieldvalue.copperdate;
const copperdec = (currenttime-copperdate)/1000
const copper_real= (copperdec*copper/60)+coppervalue;
const sonuccopper = Math.trunc(copper_real)
return change.after.ref.set({
coppervalue: sonuccopper
}, {merge: true});
}
});

Weird issue while processing events coming from kinesis

I setup amazon connect on aws and if I make a test call, it will put that call in a aws kinesis stream. I am trying to write a lambda that process this records and save them to database.
If I make a simple call (call the number - asnwer - hangup) it works just fine. However if I make a multipart call (call a number - answer - trasnfer to another number - hangup) this comes to kinesis as two separate records (CTR).
My lambda process the CTR (Contact Trace Records) one by one. First it saves the CTR to a table called call_segments and then it query this table to see if the other part of this call is already there. If it is, it merges the data and save to a table called completed_calls, otherwise skips it.
If a call has more than on segment (if it was transfered to another number) it brings it to you as two events.
My problem is that even though I am processing the events one after the other it seems that when the second event is processed (technically the call segment from first event is already in database), it can not see the first segment of the call.
here is my code:
const callRecordService = require("./call-records-service");
exports.handler = async (event) => {
await Promise.all(
event.Records.map(async (record) => {
return processRecord(record);
})
);
};
const processRecord = async function(record) {
try{
const payloadStr = new Buffer(record.kinesis.data, "base64").toString("ascii");
let payload = JSON.parse(payloadStr);
await callRecordService.processCTR(payload);
}
catch(err){
// console.error(err);
}
};
and here is the service file:
async function processCTR(ctr) {
let userId = "12"
let result = await saveCtrToContactDetails(ctr, userId);
let paramsForCallSegments = [ctr.InstanceARN.split("instance/").pop(), ctr.ContactId]
let currentCallSegements = await dbHelper.getAll(dbQueries.getAllCallSegmentsQuery, paramsForCallSegments)
let completedCall = checkIfCallIsComplete(currentCallSegements);
if (completedCall) {
console.log('call is complete')
let results = await saveCallToCompletedCalls(completedCall);
}
}
//------------- Private functions --------------------
const saveCtrToContactDetails = async (ctr, userId) => {
let params = [ctr.ContactId,userId,ctr.callDuration];
let results = await dbHelper.executeQuery(dbQueries.getInsertCallDetailsRecordsQuery, params);
return results;
}
const checkIfCallIsComplete = (currentCallSegements) => {
//This function checks if all callSegments are already in call_segments table.
}
const saveCallToCompletedCalls = async (completedCall) => {
let contact_id = completedCall[0].contact_id;
let user_id = completedCall[0].user_id;
let call_duration = completedCall[0] + completedCall[1]
completedCall.forEach(callSegment => {
call_duration += callSegment.call_duration;
});
let params = [contact_id, user_id, call_duration];
let results = await dbHelper.executeQuery(dbQueries.getInsertToCompletedCallQuery, params);
};

Get the original query object in Mongoose

I have a loop to perform multiple queries through Mongoose
"use strict";
var Mongoose = require("mongoose");
var User = Mongoose.model("User");
var Cache = {};
for (var index=0; index<usernames.length; index++) {
var query = {
username:usernames[index]
};
User.find(query).
exec(function(error,users){
//THIS IS A CALLBACK FUNCTION,
//HOW TO GET THE 'query' VARIABLE ABOVE?
//I WANT TO PUT THE RESULT INTO CACHE:
var username = users[0].username;
Cache[username] = users[0];
});
}
I need to know which result is of which query, in the callback function above.
It is for db query caching purpose. I can extract 'username' from 'users[0]', but when the array 'users' is empty, there's no such thing.
Put anonymous function inside your loop; and use .findOne() instead of .find() if you are only interested in the first user or if the username values are unique.
for (var index = 0; index < usernames.length; index++) {
(function () {
var query = {
username: usernames[index]
};
User.findOne(query).
exec(function (error, user) {
//use your query here
var username = user.username;
Cache[username] = user;
});
})()
}
However consider async for this kind of operations.