For the last few days, my cloud-function was running perfectly - every minute, the firestore would be queried for old posts and then they would be deleted, as such :
exports.hourly_job = functions.pubsub.topic('hourly-tick').onPublish((change,context) => {
const currentTime = Date.now()
const getPostsForDate = admin.firestore().collection('posts').where('timeOfDeletion', '<', currentTime)
return getPostsForDate.get().then(snapshot => {
const updates = {}
const batch = admin.firestore().batch()
snapshot.forEach((doc) => {
var key = doc.id
console.log(key)
const convos = admin.database().ref('/convoID/' + key).once('value', (snapshot) => {
if (snapshot.exists){
const convos = snapshot.val()
snapshot.forEach((child) => {
updates["conversations/" + child.key] = null
updates["messages/"+ child.key] = null
updates["convoID/"+ child.key] = null
})
}
})
updates["/convoID/"+ key] = null
updates["/reveals/" + key] = null
updates["/postDetails/" + key] = null
const postFireStoreRef = admin.firestore().collection('posts').doc(key)
const posterRef = admin.firestore().collection('posters').doc(key)
batch.delete(postFireStoreRef)
batch.delete(posterRef)
})
return Promise.all[admin.database().ref().update(updates), batch.commit()]
})
})
However, I started receiving the message :
TypeError: Cannot read property 'seconds' of null
at Function.fromProto (/user_code/node_modules/firebase-admin/node_modules/#google-cloud/firestore/build/src/timestamp.js:91:46)
at _firestore.request.then.resp (/user_code/node_modules/firebase-admin/node_modules/#google-cloud/firestore/build/src/write-batch.js:472:42)
EDIT: I fixed the date error by updating my firebase cloud functions, but the seconds is undefined still persists.
Basically, I just had to not return the batch update if there was nothing to be updated. That fixed the problem.
Related
Ok so I have a problem in which I use a collection to gather some ratings data and work with it, by the time I finish the rating update process, I have new ratings that I would like to update the collection with. However I can't call update because I get the error "Cannot overwrite model once compiled." I understand that I already called once the model to work with the data and that's why I get the error. is there any way I can update the collection? Or I will just have to workaround by creating a new collection with the latest rating, and then matching the latest ratings collection with the one I use to work with the data.
This is my code
let calculateRating = async () => {
const getData = await matchesCollection().find().lean();
const playerCollection = await playersCollection();
const getDataPlayer = await playerCollection.find().lean();
let gamesCounting = [];
getDataPlayer.forEach((player) => {
player.makePlayer = ranking.makePlayer(1500);
});
for (let i = 0; i < getData.length; i++) {
const resultA = getDataPlayer.findIndex(({ userId }, index) => {
if (userId === getData[i].userA) {
return index;
}
});
const resultB = getDataPlayer.findIndex(
({ userId }) => userId === getData[i].userB
);
const winner = getData[i].winner;
if (getDataPlayer[resultA] === undefined) {
continue;
} else if (getDataPlayer[resultB] === undefined) {
continue;
}
gamesCounting.push([
getDataPlayer[resultA].makePlayer,
getDataPlayer[resultB].makePlayer,
winner,
]);
}
ranking.updateRatings(gamesCounting);
let ratingsUpdate = [];
getDataPlayer.forEach((item) => {
let newRating = item.makePlayer.getRating();
let newDeviation = item.makePlayer.getRd();
let newVolatility = item.makePlayer.getVol();
item.rating = newRating;
item.rd = newDeviation;
item.vol = newVolatility;
ratingsUpdate.push(item);
});
};
I try the work around with creating the new collection
does anyone have some bad experience with saving float via executeRaw batch operation? Or at least an idea of how to solve my problem: ExecuteRaw is saving numbers like 7.76e-322. Input is 157 and saved value is 7.76e-322.
const items = [{uniqueId: 1, price: 100},{uniqueId: 2, price: 200}];
const completeKeys: string[] = Object.keys(items[0]);
const updateFieldsMapper = (item: any) => {
return Prisma.sql`(${Prisma.join(
completeKeys.map((key: string) => item[key])
)})`;
};
const insertKeys = completeKeys.map((key) =>
key.toLocaleLowerCase() !== key ? `"${key}"` : `${key}`
);
let insertValues = completeKeys.map((item) => updateFieldsMapper(item));
const updateSet = completeKeys.reduce((updateSet: string[], key: string) => {
if (!ingnoredKeys.includes(key)) {
updateSet.push(`"${key}" = EXCLUDED."${key}"`);
}
return updateSet;
}, []);
try {
await prisma.$executeRaw`
INSERT INTO "Product" (${Prisma.raw(insertKeys.join(","))})
VALUES ${Prisma.join(insertValues)}
ON CONFLICT (uniqueId)
DO UPDATE SET ${Prisma.raw(updateSet.join(","))};`;
} catch (error) {
console.error(util.inspect(error, false, null, true));
Sentry.captureException(error);
}
Thank you very much
Does anyone know why this does not work, what am I doing wrong here. It get stuck after the console.log "after read stream"
I am trying to read a bunch of files, convert it to json and upload with bulkwriter to firestore.
After each 400 document I am calling close to write them to firestore and then I am creating a new bulkwriter
I also tried awaiting bulkWriter.create(eventDoc, {}) but it does not work. It also get stuck and there is no error. Why is this ? the create method returns a promise.
Why can't it be awaited ?
https://googleapis.dev/nodejs/firestore/latest/BulkWriter.html#create
The idea is to process 1 file at the time and it can contains tens of thousands of rows which needs to be uploaded to firestore
I am calling this method in for...of loop and awaiting the processBatch method
Any help highly appreciated
async processBatch(document: string, file: string): Promise<void> {
const db = admin.firestore();
console.log('start: ', document);
let bulkWriter;
const writeBatchLimit = 400;
let documentsInBatch = 0;
let totalInDocument = 0;
const eventsCollectionRef = db.collection('events');
const eventDoc = eventsCollectionRef.doc(document);
return new Promise((resolve, reject) => {
console.log('promise');
bulkWriter = db.bulkWriter();
const csvStream = fs.createReadStream(file);
console.log('after read stream');
bulkWriter.create(eventDoc, {})
.then(result => {
console.log('Successfully: ', result);
csvStream.pipe(csvParser())
.on('data', row => {
console.log('row');
bulkWriter.create(eventDoc.collection('event').doc(), row);
documentsInBatch++;
if (documentsInBatch > writeBatchLimit) {
bulkWriter.close();
totalInDocument = + documentsInBatch;
documentsInBatch = 0;
bulkWriter = db.bulkWriter();
}
})
.on('end', () => {
console.log('file: ', file + ', totalInDocument: ', totalInDocument);
resolve();
});
})
.catch(err => {
console.log('Failed: ', err);
reject();
});
});
}
This seems to work:
async processBatch(document: string, file: string): Promise<void> {
const db = admin.firestore();
console.log(`start: ${document}`);
let bulkWriter;
const writeBatchLimit = 400;
let documentsInBatch = 0;
let numOfBatches = 0;
let totalInDocument = 0;
const eventsCollectionRef = db.collection('events');
const eventDoc = eventsCollectionRef.doc(document);
bulkWriter = db.bulkWriter();
const csvStream = fs.createReadStream(file);
bulkWriter.create(eventDoc, {});
csvStream.pipe(csvParser())
.on('data', row => {
bulkWriter.create(eventDoc.collection('event').doc(), row);
documentsInBatch++;
if (documentsInBatch > writeBatchLimit) {
numOfBatches++;
totalInDocument += documentsInBatch;
documentsInBatch = 0;
bulkWriter.close();
console.log(`Committing batch ${numOfBatches}, cumulative: ${totalInDocument}`);
bulkWriter = db.bulkWriter();
}
})
.on('end', () => {
console.log(`file: ${file}, totalInDocument: ${totalInDocument}`);
});
}
I have 2 actions which are :
This is first
I insert person here :
export const InsertOrUpdate = (person) => {
return (dispatch) => {
var instance = Axios.create({
baseURL: 'url',
// timeout: 1000,
headers: {
"Content-Type": "application/json"
}
});
instance.get("/InsertWorker", {
params: {
Name: person.Name,
Surname: person.Surname,
Duty: person.Duty,
DateOfDay: person.Date.substring(0, person.Date.length - 9),
Shift: person.Shift,
WorkDayCount: person.WorkDayCount,
Equipment: person.Equipment,
Sicil: person.Sicil,
Ship: person.Ship
}
});
}
}
This is second
I call workers here :
export const getSignedWorkers = (collection) => {
return (dispatch) => {
var instance = Axios.create({
baseURL: 'url',
// timeout: 1000,
headers: {
"Content-Type": "application/json"
}
});
instance.get('/GetWorkers', {
params: {
DateOfDay: collection.Tarih,
Shift: collection.Vardiya
}
})
.then((response) => response.data)
.then(x => {
const Signed = str3.filter(x => x.Ship != "" && x.Shift != "H");
console.warn('signed', Signed);
const UnSigned = str3.filter(x => x.Ship == "" || null);
const RemoveOnHolidays = UnSigned.filter(x => x.Shift != "H");
const OnHoliday = str3.filter(x => x.Shift == "H");
const AllWorkers = {};
AllWorkers.signed = Signed;
AllWorkers.Unsigned = RemoveOnHolidays;
AllWorkers.OnHoliday = OnHoliday;
dispatch({ type: FETCH_SIGNED_WORKERS_SIGNED, payload: AllWorkers });
})
.catch((error) => {
console.warn('error', error);
})
}
}
I call this actions here :
_Insert = (person) => {
person.Ship = this.props.gemi_Sefer_No;
this.props.InsertOrUpdate(item); <------- Here I call first action
var date = new Date().getDate(); //Current Date
var month = new Date().getMonth() + 1; //Current Month
var year = new Date().getFullYear(); //Current Year
var tarih = year + "/" + month + "/" + date;
let collection = {};
collection.Tarih = tarih;
collection.Vardiya = this.props.vardiya == 0 ? "S" : this.props.vardiya == 1 ? "A" : this.props.vardiya == 2 ? "G" : null
this.props.getSignedWorkers(collection); <--- Here I call second action
}
I try to insert workers to database then call all workers from database and use another component with redux. However, sometimes it works correctly sometimes not.
I mean , when I insert to database and call back again, insert worker works right however calling all workers come without worker which I insert last one. As I said, it works sometimes right. What should I do ?. Is that wrong to call two actions in same method ? if it is, what should I do ?
I have created post api but not able to figure out why am I getting this error ? Any suggestion for what I need to change in my query?
Query :
router.post('/bills', function(req, httpres, next) {
console.log("Inside the bills api");
const name = req.body.name;const designation = req.body.designation; const department = req.body.department;
const address = req.body.address;const phone = req.body.phone;const mobile = req.body.mobile;const email = req.body.email;
const organization = req.body.organization;const city = req.body.city;const state = req.body.state;const pincode = req.body.pincode;const fax = req.body.fax;
console.log(name,designation,department,address,phone,mobile,email,organization,city,state,pincode,fax)
pool.query("Insert into bill_to(name,designation,department,address,phone,mobile,email,organization,city,state,pincode,fax) VALUES ($1,$2,$3,$4,$5,$6,$7,$8,$9,$10,$11,$12)",[name,designation.department,address,phone,mobile,email,organization,city,state,pincode,fax])
.subscribe(
data => {
console.log('success',data)
/*
if(data.rowCount > 0){
httpres.json({status : true ,message : ' ok',parameters:req.body });
}else{
httpres.send('error');
}*/
if(data.rows[0].exists){
httpres.json({status : true ,message : 'data inserted',parameters:req.body });
}else{
httpres.send('error');
}
}, err => {
console.log('error',err)
httpres.send('error');
})
You have put a period (.) instead of a comma (,) between designation and department in the pool.query:
... [name,designation.department,address,phone,mobile,email,organization,city,state,pincode,fax])
Change it to a comma and it should work:
... [name,designation,department,address,phone,mobile,email,organization,city,state,pincode,fax])