how to update a collection if you already called it MongoDB Mongoos - mongodb

Ok so I have a problem in which I use a collection to gather some ratings data and work with it, by the time I finish the rating update process, I have new ratings that I would like to update the collection with. However I can't call update because I get the error "Cannot overwrite model once compiled." I understand that I already called once the model to work with the data and that's why I get the error. is there any way I can update the collection? Or I will just have to workaround by creating a new collection with the latest rating, and then matching the latest ratings collection with the one I use to work with the data.
This is my code
let calculateRating = async () => {
const getData = await matchesCollection().find().lean();
const playerCollection = await playersCollection();
const getDataPlayer = await playerCollection.find().lean();
let gamesCounting = [];
getDataPlayer.forEach((player) => {
player.makePlayer = ranking.makePlayer(1500);
});
for (let i = 0; i < getData.length; i++) {
const resultA = getDataPlayer.findIndex(({ userId }, index) => {
if (userId === getData[i].userA) {
return index;
}
});
const resultB = getDataPlayer.findIndex(
({ userId }) => userId === getData[i].userB
);
const winner = getData[i].winner;
if (getDataPlayer[resultA] === undefined) {
continue;
} else if (getDataPlayer[resultB] === undefined) {
continue;
}
gamesCounting.push([
getDataPlayer[resultA].makePlayer,
getDataPlayer[resultB].makePlayer,
winner,
]);
}
ranking.updateRatings(gamesCounting);
let ratingsUpdate = [];
getDataPlayer.forEach((item) => {
let newRating = item.makePlayer.getRating();
let newDeviation = item.makePlayer.getRd();
let newVolatility = item.makePlayer.getVol();
item.rating = newRating;
item.rd = newDeviation;
item.vol = newVolatility;
ratingsUpdate.push(item);
});
};
I try the work around with creating the new collection

Related

How to update multiple rows of a database table in ASP.NET Core

I want to update multiple rows in ASP.NET Core, but I get an error:
InvalidOperationException: The entity type 'EntityQueryable' was not found. Ensure that the entity type has been added to the model.
This is my code:
var data = _db.UserTable.Where(a => a.CityIDn == selectedid);
foreach (var items in data)
{
await Task.Run(() =>
{
items.CityID = 2;
});
_db.Update(data);
await _db.SaveChangesAsync();
}
Try this for multiple rows:
var data = _db.UserTable.Where(a => a.CityIDn == selectedid).ToList();
foreach (var item in data)
{
item.CityID = 2;
_db.UserTable.Update(item);
}
await _db.SaveChangesAsync();
And for one record, try like this:
var data = _db.UserTable.Where(a => a.CityIDn == selectedid).FirstOrDefault();
if(data != null)
{
data.CityID = 2;
_db.UserTable.Update(data );
await _db.SaveChangesAsync();
}
The content of Update should be items, it worked for me, you can have a try.
var data = _db.UserTable.Where(a => a.CityIDn == selectedid).ToList();
foreach (var items in data)
{
await Task.Run(() =>
{
items.CityID = 2;
});
_db.Update(items);
await _db.SaveChangesAsync();
}

Multi row insert into multiple tables in a transaction using pg promise

I am trying to perform a transaction using pg promise where I intend to insert multiple rows in multiple tables in the transaction. I am using pg promise helper methods to optimize the insert into each table but unable to concatenate all the queries together to perform only one query. Also would really appreciate if someone can point out if there is any way to further optimise the below code.
Currently I am getting an error that
"Parameter 'queries' must be an array." on pgp.hlpers.concat
db.tx(async t => {
let customerDataArray = [];
let campaignStatusArray = [];
let messageDetailsArray = [];
for(const customer of customerDetails)
{
const customerData = {
columna1: customer.x,
columnb1: customer.y"
};
const campaignStatus = {
columna2: customer.a,
columnb2: customer.b
};
customerDataArray.push(customerData);
campaignStatusArray.push(campaignStatus);
for (const message of customer.messages)
{
const messageStatus = {
columna: message.x,
columnb: message.y
};
messageDetailsArray.push(messageStatus);
}
}
const customerDataCs = new pgp.helpers.ColumnSet(['columna1', 'columnb1'], {table: 'customerdata'});
const campaignStatusCs = new pgp.helpers.ColumnSet(['columna2', 'columnb2'], {table: 'campaignstatus'});
const messageStatusCs = new pgp.helpers.ColumnSet(['columna', 'columnb'], {table: 'messageStatus'});
const customerDataQuery = pgp.helpers.insert(customerDataArray, customerDataCs);
const campaignStatusQuery = pgp.helpers.insert(campaignStatusArray, campaignStatusCs);
const messageStatusQuery = pgp.helpers.insert(messageDetailsArray, messageStatusCs);
const concatAllQueries = pgp.helpers.concat(customerDataQuery, campaignStatusQuery, messageStatusQuery);
await t.none(concatAllQueries);
})
.then(() => {
// success;
})
.catch(error => {
});
'''
You need to pass your queries to concat in a single array like so:
const concatAllQueries = pgp.helpers.concat([ customerDataQuery, campaignStatusQuery, messageStatusQuery ]);

override object properties in createAsyncThunk method

I have a function like this
export const fetchChildrenNews = createAsyncThunk('news/fetch1', async ([item, news]) => {
const res = await Promise.all(item.kids.map(id => {
let url = `https://hacker-news.firebaseio.com/v0/item/${id}.json?print=pretty`;
return fetch(url);
}));
const jsons = await Promise.all(res.map(r => r.json()));
let users = {...item, kids: jsons};
item.kids = []//doesn't work
item.id = 0 //doesn't work
//I want to find a branch in the original tree and replace it
const tree = (obj) => {
for (let key in obj) {
if (key === "id" && obj[key] === users.id) {
obj = users;
}
if (key == "kids") {
tree(obj);
}
}
}
tree(item);
where item is a nested object record: {by: 'nullzzz', descendants: 47, id: 28808556, kids: Array(13), score: 117}. kids property contains array of ids and in the users variable it becomes an array of records. and my goal change record.kids = [0, 7, 14] to record.kids = users ([{by: '...', id:4848,..], [{by: 'adasd'], [{by: 'zzz}] ). the variable news is a whole tree while item its branches.
I just started working with the toolkit, so I don't fully understand this
Since item is probably an object from your Redux store, that thunk would try to modify a reference to your store - and modifying the store is only allowed in reducers.
Generally, you should be doing logic like this in reducers, not in a thunk.
So, do
export const fetchChildrenNews = createAsyncThunk('news/fetch1', async ([item, news]) => {
const res = await Promise.all(item.kids.map(id => {
let url = `https://hacker-news.firebaseio.com/v0/item/${id}.json?print=pretty`;
return fetch(url);
}));
const jsons = await Promise.all(res.map(r => r.json()));
return jsons
})
and then in your slice, add the logic:
builder.addCase(fetchChildrenNews, (state, action) => {
const jsons = action.payload
// here, do not modify `items`, but modify `state` - I would assume `items` is somewhere in here anyways?
})

Batches with BulkWriter in google firestore

Does anyone know why this does not work, what am I doing wrong here. It get stuck after the console.log "after read stream"
I am trying to read a bunch of files, convert it to json and upload with bulkwriter to firestore.
After each 400 document I am calling close to write them to firestore and then I am creating a new bulkwriter
I also tried awaiting bulkWriter.create(eventDoc, {}) but it does not work. It also get stuck and there is no error. Why is this ? the create method returns a promise.
Why can't it be awaited ?
https://googleapis.dev/nodejs/firestore/latest/BulkWriter.html#create
The idea is to process 1 file at the time and it can contains tens of thousands of rows which needs to be uploaded to firestore
I am calling this method in for...of loop and awaiting the processBatch method
Any help highly appreciated
async processBatch(document: string, file: string): Promise<void> {
const db = admin.firestore();
console.log('start: ', document);
let bulkWriter;
const writeBatchLimit = 400;
let documentsInBatch = 0;
let totalInDocument = 0;
const eventsCollectionRef = db.collection('events');
const eventDoc = eventsCollectionRef.doc(document);
return new Promise((resolve, reject) => {
console.log('promise');
bulkWriter = db.bulkWriter();
const csvStream = fs.createReadStream(file);
console.log('after read stream');
bulkWriter.create(eventDoc, {})
.then(result => {
console.log('Successfully: ', result);
csvStream.pipe(csvParser())
.on('data', row => {
console.log('row');
bulkWriter.create(eventDoc.collection('event').doc(), row);
documentsInBatch++;
if (documentsInBatch > writeBatchLimit) {
bulkWriter.close();
totalInDocument = + documentsInBatch;
documentsInBatch = 0;
bulkWriter = db.bulkWriter();
}
})
.on('end', () => {
console.log('file: ', file + ', totalInDocument: ', totalInDocument);
resolve();
});
})
.catch(err => {
console.log('Failed: ', err);
reject();
});
});
}
This seems to work:
async processBatch(document: string, file: string): Promise<void> {
const db = admin.firestore();
console.log(`start: ${document}`);
let bulkWriter;
const writeBatchLimit = 400;
let documentsInBatch = 0;
let numOfBatches = 0;
let totalInDocument = 0;
const eventsCollectionRef = db.collection('events');
const eventDoc = eventsCollectionRef.doc(document);
bulkWriter = db.bulkWriter();
const csvStream = fs.createReadStream(file);
bulkWriter.create(eventDoc, {});
csvStream.pipe(csvParser())
.on('data', row => {
bulkWriter.create(eventDoc.collection('event').doc(), row);
documentsInBatch++;
if (documentsInBatch > writeBatchLimit) {
numOfBatches++;
totalInDocument += documentsInBatch;
documentsInBatch = 0;
bulkWriter.close();
console.log(`Committing batch ${numOfBatches}, cumulative: ${totalInDocument}`);
bulkWriter = db.bulkWriter();
}
})
.on('end', () => {
console.log(`file: ${file}, totalInDocument: ${totalInDocument}`);
});
}

How to improve speed of query for Firestore + Flutter?

I have Flutter + Firestore app with a perfomance problem: large database query execution time (about a 5 sec.). I have a small database size, I think that if it increases, the query execution speed will be even greater. How can I improve application performance?
import 'package:carstat/models/entry.dart';
import 'package:carstat/models/operation.dart';
import 'package:carstat/services/data_service.dart';
class DashboardService {
DataService dataService = DataService();
getMarkers(List<Entry> entries, String carId) async {
var _markers = [];
for (int i = 0; i < entries.length; i++) {
List<Operation> _operations = [];
_operations =
await dataService.getEntryOperations(entries[i].entryId, carId);
_markers.add({'entry': entries[i], 'operations': _operations});
}
return _markers;
}
}
My data structure for example:
.document(docId)
.collection('cars')
.document(carId)
.collection('entries')
.document(entryId)
.collection('operations')
.document();
DataService code:
getEntryOperations(String entryId, String carId) async {
List<Operation> _operations = [];
Future<QuerySnapshot> _userDoc =
fs.where('userId', isEqualTo: _userId).getDocuments();
await _userDoc.then((res) {
docId = res.documents[0].documentID;
});
Future<QuerySnapshot> _entryOperations = fs
.document(docId)
.collection('cars')
.document(carId)
.collection('entries')
.document(entryId)
.collection('operations')
.getDocuments();
await _entryOperations.then((val) {
for (int i = 0; i < val.documents.length; i++) {
var _operation = Operation();
_operation.entryId = entryId;
_operation.operationNote = val.documents[i].data['operationNote'];
_operation.operationDate =
val.documents[i].data['operationDate'].toDate();
_operation.operationMileage = val.documents[i].data['operationMileage'];
_operation.operationPartName =
val.documents[i].data['operationPartName'];
_operation.operationPrice = val.documents[i].data['operationPrice'];
_operation.partPrice = val.documents[i]['partPrice'];
_operation.operationId = val.documents[i]['operationId'];
_operations.add(_operation);
}
});
return _operations;
}
The query you're showing is unconditionally getting all of the documents in a specific subcollection. Of course, that will take more time as the collection grows. There is no secret trick or special flag to pass to make this query happen any faster.
In fact, there is not much you can do about this at all, other than to limit the size of the collection, or limit the number of documents in the query. You might also want to reconsider your database structure to reduce the number of documents you're fetching.
My answer, much faster
class DashboardService {
DataService dataService = DataService();
getMarkers(List<Entry> entries, String carId) async {
var _marker = []; // коллекция списков операторов для каждого регламента ТО
final opsForEntries = await Future.wait(
entries.map((value) {
return dataService.getEntryOperations(value.entryId, carId);
})
);
for(int i = 0; i < entries.length; i++) {
_marker.add(
{
'entry': entries[i],
'operations': opsForEntries[i]
}
);
}
return _marker;
}
}