Cloud Functions: Transaction with FieldValue.increment() not running atomically - google-cloud-firestore

I have a Cloud Functions transaction that uses FieldValue.increment() to update a nested map, but it isn't running atomically, thus the value updates aren't accurate (running the transaction in quick succession results in an incorrect count).
The function is fired via:
export const updateCategoryAndSendMessage= functions.firestore.document('events/{any}').onUpdate((event, context) => {
which include the following transaction:
db.runTransaction(tx => {
const categoryCounterRef = db.collection("data").doc("categoryCount")
const intToIncrement = event.after.data().published == true ? 1 : -1;
const location = event.after.data().location;
await tx.get(categoryCounterRef).then(doc => {
for (const key in event.after.data().category) {
event.after.data().category[key].forEach(async (subCategory) => {
const map = { [key]: { [subCategory]: FieldValue.increment(intToIncrement) } };
await tx.set(categoryCounterRef, { [location]: map }, { merge: true })
})
}
},
).then(result => {
console.info('Transaction success!')
})
.catch(err => {
console.error('Transaction failure:', err)
})
}).catch((error) => console.log(error));
Example:
Value of field to increment: 0
Tap on button that performs the function multiple times in quick succession (to switch between true and false for "Published")
Expected value: 0 or 1 (depending on whether reference document value is true or false)
Actual value: -3, 5, -2 etc.
As far as I'm aware, transactions should be performed "first come, first served" to avoid inaccurate data. It seems like the function isn't "queuing up" correctly - for lack of a better word.
I'm a bit stumped, would greatly appreciate any guidance with this.

Oh goodness, I was missing return...
return db.runTransaction(tx => {

Related

react-query: accessing the query with matching condition (by 1st index of queryKey array)

I've queries like:
useQuery(['myquery',{test:1}], fetchFn)
useQuery(['myquery',{test:2}], fetchFn)
useQuery(['myquery',{test:3}], fetchFn)
I would like to observe the data of all those queries with myquery without knowing the rest of the items of queryKey.
In documentation, as I understood it is possible to observe multiple queries but my matching condition seems not covered.
const observer = new QueriesObserver(queryClient, [
{ queryKey: ['post', 1], queryFn: fetchPost },
{ queryKey: ['post', 2], queryFn: fetchPost },
])
const unsubscribe = observer.subscribe(result => {
console.log(result)
unsubscribe()
})
I could only find similar usage for useIsFetching but it only gives a number of matching queries:
// How many queries matching the posts prefix are fetching?
const isFetchingPosts = useIsFetching(['posts'])
But I want to access the result of the queries, specifically the last updated one.
This is the best thing i can come up with using queryClient :
const Component = () => {
// match all queries with:
const keyPrefix = "courseSection_list";
// since it is loading state, it will trigger twice for each returning result
const matchingQueriesUpdated = useIsFetching([keyPrefix]);
const data = useMemo(() => {
const lastUpdatedMatchingQuery = queryClient.queryCache.queries
.filter((q) => q.queryKey[0] === keyPrefix)
.sort((a, b) => b.state.dataUpdatedAt - a.state.dataUpdatedAt)[0] // sorting puts the last updated one to the 1st index;
return lastUpdatedMatchingQuery.state.data;
}, [matchingQueriesUpdated]);
return <div> bla bla </div>
}
Extra render can be prevented by catching dataUpdatedAt value at 0 for loading state. But i rather keep my code more simple for now.

How to make a complex query to count nested objects that match with a query on firestore? [duplicate]

Is it possible to count how many items a collection has using the new Firebase database, Cloud Firestore?
If so, how do I do that?
2023 Update
Firestore now supports aggregation queries.
Node SDK
const collectionRef = db.collection('cities');
const snapshot = await collectionRef.count().get();
console.log(snapshot.data().count);
Web v9 SDK
const coll = collection(db, "cities");
const snapshot = await getCountFromServer(coll);
console.log('count: ', snapshot.data().count);
Notable Limitation - You cannot currently use count() queries with real-time listeners and offline queries. (See below for alternatives)
Pricing - Pricing depends on the number of matched index entries rather than the number of documents. One index entry contains multiple documents making this cheaper than counting documents individually.
Old Answer
As with many questions, the answer is - It depends.
You should be very careful when handling large amounts of data on the front end. On top of making your front end feel sluggish, Firestore also charges you $0.60 per million reads you make.
Small collection (less than 100 documents)
Use with care - Frontend user experience may take a hit
Handling this on the front end should be fine as long as you are not doing too much logic with this returned array.
db.collection('...').get().then(snap => {
size = snap.size // will return the collection size
});
Medium collection (100 to 1000 documents)
Use with care - Firestore read invocations may cost a lot
Handling this on the front end is not feasible as it has too much potential to slow down the users system. We should handle this logic server side and only return the size.
The drawback to this method is you are still invoking Firestore reads (equal to the size of your collection), which in the long run may end up costing you more than expected.
Cloud Function:
db.collection('...').get().then(snap => {
res.status(200).send({length: snap.size});
});
Front End:
yourHttpClient.post(yourCloudFunctionUrl).toPromise().then(snap => {
size = snap.length // will return the collection size
})
Large collection (1000+ documents)
Most scalable solution
FieldValue.increment()
As of April 2019 Firestore now allows incrementing counters, completely atomically, and without reading the data prior. This ensures we have correct counter values even when updating from multiple sources simultaneously (previously solved using transactions), while also reducing the number of database reads we perform.
By listening to any document deletes or creates we can add to or remove from a count field that is sitting in the database.
See the firestore docs - Distributed Counters
Or have a look at Data Aggregation by Jeff Delaney. His guides are truly fantastic for anyone using AngularFire but his lessons should carry over to other frameworks as well.
Cloud Function:
export const documentWriteListener = functions.firestore
.document('collection/{documentUid}')
.onWrite((change, context) => {
if (!change.before.exists) {
// New document Created : add one to count
db.doc(docRef).update({ numberOfDocs: FieldValue.increment(1) });
} else if (change.before.exists && change.after.exists) {
// Updating existing document : Do nothing
} else if (!change.after.exists) {
// Deleting document : subtract one from count
db.doc(docRef).update({ numberOfDocs: FieldValue.increment(-1) });
}
return;
});
Now on the frontend you can just query this numberOfDocs field to get the size of the collection.
Simplest way to do so is to read the size of a "querySnapshot".
db.collection("cities").get().then(function(querySnapshot) {
console.log(querySnapshot.size);
});
You can also read the length of the docs array inside "querySnapshot".
querySnapshot.docs.length;
Or if a "querySnapshot" is empty by reading the empty value, which will return a boolean value.
querySnapshot.empty;
As far as I know there is no build-in solution for this and it is only possible in the node sdk right now.
If you have a
db.collection('someCollection')
you can use
.select([fields])
to define which field you want to select. If you do an empty select() you will just get an array of document references.
example:
db.collection('someCollection').select().get().then(
(snapshot) => console.log(snapshot.docs.length)
);
This solution is only a optimization for the worst case of downloading all documents and does not scale on large collections!
Also have a look at this:
How to get a count of number of documents in a collection with Cloud Firestore
Aggregate count query just landed as a preview in Firestore.
Announced at the 2022 Firebase Summit: https://firebase.blog/posts/2022/10/whats-new-at-Firebase-Sumit-2022
Excerpt:
[Developer Preview] Count() function: With the new count function in
Firstore [sic], you can now get the count of the matching documents when you
run a query or read from a collection, without loading the actual
documents, which saves you a lot of time.
Code sample they showed at the summit:
During the Q&A, someone asked about pricing for aggregated queries, and the answer the Firebase team provided was that it'll cost 1 / 1000th of the price of a read (rounded up to the nearest read, see comments below for more details), but will count all records that are part of the aggregate.
Be careful counting number of documents for large collections. It is a little bit complex with firestore database if you want to have a precalculated counter for every collection.
Code like this doesn't work in this case:
export const customerCounterListener =
functions.firestore.document('customers/{customerId}')
.onWrite((change, context) => {
// on create
if (!change.before.exists && change.after.exists) {
return firestore
.collection('metadatas')
.doc('customers')
.get()
.then(docSnap =>
docSnap.ref.set({
count: docSnap.data().count + 1
}))
// on delete
} else if (change.before.exists && !change.after.exists) {
return firestore
.collection('metadatas')
.doc('customers')
.get()
.then(docSnap =>
docSnap.ref.set({
count: docSnap.data().count - 1
}))
}
return null;
});
The reason is because every cloud firestore trigger has to be idempotent, as firestore documentation say: https://firebase.google.com/docs/functions/firestore-events#limitations_and_guarantees
Solution
So, in order to prevent multiple executions of your code, you need to manage with events and transactions. This is my particular way to handle large collection counters:
const executeOnce = (change, context, task) => {
const eventRef = firestore.collection('events').doc(context.eventId);
return firestore.runTransaction(t =>
t
.get(eventRef)
.then(docSnap => (docSnap.exists ? null : task(t)))
.then(() => t.set(eventRef, { processed: true }))
);
};
const documentCounter = collectionName => (change, context) =>
executeOnce(change, context, t => {
// on create
if (!change.before.exists && change.after.exists) {
return t
.get(firestore.collection('metadatas')
.doc(collectionName))
.then(docSnap =>
t.set(docSnap.ref, {
count: ((docSnap.data() && docSnap.data().count) || 0) + 1
}));
// on delete
} else if (change.before.exists && !change.after.exists) {
return t
.get(firestore.collection('metadatas')
.doc(collectionName))
.then(docSnap =>
t.set(docSnap.ref, {
count: docSnap.data().count - 1
}));
}
return null;
});
Use cases here:
/**
* Count documents in articles collection.
*/
exports.articlesCounter = functions.firestore
.document('articles/{id}')
.onWrite(documentCounter('articles'));
/**
* Count documents in customers collection.
*/
exports.customersCounter = functions.firestore
.document('customers/{id}')
.onWrite(documentCounter('customers'));
As you can see, the key to prevent multiple execution is the property called eventId in the context object. If the function has been handled many times for the same event, the event id will be the same in all cases. Unfortunately, you must have "events" collection in your database.
In 2020 this is still not available in the Firebase SDK however it is available in Firebase Extensions (Beta) however it's pretty complex to setup and use...
A reasonable approach
Helpers... (create/delete seems redundant but is cheaper than onUpdate)
export const onCreateCounter = () => async (
change,
context
) => {
const collectionPath = change.ref.parent.path;
const statsDoc = db.doc("counters/" + collectionPath);
const countDoc = {};
countDoc["count"] = admin.firestore.FieldValue.increment(1);
await statsDoc.set(countDoc, { merge: true });
};
export const onDeleteCounter = () => async (
change,
context
) => {
const collectionPath = change.ref.parent.path;
const statsDoc = db.doc("counters/" + collectionPath);
const countDoc = {};
countDoc["count"] = admin.firestore.FieldValue.increment(-1);
await statsDoc.set(countDoc, { merge: true });
};
export interface CounterPath {
watch: string;
name: string;
}
Exported Firestore hooks
export const Counters: CounterPath[] = [
{
name: "count_buildings",
watch: "buildings/{id2}"
},
{
name: "count_buildings_subcollections",
watch: "buildings/{id2}/{id3}/{id4}"
}
];
Counters.forEach(item => {
exports[item.name + '_create'] = functions.firestore
.document(item.watch)
.onCreate(onCreateCounter());
exports[item.name + '_delete'] = functions.firestore
.document(item.watch)
.onDelete(onDeleteCounter());
});
In action
The building root collection and all sub collections will be tracked.
Here under the /counters/ root path
Now collection counts will update automatically and eventually! If you need a count, just use the collection path and prefix it with counters.
const collectionPath = 'buildings/138faicnjasjoa89/buildingContacts';
const collectionCount = await db
.doc('counters/' + collectionPath)
.get()
.then(snap => snap.get('count'));
Limitations
As this approach uses a single database and document, it is limited to the Firestore constraint of 1 Update per Second for each counter. It will be eventually consistent, but in cases where large amounts of documents are added/removed the counter will lag behind the actual collection count.
I agree with #Matthew, it will cost a lot if you perform such query.
[ADVICE FOR DEVELOPERS BEFORE STARTING THEIR PROJECTS]
Since we have foreseen this situation at the beginning, we can actually make a collection namely counters with a document to store all the counters in a field with type number.
For example:
For each CRUD operation on the collection, update the counter document:
When you create a new collection/subcollection: (+1 in the counter) [1 write operation]
When you delete a collection/subcollection: (-1 in the counter) [1 write operation]
When you update an existing collection/subcollection, do nothing on the counter document: (0)
When you read an existing collection/subcollection, do nothing on the counter document: (0)
Next time, when you want to get the number of collection, you just need to query/point to the document field. [1 read operation]
In addition, you can store the collections name in an array, but this will be tricky, the condition of array in firebase is shown as below:
// we send this
['a', 'b', 'c', 'd', 'e']
// Firebase stores this
{0: 'a', 1: 'b', 2: 'c', 3: 'd', 4: 'e'}
// since the keys are numeric and sequential,
// if we query the data, we get this
['a', 'b', 'c', 'd', 'e']
// however, if we then delete a, b, and d,
// they are no longer mostly sequential, so
// we do not get back an array
{2: 'c', 4: 'e'}
So, if you are not going to delete the collection , you can actually use array to store list of collections name instead of querying all the collection every time.
Hope it helps!
As of October 2022, Firestore has introduced a count() method on the client SDKs. Now you can count for a query without downloads.
For 1000 documents, it will charge you for 1 document read.
Web (v9)
Introduced in Firebase 9.11.0:
const collectionRef = collection(db, "cities");
const snapshot = await getCountFromServer(collectionRef);
console.log('count: ', snapshot.data().count);
Web V8
Not Available.
Node (Admin)
const collectionRef = db.collection('cities');
const snapshot = await collectionRef.count().get();
console.log(snapshot.data().count);
Android (Kotlin)
Introduced in firestore v24.4.0 (BoM 31.0.0):
val query = db.collection("cities")
val countQuery = query.count()
countQuery.get(AggregateSource.SERVER).addOnCompleteListener { task ->
if (task.isSuccessful) {
val snapshot = task.result
Log.d(TAG, "Count: ${snapshot.count}")
} else {
Log.d(TAG, "Count failed: ", task.getException())
}
}
Apple Platforms (Swift)
Introduced in Firestore v10.0.0:
do {
let query = db.collection("cities")
let countQuery = query.countAggregateQuery
let snapshot = try await countQuery.aggregation(source: AggregateSource.server)
print(snapshot.count)
} catch {
print(error)
}
Increment a counter using admin.firestore.FieldValue.increment:
exports.onInstanceCreate = functions.firestore.document('projects/{projectId}/instances/{instanceId}')
.onCreate((snap, context) =>
db.collection('projects').doc(context.params.projectId).update({
instanceCount: admin.firestore.FieldValue.increment(1),
})
);
exports.onInstanceDelete = functions.firestore.document('projects/{projectId}/instances/{instanceId}')
.onDelete((snap, context) =>
db.collection('projects').doc(context.params.projectId).update({
instanceCount: admin.firestore.FieldValue.increment(-1),
})
);
In this example we increment an instanceCount field in the project each time a document is added to the instances sub collection. If the field doesn't exist yet it will be created and incremented to 1.
The incrementation is transactional internally but you should use a distributed counter if you need to increment more frequently than every 1 second.
It's often preferable to implement onCreate and onDelete rather than onWrite as you will call onWrite for updates which means you are spending more money on unnecessary function invocations (if you update the docs in your collection).
No, there is no built-in support for aggregation queries right now. However there are a few things you could do.
The first is documented here. You can use transactions or cloud functions to maintain aggregate information:
This example shows how to use a function to keep track of the number of ratings in a subcollection, as well as the average rating.
exports.aggregateRatings = firestore
.document('restaurants/{restId}/ratings/{ratingId}')
.onWrite(event => {
// Get value of the newly added rating
var ratingVal = event.data.get('rating');
// Get a reference to the restaurant
var restRef = db.collection('restaurants').document(event.params.restId);
// Update aggregations in a transaction
return db.transaction(transaction => {
return transaction.get(restRef).then(restDoc => {
// Compute new number of ratings
var newNumRatings = restDoc.data('numRatings') + 1;
// Compute new average rating
var oldRatingTotal = restDoc.data('avgRating') * restDoc.data('numRatings');
var newAvgRating = (oldRatingTotal + ratingVal) / newNumRatings;
// Update restaurant info
return transaction.update(restRef, {
avgRating: newAvgRating,
numRatings: newNumRatings
});
});
});
});
The solution that jbb mentioned is also useful if you only want to count documents infrequently. Make sure to use the select() statement to avoid downloading all of each document (that's a lot of bandwidth when you only need a count). select() is only available in the server SDKs for now so that solution won't work in a mobile app.
UPDATE 11/20
I created an npm package for easy access to a counter function: https://code.build/p/9DicAmrnRoK4uk62Hw1bEV/firestore-counters
I created a universal function using all these ideas to handle all counter situations (except queries).
The only exception would be when doing so many writes a second, it
slows you down. An example would be likes on a trending post. It is
overkill on a blog post, for example, and will cost you more. I
suggest creating a separate function in that case using shards:
https://firebase.google.com/docs/firestore/solutions/counters
// trigger collections
exports.myFunction = functions.firestore
.document('{colId}/{docId}')
.onWrite(async (change: any, context: any) => {
return runCounter(change, context);
});
// trigger sub-collections
exports.mySubFunction = functions.firestore
.document('{colId}/{docId}/{subColId}/{subDocId}')
.onWrite(async (change: any, context: any) => {
return runCounter(change, context);
});
// add change the count
const runCounter = async function (change: any, context: any) {
const col = context.params.colId;
const eventsDoc = '_events';
const countersDoc = '_counters';
// ignore helper collections
if (col.startsWith('_')) {
return null;
}
// simplify event types
const createDoc = change.after.exists && !change.before.exists;
const updateDoc = change.before.exists && change.after.exists;
if (updateDoc) {
return null;
}
// check for sub collection
const isSubCol = context.params.subDocId;
const parentDoc = `${countersDoc}/${context.params.colId}`;
const countDoc = isSubCol
? `${parentDoc}/${context.params.docId}/${context.params.subColId}`
: `${parentDoc}`;
// collection references
const countRef = db.doc(countDoc);
const countSnap = await countRef.get();
// increment size if doc exists
if (countSnap.exists) {
// createDoc or deleteDoc
const n = createDoc ? 1 : -1;
const i = admin.firestore.FieldValue.increment(n);
// create event for accurate increment
const eventRef = db.doc(`${eventsDoc}/${context.eventId}`);
return db.runTransaction(async (t: any): Promise<any> => {
const eventSnap = await t.get(eventRef);
// do nothing if event exists
if (eventSnap.exists) {
return null;
}
// add event and update size
await t.update(countRef, { count: i });
return t.set(eventRef, {
completed: admin.firestore.FieldValue.serverTimestamp()
});
}).catch((e: any) => {
console.log(e);
});
// otherwise count all docs in the collection and add size
} else {
const colRef = db.collection(change.after.ref.parent.path);
return db.runTransaction(async (t: any): Promise<any> => {
// update size
const colSnap = await t.get(colRef);
return t.set(countRef, { count: colSnap.size });
}).catch((e: any) => {
console.log(e);
});;
}
}
This handles events, increments, and transactions. The beauty in this, is that if you are not sure about the accuracy of a document (probably while still in beta), you can delete the counter to have it automatically add them up on the next trigger. Yes, this costs, so don't delete it otherwise.
Same kind of thing to get the count:
const collectionPath = 'buildings/138faicnjasjoa89/buildingContacts';
const colSnap = await db.doc('_counters/' + collectionPath).get();
const count = colSnap.get('count');
Also, you may want to create a cron job (scheduled function) to remove old events to save money on database storage. You need at least a blaze plan, and there may be some more configuration. You could run it every sunday at 11pm, for example.
https://firebase.google.com/docs/functions/schedule-functions
This is untested, but should work with a few tweaks:
exports.scheduledFunctionCrontab = functions.pubsub.schedule('5 11 * * *')
.timeZone('America/New_York')
.onRun(async (context) => {
// get yesterday
const yesterday = new Date();
yesterday.setDate(yesterday.getDate() - 1);
const eventFilter = db.collection('_events').where('completed', '<=', yesterday);
const eventFilterSnap = await eventFilter.get();
eventFilterSnap.forEach(async (doc: any) => {
await doc.ref.delete();
});
return null;
});
And last, don't forget to protect the collections in firestore.rules:
match /_counters/{document} {
allow read;
allow write: if false;
}
match /_events/{document} {
allow read, write: if false;
}
Update: Queries
Adding to my other answer if you want to automate query counts as well, you can use this modified code in your cloud function:
if (col === 'posts') {
// counter reference - user doc ref
const userRef = after ? after.userDoc : before.userDoc;
// query reference
const postsQuery = db.collection('posts').where('userDoc', "==", userRef);
// add the count - postsCount on userDoc
await addCount(change, context, postsQuery, userRef, 'postsCount');
}
return delEvents();
Which will automatically update the postsCount in the userDocument. You could easily add other one to many counts this way. This just gives you ideas of how you can automate things. I also gave you another way to delete the events. You have to read each date to delete it, so it won't really save you to delete them later, just makes the function slower.
/**
* Adds a counter to a doc
* #param change - change ref
* #param context - context ref
* #param queryRef - the query ref to count
* #param countRef - the counter document ref
* #param countName - the name of the counter on the counter document
*/
const addCount = async function (change: any, context: any,
queryRef: any, countRef: any, countName: string) {
// events collection
const eventsDoc = '_events';
// simplify event type
const createDoc = change.after.exists && !change.before.exists;
// doc references
const countSnap = await countRef.get();
// increment size if field exists
if (countSnap.get(countName)) {
// createDoc or deleteDoc
const n = createDoc ? 1 : -1;
const i = admin.firestore.FieldValue.increment(n);
// create event for accurate increment
const eventRef = db.doc(`${eventsDoc}/${context.eventId}`);
return db.runTransaction(async (t: any): Promise<any> => {
const eventSnap = await t.get(eventRef);
// do nothing if event exists
if (eventSnap.exists) {
return null;
}
// add event and update size
await t.set(countRef, { [countName]: i }, { merge: true });
return t.set(eventRef, {
completed: admin.firestore.FieldValue.serverTimestamp()
});
}).catch((e: any) => {
console.log(e);
});
// otherwise count all docs in the collection and add size
} else {
return db.runTransaction(async (t: any): Promise<any> => {
// update size
const colSnap = await t.get(queryRef);
return t.set(countRef, { [countName]: colSnap.size }, { merge: true });
}).catch((e: any) => {
console.log(e);
});;
}
}
/**
* Deletes events over a day old
*/
const delEvents = async function () {
// get yesterday
const yesterday = new Date();
yesterday.setDate(yesterday.getDate() - 1);
const eventFilter = db.collection('_events').where('completed', '<=', yesterday);
const eventFilterSnap = await eventFilter.get();
eventFilterSnap.forEach(async (doc: any) => {
await doc.ref.delete();
});
return null;
}
I should also warn you that universal functions will run on every
onWrite call period. It may be cheaper to only run the function on
onCreate and on onDelete instances of your specific collections. Like
the noSQL database we are using, repeated code and data can save you
money.
There is no direct option available. You cant't do db.collection("CollectionName").count().
Below are the two ways by which you can find the count of number of documents within a collection.
1 :- Get all the documents in the collection and then get it's size.(Not the best Solution)
db.collection("CollectionName").get().subscribe(doc=>{
console.log(doc.size)
})
By using above code your document reads will be equal to the size of documents within a collection and that is the reason why one must avoid using above solution.
2:- Create a separate document with in your collection which will store the count of number of documents in the collection.(Best Solution)
db.collection("CollectionName").doc("counts")get().subscribe(doc=>{
console.log(doc.count)
})
Above we created a document with name counts to store all the count information.You can update the count document in the following way:-
Create a firestore triggers on the document counts
Increment the count property of counts document when a new document is created.
Decrement the count property of counts document when a document is deleted.
w.r.t price (Document Read = 1) and fast data retrieval the above solution is good.
A workaround is to:
write a counter in a firebase doc, which you increment within a transaction everytime you create a new entry
You store the count in a field of your new entry (i.e: position: 4).
Then you create an index on that field (position DESC).
You can do a skip+limit with a query.Where("position", "<" x).OrderBy("position", DESC)
Hope this helps!
I have try a lot with different approaches.
And finally, I improve one of the methods.
First you need to create a separate collection and save there all events.
Second you need to create a new lambda to be triggered by time. This lambda will Count events in event collection and clear event documents.
Code details in article.
https://medium.com/#ihor.malaniuk/how-to-count-documents-in-google-cloud-firestore-b0e65863aeca
one of the fast + money saver trick is that:-
make a doc and store a 'count' variable in firestore, when user add new doc in the collection, increase that variable, and when user delete a doc, decrease variable. e.g.
updateDoc(doc(db, "Count_collection", "Count_Doc"), {count: increment(1)});
note: use (-1) for decreasing, (1) for increasing count
How it save money and time:-
you(firebase) don't need to loop through the collection, nor browser needs to load whole collection to count number of docs.
all the counts are save in a doc of only one variable named "count" or whatever, so less than 1kb data is used, and it use only 1 reads in firebase firestore.
Solution using pagination with offset & limit:
public int collectionCount(String collection) {
Integer page = 0;
List<QueryDocumentSnapshot> snaps = new ArrayList<>();
findDocsByPage(collection, page, snaps);
return snaps.size();
}
public void findDocsByPage(String collection, Integer page,
List<QueryDocumentSnapshot> snaps) {
try {
Integer limit = 26000;
FieldPath[] selectedFields = new FieldPath[] { FieldPath.of("id") };
List<QueryDocumentSnapshot> snapshotPage;
snapshotPage = fireStore()
.collection(collection)
.select(selectedFields)
.offset(page * limit)
.limit(limit)
.get().get().getDocuments();
if (snapshotPage.size() > 0) {
snaps.addAll(snapshotPage);
page++;
findDocsByPage(collection, page, snaps);
}
} catch (InterruptedException | ExecutionException e) {
e.printStackTrace();
}
}
findDocsPage it's a recursive method to find all pages of collection
selectedFields for otimize query and get only id field instead full body of document
limit max size of each query page
page define inicial page for pagination
From the tests I did it worked well for collections with up to approximately 120k records!
Firestore is introducing a new Query.count() that fetches the count of a query without fetching the docs.
This would allow to simply query all collection items and get the count of that query.
Ref:
Firebase 10 iOS SDK
[JS SDK PR] (https://github.com/firebase/firebase-js-sdk/pull/6608)
There's a new build in function since version 9.11.0 called getCountFromServer(), which fetches the number of documents in the result set without actually downloading the documents.
https://firebase.google.com/docs/reference/js/firestore_#getcountfromserver
Took me a while to get this working based on some of the answers above, so I thought I'd share it for others to use. I hope it's useful.
'use strict';
const functions = require('firebase-functions');
const admin = require('firebase-admin');
admin.initializeApp();
const db = admin.firestore();
exports.countDocumentsChange = functions.firestore.document('library/{categoryId}/documents/{documentId}').onWrite((change, context) => {
const categoryId = context.params.categoryId;
const categoryRef = db.collection('library').doc(categoryId)
let FieldValue = require('firebase-admin').firestore.FieldValue;
if (!change.before.exists) {
// new document created : add one to count
categoryRef.update({numberOfDocs: FieldValue.increment(1)});
console.log("%s numberOfDocs incremented by 1", categoryId);
} else if (change.before.exists && change.after.exists) {
// updating existing document : Do nothing
} else if (!change.after.exists) {
// deleting document : subtract one from count
categoryRef.update({numberOfDocs: FieldValue.increment(-1)});
console.log("%s numberOfDocs decremented by 1", categoryId);
}
return 0;
});
This uses counting to create numeric unique ID. In my use, I will not be decrementing ever, even when the document that the ID is needed for is deleted.
Upon a collection creation that needs unique numeric value
Designate a collection appData with one document, set with .doc id only
Set uniqueNumericIDAmount to 0 in the firebase firestore console
Use doc.data().uniqueNumericIDAmount + 1 as the unique numeric id
Update appData collection uniqueNumericIDAmount with firebase.firestore.FieldValue.increment(1)
firebase
.firestore()
.collection("appData")
.doc("only")
.get()
.then(doc => {
var foo = doc.data();
foo.id = doc.id;
// your collection that needs a unique ID
firebase
.firestore()
.collection("uniqueNumericIDs")
.doc(user.uid)// user id in my case
.set({// I use this in login, so this document doesn't
// exist yet, otherwise use update instead of set
phone: this.state.phone,// whatever else you need
uniqueNumericID: foo.uniqueNumericIDAmount + 1
})
.then(() => {
// upon success of new ID, increment uniqueNumericIDAmount
firebase
.firestore()
.collection("appData")
.doc("only")
.update({
uniqueNumericIDAmount: firebase.firestore.FieldValue.increment(
1
)
})
.catch(err => {
console.log(err);
});
})
.catch(err => {
console.log(err);
});
});
var variable=0
variable=variable+querySnapshot.count
then if you are to use it on a String variable then
let stringVariable= String(variable)
Along with my npm package adv-firestore-functions above, you can also just use firestore rules to force a good counter:
Firestore Rules
function counter() {
let docPath = /databases/$(database)/documents/_counters/$(request.path[3]);
let afterCount = getAfter(docPath).data.count;
let beforeCount = get(docPath).data.count;
let addCount = afterCount == beforeCount + 1;
let subCount = afterCount == beforeCount - 1;
let newId = getAfter(docPath).data.docId == request.path[4];
let deleteDoc = request.method == 'delete';
let createDoc = request.method == 'create';
return (newId && subCount && deleteDoc) || (newId && addCount && createDoc);
}
function counterDoc() {
let doc = request.path[4];
let docId = request.resource.data.docId;
let afterCount = request.resource.data.count;
let beforeCount = resource.data.count;
let docPath = /databases/$(database)/documents/$(doc)/$(docId);
let createIdDoc = existsAfter(docPath) && !exists(docPath);
let deleteIdDoc = !existsAfter(docPath) && exists(docPath);
let addCount = afterCount == beforeCount + 1;
let subCount = afterCount == beforeCount - 1;
return (createIdDoc && addCount) || (deleteIdDoc && subCount);
}
and use them like so:
match /posts/{document} {
allow read;
allow update;
allow create: if counter();
allow delete: if counter();
}
match /_counters/{document} {
allow read;
allow write: if counterDoc();
}
Frontend
Replace your set and delete functions with these:
set
async setDocWithCounter(
ref: DocumentReference<DocumentData>,
data: {
[x: string]: any;
},
options: SetOptions): Promise<void> {
// counter collection
const counterCol = '_counters';
const col = ref.path.split('/').slice(0, -1).join('/');
const countRef = doc(this.afs, counterCol, col);
const countSnap = await getDoc(countRef);
const refSnap = await getDoc(ref);
// don't increase count if edit
if (refSnap.exists()) {
await setDoc(ref, data, options);
// increase count
} else {
const batch = writeBatch(this.afs);
batch.set(ref, data, options);
// if count exists
if (countSnap.exists()) {
batch.update(countRef, {
count: increment(1),
docId: ref.id
});
// create count
} else {
// will only run once, should not use
// for mature apps
const colRef = collection(this.afs, col);
const colSnap = await getDocs(colRef);
batch.set(countRef, {
count: colSnap.size + 1,
docId: ref.id
});
}
batch.commit();
}
}
delete
async delWithCounter(
ref: DocumentReference<DocumentData>
): Promise<void> {
// counter collection
const counterCol = '_counters';
const col = ref.path.split('/').slice(0, -1).join('/');
const countRef = doc(this.afs, counterCol, col);
const countSnap = await getDoc(countRef);
const batch = writeBatch(this.afs);
// if count exists
batch.delete(ref);
if (countSnap.exists()) {
batch.update(countRef, {
count: increment(-1),
docId: ref.id
});
}
/*
if ((countSnap.data() as any).count == 1) {
batch.delete(countRef);
}*/
batch.commit();
}
see here for more info...
J
This feature is now supported in FireStore, albeit in Beta.
Here are the official Firebase docs
With the new version of Firebase, you can now run aggregated queries!
Simply write
.count().get();
after your query.
As it stands, firebase only allows server-side count, like this
const collectionRef = db.collection('cities');
const snapshot = await collectionRef.count().get();
console.log(snapshot.data().count);
Please not this is for nodeJS
New feature available in Firebase/Firestore provides a count of documents in a collection:
See this thread to see how to achieve it, with an example.
How To Count Number of Documents in a Collection in Firebase Firestore With a WHERE query in react.js
According to this documentation Cloud Firestore supports the count() aggregation query and is available in preview.
The Flutter/Dart code was missing (at the time of writing this) so I played around with it and the following function seems to work:
Future<int> getCount(String path) async {
var collection = _fireStore.collection(path);
var countQuery = collection.count();
var snapShot = await countQuery.get(source: AggregateSource.server);
return snapShot.count;
}
firebaseFirestore.collection("...").addSnapshotListener(new EventListener<QuerySnapshot>() {
#Override
public void onEvent(QuerySnapshot documentSnapshots, FirebaseFirestoreException e) {
int Counter = documentSnapshots.size();
}
});
So my solution for this problem is a bit non-technical, not super precise, but good enough for me.
Those are my documents. As I have a lot of them (100k+) there are 'laws of big numbers' happening. I can assume that there is less-or-more equal number of items having id starting with 0, 1, 2, etc.
So what I do is I scroll my list till I get into id's starting with 1, or with 01, depending on how long you have to scroll
👆 here we are.
Now, having scrolled so far, I open the inspector and see how much did I scroll and divide it by height of single element
Had to scroll 82000px to get items with id starting with 1. Height of single element is 32px.
It means I have 2500 with id starting with 0, so now I multiply it by number of possible 'starting char'. In firebase it can be A-Z, a-z, 0-9 which means it's 24 + 24 + 10 = 58.
It means I have ~~2500*58 so it gives roughly 145000 items in my collection.
Summarizing: What is wrong with you firebase?

Uspert multiple documents with MongoDB/Mongoose

Say I have a list of models:
const documents = [{}, {}, {}];
And I want to insert these into the DB, or update them all, but only if a condition is met:
Model.update({isSubscribed: {$ne: false}}, documents, {upsert:true},(err, result) => {
});
The above signature is surely wrong - what I want to do is insert/update the documents, where the condition is met.
There is this Bulk API:
https://docs.mongodb.com/manual/reference/method/Bulk.find.upsert/
but I can't tell if it will work when inserting multiple documents.
Imagine this scenario: We have a list of employees and a form of some sorts to give them all a penalty, at once, not one by one :)
On the backend side, you would have your eg addBulk function. Something like this:
Penalty controller
module.exports = {
addBulk: (req, res) => {
const body = req.body;
for (const item of body) {
Penalty.create(item).exec((err, response) => {
if (err) {
res.serverError(err);
return;
}
});
res.ok('Penalties added successfully');
}
}
Then you'll probably have an API on your frontend that directs to that route and specific function (endpoint):
penaltyApi
import axios from 'axios';
import {baseApiUrl} from '../config';
const penaltyApi = baseApiUrl + 'penalty'
class PenaltyApi {
static addBulk(penalties) {
return axios({
method: 'post',
url: penaltyApi + '/addBulk',
data: penalties
})
}
}
export default PenaltyApi;
...and now let's make a form and some helper functions. I'll be using React for demonstration, but it's all JS by the end of the day, right :)
// Lets first add penalties to our local state:
addPenalty = (event) => {
event.preventDefault();
let penalty = {
amount: this.state.penaltyForm.amount,
unit: this.state.penaltyForm.unit,
date: new Date(),
description: this.state.penaltyForm.description,
employee: this.state.penaltyForm.employee.value
};
this.setState(prevState => ({
penalties: [...prevState.penalties, penalty]
}));
}
Here we are mapping over our formData and returning the value and passing it to our saveBulkEmployees() function
save = () => {
let penaltiesData = Object.assign([], this.state.penalties);
penaltiesData.map(penal => {
penal.employeeId = penal.employee.id;
delete penal.employee;
return penaltiesData;
});
this.saveBulkEmployees(penaltiesData);
}
...and finally, let's save all of them at once to our database using the Bulk API
saveBulkEmployees = (data) => {
PenaltyApi.addBulk(data).then(response => {
this.success();
console.log(response.config.data)
this.resetFormAndPenaltiesList()
}).catch(error => {
console.log('error while adding multiple penalties', error);
throw(error);
})
}
So, the short answer is YES, you can absolutely do that. The longer answer is above :) I hope this was helpful to you. If any questions, please let me know, I'll try to answer them as soon as I can.

GraphQL & MongoDB cursors

I'm confused about what should be the relation between GraphQL's cursors and MongoDB's cursors.
I'm currently working on a mutation that creates an object (mongo document) and add it to an existing connection (mongo collection). When adding the object, the mutation returns the added edge. Which should look like:
{
node,
cursor
}
While node is the actual added document, I'm confused on what should be returned as the cursor.
This is my Mutation:
const CreatePollMutation = mutationWithClientMutationId({
name: 'CreatePoll',
inputFields: {
title: {
type: new GraphQLNonNull(GraphQLString),
},
multi: {
type: GraphQLBoolean,
},
options: {
type: new GraphQLNonNull(new GraphQLList(GraphQLString)),
},
author: {
type: new GraphQLNonNull(GraphQLID),
},
},
outputFields: {
pollEdge: {
type: pollEdgeType,
resolve: (poll => (
{
// cursorForObjectInConnection was used when I've tested using mock JSON data,
// this doesn't work now since db.getPolls() is async
cursor: cursorForObjectInConnection(db.getPolls(), poll),
node: poll,
}
)),
},
},
mutateAndGetPayload: ({ title, multi, options, author }) => {
const { id: authorId } = fromGlobalId(author);
return db.createPoll(title, options, authorId, multi); //promise
},
});
Thanks!
Probably a bit late for you, but maybe it will help someone else stumbling across this problem.
Just return a promise from your resolve method. Then you can create your cursor using your actual polls array. Like so:
resolve: (poll =>
return db.getPolls().then(polls => {
return { cursor: cursorForObjectInConnection(polls, poll), node: poll }
})
)
But careful, your poll and the objects in polls have different origins now and are not strictly equal. As cursorForObjectInConnection uses javascript's indexOf your cursor would probably end up being null. To prevent this you should find the object index yourself and use offsetToCursor to construct your cursor as discussed here.

Is there a way to perform a "dry run" of an update operation?

I am in the process of changing the schema for one of my MongoDB collections. (I had been storing dates as strings, and now my application stores them as ISODates; I need to go back and change all of the old records to use ISODates as well.) I think I know how to do this using an update, but since this operation will affect tens of thousands of records I'm hesitant to issue an operation that I'm not 100% sure will work. Is there any way to do a "dry run" of an update that will show me, for a small number of records, the original record and how it would be changed?
Edit: I ended up using the approach of adding a new field to each record, and then (after verifying that the data was right) renaming that field to match the original. It looked like this:
db.events.find({timestamp: {$type: 2}})
.forEach( function (e) {
e.newTimestamp = new ISODate(e.timestamp);
db.events.save(e);
} )
db.events.update({},
{$rename: {'newTimestamp': 'timestamp'}},
{multi: true})
By the way, that method for converting the string times to ISODates was what ended up working. (I got the idea from this SO answer.)
My advice would be to add the ISODate as a new field. Once confirmed that all looks good you could then unset the the string date.
Create a test environment with your database structure. Copy a handful of records to it. Problem solved. Not the solution you were looking for, I'm sure. But, I believe, this is the exact circumstances that a 'test environment' should be used for.
Select ID of particular records that you would like to monitor. place in the update {_id:{$in:[<your monitored id>]}}
Another option which depends of the amount of overhead it will cause you -
You can consider writing a script, that performs the find operation, add printouts or run in debug while the save operation is commented out. Once you've gained confidence you can apply the save operation.
var changesLog = [];
var errorsLog = [];
events.find({timestamp: {$type: 2}}, function (err, events) {
if (err) {
debugger;
throw err;
} else {
for (var i = 0; i < events.length; i++) {
console.log('events' + i +"/"+(candidates.length-1));
var currentEvent = events[i];
var shouldUpdateCandidateData = false;
currentEvent.timestamp = new ISODate(currentEvent.timestamp);
var change = currentEvent._id;
changesLog.push(change);
// // ** Dry Run **
// currentEvent.save(function (err) {
// if (err) {
// debugger;
// errorsLog.push(currentEvent._id + ", " + currentEvent.timeStamp + ', ' + err);
// throw err;
// }
// });
}
console.log('Done');
console.log('Changes:');
console.log(changesLog);
console.log('Errors:');
console.log(errorsLog);
return;
}
});
db.collection.find({"_manager": { $exists: true, $ne: null }}).forEach(
function(doc){
doc['_managers']=[doc._manager]; // String --> List
delete doc['_manager']; // Remove "_managers" key-value pair
printjson(doc); // Debug by output the doc result
//db.teams.save(doc); // Save all the changes into doc data
}
)
In my case the collection contain _manager and I would like to change it to _managers list. I have tested it in my local working as expected.
In the several latest versions of MongoDB (at least starting with 4.2), you could do that using a transaction.
const { MongoClient } = require('mongodb')
async function main({ dryRun }) {
const client = new MongoClient('mongodb://127.0.0.1:27017', {
maxPoolSize: 1
})
const pool = await client.connect()
const db = pool.db('someDB')
const session = pool.startSession()
session.startTransaction()
try {
const filter = { id: 'some-id' }
const update = { $rename: { 'newTimestamp': 'timestamp' } }
// This is the important bit
const options = { session: session }
await db.collection('someCollection').updateMany(
filter,
update,
options // using session
)
const afterUpdate = db.collection('someCollection')
.find(
filter,
options // using session
)
.toArray()
console.debug('updated documents', afterUpdate)
if (dryRun) {
// This will roll back any changes made within the session
await session.abortTransaction()
} else {
await session.commitTransaction()
}
} finally {
await session.endSession()
await pool.close()
}
}
const _ = main({ dryRun: true })