Firestore unique index or unique constraint? - google-cloud-firestore

Is it possible in Firestore to define an index with a unique constraint? If not, how is it possible to enforce uniqueness on a document field (without using document ID)?

Yes, this is possible using a combination of two collections, Firestore rules and batched writes.
https://cloud.google.com/firestore/docs/manage-data/transactions#batched-writes
The simple idea is, using a batched write, you write your document to your "data" collection and at the same write to a separate "index" collection where you index the value of the field that you want to be unique.
Using the Firestore rules, you can then ensure that the "data" collection can only have a document written to it if the document field's value also exists in the index collection and, vice versa, that the index collection can only be written to if value in the index matches what's in the data collection.
Example
Let's say that we have a User collection and we want to ensure that the username field is unique.
Our User collection will contain simply the username
/User/{id}
{
username: String
}
Our Index collection will contain the username in the path and a value property that contains the id of the User that is indexed.
/Index/User/username/{username}
{
value: User.id
}
To create our User we use a batch write to create both the User document and the Index document at the same time.
const firebaseApp = ...construct your firebase app
const createUser = async (username) => {
const database = firebaseApp.firestore()
const batch = database.batch()
const Collection = database.collection('User')
const ref = Collection.doc()
batch.set(ref, {
username
})
const Index = database.collection('Index')
const indexRef = Index.doc(`User/username/${username}`)
batch.set(indexRef, {
value: ref.id
})
await batch.commit()
}
To update our User's username we use a batch write to update the User document, delete the previous Index document and create a new Index document all at the same time.
const firebaseApp = ...construct your firebase app
const updateUser = async (id, username) => {
const database = firebaseApp.firestore()
const batch = database.batch()
const Collection = database.collection('User')
const ref = Collection.doc(id)
const refDoc = await ref.get()
const prevData = refDoc.data()
batch.update(ref, {
username
})
const Index = database.collection('Index')
const prevIndexRef = Index.doc(`User/username/${prevData.username}`)
const indexRef = Index.doc(`User/username/${username}`)
batch.delete(prevIndexRef)
batch.set(indexRef, {
value: ref.id
})
await batch.commit()
}
To delete a User we use a batch write to delete both the User document and the Index document at the same time.
const firebaseApp = ...construct your firebase app
const deleteUser = async (id) => {
const database = firebaseApp.firestore()
const batch = database.batch()
const Collection = database.collection('User')
const ref = Collection.doc(id)
const refDoc = await ref.get()
const prevData = refDoc.data()
batch.delete(ref)
const Index = database.collection('Index')
const indexRef = Index.doc(`User/username/${prevData.username}`)
batch.delete(indexRef)
await batch.commit()
}
We then setup our Firestore rules so that they only allow a User to be created if the username is not already indexed for a different User. A User's username can only be updated if an Index does not already exist for the username and a User can only be deleted if the Index is deleted as well. Create and update will fail with a "Missing or insufficient permissions" error if a User with the same username already exists.
rules_version = '2';
service cloud.firestore {
match /databases/{database}/documents {
// Index collection helper methods
function getIndexAfter(path) {
return getAfter(/databases/$(database)/documents/Index/$(path))
}
function getIndexBefore(path) {
return get(/databases/$(database)/documents/Index/$(path))
}
function indexExistsAfter(path) {
return existsAfter(/databases/$(database)/documents/Index/$(path))
}
function indexExistsBefore(path) {
return exists(/databases/$(database)/documents/Index/$(path))
}
// User collection helper methods
function getUserAfter(id) {
return getAfter(/databases/$(database)/documents/User/$(id))
}
function getUserBefore(id) {
return get(/databases/$(database)/documents/User/$(id))
}
function userExistsAfter(id) {
return existsAfter(/databases/$(database)/documents/User/$(id))
}
match /User/{id} {
allow read: true;
allow create: if
getIndexAfter(/User/username/$(getUserAfter(id).data.username)).data.value == id;
allow update: if
getIndexAfter(/User/username/$(getUserAfter(id).data.username)).data.value == id &&
!indexExistsBefore(/User/username/$(getUserAfter(id).data.username));
allow delete: if
!indexExistsAfter(/User/username/$(getUserBefore(id).data.username));
}
match /Index/User/username/{username} {
allow read: if true;
allow create: if
getUserAfter(getIndexAfter(/User/username/$(username)).data.value).data.username == username;
allow delete: if
!userExistsAfter(getIndexBefore(/User/username/$(username)).data.value) ||
getUserAfter(getIndexBefore(/User/username/$(username)).data.value).data.username != username;
}
}
}

[Its not a perfect solution but working]
I have done this unique key using key...
I want my table to be having unique date value. so i made it key of my document.
Any way i am able to get all documents
db.collection('sensors').doc(sensorId).collection("data").doc(date).set(dataObj).then(() => {
response.send(dataObj);
});

What about doing a Transaction to first check if there are documents with the same value in this unique field, and only create the document if the result is empty.
As an example, creating a User with username as unique field:
type User = {
id?: string
username: string
firstName: string
lastName: string
}
async function createUser(user: User) {
try {
const newDocRef = db.collection('Users').doc()
await db.runTransaction(async t => {
const checkRef = db.collection('Users')
.where('username', '==', user.username)
const doc = await t.get(checkRef)
if (!doc.empty) {
throw new FirebaseError('firestore/unique-restriction',
`There is already a user with the username: '${user.username}' in the database.`
)
}
await t.create(newDocRef, user)
})
console.log('User Created')
} catch (err) {
if (err instanceof FirebaseError) {
console.log('Some error in firebase')
//Do something
} else {
console.log('Another error')
//Do whatever
}
}
}
Is this code ok or am I missing something?.

This is possible using a transaction, where a reading must be made to find out if another document uses the unique value.
IMPORTANT: Transaction has to be done using a Firestore server library to ensure blocking on concurrent operations (https://firebase.google.com/docs/firestore/transaction-data-contention#transactions_and_data_contention)
I did several tests simultaneously using Cloud Functions simulating delays and it worked great. See an example:
const result = await admin.firestore().runTransaction(async (t) => {
const personsRef = admin.firestore().collection("persons").where('email', '==', data.email)
const query = await t.get(personsRef);
if (query.docs.length > 0) {
throw new functions.https.HttpsError('permission-denied', `email ${data.email} already exists`);
}
const newPersonRef = admin.firestore().collection("persons").doc();
t.set(newPersonRef, {name: data.name, email: data.email});
return "update success";
}
In this example it is guaranteed that two people cannot use the same email in the inclusion (the same should be done for email changes).

Based on the documentation from this section https://cloud.google.com/firestore/docs/manage-data/add-data#set_a_document
You can simply add a custom identifier when adding document object to a collection as shown below:
const data = {
name: 'Los Angeles',
state: 'CA',
country: 'USA'
};
// Add a new document in collection "cities" with ID 'LA'
const res = await db.collection('cities').doc('LA').set(data);
Using this https://cloud.google.com/firestore/docs/manage-data/add-data#node.js_4 as a reference when you use set as a method on your collection you can be able to specify an id for such document when you need to auto-generate an id you simply use the add method on your collection

Related

How to populate an array of ObjectIds in mongoose?

I have a User model with a schema that I would like to validate an array of multiple friends by their id's. The portion of the schema that is supposed to do this is:
friends: {
type: [mongoose.SchemaTypes.ObjectId],
},
Then, when I try to add a friend with an id value and populate it inside the API endpoint, it adds the id to the database, but does not populate it. Here is the code:
if (method === "POST") {
const userId = getIdFromCookie(req);
try {
const newFriend = {
friends: req.body.friend
};
const updatedUser = await User.findByIdAndUpdate(userId, newFriend, {new: true})
const popUser = await User.findById(userId).populate("friends")
res.status(200).json({success: true, data: updatedUser});
} catch (error) {
res.status(400).json({success: false});
}
} else {
res.status(400).json({error: "This endpoint only supports method 'POST'"})
}
I want to know how I can add a friend's id to the database, whilst simultaneously populating it in the same endpoint.
The user schema is missing the ref field.
Example from the docs:
stories: [{ type: Schema.Types.ObjectId, ref: 'Story' }]
Without the ref, Mongoose doesn't know where to lookup the ObjectId.

Making a welcome message an embed on discord.js

I have connected MongoDB to my discord.js code and have made a setwelcome command as per-server data so that each server can customize their own welcome message. Everything works great, I just want to know if there is any way that I can make the message appear as an embed? Here's the code:
//importing all the needed files and languages
const mongo = require('./mongo')
const command = require('./command')
const welcomeSchema = require('./schemas/welcome-schema')
const mongoose = require('mongoose')
const Discord = require('discord.js')
mongoose.set('useFindAndModify', false);
//my code is inside this export
module.exports = (client) => {
//this next line is for later
const cache = {}
command(client, 'setwelcome', async (message) => {
const { member, channel, content, guild } = message
//checking to see that only admins can do this
if (!member.hasPermissions === 'ADMINISTRATOR') {
channel.send('You do not have the permission to run this command')
return
}
//simplifying commands
let text = content
//this is to store just the command and not the prefix in mongo compass
const split = text.split(' ')
if (split.length < 2) {
channel.send('Please provide a welcome message!')
return
}
split.shift()
text = split.join(' ')
//this is to not fetch from the database after code ran once
cache[guild.id] = [channel.id, text]
//this is to store the code inside mongo compass
await mongo().then(async (mongoose) => {
try {
await welcomeSchema.findOneAndUpdate({
_id: guild.id
}, {
_id: guild.id,
channelId: channel.id,
text,
}, {
upsert: true
})
} finally {
mongoose.connection.close()
}
})
})
//this is to fetch from the database
const onJoin = async (member) => {
const { guild } = member
let data = cache[guild.id]
if (!data) {
console.log('FETCHING FROM DATABASE')
await mongo().then( async (mongoose) => {
try {
const result = await welcomeSchema.findOne({ _id: guild.id })
cache[guild.id] = data = [result.channelId, result.text]
} finally {
mongoose.connection.close()
}
})
}
//this is to simplify into variables
const channelId = data[0]
const text = data[1]
/*this is where the message sends on discord. the second of these 2 lines is what I want embedded
which is basically the welcome message itself*/
const channel = guild.channels.cache.get(channelId)
channel.send(text.replace(/<#>/g, `<#${member.id}>`))
}
//this is to test the command
command(client, 'simjoin', message => {
onJoin(message.member)
})
//this is so the command works when someone joins
client.on('guildMemberAdd', member => {
onJoin(member)
})
}
I know how to usually make an embed, but I'm just confused at the moment on what to put as .setDescription() for the embed.
Please advise.
If you just want to have the message be sent as an embed, create a MessageEmbed and use setDescription() with the description as the only argument. Then send it with channel.send(embed).
const embed = new Discord.MessageEmbed();
embed.setDescription(text.replace(/<#>/g, `<#${member.id}>`));
channel.send(embed);
By the way, if you are confused about how to use a specific method you can always search for the method name on the official discord.js documentation so you don’t have to wait for an answer here. Good luck creating your bot!

Firestore function saving entire document data to Algolia

I have created a Function which is saving the Entire Document to Algolia when onCreate function is triggered. I want to save only three fields and not the entire document.
Here is my current function code:
exports.addToIndex = functions.firestore.document('questions/{questionsId}')
.onCreate((snapshot: { data: () => any; id: any; desc:any; }) => {
const data = snapshot.data();
// const descdata = snapshot.data().desc;
const objectID = snapshot.id;
console.log(objectID);
// console.log(descdata);
console.log(data);
// return index.saveObject({ ...descdata, objectID });
return index.saveObject({ ...data, objectID });
});
I only want to save three objectid's:
ObjectID
slatex
alatex
At present, it is saving all 18 fields of the document. How can I do this?
The data may carry the entire firestore document.
For limited fields, I usually frame the index data separately; like below
exports.addToIndex = functions.firestore.document('questions/{questionsId}')
.onCreate((snapshot) => {
const data = snapshot.data();
const objectID = snapshot.id;
let _index_data = {
'objectID': objectID,
'slatex': data.slatex,
'alatex': data. alatex
}
return index.saveObject(_index_data);
});

The correct way to create collection during mongoose transaction

How to autocreate collection during mongoose transaction if the collection was not created yet?
I'm aware of mongoose limitation that restricts user to create (or delete) mongoose collections during open transaction session.
Also, I was able to find 3 possible solutions on how to fix that:
1. autoCreate option
2. Model.init() method
3. Model.createCollection() method
Which one to use? Without losing indexes etc.
app.models.ts
import { model, Schema } from 'mongoose';
const UserSchema = new Schema<UserDocument>({
name: {
type: Schema.Types.String,
required: true,
}
}); // { autoCreate: true } <-- ???
export const UserModel = model<UserDocument>('User', UserSchema);
app.ts
import { startSession } from 'mongoose';
import { UserModel } from './app.models.ts';
async function createUser() {
// await UserModel.createCollection(); ??
// or
// await UserModel.init(); ??
const session = await startSession();
sesssion.startTransaction();
try {
const [user] = await UserModel.create([{ name: 'John' }], { session });
await session.commitTransaction();
return user;
} catch (error) {
await session.abortTransaction();
} finally {
session.endSession()
}
}
foo();
If a collection does not exist, MongoDB creates the collection when you first store data for that collection. You can also explicitly create a collection with various options, such as setting the maximum size or the documentation validation rules.
Anyway, mongoose takes care of indexes, collection, etc...
you just need to define the collection name: https://mongoosejs.com/docs/guide.html#collection
const UserSchema = new Schema<UserDocument>({
name: {
type: Schema.Types.String,
required: true,
}
}, {collection: 'users'});
There is the answer about transactions and collection creating -https://github.com/Automattic/mongoose/issues/6699
Actually, I use https://www.npmjs.com/package/db-migrate package to create collections and indexes before starting an app.

apollostack/graphql-server - how to get the fields requested in a query from resolver

I am trying to figure out a clean way to work with queries and mongdb projections so I don't have to retrieve excessive information from the database.
So assuming I have:
// the query
type Query {
getUserByEmail(email: String!): User
}
And I have a User with an email and a username, to keep things simple. If I send a query and I only want to retrieve the email, I can do the following:
query { getUserByEmail(email: "test#test.com") { email } }
But in the resolver, my DB query still retrieves both username and email, but only one of those is passed back by apollo server as the query result.
I only want the DB to retrieve what the query asks for:
// the resolver
getUserByEmail(root, args, context, info) {
// check what fields the query requested
// create a projection to only request those fields
return db.collection('users').findOne({ email: args.email }, { /* projection */ });
}
Of course the problem is, getting information on what the client is requesting isn't so straightforward.
Assuming I pass in request as context - I considered using context.payload (hapi.js), which has the query string, and searching it through various .split()s, but that feels kind of dirty. As far as I can tell, info.fieldASTs[0].selectionSet.selections has the list of fields, and I could check for it's existence in there. I'm not sure how reliable this is. Especially when I start using more complex queries.
Is there a simpler way?
In case you don't use mongDB, a projection is an additional argument you pass in telling it explicitly what to retrieve:
// telling mongoDB to not retrieve _id
db.collection('users').findOne({ email: 'test#test.com' }, { _id: 0 })
As always, thanks to the amazing community.
2020-Jan answer
The current answer to getting the fields requested in a GraphQL query, is to use the graphql-parse-resolve-info library for parsing the info parameter.
The library is "a pretty complete solution and is actually used under the hood by postgraphile", and is recommended going forward by the author of the other top library for parsing the info field, graphql-fields.
Use graphql-fields
Apollo server example
const rootSchema = [`
type Person {
id: String!
name: String!
email: String!
picture: String!
type: Int!
status: Int!
createdAt: Float
updatedAt: Float
}
schema {
query: Query
mutation: Mutation
}
`];
const rootResolvers = {
Query: {
users(root, args, context, info) {
const topLevelFields = Object.keys(graphqlFields(info));
return fetch(`/api/user?fields=${topLevelFields.join(',')}`);
}
}
};
const schema = [...rootSchema];
const resolvers = Object.assign({}, rootResolvers);
// Create schema
const executableSchema = makeExecutableSchema({
typeDefs: schema,
resolvers,
});
Sure you can. This is actually the same functionality that is implemented on join-monster package for SQL based db's. There's a talk by their creator: https://www.youtube.com/watch?v=Y7AdMIuXOgs
Take a look on their info analysing code to get you started - https://github.com/stems/join-monster/blob/master/src/queryASTToSqlAST.js#L6-L30
Would love to see a projection-monster package for us mongo users :)
UPDATE:
There is a package that creates a projection object from info on npm: https://www.npmjs.com/package/graphql-mongodb-projection
You can generate MongoDB projection from info argument. Here is the sample code that you can follow
/**
* #description - Gets MongoDB projection from graphql query
*
* #return { object }
* #param { object } info
* #param { model } model - MongoDB model for referencing
*/
function getDBProjection(info, model) {
const {
schema: { obj }
} = model;
const keys = Object.keys(obj);
const projection = {};
const { selections } = info.fieldNodes[0].selectionSet;
for (let i = 0; i < keys.length; i++) {
const key = keys[i];
const isSelected = selections.some(
selection => selection.name.value === key
);
projection[key] = isSelected;
}
console.log(projection);
}
module.exports = getDBProjection;
With a few helper functions you can use it like this (typescript version):
import { parceGqlInfo, query } from "#backend";
import { GraphQLResolveInfo } from "graphql";
export const user = async (parent: unknown, args: unknown, ctx: unknown, info: GraphQLResolveInfo): Promise<User | null> => {
const { dbQueryStr } = parceGqlInfo(info, userFields, "id");
const [user] = await query(`SELECT ${dbQueryStr} FROM users WHERE id=$1;`, [1]);
return user;
};
Helper functions.
Few points:
gql_uid used as ID! string type from primary key to not change db types
required option is used for dataloaders (if field was not requested by user)
allowedFields used to filter additional fields from info like '__typename'
queryPrefix is used if you need to prefix selected fields like select u.id from users u
const userFields = [
"gql_uid",
"id",
"email"
]
// merge arrays and delete duplicates
export const mergeDedupe = <T>(arr: any[][]): T => {
// #ts-ignore
return ([...new Set([].concat(...arr))] as unknown) as T;
};
import { parse, simplify, ResolveTree } from "graphql-parse-resolve-info";
import { GraphQLResolveInfo } from "graphql";
export const getQueryFieldsFromInfo = <Required = string>(info: GraphQLResolveInfo, options: { required?: Required[] } = {}): string[] => {
const { fields } = simplify(parse(info) as ResolveTree, info.returnType) as { fields: { [key: string]: { name: string } } };
let astFields = Object.entries(fields).map(([, v]) => v.name);
if (options.required) {
astFields = mergeDedupe([astFields, options.required]);
}
return astFields;
};
export const onlyAllowedFields = <T extends string | number>(raw: T[] | readonly T[], allowed: T[] | readonly T[]): T[] => {
return allowed.filter((f) => raw.includes(f));
};
export const parceGqlInfo = (
info: GraphQLResolveInfo,
allowedFields: string[] | readonly string[],
gqlUidDbAlliasField: string,
options: { required?: string[]; queryPrefix?: string } = {}
): { pureDbFields: string[]; gqlUidRequested: boolean; dbQueryStr: string } => {
const fieldsWithGqlUid = onlyAllowedFields(getQueryFieldsFromInfo(info, options), allowedFields);
return {
pureDbFields: fieldsWithGqlUid.filter((i) => i !== "gql_uid"),
gqlUidRequested: fieldsWithGqlUid.includes("gql_uid"),
dbQueryStr: fieldsWithGqlUid
.map((f) => {
const dbQueryStrField = f === "gql_uid" ? `${gqlUidDbAlliasField}::Text AS gql_uid` : f;
return options.queryPrefix ? `${options.queryPrefix}.${dbQueryStrField}` : dbQueryStrField;
})
.join(),
};
};