In Firestore, how to ignore unwanted inputs for creation? - google-cloud-firestore

As we all know we can create a record like this:
const docRef = db.collection('forms').add({
name: 'Some name',
address: 'Some address'
userId: 'some user ID'
});
However, if some users (authenticated users) POST to your store:
const docRef = db.collection('forms').add({
name: 'Some name',
address: 'Some address'
userId: 'some user ID',
unwantedData1: 'xxx,
unwantedData2: 'xxx,
unwantedData3: 'xxx,
unwantedData4: 'xxx,
unwantedData5: 'xxx,
unwantedData6: 'xxx,
unwantedData6: 'xxx,
... // <- too many rubbish params
});
Although we have the security rule, however, some attackers can register a dummy account and do something like that. How to prevent this in Firestore?
Many thanks.

You can do this with Firestore Security Rules using hasOnly.
You would do something like this:
allow write: if resource.data.keys().hasOnly([ <your whitelisted properties> ])
If for some reason you can't use the rules in this way, or you need to do more advanced checks, you could create a Cloud Function with a Firestore Trigger that validates the document and deletes it if something is wrong.

Related

How can I retrieve an id from MongoDB create operation during a transaction?

I am trying to create an audit trail using Apollo Server and Mongoose. When a user initially registers, I create a document in the users collection and a document in the history collection for each piece of data they provided (username, password, email, etc) . For each history collection document, I include the id for the user document to create a relationship. Works perfectly.
However, when I add a transaction in (see below), the userId for the user document comes back as undefined, so I cannot add it to the history entry documents. I am assuming that the id for a document does not get created until the entire transaction has been completed?
Any ideas?
Mutation: {
register: async (_, { data }) => {
// Start a mongo session & transaction
const session = await mongoose.startSession()
session.startTransaction()
try {
// Hash password and create user
const hashedPassword = await bcrypt.hash(data.password, 12)
const user = await User.create(
[{ ...data, password: hashedPassword }],
{ session }
)
// Add history entries
HistoryEntry.create([
{
user: user.id,
action: 'registered'
},
{
user: user.id,
action: 'set',
object: 'profile',
instance: user.id,
property: 'firstName',
value: firstName
},
{
user: user.id,
action: 'set',
object: 'profile',
instance: user.id,
property: 'lastName',
value: lastName
},
{
user: user.id,
action: 'set',
object: 'profile',
instance: user.id,
property: 'password'
}
])
if (loginType === 'email') {
HistoryEntry.create({
user: user.id,
action: 'set',
object: 'profile',
instance: user.id,
property: 'email',
value: login
})
}
if (loginType === 'mobile') {
HistoryEntry.create({
user: user.id,
action: 'set',
object: 'profile',
instance: user.id,
property: 'mobile',
value: login
})
}
// commit the changes if everything was successful
await session.commitTransaction()
return {
ok: true,
user
}
} catch (err) {
// if anything fails above, rollback the changes in the transaction
await session.abortTransaction()
return formatErrors(err)
} finally {
// end the session
session.endSession()
}
}
}
If you think about it, how can you add a HistoryEntry if you haven't added User yet? It's not a 'history' as you are currently doing it. I believe you got two options here - set _id on User manually new Schema({ _id: { type: Schema.ObjectId, auto: true }}) and then generate it within the transaction: var userId = ObjectId(); and use for both User and History Entries.
And the second option, more semantically correct in this context, I believe - you should attach to post-save hook:
schema.post('save', function(doc) {
console.log('%s has been saved', doc._id);
});
So, whenever an User is created, a post-save hook is fired to update History.
Came across the same issue recently, hope you have figured it out already. I may add this for future seekers.
Following create function returns an array of created documents.
const user = await User.create(
[{ ...data, password: hashedPassword }],
{ session }
);
Therefore access the user id as user[0]._id
Pass the session also to HistoryEntry.create()
HistoryEntry.create([{...},{...}], {session})
Note: In this use case, I personally prefer #marek second option to use a post-save hook.

Create user with avatar

I want to add an avatar in the user registration, but I don't know how, Please can someone share with me a full example (form, JS front, and JS backend). I'm using SailsJS 1.0 (the stable version) with VueJs.
Thanks in advance .
I figured it out. Watch these platzi tutorials:
https://courses.platzi.com/classes/1273-sails-js/10757-uploading-backend-file/
https://courses.platzi.com/classes/1273-sails-js/10758-uploading-frontend-files/
https://courses.platzi.com/classes/1273-sails-js/10759-downloading-files/
Here is what the videos tell you to do:
npm i sails-hook-uploads.
In api/controllers/entrance/signup.js
Above inputs key add a new key/value of files: ['avatar'],
In the inputs add:
avatar: {
type: 'ref',
required: true
}
In the body of the fn find var newUserRecord and above this add (even if avatar is not required, make sure to do this line, otherwise you will have a "timeout of unconsuemd file stream":
const avatarInfo = await sails.uploadOne(inputs.avatar);
Then in the first argument object of var newUserRecord = await User.create(_.extend({ add:
avatarFd: avatarInfo.fd,
avatarMime: avatarInfo.type
In api/models/User.js, add these attributes to your User model:
avatarFd: {
type: 'string',
required: false,
description: 'will either have "text" or "avatarFd"'
},
avatarMime: {
type: 'string',
required: false,
description: 'required if "avatarFd" provided'
},
Then create a download endpoint, here is how the action would look for it:
const user = await User.findOne(id);
this.res.type(paste.photoMime);
const avatarStream = await sails.startDownload(paste.photoFd);
return exits.success(avatarStream);
Add to the routes a route for this download avatar endpoint.
Then you can display this avatar by pointing the <img src=""> the source in here to this download endpoint.
------APPENDIX-----
----signup.js-----
module.exports = {
friendlyName: 'Signup',
description: 'Sign up for a new user account.',
extendedDescription:
`This creates a new user record in the database, signs in the requesting user agent
by modifying its [session](https://sailsjs.com/documentation/concepts/sessions), and
(if emailing with Mailgun is enabled) sends an account verification email.
If a verification email is sent, the new user's account is put in an "unconfirmed" state
until they confirm they are using a legitimate email address (by clicking the link in
the account verification message.)`,
files: ['avatar'],
inputs: {
emailAddress: {
required: true,
type: 'string',
isEmail: true,
description: 'The email address for the new account, e.g. m#example.com.',
extendedDescription: 'Must be a valid email address.',
},
password: {
required: true,
type: 'string',
maxLength: 200,
example: 'passwordlol',
description: 'The unencrypted password to use for the new account.'
},
fullName: {
required: true,
type: 'string',
example: 'Frida Kahlo de Rivera',
description: 'The user\'s full name.',
},
avatar: {
}
},
exits: {
success: {
description: 'New user account was created successfully.'
},
invalid: {
responseType: 'badRequest',
description: 'The provided fullName, password and/or email address are invalid.',
extendedDescription: 'If this request was sent from a graphical user interface, the request '+
'parameters should have been validated/coerced _before_ they were sent.'
},
emailAlreadyInUse: {
statusCode: 409,
description: 'The provided email address is already in use.',
},
},
fn: async function (inputs) {
var newEmailAddress = inputs.emailAddress.toLowerCase();
// must do this even if inputs.avatar is not required
const avatarInfo = await sails.uploadOne(inputs.avatar);
// Build up data for the new user record and save it to the database.
// (Also use `fetch` to retrieve the new ID so that we can use it below.)
var newUserRecord = await User.create(_.extend({
emailAddress: newEmailAddress,
password: await sails.helpers.passwords.hashPassword(inputs.password),
fullName: inputs.fullName,
tosAcceptedByIp: this.req.ip,
avatarFd: avatarInfo.fd,
avatarMime: avatarInfo.type
}, sails.config.custom.verifyEmailAddresses? {
emailProofToken: await sails.helpers.strings.random('url-friendly'),
emailProofTokenExpiresAt: Date.now() + sails.config.custom.emailProofTokenTTL,
emailStatus: 'unconfirmed'
}:{}))
.intercept('E_UNIQUE', 'emailAlreadyInUse')
.intercept({name: 'UsageError'}, 'invalid')
.fetch();
// If billing feaures are enabled, save a new customer entry in the Stripe API.
// Then persist the Stripe customer id in the database.
if (sails.config.custom.enableBillingFeatures) {
let stripeCustomerId = await sails.helpers.stripe.saveBillingInfo.with({
emailAddress: newEmailAddress
}).timeout(5000).retry();
await User.updateOne(newUserRecord.id)
.set({
stripeCustomerId
});
}
// Store the user's new id in their session.
this.req.session.userId = newUserRecord.id;
if (sails.config.custom.verifyEmailAddresses) {
// Send "confirm account" email
await sails.helpers.sendTemplateEmail.with({
to: newEmailAddress,
subject: 'Please confirm your account',
template: 'email-verify-account',
templateData: {
fullName: inputs.fullName,
token: newUserRecord.emailProofToken
}
});
} else {
sails.log.info('Skipping new account email verification... (since `verifyEmailAddresses` is disabled)');
}
// add to pubilc group
const publicGroup = await Group.fetchPublicGroup();
await Group.addMember(publicGroup.id, newUserRecord.id);
}
};

how to connect postgresql with graphql [duplicate]

GraphQL has mutations, Postgres has INSERT; GraphQL has queries, Postgres has SELECT's; etc., etc.. I haven't found an example showing how you could use both in a project, for example passing all the queries from front end (React, Relay) in GraphQL, but to a actually store the data in Postgres.
Does anyone know what Facebook is using as DB and how it's connected with GraphQL?
Is the only option of storing data in Postgres right now to build custom "adapters" that take the GraphQL query and convert it into SQL?
GraphQL is database agnostic, so you can use whatever you normally use to interact with the database, and use the query or mutation's resolve method to call a function you've defined that will get/add something to the database.
Without Relay
Here is an example of a mutation using the promise-based Knex SQL query builder, first without Relay to get a feel for the concept. I'm going to assume that you have created a userType in your GraphQL schema that has three fields: id, username, and created: all required, and that you have a getUser function already defined which queries the database and returns a user object. In the database I also have a password column, but since I don't want that queried I leave it out of my userType.
// db.js
// take a user object and use knex to add it to the database, then return the newly
// created user from the db.
const addUser = (user) => (
knex('users')
.returning('id') // returns [id]
.insert({
username: user.username,
password: yourPasswordHashFunction(user.password),
created: Math.floor(Date.now() / 1000), // Unix time in seconds
})
.then((id) => (getUser(id[0])))
.catch((error) => (
console.log(error)
))
);
// schema.js
// the resolve function receives the query inputs as args, then you can call
// your addUser function using them
const mutationType = new GraphQLObjectType({
name: 'Mutation',
description: 'Functions to add things to the database.',
fields: () => ({
addUser: {
type: userType,
args: {
username: {
type: new GraphQLNonNull(GraphQLString),
},
password: {
type: new GraphQLNonNull(GraphQLString),
},
},
resolve: (_, args) => (
addUser({
username: args.username,
password: args.password,
})
),
},
}),
});
Since Postgres creates the id for me and I calculate the created timestamp, I don't need them in my mutation query.
The Relay Way
Using the helpers in graphql-relay and sticking pretty close to the Relay Starter Kit helped me, because it was a lot to take in all at once. Relay requires you to set up your schema in a specific way so that it can work properly, but the idea is the same: use your functions to fetch from or add to the database in the resolve methods.
One important caveat is that the Relay way expects that the object returned from getUser is an instance of a class User, so you'll have to modify getUser to accommodate that.
The final example using Relay (fromGlobalId, globalIdField, mutationWithClientMutationId, and nodeDefinitions are all from graphql-relay):
/**
* We get the node interface and field from the Relay library.
*
* The first method defines the way we resolve an ID to its object.
* The second defines the way we resolve an object to its GraphQL type.
*
* All your types will implement this nodeInterface
*/
const { nodeInterface, nodeField } = nodeDefinitions(
(globalId) => {
const { type, id } = fromGlobalId(globalId);
if (type === 'User') {
return getUser(id);
}
return null;
},
(obj) => {
if (obj instanceof User) {
return userType;
}
return null;
}
);
// a globalId is just a base64 encoding of the database id and the type
const userType = new GraphQLObjectType({
name: 'User',
description: 'A user.',
fields: () => ({
id: globalIdField('User'),
username: {
type: new GraphQLNonNull(GraphQLString),
description: 'The username the user has selected.',
},
created: {
type: GraphQLInt,
description: 'The Unix timestamp in seconds of when the user was created.',
},
}),
interfaces: [nodeInterface],
});
// The "payload" is the data that will be returned from the mutation
const userMutation = mutationWithClientMutationId({
name: 'AddUser',
inputFields: {
username: {
type: GraphQLString,
},
password: {
type: new GraphQLNonNull(GraphQLString),
},
},
outputFields: {
user: {
type: userType,
resolve: (payload) => getUser(payload.userId),
},
},
mutateAndGetPayload: ({ username, password }) =>
addUser(
{ username, password }
).then((user) => ({ userId: user.id })), // passed to resolve in outputFields
});
const mutationType = new GraphQLObjectType({
name: 'Mutation',
description: 'Functions to add things to the database.',
fields: () => ({
addUser: userMutation,
}),
});
const queryType = new GraphQLObjectType({
name: 'Query',
fields: () => ({
node: nodeField,
user: {
type: userType,
args: {
id: {
description: 'ID number of the user.',
type: new GraphQLNonNull(GraphQLID),
},
},
resolve: (root, args) => getUser(args.id),
},
}),
});
We address this problem in Join Monster, a library we recently open-sourced to automatically translate GraphQL queries to SQL based on your schema definitions.
This GraphQL Starter Kit can be used for experimenting with GraphQL.js and PostgreSQL:
https://github.com/kriasoft/graphql-starter-kit - Node.js, GraphQL.js, PostgreSQL, Babel, Flow
(disclaimer: I'm the author)
Have a look at graphql-sequelize for how to work with Postgres.
For mutations (create/update/delete) you can look at the examples in the relay repo for instance.
Postgraphile https://www.graphile.org/postgraphile/ is Open Source
Rapidly build highly customisable, lightning-fast GraphQL APIs
PostGraphile is an open-source tool to help you rapidly design and
serve a high-performance, secure, client-facing GraphQL API backed
primarily by your PostgreSQL database. Delight your customers with
incredible performance whilst maintaining full control over your data
and your database. Use our powerful plugin system to customise every
facet of your GraphQL API to your liking.
You can use an ORM like sequelize if you're using Javascript or Typeorm if you're using Typescript
Probably FB using mongodb or nosql in backend. I've recently read a blog entry which explain how to connect to mongodb. Basically, you need to build a graph model to match the data you already have in your DB. Then write resolve, reject function to tell GQL how to behave when posting a query request.
See https://www.compose.io/articles/using-graphql-with-mongodb/
Have a look at SequelizeJS which is a promise based ORM that can work with a number of dialects; PostgreSQL, MySQL, SQLite and MSSQL
The below code is pulled right from its example
const Sequelize = require('sequelize');
const sequelize = new Sequelize('database', 'username', 'password', {
host: 'localhost',
dialect: 'mysql'|'sqlite'|'postgres'|'mssql',
pool: {
max: 5,
min: 0,
acquire: 30000,
idle: 10000
},
// SQLite only
storage: 'path/to/database.sqlite',
// http://docs.sequelizejs.com/manual/tutorial/querying.html#operators
operatorsAliases: false
});
const User = sequelize.define('user', {
username: Sequelize.STRING,
birthday: Sequelize.DATE
});
sequelize.sync()
.then(() => User.create({
username: 'janedoe',
birthday: new Date(1980, 6, 20)
}))
.then(jane => {
console.log(jane.toJSON());
});

sails-permissions getting all permissions

I am trying to send all the permissions for an authenticated user via JSON from Sails.
My current code to find permissions for a single model type:
hasPermission: function hasPermission(req, res) {
var permitted = PermissionService.isAllowedToPerformAction({
method: req.param('method'),
model: sails.models[req.param('model')],
user: req.user
});
return res.json(200, { permitted: permitted });
}
This code doesn't work as isAllowedToPerformAction wants a single instance of a model. Is there a way to return a single JSON file accounting for all permissions?
Try creating roles and give them permissions.
Assign role to users
Ex.
PermissionService.createRole({
name: 'carsCategoryAdmin',
permissions: [
{ action: 'update', model: 'review', criteria: [{ where: { category: 'cars'}}]},
{ action: 'delete', model: 'review', criteria: [{ where: { category: 'cars'}}]}
],
users: ['venise']
})
You can examine the role and related permissions and users,
Role.find({name:'carsCategoryAdmin'})
.populate('users')
.populate('permissions')
.exec(console.log)
See more # sails-permissions-by-example
See how to get user permissions with code in comment given by skrichten on May 10, 2014 .

Is it possible to create indexes in Mongodb for dynamic fields?

My site has several "domains" which represent sub-sections of the site. Each "domain" has its own users with profiles, and a search page where you can search for users based on profile fields that the users fill out.
User data from the 'users' collection is structured like so:
{
username: 'shawnmichaels',
first_name: 'Shawn',
last_name: 'Michaels',
domains: [
{
name: 'domain1',
user_fields: {
'bio': 'Short bio related to domain 1',
'skills': 'Pertinent skills for domain 1'
}
},
{
name: 'domain2',
user_fields: {
'bio': 'Short bio related to domain 2',
'skills': 'Pertinent skills for domain 2'
}
}
]
}
So, users have field data across multiple domains. The domain names and field names are dynamic. There could potentially be hundreds of domains for a user and dozens of fields in a domain.
Is it possible to have some sort of dynamic index so that I can search for fields under 'domain1' without getting any matches in 'domain2'?
For example, if a user1 has "skills": ["karate", "judo"] under 'domain1', and I want to search domain2 for "karate", I don't want a match for user1.
For anyone interested, I've answered my own question after a day of research.
I'm pretty new to MongoDB and didn't realize you can query arrays by treating them like any other keys, like so:
db.users.find({domains.title: 'domain1'})
// where 'domains' is an array of object each with a 'title' key
With this knowledge, I added a text index for each field I want to be searchable:
db.users.ensureIndex({
'username': 'text',
'first_name': 'text',
'last_name': 'text',
'domains.fields.bio': 'text',
'domains.fields.skills': 'text',
'domains.fields.title': 'text',
'domains.fields.training': 'text'
}, {name: 'domain_search'})
And I structured search queries in each domain like so:
db.users.find({
'domains.title': DOMAIN_NAME,
'$text': {'$search': SEARCH_TERM}
})