mongodb/mongoose: Save unique value if data is not null in from nestjs - mongodb

I am trying to save data in MongoDB. I want to store unique data when data is not null. However, I want to allow multiple null values in the unique identifier.
My sample schema:
#Schema()
export class Contact extends Document {
#Prop({ unique: true, sparse: true, require: true })
email: string;
#Prop({ default: '+1' })
countryCode: string;
#Prop({ unique: true, sparse: true })
mobile: string;
}
In this case, a mobile number is not required. User can add their contact information with or without providing a mobile number. If the user sends their mobile number that should be unique. So, I need to allow multiple null values in the mobile field. However, that field should be unique when the user provides any mobile number.
Empty entries seem to get the value null so every entry without mobile crashes with the unique identifier.
Is there any way to solve this problem either from the database layer or the application layer?
I am using NestJS for developing my API.

A unique index still does not allow multiple docs with a field of null. You need to transform your data payload by dropping the null field before you save your docs in MongoDB. A transform pipe will help you to handle this issue. Here is a transform pipe that you can use for this purpose:
#Injectable()
export class NullValidationPipe implements PipeTransform {
private isObj(obj: any): boolean {
return typeof obj === 'object' && obj !== null;
}
private dropNull(values) {
Object.keys(values).forEach((key) => {
if (!(key === 'password' || key === '_id')) {
if (this.isObj(values[key])) {
values[key] = this.dropNull(values[key]);
} else if (Array.isArray(values[key]) && values[key].length > 0) {
values[key] = values[key].map((value) => {
if (this.isObj(value)) {
value = this.dropNull(value);
}
return value;
});
} else {
if (values[key] === null || values[key] === undefined) {
delete values[key];
}
}
}
});
return values;
}
transform(values: any, metadata: ArgumentMetadata) {
const { type } = metadata;
if (type === 'param' || type === 'custom') return values;
else if (this.isObj(values) && type === 'body') {
return this.dropNull(values);
}
throw new BadRequestException('Validation failed');
}
}
Use this pipe in the controller and this pipe will drop all incoming null fields which will come with the request payload.
You can also check nest pipe transform docs: https://docs.nestjs.com/techniques/validation

Related

apollostack/graphql-server - how to get the fields requested in a query from resolver

I am trying to figure out a clean way to work with queries and mongdb projections so I don't have to retrieve excessive information from the database.
So assuming I have:
// the query
type Query {
getUserByEmail(email: String!): User
}
And I have a User with an email and a username, to keep things simple. If I send a query and I only want to retrieve the email, I can do the following:
query { getUserByEmail(email: "test#test.com") { email } }
But in the resolver, my DB query still retrieves both username and email, but only one of those is passed back by apollo server as the query result.
I only want the DB to retrieve what the query asks for:
// the resolver
getUserByEmail(root, args, context, info) {
// check what fields the query requested
// create a projection to only request those fields
return db.collection('users').findOne({ email: args.email }, { /* projection */ });
}
Of course the problem is, getting information on what the client is requesting isn't so straightforward.
Assuming I pass in request as context - I considered using context.payload (hapi.js), which has the query string, and searching it through various .split()s, but that feels kind of dirty. As far as I can tell, info.fieldASTs[0].selectionSet.selections has the list of fields, and I could check for it's existence in there. I'm not sure how reliable this is. Especially when I start using more complex queries.
Is there a simpler way?
In case you don't use mongDB, a projection is an additional argument you pass in telling it explicitly what to retrieve:
// telling mongoDB to not retrieve _id
db.collection('users').findOne({ email: 'test#test.com' }, { _id: 0 })
As always, thanks to the amazing community.
2020-Jan answer
The current answer to getting the fields requested in a GraphQL query, is to use the graphql-parse-resolve-info library for parsing the info parameter.
The library is "a pretty complete solution and is actually used under the hood by postgraphile", and is recommended going forward by the author of the other top library for parsing the info field, graphql-fields.
Use graphql-fields
Apollo server example
const rootSchema = [`
type Person {
id: String!
name: String!
email: String!
picture: String!
type: Int!
status: Int!
createdAt: Float
updatedAt: Float
}
schema {
query: Query
mutation: Mutation
}
`];
const rootResolvers = {
Query: {
users(root, args, context, info) {
const topLevelFields = Object.keys(graphqlFields(info));
return fetch(`/api/user?fields=${topLevelFields.join(',')}`);
}
}
};
const schema = [...rootSchema];
const resolvers = Object.assign({}, rootResolvers);
// Create schema
const executableSchema = makeExecutableSchema({
typeDefs: schema,
resolvers,
});
Sure you can. This is actually the same functionality that is implemented on join-monster package for SQL based db's. There's a talk by their creator: https://www.youtube.com/watch?v=Y7AdMIuXOgs
Take a look on their info analysing code to get you started - https://github.com/stems/join-monster/blob/master/src/queryASTToSqlAST.js#L6-L30
Would love to see a projection-monster package for us mongo users :)
UPDATE:
There is a package that creates a projection object from info on npm: https://www.npmjs.com/package/graphql-mongodb-projection
You can generate MongoDB projection from info argument. Here is the sample code that you can follow
/**
* #description - Gets MongoDB projection from graphql query
*
* #return { object }
* #param { object } info
* #param { model } model - MongoDB model for referencing
*/
function getDBProjection(info, model) {
const {
schema: { obj }
} = model;
const keys = Object.keys(obj);
const projection = {};
const { selections } = info.fieldNodes[0].selectionSet;
for (let i = 0; i < keys.length; i++) {
const key = keys[i];
const isSelected = selections.some(
selection => selection.name.value === key
);
projection[key] = isSelected;
}
console.log(projection);
}
module.exports = getDBProjection;
With a few helper functions you can use it like this (typescript version):
import { parceGqlInfo, query } from "#backend";
import { GraphQLResolveInfo } from "graphql";
export const user = async (parent: unknown, args: unknown, ctx: unknown, info: GraphQLResolveInfo): Promise<User | null> => {
const { dbQueryStr } = parceGqlInfo(info, userFields, "id");
const [user] = await query(`SELECT ${dbQueryStr} FROM users WHERE id=$1;`, [1]);
return user;
};
Helper functions.
Few points:
gql_uid used as ID! string type from primary key to not change db types
required option is used for dataloaders (if field was not requested by user)
allowedFields used to filter additional fields from info like '__typename'
queryPrefix is used if you need to prefix selected fields like select u.id from users u
const userFields = [
"gql_uid",
"id",
"email"
]
// merge arrays and delete duplicates
export const mergeDedupe = <T>(arr: any[][]): T => {
// #ts-ignore
return ([...new Set([].concat(...arr))] as unknown) as T;
};
import { parse, simplify, ResolveTree } from "graphql-parse-resolve-info";
import { GraphQLResolveInfo } from "graphql";
export const getQueryFieldsFromInfo = <Required = string>(info: GraphQLResolveInfo, options: { required?: Required[] } = {}): string[] => {
const { fields } = simplify(parse(info) as ResolveTree, info.returnType) as { fields: { [key: string]: { name: string } } };
let astFields = Object.entries(fields).map(([, v]) => v.name);
if (options.required) {
astFields = mergeDedupe([astFields, options.required]);
}
return astFields;
};
export const onlyAllowedFields = <T extends string | number>(raw: T[] | readonly T[], allowed: T[] | readonly T[]): T[] => {
return allowed.filter((f) => raw.includes(f));
};
export const parceGqlInfo = (
info: GraphQLResolveInfo,
allowedFields: string[] | readonly string[],
gqlUidDbAlliasField: string,
options: { required?: string[]; queryPrefix?: string } = {}
): { pureDbFields: string[]; gqlUidRequested: boolean; dbQueryStr: string } => {
const fieldsWithGqlUid = onlyAllowedFields(getQueryFieldsFromInfo(info, options), allowedFields);
return {
pureDbFields: fieldsWithGqlUid.filter((i) => i !== "gql_uid"),
gqlUidRequested: fieldsWithGqlUid.includes("gql_uid"),
dbQueryStr: fieldsWithGqlUid
.map((f) => {
const dbQueryStrField = f === "gql_uid" ? `${gqlUidDbAlliasField}::Text AS gql_uid` : f;
return options.queryPrefix ? `${options.queryPrefix}.${dbQueryStrField}` : dbQueryStrField;
})
.join(),
};
};

Angular2 interdependent form field validation

I have two form fields, where if the first field is filled in, the second field is mandatory. If I try to do this in Angular2, using a custom validator, the validator is only fired on initialization and when the specific field is changed.
Case:
- User fills in field 1
- Field 2 should become required, but isn't till the user actually changes field 2 (firing the custom validation).
private createForm():void {
this.testForm = this._formBuilder.group({
'field1': [],
'field2': ['', this.validateRequired()]
});
}
private validateRequired(){
console.log("something", this);
let component = this;
return (control: Control): { [s: string]: boolean } => {
return component.testModel.field1 && !control.value {"required":true} : null;
}
}
See this plunkr: http://plnkr.co/edit/PEY2QIegkqo8BW1UkQS5?p=preview
Edit:
For now I subscribed to field1's valueChange observable and when changed execute a manual check on field2, like:
this.testForm.controls['field1'].valueChanges.subscribe(
value => {
component.testForm.controls['field2].updateValueAndValidity();
}
)
But I feel like there must be a better way to do this.
You could use a global validator for the group like this:
private createForm():void {
this.testForm = this._formBuilder.group({
'field1': [],
'field2': ['', this.validateRequired()]
}, {
validator: this.someGlobalValidator // <-----
});
}
someGlobalValidator(group: ControlGroup) { // <-----
var valid = false;
for (name in group.controls) {
var val = group.controls[name].value
(...)
}
if (valid) {
return null;
}
return {
someValidationError: true
};
}
I want to expand on Thierry's answer a bit in order to address Arne's comment. In order to handle the validation of multiple fields and possibly multiple validations in your formgroup level validator the solution is to return a function from your validator that then returns an object that indicates the error type. Here is a example of a field matching validator that I added some extra errors to in order to illustrate the point. Note that it returns an object with possibly several properties where each object property is any string and the value is a boolean.
export function FieldMatchingValidator(field1: string, field2 :string) {
return (cg: FormGroup): { [s: string]: boolean } => {
let retVal = null;
let f1 = cg.controls[field1];
let f2 = cg.controls[field2];
retVal = f1.value === f2.value ? null : { fieldMismatch: true };
if(somecondition){
retVal['someerror'] = true;
}
if(someothercondition){
retVal['someothererror'] = true;
}
return retVal;
}
}
When this validator runs, if an error condition is encountered, then the form's errors property will be populated with the returned object with one or more properties indicating different errors. Then all you have to do it put the appropriate angular property setting on the controls that have the validation errors.
<div *ngIf="myForm.hasError('fieldMismatch')">
Field Mismatch
</div>
<div *ngIf="myForm.hasError('someerror')">
Some Error
</div>
<div [class.Errors]="myForm.hasError('someothererror')">
Some Other Error
</div>

MeteorJS conditional pub

I have a MeteorJS project and I want to publish a certain set of users based on whether an id param is defined or not. When the id param has a value it gets a list of users perfectly, however, when it is null it returns nothing. I am using the alanning:rolespackage for user roles and the exact same query works fine in meteor mongo.
Note: I understand the implications of user pub and limiting fields. I just want to understand why the pub is returning nothing when the id is null.
// Server
Meteor.publish('userAccess', function(id) {
console.log(id); // null or array
if (!id || id == null) {
return Meteor.users.find({
'roles': {
$in: ['admin', 'team']
}
});
} else {
return Meteor.users.find({
_id: id
});
}
});
You should use the alanning:roles method and publish ids returned by it.
// Server
Meteor.publish('userAccess', function(id) {
console.log(id); // null or array
if (!id || id == null) {
var Admins = Roles.getUsersInRole (['admin'])
var Teams = Roles.getUsersInRole (['team'])
// we intersect both array
var adminsAndTeam = Admins.filter(function(n) {
return Teams.indexOf(n) != -1;
});
return Meteor.users.find({
_id: {
$in: adminsAndTeam
}
});
} else {
return Meteor.users.find({
_id: id
});
}
});
The original code does work. The problem was in the front end where a conditional was in the wrong place and therefore not returning the users.

sails js model validation against database

I am writing a custom validation rule to check if the "category_id" passed to my create function is valid or not.
types: {
isValidCategoryId: function(id){
return Category.findOne({id: id}).exec(function(err, user){
if(err || !user)
return false;
else{
return true;
}
});
}
},
attributes: {
category_id : { isValidCategoryId: true, required: true, type: 'string' },
}
I understand that my custom validation function should return true, but in an asynchronous context, this may not work, like checking the value in DB.
How should I write my custom validation function to make it behave correctly?
I tried particlebanana's solution. It didn't work but at least it pointed me in the right direction.
According to the docs:
Validation rules may be defined as simple values or functions (both sync and async) that return the value to test against.
So, one easy way to do this would be:
attributes: {
category_id : {
required: true,
type: 'string',
'true': function(cb) {
Category.findOne({id: this.category_id}).exec(function(err, category){
return cb(!err && category);
});
}
}
}
As you can see, I'm just using the "true" validator here, but you could of course write your own validator to work out some more advanced logic. The key here is that your custom validators aren't async, but your validation rules can be. Yes, the terminology is very confusing.
Here's another example with a custom validator:
attributes: {
author: {
required: true,
type: 'string',
canPost: function(cb) {
Author.findOne(this.author).exec(function(err, author) {
return cb(author);
});
}
}
},
types: {
canPost: function(id, author) {
return author && author.canPost();
}
}
Hopefully that makes sense. If not, See the docs.
You can pass in a callback and return the result. It's a bit weird because it doesn't look like it follows the (err, result) standard but instead just uses (result). Give this a try:
types: {
isValidCategoryId: function(id, cb){
return Category.findOne({id: id}).exec(function(err, user){
if(err || !user)
return cb(false);
else{
return cb(true);
}
});
}
},

saving embedded Propel symfony form for one-to-one relationship

I have a pair of tables that have a one-to-one relationship.
I have a complaint form that needs to embed a person form inside of that, the relevant schema is below:
complaint:
id: ~
created_at: ~
updated_at: ~
complainant_id: { type: integer, foreignTable: person_data, foreignReference: id, onDelete: setnull }
status: { type: tinyint, default: 1 }
complaint_title: { type: varchar(64) }
complaint_number: { type: varchar(16) }
recipient: { type: varchar(128) }
person_data:
id: ~
created_at: ~
updated_at: ~
company_name: { type: varchar(64) }
first_name: { type: varchar(64) }
last_name: { type: varchar(64) }
email: { type: varchar(128) }
I am able to successfully save both objects to the database but the main complaint object is not being updated with the complainant_id of the person_data row.
Does anyone know why this isn't working correctly and how to force it to update the complaint object correctly?
I am using symfony 1.4.13, Propel 1.6.3.
UPDATE:
Here is the code for the embedded form:
<?php
public function configure()
{
$use_fields = array();
// ...other fields added...
$sub_form = new PersonDataForm(array(), array());
$this->embedForm('complainant', $sub_form);
array_push($use_fields, 'complainant');
$this->useFields($use_fields);
}
I've found a solution to this problem.
Override the saveEmbeddedForms method in the form class.
Updating the main object occurs after the saving of the embedded forms so the ids are available to update the main object.
public function saveEmbeddedForms($con = null, $forms = null)
{
// save the embedded forms
parent::saveEmbeddedForms($con, $forms);
// loop through all embedded forms and update the main object with their ids
foreach($this->getEmbeddedForms() as $name => $embedded_form)
{
switch($name)
{
case 'recipient':
// criteria to determine if the sub-object should be saved or not
if($embedded_form->getObject()->getFirstName() == '' && $embedded_form->getObject()->getLastName() == '')
{
$embedded_form->getObject()->delete();
$this->getObject()->setRecipientId(null);
$this->getObject()->save();
}
else
$this->getObject()->setRecipientId($embedded_form->getObject()->getId());
break;
case 'complainant':
if($embedded_form->getObject()->getFirstName() == '' && $embedded_form->getObject()->getLastName() == '')
{
$embedded_form->getObject()->delete();
$this->getObject()->setComplainantId(null);
$this->getObject()->save();
}
else
{
$this->getObject()->setComplainantId($embedded_form->getObject()->getId());
}
break;
default:
break;
}
}
// save the main object with the new sub-object keys set
$this->getObject()->save();
}
Unfortunately there is nowhere that I can find on the internet with this explanation. So here it is for those that come after me.