saving embedded Propel symfony form for one-to-one relationship - forms

I have a pair of tables that have a one-to-one relationship.
I have a complaint form that needs to embed a person form inside of that, the relevant schema is below:
complaint:
id: ~
created_at: ~
updated_at: ~
complainant_id: { type: integer, foreignTable: person_data, foreignReference: id, onDelete: setnull }
status: { type: tinyint, default: 1 }
complaint_title: { type: varchar(64) }
complaint_number: { type: varchar(16) }
recipient: { type: varchar(128) }
person_data:
id: ~
created_at: ~
updated_at: ~
company_name: { type: varchar(64) }
first_name: { type: varchar(64) }
last_name: { type: varchar(64) }
email: { type: varchar(128) }
I am able to successfully save both objects to the database but the main complaint object is not being updated with the complainant_id of the person_data row.
Does anyone know why this isn't working correctly and how to force it to update the complaint object correctly?
I am using symfony 1.4.13, Propel 1.6.3.
UPDATE:
Here is the code for the embedded form:
<?php
public function configure()
{
$use_fields = array();
// ...other fields added...
$sub_form = new PersonDataForm(array(), array());
$this->embedForm('complainant', $sub_form);
array_push($use_fields, 'complainant');
$this->useFields($use_fields);
}

I've found a solution to this problem.
Override the saveEmbeddedForms method in the form class.
Updating the main object occurs after the saving of the embedded forms so the ids are available to update the main object.
public function saveEmbeddedForms($con = null, $forms = null)
{
// save the embedded forms
parent::saveEmbeddedForms($con, $forms);
// loop through all embedded forms and update the main object with their ids
foreach($this->getEmbeddedForms() as $name => $embedded_form)
{
switch($name)
{
case 'recipient':
// criteria to determine if the sub-object should be saved or not
if($embedded_form->getObject()->getFirstName() == '' && $embedded_form->getObject()->getLastName() == '')
{
$embedded_form->getObject()->delete();
$this->getObject()->setRecipientId(null);
$this->getObject()->save();
}
else
$this->getObject()->setRecipientId($embedded_form->getObject()->getId());
break;
case 'complainant':
if($embedded_form->getObject()->getFirstName() == '' && $embedded_form->getObject()->getLastName() == '')
{
$embedded_form->getObject()->delete();
$this->getObject()->setComplainantId(null);
$this->getObject()->save();
}
else
{
$this->getObject()->setComplainantId($embedded_form->getObject()->getId());
}
break;
default:
break;
}
}
// save the main object with the new sub-object keys set
$this->getObject()->save();
}
Unfortunately there is nowhere that I can find on the internet with this explanation. So here it is for those that come after me.

Related

mongodb/mongoose: Save unique value if data is not null in from nestjs

I am trying to save data in MongoDB. I want to store unique data when data is not null. However, I want to allow multiple null values in the unique identifier.
My sample schema:
#Schema()
export class Contact extends Document {
#Prop({ unique: true, sparse: true, require: true })
email: string;
#Prop({ default: '+1' })
countryCode: string;
#Prop({ unique: true, sparse: true })
mobile: string;
}
In this case, a mobile number is not required. User can add their contact information with or without providing a mobile number. If the user sends their mobile number that should be unique. So, I need to allow multiple null values in the mobile field. However, that field should be unique when the user provides any mobile number.
Empty entries seem to get the value null so every entry without mobile crashes with the unique identifier.
Is there any way to solve this problem either from the database layer or the application layer?
I am using NestJS for developing my API.
A unique index still does not allow multiple docs with a field of null. You need to transform your data payload by dropping the null field before you save your docs in MongoDB. A transform pipe will help you to handle this issue. Here is a transform pipe that you can use for this purpose:
#Injectable()
export class NullValidationPipe implements PipeTransform {
private isObj(obj: any): boolean {
return typeof obj === 'object' && obj !== null;
}
private dropNull(values) {
Object.keys(values).forEach((key) => {
if (!(key === 'password' || key === '_id')) {
if (this.isObj(values[key])) {
values[key] = this.dropNull(values[key]);
} else if (Array.isArray(values[key]) && values[key].length > 0) {
values[key] = values[key].map((value) => {
if (this.isObj(value)) {
value = this.dropNull(value);
}
return value;
});
} else {
if (values[key] === null || values[key] === undefined) {
delete values[key];
}
}
}
});
return values;
}
transform(values: any, metadata: ArgumentMetadata) {
const { type } = metadata;
if (type === 'param' || type === 'custom') return values;
else if (this.isObj(values) && type === 'body') {
return this.dropNull(values);
}
throw new BadRequestException('Validation failed');
}
}
Use this pipe in the controller and this pipe will drop all incoming null fields which will come with the request payload.
You can also check nest pipe transform docs: https://docs.nestjs.com/techniques/validation

How to implement a node query resolver with apollo / graphql

I am working on implementing a node interface for graphql -- a pretty standard design pattern.
Looking for guidance on the best way to implement a node query resolver for graphql
node(id ID!): Node
The main thing that I am struggling with is how to encode/decode the ID the typename so that we can find the right table/collection to query from.
Currently I am using postgreSQL uuid strategy with pgcrytpo to generate ids.
Where is the right seam in the application to do this?:
could be done in the primary key generation at the database
could be done at the graphql seam (using a visitor pattern maybe)
And once the best seam is picked:
how/where do you encode/decode?
Note my stack is:
ApolloClient/Server (from graphql-yoga)
node
TypeORM
PostgreSQL
The id exposed to the client (the global object id) is not persisted on the backend -- the encoding and decoding should be done by the GraphQL server itself. Here's a rough example based on how relay does it:
import Foo from '../../models/Foo'
function encode (id, __typename) {
return Buffer.from(`${id}:${__typename}`, 'utf8').toString('base64');
}
function decode (objectId) {
const decoded = Buffer.from(objectId, 'base64').toString('utf8')
const parts = decoded.split(':')
return {
id: parts[0],
__typename: parts[1],
}
}
const typeDefs = `
type Query {
node(id: ID!): Node
}
type Foo implements Node {
id: ID!
foo: String
}
interface Node {
id: ID!
}
`;
// Just in case model name and typename do not always match
const modelsByTypename = {
Foo,
}
const resolvers = {
Query: {
node: async (root, args, context) => {
const { __typename, id } = decode(args.id)
const Model = modelsByTypename[__typename]
const node = await Model.getById(id)
return {
...node,
__typename,
};
},
},
Foo: {
id: (obj) => encode(obj.id, 'Foo')
}
};
Note: by returning the __typename, we're letting GraphQL's default resolveType behavior figure out which type the interface is returning, so there's no need to provide a resolver for __resolveType.
Edit: to apply the id logic to multiple types:
function addIDResolvers (resolvers, types) {
for (const type of types) {
if (!resolvers[type]) {
resolvers[type] = {}
}
resolvers[type].id = encode(obj.id, type)
}
}
addIDResolvers(resolvers, ['Foo', 'Bar', 'Qux'])
#Jonathan I can share an implementation that I have and you see what you think. This is using graphql-js, MongoDB and relay on the client.
/**
* Given a function to map from an ID to an underlying object, and a function
* to map from an underlying object to the concrete GraphQLObjectType it
* corresponds to, constructs a `Node` interface that objects can implement,
* and a field config for a `node` root field.
*
* If the typeResolver is omitted, object resolution on the interface will be
* handled with the `isTypeOf` method on object types, as with any GraphQL
* interface without a provided `resolveType` method.
*/
export function nodeDefinitions<TContext>(
idFetcher: (id: string, context: TContext, info: GraphQLResolveInfo) => any,
typeResolver?: ?GraphQLTypeResolver<*, TContext>,
): GraphQLNodeDefinitions<TContext> {
const nodeInterface = new GraphQLInterfaceType({
name: 'Node',
description: 'An object with an ID',
fields: () => ({
id: {
type: new GraphQLNonNull(GraphQLID),
description: 'The id of the object.',
},
}),
resolveType: typeResolver,
});
const nodeField = {
name: 'node',
description: 'Fetches an object given its ID',
type: nodeInterface,
args: {
id: {
type: GraphQLID,
description: 'The ID of an object',
},
},
resolve: (obj, { id }, context, info) => (id ? idFetcher(id, context, info) : null),
};
const nodesField = {
name: 'nodes',
description: 'Fetches objects given their IDs',
type: new GraphQLNonNull(new GraphQLList(nodeInterface)),
args: {
ids: {
type: new GraphQLNonNull(new GraphQLList(new GraphQLNonNull(GraphQLID))),
description: 'The IDs of objects',
},
},
resolve: (obj, { ids }, context, info) => Promise.all(ids.map(id => Promise.resolve(idFetcher(id, context, info)))),
};
return { nodeInterface, nodeField, nodesField };
}
Then:
import { nodeDefinitions } from './node';
const { nodeField, nodesField, nodeInterface } = nodeDefinitions(
// A method that maps from a global id to an object
async (globalId, context) => {
const { id, type } = fromGlobalId(globalId);
if (type === 'User') {
return UserLoader.load(context, id);
}
....
...
...
// it should not get here
return null;
},
// A method that maps from an object to a type
obj => {
if (obj instanceof User) {
return UserType;
}
....
....
// it should not get here
return null;
},
);
The load method resolves the actual object. This part you would have work more specifically with your DB and etc...
If it's not clear, you can ask! Hope it helps :)

How to replace a manual id with an ObjectID _id in mongoDB?

Let's say I have a database with two collections, kids and classes. Each kid belongs to one class.
Each class has a previously created integer id.
I want to replace the kid.class_id with the (ObjectID) _id of the class, not the (integer) id of the class.
However, when I run the script below, it doesn't reset the class_id with the class._id -- it remains the old integer id.
mongoose.connect(someMongodbUri, { useMongoClient: true }, (err, db) => {
let kidsCount = 0;
db.collection('kids').find({}).each((err, kid) => {
kidsCount++;
db.collection('classes')
.findOne({ id: kid.class_id })
.then((class, err) => {
let newClassId = class._id;
db.collection('kids').updateOne(
{ _id: kid._id },
{ $set: { class_id: newClassId } }
).then(() => {
console.info('Updated', kid.class_id);
kidsCount--;
if (kidsCount === 0) { db.close(); }
});
});
});
});
Am I missing something? Thanks for any help you can offer!
We can convert integerId to Object id.
var ObjectId = require('mongodb').ObjectID;
let newClassId = ObjectId(class._id);
There may be better or elegent ways that i don't know, but this works for me.

Can I hook up a model to an existing database?

I have mongodb sitting behind an existing API and want to migrate the API to use sailsjs.
The data structure isn't anything crazy - just standard stuff using default mongodb ObjectIds as primary keys.
Will I be able to use this existing db with sails by just wiring up sails models? Do I need to specify the _id field? And, if so, what datatype should I use?
E.g. Existing mongodb with user collection with the following schema:
_id
name
fname
lname
age
Can I just wire up using something like the following for it to work?:
// User.js
var User = {
attributes: {
name: {
fname: 'STRING',
lname: 'STRING'
},
age: 'INTEGER'
}
};
module.exports = Person;
First: you dont have to define _id (waterline do this for you)
Waterline wants to help you using the same Functions and Models for all types of databases. A "Sub-Field" is not supported in mysql for example. So this don't work.
You can do this:
// User.js
var User = {
attributes: {
name: 'json',
age: 'integer'
}
};
module.exports = User;
If you want to validate "name" you can add your own validation:
// User.js
var User = {
types: {
myname: function(json){
if(typeof json.fname == "string" && typeof json.lname == "string"){
return true;
}else{
return false;
}
}
},
attributes: {
name: {
type: "json",
myname: true
},
age: 'integer'
}
};
module.exports = User;

Symfony - how to add embed Forms?

I am trying to create a form Houses and embed the Images forms into it. I have follow the tutorial http://www.symfony-project.org/more-with-symfony/1_4/en/06-Advanced-Forms.
I have the following schema:
houses:
actAs: { Timestampable: ~ }
columns:
name: { type: string(255), notnull: true }
description: { type: string(5000), notnull: true }
images:
actAs: { Timestampable: ~ }
columns:
url: { type: string(255), notnull: true }
id_house: { type: integer, notnull: true }
relations:
houses: { local: id_house, foreign: id, foreignAlias: HousesImg}
and the code :
//lib/form/doctrine/ImagesCollectionForm
class ImagesCollectionForm extends sfForm
{
public function configure()
{
if(!$house= $this->getOption('house'))
{
throw new InvalidArgumentException('You must provide an house');
}
for ($i = 0; $i < $this->getOption('size',2); $i++)
{
$images = new images();
$images->house = $house;
$form = new imagesForm($images);
$this->embedForm($i, $form);
}
}
}
//lib/form/doctrine/housesForm.class.php
public function configure()
{
$form = new ImagesCollectionForm(null, array('house' => $this->getObject(),'size'=>2));
$this->embedForm('images', $form);
}
The fields are displayed as expected. But, when I press the save button I get a blank page and the data aren't saved in database.
use have not specified alias in Images relation with product
so by default symfony look it for relation name
so u need to change $images->house = $house; to $images->houses = $house;
or u can set alias in relation
hope this will help.