Is there a way to check for errors first from inserting in two related rows before saving in database - postgresql

I'm currently making job/task list as a test project to learn web development (REST API, Express.js, Postgresql 13 via pg-node).
The structure is that the user can add employees in various tasks and jobs.
1 job can have multiple tasks
1 task can have multiple employees
Employees cannot have more than 1 task in the same job, but can be assigned another task in a different job.
The flow in the UI is a modal that allows you to fill out the job details (name), add tasks, then assign the employees on the said task.
What I currently have was to create two endpoints being sent when clicking a "Create" button, one that creates the job, and the second that creates and assigns the employees to the tasks as I need to attach the job_id to the task, which I cannot do until it has already been generated/created. The problem with this is that in the case of an error in inserting a task, the job will already be created and saved in the database while failing to assign the task and employees, causing a conflict when the "Create" button in the UI.
What I want to do (if possible) is to create a single query that will do both the creation and insertion of the job and initial tasks and employee assignments. The query will then not save the job in the database if there are errors that occurred during the entire transaction (e.g. a task failed to be inserted due to failing a condition such as wrong data type, etc.).
DB Fiddle for the schema: https://www.db-fiddle.com/f/izPsVVxPZ8e9ZMPwbL9her/10
These are my 2 routes:
//Create Job
const {
name
} = req.body;
const job = await pool.query(`SELECT * FROM jobs WHERE
name = $1)`, [
job_name
]);
if (job.rows.length !== 0) {
return res.status(401).send("Job already exists.");
}
const newJob = await pool.query(
`INSERT INTO jobs (job_name) VALUES
($1) RETURNING *`,
[job_name]
);
res.json({ "message": "Job created successfully!" });
//Assign Task
const {
job_id
employee_id
} = req.body;
const checkTask = await pool.query(`SELECT * FROM treatments WHERE
job_id = $1 AND
employee_id $2`, [
req.params.id, employee_id
]);
if (checkTreatment.rows.length !== 0) {
return res.status(401).send("Technician already assigned in the same treatment and schedule.");
}
const newTaskAssignment = await pool.query(
`INSERT INTO treatments (job_id,
employee_id) VALUES
($1, $2) RETURNING *`,
[req.params.id, job_id]
);
res.json({ "message": "Task added to job successfully!" });
Also, if possible, how can I do bulk insert tasks/employee assignments through the API POST route, I read that it involves making an array, but I haven't delved into it yet, if you can also give me advice for it that would be great (or any resource to read, I'm currently reading documentation and stackoverflow topics).
Thank you in advance for helping it!

UPDATE: I managed to do it via following the tutorial from kb.objectrocket.com
It involves using Transactions (which I just learned last night, and are really awesome!). This is the code that solved my problem:
//2. Declare an asynchronous function for the PG transaction
async function execute() {
// Promise chain for pg Pool client
const client = await pool
.connect()
.catch(err => {
console.log("\nclient.connect():", err.name);
// iterate over the error object attributes
for (item in err) {
if (err[item] = undefined) {
process.stdout.write(item + " - " + err[item] + " ");
}
}
//end the Pool instance
console.log("\n");
process.exit();
});
try {
//Initiate the Postgres transaction
await client.query("BEGIN");
try {
const sqlString = `WITH INSERTED AS (
INSERT INTO jobs (job_name) VALUES
($1) RETURNING id)
INSERT INTO tasks(
employee_id, job_id) VALUES
($1,(
SELECT id FROM inserted
))`;
const sqlValues = [job_name, employee_id
];
// Pass SQL string to the query() method
await client.query(sqlString, sqlValues, function(err, result) {
console.log("client.query() SQL result:", result);
if (err) {
console.log("\nclient.query():", err);
// Rollback before executing another transaction
client.query("ROLLBACK");
console.log("Transaction ROLLBACK called");
} else {
client.query("COMMIT");
console.log("client.query() COMMIT row count:", result.rowCount);
}
});
} catch (er) {
// Rollback before executing another transaction
client.query("ROLLBACK");
console.log("client.query():", er);
console.log("Transaction ROLLBACK called");
}
} finally {
client.release();
console.log("Client is released");
}
}
execute();
res.json({ "message": "Service job created successfully!" });
} catch (err) {
console.error(err.message);
res.status(500).send("Server Error");
}
});
Thank you!

Related

How to merge two leads using an apex Trigger

I'm new to salesforce and I'm trying to learn more. Currently I'm stuck at a point where I don't know what to do further. Kindly point me in the right direction. Any help is appreciated.
So what im trying to do is to compare lastnames to find duplicates when the record is being created and if a duplicate is found then instead of creating it as a new record it should be merged with existing record.
So to achieve the task I have wrote the following trigger handler:
public class LeadTriggerHandler {
public static void duplicateMerge(){
List<Lead> leadList = [SELECT Id,Name, Email, Phone, FirstName, LastName FROM Lead];
List<Lead> leadTrigger = Trigger.new;
for(Lead leadVarTrigger : leadTrigger){
for(Lead leadVar : leadList){
//System.debug(leadVar.LastName + '==' + leadVarTrigger.LastName);
if(leadVarTrigger.LastName == leadVar.LastName)
{
//System.debug(leadVar.LastName + '==' + leadVarTrigger.LastName);
//leadVarTrigger.addError('This is a duplicate record');
Database.merge(leadVar, leadVarTrigger);
System.debug('Trigger Successful');
}
}
}
}
}
the following is my trigger:
trigger LeadTrigger on Lead (after insert) {
if(Trigger.isafter && Trigger.isInsert)
{
LeadTriggerHandler.duplicateMerge();
}
}
And when I try with after insert i get the following error:
LeadTrigger: execution of AfterInsert caused by: System.DmlException: Merge failed. First exception on row 0 with id 00Q5j00000ENUGVEA5; first error: INVALID_FIELD_FOR_INSERT_UPDATE, Unable to create/update fields: Name. Please check the security settings of this field and verify that it is read/write for your profile or permission set.: [Name] Class.LeadTriggerHandler.duplicateMerge: line 18, column 1 Trigger.LeadTrigger: line 5, column 1
And if i try with before trigger i get the following error for the same code:
LeadTrigger: execution of BeforeInsert caused by: System.StringException: Invalid id at index 0: null External entry point Trigger.LeadTrigger: line 5, column 1
Actually, according to your code, you are allowing the record to be created and saved to the database by using after insert. Your before insert failed because your handler class is referencing an Id, however, if you use before logic, the record isn't saved to the database yet, meaning it doesn't have an Id. With that being said, let's try the following. :)
The Trigger (Best practice is to have one trigger with all events):
trigger TestTrigger on Lead (before insert, before update, before delete, after insert, after update, after delete, after undelete) {
if(Trigger.isafter && Trigger.isInsert)
{
//Can't conduct DML operations with trigger.new or trigger.old
//So we will create a set and send this to our handler class
Set<Id> leadIds = Trigger.newMap.keySet();
LeadTriggerHandler.duplicateMerge(leadIds);
}
}
The Handler Class:
public class LeadTriggerHandler {
public static void duplicateMerge(Set<Id> idsFromTrigger){
//Querying the database for the records created during the trigger
List<Lead> leadTrigger = [SELECT Id, LastName FROM Lead WHERE Id IN: idsFromTrigger];
List<String> lastNames = new List<String>();
//This set is important as it prevents duplicates in our dml call later on
Set<Lead> deDupedLeads = new Set<Lead>();
List<Lead> leadsToDelete = new List<Lead>();
for (Lead l : leadTrigger){
//getting all of the Last Names of the records from the trigger
lastNames.add(l.lastName);
}
//We are querying the database for records that have the same last name as
//the records that were created during our trigger
List<Lead> leadList = [SELECT Id, Name, Email, Phone, FirstName, LastName FROM Lead WHERE LastName IN: lastNames];
for(Lead leadInTrigger : leadTrigger){
for(Lead leadInList : leadList){
if(leadInTrigger.LastName == leadInList.LastName){
//if the lead from the trigger has the same last name as a lead that
//already exists, add it to our set
deDupedLeads.add(leadInTrigger);
}
}
}
//add all duplicate leads from our set to our list and delete them
leadsToDelete.addAll(deDupedLeads);
delete leadsToDelete;
}
}
This handler has been bulkified in two ways, we removed the DML operation out of the loop and the code is able to process a scenario where someone mass inserts 1000s of leads at a time. Plus, rather than querying every lead record in your database, we only query for records that have the same last name as the records created during the insert operation. We advise using something more unique than LastName like Email or Phone as many people/leads can have the same Last Name. Hope this helps and have a blessed one.

pgpromise insert return message failed, but successfully inserted into table

pg-promise successfully inserted an row into table, but not showing insert return record.
If i not use returning then all is fine. if using then error occurs.
const createEMP = async (req, res, next) => {
try{
let user = await db.none("insert into emp (name, salary, joindate)
values ( ${name}, ${salary}, ${joindate}) returning * ", req.body)
res.status(200).json({
user,
"message": "new user created"
})
}
catch(error) {next(error)}
}
In postman, it shows a very long page of error. It says: <pre>QueryResultError: No return data was expected.<br>.....
change from db.none to db.one Work.

How do I skip duplicate documents in a bulk insert and ignore duplicates with a specific field c#

I need to insert many documents and ignore the duplicated docs.
Doc format:
_id:5b84e2588aceda018a974450
Name:"Jeff M"
Email:"jeff.m#xtrastaff.com"
Type:"Client"
UserId:Binary('Rw+KMGpSAECQ3gwCtfoKUg==')
UserImage:null
I want to check the duplication using the EmailId field when I am inserting. Insert only if it is not existing.
To prevent the duplicates being inserted you need a unique index which can be created in C# code:
public void CreateIndex()
{
var options = new CreateIndexOptions() { Unique = true };
var field = new StringFieldDefinition<Model>(nameof(Model.Email));
var indexDefinition = new IndexKeysDefinitionBuilder<Model>().Ascending(field);
Collection.Indexes.CreateOne(indexDefinition, options);
}
Then you can insert multiple documents using BulkWrite operation. The problem is that by default the processing will be stopped when first insert operation fails (which happens when you try to insert a duplicate) and you'll get an exception in C#. You can modify that by setting ordered parameter to false which means that all the inserts will be processed "in parallel" and you'll get one exception which aggregates all failed inserts. That exception is of type MongoBulkWriteException and you can try to catch it. So you can try following method:
public void InsertData(List<Model> data)
{
var writeOps = data.Select(x => new InsertOneModel<Model>(x));
try
{
Collection.BulkWrite(writeOps, new BulkWriteOptions() { IsOrdered = false });
}
catch (MongoBulkWriteException ex)
{
// will be thrown when there were any duplicates
}
}

How to get auto Id after upsert on a persisted model in loopback?

I have a some models generated from postgresql db using looback-connector postgresql. Id column of these models is a auto incremented integer column of postgresql db.
1) I have a remote method added on one of persisted models, where i perform simple update or insert(upsert.
Car.CreateOrUpdateCar = function (carobj, req) {
Car.upsert(Carobj, function (err, Car) {
if (err)
console.log(err);
else {
req(err, Car);
}
});
};
2) have added a remote hook to execute after this remote method.
Car.afterRemote('CreateOrUpdateCar', function (context, remoteMethodOutput, next) {
//Remaining code goes here
next();
});
3) I want to use Id of newly inserted row in step (1), in the remote hook mentioned in step (2)
I don't have much idea about postgresql db. But Try it like this
var carObj;
Car.CreateOrUpdateCar = function (carobj, req) {
Car.upsert(Carobj, function (err, Car) {
if (err)
console.log(err);
else {
req(err, Car); // Your Car object contains final result after upserting along with Id
carObj = Car;
}
});
};
Now you can get id by using carObj.id and you can use it where ever you want. I hope this helps
You can access to generated id in remote hook like this :
Car.afterRemote('CreateOrUpdateCar', function (context, remoteMethodOutput, next) {
var genId = remoteMethodOutput.id || context.result.id;
next();
});

Create an object from multiple database collections (SailsJS, MongoDB, WaterlineJS)

I'm very new to Sails and noSQL databases and I'm having trouble gathering information together from different collections. Basically I need to gather an object of items from one collection and then use a foreign key stored in that collection to add data from a separate collection so the whole thing can be sent as one object.
Currently I find all the items in a collection called Artwork, then I'm using a for loop to iterate through the artworks. I need to use an id stored in Artworks to query a collection called Contacts but having successfully found the contact I am unable to pass it back out of the function to add it to the Artwork object.
find: function ( req, res, next ) {
Artwork.find().done( function ( err, artwork ) {
// Error handling
if (err) {
return console.log(err);
} else {
for ( x in artwork ) {
var y = artwork[x]['artistID'];
// Get the artsists name
Contact.find(y).done( function( err, contact ) {
// Error handling
if ( err ) {
return console.log(err);
// The Artist was found successfully!
} else {
var artist = contact[0]['fullName'];
}
});
artwork[x]['artistsName'] = artist;
}
res.send(artwork);
}
});
}
The result of the above code is an error thrown that tells me 'artist' is undefined. The variable is not being passed outside the function?
Any advice greatly received.
Sails is about to release an update that will include associations. In the meantime, here's an answer for how you can accomplish it using async. https://stackoverflow.com/a/20050821/1262998