pgpromise insert return message failed, but successfully inserted into table - pg-promise

pg-promise successfully inserted an row into table, but not showing insert return record.
If i not use returning then all is fine. if using then error occurs.
const createEMP = async (req, res, next) => {
try{
let user = await db.none("insert into emp (name, salary, joindate)
values ( ${name}, ${salary}, ${joindate}) returning * ", req.body)
res.status(200).json({
user,
"message": "new user created"
})
}
catch(error) {next(error)}
}
In postman, it shows a very long page of error. It says: <pre>QueryResultError: No return data was expected.<br>.....

change from db.none to db.one Work.

Related

Add aditional user information when signup supabase

I am trying to add additional data in a user record in supabase, I have created a trigger that is called after a record has been inserted in auth user that should add the user id and username in the profiles table. This happens but it doesn't add the user name, it's still null in the profiles table. That data is supposed to go in the raw_user_meta_data column but it still doesn't add in the column
Trigger function:
BEGIN
INSERT INTO public.profiles(id, username)
VALUES (
NEW.id,
NEW.raw_user_meta_data -> 'username'
);
RETURN NEW;
END;
Front:
const createNewUser = async() => {
const { username, email, password } = credentials;
await supabase.auth.signUp({
email: email,
password: password,
data: {
"username": 'hello'
}
});
}
Just to follow up. Maybe this is the change with supabase v2 and supabasejs update, but now it seems to work with one argument, but slightly different than you had it in first post. Here is the link:
https://supabase.com/docs/reference/javascript/auth-signup#sign-up-with-additional-user-metadata
and the code:
const { data, error } = await supabase.auth.signUp({
email: userEmail.value,
password: password,
options: {
data: {
user_name: userName.value,
},
},
});
I found the solution reading over there, in case it helps anyone. My mistake was that in the signUp function I passed only 1 argument that had included the additional data that would be included in the trigger function. However, this signUp function must be passed 2 objects as arguments and it is this second argument that is passed that saves the object with the username extracted from the function in the front in the raw_user_meta_data column.
As additional data that can help you in the search for the error. You can know what the logs of the authentication process print. You can also insert a record directly in auth.users so you can see why it does not add additional data and be able to review the logs, I attach an example:
insert into auth.users(id, email, encrypted_password, raw_user_meta_data)
values (
'eb23060d-71ea-4112-a1c7-203d6f87fa2d',
'example#mail.com',
'$2y$10$WpAA2remsZRnZivgRaM9L.1BcjvAtUa966AICxv1RGte68BICsZkS',
'{"username": "user_example"}'
)
Final solution:
const { user, session, error } = await supabase.auth.signUp(
{
email: 'example#email.com',
password: 'example-password',
},
{
data: {
username: 'John'(variable)
}
}
)

Is there a way to check for errors first from inserting in two related rows before saving in database

I'm currently making job/task list as a test project to learn web development (REST API, Express.js, Postgresql 13 via pg-node).
The structure is that the user can add employees in various tasks and jobs.
1 job can have multiple tasks
1 task can have multiple employees
Employees cannot have more than 1 task in the same job, but can be assigned another task in a different job.
The flow in the UI is a modal that allows you to fill out the job details (name), add tasks, then assign the employees on the said task.
What I currently have was to create two endpoints being sent when clicking a "Create" button, one that creates the job, and the second that creates and assigns the employees to the tasks as I need to attach the job_id to the task, which I cannot do until it has already been generated/created. The problem with this is that in the case of an error in inserting a task, the job will already be created and saved in the database while failing to assign the task and employees, causing a conflict when the "Create" button in the UI.
What I want to do (if possible) is to create a single query that will do both the creation and insertion of the job and initial tasks and employee assignments. The query will then not save the job in the database if there are errors that occurred during the entire transaction (e.g. a task failed to be inserted due to failing a condition such as wrong data type, etc.).
DB Fiddle for the schema: https://www.db-fiddle.com/f/izPsVVxPZ8e9ZMPwbL9her/10
These are my 2 routes:
//Create Job
const {
name
} = req.body;
const job = await pool.query(`SELECT * FROM jobs WHERE
name = $1)`, [
job_name
]);
if (job.rows.length !== 0) {
return res.status(401).send("Job already exists.");
}
const newJob = await pool.query(
`INSERT INTO jobs (job_name) VALUES
($1) RETURNING *`,
[job_name]
);
res.json({ "message": "Job created successfully!" });
//Assign Task
const {
job_id
employee_id
} = req.body;
const checkTask = await pool.query(`SELECT * FROM treatments WHERE
job_id = $1 AND
employee_id $2`, [
req.params.id, employee_id
]);
if (checkTreatment.rows.length !== 0) {
return res.status(401).send("Technician already assigned in the same treatment and schedule.");
}
const newTaskAssignment = await pool.query(
`INSERT INTO treatments (job_id,
employee_id) VALUES
($1, $2) RETURNING *`,
[req.params.id, job_id]
);
res.json({ "message": "Task added to job successfully!" });
Also, if possible, how can I do bulk insert tasks/employee assignments through the API POST route, I read that it involves making an array, but I haven't delved into it yet, if you can also give me advice for it that would be great (or any resource to read, I'm currently reading documentation and stackoverflow topics).
Thank you in advance for helping it!
UPDATE: I managed to do it via following the tutorial from kb.objectrocket.com
It involves using Transactions (which I just learned last night, and are really awesome!). This is the code that solved my problem:
//2. Declare an asynchronous function for the PG transaction
async function execute() {
// Promise chain for pg Pool client
const client = await pool
.connect()
.catch(err => {
console.log("\nclient.connect():", err.name);
// iterate over the error object attributes
for (item in err) {
if (err[item] = undefined) {
process.stdout.write(item + " - " + err[item] + " ");
}
}
//end the Pool instance
console.log("\n");
process.exit();
});
try {
//Initiate the Postgres transaction
await client.query("BEGIN");
try {
const sqlString = `WITH INSERTED AS (
INSERT INTO jobs (job_name) VALUES
($1) RETURNING id)
INSERT INTO tasks(
employee_id, job_id) VALUES
($1,(
SELECT id FROM inserted
))`;
const sqlValues = [job_name, employee_id
];
// Pass SQL string to the query() method
await client.query(sqlString, sqlValues, function(err, result) {
console.log("client.query() SQL result:", result);
if (err) {
console.log("\nclient.query():", err);
// Rollback before executing another transaction
client.query("ROLLBACK");
console.log("Transaction ROLLBACK called");
} else {
client.query("COMMIT");
console.log("client.query() COMMIT row count:", result.rowCount);
}
});
} catch (er) {
// Rollback before executing another transaction
client.query("ROLLBACK");
console.log("client.query():", er);
console.log("Transaction ROLLBACK called");
}
} finally {
client.release();
console.log("Client is released");
}
}
execute();
res.json({ "message": "Service job created successfully!" });
} catch (err) {
console.error(err.message);
res.status(500).send("Server Error");
}
});
Thank you!

Simple knex query returning table name only

I am trying to work with knex to retrieve some database values however no matter the configuration I use I am either getting a 500 error code or just a pending network session. To keep it as simple as possible to get something at least working I have written the following:
export default () => (async (req, res, knex) => {
const temp = knex('vouchers').select();
console.log(temp);
res.response(201).end();
});
Which should by my understanding go into my vouchers table and retrieve everything, I only end up getting a 500 errorand a console log of vouchers i.e. my table name...
Try:
export default () => (async (req, res, knex) => {
const temp = await knex('vouchers');
console.log(temp);
res.send(JSON.stringify(temp,null,2));
});
My solution was basically restructuring the start of the query:
export default function ({ pool, queue, knex }) {
return async (req, res, next) => {
try {
const { eventId, voucher } = req.params;
with special attention to the async params.

How to get auto Id after upsert on a persisted model in loopback?

I have a some models generated from postgresql db using looback-connector postgresql. Id column of these models is a auto incremented integer column of postgresql db.
1) I have a remote method added on one of persisted models, where i perform simple update or insert(upsert.
Car.CreateOrUpdateCar = function (carobj, req) {
Car.upsert(Carobj, function (err, Car) {
if (err)
console.log(err);
else {
req(err, Car);
}
});
};
2) have added a remote hook to execute after this remote method.
Car.afterRemote('CreateOrUpdateCar', function (context, remoteMethodOutput, next) {
//Remaining code goes here
next();
});
3) I want to use Id of newly inserted row in step (1), in the remote hook mentioned in step (2)
I don't have much idea about postgresql db. But Try it like this
var carObj;
Car.CreateOrUpdateCar = function (carobj, req) {
Car.upsert(Carobj, function (err, Car) {
if (err)
console.log(err);
else {
req(err, Car); // Your Car object contains final result after upserting along with Id
carObj = Car;
}
});
};
Now you can get id by using carObj.id and you can use it where ever you want. I hope this helps
You can access to generated id in remote hook like this :
Car.afterRemote('CreateOrUpdateCar', function (context, remoteMethodOutput, next) {
var genId = remoteMethodOutput.id || context.result.id;
next();
});

Insert with id of type serial in pg-promise

In pg-promise, how can I insert data when the primary key is of type Serial? Omitting the field id creates no response in the call.
The code below produces no error in the catch (and also does not execute the then branch).
function postSecao(req, res){
var data = req.body;
var db = pgp(cn);
db.none("insert into public.secoes(nome) values($1)", [data.nome])
.then(function () {
pgp.end();
return res.status(201).end();
})
.catch(function (error) {
console.log(err);
pgp.end();
return res.status(500).end();
});
}
The table:
CREATE TABLE public.secoes
(
id bigint NOT NULL DEFAULT nextval('secoes_id_seq'::regclass),
nome character varying(100),
CONSTRAINT id PRIMARY KEY (id)
)
Manually providing the id works without problem.
function postSecao(req, res){
var data = req.body;
var db = pgp(cn);
db.none("insert into public.secoes(id, nome) values($1,$2)", [data.id, data.nome])
.then(function () {
pgp.end();
return res.status(201).end();
})
.catch(function (error) {
console.log(err);
pgp.end();
return res.status(500).end();
});
}
And of course the SQL runs fine in PGAdmin.
insert into public.secoes(nome) values('test')
I just figured out. The problem was in the privileges control of the user. It is necessary to the user accessing the DB to have privileges over the sequences used in the serial field.
So not a pg-promise problem.