Bulk insert/update from Json param in Postgresql using ON CONFLICT (Id), but Id property must be acquired within the function as not included in param - postgresql

Table definition:
CREATE TABLE public."FeatureToggles"
(
"Id" integer NOT NULL GENERATED BY DEFAULT AS IDENTITY ( INCREMENT 1 START 1 MINVALUE 1 MAXVALUE 2147483647 CACHE 1 ),
"IsDeleted" boolean NOT NULL,
"IsImported" boolean NOT NULL,
"TextProp" character varying(35),
CONSTRAINT "PK_FeatureToggles" PRIMARY KEY ("Id")
)
CREATE TABLE public."Additions"
(
"Id" integer NOT NULL GENERATED BY DEFAULT AS IDENTITY ( INCREMENT 1 START 1 MINVALUE 1 MAXVALUE 2147483647 CACHE 1 ),
"FeatureToggleId" int NOT NULL,
"IsDeleted" boolean NOT NULL,
"Url" character varying(35) NULL,
CONSTRAINT "PK_FeatureToggles" PRIMARY KEY ("Id")
CONSTRAINT "FK_Additions_FeatureToggles_FeatureToggleId" FOREIGN KEY ("FeatureToggleId")
REFERENCES public."FeatureToggles" ("Id") MATCH SIMPLE
ON UPDATE NO ACTION
ON DELETE CASCADE,
)
Insert one record into table:
INSERT INTO public."FeatureToggles" ("IsDeleted", "TextProp", "IsImported") VALUES(false, 'X', true);
Function:
CREATE OR REPLACE FUNCTION testfunctionname(jsonparam json)
RETURNS void AS
$BODY$
INSERT INTO "FeatureToggles" ("Id", "IsDeleted", "IsImported", "TextProp")
SELECT (COALESCE(SELECT "Id" FROM "FeatureToggles" WHERE "TextProp" = (prop->>'TextProp')::character varying(35)), 0),
(prop->>'IsDeleted')::boolean,
true,
(prop->>'TextProp')::character varying(35)
json_array_elements(jsonparam) prop
ON CONFLICT ("Id") DO
UPDATE SET
"IsDeleted" = EXCLUDED."IsDeleted"
INSERT INTO "Additions" ("FeatureToggleId", "IsDeleted", "Url")
SELECT (SELECT "Id" FROM "FeatureToggles" WHERE "TextProp" = (prop->>'TextProp')::character varying(35)),
(prop->>'IsDeleted')::boolean,
(prop->>'Additions')::character varying(35)
json_array_elements(jsonparam) prop
DELETE FROM "FeatureToggles" WHERE "IsImported" = true AND "TextProp" IS NOT IN (SELECT DISTINCT (prop->>'TextProp')::character varying(35)szi
json_array_elements(jsonparam) prop)
$BODY$
LANGUAGE sql
Sample JSON:
[
{
"IsDeleted": true,
"TextProp": "X",
"Additions":
[
"Test1",
"Test2"
]
},
{
"IsDeleted": false,
"TextProp": "Y",
"Additions":
[
"Test3",
"Test4"
]
}
]
Calling the function with this JSON param should update the one and only row in the FeatureToggles table to IsDeleted true and insert a new row into the FeatureToggles table with Id equals to 2, IsDeleted false and TextProp is Y. Also it should insert all Additions given in the JSON param into the corresponding table and with the correct foreign keys.
I ran into problems with populating the Id properties from the existing table and also inserting Additions into the other table.
It would be a great if the function would delete any rows in the FeatureToggle and the corresponding Additions table too if it does exists in table already, IsImported property is true, but is not in the JSON param.
Example if we change the insert script to:
INSERT INTO public."FeatureToggles" ("IsDeleted", "TextProp", "IsImported") VALUES(false, 'X', true);
INSERT INTO public."FeatureToggles" ("IsDeleted", "TextProp", "IsImported") VALUES(false, 'X222', true);
After calling the function with the same JSON param, the row with X222 should be deleted because it is marked as imported, but has no matching item (matched by TextProp property) within the new param list.
Any help would be much appreciated as this function needs to handle tens of thousands of records as parameter on each call.

You have several errors in your function (and your DDL)
Most importantly, json_array_elements() is a set returning function, so you need a FROM clause in order to generate multiple rows.
You also need to terminate each SQL statement in the function with ; and IS NOT IN is invalid - you need NOT IN
So the function should be something like this:
CREATE OR REPLACE FUNCTION testfunctionname(jsonparam json)
RETURNS void AS
$BODY$
INSERT INTO "FeatureToggles" ("Id", "IsDeleted", "IsImported", "TextProp")
SELECT coalesce(ft."Id", 0),
(prop->>'IsDeleted')::boolean,
true,
prop->>'TextProp'
FROM json_array_elements(jsonparam) prop
LEFT JOIN "FeatureToggles" ft on ft."TextProp" = (prop->>'TextProp')
ON CONFLICT ("Id") DO
UPDATE SET
"IsDeleted" = EXCLUDED."IsDeleted";
INSERT INTO "Additions" ("FeatureToggleId", "IsDeleted", "Url")
SELECT coalesce(ft."Id", 0),
(prop->>'IsDeleted')::boolean,
prop->>'Additions'
FROM json_array_elements(jsonparam) prop
JOIN "FeatureToggles" ft on ft."TextProp" = (prop->>'TextProp');
DELETE FROM "FeatureToggles"
WHERE "IsImported" = true
AND "TextProp" NOT IN (SELECT DISTINCT prop->>'TextProp' szi
FROM json_array_elements(jsonparam) prop);
$BODY$
LANGUAGE sql;
Note that ->> returns a text value, so there is no need to cast the result of those expression if the target column is text or varchar.
I also changed the scalar sub-queries to JOINs. The first insert is equivalent to an outer join - although I think that is wrong (but that's what your current code tries to do). Because if the join doesn't return anything, the INSERT will try to create a row with "Id" = 0 - bypassing the sequence generation. Using on conflict() with an auto-generated ID rarely makes sense. Maybe you want a unique index on TextProp?
I would probably implement that as a procedure rather than a function though.
Online example

Related

HSQL Triggers : user lacks privilege or object not found: NEWROW.ID

I am trying to implement triggers in hsql after update
where I have one table called component table and on update in that table I want to log it in another table using after insert trigger, for which I am doing
CREATE TABLE IF NOT EXISTS "component"(
"id" INTEGER IDENTITY,
"name" VARCHAR(100),
"configuration" LONGVARCHAR,
"owner_id" INTEGER );
CREATE TABLE IF NOT EXISTS "component_audit"(
"id" INTEGER IDENTITY,
"component_id" INTEGER ,
"action" VARCHAR(20),
"activity_time" BIGINT,
"user_id" INTEGER,
FOREIGN KEY ("component_id") REFERENCES "component"("id") ON UPDATE RESTRICT ON DELETE CASCADE
);
CREATE TRIGGER trig AFTER INSERT ON "component"
REFERENCING NEW ROW AS newrow
FOR EACH ROW
INSERT INTO "component_audit" ("id","component_id","action","activity_time","user_id")
VALUES (DEFAULT, 1, newrow.id, 123, 1);
On running HSQL throws error
Caused by: org.hsqldb.HsqlException: user lacks privilege or object
not found: NEWROW.ID
Its due to my id column being in "id" because I needed it in small caps (by DEFAULT HSQLDB is upper case)
how do I pass my variable subsitution ?
Just use the same naming as in your CREATE TABLE statement.
CREATE TRIGGER trig AFTER INSERT ON "component"
REFERENCING NEW ROW AS newrow
FOR EACH ROW
INSERT INTO "component_audit" ("id","component_id","action","activity_time","user_id")
VALUES (DEFAULT, 1, newrow."id", 123, 1);

PostgresSQL - How to insert into main table and detail table in one stored procedure?

so I'm working on a project which requires my query to insert into one main table and its detail table (which will be sent into the DB as a list) in one transaction so that it'll roll-back if one of the insert functions are failed.
Let's say I have these tables:
CREATE TABLE transaction(
id BIGSERIAL PRIMARY KEY NOT NULL,
user_id BIGINT FOREIGN KEY NOT NULL,
total_item INT NOT NULL DEFAULT 0,
total_purchase BIGINT NOT NULL DEFAULT 0
)
CREATE TABLE transaction_detail(
id BIGSERIAL PRIMARY KEY NOT NULL,
transaction_id BIGINT FOREIGN KEY NOT NULL,
product_id BIGINT FOREIGN KEY NOT NULL,
product_price INT NOT NULL DEFAULT 0,
purchase_amount INT NOT NULL DEFAULT 0
)
And I have this function:
CREATE OR REPLACE FUNCTION create_transaction(order JSONB, product_list JSONB)
Function Parameters:
order : An object which will be inserted into the transaction table
product_list : List of Product object which will be inserted into the transaction_detail table
My current query looks something like this:
CREATE OR REPLACE FUNCTION insert_order(tx JSONB, product_list JSONB)
RETURNS BIGINT
AS $$
WITH result AS (
INSERT INTO transaction(
user_id,
total_item,
total_purchase,
) VALUES (
(tx ->> 'user_id') :: BIGINT,
(tx ->> 'total_item') :: INT,
(tx ->> 'total_purchase') :: INT,
)
RETURNING id AS transaction_id
)
FOR row IN product_list LOOP
INSERT INTO transaction_detail(
transaction_id,
product_id,
product_price,
purchase_amount,
) VALUES (
transaction_id,
(row ->> 'product_id') :: BIGINT,
(row ->> 'product_price') :: INT,
(row ->> 'purchase_amount') :: INT,
)
END LOOP;
$$ LANGUAGE SQL SECURITY DEFINER;
JSON files:
tx.json
[
"user_id" : "1",
"total_item" : "2",
"total_purchase" : "2000",
]
product_list.json
[
{
"product_id" : "1",
"product_price" : "500",
"purchase_amount" : "2"
},
{
"product_id" : "2",
"product_price" : "1000",
"purchase_amount" : "1"
}
]
I know something is wrong with my query although I can't put a finger on it.
Any pointer is much appreciated.
Assuming that the data passed as product_list is an array, you can do something like this:
CREATE OR REPLACE FUNCTION insert_order(p_order JSONB, p_product_list JSONB)
RETURNS BIGINT
AS $$
WITH result AS (
INSERT INTO "transaction"(
user_id,
total_item,
total_purchase
) VALUES (
(p_order ->> 'user_id') :: BIGINT,
(p_order ->> 'total_item') :: INT,
(p_order ->> 'total_purchase') :: INT
)
RETURNING id AS transaction_id
), details as (
INSERT INTO transaction_detail(
transaction_id,
product_id,
product_price,
purchase_amount
)
select r.transaction_id,
(pl.data ->> 'product_id')::bigint,
(pl.data ->> 'product_price')::int,
(pl.data ->> 'purchase_amount')::int
from result r,
jsonb_array_elements(p_product_list) as pl(data)
)
select transaction_id
from result;
$$
LANGUAGE SQL SECURITY DEFINER;
I renamed the parameters to avoid name clashes with reserved keywords. By prefixing the parameter names you also avoid name clashes with column or table names. order is a reserved keyword and can only be used when quoted, e.g. "order". transaction is a keyword however it's not reserved, but it's better to quote it nonetheless.
The insert into the transaction details needs to be an INSERT...SELECT selecting from the result to get the generated transaction id and by unnesting the array elements in the product list JSON value.
The final select of the CTE then returns the generated transaction id.
You can call the function like this:
select insert_order('{"user_id": 42, "total_item": 1, "total_purchase": 100}'::jsonb,
'[ {"product_id": 1, "product_price": 10, "purchase_amount": 1},
{"product_id": 2, "product_price": 20, "purchase_amount": 2},
{"product_id": 3, "product_price": 30, "purchase_amount": 3} ]'::jsonb);

Partial update on an postgres upsert violates constraint

I want to be able to upsert partially inside postgres (9.5), but it seems that a partial upsert fails when not all of the constraint is fulfilled (such as the not null constraint)
Here is an example of the scenario and error
CREATE TABLE jobs (
id integer PRIMARY KEY,
employee_name TEXT NOT NULL,
address TEXT NOT NULL,
phone_number TEXT
);
CREATE OR REPLACE FUNCTION upsert_job(job JSONB)
RETURNS VOID AS $$
BEGIN
INSERT INTO jobs AS origin VALUES(
(job->>'id')::INTEGER,
job->>'employee_name'::TEXT,
job->>'address'::TEXT,
job->>'phone_number'::TEXT
) ON CONFLICT (id) DO UPDATE SET
employee_name = COALESCE(EXCLUDED.employee_name, origin.employee_name),
address = COALESCE(EXCLUDED.address, origin.address),
phone_number = COALESCE(EXCLUDED.phone_number, origin.phone_number);
END;
$$ LANGUAGE PLPGSQL SECURITY DEFINER;
--Full insert (OK)
SELECT upsert_job('{"id" : 1, "employee_name" : "AAA", "address" : "City, x street no.y", "phone_number" : "123456789"}'::jsonb);
--Partial update that fulfills constraint (Ok)
SELECT upsert_job('{"id" : 1, "employee_name" : "BBB", "address" : "City, x street no.y"}'::jsonb);
--Partial update that doesn't fulfill constraint (FAILS)
SELECT upsert_job('{"id" : 1, "phone_number" : "12345"}'::jsonb);
--ERROR: null value in column "employee_name" violates not-null constraint
--DETAIL: Failing row contains (1, null, null, 12345).
How do I go around approaching this ?
To think of it another way, what if the id didn't already exist? You can't insert just a phone number as it would have no name/address but that's exactly what you are telling it to do. So the constraint gets mad and it fails because an upsert tries to insert first and then updates if the insert fails. But your insert didn't get past the constraint check to see if it already existed.
What you can do instead if you want partials is tell it how to handle partials that would violate the constraints. Something like this (this is NOT complete and doesn't handle all partial data scenarios):
CREATE OR REPLACE FUNCTION upsert_job(job JSONB)
RETURNS VOID AS $$
BEGIN
IF (job->>'phone_number' IS NOT NULL
AND job->>'employee_name' IS NOT NULL
AND job->>'address' IS NOT NULL) THEN
INSERT INTO jobs AS origin VALUES(
(job->>'id')::INTEGER,
job->>'employee_name'::TEXT,
job->>'address'::TEXT,
job->>'phone_number'::TEXT
) ON CONFLICT (id) DO UPDATE SET
employee_name = COALESCE(EXCLUDED.employee_name, origin.employee_name),
address = COALESCE(EXCLUDED.address, origin.address),
phone_number = COALESCE(EXCLUDED.phone_number, origin.phone_number);
ELSIF (job->>'phone_number' IS NOT NULL AND (job->>'employee_name' IS NULL AND job->>'address' IS NULL)) THEN
UPDATE jobs SET phone_number=job->>'phone_number'::TEXT
WHERE id=(job->>'id')::INTEGER;
END IF;
END;
$$ LANGUAGE PLPGSQL SECURITY DEFINER;

How to create a before insert trigger on SQL Server 2012 to make sure the data i'm adding doesn't already exists in the table?

How to create a before insert trigger on SQL Server 2012 to make sure the data i'm adding doesn't already exists in the table?
The table design is :
[id] [int] IDENTITY(1,1) NOT NULL,
[column_one] [varchar](50) NOT NULL,
[column_two] [varchar](50) NOT NULL,
[column_three] [varchar](50) NOT NULL
I need to create a method to add data to the table and make sure the "couple" column_one, column_two is unique and not duplicated.
example :
id : 1, column_one : 'Stack', column_two : 'OverFlow', column_three :
'is great'
id : 2, column_one : 'Hello', column_two : 'World',
column_three : 'you good?'
id : 3, column_one : 'Help', column_two :
'me', column_three : 'please'
I have to make sure, no user can add 'Stack'+'Overflow' or 'Help'+'me', but can enter 'Stack'+'me' or 'Help'+'OverFlow' if he wants to.
I thought about creating a trigger (Before insert or instead of insert) but I don't know what to set as a condition.
CREATE TRIGGER VerifySomething
ON my_table
INSTEAD OF INSERT
AS
BEGIN
SET NOCOUNT ON
INSERT INTO my_table
do something
WHERE something something
END
GO
EDIT : I tried #TheGameiswar solution and I got some problems :
"An explicit value for the identity column in table 'my_table' can only be specified when a column list is used and IDENTITY_INSERT is ON".
After some brainstorming, I decided to create a constraint on both column instead of creating a trigger on insert.
the final result looks like :
ALTER TABLE my_table
ADD CONSTRAINT CheckUnicity UNIQUE (column_two, column_three)
CREATE TRIGGER VerifySomething
ON my_table
INSTEAD OF INSERT
AS
BEGIN
insert into yourtable
select * from inserted i
where not exists
(select 1 from table t where t.someuniquefield=i.someuniquefield
END
GO

Default ID with Korma and Postgresql?

I have the following schema:
CREATE TABLE IF NOT EXISTS art_pieces
(
-- Art Data
ID SERIAL PRIMARY KEY,
title TEXT NOT NULL,
description TEXT,
price INT NULL,
-- Relations
artists_id INT NULL
);
--;;
CREATE TABLE IF NOT EXISTS artists
(
-- Art Data
ID SERIAL PRIMARY KEY,
name TEXT NOT NULL
);
This is the corresponding art-piece entity:
(defentity art-pieces
(table :art_pieces)
(entity-fields
:id
:title
:description
:price
:artists_id)
(belongs-to artists))
I'm wondering why the following returns PSQLException ERROR: null value in column "id" violates not-null constraint:
(create-piece {:title "The Silence of the Lambda"
:description "Something something java beans and a nice chianti"
:price 5000})
Shouldn't the ID SERIAL PRIMARY KEY field populate automatically? Is this something to do with Korma's interaction with PSQL?
INSERT INTO "art_pieces" ("description", "id", "price", "title") VALUES (?, NULL, ?, ?)
The problem here is that you try to insert NULL value into id column. Default value is inserted only if you omit the column or use DEFAULT keyword (instead of NULL).
To insert the next value of the sequence into the serial column, specify that the serial column should be assigned its default value. This can be done either by excluding the column from the list of columns in the INSERT statement, or through the use of the DEFAULT key word
PostgreSQL Serial Types
So you have to change the query to:
INSERT INTO "art_pieces" ("description", "id", "price", "title") VALUES (?, DEFAULT, ?, ?)
-- or
INSERT INTO "art_pieces" ("description", "price", "title") VALUES (?, ?, ?)
Another workaround (in case you don't have permissions to change the query) would be to add a trigger function that will replace NULL value in id column automatically:
CREATE OR REPLACE FUNCTION tf_art_pieces_bi() RETURNS trigger AS
$BODY$
BEGIN
-- if insert NULL value into "id" column
IF TG_OP = 'INSERT' AND new.id IS NULL THEN
-- set "id" to the next sequence value
new.id = nextval('art_pieces_id_seq');
END IF;
RETURN new;
END;
$BODY$
LANGUAGE plpgsql;
CREATE TRIGGER art_pieces_bi
BEFORE INSERT
ON art_pieces
FOR EACH ROW EXECUTE PROCEDURE tf_art_pieces_bi();