Merge command DB2 AIX (Update ok, insert not ok) - merge

I'm currently struggeling with the merge command on a AIX DB2 9.7.
All I want to achieve is write a command which performes and Update when there is already a row with the primary keys checked and if not, then perform the Insert statement.
That is what I expect the merge command can be used for.
Select something
if exists then update
if not exists then isert
The Update statements works (I can see in my Toad) and of course in the database e.g. timestamp. But the Insert doesn't work.
Threre no syntay error shown, but no insert is made. Toad shows "executed successfully".
Any help is appreciated.
The WHERE clause shows the primary keys of the table.
MERGE INTO DSPTMCP.KENNZAHL_DEFINITION as KD
USING(Select NR_MANDANT, SL_GRUPPE_KENNZAHL, SL_KENNZAHL
FROM DSPTMCP.KENNZAHL_DEFINITION
WHERE NR_MANDANT = 5472 AND SL_GRUPPE_KENNZAHL = '_VBH' AND SL_KENNZAHL = 104) as KD1
ON(KD.NR_MANDANT = KD1.NR_MANDANT AND KD.SL_GRUPPE_KENNZAHL = KD1.SL_GRUPPE_KENNZAHL AND KD.SL_KENNZAHL = KD1.SL_KENNZAHL)
WHEN MATCHED Then
UPDATE SET
BEZ_KENNZAHL_ABS = 'MEFGA',
BEZ_KENNZAHL_REL = 'Aufgaben',
BEZ_EINHEIT_ABS = '%',
BEZ_EINHEIT_REL = '%',
SL_MODIFIZIERUNG = 3,
ANZ_NACHKOMMASTELLEN_ABS = 2,
ANZ_NACHKOMMASTELLEN_REL = 2,
KZ_QUALITAETSZIEL_ABS = 'H',
KZ_QUALITAETSZIEL_REL = 'H',
BEZ_ERMITTLUNGSFREQUENZ = 'Monatl.',
BEZ_ERMITTLUNGSART = 'Automat',
BEZ_DATENLIEFERANT = 'Geschäftsfelddaten',
TXT_QUELLINFORMATION = 'Geschäftsfelddaten',
TXT_KNZ_BESCHREIBUNG = 'Aufgaben',
FAK_REF_GEWICHT = 1,
KZ_HILFSGROESSE = 'N',
SL_GRUPPE_KENNZAHL_REL = 'ALLG',
SL_KENNZAHL_REL = 10,
FAK_ERGEBNIS_REL = 1,
BEZ_EINHEIT_QUELLE = '%',
FAK_UMRECHNUNG_QUELLE = 1,
KZ_REF_OHNE_VORZEICHEN = 'N'
WHEN Not MATCHED Then
INSERT (NR_MANDANT,
SL_GRUPPE_KENNZAHL,
SL_KENNZAHL,
SYS_DWH_TSP,
SL_MODIFIZIERUNG,
UID_ERFASSUNG,
TSP_ERFASSUNG,
UID_AENDERUNG,
TSP_AENDERUNG,
BEZ_EINHEIT_ABS,
ANZ_NACHKOMMASTELLEN_ABS,
SL_GRUPPE_KENNZAHL_REF,
SL_KENNZAHL_REF,
KZ_REF_OHNE_VORZEICHEN,
SL_REF_VERDICHTUNG,
BEZ_KENNZAHL_ABS,
BEZ_KENNZAHL_REL,
KZ_HIERARCHIESUMME,
KZ_QUALITAETSZIEL_ABS,
KZ_QUALITAETSZIEL_REL,
BEZ_ERMITTLUNGSFREQUENZ,
BEZ_ERMITTLUNGSART,
IHT_MINIMAL,
IHT_MAXIMAL,
IHT_MINIMAL_REL,
IHT_MAXIMAL_REL,
BEZ_DATENLIEFERANT,
TXT_QUELLINFORMATION,
DAT_ERFASSUNG_AB,
DAT_ERFASSUNG_BIS,
TXT_KNZ_BESCHREIBUNG,
FAK_REF_GEWICHT,
KZ_HILFSGROESSE,
SL_GRUPPE_KENNZAHL_REL,
SL_KENNZAHL_REL,
FAK_ERGEBNIS_REL,
BEZ_EINHEIT_QUELLE,
FAK_UMRECHNUNG_QUELLE,
BEZ_EINHEIT_REL,
ANZ_NACHKOMMASTELLEN_REL)
VALUES(5472,
'_VBH',
'104',
current timestamp,
3,
'AUTOMAT',
current timestamp,
'AUTOMAT',
current timestamp,
'%',
2,
null,
null,
'N',
null,
'Aufgaben',
'Aufgaben',
'N',
'H',
'H',
'Monatl.',
'Automat',
null,
null,
null,
null,
'Geschäftsfelddaten',
'Geschäftsfelddaten',
'01.01.2000',
'31.12.9999',
'Aufgaben',
1,
'N',
'ALLG',
'10',
1,
'%',
1,
'%',
2);

Problem solved. The target table coming from the using clause needn't to be empty.
Switched it to sysdummy and now it works.
Something like this
MERGE INTO DSPTMCP.KENNZAHL_DEFINITION as KD
USING(Select 5472 AS NR_MANDANT, '_KCR' AS SL_GRUPPE_KENNZAHL,
600 AS SL_KENNZAHL FROM sysibm.sysdummy1) as KD1
ON(KD.NR_MANDANT = KD1.NR_MANDANT AND KD.SL_GRUPPE_KENNZAHL = KD1.SL_GRUPPE_KENNZAHL AND KD.SL_KENNZAHL = KD1.SL_KENNZAHL)
WHEN MATCHED THEN

Related

UPDATE SET with different value for each row

I have python dict with relationship between elements and their values. For example:
db_rows_values = {
<element_uuid_1>: 12,
<element_uuid_2>: "abc",
<element_uuid_3>: [123, 124, 125],
}
And I need to update it in one query. I made it in python through the query generation loop with CASE:
sql_query_elements_values_part = " ".join([f"WHEN '{element_row['element_id']}' "
f"THEN '{ujson.dumps(element_row['value'])}'::JSONB "
for element_row in db_row_values])
query_part_elements_values_update = f"""
elements_value_update AS (
UPDATE m2m_entries_n_elements
SET value =
CASE element_id
{sql_query_elements_values_part}
ELSE NULL
END
WHERE element_id = ANY(%(elements_ids)s::UUID[])
AND entry_id = ANY(%(entries_ids)s::UUID[])
RETURNING element_id, entry_id, value
),
But now I need to rewrite it in plpgsql. I can pass db_rows_values as array of ROWTYPE or as json but how can I make something like WHEN THEN part?
Ok, I can pass dict as JSON, convert it to rows with json_to_recordset and change WHEN THEN to SET value = (SELECT.. WHERE)
WITH input_rows AS (
SELECT *
FROM json_to_recordset(
'[
{"element_id": 2, "value":"new_value_1"},
{"element_id": 4, "value": "new_value_2"}
]'
) AS x("element_id" int, "value" text)
)
UPDATE table1
SET value = (SELECT value FROM input_rows WHERE input_rows.element_id = table1.element_id)
WHERE element_id IN (SELECT element_id FROM input_rows);
https://dbfiddle.uk/?rdbms=postgres_14&fiddle=f8b6cd8285ec7757e0d8f38a1becb960

DB2 merge upsert gets 'Row not found for MERGE' error

I am trying to do a basic upsert on an iSeries db2 with the MERGE statement, similar to as described in Does DB2 have an "insert or update" statement? and http://db2performance.blogspot.com/2011/12/merge-make-your-upserts-quick.html. When executed, it gives me Row not found for MERGE. SQLSTATE=02000 instead of inserting the row. Since I have when not matched then insert in the statement, why will it return an error instead of inserting? I looked all over SO and didn't see this particular issue.
Here is the statement I'm using:
merge into UFDFTRN as T using (
select * from UFDFTRN
where DFCNO = 354 and DFINV = 1179 and DFLC = 1 and DFDATE = '2017-01-31'
and DFSPLT = 0 and DFSEQ = 100
) as S on (
T.DFCNO = S.DFCNO and T.DFINV = S.DFINV and T.DFDATE = S.DFDATE and
T.DFSPLT = S.DFSPLT and T.DFSEQ = S.DFSEQ
) when matched then
update set DFSEQ = 1000, DFTRAN = 0, DFITEM = 'F224', DFRITM = '0',
DFDESC = 'DAIRY VTM PREMIX', DFQTY = 3, DFUM = '',DESIQU = 0, DFRTQU = 3,
DFUPR = 0, DFCTUP = 0, DFUCST = 0, DFOUCST = 0, DFAMT = 0, DFOAMT = 0, DFCODE = '',
DFURAT = '', DFCGCD = '0', DFCTNO = 0, DFADJITM = '', DFADJPCT = 0, DFMNFITM = '',
DFMNFRAT = '', DFMNFQTY = '0', DFMNFTQTY = '0'
when not matched then
insert (DFCNO, DFINV, DFLC, DFDATE, DFSPLT, DFSEQ, DFTRAN, DFITEM, DFRITM, DFDESC,
DFQTY, DFUM, DFSIQU, DFRTQU, DFUPR, DFCTUP, DFUCST, DFOUCST, DFAMT, DFOAMT, DFCODE,
DFURAT, DFCGCD, DFCTNO, DFADJITM, DFADJPCT, DFMNFITM, DFMNFRAT, DFMNFQTY, DFMNFTQTY
) values (
354, 1179, 1, '2017-01-31', 0, 1000, 0, 'F224', '0', 'DAIRY VTM PREMIX', 3, '', 0,
3, 0, 0, 0, 0, 0, 0, '', '', '0', 0, '', 0, '', '', '0', '0'
)
It probably should look more like this:
merge into UFDFTRN as T using (
select 354 DFCNO, 1179 DFINV, 1 DFLC, '2017-01-31' DFDATE, 0 DFSPLT, 100 DFSEQ
, 'DAIRY VTM PREMIX' f1 -- all other columns you might need
from sysibm.sysdummy1
) as S
on (
T.DFCNO = S.DFCNO and T.DFINV = S.DFINV and T.DFDATE = S.DFDATE and
T.DFSPLT = S.DFSPLT and T.DFSEQ = S.DFSEQ
)
when matched then
update set T.DFSEQ = S.DFSEQ, T.DFTRAN = S.DFTRAN, -- etc. etc.
when not matched then
insert (DFCNO, DFINV, ... -- etc. etc.
) values (
S.DFSNO, S.DFINV, ..., S.F1, ...-- etc. etc.
)
PS. Not tested.
Mustaccio has the right format for the merge...
But as I commented, that's a really funny way to use merge.
Personally, for a 1 time thing, I would have just
update UFDFTRN
set (DFCNO, DFINV, DFLC, DFDATE, DFSPLT, DFSEQ, DFTRAN, DFITEM, DFRITM, DFDESC,
DFQTY, DFUM, DFSIQU, DFRTQU, DFUPR, DFCTUP, DFUCST, DFOUCST, DFAMT, DFOAMT, DFCODE,
DFURAT, DFCGCD, DFCTNO, DFADJITM, DFADJPCT, DFMNFITM, DFMNFRAT, DFMNFQTY, DFMNFTQTY
) = (
354, 1179, 1, '2017-01-31', 0, 1000, 0, 'F224', '0', 'DAIRY VTM PREMIX', 3, '', 0,
3, 0, 0, 0, 0, 0, 0, '', '', '0', 0, '', 0, '', '', '0', '0'
)
where DFCNO = 354 and DFINV = 1179 and DFLC = 1 and DFDATE = '2017-01-31'
and DFSPLT = 0 and DFSEQ = 100
And if that failed with record not found, simply changed the update to an insert
insert into UFDFTRN
(DFCNO, DFINV, DFLC, DFDATE, DFSPLT, DFSEQ, DFTRAN, DFITEM, DFRITM, DFDESC,
DFQTY, DFUM, DFSIQU, DFRTQU, DFUPR, DFCTUP, DFUCST, DFOUCST, DFAMT, DFOAMT, DFCODE,
DFURAT, DFCGCD, DFCTNO, DFADJITM, DFADJPCT, DFMNFITM, DFMNFRAT, DFMNFQTY, DFMNFTQTY
) values (
354, 1179, 1, '2017-01-31', 0, 1000, 0, 'F224', '0', 'DAIRY VTM PREMIX', 3, '', 0,
3, 0, 0, 0, 0, 0, 0, '', '', '0', 0, '', 0, '', '', '0', '0'
)
This use case is not unusual at all. This is the most basic use case for merge imo. You're trying to update or insert into 1 table. It really should be easier but the solution is to put any where clause in the on () portion of the statement and it starts working as expected. (worked db2 luw 10.5 at least.. I have the exact same situation) Merge apparently can't handle the using () portion's where clause. It only sees what's in on () portion for the update/insert criteria. Don't follow sysdummy1 answer as that isn't determining if the data actually exists in the target table it's just text. I don't want to write an insert and check failure because it will cause the app to throw an error and I really shouldn't have to trap the error and do that code if db2 has this feature.

How can I filter RETURNING *?

I have the following scenario. I have a table that has an IsDeleted flag I set for doing a 'soft delete' of records. I am doing an UPSERT where I am adding, modifying and flagging as deleted some records. I want to exclude records that have been flagged as deleted from the RETURNING statement. I have attempted to just append WHERE tbltest_IsDeleted = 0 to the end of the following SQL but it gives me the error: ERROR: syntax error at or near "WHERE"
How can I filter the results of the RETURNING * in the following statement?
INSERT INTO tbltest (
tbltest_ID,
tbltest_Name,
tbltest_Description,
tbltest_IsDeleted)
VALUES
(DEFAULT, 'new record','new record description', 0),
(4, 'modified record name','modified record description', 0),
(5, 'existing record name','existing record description', 1)
ON CONFLICT (tbltest_ID) DO UPDATE SET (
tbltest_Name,
tbltest_Description,
tbltest_IsDeleted) = (
excluded.tbltest_Name,
excluded.tbltest_Description,
excluded.tbltest_IsDeleted) RETURNING *;
Worked it out, here is how I was able to do it:
WITH rows AS (
INSERT INTO tbltest (
tbltest_ID,
tbltest_Name,
tbltest_Description,
tbltest_IsDeleted)
VALUES
(DEFAULT, 'new record','new record description', 0),
(4, 'modified record name','modified record description', 0),
(5, 'existing record name','existing record description', 1)
ON CONFLICT (tbltest_ID) DO UPDATE SET (
tbltest_Name,
tbltest_Description,
tbltest_IsDeleted) = (
excluded.tbltest_Name,
excluded.tbltest_Description,
excluded.tbltest_IsDeleted) RETURNING *
)
SELECT * FROM rows WHERE rows.tbltest_IsDeleted = 0
Hopefully this saves someone some time ;-)

clean way to detect if current_query() is a prepared statement?

Does anyone know a good way to detect if the result from current_query()is a prepared statement or not?
I seems that I can't simply use a string function because this would be an exampe for a prepared statement:
UPDATE table SET "x" = $1 WHERE "y" = $2 AND "z" = $3
But this would not:
UPDATE table SET "x" = '$1 + $2 = $3' WHERE "y"='$1' AND "z" = 1
Is there maybe another function I can use together with / instead of current_query() or do you have any other ideas?
You may be able to detect if current_query() is a prepared statement by looking for \$[[:digit:]] after stripping the text of all strings. The following query would do, however it may fail in cases of intricate quote nesting:
with
queries(curr_query) as (
values ($$UPDATE table SET "x" = '$1||''a'' + $2 = $3' WHERE "y"='$1' AND "z" = 1$$),
($$UPDATE table SET "x" = $r1$a$r1$||$1 WHERE "y" = $2 AND "z" = $3||$r1$b$r1$ $$),
($$UPDATE table SET "x" = $1 WHERE "y" = $2 AND "z" = $3$$)
),
stripped as (
select *,
regexp_replace(
regexp_replace(
regexp_replace(curr_query, '(["'']).*?\1', '', 'g'),
'\$([[:alpha:]]*?)\$.*?\$\1\$', '', 'g'),
'\$([[:alpha:]][[:alnum:]]*?)\$.*?\$\1\$', '', 'g') as stripped_query
from queries
)
select *, stripped_query ~ '\$[[:digit:]]' AS is_prepared
from stripped

How do I insert multiple values into a postgres table at once?

I have a table that I am trying to update multiple values at once. Here is the table schema:
Column | Type | Modifiers
---------------+---------+-----------
user_id | integer |
subservice_id | integer |
I have the user_id and want to insert multiple subservice_id's at once. Is there a syntax in Postgres that will let me do something like this
insert into user_subservices(user_id, subservice_id) values(1, [1, 2, 3]);
How would I do this?
Multi-value insert syntax is:
insert into table values (1,1), (1,2), (1,3), (2,1);
But krokodilko's answer is much slicker.
Try:
INSERT INTO user_subservices(user_id, subservice_id)
SELECT 1 id, x
FROM unnest(ARRAY[1,2,3,4,5,6,7,8,22,33]) x
Demo: http://www.sqlfiddle.com/#!15/9a006/1
A shorter version of krokodilko's answer:
insert into user_subservices(user_id, subservice_id)
values(1, unnest(array[1, 2, 3]));
A slightly related answer because I keep finding this question every time I try to remember this solution. Insert multiple rows with multiple columns:
insert into user_subservices (user_id, subservice_id)
select *
from unnest(array[1, 2], array[3, 4]);
More robust example, for when you need to insert multiple rows into some table for every row in another table:
INSERT INTO user_subservices (user_id, subservice_id)
SELECT users.id AS user_id, subservice_id
FROM users
CROSS JOIN unnest(ARRAY[1,2,3]) subservice_id;
For multiple values, this function might be helpful.
This function generates multiple values
const _multiInsert = arrOfValues => {
// removes lastCharacter
const _remLastChar = str => str.slice(0, str.length - 1);
let foramttedQuery = '';
arrOfValues.forEach(row => {
let newRow = '';
for (const val of Object.values(row)) {
let newValue = '';
if (typeof val === 'string') newValue = `'${val}',`;
else newValue = `${val},`;
newRow = newRow.concat(newValue);
}
foramttedQuery = foramttedQuery.concat(`(${_remLastChar(newRow)}),`);
});
return _remLastChar(foramttedQuery);
};
const arr_Of_Values = [
{
id: 1,
name: "SAMPLE_NAME_1",
},
{
id: 2,
name: "SAMPLE_NAME2",
}
]
const query_template = `INSERT INTO TABLE_NAME VALUES ${_multiInsert(arr_Of_Values)}`
console.log(query_template)