UPDATE SET with different value for each row - postgresql

I have python dict with relationship between elements and their values. For example:
db_rows_values = {
<element_uuid_1>: 12,
<element_uuid_2>: "abc",
<element_uuid_3>: [123, 124, 125],
}
And I need to update it in one query. I made it in python through the query generation loop with CASE:
sql_query_elements_values_part = " ".join([f"WHEN '{element_row['element_id']}' "
f"THEN '{ujson.dumps(element_row['value'])}'::JSONB "
for element_row in db_row_values])
query_part_elements_values_update = f"""
elements_value_update AS (
UPDATE m2m_entries_n_elements
SET value =
CASE element_id
{sql_query_elements_values_part}
ELSE NULL
END
WHERE element_id = ANY(%(elements_ids)s::UUID[])
AND entry_id = ANY(%(entries_ids)s::UUID[])
RETURNING element_id, entry_id, value
),
But now I need to rewrite it in plpgsql. I can pass db_rows_values as array of ROWTYPE or as json but how can I make something like WHEN THEN part?

Ok, I can pass dict as JSON, convert it to rows with json_to_recordset and change WHEN THEN to SET value = (SELECT.. WHERE)
WITH input_rows AS (
SELECT *
FROM json_to_recordset(
'[
{"element_id": 2, "value":"new_value_1"},
{"element_id": 4, "value": "new_value_2"}
]'
) AS x("element_id" int, "value" text)
)
UPDATE table1
SET value = (SELECT value FROM input_rows WHERE input_rows.element_id = table1.element_id)
WHERE element_id IN (SELECT element_id FROM input_rows);
https://dbfiddle.uk/?rdbms=postgres_14&fiddle=f8b6cd8285ec7757e0d8f38a1becb960

Related

How do I access the VALUE using CosmosQueryableExtensions

EFCore Cosmos provider does not implement subquery yet and so I have implemented the query using the following FromRawSql as per this post:
SqlParameter userMasterGuidParam = new("userMasterGuid", userMasterGuid);
SqlParameter statusNewParam = new("statusNew", CaseStatusGuids.New);
SqlParameter statusInProgressParam = new("statusInProgress", CaseStatusGuids.InProgress);
SqlParameter statusOnHoldParam = new("statusOnHold", CaseStatusGuids.OnHold);
const string TICKET_SQL =
#"SELECT * FROM c " +
"WHERE c.StatusGuid IN (#statusNewParam, #statusInProgress, #statusOnHold) " +
"AND EXISTS ( " +
"SELECT VALUE n FROM n IN c.caseservicepartner_collection " +
"WHERE n.PartnerAssignedUserGuid = #userMasterGuid) ";
// Use CosmosQueryableExtensions instead of _context.Cases.FromSqlRaw to avoid ambiguous namespace.
// https://github.com/dotnet/efcore/issues/26502
return CosmosQueryableExtensions
.FromSqlRaw(_contextCosmos.Tickets, TICKET_SQL, statusNewParam, statusInProgressParam, statusOnHoldParam, userMasterGuidParam)
.OrderByDescending(t => t.CreatedDateTime)
.ToListAsync();
When I execute this query in the Cosmos Data Explorer I get a valid result - an array of items.
SELECT * FROM c WHERE c.StatusGuid IN ('63295b5e-de34-4555-b736-408dae18aaa0', '55d05dde-6b71-475f-8ee5-5549e2187423', 'e5267754-d416-4d1f-b42f-700dc5bb13d3') AND EXISTS ( SELECT VALUE n FROM n IN c.caseservicepartner_collection WHERE n.PartnerAssignedUserGuid = 'f3e9dd05-c580-4390-8998-61ce915d2da3')
[
{
"CreatedDateTime": "2022-08-17T08:22:54.017000+00:00",
"CaseNumber": 111,
"AssignedTeamGuid": null,
"TicketTypeGuid": "18ba2bba-557f-4bbd-9b45-029194761980",
...
},
{
...
}
]
However, when I execute this using EFCore, it returns no data. Looking at the EFCore log, it seems to wrap this query in an outer select, as follows:
-- EFCore adds this
SELECT c
FROM (
-- My Query
SELECT * FROM c WHERE c.StatusGuid IN (#statusNewParam, #statusInProgress, #statusOnHold) AND EXISTS ( SELECT VALUE n FROM n IN c.caseservicepartner_collection WHERE n.PartnerAssignedUserGuid = #userMasterGuid)
) c
...which when I plug into the Data Explorer, returns a nested structure like this:
[
{
"c": {
"CreatedDateTime": "2022-08-17T08:22:54.017000+00:00",
"CaseNumber": 111,
"AssignedTeamGuid": null,
"TicketTypeGuid": "18ba2bba-557f-4bbd-9b45-029194761980",
...
}
},
]
I suspect this is why the data is not being returned, perhaps due to a type mismatch.
Is there a way to fix this so the array is returned at the root, rather than nested within the c value?
Thanks
UPDATE
I removed the SqlParameters and instead used the string format-like option to pass parameters. That sorted out my issue and date is being returned now.
string TICKET_SQL =
"SELECT * FROM c " +
"WHERE c.StatusGuid IN ({0}, {1}, {2}) " +
"AND EXISTS (SELECT VALUE n FROM n IN c.caseservicepartner_collection WHERE n.PartnerAssignedUserGuid = {3})";
return CosmosQueryableExtensions
.FromSqlRaw(contextCosmos.Tickets, TICKET_SQL, CaseStatusGuids.New, CaseStatusGuids.InProgress, CaseStatusGuids.OnHold, userMasterGuid)
.OrderByDescending(t => t.CreatedDateTime);
.ToList();

Postgresql: Can the minus operator not be used with a parameter? Only hardcoded values?

The following query deletes an entry using index:
const deleteGameQuery = `
update users
set games = games - 1
where username = $1
`
If I pass the index as a parameter, nothing is deleted:
const gameIndex = rowsCopy[0].games.findIndex(obj => obj.game == gameID).toString();
const deleteGameQuery = `
update users
set games = games - $1
where username = $2
`
const { rows } = await query(deleteGameQuery, [gameIndex, username]);
ctx.body = rows;
The gameIndex parameter is just a string, the same as if I typed it. So why doesn't it seem to read the value? Is this not allowed?
The column games is a jsonb data type with the following data:
[
{
"game": "cyberpunk-2077",
"status": "Backlog",
"platform": "Any"
},
{
"game": "new-pokemon-snap",
"status": "Backlog",
"platform": "Any"
}
]
The problem is you're passing text instead of an integer. You need to pass an integer. I'm not sure exactly how your database interface works to pass integers, try removing toString() and ensure gameIndex is a Number.
const gameIndex = rowsCopy[0].games.findIndex(obj => obj.game == gameID).
array - integer and array - text mean two different things.
array - 1 removes the second element from the array.
select '[1,2,3]'::jsonb - 1;
[1, 3]
array - '1' searches for the entry '1' and removes it.
select '["1","2","3"]'::jsonb - '1';
["2", "3"]
-- Here, nothing is removed because 1 != '1'.
select '[1,2,3]'::jsonb - '1';
[1, 2, 3]
When you pass in a parameter, it is translated by query according to its type. If you pass a Number it will be translated as 1. If you pass a String it will be translated as '1'. (Or at least that's how it should work, I'm not totally familiar with Javascript database libraries.)
As a side note, this sort of data is better handled as a join table.
create table games (
id bigserial primary key,
name text not null,
status text not null,
platform text not null
);
create table users (
id bigserial primary key,
username text not null
);
create table game_users (
game_id bigint not null references games,
user_id bigint not null references users,
-- If a user can only have each game once.
unique(game_id, user_id)
);
-- User 1 has games 1 and 2. User 2 has game 2.
insert into game_users (game_id, user_id) values (1, 1), (2, 1), (2,2);
-- User 1 no longer has game 1.
delete from game_users where game_id = 1 and user_id = 1;
You would also have a platforms table and a game_platforms join table.
Join tables are a little mind bending, but they're how SQL stores relationships. JSONB is very useful, but it is not a substitute for relationships.
You can try to avoid decomposing objects outside of postgress and manipulate jsonb structure inside the query like this:
create table gameplayers as (select 1 as id, '[
{
"game": "cyberpunk-2077",
"status": "Backlog",
"platform": "Any"
},
{
"game": "new-pokemon-snap",
"status": "Backlog",
"platform": "Any"
},
{
"game": "gameone",
"status": "Backlog",
"platform": "Any"
}
]'::jsonb games);
with
ungroupped as (select * from gameplayers g, jsonb_to_recordset(g.games)
as (game text, status text, platform text)),
filtered as (select id,
jsonb_agg(
json_build_object('game', game,
'status', status,
'platfrom', platform
)
) games
from ungroupped where game not like 'cyberpunk-2077' group by id)
UPDATE gameplayers as g set games=f.games
from filtered f where f.id=g.id;

Update jsonb set new value select from the same table

i have a table foo:
id | items
---+--------------------------------
1 |{"item_1": {"status": "status_1"}}
2 |{"item_2": {"status": "status_2"}}
...
I need to update all rows in column items which is a jsonb and set after {"status": "status"} new values ("new_value": "new_value") and after update the result must look like this:
id | items
---+------------------------------------------------------------
1 |{"item_1": {"status": "status_1", "new_value": "new_value"}}
2 |{"item_2": {"status": "status_2", "new_value": "new_value"}}
...
i've tried to do this:
WITH result AS (
INSERT INTO foo (id, items)
SELECT id, options || newvalue as res
FROM foo AS bar,
jsonb_each(bar.items::jsonb) AS item,
to_jsonb(item.value) AS options,
jsonb_build_object('new_value', 'new_value') as newvalue
WHERE id IN ('1', '2'...)
ON CONFLICT (id)
DO UPDATE
SET items = foo.items || Excluded.items::jsonb RETURNING *)
SELECT item.key AS itemkey
FROM result AS res,
jsonb_each(res.items) AS item,
to_jsonb(item.value) AS options;
but when i run this script the postgres shows this error message:
on conflict do update command cannot affect row a second time postgres
i dont understand what am i doing wrong?
UPDATE#1
Postgres version 9.6
table foo id = TEXT UNIQUE NOT NULL
about why INSERT but not just UPDATE? the answer is this is my mistake first mistake.
after some reading postgres functions finally i find out:
UPDATE foo as t
SET items = result.new_value
FROM (SELECT st.id, new_value
FROM foo AS st
CROSS JOIN LATERAL (
SELECT jsonb_object_agg(the_key, the_value || '{"new_value": "some_new_value"}'::jsonb) AS new_value
FROM jsonb_each(st.items) AS x(the_key, the_value)
LIMIT 1) AS n) AS result
WHERE result.id = t.id;

INSERT INTO not working in IF block - T-SQL

im working on procedure which should transfer number of items (value #p_count) from old store to new store
SET #countOnOldStore = (SELECT "count" FROM ProductStore WHERE StoreId = #p_oldStoreId AND ProductId = #p_productID)
SET #countOnNewStore = (SELECT "count" FROM ProductStore WHERE StoreId = #p_newStoreID AND ProductId = #p_productID)
SET #ShiftedCount = #countOnOldStore - #p_count
SET #newStoreAfterShift = #countOnNewStore + #p_count
IF #ShiftedCount > 0
BEGIN
DELETE FROM ProductStore WHERE storeId = #p_oldStoreId and productID = #p_productID
INSERT INTO ProductStore (storeId,productId,"count") VALUES (#p_oldStoreId,#p_productID,#ShiftedCount)
DELETE FROM ProductStore WHERE storeId = #p_newStoreID and productID = #p_productID
INSERT INTO ProductStore (storeId,productId,"count") VALUES (#p_newStoreID,#p_productID,#newStoreAfterShift)
END
ELSE
PRINT 'ERROR'
well ... second insert is not working. I cant figure it out. It says
Cannot insert the value NULL into column 'count', table 'dbo.ProductStore'; column does not allow nulls. INSERT fails.
Can anyone see problem and explain it to me ? Its school project
It looks like your entire query should just be:
UPDATE ProductStore
SET [count] = [count] + CASE
WHEN storeId = #p_NewStoreID THEN #p_Count
ELSE -#p_Count END
WHERE
productID = #p_ProductID and
storeID in (#p_NewStoreID,#p_OldStoreID)
If either value in the following is NULL, the total will be NULL:
SET #newStoreAfterShift = #countOnNewStore + #p_count
Check both values (#countOnNewStore, #p_count) for NULL.
Looks like you are not assigning any value to #p_count, so it is NULL and so are #ShiftedCount and #newStoreAfterShift.

How do I insert multiple values into a postgres table at once?

I have a table that I am trying to update multiple values at once. Here is the table schema:
Column | Type | Modifiers
---------------+---------+-----------
user_id | integer |
subservice_id | integer |
I have the user_id and want to insert multiple subservice_id's at once. Is there a syntax in Postgres that will let me do something like this
insert into user_subservices(user_id, subservice_id) values(1, [1, 2, 3]);
How would I do this?
Multi-value insert syntax is:
insert into table values (1,1), (1,2), (1,3), (2,1);
But krokodilko's answer is much slicker.
Try:
INSERT INTO user_subservices(user_id, subservice_id)
SELECT 1 id, x
FROM unnest(ARRAY[1,2,3,4,5,6,7,8,22,33]) x
Demo: http://www.sqlfiddle.com/#!15/9a006/1
A shorter version of krokodilko's answer:
insert into user_subservices(user_id, subservice_id)
values(1, unnest(array[1, 2, 3]));
A slightly related answer because I keep finding this question every time I try to remember this solution. Insert multiple rows with multiple columns:
insert into user_subservices (user_id, subservice_id)
select *
from unnest(array[1, 2], array[3, 4]);
More robust example, for when you need to insert multiple rows into some table for every row in another table:
INSERT INTO user_subservices (user_id, subservice_id)
SELECT users.id AS user_id, subservice_id
FROM users
CROSS JOIN unnest(ARRAY[1,2,3]) subservice_id;
For multiple values, this function might be helpful.
This function generates multiple values
const _multiInsert = arrOfValues => {
// removes lastCharacter
const _remLastChar = str => str.slice(0, str.length - 1);
let foramttedQuery = '';
arrOfValues.forEach(row => {
let newRow = '';
for (const val of Object.values(row)) {
let newValue = '';
if (typeof val === 'string') newValue = `'${val}',`;
else newValue = `${val},`;
newRow = newRow.concat(newValue);
}
foramttedQuery = foramttedQuery.concat(`(${_remLastChar(newRow)}),`);
});
return _remLastChar(foramttedQuery);
};
const arr_Of_Values = [
{
id: 1,
name: "SAMPLE_NAME_1",
},
{
id: 2,
name: "SAMPLE_NAME2",
}
]
const query_template = `INSERT INTO TABLE_NAME VALUES ${_multiInsert(arr_Of_Values)}`
console.log(query_template)