Updating JSON column in Postgres? - postgresql

I've got a JSON column in Postgres, that contains data in the following structure
{
"listings": [{
"id": "KTyneMdrAhAEKyC9Aylf",
"active": true
},
{
"id": "ZcjK9M4tuwhWWdK8WcfX",
"active": false
}
]
}
I need to do a few things, all of which I am unsure how to do
Add a new object to the listings array
Remove an object from the listings array based on its id
Update an object in the listings array based on its id
I am also using Sequelize if there are any built in functions to do this (I can't see anything obvious)

demo:db<>fiddle
Insert (jsonb_insert()):
UPDATE mytable
SET mydata = jsonb_insert(mydata, '{listings, 0}', '{"id":"foo", "active":true}');
Update (expand array, change value with jsonb_set(), reaggregate):
UPDATE mytable
SET mydata = jsonb_set(mydata, '{listings}', s.a)
FROM
(
SELECT
jsonb_agg(
CASE WHEN elems ->> 'id' = 'foo' THEN jsonb_set(elems, '{active}', 'false')
ELSE elems
END
) as a
FROM
mytable,
jsonb_array_elements(mydata -> 'listings') AS elems
) s;
Delete (expand array, filter relevant elements, reaggregate):
UPDATE mytable
SET mydata = jsonb_set(mydata, '{listings}', s.a)
FROM
(
SELECT
jsonb_agg(elems) as a
FROM
mytable,
jsonb_array_elements(mydata -> 'listings') AS elems
WHERE elems ->> 'id' != 'foo'
) s;
Updating and deleting can be done in one line like the inserting. But in that case you have to use the array element index instead of a certain value. If you want to change an array element with a value you need to read it first. This is only possible with a prior expanding.

Related

How to remove field from each elements json array postgres?

I have table
CREATE TABLE items (
id BIGINT PRIMARY KEY ,
data jsonb
);
Format of data
[
{
"id": "d20fe90c-137c-4713-bde1-2d4e75178ad3",
"text": "text",
"count": 1
},
{
"id": "d20fe90c-137c-4713-bde1-2d4e75178ad4",
""text": "text",
"count": 1
}
]
How I can remove field count from all elements of data json array?
I try
UPDATE items
SET data = data #- '{count}';
But this query requires index of array element before count as
UPDATE items
SET data = data #- '{0, count}';
There is no operator or built-in function to do that. Unnest the array and aggregate modified elements in the way like this:
update items t
set data = (
select jsonb_agg(elem- 'count')
from items
cross join lateral jsonb_array_elements(data) as arr(elem)
where id = t.id)
Test it in db<>fiddle.

PostgresSQL nested jsonb query for multiple key/value pairs

Starting out with JSONB data type and I'm hoping someone can help me out.
I have a table (properties) with two columns (id as primary key and data as jsonb).
The data structure is:
{
"ProductType": "ABC",
"ProductName": "XYZ",
"attributes": [
{
"name": "Color",
"type": "STRING",
"value": "Silver"
},
{
"name": "Case",
"type": "STRING",
"value": "Shells"
},
...
]
}
I would like to get all rows where an attribute has a specific value i.e. return all rows where Case = 'Shells' and/or Color = 'Red'.
I have tried the following but I'm not able to apply two conditions like Case = 'Shells' and Color = 'Silver'.
I can get rows for when a single attributes' name and value matches conditions but I can't figure out how to get it to work for multiple attributes.
EDIT 1:
I'm able to achieve the results using the following query:
WITH properties AS (
select *
from (
values
(1, '{"ProductType": "ABC","ProductName": "XYZ","attributes": [{"name": "Color","type": "STRING","value": "Silver"},{"name": "Case","type": "STRING","value": "Shells"}]}'::jsonb),
(2, '{"ProductType": "ABC","ProductName": "XYZ","attributes": [{"name": "Color","type": "STRING","value": "Red"},{"name": "Case","type": "STRING","value": "Shells"}]}'::jsonb)
) s(id, data)
)
select
*
from (
SELECT
id,
jsonb_object_agg(attr ->> 'name', attr -> 'value') as aggr
FROM properties m,
jsonb_array_elements(data -> 'attributes') as attr
GROUP BY id
) a
where aggr ->> 'Color' = 'Red' and aggr ->> 'Case' LIKE 'Sh%'
I could potentially have millions of these records so I suppose my only concern now is if this is efficient and if not, is there a better way?
step-by-step demo:db<>fiddle
SELECT
id
FROM properties m,
jsonb_array_elements(data -> 'attributes') as attr
GROUP BY id
HAVING jsonb_object_agg(attr ->> 'name', attr -> 'value') #> '{"Color":"Silver", "Case":"Shells"}'::jsonb
The problem is, that jsonb_array_elements() moves both related values into different records. However, this call is necessary to fetch the values. So, you need to reaggregate the values after you were able to read them. This would make it possible to check them in a related manner.
This can be achieved by using the jsonb_object_agg() aggregation function. The trick here is that we create an object with attributes like "name":"value". So, with that, we can easily check if all required attributes are in the JSON object using the #> operator.
Concerning "Edit 1"
demo:db<>fiddle
You can do this:
SELECT
*
FROM (
SELECT
id,
jsonb_object_agg(attr ->> 'name', attr -> 'value') as obj
FROM properties m,
jsonb_array_elements(data -> 'attributes') as attr
GROUP BY id
) s
WHERE obj ->> 'Color' = 'Silver'
AND obj ->> 'Case' LIKE 'Sh%'
Create the new JSON structure as described above for all JSONs
Filter this result afterward.
Alternatively you can use jsonb_object_agg() in the HAVING clause as often as you need it. I guess you need to check which way is more performant in your case:
SELECT
id
FROM properties m,
jsonb_array_elements(data -> 'attributes') as attr
GROUP BY id
HAVING
jsonb_object_agg(attr ->> 'name', attr -> 'value') ->> 'Color' = 'Silver'
AND
jsonb_object_agg(attr ->> 'name', attr -> 'value') ->> 'Case' LIKE 'Sh%'

How to lower-case all the elements of a JSONB array of strings of each row in a table

I have a table with a field called "data" which is of JSONB type. The content of "data" is an object with one of the fields called "associated_emails", which is an array of strings.
I need to update the existing table so that the content of "associated_emails" is all lower-case. How to achieve that? This is my attempt so far (it triggers error: ERROR: cannot extract elements from a scalar)
update mytable my
set
"data" = safe_jsonb_set(
my."data",
'{associated_emails}',
to_jsonb(
lower(
(
SELECT array_agg(x) FROM jsonb_array_elements_text(
coalesce(
my."data"->'associated_emails',
'{}'::jsonb
)
) t(x)
)::text[]::text
)::text[]
)
)
where
my.mytype = 'something';
You would like to use JSONB_SET and UPDATE the column with something like given below below:
UPDATE jsonb_test
SET data = JSONB_SET(data, '{0,associated_emails}',
JSONB(LOWER(data ->> 'associated_emails'::TEXT)));

PostgreSQL - Add key to each objects of an JSONB array

My database contains a table which has a column with jsonb type, and I want to update a part of these data using functions/operators from postgreSQL. Given we have this:
{
"A":[
{"index":"1"},
{"index":"2"}
],
"B":[
{"index":"3"},
{"index":"4"}
]
}
Let's say we went to add a key with an empty array to objects from "A" array, in order to have:
{
"A":[
{"index":"1", "myArray":[]},
{"index":"2", "myArray":[]}
],
"B":[
{"index":"3"},
{"index":"4"}
]
}
How can I proceed?
I've already tried this kind of things without success:
UPDATE myTable SET myColumn = (myColumn::jsonb)->>'A' || '{"myArray":[]}'
UPDATE myTable SET myColumn = (
SELECT jsonb_agg(jsonb_set(
element,
array['A'],
to_jsonb(((element ->> 'A')::jsonb || '{"myArray":[]}')::jsonb)
))
FROM jsonb_array_elements(myColumn::jsonb) element
)::json
UPDATE myTable SET myColumn = (
SELECT jsonb_each((element ->> 'A')::jsonb) || '{"myArray":[]}'::jsonb
FROM jsonb_array_elements(myColumn::jsonb) element
)::json
Obviously, all of these tests have been big failure. I have difficulties to understand how works postgreSQL functions.
Somebody can help?
The approach with jsonb_array_elements and jsonb_set was the right idea, but somehow you nested them the wrong way round:
UPDATE myTable SET myColumn = jsonb_set(myColumn, '{A}', (
SELECT jsonb_agg( element || '{"myArray":[]}' )
FROM jsonb_array_elements(myColumn -> 'A') element
));
(online demo)
Btw if your column already has jsonb data type, you shouldn't need any casts.

How to filter data from postgresql which has jsonb nested field in the array field of jsonb?

i have a table with a jsonb column and documents are like these(simplified)
{
"a": 1,
"rg": [
{
"rti": 2
}
]
}
I want to filter all the rows which has 'rg' field and there is at least one 'rti'field in the array.
My current solution is
log->>'rg' ilike '%rti%'
Is there another approach, probably a faster solution exists.
Another approach would be applying jsonb_each to the jsonb object and then jsonb_array_elements_text to the extracted value from jsonb_each method :
select id, js_value2
from
(
select (js).value as js_value, jsonb_array_elements_text((js).value) as js_value2,id
from
(
select jsonb_each(log) as js, id
from tab
) q
where (js).key = 'rg'
) q2
where js_value2 like '%rti%';
Demo