Update selected values in a jsonb column containing a array - postgresql

Table faults contains column recacc (jsonb) which contains an array of json objects. Each of them contains a field action. If the value for action is abc, I want to change it to cba. Changes to be applied to all rows.
[
{
"action": "abc",
"created": 1128154425441
},
{
"action": "lmn",
"created": 1228154425441
},
{
"action": "xyz",
"created": 1328154425441
}
]
The following doesn't work, probably because of the data being in array format
update faults
set recacc = jsonb_set(recacc,'{action}', to_jsonb('cbe'::TEXT),false)
where recacc ->> 'action' = 'abc'

I'm not sure if this is the best option, but you may first get the elements of jsonb using jsonb_array_elements, replace it and then reconstruct the json using array_agg and array_to_json.
UPDATE faults SET recacc = new_recacc::jsonb
FROM
(SELECT array_to_json(array_agg(s)) as new_recacc
FROM
( SELECT
replace(c->>'action','abc','cba') , --this to change the value
c->>'created' FROM faults f
cross join lateral jsonb_array_elements(f.recacc) as c
) as s (action,created)
) m;
Demo

Related

PostgreSQL update a jsonb column multiple times

Consider the following:
create table query(id integer, query_definition jsonb);
create table query_item(path text[], id integer);
insert into query (id, query_definition)
values
(100, '{"columns":[{"type":"integer","field":"id"},{"type":"str","field":"firstname"},{"type":"str","field":"lastname"}]}'::jsonb),
(101, '{"columns":[{"type":"integer","field":"id"},{"type":"str","field":"firstname"}]}'::jsonb);
insert into query_item(path, id) values
('{columns,0,type}'::text[], 100),
('{columns,1,type}'::text[], 100),
('{columns,2,type}'::text[], 100),
('{columns,0,type}'::text[], 101),
('{columns,1,type}'::text[], 101);
I have a query table which has a jsonb column named query_definition.
The jsonb value looks like the following:
{
"columns": [
{
"type": "integer",
"field": "id"
},
{
"type": "str",
"field": "firstname"
},
{
"type": "str",
"field": "lastname"
}
]
}
In order to replace all "type": "..." with "type": "string", I've built the query_item table which contains the following data:
path |id |
----------------+---+
{columns,0,type}|100|
{columns,1,type}|100|
{columns,2,type}|100|
{columns,0,type}|101|
{columns,1,type}|101|
path matches each path from the json root to the "type" entry, id is the corresponding query's id.
I made up the following sql statement to do what I want:
update query q
set query_definition = jsonb_set(q.query_definition, query_item.path, ('"string"')::jsonb, false)
from query_item
where q.id = query_item.id
But it partially works, as it takes the 1st matching id and skips the others (the 1st and 4th line of query_item table).
I know I could build a for statement, but it requires a plpgsql context and I'd rather avoid its use.
Is there a way to do it with a single update statement?
I've read in this topic it's possible to make it with strings, but I didn't find out how to adapt this mechanism with jsonb treatment.

How to remove field from each elements json array postgres?

I have table
CREATE TABLE items (
id BIGINT PRIMARY KEY ,
data jsonb
);
Format of data
[
{
"id": "d20fe90c-137c-4713-bde1-2d4e75178ad3",
"text": "text",
"count": 1
},
{
"id": "d20fe90c-137c-4713-bde1-2d4e75178ad4",
""text": "text",
"count": 1
}
]
How I can remove field count from all elements of data json array?
I try
UPDATE items
SET data = data #- '{count}';
But this query requires index of array element before count as
UPDATE items
SET data = data #- '{0, count}';
There is no operator or built-in function to do that. Unnest the array and aggregate modified elements in the way like this:
update items t
set data = (
select jsonb_agg(elem- 'count')
from items
cross join lateral jsonb_array_elements(data) as arr(elem)
where id = t.id)
Test it in db<>fiddle.

Updating JSON column in Postgres?

I've got a JSON column in Postgres, that contains data in the following structure
{
"listings": [{
"id": "KTyneMdrAhAEKyC9Aylf",
"active": true
},
{
"id": "ZcjK9M4tuwhWWdK8WcfX",
"active": false
}
]
}
I need to do a few things, all of which I am unsure how to do
Add a new object to the listings array
Remove an object from the listings array based on its id
Update an object in the listings array based on its id
I am also using Sequelize if there are any built in functions to do this (I can't see anything obvious)
demo:db<>fiddle
Insert (jsonb_insert()):
UPDATE mytable
SET mydata = jsonb_insert(mydata, '{listings, 0}', '{"id":"foo", "active":true}');
Update (expand array, change value with jsonb_set(), reaggregate):
UPDATE mytable
SET mydata = jsonb_set(mydata, '{listings}', s.a)
FROM
(
SELECT
jsonb_agg(
CASE WHEN elems ->> 'id' = 'foo' THEN jsonb_set(elems, '{active}', 'false')
ELSE elems
END
) as a
FROM
mytable,
jsonb_array_elements(mydata -> 'listings') AS elems
) s;
Delete (expand array, filter relevant elements, reaggregate):
UPDATE mytable
SET mydata = jsonb_set(mydata, '{listings}', s.a)
FROM
(
SELECT
jsonb_agg(elems) as a
FROM
mytable,
jsonb_array_elements(mydata -> 'listings') AS elems
WHERE elems ->> 'id' != 'foo'
) s;
Updating and deleting can be done in one line like the inserting. But in that case you have to use the array element index instead of a certain value. If you want to change an array element with a value you need to read it first. This is only possible with a prior expanding.

Count objects in jsonb array with Postgres

Assume that every jsonb value in a table have this structure:
{
"level1": [
{
"level2": [
{
"key": key,
"value": value,
"messages": [
]
},
{
"key": key,
"value": value,
"messages": [
]
},
{
"key": key,
"value": value,
"messages": [
]
}
]
}
]
}
The names of key level1 is dynamic, so can be anything (that's why I'm using the jsonb_object_keys).
I need to check if any object inside level2.messages is empty per date.
That is: if all level2.messages in a date are empty, return false. Otherwise (at least one of the objects with message has a non-empty array), return true.
I thought I could use json functions in a subquery, but they are not known inside the subquery.
I have something like this:
SELECT t2.date,
(SELECT 1 FROM fields WHERE jsonb_array_length(fields ->> 'messages') = 1 LIMIT 1) AS hasMessages
FROM table1 t1
INNER JOIN table2 t2 ON t2.id = t1.id,
jsonb_object_keys(t1.result) AS rootNode,
jsonb_array_elements(t1.result -> rootNode) AS level2,
jsonb_array_elements(level2 -> 'level2') AS fields
GROUP BY t2.date
Based on the fragmentary info in the question, this would work:
SELECT date
, count(*) AS message_count
, count(*) FILTER (WHERE l2_val->'messages' = '[]') AS empty_message_count
FROM table1 t1
, jsonb_object_keys(result) AS key1
, jsonb_array_elements(result->key1->0->'level2') AS l2_val
GROUP BY 1
-- HAVING ?
This is assuming:
Always only one key name in the outer level of the JSON object.
Always only one array element in level1.
Key name of nested array is level2'.
I guess you want to identify those that do have messages, but all empty ...

Postgresql jsonb traversal

I am very new to the PG jsonb field.
I have for example a jsonb field containing the following
{
"RootModule": {
"path": [
1
],
"tags": {
"ModuleBase1": {
"value": 40640,
"humanstring": "40640"
},
"ModuleBase2": {
"value": 40200,
"humanstring": "40200"
}
},
"children": {
"RtuInfoModule": {
"path": [
1,
0
],
"tags": {
"in0": {
"value": 11172,
"humanstring": "11172"
},
"in1": {
"value": 25913,
"humanstring": "25913"
}
etc....
Is there a way to query X levels deep and search the "tags" key for a certain key.
Say I want "ModuleBase2" and "in1" and I want to get their values?
Basically I am looking for a query that will traverse a jsonb field until it finds a key and returns the value without having to know the structure.
In Python or JS a simple loop or recursive function could easily traverse a json object (or dictionary) until it finds a key.
Is there a built in function PG has to do that?
Ultimately I want to do this in django.
Edit:
I see I can do stuff like
SELECT data.key AS key, data.value as value
FROM trending_snapshot, jsonb_each(trending_snapshot.snapshot-
>'RootModule') AS data
WHERE key = 'tags';
But I must specify the the levels.
You can use a recursive query to flatten a nested jsonb, see this answer. Modify the query to find values for specific keys (add a condition in where clause):
with recursive flat (id, path, value) as (
select id, key, value
from my_table,
jsonb_each(data)
union
select f.id, concat(f.path, '.', j.key), j.value
from flat f,
jsonb_each(f.value) j
where jsonb_typeof(f.value) = 'object'
)
select id, path, value
from flat
where path like any(array['%ModuleBase2.value', '%in1.value']);
id | path | value
----+--------------------------------------------------+-------
1 | RootModule.tags.ModuleBase2.value | 40200
1 | RootModule.children.RtuInfoModule.tags.in1.value | 25913
(2 rows)
Test it in SqlFiddle.