get array of all jsonb field keys in JOOQ - jsonb_object_keys - postgresql

I have a jsonb stored in table:
{
"a": "x",
"b": "x",
"c": "x"
...
}
What I need is to return an array of all keys from this jsonb like this:
["a", "b", "c"]
I'm using old version of Jooq there is no json_object_keys method, so is there any way to extract these keys in Joq?

jOOQ 3.18 will have support for the MySQL style JSON_KEYS() function, see #14046. As you can see in that manual page, this jOOQ API call:
jsonKeys(jsonObject(key("a").value(1), key("b").value(2)))
Will translate to something like this in CockroachDB, PostgreSQL, YugabyteDB:
-- AURORA_POSTGRES, POSTGRES, YUGABYTEDB
(
SELECT coalesce(
json_agg(j),
json_build_array()
)
FROM json_object_keys(json_build_object(
'a', CAST(1 AS int),
'b', CAST(2 AS int)
)) as j(j)
)
Alternatively, as per #a_horse_with_no_name's comment, an improved emulation might be implemented via jsonb_path_query_array in PostgreSQL 12+
You can obviously do this already today using plain SQL templating, until the above is released.

Related

Writing a rather obtuse JSON query using Slick

I am looking to translate an SQL query (Postgres) into Scala Slick code for use in my Play application.
The data looks something like this:
parent_id | json_column
----------+-----------------------------------------
| [ {"id": "abcde-12345", "data": "..."}
2 | , {"id": "67890-fghij", "data": "..."}
| , {"id": "klmno-00000", "data": "..."} ]
Here's my query in PostgreSQL:
SELECT * FROM table1
WHERE id IN (
SELECT id
FROM
table1 t1,
json_array_elements(t1.json_column) e,
json_to_record(e.value) AS r("id" text, data text)
WHERE
"id" = 'abcde-12345'
AND t1.parent_id = 2
);
This finds the results I need; any objects in t1 that include a "row" in the json_column array that has the id of "abcde-12345". The "parent_id" and "id" will be passed in to this query via query parameters (both Strings).
How would I write this query in Scala using Slick?
The easiest - maybe laziest? - way is probably to just use plain sql ..
sql"""[query]""".as[ (type1,type2..) ]
using the $var notation for the variables.
Otherwise you can use SimpleFunction to map the json calls, but I'm not quite sure how that works when they generate multiple results per row. Seems that might get complicated..

PostgreSQL jsonb - omit multiple nested keys

The task is to remove multiple nested keys from jsonb field.
Is there any way to shorten this expression without writing a custom function?
SELECT jsonb '{"a": {"b":1, "c": 2, "d": 3}}' #- '{a,b}' #- '{a,d}';
suppose we need to delete more than 2 keys
There is no way to shorten the expression. If your goal is to pass to the query a single array of keys to be deleted you can use jsonb_set() with jsonb_each():
with my_table(json_col) as (
values
(jsonb '{"a": {"b":1, "c": 2, "d": 3}}')
)
select jsonb_set(json_col, '{a}', jsonb_object_agg(key, value))
from my_table
cross join jsonb_each(json_col->'a')
where key <> all('{b, d}') -- input
group by json_col -- use PK here if exists
jsonb_set
-----------------
{"a": {"c": 2}}
(1 row)
The solution is obviously more expensive but may be handy when dealing with many keys to be deleted.
NVM, figured it out)
For this particular case, we can re-assign property with removed keys (flat):
SELECT jsonb_build_object('a', ('{ "b":1, "c": 2, "d": 3 }' - ARRAY['b','d']));
More general approach:
SELECT json_col || jsonb_build_object('<key>',
((json_col->'<key>') - ARRAY['key-1', 'key-2', 'key-n']));
Not very useful for deep paths, but works ok with 1-level nesting.

Select greatest number from a json list of variable length with PostgreSQL

I have a column (let's call it jsn) in my database with json object (actually stored as plain text for reasons). This json object looks like this:
{"a":
{"b":[{"num":123, ...},
{"num":456, ...},
...,
{"num":789, ...}],
...
},
...
}
I'm interested in the biggest "num" inside that list of objects "b" inside the object "a".
If the list if of known length I can do it like this:
SELECT
GREATEST((jsn::json->'a'->'b'->>0)::int,
(jsn::json->'a'->'b'->>1)::int,
... ,
(jsn::json->'a'->'b'->>N)::int))
FROM table
Note that I'm new to PostgreSQL (and database querying in general!) so that may be a rubbish way to do it. In any case it works. What I can't figure out is how to make this work when the list, 'b', is of arbitrary and unknown length.
In case it is relevant, I am using PostgreSQL 10 hosted on AWS RDS and running queries using pgAdmin 4.
You need to unnest the array then you can apply a max() on the result:
select max((n.x -> 'num')::int)
from the_table t
cross join jsonb_array_elements(t.jsn::jsonb -> 'a' -> 'b') as n(x);
you probably want to add a group by, so that you can distinguish rom which row the max value came from. Assuming your table has a column id that is unique:
select id, max((n.x -> 'num')::int)
from the_table t
cross join jsonb_array_elements(t.jsn::jsonb -> 'a' -> 'b') as n(x)
group by id;

Postgresql doesn't use GIN index for "?" JSON operator

By some reason index is not used for "?" operator.
Let's take this sample https://schinckel.net/2014/05/25/querying-json-in-postgres/ :
CREATE TABLE json_test (
id serial primary key,
data jsonb
);
INSERT INTO json_test (data) VALUES
('{}'),
('{"a": 1}'),
('{"a": 2, "b": ["c", "d"]}'),
('{"a": 1, "b": {"c": "d", "e": true}}'),
('{"b": 2}');
And create an index.
create index json_test_index on public.json_test using gin (data jsonb_path_ops) tablespace pg_default;
Then take a look at plan of the following query:
SELECT * FROM json_test WHERE data ? 'a';
There will be Seq Scan while I would expect an index scan. Could please somebody advise what's wrong here?
From the docs: "The non-default GIN operator class jsonb_path_ops supports indexing the #> operator only." It doesn't support the ? operator.
So use the default operator for jsonb instead (called "jsonb_ops", if you wish to spell it out explicitly).
But if your table only has 5 rows, it probably won't use the index anyway, unless you force it by set enable_seqscan = off.

postgres - syntax for updating a jsonb array

I'm struggling to find the right syntax for updating an array in a jsonb column in postgres 9.6.6
Given a column "comments", with this example:
[
{
"Comment": "A",
"LastModified": "1527579949"
},
{
"Comment": "B",
"LastModified": "1528579949"
},
{
"Comment": "C",
"LastModified": "1529579949"
}
]
If I wanted to append Z to each comment (giving AZ, BZ, CZ).
I know I need to use something like jsonb_set(comments, '{"Comment"}',
Any hints on finishing this off?
Thanks.
Try:
UPDATE elbat
SET comments = array_to_json(ARRAY(SELECT jsonb_set(x.original_comment,
'{Comment}',
concat('"',
x.original_comment->>'Comment',
'Z"')::jsonb)
FROM (SELECT jsonb_array_elements(elbat.comments) original_comment) x))::jsonb;
It uses jsonb_array_elements() to get the array elements as set, applies the changes on them using jsonb_set(), transforms this to an array and back to json with array_to_json().
But that's an awful lot of work. OK, maybe there is a more elegant solution, that I didn't find. But since your JSON seems to have a fixed schema anyway, I'd recommend a redesign to do it the relational way and have a simple table for the comments plus a linking table for the objects the comment is on. The change would have been very, very easy in such a model for sure.
Find a query returning the expected result:
select jsonb_agg(value || jsonb_build_object('Comment', value->>'Comment' || 'Z'))
from my_table
cross join jsonb_array_elements(comments);
jsonb_agg
-----------------------------------------------------------------------------------------------------------------------------------------------------
[{"Comment": "AZ", "LastModified": "1527579949"}, {"Comment": "BZ", "LastModified": "1528579949"}, {"Comment": "CZ", "LastModified": "1529579949"}]
(1 row)
Create a simple SQL function based of the above query:
create or replace function update_comments(jsonb)
returns jsonb language sql as $$
select jsonb_agg(value || jsonb_build_object('Comment', value->>'Comment' || 'Z'))
from jsonb_array_elements($1)
$$;
Use the function:
update my_table
set comments = update_comments(comments);
DbFiddle.