Postgres: searching json array for int field value - postgresql

Following this documentation: https://www.postgresql.org/docs/9.5/functions-json.html I came across this syntax for searching a json array for a constant value using single quotes.
I'd like to do the same but search for the value of a field in a table I'm joining to. I've tried a number of variations of this:
SELECT tableA.id, tableB.json_array FROM tableA
LEFT JOIN tableB ON (tableB.json_array)::jsonb #> tableA.id;
But am always running into type-related issues. Does the #> operator only work with constants? How can I solve this problem?

If your data is in JSON ARRAY format then you can use Postgres jsonb_array_elements_text function which is extracting values of array elements. After doing this you can easily use key values in a query or on where conditions.
Sample query for you:
-- sample format for json_array field: [{"id": 110}, {"id": 115}, {"id": 130}, {"id": 145}, {"id": 152}, {"id": 165}]
select b.* from tableB b
cross join jsonb_array_elements_text(b.json_array) b2(pvalue)
where
(b2.pvalue::jsonb->'id')::int4 > 100
-- (b2.pvalue::jsonb->'id')::int4 = 102
-- (b2.pvalue::jsonb->'id')::int4 in (50, 51, 55)

Related

Search for string in jsonb values - PostgreSQL

For simplicity, a row of table looks like this:
key: "z06khw1bwi886r18k1m7d66bi67yqlns",
reference_keys: {
"KEY": "1x6t4y",
"CODE": "IT137-521e9204-ABC-TESTE"
"NAME": "A"
},
I have a jsonb object like this one {"KEY": "1x6t4y", "CODE": "IT137-521e9204-ABC-TESTE", "NAME": "A"} and I want to search for a query in the values of any key. If my query is something like '521e9204' I want it to return the row that reference_keys has '521e9204' in any value. Basicly the keys don't matter for this scenario.
Note: The column reference_keys and so the jsonb object, are always a 1 dimensional array.
I have tried a query like this:
SELECT * FROM table
LEFT JOIN jsonb_each_text(table.reference_keys) AS j(k, value) ON true
WHERE j.value LIKE '%521e9204%'
The problem is that it duplicates rows, for every key in the json and it messes up the returned items.
I have also thinked of doing something like this:
SELECT DISTINCT jsonb_object_keys(reference_keys) from table;
and then use a query like:
SELECT * FROM table
WHERE reference_keys->>'CODE' like '%521e9204%'
It seems like this would work but I really don't want to rely on this solution.
You can rewrite your JOIN to an EXISTS condition to avoid the duplicates:
SELECT t.*
FROM the_table t
WHERE EXISTS (select *
from jsonb_each_text(t.reference_keys) AS j(k, value)
WHERE j.value LIKE '%521e9204%');
If you are using Postgres 12 or later, you can also use a JSON path query:
where jsonb_path_exists(reference_keys, 'strict $.** ? (# like_regex "521e9204")')

JSONB Data Type Modification in Postgresql

I have a doubt with modification of jsonb data type in postgres
Basic setup:-
array=> ["1", "2", "3"]
and now I have a postgresql database with an id column and a jsonb datatype column named lets just say cards.
id cards
-----+---------
1 {"1": 3, "4": 2}
thats the data in the table named test
Question:
How do I convert the cards of id->1 FROM {"1": 3, "4": 2} TO {"1": 4, "4":2, "2": 1, "3": 1}
How I expect the changes to occur:
From the array, increment by 1 all elements present inside the array that exist in the cards jsonb as a key thus changing {"1": 3} to {"1": 4} and insert the values that don't exist as a key in the cards jsonb with a value of 1 thus changing {"1":4, "4":2} to {"1":4, "4":2, "2":1, "3":1}
purely through postgres.
Partial Solution
I asked a senior for support regarding my question and I was told this:-
Roughly (names may differ): object keys to explode cards, array_elements to explode the array, left join them, do the calculation, re-aggregate the object. There may be a more direct way to do this but the above brute-force approach will work.
So I tried to follow through it using these two functions json_each_text(), json_array_elements_text() but ended up stuck halfway into this as well as I was unable to understand what they meant by left joining two columns:-
SELECT jsonb_each_text(tester_cards) AS each_text, jsonb_array_elements_text('[["1", 1], ["2", 1], ["3", 1]]') AS array_elements FROM tester WHERE id=1;
TLDR;
Update statement that checks whether a range of keys from an array exist or not in the jsonb data and automatically increments by 1 or inserts respectively the keys into the jsonb with a value of 1
Now it might look like I'm asking to be spoonfed but I really haven't managed to find anyway to solve it so any assistance would be highly appreciated 🙇
The key insight is that with jsonb_each and jsonb_object_agg you can round-trip a JSON object in a subquery:
SELECT id, (
SELECT jsonb_object_agg(key, value)
FROM jsonb_each(cards)
) AS result
FROM test;
(online demo)
Now you can JOIN these key-value pairs against the jsonb_array_elements of your array input. Your colleague was close, but not quite right: it requires a full outer join, not just a left (or right) join to get all the desired object keys for your output, unless one of your inputs is a subset of the other.
SELECT id, (
SELECT jsonb_object_agg(COALESCE(obj_key, arr_value), …)
FROM jsonb_array_elements_text('["1", "2", "3"]') AS arr(arr_value)
FULL OUTER JOIN jsonb_each(cards) AS obj(obj_key, obj_value) ON obj_key = arr_value
) AS result
FROM test;
(online demo)
Now what's left is only the actual calculation and the conversion to an UPDATE statement:
UPDATE test
SET cards = (
SELECT jsonb_object_agg(
COALESCE(key, arr_value),
COALESCE(obj_value::int, 0) + (arr_value IS NOT NULL)::int
)
FROM jsonb_array_elements_text('["1", "2", "3"]') AS arr(arr_value)
FULL OUTER JOIN jsonb_each_text(cards) AS obj(key, obj_value) ON key = arr_value
);
(online demo)

postgres remove specific element from jsonb array

I am using postgres 10
I have a JsonArray in a jsonb column named boards.
I have a GIN index on the jsonb column.
The column values look like this:
[{"id": "7beacefa-9ac8-4fc6-9ee6-8ff6ab1a097f"},
{"id": "1bc91c1c-b023-4338-bc68-026d86b0a140"}]
I want to delete in all the rows in the column the element
{"id": "7beacefa-9ac8-4fc6-9ee6-8ff6ab1a097f"} if such exists(update the column).
I saw that it is possible to delete an element by position with operator #- (e.g. #-'{1}') and I know you can get the position of an element using "with ordinality" but i cant manage to combine the two things.
How can i update the jsonarray?
One option would be using an update statement containing a query selecting all the sub-elements except {"id": "7beacefa-9ac8-4fc6-9ee6-8ff6ab1a097f"} by using an inequality, and then applying jsonb_agg() function to aggregate those sub-elements :
UPDATE user_boards
SET boards = (SELECT jsonb_agg(j.elm)
FROM user_boards u
CROSS JOIN jsonb_array_elements(boards) j(elm)
WHERE j.elm->>'id' != '7beacefa-9ac8-4fc6-9ee6-8ff6ab1a097f'
AND u.ID = user_boards.ID
GROUP BY ID)
where ID is an assumed identity(unique) column of the table.
Demo

Writing a rather obtuse JSON query using Slick

I am looking to translate an SQL query (Postgres) into Scala Slick code for use in my Play application.
The data looks something like this:
parent_id | json_column
----------+-----------------------------------------
| [ {"id": "abcde-12345", "data": "..."}
2 | , {"id": "67890-fghij", "data": "..."}
| , {"id": "klmno-00000", "data": "..."} ]
Here's my query in PostgreSQL:
SELECT * FROM table1
WHERE id IN (
SELECT id
FROM
table1 t1,
json_array_elements(t1.json_column) e,
json_to_record(e.value) AS r("id" text, data text)
WHERE
"id" = 'abcde-12345'
AND t1.parent_id = 2
);
This finds the results I need; any objects in t1 that include a "row" in the json_column array that has the id of "abcde-12345". The "parent_id" and "id" will be passed in to this query via query parameters (both Strings).
How would I write this query in Scala using Slick?
The easiest - maybe laziest? - way is probably to just use plain sql ..
sql"""[query]""".as[ (type1,type2..) ]
using the $var notation for the variables.
Otherwise you can use SimpleFunction to map the json calls, but I'm not quite sure how that works when they generate multiple results per row. Seems that might get complicated..

How can you do projection with array_agg(order by)?

I have a table with three columns: id, name and position. I want to create a JSON array as following:
[
{"id": 443, "name": "first"},
{"id": 645, "name": "second"}
]
This should be listed by the position column.
I started with the following query:
with output as
(
select id, name
from the_table
)
select array_to_json(array_agg(output))
from output
This works, great. Now I want to add the ordering. I started with this:
with output as
(
select id, name, position
from the_table
)
select array_to_json(array_agg(output order by output.position))
from output
Now the output is as following:
[
{"id": 443, "name": "first", "position": 1},
{"id": 645, "name": "second", "position": 2}
]
But I don't want the position field in the output.
I am facing a chicken-egg problem: I need the position column to be able to order on it, but I also don't want the position column, as I don't want it in the result output.
How can I fix this?
I don't think the following query is correct, as table ordering is (theoretically) not preserved between queries:
with output as
(
select id, name
from the_table
order by position
)
select array_to_json(array_agg(output))
from output
There are two ways (at least):
Build JSON object:
with t(x,y) as (values(1,1),(2,2))
select json_agg(json_build_object('x',t.x) order by t.y) from t;
Or delete unnecessary key:
with t(x,y) as (values(1,1),(2,2))
select json_agg((to_jsonb(t)-'y')::json order by t.y) from t;
Note that in the second case you need some type casts because - operator defined only for JSONB type.
Also note that I used direct JSON aggregation json_agg() instead of pair array_to_json(array_agg())