Postgres JSON array contains given array element - postgresql

I have been searching for this answer for long on stackoverflow, but did not find anything useful.
I have data in my_tbl as:
folder_id | links
--------------+----------------------------------------------------------------
761 | [{"ids": "[82293,82292]", "index": "index_1"}, {"ids": "[82293,82292]", "index": "index_2"}]
769 | [{"ids": "[82323,82324]", "index": "index_3"}]
572 | [{"ids": "[80031,79674,78971]", "index": "index_4"}]
785 | [{"ids": "[82367,82369]", "index": "index_5"}, {"ids": "[82368,82371]", "index": "index_6"}]
768 | [{"ids": "[82292,82306]", "index": "index_7"}]
I want to get all the rows where links->>'ids' contain 82292. So in this case, it should return me folder_id 761 and 768.
I achieved so far is separating ids arrays from links by -
jsonb_array_elements(links) ->> 'ids'
not sure how to proceed further.

You can use jsonb_array_elements to convert a json array to a rowset. You can use this to at the value of "ids". Using the #> operator you can check if the array contains a value:
select distinct folder_id
from YourTable
cross join lateral
jsonb_array_elements(links) ids(ele)
where (ele->>'ids')::jsonb #> '[82292]'
A tricky part is that "[82293,82292]" is stored as a string, not an array. The ele->>'ids' expression retrieves "[82293,82292]" as string using the ->> operator (the double > makes it return text instead of jsonb.) The content of the string is converted to an array by ::jsonb.
Example at rextester.com.

Related

Postgres: searching json array for int field value

Following this documentation: https://www.postgresql.org/docs/9.5/functions-json.html I came across this syntax for searching a json array for a constant value using single quotes.
I'd like to do the same but search for the value of a field in a table I'm joining to. I've tried a number of variations of this:
SELECT tableA.id, tableB.json_array FROM tableA
LEFT JOIN tableB ON (tableB.json_array)::jsonb #> tableA.id;
But am always running into type-related issues. Does the #> operator only work with constants? How can I solve this problem?
If your data is in JSON ARRAY format then you can use Postgres jsonb_array_elements_text function which is extracting values of array elements. After doing this you can easily use key values in a query or on where conditions.
Sample query for you:
-- sample format for json_array field: [{"id": 110}, {"id": 115}, {"id": 130}, {"id": 145}, {"id": 152}, {"id": 165}]
select b.* from tableB b
cross join jsonb_array_elements_text(b.json_array) b2(pvalue)
where
(b2.pvalue::jsonb->'id')::int4 > 100
-- (b2.pvalue::jsonb->'id')::int4 = 102
-- (b2.pvalue::jsonb->'id')::int4 in (50, 51, 55)

Writing a rather obtuse JSON query using Slick

I am looking to translate an SQL query (Postgres) into Scala Slick code for use in my Play application.
The data looks something like this:
parent_id | json_column
----------+-----------------------------------------
| [ {"id": "abcde-12345", "data": "..."}
2 | , {"id": "67890-fghij", "data": "..."}
| , {"id": "klmno-00000", "data": "..."} ]
Here's my query in PostgreSQL:
SELECT * FROM table1
WHERE id IN (
SELECT id
FROM
table1 t1,
json_array_elements(t1.json_column) e,
json_to_record(e.value) AS r("id" text, data text)
WHERE
"id" = 'abcde-12345'
AND t1.parent_id = 2
);
This finds the results I need; any objects in t1 that include a "row" in the json_column array that has the id of "abcde-12345". The "parent_id" and "id" will be passed in to this query via query parameters (both Strings).
How would I write this query in Scala using Slick?
The easiest - maybe laziest? - way is probably to just use plain sql ..
sql"""[query]""".as[ (type1,type2..) ]
using the $var notation for the variables.
Otherwise you can use SimpleFunction to map the json calls, but I'm not quite sure how that works when they generate multiple results per row. Seems that might get complicated..

Postgresql jsonb traversal

I am very new to the PG jsonb field.
I have for example a jsonb field containing the following
{
"RootModule": {
"path": [
1
],
"tags": {
"ModuleBase1": {
"value": 40640,
"humanstring": "40640"
},
"ModuleBase2": {
"value": 40200,
"humanstring": "40200"
}
},
"children": {
"RtuInfoModule": {
"path": [
1,
0
],
"tags": {
"in0": {
"value": 11172,
"humanstring": "11172"
},
"in1": {
"value": 25913,
"humanstring": "25913"
}
etc....
Is there a way to query X levels deep and search the "tags" key for a certain key.
Say I want "ModuleBase2" and "in1" and I want to get their values?
Basically I am looking for a query that will traverse a jsonb field until it finds a key and returns the value without having to know the structure.
In Python or JS a simple loop or recursive function could easily traverse a json object (or dictionary) until it finds a key.
Is there a built in function PG has to do that?
Ultimately I want to do this in django.
Edit:
I see I can do stuff like
SELECT data.key AS key, data.value as value
FROM trending_snapshot, jsonb_each(trending_snapshot.snapshot-
>'RootModule') AS data
WHERE key = 'tags';
But I must specify the the levels.
You can use a recursive query to flatten a nested jsonb, see this answer. Modify the query to find values for specific keys (add a condition in where clause):
with recursive flat (id, path, value) as (
select id, key, value
from my_table,
jsonb_each(data)
union
select f.id, concat(f.path, '.', j.key), j.value
from flat f,
jsonb_each(f.value) j
where jsonb_typeof(f.value) = 'object'
)
select id, path, value
from flat
where path like any(array['%ModuleBase2.value', '%in1.value']);
id | path | value
----+--------------------------------------------------+-------
1 | RootModule.tags.ModuleBase2.value | 40200
1 | RootModule.children.RtuInfoModule.tags.in1.value | 25913
(2 rows)
Test it in SqlFiddle.

how do I convert text to jsonB

What is the proper way to convert any text (or varchar) to jsonB type in Postgres (version 9.6) ?
For example, here I am using two methods and I am getting different results:
Method 1:
dev=# select '[{"field":15,"operator":0,"value":"1"},{"field":15,"operator":0,"value":"2"},55]'::jsonb;
jsonb
----------------------------------------------------------------------------------------------
[{"field": 15, "value": "1", "operator": 0}, {"field": 15, "value": "2", "operator": 0}, 55]
(1 row)
Method 2 , which doesn't produce the desired results, btw:
dev=# select to_jsonb('[{"field":15,"operator":0,"value":"1"},{"field":15,"operator":0,"value":"2"},55]'::text);
to_jsonb
----------------------------------------------------------------------------------------------------
"[{\"field\":15,\"operator\":0,\"value\":\"1\"},{\"field\":15,\"operator\":0,\"value\":\"2\"},55]"
(1 row)
dev=#
Here, it was converted to a string, not an array.
Why doesn't the second method creates an array ?
According to Postgres documentation:
to_jsonb(anyelemnt)
Returns the value as json or jsonb. Arrays and composites are
converted (recursively) to arrays and objects; otherwise, if there is
a cast from the type to json, the cast function will be used to
perform the conversion; otherwise, a scalar value is produced. For any
scalar type other than a number, a Boolean, or a null value, the text
representation will be used, in such a fashion that it is a valid json
or jsonb value.
IMHO you are providing a JSON formatted string, then you should use the first method.
to_json('Fred said "Hi."'::text) --> "Fred said \"Hi.\""
If you try to get an array of element using to_json(text) you'll get the next error:
select *
from jsonb_array_elements_text(to_jsonb('[{"field":15,"operator":0,"value":"1"},{"field":15,"operator":0,"value":"2"},55]'::text));
cannot extract elements from a scalar
But if you previously cast it to json:
select *
from jsonb_array_elements_text(to_jsonb('[{"field":15,"operator":0,"value":"1"},{"field":15,"operator":0,"value":"2"},55]'::json));
+--------------------------------------------+
| value |
+--------------------------------------------+
| {"field": 15, "value": "1", "operator": 0} |
+--------------------------------------------+
| {"field": 15, "value": "2", "operator": 0} |
+--------------------------------------------+
| 55 |
+--------------------------------------------+
If your text is just a json format text, you could just explicitly cast it to json/jsonb like this:
select '{"a":"b"}'::jsonb
Atomic type conversion and CSV-to-JSONb
A typical parse problem in open data applications is to parse line by line a CSV (or CSV-like) text into JSONB correct (atomic) datatypes. Datatypes can be defined in SQL jargon ('int', 'text', 'float', etc.) or JSON jargon ('string', 'number'):
CREATE FUNCTION csv_to_jsonb(
p_info text, -- the CSV line
coltypes_sql text[], -- the datatype list
rgx_sep text DEFAULT '\|' -- CSV separator, by regular expression
) RETURNS JSONb AS $f$
SELECT to_jsonb(a) FROM (
SELECT array_agg(CASE
WHEN tp IN ('int','integer','smallint','bigint') THEN to_jsonb(p::bigint)
WHEN tp IN ('number','numeric','float','double') THEN to_jsonb(p::numeric)
WHEN tp='boolean' THEN to_jsonb(p::boolean)
WHEN tp IN ('json','jsonb','object','array') THEN p::jsonb
ELSE to_jsonb(p)
END) a
FROM regexp_split_to_table(p_info,rgx_sep) WITH ORDINALITY t1(p,i)
INNER JOIN unnest(coltypes_sql) WITH ORDINALITY t2(tp,j)
ON i=j
) t
$f$ language SQL immutable;
-- Example:
SELECT csv_to_jsonb(
'123|foo bar|1.2|true|99999|{"x":123,"y":"foo"}',
array['int','text','float','boolean','bigint','object']
);
-- results [123, "foo bar", 1.2, true, 99999, {"x": 123, "y": "foo"}]
-- that is: number, string, number, true, number, object

Using LIKE operator for array of objects inside jsonb field in PostgreSQL

Is it possible to use LIKE operator for single key/value inside array of objects for jsonb field in PostgreSQL 9.4? For example I have:
id | body
------------------------------------------------------------
1 | {"products": [{"name": "qwe", "description": "asd"}, {"name": "zxc", "description": "vbn"}]}
I know, I can get a product with something like this:
select * from table where 'body'->'products' #> '[{"name": "qwe"}]'::jsonb
The question is: can I get this product if I don't know full name of it?
Try to get the key and value by using jsonb_each() function:
WITH json_test(data) AS ( VALUES
('{"products": [{"name": "qwe", "description": "asd"}, {"name": "zxc", "description": "vbn"}]}'::JSONB)
)
SELECT doc.key,doc.value
FROM json_test jt,
jsonb_array_elements(jt.data->'products') array_elements,
jsonb_each(array_elements) doc
WHERE
doc.key = 'name'
AND
doc.value::TEXT LIKE '%w%';
Output will be the following:
key | value
------+-------
name | "qwe"
(1 row)