Postgresql: How to query for JSONb arrays containing some value(s) - postgresql

I have this table:
CREATE TABLE lawyer (
id SERIAL PRIMARY KEY,
data jsonb
);
INSERT INTO lawyer (data) VALUES
('{"a": 1}'),
('{"tags":["derecho", "civil", "laboral", "penal"]}'),
('{"tags":["derecho", "penal"]}')
;
What I want is a JSONb query in postgres for when I need to find fir example any entry that contains "civil" OR "derecho"

Finally found a way to do this:
Store json arrays directly as top level:
INSERT INTO lawyer (data) VALUES
('["derecho","laboral","penal"]'),
('["derecho", "civil", "laboral", "penal"]'),
('["derecho", "penal"]')
;
SELECT *
FROM lawyer
WHERE data ? 'penal';
Result:

For those looking for an answer for the original data structure, here's the SQL:
SELECT * FROM lawyer WHERE lawyer.data->'tags' ? 'penal'

Related

How to convert an jsonb array and use stats moment

how are you?
I needed to store an array of numbers as JSONB in PostgreSQL.
Now I'm trying to calculate stats moments from this JSON, I'm facing some issues.
Sample of my data:
I already was able to convert a JSON into a float array.
I used a function to convert jsonb to float array.
CREATE OR REPLACE FUNCTION jsonb_array_castdouble(jsonb) RETURNS float[] AS $f$
SELECT array_agg(x)::float[] || ARRAY[]::float[] FROM jsonb_array_elements_text($1) t(x);
$f$ LANGUAGE sql IMMUTABLE;
Using this SQL:
with data as (
select
s.id as id,
jsonb_array_castdouble(s.snx_normalized) as serie
FROM
spectra s
)
select * from data;
I found a function that can do these calculations and I need to pass an array for that: https://github.com/ellisonch/PostgreSQL-Stats-Aggregate/
But this function requires an array in another way: unnested
I already tried to use unnest, but it will get only one value, not the entire array :(.
My goal is:
Be able to apply stats moment (kurtosis, skewness) for each row.
like:
index
skewness
1
21.2131
2
1.123
Bonus: There is a way to not use this 'with data', use the transformation in the select statement?
snx_wavelengths is JSON, right? And also you provided it as a picture and not text :( the data looks like (id, snx_wavelengths) - I believe you meant id saying index (not a good idea to use a keyword, would require identifier doublequotes):
1,[1,2,3,4]
2,[373,232,435,84]
If that is right:
select id, (stats_agg(v::float)).skewness
from myMeasures,
lateral json_array_elements_text(snx_wavelengths) v
group by id;
DBFiddle demo
BTW, you don't need "with data" in the original sample if you don't want to use and could replace with a subquery. ie:
select (stats_agg(n)).* from (select unnest(array[16,22,33,24,15])) data(n)
union all
select (stats_agg(n)).* from (select unnest(array[416,622,833,224,215])) data(n);
EDIT: And if you needed other stats too:
select id, "count","min","max","mean","variance","skewness","kurtosis"
from myMeasures,
lateral (select (stats_agg(v::float)).* from json_array_elements_text(snx_wavelengths) v) foo
group by id,"count","min","max","mean","variance","skewness","kurtosis";
DBFiddle demo

Extract all the values in jsonb into a row

I'm using postgresql 11, I have a jsonb which represent a row of that table, it's look like
{"userid":"test","rolename":"Root","loginerror":0,"email":"superadmin#ae.com",...,"thirdpartyauthenticationkey":{}}
is there any method that I could gather all the "values" of the jsonb into a string which is separated by ',' and without the keys?
The string I want to obtain with the jsonb above is like
(test, Root, 0, superadmin#ae.com, ..., {})
I need to keep the ORDER of those values as what their keys were in the jsonb. Could I do that with postgresql?
You can use the jsonb_populate_record function (assuming your json data does match the users table). This will force the text value to match the order of your users table:
Schema (PostgreSQL v13)
CREATE TABLE users (
userid text,
rolename text,
loginerror int,
email text,
thirdpartyauthenticationkey json
)
Query #1
WITH d(js) AS (
VALUES
('{"userid":"test", "rolename":"Root", "loginerror":0, "email":"superadmin#ae.com", "thirdpartyauthenticationkey":{}}'::jsonb),
('{"userid":"other", "rolename":"User", "loginerror":324, "email":"nope#ae.com", "thirdpartyauthenticationkey":{}}'::jsonb)
)
SELECT jsonb_populate_record(null::users, js),
jsonb_populate_record(null::users, js)::text AS record_as_text,
pg_typeof(jsonb_populate_record(null::users, js)::text)
FROM d
;
jsonb_populate_record
record_as_text
pg_typeof
(test,Root,0,superadmin#ae.com,{})
(test,Root,0,superadmin#ae.com,{})
text
(other,User,324,nope#ae.com,{})
(other,User,324,nope#ae.com,{})
text
Note that if you're building this string to insert it back into postgresql then you don't need to do that, since the result of jsonb_populate_record will match your table:
Query #2
WITH d(js) AS (
VALUES
('{"userid":"test", "rolename":"Root", "loginerror":0, "email":"superadmin#ae.com", "thirdpartyauthenticationkey":{}}'::jsonb),
('{"userid":"other", "rolename":"User", "loginerror":324, "email":"nope#ae.com", "thirdpartyauthenticationkey":{}}'::jsonb)
)
INSERT INTO users
SELECT (jsonb_populate_record(null::users, js)).*
FROM d;
There are no results to be displayed.
Query #3
SELECT * FROM users;
userid
rolename
loginerror
email
thirdpartyauthenticationkey
test
Root
0
superadmin#ae.com
[object Object]
other
User
324
nope#ae.com
[object Object]
View on DB Fiddle
You can use jsonb_each_text() to get a set of a text representation of the elements, string_agg() to aggregate them in a comma separated string and concat() to put that in parenthesis.
SELECT concat('(', string_agg(value, ', '), ')')
FROM jsonb_each_text('{"userid":"test","rolename":"Root","loginerror":0,"email":"superadmin#ae.com","thirdpartyauthenticationkey":{}}'::jsonb) jet (key,
value);
db<>fiddle
You didn't provide DDL and DML of a (the) table the JSON may reside in (if it does, that isn't clear from your question). The demonstration above therefore only uses the JSON you showed as a scalar. If you have indeed a table you need to CROSS JOIN LATERAL and GROUP BY some key.
Edit:
If you need to be sure the order is retained and you don't have that defined in a table's structure as #Marth's answer assumes, then you can of course extract every value manually in the order you need them.
SELECT concat('(',
concat_ws(', ',
j->>'userid',
j->>'rolename',
j->>'loginerror',
j->>'email',
j->>'thirdpartyauthenticationkey'),
')')
FROM (VALUES ('{"userid":"test","rolename":"Root","loginerror":0,"email":"superadmin#ae.com","thirdpartyauthenticationkey":{}}'::jsonb)) v (j);
db<>fiddle

Postgres / hstore : how to get IDs of rows with certain key in their hstore field?

I'm querying a pgsql DB to find rows that have certain keys in an hstore field:
select DISTINCT
from (select id, exist(data, ‘exercise_quiz’) key_exists
from user_tracking) x
where key_exists = true;
It works fine, but I need to print the IDs of the corresponding rows it returns. Can I do this with this command?
Use the operator hstore ? text (does hstore contain key?):
select id
from user_tracking
where data ? 'exercise_quiz';

How to get size of PostgreSQL jsonb field?

I have a table with jsonb field in table.
CREATE TABLE data.items
(
id serial NOT NULL,
datab jsonb
)
How to get size of this field in a query like this:
select id, size(datab) from data.items
For the number of bytes used to store:
select id, pg_column_size(datab) from data.items;
For the number of elements on the jsonb object:
select id, jsonb_array_length(datab) from data.items;
If the column uses EXTENDED storage (TOAST compression), you should use octet_length(datab::text) instead of pg_column_size

Test for item membership in Postgres json array

Let's say I have a table in Postgres that looks like this - note the zips field is json.
cities
name (text) | zips (json)
San Francisco | [94100, 94101, ...]
Washington DC | [20000, 20001, ...]
Now I want to do something like select * from cities where zip=94101, in other words, testing membership.
I tried using WHERE zips ? '94101' and got operator does not exist: json ? unknown.
I tried using WHERE zips->'94101' but was not sure what to put there, as Postgres complained argument of WHERE must by type boolean, not type json.
What do I want here? How would I solve this for 9.3 and 9.4?
edit Yes, I know I should be using the native array type... the database adapter we are using doesn't support this.
In PostgreSQL 9.4+, you can use #> operator with jsonb type:
create table test (city text, zips jsonb);
insert into test values ('A', '[1, 2, 3]'), ('B', '[4, 5, 6]');
select * from test where zips #> '[1]';
An additional advantage of such approach is 9.4's new GIN indexes that speed up such queries on big tables.
For PostgreSQL 9.4+, you should use json[b]_array_elements_text():
(the containment operator ? does something similar, but for a JSON array, it can only find exact matches, which could only occur, if your array contains strings, not numbers)
create table cities (
city text,
zips jsonb
);
insert into cities (city, zips) values
('Test1', '[123, 234]'),
('Test2', '[234, 345]'),
('Test3', '[345, 456]'),
('Test4', '[456, 123]'),
('Test5', '["123", "note the quotes!"]'),
('Test6', '"123"'), -- this is a string in json(b)
('Test7', '{"123": "this is an object, not an array!"}');
-- select * from cities where zips ? '123';
-- would yield 'Test5', 'Test6' & 'Test7', but none of you want
-- this is a safe solution:
select cities.*
from cities
join jsonb_array_elements_text(
case jsonb_typeof(zips)
when 'array' then zips
else '[]'
end
) zip on zip = '123';
-- but you can use this simplified query, if you are sure,
-- your "zips" column only contains JSON arrays:
select cities.*
from cities
join jsonb_array_elements_text(zips) zip on zip = '123';
For 9.3, you can use json_array_elements() (& convert zips manually to text):
select cities.*
from cities
join json_array_elements(zips) zip on zip::text = '123';
Note: for 9.3, you can't make your query safe (at least easily), you need to store only JSON arrays in the zips column. Also, the query above won't find any string matches, your array elements need to be numbers.
Note 2: for 9.4+ you can use the safe solution with json too (not just with jsonb, but you must call json_typeof(zips) instead of jsonb_typeof(zips)).
Edit: actually, the #> operator is better in PostgreSQL 9.4+, as #Ainar-G mentioned (because it's indexable). A little side-note: it only finds rows, if your column and query both use JSON numbers (or JSON strings, but not mixed).
For 9.3, you can use json_array_elements(). I can't test with jsonb in version 9.4 right now.
create table test (
city varchar(35) primary key,
zips json not null
);
insert into test values
('San Francisco', '[94101, 94102]');
select *
from (
select *, json_array_elements(zips)::text as zip from test
) x
where zip = '94101';