Processing record type from a jsonb_each query - postgresql

I store some data as JSON.
I want to flatten the data using jsonb_each.
The new column type is RECORD, but I don't how extract values from it.
SELECT T FROM (
SELECT json_each_text(skills::json->'prizes') FROM users) AS T;
The output is
jsonb_each
---------------------------------
(compliance,2)
(incentives,3)
(compliance,0)
(legal,3)
(legal,2)
(international-contributions,3)
The type is RECORD.
pg_typeof
-----------
record
I want to do an aggregate and GROUPBY, but I cannot figure out how to extract the first element(the string) and the second element (the value).

Here is a workaround I have found: JSON -> ROW -> JSON -> (string, integer) and then aggregate. But I am wondering if there is a shortcut and skip the ROW->JSON conversion.
SELECT U.key, AVG(U.value::int) FROM
(SELECT row_to_json(T)->'s'->>'key' AS key,
row_to_json(T)->'s'->>'value' AS value
FROM
(SELECT jsonb_each(skills::jsonb->'prizes') AS s
FROM users) AS T
) AS U
GROUP BY key;

Thanks a lot, #Arnaud, this seems like a not-very-common problem. I wasn't sure about json data structure after using row_to_json function, so I needed to validate that via:
SELECT row_to_json(T) FROM
(SELECT jsonb_each((data->'app_metadata'->>'results')::jsonb)
FROM temp) AS T;
And once I got the keys structure, I could replicate your approach:
SELECT row_to_json(T)->'jsonb_each'->>'key' as key, row_to_json(T)->'jsonb_each'->>'value' as value
FROM (select jsonb_each((data->'app_metadata'->>'results')::jsonb) FROM temp) AS T

Related

How to convert an jsonb array and use stats moment

how are you?
I needed to store an array of numbers as JSONB in PostgreSQL.
Now I'm trying to calculate stats moments from this JSON, I'm facing some issues.
Sample of my data:
I already was able to convert a JSON into a float array.
I used a function to convert jsonb to float array.
CREATE OR REPLACE FUNCTION jsonb_array_castdouble(jsonb) RETURNS float[] AS $f$
SELECT array_agg(x)::float[] || ARRAY[]::float[] FROM jsonb_array_elements_text($1) t(x);
$f$ LANGUAGE sql IMMUTABLE;
Using this SQL:
with data as (
select
s.id as id,
jsonb_array_castdouble(s.snx_normalized) as serie
FROM
spectra s
)
select * from data;
I found a function that can do these calculations and I need to pass an array for that: https://github.com/ellisonch/PostgreSQL-Stats-Aggregate/
But this function requires an array in another way: unnested
I already tried to use unnest, but it will get only one value, not the entire array :(.
My goal is:
Be able to apply stats moment (kurtosis, skewness) for each row.
like:
index
skewness
1
21.2131
2
1.123
Bonus: There is a way to not use this 'with data', use the transformation in the select statement?
snx_wavelengths is JSON, right? And also you provided it as a picture and not text :( the data looks like (id, snx_wavelengths) - I believe you meant id saying index (not a good idea to use a keyword, would require identifier doublequotes):
1,[1,2,3,4]
2,[373,232,435,84]
If that is right:
select id, (stats_agg(v::float)).skewness
from myMeasures,
lateral json_array_elements_text(snx_wavelengths) v
group by id;
DBFiddle demo
BTW, you don't need "with data" in the original sample if you don't want to use and could replace with a subquery. ie:
select (stats_agg(n)).* from (select unnest(array[16,22,33,24,15])) data(n)
union all
select (stats_agg(n)).* from (select unnest(array[416,622,833,224,215])) data(n);
EDIT: And if you needed other stats too:
select id, "count","min","max","mean","variance","skewness","kurtosis"
from myMeasures,
lateral (select (stats_agg(v::float)).* from json_array_elements_text(snx_wavelengths) v) foo
group by id,"count","min","max","mean","variance","skewness","kurtosis";
DBFiddle demo

Extract all the values in jsonb into a row

I'm using postgresql 11, I have a jsonb which represent a row of that table, it's look like
{"userid":"test","rolename":"Root","loginerror":0,"email":"superadmin#ae.com",...,"thirdpartyauthenticationkey":{}}
is there any method that I could gather all the "values" of the jsonb into a string which is separated by ',' and without the keys?
The string I want to obtain with the jsonb above is like
(test, Root, 0, superadmin#ae.com, ..., {})
I need to keep the ORDER of those values as what their keys were in the jsonb. Could I do that with postgresql?
You can use the jsonb_populate_record function (assuming your json data does match the users table). This will force the text value to match the order of your users table:
Schema (PostgreSQL v13)
CREATE TABLE users (
userid text,
rolename text,
loginerror int,
email text,
thirdpartyauthenticationkey json
)
Query #1
WITH d(js) AS (
VALUES
('{"userid":"test", "rolename":"Root", "loginerror":0, "email":"superadmin#ae.com", "thirdpartyauthenticationkey":{}}'::jsonb),
('{"userid":"other", "rolename":"User", "loginerror":324, "email":"nope#ae.com", "thirdpartyauthenticationkey":{}}'::jsonb)
)
SELECT jsonb_populate_record(null::users, js),
jsonb_populate_record(null::users, js)::text AS record_as_text,
pg_typeof(jsonb_populate_record(null::users, js)::text)
FROM d
;
jsonb_populate_record
record_as_text
pg_typeof
(test,Root,0,superadmin#ae.com,{})
(test,Root,0,superadmin#ae.com,{})
text
(other,User,324,nope#ae.com,{})
(other,User,324,nope#ae.com,{})
text
Note that if you're building this string to insert it back into postgresql then you don't need to do that, since the result of jsonb_populate_record will match your table:
Query #2
WITH d(js) AS (
VALUES
('{"userid":"test", "rolename":"Root", "loginerror":0, "email":"superadmin#ae.com", "thirdpartyauthenticationkey":{}}'::jsonb),
('{"userid":"other", "rolename":"User", "loginerror":324, "email":"nope#ae.com", "thirdpartyauthenticationkey":{}}'::jsonb)
)
INSERT INTO users
SELECT (jsonb_populate_record(null::users, js)).*
FROM d;
There are no results to be displayed.
Query #3
SELECT * FROM users;
userid
rolename
loginerror
email
thirdpartyauthenticationkey
test
Root
0
superadmin#ae.com
[object Object]
other
User
324
nope#ae.com
[object Object]
View on DB Fiddle
You can use jsonb_each_text() to get a set of a text representation of the elements, string_agg() to aggregate them in a comma separated string and concat() to put that in parenthesis.
SELECT concat('(', string_agg(value, ', '), ')')
FROM jsonb_each_text('{"userid":"test","rolename":"Root","loginerror":0,"email":"superadmin#ae.com","thirdpartyauthenticationkey":{}}'::jsonb) jet (key,
value);
db<>fiddle
You didn't provide DDL and DML of a (the) table the JSON may reside in (if it does, that isn't clear from your question). The demonstration above therefore only uses the JSON you showed as a scalar. If you have indeed a table you need to CROSS JOIN LATERAL and GROUP BY some key.
Edit:
If you need to be sure the order is retained and you don't have that defined in a table's structure as #Marth's answer assumes, then you can of course extract every value manually in the order you need them.
SELECT concat('(',
concat_ws(', ',
j->>'userid',
j->>'rolename',
j->>'loginerror',
j->>'email',
j->>'thirdpartyauthenticationkey'),
')')
FROM (VALUES ('{"userid":"test","rolename":"Root","loginerror":0,"email":"superadmin#ae.com","thirdpartyauthenticationkey":{}}'::jsonb)) v (j);
db<>fiddle

Fetch rows from postgres table which contains a specific id in jsonb[] column

I have a details table with adeet column defined as jsonb[]
a sample value stored in adeet column is as below image
Sample data stored in DB :
I want to return the rows which satisfies id=26088 i.e row 1 and 3
I have tried array operations and json operations but it does'nt work as required. Any pointers
Obviously the type of the column adeet is not of type JSON/JSONB, but maybe VARCHAR and we should fix the format so as to convert into a JSONB type. I used replace() and r/ltrim() funcitons for this conversion, and preferred to derive an array in order to use jsonb_array_elements() function :
WITH t(jobid,adeet) AS
(
SELECT jobid, replace(replace(replace(adeet,'\',''),'"{','{'),'}"','}')
FROM tab
), t2 AS
(
SELECT jobid, ('['||rtrim(ltrim(adeet,'{'), '}')||']')::jsonb as adeet
FROM t
)
SELECT t.*
FROM t2 t
CROSS JOIN jsonb_array_elements(adeet) j
WHERE (j.value ->> 'id')::int = 26088
Demo
You want to combine JSONB's <# operator with the generic-array ANY construct.
select * from foobar where '{"id":26088}' <# ANY (adeet);

Build jsonb array from jsonb field

I have column options with type jsonb , in format {"names": ["name1", "name2"]} which was created with
UPDATE table1 t1 SET options = (SELECT jsonb_build_object('names', names) FROM table2 t2 WHERE t2.id= t1.id)
and where names have type jsonb array.
SELECT jsonb_typeof(names) FROM table2 give array
Now I want to extract value of names as jsonb array. But query
SELECT jsonb_build_array(options->>'names') FROM table
gave me ["[\"name1\", \"name2\"]"], while I expect ["name1", "name2"]
How can I get value in right format?
The ->> operator will return the value of the field (in your case, a JSON array) as a properly escaped text. What you are looking for is the -> operator instead.
However, note that using the jsonb_build_array on that will return an array containing your original array, which is probably not what you want either; simply using options->'names' should get you what you want.
Actually, you don't need to use jsonb_build_array() function.
Use select options -> 'names' from table; This will fix your issue.
jsonb_build_array() is for generating the array from jsonb object. You are following wrong way. That's why you are getting string like this ["[\"name1\", \"name2\"]"].
Try to execute this sample SQL script:
select j->'names'
from (
select '{"names": ["name1", "name2"]}'::JSONB as j
) as a;

Postgresql order by case when {someCase} then json type column

I need order result from select by few ways.
It's working when it's some column from table TenderItem.
But NOT working if it some key from json type column TenderItem.ItemInfo, f.e.
select * from "TenderItem" order by "ItemInfo" ->> 'Name'; -- working in simple select
with sortingParams (columnName, isAsc) AS (VALUES ('ItemId', true))
select *
FROM "TenderItem" i, sortingParams
WHERE i."TenderId" = 1
AND i."ItemInfo" ->> 'Name' like '%Transcend%'
ORDER BY
case
WHEN columnName like '%ItemId%' THEN i."ItemId" --*work
WHEN columnName like '%ABCSegment%' THEN i."ItemInfo" ->> 'ABCSegment' --**
end desc;
**on this string i have message "ERROR: CASE types bigint and text cannot be matched"
It's not clear how you'd sort the itemID against the ItemInfo segment (unless this points to an item id) since they are not all text values (and if they are all text but some are text strings like '12345' then you do not want to use text sort since then '100' would come before '99'). You probably want them to be separate sort conditions to give more flexibility in ordering:
with sortingParams (columnName, isAsc) AS (VALUES ('ItemId', true))
select *
FROM "TenderItem" i, sortingParams
WHERE i."TenderId" = 1
AND i."ItemInfo" ->> 'Name' like '%Transcend%'
ORDER BY
case
WHEN columnName like '%ItemId%' THEN i."ItemId"::bigint end asc nulls last --puts things with an itemID ahead of those without, or could use nulls first
--if two items have same item id, then sort by segment
, case
WHEN columnName like '%ABCSegment%' THEN i."ItemInfo" ->> 'ABCSegment'
end desc;
Note that each sort condition must give the same datatype for each row being evaluated! This is what gives the error your described where the case statement gives a biting for ItemId and a text value for ItemInfo ->> 'ABCSegment'
ItemId is BIGINT and i."ItemInfo" ->> 'ABCSegment' is text which are incompatible types to do sorting on.
Try casting the value explicitly to BIGINT, i.e
..WHEN columnName like '%ABCSegment%' THEN (i."ItemInfo" ->> 'ABCSegment')::BIGINT
or make i."ItemId" a text if the above fails due to invalid bigint values.
i."ItemId"::TEXT