I have field extra with type jsonb in my table product. And I need get uniq key with uniq values for each key from all products row. Example data in extra field
{"SIZE": "110/116", "COLOUR": "Vit", "GENDER": "female", "AGE_GROUP": "Kids", "ALTERNATIVE_IMAGE": "some_path"}
for now I use query like this
select DISTINCT e.key, array_agg(DISTINCT e.value) as fields
from products AS p
join jsonb_each_text(p.extras) e on true
GROUP BY e.key
Ad have respnse (small part with some keys in full response all keys are present) like this
[
{
"key": "AGE_GROUP",
"fields": "{Adult,children,Kids}"
},
{
"key": "GENDER",
"fields": "{female,male,man}"
}
]
how to change it to array for fields alias ?
like this
[
{
"AGE_GROUP": ["Adult","children","Kids"]
},
{
"GENDER": ["female","male","man"]
}
]
or maybe whould be great like this
[
"some_alias": [{"AGE_GROUP": "Adult", "AGE_GROUP": "children", "AGE_GROUP": "Kids"}],
"some_alias": [{"GENDER": "female", "GENDER": "male", "GENDER": "man"}]
]
This will get you the former form I think:
select jsonb_agg(jsonb_build_object(k,v)) from (
select DISTINCT e.key, jsonb_agg(DISTINCT e.value) as fields
from products AS p
join jsonb_each_text(p.extras) e on true
GROUP BY e.key
) b(k,v);
Best regards,
Bjarni
Related
I have a postgres table with a jsonb colum like this:
create table if not exists doc
(
id uuid not null
constraint pkey_doc_id
primary key,
data jsonb not null
);
INSERT INTO doc (id, data) VALUES ('3cf40366-ea58-402d-b63b-c9d6fdf99ec8', '{"Id": "3cf40366-ea58-402d-b63b-c9d6fdf99ec8", "Tags": [{"Key": "inoivce", "Value": "70086"},{"Key": "customer", "Value": "100233"}] }' );
INSERT INTO doc (id, data) VALUES ('ae2d1119-adb9-41d2-96e9-53445eaf97ab', '{"Id": "ae2d1119-adb9-41d2-96e9-53445eaf97ab", "Tags": [{"Key": "project", "Value": "12345"},{"Key": "customer", "Value": "100233"}]}' );b9-41d2-96e9-53445eaf97ab", "Tags": [{"Key": "customer", "Value": "100233"}]}' )
Tags.Key in the first row contains a typo inoivce which I want to fix to invoice:
{
"Id": "3cf40366-ea58-402d-b63b-c9d6fdf99ec8",
"Tags": [{
"Key": "inoivce",
"Value": "70086"
},{
"Key": "customer",
"Value": "100233"
}]
}
I tried this:
update doc set data = jsonb_set(
data,
'{"Tags"}',
$${"Key":"invoice"}$$
) where data #> '{"Tags": [{ "Key":"inoivce"}]}';
The typo gets fixed but I'm loosing the other Tags elements in the array:
{
"Id": "3cf40366-ea58-402d-b63b-c9d6fdf99ec8",
"Tags": [{"Key": "invoice"}]
}
How can I fix the typo without removing the other elements of the Tags array?
Dbfiddle for repro.
One possible solution, not so obvious : we need a CTE because the idea here is to loop on the 'Tags' jsonb array elements using the jsonb_agg aggregate function to rebuild the array, but the SET clause of an UPDATE doesn't accept aggregate functions ...
WITH list AS
( SELECT d.id, (d.data - 'Tags') || jsonb_build_object('Tags', jsonb_agg(jsonb_set(e.content, '{Key}' :: text[], to_jsonb(replace(e.content->>'Key', 'inoivce', 'invoice'))) ORDER BY e.id)) AS data
FROM doc AS d
CROSS JOIN LATERAL jsonb_array_elements(d.data->'Tags') WITH ORDINALITY AS e(content, id)
WHERE d.data #? '$.Tags[*] ? (exists(# ? (#.Key == "inoivce")))'
GROUP BY d.id, d.data
)
UPDATE doc AS d
SET data = l.data
FROM list AS l
WHERE d.id = l.id
see the result in dbfiddle
I have a jsonb object like this:
{
"members": [
[
"1966-07-31",
null,
{
"last_name": "ss",
"first_name": "ss"
}
],
[
"1968-12-17",
"spouse",
{
"last_name": "kk",
"first_name": "kk"
}
]
]
}
I want to convert it to.
{
"applicants": [
{
"last_name": "ss",
"first_name": "ss"
}
{
"last_name": "kk",
"first_name": "kk"
}
]
}
Essentially taking the third element of each member and putting is an object in a new array 'applicant'. I don't need the member data that is outside the object.
I can run a PHP script to loop through all the rows and update it. But I'm wondering if I can write this in plain sql query?
Thanks.
You should use jsonb_array_elements with two CROSS JOIN for extract JSON array data then aggregate them
Dynamic array:
Demo
select
jsonb_build_object('applicants', jsonb_agg(e2.value))
from
test t
cross join jsonb_array_elements(t.obj -> 'members') as e1
cross join jsonb_array_elements(e1.value) as e2
where
e2.value ? 'first_name'
and e2.value ? 'last_name'
Static array:
If your structure is specific you don't need a loop over the array and can use the below query:
Demo
select
jsonb_build_object('applicants', jsonb_agg(e1.value -> 2))
from
test t
cross join jsonb_array_elements(t.obj -> 'members') as e1
Given a jsonb and set of keys how can I get a new jsonb with required keys.
I've tried extracting key-values and assigned to text[] and then using jsonb_object(text[]). It works well, but the problem comes when a key has a array of jsons.
create table my_jsonb_table
(
data_col jsonb
);
insert into my_jsonb_table (data_col) Values ('{
"schemaVersion": "1",
"Id": "20180601550002",
"Domains": [
{
"UID": "29aa2923",
"quantity": 1,
"item": "book",
"DepartmentDomain": {
"type": "paper",
"departId": "10"
},
"PriceDomain": {
"Price": 79.00,
"taxA": 6.500,
"discount": 0
}
},
{
"UID": "bbaa2923",
"quantity": 2,
"item": "pencil",
"DepartmentDomain": {
"type": "wood",
"departId": "11"
},
"PriceDomain": {
"Price": 7.00,
"taxA": 1.5175,
"discount": 1
}
}
],
"finalPrice": {
"totalTax": 13.50,
"total": 85.0
},
"MetaData": {
"shopId": "1405596346",
"locId": "95014",
"countryId": "USA",
"regId": "255",
"Date": "20180601"
}
}
')
This is what I am trying to achieve :
SELECT some_magic_fun(data_col,'Id,Domains.UID,Domains.DepartmentDomain.departId,finalPrice.total')::jsonb FROM my_jsonb_table;
I am trying to create that magic function which extracts the given keys in a jsonb format, as of now I am able to extract scalar items and put them in text[] and use jsonb_object. but don't know how can I extract all elements of array
expected output :
{
"Id": "20180601550002",
"Domains": [
{
"UID": "29aa2923",
"DepartmentDomain": {
"departId": "10"
}
},
{
"UID": "bbaa2923",
"DepartmentDomain": {
"departId": "11"
}
}
],
"finalPrice": {
"total": 85.0
}
}
I don't know of any magic. You have to rebuild it yourself.
select jsonb_build_object(
-- Straight forward
'Id', data_col->'Id',
'Domains', (
-- Aggregate all the "rows" back together into an array.
select jsonb_agg(
-- Turn each array element into a new object
jsonb_build_object(
'UID', domain->'UID',
'DepartmentDomain', jsonb_build_object(
'departId', domain#>'{DepartmentDomain,departId}'
)
)
)
-- Turn each element of the Domains array into a row
from jsonb_array_elements( data_col->'Domains' ) d(domain)
),
-- Also pretty straightforward
'finalPrice', jsonb_build_object(
'total', data_col#>'{finalPrice,total}'
)
) from my_jsonb_table;
This probably is not a good use of a JSON column. Your data is relational and would better fit traditional relational tables.
Below JSON is one of the column of type JSONB in my table 'logic', I want to query to check how many rows are there with type: QUESTION (any entry within conditions).
{
"name": null,
"conditions": [
{
"type": "QUESTION",
"question": {
}
},
{
"type": "QUESTION",
"question": {
}
},
{
"type": "FIELD",
"question": {
}
}
],
"expression": "A"
}
If you want to check the number of times "type": "QUESTION" entry appears within conditions of the jsonb column throughout the table.
select count(*) FROM logic CROSS JOIN LATERAL
jsonb_array_elements(jsonb_col->'conditions')as j(typ)
WHERE j->>'type' = 'QUESTION'
If you want to check the number of times "type": "QUESTION" entry appears within conditions for each row.
select jsonb_col,count(*) FROM logic CROSS JOIN LATERAL
jsonb_array_elements(jsonb_col->'conditions')as j(typ)
WHERE j->>'type' = 'QUESTION'
group by jsonb_col
If you want to check how many rows have at least one entry within conditions with
'type' = 'QUESTION',
select count(*) FROM
(
select DISTINCT jsonb_col FROM logic CROSS JOIN LATERAL
jsonb_array_elements(jsonb_col->'conditions')as j(typ)
WHERE j->>'type' = 'QUESTION'
)s;
Use the query which you find is appropriate for you
Demo
I have a table that stores records broken into fields, ie. if I have this record as follows
{
"name": "John Doe"
"gender": "male"
}
Then this record is stored in the table in 2 rows, as follows
[
{ "id": "1", "col": "name", "value": "John Doe" },
{ "id": "1", "col": "gender", "value": "male" }
]
If I want to write a postgresql function, that returns the record back to the original form (all attributes in a row form, instead of the one attribute one row form), how can I do it?
(the table design was done as an experiment for data warehousing purpose)
If I understand you right, you could write something like that
select a.value, b.value
from table1 a
inner join table1 b on a.id = b.id and a.orderNum < b.orderNum
To make it simple you could introduce field orderNum
p.s. I do not have Postgress installed, so you possibly would fix my query.