how to write a postgresql query to fetch field value inside array of json data - postgresql

The following data is a sample json data in postgres db
{
"RA1":{
"RaItems": [
{
"a": 1,
"b": 2
},
{
"a": 11,
"b": 22
},
{
"a": 111,
"b": 222
}
]
}
}
I tried my query to fetch only a's field values in my select clause:
select data->'RA1'->RaItems'->0->'a' from table;
but i need a postgressql query to fetch all a's inspite of dynamic array length like a output below:
1,11,111

You need to make use the JSONB_ARRAY_ELEMENTS function and then wrap the whole thing in an outer query
SELECT d->'a' FROM (
SELECT JSONB_ARRAY_ELEMENTS (bada->'RA1'->'RaItems') AS d FROM mytable
) AS subq

Related

Using SQLAlchey query top x items in a Postgres JSONB field

I have a model with a JSONB field (Postgres).
from sqlalchemy.dialects.postgresql import JSONB
class Foo(Base):
__tablename__ = 'foo'
data = Column(JSONB, nullable=False)
...
where the data field looks like this:
[
{
"name": "a",
"value": 0.0143
},
{
"name": "b",
"value": 0.0039
},
{
"name": "c",
"value": 0.1537
},
...
]
and, given a search_name I want to return the rows where name is in the top x ranked names.
I know I can access the fields with something like:
res = session.query(Foo).filter(Foo.data['name'] == search_name)
but how do I order the JSON and extract the top names from that?
A SQLAlchemy solution would be preferred, but also a plain SQL one I can use as raw is fine.

Calculate sum for items in nested jsonb array

So I have Postgres DB table 'purchases' with columns 'id' and 'receipt'. 'id' is primary int column, the value in column 'receipt' is jsonb and can look like this:
{
"shop_name":"some_shop_name",
"items": [
{
"name": "foo",
"spent": 49,
"quantity": 1
},
{
"name": "bar",
"price": 99,
"quantity": 2
},
{
"name": "abc",
"price": 999,
"quantity": 1
},
...
]
}
There can be varied amount of items in receipt.
In the end I need to write a query so the resulting table contains purchase id and amount spent on all bar for every purchase in table as such:
id
spent
1
198
..
...
My issue:
I can't figure out how to work with jsonb inside select query along regular columns in the resulting table, I guess query should be structured as this:
SELECT p.id, %jsonb_parsing_result_here% AS spent
FROM purchases p
It's blocking me from moving further with iterating through items in FOR cycle (or maybe using another way).
You can unnest the items from the receipt column with a lateral join and the jsonb_array_elements function like:
SELECT p.id, SUM((item->>'price')::NUMERIC)
FROM purchases p,
LATERAL jsonb_array_elements(p.receipt->'items') r (item)
GROUP BY p.id
It's also possible to add a where condition, for example:
WHERE item->>'name' = 'bar

How to replace object in JSONB array with matching value(s)?

I have some nested JSONB object/arrays that I want to update/insert into array completely when a given set of values match.
I've tried this command UPDATE asset SET detail = jsonb_set(detail, '{"date": "01/01/2019"}', '{"name": "ABC", "date": "01/01/2019"}');, but doesn't work since the command is looking for an index and using || simply appends the new object into the array instead of replacing it.
Schemas
Table
CREATE TABLE asset (
id uuid NOT NULL DEFAULT uuid_generate_v1(),
detail jsonb NULL,
events_connected jsonb NULL
);
JSONB Schemas
[{
"name": "Stackoverflow",
"date": "01/01/2019"
}]
A little more in depth
[{
"id": "12345",
"events": [{"type": 0, "date": "01/01/01", "ratio": 1.0}]
}]
EDIT:
I've played around with this code which almost works, but it updates all rows with the new object instead of simply inserting/updating the event id and keys that matches. The sub query that finds the matching params seem to work.
update
asset
set
events_connected =
jsonb_set(
events_connected,
array[elem_index::text, 'events', nested_elem_index::text],
'{"type": 2, "ratio": 5.0, "tax": 1, "date": "01/02/2019"}'::jsonb,
true)
from (
select
(pos - 1) as "elem_index",
(pos2 - 1) as "nested_elem_index",
elem->>'id' as "id"
from
asset,
jsonb_array_elements(events_connected) with ordinality arr(elem, pos),
jsonb_array_elements(elem->'events') with ordinality arr2(elem2, pos2)
WHERE
elem->>'id' = '12345' AND
elem2->>'type' = '1' AND
elem2->>'date' = '01/02/2019'
) sub
WHERE
sub."elem_index" IS NOT NULL AND sub."nested_elem_index" IS NOT NULL;
Fiddle: https://dbfiddle.uk/?rdbms=postgres_11&fiddle=1b3f3b0c7a9f0adda50c54011880b61d

Postgres query to fetch records from table having a column of type Array of Objects

I have a table having two properties called id and categoryArray where id is of type number and categoryArray is a jsonb[] column and schema is as mentioned below:
"id": 1,
"categoryArray": [{
"Field1": "A",
"Field2": "A1"
},
{
"Field1": "B",
"Field2": "A2"
},
{
"Field1": "C",
"Field2": "A3"
}
]
The table will have multiple record with disctint ids and will have objects inside categoryArray.
I want to query in PostgreSQL all the ids where Field1 = A inside categoryArray object.
I have tried using jsonb_array_elements but not able to achieve expected result.
Do we have a possible way to query on basis of Field1 which is inside categoryArray object.
Since you are already using jsonb[], you can use following query to get records having value of Field1 = A using subscript generating function, generate_subscripts().
select
*
from
table
where
id in
(
select
id
from
(
select
id,
categoryArray,
generate_subscripts(categoryArray, 1) as s
FROM
table
) as i
where
categoryArray[s]->>'Field1' = 'A'
);

Access a JSONB array of objects as an object of arrays

I have a JSONB column in my Postgres 9.6 database with roughly the following structure
[
{
"A": "some value",
"B": "another value",
"foo": "bar",
"x": "y"
},
{
"B": "abc",
"C": "asdf"
}
]
It is always an array of objects, the number of array elements varies. Some of the object keys are in each array element, but not all of them. The real objects have many more keys, a few hundred are common.
In certain situations, I need to get the value of a specific key for each array element. For example, if I want to access the key "B" the result should be
["another value", "abc"]
if I access "foo" the result should be
["bar", null]
Is there a reasonably efficient way to fetch all values for a specific key in a SQL query? It should work independent of the number of objects in the array, and ideally it should not get slower if the number of key in the objects get much larger.
You can use the jsonb_array_elements to extract each object, aggregate those you want in an array using ARRAY_AGG and then convert that into a json array using array_to_json:
WITH j(json) AS (
VALUES ('[
{
"A": "some value",
"B": "another value",
"foo": "bar",
"x": "y"
},
{
"B": "abc",
"C": "asdf"
}
]'::jsonb)
)
SELECT array_to_json(ARRAY_AGG(elem->'B'))
FROM j, jsonb_array_elements(json) elem
;
array_to_json
-------------------------
["another value","abc"]
(1 row)
WITH j(json) AS (
VALUES ('[
{
"A": "some value",
"B": "another value",
"foo": "bar",
"x": "y"
},
{
"B": "abc",
"C": "asdf"
}
]'::jsonb)
)
SELECT array_to_json(ARRAY_AGG(elem->'foo'))
FROM j, jsonb_array_elements(json) elem
;
array_to_json
---------------
["bar",null]
(1 row)