I have some rows that look like this:
"id","name","data","created_at"
110,"customerAdd","{\"age\": \"23\", \"name\": \"sally\", \"status\": \"valid\"}","2016-05-10 00:18:53.325752"
111,"customerUpdate","{\"age\": \"19\", \"name\": \"sally\"}","2016-05-10 00:18:53.331443"
112,"customerDelete","{\"name\": \"sally\", \"status\": \"deleted\"}","2016-05-10 00:18:53.338929"
Using a query like this:
SELECT * FROM events WHERE data->>'name' = 'sally';
I'm looking for a way to flatten / reduce / join all of these rows into one row.
Either in JSON, like this:
{'age': 19, 'status': 'deleted', name: 'sally'}
Or In actual columns, like this:
age, status, name
19, deleted, sally
The general idea is that I want to resolve each prop to the last value it was assigned, but I'd like this to happen deeply.
I'm just looking for any way to get started.
You can use DISTINCT ON the keys and values from jsonb_each_text, ordered by created_at desc, and then recombine all values using jsonb_object:
with
events(id,name,data,created_at) as (
values (110,'customerAdd','{"age": "23", "name": "sally", "status": "valid"}'::jsonb,'2016-05-10 00:18:53.325752'::timestamp),
(111,'customerUpdate','{"age": "19", "name": "sally"}','2016-05-10 00:18:53.331443'),
(112,'customerDelete','{"name": "sally", "status": "deleted"}','2016-05-10 00:18:53.338929')
),
flattened as (
select distinct on (kv.key) *
from events e
join lateral jsonb_each_text(data) kv on true
order by kv.key, e.created_at desc
)
select jsonb_object(array(select key from flattened order by key),
array(select value from flattened order by key))
Related
I have a column in postgresl 14 table that is a list of dictionary with 4 keys like this:
[{"id": 14771, "stock": "alfa-12", "name": "rooftop", "store": "MI"},
{"id": 14700, "stock": "beta-10", "name": "stove", "store": "UK"}]
This list can contain dozens of dicts but they all have id, stock, name and store as keys.
Is there a way to query this field and get it as regular columns like this:
id stock name store
14771 alfa-12 rooftop MI
14700 beta-10 stove UK
Use jsonb_array_elements() to get all elements of the json array and jsonb_to_record() to convert the elements to records. Read about the functions in the documentation.
select id, name, stock, store
from my_table
cross join jsonb_array_elements(json_col)
cross join jsonb_to_record(value) as t(id int, name text, stock text, store text)
Test it in db<>fiddle.
For simplicity, a row of table looks like this:
key: "z06khw1bwi886r18k1m7d66bi67yqlns",
reference_keys: {
"KEY": "1x6t4y",
"CODE": "IT137-521e9204-ABC-TESTE"
"NAME": "A"
},
I have a jsonb object like this one {"KEY": "1x6t4y", "CODE": "IT137-521e9204-ABC-TESTE", "NAME": "A"} and I want to search for a query in the values of any key. If my query is something like '521e9204' I want it to return the row that reference_keys has '521e9204' in any value. Basicly the keys don't matter for this scenario.
Note: The column reference_keys and so the jsonb object, are always a 1 dimensional array.
I have tried a query like this:
SELECT * FROM table
LEFT JOIN jsonb_each_text(table.reference_keys) AS j(k, value) ON true
WHERE j.value LIKE '%521e9204%'
The problem is that it duplicates rows, for every key in the json and it messes up the returned items.
I have also thinked of doing something like this:
SELECT DISTINCT jsonb_object_keys(reference_keys) from table;
and then use a query like:
SELECT * FROM table
WHERE reference_keys->>'CODE' like '%521e9204%'
It seems like this would work but I really don't want to rely on this solution.
You can rewrite your JOIN to an EXISTS condition to avoid the duplicates:
SELECT t.*
FROM the_table t
WHERE EXISTS (select *
from jsonb_each_text(t.reference_keys) AS j(k, value)
WHERE j.value LIKE '%521e9204%');
If you are using Postgres 12 or later, you can also use a JSON path query:
where jsonb_path_exists(reference_keys, 'strict $.** ? (# like_regex "521e9204")')
I have a PostgreSQL (V14) database containing info in JSONB format. The info of one cell could be something like this:
{
"Car23": {
"color": "blue",
"year": 1982,
"engine": [
12,
23.3
],
"broke": [
2,
8.5
]
},
"Banana": {
"color": "yellow",
"year": 2022,
"taste": "ok"
},
"asdf": {
"taste": "bad",
"year": [
1945,
6
],
"engine": [
24,
53.534
]
},
"Unique": {
"broke": [
342,
2.5
]
}
}
The outer key, i.o "Car23" or "Banana" has a random name created by an outside program. I want to do queries that allow me to get where the outer key contains a certain key:value.
For instance:
Find outer key(s) that broke. ("Car23" and "Unique")
find outer key(s) that have a year above 1988. ("Banana")
Find outer key(s) that have engine info and the second array number is higher then 50. ("asdf")
In Sql this seems pretty standard stuff, however I don't know how to do this within JSONB when the outer keys have random names...
I red that outer wildcard keys aren't possible, so I'm hoping there's another way of doing this within Postgresql.
You will need to unnest the JSON elements and then pick the ones you want. The fact that some values are sometimes stored in an array, and sometimes as a plain value makes things even more complicated.
I assume that "things that broke" just means those, that have a key broke:
select j.key
from the_table t
cross join lateral (
select *
from jsonb_each(t.the_column) as j(key, item)
where j.item ? 'broke'
) j;
To find those with a year > 1988 is tricky because of the two different ways of storing the year:
select j.key
from the_table t
cross join lateral (
select *
from jsonb_each(t.the_column) as j(key, item)
where case
when jsonb_typeof(j.item -> 'year') = 'array' then (j.item -> 'year' -> 0)::int
else (j.item ->> 'year')::int
end > 1988
) j;
When checking for the "engine" array item, you probably should also check if it's really an array:
select j.key
from the_table t
cross join lateral (
select *
from jsonb_each(t.the_column) as j(key, item)
where jsonb_typeof(j.item -> 'engine') = 'array'
and (j.item -> 'engine' ->> 1)::numeric > 50
) j;
I made a similar post before, but deleted it as it had contextual errors.
One of the tables in my database includes a JSONB column which includes an array of JSON objects. It's not dissimilar to this example of a session table which I've mocked up below.
id
user_id
snapshot
inserted_at
1
37
{cart: [{product_id: 1, price_in_cents: 3000, name: "product A"}, {product_id: 2, price_in_cents: 2500, name: "product B"}]}
2022-01-01 20:00:00.000000
2
24
{cart: [{product_id: 1, price_in_cents: 3000, name: "product A"}, {product_id: 3, price_in_cents: 5500, name: "product C"}]}
2022-01-02 20:00:00.000000
3
88
{cart: [{product_id: 4, price_in_cents: 1500, name: "product D"}, {product_id: 2, price_in_cents: 2500, name: "product B"}]}
2022-01-03 20:00:00.000000
The query I've worked with to retrieve records from this table is as follows.
SELECT sessions.*
FROM sessions
INNER JOIN LATERAL (
SELECT *
FROM jsonb_to_recordset(sessions.snapshot->'cart')
AS product(
"product_id" integer,
"name" varchar,
"price_in_cents" integer
)
) AS cart ON true;
I've been trying to update the query above to retrieve only the records in the sessions table for which ALL of the products in the cart have a price_in_cents value of greater than 2000.
To this point, I've not had any success on forming this query but I'd be grateful if anyone here can point me in the right direction.
You can use a JSON path expression:
select *
from sessions
...
where not sessions.snapshot ## '$.cart[*].price_in_cents <= 2000'
There is no JSON path expression that would check that all array elements are greater 2000. So this returns those rows where no element is smaller than 2000 - because that can be expressed with a JSON path expression.
Here is one possible solution based on the idea of your original query.
Each element of the cart JSON array object is joined to its sessions parent row. You 're left adding the WHERE clause conditions now that the wanted JSON array elements are exposed.
SELECT *
FROM (
SELECT
sess.id,
sess.user_id,
sess.inserted_at,
cart_items.cart_name,
cart_items.cart_product_id,
cart_items.cart_price_in_cents
FROM sessions sess,
LATERAL (SELECT (snapshot -> 'cart') snapshot_cart FROM sessions WHERE id = sess.id) snap_arr,
LATERAL (SELECT
(value::jsonb ->> 'name')::text cart_name,
(value::jsonb -> 'product_id')::int cart_product_id,
(value::jsonb -> 'price_in_cents')::int cart_price_in_cents
FROM JSONB_ARRAY_ELEMENTS(snap_arr.snapshot_cart)) cart_items
) session_snapshot_cart_product;
Explanation :
From the sessions table, the cart array is exctracted and joined per sessions row
The necessary items of the cart JSON array is then unnested by the second join using the JSONB_ARRAY_ELEMENTS(jsonb) function
The following worked well for me and allowed me the flexibility to use different comparison operators other than just ones such as == or <=.
In one of the scenarios I needed to construct, I also needed to have my WHERE in the subquery also compare against an array of values using the IN comparison operator, which was not viable using some of the other solutions that were looked at.
Leaving this here in case others run into the same issue as I did, or if others find better solutions or want to propose suggestions to build upon this one.
SELECT *
FROM sessions
WHERE NOT EXISTS (
SELECT sessions.*
FROM sessions
INNER JOIN LATERAL (
SELECT *
FROM jsonb_to_recordset(sessions.snapshot->'cart')
AS product(
"product_id" integer,
"name" varchar,
"price_in_cents" integer
)
) AS cart ON true
WHERE name ILIKE "Product%";
)
I have a table with three columns: id, name and position. I want to create a JSON array as following:
[
{"id": 443, "name": "first"},
{"id": 645, "name": "second"}
]
This should be listed by the position column.
I started with the following query:
with output as
(
select id, name
from the_table
)
select array_to_json(array_agg(output))
from output
This works, great. Now I want to add the ordering. I started with this:
with output as
(
select id, name, position
from the_table
)
select array_to_json(array_agg(output order by output.position))
from output
Now the output is as following:
[
{"id": 443, "name": "first", "position": 1},
{"id": 645, "name": "second", "position": 2}
]
But I don't want the position field in the output.
I am facing a chicken-egg problem: I need the position column to be able to order on it, but I also don't want the position column, as I don't want it in the result output.
How can I fix this?
I don't think the following query is correct, as table ordering is (theoretically) not preserved between queries:
with output as
(
select id, name
from the_table
order by position
)
select array_to_json(array_agg(output))
from output
There are two ways (at least):
Build JSON object:
with t(x,y) as (values(1,1),(2,2))
select json_agg(json_build_object('x',t.x) order by t.y) from t;
Or delete unnecessary key:
with t(x,y) as (values(1,1),(2,2))
select json_agg((to_jsonb(t)-'y')::json order by t.y) from t;
Note that in the second case you need some type casts because - operator defined only for JSONB type.
Also note that I used direct JSON aggregation json_agg() instead of pair array_to_json(array_agg())