I have the the table orders which contains a column called stores with a jsonb with many elements like
id status stores
1 in progress [{"id": 1, "lat": 19.41, "lng": -99.18, "name": "Domino´s pizza condesa", "products": "3 tacos de cabeza"},
{"id": 2, "lat": 19.03, "lng": -99.9, "name": "Papa guapa roma", "products": "una papa"}]
1 done [{"id": 3, "lat": 19.44, "lng": -99.28, "name": "ABC", "products": "3 tacos de cabeza"},
{"id": 4, "lat": 19.23, "lng": -99.29, "name": "Papa guapa roma", "products": "una papa"}]
I want to query the table orders and select only the element from the json that matches {"lat:19.41", "lng":-99.18}
So I get something like
id status store_filtered
1 in progress [{"id": 1, "lat": 19.41, "lng": -99.18, "name": "Domino´s pizza condesa", "products": "3 tacos de cabeza"}]
I currently have
SELECT id, status, stores->0 AS stores_filtered FROM orders WHERE stores #> '[{"lat":19.41, "lng": -99.18}]'
But that 0 is what I need to make dynamic, to give me the element that matches not the first.
with r as (
select status, unnest(stores) as stores
from orders
)
select * from r where stores->lat='19.41' and stores->lng='-99.18'
Asssuming that you're using PostgreSQL 9.5 (or newer), you can use the jsonb_array_elements(json) (or jsonb_array_elements(json)) functions to "unnest" the array, and then choose.
For instance, you can try:
WITH orders(id, status, stores) AS
(
VALUES
(1, 'in progress',
'[{"id": 1, "lat": 19.41, "lng": -99.18, "name": "Domino´s pizza condesa", "products": "3 tacos de cabeza"},
{"id": 2, "lat": 19.03, "lng": -99.9, "name": "Papa guapa roma", "products": "una papa"}]'::jsonb),
(1, 'done',
'[{"id": 3, "lat": 19.44, "lng": -99.28, "name": "ABC", "products": "3 tacos de cabeza"},
{"id": 4, "lat": 19.23, "lng": -99.29, "name": "Papa guapa roma", "products": "una papa"}]')
)
, expanded_stores AS
(
SELECT
id, status, jsonb_array_elements(stores) AS single_store
FROM
orders
)
SELECT
id, status, single_store AS filtered_store
FROM
expanded_stores
WHERE
single_store #> '{"lat":19.41, "lng": -99.18}'
Or, assuming that you already have declared your orders table, you could just use:
SELECT
id, status, single_store AS filtered_store
FROM
(
SELECT
id, status, jsonb_array_elements(stores) AS single_store
FROM
orders
) AS expanded_stores
WHERE
single_store #> '{"lat":19.41, "lng": -99.18}' ;
You can check version 9.5 at http://rextester.com/MQE36487.
NOTE: Although this does work, it looks a bit "unnatural" for a SQL database. It is normally easier to normalize the information, and reserve JSON just where a lot of schema flexibility is really needed.
Related
I have a PostgreSQL 12.x database. There is a column data in a table typename that contains JSON. The actual JSON data is not fixed to a particular structure; these are some examples:
{"emt": {"key": " ", "source": "INPUT"}, "id": 1, "fields": {}}
{"emt": {"key": "Stack Overflow", "source": "INPUT"}, "id": 2, "fields": {}}
{"emt": {"key": "https://www.domain.tld/index.html", "source": "INPUT"}, "description": {"key": "JSONB datatype", "source": "INPUT"}, "overlay": {"id": 5, "source": "bOv"}, "fields": {"id": 1, "description": "Themed", "recs ": "1"}}
Basically, what I'm trying to come up with is a (database migration) script that will find any object with the keys key and source, take the actual value of key and assign it to the corresponding key/value pair where the object was originally bound to. For instance:
{"emt": " ", "id": 1, "fields": {}}
{"emt": "Stack Overflow", "id": 2, "fields": {}}
{"emt": "https://www.domain.tld/index.html", "description": "JSONB datatype", "overlay": {"id": 5, "source": "bOv"}, "fields": {"id": 1, "description": "Themed", "recs ": "1"}}
I started finding the rows that contained "source": "INPUT" by using:
select * from typename
where jsonb_path_exists(data, '$.** ? (#.type() == "string" && # like_regex "INPUT")');
...but then I'm not sure how to update the returned subset or to loop through it :/
It took me a while but here is the update statement:
update typename
set data = jsonb_set(data, '{emt}', jsonb_extract_path(data, 'emt', 'key')::jsonb, false)
where jsonb_typeof(data -> 'emt') = 'object'
and jsonb_path_exists(data, '$.emt.key ? (#.type() == "string")')
and jsonb_path_exists(data, '$.emt.source ? (#.type() == "string" && # like_regex "INPUT")');
There are probably better ways to implement that where clause, but that one works ;)
One downside is that I had to figure it out how many keys are involved in the update and align it with the number of update statements; e.g.: in the original example there were two keys: emt and description — so it should have been two update statements.
a have a simple table of purchases consisting of id and jsonb column, like this:
CREATE TABLE purchase (id SERIAL, products JSONB)
Then and index:
CREATE INDEX idx_purchase_products ON purchase USING GIN (products);
The sample data are like this:
INSERT INTO purchase VALUES (
1, jsonb('[
{
"country": 1,
"type": 1,
"size": 10,
"color": 3
}
]')
),
(
2, jsonb('[
{
"country": 1,
"type": 1,
"size": 10,
"color": 3
},
{
"country": 1,
"type": 2,
"size": 12,
"color": 4
},
{
"country": 2,
"type": 1,
"size": 12,
"color": 3
}
]')
),
(
3, jsonb('[
{
"country": 1,
"type": 1,
"size": 10,
"color": 3
}
]')
),
(
4, jsonb('[
{
"country": 1,
"type": 1,
"size": 10,
"color": 3
},
{
"country": 1,
"type": 2,
"size": 12,
"color": 4
},
{
"country": 2,
"type": 1,
"size": 12,
"color": 3
}
]')
);
And some scenarios of searching:
SELECT *
FROM purchase
WHERE products #> '[{"country": 1}]'
SELECT *
FROM purchase
WHERE products #> '[{"country": 1, "type": 1}]'
SELECT *
FROM purchase
WHERE products #> '[{"size": 12}]'
SELECT *
FROM purchase
WHERE products #> '[{"size": 12, "color": 4}]'
It is expected, that the customer could search for combinations:
country,
country + type
country + type + size
country + type + size + color
country + size
size + color
type + color
etc.
And there is a big chance, the list of 4 field (country, type, size, color) will grow in future to 7-10.
And of course we want also search combinations like this:
.. WHERE products #> '[{"country": 1}]' OR products #> '[{"color": 4}]' OR products #> '[{"type": 1, "size": 10}]'
Estimated size of the table purchase is 9-12 millions rows (depending on season).
Any idea how to implement the indexes to get the query result as fast as possible?
I have a document that looks like this:
"userName": "sample name",
"values": [
{
"values": [
{
"brand": "SOLIGNUM CLEAR",
"name": "Solignum Colourless AZ",
"price": "569",
"qip": "30.00",
"sku": "1L",
"unit": "Piece"
}
]
},
{
"values": [
{
"brand": "FirePRO",
"name": "FirePRO",
"price": "419.75",
"qip": "30.00",
"sku": "1L",
"unit": "Cartons"
},
{
"brand": "SOLIGNUM AEROSOL",
"name": "Solignum Colourless AZ Aerosol",
"price": "397",
"qip": "30.00",
"sku": "500ML",
"unit": "Piece"
}
]
}
]
My query looks like this:
SELECT orders.unit, orders.sku, orders.name, orders.srp, TONUMBER(orders.price) AS price, orders.qip as quantity
FROM jdi stoCallLog
UNNEST stoCallLog.`values`[0].`values` AS orders
Query result looks like this
I have tried changing the unnest block into this:
UNNEST stoCallLog.`values`[1].`values` AS orders
selects only the 2nd array value
Also like this:
UNNEST stoCallLog.`values`.`values` AS orders
not possible i guess, it returns none
I need a way to select all of the values at once. Is there any way to do it?
Solved by modifying the UNNEST block to:
UNNEST `values` as rawOrders
UNNEST rawOrders.`values` as orders
I have a Jsonb column that store array of elements like the following:
[
{"id": "11", "name": "John", "age":"25", ..........},
{"id": "22", "name": "Mike", "age":"35", ..........},
{"id": "33", "name": "Tom", "age":"45", ..........},
.....
]
I want to replace the 2nd object(id=22) with a totally new object. I don't want to update each property one by one because there are many properties and their values all could have changed. I want to just identify the 2nd element and replace the whole object.
I know there is a jsonb_set(). However, to update the 2nd element, I need to know its array index=1 so I can do the following:
jsonb_set(data, '{1}', '{"id": "22", "name": "Don", "age":"55"}',true)
But I couldn't find any way to search and get that index. Can someone help me out?
One way I can think of is to combine row_number and json_array_elements:
-- test data
create table test (id integer, data jsonb);
insert into test values (1, '[{"id": "22", "name": "Don", "age":"55"}, {"id": "23", "name": "Don2", "age":"55"},{"id": "24", "name": "Don3", "age":"55"}]');
insert into test values (2, '[{"id": "32", "name": "Don", "age":"55"}, {"id": "33", "name": "Don2", "age":"55"},{"id": "34", "name": "Don3", "age":"55"}]');
select subrow, id, row_number() over (partition by id)
from (
select json_array_elements(data) as subrow, id
from test
) as t;
subrow | id | row_number
------------------------------------------+----+------------
{"id": "22", "name": "Don", "age":"55"} | 1 | 1
{"id": "23", "name": "Don2", "age":"55"} | 1 | 2
{"id": "24", "name": "Don3", "age":"55"} | 1 | 3
{"id": "32", "name": "Don", "age":"55"} | 2 | 1
{"id": "33", "name": "Don2", "age":"55"} | 2 | 2
{"id": "34", "name": "Don3", "age":"55"} | 2 | 3
-- apparently you can filter what you want from here
select subrow, id, row_number() over (partition by id)
from (
select json_array_elements(data) as subrow, id
from test
) as t
where subrow->>'id' = '23';
In addition, think about your schema design. It may not be the best idea to store your data this way.
I'm new to postgres and trying out some things before I take the leap over from mySQL.
What I'm trying to do is get an array of associative arrays into a single query.
It has to do with users that can select multiple contact types like phone, email and Facebook and I would like to retrieve those into the column 'contact'.
For a visualisation:
{
"first_name": "This",
"last_name": "is me",
"likes": [],
"city": null
}
And I would like to get something like this:
{
"first_name": "This",
"last_name": "Is me",
"likes": [],
"city": null,
"contact":
[
{"type":1, "value":"myemail#gmail.com", "privacy_rule":1},
{"type":4, "value":"myfacebook", "privacy_rule":1},
{"type":9, "value":"mylinkedin", "privacy_rule":1}
]
}
So the main query would be:
SELECT u.first_name, u.last_name, u.about, ARRAY(SELECT like_id FROM users_likes l WHERE l.user_id = u.user_id), u.city FROM users u WHERE user_id = {id}
The subquery would be:
SELECT c.type, c.value, c.privacy_rule FROM users_contact c WHERE c.user_id = u.user_id
But how do I integrate it in the main query to return the array of result rows?
Is it even possible?
Thanks in advance!
Ron
Ah, after some more filling about, here is the answer.
use json_build_object:
SELECT u.first_name, u.last_name,
ARRAY(SELECT like_id FROM users_likes l WHERE l.user_id = u.user_id) as likes,
ARRAY(SELECT json_build_object("contact_id", c.contact_id,
"value", c.value, "privacy",c.privacy)
FROM users_contact c WHERE c.user_id = u.user_id) as contact
FROM users_basic u WHERE user_id = {id}
This gives:
"first_name": "This",
"last_name": "Is Me",
"about": null,
"likes": [],
"city": null,
"contact": [
{
"contact_id": 1,
"value": "bbla",
"privacy": 2,
"type": "Phone"
},
{
"contact_id": 3,
"value": "blabla",
"privacy": 2,
"type": "Company Email"
},
{
"contact_id": 4,
"value": "blablabla",
"privacy": 2,
"type": "Telegram Id"
}
]
Hope it helps someone