I am trying to get jsonb result based on matching key.
I have DB table "listings" with number and data column.
number | data
1 | {"name": "XYZ company", "city": "toronto", "province": "ON", "people" : [
{ "firstName": "tom", "lastName": "hanks",
"phonenumber": [{"type": "mobile", "Number": "111111"}],
"Email": [{"type": "business", "address": "tom#xyz.com"},{"type": "personal", "address": "tom#mailinator.com"}] },
{ "firstName": "sandra", "lastName": "petes",
"phonenumber": [{"type": "mobile", "Number": "333"}, {"type": "home", "Number": "444"}],
"Email": [{"type": "business", "address": "sandra#xyz.com"}]
}
]}
I need to pull all values for data column with keys -
people->firstname
people->lastName
people->phonenumber->Number
people->Email->address
What I achieved so far is:
SELECT number
,jonb_array_length(jsonb_extract_path(data,'people')) as people_count
,jsonb_extract_path(data,'people','0','firstname') as FirstName
,jsonb_extract_path(data,'people','0','lastname') as LastName
,jsonb_extract_path(data,'people','0','email','Address'") as personEmail
,jsonb_extract_path(data,'people','0','phonenumber','Number') as personPhone
FROM listings
WHERE number='1';
However, this only gives me 0th element of people, I need to find all elements. Is there any way to achieve this in single query.
Thanks for your time!
You need to use the jsonb_array_elements() function to get all of the elements of the array. Since that function returns a set of rows, you need to use it as a row source.
SELECT '1' AS number,
jsonb_array_length(data->'people') AS people_count,
people->>'firstname' AS FirstName,
people->>'lastname' AS LastName,
people->'email'->0->>'Address' AS personEmail,
people->'phonenumber'->0->>'Number' as personPhone
FROM listings, jsonb_array_elements(data->'people') p(people)
WHERE number = '1';
This will result in a row for every person where number = '1'. The email and phone number objects are arrays too and I pick here just the first value. If you want all of them you need to just get the whole JSON arrays and then wrap this in an outer query where you do jsonb_array_elements() again.
Related
I'm trying to update the list of contacts by deleting whatever contact is requested to be deleted by user input. In other words, trying to remove an entire JSON object from a JSON array in my PostgreSQL database from a Node.js script, but I get error
error: null value in column "info" of relation "user_emails" violates
not-null constraint
I double-checked and the value and everything is there. When I try it here online it works, but on my server it returns the error. How can I fix this?
DROP table if exists user_emails;
CREATE table user_emails (
id serial not null PRIMARY KEY,
info jsonb NOT NULL
);
insert into user_emails(info) values('{
"userid": "4",
"mailbox": "johndoe#example.com",
"contacts": [
{
"id": "ghr3gk8dez4",
"email": "janedoe#gmail.com",
"last_name": "Doe",
"first_name": "Jane",
"date_created": "2022-05-08T20:52:47.967Z"
},
{
"id": "th2lypvoxpr1652045110763",
"email": "aldoe#gmail.com",
"last_name": "Doe",
"first_name": "Al",
"date_created": "2022-05-08T21:25:10.763Z"
},
{
"id": "ld123tqicmj1652045372671",
"email": "stdoe#gmail.com",
"last_name": "Doe",
"first_name": "Stella",
"date_created": "2022-05-08T21:29:32.671Z"
},
{
"id": "1ltbrpbj8xf1652045768004",
"email": "mdoe#mail.com",
"last_name": "Doe",
"first_name": "Marta",
"date_created": "2022-05-08T21:36:08.004Z"
},
{
"id": "1dgntfwvsmf1652045832589",
"email": "nala#mail.com",
"last_name": "La",
"first_name": "Na",
"date_created": "2022-05-08T21:37:12.589Z"
},
{
"id": "ll3z1n0jkhc1652045984538",
"email": "bdoe#mail.com",
"last_name": "doe",
"first_name": "bruno",
"date_created": "2022-05-08T21:39:44.538Z"
},
{
"id": "kzr996xxxt1652046050118",
"email": "pp#mail.com",
"last_name": "Perf",
"first_name": "Perf",
"date_created": "2022-05-08T21:40:50.118Z"
},
{
"id": "41bovnvsihq1652046121940",
"email": "mmd#mm.com",
"last_name": "Doe",
"first_name": "Melinda",
"date_created": "2022-05-08T21:42:01.940Z"
},
{
"id": "tnjlj4dcg2b1652046154937",
"email": "keke#j.com",
"last_name": "Kee",
"first_name": "Kee",
"date_created": "2022-05-08T21:42:34.937Z"
},
{
"id": "hor0wafkuj1652046684582",
"email": "jojo#mail.com",
"last_name": "Jo",
"first_name": "Jo",
"date_created": "2022-05-08T21:51:24.582Z"
}
],
"auto_reply": false,
"email_name": "johndoe",
"signatures": [],
"domain_name": "example.com",
"date_created": "2022-05-08T20:39:54.881Z",
"forward_email": [],
"auto_reply_messages": []
}');
this is my UPDATE
UPDATE user_emails SET info = (SELECT jsonb_agg(j)
FROM jsonb_array_elements(user_emails.info->'contacts') as t(j)
WHERE j ->> 'id' not in ('ghr3gk8dez4'));
SELECT * FROM user_emails;
jsonb_agg, like so many other aggregate functions, returns NULL if there are no rows to aggregate. You might be looking to COALESCE it to an empty array instead:
UPDATE user_emails
SET info = jsonb_set(
user_emails.info,
'{contacts}',
COALESCE(
(SELECT jsonb_agg(j)
FROM jsonb_array_elements(user_emails.info->'contacts') as t(j)
WHERE j ->> 'id' not in ('ghr3gk8dez4')
),
'[]'::jsonb
)
);
I don't think there is an efficient way to do that using only built-in functions.
There is an operator #- that removes an element by specifying the path, e.g. info #- '{contacts, 0}' would do what you want. However, it's not straight forward to build such a "path array" directly.
I would write a function that finds the index of the contact to be deleted and generates the path array:
create or replace function find_entry(p_info jsonb, p_id text)
returns text[]
as
$$
select array['contacts', (idx - 1)::text]
from jsonb_array_elements(p_info -> 'contacts') with ordinality as t(element, idx)
where t.element ->> 'id' = p_id
limit 1;
$$
language sql;
The -1 is necessary because SQL uses 1-based numbering, but in JSON arrays start at zero.
With that function you can then do:
update user_emails
set info #- find_entry(info, 'ghr3gk8dez4')
where ...
I'm trying to search in a JSON using Postgres. The JSON looks like that:
"some key": {
"city": "Chicago",
"id": "",
"color": "",
"size": ""
},
"a different key": {
"city": "San Francisco",
"id": null,
"shape": "",
"height": ""
}
I don't know what can the names of the first level keys (that's why I called them "some key" and "a different key" in the example above). I do know that they can be different from one another.
I want to extract all the values of the "city" key, Chicago and San Francisco in the example above.
I guess it's something like that but this one didn't work:
(table_name.row_name-> * ->> 'city') as city_name
(I know that the city is always in the second level on the JSON, but can occur multiple times)
You can iterate over the keys:
select t.some_column,
x.item -> 'city' as city_name
from the_table t
cross join jsonb_each(t.the_column) as x(item)
This returns each city as a new row together with the other columns of that table.
The above assumes your column is defined as jsonb (which it should be). If it isn't you need to use json_each() instead
Starting out with JSONB data type and I'm hoping someone can help me out.
I have a table (properties) with two columns (id as primary key and data as jsonb).
The data structure is:
{
"ProductType": "ABC",
"ProductName": "XYZ",
"attributes": [
{
"name": "Color",
"type": "STRING",
"value": "Silver"
},
{
"name": "Case",
"type": "STRING",
"value": "Shells"
},
...
]
}
I would like to update the value of a specific attributes element by name for a row with a given id. For example, for the element with "name"="Case" change the value to "Glass". So it ends up like
{
"ProductType": "ABC",
"ProductName": "XYZ",
"attributes": [
{
"name": "Color",
"type": "STRING",
"value": "Silver"
},
{
"name": "Case",
"type": "STRING",
"value": "Glass"
},
...
]
}
Is this possible with this structure using SQL?
I have created table structure if any of you would like to give it a shot.
dbfiddle
Use the jsonb concatenation operator, ||, to replace keys on the fly:
WITH properties (id, data) AS (
values
(1, '{"ProductType": "ABC","ProductName": "XYZ","attributes": [{"name": "Color","type": "STRING","value": "Silver"},{"name": "Case","type": "STRING","value": "Shells"}]}'::jsonb),
(2, '{"ProductType": "ABC","ProductName": "XYZ","attributes": [{"name": "Color","type": "STRING","value": "Red"},{"name": "Case","type": "STRING","value": "Shells"}]}'::jsonb)
)
SELECT id,
data||
jsonb_build_object(
'attributes',
jsonb_agg(
case
when attribs->>'name' = 'Case' then attribs||'{"value": "Glass"}'::jsonb
else attribs
end
)
) as data
FROM properties m
CROSS JOIN LATERAL JSONB_ARRAY_ELEMENTS(data->'attributes') as a(attribs)
GROUP BY id, data
Updated fiddle
I have a postgres 9.6 table which has a json field config. I want to fetch records from this table where the json has a particular key value pair.
My table is as follows
CREATE TABLE features(
id integer NOT NULL,
plan character,
config json NOT NULL
)
In the json field, I am storing a json in the form
[
{ "name": "A", "state": "active"},
{ "name": "B", "state": "inactive"},
{ "name": "C", "state": "active"}
]
Now, I am querying the database to fetch all the records for which the json field contains the key-value pair { "name": "B", "state": "inactive"}.
My query is as follows
select * from features where config #> '[{ "name": "B", "state": "inactive"}]';
However, I get an error
ERROR: operator does not exist: config #> unknown
Any idea where I am going wrong here. Pointers will be highly appreciated. TIA !!!
Operator #> is only available for jsonb data type:
CREATE TABLE features(
id integer NOT NULL,
plan character,
config jsonb NOT NULL
);
CREATE
insert into features values(1,'a',' [ { "name": "A", "state": "active"}, { "name": "B", "state": "inactive"}, { "name": "C", "state": "active"} ]');
INSERT 0 1
select * from features where config #> '[{ "name": "B", "state": "inactive"}]';
id | plan | config
----+------+----------------------------------------------------------------------------------------------------------
1 | a | [{"name": "A", "state": "active"}, {"name": "B", "state": "inactive"}, {"name": "C", "state": "active"}]
(1 row)
With json data type in the table, you can use:
select * from
(select json_array_elements(config)::jsonb as item from features) as setofjsonb
where item = '{"name": "B", "state": "inactive"}'::jsonb;
item
------------------------------------
{"name": "B", "state": "inactive"}
(1 row)
I have an Order collection with address fields and User collection with names. The Order collection contains a string called userId, which is a "foreign key" into the users collection.
I am using an aggregation pipeline to filter, join, sort, and paginate queries. The problem is that I need to provide full text search on the address and name fields.
Because the $text match must be the first stage in a pipeline, I am not sure how to accomplish the goal of finding text matching any address or name field.
User collection
[{
"_id": "5cb8caa069fc1a4351cc3705",
"firstName": "James",
"lastName": "Bond"
},{
"_id": "5c58b8de8596d52c248f34d5",
"firstName": "Jack",
"lastName": "Ryan"
}]
Order Collection
[{
"_id": "5ccc94602e67ca44fe69f160",
"address": {
"streetAddress1": "1112 main st",
"streetAddress2": null,
"unitNumber": "unit 1112",
"city": "Jackson Hole",
"state": "WY",
"postalCode": "83001"
},
"userId": "5cb8caa069fc1a4351cc3705"
}]
A search for "Jack" should match both the name "Jack" and the city "Jackson Hole".