Remove entire json object from JSON array - postgresql

I'm trying to update the list of contacts by deleting whatever contact is requested to be deleted by user input. In other words, trying to remove an entire JSON object from a JSON array in my PostgreSQL database from a Node.js script, but I get error
error: null value in column "info" of relation "user_emails" violates
not-null constraint
I double-checked and the value and everything is there. When I try it here online it works, but on my server it returns the error. How can I fix this?
DROP table if exists user_emails;
CREATE table user_emails (
id serial not null PRIMARY KEY,
info jsonb NOT NULL
);
insert into user_emails(info) values('{
"userid": "4",
"mailbox": "johndoe#example.com",
"contacts": [
{
"id": "ghr3gk8dez4",
"email": "janedoe#gmail.com",
"last_name": "Doe",
"first_name": "Jane",
"date_created": "2022-05-08T20:52:47.967Z"
},
{
"id": "th2lypvoxpr1652045110763",
"email": "aldoe#gmail.com",
"last_name": "Doe",
"first_name": "Al",
"date_created": "2022-05-08T21:25:10.763Z"
},
{
"id": "ld123tqicmj1652045372671",
"email": "stdoe#gmail.com",
"last_name": "Doe",
"first_name": "Stella",
"date_created": "2022-05-08T21:29:32.671Z"
},
{
"id": "1ltbrpbj8xf1652045768004",
"email": "mdoe#mail.com",
"last_name": "Doe",
"first_name": "Marta",
"date_created": "2022-05-08T21:36:08.004Z"
},
{
"id": "1dgntfwvsmf1652045832589",
"email": "nala#mail.com",
"last_name": "La",
"first_name": "Na",
"date_created": "2022-05-08T21:37:12.589Z"
},
{
"id": "ll3z1n0jkhc1652045984538",
"email": "bdoe#mail.com",
"last_name": "doe",
"first_name": "bruno",
"date_created": "2022-05-08T21:39:44.538Z"
},
{
"id": "kzr996xxxt1652046050118",
"email": "pp#mail.com",
"last_name": "Perf",
"first_name": "Perf",
"date_created": "2022-05-08T21:40:50.118Z"
},
{
"id": "41bovnvsihq1652046121940",
"email": "mmd#mm.com",
"last_name": "Doe",
"first_name": "Melinda",
"date_created": "2022-05-08T21:42:01.940Z"
},
{
"id": "tnjlj4dcg2b1652046154937",
"email": "keke#j.com",
"last_name": "Kee",
"first_name": "Kee",
"date_created": "2022-05-08T21:42:34.937Z"
},
{
"id": "hor0wafkuj1652046684582",
"email": "jojo#mail.com",
"last_name": "Jo",
"first_name": "Jo",
"date_created": "2022-05-08T21:51:24.582Z"
}
],
"auto_reply": false,
"email_name": "johndoe",
"signatures": [],
"domain_name": "example.com",
"date_created": "2022-05-08T20:39:54.881Z",
"forward_email": [],
"auto_reply_messages": []
}');
this is my UPDATE
UPDATE user_emails SET info = (SELECT jsonb_agg(j)
FROM jsonb_array_elements(user_emails.info->'contacts') as t(j)
WHERE j ->> 'id' not in ('ghr3gk8dez4'));
SELECT * FROM user_emails;

jsonb_agg, like so many other aggregate functions, returns NULL if there are no rows to aggregate. You might be looking to COALESCE it to an empty array instead:
UPDATE user_emails
SET info = jsonb_set(
user_emails.info,
'{contacts}',
COALESCE(
(SELECT jsonb_agg(j)
FROM jsonb_array_elements(user_emails.info->'contacts') as t(j)
WHERE j ->> 'id' not in ('ghr3gk8dez4')
),
'[]'::jsonb
)
);

I don't think there is an efficient way to do that using only built-in functions.
There is an operator #- that removes an element by specifying the path, e.g. info #- '{contacts, 0}' would do what you want. However, it's not straight forward to build such a "path array" directly.
I would write a function that finds the index of the contact to be deleted and generates the path array:
create or replace function find_entry(p_info jsonb, p_id text)
returns text[]
as
$$
select array['contacts', (idx - 1)::text]
from jsonb_array_elements(p_info -> 'contacts') with ordinality as t(element, idx)
where t.element ->> 'id' = p_id
limit 1;
$$
language sql;
The -1 is necessary because SQL uses 1-based numbering, but in JSON arrays start at zero.
With that function you can then do:
update user_emails
set info #- find_entry(info, 'ghr3gk8dez4')
where ...

Related

PSQL - Query to transpose JSONB element

I have a jsonb object like this:
{
"applicants": [
{
"last_name": "ss",
"first_name": "ss",
"age": 31
},
{
"last_name": "kk",
"first_name": "kk",
"age": 32
}
]
}
I want to convert it to.
{
"applicants": [
{
"last_name": "ss",
"data": {
"first_name": "ss",
"age": 31
}
},
{
"last_name": "kk",
"data": {
"first_name": "kk",
"age": 32
}
}
]
}
I have done a similar thing using jsonb_array_elements and jsonb_build_object before, but I can't figure out how I would create a new data object inside each object, and transpose the fields into it.
Is it possible to write this in plain psql query?
Thanks.
I have to point out that Postgres is not the best tool for modifying JSON data structures, and if you feel the need to do so, it probably means that your solution is in general not optimal. While Postgres has the necessary features to do this, I wouldn't want to maintain code that contains queries like the following.
update my_table set
json_col = (
select
jsonb_build_object(
'applicants',
jsonb_agg(
elem- 'first_name'- 'age' || jsonb_build_object(
'data',
jsonb_build_object(
'first_name',
elem->'first_name',
'age',
elem->'age'
)
)
)
)
from jsonb_array_elements(json_col->'applicants') as arr(elem)
)
Test it in db<>fiddle.

PostgresSQL nested jsonb update value of complex key/value pairs

Starting out with JSONB data type and I'm hoping someone can help me out.
I have a table (properties) with two columns (id as primary key and data as jsonb).
The data structure is:
{
"ProductType": "ABC",
"ProductName": "XYZ",
"attributes": [
{
"name": "Color",
"type": "STRING",
"value": "Silver"
},
{
"name": "Case",
"type": "STRING",
"value": "Shells"
},
...
]
}
I would like to update the value of a specific attributes element by name for a row with a given id. For example, for the element with "name"="Case" change the value to "Glass". So it ends up like
{
"ProductType": "ABC",
"ProductName": "XYZ",
"attributes": [
{
"name": "Color",
"type": "STRING",
"value": "Silver"
},
{
"name": "Case",
"type": "STRING",
"value": "Glass"
},
...
]
}
Is this possible with this structure using SQL?
I have created table structure if any of you would like to give it a shot.
dbfiddle
Use the jsonb concatenation operator, ||, to replace keys on the fly:
WITH properties (id, data) AS (
values
(1, '{"ProductType": "ABC","ProductName": "XYZ","attributes": [{"name": "Color","type": "STRING","value": "Silver"},{"name": "Case","type": "STRING","value": "Shells"}]}'::jsonb),
(2, '{"ProductType": "ABC","ProductName": "XYZ","attributes": [{"name": "Color","type": "STRING","value": "Red"},{"name": "Case","type": "STRING","value": "Shells"}]}'::jsonb)
)
SELECT id,
data||
jsonb_build_object(
'attributes',
jsonb_agg(
case
when attribs->>'name' = 'Case' then attribs||'{"value": "Glass"}'::jsonb
else attribs
end
)
) as data
FROM properties m
CROSS JOIN LATERAL JSONB_ARRAY_ELEMENTS(data->'attributes') as a(attribs)
GROUP BY id, data
Updated fiddle

I wanted to insert the below value into postgresql, which data type I needed to use and how I consturct insert Query?

the data is json array
[{ "customer": "John Doe", "items": {"product": "Beer","qty": 6}}, { "staff" : "Jack" }]
Store it as json (or jsonb)
create table sameesh
(
data json
);
insert into sameesh (data)
values
('[{ "customer": "John Doe", "items": {"product": "Beer","qty": 6}}, { "staff" : "Jack" }]');

postgres + jsonb + get values of key from multidimentional array

I am trying to get jsonb result based on matching key.
I have DB table "listings" with number and data column.
number | data
1 | {"name": "XYZ company", "city": "toronto", "province": "ON", "people" : [
{ "firstName": "tom", "lastName": "hanks",
"phonenumber": [{"type": "mobile", "Number": "111111"}],
"Email": [{"type": "business", "address": "tom#xyz.com"},{"type": "personal", "address": "tom#mailinator.com"}] },
{ "firstName": "sandra", "lastName": "petes",
"phonenumber": [{"type": "mobile", "Number": "333"}, {"type": "home", "Number": "444"}],
"Email": [{"type": "business", "address": "sandra#xyz.com"}]
}
]}
I need to pull all values for data column with keys -
people->firstname
people->lastName
people->phonenumber->Number
people->Email->address
What I achieved so far is:
SELECT number
,jonb_array_length(jsonb_extract_path(data,'people')) as people_count
,jsonb_extract_path(data,'people','0','firstname') as FirstName
,jsonb_extract_path(data,'people','0','lastname') as LastName
,jsonb_extract_path(data,'people','0','email','Address'") as personEmail
,jsonb_extract_path(data,'people','0','phonenumber','Number') as personPhone
FROM listings
WHERE number='1';
However, this only gives me 0th element of people, I need to find all elements. Is there any way to achieve this in single query.
Thanks for your time!
You need to use the jsonb_array_elements() function to get all of the elements of the array. Since that function returns a set of rows, you need to use it as a row source.
SELECT '1' AS number,
jsonb_array_length(data->'people') AS people_count,
people->>'firstname' AS FirstName,
people->>'lastname' AS LastName,
people->'email'->0->>'Address' AS personEmail,
people->'phonenumber'->0->>'Number' as personPhone
FROM listings, jsonb_array_elements(data->'people') p(people)
WHERE number = '1';
This will result in a row for every person where number = '1'. The email and phone number objects are arrays too and I pick here just the first value. If you want all of them you need to just get the whole JSON arrays and then wrap this in an outer query where you do jsonb_array_elements() again.

How to get an associative array of rows from a subquery with postgres

I'm new to postgres and trying out some things before I take the leap over from mySQL.
What I'm trying to do is get an array of associative arrays into a single query.
It has to do with users that can select multiple contact types like phone, email and Facebook and I would like to retrieve those into the column 'contact'.
For a visualisation:
{
"first_name": "This",
"last_name": "is me",
"likes": [],
"city": null
}
And I would like to get something like this:
{
"first_name": "This",
"last_name": "Is me",
"likes": [],
"city": null,
"contact":
[
{"type":1, "value":"myemail#gmail.com", "privacy_rule":1},
{"type":4, "value":"myfacebook", "privacy_rule":1},
{"type":9, "value":"mylinkedin", "privacy_rule":1}
]
}
So the main query would be:
SELECT u.first_name, u.last_name, u.about, ARRAY(SELECT like_id FROM users_likes l WHERE l.user_id = u.user_id), u.city FROM users u WHERE user_id = {id}
The subquery would be:
SELECT c.type, c.value, c.privacy_rule FROM users_contact c WHERE c.user_id = u.user_id
But how do I integrate it in the main query to return the array of result rows?
Is it even possible?
Thanks in advance!
Ron
Ah, after some more filling about, here is the answer.
use json_build_object:
SELECT u.first_name, u.last_name,
ARRAY(SELECT like_id FROM users_likes l WHERE l.user_id = u.user_id) as likes,
ARRAY(SELECT json_build_object("contact_id", c.contact_id,
"value", c.value, "privacy",c.privacy)
FROM users_contact c WHERE c.user_id = u.user_id) as contact
FROM users_basic u WHERE user_id = {id}
This gives:
"first_name": "This",
"last_name": "Is Me",
"about": null,
"likes": [],
"city": null,
"contact": [
{
"contact_id": 1,
"value": "bbla",
"privacy": 2,
"type": "Phone"
},
{
"contact_id": 3,
"value": "blabla",
"privacy": 2,
"type": "Company Email"
},
{
"contact_id": 4,
"value": "blablabla",
"privacy": 2,
"type": "Telegram Id"
}
]
Hope it helps someone