Postgresql get keys from array of objects in JSONB field - postgresql

Here' a dummy data for the jsonb column
[ { "name": [ "sun11", "sun12" ], "alignment": "center", "more": "fields" }, { "name": [ "sun12", "sun13" ], "alignment": "center" }, { "name": [ "sun14", "sun15" ] }]
I want to fetch all the name keys value from jsonb array of objects...expecting output -
[ [ "sun11", "sun12" ], [ "sun12", "sun13" ], [ "sun14", "sun15" ] ]
The problem is that I'm able to fetch the name key value by giving the index like 0, 1, etc
SELECT data->0->'name' FROM public."user";
[ "sun11", "sun12" ]
But I'm not able to get all the name keys values from same array of object.I Just want to get all the keys values from the array of json object. Any help will be helpful. Thanks

demo:db<>fiddle (Final query first, intermediate steps below)
WITH data AS (
SELECT '[ { "name": [ "sun11", "sun12" ], "alignment": "center", "more": "fields" }, { "name": [ "sun12", "sun13" ], "alignment": "center" }, { "name": [ "sun14", "sun15" ] }]'::jsonb AS jsondata
)
SELECT
jsonb_agg(elems.value -> 'name') -- 2
FROM
data,
jsonb_array_elements(jsondata) AS elems -- 1
jsonb_array_elements() expands every array element into one row
-> operator gives the array for attribute name; after that jsonb_agg() puts all extracted arrays into one again.

my example
SELECT DISTINCT sub.name FROM (
SELECT
jsonb_build_object('name', p.data->'name') AS name
FROM user AS u
WHERE u.data IS NOT NULL
) sub
WHERE sub.name != '{"name": null}';

Related

Indexing of tuples in a large PSQL JSONB array

I have a JSONB column "deps" like this
[
[
{
"name": "A"
},
"823"
],
[
{
"name": "B"
},
"332"
],
[
{
"name": "B"
},
"311"
]
]
The dictionary {"name": "B"} always comes first in the tuple. The number of tuples (items in the array) varies between zero and 15K
Is it possible to create an index of the items by "name" field of the dictionary?
Will such index improve performance of search for all rows containing
"name" = "B" or "name" = "C" ?
SELECT *
FROM the_table
WHERE deps #> ANY (
ARRAY [
'[[{"name": "B"}]]',
'[[{"name": "C"}]]'
]::jsonb[]
);
I have discovered that the time of query above does not depend much on the size of the array in the ANY argument. It seems that PSQL creates a temporary (?) index when looking for B, consequent lookup for "C" runs fast.
The table is quite large ~300GB, 25M rows

How to add new property in an object nested in 2 arrays (JSONB postgresql)

I am looking to you for help in adding a property to a json object nested in 2 arrays.
Table Example :
CREATE TABLE events (
seq_id BIGINT PRIMARY KEY,
data JSONB NOT NULL,
evt_type TEXT NOT NULL
);
example of my JSONB data column in my table:
{
"Id":"1",
"Calendar":{
"Entries":[
{
"Id": 1,
"SubEntries":[
{
"ExtId":{
"Id":"10",
"System": "SampleSys"
},
"Country":"FR",
"Details":[
{
"ItemId":"1",
"Quantity":10,
},
{
"ItemId":"2",
"Quantity":3,
}
],
"RequestId":"222",
"TypeId":1,
}
],
"OrderResult":null
}
],
"OtherThingsArray":[
]
}
}
So I need to add new properties into a SubEntries object based on the Id value of the ExtId object (The where clause)
How can I do that please ?
Thanks a lot
You can use jsonb_set() for this, which takes jsonpath assignments as a text[] (array of text values) as
SELECT jsonb_set(
input_jsonb,
the starting jsonb document
path_array '{i,j,k[, ...]}'::text[],
the path array, where the series {i, j, k} progresses at each level with either the (string) key or (integer) index (starting at zero)denoting the new key (or index) to populate
new_jsonb_value,
if adding a key-value pair, you can use something like to_jsonb('new_value_string'::text) here to force things to format correctly
create_if_not_exists_boolean
if adding new keys/indexes, give this as true so they'll be appended; otherwise you'll be limited to overwriting existing keys
)
Example
json
{
"array1": [
{
"id": 1,
"detail": "test"
}
]
}
SQL
SELECT
jsonb_set('{"array1": [{"id": 1, "detail": "test"}]}'::jsonb,
'{array1,0,update}'::TEXT[],
to_jsonb('new'::text),
true
)
Output
{
"array1": [
{
"id": 1,
"upd": "new",
"detail": "test"
}
]
}
Note that you can only add 1 nominal level of depth at a time (i.e. either a new key or a new index entry), but you can circumvent this by providing the full depth in the assignment value, or by using jsonb_set() iteratively:
select
jsonb_set(
jsonb_set('{"array1": [{"id": 1, "detail": "test"}]}'::jsonb, '{array1,0,upd}'::TEXT[], '[{"new": "val"}]'::jsonb, true),
'{array1,0,upd,0,check}'::TEXT[],
'"test_val"',
true)
would be required to produce
{
"array1": [
{
"id": 1,
"upd": [
{
"new": "val",
"check": "test_val"
}
],
"detail": "test"
}
]
}
If you need other, more complex logic to evaluate which values need to be added to which objects, you can try:
dynamically creating a set of jsonb_set() statements for execution
using the outputs from queries of jsonb_each() and jsonb_array_elements() to evaluate the row logic down at the SubEntities level, and then using jsonb_object_agg() and jsonb_agg() as appropriate to build the document back up to the root level from the resultant object-arrays and key-value collections

Mongoose how to push an element to a schema of array of arrays

I want a schema with an array containing arrays so I have the following schema defined:
runGroupEntries: [
[
{
type: mongoose.Types.ObjectId,
ref: 'User',
require: true
}
]
]
My intention is to have
runGroupEntries[['userId1', 'userId2', 'userId3'], ['userId4', 'userId5', 'userId6], ...]
I initialized the schema using:
for (let i = 0; i < numGroups; ++i) {
event.runGroupEntries.push(undefined);
}
In MongoDB Atlas, it shows:
initialization
It looks fine to me.
The way I insert element is
event.runGroupEntries[runGroup].push(userId);
In this example, runGroup is 0. I was expecting to see
runGroupEntries: [ [ null, "userId" ], [ null ], [ null ], [ null ], [ null ] ]
but the actual result is:
runGroupEntries: [ [ null, [Array] ], [ null ], [ null ], [ null ], [ null ] ],
1st push result
Then I tried to push another userId to event.runGroupEntries[0]. Interestingly, previous array now becomes "userId" but the new element been pushed still shows an array.
runGroupEntries: [
[ null, 5f5c1d95e4f678ce190d5624, [Array] ],
2nd push result
I am really clueless why the pushed element became an array. Any help would be appreciated!
It is indeed a bug in Mongoose. It's fixed now https://github.com/Automattic/mongoose/issues/9429

Postgres JSONB Editing columns

I was going through the Postgres Jsonb documentation but was unable to find a solution for a small issue I'm having.
I've got a table : MY_TABLE
that has the following columns:
User, Name, Data and Purchased
One thing to note is that "Data" is a jsonb and has multiple fields. One of the fields inside of "Data" is "Attribute" but the values it can hold are not in sync. What I mean is, it could be a string, a list of strings, an empty list, or just an empty string. However, I want to change this.
The only values that I want to allow are a list of strings and an empty list. I want to convert all the empty strings to empty lists and regular strings to a list of strings.
I have tried using json_build_array but have not had any luck
So for example, I'd want my final jsonb to look like :
[{
"Id": 1,
"Attributes": ["Test"]
},
{
"Id": 2,
"Attributes": []
},
{
"Id": 3,
"Attributes": []
}]
when converted from
{
"Id": 1,
"Attributes": "Test"
},
{
"Id": 2,
"Attributes": ""
},
{
"Id": 3,
"Attributes": []
}
]
I only care about the "Attributes" field inside of the Json, not any other fields.
I also want to ensure for some Attributes that have an empty string "Attributes": "", they get mapped to an empty list and not a list with an empty string ([] not [""])
I also want to preserve the empty array values ([]) for the Attributes that already hold an empty array value.
This is what I have so far:
jsonb_set(
mycol,
'{Attributes}',
case when js ->> 'Attributes' <> ''
then jsonb_build_array(js ->> 'Attributes')
else '[]'::jsonb
end
)
However, Attributes: [] is getting mapped to ["[]"]
Use jsonb_array_elements() to iterate over the elements of each 'data' cell and jsonb_agg to regroup the transform values together into an array:
WITH test_data(js) AS (
VALUES ($$ [
{
"Id": 1,
"Attributes": "Test"
},
{
"Id": 2,
"Attributes": ""
},
{
"Id": 3,
"Attributes": []
}
]
$$::JSONB)
)
SELECT transformed_elem
FROM test_data
JOIN LATERAL (
SELECT jsonb_agg(jsonb_set(
elem,
'{Attributes}',
CASE
WHEN elem -> 'Attributes' IN ('""', '[]') THEN '[]'::JSONB
WHEN jsonb_typeof(elem -> 'Attributes') = 'string' THEN jsonb_build_array(elem -> 'Attributes')
ELSE elem -> 'Attributes'
END
)) AS transformed_elem
FROM jsonb_array_elements(test_data.js) AS f(elem) -- iterate over every element in the array
) s
ON TRUE
returns
[{"Id": 1, "Attributes": ["Test"]}, {"Id": 2, "Attributes": []}, {"Id": 3, "Attributes": []}]

Is there a magic function with can extract all select keys/nested keys including array from jsonb

Given a jsonb and set of keys how can I get a new jsonb with required keys.
I've tried extracting key-values and assigned to text[] and then using jsonb_object(text[]). It works well, but the problem comes when a key has a array of jsons.
create table my_jsonb_table
(
data_col jsonb
);
insert into my_jsonb_table (data_col) Values ('{
"schemaVersion": "1",
"Id": "20180601550002",
"Domains": [
{
"UID": "29aa2923",
"quantity": 1,
"item": "book",
"DepartmentDomain": {
"type": "paper",
"departId": "10"
},
"PriceDomain": {
"Price": 79.00,
"taxA": 6.500,
"discount": 0
}
},
{
"UID": "bbaa2923",
"quantity": 2,
"item": "pencil",
"DepartmentDomain": {
"type": "wood",
"departId": "11"
},
"PriceDomain": {
"Price": 7.00,
"taxA": 1.5175,
"discount": 1
}
}
],
"finalPrice": {
"totalTax": 13.50,
"total": 85.0
},
"MetaData": {
"shopId": "1405596346",
"locId": "95014",
"countryId": "USA",
"regId": "255",
"Date": "20180601"
}
}
')
This is what I am trying to achieve :
SELECT some_magic_fun(data_col,'Id,Domains.UID,Domains.DepartmentDomain.departId,finalPrice.total')::jsonb FROM my_jsonb_table;
I am trying to create that magic function which extracts the given keys in a jsonb format, as of now I am able to extract scalar items and put them in text[] and use jsonb_object. but don't know how can I extract all elements of array
expected output :
{
"Id": "20180601550002",
"Domains": [
{
"UID": "29aa2923",
"DepartmentDomain": {
"departId": "10"
}
},
{
"UID": "bbaa2923",
"DepartmentDomain": {
"departId": "11"
}
}
],
"finalPrice": {
"total": 85.0
}
}
I don't know of any magic. You have to rebuild it yourself.
select jsonb_build_object(
-- Straight forward
'Id', data_col->'Id',
'Domains', (
-- Aggregate all the "rows" back together into an array.
select jsonb_agg(
-- Turn each array element into a new object
jsonb_build_object(
'UID', domain->'UID',
'DepartmentDomain', jsonb_build_object(
'departId', domain#>'{DepartmentDomain,departId}'
)
)
)
-- Turn each element of the Domains array into a row
from jsonb_array_elements( data_col->'Domains' ) d(domain)
),
-- Also pretty straightforward
'finalPrice', jsonb_build_object(
'total', data_col#>'{finalPrice,total}'
)
) from my_jsonb_table;
This probably is not a good use of a JSON column. Your data is relational and would better fit traditional relational tables.