how to compare a jsonb object that is a superset of another - postgresql

I'd like to be able to perform a check on jsonb objects with postgres where:
x = '{"a": 1, "b": 2}'
y = '{"a": 1, "b": 2, "c": 3}'
z = '{"a": 1, "b": 4, "c": 3}'
jsonb_similar(x, y) gives true.
jsonb_similar(x, z) gives false.
what is the best way to write jsonb_similar?

Related

Using subquery results as a condition of jsonb field indexed search

I have two following tables:
CREATE TABLE elements(
id INT PRIMARY KEY,
element_type INT,
api_name VARCHAR(100) UNIQUE
);
CREATE TABLE entries(
id INT PRIMARY KEY,
elements JSONB
)
entries.elements field stored a jsonb-array of elements in following format:
[
{
"id": elements.id
"type": elements.element_type
"value": <some_value>
},
{
"id": elements.id
"type": elements.element_type
"value": <some_value>
}
...
]
The value field can be one of 3 types:
Literal (i.e. "abc", 12, true)
Array of literals (i.e. ["abc", "cbd"], [12, 21])
Array of objects (i.e. [{"node_id": <uuid>, "value": "abc"}, {"node_id": <uuid>, "value": "cba"}])
And I have index (maybe incorrect)
CREATE INDEX some_idx_name
ON entries
USING GIN (elements jsonb_ops)
For example suppose that I have data in these tables:
INSERT INTO elements(id, element_type, api_name) VALUES
(1, 1, 'el_number_1'),
(2, 2, 'el_datetime'),
(3, 3, 'el_files'),
(4, 1, 'el_number_2');
INSERT INTO entries(id, elements) VALUES
(1, '[{"id": 1, "type": 1, "value": 12}, {"id": 3, "type": 3, "value": [1, 2]}]'::JSONB),
(2, '[{"id": 4, "type": 1, "value": 12}, {"id": 2, "type": 2, "value": [{"date": "20.12.2022", "time": "16:18"}]}]'::JSONB);
So if I want to find entry by elements I need to do something like this:
SELECT id
FROM entries
WHERE elements #? '$[*] ? (#.id == 4 && #.value = 12)'
But how can I find entry by value of elements which was found by api_name?
-- DOES NOT WORK (only for example what I need)
SELECT id
FROM entries
WHERE elements #? '$[*] ? (#.id == (SELECT id FROM elements WHERE api_name = 'el_number_2') && #.value = 12)'
Fiddle link

postgresql jsonb - from list of integers to list of Objects

I have a question regarding jsonb in postgresql.
I have a table that has a column of type jsonb, where I store a list of integers.
For example list_integers column:
[1, 2, 3, 4]
I want to add a new column in this table, and insert in this column the same IDs, but the form would be list of objects, where the ID field corresponds to the integer.
For example list_ids column:
[{"id": 1}, {"id": 2}, {"id": 3}, {"id": 4}]
What would be the best way to do this?
To transform:
test=> SELECT jsonb_agg(jsonb_build_object('id', id))
test-> FROM jsonb_array_elements(jsonb '[1, 2, 3, 4]') id;
jsonb_agg
----------------------------------------------
[{"id": 1}, {"id": 2}, {"id": 3}, {"id": 4}]
(1 row)```

Access a JSONB array of objects as an object of arrays

I have a JSONB column in my Postgres 9.6 database with roughly the following structure
[
{
"A": "some value",
"B": "another value",
"foo": "bar",
"x": "y"
},
{
"B": "abc",
"C": "asdf"
}
]
It is always an array of objects, the number of array elements varies. Some of the object keys are in each array element, but not all of them. The real objects have many more keys, a few hundred are common.
In certain situations, I need to get the value of a specific key for each array element. For example, if I want to access the key "B" the result should be
["another value", "abc"]
if I access "foo" the result should be
["bar", null]
Is there a reasonably efficient way to fetch all values for a specific key in a SQL query? It should work independent of the number of objects in the array, and ideally it should not get slower if the number of key in the objects get much larger.
You can use the jsonb_array_elements to extract each object, aggregate those you want in an array using ARRAY_AGG and then convert that into a json array using array_to_json:
WITH j(json) AS (
VALUES ('[
{
"A": "some value",
"B": "another value",
"foo": "bar",
"x": "y"
},
{
"B": "abc",
"C": "asdf"
}
]'::jsonb)
)
SELECT array_to_json(ARRAY_AGG(elem->'B'))
FROM j, jsonb_array_elements(json) elem
;
array_to_json
-------------------------
["another value","abc"]
(1 row)
WITH j(json) AS (
VALUES ('[
{
"A": "some value",
"B": "another value",
"foo": "bar",
"x": "y"
},
{
"B": "abc",
"C": "asdf"
}
]'::jsonb)
)
SELECT array_to_json(ARRAY_AGG(elem->'foo'))
FROM j, jsonb_array_elements(json) elem
;
array_to_json
---------------
["bar",null]
(1 row)

Swift 4 (BETA) merge dictionaries unable to infer type of parameter

I am using the exact sample code from Apple documentation in the header but I am getting this error: Generic parameter 'S' could not be inferred This is in the Swift 4 BETA WWDC release.
var dictionary = ["a": 1, "b": 2]
dictionary.merge(["a": 3, "c": 4])
{ (current, _) in current }
// ["b": 2, "a": 1, "c": 4]
// Taking the new value for key "a":
dictionary.merge(["a": 5, "d": 6])
{ (_, new) in new }
// ["b": 2, "a": 5, "c": 4, "d": 6]
I tried assigning the dictionaries to variables and hard coding the types, but I still get the same error. Anyone else able to get this to work?
Thanks to the very responsive Swift developers I got a workaround right away after I tracked down the bug. https://bugs.swift.org/browse/SR-4969
var dictionary = ["a": 1, "b": 2]
dictionary.merge(["a": 3, "c": 4].lazy.map { ($0.key, $0.value) }) { (current, _) in current }
// ["b": 2, "a": 1, "c": 4]
// Taking the new value for key "a":
dictionary.merge(["a": 5, "d": 6].lazy.map { ($0.key, $0.value) }) { (_, new) in new }
// ["b": 2, "a": 5, "c": 4, "d": 6]

QueryingJSON PostgreSQL

CREATE TABLE tableTestJSON (
id serial primary key,
data jsonb
);
INSERT INTO tableTestJSON (data) VALUES
('{}'),
('{"a": 1}'),
('{"a": 2, "b": ["c", "d"]}'),
('{"a": 1, "b": {"c": "d", "e": true}}'),
('{"b": 2}');
I can select the values. There is no problem this.
SELECT * FROM tableTestJSON;
I can test that two JSON objects are identical this query.
SELECT * FROM tableTestJSON WHERE data = '{"a":1}';
This query's output is :
id | data
----+------
1 | {"a": 1}
(1 row)
But i have a problem :
Lets say I have a column:
{a: 30}
{a: 40}
{a: 50}
In this case, how can i query for all the elements containing a = 30 or a = 40. I was not able to find any solution for 'or', e.g.
select * from table where a in (10,20); // ??
or
How can I query on such condition?
Extract a value of a json object using the operator ->>:
select *
from tabletestjson
where (data->>'a')::int in (1, 2)
id | data
----+--------------------------------------
2 | {"a": 1}
3 | {"a": 2, "b": ["c", "d"]}
4 | {"a": 1, "b": {"c": "d", "e": true}}
(3 rows)