Postgres jsonb_insert Insert Only When Object Doesn't Already Exist - postgresql

I am trying to add items to a JSON array using jsonb_insert as:
jsob_insert(document, '{path,to,array,0}','{"key":"value"}')
This inserts {"key":"value"} to the head of array. However, I want to do this insert iff the JSON object {"key":"value"} does not already exist in the array. Currently, it just adds it to the array without caring about duplicacy (which is how it is intended to work). I am just wondering if there is a way to take care of duplicate records this way.

Wrap that call in a case that checks whether the object already exists in the array:
with invars as (
select '{
"path": {
"to": {
"array": [
{
"key": "value"
}
]
}
}
}'::jsonb as document
)
select case
when jsonb_path_exists(document, '$.path.to.array[*].key ? (# == "value")')
then document
else jsonb_insert(document, '{path,to,array,0}', '{"key": "value"}')
end as desired_result,
jsonb_insert(document, '{path,to,array,0}', '{"key": "value"}') as old_result
from invars;
db<>fiddle here

Related

Aggregate results based on array of strings in JSON?

I have a table with a field called 'keywords'. It is a JSONB field with an array of keyword metadata, including the keyword's name.
What I would like is to query the counts of all these keywords by name, i.e. aggregate on keyword name and count(id). All the examples of GROUP BY queries I can find just result in the grouping occuring on the full list (i.e. only giving me counts where the two records have the same set of keywords).
So is it possible to somehow expand the list of keywords in a way that lets me get these counts?
If not, I am still at the planning stage and could refactor my schema to better handle this.
"keywords": [
{
"addedAt": "2017-04-07T21:11:00+0000",
"addedBy": {
"email": "foo#bar.com"
},
"keyword": {
"name": "Animal"
}
},
{
"addedAt": "2017-04-07T20:54:00+0000",
"addedBy": {
"email": "foo#bar.comm"
},
"keyword": {
"name": "Mammal"
}
}
]
step-by-step demo:db<>fiddle
SELECT
elems -> 'keyword' ->> 'name' AS keyword, -- 2
COUNT(*) AS count
FROM
mytable t,
jsonb_array_elements(myjson -> 'keywords') AS elems -- 1
GROUP BY 1 -- 3
Expand the array records into one row per element
Get the keyword's names
Group these text values.

Postgres jsonb nested array append

I have simple table with a jsonb column
CREATE TABLE things (
id SERIAL PRIMARY KEY,
data jsonb
);
with data that looks like:
{
"id": 1,
"title": "thing",
"things": [
{
"title": "thing 1",
"moreThings": [
{ "title": "more thing 1" }
]
}
]
}
So how do I append inside of a deeply nested array like moreThings?
For single level nested array I could do this and it works:
UPDATE posts SET data = jsonb_set(data, '{things}', data->'things' || '{ "text": "thing" }', true);
But the same doesn't work for deeply nested arrays:
UPDATE posts SET data = jsonb_set(data, '{things}', data->'things'->'moreThings' || '{ "text": "thing" }', true)
How can I append to moreThings?
It works just fine:
UPDATE things
SET data =
jsonb_set(data,
'{things,0,moreThings}',
data->'things'->0->'moreThings' || '{ "text": "thing" }',
TRUE
)
WHERE id = 1;
If you have a table that consists only of a primary key and a jsonb attribute and you regularly want to manipulate this jsonb in the database, you are certainly doing something wrong. Your life will be much easier if you normalize the data some more.

PostgreSQL query of JSONB array by nested object

I have the following array JSON data structure:
{ arrayOfObjects:
[
{
"fieldA": "valueA1",
"fieldB": { "fieldC": "valueC", "fieldD": "valueD" }
},
{
"fieldA": "valueA",
"fieldB": { "fieldC": "valueC", "fieldD": "valueD" }
}
]
}
I would like to select all records where fieldD matches my criteria (and fieldC is unknown). I've see similar answers such as Query for array elements inside JSON type but there the field being queried is a simple string (akin to searching on fieldA in my example) where my problem is that I would like to query based on an object within an object within the array.
I've tried something like select * from myTable where jsonData -> 'arrayOfObjects' #> '[ { "fieldB": { "fieldD": "valueD" } } ]' ) but that doesn't seem to work.
How can I achieve what I want?
You can execute a "contains" query on the JSONB field directly and pass the minimum you're looking for:
SELECT *
FROM mytable
WHERE json_data #> '{"arrayOfObjects": [{"fieldB": {"fieldD": "valueD"}}]}'::JSONB;
This of course assumes that fieldD is always nested under fieldB, but that's a fairly low bar to clear in terms of schema consistency.

How to push a JSON object to a nested array in a JSONB column

I need to somehow push a JSON object to a nested array of potentionally existing JSON objects - see "pages" in the below JSON snippet.
{
"session_id": "someuuid",
"visitor_ui": 1,
"pages": [
{
"datetime": "2016-08-13T19:45:40.259Z",
"duration,": 0,
"device_id": 1,
"url": {
"path": "/"
}
},
{
"datetime": "2016-08-14T19:45:40.259Z",
"duration,": 0,
"device_id": 1,
"url": {
"path": "/test"
}
},
// how can i push a new value (page) here??
]
"visit_page_count": 2
}
I'm aware of the jsonb_set(target jsonb, path text[], new_value jsonb[, create_missing boolean]) (although still finding it a bit hard to comprehend) but I guess using that, would require that I first SELECT the whole JSONB column, in order to find out how many elements inside "pages" already exists and what index to push it to using jsonb_set, right? I'm hoping theres a way in Postgres 9.5 / 9.6 to achieve the equivalent of what we know in programming languages eg. pages.push({"key": "val"}).
What would be the best and easiest way to do this with Postgresql 9.5 or 9.6?
The trick to jsonb_set() is that it modifies part of a jsonb object, but it returns the entire object. So you pass it the current value of the column and the path you want to modify ("pages" here, as a string array), then you take the existing array (my_column->'pages') and append || the new object to it. All other parts of the jsonb object remain as they were. You are effectively assigning a completely new object to the column but that is irrelevant because an UPDATE writes a new row to the physical table anyway.
UPDATE my_table
SET my_column = jsonb_set(my_column, '{pages}', my_column->'pages' || new_json, true);
The optional create_missing parameter set to true here adds the "pages" object if it does not already exist.

mongodb-php: "key" side value for nested querying of find() function doesnot work

I want to retrive record which are matching to booking's client id & want to show it to client. I am doing the following:
$mongoDb = $mongoDb->selectCollection('booking');
$bookingInfo = $mongoDb->find(array("client.id" => $_SESSION['client_id']));
My mongo database record looks like this:
"paymentDue": "",
"client": {
"contacts": [
{
"name": "loy furison",
"email": "loy#hotmail.com"
}
],
"id": "5492abba64363df013000029",
"name": "Birdfire"
},
want to fire the query with key value as client.id in find function. But this query doesnt work..whats the issue
I got a little logic that is different by key name only. If i find it with client.name then i shows me records & there i need to insert these in json object & then through foreach loop each record if i retrive & compare then it works...got it but the expected doesnt work why?????...didnt get:-!