I have the following array JSON data structure:
{ arrayOfObjects:
[
{
"fieldA": "valueA1",
"fieldB": { "fieldC": "valueC", "fieldD": "valueD" }
},
{
"fieldA": "valueA",
"fieldB": { "fieldC": "valueC", "fieldD": "valueD" }
}
]
}
I would like to select all records where fieldD matches my criteria (and fieldC is unknown). I've see similar answers such as Query for array elements inside JSON type but there the field being queried is a simple string (akin to searching on fieldA in my example) where my problem is that I would like to query based on an object within an object within the array.
I've tried something like select * from myTable where jsonData -> 'arrayOfObjects' #> '[ { "fieldB": { "fieldD": "valueD" } } ]' ) but that doesn't seem to work.
How can I achieve what I want?
You can execute a "contains" query on the JSONB field directly and pass the minimum you're looking for:
SELECT *
FROM mytable
WHERE json_data #> '{"arrayOfObjects": [{"fieldB": {"fieldD": "valueD"}}]}'::JSONB;
This of course assumes that fieldD is always nested under fieldB, but that's a fairly low bar to clear in terms of schema consistency.
Related
I have a complex object with JSONB (PostgreSQL 12), that has nested arrays in nested arrays and so on. I search for all invoices, that contains specific criteria.
create table invoice:
invoice_number primary key text not null,
parts: jsonb,
...
Object:
"parts": [
{
"groups": [
{
"categories": [
{
"items": [
{
...
"articleName": "article1",
"articleSize": "M",
},
{
...
"articleName": "article2"
"articleSize": "XXL",
}
]
}
]
}
]
},
{
"groups": [
...
]
},
]
I've a build a native query to search for items with a specific articleName:
select * from invoice i,
jsonb_array_elements(i.invoice_parts) parts,
jsonb_array_elements(parts -> 'groups') groups,
jsonb_array_elements(groups -> 'categories') categories,
jsonb_array_elements(categories -> 'items') items
where items ->> 'articleName' like '%name%' and items ->> 'articleSize' = 'XXL';
I assume i could improve search speed with indexing. I've read about Trigram indexes. Would it be the best type of indexing for my case? If yes -> how to build it for such complex object.
Thanks in regards for any advices.
The only option that might speed up this, is to create a GIN index on the parts column and use a JSON path operator:
select *
from invoice
where parts #? '$.parts[*].groups[*].categories[*].items[*] ? (#.articleName like_regex "name" && #.articleSize == "XXL")'
But I doubt this is going to be fast enough, even if that uses the index.
I am trying to add items to a JSON array using jsonb_insert as:
jsob_insert(document, '{path,to,array,0}','{"key":"value"}')
This inserts {"key":"value"} to the head of array. However, I want to do this insert iff the JSON object {"key":"value"} does not already exist in the array. Currently, it just adds it to the array without caring about duplicacy (which is how it is intended to work). I am just wondering if there is a way to take care of duplicate records this way.
Wrap that call in a case that checks whether the object already exists in the array:
with invars as (
select '{
"path": {
"to": {
"array": [
{
"key": "value"
}
]
}
}
}'::jsonb as document
)
select case
when jsonb_path_exists(document, '$.path.to.array[*].key ? (# == "value")')
then document
else jsonb_insert(document, '{path,to,array,0}', '{"key": "value"}')
end as desired_result,
jsonb_insert(document, '{path,to,array,0}', '{"key": "value"}') as old_result
from invars;
db<>fiddle here
Using PostgreSQL 13.4, I have a query like this, which is used for a GraphQL endpoint:
export const getPlans = async (filter: {
offset: number;
limit: number;
search: string;
}): Promise<SearchResult<Plan>> => {
const query = sql`
SELECT
COUNT(p.*) OVER() AS total_count,
p.json, TO_CHAR(MAX(pp.published_at) AT TIME ZONE 'JST', 'YYYY-MM-DD HH24:MI') as last_published_at
FROM
plans_json p
LEFT JOIN
published_plans pp ON p.plan_id = pp.plan_id
WHERE
1 = 1
`;
if (filter.search)
query.append(sql`
AND
(
p.plan_id::text ILIKE ${`${filter.search}%`}
OR
p.json->>'name' ILIKE ${`%${filter.search}%`}
**OR
p.json->'activities'->'venue'->>'name' ILIKE ${`%${filter.search}%`}
)
`);
// The above OR line or this alternative didn't work
// p #> '{"activities":[{"venue":{"name":${`%${filter.search}`}}}]}'
.
.
.
}
The data I'm accessing looks like this:
{
"data": {
"plans": {
"records": [
{
"id": "345sdfsdf",
"name": "test1",
"activities": [{...},{...}]
},
{
"id": "abc123",
"name": "test2",
"activities": [
{
"name": "test2",
"id": "123abc",
"venue": {
"name": *"I WANT THIS VALUE"* <------------------------
}
}
]
}
]
}
}
}
Since the search parameter provided to this query varies, I can only make changes in the WHERE block, in order to avoid affecting the other two working searches.
I tried 2 approaches (see above query), but neither worked.
Using TypeORM would be an alternative.
EDIT: For example, could I make that statement work somehow? I want to compare VALUE with the search string that is provided as argument:
p.json ->> '{"activities":[{"venue":{"name": VALUE}}]}' ILIKE ${`%${filter.search}`}
First, you should use the jsonb type instead of the json type in postgres for many reasons, see the manual :
... In general, most applications should prefer to store JSON data as
jsonb, unless there are quite specialized needs, such as legacy
assumptions about ordering of object keys...
Then you can use the following query to get the whole json data based on the search_parameter provided to the query via the user interface as far as the search_parameter is a regular expression (see the manual) :
SELECT query
FROM plans_json p
CROSS JOIN LATERAL jsonb_path_query(p.json :: jsonb , FORMAT('$ ? (#.data.plans.records[*].activities[*].venue.name like_regex "%s")', search_parameter) :: jsonpath) AS query
If you need to retrieve part of the json data only, then you transfer in the jsonb_path_query function the corresponding part of the jsonpath which is in the '(#.jsonpath)' section to the '$' section. For instance, if you just want to retrieve the jsonb object {"name": "test2", ...} then the query is :
SELECT query
FROM plans_json p
CROSS JOIN LATERAL jsonb_path_query(p.json :: jsonb , FORMAT('$.data.plans.records[*].activities[*] ? (#.venue.name like_regex "%s")', search_parameter) :: jsonpath) AS query
I have a table with a field called 'keywords'. It is a JSONB field with an array of keyword metadata, including the keyword's name.
What I would like is to query the counts of all these keywords by name, i.e. aggregate on keyword name and count(id). All the examples of GROUP BY queries I can find just result in the grouping occuring on the full list (i.e. only giving me counts where the two records have the same set of keywords).
So is it possible to somehow expand the list of keywords in a way that lets me get these counts?
If not, I am still at the planning stage and could refactor my schema to better handle this.
"keywords": [
{
"addedAt": "2017-04-07T21:11:00+0000",
"addedBy": {
"email": "foo#bar.com"
},
"keyword": {
"name": "Animal"
}
},
{
"addedAt": "2017-04-07T20:54:00+0000",
"addedBy": {
"email": "foo#bar.comm"
},
"keyword": {
"name": "Mammal"
}
}
]
step-by-step demo:db<>fiddle
SELECT
elems -> 'keyword' ->> 'name' AS keyword, -- 2
COUNT(*) AS count
FROM
mytable t,
jsonb_array_elements(myjson -> 'keywords') AS elems -- 1
GROUP BY 1 -- 3
Expand the array records into one row per element
Get the keyword's names
Group these text values.
{
"_id":"12345",
"model":{
"T":0,
"serviceTask":[
{
"_id":"6789",
"obj":{
"params" :{
"action": "getEmployeeData",
"employeeId":"123"
},
"url":"www.test.com",
"var":"test",
},
"opp":100
}
]
}
}
I have similar structured documents in my collection. How do I query the documents that match action value to "getEmployeeData". I tried dot notation with $elemMatchbut couldn't get the results.
Dot notation works fine here, it will return the document if serviceTask contains at least one action set to getEmployeeData
db.collection.find({
"model.serviceTask.obj.params.action": "getEmployeeData"
})