I am new to PostgreSQL I created a table with a JSON type column
id,country_code
11767,{"country_code": [{"code": "GB01F290/00", "new": 1}, {"code": "DE08F290/00", "new": 1}, {"code": "GB02F290/00", "new": 1}]}
11768,{"country_code": [{"code": "GB01F290/20", "new": 1}, {"code": "GB20F290/23", "new": 1}]}
list = ["GB01F290/00", "GB21F290/41"]
How can I select the rows that country_code:code contains any element of the list?
There is probably a way to create a jsonpath query to do this, but you would need some way to transform your ["GB01F290/00", "GB21F290/41"] into the correct jsonpath. I'm not very good at jsonpath, so I won't go into that.
Another way to do this would be to use the #> operator with the ANY(...) construct. But that takes a PostgreSQL array of jsonb documents as its right-hand-side, and each document needs to have a specific structure to match the structure of the documents you are querying. One way to express that array of jsonb would be:
'{"{\"country_code\": [{\"code\": \"GB01F290/00\"}]}","{\"country_code\": [{\"code\": \"GB21F290/41\"}]}"}'::jsonb[]
Or another way, with less obnoxious quoting/escaping would be:
ARRAY['{"country_code": [{"code": "GB01F290/00"}]}'::jsonb,' {"country_code": [{"code": "GB21F290/41"}]}']
A way to obtain that value given your input would be with this query:
select array_agg(jsonb_build_object(
'country_code',
jsonb_build_array(jsonb_build_object( 'code', x))
)) from
jsonb_array_elements('["GB01F290/00", "GB21F290/41"]')
But there might be better ways of doing that, in python.
Then the query would be:
select * from thetable where country_code #> ANY($1::jsonb[])
Where $1 holds the value given in the first block, or the result of the expression given in the 2nd block or the result of the query given in the third block. You could also put combine the queries into one by putting the first into the second as a subquery, but that might inhibit use of indexes.
Note that the column country_code needs to be of type jsonb, not json, for this to work. But that is what it should be anyway.
It would probably be better if you chose a different way to store your data in the first place. An array of objects where each object has a unique name (the value of "code", here) is an antipattern, and should instead be an object of objects, with the unique name being the key. And having objects which just have one key at the top level, which is the same as the name of the column, is another antipattern. And what is the point of "new":1 if it is always present (or is that just an artifact of the example you chose)? Does it convey any meaning? And if you remove all of that stuff, you are left with just a list of strings. Why use jsonb in the first place for that?
Related
I am planning to use Drools for executing the DMN models. However I am having trouble to write a condition in DMN Decision table where the input is an array of objects with structure data type and condition is to check if the array contains an object with specific fields. For ex:
Input to decision table is as below:
[
{
"name": "abc",
"lastname": "pqr"
},
{
"name": "xyz",
"lastname": "lmn"
},
{
"name": "pqr",
"lastname": "jkl"
}
]
Expected output: True if the above list contains an element that match {"name": "abc", "lastname": "pqr"} both on the same element in the list.
I see that FEEL has support for list contains, but I could not find a syntax where objects in array are not of primitive types like number,string etc but structures. So, I need help on writing this condition in Decision table.
Thanks!
Edited description:
I am trying to achieve the following using the decision table wherein details is list of info structure. Unfortunately as you see I am not getting the desired output wherein my input list contains the specific element I am looking for.
Input: details = [{"name": "hello", "lastname": "world"}]
Expected Output = "Hello world" based on condition match in row 1 of the decision table.
Actual Output = null
NOTE: Also in row no 2 of the decision table, I only check for condition wherein I am only interested in the checking for the name field.
Content for the DMN file can be found over here
In this question is not clear the overall need and requirements for the Decision Table.
For what pertaining the part of the question about:
True if the above list contains an element that match {"name": "abc", "lastname": "pqr"}
...
I see that FEEL has support for list contains, but I could not find a syntax where objects in array are not of primitive types like number,string etc but structures.
This can be indeed achieved with the list contains() function, described here.
Example expression
list contains(my list, {"name": "abc", "lastname": "pqr"})
where my list is the verbatim FEEL list from the original question statement.
Example run:
giving the expected output, true.
Naturally 2 context (complex structures) are the same if all their properties and fields are equivalent.
In DMN, there are multiple ways to achieve the same result.
If I understand the real goal of your use case, I want to suggest a better approach, much easier to maintain from a design point of view.
First of all, you have a list of users as input so those are the data types:
Then, you have to structure a bit your decision:
The decision node at least one user match will go trough the user list and will check if there is at least one user that matches the conditions inside the matching BKM.
at least one user match can implemented with the following FEEL expression:
some user in users satisfies matching(user)
The great benefit of this approach is that you can reason on specific element of your list inside the matching BKM, which makes the matching decision table extremely straightforward:
I have a table (refer to it as A) with 1 column (refer to it as c) that contains a stringified JSON array in the follow format:
[
{"sys": {"type": "Link", "linkType": "Entry", "id": "27OfJChoPO894W4rA6bQ67"}},
{"sys": {"type": "Link", "linkType": "Entry", "id": "2ygvvrBSPuWw0uTW4jdDP2"}}
]
Please, note that the array have variable length. The id fields refer to the ID of the second table (B). So, I need to select all fields from A, but populate c with a column from B.
I tried looking for JSON functions to help me get the ids, but I couldn't progress from an array of ids to finally populating it with the column from B. So, my current idea is creating a new table to hold the relation between A and B. What's the best way?
demo:db<>fiddle
You can expand your array and use the elements in the JOIN condition
SELECT
*
FROM
a,
json_array_elements(c) as elems
JOIN b ON b.id = elems -> 'sys' ->> 'id'
However, please think about normalizing your data. You shouldn't store JSON data directly if you don't need it, especially arrays are difficult to handle. If you can save the data in to appropriate tables/columns, every single action (update, search, filter and of course join) would be easier and much faster. Furthermore you have the chance for proper indexes.
The JSON column type accepts non valid JSON
eg
[1,2,3] can be inserted without the closing {}
Is there any difference between JSON and string?
While [1,2,3] is valid JSON, as zerkms has stated in the comments, to answer the primary question: Is there any difference between JSON and string?
The answer is yes. A whole new set of query operations, functions, etc. apply to json or jsonb columns that do not apply to text (or related types) columns.
For example, while with text columns you would need to use regular expressions and related string functions to parse the string (or a custom function), with json or jsonb, there exists a separate set of query operators that works within the structured nature of JSON.
From the Postgres doc, given the following JSON:
{
"guid": "9c36adc1-7fb5-4d5b-83b4-90356a46061a",
"name": "Angela Barton",
"is_active": true,
"company": "Magnafone",
"address": "178 Howard Place, Gulf, Washington, 702",
"registered": "2009-11-07T08:53:22 +08:00",
"latitude": 19.793713,
"longitude": 86.513373,
"tags": [
"enim",
"aliquip",
"qui"
]
}
The doc then says:
We store these documents in a table named api, in a jsonb column named
jdoc. If a GIN index is created on this column, queries like the
following can make use of the index:
-- Find documents in which the key "company" has value "Magnafone"
SELECT jdoc->'guid', jdoc->'name' FROM api WHERE jdoc #> '{"company": "Magnafone"}';
This allows you to query the jsonb (or json) fields very differently than if it were simply a text or related field.
Here is some Postgres doc that provides some of those query operators and functions.
Basically, if you have JSON data that you want to treat as JSON data, then a column is best specified as json or jsonb (which one you choose depends on whether you want to store it as plain text or binary, respectively).
The above data can be stored in text, but the JSON data types have the advantage you can apply JSON rules in those columns. There are several functions which are JSON specified which cannot be used for text fields.
Refer to this link to understand about the json functions/procedures
I have a json which I am storing it as jsonb in postgres.
{
"name": "Mr. Json",
"dept":{
"team":{
"aliases":["a1","a2","a3"],
"team_name": "xyz"
},
"type":"engineering",
"lead":"Mr. L"
},
"hobbies": ["Badminton", "Chess"],
"is_active": true
}
Have created a GIN index on the column
I need to do exact match queries like all rows containing type='engineering' and lead='Mr. L'.
I am currently doing containment queries like:
data #> '{"dept":{"type":"engineering"}}' and data #> '{"dept":{"lead":"Mr. L"}}'
I saw the query plan which shows GIN index is being used but I am unsure if this works or if there is some better way of achieving this.
Will I have to construct another index on nested keys?
Does indexing a jsonb column indexes the nested keys or just the top level ones?
Also please share some good resource on this.
From docs:
The default GIN operator class for jsonb supports queries with top-level key-exists operators ?, ?& and ?| operators and path/value-exists operator #>.
For containment #> it works with nested values. For other operators it works only for top-level keys or whatever level is used in expression index. Also, according to documentation, using expression index on level you want to query will be faster than simple index on whole column (makes sense as size is smaller).
If you are doing only containment search, consider using jsonb_path_ops while building your index. It is smaller and faster.
In my synfony 2 project, I'm filtering search results using a query builder. In my MongoDB i have some values in an array.
Query Bulider has the "in" operator that allows to query for values that equal one of many in an array. I wanted to perform the opposite operation, i.e. given a single value, query for entries in the data base that contain an array, that contains my value.
For instance, say I have this entry in my MongoDB:
{
"_id": 123,
"name": "John",
"countries_visited":
[
"Australia"
"Bulgaria",
"Canada"
]
}
And I want to query my database for persons who have visited "Canada". Right now, I'm using the where attribute as follows, but I'm looking for a better way to do this.
$qb->field('countries_visited')->where("function(){
return this.indexOf(".$countryName.") > -1
}");
edit:
The in and notIn operator receives an array as parameter and compares it against a single value in MongoDB. I need to provide a single parameter and apply it to an array field in MongoDB, hence "inverse in". I guess I need a contains operator if there's such a thing.
Interesting, MongoDB takes care of this automatically. If querying for a single value against an array field, Mongo will assume you want the check the array if it contains the value.
Taken from the docs:
Match an Array Element
Equality matches can specify a single element in the array to match. These specifications match if the array contains at least one element with the specified value.
So you should be able to do
$users = $dm->getRepository('User')->findOneBy([
'countries_visited' => 'Canada'
]);
or
$qb->field('countries_visited')->equals('Canada');