I have the JSON column in my table which is having array's of dictionary. The array has standard format.
[{'path': 'chat/xyz.pdf', 'file_name': 'xyz.pdf'},
{'path': 'chat/xyl.pdf', 'file_name': 'xyl.pdf'}]
The table name is chat and column name is attachments. I want to perform search on file names such that even if i type one letter is typed then that row should be retrieved. For example: if i search by string 'pd' then all values with file_name having 'pd' string should be retrieved.
I tried this and it did work.
select distinct attachments from chat, jsonb_array_elements_text(attachments)
where value::json->>'file_name' like '%xyz%';
I took reference from documentation.
You can use an EXISTS condition, then you don't need the DISTINCT:
select c.*
from chat c
where exists (select *
from jsonb_array_elements(c.attachments) as t(doc)
where t.doc ->> 'file_name' like '%xyz%');
Related
I've been looking around and can't seem to find anything that is helping me understand how I can achieve the following. (Bear in mind I have simplified this to the problem I'm having and I am only storing simple JSON objects in this field)
Given I have a table "test" defined
CREATE TABLE test (
id int primary key
, features jsonb
)
And some test data
id
features
1
{"country": "Sweden"}
2
{"country": "Denmark"}
3
{"country": "Norway"}
I've been trying to filter on the JSONB column "features". I can do this easily with one value
SELECT *
FROM test
WHERE features #> '{"country": "Sweden"}'
But I've been having troubles working out how I could filter by multiple values succintly. I can do this
SELECT *
FROM test
WHERE features #> '{"country": "Sweden"}'
OR features #> '{"country": "Norway"}'
But I have been wondering if there would be an equivalent to WHERE IN ($1, $2, ...) for JSONB columns.
I suspect that I will likely need to stick with the WHERE... OR... but would like to know if there is another way to achieve this.
You can use jsonb->>'field_name' extract a field as text, then you use any operator compatible with text type
SELECT *
FROM test
WHERE features->>'country' = 'Sweden'
SELECT *
FROM test
WHERE features->>'country' in ('Sweden', 'Norway')
You an also directly work with jsonb as follow
jsonb->'field_name' extract field as jsonb, then you can use operator compatible with jsonb:
SELECT *
FROM test
WHERE features->'country' ?| array['Sweden', 'Norway']
See docs for more details
You can extract the country value, then you can use a regular IN condition:
select *
from test
where features ->> 'country' in ('Sweden', 'Norway')
I have jsonb field 'data' in my table. I can get value of subKey by select data#>'{key1,subKey}' from table.How to insert a path into select if the path is stored in another table as a string 'key1,subKey'?
You should really store it as an array, not as a string, since that is how it will be used. You can just split it to an array dynamically, but then what if you ever need the comma to appear literally in the path?
with t as (select '{"key1":{"subKey":"foo"}}'::jsonb as data),
k as (select 'key1,subKey' as k)
select data#>regexp_split_to_array(k,',') from t,k;
Is there a way to combine two finds in MongoDB similar to the SQL subqueries?
What would be the equivalent of something like:
SELECT * FROM TABLE1 WHERE name = (SELECT name FROM TABLE2 WHERE surname = 'Smith');
I need to get an uuid from one collection searching by email and then use it to filter in another. I would like if possible to make it with just one find instead of get the uuid, store it in a variable and then search second time using the variable...
Here are the two that I want combined somehow:
db.getCollection('person').find({email:'perry.goodwin#yahoo.com'}).sort({_id:-1});
db.getCollection('case').find({applicantUuid:'4a17e96c-caf9-4d78-a853-73e190005c63'});
I have a source field from oracle db table data type VARCHAR2(512 CHAR) which is like this
%custId{str:hGl0EWJsRTwweerRkaaKsdKDsqKm0123}
%prod{str:BalanceAmount}%logistic{str:Logistic}%hiringdate{str:1999-02-28T11:10:11p}%custId{str:FpseikiD0Jt1L0Mskdww8oZBjU4La123}
but when i consider for my extract i must only consider only data with %cusId pull data and only this alphanumeric data has to be captured and populated for the extract , the problem is this is just one example from source there can be any number of combinations but i have to only consider %custId with
%custId{str:hGl0EWJsRTwweerRkaaKsdKDsqKm0123}
i need to use which function substr,lpad ?
after using the below query
SELECT
field,
REGEXP_SUBSTR(field, '%custId\{.*?\}') AS custId
FROM yourTable
where col_source='%prod{str:BalanceAmount}%logistic{str:Logistic}%hiringdate{str:1999-02-28T11:10:11p}%custId{str:FpseikiD0Jt1L0Mskdww8oZBjU4La123}'
Result
%custId{str:FpseikiD0Jt1L0Mskdww8oZBjU4La123}
but expected result
FpseikiD0Jt1L0Mskdww8oZBjU4La123
You may use REGEXP_SUBSTR here:
SELECT
field,
REGEXP_SUBSTR(field, '%custId\{(.*?)\}', 1, 1, NULL, 1) AS custId
FROM yourTable;
We have a table where one of the columns is an array. I need to select a row or many rows as long as my search value matches their values using ILIKE. My problem is that I need to search the values of an array column as well. I tried using ANY but the value needs to be exact to select a row. I need something similar to ILIKE but for that array column.
Thank you in advance.
Use unnest function:
SELECT x.value
FROM my_table t, unnest(t.my_array_column) as x(value)
WHERE x.value ILIKE 'foo'
Once your question is also tagged elixir, for converting this to Ecto use Ecto.Query.API.fragment/1 for the select condition and Ecto.Query.API.ilike/2 for match.