I have the following table called module_data. Currently it has three rows of entries:
id data
0ab5203b-9157-4934-8aba-1512afb0abd0 {"title":"Board of Supervisors Meeting","id":"1i3Ytw1mw98"}
7ee33a18-63da-4432-8967-bde5a44347a0 {"title":"Board of Supervisors Meeting","id":"4-dNAg2mn6o"}
8d71ca35-74eb-4751-b635-114bf04843f1 {"title":"COPD 101", "id":"l9O0jCR-sxg"}
Column data's datatype is jsonb. I'm trying to query it using like operator. Something like the following:
SELECT * FROM module_data WHERE title LIKE '%Board%';
I've been looking at the jsonb support and there doesn't seem to be a like operator. If anyone has any advice.
If the data column is text type, then use ->> on cast:
select * from module_data where data::json->>'title' like '%Board%'
If it's already json:
select * from module_data where data->>'title' like '%Board%'
I found the following is more straight-forward and easier for jsonb type of columns:
select * from table_name
where
column_name::text like '%Something%'
Found a good article on more examples and implementations:
https://www.compose.com/articles/faster-operations-with-the-jsonb-data-type-in-postgresql/
Hope it helps!
One other option which may be sufficient for other people who've found this page is to just cast the column to text type. Eg
select * from module_data where data::text like '%Board%'
Note though, this will search over the entire json and should only be used if you can guarantee the other fields won't be a problem.
I Think it should be like
select * from module_data where data->>'$."title"' like '%Board%'
then only it worked for me.
Related
I have a sql query suppose
sqlQuery="select * from %s_table where event like '%holi%'"
listCity=["Bangalore","Mumbai"]
for (city<- listCity){
print(s.format(city))
}
Expected output:
select * from Bangalore_table where event like '%holi%'
select * from Mumbai_table where event like '%holi%'
Actual output:
unknown format conversion exception: Conversion='%h'
Can anyone let me how to solve this, instead of holi it could be anything iam looking for a generic solution in scala.
If you want the character % in a formatting string you need to escape it by repeating it:
sqlQuery = "select * from %s_table where event like '%%holi%%'"
More generally I would not recommend using raw SQL. Instead, use a library to access the database. I use Slick but there are a number to choose from.
Also, having different tables named for different cities is really poor database design and will cause endless problems. Create a single table with an indexed city column and use WHERE to select one or more cities for inclusion in the query.
I have created one table with JSONB column as "data"
And the sample value of that column is
[{field_id:1, value:10},{field_id:2, value:"some string"}]
Now there are multiple rows like this..
What i want ?
I want to use aggregate function on "data" column such that, i should
get
Sum of all value where field_id = 1;
Avg of value where field_id = 1;
I have searched alot on google but not able to find a proper solution.
sometimes it says "Field doesn't exist" and some times it says "from clause missing"
I tried referring like data.value & also data -> value lastly data ->> value
But nothing is working.
Please let me know the solution if any one knows,
Thanks in advance.
Your attributes should be something like this, so you instruct it to run the function on a specific value:
attributes: [
[sequelize.fn('sum', sequelize.literal("data->>'value'")), 'json_sum'],
[sequelize.fn('avg', sequelize.literal("data->>'value'")), 'json_avg']
]
Then in WHERE, you reference field_id in a similar way, using literal():
where: sequelize.literal("data->>'field_id' = 1")
Your example also included a string for the value of "value" which of course won't work. But if the basic Sequelize setup works on a good set of data, you can enhance the WHERE clause to test for numeric "value" data, there are good examples here: Postgres query to check a string is a number
Hopefully this gets you close. In my experience with Sequelize + Postgres, it helps to run the program in such a way that you see what queries it creates, like in a terminal where the output is streaming. On the way to a working statement, you'll either create objects which Sequelize doesn't like, or Sequelize will create bad queries which Postgres doesn't like. If the query looks close, take it into pgAdmin for further work, then try to reproduce your adjustments in Sequelize. Good luck!
i have a query where i need to use the DISTINCT keyword, the issue is that a field i have in the select is of type MEMO (needs to be so because of its large content...).
SELECT distinct customerid, commentdate, commenttext....
is not accepted in FOXPRO 9 because commenttext field is f type Memo !
any idea?
You have a couple of options, depending on your needs:
1) Omit the memo field from the query.
2) Use an expression to convert the memo field to character. For example, LEFT(commenttext,254).
Are you really trying to apply distinct to the memo field, as well? What's your actual goal here?
Tamar
Wrap the memo field in the SELECT statement in a function such as ALLTRIM.
SELECT distinct customerid, commentdate, ALLTRIM(commenttext)....
Another option is to use something like PHDBase which is a text search indexer for Visual Foxpro. It allows character columns and memo fields to be indexed and searchable. And it's incredibly fast.
Does anyone know how to use to_tsquery() function of postgresql in sqlalchemy? I searched a lot in Google, I didn't find anything that I can understand. Please help.
I hope it is available in filter function like this:
session.query(TableName).filter(Table.column_name.to_tsquery(search_string)).all()
The expected SQL for the above query is something like this:
Select column_name
from table_name t
where t.column_name ## to_tsquery(:search_string)
The .op() method allows you to generate SQL for arbitrary operators.
session.query(TableName).filter(
Table.c.column_name.op('##')(to_tsquery(search_string))
).all()
For these type of arbitrary queries, you can embed the sql directly into your query:
session.query(TableName).\
filter("t.column_name ## to_tsquery(:search_string)").\
params(search_string=search_string).all()
You should also be able to parameterize t.column_name, but can't see the docs for that just now.
This may have been added recently but worth adding as a more standard solution.
query.filter(Model.attribute.match('your search string'))
does it for you as it looks for the right operation available for your dialect.
See the official docs:
https://docs.sqlalchemy.org/en/13/dialects/postgresql.html#full-text-search
Of course, this assumes the table you are querying is a view built with a to_tsvector attribute to apply the ## operation to.
My fifty cents in 2021, following the docs
None of the previous answers mention how to cast the text column as postgres: to_tsvector('english', column). I have a text column indexed as tsvector. And this is the way:
select(mytable.c.id).where(
func.to_tsvector('english', mytable.c.title )\
.match('somestring', postgresql_regconfig='english')
)
In my case, I didn't want to use to_tsquery as ".match" forces. A more intuitive way is to use websearch_to_tsquery when you have a search input like stackOverflow. So I did a mix from jd response.
I finally applied it as a .filter() the following statement:
func.to_tsvector('english', Table.column_name)\
.op('##')(func.websearch_to_tsquery("string to search",
postgresql_regconfig='english'))
I think this is the general formula and it applies to to_tsquery too.
I am now working on the sqlite in iPhone platform and dealing with the database with some strange entities. One of the column in this DB called type, which allows multiple values (multiple types). The data in this column is like this:
1|9|20|31|999
The table is like:
ID TYPE
------------------------
1 1|9|20|31|999
2 5|13|15|30|990
3 6|7|45|46|57
When the user want to select the data with the type 9, it needs select the above data because the entity contains 9. And I use the following statement to execute:
SELECT id
FROM table
WHERE type LIKE '%' || ? || '%'
The problem is that the data with the type 999 and without type 9 will also be selected. Also, if the user wants to select type 1, the data with type 11, 12, 13...etc will also be selected.
I tried to use the statement:
SELECT id
FROM table
WHERE type LIKE '[^0123456789]' || ? || '[^0123456789]'
But it can't select any data.
What can I do to select the data with correct type?
(The database cannot be changed because of the company requirement)
You'd need a regular expression that matches only 'yournumber|', '|yournumber|', '|yournumber', or a combination of the above. I see no way an optimizer could use an index on that kind of thing short of a specialized index that would work similarly to full-text indexes.
This is almost hopeless, unless you have a trivial amount of data (or way too many processing resources), it will not scale.
If the amount of data is small, consider just selecting like '%number%', and do some post-processing in you application code.
Otherwise, normalize you data.
You could try this
SELECT id
FROM table
WHERE '|' + type + '|' LIKE '%|' + ? + '|%'
(I use the + operator here because it's easier to read but you could also use || instead)
It's an easy solution without regex. Indexes don't help anything here (this is the same for regex solution) the query will be slow with huge data.
I had a similar question I asked myself a while ago. Also check it's answer
I'm not too much familiar with SQLite but what about:
SELECT id FROM table WHERE type REGEX '(^9\|.*)|(.*\|9\|.*)|(.*\|9$)';
assuming that you are looking for the type 9
Then to look for type 999 it would be:
SELECT id FROM table WHERE type REGEX '(^999\|.*)|(.*\|999\|.*)|(.*\|999$)';
Note that you might want to tweak the expression if you may potentially have a single type value in the type field because in that case I'd suppose that there wouldn't be any | character.