How to use ilike sqlalchemy on postgresql array field? - postgresql

I'm trying to match some names with the names I already have in my Postgresql database, using the following code:
last_like = '{last_name}%'.format(last_name)
matches = Session.query(MyTable).filter(or_(
MyTable.name.ilike(last_like),
MyTable.other_names.any(last_like, operator=ilike_op),
)).all()
Essentially it tries to match the name column or any of the other names stored as an array in the other_names column.
But I get:
KeyError: <function ilike_op at 0x7fb5269e2500>
What am I doing wrong?

For use postgresql array field you need to use unnest() function.
But you can't use result of unnest() in where clause.
Instead, you can use array_to_string function. Searching on string of other_names will give the same effect
from sqlalchemy import func as F
last_like = "%qq%"
matches = session.query(MyTable).filter(or_(
MyTable.name.ilike(last_like),
F.array_to_string(MyTable.other_names, ',').ilike(last_like),
)).all()

Related

Get unique values from PostgreSQL array

This seems like it would be straightforward to do but I just can not figure it out. I have a query that returns an ARRAY of strings in one of the columns. I want that array to only contain unique strings. Here is my query:
SELECT
f."_id",
ARRAY[public.getdomain(f."linkUrl"), public.getdomain(f."sourceUrl")] AS file_domains,
public.getuniqdomains(s."originUrls", s."testUrls") AS source_domains
FROM
files f
LEFT JOIN
sources s
ON
s."_id" = f."sourceId"
Here's an example of a row from my return table
_id
file_domains
source_domains
2574873
{cityofmontclair.org,cityofmontclair.org}
{cityofmontclair.org}
I need file_domains to only contain unique values, IE a 'set' instead of a 'list'. Like this:
_id
file_domains
source_domains
2574873
{cityofmontclair.org}
{cityofmontclair.org}
Use a CASE expression:
CASE WHEN public.getdomain(f."linkUrl") = public.getdomain(f."sourceUrl")
THEN ARRAY[public.getdomain(f."linkUrl")]
ELSE ARRAY[public.getdomain(f."linkUrl"), public.getdomain(f."sourceUrl")]
END

Expressing Postgresql VALUES command in SQLAlchemy ORM?

How to express the query
VALUES ('alice'), ('bob') EXCEPT ALL SELECT name FROM users;
(i.e. "list all names in VALUES that are not in table 'users'") in SQLAlchemy ORM? In other words, what should the statement 'X' below be like?
def check_for_existence_of_all_users_in_list(list):
logger.debug(f"checking that each user in {list} is in the database")
query = X(list)
(There is sqlalchemy.values which could be used like this:
query = sa.values(sa.column('name', sa.String)).data(['alice', 'bob']) # .???
but it appears that it can only be used as argument to INSERT or UPDATE.)
I am using SQLAlchemy 1.4.4.
This should work for you:
user_names = ['alice', 'bob']
q = values(column('name', String), name="temp_names").data([(_,) for _ in user_names])
query = select(q).except_all(select(users.c.name)) # 'users' is Table instance

Text and jsonb concatenation in a single postgresql query

How can I concatenate a string inside of a concatenated jsonb object in postgresql? In other words, I am using the JSONb concatenate operator as well as the text concatenate operator in the same query and running into trouble.
Or... if there is a totally different query I should be executing, I'd appreciate hearing suggestions. The goal is to update a row containing a jsonb column. We don't want to overwrite existing key value pairs in the jsonb column that are not provided in the query and we also want to update multiple rows at once.
My query:
update contacts as c set data = data || '{"geomatch": "MATCH","latitude":'||v.latitude||'}'
from (values (16247746,40.814140),
(16247747,20.900840),
(16247748,20.890570)) as v(contact_id,latitude) where c.contact_id = v.contact_id
The Error:
ERROR: invalid input syntax for type json
LINE 85: update contacts as c set data = data || '{"geomatch": "MATCH...
^
DETAIL: The input string ended unexpectedly.
CONTEXT: JSON data, line 1: {"geomatch": "MATCH","latitude":
SQL state: 22P02
Character: 4573
You might be looking for
SET data = data || ('{"geomatch": "MATCH","latitude":'||v.latitude||'}')::jsonb
-- ^^ jsonb ^^ text ^^ text
but that's not how one should build JSON objects - that v.latitude might not be a valid JSON literal, or even contain some injection like "", "otherKey": "oops". (Admittedly, in your example you control the values, and they're numbers so it might be fine, but it's still a bad practice). Instead, use jsonb_build_object:
SET data = data || jsonb_build_object('geomatch', 'MATCH', 'latitude', v.latitude)
There are two problems. The first is operator precedence preventing your concatenation of a jsonb object to what is read a text object. The second is that the concatenation of text pieces requires a cast to jsonb.
This should work:
update contacts as c
set data = data || ('{"geomatch": "MATCH","latitude":'||v.latitude||'}')::jsonb
from (values (16247746,40.814140),
(16247747,20.900840),
(16247748,20.890570)) as v(contact_id,latitude)
where c.contact_id = v.contact_id
;

How use '?' in #Query? How use Jsonb of postgres in spring-boot at all?

How to use ?| operator in postgres query in spring repository? I need to use where in my query for text type column which content json.
#Query(value =
"SELECT * \n" +
"FROM tbl t \n" +
"WHERE t.some_ids::::jsonb ?| array['152960','188775']", nativeQuery = true
)
List<Model> getModelsByIds();
But that don't work and I catch the next exeception:
org.springframework.dao.InvalidDataAccessApiUsageException: At least 1 parameter(s) provided but only 0 parameter(s) present in query.
You can use the associated function of that operator instead. Most of the time the obfuscation layers also choke on the :: cast operator, so you might want to use cast() instead:
WHERE pg_catalog.jsonb_exists_any(cast(t.some_ids as jsonb), array['152960','188775'])
However I think this wouldn't be able to make use of an index defined on some_ids::jsonb
You didn't mention how exactly the content of some_ids looks like.
If that is a JSON array (e.g. '["123", "456"]'::jsonb) then you can also use the contains operator #>:
WHERE cast(t.some_ids as jsonb) #> '["152960","188775"]'
If your JSON array contains numbers rather than strings ('[123,456]') you need to pass numbers in the argument as well:
WHERE cast(t.some_ids as jsonb) #> '[152960,188775]'

Build jsonb array from jsonb field

I have column options with type jsonb , in format {"names": ["name1", "name2"]} which was created with
UPDATE table1 t1 SET options = (SELECT jsonb_build_object('names', names) FROM table2 t2 WHERE t2.id= t1.id)
and where names have type jsonb array.
SELECT jsonb_typeof(names) FROM table2 give array
Now I want to extract value of names as jsonb array. But query
SELECT jsonb_build_array(options->>'names') FROM table
gave me ["[\"name1\", \"name2\"]"], while I expect ["name1", "name2"]
How can I get value in right format?
The ->> operator will return the value of the field (in your case, a JSON array) as a properly escaped text. What you are looking for is the -> operator instead.
However, note that using the jsonb_build_array on that will return an array containing your original array, which is probably not what you want either; simply using options->'names' should get you what you want.
Actually, you don't need to use jsonb_build_array() function.
Use select options -> 'names' from table; This will fix your issue.
jsonb_build_array() is for generating the array from jsonb object. You are following wrong way. That's why you are getting string like this ["[\"name1\", \"name2\"]"].
Try to execute this sample SQL script:
select j->'names'
from (
select '{"names": ["name1", "name2"]}'::JSONB as j
) as a;