Call jsonb_contains function (postgres) using JPA criteria and JsonBinaryType - postgresql

I have a JPA/Hibernate entity which has a JSONB column (using https://github.com/vladmihalcea/hibernate-types ) for storing a list of strings. This works fine so far.
#TypeDef(name = "jsonb", typeClass = JsonBinaryType.class)
#Type(type = "jsonb")
#Column(name = "TAGS", columnDefinition = "jsonb")
private List<String> tags;
Now I want to check if another string is contained in the list of strings.
I can do this by writing a native query and use the #> operator from Postgres. Because of other reasons (the query is more complex) I do not want to go in that direction. My current approach is calling the jsonb_contains method in a Spring Data specification (since the operator is just alias to this function), e.g. jsonb_contains('["tag1", "tag2", "tag3"]','["tag1"]'). What I am struggling with is, getting the second parameter right.
My initial approach is to also use a List of Strings.
public static Specification<MyEntity> hasTag(String tag) {
return (root, query, cb) -> {
if (StringUtils.isEmpty(tag)) {
return criteriaBuilder.conjunction();
}
Expression<Boolean> expression = criteriaBuilder.function("jsonb_contains",
Boolean.class,
root.get("tags"),
criteriaBuilder.literal(List.of(tag)));
return criteriaBuilder.isTrue(expression);
};
}
This results in the following error.
Caused by: org.postgresql.util.PSQLException: ERROR: function jsonb_contains(jsonb, character varying) does not exist
Hinweis: No function matches the given name and argument types. You might need to add explicit type casts.
Position: 375
It does know that root.get("tags") is mapped to JSONB but for the second parameter it does not. How can I get this right? Is this actually possible?

jsonb_contains(jsob, jsonb) parameters must be jsonb type.
You can not pass a Java String as a parameter to the function.
You can not do casting in Postgresql via JPA Criteria.
Using JSONObject or whatever does not help because Postgresql sees it as
bytea type.
There are 2 possible solutions:
Solution 1
Create jsonb with jsonb_build_object(text[]) function and send it to jsonb_contains(jsonb, jsonb) function:
public static Specification<MyEntity> hasTag(String tag) {
// get List of key-value: [key1, value1, key2, value2...]
List<Object> tags = List.of(tag);
// create jsonb from array list
Expression<?> jsonb = criteriaBuilder.function(
"jsonb_build_object",
Object.class,
cb.literal(tags)
);
Expression<Boolean> expression = criteriaBuilder.function(
"jsonb_contains",
Boolean.class,
root.get("tags"),
jsonb
);
return criteriaBuilder.isTrue(expression);
}
Solution 2
Create custom function in your Postgresql and use it in Java:
SQL:
CREATE FUNCTION jsonb_contains_as_text(a jsonb, b text)
RETURNS BOOLEAN AS $$
SELECT CASE
WHEN a #> b::jsonb THEN TRUE
ELSE FALSE
END;$$
LANGUAGE SQL IMMUTABLE STRICT;
Java Code:
public static Specification<MyEntity> hasTag(String tag) {
Expression<Boolean> expression = criteriaBuilder.function(
"jsonb_contains_as_text",
Boolean.class,
root.get("tags"),
criteriaBuilder.literal(tag)
);
return criteriaBuilder.isTrue(expression);
}

I think that the reason is that you pass the varchar as the second param. jsonb_contains() requires two jsonb params.
To check a jsonb array contains all/any values from a string array you need to use another operators: ?& or ?|.
The methods bindings for them in PSQL 9.4 are: jsonb_exists_all and jsonb_exists_any correspondingly.
In your PSQL version, you could check it by the following command:
select * from pg_operator where oprname = '?&'

Related

JPA Criteria on PostgreSQL boolean type creates invalid query

I have a regular JPA Entity which has a boolean field:
#Entity("UserAccount")
public class UserAccount {
...
#Column(name = "isActive")
private boolean isActive = false;
...
}
This is backed by a table in PostgreSQL 15. In the table definition, the field is of type BOOLEAN (not integer!):
CREATE TABLE UserAccount(
...
isActive BOOLEAN NOT NULL,
...
)
When I attempt to use the JPA Criteria API with it:
criteriaBuilder.isTrue(root.get("isActive"))
... then it will be rendered as = 1 in the SQL query which is sent to PostgreSQL:
SELECT ...
FROM UserAccount
WHERE isActive = 1
PostgreSQL rejects this query with the following error (which is pretty clear):
PSQLException: ERROR: operator does not exist: boolean = integer
Hint: No operator matches the given name and argument types. You might need to add explicit type casts.
My question is: how can I tell Hibernate that I'm using an actual PostgreSQL-native BOOLEAN type in my table instead of encoding the boolean as a number?

How to use a list as a parameter source for SQL queries with Vertx JDBC Client?

I have a Vert.x web application that needs to query an AWS RDS instance running Postgres 10.7. The Vert.x JDBC client is io.vertx:vertx-jdbc-client:3.8.4. I want to query a table with the constraint that a certain column's value is included in a set of values:
select from table where column in/any (?)
I followed the Vertx documentation, which says to create a JsonArray and populate it with the values to inject into the query. The column is of type text and the list that I want to match on is a Java ArrayList<String>. My query code looks like:
String sql = "SELECT a FROM table WHERE col IN (?)";
List<String> values = someObject.someField();
sqlClient.getConnection(connectionResult -> {
if (connectionResult.failed()) {
// handle
} else {
SQLConnection connection = connectionResult.result();
JsonArray params = new JsonArray()
.add(values);
connection.queryWithParams(sql, params, queryResult -> {
if (queryResult.failed()) {
// handle
} else {
// parse
}
});
}
});
The query fails with the error: org.postgresql.util.PSQLException: Can't infer the SQL type to use for an instance of io.vertx.core.json.JsonArray. Use setObject() with an explicit Types value to specify the type to use.
I know that in the worst case, I can create a literal SQL string where col in (?, ?, ?, ..., ?) and add each String from the list to a JsonArray, but there must be a way to just add the ArrayList<String> as a parameter and keep the query simple. How can I specify a list of values to match against in my query?
The Vert.x JDBC Client does not support array parameters in queries.
However it is possible with the Vert.x Pg Client, which does not depend on JDBC. You need to modify your query first:
SELECT a FROM table WHERE col = ANY(?)
Then:
pgClient.preparedQuery(query, Tuple.of(possibleValues), collector, handler);

Default return value for JPA query

I have a JPA query
#Query(value = "SELECT SUM(total_price) FROM ... WHERE ...", nativeQuery = true)
which works as expected when there are matching records. But when there are no matching records, the query returns null.
How can I return zero(0) instead of null when no records are found?
You can change return type to be an Optional;
#Query(value = "SELECT SUM(total_price) FROM ... WHERE ...", nativeQuery = true)
Optional<Integer> getSum(...);
Or you can wrap this getSum() with a default method;
#Query(..)
Integer getSum(...);
default Integer safeGetSum(..) {
return Optional.ofNullable(getSum(..)).orElse(0);
}
More info on null handling in repositories
When the return value is not a list, or some wrapper (plus some others check below), & there are no matching records, the return will be null, so there is no slick way to handle this with some defaultValue=0 through #Query
The absence of a query result is then indicated by returning null. Repository methods returning collections, collection alternatives, wrappers, and streams are guaranteed never to return null but rather the corresponding empty representation.
A native SQL option is COALESCE which returns the first non-null expression in the arg list
SELECT COALESCE(SUM(total_price),0) FROM ...

JOOQ PostgreSQL access a single element of an array

TO get a single element in PostgreSQL query, we can do something like
SELECT pay_by_quarter[3] FROM sal_emp;
or in case of a condition,
SELECT * FROM sal_emp WHERE pay_by_quarter[1] = 10000
How can I achieve the same thing in JOOQ, if possible.
Thanks!
There's a pending feature request to allow for referencing array elements by index through jOOQ API: https://github.com/jOOQ/jOOQ/issues/229
The workaround is to use plain SQL templating:
public static <T> Field<T> get(Field<T[]> field, int index) {
return (Field) DSL.field(
"{0}[{1}]",
field.getDataType().getType().getComponentType(),
val(index)
);
}

How to query a JSON element

Let's say I have a Postgres database (9.3) and there is a table called Resources. In the Resources table I have the fields id which is an int and data which is a JSON type.
Let's say I have the following records in said table.
1, {'firstname':'Dave', 'lastname':'Gallant'}
2, {'firstname':'John', 'lastname':'Doe'}
What I want to do is write a query that would return all the records in which the data column has a json element with the lastname equal to "Doe"
I tried to write something like this:
records = db_session.query(Resource).filter(Resources.data->>'lastname' == "Doe").all()
Pycharm however is giving me a compile error on the "->>"
Does anyone know how I would write the filter clause to do what I need?
Try using astext
records = db_session.query(Resource).filter(
Resources.data["lastname"].astext == "Doe"
).all()
Please note that the column MUST have a type of a JSONB. The regular JSON column will not work.
Also you could explicitly cast string to JSON (see Postgres JSON type doc).
from sqlalchemy.dialects.postgres import JSON
from sqlalchemy.sql.expression import cast
db_session.query(Resource).filter(
Resources.data["lastname"] == cast("Doe", JSON)
).all()
If you are using JSON type (not JSONB) the following worked for me:
Note the '"object"'
query = db.session.query(ProductSchema).filter(
cast(ProductSchema.ProductJSON["type"], db.String) != '"object"'
)
I have some GeoJSON in a JSON (not JSONB) type column and none of the existing solutions worked, but as it turns out, in version 1.3.11 some new data casters were added, so now you can:
records = db_session.query(Resource).filter(Resources.data["lastname"].as_string() == "Doe").all()
Reference: https://docs.sqlalchemy.org/en/14/core/type_basics.html#sqlalchemy.types.JSON
Casting JSON Elements to Other Types
Index operations, i.e. those invoked by calling upon the expression
using the Python bracket operator as in some_column['some key'],
return an expression object whose type defaults to JSON by default, so
that further JSON-oriented instructions may be called upon the result
type. However, it is likely more common that an index operation is
expected to return a specific scalar element, such as a string or
integer. In order to provide access to these elements in a
backend-agnostic way, a series of data casters are provided:
Comparator.as_string() - return the element as a string
Comparator.as_boolean() - return the element as a boolean
Comparator.as_float() - return the element as a float
Comparator.as_integer() - return the element as an integer
These data casters are implemented by supporting dialects in order to
assure that comparisons to the above types will work as expected, such
as:
# integer comparison
data_table.c.data["some_integer_key"].as_integer() == 5
# boolean comparison
data_table.c.data["some_boolean"].as_boolean() == True
According sqlalchemy.types.JSON, you can do it like this
from sqlalchemy import JSON
from sqlalchemy import cast
records = db_session.query(Resource).filter(Resources.data["lastname"] == cast("Doe", JSON)).all()
According to this, pre version 1.3.11, the most robust way should be like this, as it works for multiple database types, e.g. SQLite, MySQL, Postgres:
from sqlalchemy import cast, JSON, type_coerce, String
db_session.query(Resource).filter(
cast(Resources.data["lastname"], String) == type_coerce("Doe", JSON)
).all()
From version 1.3.11 onward, type-specific casters is the new and neater way to handle this:
db_session.query(Resource).filter(
Resources.data["lastname"].as_string() == "Doe"
).all()