Using Integer Array in postgres with Spring-boot - postgresql

I am attempting to accept from the browser a List and use this within a SQL query to a postgres database. I have the following code snippet that tries to show the function that I have made todo this. Some of the variables have been changed in case there appears to be discrepancies.
static public List<Map<String,Object>> fetch(NamedParameterJdbcTemplate jdbcTemplate, List<Integer> id){
List<Map<String,Object>> result= new ArrayList<>();
String sql = "select * from lookup where id && ARRAY[ :ids ]";
MapSqlParameterSource parameters = new MapSqlParameterSource();
parameters.addValue("ids",id, Types.INTEGER);
result= jdbcTemplate.query(sql,
parameters,
new RowMapper<Map<String,Object>>() { ...
}
)
}
The lookup tables id field is a postgress array hence me needing to use && and the array function
This function is called by many different endpoints and passes the NamedParameterJdbcTemplate as well as a list of Integers. The problem I am having is that if any integer in the list is < 100 I get the following message
Bad value for type int : {20}
Is there another way of doing this or a way around this error ?
EDIT:
It appears it was part of the problem mentioned as the answer but also using
rs.getInt(col)
instead of
rs.getArray(col)

There's an error I can see in the SQL, and probably the wrong choice of API after that. First in the query:
select * from lookup where id && ARRAY[ :ids ]
To bind an array parameter, it must not be placed in the ARRAY constructor, but rather you need to use JDBC binding like this:
select * from lookup where id && ?
As you've noticed I'm not using a named parameter in these examples, because NamedParameterJdbcTemplate does not provide a route to obtaining the java.sql.Connection object or a proxy to it. You can access it through the PreparedStatementSetter if you use the JdbcOperations interface instead.
public static List<Map<String,Object>> fetch(NamedParameterJdbcTemplate jdbcTemplate, List<Integer> idlist){
List<Map<String,Object>> result= new ArrayList<>();
String sql = "select * from lookup where id && ?";
final Integer[] ids = idlist.toArray(new Integer[0]);
PreparedStatementSetter parameters = new PreparedStatementSetter() {
#Override
void setValues(PreparedStatement stmt) {
Connection conn = stmt.getConnection();
// this can only be done through the Connection
java.sql.Array arr = conn.createArrayOf("integer", ids);
// you can use setObject(1, ids, java.sql.Types.ARRAY) instead of setArray
// in case the connection wrapper doesn't pass it on to the JDBC driver
stmt.setArray(1, ids);
}
};
JdbcOperations jdo = jdbcTemplate.getJdbcOperations();
result= jdo.query(sql,
parameters,
new RowMapper<Map<String,Object>>() { ...
}
)
}
There might be errors in the code, since I normally use a different set of APIs, and you need a try-catch block for java.sql.SQLException in that setValues function, but you should be able to handle it from here on.

Related

How to set values in ItemPreparedStatementSetter for one to many mapping

I am trying to use JdbcBatchItemWriter for a domain object RemittanceClaimVO . RemittanceClaimVO has a List of another domain object , ClaimVO .
public class RemittanceClaimVO {
private long remitId;
private List<ClaimVO> claims = new ArrayList<ClaimVO>();
//setter and getters
}
So for each remit id, there would be multiple claims and I wish to use single batch statement to insert all rows.
With plain jdbc, I used to write this object by putting values in batches like below ,
ist<ClaimVO> claims = remittanceClaimVO.getClaims();
if(claims != null && !claims.isEmpty()){
for(ClaimVO claim:claims){
int counter = 1 ;
stmt.setLong(counter++, remittanceClaimVO.getRemitId());
stmt.setLong(counter++, claim.getClaimId());
stmt.addBatch();
}
}
stmt.executeBatch();
I am not sure how to achieve same in Spring Batch by using ItemPreparedStatementSetter.
I have tried similar loop as above in setValues method but values not getting set.
#Override
public void setValues(RemittanceClaimVO remittanceClaimVO, PreparedStatement ps) throws SQLException {
List<ClaimVO> claims = remittanceClaimVO.getClaims();
for(ClaimVO claim:claims){
int counter = 1 ;
ps.setLong(counter++, remittanceClaimVO.getRemitId());
ps.setLong(counter++, claim.getClaimId());
}
}
This seems another related question.
Please suggest.

MyBatis - ResultMap according to javaType

Hello StackOverflowers,
There is something I don't get about MyBatis resultMap.
The model I'm working on is beeing updated. We decided to create a new graph of objects which reflects our future DB schema (the current one is awful).
To sum up our problem, here is a simple case:
The current Object whith is related to table SITE is org.example.model.SiteModel. We created a new Object called org.example.entity.Site. (The package name is temporary).
The goal is now to use the existing SQL request developed thank to MyBatis and add a new ResultMap linked to the return type of our method.
Here is a an example:
/**
* Get all site defined as template.
*/
#Select("SELECT * FROM SITE WHERE ISTEMPLATE = 'True'")
#ResultMap({"siteResMap" , "siteResultMap"})
#Options(statementType = StatementType.CALLABLE)
<T> List<T> findTemplates();
Then, in an XML configuration file, we defined the following mappings:
...
<resultMap id="siteResMap" type="org.example.entity.Site" />
<resultMap id="siteResultMap" type="org.example.model.SiteModel" />
...
And then we call the method from our DAO:
List<Site> site = siteDao.findTemplates();
List<SiteModel> siteMod = siteDao.findTemplates();
What we are expecting from this is a dynamic interpretation from MyBatis, taking the right ResultMap according to the computed return type.
But both list are shown as List<org.example.entity.Site> from debuger.
It makes me think that the first ResultMap is taken, ignoring the second one.
Am I missing something ? Is there a way to make MyBatis behave in such way ?
Regards
After a lot a research and code exploration, we found out that the String[] of ResultMap is not designed to link java return types to the resultMap.
This is function retrieving the resultmap (from org.apache.ibatis.executor.resultset.DefaultResultSetHandler)
public List<Object> handleResultSets(Statement stmt) throws SQLException {
ErrorContext.instance().activity("handling results").object(mappedStatement.getId());
final List<Object> multipleResults = new ArrayList<Object>();
int resultSetCount = 0;
ResultSetWrapper rsw = getFirstResultSet(stmt);
List<ResultMap> resultMaps = mappedStatement.getResultMaps();
int resultMapCount = resultMaps.size();
validateResultMapsCount(rsw, resultMapCount);
while (rsw != null && resultMapCount > resultSetCount) {
ResultMap resultMap = resultMaps.get(resultSetCount);
handleResultSet(rsw, resultMap, multipleResults, null);
rsw = getNextResultSet(stmt);
cleanUpAfterHandlingResultSet();
resultSetCount++;
}
String[] resultSets = mappedStatement.getResulSets();
if (resultSets != null) {
while (rsw != null && resultSetCount < resultSets.length) {
ResultMapping parentMapping = nextResultMaps.get(resultSets[resultSetCount]);
if (parentMapping != null) {
String nestedResultMapId = parentMapping.getNestedResultMapId();
ResultMap resultMap = configuration.getResultMap(nestedResultMapId);
handleResultSet(rsw, resultMap, null, parentMapping);
}
rsw = getNextResultSet(stmt);
cleanUpAfterHandlingResultSet();
resultSetCount++;
}
}
return collapseSingleResultList(multipleResults);
}
It explains why we always got a List of elements of type of the first resultMap.
We created a new Dao to map new object types.

Ormlite and PostgreSQL - Error inserting text array with custom persister

I have been working to setup Ormlite as the primary data access layer between a PostgreSQL database and Java application. Everything has been fairly straightforward, until I started messing with PostgreSQL's array types. In my case, I have two tables that make use of text[] array type. Following the documentation, I created a custom data persister as below:
public class StringArrayPersister extends StringType {
private static final StringArrayPersister singleTon = new StringArrayPersister();
private StringArrayPersister() {
super(SqlType.STRING, new Class<?>[]{String[].class});
}
public static StringArrayPersister getSingleton() {
return singleTon;
}
#Override
public Object javaToSqlArg(FieldType fieldType, Object javaObject) {
String[] array = (String[]) javaObject;
if (array == null) {
return null;
} else {
String join = "";
for (String str : array) {
join += str +",";
}
return "'{" + join.substring(0,join.length() - 1) + "}'";
}
}
#Override
public Object sqlArgToJava(FieldType fieldType, Object sqlArg, int columnPos) {
String string = (String) sqlArg;
if (string == null) {
return null;
} else {
return string.replaceAll("[{}]","").split(",");
}
}
}
And then in my business object implementation, I set up the persister class on the column likeso:
#DatabaseField(columnName = TAGS_FIELD, persisterClass = StringArrayPersister.class)
private String[] tags;
When ever I try inserting a new record with the Dao.create statement, I get an error message saying tags is of type text[], but got character varying... However, when querying existing records from the database, the business object (and text array) load just fine.
Any ideas?
UPDATE:
PostGresSQL 9.2. The exact error message:
Caused by: org.postgresql.util.PSQLException: ERROR: column "tags" is
of type text[] but expression is of type character varying Hint: You
will need to rewrite or cast the expression.
I've not used ormlite before (I generally use MyBatis), however, I believe the proximal issue is this code:
private StringArrayPersister() {
super(SqlType.STRING, new Class<?>[]{String[].class});
}
SqlType.String is mapped to varchar in SQL in the ormlite code, and so therefore I believe is the proximal cause of the error you're getting. See ormlite SQL Data Types info for more detail on that.
Try changing it to this:
private StringArrayPersister() {
super(SqlType.OTHER, new Class<?>[]{String[].class});
}
There may be other tweaks necessary as well to get it fully up and running, but that should get you passed this particular error with the varchar type mismatch.

QueryBuider get parameters for Dao.queryRaw

I'm using QueryBuider to create raw query, but I need to fill parameters to raw query manually.
Properties 'from' and 'to' are filled two times. One in 'where' section of QueryBuider, and one in queryRaw method as parameters.
Method StatementBuilder.prepareStatementString() returns query string with "?" for substitution.
Is there any way to get these parameters directly from QueryBuider instance?
For example, imagine a new method in ormlite - StatementBuilder.getPreparedStatementParameters();
QueryBuilder<AccountableItemEntity, Long> accountableItemQb = accountableItemDao.queryBuilder();
QueryBuilder<AccountingEntryEntity, Long> accountingEntryQb = accountingEntryDao.queryBuilder();
accountingEntryQb.where().eq(
AccountingEntryEntity.ACCOUNTING_ENTRY_STATE_FIELD_NAME,
AccountingEntryStateEnum.CREATED);
accountingEntryQb.join(accountableItemQb);
QueryBuilder<AccountingTransactionEntity, Long> accountingTransactionQb =
accountingTransactionDao.queryBuilder();
accountingTransactionQb.selectRaw("ACCOUNTINGENTRYENTITY.TITLE, " +
"ACCOUNTINGENTRYENTITY.ACCOUNTABLE_ITEM_ID, " +
"SUM(ACCOUNTINGENTRYENTITY.COUNT), " +
"SUM(ACCOUNTINGENTRYENTITY.COUNT * CONVERT(ACCOUNTINGENTRYENTITY.PRICEAMOUNT,DECIMAL(20, 2)))");
accountingTransactionQb.join(accountingEntryQb);
accountingTransactionQb.where().eq(
AccountingTransactionEntity.ACCOUNTING_TRANSACTION_STATE_FIELD_NAME,
AccountingTransactionStateEnum.PRINTED)
.and().between(AccountingTransactionEntity.CREATE_TIME_FIELD_NAME, from, to);
accountingTransactionQb.groupByRaw(
"ACCOUNTINGENTRYENTITY.ACCOUNTABLE_ITEM_ID, ACCOUNTINGENTRYENTITY.TITLE");
String query = accountingTransactionQb.prepareStatementString();
accountingTransactionQb.prepare().getStatement();
Timestamp fromTimestamp = new Timestamp(from.getTime());
Timestamp toTimestamp = new Timestamp(to.getTime());
//TODO: get parameters from accountingTransactionQb
GenericRawResults<Object[]> genericRawResults =
accountingEntryDao.queryRaw(query, new DataType[] { DataType.STRING,
DataType.LONG, DataType.LONG, DataType.BIG_DECIMAL },
fromTimestamp.toString(), toTimestamp.toString());
Is there any way to get these parameters directly from QueryBuider instance?
Yes, there is a way. You need to subclass QueryBuilder and then you can use the appendStatementString(...) method. You provide the argList which then can be used to get the list of arguments.
protected void appendStatementString(StringBuilder sb,
List<ArgumentHolder> argList) throws SQLException {
appendStatementStart(sb, argList);
appendWhereStatement(sb, argList, true);
appendStatementEnd(sb, argList);
}
For example, imagine a new method in ormlite - StatementBuilder.getPreparedStatementParameters();
Good idea. I've made the following changes to the Github repo.
public StatementInfo prepareStatementInfo() throws SQLException {
List<ArgumentHolder> argList = new ArrayList<ArgumentHolder>();
String statement = buildStatementString(argList);
return new StatementInfo(statement, argList);
}
...
public static class StatementInfo {
private final String statement;
private final List<ArgumentHolder> argList;
...
The feature will be in version 4.46. You can build a release from current trunk if you don't want to wait for that release.

Using function result inside expression function used by a predicate

I am trying to use predicateBuilder with next expression definition but I always got the message
"LINQ to Entities does not recognize the method 'puedeConsultar' method, and this method cannot be translated into a store expression."
I think i understand more less this problem, but i don´t know how to solve it.
private static readonly IDictionary<int, List<string>> permisosAccesoSolicitudesEstado = new Dictionary<int, List<string>>(){{0, new List<string>(){"A"}}, {1, new List<string>(){"B"}}};
private static bool esPermisoConcedido(List<string> usuariosPermitidos, string erfilUsuario)
{
return usuariosPermitidos.Any(x => x.Equals(perfilUsuario) || perfilUsuario.StartsWith(x + "|") || perfilUsuario.EndsWith("|" + x));
}
public static bool puedeConsultar(int estadoActual, string perfilUsuario)
{
List<string> usuariosPermitidos = permisosAccesoSolicitudesEstado[estadoActual];
return esPermisoConcedido(usuariosPermitidos, perfilUsuario);
}
public static bool puedeConsultar(string estadoActual, string tipoUsuario)
{
return puedeConsultar(Convert.ToInt32(estadoActual), tipoUsuario);
}
public Expression<Func<Solicitud, Boolean>> predicadoEstadoCorrectoSolicitud(string perfil)
{
return x=> EstadosSolicitud.puedeConsultar(x.estado, perfil);
}
//Instantiated by reflection, this works fine
MethodInfo method = .....
Expression<Func<T, bool>> resultado = ConstructorPredicados.True<T>();
resultado = ConstructorPredicados.And(resultado, method);
objectSet.Where(resultado).ToList();
Note:
ConstructorPredicados is based in Monty´s Gush "A universal PredicateBuilder" on http://petemontgomery.wordpress.com/2011/02/10/a-universal-predicatebuilder/
Thanks in advance
You cannot do that. Your puedeConsultar is .NET function. You cannot execute .NET functions in Linq-to-entities query. When you use method in Linq-to-entities you can use only methods which has direct mapping to SQL. It means that method in the query is only placeholder which is translated to execution of some SQL function. There is set of predefined method mappings called cannonical functions and you can map your own SQL function when using EDMX but in your case you will most probably have to first load data to application by using ToList and after that execute predicadoEstadoCorrectoSolicitud on materialized result.