Spring JDBC - Last inserted id - postgresql

I'm using Spring JDBC. Is a simple way to get last inserted ID using Spring Framework or i need to use some JDBC tricks ?
jdbcTemplate.update("insert into test (name) values(?)", params, types);
// last inserted id ...
I found something like below, but i get: org.postgresql.util.PSQLException: Returning autogenerated keys is not supported.
KeyHolder keyHolder = new GeneratedKeyHolder();
jdbcTemplate.update(new PreparedStatementCreator() {
#Override
public PreparedStatement createPreparedStatement(
Connection connection) throws SQLException {
PreparedStatement ps = connection.prepareStatement("insert into test (name) values(?)", new String[] {"id"});
ps.setString(1, "test");
return ps;
}
}, keyHolder);
lastId = (Long) keyHolder.getKey();

The old/standard way is to use call currval() after the insert (ref). Simple and secure.

Support for "generated keys for PreparedStatements" started only since PostgreSql Ver 8.4-701.

Related

Spring JPA native query to '#IdClass' annotated table and getting "No Dialect mapping for JDBC type: 1111" [duplicate]

I'm working on a Spring JPA Application, using MySQL as database. I ensured that all spring-jpa libraries, hibernate and mysql-connector-java is loaded.
I'm running a mysql 5 instance. Here is a excerpt of my application.properties file:
spring.jpa.show-sql=false
spring.jpa.hibernate.ddl-auto=create-drop
spring.jpa.database-platform=org.hibernate.dialect.MySQL5Dialect
spring.datasource.url=jdbc:mysql://localhost/mydatabase
spring.datasource.username=myuser
spring.datasource.password=SUPERSECRET
spring.datasource.driverClassName=com.mysql.jdbc.Driver
When executing an integration test, spring startsup properly but fails on creating the hibernate SessionFactory, with the exception:
org.hibernate.MappingException: No Dialect mapping for JDBC type: 1111
I think my dialects should be Mysql5Dialect, I also tried the one explicitly stating InnoDB, and the two dialect options which don't indicate the version 5. But I always end up with the same 'No Dialect mapping for JDBC type: 1111' message.
My application.properties file resides in the test/resources source folder. It is recognized by the JUnit Test runner (I previously got an exception because of an typo in it).
Are the properties I'm setting wrong? I couldn't find some official documentation on these property names but found a hint in this stackoverflow answer: https://stackoverflow.com/a/25941616/1735497
Looking forward for your answers, thanks!
BTW The application is already using spring boot.
I got the same error because my query returned a UUID column. To fix that I returned the UUID column as varchar type through the query like "cast(columnName as varchar)", then it worked.
Example:
public interface StudRepository extends JpaRepository<Mark, UUID> {
#Modifying
#Query(value = "SELECT Cast(stuid as varchar) id, SUM(marks) as marks FROM studs where group by stuid", nativeQuery = true)
List<Student> findMarkGroupByStuid();
public static interface Student(){
private String getId();
private String getMarks();
}
}
Here the answer based on the comment from SubOptimal:
The error message actually says that one column type cannot be mapped to a database type by hibernate.
In my case it was the java.util.UUID type I use as primary key in some of my entities. Just apply the annotation #Type(type="uuid-char") (for postgres #Type(type="pg-uuid"))
There is also another common use-case throwing this exception. Calling function which returns void. For more info and solution go here.
I got the same error, the problem here is UUID stored in DB is not converting to object.
I tried applying these annotations #Type(type="uuid-char") (for postgres #Type(type="pg-uuid") but it didn't work for me.
This worked for me. Suppose you want id and name from a table with a native query in JPA. Create one entity class like 'User' with fields id and name and then try converting object[] to entity we want. Here this matched data is list of array of object we are getting from query.
#Query( value = "SELECT CAST(id as varchar) id, name from users ", nativeQuery = true)
public List<Object[]> search();
public class User{
private UUID id;
private String name;
}
List<User> userList=new ArrayList<>();
for(Object[] data:matchedData){
userList.add(new User(UUID.fromString(String.valueOf(data[0])),
String.valueOf(data[1])));
}
Suppose this is the entity we have
Please Check if some Column return many have unknow Type in Query .
eg : '1' as column_name can have type unknown
and 1 as column_name is Integer is correct One .
This thing worked for me.
Finding the column that triggered the issue
First, you didn't provide the entity mapping so that we could tell what column generated this problem. For instance, it could be a UUID or a JSON column.
Now, you are using a very old Hibernate Dialect. The MySQL5Dialect is meant for MySQL 5. Most likely you are using a newer MySQL version.
So, try to use the MySQL8Dialect instead:
spring.jpa.database-platform=org.hibernate.dialect.MySQL8Dialect
Adding non-standard types
In case you got the issue because you are using a JSON column type, try to provide a custom Hibernate Dialect that supports the non-standard Type:
public class MySQL8JsonDialect
extends MySQL8Dialect{
public MySQL8JsonDialect() {
super();
this.registerHibernateType(
Types.OTHER, JsonStringType.class.getName()
);
}
}
Ans use the custom Hibernate Dialect:
<property
name="hibernate.dialect"
value="com.vladmihalcea.book.hpjp.hibernate.type.json.MySQL8JsonDialect"
/>
If you get this exception when executing SQL native queries, then you need to pass the type via addScalar:
JsonNode properties = (JsonNode) entityManager
.createNativeQuery(
"SELECT properties " +
"FROM book " +
"WHERE isbn = :isbn")
.setParameter("isbn", "978-9730228236")
.unwrap(org.hibernate.query.NativeQuery.class)
.addScalar("properties", JsonStringType.INSTANCE)
.getSingleResult();
assertEquals(
"High-Performance Java Persistence",
properties.get("title").asText()
);
Sometimes when you call sql procedure/function it might be required to return something. You can try returning void: RETURN; or string (this one worked for me): RETURN 'OK'
If you have native SQL query then fix it by adding a cast to the query.
Example:
CAST('yourString' AS varchar(50)) as anyColumnName
In my case it worked for me.
In my case, the issue was Hibernate not knowing how to deal with an UUID column. If you are using Postgres, try adding this to your resources/application.properties:
spring.jpa.properties.hibernate.dialect=org.hibernate.dialect.PostgreSQL9Dialect
Another simple explanation might be that you're fetching a complex Type (Entity/POJO) but do not specify the Entity to map to:
String sql = "select yourentity.* from {h-schema}Yourentity yourentity";
return entityManager.createNativeQuery(sql).getResultList();
simply add the class to map to in the createNativeQuery method:
return entityManager.createNativeQuery(sql, Yourentity.class).getResultList();
In my case the problem was that, I forgot to add resultClasses attribute when I setup my stored procedure in my User class.
#NamedStoredProcedureQuery(name = "find_email",
procedureName = "find_email", resultClasses = User.class, //<--I forgot that.
parameters = {
#StoredProcedureParameter(mode = ParameterMode.IN, name = "param_email", type = String.class)
}),
This also happens when you are using Hibernate and returning a void function. AT least w/ postgres. It doesnt know how to handle the void. I ended up having to change my void to a return int.
If you are using Postgres, check that you don't have a column of type Abstime. Abstime is an internal Postgres datatype not recognized by JPA. In this case, converting to Text using TO_CHAR could help if permitted by your business requirements.
if using Postgres
public class CustomPostgreSqlDialect extends PostgreSQL94Dialect{
#Override
public SqlTypeDescriptor remapSqlTypeDescriptor(SqlTypeDescriptor sqlTypeDescriptor)
{
switch (sqlTypeDescriptor.getSqlType())
{
case Types.CLOB:
return VarcharTypeDescriptor.INSTANCE;
case Types.BLOB:
return VarcharTypeDescriptor.INSTANCE;
case 1111://1111 should be json of pgsql
return VarcharTypeDescriptor.INSTANCE;
}
return super.remapSqlTypeDescriptor(sqlTypeDescriptor);
}
public CustomPostgreSqlDialect() {
super();
registerHibernateType(1111, "string");
}}
and use
<prop key="hibernate.dialect">com.abc.CustomPostgreSqlDialect</prop>
For anybody getting this error with an old hibernate (3.x) version:
do not write the return type in capital letters. hibernate type implementation mapping uses lowercase return types and does not convert them:
CREATE OR REPLACE FUNCTION do_something(param varchar)
RETURNS integer AS
$BODY$
...
This is for Hibernate (5.x) version
Calling database function which return JSON string/object
For this use unwrap(org.hibernate.query.NativeQuery.class).addScalar() methods for the same.
Example as below (Spring & Hibernate):
#PersistenceContext
EntityManager em;
#Override
public String getJson(String strLayerName) {
String *nativeQuery* = "select fn_layer_attributes(:layername)";
return em.createNativeQuery(*nativeQuery*).setParameter("layername", strLayerName).**unwrap(org.hibernate.query.NativeQuery.class).addScalar**("fn_layer_attributes", **new JsonNodeBinaryType()**) .getSingleResult().toString();
}
Function or procedure returning void cause some issue with JPA/Hibernate, so changing it with return integer and calling return 1 at the end of procedure may solved the problem.
SQL Type 1111 represents String.
If you are calling EntityManager.createNativeQuery(), be sure to include the resulting java class in the second parameter:
return em.createNativeQuery(sql, MyRecord.class).getResultList()
After trying many proposed solutions, including:
https://stackoverflow.com/a/59754570/349169 which is one of the solutions proposed here
https://vladmihalcea.com/hibernate-no-dialect-mapping-for-jdbc-type/
it was finally this one that fixed everything with the least amount of changes:
https://gist.github.com/agrawald/adad25d28bf6c56a7e4618fe95ee5a39
The trick is to not have #TypeDef on your class, but instead have 2 different #TypeDef in 2 different package-info.java files. One inside your production code package for your production DB, and one inside your test package for your test H2 DB.

How SpringBoot JPA run DDL sql with dynamic tableName?

Yestody,I got this question, how jpa run DDL sql with dynamic tableName?
usually,I just used DQL and DML like 'select,insert,update,delete'.
such as :
public interface UserRepository extends JpaRepository<User, Integer> {
#Query(value = "select a.* from user a where a.username = ? and a.password = ?", nativeQuery = true)
List<User> loginCheck(String username, String password);
}
but when I required run DDL sql below
String sql = "create table " + tableName + " as select * from user where login_flag = '1'";
I don't find a way to solve this with Jpa (or EntityManager).
Finally I used JDBC to run the DDL sql,but I think it's ugly...
Connection conn = null;
PreparedStatement ps = null;
String sql=" create table " + tableName + " as select * from user where login_flag = '1' ";
try {
Class.forName(drive);
conn = DriverManager.getConnection(url, username, password);
ps = conn.prepareStatement(sql);
ps.executeUpdate();
ps.close();
conn.close();
} catch (Exception e) {
e.printStackTrace();
}
So,can jpa run DDL sql(such as CREATE/DROP/ALTER) wiht dynamic tableName in an easy way?
Your question seems to consist of two parts
The first part
can jpa run DDL sql
Sure, just use entityManager.createNativeQuery("CREATE TABLE ...").executeUpdate(). This is probably not the best idea (you should be using a database migration tool like Flyway or Liquibase for DB creation), but it will work.
Note that you might run into some issues, e.g. different RDBMSes have different requirements regarding transactions around DDL statements, but they can be solved quite easily most of the time.
You're probably wondering how to get hold of an EntityManager when using Spring Data. See here for an explanation on how to create custom repository fragments where you can inject virtually anything you need.
The second part
with dynamic tableName
JPA only supports parameters in certain clauses within the query, and identifiers are not one of them. You'll need to use string concatenation, I'm afraid.
Why dynamic table names, though? It's not like your entity definitions are going to change at runtime. Static DDL scripts are generally less error-prone.

Must declare the scalar variable "##RowCount" on SaveChangesAsync

I have a simple table insert in entity framework to add a record to a Azure SQL Data-Warehouse table. I get this error on context.SaveChanges() - SQLException: Must declare the scalar variable "##ROWCOUNT
Reading a table works perfectly fine only the saving to a table fails.
context.Users.Add(user);
context.SaveChanges(); -> fails here.
Expected result - record should get inserted in the table
Actual result - Microsoft.EntityFrameworkCore.DbUpdateException: 'An error occurred while updating the entries. See the inner exception for details.'
Inner Exception
SqlException: Must declare the scalar variable "##ROWCOUNT".
I found out that EntityFramework is not supported for Azure SQL Data Warehouse. https://feedback.azure.com/forums/307516-sql-data-warehouse/suggestions/12868725-support-for-entity-framework
I used the SqlConnection and SqlCommand as a workaround.
`using (var cn = new SqlConnection(connectionString))
{
var query = "insert into Users([Id]) values (#Id)";
using (var cmd = new SqlCommand(query, cn))
{
cmd.Parameters.AddWithValue("#Id", 1);
cn.Open();
cmd.ExecuteNonQuery();
cn.Close();
}
}`

Using Integer Array in postgres with Spring-boot

I am attempting to accept from the browser a List and use this within a SQL query to a postgres database. I have the following code snippet that tries to show the function that I have made todo this. Some of the variables have been changed in case there appears to be discrepancies.
static public List<Map<String,Object>> fetch(NamedParameterJdbcTemplate jdbcTemplate, List<Integer> id){
List<Map<String,Object>> result= new ArrayList<>();
String sql = "select * from lookup where id && ARRAY[ :ids ]";
MapSqlParameterSource parameters = new MapSqlParameterSource();
parameters.addValue("ids",id, Types.INTEGER);
result= jdbcTemplate.query(sql,
parameters,
new RowMapper<Map<String,Object>>() { ...
}
)
}
The lookup tables id field is a postgress array hence me needing to use && and the array function
This function is called by many different endpoints and passes the NamedParameterJdbcTemplate as well as a list of Integers. The problem I am having is that if any integer in the list is < 100 I get the following message
Bad value for type int : {20}
Is there another way of doing this or a way around this error ?
EDIT:
It appears it was part of the problem mentioned as the answer but also using
rs.getInt(col)
instead of
rs.getArray(col)
There's an error I can see in the SQL, and probably the wrong choice of API after that. First in the query:
select * from lookup where id && ARRAY[ :ids ]
To bind an array parameter, it must not be placed in the ARRAY constructor, but rather you need to use JDBC binding like this:
select * from lookup where id && ?
As you've noticed I'm not using a named parameter in these examples, because NamedParameterJdbcTemplate does not provide a route to obtaining the java.sql.Connection object or a proxy to it. You can access it through the PreparedStatementSetter if you use the JdbcOperations interface instead.
public static List<Map<String,Object>> fetch(NamedParameterJdbcTemplate jdbcTemplate, List<Integer> idlist){
List<Map<String,Object>> result= new ArrayList<>();
String sql = "select * from lookup where id && ?";
final Integer[] ids = idlist.toArray(new Integer[0]);
PreparedStatementSetter parameters = new PreparedStatementSetter() {
#Override
void setValues(PreparedStatement stmt) {
Connection conn = stmt.getConnection();
// this can only be done through the Connection
java.sql.Array arr = conn.createArrayOf("integer", ids);
// you can use setObject(1, ids, java.sql.Types.ARRAY) instead of setArray
// in case the connection wrapper doesn't pass it on to the JDBC driver
stmt.setArray(1, ids);
}
};
JdbcOperations jdo = jdbcTemplate.getJdbcOperations();
result= jdo.query(sql,
parameters,
new RowMapper<Map<String,Object>>() { ...
}
)
}
There might be errors in the code, since I normally use a different set of APIs, and you need a try-catch block for java.sql.SQLException in that setValues function, but you should be able to handle it from here on.

Reset Embedded H2 database periodically

I'm setting up a new version of my application in a demo server and would love to find a way of resetting the database daily. I guess I can always have a cron job executing drop and create queries but I'm looking for a cleaner approach. I tried using a special persistence unit with drop-create approach but it doesn't work as the system connects and disconnects from the server frequently (on demand).
Is there a better approach?
H2 supports a special SQL statement to drop all objects:
DROP ALL OBJECTS [DELETE FILES]
If you don't want to drop all tables, you might want to use truncate table:
TRUNCATE TABLE
As this response is the first Google result for "reset H2 database", I post my solution below :
After each JUnit #tests :
Disable integrity constraint
List all tables in the (default) PUBLIC schema
Truncate all tables
List all sequences in the (default) PUBLIC schema
Reset all sequences
Reenable the constraints.
#After
public void tearDown() {
try {
clearDatabase();
} catch (Exception e) {
Fail.fail(e.getMessage());
}
}
public void clearDatabase() throws SQLException {
Connection c = datasource.getConnection();
Statement s = c.createStatement();
// Disable FK
s.execute("SET REFERENTIAL_INTEGRITY FALSE");
// Find all tables and truncate them
Set<String> tables = new HashSet<String>();
ResultSet rs = s.executeQuery("SELECT TABLE_NAME FROM INFORMATION_SCHEMA.TABLES where TABLE_SCHEMA='PUBLIC'");
while (rs.next()) {
tables.add(rs.getString(1));
}
rs.close();
for (String table : tables) {
s.executeUpdate("TRUNCATE TABLE " + table);
}
// Idem for sequences
Set<String> sequences = new HashSet<String>();
rs = s.executeQuery("SELECT SEQUENCE_NAME FROM INFORMATION_SCHEMA.SEQUENCES WHERE SEQUENCE_SCHEMA='PUBLIC'");
while (rs.next()) {
sequences.add(rs.getString(1));
}
rs.close();
for (String seq : sequences) {
s.executeUpdate("ALTER SEQUENCE " + seq + " RESTART WITH 1");
}
// Enable FK
s.execute("SET REFERENTIAL_INTEGRITY TRUE");
s.close();
c.close();
}
The other solution would be to recreatethe database at the begining of each tests. But that might be too long in case of big DB.
Thre is special syntax in Spring for database manipulation within unit tests
#Sql(scripts = "classpath:drop_all.sql", executionPhase = Sql.ExecutionPhase.AFTER_TEST_METHOD)
#Sql(scripts = {"classpath:create.sql", "classpath:init.sql"}, executionPhase = Sql.ExecutionPhase.BEFORE_TEST_METHOD)
public class UnitTest {}
In this example we execute drop_all.sql script (where we dropp all required tables) after every test method.
In this example we execute create.sql script (where we create all required tables) and init.sql script (where we init all required tables before each test method.
The command: SHUTDOWN
You can execute it using
RunScript.execute(jdbc_url, user, password, "classpath:shutdown.sql", "UTF8", false);
I do run it every time when the Suite of tests is finished using #AfterClass
If you are using spring boot see this stackoverflow question
Setup your data source. I don't have any special close on exit.
datasource:
driverClassName: org.h2.Driver
url: "jdbc:h2:mem:psptrx"
Spring boot #DirtiesContext annotation
#DirtiesContext(classMode = DirtiesContext.ClassMode.BEFORE_EACH_TEST_METHOD)
Use #Before to initialise on each test case.
The #DirtiesContext will cause the h2 context to be dropped between each test.
you can write in the application.properties the following code to reset your tables which are loaded by JPA:
spring.jpa.hibernate.ddl-auto=create