Inserting data into more than one table in spring-batch using ibatis - spring-batch

I am using IbatisBatchItemWriter to write complex object into multiple tables.
Here is my object how it looks like
public class SfObject{
protected List<Person> person;
}
public class Person {
protected String personId;
protected XMLGregorianCalendar dateOfBirth;
protected String countryOfBirth;
protected String regionOfBirth;
protected String placeOfBirth;
protected String birthName;
protected XMLGregorianCalendar dateOfDeath;
protected XMLGregorianCalendar lastModifiedOn;
protected List<EmailInformation> emailInformation;
}
public class EmailInformation {
protected String emailType;
protected String emailAddress;
protected XMLGregorianCalendar lastModifiedOn;
}
And here is my ibatis configuration to insert above objests
<insert id="insertCompoundEmployeeData" parameterClass="com.domain.SfObject">
<iterate property="person">
insert into E_Person_Info
(person_id,
person_birth_dt,
person_country_of_birth,
person_region_of_birth,
person_place_of_birth,
person_birth_name,
person_death_dt,
last_modified_on
)
values (#person[].personId#,
#person[].dateOfBirth,
#person[].countryOfBirth#,
#person[].regionOfBirth#,
#person[].placeOfBirth#,
#person[].birthName#,
#person[].dateOfDeath#,
#person[].lastModifiedOn#
);
<iterate property="person[].emailInformation">
insert into E_Email_Info
(email_info_person_id,
email_info_email_type,
email_info_email_address,
last_modified_on
)
values (#person[].personId#,
#person[].emailInformation[].emailType#,
#person[].emailInformation[].emailAddress#,
#person[].emailInformation[].lastModifiedOn#
);
</iterate>
</iterate>
</insert>
I am not sure whether i could use above config to insert data into more than one table, but when i executed the above code i am getting below error for batch of 10 records. Btw, email information is not mandatory so, it may be null in some person object.
Stacktrace
[08.08.2014 17:30:07] DEBUG: WebservicePagingItemReader.doRead() - Reading page 0
[08.08.2014 17:30:09] DEBUG: RepeatTemplate.executeInternal() - Repeat operation about to start at count=1
[08.08.2014 17:30:09] DEBUG: RepeatTemplate.executeInternal() - Repeat operation about to start at count=2
[08.08.2014 17:30:09] DEBUG: RepeatTemplate.executeInternal() - Repeat operation about to start at count=3
[08.08.2014 17:30:09] DEBUG: RepeatTemplate.executeInternal() - Repeat operation about to start at count=4
[08.08.2014 17:30:09] DEBUG: RepeatTemplate.executeInternal() - Repeat operation about to start at count=5
[08.08.2014 17:30:09] DEBUG: RepeatTemplate.executeInternal() - Repeat operation about to start at count=6
[08.08.2014 17:30:09] DEBUG: RepeatTemplate.executeInternal() - Repeat operation about to start at count=7
[08.08.2014 17:30:09] DEBUG: RepeatTemplate.executeInternal() - Repeat operation about to start at count=8
[08.08.2014 17:30:09] DEBUG: RepeatTemplate.executeInternal() - Repeat operation about to start at count=9
[08.08.2014 17:30:09] DEBUG: RepeatTemplate.isComplete() - Repeat is complete according to policy and result value.
[08.08.2014 17:30:09] DEBUG: IbatisBatchItemWriter.write() - Executing batch with 10 items.
[08.08.2014 17:30:09] DEBUG: SqlMapClientTemplate.execute() - Opened SqlMapSession [com.ibatis.sqlmap.engine.impl.SqlMapSessionImpl#168afdd] for iBATIS operation
[08.08.2014 17:30:10] DEBUG: Connection.debug() - {conn-100000} Connection
[08.08.2014 17:30:10] DEBUG: SqlMapClientTemplate.execute() - Obtained JDBC Connection [Transaction-aware proxy for target Connection from DataSource [org.springframework.jdbc.datasource.DriverManagerDataSource#8eae04]] for iBATIS operation
[08.08.2014 17:30:10] DEBUG: Connection.debug() - {conn-100000} Preparing Statement: insert into E_Person_Info (person_id, person_birth_dt, person_country_of_birth, person_region_of_birth, person_place_of_birth, person_birth_name, person_death_dt, last_modified_on ) values (?, ?, ?, ?, ?, ?, ?, ? );
[08.08.2014 17:30:10] DEBUG: Connection.debug() - {conn-100000} Preparing Statement: insert into E_Person_Info (person_id, person_birth_dt, person_country_of_birth, person_region_of_birth, person_place_of_birth, person_birth_name, person_death_dt, last_modified_on ) values (?, ?, ?, ?, ?, ?, ?, ? ); insert into E_Email_Info (email_info_person_id, email_info_email_type, email_info_email_address, last_modified_on ) values (?, ?, ?, ? );
[08.08.2014 17:30:10] DEBUG: Connection.debug() - {conn-100000} Preparing Statement: insert into E_Person_Info (person_id, person_birth_dt, person_country_of_birth, person_region_of_birth, person_place_of_birth, person_birth_name, person_death_dt, last_modified_on ) values (?, ?, ?, ?, ?, ?, ?, ? );
[08.08.2014 17:30:10] DEBUG: TaskletStep.doInChunkContext() - Applying contribution: [StepContribution: read=10, written=0, filtered=0, readSkips=0, writeSkips=0, processSkips=0, exitStatus=EXECUTING]
[08.08.2014 17:30:10] DEBUG: TaskletStep.doInChunkContext() - Rollback for Exception: org.springframework.dao.InvalidDataAccessResourceUsageException: Batch execution returned invalid results. Expected 1 but number of BatchResult objects returned was 3
[08.08.2014 17:30:10] DEBUG: DataSourceTransactionManager.processRollback() - Initiating transaction rollback
[08.08.2014 17:30:10] DEBUG: DataSourceTransactionManager.doRollback() - Rolling back JDBC transaction on Connection [net.sourceforge.jtds.jdbc.ConnectionJDBC3#190d8e1]
[08.08.2014 17:30:10] DEBUG: DataSourceTransactionManager.doCleanupAfterCompletion() - Releasing JDBC Connection [net.sourceforge.jtds.jdbc.ConnectionJDBC3#190d8e1] after transaction
[08.08.2014 17:30:10] DEBUG: DataSourceUtils.doReleaseConnection() - Returning JDBC Connection to DataSource
[08.08.2014 17:30:10] DEBUG: RepeatTemplate.doHandle() - Handling exception: org.springframework.dao.InvalidDataAccessResourceUsageException, caused by: org.springframework.dao.InvalidDataAccessResourceUsageException: Batch execution returned invalid results. Expected 1 but number of BatchResult objects returned was 3
[08.08.2014 17:30:10] DEBUG: RepeatTemplate.executeInternal() - Handling fatal exception explicitly (rethrowing first of 1): org.springframework.dao.InvalidDataAccessResourceUsageException: Batch execution returned invalid results. Expected 1 but number of BatchResult objects returned was 3
[08.08.2014 17:30:10] ERROR: AbstractStep.execute() - Encountered an error executing the step
org.springframework.dao.InvalidDataAccessResourceUsageException: Batch execution returned invalid results. Expected 1 but number of BatchResult objects returned was 3
at org.springframework.batch.item.database.IbatisBatchItemWriter.write(IbatisBatchItemWriter.java:140)
at org.springframework.batch.core.step.item.SimpleChunkProcessor.writeItems(SimpleChunkProcessor.java:156)
at org.springframework.batch.core.step.item.SimpleChunkProcessor.doWrite(SimpleChunkProcessor.java:137)
at org.springframework.batch.core.step.item.SimpleChunkProcessor.write(SimpleChunkProcessor.java:252)
at org.springframework.batch.core.step.item.SimpleChunkProcessor.process(SimpleChunkProcessor.java:178)
at org.springframework.batch.core.step.item.ChunkOrientedTasklet.execute(ChunkOrientedTasklet.java:74)
at org.springframework.batch.core.step.tasklet.TaskletStep$2.doInChunkContext(TaskletStep.java:268)
at org.springframework.batch.core.scope.context.StepContextRepeatCallback.doInIteration(StepContextRepeatCallback.java:76)
at org.springframework.batch.repeat.support.RepeatTemplate.getNextResult(RepeatTemplate.java:367)
at org.springframework.batch.repeat.support.RepeatTemplate.executeInternal(RepeatTemplate.java:215)
at org.springframework.batch.repeat.support.RepeatTemplate.iterate(RepeatTemplate.java:143)
at org.springframework.batch.core.step.tasklet.TaskletStep.doExecute(TaskletStep.java:242)
at org.springframework.batch.core.step.AbstractStep.execute(AbstractStep.java:198)
at org.springframework.batch.core.job.AbstractJob.handleStep(AbstractJob.java:348)
at org.springframework.batch.core.job.flow.FlowJob.access$0(FlowJob.java:1)
at org.springframework.batch.core.job.flow.FlowJob$JobFlowExecutor.executeStep(FlowJob.java:135)
at org.springframework.batch.core.job.flow.support.state.StepState.handle(StepState.java:60)
at org.springframework.batch.core.job.flow.support.SimpleFlow.resume(SimpleFlow.java:144)
at org.springframework.batch.core.job.flow.support.SimpleFlow.start(SimpleFlow.java:124)
at org.springframework.batch.core.job.flow.FlowJob.doExecute(FlowJob.java:103)
at org.springframework.batch.core.job.AbstractJob.execute(AbstractJob.java:250)
at org.springframework.batch.core.launch.support.SimpleJobLauncher$1.run(SimpleJobLauncher.java:110)
at org.springframework.core.task.SyncTaskExecutor.execute(SyncTaskExecutor.java:48)
at org.springframework.batch.core.launch.support.SimpleJobLauncher.run(SimpleJobLauncher.java:105)
at com.CtrlMPojoForBatch.initiateSpringBatchProcess(CtrlMPojoForBatch.java:92)
at com.CtrlMPojoForBatch.main(CtrlMPojoForBatch.java:33)
[08.08.2014 17:30:10] WARN: CustomStepExecutionListner.afterStep() - Failure occured executing the step readWriteExchagnerateConversionData
[08.08.2014 17:30:10] WARN: CustomStepExecutionListner.afterStep() - Initiating the rollback operation...
[08.08.2014 17:30:10] WARN: CustomStepExecutionListner.afterStep() - Rollback completed!

Assuming you're using the IbatisBatchItemWriter provided in Spring Batch (it's been deprecated in favor of the ones provided by the MyBatis project), set the assertUpdates to false. This will prevent Spring Batch from verifying that only one update was made per item.

Related

Does cascade happen in 1 transaction?

I save the Product which cascade persist the productMaterial. However, when the productMaterial throws DataIntegrityViolationException the product is rollbacked, which seems like cascade is done in 1 transaction, but i don't find any docs saying that it does. Can someone clarify it for me?
NOTE: I DO NOT use #Transactional
Material material = new Material();
material.setId(1);
Product newProduct = new Product();
ProductMaterial productMaterial = new ProductMaterial();
newProduct.setName("bàn chải");
newProduct.setPrice(1000);
newProduct.setCreatedAt(new Date());
newProduct.setProductMaterials(Collections.singletonList(productMaterial));
productMaterial.setProduct(newProduct);
productMaterial.setMaterial(material);
productRepository.save(newProduct);
Here is the hibernate execution:
Hibernate:
/* insert com.vietnam.hanghandmade.entities.Product
*/ insert
into
product
(created_at, name, price, id)
values
(?, ?, ?, ?)
2020-11-10 14:55:38.281 TRACE 65729 --- [nio-8080-exec-2] o.h.type.descriptor.sql.BasicBinder : binding parameter [1] as [TIMESTAMP] - [Tue Nov 10 14:55:38 JST 2020]
2020-11-10 14:55:38.281 TRACE 65729 --- [nio-8080-exec-2] o.h.type.descriptor.sql.BasicBinder : binding parameter [2] as [VARCHAR] - [bàn chải]
2020-11-10 14:55:38.281 TRACE 65729 --- [nio-8080-exec-2] o.h.type.descriptor.sql.BasicBinder : binding parameter [3] as [INTEGER] - [1000]
2020-11-10 14:55:38.281 TRACE 65729 --- [nio-8080-exec-2] o.h.type.descriptor.sql.BasicBinder : binding parameter [4] as [OTHER] - [e5729490-a0f8-48e7-9600-eeeba8b8f279]
Hibernate:
/* insert com.vietnam.hanghandmade.entities.ProductMaterial
*/ insert
into
product_material
(material_id, product_id)
values
(?, ?)
2020-11-10 14:55:38.324 TRACE 65729 --- [nio-8080-exec-2] o.h.type.descriptor.sql.BasicBinder : binding parameter [1] as [INTEGER] - [1]
2020-11-10 14:55:38.324 TRACE 65729 --- [nio-8080-exec-2] o.h.type.descriptor.sql.BasicBinder : binding parameter [2] as [OTHER] - [e5729490-a0f8-48e7-9600-eeeba8b8f279]
2020-11-10 14:55:38.328 WARN 65729 --- [nio-8080-exec-2] o.h.engine.jdbc.spi.SqlExceptionHelper : SQL Error: 0, SQLState: 23503
2020-11-10 14:55:38.328 ERROR 65729 --- [nio-8080-exec-2] o.h.engine.jdbc.spi.SqlExceptionHelper : ERROR: insert or update on table "product_material" violates foreign key constraint "product_material_material_id_fkey"
Detail: Key (material_id)=(1) is not present in table "material".
NOTE: This answer missed the point of the question, which is about “cascading persist” – it talks about “cascading delete” for foreign keys.
The cascading delete or update is part of the action of the system trigger that implements foreign key constraints, and as such it runs in the same transaction as the triggering statement.
I cannot find a place in the fine manual that spells this out, but it is obvious if you think about it: if the cascading delete were run in a separate transaction, it would be possible that the delete succeeds and the cascading delete fails, which would render the database inconsistent and is consequently not an option.

Cause: org.postgresql.util.PSQLException: An I/O error occurred while sending to the backend

I am using mybatis to insert the data into postgresql db. I have 19629 number of records to insert. I am trying to insert all records in one time. But if I pass more that 6k records to the query I am getting Cause: org.postgresql.util.PSQLException: An I/O error occurred while sending to the backend.
So is there any limit to number of insert records in one time in postgresql?
Mybatis code.
{ #Insert({ "<script>","insert into temp_overdrive_csv_dtls (lpat_library_card_number,day_of_use,sessions,minutes_read,hours_read,sys_created_by)","values ", "<foreach collection='recordList' item='record' separator=','>","(#{record.lpatLibraryCardNumber},#{record.dayofUse}, #{record.sessions}, #{record.minutesRead}, #{record.hoursRead}, #{record.sysCreatedBy})","</foreach>", "</script>" })public Integer insert(#Param("recordList") List<CsvRecord> recordList);
Error.
Error updating database. Cause: org.postgresql.util.PSQLException: An I/O error occurred while sending to the backend.
The error may involve com.apds.mybatis.mapper.overdrive.OverdriveTotMapper.insert-Inline
The error occurred while setting parameters
SQL: insert into temp_overdrive_csv_dtls (lpat_library_card_number,day_of_use,sessions,minutes_read,hours_read,sys_created_by) values (?,?, ?, ?, ?, ?) , (?,?, ?, ?, ?, ?) , (?,?, ?, ?, ?, ?) , (?,?, ?, ?, ?, ?) , , (?,?, ?, ?, ?, ?)
Cause: org.postgresql.util.PSQLException: An I/O error occurred while sending to the backend.
at org.apache.ibatis.exceptions.ExceptionFactory.wrapException(ExceptionFactory.java:23)
at org.apache.ibatis.session.defaults.DefaultSqlSession.update(DefaultSqlSession.java:150)
at org.apache.ibatis.session.defaults.DefaultSqlSession.insert(DefaultSqlSession.java:137)
at org.apache.ibatis.binding.MapperMethod.execute(MapperMethod.java:46)
at org.apache.ibatis.binding.MapperProxy.invoke(MapperProxy.java:43)
at com.sun.proxy.$Proxy79.insert(Unknown Source)
at com.apds.overdrive.service.OverdriveService.processRequest(OverdriveService.java:105)
at com.apds.overdrive.PartnerOverdriveApplication.main(PartnerOverdriveApplication.java:75)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.boot.devtools.restart.RestartLauncher.run(RestartLauncher.java:49)Caused by: org.postgresql.util.PSQLException: An I/O error occurred while sending to the backend.
at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:336)
at org.postgresql.jdbc.PgStatement.executeInternal(PgStatement.java:446)
at org.postgresql.jdbc.PgStatement.execute(PgStatement.java:370)
at org.postgresql.jdbc.PgPreparedStatement.executeWithFlags(PgPreparedStatement.java:149)
at org.postgresql.jdbc.PgPreparedStatement.execute(PgPreparedStatement.java:138)
at org.apache.ibatis.executor.statement.PreparedStatementHandler.update(PreparedStatementHandler.java:41)
at org.apache.ibatis.executor.statement.RoutingStatementHandler.update(RoutingStatementHandler.java:66)
at org.apache.ibatis.executor.SimpleExecutor.doUpdate(SimpleExecutor.java:45)
at org.apache.ibatis.executor.BaseExecutor.update(BaseExecutor.java:100)
at org.apache.ibatis.executor.CachingExecutor.update(CachingExecutor.java:75)
at org.apache.ibatis.session.defaults.DefaultSqlSession.update(DefaultSqlSession.java:148)
... 11 more
#
Caused by: java.io.IOException: Tried to send an out-of-range integer as a 2-byte value: 36000
at org.postgresql.core.PGStream.sendInteger2(PGStream.java:252)
at org.postgresql.core.v3.QueryExecutorImpl.sendParse(QueryExecutorImpl.java:1470)
at org.postgresql.core.v3.QueryExecutorImpl.sendOneQuery(QueryExecutorImpl.java:1793)
at org.postgresql.core.v3.QueryExecutorImpl.sendQuery(QueryExecutorImpl.java:1356)
at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:301)
... 21 more
It is not the number of rows, but the number of placeholders.
Most drivers have a limit on the number of placeholders of PreparedStatement (32767 with pgjdbc, I think).
This is one of the reasons why multi-row insert is not recommended when inserting or updating a large number of rows (another reason is performance).
You should switch to batch insert.
Please see this answer for an example code.

Using BigDecimal field in #Query and populating it with #Param field is throwing Sql Error

This following is the query i am using for setting BigDecimal value in Query but failing as error in SQL Syntax
#Query(value="Select f.id,s.student_id,f.feesPaid,f.fees_pending,f.paid_datetime from Fees f inner join Student s where f.feesPaid > :amt")
List<Fees> findFirst3ByFeesPaidGreaterThan( #Param(value = "amt") BigDecimal amt);
the following is the error
Hibernate: select fees0_.id as col_0_0_, student1_.student_id as col_1_0_, fees0_.fees_paid as col_2_0_, fees0_.fees_pending as col_3_0_, fees0_.paid_datetime as col_4_0_ from fees fees0_ inner join student student1_ on where fees0_.fees_paid>?
2019-05-07 20:06:16.779 WARN 21752 --- [nio-8082-exec-1] o.h.engine.jdbc.spi.SqlExceptionHelper : SQL Error: 1064, SQLState: 42000
2019-05-07 20:06:16.779 ERROR 21752 --- [nio-8082-exec-1] o.h.engine.jdbc.spi.SqlExceptionHelper : You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'where fees0_.fees_paid>500' at line 1
I am able to use the query method name but i wanted to do it using Query as mentioned above.

hibernate second level cache can get lazy-loading entity when i set breakpoint

I use Spring data JPA and hibernate second level cache via hibernate-redis in my project. I use #Transactional for lazy-loading, But it hints miss when I run application. if i debug it, and set a breakpoint wait for some time, it works and retrieve cache from redis. Here is the code:
Entity ItemCategory:
#Entity
#Cacheable
public class ItemCategory extends BaseModel {
#NotNull
#Column(updatable=false)
private String name;
#JsonBackReference
#ManyToOne(fetch = FetchType.LAZY)
private ItemCategory root;
}
Entity Item:
#Entity
#Cacheable
public class Item extends BaseModel {
#ManyToOne(fetch = FetchType.EAGER)
private ItemCategory category;
}
Repository:
#Repository
public interface ItemCategoryRepository extends JpaRepository<ItemCategory, Long> {
#QueryHints(value = {
#QueryHint(name = "org.hibernate.cacheable", value = "true")
})
#Query("select distinct i.category.root from Item i where i.store.id = :id and i.category.parent.id = i.category.root.id")
List<ItemCategory> findByStoreId(#Param("id") Long id);
}
hint miss:
2017-03-06 14:49:30.105 TRACE 30295 --- [nio-8080-exec-2] o.h.cache.redis.client.RedisClient : retrieve cache item. region=hibernate.org.hibernate.cache.internal.StandardQueryCache, key=sql: select distinct itemcatego2_.id as id1_21_, itemcatego2_.create_by_id as create_b8_21_, itemcatego2_.create_date as create_d2_21_, itemcatego2_.last_modified_by_id as last_mod9_21_, itemcatego2_.last_modified_date as last_mod3_21_, itemcatego2_.background_id as backgro10_21_, itemcatego2_.enabled as enabled4_21_, itemcatego2_.name as name5_21_, itemcatego2_.parent_id as parent_11_21_, itemcatego2_.root_id as root_id12_21_, itemcatego2_.slide as slide6_21_, itemcatego2_.son_number as son_numb7_21_ from item item0_ inner join item_category itemcatego1_ on item0_.category_id=itemcatego1_.id inner join item_category itemcatego2_ on itemcatego1_.root_id=itemcatego2_.id where item0_.store_id=? and itemcatego1_.parent_id=itemcatego1_.root_id; parameters: ; named parameters: {id=4}; transformer: org.hibernate.transform.CacheableResultTransformer#110f2, value=[6098054966726656, 3, 1]
2017-03-06 14:49:30.116 TRACE 30295 --- [nio-8080-exec-2] o.h.cache.redis.client.RedisClient : retrieve cache item. region=hibernate.org.hibernate.cache.spi.UpdateTimestampsCache, key=item, value=null
2017-03-06 14:49:30.127 TRACE 30295 --- [nio-8080-exec-2] o.h.cache.redis.client.RedisClient : retrieve cache item. region=hibernate.org.hibernate.cache.spi.UpdateTimestampsCache, key=item_category, value=null
2017-03-06 14:49:41.971 INFO 30295 --- [nio-8080-exec-2] i.StatisticalLoggingSessionEventListener : Session Metrics {
974551 nanoseconds spent acquiring 1 JDBC connections;
0 nanoseconds spent releasing 0 JDBC connections;
0 nanoseconds spent preparing 0 JDBC statements;
0 nanoseconds spent executing 0 JDBC statements;
0 nanoseconds spent executing 0 JDBC batches;
0 nanoseconds spent performing 0 L2C puts;
19881210 nanoseconds spent performing 1 L2C hits;
24082571 nanoseconds spent performing 2 L2C misses;
0 nanoseconds spent executing 0 flushes (flushing a total of 0 entities and 0 collections);
26331 nanoseconds spent executing 1 partial-flushes (flushing a total of 0 entities and 0 collections)
}
if i debug and set a breakpoint wait for some time(not work every time):
2017-03-06 14:50:00.565 TRACE 30295 --- [nio-8080-exec-3] o.h.cache.redis.client.RedisClient : retrieve cache item. region=hibernate.org.hibernate.cache.internal.StandardQueryCache, key=sql: select distinct itemcatego2_.id as id1_21_, itemcatego2_.create_by_id as create_b8_21_, itemcatego2_.create_date as create_d2_21_, itemcatego2_.last_modified_by_id as last_mod9_21_, itemcatego2_.last_modified_date as last_mod3_21_, itemcatego2_.background_id as backgro10_21_, itemcatego2_.enabled as enabled4_21_, itemcatego2_.name as name5_21_, itemcatego2_.parent_id as parent_11_21_, itemcatego2_.root_id as root_id12_21_, itemcatego2_.slide as slide6_21_, itemcatego2_.son_number as son_numb7_21_ from item item0_ inner join item_category itemcatego1_ on item0_.category_id=itemcatego1_.id inner join item_category itemcatego2_ on itemcatego1_.root_id=itemcatego2_.id where item0_.store_id=? and itemcatego1_.parent_id=itemcatego1_.root_id; parameters: ; named parameters: {id=4}; transformer: org.hibernate.transform.CacheableResultTransformer#110f2, value=[6098054966726656, 3, 1]
2017-03-06 14:50:00.584 TRACE 30295 --- [nio-8080-exec-3] o.h.cache.redis.client.RedisClient : retrieve cache item. region=hibernate.org.hibernate.cache.spi.UpdateTimestampsCache, key=item, value=null
2017-03-06 14:50:00.595 TRACE 30295 --- [nio-8080-exec-3] o.h.cache.redis.client.RedisClient : retrieve cache item. region=hibernate.org.hibernate.cache.spi.UpdateTimestampsCache, key=item_category, value=null
2017-03-06 14:50:01.805 TRACE 30295 --- [nio-8080-exec-3] o.h.cache.redis.client.RedisClient : retrieve cache item. region=hibernate.com.foo.bar.model.item.ItemCategory, key=com.foo.bar.model.item.ItemCategory#3, value={parent=null, lastModifiedDate=2016-12-14 09:30:48.0, lastModifiedBy=1, enabled=true, sonNumber=2, _subclass=com.foo.bar.model.item.ItemCategory, createBy=1, children=3, background=1, slide=0, root=3, name=foo, _lazyPropertiesUnfetched=false, _version=null, createDate=2016-12-14 09:29:56.0}
Hibernate: select user0_.id as id1_59_0_, user0_.create_by_id as create_11_59_0_, user0_.create_date as create_d2_59_0_, user0_.last_modified_by_id as last_mo12_59_0_, user0_.last_modified_date as last_mod3_59_0_, user0_.avatar_id as avatar_13_59_0_, user0_.email as email4_59_0_, user0_.enabled as enabled5_59_0_, user0_.gender as gender6_59_0_, user0_.nickname as nickname7_59_0_, user0_.phone as phone8_59_0_, user0_.seller_auth_info_id as seller_14_59_0_, user0_.seller_auth_status as seller_a9_59_0_, user0_.user_ext_id as user_ex15_59_0_, user0_.user_group_id as user_gr16_59_0_, user0_.username as usernam10_59_0_, user1_.id as id1_59_1_, user1_.create_by_id as create_11_59_1_, user1_.create_date as create_d2_59_1_, user1_.last_modified_by_id as last_mo12_59_1_, user1_.last_modified_date as last_mod3_59_1_, user1_.avatar_id as avatar_13_59_1_, user1_.email as email4_59_1_, user1_.enabled as enabled5_59_1_, user1_.gender as gender6_59_1_, user1_.nickname as nickname7_59_1_, user1_.phone as phone8_59_1_, user1_.seller_auth_info_id as seller_14_59_1_, user1_.seller_auth_status as seller_a9_59_1_, user1_.user_ext_id as user_ex15_59_1_, user1_.user_group_id as user_gr16_59_1_, user1_.username as usernam10_59_1_, user2_.id as id1_59_2_, user2_.create_by_id as create_11_59_2_, user2_.create_date as create_d2_59_2_, user2_.last_modified_by_id as last_mo12_59_2_, user2_.last_modified_date as last_mod3_59_2_, user2_.avatar_id as avatar_13_59_2_, user2_.email as email4_59_2_, user2_.enabled as enabled5_59_2_, user2_.gender as gender6_59_2_, user2_.nickname as nickname7_59_2_, user2_.phone as phone8_59_2_, user2_.seller_auth_info_id as seller_14_59_2_, user2_.seller_auth_status as seller_a9_59_2_, user2_.user_ext_id as user_ex15_59_2_, user2_.user_group_id as user_gr16_59_2_, user2_.username as usernam10_59_2_, usergroup3_.id as id1_65_3_, usergroup3_.create_by_id as create_b5_65_3_, usergroup3_.create_date as create_d2_65_3_, usergroup3_.last_modified_by_id as last_mod6_65_3_, usergroup3_.last_modified_date as last_mod3_65_3_, usergroup3_.name as name4_65_3_ from user user0_ left outer join user user1_ on user0_.create_by_id=user1_.id left outer join user user2_ on user1_.last_modified_by_id=user2_.id left outer join user_group usergroup3_ on user1_.user_group_id=usergroup3_.id where user0_.id=?
Hibernate: select usergroup0_.id as id1_65_0_, usergroup0_.create_by_id as create_b5_65_0_, usergroup0_.create_date as create_d2_65_0_, usergroup0_.last_modified_by_id as last_mod6_65_0_, usergroup0_.last_modified_date as last_mod3_65_0_, usergroup0_.name as name4_65_0_, user1_.id as id1_59_1_, user1_.create_by_id as create_11_59_1_, user1_.create_date as create_d2_59_1_, user1_.last_modified_by_id as last_mo12_59_1_, user1_.last_modified_date as last_mod3_59_1_, user1_.avatar_id as avatar_13_59_1_, user1_.email as email4_59_1_, user1_.enabled as enabled5_59_1_, user1_.gender as gender6_59_1_, user1_.nickname as nickname7_59_1_, user1_.phone as phone8_59_1_, user1_.seller_auth_info_id as seller_14_59_1_, user1_.seller_auth_status as seller_a9_59_1_, user1_.user_ext_id as user_ex15_59_1_, user1_.user_group_id as user_gr16_59_1_, user1_.username as usernam10_59_1_, user2_.id as id1_59_2_, user2_.create_by_id as create_11_59_2_, user2_.create_date as create_d2_59_2_, user2_.last_modified_by_id as last_mo12_59_2_, user2_.last_modified_date as last_mod3_59_2_, user2_.avatar_id as avatar_13_59_2_, user2_.email as email4_59_2_, user2_.enabled as enabled5_59_2_, user2_.gender as gender6_59_2_, user2_.nickname as nickname7_59_2_, user2_.phone as phone8_59_2_, user2_.seller_auth_info_id as seller_14_59_2_, user2_.seller_auth_status as seller_a9_59_2_, user2_.user_ext_id as user_ex15_59_2_, user2_.user_group_id as user_gr16_59_2_, user2_.username as usernam10_59_2_, user3_.id as id1_59_3_, user3_.create_by_id as create_11_59_3_, user3_.create_date as create_d2_59_3_, user3_.last_modified_by_id as last_mo12_59_3_, user3_.last_modified_date as last_mod3_59_3_, user3_.avatar_id as avatar_13_59_3_, user3_.email as email4_59_3_, user3_.enabled as enabled5_59_3_, user3_.gender as gender6_59_3_, user3_.nickname as nickname7_59_3_, user3_.phone as phone8_59_3_, user3_.seller_auth_info_id as seller_14_59_3_, user3_.seller_auth_status as seller_a9_59_3_, user3_.user_ext_id as user_ex15_59_3_, user3_.user_group_id as user_gr16_59_3_, user3_.username as usernam10_59_3_, usergroup4_.id as id1_65_4_, usergroup4_.create_by_id as create_b5_65_4_, usergroup4_.create_date as create_d2_65_4_, usergroup4_.last_modified_by_id as last_mod6_65_4_, usergroup4_.last_modified_date as last_mod3_65_4_, usergroup4_.name as name4_65_4_, user5_.id as id1_59_5_, user5_.create_by_id as create_11_59_5_, user5_.create_date as create_d2_59_5_, user5_.last_modified_by_id as last_mo12_59_5_, user5_.last_modified_date as last_mod3_59_5_, user5_.avatar_id as avatar_13_59_5_, user5_.email as email4_59_5_, user5_.enabled as enabled5_59_5_, user5_.gender as gender6_59_5_, user5_.nickname as nickname7_59_5_, user5_.phone as phone8_59_5_, user5_.seller_auth_info_id as seller_14_59_5_, user5_.seller_auth_status as seller_a9_59_5_, user5_.user_ext_id as user_ex15_59_5_, user5_.user_group_id as user_gr16_59_5_, user5_.username as usernam10_59_5_, authoritie6_.user_group_id as user_gro1_66_6_, authoritie6_.authorities as authorit2_66_6_ from user_group usergroup0_ left outer join user user1_ on usergroup0_.create_by_id=user1_.id left outer join user user2_ on user1_.create_by_id=user2_.id left outer join user user3_ on user1_.last_modified_by_id=user3_.id left outer join user_group usergroup4_ on user1_.user_group_id=usergroup4_.id left outer join user user5_ on usergroup0_.last_modified_by_id=user5_.id left outer join user_group_authorities authoritie6_ on usergroup0_.id=authoritie6_.user_group_id where usergroup0_.id=?
2017-03-06 14:50:01.830 TRACE 30295 --- [nio-8080-exec-3] o.h.cache.redis.client.RedisClient : retrieve cache item. region=hibernate.com.foo.bar.model.item.ItemCategory, key=com.foo.bar.model.item.ItemCategory#1, value={parent=null, lastModifiedDate=2016-12-05 09:31:51.0, lastModifiedBy=1, enabled=true, sonNumber=1, _subclass=com.foo.bar.model.item.ItemCategory, createBy=1, children=1, background=1, slide=0, root=1, name=bar, _lazyPropertiesUnfetched=false, _version=null, createDate=2016-12-05 09:31:28.0}
2017-03-06 14:51:02.165 INFO 30295 --- [nio-8080-exec-3] i.StatisticalLoggingSessionEventListener : Session Metrics {
15435533 nanoseconds spent acquiring 1 JDBC connections;
0 nanoseconds spent releasing 0 JDBC connections;
1405433 nanoseconds spent preparing 2 JDBC statements;
2301936 nanoseconds spent executing 2 JDBC statements;
0 nanoseconds spent executing 0 JDBC batches;
0 nanoseconds spent performing 0 L2C puts;
64020073 nanoseconds spent performing 3 L2C hits;
27037450 nanoseconds spent performing 2 L2C misses;
1247578 nanoseconds spent executing 1 flushes (flushing a total of 4 entities and 3 collections);
24403 nanoseconds spent executing 1 partial-flushes (flushing a total of 0 entities and 0 collections)
}
application.yml:
spring:
profiles: development
jpa:
show-sql: true
properties:
hibernate.cache.use_second_level_cache: true
hibernate.cache.region.factory_class: org.hibernate.cache.redis.hibernate5.SingletonRedisRegionFactory
hibernate.cache.use_query_cache: true
hibernate.cache.region_prefix: hibernate
hibernate.generate_statistics: true
hibernate.cache.use_structured_entries: true
redisson-config: classpath:redisson.yml
hibernate.cache.use_reference_entries: true
javax.persistence.sharedCache.mode: ENABLE_SELECTIVE

PostgreSQL with ODBC: "there is no parameter $1"

I have a insert statement:
INSERT INTO billData (
tmStart, tsDuration, eCallDir, ...
) VALUES (
$1, -- tmStart
$2, -- tsDuration
$3, -- eCallDir
...
);
I use SQLPrepare to compile it, bind parameters by SQLBindParameter, and execute it by SQLExecute.
After all of there steps, the error code 42P02 (there is no parameter $1) was returned.
BTW: I'm also using the almost same code for MS SQL Server and MySQL, and these two DB are working very well, so I believe my code is correct.
PS: the PostgreSQL is v9.1; the psqlODBC is v9.01.0100-1.
============================================================
UPDATE:
And the following error occured: 42601 (syntax error at or near ",") when I'm using '?' as the parameter placholder:
INSERT INTO billData (
tmStart, tsDuration, eCallDir, ...
) VALUES (
?, -- tmStart
?, -- tsDuration
?, -- eCallDir
...
);
============================================================
UPDATE:
According to the suggestion from j.w.r, It works after adding UseServerSidePrepare=1 option to the ODBC connection string.
A lot of thanks :-)
While I am sure prepared statements works well with PostgreSQL ODBC I think something is wrong with this statement code. At first I would remove comments to make this INSERT as simple as possible. I have just tried multiline INSERT with comments and with colon at ends and it worked well. I converted a little my test Python code:
import odbc
db = odbc.odbc('odbcalias/user/password')
c = db.cursor()
r = c.execute("""INSERT INTO billData (
tmStart, tsDuration, eCallDir, ...
) VALUES (
?, -- tmStart
?, -- tsDuration
?, -- eCallDir
...
);""", (tmStart, tsDuration, eCallDir, ))
print('records inserted: %s' % (r))
Can you try it with ActivePython 2.7 (it already has odbc module)?
If it will work then enable ODBC tracing and test this program. In odbc sql trace file should be entries like:
...
python.exe odbc 924-e8c ENTER SQLExecDirect
HSTMT 00A029A8
UCHAR * 0x00B5A480 [ -3] "INSERT INTO test_table(txt) VALUES (?)\ 0"
SDWORD -3
python.exe odbc 924-e8c EXIT SQLExecDirect with return code 0 (SQL_SUCCESS)
HSTMT 00A029A8
UCHAR * 0x00B5A480 [ -3] "INSERT INTO test_table(txt) VALUES (?)\ 0"
SDWORD -3
python.exe odbc 924-e8c ENTER SQLNumResultCols
HSTMT 00A029A8
SWORD * 0x0021FAA0
python.exe odbc 924-e8c EXIT SQLNumResultCols with return code 0 (SQL_SUCCESS)
HSTMT 00A029A8
SWORD * 0x0021FAA0 (0)
python.exe odbc 924-e8c ENTER SQLRowCount
HSTMT 00A029A8
SQLLEN * 0x0021FBDC
python.exe odbc 924-e8c EXIT SQLRowCount with return code 0 (SQL_SUCCESS)
HSTMT 00A029A8
SQLLEN * 0x0021FBDC (1)
Then make similar trace for your application and try to locate what is different.
============================================================
UPDATE:
According to the suggestion from j.w.r, It works after adding UseServerSidePrepare=1 option to the ODBC connection string.
A lot of thanks :-)
Note: the ByteaAsLongVarBinary=1 option should also be added if you want to parameterize a bytea field with SQLBindParameter.