OPEN JPA - #TableGenerator with negative values running in infinate loop - openjpa

#TableGenerator(name = "ParticipantGen", schema = "sa", table = "ADP_TBL_OID", pkColumnName = "TYPE_ID", pkColumnValue = "5321", valueColumnName = "OBJ_NUM", allocationSize = 50)
#Id
#GeneratedValue(strategy = GenerationType.TABLE, generator = "ParticipantGen")
private BigInteger objid;
I have a configuration like above to generate the primary key for a table.
The next value for the key is a negative number.
When I run the flow, Somehow the JPA is not accepting -ve numbers and running in infinite loop to get a positive number.
Because of this the application is blocking since the db thread to insert table record is not released.
Would be great if somebody can help here.
Log:
302378 openjpa TRACE [[ACTIVE] ExecuteThread: '4' for queue: 'weblogic.kernel.Default (self-tuning)'] openjpa.jdbc.SQL - <t -1257420086, conn 73905> [0 ms] spent
302378 openjpa TRACE [[ACTIVE] ExecuteThread: '4' for queue: 'weblogic.kernel.Default (self-tuning)'] openjpa.jdbc.SQL - <t -1257420086, conn 73905> executing prepstmnt 277699 UPDATE SA.ADP_TBL_OID SET OBJ_NUM = ? WHERE TYPE_ID = ? AND OBJ_NUM = ? [params=(long) -2116596711, (String) 5321, (long) -2116596761]
302378 openjpa TRACE [[ACTIVE] ExecuteThread: '4' for queue: 'weblogic.kernel.Default (self-tuning)'] openjpa.jdbc.SQL - <t -1257420086, conn 73905> [0 ms] spent
302379 openjpa TRACE [[ACTIVE] ExecuteThread: '4' for queue: 'weblogic.kernel.Default (self-tuning)'] openjpa.jdbc.SQL - <t -1257420086, conn 73905> executing prepstmnt 277700 SELECT OBJ_NUM FROM SA.ADP_TBL_OID WHERE TYPE_ID = ? FOR UPDATE [params=(String) 5321]
302379 openjpa TRACE [[ACTIVE] ExecuteThread: '4' for queue: 'weblogic.kernel.Default (self-tuning)'] openjpa.jdbc.SQL - <t -1257420086, conn 73905> [0 ms] spent
302379 openjpa TRACE [[ACTIVE] ExecuteThread: '4' for queue: 'weblogic.kernel.Default (self-tuning)'] openjpa.jdbc.SQL - <t -1257420086, conn 73905> executing prepstmnt 277702 UPDATE SA.ADP_TBL_OID SET OBJ_NUM = ? WHERE TYPE_ID = ? AND OBJ_NUM = ? [params=(long) -2116596661, (String) 5321, (long) -2116596711]
302380 openjpa TRACE [[ACTIVE] ExecuteThread: '4' for queue: 'weblogic.kernel.Default (self-tuning)'] openjpa.jdbc.SQL - <t -1257420086, conn 73905> [1 ms] spent
302380 openjpa TRACE [[ACTIVE] ExecuteThread: '4' for queue: 'weblogic.kernel.Default (self-tuning)'] openjpa.jdbc.SQL - <t -1257420086, conn 73905> executing prepstmnt 277703 SELECT OBJ_NUM FROM SA.ADP_TBL_OID WHERE TYPE_ID = ? FOR UPDATE [params=(String) 5321]
302381 openjpa TRACE [[ACTIVE] ExecuteThread: '4' for queue: 'weblogic.kernel.Default (self-tuning)'] openjpa.jdbc.SQL - <t -1257420086, conn 73905> [1 ms] spent
302381 openjpa TRACE [[ACTIVE] ExecuteThread: '4' for queue: 'weblogic.kernel.Default (self-tuning)'] openjpa.jdbc.SQL - <t -1257420086, conn 73905> executing prepstmnt 277705 UPDATE SA.ADP_TBL_OID SET OBJ_NUM = ? WHERE TYPE_ID = ? AND OBJ_NUM = ? [params=(long) -2116596611, (String) 5321, (long) -2116596661]
302381 openjpa TRACE [[ACTIVE] ExecuteThread: '4' for queue: 'weblogic.kernel.Default (self-tuning)'] openjpa.jdbc.SQL - <t -1257420086, conn 73905> [0 ms] spent
302381 openjpa TRACE [[ACTIVE] ExecuteThread: '4' for queue: 'weblogic.kernel.Default (self-tuning)'] openjpa.jdbc.SQL - <t -1257420086, conn 73905> executing prepstmnt 277706 SELECT OBJ_NUM FROM SA.ADP_TBL_OID WHERE TYPE_ID = ? FOR UPDATE [params=(String) 5321]
302382 openjpa TRACE [[ACTIVE] ExecuteThread: '4' for queue: 'weblogic.kernel.Default (self-tuning)'] openjpa.jdbc.SQL - <t -1257420086, conn 73905> [1 ms] spent

Related

JPA jsonb query issue with positional argument

I'm having an issue with JPA and json querying. My JPA is Eclipselink and I use Postgres DB.
My query is
with values as(select id, inputspecifications as spec from process where commercial = True and inputspecifications #> '[{\"type\":\"raster\"}]') select id from values where (spec -> 'platforms' is null or (spec -> 'platforms' -> 'satellites' is not null and (spec -> 'platforms' -> 'satellites' ?& array['310802']))
The query works fine until the array inclusion comparison (last bit). It seems JPA is seeing ?& as a positional argument, as per the fine logs
[EL Warning]: sql: 2022-11-07 10:22:05.336--ServerSession(65586123)--Missing Query parameter for named argument: & "null" will be substituted.
[EL Fine]: sql: 2022-11-07 10:22:05.336--ServerSession(65586123)--Connection(1463355115)--with values as(select id, inputspecifications as spec from process where commercial = True and inputspecifications #> '[{"type":"raster"}]') select id from values where (spec -> 'platforms' is null or (spec -> 'platforms' -> 'satellites' is not null and (spec -> 'platforms' -> 'satellites' ? array['310802']))
bind => [null]
[EL Fine]: sql: 2022-11-07 10:22:05.343--ServerSession(65586123)--SELECT 1
[EL Warning]: 2022-11-07 10:22:05.344--UnitOfWork(446445803)--Exception [EclipseLink-4002] (Eclipse Persistence Services - 2.7.6.v20200131-b7c997804f): org.eclipse.persistence.exceptions.DatabaseException
Internal Exception: org.postgresql.util.PSQLException: ERROR: syntax error at or near "$1"
Position: 293
Error Code: 0
I have tried escaping in various ways, ie \?&, \?\&,... all fail one way or another...
Any idea how to make jpa NOT see ?& as a positional parameter?

Does cascade happen in 1 transaction?

I save the Product which cascade persist the productMaterial. However, when the productMaterial throws DataIntegrityViolationException the product is rollbacked, which seems like cascade is done in 1 transaction, but i don't find any docs saying that it does. Can someone clarify it for me?
NOTE: I DO NOT use #Transactional
Material material = new Material();
material.setId(1);
Product newProduct = new Product();
ProductMaterial productMaterial = new ProductMaterial();
newProduct.setName("bàn chải");
newProduct.setPrice(1000);
newProduct.setCreatedAt(new Date());
newProduct.setProductMaterials(Collections.singletonList(productMaterial));
productMaterial.setProduct(newProduct);
productMaterial.setMaterial(material);
productRepository.save(newProduct);
Here is the hibernate execution:
Hibernate:
/* insert com.vietnam.hanghandmade.entities.Product
*/ insert
into
product
(created_at, name, price, id)
values
(?, ?, ?, ?)
2020-11-10 14:55:38.281 TRACE 65729 --- [nio-8080-exec-2] o.h.type.descriptor.sql.BasicBinder : binding parameter [1] as [TIMESTAMP] - [Tue Nov 10 14:55:38 JST 2020]
2020-11-10 14:55:38.281 TRACE 65729 --- [nio-8080-exec-2] o.h.type.descriptor.sql.BasicBinder : binding parameter [2] as [VARCHAR] - [bàn chải]
2020-11-10 14:55:38.281 TRACE 65729 --- [nio-8080-exec-2] o.h.type.descriptor.sql.BasicBinder : binding parameter [3] as [INTEGER] - [1000]
2020-11-10 14:55:38.281 TRACE 65729 --- [nio-8080-exec-2] o.h.type.descriptor.sql.BasicBinder : binding parameter [4] as [OTHER] - [e5729490-a0f8-48e7-9600-eeeba8b8f279]
Hibernate:
/* insert com.vietnam.hanghandmade.entities.ProductMaterial
*/ insert
into
product_material
(material_id, product_id)
values
(?, ?)
2020-11-10 14:55:38.324 TRACE 65729 --- [nio-8080-exec-2] o.h.type.descriptor.sql.BasicBinder : binding parameter [1] as [INTEGER] - [1]
2020-11-10 14:55:38.324 TRACE 65729 --- [nio-8080-exec-2] o.h.type.descriptor.sql.BasicBinder : binding parameter [2] as [OTHER] - [e5729490-a0f8-48e7-9600-eeeba8b8f279]
2020-11-10 14:55:38.328 WARN 65729 --- [nio-8080-exec-2] o.h.engine.jdbc.spi.SqlExceptionHelper : SQL Error: 0, SQLState: 23503
2020-11-10 14:55:38.328 ERROR 65729 --- [nio-8080-exec-2] o.h.engine.jdbc.spi.SqlExceptionHelper : ERROR: insert or update on table "product_material" violates foreign key constraint "product_material_material_id_fkey"
Detail: Key (material_id)=(1) is not present in table "material".
NOTE: This answer missed the point of the question, which is about “cascading persist” – it talks about “cascading delete” for foreign keys.
The cascading delete or update is part of the action of the system trigger that implements foreign key constraints, and as such it runs in the same transaction as the triggering statement.
I cannot find a place in the fine manual that spells this out, but it is obvious if you think about it: if the cascading delete were run in a separate transaction, it would be possible that the delete succeeds and the cascading delete fails, which would render the database inconsistent and is consequently not an option.

hibernate second level cache can get lazy-loading entity when i set breakpoint

I use Spring data JPA and hibernate second level cache via hibernate-redis in my project. I use #Transactional for lazy-loading, But it hints miss when I run application. if i debug it, and set a breakpoint wait for some time, it works and retrieve cache from redis. Here is the code:
Entity ItemCategory:
#Entity
#Cacheable
public class ItemCategory extends BaseModel {
#NotNull
#Column(updatable=false)
private String name;
#JsonBackReference
#ManyToOne(fetch = FetchType.LAZY)
private ItemCategory root;
}
Entity Item:
#Entity
#Cacheable
public class Item extends BaseModel {
#ManyToOne(fetch = FetchType.EAGER)
private ItemCategory category;
}
Repository:
#Repository
public interface ItemCategoryRepository extends JpaRepository<ItemCategory, Long> {
#QueryHints(value = {
#QueryHint(name = "org.hibernate.cacheable", value = "true")
})
#Query("select distinct i.category.root from Item i where i.store.id = :id and i.category.parent.id = i.category.root.id")
List<ItemCategory> findByStoreId(#Param("id") Long id);
}
hint miss:
2017-03-06 14:49:30.105 TRACE 30295 --- [nio-8080-exec-2] o.h.cache.redis.client.RedisClient : retrieve cache item. region=hibernate.org.hibernate.cache.internal.StandardQueryCache, key=sql: select distinct itemcatego2_.id as id1_21_, itemcatego2_.create_by_id as create_b8_21_, itemcatego2_.create_date as create_d2_21_, itemcatego2_.last_modified_by_id as last_mod9_21_, itemcatego2_.last_modified_date as last_mod3_21_, itemcatego2_.background_id as backgro10_21_, itemcatego2_.enabled as enabled4_21_, itemcatego2_.name as name5_21_, itemcatego2_.parent_id as parent_11_21_, itemcatego2_.root_id as root_id12_21_, itemcatego2_.slide as slide6_21_, itemcatego2_.son_number as son_numb7_21_ from item item0_ inner join item_category itemcatego1_ on item0_.category_id=itemcatego1_.id inner join item_category itemcatego2_ on itemcatego1_.root_id=itemcatego2_.id where item0_.store_id=? and itemcatego1_.parent_id=itemcatego1_.root_id; parameters: ; named parameters: {id=4}; transformer: org.hibernate.transform.CacheableResultTransformer#110f2, value=[6098054966726656, 3, 1]
2017-03-06 14:49:30.116 TRACE 30295 --- [nio-8080-exec-2] o.h.cache.redis.client.RedisClient : retrieve cache item. region=hibernate.org.hibernate.cache.spi.UpdateTimestampsCache, key=item, value=null
2017-03-06 14:49:30.127 TRACE 30295 --- [nio-8080-exec-2] o.h.cache.redis.client.RedisClient : retrieve cache item. region=hibernate.org.hibernate.cache.spi.UpdateTimestampsCache, key=item_category, value=null
2017-03-06 14:49:41.971 INFO 30295 --- [nio-8080-exec-2] i.StatisticalLoggingSessionEventListener : Session Metrics {
974551 nanoseconds spent acquiring 1 JDBC connections;
0 nanoseconds spent releasing 0 JDBC connections;
0 nanoseconds spent preparing 0 JDBC statements;
0 nanoseconds spent executing 0 JDBC statements;
0 nanoseconds spent executing 0 JDBC batches;
0 nanoseconds spent performing 0 L2C puts;
19881210 nanoseconds spent performing 1 L2C hits;
24082571 nanoseconds spent performing 2 L2C misses;
0 nanoseconds spent executing 0 flushes (flushing a total of 0 entities and 0 collections);
26331 nanoseconds spent executing 1 partial-flushes (flushing a total of 0 entities and 0 collections)
}
if i debug and set a breakpoint wait for some time(not work every time):
2017-03-06 14:50:00.565 TRACE 30295 --- [nio-8080-exec-3] o.h.cache.redis.client.RedisClient : retrieve cache item. region=hibernate.org.hibernate.cache.internal.StandardQueryCache, key=sql: select distinct itemcatego2_.id as id1_21_, itemcatego2_.create_by_id as create_b8_21_, itemcatego2_.create_date as create_d2_21_, itemcatego2_.last_modified_by_id as last_mod9_21_, itemcatego2_.last_modified_date as last_mod3_21_, itemcatego2_.background_id as backgro10_21_, itemcatego2_.enabled as enabled4_21_, itemcatego2_.name as name5_21_, itemcatego2_.parent_id as parent_11_21_, itemcatego2_.root_id as root_id12_21_, itemcatego2_.slide as slide6_21_, itemcatego2_.son_number as son_numb7_21_ from item item0_ inner join item_category itemcatego1_ on item0_.category_id=itemcatego1_.id inner join item_category itemcatego2_ on itemcatego1_.root_id=itemcatego2_.id where item0_.store_id=? and itemcatego1_.parent_id=itemcatego1_.root_id; parameters: ; named parameters: {id=4}; transformer: org.hibernate.transform.CacheableResultTransformer#110f2, value=[6098054966726656, 3, 1]
2017-03-06 14:50:00.584 TRACE 30295 --- [nio-8080-exec-3] o.h.cache.redis.client.RedisClient : retrieve cache item. region=hibernate.org.hibernate.cache.spi.UpdateTimestampsCache, key=item, value=null
2017-03-06 14:50:00.595 TRACE 30295 --- [nio-8080-exec-3] o.h.cache.redis.client.RedisClient : retrieve cache item. region=hibernate.org.hibernate.cache.spi.UpdateTimestampsCache, key=item_category, value=null
2017-03-06 14:50:01.805 TRACE 30295 --- [nio-8080-exec-3] o.h.cache.redis.client.RedisClient : retrieve cache item. region=hibernate.com.foo.bar.model.item.ItemCategory, key=com.foo.bar.model.item.ItemCategory#3, value={parent=null, lastModifiedDate=2016-12-14 09:30:48.0, lastModifiedBy=1, enabled=true, sonNumber=2, _subclass=com.foo.bar.model.item.ItemCategory, createBy=1, children=3, background=1, slide=0, root=3, name=foo, _lazyPropertiesUnfetched=false, _version=null, createDate=2016-12-14 09:29:56.0}
Hibernate: select user0_.id as id1_59_0_, user0_.create_by_id as create_11_59_0_, user0_.create_date as create_d2_59_0_, user0_.last_modified_by_id as last_mo12_59_0_, user0_.last_modified_date as last_mod3_59_0_, user0_.avatar_id as avatar_13_59_0_, user0_.email as email4_59_0_, user0_.enabled as enabled5_59_0_, user0_.gender as gender6_59_0_, user0_.nickname as nickname7_59_0_, user0_.phone as phone8_59_0_, user0_.seller_auth_info_id as seller_14_59_0_, user0_.seller_auth_status as seller_a9_59_0_, user0_.user_ext_id as user_ex15_59_0_, user0_.user_group_id as user_gr16_59_0_, user0_.username as usernam10_59_0_, user1_.id as id1_59_1_, user1_.create_by_id as create_11_59_1_, user1_.create_date as create_d2_59_1_, user1_.last_modified_by_id as last_mo12_59_1_, user1_.last_modified_date as last_mod3_59_1_, user1_.avatar_id as avatar_13_59_1_, user1_.email as email4_59_1_, user1_.enabled as enabled5_59_1_, user1_.gender as gender6_59_1_, user1_.nickname as nickname7_59_1_, user1_.phone as phone8_59_1_, user1_.seller_auth_info_id as seller_14_59_1_, user1_.seller_auth_status as seller_a9_59_1_, user1_.user_ext_id as user_ex15_59_1_, user1_.user_group_id as user_gr16_59_1_, user1_.username as usernam10_59_1_, user2_.id as id1_59_2_, user2_.create_by_id as create_11_59_2_, user2_.create_date as create_d2_59_2_, user2_.last_modified_by_id as last_mo12_59_2_, user2_.last_modified_date as last_mod3_59_2_, user2_.avatar_id as avatar_13_59_2_, user2_.email as email4_59_2_, user2_.enabled as enabled5_59_2_, user2_.gender as gender6_59_2_, user2_.nickname as nickname7_59_2_, user2_.phone as phone8_59_2_, user2_.seller_auth_info_id as seller_14_59_2_, user2_.seller_auth_status as seller_a9_59_2_, user2_.user_ext_id as user_ex15_59_2_, user2_.user_group_id as user_gr16_59_2_, user2_.username as usernam10_59_2_, usergroup3_.id as id1_65_3_, usergroup3_.create_by_id as create_b5_65_3_, usergroup3_.create_date as create_d2_65_3_, usergroup3_.last_modified_by_id as last_mod6_65_3_, usergroup3_.last_modified_date as last_mod3_65_3_, usergroup3_.name as name4_65_3_ from user user0_ left outer join user user1_ on user0_.create_by_id=user1_.id left outer join user user2_ on user1_.last_modified_by_id=user2_.id left outer join user_group usergroup3_ on user1_.user_group_id=usergroup3_.id where user0_.id=?
Hibernate: select usergroup0_.id as id1_65_0_, usergroup0_.create_by_id as create_b5_65_0_, usergroup0_.create_date as create_d2_65_0_, usergroup0_.last_modified_by_id as last_mod6_65_0_, usergroup0_.last_modified_date as last_mod3_65_0_, usergroup0_.name as name4_65_0_, user1_.id as id1_59_1_, user1_.create_by_id as create_11_59_1_, user1_.create_date as create_d2_59_1_, user1_.last_modified_by_id as last_mo12_59_1_, user1_.last_modified_date as last_mod3_59_1_, user1_.avatar_id as avatar_13_59_1_, user1_.email as email4_59_1_, user1_.enabled as enabled5_59_1_, user1_.gender as gender6_59_1_, user1_.nickname as nickname7_59_1_, user1_.phone as phone8_59_1_, user1_.seller_auth_info_id as seller_14_59_1_, user1_.seller_auth_status as seller_a9_59_1_, user1_.user_ext_id as user_ex15_59_1_, user1_.user_group_id as user_gr16_59_1_, user1_.username as usernam10_59_1_, user2_.id as id1_59_2_, user2_.create_by_id as create_11_59_2_, user2_.create_date as create_d2_59_2_, user2_.last_modified_by_id as last_mo12_59_2_, user2_.last_modified_date as last_mod3_59_2_, user2_.avatar_id as avatar_13_59_2_, user2_.email as email4_59_2_, user2_.enabled as enabled5_59_2_, user2_.gender as gender6_59_2_, user2_.nickname as nickname7_59_2_, user2_.phone as phone8_59_2_, user2_.seller_auth_info_id as seller_14_59_2_, user2_.seller_auth_status as seller_a9_59_2_, user2_.user_ext_id as user_ex15_59_2_, user2_.user_group_id as user_gr16_59_2_, user2_.username as usernam10_59_2_, user3_.id as id1_59_3_, user3_.create_by_id as create_11_59_3_, user3_.create_date as create_d2_59_3_, user3_.last_modified_by_id as last_mo12_59_3_, user3_.last_modified_date as last_mod3_59_3_, user3_.avatar_id as avatar_13_59_3_, user3_.email as email4_59_3_, user3_.enabled as enabled5_59_3_, user3_.gender as gender6_59_3_, user3_.nickname as nickname7_59_3_, user3_.phone as phone8_59_3_, user3_.seller_auth_info_id as seller_14_59_3_, user3_.seller_auth_status as seller_a9_59_3_, user3_.user_ext_id as user_ex15_59_3_, user3_.user_group_id as user_gr16_59_3_, user3_.username as usernam10_59_3_, usergroup4_.id as id1_65_4_, usergroup4_.create_by_id as create_b5_65_4_, usergroup4_.create_date as create_d2_65_4_, usergroup4_.last_modified_by_id as last_mod6_65_4_, usergroup4_.last_modified_date as last_mod3_65_4_, usergroup4_.name as name4_65_4_, user5_.id as id1_59_5_, user5_.create_by_id as create_11_59_5_, user5_.create_date as create_d2_59_5_, user5_.last_modified_by_id as last_mo12_59_5_, user5_.last_modified_date as last_mod3_59_5_, user5_.avatar_id as avatar_13_59_5_, user5_.email as email4_59_5_, user5_.enabled as enabled5_59_5_, user5_.gender as gender6_59_5_, user5_.nickname as nickname7_59_5_, user5_.phone as phone8_59_5_, user5_.seller_auth_info_id as seller_14_59_5_, user5_.seller_auth_status as seller_a9_59_5_, user5_.user_ext_id as user_ex15_59_5_, user5_.user_group_id as user_gr16_59_5_, user5_.username as usernam10_59_5_, authoritie6_.user_group_id as user_gro1_66_6_, authoritie6_.authorities as authorit2_66_6_ from user_group usergroup0_ left outer join user user1_ on usergroup0_.create_by_id=user1_.id left outer join user user2_ on user1_.create_by_id=user2_.id left outer join user user3_ on user1_.last_modified_by_id=user3_.id left outer join user_group usergroup4_ on user1_.user_group_id=usergroup4_.id left outer join user user5_ on usergroup0_.last_modified_by_id=user5_.id left outer join user_group_authorities authoritie6_ on usergroup0_.id=authoritie6_.user_group_id where usergroup0_.id=?
2017-03-06 14:50:01.830 TRACE 30295 --- [nio-8080-exec-3] o.h.cache.redis.client.RedisClient : retrieve cache item. region=hibernate.com.foo.bar.model.item.ItemCategory, key=com.foo.bar.model.item.ItemCategory#1, value={parent=null, lastModifiedDate=2016-12-05 09:31:51.0, lastModifiedBy=1, enabled=true, sonNumber=1, _subclass=com.foo.bar.model.item.ItemCategory, createBy=1, children=1, background=1, slide=0, root=1, name=bar, _lazyPropertiesUnfetched=false, _version=null, createDate=2016-12-05 09:31:28.0}
2017-03-06 14:51:02.165 INFO 30295 --- [nio-8080-exec-3] i.StatisticalLoggingSessionEventListener : Session Metrics {
15435533 nanoseconds spent acquiring 1 JDBC connections;
0 nanoseconds spent releasing 0 JDBC connections;
1405433 nanoseconds spent preparing 2 JDBC statements;
2301936 nanoseconds spent executing 2 JDBC statements;
0 nanoseconds spent executing 0 JDBC batches;
0 nanoseconds spent performing 0 L2C puts;
64020073 nanoseconds spent performing 3 L2C hits;
27037450 nanoseconds spent performing 2 L2C misses;
1247578 nanoseconds spent executing 1 flushes (flushing a total of 4 entities and 3 collections);
24403 nanoseconds spent executing 1 partial-flushes (flushing a total of 0 entities and 0 collections)
}
application.yml:
spring:
profiles: development
jpa:
show-sql: true
properties:
hibernate.cache.use_second_level_cache: true
hibernate.cache.region.factory_class: org.hibernate.cache.redis.hibernate5.SingletonRedisRegionFactory
hibernate.cache.use_query_cache: true
hibernate.cache.region_prefix: hibernate
hibernate.generate_statistics: true
hibernate.cache.use_structured_entries: true
redisson-config: classpath:redisson.yml
hibernate.cache.use_reference_entries: true
javax.persistence.sharedCache.mode: ENABLE_SELECTIVE

Inserting data into more than one table in spring-batch using ibatis

I am using IbatisBatchItemWriter to write complex object into multiple tables.
Here is my object how it looks like
public class SfObject{
protected List<Person> person;
}
public class Person {
protected String personId;
protected XMLGregorianCalendar dateOfBirth;
protected String countryOfBirth;
protected String regionOfBirth;
protected String placeOfBirth;
protected String birthName;
protected XMLGregorianCalendar dateOfDeath;
protected XMLGregorianCalendar lastModifiedOn;
protected List<EmailInformation> emailInformation;
}
public class EmailInformation {
protected String emailType;
protected String emailAddress;
protected XMLGregorianCalendar lastModifiedOn;
}
And here is my ibatis configuration to insert above objests
<insert id="insertCompoundEmployeeData" parameterClass="com.domain.SfObject">
<iterate property="person">
insert into E_Person_Info
(person_id,
person_birth_dt,
person_country_of_birth,
person_region_of_birth,
person_place_of_birth,
person_birth_name,
person_death_dt,
last_modified_on
)
values (#person[].personId#,
#person[].dateOfBirth,
#person[].countryOfBirth#,
#person[].regionOfBirth#,
#person[].placeOfBirth#,
#person[].birthName#,
#person[].dateOfDeath#,
#person[].lastModifiedOn#
);
<iterate property="person[].emailInformation">
insert into E_Email_Info
(email_info_person_id,
email_info_email_type,
email_info_email_address,
last_modified_on
)
values (#person[].personId#,
#person[].emailInformation[].emailType#,
#person[].emailInformation[].emailAddress#,
#person[].emailInformation[].lastModifiedOn#
);
</iterate>
</iterate>
</insert>
I am not sure whether i could use above config to insert data into more than one table, but when i executed the above code i am getting below error for batch of 10 records. Btw, email information is not mandatory so, it may be null in some person object.
Stacktrace
[08.08.2014 17:30:07] DEBUG: WebservicePagingItemReader.doRead() - Reading page 0
[08.08.2014 17:30:09] DEBUG: RepeatTemplate.executeInternal() - Repeat operation about to start at count=1
[08.08.2014 17:30:09] DEBUG: RepeatTemplate.executeInternal() - Repeat operation about to start at count=2
[08.08.2014 17:30:09] DEBUG: RepeatTemplate.executeInternal() - Repeat operation about to start at count=3
[08.08.2014 17:30:09] DEBUG: RepeatTemplate.executeInternal() - Repeat operation about to start at count=4
[08.08.2014 17:30:09] DEBUG: RepeatTemplate.executeInternal() - Repeat operation about to start at count=5
[08.08.2014 17:30:09] DEBUG: RepeatTemplate.executeInternal() - Repeat operation about to start at count=6
[08.08.2014 17:30:09] DEBUG: RepeatTemplate.executeInternal() - Repeat operation about to start at count=7
[08.08.2014 17:30:09] DEBUG: RepeatTemplate.executeInternal() - Repeat operation about to start at count=8
[08.08.2014 17:30:09] DEBUG: RepeatTemplate.executeInternal() - Repeat operation about to start at count=9
[08.08.2014 17:30:09] DEBUG: RepeatTemplate.isComplete() - Repeat is complete according to policy and result value.
[08.08.2014 17:30:09] DEBUG: IbatisBatchItemWriter.write() - Executing batch with 10 items.
[08.08.2014 17:30:09] DEBUG: SqlMapClientTemplate.execute() - Opened SqlMapSession [com.ibatis.sqlmap.engine.impl.SqlMapSessionImpl#168afdd] for iBATIS operation
[08.08.2014 17:30:10] DEBUG: Connection.debug() - {conn-100000} Connection
[08.08.2014 17:30:10] DEBUG: SqlMapClientTemplate.execute() - Obtained JDBC Connection [Transaction-aware proxy for target Connection from DataSource [org.springframework.jdbc.datasource.DriverManagerDataSource#8eae04]] for iBATIS operation
[08.08.2014 17:30:10] DEBUG: Connection.debug() - {conn-100000} Preparing Statement: insert into E_Person_Info (person_id, person_birth_dt, person_country_of_birth, person_region_of_birth, person_place_of_birth, person_birth_name, person_death_dt, last_modified_on ) values (?, ?, ?, ?, ?, ?, ?, ? );
[08.08.2014 17:30:10] DEBUG: Connection.debug() - {conn-100000} Preparing Statement: insert into E_Person_Info (person_id, person_birth_dt, person_country_of_birth, person_region_of_birth, person_place_of_birth, person_birth_name, person_death_dt, last_modified_on ) values (?, ?, ?, ?, ?, ?, ?, ? ); insert into E_Email_Info (email_info_person_id, email_info_email_type, email_info_email_address, last_modified_on ) values (?, ?, ?, ? );
[08.08.2014 17:30:10] DEBUG: Connection.debug() - {conn-100000} Preparing Statement: insert into E_Person_Info (person_id, person_birth_dt, person_country_of_birth, person_region_of_birth, person_place_of_birth, person_birth_name, person_death_dt, last_modified_on ) values (?, ?, ?, ?, ?, ?, ?, ? );
[08.08.2014 17:30:10] DEBUG: TaskletStep.doInChunkContext() - Applying contribution: [StepContribution: read=10, written=0, filtered=0, readSkips=0, writeSkips=0, processSkips=0, exitStatus=EXECUTING]
[08.08.2014 17:30:10] DEBUG: TaskletStep.doInChunkContext() - Rollback for Exception: org.springframework.dao.InvalidDataAccessResourceUsageException: Batch execution returned invalid results. Expected 1 but number of BatchResult objects returned was 3
[08.08.2014 17:30:10] DEBUG: DataSourceTransactionManager.processRollback() - Initiating transaction rollback
[08.08.2014 17:30:10] DEBUG: DataSourceTransactionManager.doRollback() - Rolling back JDBC transaction on Connection [net.sourceforge.jtds.jdbc.ConnectionJDBC3#190d8e1]
[08.08.2014 17:30:10] DEBUG: DataSourceTransactionManager.doCleanupAfterCompletion() - Releasing JDBC Connection [net.sourceforge.jtds.jdbc.ConnectionJDBC3#190d8e1] after transaction
[08.08.2014 17:30:10] DEBUG: DataSourceUtils.doReleaseConnection() - Returning JDBC Connection to DataSource
[08.08.2014 17:30:10] DEBUG: RepeatTemplate.doHandle() - Handling exception: org.springframework.dao.InvalidDataAccessResourceUsageException, caused by: org.springframework.dao.InvalidDataAccessResourceUsageException: Batch execution returned invalid results. Expected 1 but number of BatchResult objects returned was 3
[08.08.2014 17:30:10] DEBUG: RepeatTemplate.executeInternal() - Handling fatal exception explicitly (rethrowing first of 1): org.springframework.dao.InvalidDataAccessResourceUsageException: Batch execution returned invalid results. Expected 1 but number of BatchResult objects returned was 3
[08.08.2014 17:30:10] ERROR: AbstractStep.execute() - Encountered an error executing the step
org.springframework.dao.InvalidDataAccessResourceUsageException: Batch execution returned invalid results. Expected 1 but number of BatchResult objects returned was 3
at org.springframework.batch.item.database.IbatisBatchItemWriter.write(IbatisBatchItemWriter.java:140)
at org.springframework.batch.core.step.item.SimpleChunkProcessor.writeItems(SimpleChunkProcessor.java:156)
at org.springframework.batch.core.step.item.SimpleChunkProcessor.doWrite(SimpleChunkProcessor.java:137)
at org.springframework.batch.core.step.item.SimpleChunkProcessor.write(SimpleChunkProcessor.java:252)
at org.springframework.batch.core.step.item.SimpleChunkProcessor.process(SimpleChunkProcessor.java:178)
at org.springframework.batch.core.step.item.ChunkOrientedTasklet.execute(ChunkOrientedTasklet.java:74)
at org.springframework.batch.core.step.tasklet.TaskletStep$2.doInChunkContext(TaskletStep.java:268)
at org.springframework.batch.core.scope.context.StepContextRepeatCallback.doInIteration(StepContextRepeatCallback.java:76)
at org.springframework.batch.repeat.support.RepeatTemplate.getNextResult(RepeatTemplate.java:367)
at org.springframework.batch.repeat.support.RepeatTemplate.executeInternal(RepeatTemplate.java:215)
at org.springframework.batch.repeat.support.RepeatTemplate.iterate(RepeatTemplate.java:143)
at org.springframework.batch.core.step.tasklet.TaskletStep.doExecute(TaskletStep.java:242)
at org.springframework.batch.core.step.AbstractStep.execute(AbstractStep.java:198)
at org.springframework.batch.core.job.AbstractJob.handleStep(AbstractJob.java:348)
at org.springframework.batch.core.job.flow.FlowJob.access$0(FlowJob.java:1)
at org.springframework.batch.core.job.flow.FlowJob$JobFlowExecutor.executeStep(FlowJob.java:135)
at org.springframework.batch.core.job.flow.support.state.StepState.handle(StepState.java:60)
at org.springframework.batch.core.job.flow.support.SimpleFlow.resume(SimpleFlow.java:144)
at org.springframework.batch.core.job.flow.support.SimpleFlow.start(SimpleFlow.java:124)
at org.springframework.batch.core.job.flow.FlowJob.doExecute(FlowJob.java:103)
at org.springframework.batch.core.job.AbstractJob.execute(AbstractJob.java:250)
at org.springframework.batch.core.launch.support.SimpleJobLauncher$1.run(SimpleJobLauncher.java:110)
at org.springframework.core.task.SyncTaskExecutor.execute(SyncTaskExecutor.java:48)
at org.springframework.batch.core.launch.support.SimpleJobLauncher.run(SimpleJobLauncher.java:105)
at com.CtrlMPojoForBatch.initiateSpringBatchProcess(CtrlMPojoForBatch.java:92)
at com.CtrlMPojoForBatch.main(CtrlMPojoForBatch.java:33)
[08.08.2014 17:30:10] WARN: CustomStepExecutionListner.afterStep() - Failure occured executing the step readWriteExchagnerateConversionData
[08.08.2014 17:30:10] WARN: CustomStepExecutionListner.afterStep() - Initiating the rollback operation...
[08.08.2014 17:30:10] WARN: CustomStepExecutionListner.afterStep() - Rollback completed!
Assuming you're using the IbatisBatchItemWriter provided in Spring Batch (it's been deprecated in favor of the ones provided by the MyBatis project), set the assertUpdates to false. This will prevent Spring Batch from verifying that only one update was made per item.

JPA: Pre-allocate ID block not work fine

I'm trying to manage the ID generation through the annotation #TableGenerator with allocationSize default. As I understand it, to avoid updating the row for every single identifier that gets requested, an allocation size is used. In theory, the provider should pre-allocate a block of identifiers equal to the value of allocationSize - 50 in this case - and then give out identifiers from memory as requested until the block is used up or the transaction comes to an end.
I Use Eclipselink (EL) with JBoss 7.1.
The problem is that this does not happen. Inserting 3 records in the table Students, EL pre allocates for each record a block of 50 ID, even if the transaction is the same. Then, for each record, there is always access to the table. From the logs I see 3 pre allocations and three pairs of select/update query for the ID and the IDs are generated 1- 51 -101 and the sequence has as its final value 150. Piece of log
Connection acquired from connection pool
UPDATE SEQUENCE SET SEQ_COUNT = SEQ_COUNT + ? WHERE SEQ_NAME = ?
bind => [50, TABLE_SEQ]
SELECT SEQ_COUNT FROM SEQUENCE WHERE SEQ_NAME = ?
bind => [TABLE_SEQ]
local sequencing preallocation for TABLE_SEQ: objects: 50 , first: 1, last: 50
Connection released to connection pool [default].
Connection acquired from connection pool
UPDATE SEQUENCE SET SEQ_COUNT = SEQ_COUNT + ? WHERE SEQ_NAME = ?
bind => [50, TABLE_SEQ]
SELECT SEQ_COUNT FROM SEQUENCE WHERE SEQ_NAME = ?
bind => [TABLE_SEQ]
local sequencing preallocation for TABLE_SEQ: objects: 50 , first: 51, last: 100
Connection released to connection pool [default].
Being a single transaction, I expected sequential IDs (1-2-3) and the final value of the sequence 50.
Where am I wrong? I tried to do research on the forum but I can not even solve the problem.
Below is the simple test code.
Thanks for the help.
Entity Student
#Entity
#Table(name="STUDENTS")
public class Student implements Serializable
{
private static final long serialVersionUID = 4771385985502937621L;
#TableGenerator(name="TABLE_SEQ")
#Id #Column(name="ID_STUDENT") #GeneratedValue(generator="TABLE_SEQ")
private int idStudent;
private String name;
public int getIdStudent() {
return idStudent;
}
public void setIdStudent(int idStudent) {
this.idStudent = idStudent;
}
public String getName() {
return name;
}
public void setName(String name) {
this.name= name;
}
}
EJB
#Stateless(name="EJBStudent")
public class EJBStudent implements EJBStudentRemote
{
#PersistenceContext(unitName="JPA_Test")
private EntityManager manager;
public EJBStudent() {
}
#Override
public void insertStudents()
{
manager.getTransaction().begin();
Student student1 = new Student();
student1.setName("Anna");
manager.persist(student1);
Student student2 = new Student();
student2.setName("Paolo");
manager.persist(student2);
Student student3 = new Student();
student3.setName("Luigi");
manager.persist(student3);
manager.flush();
}
}
EDIT
#Chris thanks for response.
This is the finest log. I only noticed a major difference. For each insert, it create a different connection. However, if run the query suggested by #wypieprz first of all persist, the connection is always the same.
Invoking org.jboss.invocation.InterceptorContext$Invocation.insertStudent
[EL Finer]: connection: 2014-04-26 18:05:14.228--ServerSession(320769650)--Thread(Thread[EJB default - 1,5,EJB default])--client acquired: 1096067977
[EL Finer]: transaction: 2014-04-26 18:05:14.236--ClientSession(1096067977)--Thread(Thread[EJB default - 1,5,EJB default])--acquire unit of work: 698749338
[EL Finest]: transaction: 2014-04-26 18:05:14.237--UnitOfWork(698749338)--Thread(Thread[EJB default - 1,5,EJB default])--persist() operation called on: jpa.test.model.Student#55cdaad2.
[EL Finest]: connection: 2014-04-26 18:05:14.238--ServerSession(320769650)--Connection(118375432)--Thread(Thread[EJB default - 1,5,EJB default])--Connection acquired from connection pool [default].
[EL Finest]: query: 2014-04-26 18:05:14.24--ClientSession(1096067977)--Thread(Thread[EJB default - 1,5,EJB default])--Execute query DataModifyQuery(name="TABLE_SEQ" sql="UPDATE SEQUENCE SET SEQ_COUNT = SEQ_COUNT + #PREALLOC_SIZE WHERE SEQ_NAME = #SEQ_NAME")
[EL Finest]: connection: 2014-04-26 18:05:14.242--ClientSession(1096067977)--Thread(Thread[EJB default - 1,5,EJB default])--reconnecting to external connection pool
[EL Fine]: sql: 2014-04-26 18:05:14.249--ClientSession(1096067977)--Connection(1243300871)--Thread(Thread[EJB default - 1,5,EJB default])--UPDATE SEQUENCE SET SEQ_COUNT = SEQ_COUNT + ? WHERE SEQ_NAME = ?
bind => [50, TABLE_SEQ]
[EL Finest]: query: 2014-04-26 18:05:14.252--ClientSession(1096067977)--Thread(Thread[EJB default - 1,5,EJB default])--Execute query ValueReadQuery(name="TABLE_SEQ" sql="SELECT SEQ_COUNT FROM SEQUENCE WHERE SEQ_NAME = #SEQ_NAME")
[EL Fine]: sql: 2014-04-26 18:05:14.253--ClientSession(1096067977)--Connection(1243300871)--Thread(Thread[EJB default - 1,5,EJB default])--SELECT SEQ_COUNT FROM SEQUENCE WHERE SEQ_NAME = ?
bind => [TABLE_SEQ]
[EL Finest]: sequencing: 2014-04-26 18:05:14.256--ClientSession(1096067977)--Connection(1243300871)--Thread(Thread[EJB default - 1,5,EJB default])--local sequencing preallocation for TABLE_SEQ: objects: 50 , first: 1, last: 50
[EL Finest]: connection: 2014-04-26 18:05:14.258--ServerSession(320769650)--Connection(118375432)--Thread(Thread[EJB default - 1,5,EJB default])--Connection released to connection pool [default].
[EL Finest]: sequencing: 2014-04-26 18:05:14.259--UnitOfWork(698749338)--Thread(Thread[EJB default - 1,5,EJB default])--assign sequence to the object (1 -> jpa.test.model.Student#55cdaad2)
[org.hibernate.validator.util.Version] (EJB default - 1) Hibernate Validator 4.2.0.Final
[EL Finest]: transaction: 2014-04-26 18:05:14.325--UnitOfWork(698749338)--Thread(Thread[EJB default - 1,5,EJB default])--persist() operation called on: jpa.test.model.Student#59887d29.
[EL Finest]: connection: 2014-04-26 18:05:14.326--ServerSession(320769650)--Connection(265370795)--Thread(Thread[EJB default - 1,5,EJB default])--Connection acquired from connection pool [default].
[EL Finest]: query: 2014-04-26 18:05:14.327--ClientSession(1096067977)--Thread(Thread[EJB default - 1,5,EJB default])--Execute query DataModifyQuery(name="TABLE_SEQ" sql="UPDATE SEQUENCE SET SEQ_COUNT = SEQ_COUNT + ? WHERE SEQ_NAME = ?")
[EL Finest]: connection: 2014-04-26 18:05:14.328--ClientSession(1096067977)--Thread(Thread[EJB default - 1,5,EJB default])--reconnecting to external connection pool
[EL Fine]: sql: 2014-04-26 18:05:14.329--ClientSession(1096067977)--Connection(1910900393)--Thread(Thread[EJB default - 1,5,EJB default])--UPDATE SEQUENCE SET SEQ_COUNT = SEQ_COUNT + ? WHERE SEQ_NAME = ?
bind => [50, TABLE_SEQ]
[EL Finest]: query: 2014-04-26 18:05:14.331--ClientSession(1096067977)--Thread(Thread[EJB default - 1,5,EJB default])--Execute query ValueReadQuery(name="TABLE_SEQ" sql="SELECT SEQ_COUNT FROM SEQUENCE WHERE SEQ_NAME = ?")
[EL Fine]: sql: 2014-04-26 18:05:14.332--ClientSession(1096067977)--Connection(1910900393)--Thread(Thread[EJB default - 1,5,EJB default])--SELECT SEQ_COUNT FROM SEQUENCE WHERE SEQ_NAME = ?
bind => [TABLE_SEQ]
[EL Finest]: sequencing: 2014-04-26 18:05:14.334--ClientSession(1096067977)--Connection(1910900393)--Thread(Thread[EJB default - 1,5,EJB default])--local sequencing preallocation for TABLE_SEQ: objects: 50 , first: 51, last: 100
[EL Finest]: connection: 2014-04-26 18:05:14.335--ServerSession(320769650)--Connection(265370795)--Thread(Thread[EJB default - 1,5,EJB default])--Connection released to connection pool [default].
[EL Finest]: sequencing: 2014-04-26 18:05:14.336--UnitOfWork(698749338)--Thread(Thread[EJB default - 1,5,EJB default])--assign sequence to the object (51 -> jpa.test.model.Student#59887d29)
[EL Finest]: transaction: 2014-04-26 18:05:14.337--UnitOfWork(698749338)--Thread(Thread[EJB default - 1,5,EJB default])--persist() operation called on: jpa.test.model.Student#20f5e794.
[EL Finest]: connection: 2014-04-26 18:05:14.338--ServerSession(320769650)--Connection(1882633843)--Thread(Thread[EJB default - 1,5,EJB default])--Connection acquired from connection pool [default].
[EL Finest]: query: 2014-04-26 18:05:14.338--ClientSession(1096067977)--Thread(Thread[EJB default - 1,5,EJB default])--Execute query DataModifyQuery(name="TABLE_SEQ" sql="UPDATE SEQUENCE SET SEQ_COUNT = SEQ_COUNT + ? WHERE SEQ_NAME = ?")
[EL Finest]: connection: 2014-04-26 18:05:14.339--ClientSession(1096067977)--Thread(Thread[EJB default - 1,5,EJB default])--reconnecting to external connection pool
[EL Fine]: sql: 2014-04-26 18:05:14.34--ClientSession(1096067977)--Connection(402944403)--Thread(Thread[EJB default - 1,5,EJB default])--UPDATE SEQUENCE SET SEQ_COUNT = SEQ_COUNT + ? WHERE SEQ_NAME = ?
bind => [50, TABLE_SEQ]
[EL Finest]: query: 2014-04-26 18:05:14.342--ClientSession(1096067977)--Thread(Thread[EJB default - 1,5,EJB default])--Execute query ValueReadQuery(name="TABLE_SEQ" sql="SELECT SEQ_COUNT FROM SEQUENCE WHERE SEQ_NAME = ?")
[EL Fine]: sql: 2014-04-26 18:05:14.343--ClientSession(1096067977)--Connection(402944403)--Thread(Thread[EJB default - 1,5,EJB default])--SELECT SEQ_COUNT FROM SEQUENCE WHERE SEQ_NAME = ?
bind => [TABLE_SEQ]
[EL Finest]: sequencing: 2014-04-26 18:05:14.345--ClientSession(1096067977)--Connection(402944403)--Thread(Thread[EJB default - 1,5,EJB default])--local sequencing preallocation for TABLE_SEQ: objects: 50 , first: 101, last: 150
[EL Finest]: connection: 2014-04-26 18:05:14.346--ServerSession(320769650)--Connection(1882633843)--Thread(Thread[EJB default - 1,5,EJB default])--Connection released to connection pool [default].
[EL Finest]: sequencing: 2014-04-26 18:05:14.347--UnitOfWork(698749338)--Thread(Thread[EJB default - 1,5,EJB default])--assign sequence to the object (101 -> jpa.test.model.Student#20f5e794)
[EL Finer]: transaction: 2014-04-26 18:05:14.348--UnitOfWork(698749338)--Thread(Thread[EJB default - 1,5,EJB default])--begin unit of work flush
[EL Finest]: query: 2014-04-26 18:05:14.351--UnitOfWork(698749338)--Thread(Thread[EJB default - 1,5,EJB default])--Execute query InsertObjectQuery(jpa.test.model.Student#59887d29)
[EL Finest]: connection: 2014-04-26 18:05:14.353--ServerSession(320769650)--Connection(1582457600)--Thread(Thread[EJB default - 1,5,EJB default])--Connection acquired from connection pool [default].
[EL Finest]: connection: 2014-04-26 18:05:14.354--ClientSession(1096067977)--Thread(Thread[EJB default - 1,5,EJB default])--reconnecting to external connection pool
[EL Fine]: sql: 2014-04-26 18:05:14.354--ClientSession(1096067977)--Connection(1927398752)--Thread(Thread[EJB default - 1,5,EJB default])--INSERT INTO STUDENTS (ID_STUDENT, NAME) VALUES (?, ?)
bind => [51, Paolo]
[EL Finest]: query: 2014-04-26 18:05:14.37--UnitOfWork(698749338)--Thread(Thread[EJB default - 1,5,EJB default])--Execute query InsertObjectQuery(jpa.test.model.Student#55cdaad2)
[EL Fine]: sql: 2014-04-26 18:05:14.371--ClientSession(1096067977)--Connection(1927398752)--Thread(Thread[EJB default - 1,5,EJB default])--INSERT INTO STUDENTS (ID_STUDENT, NAME) VALUES (?, ?)
bind => [1, Anna]
[EL Finest]: query: 2014-04-26 18:05:14.373--UnitOfWork(698749338)--Thread(Thread[EJB default - 1,5,EJB default])--Execute query InsertObjectQuery(jpa.test.model.Student#20f5e794)
[EL Fine]: sql: 2014-04-26 18:05:14.373--ClientSession(1096067977)--Connection(1927398752)--Thread(Thread[EJB default - 1,5,EJB default])--INSERT INTO STUDENTS (ID_STUDENT, NAME) VALUES (?, ?)
bind => [101, Luigi]
[EL Finer]: transaction: 2014-04-26 18:05:14.375--UnitOfWork(698749338)--Thread(Thread[EJB default - 1,5,EJB default])--end unit of work flush
[EL Finer]: transaction: 2014-04-26 18:05:14.376--UnitOfWork(698749338)--Thread(Thread[EJB default - 1,5,EJB default])--resume unit of work
Exiting org.jboss.invocation.InterceptorContext$Invocation.insertStudent
Business method insertStudent in org.jboss.invocation.InterceptorContext$Invocation takes 5878 ms to execute
18:05:14,409 INFO [org.jboss.as.naming] (Remoting "pc" task-1) JBAS011806: Channel end notification received, closing channel Channel ID 03cefe33 (inbound) of Remoting connection 3138554d to /127.0.0.1:65303
Problem solved.
In order to work well with EclipseLink transactions, is must specify (in the persistence.xml file) which server we are using, in my case:
<property name="eclipselink.target-server" value="JBoss"/>