I have an requirement where we an entity with self entity and want to audit them. Have a look at it below:
#Entity
#Table(name = TableNames.CLIENT)
#EqualsAndHashCode(exclude="clientContacts")
#Audited
#AuditTable(value = TableNames.CLIENT_HISTORY)
public class Client implements Serializable {
private static final long serialVersionUID = -2789655782782839286L;
#Id
#Column(name = "ID")
#GeneratedValue(strategy = GenerationType.SEQUENCE, generator = "clientGenerator")
#SequenceGenerator(name = "clientGenerator", sequenceName =
"MEMBERSHIP_CLIENT_SQ",
allocationSize = 1)
private Long id;
#Column(name = "PARENT_ID")
private Long parentId;
#Column(name = "LEGAL_NAME")
private String legalName;
#Column(name = "LEI")
private String lei;
#Column(name = "CICI")
private String cici;
#Column(name = "BIC")
private String bic;
#Column(name = "LCH_UNIQUE_ID")
private String lchUniqueId;
#Column(name = "SALESFORCE_ID")
private String salesforceId;
#Column(name = "FUND_MANAGER")
private String fundManager;
#Column(name="INCORPORATION_COUNTRY_ID")
private Long incorporationCountryId;
#Column(name="FINANCIAL_CATEGORY_ID")
private Long financialCategoryId;
#Embedded
#JsonUnwrapped
private Address address;
#OneToMany(mappedBy = "client")
private Set<ClientContact> clientContacts;
#OneToOne(fetch = FetchType.LAZY)
#JoinColumn(name = "PARENT_ID", insertable = false, updatable = false)
#Getter(onMethod = #__({#JsonIgnore, #Transient}))
#Audited(targetAuditMode = RelationTargetAuditMode.NOT_AUDITED)
private Client parent;
}
As you can see we have client object as parent within the client object. We also have the parentId field, which has the id of the parent if one has the parent.
Here I mark the relation as NOT_AUDITED, so would expect the parent to be fetched from the main table rather than the history table. But I noticed that it is querying the history table for parent as well.
Also, is this the right way to reference the self object. Is there any better way to represent it?
I am using envers version 5.3.7.Final.
Log to show the data is being retrieved from the history table instead of the main table:
Hibernate: select client_his0_.id as id1_39_0_, client_his0_.revision as revision2_39_0_, customrevi1_.id as id1_5_1_, client_his0_.revision_type as revision_type3_39_0_, client_his0_.address_line1 as address_line4_39_0_, client_his0_.address_line2 as address_line5_39_0_, client_his0_.address_line3 as address_line6_39_0_, client_his0_.address_line4 as address_line7_39_0_, client_his0_.address_line5 as address_line8_39_0_, client_his0_.address_postcode as address_postcode9_39_0_, client_his0_.bic as bic10_39_0_, client_his0_.cici as cici11_39_0_, client_his0_.financial_category_id as financial_categor12_39_0_, client_his0_.fund_manager as fund_manager13_39_0_, client_his0_.incorporation_country_id as incorporation_cou14_39_0_, client_his0_.lch_unique_id as lch_unique_id15_39_0_, client_his0_.legal_name as legal_name16_39_0_, client_his0_.lei as lei17_39_0_, client_his0_.parent_id as parent_id18_39_0_, client_his0_.salesforce_id as salesforce_id19_39_0_, customrevi1_.timestamp as timestamp2_5_1_, customrevi1_.username as username3_5_1_ from membership_client_history client_his0_ cross join audit_revision customrevi1_ cross join audit_revision customrevi2_ where client_his0_.revision=customrevi2_.id and client_his0_.id=? and client_his0_.revision=customrevi1_.id order by customrevi2_.timestamp asc
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.type.descriptor.sql.BasicBinder - binding parameter [1] as [BIGINT] - [359]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.t.descriptor.sql.BasicExtractor - extracted value ([id1_39_0_] : [BIGINT]) - [359]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.t.descriptor.sql.BasicExtractor - extracted value ([revision2_39_0_] : [INTEGER]) - [803107]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.t.descriptor.sql.BasicExtractor - extracted value ([id1_5_1_] : [INTEGER]) - [803107]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.t.descriptor.sql.BasicExtractor - extracted value ([revision_type3_39_0_] : [INTEGER]) - [1]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.t.descriptor.sql.BasicExtractor - extracted value ([address_line4_39_0_] : [VARCHAR]) - [null]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.t.descriptor.sql.BasicExtractor - extracted value ([address_line5_39_0_] : [VARCHAR]) - [null]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.t.descriptor.sql.BasicExtractor - extracted value ([address_line6_39_0_] : [VARCHAR]) - [null]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.t.descriptor.sql.BasicExtractor - extracted value ([address_line7_39_0_] : [VARCHAR]) - [null]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.t.descriptor.sql.BasicExtractor - extracted value ([address_line8_39_0_] : [VARCHAR]) - [null]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.t.descriptor.sql.BasicExtractor - extracted value ([address_postcode9_39_0_] : [VARCHAR]) - [null]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.t.descriptor.sql.BasicExtractor - extracted value ([bic10_39_0_] : [VARCHAR]) - [null]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.t.descriptor.sql.BasicExtractor - extracted value ([cici11_39_0_] : [VARCHAR]) - [null]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.t.descriptor.sql.BasicExtractor - extracted value ([financial_categor12_39_0_] : [BIGINT]) - [8]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.t.descriptor.sql.BasicExtractor - extracted value ([fund_manager13_39_0_] : [VARCHAR]) - [null]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.t.descriptor.sql.BasicExtractor - extracted value ([incorporation_cou14_39_0_] : [BIGINT]) - [19]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.t.descriptor.sql.BasicExtractor - extracted value ([lch_unique_id15_39_0_] : [VARCHAR]) - [LCH00000359]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.t.descriptor.sql.BasicExtractor - extracted value ([legal_name16_39_0_] : [VARCHAR]) - [Dupont Denant Contrepartie]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.t.descriptor.sql.BasicExtractor - extracted value ([lei17_39_0_] : [VARCHAR]) - [null]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.t.descriptor.sql.BasicExtractor - extracted value ([parent_id18_39_0_] : [BIGINT]) - [57]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.t.descriptor.sql.BasicExtractor - extracted value ([salesforce_id19_39_0_] : [VARCHAR]) - [0012000000zMKFYAA4]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.t.descriptor.sql.BasicExtractor - extracted value ([incorporation_cou14_39_0_] : [BIGINT]) - [19]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.t.descriptor.sql.BasicExtractor - extracted value ([financial_categor12_39_0_] : [BIGINT]) - [8]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.t.descriptor.sql.BasicExtractor - extracted value ([parent_id18_39_0_] : [BIGINT]) - [57]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.t.descriptor.sql.BasicExtractor - extracted value ([timestamp2_5_1_] : [BIGINT]) - [1562172380000]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.t.descriptor.sql.BasicExtractor - extracted value ([username3_5_1_] : [VARCHAR]) - [risk-portal.schedule]
Hibernate: select client_his0_.id as id1_39_, client_his0_.revision as revision2_39_, client_his0_.revision_type as revision_type3_39_, client_his0_.address_line1 as address_line4_39_, client_his0_.address_line2 as address_line5_39_, client_his0_.address_line3 as address_line6_39_, client_his0_.address_line4 as address_line7_39_, client_his0_.address_line5 as address_line8_39_, client_his0_.address_postcode as address_postcode9_39_, client_his0_.bic as bic10_39_, client_his0_.cici as cici11_39_, client_his0_.financial_category_id as financial_categor12_39_, client_his0_.fund_manager as fund_manager13_39_, client_his0_.incorporation_country_id as incorporation_cou14_39_, client_his0_.lch_unique_id as lch_unique_id15_39_, client_his0_.legal_name as legal_name16_39_, client_his0_.lei as lei17_39_, client_his0_.parent_id as parent_id18_39_, client_his0_.salesforce_id as salesforce_id19_39_ from membership_client_history client_his0_ where client_his0_.revision=(select max(client_his1_.revision) from membership_client_history client_his1_ where client_his1_.revision<=? and client_his0_.id=client_his1_.id) and client_his0_.revision_type<>? and client_his0_.id=?
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.type.descriptor.sql.BasicBinder - binding parameter [1] as [INTEGER] - [803107]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.type.descriptor.sql.BasicBinder - binding parameter [2] as [INTEGER] - [2]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.type.descriptor.sql.BasicBinder - binding parameter [3] as [BIGINT] - [57]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] DEBUG o.s.s.w.h.writers.HstsHeaderWriter - Not injecting HSTS header since it did not match the requestMatcher org.springframework.security.web.header.writers.HstsHeaderWriter$SecureRequestMatcher#2021d30
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] DEBUG o.s.s.w.c.SecurityContextPersistenceFilter - SecurityContextHolder now cleared, as request processing completed
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] ERROR o.a.c.c.C.[.[.[.[dispatcherServlet] - Servlet.service() for servlet [dispatcherServlet] in context with path [/risk-portal] threw exception [Request processing failed; nested exception is javax.persistence.EntityNotFoundException: Unable to find com.lch.grouprisk.riskportal.entity.crud.Client with id 57] with root cause
javax.persistence.EntityNotFoundException: Unable to find com.lch.grouprisk.riskportal.entity.crud.Client with id 57
Related
I save the Product which cascade persist the productMaterial. However, when the productMaterial throws DataIntegrityViolationException the product is rollbacked, which seems like cascade is done in 1 transaction, but i don't find any docs saying that it does. Can someone clarify it for me?
NOTE: I DO NOT use #Transactional
Material material = new Material();
material.setId(1);
Product newProduct = new Product();
ProductMaterial productMaterial = new ProductMaterial();
newProduct.setName("bàn chải");
newProduct.setPrice(1000);
newProduct.setCreatedAt(new Date());
newProduct.setProductMaterials(Collections.singletonList(productMaterial));
productMaterial.setProduct(newProduct);
productMaterial.setMaterial(material);
productRepository.save(newProduct);
Here is the hibernate execution:
Hibernate:
/* insert com.vietnam.hanghandmade.entities.Product
*/ insert
into
product
(created_at, name, price, id)
values
(?, ?, ?, ?)
2020-11-10 14:55:38.281 TRACE 65729 --- [nio-8080-exec-2] o.h.type.descriptor.sql.BasicBinder : binding parameter [1] as [TIMESTAMP] - [Tue Nov 10 14:55:38 JST 2020]
2020-11-10 14:55:38.281 TRACE 65729 --- [nio-8080-exec-2] o.h.type.descriptor.sql.BasicBinder : binding parameter [2] as [VARCHAR] - [bàn chải]
2020-11-10 14:55:38.281 TRACE 65729 --- [nio-8080-exec-2] o.h.type.descriptor.sql.BasicBinder : binding parameter [3] as [INTEGER] - [1000]
2020-11-10 14:55:38.281 TRACE 65729 --- [nio-8080-exec-2] o.h.type.descriptor.sql.BasicBinder : binding parameter [4] as [OTHER] - [e5729490-a0f8-48e7-9600-eeeba8b8f279]
Hibernate:
/* insert com.vietnam.hanghandmade.entities.ProductMaterial
*/ insert
into
product_material
(material_id, product_id)
values
(?, ?)
2020-11-10 14:55:38.324 TRACE 65729 --- [nio-8080-exec-2] o.h.type.descriptor.sql.BasicBinder : binding parameter [1] as [INTEGER] - [1]
2020-11-10 14:55:38.324 TRACE 65729 --- [nio-8080-exec-2] o.h.type.descriptor.sql.BasicBinder : binding parameter [2] as [OTHER] - [e5729490-a0f8-48e7-9600-eeeba8b8f279]
2020-11-10 14:55:38.328 WARN 65729 --- [nio-8080-exec-2] o.h.engine.jdbc.spi.SqlExceptionHelper : SQL Error: 0, SQLState: 23503
2020-11-10 14:55:38.328 ERROR 65729 --- [nio-8080-exec-2] o.h.engine.jdbc.spi.SqlExceptionHelper : ERROR: insert or update on table "product_material" violates foreign key constraint "product_material_material_id_fkey"
Detail: Key (material_id)=(1) is not present in table "material".
NOTE: This answer missed the point of the question, which is about “cascading persist” – it talks about “cascading delete” for foreign keys.
The cascading delete or update is part of the action of the system trigger that implements foreign key constraints, and as such it runs in the same transaction as the triggering statement.
I cannot find a place in the fine manual that spells this out, but it is obvious if you think about it: if the cascading delete were run in a separate transaction, it would be possible that the delete succeeds and the cascading delete fails, which would render the database inconsistent and is consequently not an option.
I have an issue with my native query.
I've got:
#Query(value="SELECT * from orders where orders.house in ((:houseArray))", nativeQuery = true)
List<Order> findByHouseId(#Param("houseArray") List<Long> houseArray);
And when I am trying to execute, I get the following:
2017-04-18 14:19:49,736 DEBUG org.hibernate.SQL: SELECT * from orders where orders.house in ((?, ?, ?, ?, ?))
2017-04-18 14:19:49,737 TRACE o.h.t.d.s.BasicBinder: binding parameter [2] as [BIGINT] - [4]
2017-04-18 14:19:49,737 TRACE o.h.t.d.s.BasicBinder: binding parameter [3] as [BIGINT] - [5]
2017-04-18 14:19:49,737 TRACE o.h.t.d.s.BasicBinder: binding parameter [1] as [BIGINT] - [3]
2017-04-18 14:19:49,737 TRACE o.h.t.d.s.BasicBinder: binding parameter [4] as [BIGINT] - [6]
2017-04-18 14:19:49,737 TRACE o.h.t.d.s.BasicBinder: binding parameter [5] as [BIGINT] - [7]
2017-04-18 14:19:49,738 ERROR o.h.e.j.s.SqlExceptionHelper: ERROR: operator does not exist: bigint = record
Hint: No operator matches the given name and argument type(s). You might need to add explicit type casts.
Position: 49
2017-04-18 14:19:49,756 ERROR o.a.c.c.C.[.[.[.[dispatcherServlet]: Servlet.service() for servlet [dispatcherServlet] in context with path [] threw exception [Request processing failed; nested exception is javax.persistence.PersistenceException: org.hibernate.exception.SQLGrammarException: could not extract ResultSet] with root cause
org.postgresql.util.PSQLException: ERROR: operator does not exist: bigint = record
Hint: No operator matches the given name and argument type(s). You might need to add explicit type casts.
However, if I run the following query in console:
SELECT * from orders where orders.house in (1,15,2,4,5,3,6,7);
It returns proper list of orders.
How can I fix this?
Try removing one set of brackets from ((:houseArray)) so it looks like that:
#Query(value="SELECT * from orders where orders.house in (:houseArray)", nativeQuery = true)
List<Order> findByHouseId(#Param("houseArray") List<Long> houseArray);
(value, value, value) is a record, so when you do column in ((value, value, value)) you compare column vs record.
I use Spring data JPA and hibernate second level cache via hibernate-redis in my project. I use #Transactional for lazy-loading, But it hints miss when I run application. if i debug it, and set a breakpoint wait for some time, it works and retrieve cache from redis. Here is the code:
Entity ItemCategory:
#Entity
#Cacheable
public class ItemCategory extends BaseModel {
#NotNull
#Column(updatable=false)
private String name;
#JsonBackReference
#ManyToOne(fetch = FetchType.LAZY)
private ItemCategory root;
}
Entity Item:
#Entity
#Cacheable
public class Item extends BaseModel {
#ManyToOne(fetch = FetchType.EAGER)
private ItemCategory category;
}
Repository:
#Repository
public interface ItemCategoryRepository extends JpaRepository<ItemCategory, Long> {
#QueryHints(value = {
#QueryHint(name = "org.hibernate.cacheable", value = "true")
})
#Query("select distinct i.category.root from Item i where i.store.id = :id and i.category.parent.id = i.category.root.id")
List<ItemCategory> findByStoreId(#Param("id") Long id);
}
hint miss:
2017-03-06 14:49:30.105 TRACE 30295 --- [nio-8080-exec-2] o.h.cache.redis.client.RedisClient : retrieve cache item. region=hibernate.org.hibernate.cache.internal.StandardQueryCache, key=sql: select distinct itemcatego2_.id as id1_21_, itemcatego2_.create_by_id as create_b8_21_, itemcatego2_.create_date as create_d2_21_, itemcatego2_.last_modified_by_id as last_mod9_21_, itemcatego2_.last_modified_date as last_mod3_21_, itemcatego2_.background_id as backgro10_21_, itemcatego2_.enabled as enabled4_21_, itemcatego2_.name as name5_21_, itemcatego2_.parent_id as parent_11_21_, itemcatego2_.root_id as root_id12_21_, itemcatego2_.slide as slide6_21_, itemcatego2_.son_number as son_numb7_21_ from item item0_ inner join item_category itemcatego1_ on item0_.category_id=itemcatego1_.id inner join item_category itemcatego2_ on itemcatego1_.root_id=itemcatego2_.id where item0_.store_id=? and itemcatego1_.parent_id=itemcatego1_.root_id; parameters: ; named parameters: {id=4}; transformer: org.hibernate.transform.CacheableResultTransformer#110f2, value=[6098054966726656, 3, 1]
2017-03-06 14:49:30.116 TRACE 30295 --- [nio-8080-exec-2] o.h.cache.redis.client.RedisClient : retrieve cache item. region=hibernate.org.hibernate.cache.spi.UpdateTimestampsCache, key=item, value=null
2017-03-06 14:49:30.127 TRACE 30295 --- [nio-8080-exec-2] o.h.cache.redis.client.RedisClient : retrieve cache item. region=hibernate.org.hibernate.cache.spi.UpdateTimestampsCache, key=item_category, value=null
2017-03-06 14:49:41.971 INFO 30295 --- [nio-8080-exec-2] i.StatisticalLoggingSessionEventListener : Session Metrics {
974551 nanoseconds spent acquiring 1 JDBC connections;
0 nanoseconds spent releasing 0 JDBC connections;
0 nanoseconds spent preparing 0 JDBC statements;
0 nanoseconds spent executing 0 JDBC statements;
0 nanoseconds spent executing 0 JDBC batches;
0 nanoseconds spent performing 0 L2C puts;
19881210 nanoseconds spent performing 1 L2C hits;
24082571 nanoseconds spent performing 2 L2C misses;
0 nanoseconds spent executing 0 flushes (flushing a total of 0 entities and 0 collections);
26331 nanoseconds spent executing 1 partial-flushes (flushing a total of 0 entities and 0 collections)
}
if i debug and set a breakpoint wait for some time(not work every time):
2017-03-06 14:50:00.565 TRACE 30295 --- [nio-8080-exec-3] o.h.cache.redis.client.RedisClient : retrieve cache item. region=hibernate.org.hibernate.cache.internal.StandardQueryCache, key=sql: select distinct itemcatego2_.id as id1_21_, itemcatego2_.create_by_id as create_b8_21_, itemcatego2_.create_date as create_d2_21_, itemcatego2_.last_modified_by_id as last_mod9_21_, itemcatego2_.last_modified_date as last_mod3_21_, itemcatego2_.background_id as backgro10_21_, itemcatego2_.enabled as enabled4_21_, itemcatego2_.name as name5_21_, itemcatego2_.parent_id as parent_11_21_, itemcatego2_.root_id as root_id12_21_, itemcatego2_.slide as slide6_21_, itemcatego2_.son_number as son_numb7_21_ from item item0_ inner join item_category itemcatego1_ on item0_.category_id=itemcatego1_.id inner join item_category itemcatego2_ on itemcatego1_.root_id=itemcatego2_.id where item0_.store_id=? and itemcatego1_.parent_id=itemcatego1_.root_id; parameters: ; named parameters: {id=4}; transformer: org.hibernate.transform.CacheableResultTransformer#110f2, value=[6098054966726656, 3, 1]
2017-03-06 14:50:00.584 TRACE 30295 --- [nio-8080-exec-3] o.h.cache.redis.client.RedisClient : retrieve cache item. region=hibernate.org.hibernate.cache.spi.UpdateTimestampsCache, key=item, value=null
2017-03-06 14:50:00.595 TRACE 30295 --- [nio-8080-exec-3] o.h.cache.redis.client.RedisClient : retrieve cache item. region=hibernate.org.hibernate.cache.spi.UpdateTimestampsCache, key=item_category, value=null
2017-03-06 14:50:01.805 TRACE 30295 --- [nio-8080-exec-3] o.h.cache.redis.client.RedisClient : retrieve cache item. region=hibernate.com.foo.bar.model.item.ItemCategory, key=com.foo.bar.model.item.ItemCategory#3, value={parent=null, lastModifiedDate=2016-12-14 09:30:48.0, lastModifiedBy=1, enabled=true, sonNumber=2, _subclass=com.foo.bar.model.item.ItemCategory, createBy=1, children=3, background=1, slide=0, root=3, name=foo, _lazyPropertiesUnfetched=false, _version=null, createDate=2016-12-14 09:29:56.0}
Hibernate: select user0_.id as id1_59_0_, user0_.create_by_id as create_11_59_0_, user0_.create_date as create_d2_59_0_, user0_.last_modified_by_id as last_mo12_59_0_, user0_.last_modified_date as last_mod3_59_0_, user0_.avatar_id as avatar_13_59_0_, user0_.email as email4_59_0_, user0_.enabled as enabled5_59_0_, user0_.gender as gender6_59_0_, user0_.nickname as nickname7_59_0_, user0_.phone as phone8_59_0_, user0_.seller_auth_info_id as seller_14_59_0_, user0_.seller_auth_status as seller_a9_59_0_, user0_.user_ext_id as user_ex15_59_0_, user0_.user_group_id as user_gr16_59_0_, user0_.username as usernam10_59_0_, user1_.id as id1_59_1_, user1_.create_by_id as create_11_59_1_, user1_.create_date as create_d2_59_1_, user1_.last_modified_by_id as last_mo12_59_1_, user1_.last_modified_date as last_mod3_59_1_, user1_.avatar_id as avatar_13_59_1_, user1_.email as email4_59_1_, user1_.enabled as enabled5_59_1_, user1_.gender as gender6_59_1_, user1_.nickname as nickname7_59_1_, user1_.phone as phone8_59_1_, user1_.seller_auth_info_id as seller_14_59_1_, user1_.seller_auth_status as seller_a9_59_1_, user1_.user_ext_id as user_ex15_59_1_, user1_.user_group_id as user_gr16_59_1_, user1_.username as usernam10_59_1_, user2_.id as id1_59_2_, user2_.create_by_id as create_11_59_2_, user2_.create_date as create_d2_59_2_, user2_.last_modified_by_id as last_mo12_59_2_, user2_.last_modified_date as last_mod3_59_2_, user2_.avatar_id as avatar_13_59_2_, user2_.email as email4_59_2_, user2_.enabled as enabled5_59_2_, user2_.gender as gender6_59_2_, user2_.nickname as nickname7_59_2_, user2_.phone as phone8_59_2_, user2_.seller_auth_info_id as seller_14_59_2_, user2_.seller_auth_status as seller_a9_59_2_, user2_.user_ext_id as user_ex15_59_2_, user2_.user_group_id as user_gr16_59_2_, user2_.username as usernam10_59_2_, usergroup3_.id as id1_65_3_, usergroup3_.create_by_id as create_b5_65_3_, usergroup3_.create_date as create_d2_65_3_, usergroup3_.last_modified_by_id as last_mod6_65_3_, usergroup3_.last_modified_date as last_mod3_65_3_, usergroup3_.name as name4_65_3_ from user user0_ left outer join user user1_ on user0_.create_by_id=user1_.id left outer join user user2_ on user1_.last_modified_by_id=user2_.id left outer join user_group usergroup3_ on user1_.user_group_id=usergroup3_.id where user0_.id=?
Hibernate: select usergroup0_.id as id1_65_0_, usergroup0_.create_by_id as create_b5_65_0_, usergroup0_.create_date as create_d2_65_0_, usergroup0_.last_modified_by_id as last_mod6_65_0_, usergroup0_.last_modified_date as last_mod3_65_0_, usergroup0_.name as name4_65_0_, user1_.id as id1_59_1_, user1_.create_by_id as create_11_59_1_, user1_.create_date as create_d2_59_1_, user1_.last_modified_by_id as last_mo12_59_1_, user1_.last_modified_date as last_mod3_59_1_, user1_.avatar_id as avatar_13_59_1_, user1_.email as email4_59_1_, user1_.enabled as enabled5_59_1_, user1_.gender as gender6_59_1_, user1_.nickname as nickname7_59_1_, user1_.phone as phone8_59_1_, user1_.seller_auth_info_id as seller_14_59_1_, user1_.seller_auth_status as seller_a9_59_1_, user1_.user_ext_id as user_ex15_59_1_, user1_.user_group_id as user_gr16_59_1_, user1_.username as usernam10_59_1_, user2_.id as id1_59_2_, user2_.create_by_id as create_11_59_2_, user2_.create_date as create_d2_59_2_, user2_.last_modified_by_id as last_mo12_59_2_, user2_.last_modified_date as last_mod3_59_2_, user2_.avatar_id as avatar_13_59_2_, user2_.email as email4_59_2_, user2_.enabled as enabled5_59_2_, user2_.gender as gender6_59_2_, user2_.nickname as nickname7_59_2_, user2_.phone as phone8_59_2_, user2_.seller_auth_info_id as seller_14_59_2_, user2_.seller_auth_status as seller_a9_59_2_, user2_.user_ext_id as user_ex15_59_2_, user2_.user_group_id as user_gr16_59_2_, user2_.username as usernam10_59_2_, user3_.id as id1_59_3_, user3_.create_by_id as create_11_59_3_, user3_.create_date as create_d2_59_3_, user3_.last_modified_by_id as last_mo12_59_3_, user3_.last_modified_date as last_mod3_59_3_, user3_.avatar_id as avatar_13_59_3_, user3_.email as email4_59_3_, user3_.enabled as enabled5_59_3_, user3_.gender as gender6_59_3_, user3_.nickname as nickname7_59_3_, user3_.phone as phone8_59_3_, user3_.seller_auth_info_id as seller_14_59_3_, user3_.seller_auth_status as seller_a9_59_3_, user3_.user_ext_id as user_ex15_59_3_, user3_.user_group_id as user_gr16_59_3_, user3_.username as usernam10_59_3_, usergroup4_.id as id1_65_4_, usergroup4_.create_by_id as create_b5_65_4_, usergroup4_.create_date as create_d2_65_4_, usergroup4_.last_modified_by_id as last_mod6_65_4_, usergroup4_.last_modified_date as last_mod3_65_4_, usergroup4_.name as name4_65_4_, user5_.id as id1_59_5_, user5_.create_by_id as create_11_59_5_, user5_.create_date as create_d2_59_5_, user5_.last_modified_by_id as last_mo12_59_5_, user5_.last_modified_date as last_mod3_59_5_, user5_.avatar_id as avatar_13_59_5_, user5_.email as email4_59_5_, user5_.enabled as enabled5_59_5_, user5_.gender as gender6_59_5_, user5_.nickname as nickname7_59_5_, user5_.phone as phone8_59_5_, user5_.seller_auth_info_id as seller_14_59_5_, user5_.seller_auth_status as seller_a9_59_5_, user5_.user_ext_id as user_ex15_59_5_, user5_.user_group_id as user_gr16_59_5_, user5_.username as usernam10_59_5_, authoritie6_.user_group_id as user_gro1_66_6_, authoritie6_.authorities as authorit2_66_6_ from user_group usergroup0_ left outer join user user1_ on usergroup0_.create_by_id=user1_.id left outer join user user2_ on user1_.create_by_id=user2_.id left outer join user user3_ on user1_.last_modified_by_id=user3_.id left outer join user_group usergroup4_ on user1_.user_group_id=usergroup4_.id left outer join user user5_ on usergroup0_.last_modified_by_id=user5_.id left outer join user_group_authorities authoritie6_ on usergroup0_.id=authoritie6_.user_group_id where usergroup0_.id=?
2017-03-06 14:50:01.830 TRACE 30295 --- [nio-8080-exec-3] o.h.cache.redis.client.RedisClient : retrieve cache item. region=hibernate.com.foo.bar.model.item.ItemCategory, key=com.foo.bar.model.item.ItemCategory#1, value={parent=null, lastModifiedDate=2016-12-05 09:31:51.0, lastModifiedBy=1, enabled=true, sonNumber=1, _subclass=com.foo.bar.model.item.ItemCategory, createBy=1, children=1, background=1, slide=0, root=1, name=bar, _lazyPropertiesUnfetched=false, _version=null, createDate=2016-12-05 09:31:28.0}
2017-03-06 14:51:02.165 INFO 30295 --- [nio-8080-exec-3] i.StatisticalLoggingSessionEventListener : Session Metrics {
15435533 nanoseconds spent acquiring 1 JDBC connections;
0 nanoseconds spent releasing 0 JDBC connections;
1405433 nanoseconds spent preparing 2 JDBC statements;
2301936 nanoseconds spent executing 2 JDBC statements;
0 nanoseconds spent executing 0 JDBC batches;
0 nanoseconds spent performing 0 L2C puts;
64020073 nanoseconds spent performing 3 L2C hits;
27037450 nanoseconds spent performing 2 L2C misses;
1247578 nanoseconds spent executing 1 flushes (flushing a total of 4 entities and 3 collections);
24403 nanoseconds spent executing 1 partial-flushes (flushing a total of 0 entities and 0 collections)
}
application.yml:
spring:
profiles: development
jpa:
show-sql: true
properties:
hibernate.cache.use_second_level_cache: true
hibernate.cache.region.factory_class: org.hibernate.cache.redis.hibernate5.SingletonRedisRegionFactory
hibernate.cache.use_query_cache: true
hibernate.cache.region_prefix: hibernate
hibernate.generate_statistics: true
hibernate.cache.use_structured_entries: true
redisson-config: classpath:redisson.yml
hibernate.cache.use_reference_entries: true
javax.persistence.sharedCache.mode: ENABLE_SELECTIVE
I am trying to insert data from CSV file to Table Output in pentaho .
I got following error
2015/04/28 01:07:14 - Table output.0 - Connected to database [vps] (commit=5000)
2015/04/28 01:08:05 - Table output.0 - ERROR (version 5.2.0.0, build 1 from 2014-09-30_19-48-28 by buildguy) : Because of an error, this step can't continue:
2015/04/28 01:08:05 - Table output.0 - ERROR (version 5.2.0.0, build 1 from 2014-09-30_19-48-28 by buildguy) : org.pentaho.di.core.exception.KettleException:
2015/04/28 01:08:05 - Table output.0 - Error inserting row into table [fk_test] with values: [NFT-140721054GN00079XXXXXXX], [postpaid], [NON-FA], [L23-APRICOT/PINK-115(40)], [null], [OD40709024274], [82662472], [9-Jul-14], [10-Jul-14], [null], [null], [21-Jul-14], [return_completed], [1], [1379], [0], [0], [0], [0], [0], [39.5], [0], [0], [0], [0], [0], [0.3], [0], [0], [NATIONAL], [0], [0], [0], [0], [FKST-00151], [10-Jul-14], [1379], [sandal], [0], [0], [0]
2015/04/28 01:08:05 - Table output.0 -
2015/04/28 01:08:05 - Table output.0 - Error inserting/updating row
2015/04/28 01:08:05 - Table output.0 - ERROR: value too long for type character varying(21)
2015/04/28 01:08:05 - Table output.0 -
2015/04/28 01:08:05 - Table output.0 -
2015/04/28 01:08:05 - Table output.0 - at org.pentaho.di.trans.steps.tableoutput.TableOutput.writeToTable(TableOutput.java:377)
2015/04/28 01:08:05 - Table output.0 - at org.pentaho.di.trans.steps.tableoutput.TableOutput.processRow(TableOutput.java:118)
2015/04/28 01:08:05 - Table output.0 - at org.pentaho.di.trans.step.RunThread.run(RunThread.java:62)
2015/04/28 01:08:05 - Table output.0 - at java.lang.Thread.run(Thread.java:745)
2015/04/28 01:08:05 - Table output.0 - Caused by: org.pentaho.di.core.exception.KettleDatabaseException:
2015/04/28 01:08:05 - Table output.0 - Error inserting/updating row
2015/04/28 01:08:05 - Table output.0 - ERROR: value too long for type character varying(21)
2015/04/28 01:08:05 - Table output.0 -
2015/04/28 01:08:05 - Table output.0 - at org.pentaho.di.core.database.Database.insertRow(Database.java:1268)
2015/04/28 01:08:05 - Table output.0 - at org.pentaho.di.trans.steps.tableoutput.TableOutput.writeToTable(TableOutput.java:255)
2015/04/28 01:08:05 - Table output.0 - ... 3 more
2015/04/28 01:08:05 - Table output.0 - Caused by: org.postgresql.util.PSQLException: ERROR: value too long for type character varying(21)
2015/04/28 01:08:05 - Table output.0 - at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2198)
2015/04/28 01:08:05 - Table output.0 - at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:1927)
2015/04/28 01:08:05 - Table output.0 - at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:255)
2015/04/28 01:08:05 - Table output.0 - at org.postgresql.jdbc2.AbstractJdbc2Statement.execute(AbstractJdbc2Statement.java:561)
2015/04/28 01:08:05 - Table output.0 - at org.postgresql.jdbc2.AbstractJdbc2Statement.executeWithFlags(AbstractJdbc2Statement.java:419)
2015/04/28 01:08:05 - Table output.0 - at org.postgresql.jdbc2.AbstractJdbc2Statement.executeUpdate(AbstractJdbc2Statement.java:365)
2015/04/28 01:08:05 - Table output.0 - at org.pentaho.di.core.database.Database.insertRow(Database.java:1235)
2015/04/28 01:08:05 - Table output.0 - ... 4 more
2015/04/28 01:08:05 - Table output.0 - Finished processing (I=0, O=186, R=187, W=186, U=0, E=1)
2015/04/28 01:08:05 - Table output.0 - Error inserting/updating row
2015/04/28 01:08:05 - Table output.0 - ERROR: value too long for type character varying(21)
how i can solve this error .
I checked table defination in postgresql there data type is character varying without size
The error is
ERROR: value too long for type character varying(21)
and there are two values that are over 21 characters.
NFT-140721054GN00079XXXXXXX
L23-APRICOT/PINK-115(40)
You need to either shorten the values or change the character size for the columns in the table fk_test
I am using IbatisBatchItemWriter to write complex object into multiple tables.
Here is my object how it looks like
public class SfObject{
protected List<Person> person;
}
public class Person {
protected String personId;
protected XMLGregorianCalendar dateOfBirth;
protected String countryOfBirth;
protected String regionOfBirth;
protected String placeOfBirth;
protected String birthName;
protected XMLGregorianCalendar dateOfDeath;
protected XMLGregorianCalendar lastModifiedOn;
protected List<EmailInformation> emailInformation;
}
public class EmailInformation {
protected String emailType;
protected String emailAddress;
protected XMLGregorianCalendar lastModifiedOn;
}
And here is my ibatis configuration to insert above objests
<insert id="insertCompoundEmployeeData" parameterClass="com.domain.SfObject">
<iterate property="person">
insert into E_Person_Info
(person_id,
person_birth_dt,
person_country_of_birth,
person_region_of_birth,
person_place_of_birth,
person_birth_name,
person_death_dt,
last_modified_on
)
values (#person[].personId#,
#person[].dateOfBirth,
#person[].countryOfBirth#,
#person[].regionOfBirth#,
#person[].placeOfBirth#,
#person[].birthName#,
#person[].dateOfDeath#,
#person[].lastModifiedOn#
);
<iterate property="person[].emailInformation">
insert into E_Email_Info
(email_info_person_id,
email_info_email_type,
email_info_email_address,
last_modified_on
)
values (#person[].personId#,
#person[].emailInformation[].emailType#,
#person[].emailInformation[].emailAddress#,
#person[].emailInformation[].lastModifiedOn#
);
</iterate>
</iterate>
</insert>
I am not sure whether i could use above config to insert data into more than one table, but when i executed the above code i am getting below error for batch of 10 records. Btw, email information is not mandatory so, it may be null in some person object.
Stacktrace
[08.08.2014 17:30:07] DEBUG: WebservicePagingItemReader.doRead() - Reading page 0
[08.08.2014 17:30:09] DEBUG: RepeatTemplate.executeInternal() - Repeat operation about to start at count=1
[08.08.2014 17:30:09] DEBUG: RepeatTemplate.executeInternal() - Repeat operation about to start at count=2
[08.08.2014 17:30:09] DEBUG: RepeatTemplate.executeInternal() - Repeat operation about to start at count=3
[08.08.2014 17:30:09] DEBUG: RepeatTemplate.executeInternal() - Repeat operation about to start at count=4
[08.08.2014 17:30:09] DEBUG: RepeatTemplate.executeInternal() - Repeat operation about to start at count=5
[08.08.2014 17:30:09] DEBUG: RepeatTemplate.executeInternal() - Repeat operation about to start at count=6
[08.08.2014 17:30:09] DEBUG: RepeatTemplate.executeInternal() - Repeat operation about to start at count=7
[08.08.2014 17:30:09] DEBUG: RepeatTemplate.executeInternal() - Repeat operation about to start at count=8
[08.08.2014 17:30:09] DEBUG: RepeatTemplate.executeInternal() - Repeat operation about to start at count=9
[08.08.2014 17:30:09] DEBUG: RepeatTemplate.isComplete() - Repeat is complete according to policy and result value.
[08.08.2014 17:30:09] DEBUG: IbatisBatchItemWriter.write() - Executing batch with 10 items.
[08.08.2014 17:30:09] DEBUG: SqlMapClientTemplate.execute() - Opened SqlMapSession [com.ibatis.sqlmap.engine.impl.SqlMapSessionImpl#168afdd] for iBATIS operation
[08.08.2014 17:30:10] DEBUG: Connection.debug() - {conn-100000} Connection
[08.08.2014 17:30:10] DEBUG: SqlMapClientTemplate.execute() - Obtained JDBC Connection [Transaction-aware proxy for target Connection from DataSource [org.springframework.jdbc.datasource.DriverManagerDataSource#8eae04]] for iBATIS operation
[08.08.2014 17:30:10] DEBUG: Connection.debug() - {conn-100000} Preparing Statement: insert into E_Person_Info (person_id, person_birth_dt, person_country_of_birth, person_region_of_birth, person_place_of_birth, person_birth_name, person_death_dt, last_modified_on ) values (?, ?, ?, ?, ?, ?, ?, ? );
[08.08.2014 17:30:10] DEBUG: Connection.debug() - {conn-100000} Preparing Statement: insert into E_Person_Info (person_id, person_birth_dt, person_country_of_birth, person_region_of_birth, person_place_of_birth, person_birth_name, person_death_dt, last_modified_on ) values (?, ?, ?, ?, ?, ?, ?, ? ); insert into E_Email_Info (email_info_person_id, email_info_email_type, email_info_email_address, last_modified_on ) values (?, ?, ?, ? );
[08.08.2014 17:30:10] DEBUG: Connection.debug() - {conn-100000} Preparing Statement: insert into E_Person_Info (person_id, person_birth_dt, person_country_of_birth, person_region_of_birth, person_place_of_birth, person_birth_name, person_death_dt, last_modified_on ) values (?, ?, ?, ?, ?, ?, ?, ? );
[08.08.2014 17:30:10] DEBUG: TaskletStep.doInChunkContext() - Applying contribution: [StepContribution: read=10, written=0, filtered=0, readSkips=0, writeSkips=0, processSkips=0, exitStatus=EXECUTING]
[08.08.2014 17:30:10] DEBUG: TaskletStep.doInChunkContext() - Rollback for Exception: org.springframework.dao.InvalidDataAccessResourceUsageException: Batch execution returned invalid results. Expected 1 but number of BatchResult objects returned was 3
[08.08.2014 17:30:10] DEBUG: DataSourceTransactionManager.processRollback() - Initiating transaction rollback
[08.08.2014 17:30:10] DEBUG: DataSourceTransactionManager.doRollback() - Rolling back JDBC transaction on Connection [net.sourceforge.jtds.jdbc.ConnectionJDBC3#190d8e1]
[08.08.2014 17:30:10] DEBUG: DataSourceTransactionManager.doCleanupAfterCompletion() - Releasing JDBC Connection [net.sourceforge.jtds.jdbc.ConnectionJDBC3#190d8e1] after transaction
[08.08.2014 17:30:10] DEBUG: DataSourceUtils.doReleaseConnection() - Returning JDBC Connection to DataSource
[08.08.2014 17:30:10] DEBUG: RepeatTemplate.doHandle() - Handling exception: org.springframework.dao.InvalidDataAccessResourceUsageException, caused by: org.springframework.dao.InvalidDataAccessResourceUsageException: Batch execution returned invalid results. Expected 1 but number of BatchResult objects returned was 3
[08.08.2014 17:30:10] DEBUG: RepeatTemplate.executeInternal() - Handling fatal exception explicitly (rethrowing first of 1): org.springframework.dao.InvalidDataAccessResourceUsageException: Batch execution returned invalid results. Expected 1 but number of BatchResult objects returned was 3
[08.08.2014 17:30:10] ERROR: AbstractStep.execute() - Encountered an error executing the step
org.springframework.dao.InvalidDataAccessResourceUsageException: Batch execution returned invalid results. Expected 1 but number of BatchResult objects returned was 3
at org.springframework.batch.item.database.IbatisBatchItemWriter.write(IbatisBatchItemWriter.java:140)
at org.springframework.batch.core.step.item.SimpleChunkProcessor.writeItems(SimpleChunkProcessor.java:156)
at org.springframework.batch.core.step.item.SimpleChunkProcessor.doWrite(SimpleChunkProcessor.java:137)
at org.springframework.batch.core.step.item.SimpleChunkProcessor.write(SimpleChunkProcessor.java:252)
at org.springframework.batch.core.step.item.SimpleChunkProcessor.process(SimpleChunkProcessor.java:178)
at org.springframework.batch.core.step.item.ChunkOrientedTasklet.execute(ChunkOrientedTasklet.java:74)
at org.springframework.batch.core.step.tasklet.TaskletStep$2.doInChunkContext(TaskletStep.java:268)
at org.springframework.batch.core.scope.context.StepContextRepeatCallback.doInIteration(StepContextRepeatCallback.java:76)
at org.springframework.batch.repeat.support.RepeatTemplate.getNextResult(RepeatTemplate.java:367)
at org.springframework.batch.repeat.support.RepeatTemplate.executeInternal(RepeatTemplate.java:215)
at org.springframework.batch.repeat.support.RepeatTemplate.iterate(RepeatTemplate.java:143)
at org.springframework.batch.core.step.tasklet.TaskletStep.doExecute(TaskletStep.java:242)
at org.springframework.batch.core.step.AbstractStep.execute(AbstractStep.java:198)
at org.springframework.batch.core.job.AbstractJob.handleStep(AbstractJob.java:348)
at org.springframework.batch.core.job.flow.FlowJob.access$0(FlowJob.java:1)
at org.springframework.batch.core.job.flow.FlowJob$JobFlowExecutor.executeStep(FlowJob.java:135)
at org.springframework.batch.core.job.flow.support.state.StepState.handle(StepState.java:60)
at org.springframework.batch.core.job.flow.support.SimpleFlow.resume(SimpleFlow.java:144)
at org.springframework.batch.core.job.flow.support.SimpleFlow.start(SimpleFlow.java:124)
at org.springframework.batch.core.job.flow.FlowJob.doExecute(FlowJob.java:103)
at org.springframework.batch.core.job.AbstractJob.execute(AbstractJob.java:250)
at org.springframework.batch.core.launch.support.SimpleJobLauncher$1.run(SimpleJobLauncher.java:110)
at org.springframework.core.task.SyncTaskExecutor.execute(SyncTaskExecutor.java:48)
at org.springframework.batch.core.launch.support.SimpleJobLauncher.run(SimpleJobLauncher.java:105)
at com.CtrlMPojoForBatch.initiateSpringBatchProcess(CtrlMPojoForBatch.java:92)
at com.CtrlMPojoForBatch.main(CtrlMPojoForBatch.java:33)
[08.08.2014 17:30:10] WARN: CustomStepExecutionListner.afterStep() - Failure occured executing the step readWriteExchagnerateConversionData
[08.08.2014 17:30:10] WARN: CustomStepExecutionListner.afterStep() - Initiating the rollback operation...
[08.08.2014 17:30:10] WARN: CustomStepExecutionListner.afterStep() - Rollback completed!
Assuming you're using the IbatisBatchItemWriter provided in Spring Batch (it's been deprecated in favor of the ones provided by the MyBatis project), set the assertUpdates to false. This will prevent Spring Batch from verifying that only one update was made per item.