I save the Product which cascade persist the productMaterial. However, when the productMaterial throws DataIntegrityViolationException the product is rollbacked, which seems like cascade is done in 1 transaction, but i don't find any docs saying that it does. Can someone clarify it for me?
NOTE: I DO NOT use #Transactional
Material material = new Material();
material.setId(1);
Product newProduct = new Product();
ProductMaterial productMaterial = new ProductMaterial();
newProduct.setName("bàn chải");
newProduct.setPrice(1000);
newProduct.setCreatedAt(new Date());
newProduct.setProductMaterials(Collections.singletonList(productMaterial));
productMaterial.setProduct(newProduct);
productMaterial.setMaterial(material);
productRepository.save(newProduct);
Here is the hibernate execution:
Hibernate:
/* insert com.vietnam.hanghandmade.entities.Product
*/ insert
into
product
(created_at, name, price, id)
values
(?, ?, ?, ?)
2020-11-10 14:55:38.281 TRACE 65729 --- [nio-8080-exec-2] o.h.type.descriptor.sql.BasicBinder : binding parameter [1] as [TIMESTAMP] - [Tue Nov 10 14:55:38 JST 2020]
2020-11-10 14:55:38.281 TRACE 65729 --- [nio-8080-exec-2] o.h.type.descriptor.sql.BasicBinder : binding parameter [2] as [VARCHAR] - [bàn chải]
2020-11-10 14:55:38.281 TRACE 65729 --- [nio-8080-exec-2] o.h.type.descriptor.sql.BasicBinder : binding parameter [3] as [INTEGER] - [1000]
2020-11-10 14:55:38.281 TRACE 65729 --- [nio-8080-exec-2] o.h.type.descriptor.sql.BasicBinder : binding parameter [4] as [OTHER] - [e5729490-a0f8-48e7-9600-eeeba8b8f279]
Hibernate:
/* insert com.vietnam.hanghandmade.entities.ProductMaterial
*/ insert
into
product_material
(material_id, product_id)
values
(?, ?)
2020-11-10 14:55:38.324 TRACE 65729 --- [nio-8080-exec-2] o.h.type.descriptor.sql.BasicBinder : binding parameter [1] as [INTEGER] - [1]
2020-11-10 14:55:38.324 TRACE 65729 --- [nio-8080-exec-2] o.h.type.descriptor.sql.BasicBinder : binding parameter [2] as [OTHER] - [e5729490-a0f8-48e7-9600-eeeba8b8f279]
2020-11-10 14:55:38.328 WARN 65729 --- [nio-8080-exec-2] o.h.engine.jdbc.spi.SqlExceptionHelper : SQL Error: 0, SQLState: 23503
2020-11-10 14:55:38.328 ERROR 65729 --- [nio-8080-exec-2] o.h.engine.jdbc.spi.SqlExceptionHelper : ERROR: insert or update on table "product_material" violates foreign key constraint "product_material_material_id_fkey"
Detail: Key (material_id)=(1) is not present in table "material".
NOTE: This answer missed the point of the question, which is about “cascading persist” – it talks about “cascading delete” for foreign keys.
The cascading delete or update is part of the action of the system trigger that implements foreign key constraints, and as such it runs in the same transaction as the triggering statement.
I cannot find a place in the fine manual that spells this out, but it is obvious if you think about it: if the cascading delete were run in a separate transaction, it would be possible that the delete succeeds and the cascading delete fails, which would render the database inconsistent and is consequently not an option.
Related
I have Java + maven project - https://github.com/petersuchy/mongock-test-java based on Mongock reactive example (https://github.com/mongock/mongock-examples/tree/master/mongodb/springboot-reactive)
And everything works well.
I tried to migrate that project into Kotlin + gradle - https://github.com/petersuchy/mongock-test-kotlin
And I am able to run it. But my ChangeUnit is ignored. Mongock is set up properly beacause in the end I have 2 collections created - mongockLock and mongockChangeLog.
I tried to get rid of #Value annotations in MongockConfig and MongoClientConfig, but there was no change in behaviour.
Can you please point me why is this happening? I think it can be something with these Reflections becasue it is only difference in logs.
Kotlin:
2023-02-12T00:49:58.455+01:00 INFO 80854 --- [ main] i.m.r.c.e.system.SystemUpdateExecutor : Mongock has finished the system update execution
2023-02-12T00:49:58.457+01:00 INFO 80854 --- [ main] org.reflections.Reflections : Reflections took 0 ms to scan 0 urls, producing 0 keys and 0 values
2023-02-12T00:49:58.458+01:00 INFO 80854 --- [ main] org.reflections.Reflections : Reflections took 1 ms to scan 0 urls, producing 0 keys and 0 values
2023-02-12T00:49:58.458+01:00 INFO 80854 --- [ main] i.m.r.c.e.o.migrate.MigrateExecutorBase : Mongock skipping the data migration. There is no change set item.
Java:
2023-02-12T00:29:48.064+01:00 INFO 78548 --- [ main] i.m.r.c.e.system.SystemUpdateExecutor : Mongock has finished the system update execution
2023-02-12T00:29:48.072+01:00 INFO 78548 --- [ main] org.reflections.Reflections : Reflections took 6 ms to scan 1 urls, producing 1 keys and 2 values
2023-02-12T00:29:48.075+01:00 INFO 78548 --- [ main] org.reflections.Reflections : Reflections took 3 ms to scan 1 urls, producing 1 keys and 2 values
2023-02-12T00:29:48.081+01:00 INFO 78548 --- [ main] i.m.driver.core.lock.LockManagerDefault : Mongock trying to acquire the lock
Here is the full log from Kotlin project
2023-02-12T00:49:55.863+01:00 INFO 80854 --- [ main] c.e.m.MongockTestKotlinApplicationKt : No active profile set, falling back to 1 default profile: "default"
2023-02-12T00:49:56.764+01:00 INFO 80854 --- [ main] .s.d.r.c.RepositoryConfigurationDelegate : Bootstrapping Spring Data Reactive MongoDB repositories in DEFAULT mode.
2023-02-12T00:49:57.019+01:00 INFO 80854 --- [ main] .s.d.r.c.RepositoryConfigurationDelegate : Finished Spring Data repository scanning in 246 ms. Found 1 Reactive MongoDB repository interfaces.
2023-02-12T00:49:57.919+01:00 INFO 80854 --- [ main] org.mongodb.driver.client : MongoClient with metadata {"driver": {"name": "mongo-java-driver|reactive-streams|spring-boot", "version": "4.8.2"}, "os": {"type": "Linux", "name": "Linux", "architecture": "amd64", "version": "5.15.0-60-generic"}, "platform": "Java/Private Build/17.0.5+8-Ubuntu-2ubuntu122.04"} created with settings MongoClientSettings{readPreference=primary, writeConcern=WriteConcern{w=null, wTimeout=null ms, journal=null}, retryWrites=true, retryReads=true, readConcern=ReadConcern{level=null}, credential=null, streamFactoryFactory=NettyStreamFactoryFactory{eventLoopGroup=io.netty.channel.nio.NioEventLoopGroup#631cb129, socketChannelClass=class io.netty.channel.socket.nio.NioSocketChannel, allocator=PooledByteBufAllocator(directByDefault: true), sslContext=null}, commandListeners=[], codecRegistry=ProvidersCodecRegistry{codecProviders=[ValueCodecProvider{}, BsonValueCodecProvider{}, DBRefCodecProvider{}, DBObjectCodecProvider{}, DocumentCodecProvider{}, CollectionCodecProvider{}, IterableCodecProvider{}, MapCodecProvider{}, GeoJsonCodecProvider{}, GridFSFileCodecProvider{}, Jsr310CodecProvider{}, JsonObjectCodecProvider{}, BsonCodecProvider{}, EnumCodecProvider{}, com.mongodb.Jep395RecordCodecProvider#3d20e575]}, clusterSettings={hosts=[localhost:27017], srvServiceName=mongodb, mode=SINGLE, requiredClusterType=UNKNOWN, requiredReplicaSetName='null', serverSelector='null', clusterListeners='[]', serverSelectionTimeout='30000 ms', localThreshold='30000 ms'}, socketSettings=SocketSettings{connectTimeoutMS=10000, readTimeoutMS=0, receiveBufferSize=0, sendBufferSize=0}, heartbeatSocketSettings=SocketSettings{connectTimeoutMS=10000, readTimeoutMS=10000, receiveBufferSize=0, sendBufferSize=0}, connectionPoolSettings=ConnectionPoolSettings{maxSize=100, minSize=0, maxWaitTimeMS=120000, maxConnectionLifeTimeMS=0, maxConnectionIdleTimeMS=0, maintenanceInitialDelayMS=0, maintenanceFrequencyMS=60000, connectionPoolListeners=[], maxConnecting=2}, serverSettings=ServerSettings{heartbeatFrequencyMS=10000, minHeartbeatFrequencyMS=500, serverListeners='[]', serverMonitorListeners='[]'}, sslSettings=SslSettings{enabled=false, invalidHostNameAllowed=false, context=null}, applicationName='null', compressorList=[], uuidRepresentation=JAVA_LEGACY, serverApi=null, autoEncryptionSettings=null, contextProvider=null}
2023-02-12T00:49:57.968+01:00 INFO 80854 --- [ main] i.m.r.core.builder.RunnerBuilderBase : Mongock runner COMMUNITY version[5.2.2]
2023-02-12T00:49:57.970+01:00 INFO 80854 --- [ main] i.m.r.core.builder.RunnerBuilderBase : Running Mongock with NO metadata
2023-02-12T00:49:58.034+01:00 INFO 80854 --- [localhost:27017] org.mongodb.driver.cluster : Monitor thread successfully connected to server with description ServerDescription{address=localhost:27017, type=REPLICA_SET_PRIMARY, state=CONNECTED, ok=true, minWireVersion=0, maxWireVersion=17, maxDocumentSize=16777216, logicalSessionTimeoutMinutes=30, roundTripTimeNanos=62515465, setName='myReplicaSet', canonicalAddress=mongo1:27017, hosts=[mongo3:27017, mongo2:27017, mongo1:27017], passives=[], arbiters=[], primary='mongo1:27017', tagSet=TagSet{[]}, electionId=7fffffff0000000000000002, setVersion=1, topologyVersion=TopologyVersion{processId=63e7c5a7d11b71e048698dab, counter=6}, lastWriteDate=Sun Feb 12 00:49:53 CET 2023, lastUpdateTimeNanos=45870970528894}
2023-02-12T00:49:58.336+01:00 INFO 80854 --- [ main] org.reflections.Reflections : Reflections took 33 ms to scan 1 urls, producing 2 keys and 2 values
2023-02-12T00:49:58.343+01:00 INFO 80854 --- [ main] org.reflections.Reflections : Reflections took 2 ms to scan 1 urls, producing 2 keys and 2 values
2023-02-12T00:49:58.367+01:00 INFO 80854 --- [ main] i.m.driver.core.lock.LockManagerDefault : Mongock trying to acquire the lock
2023-02-12T00:49:58.400+01:00 INFO 80854 --- [ main] i.m.driver.core.lock.LockManagerDefault : Mongock acquired the lock until: Sun Feb 12 00:50:58 CET 2023
2023-02-12T00:49:58.401+01:00 INFO 80854 --- [ Thread-1] i.m.driver.core.lock.LockManagerDefault : Starting mongock lock daemon...
2023-02-12T00:49:58.404+01:00 INFO 80854 --- [ main] i.m.r.c.e.system.SystemUpdateExecutor : Mongock starting the system update execution id[2023-02-12T00:49:57.955733372-712]...
2023-02-12T00:49:58.408+01:00 INFO 80854 --- [ main] i.m.r.c.executor.ChangeLogRuntimeImpl : method[io.mongock.runner.core.executor.system.changes.SystemChangeUnit00001] with arguments: []
2023-02-12T00:49:58.411+01:00 INFO 80854 --- [ main] i.m.r.c.executor.ChangeLogRuntimeImpl : method[beforeExecution] with arguments: [io.mongock.driver.mongodb.reactive.repository.MongoReactiveChangeEntryRepository]
2023-02-12T00:49:58.413+01:00 INFO 80854 --- [ main] i.m.r.core.executor.ChangeExecutorBase : APPLIED - {"id"="system-change-00001_before", "type"="before-execution", "author"="mongock", "class"="SystemChangeUnit00001", "method"="beforeExecution"}
2023-02-12T00:49:58.425+01:00 INFO 80854 --- [ main] i.m.r.c.executor.ChangeLogRuntimeImpl : method[execution] with arguments: [io.mongock.driver.mongodb.reactive.repository.MongoReactiveChangeEntryRepository]
2023-02-12T00:49:58.429+01:00 INFO 80854 --- [ main] i.m.r.core.executor.ChangeExecutorBase : APPLIED - {"id"="system-change-00001", "type"="execution", "author"="mongock", "class"="SystemChangeUnit00001", "method"="execution"}
2023-02-12T00:49:58.447+01:00 INFO 80854 --- [ main] i.m.driver.core.lock.LockManagerDefault : Mongock waiting to release the lock
2023-02-12T00:49:58.447+01:00 INFO 80854 --- [ main] i.m.driver.core.lock.LockManagerDefault : Mongock releasing the lock
2023-02-12T00:49:58.455+01:00 INFO 80854 --- [ main] i.m.driver.core.lock.LockManagerDefault : Mongock released the lock
2023-02-12T00:49:58.455+01:00 INFO 80854 --- [ main] i.m.r.c.e.system.SystemUpdateExecutor : Mongock has finished the system update execution
2023-02-12T00:49:58.457+01:00 INFO 80854 --- [ main] org.reflections.Reflections : Reflections took 0 ms to scan 0 urls, producing 0 keys and 0 values
2023-02-12T00:49:58.458+01:00 INFO 80854 --- [ main] org.reflections.Reflections : Reflections took 1 ms to scan 0 urls, producing 0 keys and 0 values
2023-02-12T00:49:58.458+01:00 INFO 80854 --- [ main] i.m.r.c.e.o.migrate.MigrateExecutorBase : Mongock skipping the data migration. There is no change set item.
2023-02-12T00:49:58.458+01:00 INFO 80854 --- [ main] i.m.r.c.e.o.migrate.MigrateExecutorBase : Mongock has finished
2023-02-12T00:49:59.190+01:00 INFO 80854 --- [ main] o.s.b.web.embedded.netty.NettyWebServer : Netty started on port 8080
2023-02-12T00:49:59.201+01:00 INFO 80854 --- [ main] c.e.m.MongockTestKotlinApplicationKt : Started MongockTestKotlinApplicationKt in 4.086 seconds (process running for 4.773)
The problem is in your application.yam. The migration-scan-package name is wrong. That's the reason Mongock doesn't find any ChangeUnit
I have an requirement where we an entity with self entity and want to audit them. Have a look at it below:
#Entity
#Table(name = TableNames.CLIENT)
#EqualsAndHashCode(exclude="clientContacts")
#Audited
#AuditTable(value = TableNames.CLIENT_HISTORY)
public class Client implements Serializable {
private static final long serialVersionUID = -2789655782782839286L;
#Id
#Column(name = "ID")
#GeneratedValue(strategy = GenerationType.SEQUENCE, generator = "clientGenerator")
#SequenceGenerator(name = "clientGenerator", sequenceName =
"MEMBERSHIP_CLIENT_SQ",
allocationSize = 1)
private Long id;
#Column(name = "PARENT_ID")
private Long parentId;
#Column(name = "LEGAL_NAME")
private String legalName;
#Column(name = "LEI")
private String lei;
#Column(name = "CICI")
private String cici;
#Column(name = "BIC")
private String bic;
#Column(name = "LCH_UNIQUE_ID")
private String lchUniqueId;
#Column(name = "SALESFORCE_ID")
private String salesforceId;
#Column(name = "FUND_MANAGER")
private String fundManager;
#Column(name="INCORPORATION_COUNTRY_ID")
private Long incorporationCountryId;
#Column(name="FINANCIAL_CATEGORY_ID")
private Long financialCategoryId;
#Embedded
#JsonUnwrapped
private Address address;
#OneToMany(mappedBy = "client")
private Set<ClientContact> clientContacts;
#OneToOne(fetch = FetchType.LAZY)
#JoinColumn(name = "PARENT_ID", insertable = false, updatable = false)
#Getter(onMethod = #__({#JsonIgnore, #Transient}))
#Audited(targetAuditMode = RelationTargetAuditMode.NOT_AUDITED)
private Client parent;
}
As you can see we have client object as parent within the client object. We also have the parentId field, which has the id of the parent if one has the parent.
Here I mark the relation as NOT_AUDITED, so would expect the parent to be fetched from the main table rather than the history table. But I noticed that it is querying the history table for parent as well.
Also, is this the right way to reference the self object. Is there any better way to represent it?
I am using envers version 5.3.7.Final.
Log to show the data is being retrieved from the history table instead of the main table:
Hibernate: select client_his0_.id as id1_39_0_, client_his0_.revision as revision2_39_0_, customrevi1_.id as id1_5_1_, client_his0_.revision_type as revision_type3_39_0_, client_his0_.address_line1 as address_line4_39_0_, client_his0_.address_line2 as address_line5_39_0_, client_his0_.address_line3 as address_line6_39_0_, client_his0_.address_line4 as address_line7_39_0_, client_his0_.address_line5 as address_line8_39_0_, client_his0_.address_postcode as address_postcode9_39_0_, client_his0_.bic as bic10_39_0_, client_his0_.cici as cici11_39_0_, client_his0_.financial_category_id as financial_categor12_39_0_, client_his0_.fund_manager as fund_manager13_39_0_, client_his0_.incorporation_country_id as incorporation_cou14_39_0_, client_his0_.lch_unique_id as lch_unique_id15_39_0_, client_his0_.legal_name as legal_name16_39_0_, client_his0_.lei as lei17_39_0_, client_his0_.parent_id as parent_id18_39_0_, client_his0_.salesforce_id as salesforce_id19_39_0_, customrevi1_.timestamp as timestamp2_5_1_, customrevi1_.username as username3_5_1_ from membership_client_history client_his0_ cross join audit_revision customrevi1_ cross join audit_revision customrevi2_ where client_his0_.revision=customrevi2_.id and client_his0_.id=? and client_his0_.revision=customrevi1_.id order by customrevi2_.timestamp asc
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.type.descriptor.sql.BasicBinder - binding parameter [1] as [BIGINT] - [359]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.t.descriptor.sql.BasicExtractor - extracted value ([id1_39_0_] : [BIGINT]) - [359]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.t.descriptor.sql.BasicExtractor - extracted value ([revision2_39_0_] : [INTEGER]) - [803107]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.t.descriptor.sql.BasicExtractor - extracted value ([id1_5_1_] : [INTEGER]) - [803107]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.t.descriptor.sql.BasicExtractor - extracted value ([revision_type3_39_0_] : [INTEGER]) - [1]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.t.descriptor.sql.BasicExtractor - extracted value ([address_line4_39_0_] : [VARCHAR]) - [null]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.t.descriptor.sql.BasicExtractor - extracted value ([address_line5_39_0_] : [VARCHAR]) - [null]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.t.descriptor.sql.BasicExtractor - extracted value ([address_line6_39_0_] : [VARCHAR]) - [null]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.t.descriptor.sql.BasicExtractor - extracted value ([address_line7_39_0_] : [VARCHAR]) - [null]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.t.descriptor.sql.BasicExtractor - extracted value ([address_line8_39_0_] : [VARCHAR]) - [null]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.t.descriptor.sql.BasicExtractor - extracted value ([address_postcode9_39_0_] : [VARCHAR]) - [null]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.t.descriptor.sql.BasicExtractor - extracted value ([bic10_39_0_] : [VARCHAR]) - [null]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.t.descriptor.sql.BasicExtractor - extracted value ([cici11_39_0_] : [VARCHAR]) - [null]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.t.descriptor.sql.BasicExtractor - extracted value ([financial_categor12_39_0_] : [BIGINT]) - [8]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.t.descriptor.sql.BasicExtractor - extracted value ([fund_manager13_39_0_] : [VARCHAR]) - [null]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.t.descriptor.sql.BasicExtractor - extracted value ([incorporation_cou14_39_0_] : [BIGINT]) - [19]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.t.descriptor.sql.BasicExtractor - extracted value ([lch_unique_id15_39_0_] : [VARCHAR]) - [LCH00000359]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.t.descriptor.sql.BasicExtractor - extracted value ([legal_name16_39_0_] : [VARCHAR]) - [Dupont Denant Contrepartie]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.t.descriptor.sql.BasicExtractor - extracted value ([lei17_39_0_] : [VARCHAR]) - [null]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.t.descriptor.sql.BasicExtractor - extracted value ([parent_id18_39_0_] : [BIGINT]) - [57]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.t.descriptor.sql.BasicExtractor - extracted value ([salesforce_id19_39_0_] : [VARCHAR]) - [0012000000zMKFYAA4]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.t.descriptor.sql.BasicExtractor - extracted value ([incorporation_cou14_39_0_] : [BIGINT]) - [19]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.t.descriptor.sql.BasicExtractor - extracted value ([financial_categor12_39_0_] : [BIGINT]) - [8]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.t.descriptor.sql.BasicExtractor - extracted value ([parent_id18_39_0_] : [BIGINT]) - [57]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.t.descriptor.sql.BasicExtractor - extracted value ([timestamp2_5_1_] : [BIGINT]) - [1562172380000]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.t.descriptor.sql.BasicExtractor - extracted value ([username3_5_1_] : [VARCHAR]) - [risk-portal.schedule]
Hibernate: select client_his0_.id as id1_39_, client_his0_.revision as revision2_39_, client_his0_.revision_type as revision_type3_39_, client_his0_.address_line1 as address_line4_39_, client_his0_.address_line2 as address_line5_39_, client_his0_.address_line3 as address_line6_39_, client_his0_.address_line4 as address_line7_39_, client_his0_.address_line5 as address_line8_39_, client_his0_.address_postcode as address_postcode9_39_, client_his0_.bic as bic10_39_, client_his0_.cici as cici11_39_, client_his0_.financial_category_id as financial_categor12_39_, client_his0_.fund_manager as fund_manager13_39_, client_his0_.incorporation_country_id as incorporation_cou14_39_, client_his0_.lch_unique_id as lch_unique_id15_39_, client_his0_.legal_name as legal_name16_39_, client_his0_.lei as lei17_39_, client_his0_.parent_id as parent_id18_39_, client_his0_.salesforce_id as salesforce_id19_39_ from membership_client_history client_his0_ where client_his0_.revision=(select max(client_his1_.revision) from membership_client_history client_his1_ where client_his1_.revision<=? and client_his0_.id=client_his1_.id) and client_his0_.revision_type<>? and client_his0_.id=?
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.type.descriptor.sql.BasicBinder - binding parameter [1] as [INTEGER] - [803107]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.type.descriptor.sql.BasicBinder - binding parameter [2] as [INTEGER] - [2]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] TRACE o.h.type.descriptor.sql.BasicBinder - binding parameter [3] as [BIGINT] - [57]
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] DEBUG o.s.s.w.h.writers.HstsHeaderWriter - Not injecting HSTS header since it did not match the requestMatcher org.springframework.security.web.header.writers.HstsHeaderWriter$SecureRequestMatcher#2021d30
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] DEBUG o.s.s.w.c.SecurityContextPersistenceFilter - SecurityContextHolder now cleared, as request processing completed
2019-07-12 14:48:12 [http-nio-0.0.0.0-8899-exec-6] ERROR o.a.c.c.C.[.[.[.[dispatcherServlet] - Servlet.service() for servlet [dispatcherServlet] in context with path [/risk-portal] threw exception [Request processing failed; nested exception is javax.persistence.EntityNotFoundException: Unable to find com.lch.grouprisk.riskportal.entity.crud.Client with id 57] with root cause
javax.persistence.EntityNotFoundException: Unable to find com.lch.grouprisk.riskportal.entity.crud.Client with id 57
This following is the query i am using for setting BigDecimal value in Query but failing as error in SQL Syntax
#Query(value="Select f.id,s.student_id,f.feesPaid,f.fees_pending,f.paid_datetime from Fees f inner join Student s where f.feesPaid > :amt")
List<Fees> findFirst3ByFeesPaidGreaterThan( #Param(value = "amt") BigDecimal amt);
the following is the error
Hibernate: select fees0_.id as col_0_0_, student1_.student_id as col_1_0_, fees0_.fees_paid as col_2_0_, fees0_.fees_pending as col_3_0_, fees0_.paid_datetime as col_4_0_ from fees fees0_ inner join student student1_ on where fees0_.fees_paid>?
2019-05-07 20:06:16.779 WARN 21752 --- [nio-8082-exec-1] o.h.engine.jdbc.spi.SqlExceptionHelper : SQL Error: 1064, SQLState: 42000
2019-05-07 20:06:16.779 ERROR 21752 --- [nio-8082-exec-1] o.h.engine.jdbc.spi.SqlExceptionHelper : You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'where fees0_.fees_paid>500' at line 1
I am able to use the query method name but i wanted to do it using Query as mentioned above.
I have an issue with my native query.
I've got:
#Query(value="SELECT * from orders where orders.house in ((:houseArray))", nativeQuery = true)
List<Order> findByHouseId(#Param("houseArray") List<Long> houseArray);
And when I am trying to execute, I get the following:
2017-04-18 14:19:49,736 DEBUG org.hibernate.SQL: SELECT * from orders where orders.house in ((?, ?, ?, ?, ?))
2017-04-18 14:19:49,737 TRACE o.h.t.d.s.BasicBinder: binding parameter [2] as [BIGINT] - [4]
2017-04-18 14:19:49,737 TRACE o.h.t.d.s.BasicBinder: binding parameter [3] as [BIGINT] - [5]
2017-04-18 14:19:49,737 TRACE o.h.t.d.s.BasicBinder: binding parameter [1] as [BIGINT] - [3]
2017-04-18 14:19:49,737 TRACE o.h.t.d.s.BasicBinder: binding parameter [4] as [BIGINT] - [6]
2017-04-18 14:19:49,737 TRACE o.h.t.d.s.BasicBinder: binding parameter [5] as [BIGINT] - [7]
2017-04-18 14:19:49,738 ERROR o.h.e.j.s.SqlExceptionHelper: ERROR: operator does not exist: bigint = record
Hint: No operator matches the given name and argument type(s). You might need to add explicit type casts.
Position: 49
2017-04-18 14:19:49,756 ERROR o.a.c.c.C.[.[.[.[dispatcherServlet]: Servlet.service() for servlet [dispatcherServlet] in context with path [] threw exception [Request processing failed; nested exception is javax.persistence.PersistenceException: org.hibernate.exception.SQLGrammarException: could not extract ResultSet] with root cause
org.postgresql.util.PSQLException: ERROR: operator does not exist: bigint = record
Hint: No operator matches the given name and argument type(s). You might need to add explicit type casts.
However, if I run the following query in console:
SELECT * from orders where orders.house in (1,15,2,4,5,3,6,7);
It returns proper list of orders.
How can I fix this?
Try removing one set of brackets from ((:houseArray)) so it looks like that:
#Query(value="SELECT * from orders where orders.house in (:houseArray)", nativeQuery = true)
List<Order> findByHouseId(#Param("houseArray") List<Long> houseArray);
(value, value, value) is a record, so when you do column in ((value, value, value)) you compare column vs record.
I am using IbatisBatchItemWriter to write complex object into multiple tables.
Here is my object how it looks like
public class SfObject{
protected List<Person> person;
}
public class Person {
protected String personId;
protected XMLGregorianCalendar dateOfBirth;
protected String countryOfBirth;
protected String regionOfBirth;
protected String placeOfBirth;
protected String birthName;
protected XMLGregorianCalendar dateOfDeath;
protected XMLGregorianCalendar lastModifiedOn;
protected List<EmailInformation> emailInformation;
}
public class EmailInformation {
protected String emailType;
protected String emailAddress;
protected XMLGregorianCalendar lastModifiedOn;
}
And here is my ibatis configuration to insert above objests
<insert id="insertCompoundEmployeeData" parameterClass="com.domain.SfObject">
<iterate property="person">
insert into E_Person_Info
(person_id,
person_birth_dt,
person_country_of_birth,
person_region_of_birth,
person_place_of_birth,
person_birth_name,
person_death_dt,
last_modified_on
)
values (#person[].personId#,
#person[].dateOfBirth,
#person[].countryOfBirth#,
#person[].regionOfBirth#,
#person[].placeOfBirth#,
#person[].birthName#,
#person[].dateOfDeath#,
#person[].lastModifiedOn#
);
<iterate property="person[].emailInformation">
insert into E_Email_Info
(email_info_person_id,
email_info_email_type,
email_info_email_address,
last_modified_on
)
values (#person[].personId#,
#person[].emailInformation[].emailType#,
#person[].emailInformation[].emailAddress#,
#person[].emailInformation[].lastModifiedOn#
);
</iterate>
</iterate>
</insert>
I am not sure whether i could use above config to insert data into more than one table, but when i executed the above code i am getting below error for batch of 10 records. Btw, email information is not mandatory so, it may be null in some person object.
Stacktrace
[08.08.2014 17:30:07] DEBUG: WebservicePagingItemReader.doRead() - Reading page 0
[08.08.2014 17:30:09] DEBUG: RepeatTemplate.executeInternal() - Repeat operation about to start at count=1
[08.08.2014 17:30:09] DEBUG: RepeatTemplate.executeInternal() - Repeat operation about to start at count=2
[08.08.2014 17:30:09] DEBUG: RepeatTemplate.executeInternal() - Repeat operation about to start at count=3
[08.08.2014 17:30:09] DEBUG: RepeatTemplate.executeInternal() - Repeat operation about to start at count=4
[08.08.2014 17:30:09] DEBUG: RepeatTemplate.executeInternal() - Repeat operation about to start at count=5
[08.08.2014 17:30:09] DEBUG: RepeatTemplate.executeInternal() - Repeat operation about to start at count=6
[08.08.2014 17:30:09] DEBUG: RepeatTemplate.executeInternal() - Repeat operation about to start at count=7
[08.08.2014 17:30:09] DEBUG: RepeatTemplate.executeInternal() - Repeat operation about to start at count=8
[08.08.2014 17:30:09] DEBUG: RepeatTemplate.executeInternal() - Repeat operation about to start at count=9
[08.08.2014 17:30:09] DEBUG: RepeatTemplate.isComplete() - Repeat is complete according to policy and result value.
[08.08.2014 17:30:09] DEBUG: IbatisBatchItemWriter.write() - Executing batch with 10 items.
[08.08.2014 17:30:09] DEBUG: SqlMapClientTemplate.execute() - Opened SqlMapSession [com.ibatis.sqlmap.engine.impl.SqlMapSessionImpl#168afdd] for iBATIS operation
[08.08.2014 17:30:10] DEBUG: Connection.debug() - {conn-100000} Connection
[08.08.2014 17:30:10] DEBUG: SqlMapClientTemplate.execute() - Obtained JDBC Connection [Transaction-aware proxy for target Connection from DataSource [org.springframework.jdbc.datasource.DriverManagerDataSource#8eae04]] for iBATIS operation
[08.08.2014 17:30:10] DEBUG: Connection.debug() - {conn-100000} Preparing Statement: insert into E_Person_Info (person_id, person_birth_dt, person_country_of_birth, person_region_of_birth, person_place_of_birth, person_birth_name, person_death_dt, last_modified_on ) values (?, ?, ?, ?, ?, ?, ?, ? );
[08.08.2014 17:30:10] DEBUG: Connection.debug() - {conn-100000} Preparing Statement: insert into E_Person_Info (person_id, person_birth_dt, person_country_of_birth, person_region_of_birth, person_place_of_birth, person_birth_name, person_death_dt, last_modified_on ) values (?, ?, ?, ?, ?, ?, ?, ? ); insert into E_Email_Info (email_info_person_id, email_info_email_type, email_info_email_address, last_modified_on ) values (?, ?, ?, ? );
[08.08.2014 17:30:10] DEBUG: Connection.debug() - {conn-100000} Preparing Statement: insert into E_Person_Info (person_id, person_birth_dt, person_country_of_birth, person_region_of_birth, person_place_of_birth, person_birth_name, person_death_dt, last_modified_on ) values (?, ?, ?, ?, ?, ?, ?, ? );
[08.08.2014 17:30:10] DEBUG: TaskletStep.doInChunkContext() - Applying contribution: [StepContribution: read=10, written=0, filtered=0, readSkips=0, writeSkips=0, processSkips=0, exitStatus=EXECUTING]
[08.08.2014 17:30:10] DEBUG: TaskletStep.doInChunkContext() - Rollback for Exception: org.springframework.dao.InvalidDataAccessResourceUsageException: Batch execution returned invalid results. Expected 1 but number of BatchResult objects returned was 3
[08.08.2014 17:30:10] DEBUG: DataSourceTransactionManager.processRollback() - Initiating transaction rollback
[08.08.2014 17:30:10] DEBUG: DataSourceTransactionManager.doRollback() - Rolling back JDBC transaction on Connection [net.sourceforge.jtds.jdbc.ConnectionJDBC3#190d8e1]
[08.08.2014 17:30:10] DEBUG: DataSourceTransactionManager.doCleanupAfterCompletion() - Releasing JDBC Connection [net.sourceforge.jtds.jdbc.ConnectionJDBC3#190d8e1] after transaction
[08.08.2014 17:30:10] DEBUG: DataSourceUtils.doReleaseConnection() - Returning JDBC Connection to DataSource
[08.08.2014 17:30:10] DEBUG: RepeatTemplate.doHandle() - Handling exception: org.springframework.dao.InvalidDataAccessResourceUsageException, caused by: org.springframework.dao.InvalidDataAccessResourceUsageException: Batch execution returned invalid results. Expected 1 but number of BatchResult objects returned was 3
[08.08.2014 17:30:10] DEBUG: RepeatTemplate.executeInternal() - Handling fatal exception explicitly (rethrowing first of 1): org.springframework.dao.InvalidDataAccessResourceUsageException: Batch execution returned invalid results. Expected 1 but number of BatchResult objects returned was 3
[08.08.2014 17:30:10] ERROR: AbstractStep.execute() - Encountered an error executing the step
org.springframework.dao.InvalidDataAccessResourceUsageException: Batch execution returned invalid results. Expected 1 but number of BatchResult objects returned was 3
at org.springframework.batch.item.database.IbatisBatchItemWriter.write(IbatisBatchItemWriter.java:140)
at org.springframework.batch.core.step.item.SimpleChunkProcessor.writeItems(SimpleChunkProcessor.java:156)
at org.springframework.batch.core.step.item.SimpleChunkProcessor.doWrite(SimpleChunkProcessor.java:137)
at org.springframework.batch.core.step.item.SimpleChunkProcessor.write(SimpleChunkProcessor.java:252)
at org.springframework.batch.core.step.item.SimpleChunkProcessor.process(SimpleChunkProcessor.java:178)
at org.springframework.batch.core.step.item.ChunkOrientedTasklet.execute(ChunkOrientedTasklet.java:74)
at org.springframework.batch.core.step.tasklet.TaskletStep$2.doInChunkContext(TaskletStep.java:268)
at org.springframework.batch.core.scope.context.StepContextRepeatCallback.doInIteration(StepContextRepeatCallback.java:76)
at org.springframework.batch.repeat.support.RepeatTemplate.getNextResult(RepeatTemplate.java:367)
at org.springframework.batch.repeat.support.RepeatTemplate.executeInternal(RepeatTemplate.java:215)
at org.springframework.batch.repeat.support.RepeatTemplate.iterate(RepeatTemplate.java:143)
at org.springframework.batch.core.step.tasklet.TaskletStep.doExecute(TaskletStep.java:242)
at org.springframework.batch.core.step.AbstractStep.execute(AbstractStep.java:198)
at org.springframework.batch.core.job.AbstractJob.handleStep(AbstractJob.java:348)
at org.springframework.batch.core.job.flow.FlowJob.access$0(FlowJob.java:1)
at org.springframework.batch.core.job.flow.FlowJob$JobFlowExecutor.executeStep(FlowJob.java:135)
at org.springframework.batch.core.job.flow.support.state.StepState.handle(StepState.java:60)
at org.springframework.batch.core.job.flow.support.SimpleFlow.resume(SimpleFlow.java:144)
at org.springframework.batch.core.job.flow.support.SimpleFlow.start(SimpleFlow.java:124)
at org.springframework.batch.core.job.flow.FlowJob.doExecute(FlowJob.java:103)
at org.springframework.batch.core.job.AbstractJob.execute(AbstractJob.java:250)
at org.springframework.batch.core.launch.support.SimpleJobLauncher$1.run(SimpleJobLauncher.java:110)
at org.springframework.core.task.SyncTaskExecutor.execute(SyncTaskExecutor.java:48)
at org.springframework.batch.core.launch.support.SimpleJobLauncher.run(SimpleJobLauncher.java:105)
at com.CtrlMPojoForBatch.initiateSpringBatchProcess(CtrlMPojoForBatch.java:92)
at com.CtrlMPojoForBatch.main(CtrlMPojoForBatch.java:33)
[08.08.2014 17:30:10] WARN: CustomStepExecutionListner.afterStep() - Failure occured executing the step readWriteExchagnerateConversionData
[08.08.2014 17:30:10] WARN: CustomStepExecutionListner.afterStep() - Initiating the rollback operation...
[08.08.2014 17:30:10] WARN: CustomStepExecutionListner.afterStep() - Rollback completed!
Assuming you're using the IbatisBatchItemWriter provided in Spring Batch (it's been deprecated in favor of the ones provided by the MyBatis project), set the assertUpdates to false. This will prevent Spring Batch from verifying that only one update was made per item.