javax.persistence.TransactionRequiredException: Executing an update/delete query : Spring boot Migration from 1.5.4 to 2.7.5 - spring-data-jpa

I am migrating the spring boot app from spring boot 1.5.4 to spring boot 2.7.5, In old code I have hibernate at DAO layer. (does the migration caused the issue, I did not find very particular about the issue )
properties file has connection details and used below key
spring.jpa.properties.hibernate.current_session_context_class=org.springframework.orm.hibernate5.SpringSessionContext
Application class have following annotations,
#EnableTransactionManagement #EnableJpaRepositories(basePackages = {"repository package"}) #ComponentScan(basePackages = {"base package"}) #EntityScan(basePackages = {"entity package"})
DAO code :
private EntityManager entityManager;
private Session getSession() {
return entityManager.unwrap(Session.class);
}
// problematic code
public void doSomething(){
String hqlBuilder = "Update Query statement"
Query query = getSession().createQuery(hqlBuilder);
int result = query.executeUpdate();
}```
Service code:
Service class has #Transctional defined at class level however it throws exception on invoking the doSomething() method on DAO instance
`**Caused by: javax.persistence.TransactionRequiredException: Executing an update/delete query**
at org.hibernate.internal.AbstractSharedSessionContract.checkTransactionNeededForUpdateOperation(AbstractSharedSessionContract.java:445)
at org.hibernate.query.internal.AbstractProducedQuery.executeUpdate(AbstractProducedQuery.java:1692)
at <removed the package path>.dao.impl.LastViewedHistoryElementDaoHibernate.insertOrUpdate(LastViewedHistoryElementDaoHibernate.java:122)
at impl.LastViewedHistoryElementDaoHibernate$$FastClassBySpringCGLIB$$f8fba493.invoke(<generated>)
at org.springframework.cglib.proxy.MethodProxy.invoke(MethodProxy.java:218)
at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.invokeJoinpoint(CglibAopProxy.java:793)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:163)
at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.proceed(CglibAopProxy.java:763)
at org.springframework.dao.support.PersistenceExceptionTranslationInterceptor.invoke(PersistenceExceptionTranslationInterceptor.java:137)
... 271 common frames omitted`
**In DAO method returns it throws the TransactionRequiredException : No Transaction in progress**
Thank you, I did applied the #EnableTransactionManagement #Transactional, added spring jpa. I did configure Transaction manager in config also however not luck,
```#Autowired
HikariDataSource dataSource;
#Bean
public LocalSessionFactoryBean sessionFactory() {
LocalSessionFactoryBean sessionFactory = new LocalSessionFactoryBean();
sessionFactory.setDataSource(dataSource);
sessionFactory.setPackagesToScan(
new String[]{"<My packages>"});
//sessionFactory.setHibernateProperties(hibernateProperties());
return sessionFactory;
}
#Bean
public PlatformTransactionManager transactionManager() {
HibernateTransactionManager hibernateTransactionManager
= new HibernateTransactionManager();
hibernateTransactionManager.setSessionFactory(sessionFactory().getObject());
return hibernateTransactionManager;
}```
but still same error as Caused by: javax.persistence.TransactionRequiredException: no transaction is in progress
at org.hibernate.internal.AbstractSharedSessionContract.checkTransactionNeededForUpdateOperation(AbstractSharedSessionContract.java:445)
at org.hibernate.internal.SessionImpl.checkTransactionNeededForUpdateOperation(SessionImpl.java:3496)
at org.hibernate.internal.SessionImpl.doFlush(SessionImpl.java:1399)
at org.hibernate.internal.SessionImpl.flush(SessionImpl.java:1394)
at org.springframework.orm.hibernate5.SessionFactoryUtils.flush(SessionFactoryUtils.java:113)
at org.springframework.orm.hibernate5.SpringSessionSynchronization.beforeCommit(SpringSessionSynchronization.java:95)
at org.springframework.transaction.support.TransactionSynchronizationUtils.triggerBeforeCommit(TransactionSynchronizationUtils.java:97)
at org.springframework.transaction.support.AbstractPlatformTransactionManager.triggerBeforeCommit(AbstractPlatformTransactionManager.java:916)
at org.springframework.transaction.support.AbstractPlatformTransactionManager.processCommit(AbstractPlatformTransactionManager.java:727)
at org.springframework.transaction.support.AbstractPlatformTransactionManager.commit(AbstractPlatformTransactionManager.java:711)
at org.springframework.transaction.interceptor.TransactionAspectSupport.commitTransactionAfterReturning(TransactionAspectSupport.java:654)
at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:407)
at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:119)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:186)
at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.proceed(CglibAopProxy.java:763)
at org.springframework.aop.framework.CglibAopProxy$DynamicAdvisedInterceptor.intercept(CglibAopProxy.java:708)

Related

Resources must not be null in MultipleResourceItemReader

I am developing a spring batch job to download the files from S3Bucket first and place it on my local (using Tasklet) and then read the files from my local using MultiResourceItemReader and populating it into work table.
I am calling Tasklet first and then reading the files in the next step. So, that we have the input files available.
But, when I am trying to run the process, I guess because of bean configuration dependency, it's throwing below error : The Resources must not be null.
I am not sure how to handle it. Once the tasklet run is completed, there would be files available but not before that.
Error:
**Caused by: java.lang.IllegalArgumentException: The resources must not be null**
at org.springframework.util.Assert.notNull(Assert.java:201) ~[spring-core-5.3.3.jar:5.3.3]
at org.springframework.batch.item.file.MultiResourceItemReader.setResources(MultiResourceItemReader.java:246) ~[spring-batch-infrastructure-4.3.1.jar:4.3.1]
at com.cspprovemerald.SpringBatchApplication.ItemReader.FileItemReader.providerMultiResourceItemReader(FileItemReader.java:38) ~[classes/:na]
at com.cspprovemerald.SpringBatchApplication.Config.JobStepBuilderConfig.step2(JobStepBuilderConfig.java:64) ~[classes/:na]
at com.cspprovemerald.SpringBatchApplication.Config.JobStepBuilderConfig.job(JobStepBuilderConfig.java:110) ~[classes/:na]
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:na]
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[na:na]
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:na]
at java.base/java.lang.reflect.Method.invoke(Method.java:566) ~[na:na]
at org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:154) ~[spring-beans-5.3.3.jar:5.3.3]
... 38 common frames omitted
MultiResouceItemReader :
#Component
public class FileItemReader {
#Value("${local.file.download.path}")
private String localFileDownloadPath;
private static final Logger LOGGER = LoggerFactory.getLogger(FileItemReader.class);
// MultiResourceItemReader to read multiple files sequentially
public MultiResourceItemReader<Provider> providerMultiResourceItemReader() {
String locationPattern = "C:/Users/Desktop/data/in/*.csv";
Resource[] resources = null;
ResourcePatternResolver patternResolver = new PathMatchingResourcePatternResolver();
try {
resources = patternResolver.getResources(locationPattern);
} catch (IOException e) {
e.printStackTrace();
}
MultiResourceItemReader<Provider> multiResourceItemReader = new MultiResourceItemReader<>();
multiResourceItemReader.setResources(resources);
multiResourceItemReader.setDelegate(providerItemReader());
return multiResourceItemReader;
}
}
JobBuilderConfig.java
#Component
#EnableBatchProcessing
public class JobStepBuilderConfig {
#Autowired
JobBuilderFactory jobBuilderFactory;
#Autowired
StepBuilderFactory stepBuilderFactory;
#Autowired
DataSource datasource;
#Autowired
JdbcItemWriter jdbcItemWriter;
#Autowired
JdbcItemReader jdbcItemReader;
#Autowired
FileItemReader fileItemReader;
#Autowired
FileItemWriter fileItemWriter;
#Autowired
TaskletSPExecutor taskletSPExecutor;
#Autowired
TaskletS3DownloadFiles taskletS3DownloadFiles;
public Step step1(){
// step 1 : Read records from custom table and call stored procedure to update facets table
return stepBuilderFactory.get("step1S3ListCopyFiles")
.tasklet(taskletS3DownloadFiles)
.build();
}
public Step step2(){
// step 2 : Read csv files and dump it into a custom table
return stepBuilderFactory.get("step2ReadLoadCSV")
.<Provider, Provider>chunk(1000)
.reader(fileItemReader.providerMultiResourceItemReader())
.writer(jdbcItemWriter.providerJdbcBatchItemWriter())
.build();
}
#Bean
public Job job(){
return jobBuilderFactory.get("jobCSProvMI4275")
.start(step1())
.next(step2())
.incrementer(new RunIdIncrementer())
.build();
}
}
This is because the item reader is created eagerly when the Spring application context is created. At this time, the file is not downloaded yet, hence the error. Spring Batch provides a custom bean scope called the Step scope. This scope allows you to define beans that should be created at runtime only when required.
In your case, you need to make your item reader step-scoped. This means the item reader bean will be created only when the chunk-oriented step requires it (ie after the tasklet has downloaded the file). Here is an example:
#Bean
#StepScope
public MultiResourceItemReader<Provider> providerMultiResourceItemReader() {
// configure your reader here
}
You can find more details about the Step scope in the documentation here.

How to enable spring cloud sleuth in unit tests with MockMvc

We have a spring boot rest api (spring boot 2.3.0.RELEASE) that uses spring cloud sleuth (version 2.2.3.RELEASE).
At some point, we use the trace id from spring sleuth as data. The trace id is fetched by autowiring the Tracing bean and then accessing the current span. Lets say we defined a bean SimpleCorrelationBean with:
#Autowired
private Tracer tracer;
public String getCorrelationId() {
return tracer.currentSpan().context().traceIdString();
}
This seem to work perfectly when running the spring boot application, but when we try to access the tracer.currentSpan() in the unit tests, this is null. It looks like spring cloud sleuth is not creating any span while running tests..
I think it has something to do with the application context that is set up during the unit test, but I don't know how to enable spring cloud sleuth for the test application context.
Below is a simple test class where the error occurs in simpleTest1. In simpleTest2, no error occurs.
simpleTest1 errors because tracer.currentSpan() is null
#ExtendWith({ RestDocumentationExtension.class, SpringExtension.class })
#SpringBootTest(classes = MusicService.class)
#WebAppConfiguration
#ActiveProfiles("unit-test")
#ComponentScan(basePackageClasses = datacast2.data.JpaConfig.class)
public class SimpleTest {
private static final Logger logger = LoggerFactory.getLogger(SimpleTest.class);
#Autowired
private WebApplicationContext context;
private MockMvc mockMvc;
#Autowired
private FilterChainProxy springSecurityFilterChain;
#Autowired
private SimpleCorrelationBean simpleCorrelationBean;
#Autowired
private Tracer tracer;
#BeforeEach
public void setup(RestDocumentationContextProvider restDocumentation) throws Exception {
this.mockMvc = MockMvcBuilders.webAppContextSetup(this.context)
.apply(documentationConfiguration(restDocumentation))
.addFilter(springSecurityFilterChain).build();
}
#Test
public void simpleTest1() throws Exception {
try {
String correlationId = simpleCorrelationBean.getCorrelationId();
}catch(Exception e) {
logger.error("This seem to fail.", e);
}
}
#Test
public void simpleTest2() throws Exception {
//It looks like spring cloud sleuth is not creating a span, so we create one ourselfs
Span newSpan = this.tracer.nextSpan().name("simpleTest2");
try (Tracer.SpanInScope ws = this.tracer.withSpanInScope(newSpan.start())) {
String correlationId = simpleCorrelationBean.getCorrelationId();
}
finally {
newSpan.finish();
}
}
}
The question is: how to enable spring cloud sleuth for a mockMvc during unit tests?
The issue here is that MockMvc is created manually instead of relying on autoconfiguration. In this particular case custom configuration of MockMvc could be necessary. However, at least for my version of Spring Boot (2.7.6), there is no need to manually configure MockMvc, even though I use Spring Security and Spring Security Test. I couldn't figure out how to enable tracing when manually configuring MockMvc though.

Running Spring Batch test, doesn't initialize database

Trying to create some end-to-end tests for an spring batch application, which works great. I get an sql error because it is not initializing Spring Batch processing tables: org.postgresql.util.PSQLException: ERROR: relation "batch_job_instance" does not exist
I have this code in the src/test/resources/application.properties:
spring.datasource.initialize=true
spring.datasource.initialization-mode=always
spring.datasource.platform=postgresql
spring.batch.initialize-schema=always
Which is the same I have on `src/main/resources/application.properties and works.
This is the code I have for ApplicationTest:
#RunWith(SpringRunner.class)
#ContextConfiguration(classes={
TestConfiguration.class,
JobCompletionNotificationListener.class,
BatchConfiguration.class
})
#SpringBatchTest
public class ApplicationTests {
#Autowired
private JobLauncherTestUtils jobLauncherTestUtils;
#Test
public void testJob() throws Exception {
JobExecution jobExecution = jobLauncherTestUtils.launchJob();
}
}
I have an specific TestConfiguration to generate a Bean with the DataSource.
#Configuration
#PropertySource("application.properties")
public class TestConfiguration {
#Autowired
private Environment env;
#Bean
public DataSource dataSource() {
DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setDriverClassName(env.getProperty("spring.datasource.driverClassname"));
dataSource.setUrl(env.getProperty("spring.datasource.url"));
dataSource.setUsername(env.getProperty("spring.datasource.username"));
dataSource.setPassword(env.getProperty("spring.datasource.password"));
return dataSource;
}
I was expecting all tables to be created (internal Batch tables and the tables defined in schema-all.sql).
But I get the following error org.postgresql.util.PSQLException: ERROR: relation "batch_job_instance" does not exist.
I don't understand why in the main application all works automagically, and it doesn't in the test.
If a Spring test misses misses the BatchDataSourceInitializer that is being auto-configured by Spring Boot in the actual application, and you don't want to write a full #SpringBootTest, you can selectively add the auto-configuration for Spring Batch by adding the annotation
#ImportAutoConfiguration(BatchAutoConfiguration.class)
This will then provide the initializer for the injected DataSource.

Spring batch with DB2 and MongoDB using Spring boot

I need to read the DB2 database and copy data to mongoDB using Spring Batch. As I am going to write the data to mongoDB, so I dont need the transaction. I would like to keep the metadata tables scripts in mongoDB only not in DB2 but I couldn't find the metadata tables scripts for mongoDB. In the server startup time, spring boot expect the batch_job_instance table in Db2 instead of mongoDB. I annotated mongoDB as primary but still it is throwing an error.
Someone can help me with this. Thanks in advance.
MongoConfig.java:
#Configuration
#EnableMongoRepositories("com.test.mongodb")
public class MongoConfig extends AbstractMongoConfiguration {
private final Logger log = LoggerFactory.getLogger(MongoConfig.class);
#Value("${spring.data.mongodb.host}")
private String host;
#Value("${spring.data.mongodb.port}")
private Integer port;
#Value("${spring.data.mongodb.username}")
private String username;
#Value("${spring.data.mongodb.database}")
private String database;
#Value("${spring.data.mongodb.password}")
private String password;
#Bean
public ValidatingMongoEventListener validatingMongoEventListener() {
return new ValidatingMongoEventListener(validator());
}
#Bean
public LocalValidatorFactoryBean validator() {
return new LocalValidatorFactoryBean();
}
#Override
public String getDatabaseName() {
return database;
}
#Override
#Bean
public Mongo mongo() throws Exception {
return new MongoClient(singletonList(new ServerAddress(host, port)),
singletonList(MongoCredential.createCredential(username, database, password.toCharArray())));
}
#Override
#Bean
#Primary
#ConfigurationProperties(prefix = "spring.data.mongodb")
public MongoTemplate mongoTemplate() throws Exception {
return new MongoTemplate(mongo(), database);
}
}
application.properties:
# DB2
spring.datasource.jndi-name=java:jboss/datasources/Db2XaDsn
# Mongo DB
spring.data.mongodb.host=localhost
spring.data.mongodb.port=27017
spring.data.mongodb.username=admin
spring.data.mongodb.password=admin
spring.data.mongodb.database=test
Batch class:
#Configuration
#EnableBatchProcessing
public class ItemBatch {
#Autowired
private JobBuilderFactory jobBuilderFactory;
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Autowired
EntityManagerFactory entityManagerFactory;
#Autowired
MongoTemplate mongoTemplate;
#Bean
public Job readDB2() {
return jobBuilderFactory.get("readDB2").start(step1()).build();
}
#Bean
public Step step1() {
return stepBuilderFactory.get("step1").<com.model.db2.Item, Item>chunk(200).reader(reader())
.writer(writer()).build();
}
#Bean
public ItemReader<com.model.db2.Item> reader() {
JpaPagingItemReader<com.model.db2.Item> reader = new JpaPagingItemReader<>();
reader.setQueryString("select i from Item i");
reader.setEntityManagerFactory(entityManagerFactory);
return reader;
}
#Bean
public MongoItemWriter<Item> writer() {
MongoItemWriter<Item> writer = new MongoItemWriter<>();
try {
writer.setTemplate(mongoTemplate);
} catch (Exception e) {
e.printStackTrace();
}
writer.setCollection("item");
return writer;
}
}
Error:
00:44:39,484 ERROR [org.jboss.msc.service.fail] (ServerService Thread Pool -- 72) MSC000001: Failed to start service jboss.undertow.deployment.default-server.default-host./itemapi: org.jboss.msc.service.StartException in service jboss.undertow.deployment.default-server.default-host./itemapi: java.lang.RuntimeException: java.lang.IllegalStateException: Failed to execute CommandLineRunner
at org.wildfly.extension.undertow.deployment.UndertowDeploymentService$1.run(UndertowDeploymentService.java:85)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
at org.jboss.threads.JBossThread.run(JBossThread.java:320)
Caused by: java.lang.RuntimeException: java.lang.IllegalStateException: Failed to execute CommandLineRunner
at io.undertow.servlet.core.DeploymentManagerImpl.deploy(DeploymentManagerImpl.java:231)
at org.wildfly.extension.undertow.deployment.UndertowDeploymentService.startContext(UndertowDeploymentService.java:100)
at org.wildfly.extension.undertow.deployment.UndertowDeploymentService$1.run(UndertowDeploymentService.java:82)
... 6 more
Caused by: java.lang.IllegalStateException: Failed to execute CommandLineRunner
at org.springframework.boot.SpringApplication.callRunner(SpringApplication.java:735)
at org.springframework.boot.SpringApplication.callRunners(SpringApplication.java:716)
at org.springframework.boot.SpringApplication.afterRefresh(SpringApplication.java:703)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:304)
at org.springframework.boot.web.support.SpringBootServletInitializer.run(SpringBootServletInitializer.java:154)
at org.springframework.boot.web.support.SpringBootServletInitializer.createRootApplicationContext(SpringBootServletInitializer.java:134)
at org.springframework.boot.web.support.SpringBootServletInitializer.onStartup(SpringBootServletInitializer.java:87)
at org.springframework.web.SpringServletContainerInitializer.onStartup(SpringServletContainerInitializer.java:169)
at io.undertow.servlet.core.DeploymentManagerImpl.deploy(DeploymentManagerImpl.java:184)
... 8 more
Caused by: org.springframework.jdbc.BadSqlGrammarException: PreparedStatementCallback; bad SQL grammar [SELECT JOB_INSTANCE_ID, JOB_NAME from BATCH_JOB_INSTANCE where JOB_NAME = ? order by JOB_INSTANCE_ID desc]; nested exception is com.ibm.db2.jcc.am.SqlSyntaxErrorException: DB2 SQL Error: SQLCODE=-204, SQLSTATE=42704, SQLERRMC=TEST.BATCH_JOB_INSTANCE, DRIVER=4.18.60
at org.springframework.jdbc.support.SQLErrorCodeSQLExceptionTranslator.doTranslate(SQLErrorCodeSQLExceptionTranslator.java:231)
at org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:73)
at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:649)
at org.springframework.jdbc.core.JdbcTemplate.query(JdbcTemplate.java:684)
at org.springframework.jdbc.core.JdbcTemplate.query(JdbcTemplate.java:716)
at org.springframework.jdbc.core.JdbcTemplate.query(JdbcTemplate.java:726)
at org.springframework.batch.core.repository.dao.JdbcJobInstanceDao.getJobInstances(JdbcJobInstanceDao.java:230)
at org.springframework.batch.core.explore.support.SimpleJobExplorer.getJobInstances(SimpleJobExplorer.java:173)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:333)
at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157)
at org.springframework.batch.core.configuration.annotation.SimpleBatchConfiguration$PassthruAdvice.invoke(SimpleBatchConfiguration.java:127)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213)
at com.sun.proxy.$Proxy302.getJobInstances(Unknown Source)
at org.springframework.boot.autoconfigure.batch.JobLauncherCommandLineRunner.getNextJobParameters(JobLauncherCommandLineRunner.java:131)
at org.springframework.boot.autoconfigure.batch.JobLauncherCommandLineRunner.execute(JobLauncherCommandLineRunner.java:212)
at org.springframework.boot.autoconfigure.batch.JobLauncherCommandLineRunner.executeLocalJobs(JobLauncherCommandLineRunner.java:231)
at org.springframework.boot.autoconfigure.batch.JobLauncherCommandLineRunner.launchJobFromProperties(JobLauncherCommandLineRunner.java:123)
at org.springframework.boot.autoconfigure.batch.JobLauncherCommandLineRunner.run(JobLauncherCommandLineRunner.java:117)
at org.springframework.boot.SpringApplication.callRunner(SpringApplication.java:732)
... 16 more
Caused by: com.ibm.db2.jcc.am.SqlSyntaxErrorException: DB2 SQL Error: SQLCODE=-204, SQLSTATE=42704, SQLERRMC=TEST.BATCH_JOB_INSTANCE, DRIVER=4.18.60
at com.ibm.db2.jcc.am.kd.a(kd.java:747)
at com.ibm.db2.jcc.am.kd.a(kd.java:66)
at com.ibm.db2.jcc.am.kd.a(kd.java:135)
at com.ibm.db2.jcc.am.bp.c(bp.java:2788)
at com.ibm.db2.jcc.am.bp.d(bp.java:2776)
at com.ibm.db2.jcc.am.bp.a(bp.java:2209)
at com.ibm.db2.jcc.am.cp.a(cp.java:7886)
at com.ibm.db2.jcc.t4.bb.h(bb.java:141)
at com.ibm.db2.jcc.t4.bb.b(bb.java:41)
at com.ibm.db2.jcc.t4.p.a(p.java:32)
at com.ibm.db2.jcc.t4.vb.i(vb.java:145)
at com.ibm.db2.jcc.am.bp.kb(bp.java:2178)
at com.ibm.db2.jcc.am.cp.xc(cp.java:3686)
at com.ibm.db2.jcc.am.cp.b(cp.java:4493)
at com.ibm.db2.jcc.am.cp.kc(cp.java:767)
at com.ibm.db2.jcc.am.cp.executeQuery(cp.java:732)
at org.jboss.jca.adapters.jdbc.WrappedPreparedStatement.executeQuery(WrappedPreparedStatement.java:504)
at org.springframework.jdbc.core.JdbcTemplate$1.doInPreparedStatement(JdbcTemplate.java:692)
at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:633)
... 38 more
Answering my own question. I am able use spring batch without metadata tables using the below configuration:
#Configuration
public class SpringBatchInMemoryConfiguration {
#Bean
public ResourcelessTransactionManager transactionManager() {
return new ResourcelessTransactionManager();
}
#Bean
public MapJobRepositoryFactoryBean mapJobRepositoryFactory(ResourcelessTransactionManager txManager)
throws Exception {
MapJobRepositoryFactoryBean factory = new MapJobRepositoryFactoryBean(txManager);
factory.afterPropertiesSet();
return factory;
}
#Bean
public JobRepository jobRepository(MapJobRepositoryFactoryBean factory) throws Exception {
return factory.getObject();
}
#Bean
public JobExplorer jobExplorer(MapJobRepositoryFactoryBean factory) {
return new SimpleJobExplorer(factory.getJobInstanceDao(), factory.getJobExecutionDao(),
factory.getStepExecutionDao(), factory.getExecutionContextDao());
}
#Bean
public SimpleJobLauncher jobLauncher(JobRepository jobRepository) throws Exception {
SimpleJobLauncher launcher = new SimpleJobLauncher();
launcher.setJobRepository(jobRepository);
launcher.afterPropertiesSet();
return launcher;
}
}

Configuration of chained transaction manager in SDN4 after migration from SDN3

At the moment I am trying to migrate from SDN3 to SDN4. In my project I use two databases: Neo4j and MySQL, so I end up with chained transaction manager. However, after migration I have problem with its configuration. Before the migration I had this:
#Bean(name = "transactionManager")
#Autowired
public PlatformTransactionManager neo4jTransactionManager(
LocalContainerEntityManagerFactoryBean entityManagerFactory, GraphDatabaseService graphDatabaseService)
throws Exception {
JtaTransactionManager neoTransactionManager = new JtaTransactionManagerFactoryBean(graphDatabaseService)
.getObject();
neoTransactionManager.setRollbackOnCommitFailure(true);
neoTransactionManager.setAllowCustomIsolationLevels(true);
JpaTransactionManager mysqlTransactioNmanager = new JpaTransactionManager(entityManagerFactory.getObject());
return new ChainedTransactionManager(mysqlTransactioNmanager, neoTransactionManager);
}
Now I have something like this:
#Bean(name = "transactionManager")
#Autowired
public PlatformTransactionManager neo4jTransactionManager(
LocalContainerEntityManagerFactoryBean entityManagerFactory, Neo4jTransactionManager neo4jTransactionManager)
throws Exception {
Neo4jTransactionManager neoTransactionManager = neo4jTransactionManager;
JpaTransactionManager mysqlTransactioNmanager = new JpaTransactionManager(entityManagerFactory.getObject());
return new ChainedTransactionManager(mysqlTransactioNmanager, neoTransactionManager);
}
However, project could not be deployed on server, because of this exception:
Caused by: org.springframework.beans.factory.BeanCreationException: Could not autowire method: public org.springframework.transaction.PlatformTransactionManager com.project.config.ApplicationConfig.neo4jTransactionManager(org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean,org.springframework.data.neo4j.transaction.Neo4jTransactionManager) throws java.lang.Exception; nested exception is org.springframework.beans.factory.NoSuchBeanDefinitionException: No qualifying bean of type [org.springframework.data.neo4j.transaction.Neo4jTransactionManager] found for dependency: expected at least 1 bean which qualifies as autowire candidate for this dependency. Dependency annotations: {}
When mentioned part of configuration is commented project is properly deployed, but obviously there is an exception concerning missing transaction during save to MySQL database.
How should I configure this chained transaction manager in SDN4? It is hard to find any examples now, because SDN4 is quite recent and I really need to have Neo4j in standalone mode, so migration seems to be a good idea.
With this configuration I managed to successfully deploy my application:
#Bean(name = "transactionManager")
#Autowired
public PlatformTransactionManager neo4jTransactionManager(
LocalContainerEntityManagerFactoryBean entityManagerFactory,
Session session) throws Exception {
Neo4jTransactionManager neoTransactionManager = new Neo4jTransactionManager(session);
JpaTransactionManager mysqlTransactioNmanager = new JpaTransactionManager(entityManagerFactory.getObject());
return new ChainedTransactionManager(mysqlTransactioNmanager,neoTransactionManager);
}
I had also to add this element to my config:
#Override
#Bean
#Scope(value = "session", proxyMode = ScopedProxyMode.TARGET_CLASS)
public Session getSession() throws Exception {
return super.getSession();
}