Spring batch conditional flow configuration error - spring-batch

I have an XML-configured step, and I wanted to add the batch:next elements in order to get conditional flowing in my job :
<batch:step id="stepLoadCashFlows">
<batch:next on="*" to="stepCleanOldTrades" />
<batch:next on="FAILED" to="stepCleanCurrentTradesOnError" />
<batch:tasklet>
<batch:chunk reader="cashFlowItemReader" writer="cashFlowItemWriter"
processor="cashFlowsProcessor" commit-interval="10000" skip-limit="${cds.skip.limit}">
<batch:skippable-exception-classes>
<batch:include class="org.springframework.integration.transformer.MessageTransformationException" />
</batch:skippable-exception-classes>
</batch:chunk>
</batch:tasklet>
<listeners>
<listener ref="cashFlowWriterListener" />
</listeners>
</batch:step>
This gets me the following error :
Caused by: org.springframework.beans.factory.parsing.BeanDefinitionParsingException: Configuration problem: Failed to import bean definitions from URL location [classpath:cpm-batch-main-cds-load.xml]
Offending resource: class path resource [cpm-dml-subscriber-cds-top-level.xml]; nested exception is org.springframework.beans.factory.xml.XmlBeanDefinitionStoreException: Line 80 in XML document from class path resource [cpm-batch-main-cds-load.xml] is invalid; nested exception is org.xml.sax.SAXParseException; lineNumber: 80; columnNumber: 19; cvc-complex-type.2.4.a : Invalid content found starting with element 'batch:tasklet'. One of the following values '{"http://www.springframework.org/schema/batch":next, "http://www.springframework.org/schema/batch":stop, "http://www.springframework.org/schema/batch":end, "http://www.springframework.org/schema/batch":fail, "http://www.springframework.org/schema/batch":listeners}' is expected.
So where should I put these (I've tried at the end of step, inside tasklet...) ?

transitions elements should be put after the tasklet element. So in your case the following should work:
<batch:step id="stepLoadCashFlows">
<batch:tasklet>
<batch:chunk reader="cashFlowItemReader" writer="cashFlowItemWriter"
processor="cashFlowsProcessor" commit-interval="10000" skip-limit="${cds.skip.limit}">
<batch:skippable-exception-classes>
<batch:include class="org.springframework.integration.transformer.MessageTransformationException" />
</batch:skippable-exception-classes>
</batch:chunk>
<batch:listeners>
<batch:listener ref="cashFlowWriterListener" />
</batch:listeners>
</batch:tasklet>
<batch:next on="*" to="stepCleanOldTrades" />
<batch:next on="FAILED" to="stepCleanCurrentTradesOnError" />
</batch:step>
Note that the listeners element should be inside the tasklet element.

Related

Spring Batch With Schedular

enter image description hereI am new to Spring Batch with Scheduler. Here my task is to read the data from one table and write it into another table.
I am randomly going through the blogs and different tutorials.
I don't know whether there is any direct approach read from database and write into database. I took this approach like
Job 1 : Reads the data from the db using JdbcCursorItemReader, writing the data into a txt file using FlatFileItemWriter.
Job 2: Read the data from the txt file using FlatFileItemReader, multiResourceItemReader and writing the data into another table using HibernateItemWriter.
I am using a scheduler and it is going to run the batch for every 20 sec.
In this approach for the first run it is working fine. For the second run(after 20 sec), I am updating the data in the database(base table) but it is not writing updated data into the file and database.
Here is my configuration & code`package com.cg.schedulers;
public class UserScheduler {
#Autowired
private JobLauncher launcher;
#Autowired
private Job userJob;
#Autowired
private Job userJob2;
private JobExecution execution1,execution2;
public void run() {
try {
execution1 = launcher.run(userJob, new JobParameters());
execution2 = launcher.run(userJob2, new JobParameters());
System.out.println("Execution status: " + execution1.getStatus());
System.out.println("Execution status: " + execution2.getStatus());
} catch (JobExecutionAlreadyRunningException e) {
e.printStackTrace();
} catch (JobRestartException e) {
e.printStackTrace();
} catch (JobInstanceAlreadyCompleteException e) {
e.printStackTrace();
} catch (JobParametersInvalidException e) {
e.printStackTrace();
}
}
}
Xml Configuration
<import resource="spring-batch1.xml" />
<import resource="springbatch-database.xml" />
<context:annotation-config/>
<context:component-scan base-package="com.cg"/>
<!-- Reading data from -->
<bean id="itemReader"
class="org.springframework.batch.item.database.JdbcCursorItemReader"
scope="step">
<property name="dataSource" ref="dataSource" />
<property name="sql" value="select UserId, UserName, Password from USER" />
<property name="rowMapper">
<bean class="com.cg.mapper.UserRowMapper" />
</property>
</bean>
<!-- ItemWriter writes a line into output flat file -->
<bean id="flatFileItemWriter" class="org.springframework.batch.item.file.FlatFileItemWriter"
scope="step">
<property name="resource" value="file:csv/User.txt" />
<property name="lineAggregator">
<!-- An Aggregator which converts an object into delimited list of strings -->
<bean
class="org.springframework.batch.item.file.transform.DelimitedLineAggregator">
<property name="delimiter" value="," />
<property name="fieldExtractor">
<!-- Extractor which returns the value of beans property through reflection -->
<bean
class="org.springframework.batch.item.file.transform.BeanWrapperFieldExtractor">
<property name="names" value="userId, username, password" />
</bean>
</property>
</bean>
</property>
</bean>
<!-- ItemReader reads a complete line one by one from input file -->
<bean id="flatFileItemReader" class="org.springframework.batch.item.file.FlatFileItemReader"
scope="step">
<property name="lineMapper">
<bean class="org.springframework.batch.item.file.mapping.DefaultLineMapper">
<property name="fieldSetMapper">
<!-- Mapper which maps each individual items in a record to properties
in POJO -->
<bean class="com.cg.mapper.UserFieldSetMapper" />
</property>
<property name="lineTokenizer">
<!-- A tokenizer class to be used when items in input record are separated
by specific characters -->
<bean
class="org.springframework.batch.item.file.transform.DelimitedLineTokenizer">
<property name="delimiter" value="," />
</bean>
</property>
</bean>
</property>
</bean>
<bean id="multiResourceItemReader"
class="org.springframework.batch.item.file.MultiResourceItemReader">
<property name="resources" value="classpath:csv/User.txt" />
<property name="delegate" ref="flatFileItemReader" />
</bean>
<!-- Optional JobExecutionListener to perform business logic before and
after the job -->
<bean id="jobListener" class="com.cg.support.UserItemListener" />
<!-- Optional ItemProcessor to perform business logic/filtering on the input
records -->
<bean id="itemProcessor1" class="com.cg.support.UserItemProcessor" />
<bean id="itemProcessor2" class="com.cg.support.UserItemProcessor2" />
<!-- ItemWriter which writes data to database -->
<bean id="databaseItemWriter"
class="org.springframework.batch.item.database.HibernateItemWriter">
<property name="sessionFactory" ref="sessionFactory" />
</bean>
<!-- Actual Job -->
<batch:job id="userJob">
<batch:step id="step1">
<batch:tasklet transaction-manager="transactionManager">
<batch:chunk reader="itemReader" writer="flatFileItemWriter"
processor="itemProcessor1" commit-interval="10" />
</batch:tasklet>
</batch:step>
<batch:listeners>
<batch:listener ref="jobListener" />
</batch:listeners>
</batch:job>
<batch:job id="userJob2">
<batch:step id="step2">
<batch:tasklet transaction-manager="transactionManager">
<batch:chunk reader="multiResourceItemReader" writer="databaseItemWriter"
processor="itemProcessor2" commit-interval="10" />
</batch:tasklet>
</batch:step>
</batch:job>
<bean id="myScheduler" class="com.cg.schedulers.UserScheduler"/>
<task:scheduled-tasks>
<task:scheduled ref="myScheduler" method="run" cron="*/20 * * * * *" />
</task:scheduled-tasks>
Please provide the direct approach if possible using hibernate.
[enter image description here][2]
Execution status: COMPLETED
Thanks,
Vamshi.

Why restart of a failed Job fails, though Failed element has been configured?

I am trying to configure restart cabability when the job fails in certain step , then after fixing the problem, it should be able to continue from the failed step on restart. But it is not happening. Please find my code below. What am I missing here?.
<job id="OneJob" xmlns="http://www.springframework.org/schema/batch" restartable="true">
<listeners>
<listener ref="jobStatusListener" />
</listeners>
<step id="aA" next="bB">
<tasklet>
<chunk commit-interval="1000" reader="o_reader" writer="o_writer" />
</tasklet>
<listeners> <listener ref="stStepListener" /> </listeners>
</step>
<step id="bB" parent="aA">
<tasklet>
<chunk commit-interval="1000" reader="t_reader" writer="t_writer" />
</tasklet>
<fail on="FAILED" exit-code="EARLY TERMINATION"/>
<next on="*" to="cC"/>
<listeners> <listener ref="stStepListener" /> </listeners>
</step>
<step id="cC">
<tasklet>
<chunk commit-interval="1000" reader="th_reader" writer="th_writer" />
</tasklet>
<listeners> <listener ref="stStepListener" /> </listeners>
</step>
</job>
On step 2 it is failing (making a test fail with incorrect read syntax query) and storing the status as
BATCH_JOB_EXECUTION.STATUS= FAILED
BATCH_JOB_EXECUTION.EXIT_CODE=EARLY TERMINATION
After fixing the problem, When I restart with same job parameter, it is failing and storing the status as
BATCH_JOB_EXECUTION.STATUS= FAILED
BATCH_JOB_EXECUTION.EXIT_CODE=FAILED
and I see exception as below
org.springframework.batch.core.JobExecutionException: Flow execution ended unexpectedly
at org.springframework.batch.core.job.flow.FlowJob.doExecute(FlowJob.java:140)
at org.springframework.batch.core.job.AbstractJob.execute(AbstractJob.java:304)
at org.springframework.batch.core.launch.support.SimpleJobLauncher$1.run(SimpleJobLauncher.java:135)
at org.springframework.core.task.SyncTaskExecutor.execute(SyncTaskExecutor.java:50)
at org.springframework.batch.core.launch.support.SimpleJobLauncher.run(SimpleJobLauncher.java:128)
at com.one.batch.oneBatchStarter.main(oneBatchStarter.java:33)
Caused by: org.springframework.batch.core.job.flow.FlowExecutionException: Ended flow=oneJob at state=oneJob.aA with exception
at org.springframework.batch.core.job.flow.support.SimpleFlow.resume(SimpleFlow.java:174)
at org.springframework.batch.core.job.flow.support.SimpleFlow.start(SimpleFlow.java:144)
at org.springframework.batch.core.job.flow.FlowJob.doExecute(FlowJob.java:134)
... 5 more
Caused by: com.thoughtworks.xstream.converters.ConversionException: Cannot construct java.util.Map$Entry : java.util.Map$Entry : Cannot construct java.util.Map$Entry : java.util.Map$Entry
---- Debugging information ----
message : Cannot construct java.util.Map$Entry : java.util.Map$Entry
cause-exception : com.thoughtworks.xstream.converters.reflection.ObjectAccessException
cause-message : Cannot construct java.util.Map$Entry : java.util.Map$Entry
class : java.util.Map$Entry
required-type : java.util.Map$Entry
converter-type : com.thoughtworks.xstream.converters.reflection.ReflectionConverter
path : /map/map/entry
line number : -1
class[1] : java.util.HashMap
converter-type[1] : com.thoughtworks.xstream.converters.collections.MapConverter
version : 1.4.7
-------------------------------
at com.thoughtworks.xstream.core.TreeUnmarshaller.convert(TreeUnmarshaller.java:79)
at com.thoughtworks.xstream.core.AbstractReferenceUnmarshaller.convert(AbstractReferenceUnmarshaller.java:65)
at com.thoughtworks.xstream.core.TreeUnmarshaller.convertAnother(TreeUnmarshaller.java:66)
at com.thoughtworks.xstream.core.TreeUnmarshaller.convertAnother(TreeUnmarshaller.java:50)
at com.thoughtworks.xstream.converters.collections.AbstractCollectionConverter.readItem(AbstractCollectionConverter.java:71)
at com.thoughtworks.xstream.converters.collections.MapConverte

Spring Batch Partitioning - partition step not executing

I am using Batch Partitioning to fetch and process records based on some range in one of the columns from the DB table. The partition step stagingPartitionStep appears after few steps in the job configuration as follows:
<batch:job id="myBatchJob">
<batch:step id="mainStep">
<batch:tasklet task-executor="myTaskExecutor" transaction-manager="batchTransactionManager">
<batch:chunk reader="myDataReader"
processor="MyDataProcessor" writer="MyDataWriter" commit-interval="50">
</batch:chunk>
<batch:listeners>
<batch:listener ref="myItemListener" />
</batch:listeners>
</batch:tasklet>
<batch:next on="COMPLETED" to="checkConditionStep" />
</batch:step>
<batch:step id="checkConditionStep">
<batch:tasklet task-executor="checkConditionExecutor"
transaction-manager="batchTransactionManager" ref="checkConditionTasklet">
</batch:tasklet>
<batch:next on="FAILED" to="updateStagingTableStep" />
<batch:next on="COMPLETED" to="stagingPartitionStep" />
</batch:step>
<batch:step id="updateStagingTableStep">
<batch:tasklet task-executor="checkConditionExecutor"
transaction-manager="batchTransactionManager" ref="updateStagingTasklet">
</batch:tasklet>
</batch:step>
<batch:step id="stagingPartitionStep">
<batch:partition step="processStagingStep" partitioner="stagingProcessPartitioner">
<batch:handler grid-size="10" task-executor="stagingProcessTaskExecutor" />
</batch:partition>
</batch:step>
<batch:job>
The partitioner and the partion step:
<bean id="stagingProcessPartitioner"
class="com.mycom.batch.partitioner.StagingProcessPartitioner"
scope="step">
</bean>
<batch:step id="processStagingStep">
<batch:tasklet transaction-manager="batchTransactionManager">
<batch:chunk reader="stagingProcessorDataReader" writer="stagingProcessorDataWriter"
commit-interval="50">
</batch:chunk>
</batch:tasklet>
</batch:step>
The task executors:
<bean id="myTaskExecutor"
class="org.springframework.scheduling.concurrent.ThreadPoolTaskExecutor">
<property name="corePoolSize" value="20" />
<property name="maxPoolSize" value="20" />
</bean>
<bean id="stagingProcessTaskExecutor"
class="org.springframework.scheduling.concurrent.ThreadPoolTaskExecutor">
<property name="corePoolSize" value="10" />
<property name="maxPoolSize" value="10" />
<property name="allowCoreThreadTimeOut" value="true" />
</bean>
<bean id="checkConditionStep"
class="org.springframework.scheduling.concurrent.ThreadPoolTaskExecutor">
<property name="corePoolSize" value="1" />
<property name="maxPoolSize" value="1" />
</bean>
The partitioner implementation: The partitioner creates ExecutionContext for each partioned step and puts a prefix value which will be used in the JDBC query to create a partion based on voucher number.
public class StagingProcessPartitioner implements Partitioner{
public Map<String, ExecutionContext> partition(int gridSize) {
Map<String, ExecutionContext> partitionMap = new HashMap<String, ExecutionContext>();
for(int threadId = 0; threadId < gridSize; threadId++){
ExecutionContext context = new ExecutionContext();
String stepName = "step" + threadId;
context.put("voucherSuffix", threadId);
partitionMap.put(stepName, context);
LOGGER.info("Created ExecutionContext for partioned step : " + stepName);
}
return partitionMap;
}}
The data reader: The voucherSuffix from the step context is used in JDBC query to create data partion. Therefore 10 partions should be created on voucher numbers ending 0,1,2....,9.
<bean id="stagingProcessorDataReader"
class="org.springframework.batch.item.database.JdbcPagingItemReader"
scope="step">
<property name="dataSource" ref="dataSource" />
<property name="queryProvider" ref="stagingDataQueryProvider" />
<property name="parameterValues">
<map>
<entry key="department" value="#{jobParameters[department]}" />
<entry key="joiningDate" value="#{jobParameters[joiningDate]}" />
<entry key="voucherSuffix" value="#{stepExecutionContext[voucherSuffix]}" />
</map>
</property>
<property name="pageSize" value="1000" />
<property name="rowMapper" ref="myDataRowMapper"/>
</bean>
<bean id="stagingDataQueryProvider"
class="org.springframework.batch.item.database.support.SqlPagingQueryProviderFactoryBean">
<property name="dataSource" ref="dataSource" />
<property name="selectClause" value="SELECT EMP_ID, EMP_NAME, DOB, ADDRESS1, ADDRESS2" />
<property name="fromClause" value="EMP_STG_TBL" />
<property name="whereClause" value="WHERE DEPT_ID=:department AND DOJ=:joiningDate AND VCHR LIKE '%:voucherSuffix'" />
<property name="sortKeys">
<map>
<entry key="EMP_ID" value="ASCENDING"></entry>
</map>
</property>
</bean>
The problem is, when the job is executted, each step is executed fine till the partion step. The partioner creates the execution context which can be confirmed from the log statements but the step processStagingStep is not executed and the job finishes with status COMPLETED. Is this job and partion step configuration correct?
Here are the log statements
2015-02-23 03:03:04 INFO myTaskScheduler-3 SimpleStepHandler:146 - Executing step: [checkConditionStep]
2015-02-23 03:03:04 INFO myTaskScheduler-3 SimpleStepHandler:146 - Executing step: [stagingPartitionStep]
2015-02-23 03:03:04 INFO myTaskScheduler-3 StagingProcessPartitioner:29 - Created ExecutionContext for partioned step : step0
2015-02-23 03:03:04 INFO myTaskScheduler-3 StagingProcessPartitioner:29 - Created ExecutionContext for partioned step : step1
2015-02-23 03:03:04 INFO myTaskScheduler-3 StagingProcessPartitioner:29 - Created ExecutionContext for partioned step : step2
2015-02-23 03:03:04 INFO myTaskScheduler-3 StagingProcessPartitioner:29 - Created ExecutionContext for partioned step : step3
2015-02-23 03:03:04 INFO myTaskScheduler-3 StagingProcessPartitioner:29 - Created ExecutionContext for partioned step : step4
2015-02-23 03:03:04 INFO myTaskScheduler-3 StagingProcessPartitioner:29 - Created ExecutionContext for partioned step : step5
2015-02-23 03:03:04 INFO myTaskScheduler-3 StagingProcessPartitioner:29 - Created ExecutionContext for partioned step : step6
2015-02-23 03:03:04 INFO myTaskScheduler-3 StagingProcessPartitioner:29 - Created ExecutionContext for partioned step : step7
2015-02-23 03:03:04 INFO myTaskScheduler-3 StagingProcessPartitioner:29 - Created ExecutionContext for partioned step : step8
2015-02-23 03:03:04 INFO myTaskScheduler-3 StagingProcessPartitioner:29 - Created ExecutionContext for partioned step : step9
2015-02-23 03:03:05 INFO myTaskScheduler-3 SimpleJobLauncher:136 - Job: [FlowJob: [name=myBatchJob]] completed with the following parameters: [] and the following status: [COMPLETED]

Parallel Step execution in spring batch

Right now I got to know that we can run concurrent steps in spring batch using Parallel Steps (http://docs.spring.io/spring-batch/trunk/reference/html/scalability.html 7.2).I did and got success with it.
But When I see the database what has happened is person record in the table is updated with others person information, though I have made my ItemProcessor as synchronized.
Background about what job is doing is simple, just process person records from person table across department (200,400) and writes in to a flat file.
When I saw flat file I could see records a person from department 200 is written with person information from department 400.Kindly help is any thing i need to take care of?
<batch:job id="dept">
<batch:step id="dojStep1" parent="dojMainStep1" next="parallelProcessMatch">
</batch:step>
<batch:split id="parallelProcessMatch" task-executor="taskExecutor">
<batch:flow>
<batch:step id="step200" parent="dojMainStep200" >
</batch:step>
</batch:flow>
<batch:flow>
<batch:step id="step400" parent="dojMainStep400" >
</batch:step>
</batch:flow>
</batch:split>
</batch:job>
<bean id="taskExecutor" class="org.springframework.core.task.SimpleAsyncTaskExecutor"/>
<!-- Start parallelProcessMatch -->
<!-- Start dojMainStep200 -->
<batch:step id="dojMainStep200" abstract="true">
<batch:tasklet>
<batch:chunk commit-interval="1000" reader="dojDbReader200"
processor="dojMatchItemProcessor200" writer="dojClassifierMatchReportWriter200">
<batch:streams>
<batch:stream ref="itemWriterMatch200" />
<batch:stream ref="itemWriterUnMatch200" />
</batch:streams>
</batch:chunk>
</batch:tasklet>
<batch:listeners>
<batch:listener ref="dojMatch200PageHeaderCallback" />
<batch:listener ref="dojUnMatch200PageHeaderCallback" />
<batch:listener ref="dojInPageFooterCallback" />
</batch:listeners>
</batch:step>
<bean id="dojMatchItemProcessor200"
class="com.batchinterface.dept.recordresultsvr.DojMatchItemProccesor"
p:holdingTankDao-ref="holdingTankDao" p:rsdProvider-ref="rsdProvider" p:searchProvider-ref="searchProvider" />
<bean id="dojDbReader200"
class="org.springframework.batch.item.database.StoredProcedureItemReader"
p:dataSource-ref="oracleDataSource" p:rowMapper-ref="dojMatchRowMapper200"
scope="step" p:function="false" p:procedureName="PKG_JOIN.PRC_SELECT"
p:preparedStatementSetter-ref="dojmatchpropertySetter200"
p:refCursorPosition="1">
<property name="parameters">
<list>
<bean class="org.springframework.jdbc.core.SqlOutParameter">
<constructor-arg index="0" value="c1" />
<constructor-arg index="1">
<util:constant static-field="oracle.jdbc.OracleTypes.CURSOR" />
</constructor-arg>
</bean>
<bean class="org.springframework.jdbc.core.SqlParameter">
<constructor-arg index="0" value="dept" />
<constructor-arg index="1">
<util:constant static-field="oracle.jdbc.OracleTypes.VARCHAR" />
</constructor-arg>
</bean>
</list>
</property>
</bean>
<bean id="dojmatchpropertySetter200"
class="com.batchinterface.dept.recordresultsvr.DojPreparedStateSetter">
<property name="dept" value="200" />
</bean>
<bean id="dojMatchRowMapper200"
class="com.batchinterface.dept.recordresultsvr.DojMatchRowMapper" />
<bean id="dojClassifierMatchReportWriter200"
class="org.springframework.batch.item.support.ClassifierCompositeItemWriter"
p:classifier-ref="dojMatchClassifier200">
</bean>
<bean id="dojMatchClassifier200"
class="com.batchinterface.dept.recordresultsvr.DojMatchReportClassifier"
p:itemWriterMatch200-ref="itemWriterMatch200"
p:itemWriterUnMatch200-ref="itemWriterUnMatch200"
p:lastRunDate-ref="LastSuccessfulRunDate200">
</bean>
<!-- End dojMainStep200 -->
<!-- Start dojMainStep400 -->
<batch:step id="dojMainStep400" abstract="true">
<batch:tasklet>
<batch:chunk commit-interval="1000" reader="dojDbReader400"
processor="dojMatchItemProcessor400" writer="dojClassifierMatchReportWriter400">
<batch:streams>
<batch:stream ref="itemWriterMatch400" />
<batch:stream ref="itemWriterUnMatch400" />
</batch:streams>
</batch:chunk>
</batch:tasklet>
<batch:listeners>
<batch:listener ref="dojMatch400PageHeaderCallback" />
<batch:listener ref="dojUnMatch400PageHeaderCallback" />
<batch:listener ref="dojInPageFooterCallback" />
</batch:listeners>
</batch:step>
<bean id="dojMatchItemProcessor400"
class="com.batchinterface.dept.recordresultsvr.DojMatchItemProccesor"
p:holdingTankDao-ref="holdingTankDao" p:rsdProvider-ref="rsdProvider" p:searchProvider-ref="searchProvider" />
<bean id="dojDbReader400"
class="org.springframework.batch.item.database.StoredProcedureItemReader"
p:dataSource-ref="oracleDataSource" p:rowMapper-ref="dojMatchRowMapper400"
scope="step" p:function="false" p:procedureName="PKG_JOIN.PRC_SELECT"
p:preparedStatementSetter-ref="dojmatchpropertySetter400"
p:refCursorPosition="1">
<property name="parameters">
<list>
<bean class="org.springframework.jdbc.core.SqlOutParameter">
<constructor-arg index="0" value="c1" />
<constructor-arg index="1">
<util:constant static-field="oracle.jdbc.OracleTypes.CURSOR" />
</constructor-arg>
</bean>
<bean class="org.springframework.jdbc.core.SqlParameter">
<constructor-arg index="0" value="dept" />
<constructor-arg index="1">
<util:constant static-field="oracle.jdbc.OracleTypes.VARCHAR" />
</constructor-arg>
</bean>
</list>
</property>
</bean>
<bean id="dojmatchpropertySetter400"
class="com.batchinterface.dept.recordresultsvr.DojPreparedStateSetter">
<property name="dept" value="400" />
</bean>
<bean id="dojMatchRowMapper400"
class="com.batchinterface.dept.recordresultsvr.DojMatchRowMapper" />
<bean id="dojClassifierMatchReportWriter400"
class="org.springframework.batch.item.support.ClassifierCompositeItemWriter"
p:classifier-ref="dojMatchClassifier400">
</bean>
<bean id="dojMatchClassifier400"
class="com.batchinterface.dept.recordresultsvr.DojMatchReportClassifier"
p:itemWriterMatch400-ref="itemWriterMatch400"
p:itemWriterUnMatch400-ref="itemWriterUnMatch400"
p:lastRunDate-ref="LastSuccessfulRunDate400">
</bean>
<!-- End dojMainStep400 -->
<!-- End parallelProcessMatch -->

How to terminate Step within a Spring Batch Split Flow with a Decider

I've happened up the following design defect in Spring Batch.
A Step must have a Next attribute unless it is the last Step or last Step of a Split Flow.
A Decider block must handle all cases returned by the Decider.
Because of this, in a Split Flow, where the final Step would not have a Next attribute, if there is a Decider guarding it, then it must have a Next attribute. So it shouldn't have that attribute, but it also needs it. Catch 22.
Example:
<!-- Process parallel steps -->
<split id="split01">
<flow>
<step id="step1" next="step02">
<!-- Do something -->
</step>
<step id="step02">
<!-- Do something else -->
</step>
</flow>
<flow>
<step id="step03">
<!-- Do something -->
</step>
<!-- Only run under specific conditions -->
<decision id="decideToRunStep04" decider="isStepNeededDecider" >
<next on="RUN" to="step04"/>
<!-- Other state is "SKIP" -->
</decision>
<step id="step04">
<!-- Conditionally do something-->
</step>
</flow>
</split>
<step id="step05" >
<!-- Some more stuff -->
</step>
This seems like something the Spring guys would have thought of, so curious what the right, non-hack way to achieve this is. Thanks.
Given no answers from anyone on this, I'll proffer the hack that I'm using. It's not pretty, but neither is Spring.
Create a No Op Tasklet to use in a No Op step.
public class NoopTasklet implements Tasklet {
#Override
public RepeatStatus execute(final StepContribution contribution,
final ChunkContext chunkContext) throws Exception {
return RepeatStatus.FINISHED;
}
}
The add NOOP tasklet to the decision block from the original example
<!-- Does nothing -->
<bean id="noopTasklet" class="com.foo.NoopTasklet" />
<!-- From example in question
<decision id="decideToRunStep04" decider="isStepNeededDecider" >
<next on="RUN" to="step04"/>
<next on="SKIP" to="noop01"/>
</decision>
<step id="step04">
<!-- Conditionally do something-->
</step>
<step id="noop01">
<!-- Does nothing in the SKIP case
<tasklet ref="noopTasklet" />
</step>
Spring is the prettiest code in town.
That said:
<step id="step1" parent="s1">
<end on="FAILED" />
<next on="COMPLETED WITH SKIPS" to="errorPrint1" />
<next on="*" to="step2" />
</step>
as it is documented at http://docs.spring.io/spring-batch/reference/html/configureStep.html.
In XML
<batch:decision id="customerDecision" decider="customerDecider">
<batch:next on="FILE_FAILURE" to="fileFailureStep" />
<batch:next on="FILE_GENERATION" to="loadData" />
</batch:decision>
In customerDecider class
public class CustomerDecider implements JobExecutionDecider {
#Override
public FlowExecutionStatus decide(JobExecution jobExecution, StepExecution stepExecutionStatus) {
If(x)
return new FlowExecutionStatus("FILE_FAILURE") ;
else
return new FlowExecutionStatus("FILE_GENERATION") ;
}
}