I have created a workflow which contain a subprocess with boundary timer. Here is my code:
<process id="processsub" isExecutable="true">
<startEvent id="start" name="start" activiti:initiator="initiator"></startEvent>
<userTask id="usertask1" name="usertask1" activiti:assignee="${initiator}">
<documentation>this is user task</documentation>
<extensionElements>
<activiti:formProperty id="consult" name="Do u want to consult??" type="enum">
<activiti:value id="true" name="true"></activiti:value>
<activiti:value id="false" name="false"></activiti:value>
</activiti:formProperty>
</extensionElements>
</userTask>
<subProcess id="subpro" name="subpro">
<startEvent id="substart" name="substart"></startEvent>
<userTask id="usertask2" name="usertask2" activiti:candidateGroups="doctor">
<documentation>This is a user task for consultation..</documentation>
</userTask>
<userTask id="usertask3" name="usertask3" activiti:candidateGroups="doctor">
<documentation>This is a user task for prescription</documentation>
</userTask>
<endEvent id="subend" name="subend"></endEvent>
<parallelGateway id="par1" name="par1"></parallelGateway>
<parallelGateway id="par2" name="par2"></parallelGateway>
<sequenceFlow id="flow3" name="flow3" sourceRef="substart" targetRef="par1"></sequenceFlow>
<sequenceFlow id="flow4" name="flow4" sourceRef="par1" targetRef="usertask2"></sequenceFlow>
<sequenceFlow id="flow5" name="flow5" sourceRef="par1" targetRef="usertask3"></sequenceFlow>
<sequenceFlow id="flow6" name="flow6" sourceRef="usertask2" targetRef="par2"></sequenceFlow>
<sequenceFlow id="flow7" name="flow7" sourceRef="usertask3" targetRef="par2"></sequenceFlow>
<sequenceFlow id="flow8" name="flow8" sourceRef="par2" targetRef="subend"></sequenceFlow>
</subProcess>
<endEvent id="end" name="end"></endEvent>
<sequenceFlow id="flow1" name="flow1" sourceRef="start" targetRef="usertask1"></sequenceFlow>
<sequenceFlow id="flow2" name="flow2" sourceRef="usertask1" targetRef="subpro"></sequenceFlow>
<sequenceFlow id="flow9" name="flow9" sourceRef="subpro" targetRef="end"></sequenceFlow>
<boundaryEvent id="sid-FFAB6E5A-5E94-4001-9845-4D481E157F03" attachedToRef="subpro" cancelActivity="true">
<timerEventDefinition>
<timeDuration>PT2M</timeDuration>
</timerEventDefinition>
</boundaryEvent>
<userTask id="usertask4" name="usertask4" activiti:candidateGroups="reviewer"></userTask>
<endEvent id="finalend" name="finalend"></endEvent>
<sequenceFlow id="flow10" name="flow10" sourceRef="sid-FFAB6E5A-5E94-4001-9845-4D481E157F03" targetRef="usertask4"></sequenceFlow>
<sequenceFlow id="flow11" name="flow11" sourceRef="usertask4" targetRef="finalend"></sequenceFlow>
</process>
the timer getting fire after 2 minutes whether the subprocess got completed or it is partially completed i.e. 1 usertask got completed and second is incompleted. I want that if the subprocess got start whether it is completed or partially completed, the timer would not fire.
how i can do this? Please help me.
What do you want to do is to "cancel" a timer boundary event when a user task is completed. Unfortunately, it is not possible unless you code it via the API
The best way to do it is :
transform your timer boundary event to a timer intermediate catch event
when a user task is completed, send a "cancel" signal to the timer intermediate catch event
to implement the cancel signal, you have to create an event gateway after which you wait for the "cancel" signal or a timer event
You process will be like this : http://i.stack.imgur.com/yl9GC.png
Related
I've configured my Spring Batch app below:
<batch:job id="mPortRiskJob">
<batch:step id="mPortRiskStep">
<tasklet throttle-limit="10">
<chunk reader="MPortRiskReader" processor="MPortRiskProcessor" writer="MPortRiskWriter" commit-interval="10"
skip-limit="1">
<batch:skippable-exception-classes>
<include class="com.common.exception.ERDException"/>
</batch:skippable-exception-classes>
</chunk>
<batch:no-rollback-exception-classes>
<include class="com.common.exception.ERDException"/>
</batch:no-rollback-exception-classes>
</tasklet>
</batch:step>
<batch:listeners>
<batch:listener ref="MJobExecutionListener"/>
</batch:listeners>
</batch:job>
In my writer, I have a for-loop that inserts records into the database. The code looks like below:
for (String id : keylist) {
try {
insertRecord(id);
} catch (Exception e) {
throw new ERDException("Failure in write method", e);
}
}
What I want is if, for instance, the first record throws a DuplicateKeyException, for that record to be skipped, and the next record to be inserted. What's happening is, when the ERDException is thrown, Spring Batch retries all the records, including the duplicate. I want it to discard that particular record and insert the others. Is there a way to accomplish that?
Well since your exception is thrown in writer with commit-interval as 10 it is retrying all 10 records again because it needs to determine which record throws exception. Once it determines the record it will skip just that record and process others.
Please see this post.
For anyone interested, I solved my problem by combining no-rollback-exception-classes with a skip-policy! I probably went a little overboard with defining my own skip policy - one of the out-of-the-box Spring policies would have been fine. See below:
<bean id="skipPolicy" class="com.trp.erd.batch.ERDSkipPolicy">
<batch:job id="midrPortRiskJob">
<batch:step id="midrPortRiskStep">
<tasklet throttle-limit="10">
<chunk reader="MIDRPortRiskReader" processor="MIDRPortRiskProcessor" writer="MIDRPortRiskWriter" commit-interval="10"
skip-limit="1" skip-policy="skipPolicy">
<batch:skippable-exception-classes>
<include class="com.trp.erd.common.exception.ERDException"/>
</batch:skippable-exception-classes>
</chunk>
<batch:no-rollback-exception-classes>
<include class="com.trp.erd.common.exception.ERDException"/>
</batch:no-rollback-exception-classes>
</tasklet>
</batch:step>
<batch:listeners>
<batch:listener ref="MIDRJobExecutionListener"/>
</batch:listeners>
</batch:job>
And the skip policy implementation:
public class ERDSkipPolicy implements SkipPolicy {
#Override
public boolean shouldSkip(Throwable t, int skipCount) {
if (t instanceOf ERDException) {
return true;
}
}
}
I am designing a Spring Batch, which reads multiple csv files. I have used partitioning to read each file in chunk and process it to decrypt a certain column in the csv. Before decrypting if i encounter any validation error , i throw custom exception.
Now what i want is if the processing finds any validation error in the first line, the other lines should not be processed, and the job should end.
How can i achieve this? I tried to implement ProcessorListener too but it has no StepExecution object so that i can call SetTerminateOnly() or ExitStatus=Failed
Also note that i have multiple thread accessing the file in different lines.I want to kill all threads in the event of the first encountered error.
Thanks in advance
So, I identified that running multiple asynchronous concurrent threads (Spring Batch partitioning) was the real issue. Though one of the thread threw an Exception, the other threads were parallely running, and finished executing till the end.
Ath the end, the Job FAILED overall and there was no output processed, but it consumed time to process rest of the data.
Well,the solution to it is as simple as it gets. We just need stop the Job while encountering an error during processing.
The Custom Processor
public class MultiThreadedFlatFileItemProcessor implements ItemProcessor<BinFileVO, BinFileVO>,JobExecutionListener{
private JobExecution jobExecution;
private RSADecrypter decrypter;
public RSADecrypter getDecrypter() {
return decrypter;
}
public void setDecrypter(RSADecrypter decrypter) {
this.decrypter = decrypter;
}
#Override
/**
This method is used process the encrypted data
#param item
* */
public BinFileVO process(BinFileVO item) throws JobException {
if(null!=item.getEncryptedText() && !item.getEncryptedText().isEmpty()){
String decrypted = decrypter.getDecryptedText(item.getEncryptedText());
if(null!=decrypted && !decrypted.isEmpty()){
if(decrypted.matches("[0-9]+")){
if(decrypted.length() >= 12 && decrypted.length() <= 19){
item.setEncryptedText(decrypted);
}else{
this.jobExecution.stop();
throw new JobException(PropertyLoader.getValue(ApplicationConstants.DECRYPTED_CARD_NO_LENGTH_INVALID),item.getLineNumber());
}
}
}else{
this.jobExecution.stop();
throw new JobException(PropertyLoader.getValue(ApplicationConstants.EMPTY_ENCRYPTED_DATA),item.getLineNumber());
}
return item;
}
#Override
public void beforeJob(JobExecution jobExecution) {
this.jobExecution=jobExecution;
}
#Override
public void afterJob(JobExecution jobExecution) {
}
}
The Job xml config
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
.....>
<!-- JobRepository and JobLauncher are configuration/setup classes -->
<bean id="jobRepository" class="org.springframework.batch.core.repository.support.MapJobRepositoryFactoryBean" />
<bean id="jobLauncher" class="org.springframework.batch.core.launch.support.SimpleJobLauncher">
<property name="jobRepository" ref="jobRepository" />
</bean>
<!-- Job Details -->
<job id="simpleMultiThreadsReaderJob" xmlns="http://www.springframework.org/schema/batch">
<step id="step" >
<partition step="step1" partitioner="partitioner">
<handler grid-size="5" task-executor="taskExecutor"/>
</partition>
</step>
<listeners>
<listener ref="decryptingItemProcessor"/>
</listeners>
</job>
<step id="step1" xmlns="http://www.springframework.org/schema/batch">
<tasklet>
<chunk reader="itemReader" writer="itemWriter" processor="decryptingItemProcessor" commit-interval="500"/>
<listeners>
<listener ref="customItemProcessorListener" />
</listeners>
</tasklet>
</step>
<!-- Processor Details -->
<bean id="decryptingItemProcessor" class="com.test.batch.io.MultiThreadedFlatFileItemProcessor">
<property name="decrypter" ref="rsaDecrypter" />
</bean>
<!-- RSA Decrypter class -->
<bean id="rsaDecrypter" class="test.batch.secure.rsa.client.RSADecrypter"/>
<!-- Partitioner Details -->
<bean class="org.springframework.batch.core.scope.StepScope" />
<bean id="partitioner" class="com.test.batch.partition.FlatFilePartitioner" scope="step">
<property name="resource" ref="inputFile"/>
</bean>
<bean id="taskExecutor"
class="org.springframework.scheduling.concurrent.ThreadPoolTaskExecutor">
<property name="corePoolSize" value="10"/>
</bean>
<!-- Step will need a transaction manager -->
<bean id="transactionManager" class="org.springframework.batch.support.transaction.ResourcelessTransactionManager" />
........
.................
</beans>
Here are the logs
2016-09-01 06:32:40 INFO SimpleJobRepository:273 - Parent JobExecution is stopped, so passing message on to StepExecution
2016-09-01 06:32:43 INFO ThreadStepInterruptionPolicy:60 - Step interrupted through StepExecution
2016-09-01 06:32:43 INFO AbstractStep:216 - Encountered interruption executing step: Job interrupted status detected.
; org.springframework.batch.core.JobInterruptedException
2016-09-01 06:32:45 ERROR CustomJobListener:163 - exception :At line No. 1 : The decrypted card number is less than 12 or greater than 19 in length
2016-09-01 06:32:45 ERROR CustomJobListener:163 - exception :Job interrupted status detected.
2016-09-01 06:32:45 INFO SimpleJobLauncher:135 - Job: [FlowJob: [name=simpleMultiThreadsReaderJob]] completed with the following parameters: [{outputFile=/usr/local/pos/bulktokenization/csv/outputs/cc_output_EDWError_08162016.csv, partitionFile=/usr/local/pos/bulktokenization/csv/partitions/, inputFile=C:\usr\local\pos\bulktokenization\csv\inputs\cc_input_EDWError_08162016.csv, fileName=cc_input_EDWError_08162016}] and the following status: [FAILED]
2016-09-01 06:32:45 INFO BatchLauncher:122 - Exit Status : FAILED
2016-09-01 06:32:45 INFO BatchLauncher:123 - Time Taken : 8969
If we throw Custom Exception in Processor, Spring Batch will terminate and mark the job failed unless you setup 'skipable' exception. You have not mentioned where you perform validate step, are you doing in Processor or Reader? Let me know because it is where Spring Batch decides.
In my project, if I want to stop the job and throw Custom Exception, we put validation logic in a Tasklet or Processor and throw exception as below
private AccountInfoEntity getAccountInfo(Long partnerId) {
if(partnerId != null){
.....
return ....;
} else {
throw new ReportsException("XXXXX");
}
}
I am having an exception caused in Spring Batch code that I suspect is due to some bad configuration. First I will give context and then the problem I am having.
I am using Spring Batch 2.2.6.RELEASE
I have a job defined like this (simplified excerpts that I consider are the relevant ones):
....
<batch:job id="job1">
<batch:step id="step1">
<batch:tasklet ref="myTasklet1"/>
</batch:step>
<batch:step id="step2" >
<batch:tasklet ref="myTasklet2"/>
</batch:step>
<batch:step id="step3">
<batch:tasklet>
<batch:chunk reader="myReader" processor="myProcessor" writer="myCompositeWriter" commit-interval="10" />
</batch:tasklet>
<batch:listeners>
<batch:listener ref="myWriter2" />
</batch:listeners>
</batch:step>
<batch:step id="step4" >
<batch:tasklet ref="myTasklet4"/>
</batch:step>
</batch:job>
...
<bean id="myCompositeWriter " class="org.springframework.batch.item.support.CompositeItemWriter">
<property name="delegates">
<list>
<ref bean="myWriter1" />
<ref bean="myWriter2" />
</list>
</property>
</bean>
<bean id="myWriter2" class="my.test.MyWriter2" scope="step" />
...
The simplified writer2 as follows:
public class MyWriter2 implements ItemWriter<Object>, StepExecutionListener {
private ExecutionContext jobContext;
#Override
public void beforeStep(StepExecution stepExecution) {
JobExecution jobExecution = stepExecution.getJobExecution();
this.jobContext = jobExecution.getExecutionContext();
}
#Override
public ExitStatus afterStep(StepExecution stepExecution) {
return null;
}
#Override
public void write(List<? extends Object> items) {
try {
// database insertion
} catch (Exception e) {
// add exception to context for later notifications
jobContext.put("writer2_error", e);
}
}
}
Some requirements:
Need to access the jobContext from all tasklets and from writer2. Accessing the jobcontext from tasklets is straightforward. The writer2 implements StepExecutionListener and it is registered as a listener in step3 to be able to access it.
The writer2 inserts data into a database. This operation may fail but should allow the job to continue the execution and if everything else works fine then the job should end successfully. That is the reason why the insertion exceptions are all caught.
The problem:
If the writer operation in writer2 fails the exception is caught but the job fails after step3.
In Spring Batch Admin console the steps 1, 2 and 3 statuses are COMPLETED, the step 4 status is NONE, the job status and exit code is FAILED and the next exception is shown:
org.springframework.batch.core.JobExecutionException: Flow execution ended unexpectedly at
org.springframework.batch.core.job.flow.FlowJob.doExecute(FlowJob.java:141) at
org.springframework.batch.core.job.AbstractJob.execute(AbstractJob.java:301) at
org.springframework.batch.core.launch.support.SimpleJobLauncher$1.run(SimpleJobLauncher.java:134) at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) at java.lang.Thread.run(Thread.java:745) Caused by:
org.springframework.batch.core.job.flow.FlowExecutionException: Ended flow=job1 at state=step3 with exception at
org.springframework.batch.core.job.flow.support.SimpleFlow.resume(SimpleFlow.java:160) at
org.springframework.batch.core.job.flow.support.SimpleFlow.start(SimpleFlow.java:130) at
org.springframework.batch.core.job.flow.FlowJob.doExecute(FlowJob.java:135)
... 5 more Caused by: java.util.EmptyStackException at
org.codehaus.jettison.util.FastStack.peek(FastStack.java:39) at
org.codehaus.jettison.mapped.MappedXMLStreamWriter.setNewValue(MappedXMLStreamWriter.java:121) at
org.codehaus.jettison.mapped.MappedXMLStreamWriter.makeCurrentJSONObject(MappedXMLStreamWriter.java:113) at
org.codehaus.jettison.mapped.MappedXMLStreamWriter.writeStartElement(MappedXMLStreamWriter.java:241) at
com.thoughtworks.xstream.io.xml.StaxWriter.startNode(StaxWriter.java:162) at
com.thoughtworks.xstream.io.xml.AbstractXmlWriter.startNode(AbstractXmlWriter.java:37) at
com.thoughtworks.xstream.io.WriterWrapper.startNode(WriterWrapper.java:33) at
com.thoughtworks.xstream.io.path.PathTrackingWriter.startNode(PathTrackingWriter.java:44) at
com.thoughtworks.xstream.io.ExtendedHierarchicalStreamWriterHelper.startNode(ExtendedHierarchicalStreamWriterHelper.java:17) at
com.thoughtworks.xstream.converters.collections.AbstractCollectionConverter.writeItem(AbstractCollectionConverter.java:62) at
com.thoughtworks.xstream.converters.collections.MapConverter.marshal(MapConverter.java:57) at
com.thoughtworks.xstream.core.AbstractReferenceMarshaller.convert(AbstractReferenceMarshaller.java:65) at
com.thoughtworks.xstream.core.TreeMarshaller.convertAnother(TreeMarshaller.java:78) at
com.thoughtworks.xstream.core.TreeMarshaller.convertAnother(TreeMarshaller.java:63) at com.thoughtworks.xstream.core.Tree
If no exceptions occur then the job ends successfully.
Any ideas why this could be failing?
Thanks.
I have a simple batch job with a decider. The decider implementation only adds the error to a list and then returns the original FlowExecutionStatus. The parser is failing and exiting instead of failing and executing the decision step. Why? (I know ErrorHandler is not being called because the output never includes the logger ouput "in decider asdf". Also what is the best way to debug the xml? Any assistance would be greatly appreciated.
<batch:job id="testDecider">
<batch:step id="testme" next="testerDecider">
<batch:tasklet>
<batch:chunk reader="csvSLHistoryFileReader"
processor="stationSendStatsCalculator" skip-limit="5" commit-interval="10"
writer="noOpWriter">
<batch:skippable-exception-classes>
<batch:include class="org.springframework.batch.item.file.FlatFileParseException"/>
</batch:skippable-exception-classes>
</batch:chunk>
</batch:tasklet>
</batch:step>
<!-- this commit interval can be 1 because all the input from previous step will be completed and this step will
iterate through the created to-from list -->
<batch:step id="doTallies">
<batch:tasklet>
<batch:chunk reader="stationSendCountsListReader"
processor="passThrough" commit-interval="100000"
writer="stationSubtotalsFileWriter">
</batch:chunk>
</batch:tasklet>
</batch:step>
<batch:step id="tryTesterInput">
<batch:tasklet>
<batch:chunk reader="csvSSHistoryFileReader"
processor="stationSendStatsCalculator" skip-limit="5" commit-interval="10"
writer="noOpWriter">
<batch:skippable-exception-classes>
<batch:include class="org.springframework.batch.item.file.FlatFileParseException"/>
</batch:skippable-exception-classes>
</batch:chunk>
</batch:tasklet>
</batch:step>
<batch:decision id="testerDecider" decider="errorHandler">
<batch:next on="FAILED" to="tryTesterInput"/>
<batch:next on="*" to="doTallies"/>
</batch:decision>
</batch:job>
Error Handler:
public class ErrorHandler implements JobExecutionDecider {
#Resource List<String> errorsList;
private final Log logger = LogFactory.getLog(getClass());
private String badInput = "none";
private int newErrors = 0;
/* (non-Javadoc)
* #see org.springframework.batch.core.job.flow.JobExecutionDecider#decide(org.springframework.batch.core.JobExecution, org.springframework.batch.core.StepExecution)
*/
#Override
public FlowExecutionStatus decide(JobExecution jobExecution,
StepExecution stepExecution) {
logger.info("in decider asdf");
if (stepExecution.getExitStatus().getExitCode().equals("ERRORS")){
if (errorsList.size() > 0){
newErrors = errorsList.size();
logger.info("Errors encountered in previous step: "+ errorsList.size());
return new FlowExecutionStatus ("FAILED");
}
}
return new FlowExecutionStatus(jobExecution.getStatus().toString());
}
}
The exception:
12:47:37,180 [main] INFO ProcessStatisticsTasklet - constructor...
12:47:37,529 [main] INFO RequestsListReader - constructing requestsListReader
12:47:37,662 [main] INFO SimpleJobLauncher - Job: [FlowJob: [name=testDecider]] launched with the following parameters: [{csvHistoryFileName=input/ssXactHistory.csv, csvStationsProfileName=input/bostonStationsProfile.csv, RequestGenerationFileName=input/bostonRequestGeneration.xlsx}]
12:47:37,680 [main] INFO SimpleStepHandler - Executing step: [testme]
12:47:37,759 [main] ERROR AbstractStep - Encountered an error executing step testme in job testDecider
org.springframework.batch.core.step.skip.SkipLimitExceededException: Skip limit of '5' exceeded
at org.springframework.batch.core.step.skip.LimitCheckingItemSkipPolicy.shouldSkip(LimitCheckingItemSkipPolicy.java:133)
at org.springframework.batch.core.step.skip.ExceptionClassifierSkipPolicy.shouldSkip(ExceptionClassifierSkipPolicy.java:70)
at org.springframework.batch.core.step.item.FaultTolerantChunkProvider.shouldSkip(FaultTolerantChunkProvider.java:134)
at org.springframework.batch.core.step.item.FaultTolerantChunkProvider.read(FaultTolerantChunkProvider.java:91)
at org.springframework.batch.core.step.item.SimpleChunkProvider$1.doInIteration(SimpleChunkProvider.java:114)
at org.springframework.batch.repeat.support.RepeatTemplate.getNextResult(RepeatTemplate.java:368)
at org.springframework.batch.repeat.support.RepeatTemplate.executeInternal(RepeatTemplate.java:215)
at org.springframework.batch.repeat.support.RepeatTemplate.iterate(RepeatTemplate.java:144)
at org.springframework.batch.core.step.item.SimpleChunkProvider.provide(SimpleChunkProvider.java:108)
at org.springframework.batch.core.step.item.ChunkOrientedTasklet.execute(ChunkOrientedTasklet.java:69)
at org.springframework.batch.core.step.tasklet.TaskletStep$ChunkTransactionCallback.doInTransaction(TaskletStep.java:395)
at org.springframework.transaction.support.TransactionTemplate.execute(TransactionTemplate.java:130)
at org.springframework.batch.core.step.tasklet.TaskletStep$2.doInChunkContext(TaskletStep.java:267)
at org.springframework.batch.core.scope.context.StepContextRepeatCallback.doInIteration(StepContextRepeatCallback.java:77)
at org.springframework.batch.repeat.support.RepeatTemplate.getNextResult(RepeatTemplate.java:368)
at org.springframework.batch.repeat.support.RepeatTemplate.executeInternal(RepeatTemplate.java:215)
at org.springframework.batch.repeat.support.RepeatTemplate.iterate(RepeatTemplate.java:144)
at org.springframework.batch.core.step.tasklet.TaskletStep.doExecute(TaskletStep.java:253)
at org.springframework.batch.core.step.AbstractStep.execute(AbstractStep.java:195)
at org.springframework.batch.core.job.SimpleStepHandler.handleStep(SimpleStepHandler.java:141)
at org.springframework.batch.core.job.flow.JobFlowExecutor.executeStep(JobFlowExecutor.java:64)
at org.springframework.batch.core.job.flow.support.state.StepState.handle(StepState.java:60)
at org.springframework.batch.core.job.flow.support.SimpleFlow.resume(SimpleFlow.java:151)
at org.springframework.batch.core.job.flow.support.SimpleFlow.start(SimpleFlow.java:130)
at org.springframework.batch.core.job.flow.FlowJob.doExecute(FlowJob.java:135)
at org.springframework.batch.core.job.AbstractJob.execute(AbstractJob.java:301)
at org.springframework.batch.core.launch.support.SimpleJobLauncher$1.run(SimpleJobLauncher.java:134)
at org.springframework.core.task.SyncTaskExecutor.execute(SyncTaskExecutor.java:48)
at org.springframework.batch.core.launch.support.SimpleJobLauncher.run(SimpleJobLauncher.java:127)
at org.springframework.batch.core.launch.support.CommandLineJobRunner.start(CommandLineJobRunner.java:351)
at org.springframework.batch.core.launch.support.CommandLineJobRunner.main(CommandLineJobRunner.java:577)
Caused by: org.springframework.batch.item.file.FlatFileParseException: Parsing error at line: 6 in resource=[URL [file:input/ssXactHistory.csv]], input=[12/07/14 19:58:54,12/07/14 19:58:54,12/07/14 20:00:02,0,0,12,54,67,19,0,1,0,1,,,,[157] 20:00:02 Empty Zone:16 Station:165 YK L/D ===> Interzone:110 ===> Zone:5 Station:53 NS 5 W ,Station:165 YK L/D,Station:53 NS 5 W,None,None,,,Zone:16 ,Zone:5 ,]
The exception is pretty clear in that the skip-limit of 5 has been hit which will cause the step to fail. When using the <step ... next="someStep"> notation for the next state, the next state is only executed if the step completes successfully (which it has not in your case). Because of that, your decision won't execute. Instead, you need to use the longer hand version of defining where to go:
<step>
...
<next on="FAILED" to="testerDecider"/>
</step>
I want to pass some information from one step to another step in a batch job. Created BStepListener where the value into the context is stored, but this same value is not coming to tasklet [SendMailTasklet] created in another step. Where am i missing?
Job configuration
<job id="bJob" xmlns="http://www.springframework.org/schema/batch">
<step id="step1">
<tasklet>
<chunk reader="bReader" writer="bWriter" processor="bProcessor"
commit-interval="10" />
</tasklet>
<batch:next on="COMPLETED" to="sendEmail"/>
<listeners>
<listener ref="bStepListner"/>
<listener ref="bPromotionListener"/>
</listeners>
</step>
<step id="sendEmail">
<tasklet ref="sendMailManager"/>
</step>
</job>
<bean id="bStepListner" class="com.listener.BStepListener" scope="step"/>
<bean id="bPromotionListener" class="org...ExecutionContextPromotionListener">
<property name="keys" value="msg"/>
</bean>
<bean id="sendMailManager" class="com.mail.SendMailTasklet" scope="step">
BStepListener.java
public ExitStatus afterStep(StepExecution stepExecution) {
System.out.println("Step Execution Listener ... after Step");
String message = "A Sample message from step to step";
stepExecution.getExecutionContext().put("msg", message);
return null;
}
SendMailTasklet.java
public RepeatStatus execute(StepContribution contribution, ChunkContext chunkContext)
throws Exception {
logger.info("Sending Email service....");
String message = (String)chunkContext.getStepContext().getJobExecutionContext().get("msg");
this.sendMail();
return RepeatStatus.FINISHED;
}
I think (I'd have to double check the code) that we don't guarantee the order listeners are called. Because of that, the promotion listener may be being called before yours is. Try using the CompositeStepExecutionListener to wrap your list of listeners so that order is preserved.
You can read more about the CompositeStepExecutionListener here: http://docs.spring.io/spring-batch/trunk/apidocs/org/springframework/batch/core/listener/CompositeStepExecutionListener.html