Footer Callback not being invoked - spring-batch

I have defined a footercallback , but the footer info is not being written to the File.
Here is the config and code. Am i missing something?
The afterstep gets called and the WriteCount is being written to the log but not to the file.
The job def:
<job id="sodfeed" job-repository="tplJobRepository" xmlns="http://www.springframework.org/schema/batch">
<step id="readWriteBalances">
<tasklet>
<chunk reader="balancesReader" writer="balancesWriter" commit-interval="100" >
</chunk>
<listeners>
<listener ref="tplBatchFooterCallback" />
<listener ref="tplBatchFailureListener" />
</listeners>
</tasklet>
</step>
</job>
public class FooterCallback extends StepExecutionListenerSupport implements FlatFileFooterCallback{
private StepExecution stepExecution;
public void writeFooter(Writer writer) throws IOException {
writer.write("EOF" + stepExecution.getWriteCount());
System.out.println("**************************EOF" + stepExecution.getWriteCount());
}
public ExitStatus afterStep(StepExecution stepExecution) {
ExitStatus returnStatus = stepExecution.getExitStatus();
logger.info("Number of records written:"+stepExecution.getWriteCount());
return returnStatus;
}
}

Is tplBatchFooterCallback correctly injected into your FlatFileItemWriter? Listeners and callback are used in a different way.
Lookup official javadoc.

Related

Spring Batch: MultiResourcePartitioner how to set resources lazily

Question: How to set/Inject resources lazily like how it is done using the XML config in Java config.
We have spring batch program that is currently using XML configuration for uploading multiple files using the MultiResourcePartitioner. This works as intended see below config.
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:batch="http://www.springframework.org/schema/batch"
xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd
http://www.springframework.org/schema/batch http://www.springframework.org/schema/batch/spring-batch-3.0.xsd">
<job id="fileLoaderJob" xmlns="http://www.springframework.org/schema/batch">
<step id="moveFiles" next="batchFileUploader">
<tasklet ref="moveFilesTasklet" />
</step>
<step id="batchFileUploader" parent="batchFileUpload:master" >
<next on="*" to="archiveFiles" />
</step>
<step id="archiveFiles" >
<batch:tasklet ref="archiveFilesTasklet" />
</step>
</job>
<!--This Tasklet moves the file from say Input to Work dir -->
<bean id="moveFilesTasklet" class="com.spring.batch.fileloader.MoveFilesTasklet" scope="step" />
<step id="batchFileUpload" xmlns="http://www.springframework.org/schema/batch">
<tasklet>
<chunk reader="fileReader"
commit-interval="10000"
writer="fileWriter"
/>
</tasklet>
</step>
<bean name="batchFileUpload:master" class="org.springframework.batch.core.partition.support.PartitionStep">
<property name="jobRepository" ref="jobRepository"/>
<property name="stepExecutionSplitter">
<bean class="org.springframework.batch.core.partition.support.SimpleStepExecutionSplitter">
<constructor-arg ref="jobRepository"/>
<constructor-arg ref="batchFileUpload"/>
<constructor-arg>
<bean class="org.springframework.batch.core.partition.support.MultiResourcePartitioner" scope="step">
<property name="resources" ref="fileResources" />
</bean>
</constructor-arg>
</bean>
</property>
<property name="partitionHandler">
<bean class="org.springframework.batch.core.partition.support.TaskExecutorPartitionHandler">
<property name="taskExecutor">
<bean class="org.springframework.core.task.SimpleAsyncTaskExecutor">
<property name="concurrencyLimit" value="5" />
</bean>
</property>
<property name="step" ref="batchFileUpload"/>
</bean>
</property>
</bean>
<bean id="fileResources" class="com.spring.batch.fileloader.fileResources" />
<bean id="fileReader" class="org.springframework.batch.item.file.FlatFileItemReader" scope="step">
<property name="resource" value="#{stepExecutionContext[fileName]}" />
<property name="lineMapper">
<bean class="org.springframework.batch.item.file.mapping.DefaultLineMapper">
<property name="lineTokenizer">
<bean class="org.springframework.batch.item.file.transform.DelimitedLineTokenizer">
<property name="delimiter" value="," />
</bean>
</property>
<property name="fieldSetMapper">
<bean class="...fileFieldSetMapper" />
</property>
</bean>
</property>
</bean>
<bean id="fileWriter" class="....fileWriter" scope="step" />
<bean id="archiveFilesTasklet" class="....ArchiveFilesTasklet" scope="step" />
</beans>
This works well. When I try to covert this to Java Config. I am getting the resources as NULL. Here is my config class.
#Configuration
#EnableBatchProcessing
#ComponentScan(basePackages = {"com.spring.batch.fileloader"})
public class SpringBatchConfig{
#Autowired
private JobBuilderFactory jobBuilders;
#Autowired
private StepBuilderFactory stepBuilders;
#Autowired
private DataSource dataSource;
#Autowired
private ResourcePatternResolver resourcePatternResolver;
#Autowired
private ReadPropertiesFile properties;
#Bean
BatchConfigurer configurer(#Qualifier("dataSource") DataSource dataSource){
return new DefaultBatchConfigurer(dataSource);
}
#Bean(name = "fileLoaderJob")
public Job csiAuditFileLoaderJob() throws Exception{
return jobBuilders.get("csiAuditFileLoaderJob")
.start(moveFiles())
.next(batchFileUploader())
.next(archiveFiles())
.build();
}
#Bean
public Step moveFiles(){
return stepBuilders.get("moveFiles")
.tasklet(new MoveFilesTasklet(properties))
.build();
}
#Bean
#Lazy
public Step batchFileUploader() throws Exception{
return stepBuilders.get("batchFileUploader")
.partitioner(batchFileUploadStep().getName(), partitioner())
.step(batchFileUploadStep())
.taskExecutor(taskExecutor())
.build();
}
#Bean
public Step archiveFiles(){
return stepBuilders.get("archiveFiles")
.tasklet(new ArchiveFilesTasklet(properties))
.build();
}
#Bean
public Step batchFileUploadStep(){
return stepBuilders.get("batchFileUploadStep")
.<MyDomain, MyDomain>chunk(10000)
.reader(fileReader(null))
.writer(fileWriter())
.build();
}
#Bean
#Lazy
public Partitioner partitioner() throws Exception{
MultiResourcePartitioner partitioner = new MultiResourcePartitioner();
Resource[] resources;
try{
/*
Here the resources is selected from a path where the previous MoveFilesTasklet moves the file
This returns null since Spring Initialize this bean eagerly before the step is called for execution.
*/
resources = resourcePatternResolver.getResources("file:" + properties.getPath() + "/*.csv");
}
catch(IOException e){
throw new RuntimeException("I/O problems when resolving the input file pattern.", e);
}
partitioner.setResources(resources);
return partitioner;
}
#Bean
public TaskExecutor taskExecutor(){
ThreadPoolTaskExecutor taskExecutor = new ThreadPoolTaskExecutor();
taskExecutor.setMaxPoolSize(5);
taskExecutor.afterPropertiesSet();
return taskExecutor;
}
#Bean
#StepScope
public FlatFileItemReader<MyDomain> fileReader(
#Value("#{stepExecutionContext['fileName']}") String filename){
FlatFileItemReader<MyDomain> reader = new FlatFileItemReader<>();
DelimitedLineTokenizer tokenizer = new DelimitedLineTokenizer();
DefaultLineMapper<MyDomain> lineMapper = new DefaultLineMapper<>();
lineMapper.setLineTokenizer(tokenizer);
lineMapper.setFieldSetMapper(new MyFieldSetMapper());
lineMapper.afterPropertiesSet();
reader.setLineMapper(lineMapper);
reader.setResource(new PathResource(filename));
return reader;
}
#Bean
public ItemWriter<MyDomain> fileWriter(){
return new FileWriter();
}
private JobRepository getJobRepository() throws Exception{
JobRepositoryFactoryBean jobRepositoryFactoryBean = new JobRepositoryFactoryBean();
jobRepositoryFactoryBean.setDataSource(dataSource);
jobRepositoryFactoryBean.setTransactionManager(getTransactionManager());
jobRepositoryFactoryBean.setDatabaseType("MySql");
jobRepositoryFactoryBean.afterPropertiesSet();
return jobRepositoryFactoryBean.getObject();
}
private PlatformTransactionManager getTransactionManager(){
return new ResourcelessTransactionManager();
}
#Bean
public JobLauncher jobLauncher() throws Exception{
SimpleJobLauncher jobLauncher = new SimpleJobLauncher();
jobLauncher.setJobRepository(getJobRepository());
jobLauncher.afterPropertiesSet();
return jobLauncher;
}
}
Below might be the issue
1 No need to do reader.setResource(new PathResource(filename)); inside FlatFileItemReader as it will override the resources set by partitioner Bean
2 Use PathMatchingResourcePatternResolver to load files in partitioner Bean
Sample code
ClassLoader cl = this.getClass().getClassLoader();
ResourcePatternResolver resolver = new PathMatchingResourcePatternResolver(cl);
resources = resolver.getResources("file:" + properties.getPath() + "/*.csv");
3: you can also add #StepScope on partitioner bean (Not sure abot this)
Hope this helps :)
My 2p with Spring Boot:
#Bean
#StepScope
public Partitioner logsPartitioner(#Value("file:${my.resources.path}/${my.resources.input:*}.csv") Resource[] resources) {
MultiResourcePartitioner partitioner = new MultiResourcePartitioner();
partitioner.setResources(resources);
partitioner.partition(resources.length);
return partitioner;
}

Spring batch does not start reading the list from the beginning

I am using a Task Scheduler to run the batch job that runs after every 10 seconds. But, after every 10 seconds the reader begins the reading from 2nd element in the list.
My custom reader:
public class DataReader implements ItemReader<User> {
#Autowired
private MainDAO mainDAO;
private int counter;
private List<User> userList;
#PostConstruct
public void init() {
this.userList = this.mainDAO.getAllUsers();
}
public User read() throws Exception, UnexpectedInputException, ParseException, NonTransientResourceException {
// TODO Auto-generated method stub
System.out.println(counter);
if(counter < userList.size())
return userList.get(counter++);
return null;
}
}
It always starts reading from 2nd element in the list while the counter must be set to zero when the class is created.
My Run Scheduler:
#Component
public class RunScheduler {
#Autowired
private JobLauncher jobLauncher;
#Autowired
private Job job;
public void run() {
try {
String dateParameter = new Date().toString();
JobParameters parameter = new JobParametersBuilder().addString("date", dateParameter).toJobParameters();
System.out.println(dateParameter);
JobExecution execution = this.jobLauncher.run(job, parameter);
System.out.println("Exit status: " + execution.getStatus());
}
catch(Exception exception) {
exception.printStackTrace();
}
}
}
My Spring configuration for Spring batch:
<!-- Spring Batch Configuration: Start -->
<bean id="customReader" class="com.arpit.reader.DataReader" />
<bean id="customProcessor" class="com.arpit.processor.DataProcessor" />
<bean id="customWriter" class="com.arpit.writer.DataWriter" />
<batch:job id="invoiceJob">
<batch:step id="step1">
<batch:tasklet>
<batch:chunk reader="customReader" processor="customProcessor" writer="customWriter" commit-interval="1" />
</batch:tasklet>
</batch:step>
</batch:job>
<bean id="transactionManager" class="org.springframework.batch.support.transaction.ResourcelessTransactionManager" />
<bean id="jobLauncher" class="org.springframework.batch.core.launch.support.SimpleJobLauncher">
<property name="jobRepository" ref="jobRepository"/>
</bean>
<bean id="jobRepository" class="org.springframework.batch.core.repository.support.MapJobRepositoryFactoryBean">
<property name="transactionManager" ref="transactionManager" />
</bean>
<!-- Spring Batch Configuration: End -->
<!-- Task Scheduler Configuration: Start -->
<bean id="runScheduler" class="com.arpit.scheduler.RunScheduler" />
<task:scheduled-tasks>
<task:scheduled ref="runScheduler" method="run" cron="*/10 * * * * *" />
</task:scheduled-tasks>
<!-- Task Scheduler Configuration: End -->
Can someone please help me figure out what the issue is?

Want to get total row count in footer of spring batch without customizing writer(Delegate Pattern)

This is my footer class:--
public class SummaryFooterCallback extends StepExecutionListenerSupport implements FlatFileFooterCallback{
private StepExecution stepExecution;
#Override
public void writeFooter(Writer writer) throws IOException {
writer.write("footer - number of items written: " + stepExecution.getWriteCount());
}
#Override
public void beforeStep(StepExecution stepExecution) {
this.stepExecution = stepExecution;
}
}
This is my xml:--
<bean id="writer" class="org.springframework.batch.item.file.FlatFileItemWriter">
<property name="resource" ref="outputResource" />
<property name="lineAggregator">
<bean class="org.springframework.batch.item.file.transform.PassThroughLineAggregator" />
</property>
<property name="headerCallback" ref="headerCopier" />
<property name="footerCallback" ref="footerCallback" />
</bean>
<bean id="footerCallback" class="org.springframework.batch.sample.support.SummaryFooterCallback"/>
Failing at stepExecution.getWriteCount() with nullpointer Exception.
No, I haven't registered callback as a listener in the step. I am new to Java and Spring Batch, referring to your book Pro Spring Batch but not able to get the solution of the assigned task.
You need to set the writer in scope step. Here you have a java based config that worked for me.
#Bean
#StepScope
public ItemStreamWriter<Entity> writer(FlatFileFooterCallback footerCallback) {
FlatFileItemWriter<Entity> writer = new FlatFileItemWriter<Entity>();
...
writer.setFooterCallback(footerCallback);
...
return writer;
}
#Bean
#StepScope
private FlatFileFooterCallback getFooterCallback(#Value("#{stepExecution}") final StepExecution context) {
return new FlatFileFooterCallback() {
#Override
public void writeFooter(Writer writer) throws IOException {
writer.append("count: ").append(String.valueOf(context.getWriteCount()));
}
};
}

spring batch: processor called twice after skip

I have a defined a chunk with commit-interval as 10, skip-limit as 10. A processor class manipulates a field by applying some arithmetic operations. In processor class exception occurs for one of the records (say 6th record). After this, once again records from 1 to 5 are processed, 6th is skipped, 7-10 are processed and written to a XML (a custom XML writer class). Since the processor processes 1-5 records twice, the expected field value is wrong as it is calculated twice. Can you please suggest a solution to have the processor process the records only once, skip only the failed record and write the processed records to XML?
Implemented a SkipListener with onSkipInProcess(), onSkipInRead(), onSkipInWrite(). But the output is still the same.
jobconfig.xml
<batch:job id="job">
<batch:step id="step">
<batch:tasklet>
<batch:chunk reader="itemReader" writer="itemWriter"
processor="itemProcessor" commit-interval="10" skip-limit="5" retry-limit="0" >
<batch:skippable-exception-classes>
<batch:include class="java.lang.Exception"/>
</batch:skippable-exception-classes>
<batch:listeners>
<batch:listener ref="skipListener" />
</batch:listeners>
</batch:chunk>
</batch:tasklet>
</batch:step>
</batch:job>
<bean id="itemWriter" class="a.b.XWriter" scope="step"/>
<bean id="skipListener" class="a.b.SkipListener"/>
<bean id="itemProcessor" class="a.b.XProcessor" scope="step"/>
<bean id="itemReader" class="a.b.XReader"/>
ItemReader class:
public class XReader implements ItemReader {
#Autowired
private XRepository classDao;
private List lst = new ArrayList();
private int index= 0;
public Object read() throws Exception, UnexpectedInputException, ParseException, NonTransientResourceException {
if (lst.isEmpty()) {
lst = classDao.findAll();
}
if (index < lst.size()) {
return lst.get(index++);
} else return null;
}
}
ItemProcessor class:
public class XProcessor<T> implements ItemProcessor<T, T> {
public Object process(Object item) throws Exception {
// logic here
}
ItemWriter class:
public class XWriter <T> implements ItemWriter<T> {
public void write(List<? extends T> items) throws Exception {
// logic here to write to XML
}}
SkipListener class:
public class SkipListener<T,S> implements org.springframework.batch.core.SkipListener<T, S> {
public void onSkipInProcess(T arg0, Throwable arg1) {
}
public void onSkipInRead(Throwable arg0) {
}
public void onSkipInWrite(S arg0, Throwable arg1) {
}
}
When using ItemProcessors in a fault tolerant step, they are expected to be idempotent because there is a risk that they will be called multiple times (as seen in your example). You can read more about this in section 6.3.3 of the documentation here: http://docs.spring.io/spring-batch/reference/html/readersAndWriters.html
You need to have a listener implementation like the below one. Whenever some exception occurs it calls the corresponding method, if you want , you can handle otherwise just leave the method empty. so it will not fail the job.
Also it will not call the processor twice.
xml configuration:
<batch:listeners>
<batch:listener ref="recordSkipListener"/>
</batch:listeners>
Listener class:
public class RecordSkipListener implements SkipListener<Model> {
#Override
public void onSkipInRead(Throwable t) {
}
#Override
public void onSkipInWrite(Model item, Throwable t) {
}
#Override
public void onSkipInProcess(Model item, Throwable t) {
}
}

REST API with Struts

I'm trying to add a REST API to an existing struts 2 application.
The idea is to have part of the application using standard struts mapping, and another part using REST.
So I used the struts2-rest-plugin plugin, and added the following configuration:
struts.xml:
<constant name="rest" value="org.apache.struts2.rest.RestActionMapper"/>
<constant name="struts.mapper.class"
value="org.apache.struts2.dispatcher.mapper.PrefixBasedActionMapper"/>
<constant name="struts.mapper.prefixMapping" value="/rest:rest,/:struts"/>
struts.properties:
struts.action.extension=,htm,action,xml,json
TasksController.java:
package xxx.common.webservice.rest;
public class TasksController implements ModelDriven<Task> {
public String update() {
return "UPDATE";
}
// Handles /tasks/{id} GET requests
public String show() {
return "YES";
}
#Override
public Task getModel() {
// TODO Auto-generated method stub
return null;
}
}
With this configuration, the basic struts action work, but I can't get the REST actions to work.
I also tried different struts.xml configurations (including the convention plugin options), but without any success, the mappings are never shown with the config-brower plugin.
Any idea of what I have missed or done wrong?
It finally worked, but it was a while ago and I don't remember exactly what I did, here is my configuration, hope this helps.
struts.xml
<constant name="struts.convention.action.mapAllMatches" value="true"/>
<constant name="struts.convention.package.locators" value="webservice"/>
<constant name="struts.convention.action.suffix" value="Controller"/>
<constant name="struts.convention.default.parent.package" value="rest-default"/>
<constant name="struts.mapper.class" value="org.apache.struts2.dispatcher.mapper.PrefixBasedActionMapper" />
<constant name="struts.mapper.prefixMapping" value="/rest:rest,:struts" />
<package name="home" namespace="/" extends="struts-default">
...
</package>
TaskController.java
package com.test.webservice.rest;
public class TaskController extends RestActionSupport implements
ModelDriven<TaskDTO> {
public final HttpHeaderResult show() {
...
}
...
}