store filenames in Spring Batch for send email - spring-batch

I’m writing an application in Spring Batch to do this:
Read the content of a folder, file by file.
Rename the files and move them to several folders.
Send two emails: one with successed name files processed and one with name files that throwed errors.
I’ve already get 1. and 2. but I need to make the 3 point. ¿How can I store the file names that have send to the writer method in an elegant way with Spring Batch?

You can use a Execution Context to store the values of file names which gets processed and also which fails with errors.
We shall have a List/ similar datastructure which has the file names after the business logic. Below is a small snippet for reference which implements StepExecutionListener,
public class FileProcessor implements ItemWriter<TestData>, StepExecutionListener {
private List<String> success = new ArrayList<>();
private List<String> failed = new ArrayList<>();
#Override
public void beforeStep(StepExecution stepExecution) {
}
#Override
public void write(List<? extends BatchTenantBackupData> items) throws Exception {
// Business logic which adds the success and failure file names to the list
after processing
}
#Override
public ExitStatus afterStep(StepExecution stepExecution) {
stepExecution.getJobExecution().getExecutionContext()
.put("fileProcessedSuccessfully", success);
stepExecution.getJobExecution().getExecutionContext()
.put("fileProcessedFailure", failed);
return ExitStatus.COMPLETED;
}
}
Now we have stored the file names in the execution context which we will be able to use it in send email step.
public class sendReport implements Tasklet, StepExecutionListener {
private List<String> success = new ArrayList<>();
private List<String> failed = new ArrayList<>();
#Override
public void beforeStep(StepExecution stepExecution) {
try {
// Fetch the list of file names which we have stored in the context from previous step
success = (List<String>) stepExecution.getJobExecution().getExecutionContext()
.get("fileProcessedSuccessfully");
failed = (List<BatchJobReportContent>) stepExecution.getJobExecution()
.getExecutionContext().get("fileProcessedFailure");
} catch (Exception e) {
}
}
#Override
public RepeatStatus execute(StepContribution contribution, ChunkContext chunkContext) throws Exception {
// Business logic to send email with the file names
}
#Override
public ExitStatus afterStep(StepExecution stepExecution) {
logger.debug("Email Trigger step completed successfully!");
return ExitStatus.COMPLETED;
}
}

Related

Detecting when the Job is STOPPING in a FlatFileItemReader custom bufferedReader instance

How would one detect when the Job has been signaled to stop from within the FlatFileItemReader bufferedFileReader that was created using the configured custom BufferedFileReaderFactory bean. This custom bufferedFileReader tails the file and waits for more input indefinitely so the Job STOPPING status isn't being detected as the code is blocked outside the Spring Batch framework.
I can do this with a Step1->Tasklet->Step1 flow loop that does a Thread.sleep in the Tasklet but the nature of this constantly growing file means I'll be hitting EOF every couple of seconds and generating a huge amount of StepExecution rows in the database.
public class TailingBufferedReaderFactory implements BufferedReaderFactory {
#Override
public BufferedReader create(Resource resource, String encoding) throws IOException {
return new TailingBufferedReader(new InputStreamReader(resource.getInputStream(), encoding));
}
}
public class TailingBufferedReader extends BufferedReader implements JobExecutionListener {
private JobExecution jobExecution;
public TailingBufferedReader(Reader in) {
super(in);
}
#Override
public String readLine() throws IOException {
while (!jobExecution.isStopping()) { //The elusive Job Execution status check
var line = super.readLine();
if (line == null) {
Thread.sleep(waitDurationMillis);
continue;
}
return line;
}
return null;
}
// Ideally something like this configured on the Job
#Override
public void beforeJob(JobExecution jobExecution) {
this.jobExecution = jobExecution;
}
#Override
public void afterJob(JobExecution jobExecution) {}
}

Spring Batch How to get JobExecution Object in Process Listner

I have requirement in my project what ever exception occur in ItemProccesor need to store Exception in JobExecution context and at the end of JobExecution send mail for Exceptional records but how to get JobExecution Object in processListner?
I tried using #beforestep in processListner but JobExecution object was null is there any way to get JobExecution context in process Listner
I got solution in spring batch for above issue, need to specify jobscope in process listener and access job execution context in listner class code is mention below.
#Bean
#JobScope
public CaliberatedProcessorListener calibratedProcessorListener() {
return new CaliberatedProcessorListener();
}
public class CaliberatedProcessorListener <T, S> implements ItemProcessListener<T, S> {
#Value("#{jobExecution}")
public JobExecution jobExecution;
#Override
public void beforeProcess(T calibratedProessorInPut) {
// // do nothing
}
#Override
public void afterProcess(T calibratedProessorInput, S calibratedProessorOutPut) {
// do nothing
}
#Override
public void onProcessError(T item, Exception calibratedProcessorEx) {
FtpEmailData ftpEmailData = (FtpEmailData) jobExecution.getExecutionContext().get("calDeviceBatchInfo");
ftpEmailData.getExceptionList().add(new CalibratedDeviceException(calibratedProcessorEx.getMessage()));
}
}

Spring Batch: File not being read

I am trying to create an application that uses the spring-batch-excel extension to be able to read Excel files uploaded through a web interface by it's users in order to parse the Excel file for addresses.
When the code runs, there is no error, but all I get is the following in my log. Even though I have log/syso throughout my Processor and Writer (these are never being called, and all I can imagine is it's not properly reading the file, and returning no data to process/write). And yes, the file has data, several thousand records in fact.
Job: [FlowJob: [name=excelFileJob]] launched with the following parameters: [{file=Book1.xlsx}]
Executing step: [excelFileStep]
Job: [FlowJob: [name=excelFileJob]] completed with the following parameters: [{file=Book1.xlsx}] and the following status: [COMPLETED]
Below is my JobConfig
#Configuration
#EnableBatchProcessing
public class AddressExcelJobConfig {
#Bean
public BatchConfigurer configurer(EntityManagerFactory entityManagerFactory) {
return new CustomBatchConfigurer(entityManagerFactory);
}
#Bean
Step excelFileStep(ItemReader<AddressExcel> excelAddressReader,
ItemProcessor<AddressExcel, AddressExcel> excelAddressProcessor,
ItemWriter<AddressExcel> excelAddressWriter,
StepBuilderFactory stepBuilderFactory) {
return stepBuilderFactory.get("excelFileStep")
.<AddressExcel, AddressExcel>chunk(1)
.reader(excelAddressReader)
.processor(excelAddressProcessor)
.writer(excelAddressWriter)
.build();
}
#Bean
Job excelFileJob(JobBuilderFactory jobBuilderFactory,
#Qualifier("excelFileStep") Step excelAddressStep) {
return jobBuilderFactory.get("excelFileJob")
.incrementer(new RunIdIncrementer())
.flow(excelAddressStep)
.end()
.build();
}
}
Below is my AddressExcelReader
The late binding works fine, there is no error. I have tried loading the resource given the file name, in addition to creating a new ClassPathResource and FileSystemResource. All are giving me the same results.
#Component
#StepScope
public class AddressExcelReader implements ItemReader<AddressExcel> {
private PoiItemReader<AddressExcel> itemReader = new PoiItemReader<AddressExcel>();
#Override
public AddressExcel read()
throws Exception, UnexpectedInputException, ParseException, NonTransientResourceException {
return itemReader.read();
}
public AddressExcelReader(#Value("#{jobParameters['file']}") String file, StorageService storageService) {
//Resource resource = storageService.loadAsResource(file);
//Resource testResource = new FileSystemResource("upload-dir/Book1.xlsx");
itemReader.setResource(new ClassPathResource("/upload-dir/Book1.xlsx"));
itemReader.setLinesToSkip(1);
itemReader.setStrict(true);
itemReader.setRowMapper(excelRowMapper());
}
public RowMapper<AddressExcel> excelRowMapper() {
BeanWrapperRowMapper<AddressExcel> rowMapper = new BeanWrapperRowMapper<>();
rowMapper.setTargetType(AddressExcel.class);
return rowMapper;
}
}
Below is my AddressExcelProcessor
#Component
public class AddressExcelProcessor implements ItemProcessor<AddressExcel, AddressExcel> {
private static final Logger log = LoggerFactory.getLogger(AddressExcelProcessor.class);
#Override
public AddressExcel process(AddressExcel item) throws Exception {
System.out.println("Converting " + item);
log.info("Convert {}", item);
return item;
}
}
Again, this is never coming into play (no logs generated). And if it matters, this is how I'm launching my job from a FileUploadController from a #PostMapping("/") to handle the file upload, which first stores the file, then runs the job:
#PostMapping("/")
public String handleFileUpload(#RequestParam("file") MultipartFile file, RedirectAttributes redirectAttributes) {
storageService.store(file);
try {
JobParameters jobParameters = new JobParametersBuilder()
.addString("file", file.getOriginalFilename().toString()).toJobParameters();
jobLauncher.run(job, jobParameters);
} catch (JobExecutionAlreadyRunningException | JobRestartException | JobInstanceAlreadyCompleteException
| JobParametersInvalidException e) {
e.printStackTrace();
}
redirectAttributes.addFlashAttribute("message",
"You successfully uploaded " + file.getOriginalFilename() + "!");
return "redirect:/";
}
And last by not least
Here is my AddressExcel POJO
import lombok.Data;
#Data
public class AddressExcel {
private String address1;
private String address2;
private String city;
private String state;
private String zip;
public AddressExcel() {}
}
UPDATE (10/13/2016)
From Nghia Do's comments, I also created my own RowMapper instead of using the BeanWrapper to see if that was the issue. Still the same results.
public class AddressExcelRowMapper implements RowMapper<AddressExcel> {
#Override
public AddressExcel mapRow(RowSet rs) throws Exception {
AddressExcel temp = new AddressExcel();
temp.setAddress1(rs.getColumnValue(0));
temp.setAddress2(rs.getColumnValue(1));
temp.setCity(rs.getColumnValue(2));
temp.setState(rs.getColumnValue(3));
temp.setZip(rs.getColumnValue(4));
return temp;
}
}
All it seems I needed was to add the following to my ItemReader:
itemReader.afterPropertiesSet();
itemReader.open(new ExecutionContext());

Pass current step output to next step and write to flatfile

I need to prepare two set of List and write them into FlatFile. The first set will be only simple retrieving from SQL and before write into FlatFile will do some string formatting. Another set of data slightly complex, first I need to get data from some table and insert into a temp table. The data will grab from this temp table and similarly need to perform some string formatting and also updating the temp file. Finally, both set data write into FlatFile.
Come into Spring Batch, I will have 3 steps.
First Step
First Reader read from DB
First Processor string formatting
First Writer write into file
Second Step
BeforeRead Retrieve and Insert to Temp table
Second Reader read from temp table
Second Processor string formatting and update temp table status
Second Writer write into file
Third Step
MUltiResourceItemReader read two files
Write into Final File
Tasklet
Delete both file and purge the temp table.
My question now is for first and second step if I don't write into file, possible to pass the data into third step?
Taking in account what Hansjoerg Wingeier said, below are custom implementations of ListItemWriter and ListItemReader which lets you define a name property. This property is used as a key to store the list in the JobExecutionContext.
The reader :
public class CustomListItemReader<T> implements ItemReader<T>, StepExecutionListener {
private String name;
private List<T> list;
#Override
public T read() throws Exception, UnexpectedInputException, ParseException, NonTransientResourceException {
if (list != null && !list.isEmpty()) {
return list.remove(0);
}
return null;
}
#Override
public void beforeStep(StepExecution stepExecution) {
list = (List<T>) stepExecution.getJobExecution().getExecutionContext().get(name);
}
#Override
public ExitStatus afterStep(StepExecution stepExecution) {
return null;
}
public void setName(String name) {
this.name = name;
}
}
The writer :
public class CustomListItemWriter<T> implements ItemWriter<T>, StepExecutionListener {
private String name;
private List<T> list = new ArrayList<T>();
#Override
public void write(List<? extends T> items) throws Exception {
for (T item : items) {
list.add(item);
}
}
#Override
public void beforeStep(StepExecution stepExecution) {}
#Override
public ExitStatus afterStep(StepExecution stepExecution) {
stepExecution.getJobExecution().getExecutionContext().put(name, list);
return null;
}
public void setName(String name) {
this.name = name;
}
}
Normally, you don't want to do that.
If you just have a couple of hundred entries, it would work. You could, for instance, write a special class, that implements the reader and writer interface. When writing, just store the data in a list, when reading, read the entries from the list. Just instantiate it as a bean and use it in both steps (1 and 2) as your writer. by simply make the write method synchronized, it would even work when step 1 and 2 are executed in parallel.
But the problem is, that this solution doesn't scale with the amount of your input data. the more data you read, the more memory you need.
This is one of the key concepts of batch-processing: having a constant memory usage regardless of the amount of data that has to be processed.

Not able to access stepExecutionContext value in writer

I am setting the stepExecutionContext value in my partitioner and trying to get it in Writer.
But i could not able to access it.
The writer is step scoped.
Could any one help me how to get the step execution context values in writer?
Thanks
Shankar
you can implement StepExecutionListener in writer to get stepExecution in writer
public class ExampleWriter implements ItemWriter<T>,StepExecutionListener {
private JobExecution jobExecution;
#Override
public void write(List<? extends T> items) {
String executionContextValue=jobExecution.getExecutionContext().get("KEY");
System.out.println("ExecutionContextValue is:"+executionContextValue);
}
#Override
public ExitStatus afterStep(StepExecution stepExecution) {
return null;
}
#Override
public void beforeStep(StepExecution stepExecution) {
this.jobExecution= stepExecution.getJobExecution();
}
}
If needed register this class as a listener in xml configuration
<listener>ExampleWriter</listener>