Spring Batch How to get JobExecution Object in Process Listner - spring-batch

I have requirement in my project what ever exception occur in ItemProccesor need to store Exception in JobExecution context and at the end of JobExecution send mail for Exceptional records but how to get JobExecution Object in processListner?
I tried using #beforestep in processListner but JobExecution object was null is there any way to get JobExecution context in process Listner

I got solution in spring batch for above issue, need to specify jobscope in process listener and access job execution context in listner class code is mention below.
#Bean
#JobScope
public CaliberatedProcessorListener calibratedProcessorListener() {
return new CaliberatedProcessorListener();
}
public class CaliberatedProcessorListener <T, S> implements ItemProcessListener<T, S> {
#Value("#{jobExecution}")
public JobExecution jobExecution;
#Override
public void beforeProcess(T calibratedProessorInPut) {
// // do nothing
}
#Override
public void afterProcess(T calibratedProessorInput, S calibratedProessorOutPut) {
// do nothing
}
#Override
public void onProcessError(T item, Exception calibratedProcessorEx) {
FtpEmailData ftpEmailData = (FtpEmailData) jobExecution.getExecutionContext().get("calDeviceBatchInfo");
ftpEmailData.getExceptionList().add(new CalibratedDeviceException(calibratedProcessorEx.getMessage()));
}
}

Related

store filenames in Spring Batch for send email

I’m writing an application in Spring Batch to do this:
Read the content of a folder, file by file.
Rename the files and move them to several folders.
Send two emails: one with successed name files processed and one with name files that throwed errors.
I’ve already get 1. and 2. but I need to make the 3 point. ¿How can I store the file names that have send to the writer method in an elegant way with Spring Batch?
You can use a Execution Context to store the values of file names which gets processed and also which fails with errors.
We shall have a List/ similar datastructure which has the file names after the business logic. Below is a small snippet for reference which implements StepExecutionListener,
public class FileProcessor implements ItemWriter<TestData>, StepExecutionListener {
private List<String> success = new ArrayList<>();
private List<String> failed = new ArrayList<>();
#Override
public void beforeStep(StepExecution stepExecution) {
}
#Override
public void write(List<? extends BatchTenantBackupData> items) throws Exception {
// Business logic which adds the success and failure file names to the list
after processing
}
#Override
public ExitStatus afterStep(StepExecution stepExecution) {
stepExecution.getJobExecution().getExecutionContext()
.put("fileProcessedSuccessfully", success);
stepExecution.getJobExecution().getExecutionContext()
.put("fileProcessedFailure", failed);
return ExitStatus.COMPLETED;
}
}
Now we have stored the file names in the execution context which we will be able to use it in send email step.
public class sendReport implements Tasklet, StepExecutionListener {
private List<String> success = new ArrayList<>();
private List<String> failed = new ArrayList<>();
#Override
public void beforeStep(StepExecution stepExecution) {
try {
// Fetch the list of file names which we have stored in the context from previous step
success = (List<String>) stepExecution.getJobExecution().getExecutionContext()
.get("fileProcessedSuccessfully");
failed = (List<BatchJobReportContent>) stepExecution.getJobExecution()
.getExecutionContext().get("fileProcessedFailure");
} catch (Exception e) {
}
}
#Override
public RepeatStatus execute(StepContribution contribution, ChunkContext chunkContext) throws Exception {
// Business logic to send email with the file names
}
#Override
public ExitStatus afterStep(StepExecution stepExecution) {
logger.debug("Email Trigger step completed successfully!");
return ExitStatus.COMPLETED;
}
}

How do I access Job parameters in a Spring Batch Listener?

I'm working with ItemListenerSupport to do some error handling for ItemReadListener, ItemProcessListener, and ItemWriteListener. I want to access the job parameters in this instance. How do I fetch those? I tried #BeforeStep to inject the StepExecution and Jobexecution but neither worked.
To get the handle of Job Parameters you can implement StepExecutionListener to your listener Class to make use of Overridden methods beforeStep and afterStep
#Override
public void beforeStep(StepExecution stepExecution) {
String name = (String) stepExecution.getJobExecution().getExecutionContext()
.get("name");
}
#Override
public ExitStatus afterStep(StepExecution stepExecution) {
if (stepExecution.getStatus() == BatchStatus.COMPLETED) {
return ExitStatus.COMPLETED;
}
return ExitStatus.FAILED;
}
You can declare your listener as a step scoped bean and inject job parameters in it, something like:
#Bean
#StepScope
public ItemReadListener itemReadListener(final #Value("#{jobParameters['name']}") String name) {
return new ItemListenerSupport() {
#Override
public void afterRead(Object item) {
System.out.println("in listener, job param name=" + name);
super.afterRead(item);
}
};
}

Detecting when the Job is STOPPING in a FlatFileItemReader custom bufferedReader instance

How would one detect when the Job has been signaled to stop from within the FlatFileItemReader bufferedFileReader that was created using the configured custom BufferedFileReaderFactory bean. This custom bufferedFileReader tails the file and waits for more input indefinitely so the Job STOPPING status isn't being detected as the code is blocked outside the Spring Batch framework.
I can do this with a Step1->Tasklet->Step1 flow loop that does a Thread.sleep in the Tasklet but the nature of this constantly growing file means I'll be hitting EOF every couple of seconds and generating a huge amount of StepExecution rows in the database.
public class TailingBufferedReaderFactory implements BufferedReaderFactory {
#Override
public BufferedReader create(Resource resource, String encoding) throws IOException {
return new TailingBufferedReader(new InputStreamReader(resource.getInputStream(), encoding));
}
}
public class TailingBufferedReader extends BufferedReader implements JobExecutionListener {
private JobExecution jobExecution;
public TailingBufferedReader(Reader in) {
super(in);
}
#Override
public String readLine() throws IOException {
while (!jobExecution.isStopping()) { //The elusive Job Execution status check
var line = super.readLine();
if (line == null) {
Thread.sleep(waitDurationMillis);
continue;
}
return line;
}
return null;
}
// Ideally something like this configured on the Job
#Override
public void beforeJob(JobExecution jobExecution) {
this.jobExecution = jobExecution;
}
#Override
public void afterJob(JobExecution jobExecution) {}
}

Spring Boot and FlywayTest cause JPA Camel routes to throw exceptions during database reset

I have a Spring Boot application that contains a Camel route with a JPA consumer.
When running a test that uses the #FlyTest annotation the database is reset as expected prior to the test but while this is happening the Camel JPA consumer tries to execute an SQL select against the database.
How do I disable the route while FlywayTest is resetting the database?
Any suggestions are appreciated.
I worked around the problem by setting consumer.initialDelay=5000 which was sufficient to allow Flyway to reset the DB.
A more robust approach to solving this was to suspend the Camel routes in my tests by using:
#Autowired
CamelContext camelContext;
#Before
public void init() throws Exception {
camelContext.suspend();
}
In Camel 2.15 (current master branch) I recently delayed CamelContext startup. The routes should not be started so early then. Therefore the routes should not consume from JPA too early as well.
If you still experience this issue - feel free to drop me a line. I will fix it.
I configured my application to start all Camel routes after the Flyway migration is complete. This proved very important on slow machines or large databases where the migration could be very slow.
#Configuration
#EnableTransactionManagement
#EnableAutoConfiguration
public class Application {
public static void main(String[] args) throws Exception {
new SpringApplication((Object[])args).run();
}
#Bean
#DependsOn({"flyway","dataSource"})
CamelContext camelContext(ApplicationContext applicationContext,
CamelConfigurationProperties configurationProperties) {
CamelContext camelContext = new SpringCamelContext(applicationContext);
SpringCamelContext.setNoStart(true);
if (!configurationProperties.isJmxEnabled()) {
camelContext.disableJMX();
}
if (configurationProperties.getName() != null) {
((SpringCamelContext) camelContext).setName(configurationProperties.getName());
}
camelContext.setAutoStartup(false);
return camelContext;
}
#Bean
FlywayCallback flywayCallback(Flyway flyway, final CamelContext camelContext) {
FlywayCallback callback = new FlywayCallback() {
#Override
public void beforeClean(Connection connection) {}
#Override
public void afterClean(Connection connection) {}
#Override
public void beforeMigrate(Connection connection) {}
#Override
public void afterMigrate(Connection connection) {
try {
camelContext.startAllRoutes();
} catch (Exception e) {
e.printStackTrace();
throw new RuntimeException("Camel startup failed", e);
}
}
#Override
public void beforeEachMigrate(Connection connection,
MigrationInfo info) {}
#Override
public void afterEachMigrate(Connection connection,
MigrationInfo info) {}
#Override
public void beforeValidate(Connection connection) {}
#Override
public void afterValidate(Connection connection) {}
#Override
public void beforeBaseline(Connection connection) {}
#Override
public void afterBaseline(Connection connection) {}
#Override
public void beforeInit(Connection connection) {}
#Override
public void afterInit(Connection connection) {}
#Override
public void beforeRepair(Connection connection) {}
#Override
public void afterRepair(Connection connection) {}
#Override
public void beforeInfo(Connection connection) {}
#Override
public void afterInfo(Connection connection) {}
};
flyway.setCallbacks(callback);
return callback;
}
}

Not able to access stepExecutionContext value in writer

I am setting the stepExecutionContext value in my partitioner and trying to get it in Writer.
But i could not able to access it.
The writer is step scoped.
Could any one help me how to get the step execution context values in writer?
Thanks
Shankar
you can implement StepExecutionListener in writer to get stepExecution in writer
public class ExampleWriter implements ItemWriter<T>,StepExecutionListener {
private JobExecution jobExecution;
#Override
public void write(List<? extends T> items) {
String executionContextValue=jobExecution.getExecutionContext().get("KEY");
System.out.println("ExecutionContextValue is:"+executionContextValue);
}
#Override
public ExitStatus afterStep(StepExecution stepExecution) {
return null;
}
#Override
public void beforeStep(StepExecution stepExecution) {
this.jobExecution= stepExecution.getJobExecution();
}
}
If needed register this class as a listener in xml configuration
<listener>ExampleWriter</listener>