How to check if a job is still running or is finished regardless of finalization status - spring-batch

Working in Spring Batch (3) with Spring Boot(1.5) project. I have an end of day job "endOfDayJob" that is asynchronously execute through a web controller, in the controller i am returning the job execution id.
Below the code for configuration class. Highlight here that i am implementing BatchConfigurer interface and creating a async JobLauncer with SimpleAsyncTaskExecutor.
#Configuration
#EnableBatchProcessing
#EnableAsync
public class BatchConfiguration implements BatchConfigurer {
#Autowired
private JobBuilderFactory jobBuilderFactory;
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Autowired
private DataSource dataSource;
#Bean
public Job endOfDayJob() throws Exception {
SimpleJobBuilder simpleJobBuilder = jobBuilderFactory.get("endOfDayJob")
.incrementer(new RunIdIncrementer())
.start(init())
.next(updateInventory())
.next(generateSalesReport())
.next(cleanup())
.next(sendReport());
return simpleJobBuilder.build();
}
#Bean
public Step init() {
return stepBuilderFactory.get("initStep").tasklet(initTasklet()).build();
}
#Bean
public Step updateInventory() {
return stepBuilderFactory.get("updateInventoryStep").tasklet(updateInventoryTasklet()).build();
}
#Bean
public Step generateSalesReport() {
return stepBuilderFactory.get("generateSalesReportStep").tasklet(generateSalesReportTasklet()).build();
}
#Bean
public Step cleanup() {
return stepBuilderFactory.get("cleanupStep").tasklet(cleanupTasklet()).build();
}
#Bean
public Step sendReport() {
return stepBuilderFactory.get("sendReportStep").tasklet(sendReportTasklet()).build();
}
#Override
public JobRepository getJobRepository() throws Exception {
JobRepositoryFactoryBean factory = new JobRepositoryFactoryBean();
factory.setDataSource(dataSource);
factory.setTransactionManager(getTransactionManager());
factory.setIsolationLevelForCreate("ISOLATION_READ_COMMITTED");
factory.setTablePrefix("BATCH_");
return factory.getObject();
}
#Override
public PlatformTransactionManager getTransactionManager() throws Exception {
return new DataSourceTransactionManager(dataSource);
}
#Override
public JobLauncher getJobLauncher() throws Exception {
SimpleJobLauncher jobLauncher = new SimpleJobLauncher();
jobLauncher.setJobRepository(getJobRepository());
jobLauncher.setTaskExecutor(new SimpleAsyncTaskExecutor());
jobLauncher.afterPropertiesSet();
return jobLauncher;
}
#Override
public JobExplorer getJobExplorer() throws Exception {
JobExplorerFactoryBean jobExplorerFactoryBean = new JobExplorerFactoryBean();
jobExplorerFactoryBean.setDataSource(dataSource);
jobExplorerFactoryBean.setTablePrefix("BATCH_");
jobExplorerFactoryBean.afterPropertiesSet();
return jobExplorerFactoryBean.getObject();
}
}
Here the code for the web controller.
#RestController
#RequestMapping("/api/job")
public class WebController {
private static final Logger logger = LoggerFactory.getLogger(WebController.class);
#Autowired
private DataSource dataSource;
#Autowired
private BatchConfiguration batchConfiguration;
#Autowired
private JobLauncher jobLauncher;
#GetMapping("/endOfDayJob")
private Long kycrBatch(#RequestParam(value = "odate", required = true) String odate) {
logger.info("ExecutingendOfDayJob with odate = {}", odate);
if (odate == null || odate.isEmpty() || odate.trim().isEmpty()) {
return -1L;
}
JobParametersBuilder jobParametersBuilder = new JobParametersBuilder();
jobParametersBuilder.addString("odate", odate);
long jobExecutionId = -1L;
try {
Job endOfDayJob = this.batchConfiguration.endOfDayJob();
jobParametersBuilder.addDate("runtime", new Date());
jobExecutionId = jobLauncher.run(endOfDayJob, jobParametersBuilder.toJobParameters()).getId();
} catch (Exception e) {
logger.error("Error ocurred executing endOfDayJob with message: {}", e.getMessage());
return -1L;
}
return jobExecutionId;
}
}
Then i want to add new method in the controller to to know if the job ended or not. What is a possible way to check if a job is still running or is already finished regardless of finalization status??

Then i want to add new method in the controller to to know if the job ended or not.
You can inject the JobExplorer in your controller and write something like:
public boolean isRunning(long jobExecutionId) {
JobExecution jobExecution = jobExplorer.getJobExecution(jobExecutionId);
return jobExecution.isRunning();
}

Related

FlatFileFooterCallback Exception not coming in Skip Listener

Issue :- Error throwing from FlatFileFooterCallback is not skippable and Job is getting stopped
What we tried
Job Configuration below
#Bean
public Step mmsSlaveStep() throws Exception {
return stepBuilderFactory.get("mmsSlaveStep").<BaseDTO, BaseDTO>chunk(chunkSize).reader(mmsReader(null))
.processor(mmsProcessor()).writer(mmsItemWriter(null)).faultTolerant().skipLimit(skipErrorCount)
.skip(Exception.class).listener(itemSkipListener()).build();
}
ItemWriter that have Footer Call Back
#Bean
#StepScope
public FlatFileItemWriter<BaseDTO> mmsItemWriter(#Value("#{stepExecutionContext['fileName']}") String filePath)
throws Exception {
BeanWrapperFieldExtractor<BaseDTO> fieldExtractor = new BeanWrapperFieldExtractor<>();
fieldExtractor.setNames(mmsWriterFields);
fieldExtractor.afterPropertiesSet();
FormatterLineAggregator<BaseDTO> lineAggregator = new FormatterLineAggregator<>();
lineAggregator.setFormat(mmsWriterFieldsFormat);
lineAggregator.setFieldExtractor(fieldExtractor);
String fileName = FilenameUtils.getName(filePath);
Resource outputResource = new FileSystemResource(new File(sucessPath + fileName + inputDataSource));
return new FlatFileItemWriterBuilder<BaseDTO>().name("mmsWriter").resource(outputResource)
.lineAggregator(lineAggregator).shouldDeleteIfEmpty(true).footerCallback(mmsSummaryItemWriter()).build();
}
FooterCall Back Writer that have writeFooter that aggregates data to a DB table. This exception is not coming to SkipListener
public class MMSSummaryWriter implements FlatFileFooterCallback {
#Autowired
private MMSMasterService mmsMasterService;
#Value("#{stepExecution}")
private StepExecution stepExecution;
#Autowired
#Qualifier("jdbcTemplate")
private JdbcTemplate jdbcTemplate;
#Override
public void writeFooter(Writer writer) throws IOException{
ExecutionContext executionContext =stepExecution.getExecutionContext();
HashMap<String,MMSDto> summaryMap = (HashMap<String, MMSDto>) executionContext.get("Summary");
List<MMSDto> summaryValues = new ArrayList<>(summaryMap.values());
int[] updateCounts = jdbcTemplate.batchUpdate(InterConnectUtils.SUMMARY_EVENT_INSERT, new BatchPreparedStatementSetter() {
#Override
public void setValues(PreparedStatement ps, int i) throws SQLException {
int counter =0;
MMSDto mmsDto = summaryValues.get(i);
ps.setString(1,mmsDto.getBatchNumber());
ps.setString(2,mmsDto.getEntityCode());
ps.setString(3,mmsDto.getInputFileType());
ps.setInt(4, mmsDto.getTotalType());
ps.setString(5,mmsDto.getServiceType());
ps.setString(6, mmsDto.getCallInd());
ps.setString(7,mmsDto.getCallDateTime());
ps.setString(8,mmsDto.getEventDirection());
ps.setString(9,mmsDto.getProductID());
ps.setInt(10,mmsDto.getTargetInd());
ps.setInt(11,mmsDto.getNoRecords());
ps.setLong(26,mmsDto.getTotalDuration());
System.out.println("InsertValues : "+mmsDto.getBatchNumber()+"#"+mmsDto.getTotalType()+"#"+mmsDto.getCallInd()+"#"+mmsDto.getServiceType()+"#"+
mmsDto.getProductID()+"#"+mmsDto.getCallDateTime()+"#"+mmsDto.getTotalType());
}
#Override
public int getBatchSize() {
System.out.println("summaryMap.size()"+summaryMap.size());
return summaryMap.size();
}
});
}
}
SkipListner Class that handles errors and log to table and allow the job to continue
public class PreProcSkipListener implements SkipListener<BaseDTO, BaseDTO> {
private static final Logger logger = LoggerFactory.getLogger(PreProcSkipListener.class);
#Autowired
#Qualifier("jdbcTemplate")
private JdbcTemplate jdbcTemplate;
#Override
#Transactional(propagation = Propagation.REQUIRES_NEW)
public void onSkipInWrite(BaseDTO item, Throwable t) {
Exception ex = new Exception(t);
jdbcTemplate.update(InterConnectUtils.BATCH_ERROR_INSERT, new PreparedStatementSetter() {
#Override
public void setValues(PreparedStatement ps) throws SQLException {
MMSDto mmsDto = (MMSDto) item;
StringWriter sw = new StringWriter();
ex.printStackTrace(new PrintWriter(sw));
String exceptionAsString = sw.toString();
ps.setDate(1, new java.sql.Date(new Date().getTime()));
ps.setString(2, mmsDto.getBatchNumber());
ps.setString(3, "WRITER");
Reader reader = new StringReader(exceptionAsString + "--" + ex.getMessage());
ps.setClob(4, reader);
}
});
}
}}
There is an open issue mentioned in github to consider. Let us know any workaround
github defect

Multithreading with StoredProcedureItemReader

Is it possible to have multiple threads with StoredProcedureItemReader? I have done multithreading with PageReader but not sure if it will work with StoredProcedureItemReader.
Below is the job configuration for using a StoredProcedureReader. I have wrapped the reader in Thread safe reader. I want to use ThreadPoolTaskExecutor but not able to figure out how I can do partition for each thread with the Stored procedure.
***#Configuration
public class SpPocJobConfigurationMT {
private DataSource dataSource;
/**
* The Job builder factory.
*/
private JobBuilderFactory jobBuilderFactory;
/**
* The Jdbc template.
*/
#Autowired
JdbcTemplate jdbcTemplate;
/**
* The Step builder factory.
*/
private StepBuilderFactory stepBuilderFactory;
#Autowired
private BillingRecordAuditRepository billingRecordAuditRepository;
#Autowired
private StagingMortgageDataTxnRepository stagingMortgageDataTxnRepository;
private SystemRepository systemRepository;
#Autowired
public SpPocJobConfigurationMT(JobBuilderFactory jobBuilderFactory, StepBuilderFactory stepBuilderFactory, SystemRepository systemRepository, DataSource dataSource) {
Assert.notNull(systemRepository, "SystemRepository cannot be null");
Assert.notNull(jobBuilderFactory, "JobBuilderFactory cannot be null");
Assert.notNull(stepBuilderFactory, "StepBuilderFactory cannot be null");
Assert.notNull(dataSource, "DataSource cannot be null");
this.jobBuilderFactory = jobBuilderFactory;
this.stepBuilderFactory = stepBuilderFactory;
this.systemRepository = systemRepository;
this.dataSource = dataSource;
}
#Bean
#Transactional
#Description(value = "")
public Job SpPocJobMT() throws Exception {
return jobBuilderFactory.get("spPocJobMT")
.start(spPocStepMT())
.build();
}
#Bean
public Step spPocStepMT() throws Exception {
return stepBuilderFactory.get("spPocStepMT")
.allowStartIfComplete(false)
.<StagingDataDto,StagingDataDto> chunk(20)
.reader(sybcSpReaderMT())
.processor(spPocProcessorMT())
.writer(spPocWriterMT())
// .taskExecutor(new ThreadPoolTaskExecutor ())
// .taskExecutor(new SimpleAsyncTaskExecutor())
.build();
}
#Bean
public SpPocWriter spPocWriterMT() {
return new SpPocWriter(this.billingRepository, this.stagingTxnRepository);
}
#Bean
public SpPocProcessor spPocProcessorMT() {
return new SpPocProcessor();
}
#Bean
#StepScope
public SynchronizedItemStreamReader sybcSpReaderMT() {
StoredProcedureItemReader reader = new StoredProcedureItemReader();
SqlParameter[] parameters = {new SqlParameter("#p_id", OracleTypes.NUMBER)
, new SqlOutParameter("#p_out_c1", OracleTypes.CURSOR)
, new SqlOutParameter("#p_out_c2", OracleTypes.CURSOR)
};
reader.setDataSource(dataSource);
reader.setProcedureName("SP_POC_FINAL");
reader.setRowMapper(new SPRowMapper());
reader.setRefCursorPosition(3);
reader.setPreparedStatementSetter(new MyItemPreparedStatementSetter());
reader.setParameters(parameters);
reader.setSaveState(false);
reader.setVerifyCursorPosition(false);
SynchronizedItemStreamReader synchronizedItemStreamReader = new SynchronizedItemStreamReader();
synchronizedItemStreamReader.setDelegate(reader);
return synchronizedItemStreamReader;
}
public class MyItemPreparedStatementSetter implements PreparedStatementSetter {
#Override
public void setValues(PreparedStatement ps) throws SQLException {
ps.setInt(1, 1);
((CallableStatement) ps).registerOutParameter(2, OracleTypes.CURSOR);
((CallableStatement) ps).registerOutParameter(3, OracleTypes.CURSOR);
}
}
}***
The StoredProcedureItemReader extends AbstractItemCountingItemStreamItemReader which is not thread-safe, please check its javadoc.
If you want to use the StoredProcedureItemReader in a multi-threaded step, you need to wrap it in a SynchronizedItemStreamReader or make it step-scoped.

How to add custom header in IntegrationFlow with Spring Batch Integration?

I created a pollableChannel which is listening a S3 Bucket getting files and launching a job.
My classe is like this:
#Bean
public S3SessionFactory s3SessionFactory(AmazonS3 pAmazonS3) {
return new S3SessionFactory(pAmazonS3);
}
#Bean
public S3InboundFileSynchronizer s3InboundFileSynchronizer(S3SessionFactory s3SessionFactory) {
S3InboundFileSynchronizer synchronizer = new S3InboundFileSynchronizer(s3SessionFactory);
synchronizer.setPreserveTimestamp(true);
synchronizer.setDeleteRemoteFiles(false);
synchronizer.setRemoteDirectory(awsS3Properties.getCercBucket());
return synchronizer;
}
#Bean
public S3InboundFileSynchronizingMessageSource s3InboundFileSynchronizingMessageSource(
S3InboundFileSynchronizer s3InboundFileSynchronizer) {
S3InboundFileSynchronizingMessageSource messageSource = new S3InboundFileSynchronizingMessageSource(
s3InboundFileSynchronizer);
messageSource.setAutoCreateLocalDirectory(true);
messageSource.setLocalDirectory(new FileSystemResource(integrationProperties.getTempDirectoryName()).getFile());
return messageSource;
}
#Bean("${receivable.integration.inChannel}")
public PollableChannel s3FilesChannel() {
return new QueueChannel();
}
#Bean
public IntegrationFlow integrationFlow(
S3InboundFileSynchronizingMessageSource s3InboundFileSynchronizingMessageSource) {
return IntegrationFlows
.from(s3InboundFileSynchronizingMessageSource,
c -> c.poller(Pollers.fixedRate(1000).maxMessagesPerPoll(1)))
.transform(fileMessageToJobRequest()).handle(jobLaunchingGateway())
.get();
}
#Bean
public FileMessageToJobRequest fileMessageToJobRequest() {
FileMessageToJobRequest fileMessageToJobRequest = new FileMessageToJobRequest();
fileMessageToJobRequest.setFileParameterName("input.file.name");
fileMessageToJobRequest.setJob(receivablePositionJob);
return fileMessageToJobRequest;
}
#Bean
#ServiceActivator(inputChannel = "${receivable.integration.inChannel}", poller = #Poller(fixedRate = "1000"))
public JobLaunchingGateway jobLaunchingGateway() {
SimpleJobLauncher simpleJobLauncher = new SimpleJobLauncher();
simpleJobLauncher.setJobRepository(jobRepository);
simpleJobLauncher.setTaskExecutor(new SyncTaskExecutor());
JobLaunchingGateway jobLaunchingGateway = new JobLaunchingGateway(simpleJobLauncher);
jobLaunchingGateway.setOutputChannel(s3FilesChannel());
return jobLaunchingGateway;
}
And my FileMessageToJobRequest is like this:
public class FileMessageToJobRequest {
private Job job;
private String fileParameterName;
public void setFileParameterName(String fileParameterName) {
this.fileParameterName = fileParameterName;
}
public void setJob(Job job) {
this.job = job;
}
#Transformer
public JobLaunchRequest toRequest(Message<File> message) {
JobParametersBuilder jobParametersBuilder = new JobParametersBuilder();
jobParametersBuilder.addString(fileParameterName, message.getPayload().getAbsolutePath());
return new JobLaunchRequest(job, jobParametersBuilder.toJobParameters());
}
}
I want to add a custom MessageHeader in the Message or my second option is intercept the context before the message is published due to I need to set my tenant in ThreadLocal.
How could I do that?
Thanks in advance.
UPDATE with enrichHeaders:
#Bean
public IntegrationFlow integrationFlow(
S3InboundFileSynchronizingMessageSource s3InboundFileSynchronizingMessageSource) {
return IntegrationFlows
.from(s3InboundFileSynchronizingMessageSource,
c -> c.poller(Pollers.fixedRate(1000).maxMessagesPerPoll(1)))
.transform(fileMessageToJobRequest())
.enrichHeaders(Map.of("teste", "testandio"))
.handle(jobLaunchingGateway())
.get();
}
First of all you must remove that #ServiceActivator(inputChannel = "${receivable.integration.inChannel}" since it points to the same s3FilesChannel, which is an outputChannel of that JobLaunchingGateway, too. So, you are making a loop with such a configuration. Not sure how it works for you at all...
To add a header before sending to that JobLaunchingGateway, you just need to add enrichHeaders() before your .handle(jobLaunchingGateway()) in that integrationFlow definition.

Spring Batch: AsyncItemProcessor and AsyncItemWriter

1) I have a large file (> 100k lines) that needs to be processed. I have a lot of business validation and checks against external systems for each line item. The code is being migrated from a legacy app and i just put these business logic into the AsyncitemProcessor, which also persists the data into the DB. Is this a good practise to create/save records in the ItemProcessor (in lieu of ItemWriter) ?
2) Code is ::
#Configuration
#EnableAutoConfiguration
#ComponentScan(basePackages = "com.liquidation.lpid")
#EntityScan(basePackages = "com.liquidation.lpid.entities")
#EnableTransactionManagement
public class SimpleJobConfiguration {
#Autowired
public JobRepository jobRepository;
#Autowired
private StepBuilderFactory stepBuilderFactory;
#Autowired
#Qualifier("myFtpSessionFactory")
private SessionFactory myFtpSessionFactory;
#Autowired
public JobBuilderFactory jobBuilderFactory;
#Bean
public ThreadPoolTaskExecutor lpidItemTaskExecutor() {
ThreadPoolTaskExecutor tExec = new ThreadPoolTaskExecutor();
tExec.setCorePoolSize(10);
tExec.setMaxPoolSize(10);
tExec.setAllowCoreThreadTimeOut(true);
return tExec;
}
#BeforeStep
public void beforeStep(StepExecution stepExecution){
String name = stepExecution.getStepName();
System.out.println("name: " + name);
}
#Bean
public SomeItemWriterListener someItemWriterListener(){
return new SomeItemWriterListener();
};
#Bean
#StepScope
public FlatFileItemReader<FieldSet> lpidItemReader(#Value("#{stepExecutionContext['fileResource']}") String fileResource) {
System.out.println("itemReader called !!!!!!!!!!! for customer data" + fileResource);
FlatFileItemReader<FieldSet> reader = new FlatFileItemReader<FieldSet>();
reader.setResource(new ClassPathResource("/data/stage/"+ fileResource));
reader.setLinesToSkip(1);
DefaultLineMapper<FieldSet> lineMapper = new DefaultLineMapper<FieldSet>();
DelimitedLineTokenizer tokenizer = new DelimitedLineTokenizer();
reader.setSkippedLinesCallback(new LineCallbackHandler() {
public void handleLine(String line) {
if (line != null) {
tokenizer.setNames(line.split(","));
}
}
});
lineMapper.setLineTokenizer(tokenizer);
lineMapper.setFieldSetMapper(new PassThroughFieldSetMapper());
lineMapper.afterPropertiesSet();
reader.setLineMapper(lineMapper);
return reader;
}
#Bean
public ItemWriter<FieldSet> lpidItemWriter() {
return new LpidItemWriter();
}
#Autowired
private MultiFileResourcePartitioner multiFileResourcePartitioner;
#Bean
public Step masterStep() {
return stepBuilderFactory.get("masterStep")
.partitioner(slaveStep().getName(), multiFileResourcePartitioner)
.step(slaveStep())
.gridSize(4)
.taskExecutor(lpidItemTaskExecutor())
.build();
}
#Bean
public ItemProcessListener<FieldSet,String> processListener(){
return new LpidItemProcessListener();
}
#SuppressWarnings("unchecked")
#Bean
public Step slaveStep() {
return stepBuilderFactory.get("slaveStep")
.<FieldSet,FieldSet>chunk(5)
.faultTolerant()
.listener(new ChunkListener())
.reader(lpidItemReader(null))
.processor(asyncItemProcessor())
.writer(asyncItemWriter()).listener(someItemWriterListener()).build();
}
#Bean
public AsyncItemWriter<FieldSet> asyncItemWriter(){
AsyncItemWriter<FieldSet> asyncItemProcessor = new AsyncItemWriter<>();
asyncItemProcessor.setDelegate(lpidItemWriter());
try {
asyncItemProcessor.afterPropertiesSet();
} catch (Exception e) {
e.printStackTrace();
}
return asyncItemProcessor;
}
#Bean
public ItemProcessor<FieldSet, FieldSet> processor() {
return new lpidCheckItemProcessor();
}
#Bean
public AsyncItemProcessor<FieldSet, FieldSet> asyncItemProcessor() {
AsyncItemProcessor<FieldSet, FieldSet> asyncItemProcessor = new AsyncItemProcessor<FieldSet, FieldSet>();
asyncItemProcessor.setDelegate(processor());
asyncItemProcessor.setTaskExecutor(lpidItemTaskExecutor());
try {
asyncItemProcessor.afterPropertiesSet();
} catch (Exception e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
return asyncItemProcessor;
}
#Bean
public Job job() throws Exception {
return jobBuilderFactory.get("job").incrementer(new RunIdIncrementer()).start(masterStep()).build();
}
}
The itemwriter runs before the itemprocessor has completed. My understanding is: for every chunk, the item reader reads the data, item processor will churn through each item, and at the end of the chunk, the item writer gets called (which in my case,it does not do anything since the itemprocessor persists the data). But the itemwriter gets called before the item processor gets completed and my job never completes. What am i doing incorrectly here? (I looked at previous issues around it and the solution was to wrap the writer around the AsyncItemWriter(), which i am doing) .
Thanks
Sundar

How to run update query in Spring JPA for quartz job

I have a quartz job in spring 4 and I am using JPA hibernate to update database value through quartz job but I am getting javax.persistence.TransactionRequiredException: Executing an update/delete query
I don't understand what kind of configuration is missing in quartz job. I referred to SpringBeanAutowiringSupport example still update is failing but select is working fine.
Below is my code
#Configuration
#ComponentScan("com.stock")
public class QuartzConfiguration {
#Autowired
private ApplicationContext applicationContext;
#Bean
public JobDetailFactoryBean jobDetailBalanceCarryForward(){
JobDetailFactoryBean factory = new JobDetailFactoryBean();
factory.setJobClass(BillingCroneSvcImpl.class);
Map<String,Object> map = new HashMap<String,Object>();
map.put("task", "balanceCarryForward");
factory.setJobDataAsMap(map);
factory.setGroup("BalanceCarryForwardJob");
factory.setName("balance carry forward");
return factory;
}
#Bean
public CronTriggerFactoryBean cronTriggerBalanceCarryForward(){
CronTriggerFactoryBean stFactory = new CronTriggerFactoryBean();
stFactory.setJobDetail(jobDetailBalanceCarryForward().getObject());
stFactory.setStartDelay(3000);
stFactory.setName("balancCarryForwardTrigger");
stFactory.setGroup("balanceCarryForwardgroup");
stFactory.setCronExpression("0 0/1 * 1/1 * ? *");
return stFactory;
}
#Bean
public SpringBeanJobFactory springBeanJobFactory() {
AutoWiringSpringBeanJobFactory jobFactory = new AutoWiringSpringBeanJobFactory();
jobFactory.setApplicationContext(applicationContext);
return jobFactory;
}
#Bean
public SchedulerFactoryBean schedulerFactoryBean() {
SchedulerFactoryBean schedulerFactory = new SchedulerFactoryBean();
schedulerFactory.setJobFactory(springBeanJobFactory());
schedulerFactory.setTriggers(cronTriggerBalanceCarryForward().getObject());
return schedulerFactory;
}
}
Below class where quartz executeInternal method is written
#Service
#PersistJobDataAfterExecution
#DisallowConcurrentExecution
#Autowired
private BillingCroneRepo billingCroneRepo;
public class BillingCroneSvcImpl extends QuartzJobBean implements BillingCroneSvc {
#Override
#Transactional
protected void executeInternal(JobExecutionContext context) throws JobExecutionException {
SpringBeanAutowiringSupport.processInjectionBasedOnCurrentContext(context);
billingCroneRepo.updateBalance();
// this method throws exception javax.persistence.TransactionRequiredException: Executing an update/delete query
}
}
App config class
#EnableWebMvc
#EnableTransactionManagement
#Configuration
#ComponentScan({ "com.stock.*" })
#Import({ SecurityConfig.class })
#PropertySource("classpath:jdbc.properties")
public class AppConfig extends WebMvcConfigurerAdapter {
private static final String PROPERTY_NAME_DATABASE_DRIVER = "db.driver";
private static final String PROPERTY_NAME_DATABASE_PASSWORD = "db.password";
private static final String PROPERTY_NAME_DATABASE_URL = "db.url";
private static final String PROPERTY_NAME_DATABASE_USERNAME = "db.username";
private static final String PROPERTY_NAME_HIBERNATE_DIALECT = "hibernate.dialect";
private static final String PROPERTY_NAME_HIBERNATE_SHOW_SQL = "hibernate.show_sql";
private static final String PROPERTY_NAME_ENTITYMANAGER_PACKAGES_TO_SCAN = "entitymanager.packages.to.scan";
#Resource
private Environment env;
#Bean(name = "dataSource")
public DriverManagerDataSource dataSource() {
DriverManagerDataSource driverManagerDataSource = new DriverManagerDataSource();
driverManagerDataSource.setDriverClassName(env.getRequiredProperty(PROPERTY_NAME_DATABASE_DRIVER));
driverManagerDataSource.setUrl(env.getRequiredProperty(PROPERTY_NAME_DATABASE_URL));
driverManagerDataSource.setUsername(env.getRequiredProperty(PROPERTY_NAME_DATABASE_USERNAME));
driverManagerDataSource.setPassword(env.getRequiredProperty(PROPERTY_NAME_DATABASE_PASSWORD));
return driverManagerDataSource;
}
#Bean
public LocalContainerEntityManagerFactoryBean entityManagerFactory() {
LocalContainerEntityManagerFactoryBean entityManagerFactoryBean = new LocalContainerEntityManagerFactoryBean();
entityManagerFactoryBean.setDataSource(dataSource());
entityManagerFactoryBean.setPersistenceProviderClass(HibernatePersistence.class);
entityManagerFactoryBean.setPackagesToScan(env.getRequiredProperty(PROPERTY_NAME_ENTITYMANAGER_PACKAGES_TO_SCAN));
entityManagerFactoryBean.setJpaProperties(hibProperties());
return entityManagerFactoryBean;
}
private Properties hibProperties() {
Properties properties = new Properties();
properties.put(PROPERTY_NAME_HIBERNATE_DIALECT,env.getRequiredProperty(PROPERTY_NAME_HIBERNATE_DIALECT));
properties.put(PROPERTY_NAME_HIBERNATE_SHOW_SQL,env.getRequiredProperty(PROPERTY_NAME_HIBERNATE_SHOW_SQL));
return properties;
}
#Bean
public PlatformTransactionManager transactionManager() {
JpaTransactionManager transactionManager = new JpaTransactionManager();
transactionManager.setEntityManagerFactory(entityManagerFactory().getObject());
return transactionManager;
}
#Bean
public ReloadableResourceBundleMessageSource messageSource(){
ReloadableResourceBundleMessageSource messageSource=new ReloadableResourceBundleMessageSource();
String[] resources= {"classpath:messages"};
messageSource.setBasenames(resources);
return messageSource;
}
#Bean
public LocaleResolver localeResolver() {
final CookieLocaleResolver ret = new CookieLocaleResolver();
ret.setDefaultLocale(new Locale("en_IN"));
return ret;
}
#Bean
public LocaleChangeInterceptor localeChangeInterceptor(){
LocaleChangeInterceptor localeChangeInterceptor=new LocaleChangeInterceptor();
localeChangeInterceptor.setParamName("language");
return localeChangeInterceptor;
}
#Override
public void addResourceHandlers(ResourceHandlerRegistry registry) {
registry.addResourceHandler("/Angular/**").addResourceLocations("/Angular/");
registry.addResourceHandler("/css/**").addResourceLocations("/css/");
registry.addResourceHandler("/email_templates/**").addResourceLocations("/email_templates/");
registry.addResourceHandler("/fonts/**").addResourceLocations("/fonts/");
registry.addResourceHandler("/img/**").addResourceLocations("/img/");
registry.addResourceHandler("/js/**").addResourceLocations("/js/");
registry.addResourceHandler("/Landing_page/**").addResourceLocations("/Landing_page/");
}
#Bean
public static PropertySourcesPlaceholderConfigurer properties() {
PropertySourcesPlaceholderConfigurer pspc = new PropertySourcesPlaceholderConfigurer();
org.springframework.core.io.Resource[] resources = new ClassPathResource[] { new ClassPathResource("application.properties") };
pspc.setLocations(resources);
pspc.setIgnoreUnresolvablePlaceholders(true);
return pspc;
}
#Bean
public InternalResourceViewResolver viewResolver() {
InternalResourceViewResolver viewResolver = new InternalResourceViewResolver();
viewResolver.setViewClass(JstlView.class);
viewResolver.setPrefix("/WEB-INF/pages/");
viewResolver.setSuffix(".jsp");
return viewResolver;
}
// through below code we directly read properties file in jsp file
#Bean(name = "propertyConfigurer")
public PropertiesFactoryBean mapper() {
PropertiesFactoryBean bean = new PropertiesFactoryBean();
bean.setLocation(new ClassPathResource("application.properties"));
return bean;
}
}
Can anybody please assist me how to resolve transational issue in spring JPA with quartz
Thanks you all for your help. Finally I autowired EntityManagerFactory instead of persitance EntityManager and it is working fine. I tried all scenario but nothing worked to inject spring transactional in quartz so finally autoriwed entitymanagerfactory
Below is my repo class code.
#Repository
public class BillingCroneRepoImpl implements BillingCroneRepo {
/*#PersistenceContext
private EntityManager entityManager;*/
#Autowired
EntityManagerFactory entityManagerFactory;
public boolean updateTable(){
EntityManager entityManager = entityManagerFactory.createEntityManager();
EntityTransaction entityTransaction = entityManager.getTransaction();
entityTransaction.begin(); // this will go in try catch
Query query = entityManager.createQuery(updateSql);
// update table code goes here
entityTransaction.commit(); // this will go in try catch
}
}
I'm not the Spring specialist, but I think new ... doesn't work with #Transactional
#Service
#PersistJobDataAfterExecution
#DisallowConcurrentExecution
public class BillingCroneSvcImpl extends QuartzJobBean implements BillingCroneSvc {
#Autowired
BillingCroneRepo billingCroneRepo;
#Override
#Transactional
protected void executeInternal(JobExecutionContext context) throws JobExecutionException {
SpringBeanAutowiringSupport.processInjectionBasedOnCurrentContext(context);
billingCroneRepo.updateBalance();
}
}
It's because quartz is using the bean instead of the proxy generated for #Transactional.
Use either MethodInvokingJobDetailFactoryBean (instead of inheriting QuartzJob) or use a dedicated wrapper quarz bean (inheriting from QuartzJob) that call the spring bean (not inheriting from QuartzJob) having the #Transactionnal annotation.
EDIT : this is in fact not the problem
The problem is here :
JobDetailFactoryBean factory = new JobDetailFactoryBean();
factory.setJobClass(BillingCroneSvcImpl.class);
By passing the class, I presume that Quartz will instantiate it itself, so Spring won't create it and won't wrap the bean in a Proxy that handle the #Transactionnal behaviour.
Instead you must use something along the line :
#Bean(name = "billingCroneSvc")
public BillingCroneSvc getSvc(){
return new BillingCroneSvcImpl();
}
#Bean
public JobDetailFactoryBean jobDetailBalanceCarryForward(){
JobDetailFactoryBean factory = new JobDetailFactoryBean();
getSvc();// just make sure the bean is instantiated
factory.setBeanName("billingCroneSvc");
...
}