The query in my reader takes a really long time to fetch results due to multiple table joins. I am considering the option of splitting my query joins, using temp tables if possible. is this a feasible solution ? can spring batch support use of temp tables between the reader, processor and writer ?
Yes it is possible. You should use Same DataSource instance for your reader, writer, processor.
Example:
#Component
public class DataSourceDao{
DataSource dataSource;
public DataSource getDataSource() {
return dataSource;
}
#Autowired
public void setDataSource(DataSource dataSource) {
this.dataSource = dataSource;
}
}
Reader:
public class MyReader implements ItemReader<POJO_CLASS> {
#Autowired
DataSourceDao dataSource;
#Override
JdbcCursorItemReader<POJO_CLASS> reader= new
JdbcCursorItemReader<>();
public <POJO_CLASS> read() throws Exception, UnexpectedInputException,
ParseException, NonTransientResourceException {
reader.setDataSource(dataSource.getDataSource());
// Implement your read logic
}
}
Writer:
public class YourWriter implements ItemWriter<POJO_CLASS> {
JdbcBatchItemWriter<POJO_CLASS> writer= new JdbcBatchItemWriter<>();
#Autowired
DataSourceDao dataSource;
void write(List<? extends POJO_CLASS> POJO)
{
writer.setDataSource(dataSource.getDataSource());
<Your logics...>
}
Related
Is it possible to have multiple threads with StoredProcedureItemReader? I have done multithreading with PageReader but not sure if it will work with StoredProcedureItemReader.
Below is the job configuration for using a StoredProcedureReader. I have wrapped the reader in Thread safe reader. I want to use ThreadPoolTaskExecutor but not able to figure out how I can do partition for each thread with the Stored procedure.
***#Configuration
public class SpPocJobConfigurationMT {
private DataSource dataSource;
/**
* The Job builder factory.
*/
private JobBuilderFactory jobBuilderFactory;
/**
* The Jdbc template.
*/
#Autowired
JdbcTemplate jdbcTemplate;
/**
* The Step builder factory.
*/
private StepBuilderFactory stepBuilderFactory;
#Autowired
private BillingRecordAuditRepository billingRecordAuditRepository;
#Autowired
private StagingMortgageDataTxnRepository stagingMortgageDataTxnRepository;
private SystemRepository systemRepository;
#Autowired
public SpPocJobConfigurationMT(JobBuilderFactory jobBuilderFactory, StepBuilderFactory stepBuilderFactory, SystemRepository systemRepository, DataSource dataSource) {
Assert.notNull(systemRepository, "SystemRepository cannot be null");
Assert.notNull(jobBuilderFactory, "JobBuilderFactory cannot be null");
Assert.notNull(stepBuilderFactory, "StepBuilderFactory cannot be null");
Assert.notNull(dataSource, "DataSource cannot be null");
this.jobBuilderFactory = jobBuilderFactory;
this.stepBuilderFactory = stepBuilderFactory;
this.systemRepository = systemRepository;
this.dataSource = dataSource;
}
#Bean
#Transactional
#Description(value = "")
public Job SpPocJobMT() throws Exception {
return jobBuilderFactory.get("spPocJobMT")
.start(spPocStepMT())
.build();
}
#Bean
public Step spPocStepMT() throws Exception {
return stepBuilderFactory.get("spPocStepMT")
.allowStartIfComplete(false)
.<StagingDataDto,StagingDataDto> chunk(20)
.reader(sybcSpReaderMT())
.processor(spPocProcessorMT())
.writer(spPocWriterMT())
// .taskExecutor(new ThreadPoolTaskExecutor ())
// .taskExecutor(new SimpleAsyncTaskExecutor())
.build();
}
#Bean
public SpPocWriter spPocWriterMT() {
return new SpPocWriter(this.billingRepository, this.stagingTxnRepository);
}
#Bean
public SpPocProcessor spPocProcessorMT() {
return new SpPocProcessor();
}
#Bean
#StepScope
public SynchronizedItemStreamReader sybcSpReaderMT() {
StoredProcedureItemReader reader = new StoredProcedureItemReader();
SqlParameter[] parameters = {new SqlParameter("#p_id", OracleTypes.NUMBER)
, new SqlOutParameter("#p_out_c1", OracleTypes.CURSOR)
, new SqlOutParameter("#p_out_c2", OracleTypes.CURSOR)
};
reader.setDataSource(dataSource);
reader.setProcedureName("SP_POC_FINAL");
reader.setRowMapper(new SPRowMapper());
reader.setRefCursorPosition(3);
reader.setPreparedStatementSetter(new MyItemPreparedStatementSetter());
reader.setParameters(parameters);
reader.setSaveState(false);
reader.setVerifyCursorPosition(false);
SynchronizedItemStreamReader synchronizedItemStreamReader = new SynchronizedItemStreamReader();
synchronizedItemStreamReader.setDelegate(reader);
return synchronizedItemStreamReader;
}
public class MyItemPreparedStatementSetter implements PreparedStatementSetter {
#Override
public void setValues(PreparedStatement ps) throws SQLException {
ps.setInt(1, 1);
((CallableStatement) ps).registerOutParameter(2, OracleTypes.CURSOR);
((CallableStatement) ps).registerOutParameter(3, OracleTypes.CURSOR);
}
}
}***
The StoredProcedureItemReader extends AbstractItemCountingItemStreamItemReader which is not thread-safe, please check its javadoc.
If you want to use the StoredProcedureItemReader in a multi-threaded step, you need to wrap it in a SynchronizedItemStreamReader or make it step-scoped.
I wrote a simple demo to overwrite default jobrepo. Instead of map based I wanted a H2 db to hold persistent metadata.
Therefore I wrote a CustomBatchConfigurer like this:
#Configuration
public class CustomBatchConfigurer extends DefaultBatchConfigurer {
#Autowired
#Qualifier("repo-db")
DataSource dataSource;
#Override
public void setDataSource(DataSource dataSource) {
super.setDataSource(dataSource);
}
#Bean(name = "repo-db")
public DataSource getJobRepoDataSource() {
return DataSourceBuilder
.create()
.url("jdbc:h2:tcp://localhost/~/src/spring-batch/batch_repo")
.driverClassName("org.h2.Driver")
.username("sa")
.password("test")
.type(HikariDataSource.class)
.build();
}
}
But Spring-Batch is not picking it up:
o.s.b.c.c.a.DefaultBatchConfigurer: No datasource was provided...using a Map based JobRepository
What am I doing wrong? I thought I had followed the instructions on spring doc ref.
Thanks and regards,
Jörg
You configuration should look more like this:
#Configuration
public class CustomBatchConfiguration {
#Bean
public BatchConfigurer batchConfigurer(#Qualifier("repo-db") DataSource dataSource) {
return new DefaultBatchConfigurer(dataSource);
}
#Bean(name = "repo-db")
public DataSource jobRepoDataSource() {
return DataSourceBuilder
.create()
.url("jdbc:h2:tcp://localhost/~/src/spring-batch/batch_repo")
.driverClassName("org.h2.Driver")
.username("sa")
.password("test")
.type(HikariDataSource.class)
.build();
}
}
If your bean methods are proxied (which is the default), you can also simplify the first bean method to
#Bean
public BatchConfigurer batchConfigurer() {
return new DefaultBatchConfigurer(jobRepoDataSource());
}
Please also have a second look at the official documentation: https://docs.spring.io/spring-batch/docs/4.3.x/reference/html/job.html#javaConfig
I am using springbatch to read data from mongo db using MongoItemReader bean.Suppose i want to read data from 2 different collections in a same job instance.Is this possible?
#Bean
#StepScope
public MongoItemReader<Object> reader() throws UnexpectedInputException, ParseException, Exception {
DataReader dataReader = new DataReader();
return dataReader.read();
}
#Bean
public DataItemProcessor processor() {
return new DataItemProcessor();
}
#Bean
public MongoItemWriter<DestinationCollectionModelClass> writer() {
MongoItemWriter<DestinationCollectionModelClass> writer = new MongoItemWriter<>();
writer.setCollection("collection_name_where_data_is_saved");
writer.setTemplate(mongoTemplate);
return writer;
}
#Bean
public Step step1(MongoItemWriter<DestinationModelClass> writer) throws UnexpectedInputException, ParseException, Exception {
return stepBuilderFactory.get("step1")
// TODO: P3 chunk size configurable
.<Object, DestinationModelClass>chunk(100)
.reader(dataReader())
.processor(processor())
.writer(writer())
.build();
}
Below is my class DataReader.java
public class DataReader extends MongoItemReader {
#Autowired
private MongoTemplate mongoTemplate;
#Override
public MongoItemReader<Object> read() throws Exception, UnexpectedInputException, ParseException {
List<Object> mongoItemReaderList = new ArrayList<>();
Map<String, Direction> sorts = new HashMap<>();
sorts.put("_id", Direction.ASC);
MongoItemReader<Object> collectionOneReader = new MongoItemReader<>();
collectionOneReader.setTemplate(mongoTemplate);
collectionOneReader.setTargetType(CollectionOneModelClass.class);
collectionOneReader.setQuery("{}");
collectionOneReader.setSort(sorts);
MongoItemReader<Object> collectionTwoReader = new MongoItemReader<>();
collectionTwoReader.setTemplate(mongoTemplate);
collectionTwoReader.setTargetType(CollectionTwoModelClass.class);
collectionTwoReader.setQuery("{}");
collectionTwoReader.setSort(sorts);
mongoItemReaderList.add(collectionOneReader);
mongoItemReaderList.add(collectionTwoReader);
MongoItemReader<Object> readerObject = (MongoItemReader<Object>) mongoItemReaderList;
return readerObject;
}
}
Below is my DataItemProcessor.java
public class DataItemProcessor implements ItemProcessor<Object, DestinationModelClass> {
public DataItemProcessor() {}
#Override
public DestinationModelClass process(Object phi) throws Exception {
DestinationModelClass hbd = new DestinationModelClass();
if(phi instanceof CollectionOneModelClass) {
//Processing code if Object is an instance of CollectionOneModelClass
}
if(phi instanceof CollectionTwoModelClass) {
//Processing code if Object is an instance of CollectionTwoModelClass
}
return hbd;
}
}
You can't have two readers in the same chunk-oriented step. What you can do is use the driving query pattern, which, in your case, could be implemented as follows:
Item Reader: reads items from collection 1
Item Processor: enriches items from collection 2
Item Writer: writes enriched items to collection 3
I have a step in my batch job that I want to use only to delete rows from a table.
The step looks like this:
#Bean
public Step step2(StepBuilderFactory factory,
PurgeAggBalanceWriter writer,
DataSource dataSource,
PlatformTransactionManager platformTransactionManager){
return stepBuilderFactory.get("step2")
.transactionManager(platformTransactionManager)
.<Assessment,Assessment>chunk(10)
.reader(getReader(dataSource, READER_QUERY2, "AggBalanceMapper", new AggBalanceMapper()))
.writer(writer)
.build();
}
I am using this writer class with a jdcb template to run the delete statement:
public class PurgeAggBalanceWriter implements ItemWriter<Assessment> {
private JdbcTemplate jdbcTemplate;
private static final String DELETE_QUERY = "DELETE FROM TABLE WHERE COLUMN = 'TEST'";
public PurgeAggBalanceWriter(DataSource dataSource) {
this.jdbcTemplate = new JdbcTemplate(dataSource);
}
#Override
public void write(List<? extends Assessment> list) {
jdbcTemplate.update(DELETE_QUERY);
}
The step completes successfully but I dont see why an ItemReader is required as it states when I try to remove the .reader() from step2.
Is there a way to avoid using a reader/mapper and just using the writer since all I have to do is run a delete query?
all I have to do is run a delete query
In this case, you don't need a chunk-oriented tasklet. A simple tasklet is enough, something like:
public class DeletionTasklet implements Tasklet {
private static final String DELETE_QUERY = "DELETE FROM TABLE WHERE COLUMN = 'TEST'";
private JdbcTemplate jdbcTemplate;
public DeletionTasklet(DataSource dataSource) {
this.jdbcTemplate = new JdbcTemplate(dataSource);
}
#Override
public RepeatStatus execute(StepContribution contribution, ChunkContext chunkContext) {
jdbcTemplate.update(DELETE_QUERY);
return RepeatStatus.FINISHED;
}
}
I have a quartz job in spring 4 and I am using JPA hibernate to update database value through quartz job but I am getting javax.persistence.TransactionRequiredException: Executing an update/delete query
I don't understand what kind of configuration is missing in quartz job. I referred to SpringBeanAutowiringSupport example still update is failing but select is working fine.
Below is my code
#Configuration
#ComponentScan("com.stock")
public class QuartzConfiguration {
#Autowired
private ApplicationContext applicationContext;
#Bean
public JobDetailFactoryBean jobDetailBalanceCarryForward(){
JobDetailFactoryBean factory = new JobDetailFactoryBean();
factory.setJobClass(BillingCroneSvcImpl.class);
Map<String,Object> map = new HashMap<String,Object>();
map.put("task", "balanceCarryForward");
factory.setJobDataAsMap(map);
factory.setGroup("BalanceCarryForwardJob");
factory.setName("balance carry forward");
return factory;
}
#Bean
public CronTriggerFactoryBean cronTriggerBalanceCarryForward(){
CronTriggerFactoryBean stFactory = new CronTriggerFactoryBean();
stFactory.setJobDetail(jobDetailBalanceCarryForward().getObject());
stFactory.setStartDelay(3000);
stFactory.setName("balancCarryForwardTrigger");
stFactory.setGroup("balanceCarryForwardgroup");
stFactory.setCronExpression("0 0/1 * 1/1 * ? *");
return stFactory;
}
#Bean
public SpringBeanJobFactory springBeanJobFactory() {
AutoWiringSpringBeanJobFactory jobFactory = new AutoWiringSpringBeanJobFactory();
jobFactory.setApplicationContext(applicationContext);
return jobFactory;
}
#Bean
public SchedulerFactoryBean schedulerFactoryBean() {
SchedulerFactoryBean schedulerFactory = new SchedulerFactoryBean();
schedulerFactory.setJobFactory(springBeanJobFactory());
schedulerFactory.setTriggers(cronTriggerBalanceCarryForward().getObject());
return schedulerFactory;
}
}
Below class where quartz executeInternal method is written
#Service
#PersistJobDataAfterExecution
#DisallowConcurrentExecution
#Autowired
private BillingCroneRepo billingCroneRepo;
public class BillingCroneSvcImpl extends QuartzJobBean implements BillingCroneSvc {
#Override
#Transactional
protected void executeInternal(JobExecutionContext context) throws JobExecutionException {
SpringBeanAutowiringSupport.processInjectionBasedOnCurrentContext(context);
billingCroneRepo.updateBalance();
// this method throws exception javax.persistence.TransactionRequiredException: Executing an update/delete query
}
}
App config class
#EnableWebMvc
#EnableTransactionManagement
#Configuration
#ComponentScan({ "com.stock.*" })
#Import({ SecurityConfig.class })
#PropertySource("classpath:jdbc.properties")
public class AppConfig extends WebMvcConfigurerAdapter {
private static final String PROPERTY_NAME_DATABASE_DRIVER = "db.driver";
private static final String PROPERTY_NAME_DATABASE_PASSWORD = "db.password";
private static final String PROPERTY_NAME_DATABASE_URL = "db.url";
private static final String PROPERTY_NAME_DATABASE_USERNAME = "db.username";
private static final String PROPERTY_NAME_HIBERNATE_DIALECT = "hibernate.dialect";
private static final String PROPERTY_NAME_HIBERNATE_SHOW_SQL = "hibernate.show_sql";
private static final String PROPERTY_NAME_ENTITYMANAGER_PACKAGES_TO_SCAN = "entitymanager.packages.to.scan";
#Resource
private Environment env;
#Bean(name = "dataSource")
public DriverManagerDataSource dataSource() {
DriverManagerDataSource driverManagerDataSource = new DriverManagerDataSource();
driverManagerDataSource.setDriverClassName(env.getRequiredProperty(PROPERTY_NAME_DATABASE_DRIVER));
driverManagerDataSource.setUrl(env.getRequiredProperty(PROPERTY_NAME_DATABASE_URL));
driverManagerDataSource.setUsername(env.getRequiredProperty(PROPERTY_NAME_DATABASE_USERNAME));
driverManagerDataSource.setPassword(env.getRequiredProperty(PROPERTY_NAME_DATABASE_PASSWORD));
return driverManagerDataSource;
}
#Bean
public LocalContainerEntityManagerFactoryBean entityManagerFactory() {
LocalContainerEntityManagerFactoryBean entityManagerFactoryBean = new LocalContainerEntityManagerFactoryBean();
entityManagerFactoryBean.setDataSource(dataSource());
entityManagerFactoryBean.setPersistenceProviderClass(HibernatePersistence.class);
entityManagerFactoryBean.setPackagesToScan(env.getRequiredProperty(PROPERTY_NAME_ENTITYMANAGER_PACKAGES_TO_SCAN));
entityManagerFactoryBean.setJpaProperties(hibProperties());
return entityManagerFactoryBean;
}
private Properties hibProperties() {
Properties properties = new Properties();
properties.put(PROPERTY_NAME_HIBERNATE_DIALECT,env.getRequiredProperty(PROPERTY_NAME_HIBERNATE_DIALECT));
properties.put(PROPERTY_NAME_HIBERNATE_SHOW_SQL,env.getRequiredProperty(PROPERTY_NAME_HIBERNATE_SHOW_SQL));
return properties;
}
#Bean
public PlatformTransactionManager transactionManager() {
JpaTransactionManager transactionManager = new JpaTransactionManager();
transactionManager.setEntityManagerFactory(entityManagerFactory().getObject());
return transactionManager;
}
#Bean
public ReloadableResourceBundleMessageSource messageSource(){
ReloadableResourceBundleMessageSource messageSource=new ReloadableResourceBundleMessageSource();
String[] resources= {"classpath:messages"};
messageSource.setBasenames(resources);
return messageSource;
}
#Bean
public LocaleResolver localeResolver() {
final CookieLocaleResolver ret = new CookieLocaleResolver();
ret.setDefaultLocale(new Locale("en_IN"));
return ret;
}
#Bean
public LocaleChangeInterceptor localeChangeInterceptor(){
LocaleChangeInterceptor localeChangeInterceptor=new LocaleChangeInterceptor();
localeChangeInterceptor.setParamName("language");
return localeChangeInterceptor;
}
#Override
public void addResourceHandlers(ResourceHandlerRegistry registry) {
registry.addResourceHandler("/Angular/**").addResourceLocations("/Angular/");
registry.addResourceHandler("/css/**").addResourceLocations("/css/");
registry.addResourceHandler("/email_templates/**").addResourceLocations("/email_templates/");
registry.addResourceHandler("/fonts/**").addResourceLocations("/fonts/");
registry.addResourceHandler("/img/**").addResourceLocations("/img/");
registry.addResourceHandler("/js/**").addResourceLocations("/js/");
registry.addResourceHandler("/Landing_page/**").addResourceLocations("/Landing_page/");
}
#Bean
public static PropertySourcesPlaceholderConfigurer properties() {
PropertySourcesPlaceholderConfigurer pspc = new PropertySourcesPlaceholderConfigurer();
org.springframework.core.io.Resource[] resources = new ClassPathResource[] { new ClassPathResource("application.properties") };
pspc.setLocations(resources);
pspc.setIgnoreUnresolvablePlaceholders(true);
return pspc;
}
#Bean
public InternalResourceViewResolver viewResolver() {
InternalResourceViewResolver viewResolver = new InternalResourceViewResolver();
viewResolver.setViewClass(JstlView.class);
viewResolver.setPrefix("/WEB-INF/pages/");
viewResolver.setSuffix(".jsp");
return viewResolver;
}
// through below code we directly read properties file in jsp file
#Bean(name = "propertyConfigurer")
public PropertiesFactoryBean mapper() {
PropertiesFactoryBean bean = new PropertiesFactoryBean();
bean.setLocation(new ClassPathResource("application.properties"));
return bean;
}
}
Can anybody please assist me how to resolve transational issue in spring JPA with quartz
Thanks you all for your help. Finally I autowired EntityManagerFactory instead of persitance EntityManager and it is working fine. I tried all scenario but nothing worked to inject spring transactional in quartz so finally autoriwed entitymanagerfactory
Below is my repo class code.
#Repository
public class BillingCroneRepoImpl implements BillingCroneRepo {
/*#PersistenceContext
private EntityManager entityManager;*/
#Autowired
EntityManagerFactory entityManagerFactory;
public boolean updateTable(){
EntityManager entityManager = entityManagerFactory.createEntityManager();
EntityTransaction entityTransaction = entityManager.getTransaction();
entityTransaction.begin(); // this will go in try catch
Query query = entityManager.createQuery(updateSql);
// update table code goes here
entityTransaction.commit(); // this will go in try catch
}
}
I'm not the Spring specialist, but I think new ... doesn't work with #Transactional
#Service
#PersistJobDataAfterExecution
#DisallowConcurrentExecution
public class BillingCroneSvcImpl extends QuartzJobBean implements BillingCroneSvc {
#Autowired
BillingCroneRepo billingCroneRepo;
#Override
#Transactional
protected void executeInternal(JobExecutionContext context) throws JobExecutionException {
SpringBeanAutowiringSupport.processInjectionBasedOnCurrentContext(context);
billingCroneRepo.updateBalance();
}
}
It's because quartz is using the bean instead of the proxy generated for #Transactional.
Use either MethodInvokingJobDetailFactoryBean (instead of inheriting QuartzJob) or use a dedicated wrapper quarz bean (inheriting from QuartzJob) that call the spring bean (not inheriting from QuartzJob) having the #Transactionnal annotation.
EDIT : this is in fact not the problem
The problem is here :
JobDetailFactoryBean factory = new JobDetailFactoryBean();
factory.setJobClass(BillingCroneSvcImpl.class);
By passing the class, I presume that Quartz will instantiate it itself, so Spring won't create it and won't wrap the bean in a Proxy that handle the #Transactionnal behaviour.
Instead you must use something along the line :
#Bean(name = "billingCroneSvc")
public BillingCroneSvc getSvc(){
return new BillingCroneSvcImpl();
}
#Bean
public JobDetailFactoryBean jobDetailBalanceCarryForward(){
JobDetailFactoryBean factory = new JobDetailFactoryBean();
getSvc();// just make sure the bean is instantiated
factory.setBeanName("billingCroneSvc");
...
}