How can we differentiate between String and Long in job parameters passed to Spring Batch job through CommandLine - spring-batch

I'm using vulnscanner job example and tried to add 2 additional parameters. Here is the url I used to run the job.
curl -d jobParameters=ipAddress=74.54.219.210,outputFile=logs/tb.xml,country=USA,randomId=1245873 http://localhost:8080/partitioningJobs/jobs/vulnScannerJob.json
When I used VARCHAR as datatype for country and BIGINT as datatype for randomId, the example thrown the following exception in scanPorts step.
Caused by: org.springframework.jdbc.UncategorizedSQLException: PreparedStatementCallback; uncategorized SQLException for SQL [UPDATE BATCH_STEP_EXECUTION_CONTEXT SET SHORT_CONTEXT = ?, SERIALIZED_CONTEXT = ? WHERE STEP_EXECUTION_ID = ?]; SQL state [25P02]; error code [0]; ERROR: current transaction is aborted, commands ignored until end of transaction block; nested exception is org.postgresql.util.PSQLException: ERROR: current transaction is aborted, commands ignored until end of transaction block
at org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:84)
at org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:81)
at org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.translate(AbstractFallbackSQLExceptionTranslator.java:81)
at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:660)
at org.springframework.jdbc.core.JdbcTemplate.update(JdbcTemplate.java:909)
at org.springframework.jdbc.core.JdbcTemplate.update(JdbcTemplate.java:970)
at org.springframework.batch.core.repository.dao.JdbcExecutionContextDao.persistSerializedContext(JdbcExecutionContextDao.java:233)
at org.springframework.batch.core.repository.dao.JdbcExecutionContextDao.updateExecutionContext(JdbcExecutionContextDao.java:161)
at org.springframework.batch.core.repository.support.SimpleJobRepository.updateExecutionContext(SimpleJobRepository.java:205)
at sun.reflect.GeneratedMethodAccessor110.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:317)
at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157)
at org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:96)
at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:260)
at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:94)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:207)
at com.sun.proxy.$Proxy18.updateExecutionContext(Unknown Source)
at org.springframework.batch.core.step.tasklet.TaskletStep$ChunkTransactionCallback.doInTransaction(TaskletStep.java:451)
... 50 more
Caused by: org.postgresql.util.PSQLException: ERROR: current transaction is aborted, commands ignored until end of transaction block
at org.postgresql.core.v3.QueryExecutorImpl.receiveErrorResponse(QueryExecutorImpl.java:2458)
at org.postgresql.core.v3.QueryExecutorImpl.processResults(QueryExecutorImpl.java:2158)
at org.postgresql.core.v3.QueryExecutorImpl.execute(QueryExecutorImpl.java:291)
at org.postgresql.jdbc.PgStatement.executeInternal(PgStatement.java:432)
at org.postgresql.jdbc.PgStatement.execute(PgStatement.java:358)
at org.postgresql.jdbc.PgPreparedStatement.executeWithFlags(PgPreparedStatement.java:171)
at org.postgresql.jdbc.PgPreparedStatement.executeUpdate(PgPreparedStatement.java:138)
at org.apache.commons.dbcp.DelegatingPreparedStatement.executeUpdate(DelegatingPreparedStatement.java:102)
at org.springframework.jdbc.core.JdbcTemplate$2.doInPreparedStatement(JdbcTemplate.java:916)
at org.springframework.jdbc.core.JdbcTemplate$2.doInPreparedStatement(JdbcTemplate.java:909)
at org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:644)
... 68 more
Caused by: org.postgresql.util.PSQLException: ERROR: operator does not exist: bigint = character varying
Hint: No operator matches the given name and argument type(s). You might need to add explicit type casts.
Position: 67
I changed both the datatypes to VARCHAR and it ran fine. But I wanted to know whether spring batch recognizes all it's job parameters as String only or other other datatypes too.
FYI, I have used randomId jobParameter in ItemReader and I got this error. I have made all necessary changes in Target and TargetRowMapper class to handle these additional parameters.
<bean id="targetItemReader" class="org.springframework.batch.item.database.JdbcPagingItemReader" scope="step">
<property name="dataSource" ref="dataSource" />
<property name="queryProvider">
<bean
class="org.springframework.batch.item.database.support.SqlPagingQueryProviderFactoryBean">
<property name="dataSource" ref="dataSource" />
<property name="selectClause" value="ID, RANDOMID, IP, PORT, CONNECTED, BANNER" />
<property name="fromClause" value="FROM TARGET" />
<property name="whereClause" value="RANDOMID = :qid AND ID >= :minId AND ID <= :maxId AND CONNECTED IS NULL"/>
<property name="sortKey" value="ID" />
</bean>
</property>
<property name="pageSize" value="10" />
<property name="parameterValues">
<map>
<entry key="minId" value="#{stepExecutionContext[minValue]}"/>
<entry key="maxId" value="#{stepExecutionContext[maxValue]}"/>
<entry key="qid" value="#{jobParameters[randomId]}"/>
</map>
</property>
<property name="rowMapper">
<bean class="com.michaelminella.springbatch.domain.TargetRowMapper"/>
</property>
</bean>

As my understanding, #{jobParameter} is an unmodifiableMap of java.util.Collections of type . It can contain any Object such as
JobExecution jobExecution = jobLauncher.run(this.exampleJob, new
JobParametersBuilder()
.addDate("now", new Date())
.addString("paramString", "stringAsParam")
.addLong("paramLong", 1234L)
.toJobParameters()
For your case, you passed all params as String that is why you are getting an error at query level.
To fix it, I'd like to change
<entry key="qid" value="#{jobParameters[randomId]}"/>
to
<entry key="qid" value="#{T(java.lang.Long).parseLong(jobParameters[randomId])}"/>

Related

com.atomikos.jdbc.AtomikosSQLException: Connection pool exhausted - try increasing 'maxPoolSize' and/or 'borrowConnectionTimeout' on the

In my spring batch application, i'm using atomikos version(4.0.4) and JTA(1.1).Some of the jobs are hanged in PROD and acquired all the connections from DB which in turns stopped the other jobs which were triggered in parallel. And all were failed with below error.
Error Log 1:
Could not get JDBC Connection; nested exception is com.atomikos.jdbc.AtomikosSQLException: Connection pool exhausted - try increasing 'maxPoolSize' and/or 'borrowConnectionTimeout' on the DataSourceBean.
Error Log 2 :
Failed to grow connection pool.
And almost for 13 jobs no instance got created in DB and in control-m the logs were captured with "Null Exception Message intercepted"
Can anyone please suggest on this issue? Even tried with upgrading the atomikos version upto 5.0.0 but still same issue occurring.
{
AtomikosDataSourceBean ads = new AtomikosDataSourceBean();
if (mDevModeDriverClassName.toLowerCase().contains("oracle")) {
if (!mDevModeDriverClassName.equals("oracle.jdbc.xa.client.OracleXADataSource")) {
log.warn("DataSource property 'devModeDriverClassName' should be set "
+ "to 'oracle.jdbc.xa.client.OracleXADataSource' when using Oracle! " + "Current value is: "
+ mDevModeDriverClassName);
}
}
String vUniqueResourceName = "DS-" + UUID.randomUUID();
log.debug("Creating Oracle XA DataSource. uniqueResourceName={}"+vUniqueResourceName);
ads.setUniqueResourceName(vUniqueResourceName);
ads.setXaDataSourceClassName(mDevModeDriverClassName); // "oracle.jdbc.xa.client.OracleXADataSource");
ads.setMaxPoolSize((mDevModeMaxSize > 0) ? mDevModeMaxSize : 1); //mDevModeMaxSize =10
ads.setTestQuery("SELECT 1 FROM DUAL");
Properties xaProps = new Properties();
xaProps.setProperty("user", mDevModeUsername);
xaProps.setProperty("password", mDevModePassword);
xaProps.setProperty("URL", mDevModeJdbcUrl);
ads.setXaProperties(xaProps);
OracleXADataSource xaDataSource = new OracleXADataSource();
xaDataSource.setUser(mDevModeUsername);
xaDataSource.setPassword(mDevModePassword);
xaDataSource.setURL(mDevModeJdbcUrl);
ads.setXaDataSource(xaDataSource);
<bean id="rsDataSource" class="com.sample.CustomDataSource" scope="singleton" destroy-method="close">
<property name="devModeDriverClassName" value="${spring.datasource.driver-class-name}" />
<property name="devModeJdbcUrl" value="${spring.datasource.rs.url}" />
<property name="devModeUsername" value="${spring.datasource.rs.username}" />
<property name="devModePassword" value="${spring.datasource.rs.password}" />
<property name="devModeMaxSize" value="10" />
</bean>
<bean id="transactionManager"
class="org.springframework.transaction.jta.JtaTransactionManager" >
<property name="nestedTransactionAllowed" value="true"/>
<property name="allowCustomIsolationLevels" value="true"/>
<property name="defaultTimeout" value="-1"/>
<property name="transactionManager" ref="txManager"></property>
</bean>
<bean id="txManager" class="com.atomikos.icatch.jta.UserTransactionManager" destroy-method="close">
<property name="forceShutdown" value="true"/>
<property name="transactionTimeout" value="60"></property>
</bean>
You either have a connection leak or your pool size is not rightly configured. And looking at your config, it is most likely that your connection pool size is not correctly configured:
And almost for 13 jobs no instance got created
ads.setMaxPoolSize((mDevModeMaxSize > 0) ? mDevModeMaxSize : 1); //mDevModeMaxSize =10
<property name="devModeMaxSize" value="10" />
Your connection pool is set to serve at most 10 connections, but you are launching 13 jobs. So it should not be surprising to have the error:
Connection pool exhausted - try increasing 'maxPoolSize'
You need to increase the maxPoolSize accordingly.

Spring-batch Entity Manager becomes null after init

I'm currently implementing a Spring-batch that reads and writes to files BUT also needs to do CRUD operations on a database.
I've tried to simply define an Entity manager in my xml configuration, and use it in my DAO class. However, right after the init, the EntityManager becomes null.
Can anyone help me with this ? (a solution or a link via something usable would be perfect).
My batchContext.xml
<bean id="dataSource" class="org.apache.commons.dbcp.BasicDataSource" destroy-method="close">
<property name="driverClassName" value="${batch.datasource.driverClassName}"/>
<property name="url" value="${batch.datasource.url}"/>
<property name="username" value="${batch.datasource.username}"/>
<property name="password" value="${batch.datasource.password}"/>
<property name="testOnBorrow" value="true"/>
<property name="testOnReturn" value="true"/>
<property name="testWhileIdle" value="true"/>
<property name="timeBetweenEvictionRunsMillis" value="1800000"/>
<property name="numTestsPerEvictionRun" value="3"/>
<property name="minEvictableIdleTimeMillis" value="1800000"/>
</bean>
<bean id="entityManagerFactory" name="entTransactionMgr" class="org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean">
<!-- <property name="persistenceXmlLocation" value="classpath:/META-INF/spring/persistence.xml" /> -->
<property name="persistenceUnitName" value="persistenceUnit"/>
<property name="packagesToScan" value="${jpa.scan.packages}"/>
<property name="dataSource" ref="dataSource"/>
<property name="jpaVendorAdapter">
<bean class="org.springframework.orm.jpa.vendor.HibernateJpaVendorAdapter"/>
</property>
<property name="loadTimeWeaver">
<bean class="org.springframework.instrument.classloading.InstrumentationLoadTimeWeaver" />
</property>
<!-- Custom jpaDialect pour le deuxieme batch:job-repository-->
<property name="jpaDialect">
<bean class="fr.mma.soecm.batchfacade.util.CustomHibernateJpaDialect" />
</property>
<property name="jpaProperties">
<props>
<!-- multiple props here-->
</props>
</property>
</bean>
<bean id="transactionManager" class="org.springframework.orm.jpa.JpaTransactionManager">
<property name="entityManagerFactory" ref="entityManagerFactory"/>
</bean>
<!-- mode="aspectj" -->
<tx:annotation-driven transaction-manager="transactionManager"/>
<batch:job-repository id="jobRepository" data-source="dataSource" transaction-manager="transactionManager"/>
<!-- Jobs held separatelly -->
<import resource="batchContext-job.xml"/>
My DAO
#Repository("batchJobDao")
#Transactional
public class BatchJobDaoImpl implements BatchJobDao{
#PersistenceContext(unitName="persistenceUnit")
#Autowired
private EntityManager entityManager;
// #PersistenceContext(unitName="persistenceUnit", type=PersistenceContextType.EXTENDED)
// public void setEntityManager(EntityManager entityManager) {
// System.out.println("Setting Entity Manager :" + entityManager);
// this. entityManager = entityManager;
// }
#PostConstruct
public void init(){
if (entityManager == null){
System.out.println(" Entity Manager is null");
} else {
System.out.println(" Entity Manager is not null");
getAllJobExecutions();
}
}
private static final String RECHERCHER_JOB_EXECUTION = "Select bje from BatchJobExecution bje";
#SuppressWarnings("unchecked")
#Override
#Transactional("transactionManager")
public List<BatchJobExecution> getAllJobExecutions(){
// EntityManagerFactory emf=Persistence.createEntityManagerFactory("entTransactionMgr");
// EntityManager em=emf.createEntityManager();
Query query = entityManager.createQuery(RECHERCHER_JOB_EXECUTION, BatchJobExecution.class);
List<BatchJobExecution> executions = (List<BatchJobExecution>) query.getResultList();
System.out.println("EXES : " + executions);
return executions;
}
}
There is some code commented out because I've tried multiple aproaches (changing the persistence context type, recovering the entity manager "manually", having a persistence.xml file for the entityManager) without succes.
My output when running the job is (without all the extra lines..):
Entity Manager is not null
EXES : [fr.mma.soecm.batchfacade.domain.BatchJobExecution#10e6c33,...]
[BatchService] - Synchronous job launch
[AbstractStep] - Encountered an error executing the step
Caused by: java.lang.NullPointerException
And the null pointer, on debug, is throws by the EntityManager being null when I call the "createQuery" in my DAO.
Thanks for your help.
I'll keep searching on my end.. God Speed!
As mentioned in the comment above, my aparent problem was due to the fact that I was trying to call the Service or the DAO in the Constructor of the Step.
After moving the Service call in the "doRead()" method, I could perform all CRUD operations with the EntityManager.
Please let me know if anyone has questions about this/and how to make it work otherwise, as I've not found any explanation on the internet, since I've began searching last week.

Spring Batch: DAO call from processor slows down batch process -- can I reuse DB connection?

When I add a DAO call in the MdwValidatingItemProcessor below, I get a serious hit in performance. Prior to adding the additional DAO call, 2500 records where being read/processed/written in 40ish seconds. When I add the one additional DAO call (lobPolicyMapper.getPolicyOrigin(item.getLob(),item.getRgn_cd())
), it becomes 120 seconds. And the lob_policy table only has one record in it. I plan to have many such DAO calls in my processor to do additional validation of the item. Can I reuse the DAO's connection so for each processor that is validating an item, the connection doesn't have to be opened and closed constantly? The AsyncItemProcessor delegates to the MdwValidatingItemProcessor:
public class MdwValidatingItemProcessor implements
ItemProcessor<MecMdw, MecMdw> {
#Autowired
DataSourceTransactionManager txManager;
#Autowired
LobPolicyMapper lobPolicyMapper;
#Autowired
MecUtils mecUtils;
#Autowired
LobShopMapper lobShopMapper;
#Autowired
KaiserIssuerMapper kaiserIssuerMapper;
private Validator validator;
public void setValidator(Validator validator) {
this.validator = validator;
}
private List<Map> validationReasons;
#BeforeStep
public void beforeStep(StepExecution stepExecution) {
validationReasons = (List<Map>) stepExecution.getJobExecution().getExecutionContext().get("validationReasons");
}
public MecMdw process(MecMdw item) throws Exception {
DefaultTransactionDefinition def = new DefaultTransactionDefinition();
def.setPropagationBehavior(TransactionDefinition.PROPAGATION_REQUIRED);
TransactionStatus status = txManager.getTransaction(def);
BindingResult results = BindAndValidate(item);
try {
if (results.hasErrors()) {
buildValidationException(results, item);
return setAsKickOut(item);
}
mecUtils.checkForPaddingAndValidateSSN(item);
//if both SSN and DOB are valid, set Rep_DOB_or_SSN to be SSN
if (item.getInvalid_ssn().equals(MecConstants.Y_FLAG) && mecUtils.isDOBValid(item)) {
item.setRep_doborssn("SSN");
}
else if(mecUtils.isDOBValid(item)){
item.setRep_doborssn("DOB");
}
else{
List<String> listOfErrors = new ArrayList<String>();
listOfErrors.add("BothSSNAndDOBInvalid");
item.setValidationErrors(listOfErrors);
return setAsKickOut(item);
}
// Policy Origin code not found in MEC database based on Region /
// LOB.
LobPolicy lobPolicy = lobPolicyMapper.getPolicyOrigin(
item.getLob(), item.getRgn_cd());
if (lobPolicy == null || ("").equals(lobPolicy.getLob())
|| lobPolicy.getLob() == null) {
return setAsKickOut(item);
}
//set the Rep_PolicyOgnCd
item.setRep_policy_ogn_cd(lobPolicy.getPolicyOrigin());
//If origin of policy = SHOP, look for Shop Identifier.(mec_lob_shop table)
if(("SHOP").equals(lobPolicy.getPolicyOrigin())){
if(lobShopMapper.getValidationShopIdentifier(item) == null){
return setAsKickOut(item);
}
}
//Validation Shop Identifier not found in MEC database based on Region/LOB/PID
if(lobShopMapper.getValidationShopIdentifier(item) == null){
return setAsKickOut(item);
}
//Kaiser Issuer EIN not found in MEC database based on Region/LOB.
Integer kaiserIssuer = kaiserIssuerMapper.getKaiserIssuerIdWithLobAndRegion(item);
if(kaiserIssuer == 0){
return setAsKickOut(item);
}
else{
item.setRep_kaiser_issuer_id(kaiserIssuer.toString() );
}
} catch (Exception ex) {
ex.printStackTrace();
} finally {
txManager.commit(status);
}
return item;
}
private MecMdw setAsKickOut(MecMdw item) {
item.setKick_out_fl('Y');
return item;
}
private BindingResult BindAndValidate(MecMdw item) {
DataBinder binder = new DataBinder(item);
binder.setValidator(validator);
binder.validate();
return binder.getBindingResult();
}
private void buildValidationException(BindingResult results, MecMdw item) {
List<String> listOfErrors = new ArrayList<String>();
for (ObjectError error : results.getAllErrors()) {
String[] codes = error.getCodes();
listOfErrors.add(codes[1]);
}
item.setValidationErrors(listOfErrors);
// throw new ValidationException(msg.toString());
}
I have a batch job using AsyncItemProcessor and AsyncItemWriter as follows:
<job id="mecmdwvalidatorJob" xmlns="http://www.springframework.org/schema/batch">
<step id="mdwvalidatorStep1">
<tasklet>
<chunk reader="pageItemReader" processor="asyncItemProcessor"
writer="asynchItemWriter" commit-interval="1000" skip-limit="2147483647">
<skippable-exception-classes> <!-- TODO -->
<include class="java.lang.Exception" />
</skippable-exception-classes>
</chunk>
</tasklet>
</step>
</job>
<bean id="pageItemReader"
class="org.springframework.batch.item.database.JdbcPagingItemReader">
<property name="dataSource" ref="dataSource" />
<property name="queryProvider">
<bean
class="org.springframework.batch.item.database.support.SqlPagingQueryProviderFactoryBean">
<property name="dataSource" ref="dataSource" />
<property name="selectClause"
value="select MDW_ID,FK_LOG_FILE_ID,TAX_YEAR,SUBS_TYPE_CD,SUB_FIRST_NM,SUB_MIDDLE_NM,SUB_LAST_NM,SUB_SUFFIX,SUB_DOB,SUB_ADDR1,SUB_ADDR2,SUB_CITY,SUB_STATE,SUB_PROVINCE,SUB_ZIP,SUB_ZIP4,SUB_COUNTRY_CD,SUB_COUNTRY,SUB_F_POSTAL_CD,LOB,SUB_SSN,GRP_EMP_NAME1,GRP_EMP_NAME2,GRP_EIN,GRP_ADDR1,GRP_ADDR2,GRP_CITY,GRP_STATE,GRP_PROVINCE,GRP_ZIP,GRP_ZIP4,GRP_COUNTRY_CD,GRP_COUNTRY,GRP_F_POSTAL_CD,ISSUER_NAME1,ISSUER_NAME2,ISSUER_PHONE,ISSUER_ADDR1,ISSUER_ADDR2,ISSUER_CITY,ISSUER_PROVINCE,ISSUER_ZIP,ISSUER_ZIP4,ISSUER_COUNTRY_CD,ISSUER_COUNTRY,ISSUER_F_POSTAL_CD,MEM_FIRST_NM,MEM_MIDDLE_NM,MEM_LAST_NM,MEM_SUFFIX,MEM_SSN,MEM_DOB,MEM_START_DATE,MEM_END_DATE,REGION_CD,SUB_MRN,SUB_MRN_PREFIX,MEM_MRN,MRN_PREFIX,PID,SUB_GRP_ID,SUB_GRP_NAME,INVALID_ADDR_FL" />
<property name="fromClause"
value="from MEC_MDW JOIN MEC_FILE_LOG on MEC_FILE_LOG.LOG_FILE_ID=MEC_MDW.FK_LOG_FILE_ID " />
<property name="whereClause" value="where MEC_FILE_LOG.STATUS=:status" />
<property name="sortKey" value="MDW_ID" />
</bean>
</property>
<property name="parameterValues">
<map>
<entry key="status" value="READY TO VALIDATE" />
</map>
</property>
<property name="pageSize" value="1000" />
<property name="rowMapper" ref="mdwRowMapper" />
</bean>
<bean id="mdwRowMapper" class="org.my.rowmapper.MdwRowMapper" />
<bean id="asyncItemProcessor"
class="org.springframework.batch.integration.async.AsyncItemProcessor">
<property name="delegate">
<bean
class="org.my.itemprocessor.MdwValidatingItemProcessor">
<property name="validator">
<bean
class="org.springframework.validation.beanvalidation.LocalValidatorFactoryBean" />
</property>
</bean>
</property>
<property name="taskExecutor" ref="taskExecutor" />
</bean>
<task:executor id="taskExecutor" pool-size="10" />
<bean id="asynchItemWriter"
class="org.springframework.batch.integration.async.AsyncItemWriter">
<property name="delegate" ref="customerCompositeWriter">
</property>
</bean>
<bean id="customerCompositeWriter"
class="org.springframework.batch.item.support.CompositeItemWriter">
<property name="delegates">
<list>
<ref bean="itemWriter1" />
<ref bean="itemWriter2" />
</list>
</property>
</bean>
<bean id="itemWriter1" class="org.my.writer.MdwWriter" />
<bean id="itemWriter2" class="org.my.writer.KickoutWriter" />
The transaction manager and MyBatix SqlSessionTemplate:
<!-- transaction manager, use JtaTransactionManager for global tx -->
<bean id="transactionManager"
class="org.springframework.jdbc.datasource.DataSourceTransactionManager">
<property name="dataSource" ref="dataSource" />
</bean>
<!-- define SqlSessionFactory as BATCH execution -->
<bean id="sqlSession" class="org.mybatis.spring.SqlSessionTemplate">
<constructor-arg index="0" ref="sqlSessionFactory" />
<constructor-arg index="1" value="BATCH" />
</bean>
<!-- stored job-meta in database -->
<bean id="jobRepository"
class="org.springframework.batch.core.repository.support.JobRepositoryFactoryBean">
<property name="dataSource" ref="dataSource" />
<property name="transactionManager" ref="batchTransactionManager" />
<property name="databaseType" value="sqlserver" />
</bean>
<bean id="jobLauncher"
class="org.springframework.batch.core.launch.support.SimpleJobLauncher">
<property name="jobRepository" ref="jobRepository" />
</bean>
The DAO making the call:
public interface LobPolicyMapper {
public abstract LobPolicy getPolicyOrigin(#Param("lob") String lob, #Param("regionCd") String regionCd);
UPDATE I added ehcache to the the MyBatis XML. This will help with the repetitive queries. But again, I am looking for a way to share this same DAO's connection across the AsyncItemProcessors:
<mapper namespace="org.mybatis.LobPolicyMapper">
<cache type="org.mybatis.caches.ehcache.EhcacheCache"/>
<select id="getPolicyOrigin" parameterType="hashmap" resultType='org.my.domain.LobPolicy'>
SELECT
l.lob_id as lobId,
l.lob as lob,
l.policy_origin as policyOrigin,
l.region_cd as regionCd
from mec_lob_policy l
where lob= #{lob} and region_cd=#{regionCd}
</select>
I added ehcache to the project
<dependency>
<groupId>org.mybatis</groupId>
<artifactId>mybatis-ehcache</artifactId>
<version>1.0.0</version>
</dependency>
I think I need a database pool. Here is what I am seeing in my logs on exiting the processor:
2015-08-27 17:09:54,194 [taskExecutor-3] DEBUG org.mybatis.spring.SqlSessionUtils - Releasing transactional SqlSession [org.apache.ibatis.session.defaults.DefaultSqlSession#77b027f]
2015-08-27 17:10:06,674 [taskExecutor-3] DEBUG org.mybatis.spring.SqlSessionUtils - Transaction synchronization committing SqlSession [org.apache.ibatis.session.defaults.DefaultSqlSession#77b027f]
2015-08-27 17:10:06,675 [taskExecutor-3] DEBUG org.mybatis.spring.SqlSessionUtils - Transaction synchronization deregistering SqlSession [org.apache.ibatis.session.defaults.DefaultSqlSession#77b027f]
2015-08-27 17:10:06,675 [taskExecutor-3] DEBUG org.mybatis.spring.SqlSessionUtils - Transaction synchronization closing SqlSession [org.apache.ibatis.session.defaults.DefaultSqlSession#77b027f]
2015-08-27 17:10:06,675 [taskExecutor-3] DEBUG org.springframework.jdbc.datasource.DataSourceTransactionManager - Initiating transaction commit
2015-08-27 17:10:06,675 [taskExecutor-3] DEBUG org.springframework.jdbc.datasource.DataSourceTransactionManager - Committing JDBC transaction on Connection [ConnectionID:14 ClientConnectionId: 1b95dc0e-83c0-487b-af59-f5be52931818]
2015-08-27 17:10:06,805 [taskExecutor-3] DEBUG org.springframework.jdbc.datasource.DataSourceTransactionManager - Releasing JDBC Connection [ConnectionID:14 ClientConnectionId: 1b95dc0e-83c0-487b-af59-f5be52931818] after transaction
2015-08-27 17:10:06,805 [taskExecutor-3] DEBUG org.springframework.jdbc.datasource.DataSourceUtils - Returning JDBC Connection to DataSource
Here is what I am seeing in my logs on entering the processor. New transaction is created and a new database connection is fetched:
2015-08-27 17:10:06,805 [taskExecutor-3] DEBUG org.springframework.jdbc.datasource.DataSourceTransactionManager - Creating new transaction with name [null]: PROPAGATION_REQUIRED,ISOLATION_DEFAULT
2015-08-27 17:10:06,805 [taskExecutor-3] DEBUG org.springframework.jdbc.datasource.DriverManagerDataSource - Creating new JDBC DriverManager Connection to [jdbc:sqlserver://blah]
2015-08-27 17:10:07,115 [taskExecutor-3] DEBUG org.springframework.jdbc.datasource.DataSourceTransactionManager - Acquired Connection [ConnectionID:23 ClientConnectionId: d1f016e6-3e9d-4b0e-a34d-14298c292a65] for JDBC transaction
2015-08-27 17:10:07,115 [taskExecutor-3] DEBUG org.springframework.jdbc.datasource.DataSourceTransactionManager - Switching JDBC Connection [ConnectionID:23 ClientConnectionId: d1f016e6-3e9d-4b0e-a34d-14298c292a65] to manual commit

Can Same spring batch invoked with different job parameter from different multiple thread at the same time

Our requirement is to write multiple files at the same time. we are using spring batch to write file and we are lunching the spring batch from different thread. Each thread will have it is own application context. So we can assure that the singletone beans will not be shared across multiple thread. Below is my code snippet.
Spring batch config.
<bean id="reportDataReader" class="com.test.ist.batch2.rrm.batch.readers.RRMItmeReader"
scope="step">
<property name="verifyCursorPosition" value="false" />
<property name="dataSource" ref="dataSource" />
<property name="sql" value="#{jobParameters['sqlquery']}" />
<property name="rowMapper" ref="valueMapper" />
<property name="fetchSize" value="5000" />
</bean>
<bean id="valueMapper" class="com.test.ist.batch2.rrm.batch.mappers.DBValueMapper" scope="step"></bean>
<bean id="velocityFileWritter"
class="com.test.ist.batch2.rrm.batch.writers.RRMVelocityFileWriter"
scope="step">
</bean>
<bean id="velocityEngine"
class="org.springframework.ui.velocity.VelocityEngineFactoryBean">
<property name="velocityProperties">
<value>
resource.loader = class
class.resource.loader.class = org.apache.velocity.runtime.resource.loader.ClasspathResourceLoader
class.resource.loader.cache = true
class.resource.loader.modificationCheckInterval = 0
</value>
</property>
</bean>
<batch:job id="rrmReportGenJob">
<batch:step id="rrmReportGenStep">
<batch:tasklet>
<batch:chunk reader="reportDataReader" writer="velocityFileWritter"
commit-interval="${reportData.reader.commit-interval}">
</batch:chunk>
</batch:tasklet>
</batch:step>
</batch:job>
This how we are invoking the spring batch.
ThreadPoolExecutor tpe=new ThreadPoolExecutor(10, 10, 1000000, TimeUnit.MILLISECONDS, new LinkedBlockingQueue());
PetReportGenerator rrg=new PetReportGenerator(null);
ThreadTest tt=new ThreadTest(new PetReportGenerator(null), "161");
ThreadTest tt2=new ThreadTest(new PetReportGenerator(null), "162");
ThreadTest tt3=new ThreadTest(new PetReportGenerator(null), "163");
ThreadTest tt4=new ThreadTest(new PetReportGenerator(null), "165");
tpe.execute(tt);
tpe.execute(tt2);
tpe.execute(tt3);
tpe.execute(tt4);
In the constructor of PetReportGenerator we are initializing the bean config.
Below is the code snippet
private ApplicationContext appContext;
public PetReportGenerator(ApplicationContext reportContext){
if(null == reportContext){
//if(null == appContext){
appContext=new ClassPathXmlApplicationContext("spring-batch-jobs.xml");
//}
}else{
setAppContext(reportContext);;
}
}
Below is the code extract of how we invoke the spring batch
Job jobToExecute = (Job)SpringUtils.getBean(jobName);
JobParametersBuilder paramsBuilder = new JobParametersBuilder();
//By default add the Data time. This will help to lauch the same job again with same parameters
paramsBuilder.addLong("JOB_TIME", System.currentTimeMillis());
if(!jobParams.isEmpty()){
//Validate input fields.
String sqlToUse = validator.validateInput(jobParams);
for(Map.Entry entry:jobParams.entrySet()){
paramsBuilder.addString(entry.getKey(), entry.getValue());
}
}else{
throw new ReportGenerationException("Job input parameter is Empty");
}
jobexe=jobLauncher.run(jobToExecute, paramsBuilder.toJobParameters());
If it is run in a single thread it is working fine.
When it is invoked by multiple threads we are getting below error
09:09:26,742 ERROR pool-1-thread-3 job.AbstractJob:329 - Encountered fatal error executing job
java.lang.NullPointerException
at org.springframework.batch.core.repository.dao.MapJobExecutionDao.synchronizeStatus(MapJobExecutionDao.java:158)
at org.springframework.batch.core.repository.support.SimpleJobRepository.update(SimpleJobRepository.java:161)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:317)
at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:190)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:157)
at org.springframework.transaction.interceptor.TransactionInterceptor$1.proceedWithInvocation(TransactionInterceptor.java:98)
at org.springframework.transaction.interceptor.TransactionAspectSupport.invokeWithinTransaction(TransactionAspectSupport.java:262)
at org.springframework.transaction.interceptor.TransactionInterceptor.invoke(TransactionInterceptor.java:95)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179)
at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:207)
at com.sun.proxy.$Proxy14.update(Unknown Source)
at org.springframework.batch.core.job.AbstractJob.updateStatus(AbstractJob.java:416)
at org.springframework.batch.core.job.AbstractJob.execute(AbstractJob.java:299)
at org.springframework.batch.core.launch.support.SimpleJobLauncher$1.run(SimpleJobLauncher.java:135)
at org.springframework.core.task.SyncTaskExecutor.execute(SyncTaskExecutor.java:50)
at org.springframework.batch.core.launch.support.SimpleJobLauncher.run(SimpleJobLauncher.java:128)
Can any one please help me to understand what could be wrong ?
The MapJobRepository is NOT intended for production use. It is NOT threadsafe. If you need the performance of in memory job repositories (loosing restartability, etc), use an in memory database like HSQLDB.
That note aside, if you are using thread safe components, there is no reason you can't launch multiple job instances with multiple threads.
Are you sure, that the MapJobExecutionDao is threadsafe in all aspects? I see, that a ConcurrentMap is used in side MapJobExecutionDao, but I'm not sure if this is enough. I once had a problem getting also a NullPointer from a Map that was accessed from different threads. The problem there was, that one thread caused a rehashing and when the second thread did access the map at that very moment, it received a nullpointer.
Are you sure, that the combinations of your identifying jobparameters is unique? I see, that you add a parameter Job_Time with System.currentTimeMillis(), but do you know, if that really resolves in an unique timestamp?
Have you tried to use the table based versions of JobExecutionDao and so on?

Spring Batch FlatFileItemWriter - How to use stepExecution.jobId to generate file name

I have this FileWriter where I'm trying to append the current Job Id to the filename that is generated.
<bean id="csvFileWriter" class="org.springframework.batch.item.file.FlatFileItemWriter" scope="step">
<property name="resource">
<bean class="org.springframework.core.io.FileSystemResource">
<constructor-arg type="java.lang.String">
<value>${csv.file}_#{stepExecution.jobExecution.jobId}</value>
</constructor-arg>
</bean>
</property>
<property name="lineAggregator">
<bean class="org.springframework.batch.item.file.transform.DelimitedLineAggregator">
<property name="delimiter">
<util:constant
static-field="org.springframework.batch.item.file.transform.DelimitedLineTokenizer.DELIMITER_COMMA"/>
</property>
<property name="fieldExtractor">
<bean class="org.springframework.batch.item.file.transform.PassThroughFieldExtractor" />
</property>
</bean>
</property>
....
....
but it's bombing out with
Caused by: org.springframework.expression.spel.SpelEvaluationException: EL1008E:(pos 0): Field or property 'stepExecution' cannot be found on object of type 'org.springframework.beans.factory.config.BeanExpressionContext'
at org.springframework.expression.spel.ast.PropertyOrFieldReference.readProperty(PropertyOrFieldReference.java:208)
at org.springframework.expression.spel.ast.PropertyOrFieldReference.getValueInternal(PropertyOrFieldReference.java:72)
at org.springframework.expression.spel.ast.CompoundExpression.getValueInternal(CompoundExpression.java:52)
at org.springframework.expression.spel.ast.SpelNodeImpl.getTypedValue(SpelNodeImpl.java:102)
at org.springframework.expression.spel.standard.SpelExpression.getValue(SpelExpression.java:97)
at org.springframework.expression.common.CompositeStringExpression.getValue(CompositeStringExpression.java:82)
at org.springframework.expression.common.CompositeStringExpression.getValue(CompositeStringExpression.java:1)
at org.springframework.context.expression.StandardBeanExpressionResolver.evaluate(StandardBeanExpressionResolver.java:139)
... 45 more
Any idea how i can correctly reference the jobId in this case?
Update: Adding worked solution
I implemented the JobExecutionListener which adds the jobId to the ExecutionContext
public class MyExecutionListener implements JobExecutionListener {
public void beforeJob(JobExecution jobExecution) {
long jobId = jobExecution.getJobId();
jobExecution.getExecutionContext().put("jobId",jobId);
jobExecution.getExecutionContext().put("date",date);
}
public void afterJob(JobExecution jobExecution) {
Register the listener to the batch job
<batch:job id="batchJob">
<batch:listeners>
<batch:listener ref="myExecutionListener"/>
</batch:listeners>
And finally the CSV writer gets updated to
<bean id="fundAssetCsvFileWriter"
class="org.springframework.batch.item.file.FlatFileItemWriter" scope="step">
<property name="resource">
<bean class="org.springframework.core.io.FileSystemResource">
<constructor-arg value="${csv.file.name}_#{jobExecutionContext['date']}_#{jobExecutionContext['jobId']}.csv" type="java.lang.String"/>
</bean>
The supported names for late-bindig are:
#{jobParameters}
#{jobExecutionContext}
#{stepExecutionContext}
If jobId is not directly accessible, look this question.
Also, resource can be injected directly as
<property name="resource">
<value>file://${csv.file}_#{jobExecutionContext['jobId']}</value>
</property>
because the right resource type is created using a converter.
#{stepExecution.jobExecution.id} or #{stepExecution.jobExecutionId} should work though.
The StepContext does provide access to the StepExecution for late binding via SpEL expressions.