Somehow transaction rollback not working for CompositeItemWriter.
I have 2 update queries inside CompositeItemWriter and commit-interval of 20.
Let's say after 15 records, if I get any issue with next update, then Ideally all earlier records should roll back to their earlier state. But I can see those records updated in DB.
Anyone has any idea about the same? I am using 3.0.5.RELEASE version of Spring batch.
Configuration is
<step id="step1" next="step2">
<tasklet transaction-manager="transactionManager" >
<chunk reader="reader" writer="customItemWriter" commit-interval="20"/>
<transaction-attributes propagation="REQUIRES_NEW" isolation="DEFAULT"/>
</tasklet>
</step>`
<bean id="customItemWriter" class="org.springframework.batch.item.support.CompositeItemWriter">
<property name="delegates">
<list>
<ref bean="updateTable1"/>
<ref bean="updateTable2"/>
<ref bean="fileWriter"/>
</list>
</property>
</bean>
<bean id="updateTable1" class="org.springframework.batch.item.database.JdbcBatchItemWriter">
<property name="assertUpdates" value="true"/>
<property name="dataSource" ref="dataSource"/>
<property name="itemPreparedStatementSetter">
<bean class="com.batch.jdbc.Table1PreparedStatementSetter" />
</property>
<property name="sql"
value="UPDATE TABLE1 SET STATUS = ? WHERE COLUMN1_ID = ?"/>
</bean>
<bean id="updateTable2" class="org.springframework.batch.item.database.JdbcBatchItemWriter">
<property name="assertUpdates" value="true"/>
<property name="dataSource" ref="dataSource"/>
<property name="itemPreparedStatementSetter">
<bean class="com.batch.jdbc.Table2PreparedStatementSetter" />
</property>
<property name="sql"
value="UPDATE TABLE2 SET STATUS = ? WHERE COLUMN2_ID = ?"/>
</bean>
<bean id="fileWriter" class="x.y.z.CustomFileWriter" />
Related
Trying to read data from DB using spring batch and hibernate reader getting org.hibernate.hql.internal.ast.QuerySyntaxException: Result is not mapped [from Result]
<import resource="/context-model.xml"/>
<batch:job id="MainJob">
<!-- File Load Step -->
<batch:step id="stepDataReadFromDB">
<batch:tasklet>
<batch:chunk reader="DataReaderDB" processor ="" dummyProcessor" writer="dummyWriter" commit-interval="2"></batch:chunk>
</batch:tasklet>
</batch:step>
</batch:job>
<bean id="DataReaderDB" class="org.springframework.batch.item.database.HibernateCursorItemReader">
<property name="sessionFactory" ref="sessionFactory" />
<property name="queryString" value="from Result" />
<property name="useStatelessSession" value="false" />
</bean>
<bean id="transactionManager" class="org.springframework.batch.support.transaction.ResourcelessTransactionManager"/>
<bean id="jobLauncher" class="org.springframework.batch.core.launch.support.SimpleJobLauncher">
<property name="jobRepository" ref="jobRepository"/>
</bean>
<bean id="sessionFactory" class="org.springframework.orm.hibernate4.LocalSessionFactoryBean">
<property name="dataSource" ref="DataSource" />
<property name="mappingLocations" value="classpath:META-INF/spring/batch/hibernate/*.hbm.xml" />
<property name="hibernateProperties">
<props>
<prop key="hibernate.dialect"> org.hibernate.dialect.OracleDialect</prop>
<prop key="hibernate.show_sql">true</prop>
<prop key="hibernate.format_sql">true</prop>
</props>
</property>
</bean>
<bean id="transactionManager" class="org.springframework.orm.hibernate4.HibernateTransactionManager"/>
<tx:annotation-driven transaction-manager="transactionManager"/>
<bean id="DataSource" class="org.apache.commons.dbcp.BasicDataSource">
<property name="driverClassName" value="oracle.jdbc.OracleDriver"/>
<property name="url" value="jdbc:oracle:thin:#******"/>
<property name="username" value="UN"/>
<property name="password" value="PW"/>
</bean>
Hiber.hib.xml:
<hibernate-configuration>
<session-factory>
<mapping class="org.core.reader.Result"/>
</session-factory>
</hibernate-configuration>
Entity class
#Entity
#Table(name = "RESULT")
public class Result {
#Id
#Column(name = "SID", nullable = false)
int sID;
#Column(name = "COLUMN1")
String studentName;
I am unable to read data from DB.I need to fetch data from Oracle using hibernate corresponding to the data from the request. I configured like this but getting error above shown
Can someone please help me on this?
I have pasted above the code snippet from configuration file.
If your hibernate configuration file is named Hiber.hib.xml then definitely this won't work. Rename it to hibernate.cfg.xml.
Furthermore I see you are intermixing Annotation based Entity mapping and *.hbm.xml based Entity mapping. Don't do this and try to use one method.
RetryLogic using AOP is not working for MongoItemWritrer , but the same is working for my custom Readers and writers
Is there anything that I'm doing wrong here.
Below is the code snipopet.
<bean id="retryAdvice"
class="org.springframework.retry.interceptor.RetryOperationsInterceptor">
<property name="retryOperations" ref="taskBatchRetryTemplate" />
</bean>
<bean id="taskBatchRetryTemplate" class="org.springframework.retry.support.RetryTemplate">
<property name="retryPolicy" ref="genericRetryPolicy" />
<property name="backOffPolicy">
<bean class="org.springframework.retry.backoff.ExponentialBackOffPolicy">
<property name="initialInterval" value="${mongoloader.backOffPeriod.initialInterval}"/>
<property name="maxInterval" value="${mongoloader.backOffPeriod.maxInterval}"/>
<property name="multiplier" value="${mongoloader.backOffPeriod.multiplier}"/>
</bean>
</property>
<property name="listeners">
<bean class="com.company.ens.myload.job.StepRetryListener"/>
</property>
</bean>
<bean id="genericRetryPolicy" class="org.springframework.retry.policy.SimpleRetryPolicy" >
<constructor-arg index="0" value="${mongoloader.retry.limit}"/>
<constructor-arg index="1">
<map>
<entry key="org.springframework.data.mongodb.CannotGetMongoDbConnectionException" value="true"/>
<entry key="org.springframework.jdbc.CannotGetJdbcConnectionException" value="true"/>
<!-- Just included the below exception for testing purpose, needs to be removed -->
<entry key="java.io.FileNotFoundException" value="true"/>
<entry key="org.springframework.dao.DuplicateKeyException" value="true"/>
</map>
</constructor-arg>
</bean>
//including only my step configution here
<batch:step id="Step4a-MainFlow_TranslateRawFedObjectsToFilingModel"
allow-start-if-complete="false">
<batch:tasklet>
<batch:chunk reader="rawFedObjectMongoReader"
processor="translatingProcessor" writer="SctModelMongoWriter"
commit-interval="100"/>
</batch:tasklet>
<batch:listeners>
<batch:listener ref="step4aListener"/>
</batch:listeners>
</batch:step>
Did not find any posts related to my question. Any help would be appreciated.
I have developed SpringBatch application and deployed as Web Application in Websphere Liberty profile container. The batch program is designed to read records from a table and invokes HTTP service. Based on the service response a column named status is updated as RECORD_SENT/COMPLETE/ERROR type.
Objective is to reuse the same program for multiple datasources. The data source is passed in job parameter using client type. The datasources are in different schemas but having same datamodel.
Question: How does the transaction manager can be applied at run time inside Job Step or Tasklet?. Seeking help in this regard.
Configuration:
<bean id="entityManagerFactory1"
class="org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean">
<property name="dataSource" ref="dataSource1" />
<property name="persistenceUnitName" value="user" />
<property name="jpaVendorAdapter">
<bean class="org.springframework.orm.jpa.vendor.HibernateJpaVendorAdapter">
<property name="showSql" value="false" />
</bean>
</property>
<property name="jpaDialect">
<bean class="org.springframework.orm.jpa.vendor.HibernateJpaDialect" />
</property>
</bean>
<bean id="entityManagerFactory2"
class="org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean">
<property name="dataSource" ref="dataSource2" />
<property name="persistenceUnitName" value="user" />
<property name="jpaVendorAdapter">
<bean class="org.springframework.orm.jpa.vendor.HibernateJpaVendorAdapter">
<property name="showSql" value="false" />
</bean>
</property>
<property name="jpaDialect">
<bean class="org.springframework.orm.jpa.vendor.HibernateJpaDialect" />
</property>
</bean>
<bean id="entityManagerSelector" class="*com.spring.jpa.test.EntitymanagerSelector">
<property name="entityManagerFactory1" ref="entityManagerFactory1"></property>
<property name="entityManagerFactory2" ref="entityManagerFactory2"></property>
</bean>
job.xml snippet
<bean id="itemReader" class="org.springframework.batch.item.database.JpaPagingItemReader" scope="step">
<property name="entityManagerFactory" value="#{entityManagerSelector.getEntitymanagerForClient({jobParameters['client']})}" />
<property name="queryString" value="select u from User u where u.age > #{jobParameters['age']}" />
</bean>
Setting the job parameters during runtime to identify the client
JobParameters param = new JobParametersBuilder()
.addString("age", "20").addString("client", "client2")
.toJobParameters();
JobExecution execution = jobLauncher.run(job, param);
It will not be possible for you to set the transaction-manager of the Step/tasklet during runtime. You will be better off creating a separate Job's for each client and using their own transaction manager in the tasklet.
<bean id="transactionManager1" class="org.springframework.orm.jpa.JpaTransactionManager">
<property name="entityManagerFactory" ref="entityManagerFactory1" />
</bean>
<bean id="transactionManager2" class="org.springframework.orm.jpa.JpaTransactionManager">
<property name="entityManagerFactory" ref="entityManagerFactory2" />
</bean>
Now use these transaction manager when creating the batch job's
<job id="testJob1" xmlns="http://www.springframework.org/schema/batch">
<step id="client1step1">
<tasklet transaction-manager="transactionManager1">
<chunk reader="itemReader" writer="itemWriter" commit-interval="1" />
</tasklet>
</step>
</job>
<job id="testJob2" xmlns="http://www.springframework.org/schema/batch">
<step id="client2step2">
<tasklet transaction-manager="transactionManager2">
<chunk reader="itemReader" writer="itemWriter" commit-interval="1" />
</tasklet>
</step>
</job>
Let me know if this works out.
I have a spring batch job in which a step is as follows:
<bean id="abstractReader" class="org.springframework.batch.item.database.JdbcCursorItemReader" abstract="true">
<property name="fetchSize" value="1000"/>
<property name="verifyCursorPosition" value="true"/>
<property name="rowMapper">
<bean class="org.springframework.jdbc.core.ColumnMapRowMapper"/>
</property>
</bean>
<bean id="masterReader" parent="abstractReader" abstract="true">
<property name="fetchSize" value="1000"/>
<property name="dataSource" ref="masterDataSource"/>
</bean>
<bean id="abstractWriter" class="org.springframework.batch.item.database.JdbcBatchItemWriter" abstract="true">
<property name="assertUpdates" value="false"/>
<property name="itemPreparedStatementSetter">
<bean class="org.springframework.batch.item.database.support.ColumnMapItemPreparedStatementSetter"/>
</property>
</bean>
<bean id="masterWriter" parent="abstractWriter" abstract="true">
<property name="dataSource" ref="masterDataSource"/>
</bean>
<bean id="tempWriter" parent="masterWriter" scope="step">
<property name="sql" value="${insert_query}"/>
</bean>
<bean id="tempReader" parent="masterReader" scope="step">
<property name="sql" value="${select_query}"/>
</bean>
<batch:step id="tempStep">
<batch:tasklet>
<batch:chunk commit-interval="100"
reader="tempReader"
writer="tempWriter"/>
</batch:tasklet>
</batch:step>
Is there a way to bring named parameter support in the queries? Currently JdbcCursorItemReader is using PreparedStatement. (Too many ? in queries now)
There isn't a way with the JdbcCursorItemReader however you can do it with the JdbcPagingItemReader. You can read more about that reader in the documentation here: https://docs.spring.io/spring-batch/apidocs/org/springframework/batch/item/database/JdbcPagingItemReader.html
The spring Batch program which I am working on is reading data from a table. It’s using ‘org.springframework.batch.item.database.JdbcCursorItemReader’ itemReader . Earlier the plan was to Alter table and add a PROCESSED_INDICATOR flag and prepopulate it with status ‘PENDING’. Once the record is processed and writer will update the status of PROCESSED_INDICATOR flag to ‘Processed’. This is to support re-startability . For example if batch picks up 1 million records and died in ½ million records then when I restart the batch; it should start where I have left off.
But unfortunately, management didn’t approve this solution. I am digging in ways to make itemreader re-startable. As per Spring documentation “Most ItemReaders have much more sophisticated restart logic. The JdbcCursorItemReader, for example, stores the row id of the last processed row in the Cursor.”
Does anyone have any sample example of such custom reader which implements JdbcCursorItemReader and stores last processed row in the cursor.
https://docs.spring.io/spring-batch/trunk/reference/html/readersAndWriters.html
==FULL XML CONFIGURATION==
<import resource="classpath:/batch/utility/skip/batch_skip.xml" />
<import resource="classpath:/batch/config/context-postgres.xml" />
<import resource="classpath:/batch/config/oracle-database.xml" />
<context:property-placeholder
location="classpath:/batch/jobs/TPF-1001-DD-01/TPF-1001-DD-01.properties" />
<bean id="gridSizePartitioner"
class="com.tpf.partitioner.GridSizePartitioner" />
<task:executor id="taskExecutor" pool-size="${pool.size}" />
<batch:job id="XYZJob" job-repository="jobRepository"
restartable="true">
<batch:step id="XYZSTEP">
<batch:description>Convert TIF files to PDF</batch:description>
<batch:partition partitioner="gridSizePartitioner">
<batch:handler task-executor="taskExecutor"
grid-size="${pool.size}" />
<batch:step>
<batch:tasklet allow-start-if-complete="true">
<batch:chunk commit-interval="${commit.interval}"
skip-limit="${job.skip.limit}">
<batch:reader>
<bean id="timeReader"
class="org.springframework.batch.item.database.JdbcCursorItemReader"
scope="step">
<property name="dataSource" ref="oracledataSource" />
<property name="sql">
<value>
select TIME_ID as timesheetId,count(*),max(CREATION_DATETIME) as creationDateTime , ILN_NUMBER as ilnNumber
from TS_FAKE_NAME
where creation_datetime >= '#{jobParameters['creation_start_date1']} 12.00.00.000000000 AM'
and creation_datetime < '#{jobParameters['creation_start_date2']} 11.59.59.999999999 PM'
and mod(time_id,${pool.size})=#{stepExecutionContext['partition.id']}
group by time_id ,ILN_NUMBER
</value>
</property>
<property name="rowMapper">
<bean
class="org.springframework.jdbc.core.BeanPropertyRowMapper">
<property name="mappedClass"
value="com.tpf.model.Time" />
</bean>
</property>
</bean>
</batch:reader>
<batch:processor>
<bean id="compositeItemProcessor"
class="org.springframework.batch.item.support.CompositeItemProcessor">
<property name="delegates">
<list>
<ref bean="timeProcessor" />
</list>
</property>
</bean>
</batch:processor>
<batch:writer>
<bean id="compositeItemWriter"
class="org.springframework.batch.item.support.CompositeItemWriter">
<property name="delegates">
<list>
<ref bean="timeWriter" />
</list>
</property>
</bean>
</batch:writer>
<batch:skippable-exception-classes>
<batch:include
class="com.utility.skip.BatchSkipException" />
</batch:skippable-exception-classes>
<batch:listeners>
<batch:listener ref="batchSkipListener" />
</batch:listeners>
</batch:chunk>
</batch:tasklet>
</batch:step>
</batch:partition>
</batch:step>
<batch:validator>
<bean
class="org.springframework.batch.core.job.DefaultJobParametersValidator">
<property name="requiredKeys">
<list>
<value>batchRunNumber</value>
<value>creation_start_date1</value>
<value>creation_start_date2</value>
</list>
</property>
</bean>
</batch:validator>
</batch:job>
<bean id="timesheetWriter" class="com.tpf.writer.TimeWriter"
scope="step">
<property name="dataSource" ref="dataSource" />
</bean>
<bean id="timeProcessor"
class="com.tpf.processor.TimeProcessor" scope="step">
<property name="dataSource" ref="oracledataSource" />
</bean>
Does anyone have any sample example of such custom reader which implements JdbcCursorItemReader and stores last processed row in the cursor
The JdbcCursorItemReader does that, see Javadoc, here is an excerpt:
ExecutionContext: The current row is returned as restart data,
and when restored from that same data, the cursor is opened and the current row
set to the value within the restart data.
So you don't need a custom reader.