I am using 2.2.7.RELEASE version of Spring Batch in my project.
I have a batch which is using JdbcPagingItemReader to read records from DB2. In that I have a JOIN on 2 tables and have some where conditions. At the last I am providing column by which I want to sort my records. But because I am using table aliases, its giving me error. When skipping alias name, then its give problem while fetching the next set of result.
I searched and found that this issue got resolved in 2.1.9 Release but I couldn't find the fix in later releases. Anybody has any idea about this issue and the solution too.
You can find the code snippet of query configuration below.
<bean id="releaseSqlPagingQueryProvider" class="org.springframework.batch.item.database.support.SqlPagingQueryProviderFactoryBean">
<property name="dataSource" ref="dataSource" />
<property name="selectClause" value="SELECT M.* " />
<property name="fromClause" value="FROM MASTER AS M JOIN RELEASE AS R ON M.EMPLOYEE_ID = R.EMPLOYEE_ID " />
<property name="whereClause" >
<value>
<![CDATA[
WHERE R.COLUMN_1 = ?
]]>
</value>
</property>
<property name="sortKey" value="EMPLOYEE_ID" />
Related
I have a web-app using apache-camel to submit routes which execute some postgresql select and insert.
I'm not using any DAO, so I haven't a code where begin and close connections, I believed that connections life-cycle was managed by Spring but it seems not working.
The problem is that everytime my route executes, I see one more connection which remains IDLE, so previous IDLE connections are not being reused, this takes to the "too many client connections problem"
In my route I have:
<bean id="configLocation" class="org.springframework.core.io.FileSystemResource">
<constructor-arg type="java.lang.String" value="..../src/main/resources/config/test.xml" />
</bean>
<bean id="dataSourcePostgres" class="org.apache.ibatis.datasource.pooled.PooledDataSource">
<property name="driver" value="org.postgresql.Driver" />
<property name="url" value="jdbc:postgresql://localhost:5432/postgres" />
<property name="username" value="postgres" />
<property name="password" value="postgres" />
</bean>
<bean id="postgresTrivenetaSessionFactory" class="org.mybatis.spring.SqlSessionFactoryBean">
<property name="dataSource" ref="dataSourcePostgres" />
<property name="configLocation" ref="configLocation" />
</bean>
Here they are some sample queries:
<select id="selectTest" resultType="java.util.LinkedHashMap">
select * from test;
</select>
<insert id="insertTest" parameterType="java.util.LinkedHashMap" useGeneratedKeys="true" keyProperty="id" keyColumn="id">
INSERT INTO test(note,regop_id)
VALUES (#{note},#{idKey});
</insert>
I tried even adding this:
<bean id="transactionManager"
class="org.springframework.jdbc.datasource.DataSourceTransactionManager">
<property name="dataSource" ref="dataSourcePostgresTriveneta" />
</bean>
At last I found the problem, it was that the DataSource is never closed automatically at the end of a Camel route.
So, each time that Camel route executed, it left an open Datasource, then all the created IDLE connections (their number obviously depends from the DataSource configuration and its usage) remained and accumulate over and over.
The final solution was to add a bean created ad hoc at the end of the Camel route, taking the DataSource as argument and closing it, that's all.
Now I want to use JPA second level cache by ehache. I made some configuration and it seems work. But I still can see the query sql. I'm not sure ehcache is working. Does anybody know about that? Thanks.
1.some part of persistence.xml
<property name="hibernate.cache.region.factory_class" value="org.hibernate.cache.ehcache.EhCacheRegionFactory" />
<property name="hibernate.cache.provider_configuration" value="/ehcache.xml" />
<property name="hibernate.generate_statistics" value="true" />
<property name="hibernate.cache.use_second_level_cache" value="true" />
<property name="hibernate.cache.use_query_cache" value="true" />
<property name="hibernate.show_sql" value="true" />
2.ehcache.xml
<?xml version="1.0" encoding="UTF-8"?>
<ehcache>
<defaultCache maxElementsInMemory="1" eternal="false"
timeToIdleSeconds="1200" timeToLiveSeconds="1200" overflowToDisk="true" clearOnFlush="true">
</defaultCache>
<cache name="org.test.persistent.entity.Scenario"
maxElementsInMemory="10000"
eternal="false"
timeToIdleSeconds="1800"
timeToLiveSeconds="3600"
overflowToDisk="true">
</cache>
<cache name="org.hibernate.cache.spi.UpdateTimestampsCache"
maxElementsInMemory="10000"
timeToIdleSeconds="1800"
timeToLiveSeconds="3600"
eternal="false">
</cache>
</ehcache>
3. sql
TypedQuery<Scenario> query = em.createQuery(
"from Scenario as s where s.obsolete!=1 and s.parentId=? order by s.name, s.scenarioStatusId",
Scenario.class);
query.setParameter(1, parentId);
query.setHint("org.hibernate.cacheable", true);
List<Scenario> scenarios = null;
org.hibernate.Query hbQuery = null;
if (query instanceof org.hibernate.ejb.QueryImpl) {
hbQuery = ((org.hibernate.ejb.QueryImpl)query).getHibernateQuery();
hbQuery.setCacheable(true);
scenarios = hbQuery.list();
} else {
scenarios = query.getResultList();
}
Ehcache setting is having its default value maxElementsInMemory="10000". It has to be tuned according to the hits.
One way to do this is, You can increase maxElementsInMemory and check if you want to solve this by "Trial and error" method.
Other way (reliable) is that you have to publish the Mbeans and watch for the cache usage and cache misses in JConsole. According to the report you can increase 'maxElementsInMemory'. Please refer how you should do that in http://hibernate-jcons.sourceforge.net/usage.html
I write cache data to disk by adding "". It seems work well and I find some data in the directory. But I still not find the reason of why it still execute sql.
I am relatively new to Spring Batch.
I have an input file with a header. This header contains several fields, one of which I am interested in (YYYYMM data).
Here is my config for this :
<bean id="detaillesHeaderReaderCallback" class="fr.generali.ede.daemon.batch.dstaff.detailles.DetaillesHeaderReaderCallback" >
<property name="headerTokenizer" ref="headerTokenizer" />
<property name="fieldSetMapper" ref="fieldSetMapperHeaderLog07" />
<!-- need to write moisComptable to ChunkContext -->
<property name="chunkContext" value="#{chunkExecutionContext}" />
</bean>
<bean id="headerTokenizer"
class="org.springframework.batch.item.file.transform.FixedLengthTokenizer">
<property name="names" value="dummy1,moisComptable,dummy2" />
<property name="columns" value="1-22,23-28,29-146" />
</bean>
After which, in the next step of the job, I want to generate an output file whose name is composed of a static part and that header field :
<bean id="fileItemWriterLog07" class="org.springframework.batch.item.file.FlatFileItemWriter">
<property name="resource"
value="file:${batch.coherence.out.path}/DSTAF007_LOG_#{jobExecutionContext['moisComptable']}.txt" />
<property name="shouldDeleteIfExists" value="true" />
<property name="headerCallback" ref="DetaillesHeaderWriterCallbackLog07" />
...
</bean/>
(I have two jobs because I first write to a database, and then read from it.)
As one would guess this doesn't work, the config file is flowed so I get BeanCreationExceptions. But this gives an idea of what I want to achieve.
I have no exception on the ChunkContext (yet ?) but one on the writer resource. Here is the exception :
Field or property 'jobExecutionContext' cannot be found on object of type 'org.springframework.beans.factory.config.BeanExpressionContext'
Does anyone have an idea about how to proceed ?
Thanks in advance.
I've just started working with activity and integrated it in my project (postgres based) in an embedded way (sample spring configuration file snip)
(...)
<!-- Activiti components -->
<bean id="processEngineConfiguration" class="org.activiti.spring.SpringProcessEngineConfiguration">
<property name="dataSource" ref="dataSource" />
<property name="transactionManager" ref="transactionManager" />
<property name="databaseSchemaUpdate" value="true" />
<property name="jobExecutorActivate" value="false" />
</bean>
<bean id="processEngine" class="org.activiti.spring.ProcessEngineFactoryBean">
<property name="processEngineConfiguration" ref="processEngineConfiguration" />
</bean>
<bean id="repositoryService" factory-bean="processEngine" factory-method="getRepositoryService" />
<bean id="runtimeService" factory-bean="processEngine" factory-method="getRuntimeService" />
<bean id="taskService" factory-bean="processEngine" factory-method="getTaskService" />
<bean id="historyService" factory-bean="processEngine" factory-method="getHistoryService" />
<bean id="managementService" factory-bean="processEngine" factory-method="getManagementService" />
(...)
It works well and create a lot of tables on my application schema at startup.
My problem is : tables are created in the 'public' schema in my postgres database. I would have preferred to put those tables in a separate schema, say 'activity'.
Fact is that after browsing the documentation / the net for almost two hours, I didn't found any way to change the default schema target creation behavior.
Any help... greatly appreciated ! ;)
Since Postgres 9.4 JDBC driver you can specify the default schema in the JDBC url like this:
jdbc:postgresql://localhost:5432/mydatabase?currentSchema=myschema
With this URL, all Activiti tables are created in the myschema schema instead of the default one in the search path, usually public.
Sources: this response on Stack Overflow and the latest documentation.
I would like to execute a Stored Procedure using spring JdbcBatchItemWriter. My current code looks like :
<bean id="xyzWriter" class="org.springframework.batch.item.database.JdbcBatchItemWriter">
......
<property name="sql" value="update abc where x=:paramX" />
......
</bean>
I would like to replace this update sql query with a Stored Proc call. I would like to handle it in the xml file itself. Any help is really appreciated.
Thanks
Did you tried running SP through JdbcBatchItemWriter?
because I also had same requirement and i just tried and it worked for me
<bean id="trackItemWriter" class="org.springframework.batch.item.database.JdbcBatchItemWriter">
<property name="dataSource" ref="mySQLDatasource"/>
<property name="itemPreparedStatementSetter">
<bean class="com.MyDataPreparedStatmentSetter"/>
</property>
<property name="assertUpdates" value="false" />
<property name="sql" value="Call my_Stored_Proc (?,?,?,?)"/>
</bean>
Hope it helps.