createEntityManagerFactory is failing randomly - jpa

Now and then it is observed that the createEntityManagerFactory is failing. The stack trace is little confusing for me. Can anybody throw some light on it?
java.lang.StackOverflowError
at java.io.File.list(File.java:1133)
at java.io.File.listFiles(File.java:1297)
at org.eclipse.persistence.internal.jpa.deployment.DirectoryArchive.init(DirectoryArchive.java:90)
at org.eclipse.persistence.internal.jpa.deployment.DirectoryArchive.init(DirectoryArchive.java:96)
at org.eclipse.persistence.internal.jpa.deployment.DirectoryArchive.init(DirectoryArchive.java:96)
at org.eclipse.persistence.internal.jpa.deployment.DirectoryArchive.init(DirectoryArchive.java:96)
at org.eclipse.persistence.internal.jpa.deployment.DirectoryArchive.init(DirectoryArchive.java:96)
at org.eclipse.persistence.internal.jpa.deployment.DirectoryArchive.init(DirectoryArchive.java:96)
at org.eclipse.persistence.internal.jpa.deployment.DirectoryArchive.<init>(DirectoryArchive.java:74)
at org.eclipse.persistence.internal.jpa.deployment.DirectoryArchive.<init>(DirectoryArchive.java:54)
at org.eclipse.persistence.internal.jpa.deployment.ArchiveFactoryImpl.createArchive(ArchiveFactoryImpl.java:89)
at org.eclipse.persistence.internal.jpa.deployment.PersistenceUnitProcessor.findPersistenceArchives(PersistenceUnitProcessor.java:302)
at org.eclipse.persistence.internal.jpa.deployment.PersistenceUnitProcessor.findPersistenceArchives(PersistenceUnitProcessor.java:276)
at org.eclipse.persistence.internal.jpa.deployment.JPAInitializer.findPersistenceUnitInfoInArchives(JPAInitializer.java:150)
at org.eclipse.persistence.internal.jpa.deployment.JPAInitializer.findPersistenceUnitInfo(JPAInitializer.java:135)
at org.eclipse.persistence.jpa.PersistenceProvider.createEntityManagerFactory(PersistenceProvider.java:177)
at org.eclipse.persistence.jpa.PersistenceProvider.createEntityManagerFactoryImpl(PersistenceProvider.java:129)
at org.eclipse.persistence.jpa.PersistenceProvider.createEntityManagerFactory(PersistenceProvider.java:177)
at org.eclipse.persistence.jpa.PersistenceProvider.createEntityManagerFactoryImpl(PersistenceProvider.java:129)
at org.eclipse.persistence.jpa.PersistenceProvider.createEntityManagerFactory(PersistenceProvider.java:177)
at org.eclipse.persistence.jpa.PersistenceProvider.createEntityManagerFactoryImpl(PersistenceProvider.java:129)
at org.eclipse.persistence.jpa.PersistenceProvider.createEntityManagerFactory(PersistenceProvider.java:177)
at org.eclipse.persistence.jpa.PersistenceProvider.createEntityManagerFactoryImpl(PersistenceProvider.java:129)
at org.eclipse.persistence.jpa.PersistenceProvider.createEntityManagerFactory(PersistenceProvider.java:177)
....
...
..
at org.eclipse.persistence.jpa.PersistenceProvider.createEntityManagerFactoryImpl(PersistenceProvider.java:129)
at org.eclipse.persistence.jpa.PersistenceProvider.createEntityManagerFactory(PersistenceProvider.java:177)
at javax.persistence.Persistence.createEntityManagerFactory(Persistence.java:79)
On further tracing i see that this is happening when a new EntityManagerFactory is being created after the close of existing one is failing with the following exceptions:
code:
synchronized (factoryLock) {
if (null != emFactory) {
if (true == emFactory.isOpen()) {
try {
emFactory.close();
} catch (IllegalStateException e) {
System.out.println(e.getMessage());
}
catch (Exception e) {
System.out.println(e.getMessage());
}
}
emFactory = null;
}
Map<String, String> properties = new HashMap<String, String>();
properties.put(PersistenceUnitProperties.JDBC_DRIVER, jdbcDriver);
properties.put(PersistenceUnitProperties.SESSION_CUSTOMIZER,
"com.ca.waae.dbaccess.custom.JPASessionCustomizer");
properties.put(PersistenceUnitProperties.JDBC_URL, jdbcUrl);
properties.put(PersistenceUnitProperties.JDBC_PROPERTY+"REQUEST_KERBEROS_SESSION", "true");
properties.put(PersistenceUnitProperties.JDBC_PROPERTY+"SERVICE_PRINCIPAL_NAME", dbPrincipalName);
emFactory = Persistence.createEntityManagerFactory(
"JPA", properties);
}
Exception:
java.util.ConcurrentModificationException
at java.util.HashMap$HashIterator.nextNode(HashMap.java:1440)
at java.util.HashMap$ValueIterator.next(HashMap.java:1469)
at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.clearStatementCache(DatabaseAccessor.java:342)
at org.eclipse.persistence.internal.databaseaccess.DatabaseAccessor.disconnect(DatabaseAccessor.java:505)
at org.eclipse.persistence.internal.sessions.DatabaseSessionImpl.disconnect(DatabaseSessionImpl.java:415)
at org.eclipse.persistence.sessions.server.ServerSession.disconnect(ServerSession.java:504)
at org.eclipse.persistence.internal.sessions.DatabaseSessionImpl.logout(DatabaseSessionImpl.java:931)
at org.eclipse.persistence.sessions.server.ServerSession.logout(ServerSession.java:776)
at org.eclipse.persistence.internal.jpa.EntityManagerSetupImpl.removeSessionFromGlobalSessionManager(EntityManagerSetupImpl.java:511)
at org.eclipse.persistence.internal.jpa.EntityManagerSetupImpl.undeploy(EntityManagerSetupImpl.java:2850)
at org.eclipse.persistence.internal.jpa.EntityManagerFactoryDelegate.close(EntityManagerFactoryDelegate.java:267)
at org.eclipse.persistence.internal.jpa.EntityManagerFactoryImpl.close(EntityManagerFactoryImpl.java:287)
at com.ca.waae.dbaccess.dao.AEConnection.createJpaEntityFactory(AEConnection.java:144)
persistent.xml contents are like
?xml version="1.0" encoding="UTF-8"?>
<persistence version="2.0" xmlns="http://java.sun.com/xml/ns/persistence" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://java.sun.
com/xml/ns/persistence http://java.sun.com/xml/ns/persistence/persistence_2_0.xsd">
<persistence-unit name="JPA">
<class>com.xx.yy.dbaccess.model.table1</class>
<class>com.xx.yy.dbaccess.model.table2</class>
<class>com.xx.yy.dbaccess.model.table3</class>
<class>com.xx.yy.dbaccess.model.table4</class>
<class>com.xx.yy.dbaccess.model.table5</class>
<properties>
<property name="eclipselink.logging.level" value="ALL"/>
<property name="eclipselink.jdbc.cache-statements" value="true"/>
<property name="eclipselink.jpa.uppercase-column-names " value="false"/>
<property name="eclipselink.logging.parameters" value="true"/>
<property name="eclipselink.weaving" value="static"/>
</properties>
</persistence-unit>
</persistence>

Related

Hibernate Search not releasing db session

We use Hibernate search(version 3.1) and Lucene (version 2.4) for indexing the content runs on Jboss 7.2. Database team had reported huge database session spike. They reported db session go to idle after serving few requests. Here is the code:
public void updateDocumentByIds(IndexMessage indexMessage, java.io.Serializable[] entityPKs,FullTextSession session, boolean isSelfRebuild) throws AAException
{
if (indexMessage.getServProvCode() == null && indexMessage.getSourceNumber() == null) {
logger.error("Agency Code and Source Number are null!");
return;
}
TransactionManager transactionManager = null;
boolean isNewTransaction = false;
FullTextSession searchSession = session;
try
{
IIndexAdapter indexAdapter = IndexAdapter.getIndexAdapter(indexMessage.getEntityType());
IndexDirectoryManager directoryManager = IndexDirectoryManager.getInstance();
String specifyIndexName = directoryManager.getSpecifyIndexName(indexMessage.getServProvCode());
if (searchSession == null)
{
transactionManager = getTransactionManager();
transactionManager.begin();
entityManager = getEntityManager();
Session hibernateSession = (Session) entityManager.getDelegate();
searchSession = SwitchSession.getFullTextSession(hibernateSession, specifyIndexName);
searchSession.setFlushMode(FlushMode.MANUAL); // disable flush operations
searchSession.setCacheMode(CacheMode.IGNORE); // disable 2nd level cache operations
isNewTransaction = true;
}
AgencyModel agencyModel = null;
if (indexMessage.getServProvCode() != null)
{
agencyModel = getAgencyByAgencyCode(searchSession, indexMessage.getServProvCode().toUpperCase());
if (agencyModel == null)
{
logger.warn("No such agency:" + indexMessage.getServProvCode());
return;
}
}
else if (indexMessage.getSourceNumber() == null)
{
logger.warn("Please specify one Agency Code or Source Number!");
return;
}
directoryManager.chooseSyncOrSearchDirectory(specifyIndexName, indexMessage.getEntityType(),
ActionType.SYNC);
for (java.io.Serializable entityPK : entityPKs)
{
try
{
logger.info("========Start Update Index==========");
logger.info("Entity Type: " + indexMessage.getEntityType());
logger.info("Primary Key: " + entityPK.toString());
Object object = indexAdapter.getObjectByPK(searchSession, entityPK, agencyModel);
if (object == null)
{
// Remove index when the data was deleted from DB
searchSession.purge(EntityMapHelper.getEntityClass(indexMessage.getEntityType()), entityPK);
}
else
{
// Update Index when this entity can be found in DB
searchSession.index(object);
}
// While the related index is rebuilding, this record need be tracked into a LOG table.
if (!isSelfRebuild
&& directoryManager.needTrackForSync(searchSession, specifyIndexName, indexMessage
.getEntityType()))
{
saveToUnindexedData(indexMessage, indexMessage.getServProvCode(), entityPK);
}
logger.info("============ End Now =================\n");
}
catch (Exception e)
{
logger.error("Exception occured during update index for " + specifyIndexName + "/"
+ entityPK.toString(), e);
continue;
}
}
searchSession.flushToIndexes();
searchSession.clear();
if (isNewTransaction)
{
commit(entityManager, searchSession, transactionManager);
transactionManager = null;
}
}
catch (Exception e)
{
throw new SyncIndexException("", e);
}
finally
{
if (transactionManager != null)
{
try
{
transactionManager.rollback();
}
catch (Exception e)
{
;
}
}
IndexDirectoryThreadLocal.remove();
}
}
Note: This below Hibernate EntityManagerFactory configuration used for unit test
<bean id="entityManagerFactory" class="org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean">
<property name="packagesToScan" value="com.comp" />
<property name="persistenceProviderClass" value="com.comp.orm.hibernate3.compHibernatePersistence" />
<property name="dataSource" ref="dataSource" />
<property name="persistenceUnitName" value="AAPU"/>
<property name="jpaPropertyMap">
<map>
<entry key="hibernate.show_sql"
value="false"/>
<entry key="hibernate.format_sql"
value="true"/>
<entry key="use_sql_comments"
value="true"/>
<entry key="hibernate.bytecode.use_reflection_optimizer"
value="true"/>
<entry key="hibernate.max_fetch_depth"
value="1"/>
<entry key="hibernate.default_batch_fetch_size"
value="30"/>
<entry key="hibernate.jdbc.fetch_size"
value="30"/>
<entry key="hibernate.jdbc.batch_size"
value="15"/>
<entry key="hibernate.cache.use_second_level_cache"
value="false"/>
<entry key="hibernate.jdbc.use_scrollable_resultset"
value="true"/>
<entry key="hibernate.temp.use_jdbc_metadata_defaults"
value="false"/>
<entry key="hibernate.jdbc.factory_class"
value="org.hibernate.jdbc.BatchingBatcherFactory"/>
<entry key="hibernate.c3p0.min_size"
value="50"/>
<entry key="hibernate.c3p0.max_size"
value="100"/>
<entry key="hibernate.c3p0.timeout"
value="120"/>
<entry key="hibernate.c3p0.max_statements"
value="100"/>
<entry key="hibernate.c3p0.idle_test_period"
value="3000"/>
<entry key="hibernate.search.default.directory_provider"
value="com.comp.aa.globalsearch.directory.IndexDirectoryProvider"/>
<entry key="hibernate.search.default.indexBase"
value="/index"/>
<entry key="hibernate.search.reader.strategy"
value="com.comp.aa.globalsearch.directory.SwitchReaderProvider"/>
<entry key="hibernate.jdbc.factory_class"
value="org.hibernate.jdbc.BatchingBatcherFactory"/>
<entry key="hibernate.transaction.manager_lookup_class"
value="org.hibernate.transaction.JBossTransactionManagerLookup"/>
<entry key="hibernate.current_session_context_class"
value="jta"/>
</map>
</property>
<property name="persistenceUnitPostProcessors">
<list>
<bean class="com.comp.orm.util.JtaPersistenceUnitPostProcessor">
<property name="jtaDataSource" ref="dataSource"/>
</bean>
</list>
</property>
</bean>
In some of the forums, i saw hibernate.c3p0.idle_test_period should be less or equal to hibernate.c3p0.timeout. We changed both the values to 5 minutes, but still the problem exists
Any idea why db sessions are idle?

org.hibernate.HibernateException: The database returned no natively generated identity value - Spring Batch for MultiReaderHibernateWriter

I am developing 'Multiresource ItemReader & HibernateItemWritter' from link: http://websystique.com/springbatch/spring-batch-multiresourceitemreader-hibernateitemwriter-example/. When I was running the Main program, I faced the below error:
Jul 27, 2016 12:27:36 AM org.springframework.batch.core.launch.support.SimpleJobLauncher afterPropertiesSet
INFO: No TaskExecutor has been set, defaulting to synchronous executor.
Jul 27, 2016 12:27:37 AM org.springframework.batch.core.launch.support.SimpleJobLauncher run
INFO: Job: [FlowJob: [name=examResultJob]] launched with the following parameters: [{}]
---------------------------------
ExamResult Job starts at :2016-07-27T00:27:37.294+05:30
Jul 27, 2016 12:27:37 AM org.springframework.batch.core.job.SimpleStepHandler handleStep
INFO: Executing step: [step1]
Processing result :ExamResult [id=0, studentName=Brian Burlet, dob=1985-02-01, percentage=76.0]
Processing result :ExamResult [id=0, studentName=Jimmy Snuka, dob=1983-02-01, percentage=39.0]
Processing result :ExamResult [id=0, studentName=Renard konig, dob=1970-02-01, percentage=61.0]
Processing result :ExamResult [id=0, studentName=Kevin Richard, dob=2002-02-01, percentage=59.0]
Processing result :ExamResult [id=0, studentName=Sam Disilva, dob=1992-05-01, percentage=76.0]
Processing result :ExamResult [id=0, studentName=Bob corbet, dob=1990-07-10, percentage=29.0]
Processing result :ExamResult [id=0, studentName=Rick Ricky, dob=1973-02-01, percentage=54.0]
Processing result :ExamResult [id=0, studentName=Igor Watson, dob=1986-02-01, percentage=34.0]
Processing result :ExamResult [id=0, studentName=Peet Sampras, dob=1978-02-01, percentage=97.0]
Processing result :ExamResult [id=0, studentName=Rita Paul, dob=1993-02-01, percentage=92.0]
Hibernate: insert into EXAM_RESULT (DOB, PERCENTAGE, STUDENT_NAME) values (?, ?, ?)
Jul 27, 2016 12:18:09 AM
org.springframework.batch.core.step.AbstractStep execute
SEVERE: Encountered an error executing step step1 in job examResultJob
org.hibernate.HibernateException: The database returned no natively generated identity value
at org.hibernate.id.IdentifierGeneratorHelper.getGeneratedIdentity(IdentifierGeneratorHelper.java:91)
at org.hibernate.id.IdentityGenerator$GetGeneratedKeysDelegate.executeAndExtract(IdentityGenerator.java:100)
at org.hibernate.id.insert.AbstractReturningDelegate.performInsert(AbstractReturningDelegate.java:58)
at org.hibernate.persister.entity.AbstractEntityPersister.insert(AbstractEntityPersister.java:3032)
at org.hibernate.persister.entity.AbstractEntityPersister.insert(AbstractEntityPersister.java:3558)
at org.hibernate.action.internal.EntityIdentityInsertAction.execute(EntityIdentityInsertAction.java:98)
at org.hibernate.engine.spi.ActionQueue.execute(ActionQueue.java:490)
at org.hibernate.engine.spi.ActionQueue.addResolvedEntityInsertAction(ActionQueue.java:195)
at org.hibernate.engine.spi.ActionQueue.addInsertAction(ActionQueue.java:179)
at org.hibernate.engine.spi.ActionQueue.addAction(ActionQueue.java:214)
at org.hibernate.event.internal.AbstractSaveEventListener.addInsertAction(AbstractSaveEventListener.java:324)
at org.hibernate.event.internal.AbstractSaveEventListener.performSaveOrReplicate(AbstractSaveEventListener.java:288)
at org.hibernate.event.internal.AbstractSaveEventListener.performSave(AbstractSaveEventListener.java:194)
at org.hibernate.event.internal.AbstractSaveEventListener.saveWithGeneratedId(AbstractSaveEventListener.java:125)
at org.hibernate.event.internal.DefaultSaveOrUpdateEventListener.saveWithGeneratedOrRequestedId(DefaultSaveOrUpdateEventListener.java:209)
at org.hibernate.event.internal.DefaultSaveOrUpdateEventListener.entityIsTransient(DefaultSaveOrUpdateEventListener.java:194)
at org.hibernate.event.internal.DefaultSaveOrUpdateEventListener.performSaveOrUpdate(DefaultSaveOrUpdateEventListener.java:114)
at org.hibernate.event.internal.DefaultSaveOrUpdateEventListener.onSaveOrUpdate(DefaultSaveOrUpdateEventListener.java:90)
at org.hibernate.internal.SessionImpl.fireSaveOrUpdate(SessionImpl.java:684)
at org.hibernate.internal.SessionImpl.saveOrUpdate(SessionImpl.java:676)
at org.hibernate.internal.SessionImpl.saveOrUpdate(SessionImpl.java:671)
at org.springframework.batch.item.database.HibernateItemWriter.doWrite(HibernateItemWriter.java:140)
at org.springframework.batch.item.database.HibernateItemWriter.write(HibernateItemWriter.java:113)
at org.springframework.batch.core.step.item.SimpleChunkProcessor.writeItems(SimpleChunkProcessor.java:175)
at org.springframework.batch.core.step.item.SimpleChunkProcessor.doWrite(SimpleChunkProcessor.java:151)
at org.springframework.batch.core.step.item.SimpleChunkProcessor.write(SimpleChunkProcessor.java:274)
at org.springframework.batch.core.step.item.SimpleChunkProcessor.process(SimpleChunkProcessor.java:199)
at org.springframework.batch.core.step.item.ChunkOrientedTasklet.execute(ChunkOrientedTasklet.java:75)
at org.springframework.batch.core.step.tasklet.TaskletStep$ChunkTransactionCallback.doInTransaction(TaskletStep.java:406)
at org.springframework.batch.core.step.tasklet.TaskletStep$ChunkTransactionCallback.doInTransaction(TaskletStep.java:330)
at org.springframework.transaction.support.TransactionTemplate.execute(TransactionTemplate.java:133)
at org.springframework.batch.core.step.tasklet.TaskletStep$2.doInChunkContext(TaskletStep.java:271)
at org.springframework.batch.core.scope.context.StepContextRepeatCallback.doInIteration(StepContextRepeatCallback.java:81)
at org.springframework.batch.repeat.support.RepeatTemplate.getNextResult(RepeatTemplate.java:374)
at org.springframework.batch.repeat.support.RepeatTemplate.executeInternal(RepeatTemplate.java:215)
at org.springframework.batch.repeat.support.RepeatTemplate.iterate(RepeatTemplate.java:144)
at org.springframework.batch.core.step.tasklet.TaskletStep.doExecute(TaskletStep.java:257)
at org.springframework.batch.core.step.AbstractStep.execute(AbstractStep.java:200)
at org.springframework.batch.core.job.SimpleStepHandler.handleStep(SimpleStepHandler.java:148)
at org.springframework.batch.core.job.flow.JobFlowExecutor.executeStep(JobFlowExecutor.java:64)
at org.springframework.batch.core.job.flow.support.state.StepState.handle(StepState.java:67)
at org.springframework.batch.core.job.flow.support.SimpleFlow.resume(SimpleFlow.java:169)
at org.springframework.batch.core.job.flow.support.SimpleFlow.start(SimpleFlow.java:144)
at org.springframework.batch.core.job.flow.FlowJob.doExecute(FlowJob.java:134)
at org.springframework.batch.core.job.AbstractJob.execute(AbstractJob.java:306)
at org.springframework.batch.core.launch.support.SimpleJobLauncher$1.run(SimpleJobLauncher.java:135)
at org.springframework.core.task.SyncTaskExecutor.execute(SyncTaskExecutor.java:50)
at org.springframework.batch.core.launch.support.SimpleJobLauncher.run(SimpleJobLauncher.java:128)
at com.websystique.springbatch.Main.main(Main.java:22)
Any pointers to solve the issue?
ExamResult.java
#Entity
#Table(name = "EXAM_RESULT")
public class ExamResult {
#Id
//#GeneratedValue(strategy = GenerationType.IDENTITY)
#GeneratedValue(strategy = GenerationType.AUTO)
private long id;
#Column(name = "STUDENT_NAME", nullable = false)
private String studentName;
#Column(name = "DOB", nullable = false)
#Type(type="org.jadira.usertype.dateandtime.joda.PersistentLocalDate")
private LocalDate dob;
#Column(name = "PERCENTAGE", nullable = false)
private double percentage;
public long getId() {
return id;
}
public void setId(long id) {
this.id = id;
}
public String getStudentName() {
return studentName;
}
public void setStudentName(String studentName) {
this.studentName = studentName;
}
public LocalDate getDob() {
return dob;
}
public void setDob(LocalDate dob) {
this.dob = dob;
}
public double getPercentage() {
return percentage;
}
public void setPercentage(double percentage) {
this.percentage = percentage;
}
#Override
public int hashCode() {
final int prime = 31;
int result = 1;
result = prime * result + (int) (id ^ (id >>> 32));
return result;
}
#Override
public boolean equals(Object obj) {
if (this == obj)
return true;
if (obj == null)
return false;
if (!(obj instanceof ExamResult))
return false;
ExamResult other = (ExamResult) obj;
if (id != other.id)
return false;
return true;
}
#Override
public String toString() {
return "ExamResult [id=" + id + ", studentName=" + studentName
+ ", dob=" + dob + ", percentage=" + percentage + "]";
}
}
spring-batch-context.xml
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:batch="http://www.springframework.org/schema/batch" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.springframework.org/schema/batch
http://www.springframework.org/schema/batch/spring-batch.xsd
http://www.springframework.org/schema/beans
http://www.springframework.org/schema/beans/spring-beans.xsd">
<import resource="classpath:context-model.xml"/>
<!-- JobRepository and JobLauncher are configuration/setup classes -->
<bean id="jobRepository" class="org.springframework.batch.core.repository.support.MapJobRepositoryFactoryBean" />
<bean id="jobLauncher" class="org.springframework.batch.core.launch.support.SimpleJobLauncher">
<property name="jobRepository" ref="jobRepository" />
</bean>
<!-- ============= Multi Resource Item Reader ================== -->
<bean id="multiResourceItemReader" class="org.springframework.batch.item.file.MultiResourceItemReader">
<property name="resources" value="classpath:csv/ExamResult*.txt" />
<property name="delegate" ref="flatFileItemReader" />
</bean>
<!-- =========== ItemReader reads a complete line one by one from input file ============-->
<bean id="flatFileItemReader" class="org.springframework.batch.item.file.FlatFileItemReader" scope="step">
<property name="lineMapper">
<bean class="org.springframework.batch.item.file.mapping.DefaultLineMapper">
<property name="fieldSetMapper">
<!-- Mapper which maps each individual items in a record to properties in POJO -->
<bean class="com.websystique.springbatch.ExamResultFieldSetMapper" />
</property>
<property name="lineTokenizer">
<!-- A tokenizer class to be used when items in input record are separated by specific characters -->
<bean class="org.springframework.batch.item.file.transform.DelimitedLineTokenizer">
<property name="delimiter" value="|" />
</bean>
</property>
</bean>
</property>
</bean>
<!-- ItemWriter which writes data to database -->
<bean id="databaseItemWriter" class="org.springframework.batch.item.database.HibernateItemWriter">
<property name="sessionFactory" ref="sessionFactory" />
</bean>
<!-- Optional ItemProcessor to perform business logic/filtering on the input records -->
<bean id="itemProcessor" class="com.websystique.springbatch.ExamResultItemProcessor" />
<!-- Optional JobExecutionListener to perform business logic before and after the job -->
<bean id="jobListener" class="com.websystique.springbatch.ExamResultJobListener" />
<!-- =========== Actual Job =========== -->
<batch:job id="examResultJob">
<batch:step id="step1">
<batch:tasklet transaction-manager="transactionManager">
<batch:chunk reader="multiResourceItemReader" writer="databaseItemWriter"
processor="itemProcessor" commit-interval="10" />
</batch:tasklet>
</batch:step>
<batch:listeners>
<batch:listener ref="jobListener" />
</batch:listeners>
</batch:job>
</beans>
context-model.xml
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:context="http://www.springframework.org/schema/context"
xmlns:tx="http://www.springframework.org/schema/tx"
xmlns:aop="http://www.springframework.org/schema/aop"
xsi:schemaLocation="
http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd
http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context.xsd
http://www.springframework.org/schema/aop http://www.springframework.org/schema/aop/spring-aop.xsd
http://www.springframework.org/schema/tx http://www.springframework.org/schema/tx/spring-tx.xsd"
default-autowire="byName" default-init-method="init">
<import resource="classpath:context-datasource.xml"/>
<bean id="sessionFactory" class="org.springframework.orm.hibernate4.LocalSessionFactoryBean" >
<property name="dataSource" ref="dataSource"/>
<property name="packagesToScan">
<list>
<value>com.websystique.springbatch.model</value>
</list>
</property>
<property name="hibernateProperties">
<props>
<prop key="hibernate.dialect">org.hibernate.dialect.MySQLDialect</prop>
<prop key="hibernate.show_sql">true</prop>
<!-- <prop key="hibernate.format_sql">true</prop> -->
</props>
</property>
</bean>
<bean id="transactionManager" class="org.springframework.orm.hibernate4.HibernateTransactionManager" />
<tx:annotation-driven transaction-manager="transactionManager"/>
</beans>
Simply create exam_result like below:
CREATE TABLE `websystique`.`exam_result` (
`ID` INT NOT NULL AUTO_INCREMENT,
`STUDENT_NAME` VARCHAR(500) NULL,
`DOB` DATE NULL,
`PERCENTAGE` VARCHAR(500) NULL,
PRIMARY KEY (`ID`));

Quartz threadpool reconfiguration

I'm using Quartz and want to change it's thread pool size via remote JMX call, but unfortunately couldn't find any proper solution. Is it possible to change the configuration of the running job programmatically ?
I used Quartz with spring. In my web.xml I created a spring ContextListener. My app starts the Quartz job and exposes 2 JMX methods to start and stop on demand.
<listener>
<listener-class>za.co.lance.admin.infrastructure.ui.util.MBeanContextListener</listener-class>
</listener>
The MBeanContextListener class like this.
public class MBeanContextListener extends ContextLoaderListener {
private ObjectName objectName;
private static Logger logger = LoggerFactory.getLogger(MBeanContextListener.class);
#Override
public void contextDestroyed(final ServletContextEvent sce) {
super.contextDestroyed(sce);
logger.debug("=============> bean context listener destroy");
final MBeanServer mbeanServer = ManagementFactory.getPlatformMBeanServer();
try {
mbeanServer.unregisterMBean(objectName);
logger.info("=============> QuartzJmx unregisterMBean ok");
} catch (final Exception e) {
e.printStackTrace();
}
}
#Override
public void contextInitialized(final ServletContextEvent sce) {
super.contextInitialized(sce);
logger.debug("=============> bean context listener started");
final MBeanServer mbeanServer = ManagementFactory.getPlatformMBeanServer();
try {
final QuartzJmx processLatestFailedDocumentsMbean = new QuartzJmx();
Scheduler scheduler = (Scheduler) ContextLoader.getCurrentWebApplicationContext().getBean("runProcessLatestFailedDocumentsScheduler");
processLatestFailedDocumentsMbean.setScheduler(scheduler);
objectName = new ObjectName("za.co.lance.admin.infrastructure.jmx.mbeans:type=QuartzJmxMBean");
mbeanServer.registerMBean(processLatestFailedDocumentsMbean, objectName);
logger.info("=============> QuartzJmx registerMBean ok");
} catch (final Exception e) {
e.printStackTrace();
}
}
}
The QuartzJmx class. PLEASE NOTE! Any MBean class (QuartzJmx) must have an interface ending with MBean (QuartzJmxMBean ).
#Component
public class QuartzJmx implements QuartzJmxMBean {
private Scheduler scheduler;
private static Logger LOG = LoggerFactory.getLogger(QuartzJmx.class);
#Override
public synchronized void suspendRunProcessLatestFailedDocumentsJob() {
LOG.info("Suspending RunProcessLatestFailedDocumentsJob");
if (scheduler != null) {
try {
if (scheduler.isStarted()) {
scheduler.standby();
LOG.info("RunProcessLatestFailedDocumentsJob suspended");
} else {
LOG.info("RunProcessLatestFailedDocumentsJob already suspended");
throw new SchedulerException("RunProcessLatestFailedDocumentsJob already suspended");
}
} catch (SchedulerException e) {
LOG.error(e.getMessage());
}
} else {
LOG.error("Cannot suspend RunProcessLatestFailedDocumentsJob. Scheduler = null");
throw new IllegalArgumentException("Cannot suspend RunProcessLatestFailedDocumentsJob. Scheduler = null");
}
}
#Override
public synchronized void startRunProcessLatestFailedDocumentsJob() {
LOG.info("Starting RunProcessLatestFailedDocumentsJob");
if (scheduler != null) {
try {
if (scheduler.isInStandbyMode()) {
scheduler.start();
LOG.info("RunProcessLatestFailedDocumentsJob started");
} else {
LOG.info("RunProcessLatestFailedDocumentsJob already started");
throw new SchedulerException("scheduler already started");
}
} catch (SchedulerException e) {
LOG.error(e.getMessage());
}
} else {
LOG.error("Cannot start RunProcessLatestFailedDocumentsJob. Scheduler = null");
throw new IllegalArgumentException("Cannot start RunProcessLatestFailedDocumentsJob. Scheduler = null");
}
}
#Override
public void setScheduler(Scheduler scheduler) {
this.scheduler = scheduler;
}
And last, the Spring context
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.springframework.org/schema/beans
http://www.springframework.org/schema/beans/spring-beans-3.0.xsd">
<bean id="runProcessLatestFailedDocumentsTask"
class="za.co.lance.admin.infrastructure.service.vbs.process.ProcessDocumentServiceImpl" />
<!-- Spring Quartz -->
<bean name="runProcessLatestFailedDocumentsJob" class="org.springframework.scheduling.quartz.JobDetailBean">
<property name="jobClass"
value="za.co.lance.admin.infrastructure.service.quartz.RunProcessLatestFailedDocuments" />
<property name="jobDataAsMap">
<map>
<entry key="processDocumentService" value-ref="runProcessLatestFailedDocumentsTask" />
</map>
</property>
</bean>
<!-- Cron Trigger -->
<bean id="processLatestFailedDocumentsTrigger" class="org.springframework.scheduling.quartz.CronTriggerBean">
<property name="jobDetail" ref="runProcessLatestFailedDocumentsJob" />
<!-- Cron-Expressions (seperated with a space) fields are -->
<!-- Seconds Minutes Hours Day-of-Month Month Day-of-Week Year(optional) -->
<!-- Run every morning hour from 9am to 6pm from Monday to Saturday -->
<property name="cronExpression" value="0 0 9-18 ? * MON-SAT" />
</bean>
<!-- Scheduler -->
<bean id="runProcessLatestFailedDocumentsScheduler"
class="org.springframework.scheduling.quartz.SchedulerFactoryBean">
<property name="jobDetails">
<list>
<ref bean="runProcessLatestFailedDocumentsJob" />
</list>
</property>
<property name="triggers">
<list>
<ref bean="processLatestFailedDocumentsTrigger" />
</list>
</property>
</bean>
</beans>

in the hornetq, the consumer is automatically invoked?

I looked over all of examples in the hornetq, but I couldn't find the example that the consumer is automactically invoked whenever the message comess through the producer.
Please let me know about the example code or hint.
thanks in advance.
Use DefaultMessageListenerContainer. You can register a listener to it and consume messages asynchronously. Follow this link for more information about tuning MessageListenerContainer: http://bsnyderblog.blogspot.se/2010/05/tuning-jms-message-consumption-in.html.
Hornetq dependecies you need (I used a standalone hornetq-2.3.0.CR2) (You also need some spring jars):
<dependencies>
<!-- hornetq -->
<dependency>
<groupId>org.jboss.netty</groupId>
<artifactId>netty</artifactId>
<version>3.2.7.Final</version>
</dependency>
<dependency>
<groupId>org.hornetq</groupId>
<artifactId>hornetq-jms-client</artifactId>
<version>2.3.0.CR2</version>
</dependency>
<dependency>
<groupId>org.hornetq</groupId>
<artifactId>hornetq-core-client</artifactId>
<version>2.3.0.CR2</version>
</dependency>
<!-- hornetq -->
</dependencies>
The beans you should use in your applicationContext.xml (I didn't use jndi for getting ConnectionFactory and destinations; For this, you can follow this question):
<!-- It's ConnectionFactory to connect to hornetq. 5445 is hornetq acceptor port -->
<bean name="connectionFactory" class="messaging.jms.CustomHornetQJMSConnectionFactory">
<constructor-arg index="0" name="ha" value="false" />
<constructor-arg index="1" name="commaSepratedServerUrls" value="127.0.0.1:5445" />
</bean>
<bean id="destinationParent" class="messaging.jms.JmsDestinationFactoryBean" abstract="true">
<property name="pubSubDomain" value="false" /> <!-- default is queue -->
</bean>
<bean id="exampleDestination" parent="destinationParent">
<property name="destinationName" value="example" /> <!-- queue name -->
</bean>
<!-- MessageListener -->
<bean id="messageHandler" class="messaging.consumer.MessageHandler">
</bean>
<!-- MessageListenerContainer -->
<bean id="paymentListenerContainer" class="org.springframework.jms.listener.DefaultMessageListenerContainer">
<property name="destination" ref="exampleDestination" />
<property name="messageListener" ref="messageHandler" />
<property name="connectionFactory" ref="connectionFactory" />
<property name="sessionTransacted" value="true" />
<property name="concurrentConsumers" value="1" />
<property name="maxConcurrentConsumers" value="10" />
<property name="idleConsumerLimit" value="2" />
<property name="idleTaskExecutionLimit" value="5" />
<property name="receiveTimeout" value="3000" />
</bean>
CustomHornetQJMSConnectionFactory:
public class CustomHornetQJMSConnectionFactory extends org.hornetq.jms.client.HornetQJMSConnectionFactory
{
private static final long serialVersionUID = 1L;
public CustomHornetQJMSConnectionFactory(boolean ha, String commaSepratedServerUrls)
{
super(ha, converToTransportConfigurations(commaSepratedServerUrls));
}
public static TransportConfiguration[] converToTransportConfigurations(String commaSepratedServerUrls)
{
String [] serverUrls = commaSepratedServerUrls.split(",");
TransportConfiguration[] transportconfigurations = new TransportConfiguration[serverUrls.length];
for(int i = 0; i < serverUrls.length; i++)
{
String[] urlParts = serverUrls[i].split(":");
HashMap<String, Object> map = new HashMap<String,Object>();
map.put(TransportConstants.HOST_PROP_NAME, urlParts[0]);
map.put(TransportConstants.PORT_PROP_NAME, urlParts[1]);
transportconfigurations[i] = new TransportConfiguration(NettyConnectorFactory.class.getName(), map);
}
return transportconfigurations;
}
}
JmsDestinationFactoryBean (Used in destinationParent bean):
public class JmsDestinationFactoryBean implements FactoryBean<Destination>
{
private String destinationName;
private boolean pubSubDomain = false;
public void setDestinationName(String destinationName) {
this.destinationName = destinationName;
}
public void setPubSubDomain(boolean pubSubDomain) {
this.pubSubDomain = pubSubDomain;
}
#Override
public Class<?> getObjectType()
{
return Destination.class;
}
#Override
public boolean isSingleton()
{
return true;
}
#Override
public Destination getObject() throws Exception
{
if(pubSubDomain)
{
return HornetQJMSClient.createTopic(destinationName);
}
else
{
return HornetQJMSClient.createQueue(destinationName);
}
}
}
MessageHandler (Received messages go to onMessage method for process) (For simplicity, You can implement javax.jms.MessageListener instead of SessionAwareMessageListener):
public class MessageHandler implements org.springframework.jms.listener.SessionAwareMessageListener<Message>
{
#Override
public void onMessage(Message msg, Session session) throws JMSException
{
if(msg instanceof TextMessage)
{
System.out.println(((TextMessage)msg).getText());
session.commit();
}
else
{
session.rollback(); // send message back to the queue
}
}

Glassfish #PersistenceContext

I have the ollowing persistence.xml in my JSF project using eclipse:
<?xml version="1.0" encoding="UTF-8"?>
<persistence version="2.0" xmlns="http://java.sun.com/xml/ns/persistence" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://java.sun.com/xml/ns/persistence http://java.sun.com/xml/ns/persistence/persistence_2_0.xsd">
<persistence-unit name="EnglishOnline" transaction-type="JTA">
<provider>org.eclipse.persistence.jpa.PersistenceProvider</provider>
<class>org.englishonline.persistence.User</class>
<properties>
<property name="eclipselink.ddl-generation" value="create-tables" />
<property name="javax.persistence.jdbc.url" value="jdbc:mysql://localhost:3306/englishonline" />
<property name="javax.persistence.jdbc.driver" value="com.mysql.jdbc.Driver" />
<property name="javax.persistence.jdbc.user" value="root" />
<property name="javax.persistence.jdbc.password" value="porcelainbus" />
</properties>
</persistence-unit>
</persistence>
At the same time I have the following Backing bean:
import java.util.Date;
import org.englishonline.persistence.*;
import javax.faces.bean.*;
import javax.persistence.*;
#ManagedBean
#RequestScoped
public class Login {
#PersistenceContext(unitName="EnglishOnline")
private EntityManager em;
private String login = null;
private String password = null;
public String getLogin() {
return login;
}
public void setLogin(String login) {
this.login = login;
}
public String getPassword() {
return password;
}
public void setPassword(String password) {
this.password = password;
}
public String performLogin() {
if (login.equals(password)) {
EntityTransaction tx = em.getTransaction();
User user = new User();
user.setBirthdate(new Date());
user.setLogin(login);
user.setPassword(password);
user.setName("Bobby");
user.setRegistrationdate(new Date());
user.setSex(Sex.MALE);
tx.begin();
em.persist(user);
tx.commit();
return null;
} else {
return null;
}
}
}
But when I try to submit the button in my app, the server log gives this:
WARNING: RAR5117 : Failed to obtain/create connection from connection pool [ DerbyPool ]. Reason : com.sun.appserv.connectors.internal.api.PoolingException: Connection could not be allocated because: java.net.ConnectException : Connection to localhost port 1527 Connection refused: connect.
WARNING: RAR5114 : Error allocating connection : [Error in allocating a connection. Cause: Connection could not be allocated because: java.net.ConnectException : Connection refused: connect.]
SEVERE: Local Exception Stack:
Exception [EclipseLink-4002] (Eclipse Persistence Services - 2.3.0.v20110604-r9504): org.eclipse.persistence.exceptions.DatabaseException
Internal Exception: java.sql.SQLException: Error in allocating a connection. Cause: Connection could not be allocated because: java.net.ConnectException : Connection refused: connect.
Error Code: 0
at org.eclipse.persistence.exceptions.DatabaseException.sqlException(DatabaseException.java:309)
at org.eclipse.persistence.sessions.JNDIConnector.connect(JNDIConnector.java:135)
at org.eclipse.persistence.sessions.DatasourceLogin.connectToDatasource(DatasourceLogin.java:162)
at org.eclipse.persistence.internal.sessions.DatabaseSessionImpl.loginAndDetectDatasource(DatabaseSessionImpl.java:582)
at org.eclipse.persistence.internal.jpa.EntityManagerFactoryProvider.login(EntityManagerFactoryProvider.java:206)
at org.eclipse.persistence.internal.jpa.EntityManagerSetupImpl.deploy(EntityManagerSetupImpl.java:472)
at org.eclipse.persistence.internal.jpa.EntityManagerFactoryDelegate.getDatabaseSession(EntityManagerFactoryDelegate.java:188)
at org.eclipse.persistence.internal.jpa.EntityManagerFactoryDelegate.createEntityManagerImpl(EntityManagerFactoryDelegate.java:277)
at org.eclipse.persistence.internal.jpa.EntityManagerFactoryImpl.createEntityManagerImpl(EntityManagerFactoryImpl.java:290)
at org.eclipse.persistence.internal.jpa.EntityManagerFactoryImpl.createEntityManager(EntityManagerFactoryImpl.java:275)
at com.sun.enterprise.container.common.impl.EntityManagerWrapper._getDelegate(EntityManagerWrapper.java:218)
at com.sun.enterprise.container.common.impl.EntityManagerWrapper.getTransaction(EntityManagerWrapper.java:857)
at org.englishonline.backing.Login.performLogin(Login.java:34)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at com.sun.el.parser.AstValue.invoke(AstValue.java:234)
at com.sun.el.MethodExpressionImpl.invoke(MethodExpressionImpl.java:297)
at com.sun.faces.facelets.el.TagMethodExpression.invoke(TagMethodExpression.java:105)
at javax.faces.component.MethodBindingMethodExpressionAdapter.invoke(MethodBindingMethodExpressionAdapter.java:88)
at com.sun.faces.application.ActionListenerImpl.processAction(ActionListenerImpl.java:102)
at javax.faces.component.UICommand.broadcast(UICommand.java:315)
at javax.faces.component.UIViewRoot.broadcastEvents(UIViewRoot.java:794)
at javax.faces.component.UIViewRoot.processApplication(UIViewRoot.java:1259)
at com.sun.faces.lifecycle.InvokeApplicationPhase.execute(InvokeApplicationPhase.java:81)
at com.sun.faces.lifecycle.Phase.doPhase(Phase.java:101)
at com.sun.faces.lifecycle.LifecycleImpl.execute(LifecycleImpl.java:118)
at javax.faces.webapp.FacesServlet.service(FacesServlet.java:593)
at org.apache.catalina.core.StandardWrapper.service(StandardWrapper.java:1539)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:281)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:175)
at org.apache.catalina.core.StandardPipeline.doInvoke(StandardPipeline.java:655)
at org.apache.catalina.core.StandardPipeline.invoke(StandardPipeline.java:595)
at com.sun.enterprise.web.WebPipeline.invoke(WebPipeline.java:98)
at com.sun.enterprise.web.PESessionLockingStandardPipeline.invoke(PESessionLockingStandardPipeline.java:91)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:162)
at org.apache.catalina.connector.CoyoteAdapter.doService(CoyoteAdapter.java:330)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:231)
at com.sun.enterprise.v3.services.impl.ContainerMapper.service(ContainerMapper.java:174)
at com.sun.grizzly.http.ProcessorTask.invokeAdapter(ProcessorTask.java:828)
at com.sun.grizzly.http.ProcessorTask.doProcess(ProcessorTask.java:725)
at com.sun.grizzly.http.ProcessorTask.process(ProcessorTask.java:1019)
at com.sun.grizzly.http.DefaultProtocolFilter.execute(DefaultProtocolFilter.java:225)
at com.sun.grizzly.DefaultProtocolChain.executeProtocolFilter(DefaultProtocolChain.java:137)
at com.sun.grizzly.DefaultProtocolChain.execute(DefaultProtocolChain.java:104)
at com.sun.grizzly.DefaultProtocolChain.execute(DefaultProtocolChain.java:90)
at com.sun.grizzly.http.HttpProtocolChain.execute(HttpProtocolChain.java:79)
at com.sun.grizzly.ProtocolChainContextTask.doCall(ProtocolChainContextTask.java:54)
at com.sun.grizzly.SelectionKeyContextTask.call(SelectionKeyContextTask.java:59)
at com.sun.grizzly.ContextTask.run(ContextTask.java:71)
at com.sun.grizzly.util.AbstractThreadPool$Worker.doWork(AbstractThreadPool.java:532)
at com.sun.grizzly.util.AbstractThreadPool$Worker.run(AbstractThreadPool.java:513)
at java.lang.Thread.run(Thread.java:662)
Caused by: java.sql.SQLException: Error in allocating a connection. Cause: Connection could not be allocated because: java.net.ConnectException : Ошибка соединения с сервером localhost на порту 1527 с сообщением Connection refused: connect.
at com.sun.gjc.spi.base.DataSource.getConnection(DataSource.java:151)
at org.eclipse.persistence.sessions.JNDIConnector.connect(JNDIConnector.java:132)
... 52 more
Caused by: javax.resource.spi.ResourceAllocationException: Error in allocating a connection. Cause: Connection could not be allocated because: java.net.ConnectException : Ошибка соединения с сервером localhost на порту 1527 с сообщением Connection refused: connect.
at com.sun.enterprise.connectors.ConnectionManagerImpl.internalGetConnection(ConnectionManagerImpl.java:307)
at com.sun.enterprise.connectors.ConnectionManagerImpl.allocateConnection(ConnectionManagerImpl.java:190)
at com.sun.enterprise.connectors.ConnectionManagerImpl.allocateConnection(ConnectionManagerImpl.java:165)
at com.sun.enterprise.connectors.ConnectionManagerImpl.allocateConnection(ConnectionManagerImpl.java:160)
at com.sun.gjc.spi.base.DataSource.getConnection(DataSource.java:145)
... 53 more
And the JPA provider does nothing.
You specified the transaction-type as JTA which requires a JTA datasource. You will want to use "resource_local" if you wish to specify the jdbc.url and control the transactions directly instead of using JTA.