How to instantiate and register a JobListener in the quartz.properties file? - quartz-scheduler

I've read from many sources where people have said that it is possible and intended that a listener can be instantiated and registered to a scheduler all in the properties file
But I searched everywhere and can't find a single example of this.
Note: I am using this Quartz Initiazlizer Servlet to start my scheduler. Therefore I won't be able to use the conventional method of registering a JobListener to the scheduler. Other method of doing this is very accepted
Basically I want to do this:
JobListener jobListener = new SchedulerGlobalListener();
scheduler.getListenerManager().addJobListener(jobListener);
In this (under quartz.properties)
org.quartz.jobListener.NAME.class = com.foo.MyListenerClass
org.quartz.jobListener.NAME.propName = propValue
org.quartz.jobListener.NAME.prop2Name = prop2Value
*
*
Below is what I've tried and the results
#quartz.properties#
org.quartz.jobListener.SchedulerGlobalListener.class = com.scheduler.SchedulerGlobalListener
#listener class#
public class SchedulerGlobalListener implements JobListener {
private String name;
public SchedulerGlobalListener() {
}
public SchedulerGlobalListener(String name) {
if(name.isEmpty())
{
this.name = "SchedulerGlobalListener";
}
else
{
this.name = name;
}
}
#Override
public String getName() {
return name;
}
public String setName(String name) {
return name;
}
#Override
public void jobToBeExecuted(JobExecutionContext context) {
// do something with the event
}
#Override
public void jobWasExecuted(JobExecutionContext context, JobExecutionException jobException) {
System.out.println("I just ran this job: " + context.getJobDetail().getJobClass().getName());
}
#Override
public void jobExecutionVetoed(JobExecutionContext context) {
// do something with the event
}
}
Result:
INFO: QuartzInitializer: Quartz Scheduler failed to initialize: java.lang.IllegalArgumentException: JobListener name cannot be empty.

Thanks for the help guys. I totally missed this (under Quartz config documentation):
For example if the properties file contains the property 'org.quartz.jobStore.myProp = 10' then after the JobStore class has been instantiated, the method 'setMyProp()' will be called on it.
quartz.properties
org.quartz.jobListener.SchedulerGlobalListener.class =
com.scheduler.SchedulerGlobalListener
org.quartz.jobListener.SchedulerGlobalListener.globalListenerName =
SchedulerGlobalListener
SchedulerGlobalListener.java
public void setGlobalListenerName(String name) {
this.name = name;
}

You have already done things almost.
Please refer
this link.
You just have to specify the name of the listener class in quartz.properties file
and make sure that the specified listener class is in your classpath.
Refer this
article for how to use joblistener except the part to register the
joblistener with scheduler. Instead, add the above mentioned properties to quartz properties file.

Here you find how hor to use Quartz and wrtie and trigger your jobs
http://www.mkyong.com/java/quartz-joblistener-example/
In Quartz there is a cron expression which describes the interval of quartz job to start again here
http://quartz-scheduler.org/documentation/quartz-1.x/tutorials/crontrigger
http://docs.oracle.com/cd/E12058_01/doc/doc.1014/e12030/cron_expressions.htm
you found how to write cron expressions.
Quartz simple trigger
Trigger trigger = TriggerBuilder
.newTrigger()
.withIdentity("TriggerName", "group1")
.withSchedule(
SimpleScheduleBuilder.simpleSchedule()
.withIntervalInSeconds(5).repeatForever())
.build();
Quartz Cron Trigger
Trigger trigger = TriggerBuilder
.newTrigger()
.withIdentity("TriggerName", "group1")
.withSchedule(
CronScheduleBuilder.cronSchedule("0/5 * * * * ?"))
.build();
If you are using spring then all quartz properties you can mention in your context file as
<bean class="org.springframework.scheduling.quartz.SchedulerFactoryBean">
<property name="triggers">
<list>
<ref bean="cronTrigger" />
<ref bean="simpleTrigger" />
</list>
</property>
<property name="quartzProperties">
<props>
<prop key="propertName">propertyValue</prop>
</props>
</property>
</bean>
and
<bean id="beanName" class="org.springframework.scheduling.quartz.SchedulerFactoryBean" destroy-method="destroy">
<property name="jobFactory">
<bean class="org.springframework.scheduling.quartz.SpringBeanJobFactory"/>
</property>
<property name="dataSource" ref="JNDIDataSource" />
<property name="transactionManager" ref="transactionManager" />
<property name="quartzProperties">
<util:properties location="classpath:/quartz.properties"/>
</property>
<property name="triggers">
<list>
<ref bean="triggerBean"/>
</list>
</property>
</bean>

Related

Spring batch does not start reading the list from the beginning

I am using a Task Scheduler to run the batch job that runs after every 10 seconds. But, after every 10 seconds the reader begins the reading from 2nd element in the list.
My custom reader:
public class DataReader implements ItemReader<User> {
#Autowired
private MainDAO mainDAO;
private int counter;
private List<User> userList;
#PostConstruct
public void init() {
this.userList = this.mainDAO.getAllUsers();
}
public User read() throws Exception, UnexpectedInputException, ParseException, NonTransientResourceException {
// TODO Auto-generated method stub
System.out.println(counter);
if(counter < userList.size())
return userList.get(counter++);
return null;
}
}
It always starts reading from 2nd element in the list while the counter must be set to zero when the class is created.
My Run Scheduler:
#Component
public class RunScheduler {
#Autowired
private JobLauncher jobLauncher;
#Autowired
private Job job;
public void run() {
try {
String dateParameter = new Date().toString();
JobParameters parameter = new JobParametersBuilder().addString("date", dateParameter).toJobParameters();
System.out.println(dateParameter);
JobExecution execution = this.jobLauncher.run(job, parameter);
System.out.println("Exit status: " + execution.getStatus());
}
catch(Exception exception) {
exception.printStackTrace();
}
}
}
My Spring configuration for Spring batch:
<!-- Spring Batch Configuration: Start -->
<bean id="customReader" class="com.arpit.reader.DataReader" />
<bean id="customProcessor" class="com.arpit.processor.DataProcessor" />
<bean id="customWriter" class="com.arpit.writer.DataWriter" />
<batch:job id="invoiceJob">
<batch:step id="step1">
<batch:tasklet>
<batch:chunk reader="customReader" processor="customProcessor" writer="customWriter" commit-interval="1" />
</batch:tasklet>
</batch:step>
</batch:job>
<bean id="transactionManager" class="org.springframework.batch.support.transaction.ResourcelessTransactionManager" />
<bean id="jobLauncher" class="org.springframework.batch.core.launch.support.SimpleJobLauncher">
<property name="jobRepository" ref="jobRepository"/>
</bean>
<bean id="jobRepository" class="org.springframework.batch.core.repository.support.MapJobRepositoryFactoryBean">
<property name="transactionManager" ref="transactionManager" />
</bean>
<!-- Spring Batch Configuration: End -->
<!-- Task Scheduler Configuration: Start -->
<bean id="runScheduler" class="com.arpit.scheduler.RunScheduler" />
<task:scheduled-tasks>
<task:scheduled ref="runScheduler" method="run" cron="*/10 * * * * *" />
</task:scheduled-tasks>
<!-- Task Scheduler Configuration: End -->
Can someone please help me figure out what the issue is?

Using MultiResourceItemReader to read 2 plain text file and write into single file

My batch job will generate 2 text files with string format per line. I created a reader
<bean id="myMultiResourceReader"
class=" org.springframework.batch.item.file.MultiResourceItemReader">
<property name="resources" value="file:D:/MY/sample/*.txt" />
</bean>
<bean id="myFinalWriter" class="org.springframework.batch.item.file.FlatFileItemWriter"
scope="step">
<property name="resource" value="${test.file3}" />
<property name="lineAggregator">
<bean
class="org.springframework.batch.item.file.transform.PassThroughLineAggregator" />
</property>
<property name="footerCallback" ref="myFinalCustomItemWriter" />
<property name="headerCallback" ref="myFinalCustomItemWriter" />
</bean>
<bean id="myFinalCustomItemWriter" class="my.process.MyWriter"
scope="step">
<property name="delegate" ref="myFinalWriter" />
<property name="stepContext" value="#{stepExecution.stepName}" />
</bean>
I was getting this error:
Caused by: org.springframework.beans.ConversionNotSupportedException: Failed to convert property value of type 'com.sun.proxy.$Proxy68 implementing org.springframework.batch.item.file.ResourceAwareItemWriterItemStream,org.springframework.beans.factory.InitializingBean,org.springframework.batch.item.ItemStreamWriter,org.springframework.batch.item.ItemStream,org.springframework.aop.scope.ScopedObject,java.io.Serializable,org.springframework.aop.framework.AopInfrastructureBean,org.springframework.aop.SpringProxy,org.springframework.aop.framework.Advised' to required type 'org.springframework.batch.item.file.FlatFileItemWriter' for property 'delegate'; nested exception is java.lang.IllegalStateException: Cannot convert value of type [com.sun.proxy.$Proxy68 implementing org.springframework.batch.item.file.ResourceAwareItemWriterItemStream,org.springframework.beans.factory.InitializingBean,org.springframework.batch.item.ItemStreamWriter,org.springframework.batch.item.ItemStream,org.springframework.aop.scope.ScopedObject,java.io.Serializable,org.springframework.aop.framework.AopInfrastructureBean,org.springframework.aop.SpringProxy,org.springframework.aop.framework.Advised] to required type [org.springframework.batch.item.file.FlatFileItemWriter] for property 'delegate': no matching editors or conversion strategy found
Basically I just want to combine two plain file, and append the total count at footer. Then delete away the both input file. Can help?
MyWriter.java
public class MyWriter implements ItemWriter<String>, FlatFileFooterCallback, FlatFileHeaderCallback, ItemStream{
private static Logger log = Logger.getLogger(MyWriter.class);
private FlatFileItemWriter<String> delegate;
private int recordCount = 0;
private String stepContext;
public void writeFooter(Writer writer) throws IOException {
writer.write("#" + recordCount);
}
public void writeHeader(Writer writer) throws IOException {
writer.write("#" + StringUtil.getSysDate());
}
public void setDelegate(FlatFileItemWriter<String> delegate) {
this.delegate = delegate;
}
public void write(List<? extends String> list) throws Exception {
int chunkRecord = 0;
for (String item : list) {
chunkRecord++;
}
delegate.write(list);
recordCount += chunkRecord;
}
public void close() throws ItemStreamException {
this.delegate.close();
}
public void open(ExecutionContext arg0) throws ItemStreamException {
this.delegate.open(arg0);
}
public void update(ExecutionContext arg0) throws ItemStreamException {
this.delegate.update(arg0);
}
public void setStepContext(String stepContext) {
this.stepContext = stepContext;
}
}
As Luca Basso Ricci already pointed out, the problem is your delegate definition in MyWriter. Since Spring creates proxies for it beans, it will not recognize your FlatFileItemReader as an actual instance of FlatFileItemWriter and, therefore, the setDelegate(FlatFileItemWriter delegate) will fail.
Use an ItemStreamWriter in MyWriter. As you see in the exception message, the created proxy does provide this interface. Hence, it can be inserted
This will solve the delegation to write, open, close, and update method. In order to write the header and footer, you need to implement a HeaderCallback and FooterCallback and set it directly in the definition of your FlatFileItemWriter.
Implementing the HeaderCallback is not a problem since you only set the systemdate.
As FooterCallback, make your own Bean. Use it in the FlatFileItemWriter to write the footer. Add an "increaseCount" method to it and use it in your MyWriter Bean to increase the written count.
public void write(List<? extends String> list) throws Exception {
myFooterCallback.increaseCount(list.size());
delegate.write(list);
}
Another possible option would be to directly extend MyWriter from FlatFileItemWriter:
public class MyWriter extends FlatFileItemWriter<String> implements FlatFileFooterCallback, FlatFileHeaderCallback{
private static Logger log = Logger.getLogger(MyWriter.class);
private int recordCount = 0;
private String stepContext;
public void writeFooter(Writer writer) throws IOException {
writer.write("#" + recordCount);
}
public void writeHeader(Writer writer) throws IOException {
writer.write("#" + StringUtil.getSysDate());
}
public void afterPropertiesSet() {
setFooterCallback(this);
setHeaderCallback(this);
super.afterPropertiesSet();
}
public void write(List<? extends String> list) throws Exception {
super.write(list);
recordCount += list.size();
}
}
Configuration in your XML would look like this:
<bean id="myFinalCustomItemWriter" class="my.process.MyWriter" scope="step">
<property name="resource" value="${test.file3}" />
<property name="lineAggregator">
<bean class="org.springframework.batch.item.file.transform.PassThroughLineAggregator" />
</property>
<property name="stepContext" value="#{stepExecution.stepName}" />
</bean>

Transaction doesn't work in aspectj

I have the aspect(see below) which should log actions(create, update, delete) in db. Depends on action logging happens in a preProcess or postProcess method. I shouldn't log anything if some fail happens through these actions. I.e. if create didn't happened, then there is no need to logging it.
I tried to tested it. I throw RunTimeException in the join point and expect that there is no new log in db. Unfortunately, new log is saved in spite of exception in the join point.
Aspect:
#Component
#Aspect
public class LoggingAspect {
#Autowired
private ApplicationContext appContext;
#Autowired
private LoggingService loggingService;
#Around("#annotation(Loggable)")
#Transactional
public void saveActionMessage(ProceedingJoinPoint joinPoint) throws Throwable {
MethodSignature ms = (MethodSignature) joinPoint.getSignature();
Loggable m = ms.getMethod().getAnnotation(Loggable.class);
LoggingStrategy strategy = appContext.getBean(m.strategy());
Object argument = joinPoint.getArgs()[0];
strategy.preProcess(argument);
joinPoint.proceed();
strategy.postProcess(argument);
}
}
TestApplicationConfig:
<context:spring-configured/>
<import resource="applicationConfig-common.xml"/>
<import resource="applicationConfig-security.xml"/>
<aop:aspectj-autoproxy/>
<util:map id="testValues">
<entry key="com.exadel.mbox.test.testSvnFile" value="${svnFolder.configPath}${svnRoot.file[0].fileName}"/>
<entry key="com.exadel.mbox.test.testCommonRepositoryPath" value="${svnRoot.commonRepositoryPath}"/>
<entry key="com.exadel.mbox.test.testMailFile" value="${mailingList.configPath}"/>
</util:map>
<context:component-scan base-package="com.exadel.report.common" />
<!-- Jpa Repositories -->
<jpa:repositories base-package="com.exadel.report.common.dao" />
<tx:annotation-driven proxy-target-class="true"
transaction-manager="txManager" mode="aspectj"/>
<bean id="txManager"
class="org.springframework.jdbc.datasource.DataSourceTransactionManager">
<property name="dataSource" ref="dataSource"/>
</bean>
<!-- Data Source -->
<bean id="dataSource" class="org.springframework.jdbc.datasource.DriverManagerDataSource">
<property name="driverClassName" value="org.hsqldb.jdbcDriver" />
<property name="url" value="jdbc:hsqldb:mem:testdb" />
<property name="username" value="sa" />
<property name="password" value="" />
</bean>
<!-- Entity Manager -->
<bean id="entityManagerFactory"
class="org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean">
<property name="dataSource" ref="dataSource" />
<property name="jpaVendorAdapter">
<bean class="org.springframework.orm.jpa.vendor.HibernateJpaVendorAdapter">
<property name="showSql" value="true"/>
<property name="generateDdl" value="true"/>
<property name="databasePlatform" value="org.hibernate.dialect.HSQLDialect"/>
</bean>
</property>
<property name="persistenceUnitName" value="exviewer-test"/>
</bean>
<!-- Transaction Manager -->
<bean id="transactionManager" class="org.springframework.orm.jpa.JpaTransactionManager">
<property name="entityManagerFactory" ref="entityManagerFactory" />
</bean>
[Update]
LoggingStrategy:
public interface LoggingStrategy {
public void preProcess(Object obj);
public void postProcess(Object obj);
}
BaseLoggingStrategy:
public class BaseLoggingStrategy implements LoggingStrategy {
#Override
public void preProcess(Object obj) {}
#Override
public void postProcess(Object obj) {}
}
UpdateProcessStrategy:
#Service
public class UpdateProcessStrategy extends BaseLoggingStrategy {
#Autowired
private LoggingService loggingService;
#Autowired
private UserService userService;
#Autowired
DeviceService deviceService;
private Device currentDevice;
#Override
#Transactional
public void preProcess(Object obj) {
currentDevice = (Device) obj;
Device previousDevice = deviceService.getById(currentDevice.getId());
String deviceDataBeforeUpdate = deviceService.getDeviceDetailsInJSON(previousDevice);
String deviceDataAfterUpdate = deviceService.getDeviceDetailsInJSON(currentDevice);
String login = userService.getCurrentUser().getLogin();
String actionMessage = LoggingMessages.DEVICE_UPDATE.name();
loggingService.save(
new Logging(
login,
actionMessage,
deviceDataBeforeUpdate,
deviceDataAfterUpdate,
new Date())
);
}
#Override
public void postProcess(Object obj) {}
}
Class intercepted by aspcet:
#Service
public class DeviceService {
#Loggable(value = LoggingMessages.DEVICE_CREATE, strategy = CreateProcessStrategy.class)
#Transactional
public void create(Device device) {
createOrUpdate(device);
}
#Loggable(value = LoggingMessages.DEVICE_UPDATE, strategy = UpdateProcessStrategy.class)
#Transactional
public void update(Device device) {
createOrUpdate(device);
}
private void createOrUpdate(Device device) {
deviceRepository.save(device);
}
#Loggable(value = LoggingMessages.DEVICE_REMOVE, strategy = RemoveProcessStrategy.class)
public void remove(Long deviceId) {
deviceRepository.delete(deviceId);
}
}
Loggable annotation:
#Retention(RetentionPolicy.RUNTIME)
#Target(ElementType.METHOD)
public #interface Loggable {
LoggingMessages value();
Class<? extends LoggingStrategy> strategy();
}
Log for update action contains:
id, created_dtm, action(DEVICE_UPDATE), device_data_before_action_on_the_device(in json format), device_data_after_action_on_the_device(in json format), created_by.
Disclaimer: Actually I am not a Spring expert, maybe someone else can help you out here. My field of expertise it AspectJ, which is how I found your question.
Anyway, you have two issues here:
#Transactional annotation on your aspect's advice LoggingAspect.saveActionMessage(..). Actually I have no idea if this works at all (I found no example using #Transactional on an aspect method/advice on the web, but maybe I searched in the wrong way) because declarative transaction handling in Spring is implemented via proxy-based technology, just like Spring AOP. Read the chapter 12 about transaction management in the Spring manual for further details, especially chapter 12.5.1. I am pretty sure you will find a way to do what you want there.
Nested transactions, because e.g. UpdateProcessStrategy.preProcess(..) is called by the very advice which is meant to be transactional, but is declared #Transactional too. So you have a transaction within a transaction. How Spring handles this, I have no idea, but maybe this tutorial about Spring transaction propagation contains enlightening details.
The Spring manual lists several means to implement transactional behaviour: programmatically, declaratively via annotations, XML-based <tx:advice> stuff and so forth. I don't know which way is the best for you, I merely wanted to provide some general hints.

deleting records through spring batch

I have a file with records.
I have to develop a spring batch program which will read the file and DELETE the same records from the database table.
Is it possible to run delete query through ItemWriter ???
Serkans answer is right, but there are some more possibilities for working with batch sql
you could use spring-jdbc-batch-template instead of the normal jdbc-template
you could directly use the spring-batch-jdbc-item-writer see example code
code example with spring batch xml config and java code
<bean id="itemWriter" class="org.springframework.batch.item.database.JdbcBatchItemWriter">
<property name="dataSource" ref="dataSource" />
<property name="sql">
<!-- Why CDATA?
because < etc. is not allowed for xml values
when you use < xml parser will work, but
now the sql won't because of the & spring assumes
a placeholder, see
- AbstractSqlPagingQueryProvider.init(...)
- JdbcParameterUtils.countParameterPlaceholders(...)
-->
<value>
<![CDATA[
DELETE FROM TEST
WHERE id = ?
and sub.id = ?
and ...
]]>
</value>
</property>
<property name="itemPreparedStatementSetter">
<bean class="...FieldSetItemPreparedStatementSetter" />
</property>
</bean>
/**
* Implementation for {#link ItemPreparedStatementSetter},
* sets the values from {#link FieldSet}.
*
*/
public class FieldSetItemPreparedStatementSetter implements ItemPreparedStatementSetter<FieldSet> {
/** {#inheritDoc} */
#Override
public void setValues(FieldSet item, PreparedStatement ps) throws SQLException {
for (int i = 0; i < item.getValues().length; i++) {
// PreparedStatements start with 1
ps.setObject(i + 1, item.getValues()[i]);
}
}
}
There is no difference between update or delete operations, only the sql changes; so simple answer to your question will be yes.
The following code fragment may help for basic needs assuming that you are deleting from the Book table.
public class BookJdbcItemWriter implements ItemWriter<Book> {
private static final String DELETE_BOOK = "delete from Book where id = ?";
private JdbcTemplate jdbcTemplate;
public BookJdbcItemWriter(DataSource dataSource) {
this.jdbcTemplate = new JdbcTemplate(dataSource);
}
public void write(List<? extends Book> items) throws Exception {
for(Book item : items) {
int updated = jdbcTemplate.update(DELETE_BOOK,item.getId());
}
}
}

JPA entityManager is null in Pointcut

I have defined a pointcut using the #Aspect annotation in my class.
I configure the pointcut using a custom annotation which I have defined in my context:
<aop:aspectj-autoproxy proxy-target-class="true"/>
<!-- Messaging pointcut -->
<bean id="messagePointcut" class="com.adobe.codex.aspects.MessagePointcut" >
<constructor-arg ref="msgPointcutEntityFactory"/>
<property name="buildDao" ref="buildDao"/>
</bean>
<!-- enable our own annotation -->
<aop:config proxy-target-class="true">
<aop:aspect ref="messagePointcut">
<aop:pointcut id="proxiedMethods" expression="#annotation(com..codex.aspects.annotation.MessageGateway)"/>
<aop:around pointcut-ref="proxiedMethods" method="interceptAnnotatedMethod"/>
</aop:aspect>
</aop:config>
Unfortunately the entityManager inside buildDao is always null if I have a reference to buildDao in my pointcut.
Not sure what the best way to fix this would be.
I'm assuming the problem is that the weaving used (load time) is does not know how to create an entityManager from the entityManagerFactory bean.
here is a snippet of my dao context.
<context:annotation-config />
<bean id="entityManagerFactory"
class="org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean">
<property name="dataSource" ref="dataSource" />
<property name="jpaProperties">
<util:properties
location="classpath:com//codex/dao/jpa/hibernate.properties" />
</property>
</bean>
<bean id="buildDao" class="com..codex.dao.jpa.JpaBuildDao">
<description>
A DAO for Builds.
</description>
<property name="queryHelper" ref="queryHelper" />
<property name="partDao" ref="partDao" />
<property name="buildQueryFactory" ref="buildQueryFactory" />
</bean>
Here is my Pointcut:
#Aspect
#Transactional()
public class MessagePointcut implements Ordered, MsgObservable {
private MsgPointcutEntityFactory msgEntityFactory;
private BuildDao buildDao;
public void setBuildDao(BuildDao buildDao) {
this.buildDao = buildDao;
}
public MessagePointcut(MsgPointcutEntityFactory msgEntityFactory){
this.msgEntityFactory = msgEntityFactory;
}
#Transactional(readOnly = true)
public Object interceptAnnotatedMethod(ProceedingJoinPoint pjp) {
Object returnedEntity = null;
Object originalEntity = null;
try { //
// do stuff before executing the call
originalEntity = msgEntityFactory.fetch(id, Build.class);
//execute the call
returnedEntity = pjp.proceed();
// do stuff after executing the call
// ...
} catch (Throwable e) {
e.printStackTrace();
}
return returnedEntity;
}
#Override
public int getOrder() {
return 2;
}
}
And a snippet of my dao
#Repository
public class JpaBuildDao implements BuildDao {
private static final Log log = LogFactory.getLog(JpaBuildDao.class);
#PersistenceContext
private EntityManager entityManager;
private QueryHelper queryHelper;
private BuildQueryFactory standardQueryFactory;
private PartDao partDao;
public Build getFlatBuild(Integer id) {
Build returnBuild;
Query query = entityManager.createQuery(
"SELECT b FROM Build b " +
"WHERE " +
"b.id = :id");
query.setParameter("id", id);
returnBuild = (Build) query.getSingleResult();
return returnBuild;
}
Made some progress. The real issue is that buildDao is injected raw into the pointcut w/o the required Jpa proxy that instantiates the entityManager.
Turns out the issue only occurs when another config detail comes into the mix. I also have two MethodInvokingFactoryBean instances injecting beans into my pointcut:
<bean id="registerListenerJms"
class="org.springframework.beans.factory.config.MethodInvokingFactoryBean">
<property name="targetObject">
<ref local="messagePointcut" />
</property>
<property name="targetMethod">
<value>registerObserver</value>
</property>
<property name="arguments">
<list>
<ref bean="jmsGateway" />
</list>
</property>
</bean>
<bean id="registerListenerAmf"
class="org.springframework.beans.factory.config.MethodInvokingFactoryBean">
<property name="targetObject">
<ref local="messagePointcut" />
</property>
<property name="targetMethod">
<value>registerObserver</value>
</property>
<property name="arguments">
<list>
<ref bean="amfGateway" />
</list>
</property>
</bean>
When I remove these two beans my pointcut doesn't get the raw proxy, but it gets a JdkDynamicAopProxy with a reference to the dao.
Have no clue why MethodInvokingFactoryBean messes up injecting the dao, but it does.
Bottom line is for the time being I'm removing the MethodInvokingFactoryBean that implement my observer pattern and live with a dependency of the pointcut on the beans that want to hook in.
Not a complete solution but an acceptable workaround.