Spring batch does not start reading the list from the beginning - scheduled-tasks

I am using a Task Scheduler to run the batch job that runs after every 10 seconds. But, after every 10 seconds the reader begins the reading from 2nd element in the list.
My custom reader:
public class DataReader implements ItemReader<User> {
#Autowired
private MainDAO mainDAO;
private int counter;
private List<User> userList;
#PostConstruct
public void init() {
this.userList = this.mainDAO.getAllUsers();
}
public User read() throws Exception, UnexpectedInputException, ParseException, NonTransientResourceException {
// TODO Auto-generated method stub
System.out.println(counter);
if(counter < userList.size())
return userList.get(counter++);
return null;
}
}
It always starts reading from 2nd element in the list while the counter must be set to zero when the class is created.
My Run Scheduler:
#Component
public class RunScheduler {
#Autowired
private JobLauncher jobLauncher;
#Autowired
private Job job;
public void run() {
try {
String dateParameter = new Date().toString();
JobParameters parameter = new JobParametersBuilder().addString("date", dateParameter).toJobParameters();
System.out.println(dateParameter);
JobExecution execution = this.jobLauncher.run(job, parameter);
System.out.println("Exit status: " + execution.getStatus());
}
catch(Exception exception) {
exception.printStackTrace();
}
}
}
My Spring configuration for Spring batch:
<!-- Spring Batch Configuration: Start -->
<bean id="customReader" class="com.arpit.reader.DataReader" />
<bean id="customProcessor" class="com.arpit.processor.DataProcessor" />
<bean id="customWriter" class="com.arpit.writer.DataWriter" />
<batch:job id="invoiceJob">
<batch:step id="step1">
<batch:tasklet>
<batch:chunk reader="customReader" processor="customProcessor" writer="customWriter" commit-interval="1" />
</batch:tasklet>
</batch:step>
</batch:job>
<bean id="transactionManager" class="org.springframework.batch.support.transaction.ResourcelessTransactionManager" />
<bean id="jobLauncher" class="org.springframework.batch.core.launch.support.SimpleJobLauncher">
<property name="jobRepository" ref="jobRepository"/>
</bean>
<bean id="jobRepository" class="org.springframework.batch.core.repository.support.MapJobRepositoryFactoryBean">
<property name="transactionManager" ref="transactionManager" />
</bean>
<!-- Spring Batch Configuration: End -->
<!-- Task Scheduler Configuration: Start -->
<bean id="runScheduler" class="com.arpit.scheduler.RunScheduler" />
<task:scheduled-tasks>
<task:scheduled ref="runScheduler" method="run" cron="*/10 * * * * *" />
</task:scheduled-tasks>
<!-- Task Scheduler Configuration: End -->
Can someone please help me figure out what the issue is?

Related

Spring Batch: MultiResourcePartitioner how to set resources lazily

Question: How to set/Inject resources lazily like how it is done using the XML config in Java config.
We have spring batch program that is currently using XML configuration for uploading multiple files using the MultiResourcePartitioner. This works as intended see below config.
<?xml version="1.0" encoding="UTF-8"?>
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xmlns:batch="http://www.springframework.org/schema/batch"
xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd
http://www.springframework.org/schema/batch http://www.springframework.org/schema/batch/spring-batch-3.0.xsd">
<job id="fileLoaderJob" xmlns="http://www.springframework.org/schema/batch">
<step id="moveFiles" next="batchFileUploader">
<tasklet ref="moveFilesTasklet" />
</step>
<step id="batchFileUploader" parent="batchFileUpload:master" >
<next on="*" to="archiveFiles" />
</step>
<step id="archiveFiles" >
<batch:tasklet ref="archiveFilesTasklet" />
</step>
</job>
<!--This Tasklet moves the file from say Input to Work dir -->
<bean id="moveFilesTasklet" class="com.spring.batch.fileloader.MoveFilesTasklet" scope="step" />
<step id="batchFileUpload" xmlns="http://www.springframework.org/schema/batch">
<tasklet>
<chunk reader="fileReader"
commit-interval="10000"
writer="fileWriter"
/>
</tasklet>
</step>
<bean name="batchFileUpload:master" class="org.springframework.batch.core.partition.support.PartitionStep">
<property name="jobRepository" ref="jobRepository"/>
<property name="stepExecutionSplitter">
<bean class="org.springframework.batch.core.partition.support.SimpleStepExecutionSplitter">
<constructor-arg ref="jobRepository"/>
<constructor-arg ref="batchFileUpload"/>
<constructor-arg>
<bean class="org.springframework.batch.core.partition.support.MultiResourcePartitioner" scope="step">
<property name="resources" ref="fileResources" />
</bean>
</constructor-arg>
</bean>
</property>
<property name="partitionHandler">
<bean class="org.springframework.batch.core.partition.support.TaskExecutorPartitionHandler">
<property name="taskExecutor">
<bean class="org.springframework.core.task.SimpleAsyncTaskExecutor">
<property name="concurrencyLimit" value="5" />
</bean>
</property>
<property name="step" ref="batchFileUpload"/>
</bean>
</property>
</bean>
<bean id="fileResources" class="com.spring.batch.fileloader.fileResources" />
<bean id="fileReader" class="org.springframework.batch.item.file.FlatFileItemReader" scope="step">
<property name="resource" value="#{stepExecutionContext[fileName]}" />
<property name="lineMapper">
<bean class="org.springframework.batch.item.file.mapping.DefaultLineMapper">
<property name="lineTokenizer">
<bean class="org.springframework.batch.item.file.transform.DelimitedLineTokenizer">
<property name="delimiter" value="," />
</bean>
</property>
<property name="fieldSetMapper">
<bean class="...fileFieldSetMapper" />
</property>
</bean>
</property>
</bean>
<bean id="fileWriter" class="....fileWriter" scope="step" />
<bean id="archiveFilesTasklet" class="....ArchiveFilesTasklet" scope="step" />
</beans>
This works well. When I try to covert this to Java Config. I am getting the resources as NULL. Here is my config class.
#Configuration
#EnableBatchProcessing
#ComponentScan(basePackages = {"com.spring.batch.fileloader"})
public class SpringBatchConfig{
#Autowired
private JobBuilderFactory jobBuilders;
#Autowired
private StepBuilderFactory stepBuilders;
#Autowired
private DataSource dataSource;
#Autowired
private ResourcePatternResolver resourcePatternResolver;
#Autowired
private ReadPropertiesFile properties;
#Bean
BatchConfigurer configurer(#Qualifier("dataSource") DataSource dataSource){
return new DefaultBatchConfigurer(dataSource);
}
#Bean(name = "fileLoaderJob")
public Job csiAuditFileLoaderJob() throws Exception{
return jobBuilders.get("csiAuditFileLoaderJob")
.start(moveFiles())
.next(batchFileUploader())
.next(archiveFiles())
.build();
}
#Bean
public Step moveFiles(){
return stepBuilders.get("moveFiles")
.tasklet(new MoveFilesTasklet(properties))
.build();
}
#Bean
#Lazy
public Step batchFileUploader() throws Exception{
return stepBuilders.get("batchFileUploader")
.partitioner(batchFileUploadStep().getName(), partitioner())
.step(batchFileUploadStep())
.taskExecutor(taskExecutor())
.build();
}
#Bean
public Step archiveFiles(){
return stepBuilders.get("archiveFiles")
.tasklet(new ArchiveFilesTasklet(properties))
.build();
}
#Bean
public Step batchFileUploadStep(){
return stepBuilders.get("batchFileUploadStep")
.<MyDomain, MyDomain>chunk(10000)
.reader(fileReader(null))
.writer(fileWriter())
.build();
}
#Bean
#Lazy
public Partitioner partitioner() throws Exception{
MultiResourcePartitioner partitioner = new MultiResourcePartitioner();
Resource[] resources;
try{
/*
Here the resources is selected from a path where the previous MoveFilesTasklet moves the file
This returns null since Spring Initialize this bean eagerly before the step is called for execution.
*/
resources = resourcePatternResolver.getResources("file:" + properties.getPath() + "/*.csv");
}
catch(IOException e){
throw new RuntimeException("I/O problems when resolving the input file pattern.", e);
}
partitioner.setResources(resources);
return partitioner;
}
#Bean
public TaskExecutor taskExecutor(){
ThreadPoolTaskExecutor taskExecutor = new ThreadPoolTaskExecutor();
taskExecutor.setMaxPoolSize(5);
taskExecutor.afterPropertiesSet();
return taskExecutor;
}
#Bean
#StepScope
public FlatFileItemReader<MyDomain> fileReader(
#Value("#{stepExecutionContext['fileName']}") String filename){
FlatFileItemReader<MyDomain> reader = new FlatFileItemReader<>();
DelimitedLineTokenizer tokenizer = new DelimitedLineTokenizer();
DefaultLineMapper<MyDomain> lineMapper = new DefaultLineMapper<>();
lineMapper.setLineTokenizer(tokenizer);
lineMapper.setFieldSetMapper(new MyFieldSetMapper());
lineMapper.afterPropertiesSet();
reader.setLineMapper(lineMapper);
reader.setResource(new PathResource(filename));
return reader;
}
#Bean
public ItemWriter<MyDomain> fileWriter(){
return new FileWriter();
}
private JobRepository getJobRepository() throws Exception{
JobRepositoryFactoryBean jobRepositoryFactoryBean = new JobRepositoryFactoryBean();
jobRepositoryFactoryBean.setDataSource(dataSource);
jobRepositoryFactoryBean.setTransactionManager(getTransactionManager());
jobRepositoryFactoryBean.setDatabaseType("MySql");
jobRepositoryFactoryBean.afterPropertiesSet();
return jobRepositoryFactoryBean.getObject();
}
private PlatformTransactionManager getTransactionManager(){
return new ResourcelessTransactionManager();
}
#Bean
public JobLauncher jobLauncher() throws Exception{
SimpleJobLauncher jobLauncher = new SimpleJobLauncher();
jobLauncher.setJobRepository(getJobRepository());
jobLauncher.afterPropertiesSet();
return jobLauncher;
}
}
Below might be the issue
1 No need to do reader.setResource(new PathResource(filename)); inside FlatFileItemReader as it will override the resources set by partitioner Bean
2 Use PathMatchingResourcePatternResolver to load files in partitioner Bean
Sample code
ClassLoader cl = this.getClass().getClassLoader();
ResourcePatternResolver resolver = new PathMatchingResourcePatternResolver(cl);
resources = resolver.getResources("file:" + properties.getPath() + "/*.csv");
3: you can also add #StepScope on partitioner bean (Not sure abot this)
Hope this helps :)
My 2p with Spring Boot:
#Bean
#StepScope
public Partitioner logsPartitioner(#Value("file:${my.resources.path}/${my.resources.input:*}.csv") Resource[] resources) {
MultiResourcePartitioner partitioner = new MultiResourcePartitioner();
partitioner.setResources(resources);
partitioner.partition(resources.length);
return partitioner;
}

Transaction doesn't work in aspectj

I have the aspect(see below) which should log actions(create, update, delete) in db. Depends on action logging happens in a preProcess or postProcess method. I shouldn't log anything if some fail happens through these actions. I.e. if create didn't happened, then there is no need to logging it.
I tried to tested it. I throw RunTimeException in the join point and expect that there is no new log in db. Unfortunately, new log is saved in spite of exception in the join point.
Aspect:
#Component
#Aspect
public class LoggingAspect {
#Autowired
private ApplicationContext appContext;
#Autowired
private LoggingService loggingService;
#Around("#annotation(Loggable)")
#Transactional
public void saveActionMessage(ProceedingJoinPoint joinPoint) throws Throwable {
MethodSignature ms = (MethodSignature) joinPoint.getSignature();
Loggable m = ms.getMethod().getAnnotation(Loggable.class);
LoggingStrategy strategy = appContext.getBean(m.strategy());
Object argument = joinPoint.getArgs()[0];
strategy.preProcess(argument);
joinPoint.proceed();
strategy.postProcess(argument);
}
}
TestApplicationConfig:
<context:spring-configured/>
<import resource="applicationConfig-common.xml"/>
<import resource="applicationConfig-security.xml"/>
<aop:aspectj-autoproxy/>
<util:map id="testValues">
<entry key="com.exadel.mbox.test.testSvnFile" value="${svnFolder.configPath}${svnRoot.file[0].fileName}"/>
<entry key="com.exadel.mbox.test.testCommonRepositoryPath" value="${svnRoot.commonRepositoryPath}"/>
<entry key="com.exadel.mbox.test.testMailFile" value="${mailingList.configPath}"/>
</util:map>
<context:component-scan base-package="com.exadel.report.common" />
<!-- Jpa Repositories -->
<jpa:repositories base-package="com.exadel.report.common.dao" />
<tx:annotation-driven proxy-target-class="true"
transaction-manager="txManager" mode="aspectj"/>
<bean id="txManager"
class="org.springframework.jdbc.datasource.DataSourceTransactionManager">
<property name="dataSource" ref="dataSource"/>
</bean>
<!-- Data Source -->
<bean id="dataSource" class="org.springframework.jdbc.datasource.DriverManagerDataSource">
<property name="driverClassName" value="org.hsqldb.jdbcDriver" />
<property name="url" value="jdbc:hsqldb:mem:testdb" />
<property name="username" value="sa" />
<property name="password" value="" />
</bean>
<!-- Entity Manager -->
<bean id="entityManagerFactory"
class="org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean">
<property name="dataSource" ref="dataSource" />
<property name="jpaVendorAdapter">
<bean class="org.springframework.orm.jpa.vendor.HibernateJpaVendorAdapter">
<property name="showSql" value="true"/>
<property name="generateDdl" value="true"/>
<property name="databasePlatform" value="org.hibernate.dialect.HSQLDialect"/>
</bean>
</property>
<property name="persistenceUnitName" value="exviewer-test"/>
</bean>
<!-- Transaction Manager -->
<bean id="transactionManager" class="org.springframework.orm.jpa.JpaTransactionManager">
<property name="entityManagerFactory" ref="entityManagerFactory" />
</bean>
[Update]
LoggingStrategy:
public interface LoggingStrategy {
public void preProcess(Object obj);
public void postProcess(Object obj);
}
BaseLoggingStrategy:
public class BaseLoggingStrategy implements LoggingStrategy {
#Override
public void preProcess(Object obj) {}
#Override
public void postProcess(Object obj) {}
}
UpdateProcessStrategy:
#Service
public class UpdateProcessStrategy extends BaseLoggingStrategy {
#Autowired
private LoggingService loggingService;
#Autowired
private UserService userService;
#Autowired
DeviceService deviceService;
private Device currentDevice;
#Override
#Transactional
public void preProcess(Object obj) {
currentDevice = (Device) obj;
Device previousDevice = deviceService.getById(currentDevice.getId());
String deviceDataBeforeUpdate = deviceService.getDeviceDetailsInJSON(previousDevice);
String deviceDataAfterUpdate = deviceService.getDeviceDetailsInJSON(currentDevice);
String login = userService.getCurrentUser().getLogin();
String actionMessage = LoggingMessages.DEVICE_UPDATE.name();
loggingService.save(
new Logging(
login,
actionMessage,
deviceDataBeforeUpdate,
deviceDataAfterUpdate,
new Date())
);
}
#Override
public void postProcess(Object obj) {}
}
Class intercepted by aspcet:
#Service
public class DeviceService {
#Loggable(value = LoggingMessages.DEVICE_CREATE, strategy = CreateProcessStrategy.class)
#Transactional
public void create(Device device) {
createOrUpdate(device);
}
#Loggable(value = LoggingMessages.DEVICE_UPDATE, strategy = UpdateProcessStrategy.class)
#Transactional
public void update(Device device) {
createOrUpdate(device);
}
private void createOrUpdate(Device device) {
deviceRepository.save(device);
}
#Loggable(value = LoggingMessages.DEVICE_REMOVE, strategy = RemoveProcessStrategy.class)
public void remove(Long deviceId) {
deviceRepository.delete(deviceId);
}
}
Loggable annotation:
#Retention(RetentionPolicy.RUNTIME)
#Target(ElementType.METHOD)
public #interface Loggable {
LoggingMessages value();
Class<? extends LoggingStrategy> strategy();
}
Log for update action contains:
id, created_dtm, action(DEVICE_UPDATE), device_data_before_action_on_the_device(in json format), device_data_after_action_on_the_device(in json format), created_by.
Disclaimer: Actually I am not a Spring expert, maybe someone else can help you out here. My field of expertise it AspectJ, which is how I found your question.
Anyway, you have two issues here:
#Transactional annotation on your aspect's advice LoggingAspect.saveActionMessage(..). Actually I have no idea if this works at all (I found no example using #Transactional on an aspect method/advice on the web, but maybe I searched in the wrong way) because declarative transaction handling in Spring is implemented via proxy-based technology, just like Spring AOP. Read the chapter 12 about transaction management in the Spring manual for further details, especially chapter 12.5.1. I am pretty sure you will find a way to do what you want there.
Nested transactions, because e.g. UpdateProcessStrategy.preProcess(..) is called by the very advice which is meant to be transactional, but is declared #Transactional too. So you have a transaction within a transaction. How Spring handles this, I have no idea, but maybe this tutorial about Spring transaction propagation contains enlightening details.
The Spring manual lists several means to implement transactional behaviour: programmatically, declaratively via annotations, XML-based <tx:advice> stuff and so forth. I don't know which way is the best for you, I merely wanted to provide some general hints.

Spring #Transactional not starting with RoutingDataSources

I am using routing data sources, and my create operations are annotated with #Transactional annotations. But i noticed that transaction does not begin or commit. Following is my routing data source configuration.
<bean id="routingDataSource" class="com.test.dataaccess.base.dao.CustomerRoutingDataSource">
<property name="defaultTargetDataSource" ref="testDataSource" />
<property name="targetDataSources">
<map key-type="java.lang.String">
<entry key="0" value-ref="testDataSource" />
</map>
</property>
</bean>
<bean class="org.springframework.orm.jpa.JpaTransactionManager"
id="customerTransactionManager">
<property name="entityManagerFactory" ref="customerEntityManagerFactory" />
</bean>
Same data source i am using with my org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean.
I am adding another datasources to routing data sources at deployment time as follows.
Spring transaction management does not works.
#Component
public class CustomerDataSourcePostProcessor implements ApplicationListener {
#Autowired
DatasourcesDAO datasourcesDAO;
#Autowired
#Qualifier("customerEntityManagerFactory")
private LocalContainerEntityManagerFactoryBean testContentEntityManagerFactory;
#Autowired
#Qualifier("routingDataSource")
private CustomerRoutingDataSource routingDataSource;
#Autowired
#Qualifier("customerTransactionManager")
private JpaTransactionManager customerTransactionManager;
private static final Logger LOGGER = LoggerFactory.getLogger(CustomerDataSourcePostProcessor.class);
public void onApplicationEvent(ApplicationEvent e) {
if (e instanceof ContextRefreshedEvent) {
loadCustomerDBConfigForServer();
}
}
private void loadCustomerDBConfigForServer() {
Map<Object, Object> databaseConfig = loadCustomerDatabaseConfig();
routingDataSource.setTargetDataSources(databaseConfig);
routingDataSource.afterPropertiesSet();
testContentEntityManagerFactory.setDataSource(routingDataSource);
testContentEntityManagerFactory.afterPropertiesSet();
EntityManagerFactory emf =testContentEntityManagerFactory.getObject(); // transaction not begin possible root cause one
customerTransactionManager.setEntityManagerFactory(emf);
customerTransactionManager.afterPropertiesSet();
}
}

Spring AOP #After annotation not running as expected

I am learning spring AOP, but I am having issues running the #After annotation.
For some reason, #After gets executed BEFORE the method call.
What am I doing wrong? Is it my Eclipse environment issue?
#Aspect
public class LoggingAspect {
#After("execution(* aop.*.* (..))")
public void AfterLoggingAdvice() {
System.out.println("AfterLoggingAdvice() is running");
}
}
This is my main class:
public static void main(String[] args) {
ApplicationContext context = new ClassPathXmlApplicationContext(
"aop.xml");
ShapeService service = context.getBean("shapeService",
ShapeService.class);
System.out.println(service.getCircle().getName());
}
XML file:
<aop:aspectj-autoproxy/>
<bean id="circle" class="aop.Circle" >
<property name="name" value="Circle Name" />
<property name="id" value="Circle ID" />
</bean>
<bean id="shapeService" class="aop.ShapeService" autowire="byName"/>
<bean id="loggingAspect" class="aop.LoggingAspect"/>
</beans>
This is the output regardless of using #After or #Before:
AfterLoggingAdvice() is running
AfterLoggingAdvice() is running
Circle Name

JPA entityManager is null in Pointcut

I have defined a pointcut using the #Aspect annotation in my class.
I configure the pointcut using a custom annotation which I have defined in my context:
<aop:aspectj-autoproxy proxy-target-class="true"/>
<!-- Messaging pointcut -->
<bean id="messagePointcut" class="com.adobe.codex.aspects.MessagePointcut" >
<constructor-arg ref="msgPointcutEntityFactory"/>
<property name="buildDao" ref="buildDao"/>
</bean>
<!-- enable our own annotation -->
<aop:config proxy-target-class="true">
<aop:aspect ref="messagePointcut">
<aop:pointcut id="proxiedMethods" expression="#annotation(com..codex.aspects.annotation.MessageGateway)"/>
<aop:around pointcut-ref="proxiedMethods" method="interceptAnnotatedMethod"/>
</aop:aspect>
</aop:config>
Unfortunately the entityManager inside buildDao is always null if I have a reference to buildDao in my pointcut.
Not sure what the best way to fix this would be.
I'm assuming the problem is that the weaving used (load time) is does not know how to create an entityManager from the entityManagerFactory bean.
here is a snippet of my dao context.
<context:annotation-config />
<bean id="entityManagerFactory"
class="org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean">
<property name="dataSource" ref="dataSource" />
<property name="jpaProperties">
<util:properties
location="classpath:com//codex/dao/jpa/hibernate.properties" />
</property>
</bean>
<bean id="buildDao" class="com..codex.dao.jpa.JpaBuildDao">
<description>
A DAO for Builds.
</description>
<property name="queryHelper" ref="queryHelper" />
<property name="partDao" ref="partDao" />
<property name="buildQueryFactory" ref="buildQueryFactory" />
</bean>
Here is my Pointcut:
#Aspect
#Transactional()
public class MessagePointcut implements Ordered, MsgObservable {
private MsgPointcutEntityFactory msgEntityFactory;
private BuildDao buildDao;
public void setBuildDao(BuildDao buildDao) {
this.buildDao = buildDao;
}
public MessagePointcut(MsgPointcutEntityFactory msgEntityFactory){
this.msgEntityFactory = msgEntityFactory;
}
#Transactional(readOnly = true)
public Object interceptAnnotatedMethod(ProceedingJoinPoint pjp) {
Object returnedEntity = null;
Object originalEntity = null;
try { //
// do stuff before executing the call
originalEntity = msgEntityFactory.fetch(id, Build.class);
//execute the call
returnedEntity = pjp.proceed();
// do stuff after executing the call
// ...
} catch (Throwable e) {
e.printStackTrace();
}
return returnedEntity;
}
#Override
public int getOrder() {
return 2;
}
}
And a snippet of my dao
#Repository
public class JpaBuildDao implements BuildDao {
private static final Log log = LogFactory.getLog(JpaBuildDao.class);
#PersistenceContext
private EntityManager entityManager;
private QueryHelper queryHelper;
private BuildQueryFactory standardQueryFactory;
private PartDao partDao;
public Build getFlatBuild(Integer id) {
Build returnBuild;
Query query = entityManager.createQuery(
"SELECT b FROM Build b " +
"WHERE " +
"b.id = :id");
query.setParameter("id", id);
returnBuild = (Build) query.getSingleResult();
return returnBuild;
}
Made some progress. The real issue is that buildDao is injected raw into the pointcut w/o the required Jpa proxy that instantiates the entityManager.
Turns out the issue only occurs when another config detail comes into the mix. I also have two MethodInvokingFactoryBean instances injecting beans into my pointcut:
<bean id="registerListenerJms"
class="org.springframework.beans.factory.config.MethodInvokingFactoryBean">
<property name="targetObject">
<ref local="messagePointcut" />
</property>
<property name="targetMethod">
<value>registerObserver</value>
</property>
<property name="arguments">
<list>
<ref bean="jmsGateway" />
</list>
</property>
</bean>
<bean id="registerListenerAmf"
class="org.springframework.beans.factory.config.MethodInvokingFactoryBean">
<property name="targetObject">
<ref local="messagePointcut" />
</property>
<property name="targetMethod">
<value>registerObserver</value>
</property>
<property name="arguments">
<list>
<ref bean="amfGateway" />
</list>
</property>
</bean>
When I remove these two beans my pointcut doesn't get the raw proxy, but it gets a JdkDynamicAopProxy with a reference to the dao.
Have no clue why MethodInvokingFactoryBean messes up injecting the dao, but it does.
Bottom line is for the time being I'm removing the MethodInvokingFactoryBean that implement my observer pattern and live with a dependency of the pointcut on the beans that want to hook in.
Not a complete solution but an acceptable workaround.