Using spring-retry #EnableRetry with spring-batch causes unexpected proxy - spring-batch

I am trying to make use of the spring-retry library in my batch job and on adding the #EnableRetry annotation to my #Configuration as documentation suggests my application is now failing because it appears a spring-batch library bean which is being #Autowired is being proxied.
#Configuration
#EnableBatchProcessing
#EnableRetry
#Import({SpringBatchConfiguration.class, School192ClientConfiguration.class })
public class SchoolJobConfiguration { .. }
Exception:
Caused by: org.springframework.beans.factory.BeanNotOfRequiredTypeException: Bean named 'jobRegistry' is expected to be of type 'org.springframework.batch.core.configuration.JobRegistry' but was actually of type 'com.sun.proxy.$Proxy130'
I have added the following to a separate class:
#Retryable(value = School192ClientException.class, maxAttempts = 3, backoff = #Backoff(delay = 2000))
#Override
protected void doReadPage() {
My question is why is this bean (jobRegistry) being proxied? (I don't have much experience with AOP).
I am using spring-boot version 1.5.3.RELEASE.

Related

Write/Run Junit Test class (To test actuators) without Datasource bean creation in springboot container

What we are trying to do?
Writing a Junit for Springboot actuator/Admin as below
Code snippet:
ActuatorTests.java
#SpringBootTest(properties = {
"management.endpoints.web.exposure.include=" })
#ActiveProfiles(profiles = "local")
#AutoConfigureMockMvc
public class ActuatorTests {
#Autowired
private MockMvc mockMvc;
#MockBean
JwtDecoder jwtDecoder;
#Test
public void testActuatorEndpointSuccess() throws Exception {
MockHttpServletResponse resp = mockMvc
.perform(MockMvcRequestBuilders.get("/actuator/").accept(MediaType.APPLICATION_JSON)).andReturn()
.getResponse();
assertEquals(resp.getStatus(), 200);
}
application-local.yml
This property contains Datasource, username, password and others properties
What is the issue?
During spring boot container start, it is creating Data Source by using data source properties of application-local.yml
Problem here is I can't rely on application-local.yml becoz properties changes environment to environment may not work all the time with same property values and also which is unnecessary for my Junit as the testcase is about testing the management actuator endpoint only.
What we have tried?
Ran by excluding some the JPA classes using below.
#SpringBootTest(properties = {
"management.endpoints.web.exposure.include=" })
#ActiveProfiles(profiles = "local")
#AutoConfigureMockMvc
#EnableAutoConfiguration(exclude = {
DataSourceAutoConfiguration.class,
DataSourceTransactionManagerAutoConfiguration.class,
HibernateJpaAutoConfiguration.class
})
public class ActuatorTests { .....}
But found the below error in the console.
Note: the error log also having chain of bean creation errors from DAO,Service, to Controller layer classes,
I have given only the tail of the log due to restrictions.
**Caused by: org.springframework.beans.factory.NoSuchBeanDefinitionException: No bean named 'entityManagerFactory' available**
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanDefinition(DefaultListableBeanFactory.java:805)
at org.springframework.beans.factory.support.AbstractBeanFactory.getMergedLocalBeanDefinition(AbstractBeanFactory.java:1278)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:297)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:276)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:202)
at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:330)
... 118 common frames omitted
Any help on this?
We can see similar question has been asked but no answer found in it.
Run junit without data source persistence context in spring boot
Any other solution to above actuator Test Junit is also welcome..

Spring Batch Test Single Job

I am trying to write Integration test for spring batch aplication , in my project there are approx 10+jobs
I want to run only a single job but unable to achieve any Suggestion.
#SpringBatchTest
#RunWith(SpringRunner.class)
#ContextConfiguration(classes= MyApp.class)
#SpringBootTest
#Slf4j
public class JobATest {
JobLauncherTestUtils jobLauncherTestUtils = new JobLauncherTestUtils();
#Autowired
#Qualifier(JOB_A)
Job joba;
#Before
public void setUp() throws Exception {
log.debug("CAME HERE setUp {} ",joba.getName());
jobLauncherTestUtils.setJob(joba);
}
#After
public void tearDown() throws Exception {
}
#Test
public void processAJob() throws Exception {
jobLauncherTestUtils.launchJob();
}
}
ERROR
Caused by: org.springframework.beans.factory.UnsatisfiedDependencyException:
Error creating bean with name 'jobLauncherTestUtils': Unsatisfied dependency expressed
through method 'setJob' parameter 0; nested exception is
org.springframework.beans.factory.NoUniqueBeanDefinitionException:
No qualifying bean of type 'org.springframework.batch.core.Job' available:
expected single matching bean but found 2: **joba,jobb**
When using #SpringBatchTest, it is expected that the test context contains a single job bean. This is mentioned in the javadoc of the annotation.
There is an open issue for that which we might consider for the next major release. Please upvote or add a comment if you have a suggestion for an improvement. I also invite you to check the thread on Multiple Job unit testing with #SpringBatchTest which could help you as well.

Error creating unit test with Spring cloud stream using kafka

i dunno how make one sample test using kafka, i tried to follow the spring guide but dont work.
Can someone help me?
zzzzz zz z z z z z z z z z z z
#RunWith(SpringRunner.class)
#SpringBootTest
#DirtiesContext
public class EnrollSenderTest {
#Autowired
public EnrollSender producer;
#Autowired
private BinderFactory<MessageChannel> binderFactory;
#Autowired
private MessageCollector messageCollector;
#SuppressWarnings("unchecked")
#Test
public void test() {
Message<String> message = new GenericMessage<>("hello");
producer.sendEnroll(message);
Message<String> received = (Message<String>) messageCollector.forChannel(producer.getOutput()).poll();
assertThat(received.getPayload(), equalTo("hello"));
}
}
And my class Producer is:
#Service
#EnableBinding(Source.class)
public class EnrollSender {
private final MessageChannel output;
public EnrollSender(Source output) {
this.output = output.output();
}
public void sendEnroll(Object enroll) {
output.send(MessageBuilder.withPayload(enroll).build());
}
public MessageChannel getOutput() {
return output;
}
}
But gives the following error:
java.lang.IllegalStateException: Failed to load ApplicationContext
Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'messageCollector' defined in class path resource [org/springframework/cloud/stream/test/binder/TestSupportBinderAutoConfiguration.class]: Bean instantiation via factory method failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [org.springframework.cloud.stream.test.binder.MessageCollector]: Factory method 'messageCollector' threw exception; nested exception is java.lang.NoSuchMethodError: org.springframework.cloud.stream.binder.BinderFactory.getBinder(Ljava/lang/String;Ljava/lang/Class;)Lorg/springframework/cloud/stream/binder/Binder;
Caused by: org.springframework.beans.BeanInstantiationException: Failed to instantiate [org.springframework.cloud.stream.test.binder.MessageCollector]: Factory method 'messageCollector' threw exception; nested exception is java.lang.NoSuchMethodError: org.springframework.cloud.stream.binder.BinderFactory.getBinder(Ljava/lang/String;Ljava/lang/Class;)Lorg/springframework/cloud/stream/binder/Binder;
Caused by: java.lang.NoSuchMethodError: org.springframework.cloud.stream.binder.BinderFactory.getBinder(Ljava/lang/String;Ljava/lang/Class;)Lorg/springframework/cloud/stream/binder/Binder;
Marius Bogoevici, my dependencys
dependencyManagement {
imports {
mavenBom "org.springframework.cloud:spring-cloud-dependencies:Camden.SR4"
}
}
compile 'org.springframework.cloud:spring-cloud-starter-stream-kafka'
compile group: 'org.springframework.cloud', name: 'spring-cloud-stream-test-support', version: '1.1.1.RELEASE'
Looks like you have a mismatched dependency set on the classpath (i.e. an older version of Spring Cloud Stream core).
You can solve this by removing the version for spring-cloud-stream-test-support because the Camden.SR4 BOM will provide the correct one.
Moreover, if you want to test with an embedded Kafka instance, you can find an example here: https://github.com/spring-cloud/spring-cloud-stream-samples/blob/master/multibinder/src/test/java/multibinder/RabbitAndKafkaBinderApplicationTests.java#L57
(The example shows you how to configure the Kafka binder with an embedded broker for testing - it also shows how to use two different binders within the same app, but probably you don't care about that).
This is because of the incompatible versions as pointed out by Marius above.
You would either need Camden.SR5 that has compatible versions of Spring Cloud Stream and Spring Cloud Stream test support or Camden.SR4 with Spring Cloud Stream test support version 1.1.0.RELEASE.
This is change that went in between 1.1.0.RELEASE and 1.1.1.RELEASE of Spring Cloud Steram:

How to load spring application context even if Cassandra down

When using
#Configuration
#EnableCassandraRepositories(basePackages={"com.foo"})
public class CassandraConfig{
#Bean
public CassandraClusterFactoryBean cluster()
{
final CassandraClusterFactoryBean cluster = new CassandraClusterFactoryBean();
cluster.setContactPoints(nodesRead);
cluster.setPort(port);
return cluster;
}
Where in the com.foo package there is a interface that extends CrudRepository.
Is there a way to make it so that at startup time an exception is not thrown if the database is down?
Ideally what occurs is that we startup and anytime you call a method on the repository, it will first attempt to connect to the database and then if the database is still down return an error saying can't connect.
The behavior I currently observe is that NoHostAvailableException is thrown and the web container does not start up.
I was able to come up with a solution. I removed the #EnableCassandraRepositories(basePackages={"com.foo"}) annotation from the repository and defined a Bean in my Config that would return my repository. Removing the EnableCassandraRepositories allowed lazy loading of the repository. This new bean in my Config allowed me to instantiate my repository using the RepositoryFactorySupport getRepository() method. I annotated this bean as lazy and made sure references to the bean were also lazy.
Assume my repository looks like the following
public interface IBarRepository extends CrudRepository<Bar, BarKey>{}
My Config file now looks like
#Configuration
public class CassandraConfig{
#Bean
#Lazy(value=true)
public IBarRepository barRepository() throws Exception
{
final RepositoryFactorySupport support = CassandraRepositoryFactory(cassandraTemplate());
return support.getRepository(IBarRepository.class);
}
#Bean
#Lazy(value=true)
public CassandraClusterFactoryBean cluster()
{
final CassandraClusterFactoryBean cluster = new CassandraClusterFactoryBean();
cluster.setContactPoints(nodesRead);
cluster.setPort(port);
return cluster;
}
//More beans down here defining things like cluster, mappingContext, session, etc.

Spring data error

I have the following class
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(locations = {"classpath:my-ctx.xml"})
public class UserTests {
#Inject
private ApplicationContext applicationContext;
private UserRepository getUserRepository() {
return (UserRepository)applicationContext.getBean("userRepository", CrudRepository.class);
}
#Test
public void someTest() {
User user = new User();
user.setName("John Doe");
getUserRepository().save(user);
}
}
Running the test, I get the following error
org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'userRepository': FactoryBean threw exception on object creation; nested exception is java.lang.NullPointerException
...
root cause is
org.datanucleus.api.jpa.metamodel.SingularAttributeImpl.isVersion(SingularAttributeImpl.java:79)
org.springframework.data.jpa.repository.support.JpaMetamodelEntityInformation.findVersionAttribute(JpaMetamodelEntityInformation.java:92)
org.springframework.data.jpa.repository.support.JpaMetamodelEntityInformation.<init>(JpaMetamodelEntityInformation.java:78)
org.springframework.data.jpa.repository.support.JpaEntityInformationSupport.getMetadata(JpaEntityInformationSupport.java:65)
org.springframework.data.jpa.repository.support.JpaRepositoryFactory.getEntityInformation(JpaRepositoryFactory.java:146)
org.springframework.data.jpa.repository.support.JpaRepositoryFactory.getTargetRepository(JpaRepositoryFactory.java:84)
org.springframework.data.jpa.repository.support.JpaRepositoryFactory.getTargetRepository(JpaRepositoryFactory.java:67)
...
where VersionMetaData vermd = mmd.getAbstractClassMetaData().getVersionMetaData(); is null.
Is this a bug?
I know that I can put something like #Inject UserRepository userRepository;, but taking into account how Spring Data works, these two should have the same result, right? And anyway the result will be the same error.
I'm using Spring data 1.4.1, DataNucleus 3.3.2, Spring 3.2.4.
Actually this is a DataNucleus bug and I filled in a bug report (with test and fix patch included): http://www.datanucleus.org/servlet/jira/browse/NUCJPA-250.
My workaround was to switch back to Spring Data 1.3.0.