Spring Batch Test Single Job - spring-batch

I am trying to write Integration test for spring batch aplication , in my project there are approx 10+jobs
I want to run only a single job but unable to achieve any Suggestion.
#SpringBatchTest
#RunWith(SpringRunner.class)
#ContextConfiguration(classes= MyApp.class)
#SpringBootTest
#Slf4j
public class JobATest {
JobLauncherTestUtils jobLauncherTestUtils = new JobLauncherTestUtils();
#Autowired
#Qualifier(JOB_A)
Job joba;
#Before
public void setUp() throws Exception {
log.debug("CAME HERE setUp {} ",joba.getName());
jobLauncherTestUtils.setJob(joba);
}
#After
public void tearDown() throws Exception {
}
#Test
public void processAJob() throws Exception {
jobLauncherTestUtils.launchJob();
}
}
ERROR
Caused by: org.springframework.beans.factory.UnsatisfiedDependencyException:
Error creating bean with name 'jobLauncherTestUtils': Unsatisfied dependency expressed
through method 'setJob' parameter 0; nested exception is
org.springframework.beans.factory.NoUniqueBeanDefinitionException:
No qualifying bean of type 'org.springframework.batch.core.Job' available:
expected single matching bean but found 2: **joba,jobb**

When using #SpringBatchTest, it is expected that the test context contains a single job bean. This is mentioned in the javadoc of the annotation.
There is an open issue for that which we might consider for the next major release. Please upvote or add a comment if you have a suggestion for an improvement. I also invite you to check the thread on Multiple Job unit testing with #SpringBatchTest which could help you as well.

Related

Write/Run Junit Test class (To test actuators) without Datasource bean creation in springboot container

What we are trying to do?
Writing a Junit for Springboot actuator/Admin as below
Code snippet:
ActuatorTests.java
#SpringBootTest(properties = {
"management.endpoints.web.exposure.include=" })
#ActiveProfiles(profiles = "local")
#AutoConfigureMockMvc
public class ActuatorTests {
#Autowired
private MockMvc mockMvc;
#MockBean
JwtDecoder jwtDecoder;
#Test
public void testActuatorEndpointSuccess() throws Exception {
MockHttpServletResponse resp = mockMvc
.perform(MockMvcRequestBuilders.get("/actuator/").accept(MediaType.APPLICATION_JSON)).andReturn()
.getResponse();
assertEquals(resp.getStatus(), 200);
}
application-local.yml
This property contains Datasource, username, password and others properties
What is the issue?
During spring boot container start, it is creating Data Source by using data source properties of application-local.yml
Problem here is I can't rely on application-local.yml becoz properties changes environment to environment may not work all the time with same property values and also which is unnecessary for my Junit as the testcase is about testing the management actuator endpoint only.
What we have tried?
Ran by excluding some the JPA classes using below.
#SpringBootTest(properties = {
"management.endpoints.web.exposure.include=" })
#ActiveProfiles(profiles = "local")
#AutoConfigureMockMvc
#EnableAutoConfiguration(exclude = {
DataSourceAutoConfiguration.class,
DataSourceTransactionManagerAutoConfiguration.class,
HibernateJpaAutoConfiguration.class
})
public class ActuatorTests { .....}
But found the below error in the console.
Note: the error log also having chain of bean creation errors from DAO,Service, to Controller layer classes,
I have given only the tail of the log due to restrictions.
**Caused by: org.springframework.beans.factory.NoSuchBeanDefinitionException: No bean named 'entityManagerFactory' available**
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanDefinition(DefaultListableBeanFactory.java:805)
at org.springframework.beans.factory.support.AbstractBeanFactory.getMergedLocalBeanDefinition(AbstractBeanFactory.java:1278)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:297)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:276)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:202)
at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:330)
... 118 common frames omitted
Any help on this?
We can see similar question has been asked but no answer found in it.
Run junit without data source persistence context in spring boot
Any other solution to above actuator Test Junit is also welcome..

Using spring-retry #EnableRetry with spring-batch causes unexpected proxy

I am trying to make use of the spring-retry library in my batch job and on adding the #EnableRetry annotation to my #Configuration as documentation suggests my application is now failing because it appears a spring-batch library bean which is being #Autowired is being proxied.
#Configuration
#EnableBatchProcessing
#EnableRetry
#Import({SpringBatchConfiguration.class, School192ClientConfiguration.class })
public class SchoolJobConfiguration { .. }
Exception:
Caused by: org.springframework.beans.factory.BeanNotOfRequiredTypeException: Bean named 'jobRegistry' is expected to be of type 'org.springframework.batch.core.configuration.JobRegistry' but was actually of type 'com.sun.proxy.$Proxy130'
I have added the following to a separate class:
#Retryable(value = School192ClientException.class, maxAttempts = 3, backoff = #Backoff(delay = 2000))
#Override
protected void doReadPage() {
My question is why is this bean (jobRegistry) being proxied? (I don't have much experience with AOP).
I am using spring-boot version 1.5.3.RELEASE.

Error creating unit test with Spring cloud stream using kafka

i dunno how make one sample test using kafka, i tried to follow the spring guide but dont work.
Can someone help me?
zzzzz zz z z z z z z z z z z z
#RunWith(SpringRunner.class)
#SpringBootTest
#DirtiesContext
public class EnrollSenderTest {
#Autowired
public EnrollSender producer;
#Autowired
private BinderFactory<MessageChannel> binderFactory;
#Autowired
private MessageCollector messageCollector;
#SuppressWarnings("unchecked")
#Test
public void test() {
Message<String> message = new GenericMessage<>("hello");
producer.sendEnroll(message);
Message<String> received = (Message<String>) messageCollector.forChannel(producer.getOutput()).poll();
assertThat(received.getPayload(), equalTo("hello"));
}
}
And my class Producer is:
#Service
#EnableBinding(Source.class)
public class EnrollSender {
private final MessageChannel output;
public EnrollSender(Source output) {
this.output = output.output();
}
public void sendEnroll(Object enroll) {
output.send(MessageBuilder.withPayload(enroll).build());
}
public MessageChannel getOutput() {
return output;
}
}
But gives the following error:
java.lang.IllegalStateException: Failed to load ApplicationContext
Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'messageCollector' defined in class path resource [org/springframework/cloud/stream/test/binder/TestSupportBinderAutoConfiguration.class]: Bean instantiation via factory method failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [org.springframework.cloud.stream.test.binder.MessageCollector]: Factory method 'messageCollector' threw exception; nested exception is java.lang.NoSuchMethodError: org.springframework.cloud.stream.binder.BinderFactory.getBinder(Ljava/lang/String;Ljava/lang/Class;)Lorg/springframework/cloud/stream/binder/Binder;
Caused by: org.springframework.beans.BeanInstantiationException: Failed to instantiate [org.springframework.cloud.stream.test.binder.MessageCollector]: Factory method 'messageCollector' threw exception; nested exception is java.lang.NoSuchMethodError: org.springframework.cloud.stream.binder.BinderFactory.getBinder(Ljava/lang/String;Ljava/lang/Class;)Lorg/springframework/cloud/stream/binder/Binder;
Caused by: java.lang.NoSuchMethodError: org.springframework.cloud.stream.binder.BinderFactory.getBinder(Ljava/lang/String;Ljava/lang/Class;)Lorg/springframework/cloud/stream/binder/Binder;
Marius Bogoevici, my dependencys
dependencyManagement {
imports {
mavenBom "org.springframework.cloud:spring-cloud-dependencies:Camden.SR4"
}
}
compile 'org.springframework.cloud:spring-cloud-starter-stream-kafka'
compile group: 'org.springframework.cloud', name: 'spring-cloud-stream-test-support', version: '1.1.1.RELEASE'
Looks like you have a mismatched dependency set on the classpath (i.e. an older version of Spring Cloud Stream core).
You can solve this by removing the version for spring-cloud-stream-test-support because the Camden.SR4 BOM will provide the correct one.
Moreover, if you want to test with an embedded Kafka instance, you can find an example here: https://github.com/spring-cloud/spring-cloud-stream-samples/blob/master/multibinder/src/test/java/multibinder/RabbitAndKafkaBinderApplicationTests.java#L57
(The example shows you how to configure the Kafka binder with an embedded broker for testing - it also shows how to use two different binders within the same app, but probably you don't care about that).
This is because of the incompatible versions as pointed out by Marius above.
You would either need Camden.SR5 that has compatible versions of Spring Cloud Stream and Spring Cloud Stream test support or Camden.SR4 with Spring Cloud Stream test support version 1.1.0.RELEASE.
This is change that went in between 1.1.0.RELEASE and 1.1.1.RELEASE of Spring Cloud Steram:

Spring data error

I have the following class
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(locations = {"classpath:my-ctx.xml"})
public class UserTests {
#Inject
private ApplicationContext applicationContext;
private UserRepository getUserRepository() {
return (UserRepository)applicationContext.getBean("userRepository", CrudRepository.class);
}
#Test
public void someTest() {
User user = new User();
user.setName("John Doe");
getUserRepository().save(user);
}
}
Running the test, I get the following error
org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'userRepository': FactoryBean threw exception on object creation; nested exception is java.lang.NullPointerException
...
root cause is
org.datanucleus.api.jpa.metamodel.SingularAttributeImpl.isVersion(SingularAttributeImpl.java:79)
org.springframework.data.jpa.repository.support.JpaMetamodelEntityInformation.findVersionAttribute(JpaMetamodelEntityInformation.java:92)
org.springframework.data.jpa.repository.support.JpaMetamodelEntityInformation.<init>(JpaMetamodelEntityInformation.java:78)
org.springframework.data.jpa.repository.support.JpaEntityInformationSupport.getMetadata(JpaEntityInformationSupport.java:65)
org.springframework.data.jpa.repository.support.JpaRepositoryFactory.getEntityInformation(JpaRepositoryFactory.java:146)
org.springframework.data.jpa.repository.support.JpaRepositoryFactory.getTargetRepository(JpaRepositoryFactory.java:84)
org.springframework.data.jpa.repository.support.JpaRepositoryFactory.getTargetRepository(JpaRepositoryFactory.java:67)
...
where VersionMetaData vermd = mmd.getAbstractClassMetaData().getVersionMetaData(); is null.
Is this a bug?
I know that I can put something like #Inject UserRepository userRepository;, but taking into account how Spring Data works, these two should have the same result, right? And anyway the result will be the same error.
I'm using Spring data 1.4.1, DataNucleus 3.3.2, Spring 3.2.4.
Actually this is a DataNucleus bug and I filled in a bug report (with test and fix patch included): http://www.datanucleus.org/servlet/jira/browse/NUCJPA-250.
My workaround was to switch back to Spring Data 1.3.0.

javax.inject.Qualifier Spring JavaConfig

I have the following code
The 2 javax.Inject Qualifiers
#Qualifier
#Target(value={ElementType.FIELD,ElementType.TYPE,ElementType.PARAMETER})
#Retention(RetentionPolicy.RUNTIME)
public #interface Hibernate {
--nothing goes here
}
#Qualifier
#Target(value={ElementType.FIELD,ElementType.TYPE,ElementType.PARAMETER})
#Retention(RetentionPolicy.RUNTIME)
public #interface Toplink{
--nothing goes here
}
I Qualify the repositories
#Named
#Hibernate
public class HibernateRepository implements IRepository{
-- some code
}
#Named
#Toplink
public class ToplinkRepository implements IRepository{
-- some code
}
These repositories are injected using javax.Inject
public class InvoiceService {
#Inject
//#Hibernate I alternate between the two to test
#Toplink
private IRepository iRepository;
public void saveInvoice(Invoice invoice){
iRepository.save(invoice);
}
using the following configuration class
#Configuration
public class Myconfig {
#Bean
public IRepository getHibernateRepository(){
return new HibernateRepository();
}
#Bean
public InvoiceService getInvoiceService(){
return new InvoiceService();
}
#Bean
public IRepository getToplinkRepository(){
return new ToplinkRepository();
}
}
This code works perfectly fine when I use the XML configuration , any idea how to get it working with javaConfig ?? Or is there something fundamentally wrong in my code ?? When used its throws the following exception
Exception in thread "main"
org.springframework.beans.factory.BeanCreationException: Error
creating bean with name 'getInvoiceService': Injection of autowired
dependencies failed; nested exception is
org.springframework.beans.factory.BeanCreationException: Could not
autowire field: private com.domain.IRepository
com.service.InvoiceService.iRepository; nested exception is
org.springframework.beans.factory.NoSuchBeanDefinitionException: No
matching bean of type [com.domain.IRepository] found for dependency:
expected at least 1 bean which qualifies as autowire candidate for
this dependency. Dependency annotations: {#javax.inject.Inject(),
#com.domain.Toplink()}
Thanks in anticipation.
In the case of #Bean methods, it's the return type that counts. Even though you may be returning a TopLinkRepository from one method, and a HibernateRepository from another, from the container's point of view, all it knows is that there are two beans of type IRepository, and therefore does not understand that one is #Toplink annotated and one is #Hibernate annotated.
You have several of choices here. The simplest, given your current configuration, would be to change the return types to make them more specific.
The second is to leave the return types generic, but move the #Toplink and #Hibernate qualifier annotations to the #Bean method level.
The third is to component-scan for the repository types instead of declaring them as #Bean methods.
The third approach is generally recommended, given that you're already using #Inject on the repository components, and have them marked with #Named. This makes them natural candidates for component-scanning in the first place. Check out the Javadoc for #ComponentScan to see how to do this in the #Configuration class world.