How to enable spring cloud sleuth in unit tests with MockMvc - mockmvc

We have a spring boot rest api (spring boot 2.3.0.RELEASE) that uses spring cloud sleuth (version 2.2.3.RELEASE).
At some point, we use the trace id from spring sleuth as data. The trace id is fetched by autowiring the Tracing bean and then accessing the current span. Lets say we defined a bean SimpleCorrelationBean with:
#Autowired
private Tracer tracer;
public String getCorrelationId() {
return tracer.currentSpan().context().traceIdString();
}
This seem to work perfectly when running the spring boot application, but when we try to access the tracer.currentSpan() in the unit tests, this is null. It looks like spring cloud sleuth is not creating any span while running tests..
I think it has something to do with the application context that is set up during the unit test, but I don't know how to enable spring cloud sleuth for the test application context.
Below is a simple test class where the error occurs in simpleTest1. In simpleTest2, no error occurs.
simpleTest1 errors because tracer.currentSpan() is null
#ExtendWith({ RestDocumentationExtension.class, SpringExtension.class })
#SpringBootTest(classes = MusicService.class)
#WebAppConfiguration
#ActiveProfiles("unit-test")
#ComponentScan(basePackageClasses = datacast2.data.JpaConfig.class)
public class SimpleTest {
private static final Logger logger = LoggerFactory.getLogger(SimpleTest.class);
#Autowired
private WebApplicationContext context;
private MockMvc mockMvc;
#Autowired
private FilterChainProxy springSecurityFilterChain;
#Autowired
private SimpleCorrelationBean simpleCorrelationBean;
#Autowired
private Tracer tracer;
#BeforeEach
public void setup(RestDocumentationContextProvider restDocumentation) throws Exception {
this.mockMvc = MockMvcBuilders.webAppContextSetup(this.context)
.apply(documentationConfiguration(restDocumentation))
.addFilter(springSecurityFilterChain).build();
}
#Test
public void simpleTest1() throws Exception {
try {
String correlationId = simpleCorrelationBean.getCorrelationId();
}catch(Exception e) {
logger.error("This seem to fail.", e);
}
}
#Test
public void simpleTest2() throws Exception {
//It looks like spring cloud sleuth is not creating a span, so we create one ourselfs
Span newSpan = this.tracer.nextSpan().name("simpleTest2");
try (Tracer.SpanInScope ws = this.tracer.withSpanInScope(newSpan.start())) {
String correlationId = simpleCorrelationBean.getCorrelationId();
}
finally {
newSpan.finish();
}
}
}
The question is: how to enable spring cloud sleuth for a mockMvc during unit tests?

The issue here is that MockMvc is created manually instead of relying on autoconfiguration. In this particular case custom configuration of MockMvc could be necessary. However, at least for my version of Spring Boot (2.7.6), there is no need to manually configure MockMvc, even though I use Spring Security and Spring Security Test. I couldn't figure out how to enable tracing when manually configuring MockMvc though.

Related

Running Spring Batch test, doesn't initialize database

Trying to create some end-to-end tests for an spring batch application, which works great. I get an sql error because it is not initializing Spring Batch processing tables: org.postgresql.util.PSQLException: ERROR: relation "batch_job_instance" does not exist
I have this code in the src/test/resources/application.properties:
spring.datasource.initialize=true
spring.datasource.initialization-mode=always
spring.datasource.platform=postgresql
spring.batch.initialize-schema=always
Which is the same I have on `src/main/resources/application.properties and works.
This is the code I have for ApplicationTest:
#RunWith(SpringRunner.class)
#ContextConfiguration(classes={
TestConfiguration.class,
JobCompletionNotificationListener.class,
BatchConfiguration.class
})
#SpringBatchTest
public class ApplicationTests {
#Autowired
private JobLauncherTestUtils jobLauncherTestUtils;
#Test
public void testJob() throws Exception {
JobExecution jobExecution = jobLauncherTestUtils.launchJob();
}
}
I have an specific TestConfiguration to generate a Bean with the DataSource.
#Configuration
#PropertySource("application.properties")
public class TestConfiguration {
#Autowired
private Environment env;
#Bean
public DataSource dataSource() {
DriverManagerDataSource dataSource = new DriverManagerDataSource();
dataSource.setDriverClassName(env.getProperty("spring.datasource.driverClassname"));
dataSource.setUrl(env.getProperty("spring.datasource.url"));
dataSource.setUsername(env.getProperty("spring.datasource.username"));
dataSource.setPassword(env.getProperty("spring.datasource.password"));
return dataSource;
}
I was expecting all tables to be created (internal Batch tables and the tables defined in schema-all.sql).
But I get the following error org.postgresql.util.PSQLException: ERROR: relation "batch_job_instance" does not exist.
I don't understand why in the main application all works automagically, and it doesn't in the test.
If a Spring test misses misses the BatchDataSourceInitializer that is being auto-configured by Spring Boot in the actual application, and you don't want to write a full #SpringBootTest, you can selectively add the auto-configuration for Spring Batch by adding the annotation
#ImportAutoConfiguration(BatchAutoConfiguration.class)
This will then provide the initializer for the injected DataSource.

How to save test data to be consumed by a spring batch integration test

I am trying to use my JPA repositories in order to save test data into h2 to be then used by a spring batch integration test.
Here is my integration test:
#SpringBootTest(webEnvironment = SpringBootTest.WebEnvironment.NONE, classes = Batch.class)
public class MessageDigestMailingStepIT extends AbstractBatchIntegrationTest {
#Autowired
#Qualifier("messagesDigestMailingJob")
private Job messagesDigestMailingJob;
#Autowired
private JobLauncher jobLauncher;
#Autowired
private JobRepository jobRepository;
#Autowired
private UserAccountRepository userAccountRepository;
#Autowired
private MessageRepository messageRepository;
private JobLauncherTestUtils jobLauncherTestUtils;
#Before
public void setUp() {
this.jobLauncherTestUtils = new JobLauncherTestUtils();
this.jobLauncherTestUtils.setJobLauncher(jobLauncher);
this.jobLauncherTestUtils.setJobRepository(jobRepository);
this.jobLauncherTestUtils.setJob(messagesDigestMailingJob);
}
#Test
#Transactional
public void shouldSendMessageDigestAndUpdateNotificationSent() {
UserAccount userAccount = DomainFactory.createUserAccount("me#example.com");
userAccountRepository.save(userAccount);
JobParameters jobParameters = new JobParametersBuilder().addDate("execution_date", new Date()).toJobParameters();
jobLauncherTestUtils.launchStep("messagesDigestMailingStep", jobParameters);
//Assertions
}
}
Notice the #Transactional on the test method. Unfortunately Spring batch uses its own transactions and my use of #Transactional clashes with spring batch transactions.
Here is the error message I get:
java.lang.IllegalStateException: Existing transaction detected in JobRepository. Please fix this and try again (e.g. remove #Transactional annotations from client).
Can someone please advise how to insert test data to be available for a spring batch integration test?
edit: For good measure, here is the definition of the AbstractBatchIntegrationTest class:
#AutoConfigureTestEntityManager
#AutoConfigureJson
#AutoConfigureJsonTesters
#RunWith(SpringJUnit4ClassRunner.class)
#ActiveProfiles(Profiles.TEST)
#ComponentScan(basePackages = {"com.bignibou.it.configuration", "com.bignibou.configuration"})
public abstract class AbstractBatchIntegrationTest {
}
edit: I have decided to rely only on the #Sql annotation as follows:
#Sql(scripts = "insert_message.sql", executionPhase = Sql.ExecutionPhase.BEFORE_TEST_METHOD)
#Sql(scripts = "clean_database.sql", executionPhase = Sql.ExecutionPhase.AFTER_TEST_METHOD)
#Test
public void shouldSendMessageDigestAndUpdateNotificationSent() {
...
Remove #Transactional from the test so that the UserAccount gets immediately persisted to the database. Then use #Sql with ExecutionPhase.AFTER_TEST_METHOD to execute a clean-up script (or inlined statement) to manually undo the changes performed during the test.

Spring RestDoc 1.1:MockMvcRestDocumentation.documentationConfiguration not working with JUnitRestDocumentation

I have configured as follows my rest docs with Junit4 ,with spring boot 1.4
#RunWith(SpringJUnit4ClassRunner.class)
#SpringApplicationConfiguration(classes = Application.class)
#WebAppConfiguration
#ActiveProfiles(SPRING_PROFILE_ACTIVE_TEST)
public class CustomerDetailsControllerWACTest {
#Autowired
private WebApplicationContext wac;
#Rule
public final JUnitRestDocumentation documentation =
new JUnitRestDocumentation("build/generated-snippets");
private RestDocumentationResultHandler document;
MockMvc mockMvc;
#Before
public void setUp() throws Exception
{
this.document = document("{method-name}", preprocessRequest(prettyPrint()), preprocessResponse(prettyPrint()));
this.mockMvc = MockMvcBuilders.webAppContextSetup(wac).
apply(documentationConfiguration(this.documentation))
.alwaysDo(this.document)
.build();
}
But the error is The method documentationConfiguration(RestDocumentation) in the type MockMvcRestDocumentation is not applicable for the arguments (JUnitRestDocumentation)
The documentation also has same configuration as mentioned here.But still it is showing above error.
RestDocs Dependancies(version):
spring-restdocs-core-1.1.1 and spring-restdocs-mockmvc-1.0.1
You have incompatible version mismatch. You should use the same versions of core and mockmvc.
The JUnitRestDocumentation from spring-restdocs-core 1.1.1 cannot be applied to MockMvcRestDocumentation.documentationConfiguration(RestDocumentation) since that method in version 1.0.1 only accepts RestDocumentation. The overloaded method accepting the interface RestDocumentationContextProvider was added in 1.1.

Spring Cloud Stream and Kafka Integration Error Handling

I am trying to create a Spring Boot application with Spring Cloud Stream and Kafka integration. I created a sample Topic in Kafka with 1 partition and have published to the topic from the Spring Boot application created based on the directions given here
http://docs.spring.io/spring-cloud-stream/docs/1.0.2.RELEASE/reference/htmlsingle/index.html
and
https://blog.codecentric.de/en/2016/04/event-driven-microservices-spring-cloud-stream/
Spring Boot App -
#SpringBootApplication
public class MyApplication {
private static final Log logger = LogFactory.getLog(MyApplication.class);
public static void main(String[] args) {
SpringApplication.run(MyApplication.class, args);
}
}
Kafka Producer Class
#Service
#EnableBinding(Source.class)
public class MyProducer {
private static final Log logger = LogFactory.getLog(MyProducer.class);
#Bean
#InboundChannelAdapter(value = Source.OUTPUT, poller = #Poller(fixedDelay = "10000", maxMessagesPerPoll = "1"))
public MessageSource<TimeInfo> timerMessageSource() {
TimeInfo t = new TimeInfo(new Timestamp(new Date().getTime())+"","Label");
MessageBuilder<TimeInfo> m = MessageBuilder.withPayload(t);
return () -> m.build();
}
public static class TimeInfo{
private String time;
private String label;
public TimeInfo(String time, String label) {
super();
this.time = time;
this.label = label;
}
public String getTime() {
return time;
}
public String getLabel() {
return label;
}
}
}
All is working well except for when I want to handle exceptions.
If the Kafka Topic went down, I can see the ConnectionRefused exception being thrown in the log files for the app, but the retry logic built in seems to be going at retrying continuously without stopping!
There is no exception thrown at all for me to handle and do further exception processing. I have read through the Producer options and the Binder options for Kafka in the Spring Cloud Stream documentation above and I cannot see any customization options possible to get this exception thrown above all the way for me to capture.
I am new to Spring Boot / Spring Cloud Stream / Spring Integration (which seems to be the underlying implementation to the cloud stream project).
Is there anything else you guys know to get this exception cascaded to my Spring Cloud Stream app?

Inject EntityManager in SwitchYard Junit implementation

I am trying to implement Junit in SwitchYard Application.
i am using JPA , without using Camel. i have persistence.xml with the following details. And i am using resource producer pattern to expose EntityManager.
But when i am testing a service, i am getting null Invocation for EntityManager in DAO layer.
Is there any way , i can mock or inject EntityManager in SwitchYard Junit
#RunWith(SwitchYardRunner.class)
#SwitchYardTestCaseConfig(config = SwitchYardTestCaseConfig.SWITCHYARD_XML, mixins = {
CDIMixIn.class, HTTPMixIn.class, NamingMixIn.class })
public class SalesModuleServiceTest {
private SwitchYardTestKit testKit;
private CDIMixIn cdiMixIn;
private HTTPMixIn httpMixIn;
private static NamingMixIn namingMixIn;
private TransformerRegistry transformerRegistry;
#ServiceOperation("SalesModuleService")
private Invoker service;
//------ JUnit test with REST binding fails if no resteasy properties defined ------
#BeforeDeploy
public void setProperties()
{
System.setProperty("org.switchyard.component.resteasy.standalone.port", "8081");
System.setProperty("org.switchyard.component.resteasy.standalone.path", "");
}
#Test
public void testUpdateCustomerStatus() throws Exception {
SalesDetailsRequest message = null;
BudgetResponse<?> result = service.operation("updateCustomerStatus")
.sendInOut(message).getContent(SalesResponse.class);
// validate the results
Assert.assertTrue("Implement me", false);
}
}