Spring Cloud Stream unit test Producer - spring-cloud

I'm using Spring Cloud Stream 3.1.3.
I'm migrate from a pre 3.1 version, so I wrote my producer using a java.util.Function (I know I can use Supplier but this is what I need)
application.yaml file is configured with function definition, input and output bindings, and this is what I have:
#EnableAutoConfiguration
#Service
public class Producer {
public void produce(int messageId, Object data) {
Message<Object> message = MessageBuilder
.withPayload(data)
.setHeader(PARTITION_KEY, messageId)
.build();
streamBridge.send("produceMessage-in-0", message);
}
#Bean
public Function<Message<Object>, Message<Object>> produceMessage() {
return (input) -> {
int messageId = input.getHeaders().get(PARTITION_KEY, Integer.class);
Object message = input.getPayload();
return MessageBuilder
.withPayload(message)
.setHeader(MessageHeaders.CONTENT_TYPE, MimeTypeUtils.APPLICATION_JSON)
.setHeader(PARTITION_KEY, messageId)
.setHeader("type", "MyMessage")
.build();
};
}
}
Now, I would like to test this implmentation, so I wrote this test class
#SpringBootTest
class ProducerTest {
#Autowired
private Producer producer;
#Autowired
private ObjectMapper objectMapper;
#Test
void produceOk() {
try (ConfigurableApplicationContext context = new SpringApplicationBuilder(TestChannelBinderConfiguration.getCompleteConfiguration(Producer.class)).run()) {
producer.produce(1, new MyMessage(1, "Hello"));
OutputDestination output = context.getBean(OutputDestination.class);
Message<byte[]> received = output.receive();
Assertions.assertNotNull(received);
}
}
}
Test fails because output.receive() returns null.
Is this the right way to test my code?
Thanks

It is difficult to see what your issue may be since we don't see the entire setup of your project, but here are few pointers that may help. . .
Please look at any of the tests we use in the framework as well as checkout the Testing section of the reference manual.
There is also s dedicated StreamBridgeTests.java which is what I believe you are looking for.

Related

How to properly test kafkaTemplate.send() within a function in Junit5?

I'm learning how to write tests and especially tests that have a producer in it. I cannot post all the classes because it's HUGE (and not mine, I should just practice by changing the test to work with KafkaTemplate). I'm lost as to how a call like this should be tested.
I'm getting a NPE because of a producer.send("topic", JsonObject) that is in the function I'm testing. The functions is built like so:
#Autowired
private KafkaTemplate<String,EventDto> kafkaTemplate;
public EventDto sendEvent(Event event) {
EventDto eventToSend = this.dtoMapper.mapToDto(event, SomeEvent.class);
this.kafkaTemplate.send("topic",eventToSend);
return eventToSend;
}
in the unit test it's like this (irrelevant parts omitted):
#Test
void testSendEvent() {
//omitted lines regarding second assert that works
EventProducer producer = new EventProducer(something);
EventDto dto = producer.sendEvent(Event.newBuilder().build());
assertThat(dto).isNotNull();
//there is a second assert here that passes, nothing to do with kafka
}
We have Mockito and I assume I need to mock the KafkaTemplate somehow. But I'm not quite getting how I can "direct" the sendEvent to use the KafkaTemplate within the producer.sendEvent() call?
Solution edit: I changed the #Autowired to injecting it with the constructor instead. Works well! Here is the full class and method now
#Service
public class EventProducer implements EventProducerInterface {
private final DtoMapper dtoMapper;
private KafkaTemplate<String,EventDto> kafkaTemplate;
#Autowired
public EventProducer (KafkaTemplate<String,EventDto> kafkaTemplate, IDtoMapper dtoMapper) {
Assert.notNull(dtoMapper, "dtoMapper must not be null");
this.dtoMapper = dtoMapper;
this.kafkaTemplate=kafkaTemplate;
}
public EventDto sendEvent(Event event) {
EventDto eventToSend = this.dtoMapper.mapToDto(event, EventDto.class);
this.kafkaTemplate.send("output-topic",eventToSend);
return eventToSend;
}
}
You should use constructor injection instead of #Autowired:
private KafkaTemplate<String,EventDto> kafkaTemplate;
public EventProducer(KafkaTemplate<String,EventDto> kafkaTemplate, something) {
this.kafkaTemplate = kafkaTemplate;
}
public EventDto sendEvent(Event event) {
EventDto eventToSend = this.dtoMapper.mapToDto(event, SomeEvent.class);
this.kafkaTemplate.send("topic",eventToSend);
return eventToSend;
}
This way you can inject a mock in your tests:
#Test
void testSendEvent() {
//omitted lines regarding second assert that works
KafkaTemplate<<String,EventDto>> templateMock = mock(KafkaTemplate.class);
EventProducer producer = new EventProducer(templateMock, something);
EventDto dto = producer.sendEvent(Event.newBuilder().build());
assertThat(dto).isNotNull();
//there is a second assert here that passes, nothing to do with kafka
}
If you can't change the class' constructor, you can provide a mock using #MockBean:
#MockBean
KafkaTemplate<String,EventDto> kafkaTemplate;
#Test
void testSendEvent() {
//omitted lines regarding second assert that works
EventProducer producer = new EventProducer(something);
EventDto dto = producer.sendEvent(Event.newBuilder().build());
assertThat(dto).isNotNull();
//there is a second assert here that passes, nothing to do with kafka
}
But there's something odd with this design - does the EventProducer class have #Autowired and constructor arguments? Autowiring only works on beans, and usually either the class has a default constructor and #Autowired dependencies, or injects everything through the constructor.
If those options I present do not work for you, please add more details on the class' constructor and overall design.

Uploading retrieved files in a FTP server to mongodb collection using spring boot

I have implemented a springboot application to retrieve files from an FTP server and to download them into my local directory.
Following is the code which I used to do that.
#Configuration
public class FTPConfiguration {
#ServiceActivator(inputChannel = "ftpMGET")
#Bean
public FtpOutboundGateway getFiles() {
FtpOutboundGateway gateway = new FtpOutboundGateway(sf(), "mget", "payload");
gateway.setAutoCreateDirectory(true);
gateway.setLocalDirectory(new File("./downloads/"));
gateway.setFileExistsMode(FileExistsMode.REPLACE_IF_MODIFIED);
gateway.setFilter(new AcceptOnceFileListFilter<>());
gateway.setOutputChannelName("fileResults");
return gateway;
}
#Bean
public MessageChannel fileResults() {
DirectChannel channel = new DirectChannel();
channel.addInterceptor(tap());
return channel;
}
#Bean
public WireTap tap() {
return new WireTap("logging");
}
#ServiceActivator(inputChannel = "logging")
#Bean
public LoggingHandler logger() {
LoggingHandler logger = new LoggingHandler(LoggingHandler.Level.INFO);
logger.setLogExpressionString("'Files:' + payload");
return logger;
}
#Bean
public DefaultFtpSessionFactory sf() {
DefaultFtpSessionFactory sf = new DefaultFtpSessionFactory();
sf.setHost("localhost");
sf.setPort(2121);
sf.setUsername("anonymous");
sf.setPassword("");
return sf;
}
#MessagingGateway(defaultRequestChannel = "ftpMGET", defaultReplyChannel = "fileResults")
public interface GateFile {
List<File> mget(String directory);
}
}
Now I need to upload these files to my MongoDB automatically, when I run this program.
Can anyone please help me or guide me the steps I should follow?
Please, take a look into a MongoDB support in Spring Integration: https://docs.spring.io/spring-integration/docs/current/reference/html/mongodb.html#mongodb-outbound-channel-adapter
You probably need to think how to make some POJO from files of that MGET result and send it to that MongoDB channel adapter for storing as documents in some collection.

Implementing Projection with Specification in Spring Data JPA

I am trying to implement the projection with specification in Spring Data JPA via this implementation:
https://github.com/pramoth/specification-with-projection
Related classes are as follows:
Spec:
public class TopicSpec {
public static Specification<Topic> idEq(String id){
return (root, query, cb) -> cb.equal(root.get(Topic_.id),id);
}
}
Repository
#Repository
public interface TopicRepository extends JpaRepository<Topic,String>,JpaSpecificationExecutorWithProjection<Topic> {
public static interface TopicSimple{
String getId();
String getName();
}
List<TopicSimple> findById(String id);
}
Test
#Test
public void specificationWithProjection() {
Specification<Topic> where= Specifications.where(TopicSpec.idEq("Bir"));
List<Topic> all = topicRepository.findAll(where);
Assertions.assertThat(all).isNotEmpty();
}
I have this response from the Get method:
However the tests fail. Besides when I pull the github project of pramoth I can run the tests with success. Does anyone have any opinion about this issue?
The full project can be found here:
https://github.com/dengizik/projectionDemo
I have asked the same question to the developer of the project Pramoth Suwanpech, who was kind enough to check my code and give answer. My test class should've implement the test object like this:
#Before
public void init() {
Topic topic = new Topic();
topic.setId("İki");
topic.setName("Hello");
topicRepository.save(topic); }
With this setting the tests passed.

Using Spring Batch to write to a Cassandra Database

As of now, I'm able to connect to Cassandra via the following code:
import com.datastax.driver.core.Cluster;
import com.datastax.driver.core.Session;
public static Session connection() {
Cluster cluster = Cluster.builder()
.addContactPoints("IP1", "IP2")
.withCredentials("user", "password")
.withSSL()
.build();
Session session = null;
try {
session = cluster.connect("database_name");
session.execute("CQL Statement");
} finally {
IOUtils.closeQuietly(session);
IOUtils.closeQuietly(cluster);
}
return session;
}
The problem is that I need to write to Cassandra in a Spring Batch project. Most of the starter kits seem to use a JdbcBatchItemWriter to write to a mySQL database from a chunk. Is this possible? It seems that a JdbcBatchItemWriter cannot connect to a Cassandra database.
The current itemwriter code is below:
#Bean
public JdbcBatchItemWriter<Person> writer() {
JdbcBatchItemWriter<Person> writer = new JdbcBatchItemWriter<Person>();
writer.setItemSqlParameterSourceProvider(new
BeanPropertyItemSqlParameterSourceProvider<Person>());
writer.setSql("INSERT INTO people (first_name, last_name) VALUES
(:firstName, :lastName)");
writer.setDataSource(dataSource);
return writer;
}
Spring Data Cassandra provides repository abstractions for Cassandra that you should be able to use in conjunction with the RepositoryItemWriter to write to Cassandra from Spring Batch.
It is possible to extend Spring Batch to support Cassandra by customising ItemReader and ItemWriter.
ItemWriter example:
public class CassandraBatchItemWriter<Company> implements ItemWriter<Company>, InitializingBean {
protected static final Log logger = LogFactory.getLog(CassandraBatchItemWriter.class);
private final Class<Company> aClass;
#Autowired
private CassandraTemplate cassandraTemplate;
#Override
public void afterPropertiesSet() throws Exception { }
public CassandraBatchItemWriter(final Class<Company> aClass) {
this.aClass = aClass;
}
#Override
public void write(final List<? extends Company> items) throws Exception {
logger.debug("Write operations is performing, the size is {}" + items.size());
if (!items.isEmpty()) {
logger.info("Deleting in a batch performing...");
cassandraTemplate.deleteAll(aClass);
logger.info("Inserting in a batch performing...");
cassandraTemplate.insert(items);
}
logger.debug("Items is null...");
}
}
Then you can inject it as a #Bean through #Configuration
#Bean
public ItemWriter<Company> writer(final DataSource dataSource) {
final CassandraBatchItemWriter<Company> writer = new CassandraBatchItemWriter<Company>(Company.class);
return writer;
}
Full source code can be found in Github repo: Spring-Batch-with-Cassandra

Spring Integration Email Redelivery on Exception

I have a web service that via a GET Http method, the user requests for a person object. This person is sent to a JMS Queue and then with the help of Spring Integration, I send it to a fake email address (https://papercut.codeplex.com/). I have written the code with Spring Integration Java DSL. I would like to ask:
Is there a more flexible way to send the email message?
If an exception is thrown, how can the mail be redelivered with the help of Spring Integration? (e.g. for 5 times and if it is not sent then the exception gets handled and the program stops)
Here is my code:
Web Service
public Person findById(Integer id) {
Person person = jpaPersonRepository.findOne(id);
jmsTemplate.convertAndSend("testQueue", person);
return jpaPersonRepository.findOne(id);
}
Java Confiuration
#Configuration
#EnableIntegration
#ComponentScan
public class JavaConfig {
private static final String DEFAULT_BROKER_URL = "tcp://localhost:61616";
private static final String DEFAULT_QUEUE = "testQueue";
#Bean
public ActiveMQConnectionFactory connectionFactory() {
ActiveMQConnectionFactory connectionFactory = new ActiveMQConnectionFactory();
connectionFactory.setBrokerURL(DEFAULT_BROKER_URL);
return connectionFactory;
}
#Bean
public JmsTemplate jmsTemplate() {
JmsTemplate template = new JmsTemplate();
template.setConnectionFactory(this.connectionFactory());
template.setDefaultDestinationName(DEFAULT_QUEUE);
return template;
}
#Bean
public DefaultMessageListenerContainer defaultMessageListenerContainer() {
DefaultMessageListenerContainer defaultMessageListenerContainer = new DefaultMessageListenerContainer();
defaultMessageListenerContainer.setDestinationName(DEFAULT_QUEUE);
defaultMessageListenerContainer.setConnectionFactory(this.connectionFactory());
return defaultMessageListenerContainer;
}
#Bean(name="inputChannel")
public DirectChannel directChannel() {
return new DirectChannel();
}
#Bean
public IntegrationFlow orders() {
return IntegrationFlows
.from(Jms.messageDrivenChannelAdapter(defaultMessageListenerContainer()))
.transform(new ObjectToStringTransformer())
.enrichHeaders(p -> p.header(MailHeaders.TO, "Papercut0#test.com"))
.handle(Mail.outboundAdapter("127.0.0.1")
.credentials("test","test").port(25)
.javaMailProperties(p -> p.put("mail.debug", "true")),
e -> e.id("sendMailEndpoint"))
.get();
}
}
Is there a more flexible way to send the email message?
Sorry, the question isn't clear. You have enough short code to do that. Mail.outboundAdapter() and all its fluent API. What should be more flexible?
If an exception is thrown, how can the mail be redelivered with the help of Spring Integration?
For this purpose Spring Integration suggests RequestHandlerRetryAdvice. And Mail.outboundAdapter() can be configured with that as:
#Bean
public Advice retryAdvice() {
RequestHandlerRetryAdvice advice = new RequestHandlerRetryAdvice();
RetryTemplate retryTemplate = new RetryTemplate();
SimpleRetryPolicy retryPolicy = new SimpleRetryPolicy();
retryPolicy.setMaxAttempts(5);
retryTemplate.setRetryPolicy(retryPolicy);
advice.setRetryTemplate(retryTemplate);
advice.setRecoveryCallback(new ErrorMessageSendingRecoverer(emailErrorChannel()));
return advice;
}
...
.handle(Mail.outboundAdapter("127.0.0.1")
.credentials("test","test").port(25)
.javaMailProperties(p -> p.put("mail.debug", "true")),
e -> e.id("sendMailEndpoint")
.advice(retryAdvice())) // HERE IS THE TRICK!
See its JavaDocs and Reference Manual on the matter.