spring webflux stream Completion Consumer - rest

I have spring webflux stream consumer which calls a REST endpoint and consumes the messages received and save to an RDBMS. i am trying to find a way to batch it. I see the subscribe() has an overloaded method which gets called on Completion. I am trying to find how to get hold of the data when this completion consumer gets called since i am calling a CompletionConsumer which is of type Runnable and all i am having is the run() method which dont take any parameters.
**CLIENT**
WebClient.create("http://localhost:8080")
.get()
.uri("/objects")
.accept(MediaType.TEXT_EVENT_STREAM)
.exchange()
.flatMapMany(clientResponse ->clientResponse.bodyToFlux(MyObject.class))
.subscribe(null,null,completionProcessorSubscriber);
**COMPLETION SUBSCRIBER**
#Service
public class CompletionProcessorSubscriber implements Runnable{
#Autowired
LegacyDAOImpl dao;
Logger logger = LoggerFactory.getLogger(CompletionProcessorSubscriber.class);
public void run() {
logger.info("\ninside RUNNNNNNNNN\n\n");
// here how to get hold of the data stream ?
}
Below is the Documentation from the Flux API
*/
public final Disposable subscribe(
#Nullable Consumer<? super T> consumer,
#Nullable Consumer<? super Throwable> errorConsumer,
#Nullable Runnable completeConsumer) {
return subscribe(consumer, errorConsumer, completeConsumer, null);
}

You should avoid adding to much logic to subscriber methods. Instead, you should utilize the rich set of operators provided by Flux API.
In this case the operators you need are buffer to collect batches and concatMap to execute batches sequentially.
In the following example I assume the LegacyDAOImpl is a blocking service whose work should be assigned to an appropriate thread pool.
public static void main(String[] args) throws InterruptedException
{
webClient.get()
.uri("/objects")
.accept(MediaType.TEXT_EVENT_STREAM)
.exchange()
.flatMapMany(clientResponse -> clientResponse.bodyToFlux(MyObject.class))
.buffer(100) // batch size
.concatMap(batchOfMyObjects -> Mono.fromRunnable(() -> legacyDAOImpl.saveAll(batchOfMyObjects))
.subscribeOn(Schedulers.elastic())) // blocking IO goes to elastic thread pool
.subscribe();
}
private static class LegacyDAOImpl
{
public void saveAll(List<MyObject> myObjects)
{
// save here
}
}
private static class MyObject
{
}

Related

How to properly test kafkaTemplate.send() within a function in Junit5?

I'm learning how to write tests and especially tests that have a producer in it. I cannot post all the classes because it's HUGE (and not mine, I should just practice by changing the test to work with KafkaTemplate). I'm lost as to how a call like this should be tested.
I'm getting a NPE because of a producer.send("topic", JsonObject) that is in the function I'm testing. The functions is built like so:
#Autowired
private KafkaTemplate<String,EventDto> kafkaTemplate;
public EventDto sendEvent(Event event) {
EventDto eventToSend = this.dtoMapper.mapToDto(event, SomeEvent.class);
this.kafkaTemplate.send("topic",eventToSend);
return eventToSend;
}
in the unit test it's like this (irrelevant parts omitted):
#Test
void testSendEvent() {
//omitted lines regarding second assert that works
EventProducer producer = new EventProducer(something);
EventDto dto = producer.sendEvent(Event.newBuilder().build());
assertThat(dto).isNotNull();
//there is a second assert here that passes, nothing to do with kafka
}
We have Mockito and I assume I need to mock the KafkaTemplate somehow. But I'm not quite getting how I can "direct" the sendEvent to use the KafkaTemplate within the producer.sendEvent() call?
Solution edit: I changed the #Autowired to injecting it with the constructor instead. Works well! Here is the full class and method now
#Service
public class EventProducer implements EventProducerInterface {
private final DtoMapper dtoMapper;
private KafkaTemplate<String,EventDto> kafkaTemplate;
#Autowired
public EventProducer (KafkaTemplate<String,EventDto> kafkaTemplate, IDtoMapper dtoMapper) {
Assert.notNull(dtoMapper, "dtoMapper must not be null");
this.dtoMapper = dtoMapper;
this.kafkaTemplate=kafkaTemplate;
}
public EventDto sendEvent(Event event) {
EventDto eventToSend = this.dtoMapper.mapToDto(event, EventDto.class);
this.kafkaTemplate.send("output-topic",eventToSend);
return eventToSend;
}
}
You should use constructor injection instead of #Autowired:
private KafkaTemplate<String,EventDto> kafkaTemplate;
public EventProducer(KafkaTemplate<String,EventDto> kafkaTemplate, something) {
this.kafkaTemplate = kafkaTemplate;
}
public EventDto sendEvent(Event event) {
EventDto eventToSend = this.dtoMapper.mapToDto(event, SomeEvent.class);
this.kafkaTemplate.send("topic",eventToSend);
return eventToSend;
}
This way you can inject a mock in your tests:
#Test
void testSendEvent() {
//omitted lines regarding second assert that works
KafkaTemplate<<String,EventDto>> templateMock = mock(KafkaTemplate.class);
EventProducer producer = new EventProducer(templateMock, something);
EventDto dto = producer.sendEvent(Event.newBuilder().build());
assertThat(dto).isNotNull();
//there is a second assert here that passes, nothing to do with kafka
}
If you can't change the class' constructor, you can provide a mock using #MockBean:
#MockBean
KafkaTemplate<String,EventDto> kafkaTemplate;
#Test
void testSendEvent() {
//omitted lines regarding second assert that works
EventProducer producer = new EventProducer(something);
EventDto dto = producer.sendEvent(Event.newBuilder().build());
assertThat(dto).isNotNull();
//there is a second assert here that passes, nothing to do with kafka
}
But there's something odd with this design - does the EventProducer class have #Autowired and constructor arguments? Autowiring only works on beans, and usually either the class has a default constructor and #Autowired dependencies, or injects everything through the constructor.
If those options I present do not work for you, please add more details on the class' constructor and overall design.

SpringBoot: Create document in mongodb on startup if not exists

I have a small service on SpringBoot and Mongodb as a DB.
I need to be able create a small collection with one document ( very basic: id, name, status) on startup. An analog of sql create table if not exists, but for mongo. How do I do that?
I tried to initialize values in the document attributes, but it didn't help.
Currently, collection and the document appear only if I use API to add it.
You may want to use something like ApplicationRunner or CommandLineRunner which can be defined as a bean.
Example:
#SpringBootApplication
public class MyApplication {
public static void main(String[] args) {
SpringApplication.run(MyApplication .class, args);
}
#Bean
public CommandLineRunner initialize(MyRepository myRepository) {
return args -> {
// Insert elements into myRepository
};
}
}
Both CommandLineRunner and ApplicationRunner are functional interfaces, so we can use a lambda for them. Spring Boot will execute them at the startup of the application.
You can leverage the spring internal event mechanism.
When your application is ready, spring triggers the event ApplicationReadyEvent
You can listen to this event and init your collection:
#Component
public class DataInit implements ApplicationListener<ApplicationReadyEvent> {
private final MyRepository myRepository;
public DataInit(MyRepository myRepository) {
this.myRepository = myRepository;
}
#Override
public void onApplicationEvent(ApplicationReadyEvent event) {
// init data
}
}

Spring Boot Hibernate Postgresql #Transactional does not rollback [duplicate]

I want to read text data fixtures (CSV files) at the start on my application and put it in my database.
For that, I have created a PopulationService with an initialization method (#PostConstruct annotation).
I also want them to be executed in a single transaction, and hence I added #Transactional on the same method.
However, the #Transactional seems to be ignored :
The transaction is started / stopped at my low level DAO methods.
Do I need to manage the transaction manually then ?
Quote from legacy (closed) Spring forum:
In the #PostConstruct (as with the afterPropertiesSet from the InitializingBean interface) there is no way to ensure that all the post processing is already done, so (indeed) there can be no Transactions. The only way to ensure that that is working is by using a TransactionTemplate.
So if you would like something in your #PostConstruct to be executed within transaction you have to do something like this:
#Service("something")
public class Something {
#Autowired
#Qualifier("transactionManager")
protected PlatformTransactionManager txManager;
#PostConstruct
private void init(){
TransactionTemplate tmpl = new TransactionTemplate(txManager);
tmpl.execute(new TransactionCallbackWithoutResult() {
#Override
protected void doInTransactionWithoutResult(TransactionStatus status) {
//PUT YOUR CALL TO SERVICE HERE
}
});
}
}
I think #PostConstruct only ensures the preprocessing/injection of your current class is finished. It does not mean that the initialization of the whole application context is finished.
However you can use the spring event system to receive an event when the initialization of the application context is finished:
public class MyApplicationListener implements ApplicationListener<ContextRefreshedEvent> {
public void onApplicationEvent(ContextRefreshedEvent event) {
// do startup code ..
}
}
See the documentation section Standard and Custom Events for more details.
As an update, from Spring 4.2 the #EventListener annotation allows a cleaner implementation:
#Service
public class InitService {
#Autowired
MyDAO myDAO;
#EventListener(ContextRefreshedEvent.class)
public void onApplicationEvent(ContextRefreshedEvent event) {
event.getApplicationContext().getBean(InitService.class).initialize();
}
#Transactional
public void initialize() {
// use the DAO
}
}
Inject self and call through it the #Transactional method
public class AccountService {
#Autowired
private AccountService self;
#Transactional
public void resetAllAccounts(){
//...
}
#PostConstruct
private void init(){
self.resetAllAccounts();
}
}
For older Spring versions which do not support self-injection, inject BeanFactory and get self as beanFactory.getBean(AccountService.class)
EDIT
It looks like that since this solution has been posted 1.5 years ago developers are still under impression that if a method,
annotated with #Transactional, is called from a #PostContruct-annotated method invoked upon the Bean initialization, it won't be actually executed inside of Spring Transaction, and awkward (obsolete?) solutions get discussed and accepted instead of this very simple and straightforward one and the latter even gets downvoted.
The Doubting Thomases :) are welcome to check out an example Spring Boot application at GitHub which implements the described above solution.
What actually causes, IMHO, the confusion: the call to #Transactional method should be done through a proxied version of a Bean where such method is defined.
When a #Transactional method is called from another Bean, that another Bean usually injects this one and invokes its proxied (e.g. through #Autowired) version of it, and everything is fine.
When a #Transactional method is called from the same Bean directly, through usual Java call, the Spring AOP/Proxy machinery is not involved and the method is not executed inside of Transaction.
When, as in the suggested solution, a #Transactional method is called from the same Bean through self-injected proxy (self field), the situation is basically equivalent to a case 1.
#Platon Serbin's answer didn't work for me. So I kept searching and found the following answer that saved my life. :D
The answer is here No Session Hibernate in #PostConstruct, which I took the liberty to transcribe:
#Service("myService")
#Transactional(readOnly = true)
public class MyServiceImpl implements MyService {
#Autowired
private MyDao myDao;
private CacheList cacheList;
#Autowired
public void MyServiceImpl(PlatformTransactionManager transactionManager) {
this.cacheList = (CacheList) new TransactionTemplate(transactionManager).execute(new TransactionCallback(){
#Override
public Object doInTransaction(TransactionStatus transactionStatus) {
CacheList cacheList = new CacheList();
cacheList.reloadCache(MyServiceImpl.this.myDao.getAllFromServer());
return cacheList;
}
});
}
The transaction part of spring might not be initialized completely at #PostConstruct.
Use a listener to the ContextRefreshedEvent event to ensure, that transactions are available:
#Component
public class YourService
implements ApplicationListener<ContextRefreshedEvent> // <= ensure correct timing!
{
private final YourRepo repo;
public YourService (YourRepo repo) {this.repo = repo;}
#Transactional // <= ensure transaction!
#Override
public void onApplicationEvent(ContextRefreshedEvent event) {
repo.doSomethingWithinTransaction();
}
}
Using transactionOperations.execute() in #PostConstruct or in #NoTransaction method both works
#Service
public class ConfigurationService implements ApplicationContextAware {
private static final Logger LOG = LoggerFactory.getLogger(ConfigurationService.class);
private ConfigDAO dao;
private TransactionOperations transactionOperations;
#Autowired
public void setTransactionOperations(TransactionOperations transactionOperations) {
this.transactionOperations = transactionOperations;
}
#Autowired
public void setConfigurationDAO(ConfigDAO dao) {
this.dao = dao;
}
#PostConstruct
public void postConstruct() {
try { transactionOperations.execute(new TransactionCallbackWithoutResult() {
#Override
protected void doInTransactionWithoutResult(final TransactionStatus status) {
ResultSet<Config> configs = dao.queryAll();
}
});
}
catch (Exception ex)
{
LOG.trace(ex.getMessage(), ex);
}
}
#NoTransaction
public void saveConfiguration(final Configuration configuration, final boolean applicationSpecific) {
String name = configuration.getName();
Configuration original = transactionOperations.execute((TransactionCallback<Configuration>) status ->
getConfiguration(configuration.getName(), applicationSpecific, null));
}
#Override
public void setApplicationContext(ApplicationContext applicationContext) throws BeansException {
}
}

Asynchronous Controller in Spring Boot

I have created Webservice using Spring boot and in this there is a rest controller which hits the database via vendor based JDBC driver and fetch the records. In this process number of records retrieved are more than 80K records. Due to this when ever we are hitting the the rest endpoint as client , we are getting time out errors.
I have tried setting up the asynchronous calls using the below tutorial.But unfortunately , rest calls are still timing out.
https://howtodoinjava.com/spring-boot2/enableasync-async-controller/
Controller
#RequestMapping(value = "/v1/lr/fullpositionasync", produces = {APPLICATION_JSON_UTF8_VALUE}, method = RequestMethod.GET)
#ResponseBody
public CompletableFuture<List<Position>> retrieveTradePositionsFullAsync(HttpServletRequest request, HttpServletResponse response) throws ExecutionException, InterruptedException {
CompletableFuture<List<Position>> positionList =null;
try {
positionList = positionService.getFullPosition();
}
catch(Exception e){
log.info("Error Occurred in Controller is:"+e.getMessage());
}
CompletableFuture.allOf(positionList).join();
log.info(String.valueOf(positionList.get()));
return positionList;
}
Service
#Service
#Slf4j
public class PositionServiceImpl implements PositionService {
#Autowired
private PositionDao positionDao;
#Async("asyncExecutor")
#Override
public CompletableFuture<List<Position>> getFullPosition() {
List<Position> fullpositionList = null;
log.info("Getting the full Position process started");
fullpositionList = positionDao.retrieveData();
log.info("Total Positions retrieved:"+fullpositionList.size());
try {
log.info("Thread is about to sleep 1000 milliseconds");
Thread.sleep(1000);
}catch(InterruptedException e){
log.info(e.getMessage());
}
log.info("Full Positions retrieval completed");
return CompletableFuture.completedFuture(fullpositionList);
}
}
Configuration
#Configuration
#EnableAsync
#Slf4j
public class AsyncConfiguration
{
#Bean(name = "asyncExecutor")
public Executor asyncExecutor()
{
ThreadPoolTaskExecutor executor = new ThreadPoolTaskExecutor();
executor.setCorePoolSize(20);
executor.setMaxPoolSize(1000);
executor.setWaitForTasksToCompleteOnShutdown(true);
executor.setThreadNamePrefix("AsynchThreadForEndPoint-");
executor.initialize();
log.info("Executor is :"+executor.toString());
return executor;
}
}
DAO
#Repository
public class PositionDaoImpl implements PositionDao {
#Autowired
private JdbcTemplate jdbcTemplate;
private static final String ALL_POSITION_QUERY = "call AllPositionProcedure()";
public List<Position> retrieveData() {
return jdbcTemplate.query(ALL_POSITION_QUERY, new BeanPropertyRowMapper(Position.class));
// List<Map<String, Object>> mapList = jdbcTemplate.queryForList(sql);
}
You can't perform async operations over the Database by using JDBC. JDBC is blocking, so it will block your thread until the operations will be executed. If you want to execute operations in an async manner, use R2DBC instead of JDBC.
For your use case, the best way to do is convert your application into Reactive Streams (Flux).
A Flux is a Reactive Streams Publisher. it is a fully non-blocking reactive programming foundation for the JVM, with efficient demand management (in the form of managing "backpressure"). It integrates directly with the Java 8 functional APIs, notably CompletableFuture, Stream, and Duration. It offers composable asynchronous sequence APIs Flux (for [N] elements) and Mono (for [0|1] elements), extensively implementing the Reactive Streams specification.
it is very simple to implement in existing app. just change your repository return type Flux instead of List or Future.
For more info, you can take reference Here

How to properly create Spring Cloud Task with custom parameters?

According to the samples here (actually - timestamp task), I have implemented a small task class:
#SpringBootApplication
#EnableTask
#EnableConfigurationProperties({ RestProcessorTaskProperties.class })
public class RestProcessorTaskApplication {
public static void main(String[] args) {
SpringApplication.run(RestProcessorTaskApplication.class, args);
}
#Autowired
private RestProcessorTaskProperties config;
// some fields and beans
#Bean
public CommandLineRunner run(RestTemplate restTemplate) {
return args -> {
// doing some stuff
};
}
}
and then I've created Properties class (in the same package)
#ConfigurationProperties("RestProcessor")
public class RestProcessorTaskProperties {
private String host = "http://myhost:port";
public String getHost() {
return host;
}
public void setHost(String host) {
this.host = host;
}
}
But after I've registered task on my local Spring Cloud Data Server, I see numerous parameters, that, I suppose, was added automatically. I those mean parameters like:
abandon-when-percentage-full java.lang.Integer
abandoned-usage-tracking java.lang.Boolean
acceptors java.lang.Integer
access-to-underlying-connection-allowed java.lang.Boolean
and others...
Is it possible somehow to hide (or remove) them, so that when launching task I could configure only those parameters, that was added by me (single host property in my example above)?
By default Spring Cloud Data Flow will show you all the available properties for a boot application. However, you can create a whitelist of properties that you wish to show.
Here is a link to the Spring Cloud Data Flow reference doc that will discuss how to do this: http://docs.spring.io/spring-cloud-dataflow/docs/current/reference/htmlsingle/#spring-cloud-dataflow-stream-app-whitelisting.
And here is link to the timestamp starter app that has an example of this: https://github.com/spring-cloud/spring-cloud-task-app-starters/tree/master/spring-cloud-starter-task-timestamp