I have created Webservice using Spring boot and in this there is a rest controller which hits the database via vendor based JDBC driver and fetch the records. In this process number of records retrieved are more than 80K records. Due to this when ever we are hitting the the rest endpoint as client , we are getting time out errors.
I have tried setting up the asynchronous calls using the below tutorial.But unfortunately , rest calls are still timing out.
https://howtodoinjava.com/spring-boot2/enableasync-async-controller/
Controller
#RequestMapping(value = "/v1/lr/fullpositionasync", produces = {APPLICATION_JSON_UTF8_VALUE}, method = RequestMethod.GET)
#ResponseBody
public CompletableFuture<List<Position>> retrieveTradePositionsFullAsync(HttpServletRequest request, HttpServletResponse response) throws ExecutionException, InterruptedException {
CompletableFuture<List<Position>> positionList =null;
try {
positionList = positionService.getFullPosition();
}
catch(Exception e){
log.info("Error Occurred in Controller is:"+e.getMessage());
}
CompletableFuture.allOf(positionList).join();
log.info(String.valueOf(positionList.get()));
return positionList;
}
Service
#Service
#Slf4j
public class PositionServiceImpl implements PositionService {
#Autowired
private PositionDao positionDao;
#Async("asyncExecutor")
#Override
public CompletableFuture<List<Position>> getFullPosition() {
List<Position> fullpositionList = null;
log.info("Getting the full Position process started");
fullpositionList = positionDao.retrieveData();
log.info("Total Positions retrieved:"+fullpositionList.size());
try {
log.info("Thread is about to sleep 1000 milliseconds");
Thread.sleep(1000);
}catch(InterruptedException e){
log.info(e.getMessage());
}
log.info("Full Positions retrieval completed");
return CompletableFuture.completedFuture(fullpositionList);
}
}
Configuration
#Configuration
#EnableAsync
#Slf4j
public class AsyncConfiguration
{
#Bean(name = "asyncExecutor")
public Executor asyncExecutor()
{
ThreadPoolTaskExecutor executor = new ThreadPoolTaskExecutor();
executor.setCorePoolSize(20);
executor.setMaxPoolSize(1000);
executor.setWaitForTasksToCompleteOnShutdown(true);
executor.setThreadNamePrefix("AsynchThreadForEndPoint-");
executor.initialize();
log.info("Executor is :"+executor.toString());
return executor;
}
}
DAO
#Repository
public class PositionDaoImpl implements PositionDao {
#Autowired
private JdbcTemplate jdbcTemplate;
private static final String ALL_POSITION_QUERY = "call AllPositionProcedure()";
public List<Position> retrieveData() {
return jdbcTemplate.query(ALL_POSITION_QUERY, new BeanPropertyRowMapper(Position.class));
// List<Map<String, Object>> mapList = jdbcTemplate.queryForList(sql);
}
You can't perform async operations over the Database by using JDBC. JDBC is blocking, so it will block your thread until the operations will be executed. If you want to execute operations in an async manner, use R2DBC instead of JDBC.
For your use case, the best way to do is convert your application into Reactive Streams (Flux).
A Flux is a Reactive Streams Publisher. it is a fully non-blocking reactive programming foundation for the JVM, with efficient demand management (in the form of managing "backpressure"). It integrates directly with the Java 8 functional APIs, notably CompletableFuture, Stream, and Duration. It offers composable asynchronous sequence APIs Flux (for [N] elements) and Mono (for [0|1] elements), extensively implementing the Reactive Streams specification.
it is very simple to implement in existing app. just change your repository return type Flux instead of List or Future.
For more info, you can take reference Here
Related
I'm using Spring Cloud Stream 3.1.3.
I'm migrate from a pre 3.1 version, so I wrote my producer using a java.util.Function (I know I can use Supplier but this is what I need)
application.yaml file is configured with function definition, input and output bindings, and this is what I have:
#EnableAutoConfiguration
#Service
public class Producer {
public void produce(int messageId, Object data) {
Message<Object> message = MessageBuilder
.withPayload(data)
.setHeader(PARTITION_KEY, messageId)
.build();
streamBridge.send("produceMessage-in-0", message);
}
#Bean
public Function<Message<Object>, Message<Object>> produceMessage() {
return (input) -> {
int messageId = input.getHeaders().get(PARTITION_KEY, Integer.class);
Object message = input.getPayload();
return MessageBuilder
.withPayload(message)
.setHeader(MessageHeaders.CONTENT_TYPE, MimeTypeUtils.APPLICATION_JSON)
.setHeader(PARTITION_KEY, messageId)
.setHeader("type", "MyMessage")
.build();
};
}
}
Now, I would like to test this implmentation, so I wrote this test class
#SpringBootTest
class ProducerTest {
#Autowired
private Producer producer;
#Autowired
private ObjectMapper objectMapper;
#Test
void produceOk() {
try (ConfigurableApplicationContext context = new SpringApplicationBuilder(TestChannelBinderConfiguration.getCompleteConfiguration(Producer.class)).run()) {
producer.produce(1, new MyMessage(1, "Hello"));
OutputDestination output = context.getBean(OutputDestination.class);
Message<byte[]> received = output.receive();
Assertions.assertNotNull(received);
}
}
}
Test fails because output.receive() returns null.
Is this the right way to test my code?
Thanks
It is difficult to see what your issue may be since we don't see the entire setup of your project, but here are few pointers that may help. . .
Please look at any of the tests we use in the framework as well as checkout the Testing section of the reference manual.
There is also s dedicated StreamBridgeTests.java which is what I believe you are looking for.
I have a Spring Batch application(Spring Boot 2.3.5.RELEASE) that uses a JpaRepository to insert some custom log messages into a database as Spring Batch is processing. This is separate from the out of the box Spring Batch tables. Seems that when I throw an exception from my ItemProcessorAdapter, it is caught by the ItemProcessListener onProcessError() method. In this method I am performing a JpaRepository save() and flush(). No errors are logged, but once I leave this method the JpaRepository does a rollback.
Is this normal behavior? How can I get around it?
When using JpaRepository, is there a way to set a #Transactional(noRollbackFor = {xxxException.class})? I tried this and it seemed to have no effect.
Sample code snippet is below.
#Configuration
public class BatchJobConfiguration {
//Omitted for clarity....
#Bean
#StepScope
public CompositeItemProcessor<Decision,Decision> itemProcessor() {
CompositeItemProcessor<Decision,Decision> itemProcessor = new CompositeItemProcessor<>();
itemProcessor.setDelegates(Arrays.asList(
decisionValidatingItemProcessor(),
myItemProcessor(null)
));
return itemProcessor;
} // end itemProcessor()
#Bean
public BeanValidatingItemProcessor<Decision> decisionValidatingItemProcessor() {
BeanValidatingItemProcessor<Decision> beanValidatingItemProcessor = new BeanValidatingItemProcessor<>();
beanValidatingItemProcessor.setFilter(true);
return beanValidatingItemProcessor;
} // end decisionValidatingItemProcessor()
#Bean
public ItemProcessorAdapter<Decision,Decision> myItemProcessor(DecisionProcessingService service) {
ItemProcessorAdapter<Decision,Decision> adapter = new ItemProcessorAdapter<>();
adapter.setTargetObject(service);
adapter.setTargetMethod("processDecision");
return adapter;
}
#Bean
#StepScope
public DecisionItemProcessListener decisionItemProcessListener() {
return new DecisionItemProcessListener(mpJpaRepository);
}
}
#Service
public class DecisionProcessingService {
public Decision processDecision(Decision decision) throws BatchException {
....
throw new BatchException("An error occurred");
}
}
public class DecisionItemProcessListener implements ItemProcessListener<Decision,Decision> {
private MyJpaRepository mpJpaRepository;
public DecisionItemProcessListener(MyJpaRepository mpJpaRepository) {
this.mpJpaRepository = mpJpaRepository;
}
....
#Override
public void onProcessError(Decision decision, Exception e) {
MyEntityObject obj = MyEntityObject.builder()
.msg(e.getMessage())
.build();
mpJpaRepository.save(obj);
mpJpaRepository.flush();
// after this, the insert above is rolled back.
} // end onProcessError()
}
The callback you are using here ItemProcessListener#onProcessError is called with-in a transaction (driven by Spring Batch) that is going to be rolled-back due to the exception thrown by the item processor.
If you want to save data in that method, you need to use a new transaction (use the REQUIRES_NEW propagation).
EDIT: I shared a minimal complete example here: https://github.com/benas/spring-batch-lab/tree/master/issues/so64913980.
How to remove/handle irrelevant or bad sort parameters from http url using Pageable interface in spring boot?
For e.g. I have a query like
http://localhost:8080/all?sort=firstName,asc&sort=nosuchfield,asc
How can I handle or remove the irrelevant field "nosuchfield"?
Also, how can I limit sort parameters in URL?
If the sorting field doesn't present in the database then below exception will be thrown by Spring JPA.
org.springframework.data.mapping.PropertyReferenceException: No property nosuchfield found for type <TYPE>!
at org.springframework.data.mapping.PropertyPath.<init>(PropertyPath.java:94)
at org.springframework.data.mapping.PropertyPath.create(PropertyPath.java:382)
at org.springframework.data.mapping.PropertyPath.create(PropertyPath.java:358)
However, the exception can be handled using various types. Ultimately, you can just log it or transform it into any custom exception. As per my requirement, I have transformed it into a custom exception.
Using AOP
#Aspect
#Component
public class UnKnownColumnSortingExceptionHandler {
#AfterThrowing(pointcut = "execution(* com.repositorypackage.*.*(..))", throwing = "exception")
public void executeWhenExceptionThrowninRepository(JoinPoint jp, Throwable ex) {
if (ex instanceof PropertyReferenceException) {
throw new CustomException("Invalid Database operation");
}
}
}
Using #ControllerAdvice(Exception handling in Application wise)
#ControllerAdvice
public class GlobalExceptionHandler extends ResponseEntityExceptionHandler {
public GlobalExceptionHandler() {}
#ExceptionHandler({PropertyReferenceException.class})
public ResponseEntity<Void> handleAllExceptions(Exception ex, WebRequest req) {
return new ResponseEntity<>(HttpStatus.INTERNAL_SERVER_ERROR);
}
}
Exception handling in Controller wise
Add the below piece of code to your controller
#ExceptionHandler({PropertyReferenceException.class})
public ResponseEntity<Void> handleAllExceptions(Exception ex, WebRequest req)
{
return new ResponseEntity<>(HttpStatus.INTERNAL_SERVER_ERROR);
}
I have spring webflux stream consumer which calls a REST endpoint and consumes the messages received and save to an RDBMS. i am trying to find a way to batch it. I see the subscribe() has an overloaded method which gets called on Completion. I am trying to find how to get hold of the data when this completion consumer gets called since i am calling a CompletionConsumer which is of type Runnable and all i am having is the run() method which dont take any parameters.
**CLIENT**
WebClient.create("http://localhost:8080")
.get()
.uri("/objects")
.accept(MediaType.TEXT_EVENT_STREAM)
.exchange()
.flatMapMany(clientResponse ->clientResponse.bodyToFlux(MyObject.class))
.subscribe(null,null,completionProcessorSubscriber);
**COMPLETION SUBSCRIBER**
#Service
public class CompletionProcessorSubscriber implements Runnable{
#Autowired
LegacyDAOImpl dao;
Logger logger = LoggerFactory.getLogger(CompletionProcessorSubscriber.class);
public void run() {
logger.info("\ninside RUNNNNNNNNN\n\n");
// here how to get hold of the data stream ?
}
Below is the Documentation from the Flux API
*/
public final Disposable subscribe(
#Nullable Consumer<? super T> consumer,
#Nullable Consumer<? super Throwable> errorConsumer,
#Nullable Runnable completeConsumer) {
return subscribe(consumer, errorConsumer, completeConsumer, null);
}
You should avoid adding to much logic to subscriber methods. Instead, you should utilize the rich set of operators provided by Flux API.
In this case the operators you need are buffer to collect batches and concatMap to execute batches sequentially.
In the following example I assume the LegacyDAOImpl is a blocking service whose work should be assigned to an appropriate thread pool.
public static void main(String[] args) throws InterruptedException
{
webClient.get()
.uri("/objects")
.accept(MediaType.TEXT_EVENT_STREAM)
.exchange()
.flatMapMany(clientResponse -> clientResponse.bodyToFlux(MyObject.class))
.buffer(100) // batch size
.concatMap(batchOfMyObjects -> Mono.fromRunnable(() -> legacyDAOImpl.saveAll(batchOfMyObjects))
.subscribeOn(Schedulers.elastic())) // blocking IO goes to elastic thread pool
.subscribe();
}
private static class LegacyDAOImpl
{
public void saveAll(List<MyObject> myObjects)
{
// save here
}
}
private static class MyObject
{
}
As of now, I'm able to connect to Cassandra via the following code:
import com.datastax.driver.core.Cluster;
import com.datastax.driver.core.Session;
public static Session connection() {
Cluster cluster = Cluster.builder()
.addContactPoints("IP1", "IP2")
.withCredentials("user", "password")
.withSSL()
.build();
Session session = null;
try {
session = cluster.connect("database_name");
session.execute("CQL Statement");
} finally {
IOUtils.closeQuietly(session);
IOUtils.closeQuietly(cluster);
}
return session;
}
The problem is that I need to write to Cassandra in a Spring Batch project. Most of the starter kits seem to use a JdbcBatchItemWriter to write to a mySQL database from a chunk. Is this possible? It seems that a JdbcBatchItemWriter cannot connect to a Cassandra database.
The current itemwriter code is below:
#Bean
public JdbcBatchItemWriter<Person> writer() {
JdbcBatchItemWriter<Person> writer = new JdbcBatchItemWriter<Person>();
writer.setItemSqlParameterSourceProvider(new
BeanPropertyItemSqlParameterSourceProvider<Person>());
writer.setSql("INSERT INTO people (first_name, last_name) VALUES
(:firstName, :lastName)");
writer.setDataSource(dataSource);
return writer;
}
Spring Data Cassandra provides repository abstractions for Cassandra that you should be able to use in conjunction with the RepositoryItemWriter to write to Cassandra from Spring Batch.
It is possible to extend Spring Batch to support Cassandra by customising ItemReader and ItemWriter.
ItemWriter example:
public class CassandraBatchItemWriter<Company> implements ItemWriter<Company>, InitializingBean {
protected static final Log logger = LogFactory.getLog(CassandraBatchItemWriter.class);
private final Class<Company> aClass;
#Autowired
private CassandraTemplate cassandraTemplate;
#Override
public void afterPropertiesSet() throws Exception { }
public CassandraBatchItemWriter(final Class<Company> aClass) {
this.aClass = aClass;
}
#Override
public void write(final List<? extends Company> items) throws Exception {
logger.debug("Write operations is performing, the size is {}" + items.size());
if (!items.isEmpty()) {
logger.info("Deleting in a batch performing...");
cassandraTemplate.deleteAll(aClass);
logger.info("Inserting in a batch performing...");
cassandraTemplate.insert(items);
}
logger.debug("Items is null...");
}
}
Then you can inject it as a #Bean through #Configuration
#Bean
public ItemWriter<Company> writer(final DataSource dataSource) {
final CassandraBatchItemWriter<Company> writer = new CassandraBatchItemWriter<Company>(Company.class);
return writer;
}
Full source code can be found in Github repo: Spring-Batch-with-Cassandra