Mono subscribe is not executing - reactive-programming

I have started learning reactive programming using spring boot and when I tried execute the below piece of code from route function, the mono subscribe part of logic is not executing.
#SpringBootApplication
public class MonotopayloadApplication {
public static void main(String[] args) {
SpringApplication.run(MonotopayloadApplication.class, args);
}
#Bean
RouterFunction<ServerResponse> videoEndPoint() {
return route()
.path("/test", builder -> builder
.POST("", this::handle)
).build();
}
private Mono<ServerResponse> handle(ServerRequest serverRequest) {
serverRequest.bodyToMono(String.class)
.subscribe(result -> {
try {
Thread.sleep(5000);
} catch (InterruptedException e) {
e.printStackTrace();
}
System.out.println("subscribe thread : " + Thread.currentThread().getName());
});
System.out.println("Thread : " + Thread.currentThread().getName() + " is leaving");
return Mono.empty();
}
}
I do not want to call the block/blockFirst/blockLast(). I understand the main thread is leaving the subscribe thread. Kindly help.

Related

How to Handle a Kafka Record with a Class-Level #KafkaListener with no #KafkaHandler for the Record Value

Normally, when we define a class-level #KafkaListener and method level #KafkaHandlers, we can define a default #KafkaHandler to handle unexpected payloads.
https://docs.spring.io/spring-kafka/docs/current/reference/html/#class-level-kafkalistener
But, what should we do if we don't have a default method?
With version 2.6 and later, you can configure a SeekToCurrentErrorHandler to immediately send such messages to a dead letter topic, by examining the exception.
Here is a simple Spring Boot application that demonstrates the technique:
#SpringBootApplication
public class So59256214Application {
public static void main(String[] args) {
SpringApplication.run(So59256214Application.class, args);
}
#Bean
public NewTopic topic1() {
return TopicBuilder.name("so59256214").partitions(1).replicas(1).build();
}
#Bean
public NewTopic topic2() {
return TopicBuilder.name("so59256214.DLT").partitions(1).replicas(1).build();
}
#KafkaListener(id = "so59256214.DLT", topics = "so59256214.DLT")
void listen(ConsumerRecord<?, ?> in) {
System.out.println("dlt: " + in);
}
#Bean
public ApplicationRunner runner(KafkaTemplate<String, Object> template) {
return args -> {
template.send("so59256214", 42);
template.send("so59256214", 42.0);
template.send("so59256214", "No handler for this");
};
}
#Bean
ErrorHandler eh(KafkaOperations<String, Object> template) {
SeekToCurrentErrorHandler eh = new SeekToCurrentErrorHandler(new DeadLetterPublishingRecoverer(template));
BackOff neverRetryOrBackOff = new FixedBackOff(0L, 0);
BackOff normalBackOff = new FixedBackOff(2000L, 3);
eh.setBackOffFunction((rec, ex) -> {
if (ex.getMessage().contains("No method found for class")) {
return neverRetryOrBackOff;
}
else {
return normalBackOff;
}
});
return eh;
}
}
#Component
#KafkaListener(id = "so59256214", topics = "so59256214")
class Listener {
#KafkaHandler
void integerHandler(Integer in) {
System.out.println("int: " + in);
}
#KafkaHandler
void doubleHandler(Double in) {
System.out.println("double: " + in);
}
}
spring.kafka.consumer.auto-offset-reset=earliest
spring.kafka.consumer.value-deserializer=org.springframework.kafka.support.serializer.JsonDeserializer
spring.kafka.producer.value-serializer=org.springframework.kafka.support.serializer.JsonSerializer
Result:
int: 42
double: 42.0
dlt: ConsumerRecord(topic = so59256214.DLT, ...

Spring batch write skipper item into csv file

I want to write skipper lines in first csv file and the result of processor in second file in one step but it not works !
My code :
// => Step cecStep1
#Bean
public Step cecStep1(StepBuilderFactory stepBuilders) throws IOException {
return stepBuilders.get("fileDecrypt")
.<CSCivique, String>chunk(100)
.reader(reader1())
.processor(processor1FileDecrypt())
.writer(writer1())
.faultTolerant()
.skip(Exception.class)
.skipLimit(100)
.listener(new MySkipListener())
.build();
}
// ##################################### Step SkipListener ###################################################
public static class MySkipListener implements SkipListener {
private BufferedWriter bw = null;
public MySkipListener(File file) throws IOException {
//this.fileWriter = new FileWriter(file);
bw= new BufferedWriter(new FileWriter(file, true));
System.out.println("MySkipListener =========> :"+file);
}
#Override
public void onSkipInRead(Throwable throwable) {
if (throwable instanceof FlatFileParseException) {
FlatFileParseException flatFileParseException = (FlatFileParseException) throwable;
System.out.println("onSkipInRead =========> :");
try {
bw.write(flatFileParseException.getInput()+"; VĂ©rifiez les colonnes !!");
bw.newLine();
bw.flush();
// fileWriter.close();
} catch (IOException e) {
System.err.println("Unable to write skipped line to error file");
}
}
}
#Override
public void onSkipInWrite(CSCivique item, Throwable t) {
System.out.println("Item " + item + " was skipped due to: " + t.getMessage());
}
#Override
public void onSkipInProcess(CSCivique item, Throwable t) {
System.out.println("Item " + item + " was skipped due to: " + t.getMessage());
}
}
#Bean
public FlatFileItemWriter<String> writer1() {
return new FlatFileItemWriterBuilder<String>().name(greetingItemWriter)
.resource(new FileSystemResource("target/test-outputs/greetings.csv"))
.lineAggregator(new PassThroughLineAggregator<>()).build();
}
Tankyou !
In your processor, you can:
throw a skippable exception for invalid items so that the skip listener intercepts them and writes them to the specified file
let valid items go to the writer so that they are written as configured in the item writer
For example:
class MyItemProcessor implements ItemProcessor<Object, Object> {
#Override
public Object process(Object item) throws Exception {
if (shouldBeSkipped(item)) {
throw new MySkippableException();
}
// process item
return item;
}
}
Hope this helps.

Vertx.deployVerticle does not call the supplied completion handler

I writing a service where a deployed verticle is linked to a rest endpoint. The service is working 100% (I dynamically deployed the verticle and calling the REST endpoint execute a function on the verticle). The problem is that the supplied completion handler is never called. Any ideas?
Following is my code:
LOGGER.debug(String.format("Starting runner %s:%s:%s" ,functionName, faasFunctionClass, fileName));
DeploymentOptions deploymentOptions = new DeploymentOptions();
deploymentOptions.setInstances(1);
JsonObject jsonObject = new JsonObject();
jsonObject.put(FUNCTION_NAME, functionName);
jsonObject.put(FUNCTION_CLASS, faasFunctionClass);
jsonObject.put(FUNCTION_FILENAME, fileName);
deploymentOptions.setConfig(jsonObject);
LOGGER.debug(String.format("Deploying [%s]" ,jsonObject.encode()));
this.vertx.deployVerticle("faas:" + VertxFaasRunner.class.getCanonicalName(),deploymentOptions, event->{
if (event.succeeded()) {
System.out.println("Deployment id is: " + event.result());
} else {
System.out.println("Deployment failed!");
}
});
In this case it depends on how you have implemented your Verticle.
in the below code when future.complete() is executed then only event.succeeded() will be true.
public class MainVerticle extends AbstractVerticle {
#Override
public void start() throws Exception {
System.out.println("[Main] Running in " + Thread.currentThread().getName());
vertx
.deployVerticle("io.vertx.example.core.verticle.worker.WorkerVerticle",
new DeploymentOptions().setWorker(true), event -> {
if (event.succeeded()) {
System.out.println("Deployment id is: " + event.result());
} else {
System.out.println("Deployment failed!");
}
});
}
}
public class WorkerVerticle extends AbstractVerticle {
#Override
public void start(Future future) throws Exception {
System.out.println("[Worker] Starting in " + Thread.currentThread().getName());
vertx.eventBus().<String>consumer("sample.data", message -> {
System.out.println("[Worker] Consuming data in " + Thread.currentThread().getName());
String body = message.body();
message.reply(body.toUpperCase());
});
// this notifies that the verticle is deployed successfully.
future.complete();
}
}

How to detect every mouse event in system using java

How to detect every mouse event in system using java?
I have tried using Point class to catch mouse motion but that thing was not so convenient.
import org.jnativehook.GlobalScreen;
import org.jnativehook.NativeHookException;
import org.jnativehook.keyboard.NativeKeyEvent;
import org.jnativehook.keyboard.NativeKeyListener;
class GlobalKeyListenerExample implements NativeKeyListener {
public void nativeKeyPressed(NativeKeyEvent e) {
System.out.println("Key Pressed: " + NativeKeyEvent.getKeyText(e.getKeyCode()));
if (e.getKeyCode() == NativeKeyEvent.VK_ESCAPE) {
}
}
public void nativeKeyReleased(NativeKeyEvent e) {
System.out.println("Key Released: " + NativeKeyEvent.getKeyText(e.getKeyCode()));
}
public void nativeKeyTyped(NativeKeyEvent e) {
System.out.println("Key Typed: " + e.getKeyText(e.getKeyCode()));
}
public GlobalKeyListenerExample()
{
try {
GlobalScreen.registerNativeHook();
}
catch (NativeHookException ex) {
System.err.println("There was a problem registering the native hook.");
System.err.println(ex.getMessage());
System.exit(1);
}
GlobalScreen.getInstance().addNativeKeyListener(this);
}
public static void main(String[] args) {
try {
GlobalScreen.registerNativeHook();
}
catch (NativeHookException ex) {
System.err.println("There was a problem registering the native hook.");
System.err.println(ex.getMessage());
System.exit(1);
}
//Construct the example object and initialze native hook.
GlobalScreen.getInstance().addNativeKeyListener(new GlobalKeyListenerExample());
}
}

future.get after ScheduledThreadPoolExecutor shutdown, will it work?

We use the ScheduledThreadPoolExecutor and after submitting the job we call shutdown immediately.
Because as per doc Shutdown does not kill the submitted task, running task and allows it to complete.
The question is after shutdown can we continue to use the future object that the ScheduledThreadPoolExecutor submit returns.
private static Future submitACall(Callable callableDelegate) {
ScheduledThreadPoolExecutor threadPoolExe = null;
try {
threadPoolExe = new ScheduledThreadPoolExecutor(1);
return threadPoolExe.submit(callableDelegate);
} finally {
threadPoolExe.shutdown();
}
}
//in another method...
if(future.isDone())
future.get();
Yes, you can, in a try-catch:
package testsomething;
import java.util.concurrent.Callable;
import java.util.concurrent.ExecutionException;
import java.util.concurrent.Future;
import java.util.concurrent.ScheduledThreadPoolExecutor;
public class TestSomething {
private static Future future = null;
private static ScheduledThreadPoolExecutor threadPoolExe = null;
public static void main(String[] args) {
Callable callableDelegate = new MyCallable();
future = submitACall(callableDelegate);
try {
System.out.println("First get: " + ((Integer)future.get()));
} catch (InterruptedException | ExecutionException ex) {
System.out.println("Exception: " + ex);
}
try {
Thread.sleep(100L);
} catch (InterruptedException ex) {
System.out.println("Exception: " + ex);
}
try {
System.out.println("Thread pool shut down? " + threadPoolExe.isShutdown());
System.out.println("Second get through 'anotherMethod': " + anotherMethod());
} catch (InterruptedException | ExecutionException ex) {
System.out.println("Exception: " + ex);
}
}
private static Future submitACall(Callable callableDelegate) {
try {
threadPoolExe = new ScheduledThreadPoolExecutor(1);
return
threadPoolExe.submit(callableDelegate);
} finally {
threadPoolExe.shutdown();
}
}
private static Integer anotherMethod() throws ExecutionException, InterruptedException {
if(future.isDone())
return ((Integer)future.get());
else
return null;
}
private static class MyCallable implements Callable {
#Override
public Object call() throws Exception {
return new Integer(0);
}
}
}