Vert.x Kafka not adhering to RxJava thread assignment? - rx-java2

Given the following code:
kafkaConsumer
.rxSubscription()
.subscribeOn(Schedulers.io())
.map(s -> {
logger.info("Mapping on Thread: " + Thread.currentThread().getName());
return s;
})
.observeOn(Schedulers.computation())
.subscribe(
set -> {
logger.info("Subscribing on Thread: " +Thread.currentThread().getName());
});
where kafkaConsumer is a Vert.x KafkaConsumer, I expect that the
.map(s -> {
logger.info("Mapping on Thread: " + Thread.currentThread().getName());
return s;
})
would happen on the Reactive IO Thread. However, it executes on the Vert.x event-loop Thread. When I run the following test class, the same scenario runs the map method on the IO thread as expected.
public class ThreadTesting {
public static void main(String args[]) {
Vertx vertx = Vertx.vertx();
Observable.fromArray(new String[] {"start"})
.flatMapSingle(s -> method1())
.subscribeOn(Schedulers.io())
.map(
s -> {
System.out.println("mapping 2 on Thread: " + Thread.currentThread().getName());
return s.concat(method2());
})
.observeOn(Schedulers.computation())
.subscribe(
str -> {
System.out.println("Subscribing on Thread: " + Thread.currentThread().getName());
},
onError -> {
onError.printStackTrace();
});
}
public static Single<String> method1() {
System.out.println("Executing method 1 on Thread: " + Thread.currentThread().getName());
AsyncResultSingle<String> vertxSingle = new AsyncResultSingle<>(
h -> {
h.handle(Future.succeededFuture("method 1 string"));
});
return vertxSingle;
}
public static String method2() {
System.out.println("Executing method 2 on Thread: " + Thread.currentThread().getName());
return "method 2 String";
}
}
What causes this discrepancy in Thread execution to happen?

The Vert.x KafkaConsumer emits items asynchronously on an event-loop thread, even if you subscribed to it on the io scheduler.
In your snippet, you try to force items to be emitted on the computation scheduler. It works, but not on the observable you expect: it applies to the observable returned by the map operation.
If you want map to operate on the computation scheduler, you need to apply the observeOn operator before:
kafkaConsumer
.rxSubscription()
.subscribeOn(Schedulers.io())
.observeOn(Schedulers.computation())
.map(s -> {
logger.info("Mapping on Thread: " + Thread.currentThread().getName());
return s;
})
.subscribe(
set -> {
logger.info("Subscribing on Thread: " +Thread.currentThread().getName());
});

Related

Mono subscribe is not executing

I have started learning reactive programming using spring boot and when I tried execute the below piece of code from route function, the mono subscribe part of logic is not executing.
#SpringBootApplication
public class MonotopayloadApplication {
public static void main(String[] args) {
SpringApplication.run(MonotopayloadApplication.class, args);
}
#Bean
RouterFunction<ServerResponse> videoEndPoint() {
return route()
.path("/test", builder -> builder
.POST("", this::handle)
).build();
}
private Mono<ServerResponse> handle(ServerRequest serverRequest) {
serverRequest.bodyToMono(String.class)
.subscribe(result -> {
try {
Thread.sleep(5000);
} catch (InterruptedException e) {
e.printStackTrace();
}
System.out.println("subscribe thread : " + Thread.currentThread().getName());
});
System.out.println("Thread : " + Thread.currentThread().getName() + " is leaving");
return Mono.empty();
}
}
I do not want to call the block/blockFirst/blockLast(). I understand the main thread is leaving the subscribe thread. Kindly help.

RxJava adjust backpressure avoiding observeOn buffer

In the code below I would like the subscriber to control when the Flowable emits an event by holding a reference to the Subscription inside subscribe() and requesting the number of elements I want to be produced.
What I am experiencing is that observeOn()'s buffer with size 2 is hiding my call to subscription.request(3) as the producer is producing 2 elements at a time instead of 3.
public class FlowableExamples {
public static void main(String[] args) throws InterruptedException {
long start = new Date().getTime();
Flowable<Integer> flowable = Flowable
.generate(() -> 0, (Integer state, Emitter<Integer> emitter) -> {
int newValue = state + 1;
log("Producing: " + newValue);
emitter.onNext(newValue);
return newValue;
})
.take(30);
flowable
.subscribeOn(Schedulers.io())
.observeOn(Schedulers.computation(), false, 2)
.subscribe(new Subscriber<Integer>() {
Subscription subscription;
#Override
public void onSubscribe(Subscription subscription) {
this.subscription = subscription;
subscription.request(5);
}
#Override
public void onNext(Integer integer) {
log("\t\treceived: " + integer);
if (integer >= 5) {
sleep(500);
log("Requesting 3 should produce 3, but actually produced 2");
subscription.request(3);
sleep(1000);
}
}
#Override
public void onError(Throwable throwable) {}
#Override
public void onComplete() {
log("Subscription Completed!!!!!!!!");
}
});
sleep(40_000);
System.out.println("Exit main after: " + (new Date().getTime() - start) + " ms");
}
private static void log(String msg) {
System.out.println(Thread.currentThread().getName() + ": " + msg);
}
private static void sleep(long ms) {
try {
Thread.sleep(ms);
} catch (InterruptedException e) {}
}
}
How could I accomplish this?

How Spring Cloud Stream reactive processing works?

How to achieve reactive message processing in Spring Cloud Stream? I read about Spring Cloud Function and that I should use them for reactive processing so I created sample one:
#Bean
public Consumer<Flux<Message<Loan>>> loanProcess() {
return loanMessages ->
loanMessages
.flatMap(loanMessage -> Mono.fromCallable(() -> {
if (loanMessage.getPayload().getStatus() == null) {
log.error("Empty status");
throw new RuntimeException("Loan status is empty");
}
return "Good";
}))
.doOnError(throwable -> log.error("Exception occurred: {}", throwable))
.subscribe(status -> log.info("Message processed correctly: {}", status));
}
Afterwards I started to thinking what is the difference between the above function and the class with #StreamListener and usage of Reactor types:
#StreamListener(Sink.INPUT)
public void loanReceived(Message<Loan> message) {
Mono.just(message)
.flatMap(loanMessage -> Mono.fromCallable(() -> {
if (loanMessage.getPayload().getStatus() == null) {
log.error("Empty status");
throw new RuntimeException("Loan status is empty");
}
log.info("Correct message");
return "Correct message received";
}))
.doOnError(throwable -> log.error("Exception occurred: {}", throwable.getClass()))
.subscribe(status -> log.info("Message processed correctly: {}", status));
}
Additionally, in Spring Webflux I understand that there are few threads from netty which handle requests processing (running in event loop). However, I cannot find a documentation how thread model works in Spring Cloud Stream.

RxJava Observables not Parallel

I am new to RxJava.
I am executing a basic code :
public class App {
public static void main(String... args) throws Exception {
long startTime = System.currentTimeMillis();
abcd().map(cnt -> cnt).subscribe((s) -> System.out.println(s));
abcd().map(cnt -> cnt).subscribe(s -> System.out.println(s));
long endTime = System.currentTimeMillis();
long diff = endTime - startTime;
System.out.println(diff);
}
public static Observable<Integer> abcd() {
try {
Thread.sleep(1000);
} catch (Exception e) {
System.out.println();
}
Observable<Integer> r = Observable.fromArray(10);
return r;
}
}
Basically created two Observables & both taking 1 second in processing.
And the total time to run this code is more than 2 seconds, meaning two Observables are not executing in Parallel.
How do I change my code so as the total time of execution is 1 sec meaning two of my observable should execute in parrallel. Please post the answer wrt to RxJava.
You are sleeping before the Observables are even created so it's not exactly the same as Observable doing processing before it emits the result.
You can defer the execution of the abcd() method onto a background thread and wait for both to sub-flows to terminate:
Observable.merge(
Observable.defer(() -> abcd())
.subscribeOn(Schedulers.io())
.map(cnt -> cnt)
.doOnNext(System.out::println),
Observable.defer(() -> abcd())
.subscribeOn(Schedulers.io())
.map(cnt -> cnt + 1)
.doOnNext(System.out::println)
)
.blockingSubscribe(ignored -> { }, Throwable::printStackTrace);

Vertx.deployVerticle does not call the supplied completion handler

I writing a service where a deployed verticle is linked to a rest endpoint. The service is working 100% (I dynamically deployed the verticle and calling the REST endpoint execute a function on the verticle). The problem is that the supplied completion handler is never called. Any ideas?
Following is my code:
LOGGER.debug(String.format("Starting runner %s:%s:%s" ,functionName, faasFunctionClass, fileName));
DeploymentOptions deploymentOptions = new DeploymentOptions();
deploymentOptions.setInstances(1);
JsonObject jsonObject = new JsonObject();
jsonObject.put(FUNCTION_NAME, functionName);
jsonObject.put(FUNCTION_CLASS, faasFunctionClass);
jsonObject.put(FUNCTION_FILENAME, fileName);
deploymentOptions.setConfig(jsonObject);
LOGGER.debug(String.format("Deploying [%s]" ,jsonObject.encode()));
this.vertx.deployVerticle("faas:" + VertxFaasRunner.class.getCanonicalName(),deploymentOptions, event->{
if (event.succeeded()) {
System.out.println("Deployment id is: " + event.result());
} else {
System.out.println("Deployment failed!");
}
});
In this case it depends on how you have implemented your Verticle.
in the below code when future.complete() is executed then only event.succeeded() will be true.
public class MainVerticle extends AbstractVerticle {
#Override
public void start() throws Exception {
System.out.println("[Main] Running in " + Thread.currentThread().getName());
vertx
.deployVerticle("io.vertx.example.core.verticle.worker.WorkerVerticle",
new DeploymentOptions().setWorker(true), event -> {
if (event.succeeded()) {
System.out.println("Deployment id is: " + event.result());
} else {
System.out.println("Deployment failed!");
}
});
}
}
public class WorkerVerticle extends AbstractVerticle {
#Override
public void start(Future future) throws Exception {
System.out.println("[Worker] Starting in " + Thread.currentThread().getName());
vertx.eventBus().<String>consumer("sample.data", message -> {
System.out.println("[Worker] Consuming data in " + Thread.currentThread().getName());
String body = message.body();
message.reply(body.toUpperCase());
});
// this notifies that the verticle is deployed successfully.
future.complete();
}
}