I am new to webflux and am not able to find the right material to continue with the implementation.
I want to issue a request and process the response asynchronously. In this case service call takes about 8-10 ms to respond, so we issue the request and continue doing other work, and look for the response when it is needed for further processing.
Mono<Map<String,Price>> resp = webClient.post()
.uri("/{type}",isCustomerPricing ? "customer" : "profile")
.body(Mono.just(priceDetailsRequest),PriceDetailsRequest.class)
.retrieve().bodyToMono(customerPriceDetailsType);
How do we make this call execute asynchronously on a different thread.(I tried subscriberOn with Schedulers.single/ Scheuldes.parallel), but didn't see the call getting executed until Mono.block() is called.
How do we achieve ?
We want this call execute in parallel on a separate thread, so the
current thread can continue with other work
When processing completes, set response to context
When the current thread looks for the response, if the service has not
completed, block until the call completes
You don't need to block for consuming the response. Just assign an operator to consume the response in the same chain. An example is given below.
Mono<Map<String,Price>> resp = webClient.post()
.uri("/{type}",isCustomerPricing ? "customer" : "profile")
.body(Mono.just(priceDetailsRequest),PriceDetailsRequest.class)
.retrieve()
.bodyToMono(CustomerPriceDetailsType.class)
.map(processor::responseToDatabaseEntity) // Create a persistable entity from the response
.map(priceRepository::save) // Save the entity to the database
.subscribe(); //This is to ensure that the flux is triggered.
Alternatively you can provide a consumer as a parameter of the subscribe() method.
Related
I am using Spring Batch and from processor making HTTP Get Calls to downstream service. I am using WebClient with MONO to make the HTTP request.
I observed that for every entry in my log - there are atleast 2 or 3 entries at the downstream applications.
Means if i made 1 get request , downstream application is getting 2 or 3 requests.
Even though the request is success with in given time(less than 2 seconds) , still we are seeing repeated calls to Downstream applications. Some times 2 calls are made and some times 3 calls are made.
2nd & 3rd call is happening with in milli seconds after the first call.
I am not using any retry or repeat logic externally. Any suggestion on how to avoid this?
Also i dont need to check the body of the response- if the response code is 200 that is good for me.
Here is the code snippet.
//Connection provider properties
ConnectionProvider provider = ConnectionProvider
.builder("fixed")
.maxConnections(corePoolSize)
.pendingAcquireMaxCount(-1)
.build();
HttpClient client = HttpClient
.create(provider)
.option(ChannelOption.CONNECT_TIMEOUT_MILLIS, ConnectionTimeOut)
.doOnConnected(conn -> conn
.addHandlerLast(new ReadTimeoutHandler(ReadTimeout, TimeUnit.MILLISECONDS)));
response = webClient.clientConnector(new ReactorClientHttpConnector(client))
.build()
.get()
.uri(queryURL)
.headers(getHeaders(transactionId))
.retrieve()
.bodyToMono(Account.class);
Account result = response.block();
if (Boolean.TRUE.equals(Objects.requireNonNull(result).isComplete())) {
statusCode = HttpStatus.OK;
future.complete(result);
}
I am current writing a WebSocket client. There are a few functions that I need to happen sequentially.
currently I do it in this way
Connect to server
Then set up a listener.
Add Conditional statement to listener.(To check if response id's match request ids)
send request One with id
if response id matches request One Id then process request
send request Two
repeat
This makes sequential actions look something like this
_channel.stream.listen((response) {
if(response.id == requestOne.id) {
handleRequestOneResponse(response);
}
if (response.id == requestTwo.id){
handleRequestTwoResponse(response);
}
...
});
sendActionOneRequest();
handleRequestOneResponse() {
// Some processing
sendActionTwoRequest();
}
handleRequestTwoResponse() {
// some processing
sendActionThreeRequest();
}
What I want to do is
Set up an async function
Send the request to the WebSocket server.
Pause the execution of the async function.
wait until a matching response comes from the server
complete the async function.
This would allow me to write a series of actions like
await actionOne();
await actionTwo();
await actionThree();
I'm thinking I can set up and destroy a stream listener in each action function but I don't know how to wait for a specific response before exiting.
On the other hand I think I can even use the existing listener on the outside, but I still can't figure out to wait till a specific response comes in before moving forward.
As it is I have to jump through every function to find out what comes after the other and there are more than 5 requests that have to be sent sequentially.
What is the recommended way in vert.x to write an Asynchronous request handler?
In this service, a request processing typically involves calling DB, calling external services, etc. I do not want to block the request handling thread however. What is the recommended way to achieve this using vet.x? In a typical asynchronous processing chain, I would use the request handling thread to emit a message to the message bus with the request object. Another handler will pick this message and do some processing such as checking request params. This handler can then emit a new message to the bus which can be picked up by the next handler which will do a remote call. This handler emits a new message with the result of the call which can be picked up by the next handler which will do error checking etc. Next handler would be responsible for creating the response and sending it to the client.
How one can create a similar pipeline using vert.x?
Everything, comprising request handlers for HttpServer, is asynchronous, isn't it?
var server = vertx.createHttpServer(HttpServerOptions())
server.requestHandler { req ->
req.setExpectMultipart(true) // for handling forms
var totalBuffer = Buffer.buffer()
req.handler { buff -> b.appendBuffer(buff) }
.endHandler { // the body has now been fully read
var formAttributes = request.formAttributes()
req.response().putHeader("Content-type","text/html");
req.response().end("Hello HTTP!");
}
// the above is so common that Vertx provides: bodyHandler{totalbuff->..}
}.listen(8080, "127.0.0.1", { res -> if(res.succeeded()) ... });
You just need to (end) write on req.response() on your final handler of your pipeline.
For a more stream-like implementation (i.e., not callback-based), you may use Vert.x Rx/ReactiveStreams API. E.g., you may use Vert.x Web Client for making requests, possibly using its Rx-fied API.
I'm using Activiti version 6.
I created a BPMN process from activiti-app.
Then I want to start that process from activiti-rest.war using the API.
http://localhost:8080/activiti-rest/service/runtime/process-instances
request body :
{
"processDefinitionKey":"cep_dispatch_process",
"businessKey":"myBusinessKey",
"returnVariables": false
}
header :
Content-Type:application/json
As I see in the LOG process is being started in tomcat threads.
referring latest GitHub code:
Activiti-activiti-6.0.0\modules\activiti-rest\src\main\java\org\activiti\rest\service\api\runtime\process\ProcessInstanceCollectionResource.java
When I see method,
#RequestMapping(value = "/runtime/process-instances", method = RequestMethod.POST, produces = "application/json")
public ProcessInstanceResponse createProcessInstance(#RequestBody ProcessInstanceCreateRequest request, HttpServletRequest httpRequest, HttpServletResponse response) {
I can see process is being started and not waiting for process to complete, HTTP response is 201. I can understand request is not being hold for process instance to complete.
instance = processInstanceBuilder.start();
response.setStatus(HttpStatus.CREATED.value());
Please refer the log snipped below, I can see process is executing in server thread and request is waiting till process completed.
276-DEBUG 17-01-2019 14:12:07,177- (http-nio-8080-exec-3) ExecutionEntityManagerImpl: Child execution Execution[ id '130023' ] - parent '130021' created with parent 130021
241-DEBUG 17-01-2019 14:12:07,178- (http-nio-8080-exec-3) ContinueProcessOperation: Executing boundary event activityBehavior class org.activiti.engine.impl.bpmn.behavior.BoundaryTimerEventActivityBehavior with execution 130023
171-DEBUG 17-01-2019 14:12:07,202- (http-nio-8080-exec-3) ContinueProcessOperation: Executing activityBehavior class org.activiti.engine.impl.bpmn.behavior.SubProcessActivityBehavior on activity 'sid-1A2A8DF5-764A-4960-8E5D-F347DC10207C' with execution 130021
276-DEBUG 17-01-2019 14:12:07,203- (http-nio-8080-exec-3) ExecutionEntityManagerImpl: Child execution Execution[ id '130025' ] - parent '130021' created with parent 130021
63-DEBUG 17-01-2019 14:12:07,203- (http-nio-8080-exec-3) DefaultActivitiEngineAgenda: Operation class org.activiti.engine.impl.agenda.ContinueProcessOperation added to agenda
70-DEBUG 17-01-2019 14:12:07,203- (http-nio-8080-exec-3) CommandInvoker: Executing operation class org.activiti.engine.impl.agenda.ContinueProcessOperation
Request must not wait for process to complete.
How can I solve this, request to start the process must not wait for process-instance to complete.
As you see in the response below:
{"id":"130028",
"url":"http://localhost:8080/activiti-rest/service/runtime/process-instances/130028",
"businessKey":"myBusinessKey",
"suspended":false,
"ended":true,
"processDefinitionId":"cep_dispatch_process:13:125033",
"processDefinitionUrl":"http://localhost:8080/activiti-rest/service/repository/process-definitions/cep_dispatch_process:13:125033"
,"processDefinitionKey":"cep_dispatch_process",
"activityId":null,
"variables":[],
"tenantId":"",
"name":null,
"completed":true
}
API is returning only after process completes, I add delay of 2 min in service task, I can see request will be waiting.
I'm not a big guru in Activiti but as a simplest solution I can suggest to activate Async executor and use Asynchronous Continuations for your service task. This could solve your problem. Activiti's behaviour is expected because until it has persisted state to DB it can't say for sure that process is created (because transaction could be rolled back due to DB error for example)
I have a ServiceWorker registered on my page and want to pass some data to it so it can be stored in an IndexedDB and used later for network requests (it's an access token).
Is the correct thing just to use network requests and catch them on the SW side using fetch, or is there something more clever?
Note for future readers wondering similar things to me:
Setting properties on the SW registration object, e.g. setting self.registration.foo to a function within the service worker and doing the following in the page:
navigator.serviceWorker.getRegistration().then(function(reg) { reg.foo; })
Results in TypeError: reg.foo is not a function. I presume this is something to do with the lifecycle of a ServiceWorker meaning you can't modify it and expect those modification to be accessible in the future, so any interface with a SW likely has to be postMessage style, so perhaps just using fetch is the best way to go...?
So it turns out that you can't actually call a method within a SW from your app (due to lifecycle issues), so you have to use a postMessage API to pass serialized JSON messages around (so no passing callbacks etc).
You can send a message to the controlling SW with the following app code:
navigator.serviceWorker.controller.postMessage({'hello': 'world'})
Combined with the following in the SW code:
self.addEventListener('message', function (evt) {
console.log('postMessage received', evt.data);
})
Which results in the following in my SW's console:
postMessage received Object {hello: "world"}
So by passing in a message (JS object) which indicates the function and arguments I want to call my event listener can receive it and call the right function in the SW. To return a result to the app code you will need to also pass a port of a MessageChannel in to the SW and then respond via postMessage, for example in the app you'd create and send over a MessageChannel with the data:
var messageChannel = new MessageChannel();
messageChannel.port1.onmessage = function(event) {
console.log(event.data);
};
// This sends the message data as well as transferring messageChannel.port2 to the service worker.
// The service worker can then use the transferred port to reply via postMessage(), which
// will in turn trigger the onmessage handler on messageChannel.port1.
// See https://html.spec.whatwg.org/multipage/workers.html#dom-worker-postmessage
navigator.serviceWorker.controller.postMessage(message, [messageChannel.port2]);
and then you can respond via it in your Service Worker within the message handler:
evt.ports[0].postMessage({'hello': 'world'});
To pass data to your service worker, the above mentioned is a good way. But in case, if someone is still having a hard time implementing that, there is an other hack around for that,
1 - append your data to get parameter while you load service-worker (for eg., from sw.js -> sw.js?a=x&b=y&c=z)
2- Now in service worker, fetch those data using self.self.location.search.
Note, this will be beneficial only if the data you pass do not change for a particular client very often, other wise it will keep changing the loading url of service worker for that particular client and every time the client reloads or revisits, new service worker is installed.