Stream a response in Spring mvc - rest

This is the situation, lets say i have and endpoint and receive a request that retrieves data between a range of time or whatever, and the result of that request is a big list that i get from a database, lets say a list of a "Person" object, then for each of this person objects I have to call another method that it may be a little slow and it would delay the response a lot if i have to wait until it is executed for all the elements of this big list.
What i would like to accomplish is that i can stream the response through a rest endpoint and my front end does not have to wait until all this list is processed to start displaying it on the screen.
So i have a confusion here, i know that an asynchronous method using spring #Async it would make the consumer to be able to give a response even if the task is still not finished, but as far as i understand, this is helpful in the case of sending emails, or any other task or series of tasks whose response you are not going to display in the screen.
But in the case of a response that is meant to be displayed in the screen, i guess i should stream a chunk of data as soon as i have a whole "person" object ready.
What is the right way to accomplish this? is the Async method of any help in this situation or i should only find a way to detect when i have a person object is formed to stream it? or i'm terribly wrong and im not understanding the concepts of Async and streaming.
A little example would help.
Thanks.

I have been trying to understand the same concept from last 3 days and here is the my understanding which may help you.
Asynchronous REST endpoint:
If your REST end point is doing some complex business logic or calling some external service and may take some time respond back, its better to respond back from API ASAP moving the time consuming logic to background (separate thread). This is where Asynchronous processing will help.
Chunked output:
If your end point is expected to send large amount of data. In order to improve the user experience if i decide to start rendering the output (in UI) as soon as they start becoming available, chunked output from REST end point is the better approach.
Using jersey we can achieve both asynchronous processing and chunked output as mentioned in the below sample.
public ChunkedOutput<String> getChunkedResponse() {
final ChunkedOutput<String> output = new ChunkedOutput<String>(String.class);
new Thread() {
public void run() {
try {
String chunk;
int index = 0;
while ((chunk = getWordAtIndex(index)) != null) {
output.write(chunk);
index++;
}
} catch (IOException e) {
//Add code to handle the IO Exception during this operation
} finally {
try {
output.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
}.start();
return output; // This output object may be returned way before output is created
}
I have tried out a sample to test this out with jersey and spring-boot combination. You can check it out in my git repository here.
Hope it helps.

Related

Apache Camel - Getting a list of files from FTP as a result of a GET request

As the title suggests I'm trying to get a list of files from an FTP directory to send as a response of a GET request.
I have current rest route implementation:
rest().get("/files")
.produces(MediaType.APPLICATION_JSON_VALUE)
.route()
.routeId("restRouteId")
.to("direct:getAllFiles");
On the other side of the direct route I have the following routes:
from("direct:getAllFiles")
.routeId("filesDirectId")
.to("controlbus:route" +
"?action=start" +
"&routeId=ftpRoute");
from([ftpurl])
.noAutoStartup()
.routeId("ftpRoute")
.aggregate(constant(true), new FileAggregationStrategy())
.completionFromBatchConsumer()
.process(filesProcessor)
.to("controlbus:route" +
"?action=stop" +
"&routeId=" + BESTANDEN_ROUTE_ID);
The issue at hand is that with this method the request does not wait for the complete process to finish, it almost instantly returns an empty response with StatusCode 200.
I've tried multiple solutions but they all fail in either of two ways: either the request gets a response even though the route hasn't finished yet OR the route gets stuck waiting for inflight exchanges at some point and waits for the 5 minute timeout to continue.
Thanks in advance for your advice and/or help!
Note: I'm working in a Spring Boot application (2.0.5) and Apache Camel (2.22.1).
I think the problem here is that your two routes are not connected. You are using the control bus to start the second route but it doesn't return the value back to the first route - it just completes, as you've noted.
What I think you need (I've not tested it) is something like:
from("direct:getAllFiles")
.routeId("filesDirectId")
.pollEnrich( [ftpurl], new FileAggregationStrategy() )
.process( filesProcessor );
as this will synchronously consume your ftp consumer, and do the post processing and return the values to your rest route.
With the help of #Screwtape's answer i managed to get it working for my specific issue. A few adjustments were needed, here is a list of what you need:
Add the option "sendEmptyMessageWhenIdle=true" to the ftp url
In the AggregationStrategy add an if (exchange == null) clause
In the clause set a property "finished" to true
Wrap the pollEnrich with a loopDoWhile that checks the finished property
In its entirety it looks something like:
from("direct:ftp")
.routeId("ftpRoute")
.loopDoWhile(!finished)
.pollEnrich("ftpurl...&sendEmptyMessageWhenIdle=true", new FileAggregationStrategy())
.choice()
.when(finished)
.process(filesProcessor)
.end()
.end();
In the AggregationStrategy the aggregate method looks something like:
#Override
public Exchange aggregate(Exchange currentExchange, Exchange newExchange) {
if (currentExchange == null)
return init(newExchange);
else {
if (newExchange == null) {
currentExchange.setProperty("finished", true);
return currentExchange;
}
return update(currentExchange, newExchange);
}
}

Writing Verticles that performs CRUD Operations on a file

I'm new to Vert.x and trying I am trying to implement a small REST API that stores its data in JSON files on the local file system.
So far I managed to implement the REST API since Vertx is very well documented on that part.
What I'm currently looking for are examples how to build data access objects in Vert.x. How can I implement a Verticle that can perform crud operations on a text file containing JSON?
Can you provide me any examples? Any hints?
UPDATE 1:
By CRUD operations on a file I'm thinking of the following. Imagine there is a REST resource called Records exposed on the the path /api/v1/user/:userid/records/.
In my verticle that starts my HTTP server I have the following routes.
router.get('/api/user/:userid/records').handler(this.&handleGetRecords)
router.post('/api/user/:userid/records').handler(this.&handleNewRecord)
The handler methods handleGetRecords and handleNewRecord are sending a message using the Vertx event bus.
request.bodyHandler({ b ->
def userid = request.getParam('userid')
logger.info "Reading record for user {}", userid
vertx.eventBus().send(GET_TIME_ENTRIES.name(), "read time records", [headers: [userId: userid]], { reply ->
// This handler will be called for every request
def response = routingContext.response()
if (reply.succeeded()) {
response.putHeader("content-type", "text/json")
// Write to the response and end it
response.end(reply.result().body())
} else {
logger.warn("Reply failed {}", reply.failed())
response.statusCode = 500
response.putHeader("content-type", "text/plain")
response.end('That did not work out well')
}
})
})
Then there is another verticle that consumes these messages GET_TIME_ENTRIES or CREATE_TIME_ENTRY. I think of this consumer verticle as a Data Access Object for Records. This verticle can read a file of the given :userid that contains all user records. The verticle is able to
add a record
read all records
read a specific record
update a record
delete a or all records
Here is the example of reading all records.
vertx.eventBus().consumer(GET_TIME_ENTRIES.name(), { message ->
String userId = message.headers().get('userId')
String absPath = "${this.source}/${userId}.json" as String
vertx.fileSystem().readFile(absPath, { result ->
if (result.succeeded()) {
logger.info("About to read from user file {}", absPath)
def jsonObject = new JsonObject(result.result().toString())
message.reply(jsonObject.getJsonArray('records').toString())
} else {
logger.warn("User file {} does not exist", absPath)
message.fail(404, "user ${userId} does not exist")
}
})
})
What I trying to achieve is to read the file like I did above and deserialise the JSON into a POJO (e.g. a List<Records>). This seems much more convenient that working with JsonObject of Vertx. I don't want to manipulate the JsonObject instance.
First of all, your approach using EventBus is fine, in my opinion. It may be a bit slower, because EventBus will serialize/deserialize your objects, but it gives you a very good decoupling.
Example of another approach you can see here:
https://github.com/aesteve/vertx-feeds/blob/master/src/main/java/io/vertx/examples/feeds/dao/RedisDAO.java
Note how every method receives handler as its last argument:
public void getMaxDate(String feedHash, Handler<Date> handler) {
More coupled, but also more efficient.
And for a more classic and straightforward approach, you can see the official examples:
https://github.com/aokolnychyi/vertx-example/blob/master/src/main/java/com/aokolnychyi/vertx/example/dao/MongoDbTodoDaoImpl.java
You can see that here DAO is pretty much synchronous, but since the handlers are still async, it's fine anyway.
I guess the following link will help you out and this is a good example of Vertx crud operations.
Vertx student crud operations using hikari

Is the following code with Vert.x really reactive?

Do I have a wrong understanding of "reactive" or is something wrong in my example?
I did a small code sample in Vertx: In a REST service I read data from mongodb and returning as JSON.
...........
Router router = Router.router(vertx);
router.route().handler(BodyHandler.create());
router.get("/gilders").handler(this::listAll);
vertx.createHttpServer().requestHandler(router::accept).listen(8080);
}
private void listAll(RoutingContext routingContext) {
mongoClient.find("gliders", new JsonObject(), results -> {
List<JsonObject> objects = results.result();
/* is this non blocking?!
mongoClient.find return immediately, but the rest client just
gets results, after mongo delivered all results
*/
List<Glider> gilder = objects.stream()
.map(res -> {
Glider g = new Glider();
g.setName(res.getString("name"));
g.setPrice(res.getString("price"));
return g;
})
.collect(Collectors.toList());
routingContext.response()
.putHeader("content-type", "application/json; charset=utf-8")
.end(Json.encodePrettily(gilder));
});
}
OK, its not blocking, I could compute something else meanwhile waiting for mongo.
But somehow I thought about "reactive" is that the REST client will get already the first chunks of the mongo results even mongo is still not ready finding all by that time (HTTP Streaming). But like this, the callback is just invoked, when mongo found all results.
Reactive is not the same as streaming. Reactive is a concept around data flows, your application will react to events, e.g.: data returned from mongoDB. You can now implement streaming on top of it by asking the mongo client to start pumping data asap as it arrives from the network. However in a blocking API you could do streaming by blocking the application for data and then pass it one by one to a consumer.

ASP.NET MVC2 AsyncController: Does performing multiple async operations in series cause a possible race condition?

The preamble
We're implementing a MVC2 site that needs to consume an external API via https (We cannot use WCF or even old-style SOAP WebServices, I'm afraid). We're using AsyncController wherever we need to communicate with the API, and everything is running fine so far.
Some scenarios have come up where we need to make multiple API calls in series, using results from one step to perform the next.
The general pattern (simplified for demonstration purposes) so far is as follows:
public class WhateverController : AsyncController
{
public void DoStuffAsync(DoStuffModel data)
{
AsyncManager.OutstandingOperations.Increment();
var apiUri = API.getCorrectServiceUri();
var req = new WebClient();
req.DownloadStringCompleted += (sender, e) =>
{
AsyncManager.Parameters["result"] = e.Result;
AsyncManager.OutstandingOperations.Decrement();
};
req.DownloadStringAsync(apiUri);
}
public ActionResult DoStuffCompleted(string result)
{
return View(result);
}
}
We have several Actions that need to perform API calls in parallel working just fine already; we just perform multiple requests, and ensure that we increment AsyncManager.OutstandingOperations correctly.
The scenario
To perform multiple API service requests in series, we presently are calling the next step within the event handler for the first request's DownloadStringCompleted. eg,
req.DownloadStringCompleted += (sender, e) =>
{
AsyncManager.Parameters["step1"] = e.Result;
OtherActionAsync(e.Result);
AsyncManager.OutstandingOperations.Decrement();
}
where OtherActionAsync is another action defined in this same controller following the same pattern as defined above.
The question
Can calling other async actions from within the event handler cause a possible race when accessing values within AsyncManager?
I tried looking around MSDN but all of the commentary about AsyncManager.Sync() was regarding the BeginMethod/EndMethod pattern with IAsyncCallback. In that scenario, the documentation warns about potential race conditions.
We don't need to actually call another action within the controller, if that is off-putting to you. The code to build another WebClient and call .DownloadStringAsync() on that could just as easily be placed within the event handler of the first request. I have just shown it like that here to make it slightly easier to read.
Hopefully that makes sense! If not, please leave a comment and I'll attempt to clarify anything you like.
Thanks!
It turns out the answer is "No".
(for future reference incase anyone comes across this question via a search)

Waiting for more than one event (using GWT)

I want to fetch two XML documents from the server and resume processing when both have arrived. Can I fetch them in parallel, or do I have to refrain from issuing the second request until the first has completed?
You can fetch them in parallel, but keep in mind that browsers have a limit on the number of parallel requests, see http://www.browserscope.org/?category=network (choose "Major Versions" in the dropdown on the top left to see more versions). Note especially, that IE < 8 has a limit of 2 connections per hostname!
If you still want to do this, then note that the responses can arrive in any order. So you'll have to implement something that will keep track of the requests/responses (a counter or something more sophisticated), so that you'll know when all responses you need have arrived.
The best solution is often to send just one request that asks for both XML documents, and the server returns them both at once in one response.
Make both requests, then check when either one completes whether the other is done, and continue if it is.
private String responseOne;
private String responseTwo;
public startRequests() {
makeAsyncRequestOne(new AsyncCallback<String>() {
onSuccess(String response) {
this.responseOne = response;
if (responseTwo != null) {
proceed();
}
}
});
makeAsyncRequestTwo(new AsyncCallback<String>() {
onSuccess(String response) {
this.responseTwo = response;
if (responseOne != null) {
proceed();
}
}
});
}
As Chris points out, this may hit a ceiling on maximum concurrent requests to the same hostname, so if you have lots of requests to send at once, you could keep a queue of requests and call the next one in proceed() until the queue is exhausted.
But if you plan on having a lot of concurrent requests, you probably need to redesign your service anyway, to batch operations together.