Please look at the below code from controller(Added comments) which uses RestTemplate:
#GetMapping("/{courseid}")
public Course getCourseDetails(#PathVariable Long courseid) {
// Get Course info (ID, Name, Description) from pre-populated Array List
CourseInfo courseInfo = getCourseInfo(courseid);
// Get Price info of a course from another microservice using RESTTemplate
Price price = restTemplate.getForObject("http://localhost:8002/price/"+courseid, Price.class);
// Get enrollment info of a course from another microservice using RESTTemplate
Enrollment enrollment = restTemplate.getForObject("http://localhost:8003/enrollment/"+courseid, Enrollment.class);
//Consolidate everything in to Course object and send it as response
return new Course(courseInfo.getCourseID(), courseInfo.getCourseName(), courseInfo.getCourseDesc(), price.getDiscountedPrice(),
enrollment.getEnrollmentOpen());
}
Now I am trying to achieve the same using Reactive programming. I now use WebClient and Mono from Web-Flux. But, I am so confused as to how to combine the results? Take a look at the below code (Just using Mono Everywhere. Rest of the code remained same)
#GetMapping("/{courseid}")
public Mono<Course> getCourseDetails(#PathVariable Long courseid) {
// Get Course info (ID, Name, Description) from pre-populated Array List
CourseInfo courseInfo = getCourseInfo(courseid);
// Get Price info of a course from another microservice using RESTTemplate
Mono<Price> price = webClient.get().uri("http://localhost:8002/price/{courseid}/",courseid).retrieve().bodyToMono(Price.class);
// Get enrollment info of a course from another microservice using RESTTemplate
Mono<Enrollment> inventory = webClient.get().uri("http://localhost:8003/enrollment/{courseid}/",courseid).retrieve().bodyToMono(Enrollment.class);
//Question : How do we Consolidate everything and form a Mono<Course> object and send it as response?
}
Question 1 : How do we Consolidate everything and form a Mono object and send it as response?
Question 2 : Does the statement "CourseInfo courseInfo = getCourseInfo(courseid);" cause blocking operation?
Thanks!
Answering to:
Question 1 : How do we Consolidate everything and form a Mono object and send it as response?
Mono.zip(..) is what you need to combine the two results. This diagram is from the doc :
Note that, zip will result in an empty Mono if one of A or 1 is empty! Use switchIfEmpty/defaultIfEmpty to protect against that case.
Thus the code looks like:
#GetMapping("/{courseid}")
public Mono<Course> getCourseDetails(#PathVariable Long courseid) {
CourseInfo courseInfo = getCourseInfo(courseid);
Mono<Price> priceMono = webClient.get().uri("http://localhost:8002/price/{courseid}/",courseid).retrieve().bodyToMono(Price.class);
Mono<Enrollment> enrollmentMono = webClient.get().uri("http://localhost:8003/enrollment/{courseid}/",courseid).retrieve().bodyToMono(Enrollment.class);
return Mono.zip(priceMono, enrollmentMono).map(t -> new Course(courseInfo.getCourseID(), courseInfo.getCourseName(), courseInfo.getCourseDesc(), t.getT1().getDiscountedPrice(),
t.getT2().getEnrollmentOpen()));
}
Now answering to:
Question 2 : Does the statement "CourseInfo courseInfo = getCourseInfo(courseid);" cause blocking operation?
Since you mentioned that Get Course info (ID, Name, Description) from pre-populated Array List, if it is just an in-memory Array containing the course information, then it's not blocking.
But (as #mslowiak also mentioned), if getCourseInfo contains logic which involves querying a database, ensure that you are not using a blocking JDBC driver. If so, then there is no point of using Webflux and Reactor. Use Spring R2DBC if that is the case.
restTemplate.getForObject returns simple object - in your case Price or Enrollment. To convert them to Mono you can simply Mono.just(object), however better solution would be to switch to Webclient which is default HTTP client for Spring Reactive
getCourseInfo it depends what is the logic behind this method. For sure if there is a JDBC connection behind that method it is blocking.
To make a final response with Mono<Course> you should think about zip operator which will help you with that.
For ex:
Mono<Course> courseMono = Mono.zip(price, enrollment)
.map(tuple -> new Course(courseInfo, tuple.getT1(), tuple.getT2()));
Related
Our test automation needs to interact with kafka and we are looking at how we can achieve this with karate.
We have a java class which reads from kafka and puts records in an internal list. We then ask for these records from karate, filter out all messages from background traffic, and return the first message that matches our filter.
So our consumer looks like this (simplified):
// consume.js
function(bootstrapServers, topic, filter, timeout, interval) {
var KafkaLib = Java.type('kafka.KafkaLib')
var records = KafkaLib.getRecords(bootstrapServers, topic)
for (record_id in records) {
// TODO here we want to convert record to a json (and later xml for xml records) so that
// we can access them as 'native' karate data types and use notation like: cat.cat.scores.score[1]
var record = records[record_id]
if (filter(record)) {
karate.log("Record matched: " + record)
return record
}
}
throw "No records found matching the filter: " + filter
}
Records can be json, xml, or plain text, but looking in the json case now.
In this case given that in kafka there is a message like this:
{"correlationId":"b3e6bbc7-e5a6-4b2a-a8f9-a0ddf435de67","text":"Hello world"}
This is loaded as a string in the record variable above.
We want to convert this to json so that a filter like this would work:
* def uuid = java.util.UUID.randomUUID() + ''
# This is what we are publishing to kafka
* def payload = ({ correlationId: uuid, text: "Hello world" })
* def filter = function(m) { return m.correlationId == uuid }
Is there a way to convert a string to a native karate variable in javascript? Might have missed it looking at https://intuit.github.io/karate/#the-karate-object. By the way var jsonRecord = karate.toJson(record) did not work and jsonRecord.uuid was undefined.
Edit: I have made an example of what I am trying to achieve here:
https://github.com/KostasKgr/karate-issues/blob/java_json_interop/src/test/java/examples/consumption/consumption.feature
Many thanks
Sometime ago I had put together a something that could be used to test Kafka from within Karate. Pls see if https://github.com/Sdaas/karate-kafka helps. Happy to enhance / improve if it helps you.
Can you try,
* json payload = { correlationId: uuid, text: "Hello world" }
ref : Type Conversion
for type conversion within javascript ideally karate.toMap(object) or karate.toJson(object) should.
rather than wrapping up everything into one JS function, I would suggest keeping the record invoking part outside the JS and let karate cast it.
* json records = Java.type('kafka.KafkaLib').getRecords(bootstrapServers, topic)
* consume(records, filter, timeout, interval)
As mentioned in the comments of another answer, there is now an enhancement ticket on karate to achieve what was discussed in this thread, see https://github.com/intuit/karate/issues/1202
Until that is in place, I managed to get most of what I wanted concerning JSON by parsing string to json in Java and returning that to karate.
Map<String,Object> result = new ObjectMapper().readValue(record, HashMap.class);
Not sure if the same can be worked around for xml
You can see the workaround in action here:
https://github.com/KostasKgr/karate-issues/blob/java_json_interop_v2/src/test/java/examples/consumption/consumption.feature
Because of Karate's support for Java inter-op you can easily write some "glue" code to connect your existing Kafka systems to Karate test-suites, see the first link below.
Here are a few references:
how to use Java inter-op to listen and wait for events: https://twitter.com/KarateDSL/status/1417023536082812935
the Karate ActiveMQ example: https://github.com/intuit/karate/tree/master/karate-netty#consumer-provider-example
Walmart Labs blog post (Kafka specific): https://medium.com/walmartglobaltech/kafka-automation-using-karate-6a129cfdc210
Karate Kafka (3rd party project / example): https://github.com/Sdaas/karate-kafka
I'm new to Vert.x and trying I am trying to implement a small REST API that stores its data in JSON files on the local file system.
So far I managed to implement the REST API since Vertx is very well documented on that part.
What I'm currently looking for are examples how to build data access objects in Vert.x. How can I implement a Verticle that can perform crud operations on a text file containing JSON?
Can you provide me any examples? Any hints?
UPDATE 1:
By CRUD operations on a file I'm thinking of the following. Imagine there is a REST resource called Records exposed on the the path /api/v1/user/:userid/records/.
In my verticle that starts my HTTP server I have the following routes.
router.get('/api/user/:userid/records').handler(this.&handleGetRecords)
router.post('/api/user/:userid/records').handler(this.&handleNewRecord)
The handler methods handleGetRecords and handleNewRecord are sending a message using the Vertx event bus.
request.bodyHandler({ b ->
def userid = request.getParam('userid')
logger.info "Reading record for user {}", userid
vertx.eventBus().send(GET_TIME_ENTRIES.name(), "read time records", [headers: [userId: userid]], { reply ->
// This handler will be called for every request
def response = routingContext.response()
if (reply.succeeded()) {
response.putHeader("content-type", "text/json")
// Write to the response and end it
response.end(reply.result().body())
} else {
logger.warn("Reply failed {}", reply.failed())
response.statusCode = 500
response.putHeader("content-type", "text/plain")
response.end('That did not work out well')
}
})
})
Then there is another verticle that consumes these messages GET_TIME_ENTRIES or CREATE_TIME_ENTRY. I think of this consumer verticle as a Data Access Object for Records. This verticle can read a file of the given :userid that contains all user records. The verticle is able to
add a record
read all records
read a specific record
update a record
delete a or all records
Here is the example of reading all records.
vertx.eventBus().consumer(GET_TIME_ENTRIES.name(), { message ->
String userId = message.headers().get('userId')
String absPath = "${this.source}/${userId}.json" as String
vertx.fileSystem().readFile(absPath, { result ->
if (result.succeeded()) {
logger.info("About to read from user file {}", absPath)
def jsonObject = new JsonObject(result.result().toString())
message.reply(jsonObject.getJsonArray('records').toString())
} else {
logger.warn("User file {} does not exist", absPath)
message.fail(404, "user ${userId} does not exist")
}
})
})
What I trying to achieve is to read the file like I did above and deserialise the JSON into a POJO (e.g. a List<Records>). This seems much more convenient that working with JsonObject of Vertx. I don't want to manipulate the JsonObject instance.
First of all, your approach using EventBus is fine, in my opinion. It may be a bit slower, because EventBus will serialize/deserialize your objects, but it gives you a very good decoupling.
Example of another approach you can see here:
https://github.com/aesteve/vertx-feeds/blob/master/src/main/java/io/vertx/examples/feeds/dao/RedisDAO.java
Note how every method receives handler as its last argument:
public void getMaxDate(String feedHash, Handler<Date> handler) {
More coupled, but also more efficient.
And for a more classic and straightforward approach, you can see the official examples:
https://github.com/aokolnychyi/vertx-example/blob/master/src/main/java/com/aokolnychyi/vertx/example/dao/MongoDbTodoDaoImpl.java
You can see that here DAO is pretty much synchronous, but since the handlers are still async, it's fine anyway.
I guess the following link will help you out and this is a good example of Vertx crud operations.
Vertx student crud operations using hikari
I'm learning the Spring 4 stuff by converting an existing Spring 3 project. In that project I have a custom query. That query fetches data in a straightforward way, after which some heavy editing is done to the query results. Now the data is sent to the caller.
I plan on extending CrudRepository for most of my simple query needs. The data will be output in HATEOAS format.
For this custom query I think I should be adding custom behavior (spring.io, "Working with Spring Data Repositories", Section 1.3.1, "Adding custom behavior to single repositories").
As an example:
#Transactional(readOnly = true)
public List<Offer> getFiltered(List<Org> orgs, OfferSearch criteria) {
List<Offer> filteredOffers = getDateTypeFiltered(criteria);
filteredOffers = applyOrgInfo(orgs, filteredOffers);
filteredOffers = applyFilterMatches(filteredOffers, criteria);
return sortByFilterMatches(filteredOffers);
}
(The code merely illustrates that I don't have a simple value fetch going on.)
If I could use the raw results of getDateTypeFiltered(criteria) then I could put that into a CrudRepository interface and the output would be massaged into HATEOAS by the Spring libraries. But I must do my massaging in an actual Java object, and I don't know how to tell Spring to take my output and emit it in my desired output format.
Is there an easy way to get there from here? Or must I try things like do my filtering in the browser?
Thanks,
Jerome.
To properly get HAL formatted results, your query controllers must return some form of Spring HATEOAS Resource type.
#RequestMapping(method = RequestMethod.GET, value = "/documents/search/findAll")
public ResponseEntity<?> findAll() {
List<Resource<Document>> docs = new ArrayList<>();
docs.add(new Resource<Document>(new Document("doc1"), new Link("localhost")));
docs.add(new Resource<Document>(new Document("doc2"), new Link("localhost")));
Resources<Resource<Document>> resources = new Resources<Resource<Document>>(docs);
resources.add(linkTo(methodOn(ApplicationController.class).findAll()).withSelfRel());
resources.add(entityLinks.linkToCollectionResource(Document.class).withRel("documents"));
return ResponseEntity.ok(resources);
}
I have submitted a pull request to Spring Data REST to update its reference docs to specify this in http://docs.spring.io/spring-data/rest/docs/2.4.0.RELEASE/reference/html/#customizing-sdr.overriding-sdr-response-handlers
I am not sure I perfectly got your question. If I did this should be the answer: http://docs.spring.io/spring-data/jpa/docs/1.9.0.RELEASE/reference/html/#repositories.custom-implementations
I am building a REST API and facing this issue: How can REST API pass very large JSON?
Basically, I want to connect to Database and return the training data. The problem is in Database I have 400,000 data. If I wrap them into a JSON file and pass through GET method, the server would throw Heap overflow exception.
What methods we can use to solve this problem?
DBTraining trainingdata = new DBTraining();
#GET
#Produces("application/json")
#Path("/{cat_id}")
public Response getAllDataById(#PathParam("cat_id") String cat_id) {
List<TrainingData> list = new ArrayList<TrainingData>();
try {
list = trainingdata.getAllDataById(cat_id);
Gson gson = new Gson();
Type dataListType = new TypeToken<List<TrainingData>>() {
}.getType();
String jsonString = gson.toJson(list, dataListType);
return Response.ok().entity(jsonString).header("Access-Control-Allow-Origin", "*").header("Access-Control-Allow-Methods", "GET").build();
} catch (SQLException e) {
logger.warn(e.getMessage());
}
return null;
}
The RESTful way of doing this is to create a paginated API. First, add query parameters to set page size, page number, and maximum number of items per page. Use sensible defaults if any of these are not provided or unrealistic values are provided. Second, modify the database query to retrieve only a subset of the data. Convert that to JSON and use that as the payload of your response. Finally, in following HATEOAS principles, provide links to the next page (provided you're not on the last page) and previous page (provided you're not on the first page). For bonus points, provide links to the first page and last page as well.
By designing your endpoint this way, you get very consistent performance characteristics and can handle data sets that continue to grow.
The GitHub API provides a good example of this.
My suggestion is no to pass the data as a JSON but as a file using multipart/form-data. In your file, each line could be a JSON representing a data record. Then, it would be easy to use a FileOutputStream to receive te file. Then, you can process the file line by line to avoid memory problems.
A Grails example:
if(params.myFile){
if(params.myFile instanceof org.springframework.web.multipart.commons.CommonsMultipartFile){
def fileName = "/tmp/myReceivedFile.txt"
new FileOutputStream(fileName).leftShift(params.myFile.getInputStream())
}
else
//print or signal error
}
You can use curl to pass your file:
curl -F "myFile=#/mySendigFile.txt" http://acme.com/my-service
More details on a similar solution on https://stackoverflow.com/a/13076550/2476435
HTTP has the notion of chunked encoding that allows you send a HTTP response body in smaller pieces to prevent the server from having to hold the entire response in memory. You need to find out how your server framework supports chunked encoding.
I have an ASP.NET MVC 2 application which in part allows a user to filter data and view that data in a JQGrid.
Currently this consists of a controller which initialises my filter model and configures how I wish my grid to be displayed. This information is used by a view and a partial view to display the filter and the grid shell. I use an editor template to display my filter. The JQGrid makes use of a JsonResult controller action (GET) to retrieve the results of the filter (with the addition of the paging offered by the grid - only a single page of data is returned by the GET request. The Uri used by the grid to request data contains the filter model as a RouteValue - and currently contains a string representation of the current state of the filter. A custom IModelBinder is used to convert this representation back into an instance of the filter model class.
The user can change the filter and press a submit button to get different results - this is then picked up by an (HttpPost) ViewResult action which takes the filter model - reconstituted by a further model binder and causes the grid shell to be updated.
So I have:
FilterModel
Represents the user's desired filtering characteristics
FilterModelEditorTemplateSubmissionBinder : DefaultModelBinder - used to convert the request information supplied from a user changing their filtering characteristics into the appropriate FilterModel instance.
FilterModelStringRepresentationBinder : IModelBinder - used to convert the encoded filter from the JQGrid GET request for data so the correct request is made of the service which is ultimately performing the query and returning the relevant data.
ViewResult Index() - constructs a default filter, configures the grid specification and returns the view to render the filter's editor template, and the grid shell.
[HttpPost]ViewResult Filter(FilterModel filter) - takes the new filter characteristics and returns the same view as Index(). Uses FilterModelEditorTemplateSubmissionBinder to bind the filter model.
JsonResult GetData(FilterModel filter, string sidx, string sord, int page, int rows) - called from the JQGrid in order to retrieve the data. Uses FilterModelStringRepresentationBinder to bind the filter model.
As a complication, my filter model contains a option to select a single value from a collection of items. This collection is retrieved from a service request and I don't want to keep querying for this data everytime I show the filter, currently I get it if the property is null, and then include the options hidden in the editor template and encoding in the string representation. These options are then reconstituted by the relevant model binder.
Although this approach works I can't help but feel that I am having to basically reinvent viewstate in order to maintain my filter and the included options. As I am new to ASP.NET MVC but am very happy with classic ASP and ASP.NET Web Forms I thought I'd throw this out there for comment and guidance as to find a way which more closely fits with the MVC pattern.
It seems to me that the best way in to divide some actions which provide pure data for the jqGrid from other controller action. Such jqGrid-oriented actions can have prototype like:
JsonResult GetData(string filter, string sidx, string sord, int page, int rows)
I personally prefer to implement this part as WCF service and to have this WCF service as a part of the same ASP.NET site. In general it's much more the matter of taste and depends on your other project requirements.
This part of you ASP.NET site could implement users authentication which you need and can be tested with unit tests exactly like other actions of your controllers.
The views of the ASP.NET MVC site can have empty data for jqGrids, and have only correct URLs and probably generate the HTML code depends on the users permission in the site. Every page will fill the data of jqGrids with respect of the corresponds requests to the server (request to the corresponding GetData action).
You can use HTTP GET for the data for the best data caching. The caching of data is the subject of a separate discussion. If you do this, you should use prmNames: { nd:null } in the definition of jqGrid to remove unique nd parameter with the timestamp added per default to every GET request. To have full control of the data caching on the server side you can for example add in HTTP headers of the server responses both "Cache-Control" set to "max-age=0" and "ETag" header with the value calculated based of the data returned in the response. You should test whether the request from the client has "If-None-Match" HTTP header with the value of "ETag" coresponds the data cached on the client. Then you should verify whether the current data on the server (in the database) are changed and, if there are not changed, generate a response with an empty body (set SuppressEntityBody to true) and return "304 Not Modified" status code (HttpStatusCode.NotModified) instead of default "200 OK". A more detail explanation is much more longer.
If you don't want optimize you site for caching of HTTP GET data for jqGrids you can either use HTTP POST or don't use prmNames: { nd:null } parameter.
The code inside of JsonResult GetData(string filter, string sidx, string sord, int page, int rows) is not very short of cause. You should deserialise JSON data from the filter string and then construct the request to the data model depends on the method of the data access which you use (LINQ to SQL, Entity Model or SqlCommand with SqlDataReader). Because you have this part already implemented it has no sense to discuss this part.
Probably the main part of my suggestion is the usage of clear separation of controller actions which provide the data for all your jqGrids and the usage of MVC views with empty data (having only <table id="list"></table><div id="pager"></div>). You should also has no doubt with having a relative long code for analyzing of filters which come from the Advance Searching feature of the jqGrid and generating or the corresponding requests to your data model. Just implement it one time. In my implementation the code in also relatively complex, but it is already written one time, it works and it can be used for all new jqGrids.
I made this once, very simple.
pseudo code:
Controller
[HttpGet]
public ActionResult getList(int? id){
return PartialView("Index", new ListViewModel(id??0))
}
ViewModel
public class ListViewModel{
//ObjectAmountPerPage is the amount of object you want per page, you can modify this as //parameter so the user
//can choose the amount
public int ObjectAmountPerPage = 20 //you can make this into a variable of any sort, db/configfile/parameter
public List<YourObjectName> ObjectList;
public int CurrentPage;
public ListViewModel(id){
Currentpage = id;
using (MyDataContext db = new MyDataContext()){
ObjectList = db.YourObjectName.OrderBy(object=>object.somefield).getListFromStartIndexToEndIndex(id*ObjectAmountPerPage ,(id*ObjectAmountPerPage) +20).toList();
}
}
}
Now Create A RenderPartial:
PartialView
<#page inherit="IEnumerable<ListViewMode>">
<%foreach(YourObjectName object in Model.ObjectList){%>
Create a table with your fields
<%}%>
And create a view that implements your Jquery, other components+your partialView
View
<javascript>
$(function(){
$("#nextpage").click(function(){
(/controller/getlist/$("#nextpage").val(),function(data){$("#yourlist").html = data});
});
});
</javascript>
<div id="yourlist">
<%=Html.RenderPartial("YourPartialView", new ListViewModel())%>
</div>
<something id="nextpage" value"<%=Model.CurentPage+1%>">next page</something>
I hope this helps, this is according to the MVC- mv-mv-c principle ;)
Model-View -(modelview) - control