Validating input parameters for JAX-RS methods - rest

I'm writing a JAX-RS API, and am using the Response class as the return type of each method. However, I'm trying to figure out the "best" approach to validate parameters.
I've written some REST APIs before, and would usually have custom validation routines in the method and then have a custom return object with validation messages in it. I'm thinking I want to do the same here, but is that "preferred"?
I know there are annotations like #NotNull, etc. that you can apply and provide custom validation messages, but I don't really like the way that ends up looking.
So, what I did is that I wrote a return object bean that I'm using as the .entity() for my JAX-RS response, and I'm putting all of my validation messages in there. I use the same return object for Successes and Failures, but it's just a matter of which parameters I populate in there depending on the scenario. This is an internal API, so there won't be any external consumers. I just wanted to standardize the return type so it always returns the same "object".
Does that sound like a good approach? I'm a little rusty on REST API best practices, and I've been googling around like crazy but haven't really come to any best practices conclusions.

Input validation for RestAPI is not straight forward. There are a lot of internet resources, but none covers an elegant way of doing this.
As you mentioned the trivial input validations can be done using the annotation of different implementations of jax-rs library.
These annotations can be sophisticated as regex is supported. Which will help you cover more validation cases than #NotNull, #size , etc.
Theses annotations also accept message as input, which will help you customize a message that has to be returned to a user.
Those annotations may not look sexy (specially when regex is involved); but I would still prefer it over writing my own validator.
The other concern of validation which is a little bit tricky is when you want to validate constraints (more like a logic),
say for example you have this requirement: if parameter A has value X, Parameter B must have not be empty, otherwise it is OK to have parameter B empty.
This is not something you could handle using the usual javax.validation.constraints.*. I didn't find a good library that could handle this.
take a look at
javax.validation.ConstraintViolation
you could write your own custom validation logic that will be called whenever the library intercepts call to your API.

Related

CDK constructs: passing values to constructors vs using the context values

I wrote a small cdk construct that parses the logs in a cloudwatch log group via a lambda and sends a mail when a pattern is matched. This allows a developer to be notified via an sns topic, should an error appear in the logs.
The construct needs to know which log group to monitor, and which pattern to look for. These are currently passed in as parameters to its constructor. The user of my small construct library is supposed to use this construct as part of his stack. However, one could also define them as parameters, or better yet given what the docs say values in a context - basically using this construct in a standalone app.
Would this be an appropriate use of the context? What else it is useful for?
It's hard to say a definitive answer, but I would recommend always passing in properties to a construct explicitly on the constructor.
A) This creates consistency with the rest of the constructs.
B) Nothing is 'hidden' in your construct definition.
The only thing I've generally found context useful for is passing in parameters from the CLI, but even that is pretty rare and there are often better ways to do it.

Java: Fast and generic gateway Data to Soap

I do want to build a generic gateway from a nested map (generated from binary data stream) to SOAP- clients.
Background: a non-java-application which needs to call SOAP-Services can't generate json or SOAP/XML, but easily generate a custom protocol (which is under our control).
So a proxy is needed. That proxy should not be rewritten on every change of the WSDL or rollout of the next Webservice.
My plan is:
to have url, port and service-name (url:port/service-name) as "strict" defined parameters of that proxy,
to have the SOAP Action as a "strict" defined parameter
to request (possibly cached) the wsdl of url:port/service-name?wsdl and initiate the stub-call dynamically (cached),
to fill the values, which are present in the nested map, to that stub
call the SOAP-Service
convert the answer back to that binary protocol.
If some necessary values are missing it should send the equivalent of a SOAP-Error.
All that of course with small (affordable) latency, high stability, absolute minimal deployment downtime (for updates) and quite some load.
I see several possibilities:
a) Using a ESB like WSO2ESB. There I would implement the stream format as a special input format adapter, convert it to internal XMLStream (at least the json-adapters seem to work that way) and send it to mediator. That mediator would try something like in
http://today.java.net/pub/a/today/2006/12/13/invoking-web-services-using-apache-axis2.html "Creating a Dynamic Client" and call the SOAP-Service directly.
b) using a MOM-Middleware like ApacheMQ with Camel,
c) reduce it to something like Apache Karaf and CXF
I'm a bit lost between all those possibilities, and those are just more or less arbitrary samples of each kind.
Thoughts to a):
minus: It feels a bit odd to have no ESB-Target, since the mediator would directly call the given SOAP-Requests
minus: I wonder if internally converting into XML-Stream would not cost extra time and resources
minus: changing the code needs restart of the WSO2ESB as far as I got it
plus: instead of url, port, service-name I could define symbolic names which are resolved using the ESB -- iff that doesn't take extra milliseconds.
For b) I have not yet checked how easily those format conversions are in Camel and if SOAP-Service-Requests fit into Message Sending and Queueing.
I did already some searches to that topic but it's really confusing because of the overlapping scopes of quite different products. I thought it to be a standard problem but apparently there are no obvious solutions - at least I didn't find them.
I do hope to get a clue which of those solutions could lead into trouble or much work (and which into easy success), and I hope that there is some reason in my approach.
Thanks for any qualified comments!
Marco

Parsing GWT RPC POST request/response

I'm using GWT-RPC to get the client data and my requirement is to parse the payload to retrieve the data inside. I need to log or persist this data for metrics/monitoring purpose.
I'm using the Servlet Filter to intercept the HTTP requests. I can see that the request looks something like this -
5|0|7|http://localhost:8080/testproject|
29F4EA1240F157649C12466F01F46F60|com.test.client.GreetingService|
greetServer|java.lang.String|myInput1|myInput2|1|2|3|4|2|5|5|6|7|
Is there any standard mechanism to parse this data? I'm afraid writing my own code to parse this is not a good solution as this request payload is going to be complex when we pass custom objects to/from RPC and GWT-RPC internal parsing mechanism could change in future, which can break my code. I came across this, but not sure if it is robust/maintained.
Is there any alternative? Any pointers will be appreciated.
Use the RPC class from GWT.
You'll have to provide the serialization policy, whose strong name is passed in a request header.
Decoding responses is harder. You can use com.google.gwt.user.client.rpc.impl.RequestCallbackAdapter.ResponseReader along with a com.google.gwt.user.client.rpc.impl.ClientSerializationStreamReader but you'll need to have the JsParser from gwt-dev.jar in the classpath; and you cannot have gwt-dev.jar in a web application as it contains the servlet API (among others); so you'll have to extract the relevant classes from gwt-dev.jar to use them in your web app.
Note that in both cases, you'll reconstruct the same objects as will be deserialized for processing the request "for real", or were serialized as the result of the request processing.
All-in-all, you'll probably have better luck and better performances with using AOP on the methods of your RemoteServiceServlets.
I'm not sure, if that is what you're looking for, but a standard way to log the parsed parameters would be to override AbstractRemoteServiceServlet's onAfterRequestDeserialized(RPCRequest rpcRequest): RPCRequest contains the service method, with all its parameter values, the parsed RpcToken etc. in the form of nice Java objects.

jsonrpc2 returning remote reference

I'm exploring jsonrpc 2 for a web service. I have some experience with java rmi and very much liked that. To make things easy I using the zend framework so I think I like to use that library. There is however one thing i am missing. how do I make a procedure send back a reference to an other object.
I get that is not within the protocol because its about procedures but it would still be a useful thing. Like with the java rmi I could pick objects to send by value (serialize) or reference (remote object proxy). So what is the best way do solve this? are there any standards for this that most library's use?
I spend a view hours on google looking for this and can think of a solution (like return a url) but, I would rather use a standard then design something new.
There is one other thing i would like your opinion on. I heard an architect rand about the protocol's feature of sending batches of call's. Are the considered nice or dirty? (he thinks they where ugly but i can think of use for then)
update
I think the nicesed way is just to return a remoteref object with a url to the object. That way its only a small wrappen and a litle documentation. Yet i would like to know if there is a commen way to do this.
SMD Posibilitie's
There might be some way to specify the return type in my smd, is there anyone with idears of how to give a reference to another page in my smd return type? Or does anyone know a good explenation for the zend_json_smd class?
You can't return a reference of any kind via JSON-RPC.
There is no standard to do so (afaik) because RPC is stateless, and most developers prefer it that way. This simplicity is what makes JSON-RPC so desirable to client-side developers over SOAP (and other messes).
You could however adopt a convention in your return values that some JSON constructs should be treated as "cues" to manufacture remote "object" proxies. For example, you could create a modified JSON de-serialiser that turns:
{
"__remote_object": {
"class": "Some.Remote.Class",
"remote_id": 54625143,
"smd": "http://domain/path/to/further.smd",
"init_with": { ... Initial state of object ... }
}
}
Into a remote object proxy by:
creating a local object with a prototype named after class, initialised via init_with
downloading & processing the smd URL
creating a local proxy method (in the object proxy) for each procedure exposed via the API, such that they pass the remote_id to the server with each call (so that the server knows which remote object proxy maps to which server-side object.)
While this scheme would be workable, there are plenty of moving parts, and the code is much larger and more complicated on the client-side than for normal JSON-RPC.
JSON-RPC itself is not well standardised, so most extensions (even SMD) are merely conventions around method-names and payloads.

Creating Self-Documenting Actors in Scala

I'm looking at implementing a JSON-RPC based web service in Scala using finagle. I'm trying to work out how best to structure the RPC invocation code (ie. taking the deserialized request and invoking the appropriate method).
The service needs to be able to spit out a help page on all the possible requests accepted and their parameters. In Java, I would simply use annotations (to both expose and document functions) and then have the RPC service reflect on the appropriate classes, detect all exposed methods and then use the reflected MethodInfo's to invoke the functions where appropriate.
What is the idiomatic Scala way to achieve something similar? Should I use a message-passing approach (ie. just pass a request object into an actor, have it determine if it can invoke it, etc.)
We had success doing something similar to the approach suggested by #Jan above. More specifically, we defined a parent class for all request objects which takes the expected return type as a type parameter. Going one step further, we're generating our protocol IDL and serialization bindings by reflecting on API objects (little more than sets of requests).
In the future, the experimental typed channels feature in Akka may help with some of the mechanics.