This might be a very generic question, but considering the fact that REST is focused on accessing named resources through a single consistent interface; does it supports protocol buffers?
Yes, you can absolutely combine Protobuf and REST.
Protbuf specifies a way to encode data. REST specifies a way to interact with resources, but does not require any particular encoding for the resource bodies. If you create a RESTful HTTP-based API and use Protobuf to encode the entity-bodies (the technical term for the payload part of an HTTP request or response), then you are using both REST and Protobuf.
Back to the future, there is this Spring REST API with Protocol Buffers tutorial:
Generate the corresponding Java classes using:
protoc --java_out=java resources/baeldung.proto
Add the following dependency on your Maven's POM file:
<dependency>
<groupId>com.google.protobuf</groupId>
<artifactId>protobuf-java</artifactId>
<version>3.0.0-beta-3</version>
</dependency>
Add the following converter to your #SpringBootApplication:
#Bean
ProtobufHttpMessageConverter protobufHttpMessageConverter() {
return new ProtobufHttpMessageConverter();
}
The ProtobufHttpMessageConverter bean is used to convert responses returned by #RequestMapping annotated methods to protocol buffer messages.
What's important here is that we're operating with Protocol Buffer specific data – not with standard POJOs.
Related
I am in one of the cases that needs signing a payload with multiple signatures, as case (2) in this answer.
As a reminder, JWT is specified by RFC7519, in which signing is defined to use JSON Web Signature, JWS, RFC7515.
JWS/RFC7515 define the compact representation in Section 7.1, which is widely implemented by libraries. But they also define the longer JWS JSON Serialization in Section 7.2, which allows for multiple signatures of the payload.
The documentation at jwt.io lists a plethora of Java libraries, but is there any of them that actually implements Section 7.2, with the multiple signatures?
You can the bookstore
<dependency>
<groupId>com.nimbusds</groupId>
<artifactId>nimbus-jose-jwt</artifactId>
<version>9.16</version>
</dependency>
The following link shows how to make a signature with multiple private keys and then its corresponding validation
https://connect2id.com/products/nimbus-jose-jwt/examples/jws-json-multiple-signatures
https://8gwifi.org/jwkconvertfunctions.jsp
https://dzone.com/articles/json-message-signing-alternatives
I am creating HTTP request using Apache HTTP Client version 4.3.4. I see there are some classes like HttpGet,... and there is also a class BasicHttpRequest. I am not sure which one to use.
Whats the difference and which one should be used in which condition ?
BasicHttpRequest is provided by the core library. As its name suggests it is pretty basic: it enforces no particular method name or type, nor does it attempt to validate the request URI. The URI parameter can be any arbitrary garbage. HttpClient will dutifully transmit it to server as is, if it is unable to parse it to a valid URI.
HttpUriRequest variety on the other hand will enforce specific method type and will require a valid URI. Another important feature is that HttpUriRequest can be aborted at any point of their execution.
You should always be using classes that implement HttpUriRequest per default.
I was just browsing the 4.3.6 javadoc attempting to locate your BasicHttpRequest and was unable to find it. Do you have a reference to the javadoc of this class?
I would be under the impression that BasicHttpRequest would be a base class providing operations and attributes common to more than one HttpRequest. It may be extremely generic for extension purposes.
To the first part of your question, use HttpGet, HttpPost etc for their specific operations. If you only need to HTTP/GET information then use HttpGet, if you need to post a form or document body, then use HttpPost. If you are attempting to use things like the Head, Put, Delete method, then use the correspoding HttpXXX class.
It possible in Resteasy to extract the URI mapping to an external, dedicated file?
Annotating classes and methods is quick and easy but I would like to have a file that maps the URIs to functions. Something like:
/teams/{team}/player/{player-id} TeamResource.fetchPlayer
As far as I know this is not currently supported as part of the JAX-RS specification, but I could see you being able to do this with byte code insertion at runtime using something like javassist.
Basically you would add the #Path annotations to the your resource classes at runtime with the values loaded from your uri mapping file. Once the annotations were added to the resource you would then inject them into Resteasy.
Is there an easy way to use SerializationStreamWriter for custom purposes without writing an own generator?
(for example html5 storage)
The javadoc of GWT tells very little.
We are writing an implementation for our current project, which does exactly what you want do: serialize an custom object into string and save it into the localstorage, deserialize the string into an object...
So, for me, it is possbile, to use SerializationStreamWriter for serialization, and use SerializationStreamReader for deserialization in CLIENT SIDE.
To realize this,
you don't need a generator for SerializationStreamWriter/SerializationStreamReader, but a generator for TypeSerializer (which implements com.google.gwt.user.client.rpc.impl.SerializerBase). And this is quiet simple, take a look at com.google.gwt.user.rebind.rpc.TypeSerializerCreator, and use it in your generator. OR, if all your custom objects are referenced in one RPC service, you can just use the generated rpc service's TypeSerializer.
And you must write a proper implementation of SerializationStreamWriter OR SerializationStreamReader. Because there has two serialized string formats(request used format and response used format):
IN GWT, you have
ClientSerializationStreamWriter, ClientSerializationStreamReader for client side serialization/deserialization;
ServerSerializationStreamWriter, ServerSerializationStreamReader for server side serialization/deserialization;
Client SerializationStream Writer will serialize the object into FORMAT_1, and only Server SerializationStream Reader can read it (deserialize it into object).
Server SerializationStream Writer will serialize the object into FORMAT_2, and only Client SerializationStream Reader can read it (deserialize it into object).
so what you need to do, if you want to use ClientSerializationStreamWriter to serialize your object, then write a similar implementation of ServerSerializationStreamReader for client side. Or if you want to use ClientSerializationStreamWriter to deserialize the string, then write a similar implementation of ServerSerializationStreamWriter in client side. This is not difficult, because the difference between FORMAT_1 and FORMAT_2 is just the order.
No.
Because the GWT-RPC serialization is asymmetric, it cannot be used for local storage: the server understands what the client sent, the client understands what the server sent, but they won't understand what they themselves wrote.
I have a Java client that calls a RESTEasy (JAX-RS) Java server. It is possible that some of my users may have a newer version of the client than the server.
That client may call a resource on the server that contains query parameters that the server does not know about. Is it possible to detect this on the server side and return an error?
I understand that if the client calls a URL that has not been implemented yet on the server, the client will get a 404 error, but what happens if the client passes in a query parameter that is not implemented (e.g.: ?sort_by=last_name)?
Is it possible to detect this on the server side and return an error?
Yes, you can do it. I think the easiest way is to use #Context UriInfo. You can obtain all query parameters by calling getQueryParameters() method. So you know if there are any unknown parameters and you can return error.
but what happens if the client passes in a query parameter that is not implemented
If you implement no special support of handling "unknown" parameters, the resource will be called and the parameter will be silently ignored.
Personally I think that it's better to ignore the unknown parameters. If you just ignore them, it may help to make the API backward compatible.
You should definitely check out the JAX-RS filters (org.apache.cxf.jaxrs.ext.RequestHandler) to intercept, validate, manipulate request, e.g. for security or validatng query parameters.
If you declared all your parameters using annotations you can parse the web.xml file for the resource class names (see possible regex below) and use the full qualified class names to access the declared annotations for methods (like javax.ws.rs.GET) and method parameters (like javax.ws.rs.QueryParam) to scan all available web service resources - this way you don't have to manually add all resource classes to your filter.
Store this information in static variables so you just have to parse this stuff the first time you hit your filter.
In your filter you can access the org.apache.cxf.message.Message for the incoming request. The query string is easy to access - if you also want to validate form parameters and multipart names, you have to reas the message content and write it back to the message (this gets a bit nasty since you have to deal with multipart boundaries etc).
To 'index' the resources I just take the HTTP method and append the path (which is then used as key to access the declared parameters.
You can use the ServletContext to read the web.xml file. For extracting the resource classes this regex might be helpful
String webxml = readInputStreamAsString(context.getResourceAsStream("WEB-INF/web.xml"));
Pattern serviceClassesPattern = Pattern.compile("<param-name>jaxrs.serviceClasses</param-name>.*?<param-value>(.*?)</param-value>", Pattern.DOTALL | Pattern.MULTILINE);