Creating A Single Generic Handler For Agatha? - service

I'm using the Agatha request/response library (and StructureMap, as utilized by Agatha 1.0.5.0) for a service layer that I'm prototyping, and one thing I've noticed is the large number of handlers that need to be created. It generally makes sense that any request/response type pair would need their own handler. However, as this scales to a large enterprise environment that's going to be A LOT of handlers.
What I've started doing is dividing up the enterprise domain into logical processor classes (dozens of processors instead of many hundreds or possibly eventually thousands handlers). The convention is that each request/response type (all of which inherit from a domain base request/response pair, which inherit from Agatha's) gets exactly one function in a processor somewhere.
The generic handler (which inherits from Agatha's RequestHandler) then uses reflection in the Handle method to find the method for the given TREQUEST/TRESPONSE and invoke it. If it can't find one or if it finds more than one, it returns a TRESPONSE containing an error message (messages are standardized in the domain's base response class).
The goal here is to allow developers across the enterprise to just concern themselves with writing their request/response types and processor functions in the domain and not have to spend additional overhead creating handler classes which would all do exactly the same thing (pass control to a processor function).
However, it seems that I still need to have defined a handler class (albeit empty, since the base handler takes care of everything) for each request/response type pair. Otherwise, the following exception is thrown when dispatching a request to the service:
StructureMap Exception Code: 202
No Default Instance defined for PluginFamily Agatha.ServiceLayer.IRequestHandler`1[[TSFG.Domain.DTO.Actions.HelloWorldRequest, TSFG.Domain.DTO, Version=1.0.0.0, Culture=neutral, PublicKeyToken=null]], Agatha.ServiceLayer, Version=1.0.5.0, Culture=neutral, PublicKeyToken=6f21cf452a4ffa13
Is there a way that I'm not seeing to tell StructureMap and/or Agatha to always use the base handler class for all request/response type pairs? Or maybe to use Reflection.Emit to generate empty handlers in memory at application start just to satisfy the requirement?
I'm not 100% familiar with these libraries and am learning as I go along, but so far my attempts at both those possible approaches have been unsuccessful. Can anybody offer some advice on solving this, or perhaps offer another approach entirely?

I'm not familiar with Agatha. But if you want all requests for IRequestHandler<T> to be fulfilled by BaseHandler<T>, you can use the following StructureMap registration:
For(typeof(IRequestHandler<>)).Use(typeof(BaseHandler<>));
When something asks for an IRequestHandler<Foo>, it should get a BaseHandler<Foo>.

Related

Why do lambdas resolve faster?

The docs recommend to register frequently-used components via lambdas as ...
This can yield an improvement of up to 10x faster Resolve() calls
Now there are obviously a few questions:
why? (EDIT: to clarify: I would understand if the register time goes up, because you got to use reflection now to find the right constructor and such, but why the heck the resolve time?)
In which scenarios does this apply / what aspects of the registered class make this number go up/which make it go down?
What resolve times are we generally talking about anyhow? Like "yeah now it takes 100 instead of 10 cpu cycles" or actually measurable numbers in "normal" use cases (web service with per-request lifetimes)?
As noted in the comments, the short version is that the concrete implementation is going to be faster than the reflection manner of resolving.
Diving deeper, think about the steps involved in each.
Lambda:
Execute the method.
There is no step two.
Reflection:
Enumerate all the constructors for the type to be instantiated. This list could be cached, but is already fairly well cached by the .NET framework.
Of all of the constructors that are available, figure out which one to execute based on the number of constructor parameters available and the types registered in the container. Note the types registered in the container may change based on registration sources, lifetime scope registrations, etc.
Resolve the constructor parameters. If there are reflection-based registrations that make up the constructor parameters, run them through this process recursively.
Invoke the selected constructor using the resolved parameters.
As you can see, there's actually a lot more work than just Activator.CreateInstance in the reflection manner of resolving, which is why it takes longer.
But, as also noted in the comments, don't worry about premature optimization. This all happens pretty darn quickly so wait to optimize until you can actually locate a bottleneck using a profiler or some similar tool.

Java: Fast and generic gateway Data to Soap

I do want to build a generic gateway from a nested map (generated from binary data stream) to SOAP- clients.
Background: a non-java-application which needs to call SOAP-Services can't generate json or SOAP/XML, but easily generate a custom protocol (which is under our control).
So a proxy is needed. That proxy should not be rewritten on every change of the WSDL or rollout of the next Webservice.
My plan is:
to have url, port and service-name (url:port/service-name) as "strict" defined parameters of that proxy,
to have the SOAP Action as a "strict" defined parameter
to request (possibly cached) the wsdl of url:port/service-name?wsdl and initiate the stub-call dynamically (cached),
to fill the values, which are present in the nested map, to that stub
call the SOAP-Service
convert the answer back to that binary protocol.
If some necessary values are missing it should send the equivalent of a SOAP-Error.
All that of course with small (affordable) latency, high stability, absolute minimal deployment downtime (for updates) and quite some load.
I see several possibilities:
a) Using a ESB like WSO2ESB. There I would implement the stream format as a special input format adapter, convert it to internal XMLStream (at least the json-adapters seem to work that way) and send it to mediator. That mediator would try something like in
http://today.java.net/pub/a/today/2006/12/13/invoking-web-services-using-apache-axis2.html "Creating a Dynamic Client" and call the SOAP-Service directly.
b) using a MOM-Middleware like ApacheMQ with Camel,
c) reduce it to something like Apache Karaf and CXF
I'm a bit lost between all those possibilities, and those are just more or less arbitrary samples of each kind.
Thoughts to a):
minus: It feels a bit odd to have no ESB-Target, since the mediator would directly call the given SOAP-Requests
minus: I wonder if internally converting into XML-Stream would not cost extra time and resources
minus: changing the code needs restart of the WSO2ESB as far as I got it
plus: instead of url, port, service-name I could define symbolic names which are resolved using the ESB -- iff that doesn't take extra milliseconds.
For b) I have not yet checked how easily those format conversions are in Camel and if SOAP-Service-Requests fit into Message Sending and Queueing.
I did already some searches to that topic but it's really confusing because of the overlapping scopes of quite different products. I thought it to be a standard problem but apparently there are no obvious solutions - at least I didn't find them.
I do hope to get a clue which of those solutions could lead into trouble or much work (and which into easy success), and I hope that there is some reason in my approach.
Thanks for any qualified comments!
Marco

jsonrpc2 returning remote reference

I'm exploring jsonrpc 2 for a web service. I have some experience with java rmi and very much liked that. To make things easy I using the zend framework so I think I like to use that library. There is however one thing i am missing. how do I make a procedure send back a reference to an other object.
I get that is not within the protocol because its about procedures but it would still be a useful thing. Like with the java rmi I could pick objects to send by value (serialize) or reference (remote object proxy). So what is the best way do solve this? are there any standards for this that most library's use?
I spend a view hours on google looking for this and can think of a solution (like return a url) but, I would rather use a standard then design something new.
There is one other thing i would like your opinion on. I heard an architect rand about the protocol's feature of sending batches of call's. Are the considered nice or dirty? (he thinks they where ugly but i can think of use for then)
update
I think the nicesed way is just to return a remoteref object with a url to the object. That way its only a small wrappen and a litle documentation. Yet i would like to know if there is a commen way to do this.
SMD Posibilitie's
There might be some way to specify the return type in my smd, is there anyone with idears of how to give a reference to another page in my smd return type? Or does anyone know a good explenation for the zend_json_smd class?
You can't return a reference of any kind via JSON-RPC.
There is no standard to do so (afaik) because RPC is stateless, and most developers prefer it that way. This simplicity is what makes JSON-RPC so desirable to client-side developers over SOAP (and other messes).
You could however adopt a convention in your return values that some JSON constructs should be treated as "cues" to manufacture remote "object" proxies. For example, you could create a modified JSON de-serialiser that turns:
{
"__remote_object": {
"class": "Some.Remote.Class",
"remote_id": 54625143,
"smd": "http://domain/path/to/further.smd",
"init_with": { ... Initial state of object ... }
}
}
Into a remote object proxy by:
creating a local object with a prototype named after class, initialised via init_with
downloading & processing the smd URL
creating a local proxy method (in the object proxy) for each procedure exposed via the API, such that they pass the remote_id to the server with each call (so that the server knows which remote object proxy maps to which server-side object.)
While this scheme would be workable, there are plenty of moving parts, and the code is much larger and more complicated on the client-side than for normal JSON-RPC.
JSON-RPC itself is not well standardised, so most extensions (even SMD) are merely conventions around method-names and payloads.

Creating Self-Documenting Actors in Scala

I'm looking at implementing a JSON-RPC based web service in Scala using finagle. I'm trying to work out how best to structure the RPC invocation code (ie. taking the deserialized request and invoking the appropriate method).
The service needs to be able to spit out a help page on all the possible requests accepted and their parameters. In Java, I would simply use annotations (to both expose and document functions) and then have the RPC service reflect on the appropriate classes, detect all exposed methods and then use the reflected MethodInfo's to invoke the functions where appropriate.
What is the idiomatic Scala way to achieve something similar? Should I use a message-passing approach (ie. just pass a request object into an actor, have it determine if it can invoke it, etc.)
We had success doing something similar to the approach suggested by #Jan above. More specifically, we defined a parent class for all request objects which takes the expected return type as a type parameter. Going one step further, we're generating our protocol IDL and serialization bindings by reflecting on API objects (little more than sets of requests).
In the future, the experimental typed channels feature in Akka may help with some of the mechanics.

CORBA: Can a CORBA IDL type be an attribute of another?

Before I start using CORBA I want to know something.
It would seem intuitive to me that you could use an IDL type as an attribute of another, which would then expose that attribute's methods to the client application (using ".") as well.
But is this possible?
For example (excuse my bad IDL):
interface Car{
attribute BrakePedal brakePedal;
//...
}
//then.. (place above)
interface BrakePedal{
void press();
//...
}
//...
Then in the client app, you could do: myCar.brakePedal.press();
CORBA would seem crappy if you couldn't do these kind of multi-level
object interfaces. After all, real-world objects are multi-level, right? So can
someone put my mind at ease and confirm (or try, if you already have CORBA set up) if this definitely works? None of the IDL documentation explicitly shows this in example, which is why I'm concerned. Thanks!
Declaring an attribute is logically equivalent to declaring a pair of accessor functions, one to read the value of the attribute, and one to write it (you can also have readonly attributes, in which case you would only get the read function).
It does appear from the CORBA spec. that you could put an interface name as an attribute name. I tried feeding such IDL to omniORB's IDL to C++ translator, and it didn't reject it. So I think it is allowed.
But, I'm really not sure you would want to do this in practice. Most CORBA experts recommend that if you are going to use attributes you only use readonly attributes. And for something like this, I would just declare my own function that returned an interface.
Note that you can't really do the syntax you want in the C++ mapping anyway; e.g.
server->brakePedal()->press(); // major resource leak here
brakePedal() is the attribute accessor function that returns a CORBA object reference. If you immediately call press() on it, you are going to leak the object reference.
To do this without a leak you would have to do something like this:
BrakePedal_var brakePedal(server->brakePedal());
brakePedal->press();
You simply can't obtain the notational convenience you want from attributes in this scenario with the C++ mapping (perhaps you could in the Python mapping). Because of this, and my general dislike for attributes, I'd just use a regular function to return the BrakePedal interface.
You don't understand something important about distributed objects: remote objects (whether implemented with CORBA, RMI, .NET remoting or web services) are not the same as local objects. Calls to CORBA objects are expensive, slow, and may fail due to network problems. The object.attribute.method() syntax would make it hard to see that two different remote calls are being executed on that one line, and make it hard to handle any failures that might occur.