Correct way of using/testing event service in Eclipse E4 RCP - eclipse

Let me ask two coupled questions that might boil down to one about good application design ;-)
What is the best practice for using event based communication in an e4 RCP application?
How can I write simple unit tests (using JUnit) for classes that send/receive events using dependency injection and IEventBroker ?
Let’s be more concrete: say I am developing an Eclipse e4 RCP application consisting of several plugins that need to communicate. For communication I want to use the event service provided by org.eclipse.e4.core.services.events.IEventBroker so my plugins stay loosely coupled. I use dependency injection to inject the event broker to a class that dispatches events:
#Inject static IEventBroker broker;
private void sendEvent() {
broker.post(MyEventConstants.SOME_EVENT, payload)
}
On the receiver side, I have a method like:
#Inject
#Optional
private void receiveEvent(#UIEventTopic(MyEventConstants.SOME_EVENT) Object payload)
Now the questions:
In order for IEventBroker to be successfully injected, my class needs access to the current IEclipseContext. Most of my classes using the event service are not referenced by the e4 application model, so I have to manually inject the context on instantiation using e.g. ContextInjectionFactory.inject(myEventSendingObject, context);
This approach works but I find myself passing around a lot of context to wherever I use the event service. Is this really the correct approach to event based communication across an E4 application?
how can I easily write JUnit tests for a class that uses the event service (either as a sender or receiver)? Obviously, none of the above annotations work in isolation since there is no context available. I understand everyone’s convinced that dependency injection simplifies testability. But does this also apply to injecting services like the IEventBroker?
This article describes creation of your own IEclipseContext to include the process of DI in tests. Not sure if this could resolve my 2nd issue but I also hesitate running all my tests as JUnit Plug-in tests as it appears impractible to fire up the PDE for each unit test. Maybe I just misunderstand the approach.
This article speaks about “simply mocking IEventBroker”. Yes, that would be great! Unfortunately, I couldn’t find any information on how this can be achieved.
All this makes me wonder whether I am still on the right track or if this is already a case of bad design? And if so, how would you go about redesigning? Move all event related actions to dedicated event sender/receiver classes or a dedicated plugin?

Actually, running a JUnit plug-in test is not that expensive. You can configure the launch configuration to run in headless mode so the only thing loaded is a lightweight PDE without workbench. The same happens when you run a headless build with for example Tycho. Surefire launches your test-bundle as headless plug-in test by default.
The advantage over isolated unit tests is that you can access your plug-in's resources and, most importantly, use dependency injection. If you want to mock an injected object you have to run a plug-in test so you can use InjectorFactory.
This is how you would go about mocking the event service: IEventBroker is an interface, so the only thing you need to do is writing a mock implementation for it
public class IEventBrokerMock implements IEventBroker {
#Override
// Implemented Methods
}
In your test method you would have something like
InjectorFactory.getDefault().addBinding(IEventBroker.class).implementedBy(IEventBrokerMock.class);
ClassUnderTest myObject = InjectorFactory.getDefault().make(ClassUnderTest.class, null);
If you want to work with a context the test method would instead contain
IEclipseContext context = EclipseContextFactory.create();
context.set(IEventBroker.class, new IEventBrokerMock());
ClassUnderTest myObject = ContextInjectionFactory.make(ClassUnderTest.class, context);
If you run this as JUnit plug-in test your object will have the mocked event service injected.

for testing, instead of DI, i use "eventBroker=new org.eclipse.e4.ui.services.internal.events.EventBroker();" to get a eventbroker object to use, it works ok

Related

Why do we need a interface to define every service in aem?

I have been working with for a while but somehow never thought about this. Every aem project that I have worked on, has one similarity in their code structure. There is an interface for every service written.
My question is why do we need a interface for every service?
Can #reference or #inject not use the services without an interface?
Using interfaces is a good practice to decouple the user of a service from the implementation. In many cases you even want to have an API bundle so the user of the service does not need a maven dependency to the implementing bundle.
On the other hand you are not required to use interfaces. Especially when I wire components inside a bundle interfaces are often an unnecessary layer. In this case simply export the service directly with the class.
See here for an example:
#Component(service = DistributionMetricsService.class)
public class DistributionMetricsService {
...
}
and here for the client code:
#Reference
private DistributionMetricsService distributionMetricsService;
So the main difference is that you have to specify the service property if you want to export a component with its implementation class.

What is best practice for using E4 dependency injection for our own objects?

I am working on an E4 RCP application and, while our basic DI configuration is working, I have some reservations about our current implementation.
The IInjector interface and #ProcessAdditions annotation are tagged as being discouraged for external access. Currently, we are using a series of statements similar to
injector.addBinding(IInterface.class).implementedBy(Concrete.class);
from within a method marked as #ProcessAdditions. What method(s) can be used that don't violate access rules? I know I can bind classes/strings to instances via IEclipseContext, but using ContextInjectionFactory by hand seems to force order of construction to be known by configurer (as opposed to other DI frameworks).
I know Guice has the concept of child injectors, but in E4, ContextInjectionFactory is internally set to use only the default injector for manufacturing. What is the best method to manufacture a group of objects, using DI, and subsequently disposing of this group? I would like to create a fresh batch of processing objects for each processing operation.
ContextInjectionFactory is the only thing I have seen described for doing injection in e4 (in Lars Vogel's 'Eclipse 4 RCP' book for example). This is what I use in my e4 applications.
Some things, such as #ProcessAdditions are marked as discouraged because that part of the e4 API has not been finalized yet and might change, they can still be used. #ProcessAdditions is only used for the application Life Cycle class.

Unit testing DAO tier with FUSE ESB

We are working on some spikes using Fuse ESB (Camel,OSGi, blueprint) to deliver some components. We have an imposed architecture from our EAs which is: REST controller uses a route to call a CXF WS. This calls a local java class as a service to, for example, perform CRUD actions. These use JPA enabled DAO/entities. All seems a bit academic in design rather than real world but thats another story.
Question is about testing. Normally I would actually test this service tier using H2 to provide the DB, wiring the DAO, entityManager etc together with spring (I know some wouldn't do this but I do, bear with me). But we will use blueprint for fuse. How can I unit test this tier? Getting my tests to subclass CamelBlueprintTestSupport doesn’t work, this expects a route. Can’t use SpringJUnit4ClassRunner (though do have it working with this currently) as this wires with spring, when running in the container we will wire with blueprint.
So how do we unit test this? How do I instantiate this set of classes within a blueprint based unit test? Can we?
One aproach you may try is to use pax exam. It allows to run tests in a full OSGi environment. So you can install your real bundle test it in a black box fashion.
You can use pojosr which is what camel-test-blueprint is using: https://code.google.com/p/pojosr/
Though pojosr is not a full OSGi environment, so there will be some limitations what you can do.
For the camel-test-blueprint you may be able to override the method isUseRouteBuilder and return false, then it ought not to expect a route.

How do I inject into a Servlet with Dagger?

How do I inject objects into a Servlet using Dagger?
Since the servlet container instantiates the Servlets themselves, they are not created with Dagger. Therefore, the only mechanism I can see to inject into them is via static injections, which the dagger homepage warns against doing. Is there another (best practices) way to do it?
Specifically, I am using Jetty and GWT (my servlets extend RemoteServiceServlet), but I don't think those details matter.
There is not (yet) any stock infrastructure code to support a Java EE servlet stack for Dagger.
That said, there are ways you could home-brew it until we get to it. If you were using it only for singletons, then you could mirror what some people are doing on android, and initialize your graph at app startup using a context listener, then use the Servlet's init() method to self-inject
It gets much trickier when you try to add scoping to requests and such - not impossible, but it requires more scaffolding.
While there is no stock infrastructure for this, I did the following:
I put the ObjectGraph into the ServletContext of the web server. Then, for each Servlet, I can do the following,
#Inject
SomeDependency dependency;
#Inject
SomeOtherDependency otherDependency;
#Override
public void init(FilterConfig filterConfig) throws ServletException
{
((ObjectGraph) filterConfig.getServletContext().getAttribute(DaggerConstants.DAGGER_OBJECT_GRAPH)).inject(this);
}
where I have previously defined the DaggerConstants myself.
There are likely a variety of ways to get the ObjectGraph into the ServletContext, depending on what your application is. We use an embedded jetty server, so we control everything during startup. Not sure how you would do it in a general container, but presuming you instantiate your main ObjecGraph through some init servlet, you would do it there.
servletContext.setAttribute(DaggerConstants.DAGGER_OBJECT_GRAPH, objectGraph);
Note that our application uses a single ObjectGraph for the entire application, which might not be your situation.

How to bring Spring Roo & GWT together

I am trying to develop a Spring Roo/GWT app with the newest integration of GWT in Roo.
Getting the scaffolding to work is very straightforward, but I don't really understand how the RPC works there.
Can someone of you provide a simple example how to do a simple service to connect client/server within Spring Roo and GWT.
Would be very helpful for a start, as I couldn't find any resource on that.
thx & regards,
Flo
Flo,
Not sure if you are up on google wave at all, but that does seem to be one place to keep apace of the current effort. Specifically this wave is available to the public:
RequestFactory Wave
It covers the details (well emerging details) about the RequestFactory API.
The basic idea is that your domain model objects are needed on the server side and the client side. Using hibernate can cause issues with the class files and people have wound up having two sets of model objects, and using custom GWT-RPC to make server requests and marshall/un-marshall between the client- and server-side model objects. Not an ideal solution. Even if you can use the same model objects, the overhead of the RPC is a drag.
Enter RequestFactory and we see that google engineers are probably getting paid what they are worth. Take a look at the sample code generated from the .roo - specifically ApplicationRequestFactory.java.
package com.springsource.extrack.gwt.request;
import com.google.gwt.requestfactory.shared.RequestFactory;
public interface ApplicationRequestFactory extends RequestFactory {
ReportRequest reportRequest();
ExpenseRequest expenseRequest();
EmployeeRequest employeeRequest();
}
This is an interface that provides request methods for each of the domain objects. There is no implementation of this class defined in the project. It is instantiated in the EntryPoint with a call to GWT.create(...):
final ApplicationRequestFactory requestFactory =
GWT.create(ApplicationRequestFactory.class);
requestFactory.init(eventBus);
Within the com.springsource.extrack.gwt.request package you will see an ApplicationEntityTypesProcessor.java which is cleverly using generics to package the references to the domain classes for use later in the presentation. The rest of that package though are events and handlers for each model object.
Specifically there are four auto-generated classes for each object:
EmployeeRecord.java - This is a DTO for the domain object.
EmployeeRecordChanged.java - This is a RecordChanged event to provide a hook method onEmployeeChanged.
EmployeeChangedHandler.java - This is an interface that will be implemented when specific behaviour for the onEmployeeChanged is needed.
EmployeeRequest.java - This is an interface used by the ApplicationRequestFactory to package up the various access methods for a given object.
Keep in mind there is a lot of code generated behind the scenes to support all this. And from M1 to M2 a lot has been cleaned out of what is visible in a GWT project. I'd expect there to be more changes, but not as drastic as M1 to M2 was.
So finally these events can be used as in the UI package to tie together the domain and the UI. ReportListActivity.java:
public void start(Display display) {
this.registration = eventBus.addHandler(ReportRecordChanged.TYPE, new ReportChangedHandler() {
public void onReportChanged(ReportRecordChanged event) {
update(event.getWriteOperation(), event.getRecord());
}
});
super.start(display);
}
Again I refer you to the wave for more info. Plus the expenses.roo demonstrates how to use Places and has a rather slick Activity framework as well. Happy GWTing.
Regards.
The functionality you are referring to is currently under heavy development (or so the guys at Google want us to believe ;)) so the API and internal workings are not final and will most likely still change before the final release of GWT 2.1 (this was stated a few times during the GWT sessions during Google IO 2010). However, you can browse the Bikeshed sample in the trunk to see a working (hopefully ;)) example. There's also the 2.1 branch that appears to contain the up-to-date (?) sample (and the cookbook that was promised on Google IO).
Personally, I'd wait with switching your code to the new RPC model till the guys working on GWT say it's safe to do so ;) (but it's definitely a good idea to get accustomed with the general idea now - it's not like they will change everything :D).