Explain RestEasy providers, resources and singletons in relation to classes vs instances? - rest

I'm building a JAX-RS app that consists of a stockroom and a workplace. The stockroom holds a set of Java classes that can be instantiated (via AJAX) to create named instances of those classes in the workplace.
So far I'm able to reference the stockroom and workplace fine by declaring them as "singletons" in the RestEasy application
singletons.add(StockPlace.getInstance());
singletons.add(WorkPlace.getInstance());
I'm unable to understand how to understand how the stockroom content classes should be handled. The effect I'm trying to achieve is that when I dynamically create an instance of one of the stockroom classes, that instance can be dynamically accessed via REST commands. I've tried various permutations of:
classes.add(SomeComponent.class);
I think I'm missing knowledge of how the Java notion of how classes work as factories for making instances, and how both of these relate to what RestEasy calls classes, singletons (singletons ARE classes, yet RestEasy registers them as instances) and resources (instances?).
I suspect I'll wind up needing to dynamically register new instances but can't find a way to do that either. I did find a way to do it given the ServletContext, but am not able to get access to that either. Can someone get me on the right track?

Our eventual answer to this question was to bail out of RestEasy and convert to DropWizard. That problem and many others vanished and everything became easy again.

I believe I know what you are after, but I should at least give you a push in the right direction.
You will need to add the annotated RESTEasy class(es) to the registry. Below is the class I used for a recent project. It adds to the singletons (per what you did) but it also adds to the registry.
public class RESTEasyServerApplication extends javax.ws.rs.core.Application
{
// The RESTEasy registry
#Autowired
protected org.jboss.resteasy.spi.Registry registry;
// The annotated RESTEasy handler classes
private Set<Object> singletons = new HashSet<Object>();
private List<Object> handlers = new ArrayList<Object>();
public RESTEasyServerApplication()
{}
#Override
public Set<Object> getSingletons()
{
return singletons;
}
// Spring injection support
public void setHandlers( List<Object> handlers )
{
for( Object handler : handlers )
{
if( registry != null )
{
// Save a reference to the handler
this.handlers.add( handler );
// Register the handler with RESTEasy
registry.addSingletonResource( handler );
}
singletons.add( handler );
}
}
// Spring injection support
public List<Object> getHandlers()
{
return handlers;
}
}
I used Spring, and here is the relevant configuration:
<!-- RESTeasy/Spring integration -->
<import resource="classpath:springmvc-resteasy.xml" />
<!-- RESTeasy server application -->
<bean id="application" class="blah.blah.resteasy.RESTEasyServerApplication">
<property name="handlers">
<list>
<!-- Application specific handler classes -->
<ref bean="sample"/>
</list>
</property>
</bean>
Should be easy to modify/add a method to accept a single annotated RESTEasy class and make it work dynamically as required. The registry is defined in the springmvc-resteasy.xml file.

Since I've found no answers that don't involve strapping another whole layer of complexity (Spring) onto RestEasy, the solution I found livable is outlined in the final comment above. That is, don't rely on sending remote messages to instances unless the app is truly stateless (e.g. instances don't persist across messages). Only send remote messages to singletons which do persist across requests. Each such message can identify the desired instance (by String id in my case), and the singleton can forward to the identified instance as an ordinary POJO.
I still don't see why RestEasy unconditionally treats non-Singletons (instances) as ephemeral. Statelessness is not a restriction on REST, only a restriction on when GET methods can be used (idempotent calls). PUT and POST calls are neither stateless nor idempotent.
As I understand this, of course, and feel free to correct me. My focus is getting this app on the air, not exploring every corner of RestEasy, REST, and certainly not Spring.

Related

VaadinServiceInitListener not picked up in a Quarkus app

I have a Quarkus application using current versions of Vaadin Flow and Quarkus (23.2.4 and 2.13.1.Final). I want to have a VaadinServiceInitListener to check access annotations on the views (#RolesAllowed(...)) using AccessAnnotationChecker. I believe annotating the implementation with #VaadinServiceEnabled
should fix this, but I need to register it in META-INF/services/com.vaadin.flow.server.VaadinServiceInitListener to have it activated. This is how to do it when not using a dependency injection framework. Then everything works as expected and I can use AccessAnnotationChecker to see if the user has access to that view, on BeforeEnterEvent.
I also notice the message Can't find any #VaadinServiceScoped bean implementing 'I18NProvider'. Cannot use CDI beans for I18N, falling back to the default behavior. on startup. Strangely, implementing I18NProvided in a class and annotating it with #VaadinServiceEnabled and #VaadinServiceScoped makes that message go away, eg. it is recognized by CDI.
Why isn't my VaadinServiceInitListener implementation recogized? Currently it is annotated with
#VaadinServiceEnabled
#VaadinServiceScoped
#Unremovable
My pom.xml include
vaadin-quarkus-extension,
quarkus-oidc,
quarkus-keycloak-authorization,
vaadin-jandex
Instead of using a listener, you can use a CDI event.
Quarkus's dependency injection solution is based on CDI, so you can use the same events. Here's an example
public class BootstrapCustomizer {
private void onServiceInit(#Observes
ServiceInitEvent serviceInitEvent) {
serviceInitEvent.addIndexHtmlRequestListener(
this::modifyBootstrapPage);
}
private void modifyBootstrapPage(
IndexHtmlResponse response) {
response.getDocument().body().append(
"<p>By CDI add-on</p>");
}
}
More information here https://vaadin.com/docs/latest/integrations/cdi/events

Can I use SpringData by itself [duplicate]

I'm trying to wire up Spring Data JPA objects manually so that I can generate DAO proxies (aka Repositories) - without using a Spring bean container.
Inevitably, I will be asked why I want to do this: it is because our project is already using Google Guice (and on the UI using Gin with GWT), and we don't want to maintain another IoC container configuration, or pull in all the resulting dependencies. I know we might be able to use Guice's SpringIntegration, but this would be a last resort.
It seems that everything is available to wire the objects up manually, but since it's not well documented, I'm having a difficult time.
According to the Spring Data user's guide, using repository factories standalone is possible. Unfortunately, the example shows RepositoryFactorySupport which is an abstract class. After some searching I managed to find JpaRepositoryFactory
JpaRepositoryFactory actually works fairly well, except it does not automatically create transactions. Transactions must be managed manually, or nothing will get persisted to the database:
entityManager.getTransaction().begin();
repositoryInstance.save(someJpaObject);
entityManager.getTransaction().commit();
The problem turned out to be that #Transactional annotations are not used automatically, and need the help of a TransactionInterceptor
Thankfully, the JpaRepositoryFactory can take a callback to add more AOP advice to the generated Repository proxy before returning:
final JpaTransactionManager xactManager = new JpaTransactionManager(emf);
final JpaRepositoryFactory factory = new JpaRepositoryFactory(emf.createEntityManager());
factory.addRepositoryProxyPostProcessor(new RepositoryProxyPostProcessor() {
#Override
public void postProcess(ProxyFactory factory) {
factory.addAdvice(new TransactionInterceptor(xactManager, new AnnotationTransactionAttributeSource()));
}
});
This is where things are not working out so well. Stepping through the debugger in the code, the TransactionInterceptor is indeed creating a transaction - but on the wrong EntityManager. Spring manages the active EntityManager by looking at the currently executing thread. The TransactionInterceptor does this and sees there is no active EntityManager bound to the thread, and decides to create a new one.
However, this new EntityManager is not the same instance that was created and passed into the JpaRepositoryFactory constructor, which requires an EntityManager. The question is, how do I make the TransactionInterceptor and the JpaRepositoryFactory use the same EntityManager?
Update:
While writing this up, I found out how to solve the problem but it still may not be the ideal solution. I will post this solution as a separate answer. I would be happy to hear any suggestions on a better way to use Spring Data JPA standalone than how I've solve it.
The general principle behind the design of JpaRepositoryFactory and the according Spring integration JpaRepositoryFactory bean is the following:
We're assuming you run your application inside a managed JPA runtime environment, not caring about which one.
That's the reason we rely on injected EntityManager rather than an EntityManagerFactory. By definition the EntityManager is not thread safe. So if dealt with an EntityManagerFactory directly we would have to rewrite all the resource managing code a managed runtime environment (just like Spring or EJB) would provide you.
To integrate with the Spring transaction management we use Spring's SharedEntityManagerCreator that actually does the transaction resource binding magic you've implemented manually. So you probably want to use that one to create EntityManager instances from your EntityManagerFactory. If you want to activate the transactionality at the repository beans directly (so that a call to e.g. repo.save(…) creates a transaction if none is already active) have a look at the TransactionalRepositoryProxyPostProcessor implementation in Spring Data Commons. It actually activates transactions when Spring Data repositories are used directly (e.g. for repo.save(…)) and slightly customizes the transaction configuration lookup to prefer interfaces over implementation classes to allow repository interfaces to override transaction configuration defined in SimpleJpaRepository.
I solved this by manually binding the EntityManager and EntityManagerFactory to the executing thread, before creating repositories with the JpaRepositoryFactory. This is accomplished using the TransactionSynchronizationManager.bindResource method:
emf = Persistence.createEntityManagerFactory("com.foo.model", properties);
em = emf.createEntityManager();
// Create your transaction manager and RespositoryFactory
final JpaTransactionManager xactManager = new JpaTransactionManager(emf);
final JpaRepositoryFactory factory = new JpaRepositoryFactory(em);
// Make sure calls to the repository instance are intercepted for annotated transactions
factory.addRepositoryProxyPostProcessor(new RepositoryProxyPostProcessor() {
#Override
public void postProcess(ProxyFactory factory) {
factory.addAdvice(new TransactionInterceptor(xactManager, new MatchAlwaysTransactionAttributeSource()));
}
});
// Create your repository proxy instance
FooRepository repository = factory.getRepository(FooRepository.class);
// Bind the same EntityManger used to create the Repository to the thread
TransactionSynchronizationManager.bindResource(emf, new EntityManagerHolder(em));
try{
repository.save(someInstance); // Done in a transaction using 1 EntityManger
} finally {
// Make sure to unbind when done with the repository instance
TransactionSynchronizationManager.unbindResource(getEntityManagerFactory());
}
There must be be a better way though. It seems strange that the RepositoryFactory was designed to use EnitiyManager instead of an EntityManagerFactory. I would expect, that it would first look to see if an EntityManger is bound to the thread and then either create a new one and bind it, or use an existing one.
Basically, I would want to inject the repository proxies, and expect on every call they internally create a new EntityManager, so that calls are thread safe.

No event context active - RESTeasy, Seam

I'm trying to add a RESTful web service with RESTeasy to our application running on JBoss 7.x, using Seam2.
I wanted to use as little Seam as possible, but I need it for Dependancy Injection.
My REST endpoints are as follows:
#Name("myEndpoint")
#Stateless
#Path("/path")
#Produces(MediaType.APPLICATION_JSON+"; charset=UTF-8")
public class MyEndpoint {
#In private FooService fooService;
#GET
#Path("/foo/{bar}")
public Response foobar(#CookieParam("sessionId") String sessionId,
#PathParam("bar") String bar)
{ ... }
}
I'm using a class extending Application. There is no XML config.
I can use the web service methods and they work, but I always get an IllegalStateException:
Exception processing transaction Synchronization after completion: java.lang.IllegalStateException: No event context active
Complete StackTrace
I did try everything in the documentation, but I can't get it away. If I leave out the #Stateless annotation, I don't get any Injection done. Adding #Scope doesn't do jack. Accessing the service via seam/resource/ doesn't even work (even without the Application class with #ApplicationPath).
It goes away if I don't use Dep. Injection, but instead add to each and every method
fooService = Component.getInstance("fooService");
Lifecycle.beginCall();
...
Lifecycle.endCall();
which isn't really a good solution. Nah, doesn't work either...
I have resolved the issue. For some reason (still not sure why, maybe because I tried to use Annotations and code exclusivly and no XML config), my REST service was availiable under a "non-standard" URL.
Usually it'd be something like "/seam/resources/rest".
Anyway, if you have a "custom" path, Seam doesn't know it should inject a context. You need to add <web:context-filter url-pattern="something" /> to your component.xml.
Specifically we already had this tag, but with the attribute regex-url-pattern and I extended it to match the REST URL.

Using Guice/Peaberry for osgi declarative services

I want to solve the following problem and need advice, what the best solution is.
I have a bundle A in which a service interface X is defined. A bundle B provides a service implementation of X and contributes the implementation to the tool. A and B use Google Guice and Peaberry to configure the setup of the objects.
There are two possibilities I can use to contribute the service implementation:
Using an eclipse extension:
In this solution I can use the GuiceExtensionFactory mechanism of Peaberry to create the service implementation using Guice and therefore can inject stuff needed by the implementation. The disadvantage here is that in the bundle defining the extension point, I need the boilerplate code for the resolution of the extensions because there is to my knowledge no way to get the extensions injected into the class which uses the extensions.
This looks like this:
<extension point="A.service.X">
<xservice
...
class="org.ops4j.peaberry.eclipse.GuiceExtensionFactory:B.XImpl"
.../>
</extension>
<extension
point="org.ops4j.peaberry.eclipse.modules">
<module
class="B.XModule">
</module>
</extension>
but I need the boilerplate code like this:
private List<X> getRegisteredX() {
final List<X> ximpls = new ArrayList<>();
for (final IConfigurationElement e : Platform.getExtensionRegistry().getConfigurationElementsFor( X_EXTENSION_POINT_ID)) {
try {
final Object object = e.createExecutableExtension("class"); //$NON-NLS-1$
if (object instanceof X) {
ximpls.add((X) object);
}
} catch (final CoreException ex) {
// Log
}
}
return ximpls;
}
Using an OSGI service:
My main problem here is to ensure that the service is registered. I want the bundle loaded lazily, so at least an access to one of the classes of the bundle is required. Registering the service programmatically using Peaberry has an issue, because nobody ever asks for a class of the bundle. The solution is to provide the service as a declarative service, but I do not know a way to create the service implementation in a way, that I can use Guice to inject required objects.
So I have some questions:
Is there something I do not know so far that implements the code needed to read the extensions at an extension point generically and allows to inject the extensions to the class using the extensions?
Is there a way to ensure that the service is provided even if it is added using the standard Peaberry mechanism, i.e., the bundle is activated when the service is requested?
Is there a way like the GuiceExtensionFactory for declarative services, so that the creation of the service implementation can be done by the injector of the bundle?
Something that look like:
<?xml version="1.0" encoding="UTF-8"?>
<scr:component xmlns:scr="http://www.osgi.org/xmlns/scr/v1.1.0" name="Ximpl">
<implementation class="some.generic.guiceaware.ServiceFactory:B.Ximpl"/>
<service>
<provide interface="A.X"/>
</service>
</scr:component>
Summarized, I want a service implementation generated by Guice and I want to get the service implementations simply injected into the classes using the service without extensive boilerplate code. Has anybody a solution for that?
Sorry, to ask, but I searched the web for quite a while and so far I did not find a solution.
Thanks and best regards,
Lars
I found a solution, but since I did not find it without a lot of trying out and thinking I thought I share it here. From the options I mentioned in my posting, my solution uses the first one, that is Eclipse extension points and extensions. In order to use Guice in the context of extension points there are two aspects to consider:
Providing an extension that is created by an Guice injector
This is explained very well here: https://code.google.com/p/peaberry/wiki/GuiceExtensionFactory. There is one remark to make from my side. The creation of the extension object is done in an injector inside of the GuiceExtensionFactory, so it is an own context, which needs to be configured by the module given as additional extension to the factory. This can become an issue, if you have other needs that require creating the injector in the bundle on your own.
Defining an extension point so that the extensions are simply injected into the classes which use the extensions.
First thing to do is to define the extension point schema file as normally. It should contain the reference of an interface that has to be implemented by the extensions.
The id of the extension point has to be connected to the interface which is provided by the extensions and which is injected by guice/peaberry. Therefore peaberry provides an annotation to be used to annotate the interface:
import org.ops4j.peaberry.eclipse.ExtensionBean;
#ExtensionBean("injected.extension.point.id")
public interface InjectedInterface {
...
}
On some web pages you also find the information that if the id is equal to the qualified name of the interface, it can be found directly without the annotation but I did not try this out.
In order to enable the injection, you have to do two things to configure the Guice injector creation.
First the EclipseRegistry object of Peaberry has to be set as ServiceRegistry. Second the binding of the extension implementations to a provided service has to be done.
The injector creation has to be done in this way:
import org.osgi.framework.BundleContext;
import com.google.inject.Guice;
import com.google.inject.Injector;
import org.ops4j.peaberry.eclipse.EclipseRegistry;
import static org.ops4j.peaberry.Peaberry.*;
void initializer() {
Injector injector = Guice.createInjector(osgiModule(context, EclipseRegistry.eclipseRegistry()), new Module() {
binder.bind(iterable(InjectedInterface.class)).toProvider(service(InjectedInterface.class).multiple());
});
}
The extension implementations can then simply be injected like this:
private Iterable<InjectedInterface> registeredExtensions;
#Inject
void setSolvers(final Iterable<InjectedInterface> extensions) {
registeredExtensions = extensions;
}
With the described way it is possible to have injected extensions which are implemented by classes using Guice to get dependencies injected.
I did not find a solution to use osgi services so far, but perhaps there is someone who has an idea.
Best regards,
Lars

OSGi services - best practice

I start loving OSGi services more and more and want to realize a lot more of my components as services. Now I'm looking for best-practice, especially for UI components.
For Listener-relations I use the whiteboard-pattern, which IMHO opinion is the best approach. However if I want more than just notifications, I can think of three possible solutions.
Imagine the following scenario:
interface IDatabaseService {
EntityManager getEntityManager();
}
[1] Whiteboard Pattern - with self setting service
I would create a new service interface:
interface IDatabaseServiceConsumer {
setDatabaseService(IDatabaseService service);
}
and create a declarative IDatabaseService component with a bindConsumer method like this
protected void bindConsumer(IDatabaseServiceConsumer consumer) {
consumer.setDatabaseService(this);
}
protected void unbindConsumer(IDatabaseServiceConsumer consumer) {
consumer.setDatabaseService(null);
}
This approach assumes that there's only one IDatabaseService.
[Update] Usage would look like this:
class MyUIClass ... {
private IDatabaseService dbService;
Consumer c = new IDatabaseServiceConsumer() {
setDatabaseService(IDatabaseService service) {
dbService = service;
}
}
Activator.registerService(IDatabaseServiceConsumer.class,c,null);
...
}
[2] Make my class a service
Image a class like
public class DatabaseEntryViewer extends TableViewer
Now, I just add bind/unbind methods for my IDatabaseService and add a component.xml and add my DatabaseEntryViewer. This approach assumes, that there is a non-argument constructor and I create the UI components via a OSGi-Service-Factory.
[3] Classic way: ServiceTracker
The classic way to register a static ServiceTracker in my Activator and access it. The class which uses the tracker must handle the dynamic.
Currently I'm favoring the first one, as this approach doesn't complicated object creation and saves the Activator from endless, static ServiceTrackers.
I have to agree with #Neil Bartlett, your option 1 is backward. You are in effect using an Observer/Observable pattern.
Number 2 is not going to work, since the way UI objects lifecycles are managed in RCP won't allow you to do what you want. The widget will have to be created as part of the initialization of some sort of view container (ViewPart, Dialog, ...). This view part is typically configured and managed via the Workbench/plugin mechanism. You should work with this, not against it.
Number 3 would be a simple option, not necessarily the best, but simple.
If you use Spring DM, then you can easily accomplish number 2. It provides a means to inject your service beans into your UI Views, Pages, etc. You use a spring factory to create your views (as defined in your plugin.xml), which is configured via a Spring configuration, which is capable of injecting your services into the bean.
You may also be able to combine the technique used by the SpringExtensionFactory class along with DI to accomplish the same thing, without introducing another piece of technology. I haven't tried it myself so I cannot comment on the difficulty, although it is what I would try to do to bridge the gap between RCP and OSGi if I wasn't already using Spring DM.