How to create a JpaRepository but NOT using dependency injection - spring-data-jpa

In my application I am currently creating a JpaRepository via the usual mechanism, i.e. by defining an interface like so:
public interface HistoryInfoRepository extends JpaRepository<HistoryInfo, Long>
This works all fine BUT if the DB is not available or not accessible when starting up the application then the creation of the repo bean fails, the injection into the respective service (which receives that repo as an autowired constructor argument) fails and as a consequence my applications main class:
SpringApplication.run(Application.class, args);
fails as well. I.e. the entire application just fails with a gigantic stacktrace.
I would want to make situation a bit more user friendly in that the application at least comes up so far that it is able to display some (hopefully) helpful error message with an explanation and some hints what should be checked for.
That would require to NOT autowire the JpaRepository but create it programmatically, e.g. in said service's constructor but wrapped within some try - catch construct.
I don't want to give up that convenience that the JpaRepository mechanism provides but how can one create such a JpaRepository programmatically when there is only an interface defined and everything else is done automagically by Spring?
Is there some API-call to create such a JpaRepository for a given interface?
I would like to see something like this:
public HistoryInfoService( /* HistoryInfoRepository historyRepo */) {
...
try {
historyRepo = createJpaRepositoryFromInterface(HistoryInfoRepository.class);
} catch (Exception ex) {
// inform user about missing or non-reachable DB server and suggest remedies...
}
...
}
Here I commented away the constructor argument which is normally provided via auto-wiring, but instead I want to create the JpaRepository using some method-call. That would allow me to properly react to the failed Repo-creation as indicated by the comment in the catch clause.
I searched for something like that but I must have been using the wrong search terms and found nothing so far. Any hints?

Related

Can I use SpringData by itself [duplicate]

I'm trying to wire up Spring Data JPA objects manually so that I can generate DAO proxies (aka Repositories) - without using a Spring bean container.
Inevitably, I will be asked why I want to do this: it is because our project is already using Google Guice (and on the UI using Gin with GWT), and we don't want to maintain another IoC container configuration, or pull in all the resulting dependencies. I know we might be able to use Guice's SpringIntegration, but this would be a last resort.
It seems that everything is available to wire the objects up manually, but since it's not well documented, I'm having a difficult time.
According to the Spring Data user's guide, using repository factories standalone is possible. Unfortunately, the example shows RepositoryFactorySupport which is an abstract class. After some searching I managed to find JpaRepositoryFactory
JpaRepositoryFactory actually works fairly well, except it does not automatically create transactions. Transactions must be managed manually, or nothing will get persisted to the database:
entityManager.getTransaction().begin();
repositoryInstance.save(someJpaObject);
entityManager.getTransaction().commit();
The problem turned out to be that #Transactional annotations are not used automatically, and need the help of a TransactionInterceptor
Thankfully, the JpaRepositoryFactory can take a callback to add more AOP advice to the generated Repository proxy before returning:
final JpaTransactionManager xactManager = new JpaTransactionManager(emf);
final JpaRepositoryFactory factory = new JpaRepositoryFactory(emf.createEntityManager());
factory.addRepositoryProxyPostProcessor(new RepositoryProxyPostProcessor() {
#Override
public void postProcess(ProxyFactory factory) {
factory.addAdvice(new TransactionInterceptor(xactManager, new AnnotationTransactionAttributeSource()));
}
});
This is where things are not working out so well. Stepping through the debugger in the code, the TransactionInterceptor is indeed creating a transaction - but on the wrong EntityManager. Spring manages the active EntityManager by looking at the currently executing thread. The TransactionInterceptor does this and sees there is no active EntityManager bound to the thread, and decides to create a new one.
However, this new EntityManager is not the same instance that was created and passed into the JpaRepositoryFactory constructor, which requires an EntityManager. The question is, how do I make the TransactionInterceptor and the JpaRepositoryFactory use the same EntityManager?
Update:
While writing this up, I found out how to solve the problem but it still may not be the ideal solution. I will post this solution as a separate answer. I would be happy to hear any suggestions on a better way to use Spring Data JPA standalone than how I've solve it.
The general principle behind the design of JpaRepositoryFactory and the according Spring integration JpaRepositoryFactory bean is the following:
We're assuming you run your application inside a managed JPA runtime environment, not caring about which one.
That's the reason we rely on injected EntityManager rather than an EntityManagerFactory. By definition the EntityManager is not thread safe. So if dealt with an EntityManagerFactory directly we would have to rewrite all the resource managing code a managed runtime environment (just like Spring or EJB) would provide you.
To integrate with the Spring transaction management we use Spring's SharedEntityManagerCreator that actually does the transaction resource binding magic you've implemented manually. So you probably want to use that one to create EntityManager instances from your EntityManagerFactory. If you want to activate the transactionality at the repository beans directly (so that a call to e.g. repo.save(…) creates a transaction if none is already active) have a look at the TransactionalRepositoryProxyPostProcessor implementation in Spring Data Commons. It actually activates transactions when Spring Data repositories are used directly (e.g. for repo.save(…)) and slightly customizes the transaction configuration lookup to prefer interfaces over implementation classes to allow repository interfaces to override transaction configuration defined in SimpleJpaRepository.
I solved this by manually binding the EntityManager and EntityManagerFactory to the executing thread, before creating repositories with the JpaRepositoryFactory. This is accomplished using the TransactionSynchronizationManager.bindResource method:
emf = Persistence.createEntityManagerFactory("com.foo.model", properties);
em = emf.createEntityManager();
// Create your transaction manager and RespositoryFactory
final JpaTransactionManager xactManager = new JpaTransactionManager(emf);
final JpaRepositoryFactory factory = new JpaRepositoryFactory(em);
// Make sure calls to the repository instance are intercepted for annotated transactions
factory.addRepositoryProxyPostProcessor(new RepositoryProxyPostProcessor() {
#Override
public void postProcess(ProxyFactory factory) {
factory.addAdvice(new TransactionInterceptor(xactManager, new MatchAlwaysTransactionAttributeSource()));
}
});
// Create your repository proxy instance
FooRepository repository = factory.getRepository(FooRepository.class);
// Bind the same EntityManger used to create the Repository to the thread
TransactionSynchronizationManager.bindResource(emf, new EntityManagerHolder(em));
try{
repository.save(someInstance); // Done in a transaction using 1 EntityManger
} finally {
// Make sure to unbind when done with the repository instance
TransactionSynchronizationManager.unbindResource(getEntityManagerFactory());
}
There must be be a better way though. It seems strange that the RepositoryFactory was designed to use EnitiyManager instead of an EntityManagerFactory. I would expect, that it would first look to see if an EntityManger is bound to the thread and then either create a new one and bind it, or use an existing one.
Basically, I would want to inject the repository proxies, and expect on every call they internally create a new EntityManager, so that calls are thread safe.

Limit instance count with Autofac?

I have a console app that will create an instance of a class and execute a method on it, and that's really all it does (but this method may do a lot of things). The class is determined at runtime based on command line args, and this is registered to Autofac so it can be correctly resolved, supplying class-specific constructor parameters extracted from the command line. All this works.
Now, I need to impose a system-wide limit to the number of instances per class that can be running at any one time. I will probably use a simple SQL database to keep track of number of allowed and running instances per class, and I have no problem with the SQL side of things.
But how do I actually impose this limit in a nice manner using Autofac?
I am thinking that I would have some "slot service" that would do something like this:
Try to reserve a new instance "slot".
If no more slots, log a message and terminate the process.
If slot successfully reserved, create instance and return it.
My idea is also to free the instance's slot in the class' Dispose method, preferably by using another method on the slot service.
How would I fit this into Autofac?
One possibility would be to register the class I want to instantiate with a lambda/delegate that does the above steps. But in that case, how do I "terminate"? Throw an exception? That would require some code to catch the exception and either log it or simply ignore it before terminating the process. I don't like it. I'd like the entire slot reservation stuff inside the delegate, lambda or service.
Another solution might be to do the slot reservation outside of Autofac, but that also seems somewhat messy.
I would prefer a solution where the "slot service" itself can be nicely unit tested, i.e. non-static and with an interface, and preferably resolved with Autofac.
I'm sure I'm missing something obvious here... Any suggestions?
This is my "best bet" so far:
static void Main(string[] args)
{
ReadCommandLine(args, out Type itemClass, out Type paramsClass, out Type paramsInterface, out object parameters);
BuildContainer(itemClass, paramsClass, paramsInterface, parameters);
IInstanceHandler ih = Container.Resolve<IInstanceHandler>();
if (ih.RegisterInstance(itemClass, out long instanceid))
{
try
{
Container.Resolve<IItem>().Execute();
}
finally
{
ih.UnregisterInstance(itemClass, instanceid);
}
}
}

DI and inheritance

Another question appeared during my migration from an E3 application to a pure E4.
I got a Structure using inheritance as in the following pic.
There I have an invocation sequence going from the AbstractRootEditor to the FormRootEditor to the SashCompositeSubView to the TableSubView.
There I want to use my EMenuService, but it is null due to it can´t be injected.
The AbstractRootEditor is the only class connected to the Application Model (as a MPart created out of an MPartDescriptor).
I´d like to inject the EMenuService anyway in the AbstractSubView, otherwise I would´ve the need to carry the Service through all of my classes. But I don´t have an IEclipseContext there, due to my AbstractSubView is not connected with Application Model (Do I ?).
I there any chance to get the service injected in the AvstractSubView?
EDIT:
I noticed that injecting this in my AbstractSubView isn´t possible (?), so I´m trying to get it into my TableSubView.
After gregs comment i want to show some code:
in the AbstractRootEditor:
#PostConstruct
public final void createPartControl(Composite parent, #Active MPart mPart) {
...
ContextInjectionFactory.make(TableSubView.class, mPart.getContext());
First I got an Exception, saying that my TableSubView.class got an invalid constructor, so now the Constructor there is:
public TableSubView() {
this.tableInputController=null;
}
as well as my Field-Injection:
#Inject EMenuService eMenuService
This is kind of not working, eMenuService is still null
If you create your objects using ContextInjectionFactory they will be injected. Use:
MyClass myClass = ContextInjectionFactory.make(MyClass.class, context);
where context is an IEclipseContext (so you have to do this for every class starting from one that is injected by Eclipse).
There is also a seconds version of ContextInjectionFactory.make which lets you provide two contexts the second one being a temporary context which can contain additional values.

No event context active - RESTeasy, Seam

I'm trying to add a RESTful web service with RESTeasy to our application running on JBoss 7.x, using Seam2.
I wanted to use as little Seam as possible, but I need it for Dependancy Injection.
My REST endpoints are as follows:
#Name("myEndpoint")
#Stateless
#Path("/path")
#Produces(MediaType.APPLICATION_JSON+"; charset=UTF-8")
public class MyEndpoint {
#In private FooService fooService;
#GET
#Path("/foo/{bar}")
public Response foobar(#CookieParam("sessionId") String sessionId,
#PathParam("bar") String bar)
{ ... }
}
I'm using a class extending Application. There is no XML config.
I can use the web service methods and they work, but I always get an IllegalStateException:
Exception processing transaction Synchronization after completion: java.lang.IllegalStateException: No event context active
Complete StackTrace
I did try everything in the documentation, but I can't get it away. If I leave out the #Stateless annotation, I don't get any Injection done. Adding #Scope doesn't do jack. Accessing the service via seam/resource/ doesn't even work (even without the Application class with #ApplicationPath).
It goes away if I don't use Dep. Injection, but instead add to each and every method
fooService = Component.getInstance("fooService");
Lifecycle.beginCall();
...
Lifecycle.endCall();
which isn't really a good solution. Nah, doesn't work either...
I have resolved the issue. For some reason (still not sure why, maybe because I tried to use Annotations and code exclusivly and no XML config), my REST service was availiable under a "non-standard" URL.
Usually it'd be something like "/seam/resources/rest".
Anyway, if you have a "custom" path, Seam doesn't know it should inject a context. You need to add <web:context-filter url-pattern="something" /> to your component.xml.
Specifically we already had this tag, but with the attribute regex-url-pattern and I extended it to match the REST URL.

Griffon integration tests with jpa

I'm writing a griffon application with JavaFX and the JPA plugin. I have a service I'd like to test - this service makes use of the JPA plugin (withJpa {...}) and it's this database access that I want to test.
So, I want to write this test so it inserts some data, then check that the service produces the right answer thus verifying the sql query is correct.
I've written a simple test:
class ReportServiceTests extends GriffonUnitTestCase {
GriffonApplication app
public void testStats() {
println app.getServices()
println app.getControllers()
}
}
but I cannot get hold of a valid service - both the println statements above produce "[:]".
How do I get hold of the 'ReportService' instance and exercise it against the database? I don't want to mock the database interaction.
Thanks.
There's no need to mock the database. As explained in http://griffon.codehaus.org/guide/latest/guide/testing.html#integrationTesting applications reach the INITIALIZE phase during integration testing. Addons get initialized during this phase. Services on the other hand get initialized lazily as they are pulled in by MVC members when instantiated: they do not get instantiated out-of-the-box if you call app.getServices(). However you can instruct the application to eagerly instantiate all services, this will make your code work as expected; just add the following flag to Config.groovy
griffon.services.eager.instantiation = true
More info on services can be found at http://griffon.codehaus.org/guide/latest/guide/controllersAndServices.html#services