Does Guice Persist provide transaction scoped or application managed EntityManager? - jpa

We use Guice Persist to inject EntityManager in our project.
E.g.
public class MyDao{
#Inject
EntityManager em;
public void someMethod(){
//uses em instance
}
}
But it is unclear for us how injected instance of EntityManager is about to be used.
What type of EntityManager is this? (see e.g.: types of entity managers) Under the hood Guice Persist instantiates it via EntityManagerFactory.createEntityManager() so I'd say it's application-managed entity manager. But in official Wiki they write about seesion-per-transaction strategy, which suggests that EntityManager is (pseudo) transaction-scoped.
Should we invoke close() on it manually? Or Guice will take care of it?
What is the scope of first level cache? Only single transaction (like in transaction-scoped entity managers) or as long as I use the same injected instance of EntityManager (like in application managed entity managers)?

Even though the question is perfectly answered by Piotr I'd like to add some practical advise on how to use guice-persist.
I've been having issues with it which were pretty hard to debug. In my application certain threads would display outdated data and sometimes EntityManager instances were left with old dead database connections. The root cause was to be found in the way I used the #Transactional annotation (I only used them for methods that do updates/inserts/deletes, not for read-only methods). guice-persist stores EntityManager instances in a ThreadLocal as soon as you call get() on an injected Provider<EntityManager> instance (which I did for read-only methods). However, this ThreadLocal is only removed if you also call UnitOfWork.end() (which normally is done by the interceptor if #Transactional is on the method). Not doing so will leave the EM instance in the ThreadLocal so that eventually every thread in your thread pool will have an old EM instance with stale cached entities.
So, as long as you stick to the following simple rules the usage of guice-persist is straight forward:
Always inject Provider<EntityManager> instead of EntityManager directly.
For transaction-scoped semantics: always annotate each method with #Transactional (even the read-only methods). This way the JpaLocalTxnInterceptor will intercept the calls to your annotated methods making sure not only to start and commit transactions but also to set and unset EM instances in the ThreadLocal.
For request-scoped semantics: use the PersistFilter servlet filter that ships with guice-persist. It will call begin() and end() on the UnitOfWork for you before and after the request is done, thereby populating and cleaning up the ThreadLocal.
Hope this helps!

I did some research of the source code of Guice-persist and read through comments under Guice-persist wiki pages and these are the answers that I needed:
1 . Lifecycle management of EntityManager is kind of broken if it's injected via #Inject EntityManager. As stated in one of the comments on the Wiki:
I confirm that inject directly an EntityManager instead of a provider
can be dangerous. If you're not inside a UnitOfWork or a method
annotated with #Transaction, the first injection of an EntityManager
in a thread will create a new EntityManager, never destroy it, and
always use this specific EntityManager for this thread (EM are stored
thread-local). This can lead to terrible issues, like injection of
dead entityManager (connection closed, etc) So my recommendation if to
always inject a Provider, or at least to inject
directly an EntityManager only inside an opened UnitOfWork.
So example in my question isn't the most correct usage. It creates singleton instance of EntityManager (per-thread) and will inject this instance everywhere :-(.
However if I've injected Provider and used it inside #Transactional method then the instance of EntityManager would be per-transaction. So the answer to this question is: if injected and used correctly, the entity manager is transaction-scoped.
2 . If injected and used correctly then I don't need to manualy close entity manager (guice-persist will handle that for me). If used incorrectly, closing manually would be very bad idea (closed instance of EntityManager would be injected in every place when I #Inject EntityManager )
3 . If injected and used correctly then the scope of L1 cache is single transaction. If used incorrectly, the scope of the L1 cache is the lifetime of application (EntityManager is singleton)

1. It depends on you module cofiguration. There are some basic bindings:
JpaPersistanceService
public class JpaPersistanceService implements Provider<EntityManager> {
private EntityManagerFactory factory;
public JpaPersistanceService(EntityManagerFactory factory) {
this.factory = factory;
}
#Override
public EntityManager get() {
return factory.createEntityManager();
}
}
Module binding
EntityManagerFactory factory = Persistence.createEntityManagerFactory(getEnvironment(stage));
bind(EntityManager.class).annotatedWith(Names.named("request")).toProvider(new JpaPersistanceService(factory)).in(RequestScoped.class);
bind(EntityManager.class).annotatedWith(Names.named("session")).toProvider(new JpaPersistanceService(factory)).in(SessionScoped.class);
bind(EntityManager.class).annotatedWith(Names.named("app")).toProvider(new JpaPersistanceService(factory)).asEagerSingleton;
Usage
#Inject #Named("request")
private EntityManager em; //inject a new EntityManager class every request
#Inject #Named("session")
private Provider<EntityManager> emProvider; //inject a new EntityManager class each session
//This is little bit tricky, cuz EntityManager is stored in session. In Stage.PRODUCTION are all injection created eagerly and there is no session at injection time. Session binding should be done in lazy way - inject provider and call emProvider.get() when em is needed;
#Inject #Named("application")
private EntityManager em; //inject singleton
2. Yes, you should or you will use JpaPersistModule [javadoc]
3. Well, this is about JPA configuration in persistence.xml and EntityManager scope

I'm injecting a provider .... but I suspect something is wrong.
When I try to redeploy an application ALWAYS I have to restar the server because the JPA classes are cached.
It happens the following pseudo-bug
https://bugs.eclipse.org/bugs/show_bug.cgi?id=326552
Theoretically by injecting a Provider and getting an instance of EntityManager you should not close anything ....

Related

Injecting a Stateless EJB into another Stateless and using PersistenceContext

With JEE 5 / EJB 3.0 life of Java developers became much more easier. Later, influenced by Spring and CDI, similar approaches were also adopted in JEE.
Now, I hope I am doing it right, but just to be sure:
I have several Stateless EJBs, which all query and / or modify the database. An example is
#Stateless
public class AddressDBService {
#PersistenceContext
protected EntityManager em;
Some of the Stateless EJB refer the other services like this:
#Stateless
public class AVeDBService {
#PersistenceContext
protected EntityManager em;
#Inject
private HomeToDealDBService homeToDealDBService;
#Inject
private AddressDBService addressDBservice;
and in the Stateless EJBs I have public methods like the ones below:
#TransactionAttribute(TransactionAttributeType.REQUIRES_NEW)
public void saveEntity(Home home) throws EntityExistsException {
this.em.persist(home);
addressDBservice.saveAddress(home.getMainAddress(), home);
}
While I am almost certain this usage is correct and thread-safe (the above services are in turn injected into JSF Managed Beans).
Could somebody confirm that my usage is correct, thread-safe and conforms good practices?
My usage seems to be conform with the following questions:
Is EntityManager really thread-safe?
Stateless EJB with more injected EJBs instances
The "is correct?" question can't be answered without know the goal of the project.
It could works? Yes, you have posted java-ee code that could deploy, but is not enough.
I usually use BCE (Boundary Control Entity) pattern and Domain Driven pattern.
In this pattern we use EJB for business logic services or endpoint (JAX-RS) and all other injections, that are the Control part, are CDI objects.
Entities (JPA) could use cascade to avoid to manually save related entities:
addressDBservice.saveAddress(home.getMainAddress(), home);
can be avoided if you define the entity like this:
#Entity
public class Home {
#ManyToOne(cascade=ALL)
private Address mainAddress;
}
The #TransactionAttribute(TransactionAttributeType.REQUIRES_NEW) annotation usually respond to a specific transactions behavior, is not required, so is correct only if is what you want to do.

Common lib for EntityManager CDI

I have a common generic DAO in common lib. I want in each module which uses this DAO to initialize with its own persistence UNIT
public abstract class GenericDao implements IGenericDao {
#PersistenceContext(unitName = "XXXX")
private EntityManager entityManager;
and in other module
public class CarDao extends GenericDao{
I have lot of projects are using this generic DAO but each project have its own persistence unit.
Persitence unit are differents following the project where is used the common library
The point is i could not use POO with abstract getEntityManager injected in each micro servicies because in common project we have a history DAO common for all microservicies and for each one i have to retrieve the entityManager injected from the microservice
Am i doing wrong or well? and how set th epersistence unit in each project ? (each project have lot fo DAO and i don't want repet each time CRUD methods)
#PersistenceContext(unitName = "XXXX")
private EntityManager entityManager;
This should be done in each concrete class, the abstract one should implement the concrete operation using
getEntityManager().doSomething(entity)
the getter getEntityManager() being abstract.
Imho this is a design smell, EntityManager is already an abstraction and you have nothing to gain encapsulating it.
[edit]
Regarding the "factory" approach, the way in CDI to dynamically inject resources is using producer methods.
You can so create a method returning an EntityManager instance that will dynamically resolve the EntityManagerFactory according to the persistence unit name (see an example here).
Note that this is a very bad idea as the entityManager scope is usually bound to the transaction one, letting the container inject you the entityManager instance guarantee that the scope will be correctly handled (by the container). The only viable configuration with this approach is when you want an "application managed" entityManager
NB: note that the given example will instantiate a new EntityManageFactory instance for each injection which can be really catastrophic according to the way you use it (the EntityManageFactory should be created once for all the application)
be sure to be aware of EntityManager lifecycle before going further.
Thank you guy for your advices in fact i was totally stupid in my genericDao i simply put
public abstract class GenericDao implements IGenericDao {
#PersistenceContext
private EntityManager entityManager;
As we have only one PersistentUnit it will be automatically injected....
was so easy !
then i can use #PersistentContext in all DAOs or simply and best call getEntityManager from their parent IGenericDao

Can I use SpringData by itself [duplicate]

I'm trying to wire up Spring Data JPA objects manually so that I can generate DAO proxies (aka Repositories) - without using a Spring bean container.
Inevitably, I will be asked why I want to do this: it is because our project is already using Google Guice (and on the UI using Gin with GWT), and we don't want to maintain another IoC container configuration, or pull in all the resulting dependencies. I know we might be able to use Guice's SpringIntegration, but this would be a last resort.
It seems that everything is available to wire the objects up manually, but since it's not well documented, I'm having a difficult time.
According to the Spring Data user's guide, using repository factories standalone is possible. Unfortunately, the example shows RepositoryFactorySupport which is an abstract class. After some searching I managed to find JpaRepositoryFactory
JpaRepositoryFactory actually works fairly well, except it does not automatically create transactions. Transactions must be managed manually, or nothing will get persisted to the database:
entityManager.getTransaction().begin();
repositoryInstance.save(someJpaObject);
entityManager.getTransaction().commit();
The problem turned out to be that #Transactional annotations are not used automatically, and need the help of a TransactionInterceptor
Thankfully, the JpaRepositoryFactory can take a callback to add more AOP advice to the generated Repository proxy before returning:
final JpaTransactionManager xactManager = new JpaTransactionManager(emf);
final JpaRepositoryFactory factory = new JpaRepositoryFactory(emf.createEntityManager());
factory.addRepositoryProxyPostProcessor(new RepositoryProxyPostProcessor() {
#Override
public void postProcess(ProxyFactory factory) {
factory.addAdvice(new TransactionInterceptor(xactManager, new AnnotationTransactionAttributeSource()));
}
});
This is where things are not working out so well. Stepping through the debugger in the code, the TransactionInterceptor is indeed creating a transaction - but on the wrong EntityManager. Spring manages the active EntityManager by looking at the currently executing thread. The TransactionInterceptor does this and sees there is no active EntityManager bound to the thread, and decides to create a new one.
However, this new EntityManager is not the same instance that was created and passed into the JpaRepositoryFactory constructor, which requires an EntityManager. The question is, how do I make the TransactionInterceptor and the JpaRepositoryFactory use the same EntityManager?
Update:
While writing this up, I found out how to solve the problem but it still may not be the ideal solution. I will post this solution as a separate answer. I would be happy to hear any suggestions on a better way to use Spring Data JPA standalone than how I've solve it.
The general principle behind the design of JpaRepositoryFactory and the according Spring integration JpaRepositoryFactory bean is the following:
We're assuming you run your application inside a managed JPA runtime environment, not caring about which one.
That's the reason we rely on injected EntityManager rather than an EntityManagerFactory. By definition the EntityManager is not thread safe. So if dealt with an EntityManagerFactory directly we would have to rewrite all the resource managing code a managed runtime environment (just like Spring or EJB) would provide you.
To integrate with the Spring transaction management we use Spring's SharedEntityManagerCreator that actually does the transaction resource binding magic you've implemented manually. So you probably want to use that one to create EntityManager instances from your EntityManagerFactory. If you want to activate the transactionality at the repository beans directly (so that a call to e.g. repo.save(…) creates a transaction if none is already active) have a look at the TransactionalRepositoryProxyPostProcessor implementation in Spring Data Commons. It actually activates transactions when Spring Data repositories are used directly (e.g. for repo.save(…)) and slightly customizes the transaction configuration lookup to prefer interfaces over implementation classes to allow repository interfaces to override transaction configuration defined in SimpleJpaRepository.
I solved this by manually binding the EntityManager and EntityManagerFactory to the executing thread, before creating repositories with the JpaRepositoryFactory. This is accomplished using the TransactionSynchronizationManager.bindResource method:
emf = Persistence.createEntityManagerFactory("com.foo.model", properties);
em = emf.createEntityManager();
// Create your transaction manager and RespositoryFactory
final JpaTransactionManager xactManager = new JpaTransactionManager(emf);
final JpaRepositoryFactory factory = new JpaRepositoryFactory(em);
// Make sure calls to the repository instance are intercepted for annotated transactions
factory.addRepositoryProxyPostProcessor(new RepositoryProxyPostProcessor() {
#Override
public void postProcess(ProxyFactory factory) {
factory.addAdvice(new TransactionInterceptor(xactManager, new MatchAlwaysTransactionAttributeSource()));
}
});
// Create your repository proxy instance
FooRepository repository = factory.getRepository(FooRepository.class);
// Bind the same EntityManger used to create the Repository to the thread
TransactionSynchronizationManager.bindResource(emf, new EntityManagerHolder(em));
try{
repository.save(someInstance); // Done in a transaction using 1 EntityManger
} finally {
// Make sure to unbind when done with the repository instance
TransactionSynchronizationManager.unbindResource(getEntityManagerFactory());
}
There must be be a better way though. It seems strange that the RepositoryFactory was designed to use EnitiyManager instead of an EntityManagerFactory. I would expect, that it would first look to see if an EntityManger is bound to the thread and then either create a new one and bind it, or use an existing one.
Basically, I would want to inject the repository proxies, and expect on every call they internally create a new EntityManager, so that calls are thread safe.

How to make #EJB injection work on the server?

Looking at this answer, it says:
If you don't want to use an Application Client Container and instead just run the application client class through a java command, injection won't be possible and you'll have to perform a JNDI lookup.
However, given that I am trying to inject a DAO bean like the example shown here, if I cannot do the automatic injecting, it means my application must manually do the JNDI lookup and all the transaction begin/end that I would get for free if the #EJB actually worked.
However, since everything is all within the same Eclipse EJB Project (it also failed with the same null handle when I had my client code in a Dynamic Web Project), surely there must be an easy way to get it all working? Can anyone suggest what I am doing wrong?
Finally, this article suggests that DAOs are not needed, but if I replace within my EJB:
#EJB MyDao dao;
with the more direct:
#PersistenceContext private EntityManager em;
I still get the similar null value; is this the same injection failure problem?
NB: I have just noticed this answer:
This is a bug in Glassfish (apparently in the web services stack).
I am running v4.0 Build 89, which still has this bug? Does this mean I have to do all JPA actions the long-winded way?
I eventually found out that the problem/issue is that in order to use injection of the #PersistenceContext the class MUST be a bean itself. This is hinted at in the example on Wikipedia:
#Stateless
public class CustomerService {
#PersistenceContext
private EntityManager entityManager;
public void addCustomer(Customer customer) {
entityManager.persist(customer);
}
}
I could delete this question, but perhaps leaving this answer might provide a hint to someone, or at least show them a minimal working example of EJB and JPA.

Creating new entities while enabling injection

I have a method on a stateless session bean which creates a new instance of an entity and persists it. You might normally use new MyEntity() to create the object but I would like injection to populate some of the properties of the entity for me.
I got partial success using
#Inject
#New
private MyEntity myNewEntity;
in the session bean and then using that instance in my method.
The problem I have now is that the second time the method is called, myNewEntity isn't a new object, its the same object as the one created the first time. As a result I'm getting
com.mysql.jdbc.exceptions.jdbc4.MySQLIntegrityConstraintViolationException: Duplicate entry '9' for key 'PRIMARY'
Or at least that's why I think I'm getting this exception. Certainly if I use new MyEntity() I don't get the exception but my injection doesn't happen.
Am I on the wrong track? How can I create a new local entity object while enabling injection?
Any help would be great!
First of all - I have serious doubts that it's a good idea to use CDI to control the lifecycle of a Entity. See this quote from the documentation (here):
According to this definition, JPA
entities are technically managed
beans. However, entities have their
own special lifecycle, state and
identity model and are usually
instantiated by JPA or using new.
Therefore we don't recommend directly
injecting an entity class. We
especially recommend against assigning
a scope other than #Dependent to an
entity class, since JPA is not able to
persist injected CDI proxies.
What you should do to create new instances of entities is adding a layer of indirection, either with #Produces or #Unwraps (Seam Solder, if you need it to be truly stateless), and thereby making sure that you code explicitly calls new.
I think I have a working solution now which seems okay, though I'm not quite sure why it works so I welcome your feedback on a better solution. I am now injecting a DAO-style bean into my stateless session bean:
#Stateless
public class PhoneService {
#Inject
protected ProblemReports problemReports;
An injecting my entity into the ProblemReports bean:
public class ProblemReports {
#Inject
#New
private ProblemReport newProblemReport;
I assume that ProblemReports defaults to #Dependant scope which as I understand it should be the same as the stateless session bean which containts it. I could understand this if the scope of ProblemReports was shorter, causing a new instance of ProblemReport to be created when the new ProblemReports is created; but it isn't.
Is this just an example of EJB and CDI not playing well together?