How to access Spring Data configured entity manager (factory) in custom implementation - spring-data

I'm writing a custom implementation for a Spring Data JPA repository. So I have:
MyEntityRepositoryCustom => interface with the custom methods
MyEntityRepositoryUmpl => implementation of the above interface
MyEntityRepository => standard interface which extends JpaRepository and MyEntityRepositoryCustom
My problem is this: within the implementation of MyEntityRepositoryUmpl I need to access the entity manager that was injected into Spring Data. How to get it?
I can use #PersistenceContext to get it autowired, but the problem is that this repository must work in an application that sets up more than one persistence units. So, to tell Spring which one I need, I would have to use #PersistenceContext(unitName="myUnit"). However, since my repositories are defined in a reusable service layer, I can't know at that point what will be the name of the persistence unit that the higher-level application layer will configure to be injected into my repositories.
In other words, what I would need to do is to access the entity manager that Spring Data itself is using, but after a (not so quick) look at Spring Data JPA documentation I couldn't find anything for this.
Honestly, the fact that the Impl classes are totally unaware of Spring Data, although described as a strength in Spring Data manual, is actually a complication whenever you need to access something that is usually provided by Spring Data itself in your custom implementation (almost always, I would say...).

Since version Spring Data JPA 1.9.2 you have access to EntityManager through JpaContext, see: http://docs.spring.io/spring-data/jpa/docs/1.9.2.RELEASE/reference/html/#jpa.misc.jpa-context.
Example:
#Component
public class RepositoryUtil
{
#Autowired
private JpaContext jpaContext;
public void deatach(T entity)
{
jpaContext.getEntityManagerByManagedType(entity.getClass()).detach(entity);
}
}
P.S.
This approach will not work if you have more than one EntityManager candidate for some Class, see implementation of JpaContext#getEntityManagerByManagedType -> DefaultJpaContext#getEntityManagerByManagedType.

The best I could find is to set up a "convention": my repositories declare that they expect a persistence unit named myConventionalPU to be made available. The application layer then assigns that alias to the entity manager factory that it sets up and injects into Spring Data, so my custom implementations can receive the correct EMF with autowiring by using that alias. Here's an excerpt of my application context:
<bean id="myEntityManagerFactory" name="myConventionalPU"
class="org.springframework.orm.jpa.LocalContainerEntityManagerFactoryBean">
[...]
</bean>
<jpa:repositories base-package="com.example"
entity-manager-factory-ref="myEntityManagerFactory"
transaction-manager-ref="transactionManager" />
And within my custom implementation:
#PersistenceContext(unitName = "myConventionalPU")
private EntityManager em;
I opened DATAJPA-669 with this requirement.

Spring Data JPA uses Auto configuration classes to auto generate entityManagerFactory, dataSource and transactionManager.
If you want get access to entityManager and control the instantiation and settings, you need to define your own PersistenceConfiguration. Below is the sample code using Java Config
#Configuration
#EnableTransactionManagement
#EnableJpaRepositories(basePackages = { "com.test.repositories.*" })
public class PersistenceJpaConfig {
#Autowired
JpaVendorAdapter jpaVendorAdapter;
#Bean
public DataSource dataSource() {
return new EmbeddedDatabaseBuilder()
.setName("testdb")
.setType(EmbeddedDatabaseType.HSQL)
.build();
}
#Bean
public EntityManager entityManager() {
return entityManagerFactory().createEntityManager();
}
#Bean
public EntityManagerFactory entityManagerFactory() {
LocalContainerEntityManagerFactoryBean lef = new LocalContainerEntityManagerFactoryBean();
lef.setDataSource(dataSource());
lef.setJpaVendorAdapter(jpaVendorAdapter);
lef.setPackagesToScan("com.test.domain.*");
lef.afterPropertiesSet();
return lef.getObject();
}
#Bean
public PlatformTransactionManager transactionManager() {
return new JpaTransactionManager(entityManagerFactory());
}
}
If you have multiple data sources, follow this article.

Related

Spring Data MongoDB #Transactional isn't working?

I have below maven dependency & configuration set up
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-mongodb</artifactId>
</dependency>
#Configuration
#EnableMongoAuditing
public class MongoConfig {
#Bean
MongoTransactionManager transactionManager(MongoDbFactory mongoDbFactory) {
return new MongoTransactionManager(mongoDbFactory);
}
}
Updated: I've taken the suggested solution to create a bean with #Transactional, and have it injected into my test class. Below is the service bean I created:
#Service
#Transactional
#RequiredArgsConstructor
public class MongoTransactionService {
private final UserRepo userRepo;
public void boundToFail() throws RuntimeException {
userRepo.save(User.builder().id("1").build());
throw new RuntimeException();
}
}
and test class where I inject a bean of MongoTransactionService:
#DataMongoTest(excludeAutoConfiguration = EmbeddedMongoAutoConfiguration.class,
includeFilters = #ComponentScan.Filter(type = FilterType.ASSIGNABLE_TYPE, classes = MongoTransactionService.class))
#ExtendWith(SpringExtension.class)
class MongoTransactionServiceTest {
#Autowired
UserRepo userRepo;
#Autowired
MongoTransactionService mongoTransactionService;
#Test
void testTransactional() {
try {
mongoTransactionService.boundToFail();
} catch (Exception e) {
// do something
}
val user = userRepo.findById("1").orElse(null);
assertThat(user).isNull();
}
}
I am expecting a call to boundToFail(), which throws a RuntimeException, would roll back the saved user, but the user still gets persisted in the database after the call.
It turns out that #DataMongoTest doesn't activate the auto-configuration for MongoDB transactions. I've filed a ticket with Spring Boot to fix that. In the mean time, you can get this to work by adding
#ImportAutoConfiguration(TransactionAutoConfiguration.class)
to your test class.
Note that using MongoDB transactions requires a replica set database setup. If that's not given the creation of a transaction will fail and your test case will capture that exception and the test will still succeed. The data will not be inserted but that's not due to the RuntimeException being thrown but the transaction not being started in the first place.
The question previously presented a slightly different code arrangement that suffered from other problems. For reference, here's the previous answer:
#Transactional needs to live on public methods of a separate Spring bean as the transactional logic is implemented by wrapping the target object with a proxy that contains an interceptor interacting with the transaction infrastructure.
You example suffers from two problems:
The test itself is not a Spring bean. I.e. there's no transactional behavior added to boundToFail(…). #Transactional can be used on JUnit test methods but that's controlling the transactional behavior of the test. Most prominently, to roll back the transaction to make sure changes to the data store made in the test do not affect other tests. See this section of the reference documentation.
Even if there was transactional logic applied to boundToFail(…), a local method call to the method would never trigger it as it doesn't pass the proxy that's applying it. See more on that in the reference documentation.
The solution to your problem is to create a separate Spring bean that carries the #Transactional annotation, get that injected into your test case and call the method from the test.

Refactoring application.properties to use data source programatically

I have a sample spring boot application.I am using PostgreSQL DB.I am new to the world of spring,spring boot etc
The sample application performs CRUD operations.I am using Java8 and intelliJ IDEA.
My application.properties contains the following:-
spring.profiles.active=local
While I have the following stuff in my application-local.properties :-
spring.datasource.url=jdbc:postgresql://localhost:5432/postgres
spring.datasource.username=postgres
spring.datasource.password=postgres123
spring.datasource.initial-size=5
spring.datasource.max-active=10
spring.datasource.max-idle=5
spring.datasource.min-idle=2
As it is evident that my application.properties also has some connection pooling stuff.
The structure of my project happens to be as follows:-
As it might be evident from the project structure that I have a model folder which of course contains the class for employee table.
I have got a DAO that has a class consisting of methods to work with employee related data.
I have a service layer that consists of EmployeeSerive class.It has
#Autowired
EmployeeDao employeeDao;
Hence the DAO methods are implemented in service.
I need to move whatever I have in my application.properties
I think the best way to do this would be to configure datasource programatically:-
My Application.java class looks something like this:-
#SpringBootApplication
#PropertySource("application-${spring.profiles.active}.properties")
public class Application implements CommandLineRunner {
#Autowired
DataSource dataSource;
private static final Logger logger =
LoggerFactory.getLogger(Application.class);
public static void main(String[] args) {
SpringApplication.run(Application.class, args);
logger.info("-------Message at INFO level from Application.main()-------
");
}
#Override
public void run(String... args) throws Exception {
logger.info("-------Message at INFO level from Application.run()-------
");
logger.info(String.valueOf(dataSource));
}
}
In order to configure a bean should I introduce something like below in Application.java class:-
#Bean
public DataSource dataSource() {
return DataSourceBuilder
.create()
.username("")
.password("")
.url("")
.driverClassName("")
.build();
}
In the above code snippet I would pass whatever is available in application.properties file.
I am following the below link:-
Configure DataSource programmatically in Spring Boot
Do I need to introduce additional classes in my application possibly in DAO to use the above.
Or just introducing the above is sufficient.
What should I do for connection pooling stuff?
Thanks in advance !

spring data solr showcase

I am trying to understand the spring data solr showcase project.
https://github.com/christophstrobl/spring-data-solr-showcase
After spending quite a bit of time, I could not find how the productRepository is implemented and injected in https://github.com/christophstrobl/spring-data-solr-showcase/blob/master/src/main/java/org/springframework/data/solr/showcase/product/ProductServiceImpl.java
#Service class ProductServiceImpl implements ProductService {
private static final Pattern IGNORED_CHARS_PATTERN = Pattern.compile("\\p{Punct}");
private ProductRepository productRepository;
#Autowired
public void setProductRepository(ProductRepository productRepository) {
this.productRepository = productRepository;
}
The ProductRepository is defined as interface (https://github.com/christophstrobl/spring-data-solr-showcase/blob/master/src/main/java/org/springframework/data/solr/showcase/product/ProductRepository.java) and I did not find any code implementing this interface
interface ProductRepository extends SolrCrudRepository<Product, String> {
#Highlight(prefix = "<b>", postfix = "</b>")
#Query(fields = { SearchableProductDefinition.ID_FIELD_NAME,
SearchableProductDefinition.NAME_FIELD_NAME,
SearchableProductDefinition.PRICE_FIELD_NAME,
SearchableProductDefinition.FEATURES_FIELD_NAME,
SearchableProductDefinition.AVAILABLE_FIELD_NAME },
defaultOperator = Operator.AND)
HighlightPage<Product> findByNameIn(Collection<String> names, Pageable page);
#Facet(fields = { SearchableProductDefinition.NAME_FIELD_NAME })
FacetPage<Product> findByNameStartsWith(Collection<String> nameFragments, Pageable pagebale);
}
Below is how the spring context is configured:
https://github.com/christophstrobl/spring-data-solr-showcase/blob/master/src/main/java/org/springframework/data/solr/showcase/Application.java
If anyone could point me to the direction where this interface is implemented and injected, that would be great.
The showcase makes use of Spring Data repository abstractions using query derivation from method name. So the infrastructure provided by Spring Data and the Solr module take care of creating the required implementations for you. Please have a look at the Reference Documentation for a more detailed explanation.
The showcase itself is built in a way that allows you to step through several stages of development by having a look at the diffs transitioning from one step to the other. So having a look at Step 2 shows how to make use of Custom Repository Implementation, while Step 4 demonstractes how to enable Highlighting using #Highlight.
The goal of Spring Data is to reduce the amount of boilerplate coding (means to reduce repetition of code).
For the basic methods like save,find the implementation will provide by spring and spring will create beans(Objetcs) for these interfaces. To tell the spring that these are my repositories inside this package, we are writing #EnableJpaRepositories(basePackeges="com.spring.repositories") or <jpa:repositories base-package="com.acme.repositories"/> for JPA repositories
Foring solr repositores we have to write #EnableSolrRepositories(basePackages="com.spring.repositories" or <solr:repositories base-package="com.acme.repositories" /> Spring will create objetcs for these interfaces, we can inject these interface objetcs using #Autowire annotation.
Example:
#Service
Pulic class SomeService{
#Autowire
private SampleRepository;
/* #postConstruct annotation is used to execute method just after creating bean
and injecting all dependencies by spring*/
#PostConstruct
public void postConstruct(){
System.out.println("SampleRepository implementation class name"+ SampleRepository.getClass());
}
}
The above example is to see the implementation class of SampleRepository interface (This class is not user defined, it is class given by spring).
For reference documentation link http://docs.spring.io/spring-data/solr/docs/2.0.2.RELEASE/reference/html/. Try to read this simple documentation you can get more knowlede on spring-data.

Custom Annotation vs using #Named in JEE6

Is there any difference between these two alternatives ... can they both be used interchangeably?
(A) Creating a custom annotation so #Inject can be used instead of #PersistenceContext within a DAO, as shown in the answer to - how-to-stack-custom-annotation-in-java-with-inject-annotation
(B) Using #Named("yourName") to qualify the Producer, such as the following code sample.
public class Resources {
/**
* EntityManager's persistence context is defined here so the #Inject annotation may be used in referencing classes.
*/
#Produces
#Named("MyEm")
#PersistenceContext(unitName = "jboss.managed")
private EntityManager em;
}
#Stateless
public class FiletracksentHome {
..
#Inject
#Named("MyEm")
private EntityManager entityManager;
..
}
They are interchangable, but you should use (A).
The #Named annotation is primarily used for being able to access the object via expression language (EL), e.g. in a JSF view.
The problem is, that the resolution is done via String, thus being neither type safe, nor usually being automatically covered by refactorings in the IDE.
The CDI specification states that it should not be used for qualifying injection points if not be used to integrate legacy code.
Here's a nice article about this topic.

Using CDI + WS/RS + JPA to build an app

#Path(value = "/user")
#Stateless
public class UserService {
#Inject
private UserManager manager;
#Path(value = "/create")
#GET
#Produces(value = MediaType.TEXT_PLAIN)
public String doCreate(#QueryParam(value = "name") String name) {
manager.createUser(name);
return "OK";
}
}
here is the user manager impl
public class UserManager {
#PersistenceContext(unitName = "shop")
private EntityManager em;
public void createUser(String name) {
User user = new User();
user.setName(name);
// skip some more initializations
em.persist(user);
}
}
the problem is if i do not mark UserService as #Stateless then the manager field is null
but if i mark #Stateless, i can have the manager field injected, and the application works as i can get the data saved into db
just wondering, what is the reason behind this?
and is this the preferred way to wiring the application?
well, i am thinking to pull out the EntityManager to a producer, so that it can be shared
the problem is if I do not mark UserService as #Stateless then the manager field is null
For injection to occur, the class has to be a managed component such as Enterprise Beans, Servlets, Filters, JSF managed beans, etc or CDI managed bean (this is the new part with Java EE 6, you can make any class a managed bean with CDI).
So, if you don't make your JAX-RS endpoint an EJB, how to enable injection? This is nicely explained in JAX-RS and CDI integration using Glassfish v3:
There are two ways CDI managed beans
are enabled:
instantiated by CDI, life-cycle managed by Jersey. Annotate with
#ManagedBean and optionally annotate
with a Jersey scope annotation.
instantiated and managed by CDI. Annotate with a CDI scope annotation,
like #RequestScoped (no #ManagedBean
is required)
I also suggest checking the resources below.
and is this the preferred way to wiring the application?
I'd say yes. CDI is very nice and... don't you like injection?
well, I am thinking to pull out the EntityManager to a producer, so that it can be shared
Shared between what? And why? In you case, you should use an EntityManager with a lifetime that is scoped to a single transaction (a transaction-scoped persistence context). In other words, don't share it (and don't worry about opening and closing it for each request, this is not an expensive operation).
References
JPA 2.0 Specification
Section 7.6 "Container-managed Persistence Contexts"
Section 7.6.1 "Container-managed Transaction-scoped Persistence Context"
Section 7.6.2 "Container-managed Extended Persistence Context"
Resources
Dependency Injection in Java EE 6 - Part 1
Introducing the Java EE 6 Platform: Part 1
TOTD #124: Using CDI + JPA with JAX-RS and JAX-WS
The #Singleton annotation will help: http://www.mentby.com/paul-sandoz/jax-rs-on-glassfish-31-ejb-injection.html