Spring Data Rest with Cache - spring-data-jpa

I was learning Spring Data Rest but I didn't find how to use Cache in Spring Data Rest.
How can i use Cache with Spring Data Rest's curd/page .
Or should I use JPA+Cache and ignore Spring Data Rest?
If I misunderstanding anything please remind me.
best regard

You can try the following approach:
1) Override your repos methods findById and findAll, make them Cacheable:
public interface MyEntityRepo extends JpaRepository<MyEntity, Long> {
#Cacheable("myEntities")
#Override
Optional<MyEntity> findById(Long id);
#Cacheable("pagedMyEntities")
#Override
Page<MyEntity> findAll(Pageable pageable);
}
2) Create a RepositoryEventHandler to evict your caches:
#RepositoryEventHandler
public class MyEntityEventHandler {
private final CacheManager cacheManager;
public MyEntityEventHandler(CacheManager cacheManager) {
this.cacheManager = cacheManager;
}
#HandleAfterCreate
#HandleAfterSave
#HandleAfterDelete
public void handleCachesEviction(MyEntity entity) {
Optional.ofNullable(cacheManager.getCache("myEntities"))
.ifPresent(c -> c.evict(entity.getId()));
Optional.ofNullable(cacheManager.getCache("pagedMyEntities"))
.ifPresent(c -> c.clear());
}
}
3) And of course create a cache manager bean, for example:
#EnableCaching
#SpringBootApplication
public class Application {
#Bean
public CacheManager cacheManager() {
return new ConcurrentMapCacheManager();
}
}

Related

SpringBoot: Create document in mongodb on startup if not exists

I have a small service on SpringBoot and Mongodb as a DB.
I need to be able create a small collection with one document ( very basic: id, name, status) on startup. An analog of sql create table if not exists, but for mongo. How do I do that?
I tried to initialize values in the document attributes, but it didn't help.
Currently, collection and the document appear only if I use API to add it.
You may want to use something like ApplicationRunner or CommandLineRunner which can be defined as a bean.
Example:
#SpringBootApplication
public class MyApplication {
public static void main(String[] args) {
SpringApplication.run(MyApplication .class, args);
}
#Bean
public CommandLineRunner initialize(MyRepository myRepository) {
return args -> {
// Insert elements into myRepository
};
}
}
Both CommandLineRunner and ApplicationRunner are functional interfaces, so we can use a lambda for them. Spring Boot will execute them at the startup of the application.
You can leverage the spring internal event mechanism.
When your application is ready, spring triggers the event ApplicationReadyEvent
You can listen to this event and init your collection:
#Component
public class DataInit implements ApplicationListener<ApplicationReadyEvent> {
private final MyRepository myRepository;
public DataInit(MyRepository myRepository) {
this.myRepository = myRepository;
}
#Override
public void onApplicationEvent(ApplicationReadyEvent event) {
// init data
}
}

Spring AOP Pointcut for Spring Data Rest Controller (EndPoint)

I would like to do something on every api call to my spring boot app. I use Spring AOP to achieve this. Using:
#Pointcut("within(#org.springframework.stereotype.Controller *)")
public void controller() {
}
#Pointcut("within(#org.springframework.web.bind.annotation.RestController *)")
public void restController() {
}
#After("(controller() || restController())")
public void loggingAdvice(JoinPoint joinPoint) {
// TODO: do something
}
Using that I can get all the event when API is being called. However, I am also using spring rest data for crud mechanism that automatically generate API end point, for example:
#RepositoryRestResource(collectionResourceRel = "users", path = "users")
public interface UserRepository extends PagingAndSortingRepository<User, Long> {
User findByEmail(String email);
}
The question is, can I create a point cut for every API end point that is generated by spring rest data?
Following pointcut will target all the RESTful endpoint calls made at "/users"
Considering the package of UserRepository is rg.so.example.datarest
#Pointcut("execution(* rg.so.example.datarest.UserRepository.*(..))")
public void dataRest() {
}
A more generic pointcut to target all the Repository implementations in a package rg.so.example.datarest would be
#Pointcut("execution(* rg.so.example.datarest..*(..))")

Spring Boot Hibernate Postgresql #Transactional does not rollback [duplicate]

I want to read text data fixtures (CSV files) at the start on my application and put it in my database.
For that, I have created a PopulationService with an initialization method (#PostConstruct annotation).
I also want them to be executed in a single transaction, and hence I added #Transactional on the same method.
However, the #Transactional seems to be ignored :
The transaction is started / stopped at my low level DAO methods.
Do I need to manage the transaction manually then ?
Quote from legacy (closed) Spring forum:
In the #PostConstruct (as with the afterPropertiesSet from the InitializingBean interface) there is no way to ensure that all the post processing is already done, so (indeed) there can be no Transactions. The only way to ensure that that is working is by using a TransactionTemplate.
So if you would like something in your #PostConstruct to be executed within transaction you have to do something like this:
#Service("something")
public class Something {
#Autowired
#Qualifier("transactionManager")
protected PlatformTransactionManager txManager;
#PostConstruct
private void init(){
TransactionTemplate tmpl = new TransactionTemplate(txManager);
tmpl.execute(new TransactionCallbackWithoutResult() {
#Override
protected void doInTransactionWithoutResult(TransactionStatus status) {
//PUT YOUR CALL TO SERVICE HERE
}
});
}
}
I think #PostConstruct only ensures the preprocessing/injection of your current class is finished. It does not mean that the initialization of the whole application context is finished.
However you can use the spring event system to receive an event when the initialization of the application context is finished:
public class MyApplicationListener implements ApplicationListener<ContextRefreshedEvent> {
public void onApplicationEvent(ContextRefreshedEvent event) {
// do startup code ..
}
}
See the documentation section Standard and Custom Events for more details.
As an update, from Spring 4.2 the #EventListener annotation allows a cleaner implementation:
#Service
public class InitService {
#Autowired
MyDAO myDAO;
#EventListener(ContextRefreshedEvent.class)
public void onApplicationEvent(ContextRefreshedEvent event) {
event.getApplicationContext().getBean(InitService.class).initialize();
}
#Transactional
public void initialize() {
// use the DAO
}
}
Inject self and call through it the #Transactional method
public class AccountService {
#Autowired
private AccountService self;
#Transactional
public void resetAllAccounts(){
//...
}
#PostConstruct
private void init(){
self.resetAllAccounts();
}
}
For older Spring versions which do not support self-injection, inject BeanFactory and get self as beanFactory.getBean(AccountService.class)
EDIT
It looks like that since this solution has been posted 1.5 years ago developers are still under impression that if a method,
annotated with #Transactional, is called from a #PostContruct-annotated method invoked upon the Bean initialization, it won't be actually executed inside of Spring Transaction, and awkward (obsolete?) solutions get discussed and accepted instead of this very simple and straightforward one and the latter even gets downvoted.
The Doubting Thomases :) are welcome to check out an example Spring Boot application at GitHub which implements the described above solution.
What actually causes, IMHO, the confusion: the call to #Transactional method should be done through a proxied version of a Bean where such method is defined.
When a #Transactional method is called from another Bean, that another Bean usually injects this one and invokes its proxied (e.g. through #Autowired) version of it, and everything is fine.
When a #Transactional method is called from the same Bean directly, through usual Java call, the Spring AOP/Proxy machinery is not involved and the method is not executed inside of Transaction.
When, as in the suggested solution, a #Transactional method is called from the same Bean through self-injected proxy (self field), the situation is basically equivalent to a case 1.
#Platon Serbin's answer didn't work for me. So I kept searching and found the following answer that saved my life. :D
The answer is here No Session Hibernate in #PostConstruct, which I took the liberty to transcribe:
#Service("myService")
#Transactional(readOnly = true)
public class MyServiceImpl implements MyService {
#Autowired
private MyDao myDao;
private CacheList cacheList;
#Autowired
public void MyServiceImpl(PlatformTransactionManager transactionManager) {
this.cacheList = (CacheList) new TransactionTemplate(transactionManager).execute(new TransactionCallback(){
#Override
public Object doInTransaction(TransactionStatus transactionStatus) {
CacheList cacheList = new CacheList();
cacheList.reloadCache(MyServiceImpl.this.myDao.getAllFromServer());
return cacheList;
}
});
}
The transaction part of spring might not be initialized completely at #PostConstruct.
Use a listener to the ContextRefreshedEvent event to ensure, that transactions are available:
#Component
public class YourService
implements ApplicationListener<ContextRefreshedEvent> // <= ensure correct timing!
{
private final YourRepo repo;
public YourService (YourRepo repo) {this.repo = repo;}
#Transactional // <= ensure transaction!
#Override
public void onApplicationEvent(ContextRefreshedEvent event) {
repo.doSomethingWithinTransaction();
}
}
Using transactionOperations.execute() in #PostConstruct or in #NoTransaction method both works
#Service
public class ConfigurationService implements ApplicationContextAware {
private static final Logger LOG = LoggerFactory.getLogger(ConfigurationService.class);
private ConfigDAO dao;
private TransactionOperations transactionOperations;
#Autowired
public void setTransactionOperations(TransactionOperations transactionOperations) {
this.transactionOperations = transactionOperations;
}
#Autowired
public void setConfigurationDAO(ConfigDAO dao) {
this.dao = dao;
}
#PostConstruct
public void postConstruct() {
try { transactionOperations.execute(new TransactionCallbackWithoutResult() {
#Override
protected void doInTransactionWithoutResult(final TransactionStatus status) {
ResultSet<Config> configs = dao.queryAll();
}
});
}
catch (Exception ex)
{
LOG.trace(ex.getMessage(), ex);
}
}
#NoTransaction
public void saveConfiguration(final Configuration configuration, final boolean applicationSpecific) {
String name = configuration.getName();
Configuration original = transactionOperations.execute((TransactionCallback<Configuration>) status ->
getConfiguration(configuration.getName(), applicationSpecific, null));
}
#Override
public void setApplicationContext(ApplicationContext applicationContext) throws BeansException {
}
}

Implementing Projection with Specification in Spring Data JPA

I am trying to implement the projection with specification in Spring Data JPA via this implementation:
https://github.com/pramoth/specification-with-projection
Related classes are as follows:
Spec:
public class TopicSpec {
public static Specification<Topic> idEq(String id){
return (root, query, cb) -> cb.equal(root.get(Topic_.id),id);
}
}
Repository
#Repository
public interface TopicRepository extends JpaRepository<Topic,String>,JpaSpecificationExecutorWithProjection<Topic> {
public static interface TopicSimple{
String getId();
String getName();
}
List<TopicSimple> findById(String id);
}
Test
#Test
public void specificationWithProjection() {
Specification<Topic> where= Specifications.where(TopicSpec.idEq("Bir"));
List<Topic> all = topicRepository.findAll(where);
Assertions.assertThat(all).isNotEmpty();
}
I have this response from the Get method:
However the tests fail. Besides when I pull the github project of pramoth I can run the tests with success. Does anyone have any opinion about this issue?
The full project can be found here:
https://github.com/dengizik/projectionDemo
I have asked the same question to the developer of the project Pramoth Suwanpech, who was kind enough to check my code and give answer. My test class should've implement the test object like this:
#Before
public void init() {
Topic topic = new Topic();
topic.setId("İki");
topic.setName("Hello");
topicRepository.save(topic); }
With this setting the tests passed.

Implementing RequestMethod.PATCH in Spring RestController

I am creating a Rest API for a MongoDB database using MongoRepository. I want to create an endpoint that uses "RequestMethod.PATCH" and implements the "PATCH" functionality: delta update with fields provided in the #RequestBody.
The functionality that I want already exists in "Spring Data Rest" by using the "#RepositoryRestResource" annotation on my Repository class as described here https://spring.io/guides/gs/accessing-data-rest/
But I don't want to expose my Repository class like that. I like the classic Controller->Service->Repository lineage. My controller looks like this:
#RestController
public class ActivitiesController {
#Autowired
ActivitiesService activitiesService;
#RequestMapping(value="activities", method=RequestMethod.PATCH)
public ActivityModel updateActivity(
#RequestBody ActivityModel activityModel
){
//Input ActivityModel will only have subset of fields that have been changed, aka the delta
return activitiesService.update(activityModel);
}
#RequestMapping(value="activities", method=RequestMethod.PUT)
public ActivityModel updateActivity(
#RequestBody ActivityModel activityModel
){
//Input ActivityModel will have all fields populated
return activitiesService.save(activityModel);
}
}
And my repository is here:
#Repository
public interface ActivitiesRepo extends MongoRepository<ActivityModel, String> {
//out of the box implementation
}
My problem is that, from what I can tell, MongoRepository does not provide delta updates out of the box the way that Spring Data Rest does. How can I implement that functionality in the Service layer here?:
#Service
public class ActivitiesService {
#Autowired
ActivitiesRepo activitiesRepo;
public ActivityModel update(ActivityModel activityModel){
//delta update implementation, aka PATCH implementation
}
//method that should only be used with RequestMethod.PUT
public ActivityModel save(ActivityModel activityModel){
return activitiesRepo.save(activityModel);
}
}
What do you think of this solution for a manual "PATCH" implementation:
public class ModelUtil {
public static <T> Object update(Object origModel, Object dirtyModel, Class<T> clazz){
ObjectMapper m = new ObjectMapper();
HashMap<String, Object> origModelAsMap = m.convertValue(origModel, new TypeReference<Map<String, Object>>() {});
HashMap<String, Object> dirtyModelAsMap = m.convertValue(dirtyModel, new TypeReference<Map<String, Object>>() {});
dirtyModelAsMap.forEach((k, v)-> {
origModelAsMap.put(k, v);
});
return m.convertValue(origModelAsMap, clazz);
}
}