Implementing Projection with Specification in Spring Data JPA - spring-data-jpa

I am trying to implement the projection with specification in Spring Data JPA via this implementation:
https://github.com/pramoth/specification-with-projection
Related classes are as follows:
Spec:
public class TopicSpec {
public static Specification<Topic> idEq(String id){
return (root, query, cb) -> cb.equal(root.get(Topic_.id),id);
}
}
Repository
#Repository
public interface TopicRepository extends JpaRepository<Topic,String>,JpaSpecificationExecutorWithProjection<Topic> {
public static interface TopicSimple{
String getId();
String getName();
}
List<TopicSimple> findById(String id);
}
Test
#Test
public void specificationWithProjection() {
Specification<Topic> where= Specifications.where(TopicSpec.idEq("Bir"));
List<Topic> all = topicRepository.findAll(where);
Assertions.assertThat(all).isNotEmpty();
}
I have this response from the Get method:
However the tests fail. Besides when I pull the github project of pramoth I can run the tests with success. Does anyone have any opinion about this issue?
The full project can be found here:
https://github.com/dengizik/projectionDemo

I have asked the same question to the developer of the project Pramoth Suwanpech, who was kind enough to check my code and give answer. My test class should've implement the test object like this:
#Before
public void init() {
Topic topic = new Topic();
topic.setId("İki");
topic.setName("Hello");
topicRepository.save(topic); }
With this setting the tests passed.

Related

Spring Boot Hibernate Postgresql #Transactional does not rollback [duplicate]

I want to read text data fixtures (CSV files) at the start on my application and put it in my database.
For that, I have created a PopulationService with an initialization method (#PostConstruct annotation).
I also want them to be executed in a single transaction, and hence I added #Transactional on the same method.
However, the #Transactional seems to be ignored :
The transaction is started / stopped at my low level DAO methods.
Do I need to manage the transaction manually then ?
Quote from legacy (closed) Spring forum:
In the #PostConstruct (as with the afterPropertiesSet from the InitializingBean interface) there is no way to ensure that all the post processing is already done, so (indeed) there can be no Transactions. The only way to ensure that that is working is by using a TransactionTemplate.
So if you would like something in your #PostConstruct to be executed within transaction you have to do something like this:
#Service("something")
public class Something {
#Autowired
#Qualifier("transactionManager")
protected PlatformTransactionManager txManager;
#PostConstruct
private void init(){
TransactionTemplate tmpl = new TransactionTemplate(txManager);
tmpl.execute(new TransactionCallbackWithoutResult() {
#Override
protected void doInTransactionWithoutResult(TransactionStatus status) {
//PUT YOUR CALL TO SERVICE HERE
}
});
}
}
I think #PostConstruct only ensures the preprocessing/injection of your current class is finished. It does not mean that the initialization of the whole application context is finished.
However you can use the spring event system to receive an event when the initialization of the application context is finished:
public class MyApplicationListener implements ApplicationListener<ContextRefreshedEvent> {
public void onApplicationEvent(ContextRefreshedEvent event) {
// do startup code ..
}
}
See the documentation section Standard and Custom Events for more details.
As an update, from Spring 4.2 the #EventListener annotation allows a cleaner implementation:
#Service
public class InitService {
#Autowired
MyDAO myDAO;
#EventListener(ContextRefreshedEvent.class)
public void onApplicationEvent(ContextRefreshedEvent event) {
event.getApplicationContext().getBean(InitService.class).initialize();
}
#Transactional
public void initialize() {
// use the DAO
}
}
Inject self and call through it the #Transactional method
public class AccountService {
#Autowired
private AccountService self;
#Transactional
public void resetAllAccounts(){
//...
}
#PostConstruct
private void init(){
self.resetAllAccounts();
}
}
For older Spring versions which do not support self-injection, inject BeanFactory and get self as beanFactory.getBean(AccountService.class)
EDIT
It looks like that since this solution has been posted 1.5 years ago developers are still under impression that if a method,
annotated with #Transactional, is called from a #PostContruct-annotated method invoked upon the Bean initialization, it won't be actually executed inside of Spring Transaction, and awkward (obsolete?) solutions get discussed and accepted instead of this very simple and straightforward one and the latter even gets downvoted.
The Doubting Thomases :) are welcome to check out an example Spring Boot application at GitHub which implements the described above solution.
What actually causes, IMHO, the confusion: the call to #Transactional method should be done through a proxied version of a Bean where such method is defined.
When a #Transactional method is called from another Bean, that another Bean usually injects this one and invokes its proxied (e.g. through #Autowired) version of it, and everything is fine.
When a #Transactional method is called from the same Bean directly, through usual Java call, the Spring AOP/Proxy machinery is not involved and the method is not executed inside of Transaction.
When, as in the suggested solution, a #Transactional method is called from the same Bean through self-injected proxy (self field), the situation is basically equivalent to a case 1.
#Platon Serbin's answer didn't work for me. So I kept searching and found the following answer that saved my life. :D
The answer is here No Session Hibernate in #PostConstruct, which I took the liberty to transcribe:
#Service("myService")
#Transactional(readOnly = true)
public class MyServiceImpl implements MyService {
#Autowired
private MyDao myDao;
private CacheList cacheList;
#Autowired
public void MyServiceImpl(PlatformTransactionManager transactionManager) {
this.cacheList = (CacheList) new TransactionTemplate(transactionManager).execute(new TransactionCallback(){
#Override
public Object doInTransaction(TransactionStatus transactionStatus) {
CacheList cacheList = new CacheList();
cacheList.reloadCache(MyServiceImpl.this.myDao.getAllFromServer());
return cacheList;
}
});
}
The transaction part of spring might not be initialized completely at #PostConstruct.
Use a listener to the ContextRefreshedEvent event to ensure, that transactions are available:
#Component
public class YourService
implements ApplicationListener<ContextRefreshedEvent> // <= ensure correct timing!
{
private final YourRepo repo;
public YourService (YourRepo repo) {this.repo = repo;}
#Transactional // <= ensure transaction!
#Override
public void onApplicationEvent(ContextRefreshedEvent event) {
repo.doSomethingWithinTransaction();
}
}
Using transactionOperations.execute() in #PostConstruct or in #NoTransaction method both works
#Service
public class ConfigurationService implements ApplicationContextAware {
private static final Logger LOG = LoggerFactory.getLogger(ConfigurationService.class);
private ConfigDAO dao;
private TransactionOperations transactionOperations;
#Autowired
public void setTransactionOperations(TransactionOperations transactionOperations) {
this.transactionOperations = transactionOperations;
}
#Autowired
public void setConfigurationDAO(ConfigDAO dao) {
this.dao = dao;
}
#PostConstruct
public void postConstruct() {
try { transactionOperations.execute(new TransactionCallbackWithoutResult() {
#Override
protected void doInTransactionWithoutResult(final TransactionStatus status) {
ResultSet<Config> configs = dao.queryAll();
}
});
}
catch (Exception ex)
{
LOG.trace(ex.getMessage(), ex);
}
}
#NoTransaction
public void saveConfiguration(final Configuration configuration, final boolean applicationSpecific) {
String name = configuration.getName();
Configuration original = transactionOperations.execute((TransactionCallback<Configuration>) status ->
getConfiguration(configuration.getName(), applicationSpecific, null));
}
#Override
public void setApplicationContext(ApplicationContext applicationContext) throws BeansException {
}
}

Spring Data MongoDB No property get found for type at org.springframework.data.mapping.PropertyPath

I am using Spring Data MongodB 1.4.2.Release version. For Spring Data MongoDB, I have created the custom repository interface and implementation in one location and create custom query function getUsersName(Users users).
However I am still getting below exception:
Caused by: org.springframework.data.mapping.PropertyReferenceException:
No property get found for type Users! at org.springframework.data.mapping.PropertyPath. (PropertyPath.java:75) at
org.springframework.data.mapping.PropertyPath.create(PropertyPath.java:327) at
org.springframework.data.mapping.PropertyPath.create(PropertyPath.java:359) at
org.springframework.data.mapping.PropertyPath.create(PropertyPath.java:359) at
org.springframework.data.mapping.PropertyPath.create(PropertyPath.java:307) at
org.springframework.data.mapping.PropertyPath.from(PropertyPath.java:270) at
org.springframework.data.mapping.PropertyPath.from(PropertyPath.java:241) at
org.springframework.data.repository.query.parser.Part.(Part.java:76) at
org.springframework.data.repository.query.parser.PartTree$OrPart.(PartTree.java:201) at
org.springframework.data.repository.query.parser.PartTree$Predicate.buildTree(PartTree.java:291) at
org.springframework.data.repository.query.parser.PartTree$Predicate.(PartTree.java:271) at
org.springframework.data.repository.query.parser.PartTree.(PartTree.java:80) at
org.springframework.data.mongodb.repository.query.PartTreeMongoQuery.(PartTreeMongoQuery.java:47)
Below is my Spring Data MongoDB structure:
/* Users Domain Object */
#Document(collection = "users")
public class Users {
#Id
private ObjectId id;
#Field ("last_name")
private String last_name;
#Field ("first_name")
private String first_name;
public String getLast_name() {
return last_name;
}
public void setLast_name(String last_name) {
this.last_name = last_name;
}
public String getFirst_name() {
return first_name;
}
public void setFirst_name(String first_name) {
this.first_name = first_name;
}
}
/* UsersRepository.java main interface */
#Repository
public interface UsersRepository extends MongoRepository<Users,String>, UsersRepositoryCustom {
List findUsersById(String id);
}
/* UsersRepositoryCustom.java custom interface */
#Repository
public interface UsersRepositoryCustom {
List<Users> getUsersName(Users users);
}
/* UsersRepositoryImpl.java custom interface implementation */
#Component
public class UsersRepositoryImpl implements UsersRepositoryCustom {
#Autowired
MongoOperations mongoOperations;
#Override
public List<Users> getUsersName(Users users) {
return mongoOperations.find(
Query.query(Criteria.where("first_name").is(users.getFirst_name()).and("last_name").is(users.getLast_name())), Users.class);
}
/* Mongo Test function inside Spring JUnit Test class calling custom function with main UsersRepository interface */
#Autowired
private UsersRepository usersRepository;
#Test
public void getUsersName() {
Users users = new Users();
users.setFirst_name("James");`enter code here`
users.setLast_name("Oliver");
List<Users> usersDetails = usersRepository.getUsersName(users);
System.out.println("users List" + usersDetails.size());
Assert.assertTrue(usersDetails.size() > 0);
}
The query method declaration in your repository interface is invalid. As clearly stated in the reference documentation, query methods need to start with get…By, read_By, find…By or query…by.
With custom repositories, there shouldn't be a need for method naming conventions as Oliver stated. I have mine working with a method named updateMessageCount
Having said that, I can't see the problem with the code provided here.
I resolved this issue with the help of this post here, where I wasn't naming my Impl class correctly :
No property found for type error when try to create custom repository with Spring Data JPA

How to access multiple resources in a single request : Jersey Rest

I am trying to a find a good design for the following scenario.
I have a POST rest service which will be given an array of services as data. And which should in turn be calling them one by one to aggregate results on the server and send them back to the client.
#Path("/resource1")
#Path("/resource2")
#Path("/collection")
Post data to /collection
{["serviceName": "resource1", "data":"test1"], ["serviceName":"resource2","data":"test2"]}
The reason i need the resource1 and resource2 are, because those services can be called standalone also. I want to reuse the same setup if possible.
Is there any way to do this.
I am using jersey with spring.
Not sure what these resources have in common. If the post method has the same signature for all of them, you could have an abstract class or interface they implement defining the post method and can try using ResourceContext.matchResource to do this. E.g. something like this:
public abstract class AbstractResource {
public abstract String post(Object data);
}
#Path("/resource1")
public class Resource1 extends AbstractResource {
#POST
public String post(String data) {
// do something
}
}
#Path("/collection")
public class CollectionResource {
#Context
private ResourceContext rc;
#POST
#Consumes("application/json")
public String post(List<PostRequest> postRequests) {
StringBuilder result = new StringBuilder();
for (PostRequest pr : postRequests) {
// should wrap this in try-catch
AbstractResource ar = rc.matchResource(pr.resource,
AbstractResource.class);
sb.append(ar.post(pr.data));
}
return result.toString();
}
}
#XmlRootElement
public class PostRequest {
public String resource;
public String data;
}
Hopefully you got the idea and will be able to play with it and tweak it to fit your needs.

Generic Repository session management asp.net-mvc fluent nhibernate

I have gotten into a problem with my project. I am using a generic repository with structure map together with Fluent NHibernate. Everything works rather well, but when it comes to transactions and session management I have really no clue what to do. I have looked around for answers but I cant really find anything that fit my needs.
What I do in my application is that I let structure map instantiate a repository class when it gets a request for it, like so:
internal class RepositoryRegistry : Registry
{
public RepositoryRegistry()
{
For<IRepository<User>>().Use<Repository<User>>();
For<IRepository<Tasks>>().Use<Repository<Tasks>>();
}
}
internal class NHibernateRegistry : Registry
{
public NHibernateRegistry()
{
For<ISessionFactory>()
.Singleton()
.Use(() => new NHibernateSessionFactory().GetSessionFactory());
For<ISession>()
.Singleton()
.Use(x => x.GetInstance<ISessionFactory>().OpenSession());
}
}
public interface IRepository<T>
{
T GetById(int id);
void SaveOrUpdate(T entity);
IList<T> GetAll();
IQueryable<T> Linq();
void Add(T entity);
}
Edit: I have concluded what I need. I wan't to use the unit of work pattern along with structure map, but I also want to have some kind of repository wrapper which can be accessed through a unit of work.
Thanks,
James Ford
I think that you are looking for the Unit Of Work pattern, where the transaction life time is controlled by a unit of work that you inject into the Repostories/Services.
See this answer for a sample implementation of a UoW with NHibernate and StructureMap.
Edit:
Provided you have implemented a Unit of Work and a generic repository you would basically use them by:
1) Mapping them in structure map:
c.For(typeof(IRepository<>)).Use(typeof(Repository<>));
c.For<IUnitOfWork>().Use<UnitOfWork>();
2) Having the Controller accept a Repository(or a Service encapsulating the repository; this approach is often preferred) and the UnitOfWork:
public class MyController
{
public MyController(IRepository<MyEntity> repository, IUnitOfWork uow)
{
_repository = repository;
_unitOfWork = uow;
}
}
This of course also requires that you have created a custom ControllerFactory.
3) Using the Unit of Work and Repository in the controller action:
public ViewResult MyAction(MyEntity entity)
{
_repository.Save(entity);
_unitOfWork.Commit();
return View();
}

How To Centrally Initialize an IOC Container in MbUnit?

We currently have a suite of integration tests that run via MbUnit test suites. We are in the process of refactoring much of the code to use an IOC framework (StructureMap).
I'd like to configure/initialize the container ONCE when the MBUnit test runner fires up, using the same registry code that we use in production.
Is there a way of achieving this in MbUnit?
(EDIT) The version of MbUnit is 2.4.197.
Found it. The AssemblyCleanup attribute.
http://www.testingreflections.com/node/view/639
I understand that you want to spin up only one container for your entire test run and have it be the container used across test suite execution. The MBUnit docs make it look like you might be able to use a TestSuiteFixture and TestSuiteFixtureSetup to accomplish about what you want.
I wanted to speak from the point of view of a StructureMap user and Test Driven Developer.
We rarely use containers in our test suites unless we are explicitly testing pulling things out of the container. When this is necessary I use the an abstract test base class below (warning we use NUnit):
[TestFixture]
public abstract class with_container
{
protected IContainer Container;
[TestFixtureSetUp]
public void beforeAll()
{
Container = new ServiceBootstraper().GetContainer();
Container.AssertConfigurationIsValid();
}
}
public class Bootstraper
{
public Bootstraper()
{
ObjectFactory.Initialize(x =>
{
//register stuff here
});
}
public IContainer GetContainer()
{
return ObjectFactory.Container;
}}
I would recommend for normal tests that you skip the normal container and just use the automocking container included with StructureMap. Here is another handy abstract test base class we use.
public abstract class Context<T> where T : class
{
[SetUp]
public void Setup()
{
_services = new RhinoAutoMocker<T>(MockMode.AAA);
OverrideMocks();
_cut = _services.ClassUnderTest;
Given();
}
public RhinoAutoMocker<T> _services { get; private set; }
public T _cut { get; private set; }
public SERVICE MockFor<SERVICE>() where SERVICE : class
{
return _services.Get<SERVICE>();
}
public SERVICE Override<SERVICE>(SERVICE with) where SERVICE : class
{
_services.Inject(with);
return with;
}
public virtual void Given()
{
}
public virtual void OverrideMocks()
{
}
}
and here is a basic test using this context tester:
[TestFixture]
public class communication_publisher : Context<CommunicationPublisher>
{
[Test]
public void should_send_published_message_to_endpoint_retrieved_from_the_factory()
{
var message = ObjectMother.ValidOutgoingCommunicationMessage();
_cut.Publish(message);
MockFor<IEndpoint>().AssertWasCalled(a => a.Send(message));
}
}
Sorry if this is not exactly what you wanted. Just these techniques work very well for us and I wanted to share.