Mongo custom repository autowired is null - mongodb

I try to autowire my custom mongo repository (and it seems the constructor is executed) but still the result is null
I've looked at some similar questions
Spring Data Neo4j - #Autowired Repository == null
and
spring data mongo repository is null
but I still don't know how to solve this.
public class TestRepo {
#Autowired
PersonRepository repository;
public void find(String name)
{
System.out.println(repository.findByName(name));
}
}
config
<mongo:repositories base-package="com.yyyy.zzz" />
PersonRepository
public interface PersonRepository extends Repository<Person, BigInteger> {
#Query("{name : ?0}")
public Person findByName(String name);
}
Implementation
public class PersonRepositoryImpl implements PersonRepository{
PersonRepositoryImpl()
{
System.out.println("constructing");
}
public Person findByName(String name) {
...
}
}
if I get the repository bean directly from context it works

Your repository setup looks suspicious. To execute query methods, you don't need to provide an implementation at all. I suspect in your current setup the custom implementation you have in PersonRepositoryImpl "overrides" the query method and thus will be preferred on execution.
If you simply drop your implementation class, Spring Data will automatically execute the query for you on invocation.
Generally speaking, custom implementation classes are only needed for functionality you cannot get through other means (query methods, Querydsl intergration etc.).

Related

Spring AOP Pointcut for Spring Data Rest Controller (EndPoint)

I would like to do something on every api call to my spring boot app. I use Spring AOP to achieve this. Using:
#Pointcut("within(#org.springframework.stereotype.Controller *)")
public void controller() {
}
#Pointcut("within(#org.springframework.web.bind.annotation.RestController *)")
public void restController() {
}
#After("(controller() || restController())")
public void loggingAdvice(JoinPoint joinPoint) {
// TODO: do something
}
Using that I can get all the event when API is being called. However, I am also using spring rest data for crud mechanism that automatically generate API end point, for example:
#RepositoryRestResource(collectionResourceRel = "users", path = "users")
public interface UserRepository extends PagingAndSortingRepository<User, Long> {
User findByEmail(String email);
}
The question is, can I create a point cut for every API end point that is generated by spring rest data?
Following pointcut will target all the RESTful endpoint calls made at "/users"
Considering the package of UserRepository is rg.so.example.datarest
#Pointcut("execution(* rg.so.example.datarest.UserRepository.*(..))")
public void dataRest() {
}
A more generic pointcut to target all the Repository implementations in a package rg.so.example.datarest would be
#Pointcut("execution(* rg.so.example.datarest..*(..))")

spring data repository Implenation

I am using Spring DATA JPA and selected #Query annotation for creating queries (instead of using NamedQueries and Queries created from MethodName)
I have a data repository as below
public interface EventRepository extends CrudRepository<Event, Long> {
#Query("select e from Event e where e.name = :eventName)
public List<Event>findEventByName(String eventName );
}
Interface looks good and its enough as per Spring reference doc.
But I need a impl class because I need many other methods in addition to above.
I am facing 2 issues when I create EventRepositoryImpl java implementing EventRepository
Its asking to implement all the methods in EventRepository, findEventByName method is self contained in interface and why I need implement it again in Impl class?
Its asking to implement all the methods in CrudRepository, I know its per OOPS design, But there many methods
So, for these issues can I define my EventRepositoryImpl as abstract,
this seems to be working fine.
But do I need to worry about anything else, when Spring uses a abstract class as a bean.
or is there an elegant way to solve this issue.
Appreciate your help.
You do not have to implement all of these methods neither create an abstract class. Take a look into official documentation.
interface UserRepositoryCustom {
public void someCustomMethod(User user);
}
class UserRepositoryImpl implements UserRepositoryCustom {
public void someCustomMethod(User user) {
// Your custom implementation
}
}
interface UserRepository extends CrudRepository<User, Long>, UserRepositoryCustom {
// Declare query methods here
}

#Inject not working in AttributeConverter

I have a simple AttributeConverter implementation in which I try to inject an object which have to provide the conversion logic, but #Inject seem not to work for this case. The converter class looks like this:
#Converter(autoApply=false)
public class String2ByteArrayConverter implements AttributeConverter<String, byte[]>
{
#Inject
private Crypto crypto;
#Override
public byte[] convertToDatabaseColumn(String usrReadable)
{
return crypto.pg_encrypt(usrReadable);
}
#Override
public String convertToEntityAttribute(byte[] dbType)
{
return crypto.pg_decrypt(dbType);
}
}
When the #Converter is triggered it throws an NullPointerException because the property crypto is not being initialized from the container. Why is that?
I'm using Glassfish 4 and in all other cases #Inject works just fine.
Is it not possible to use CDI on converters?
Any help will be appreciated :)
The accent of my question is more the AttributeConverter part. I understand that for the CDI to work a bean must meet the conditions described here http://docs.oracle.com/javaee/6/tutorial/doc/gjfzi.html.
I also have tried to force the CDI to work by implementing the following constructor:
#Inject
public String2ByteArrayConverter(Crypto crypto)
{
this.crypto = crypto;
}
And now I got the following exception which doesn't give me any clue:
2015-07-23T01:03:24.835+0200|Severe: Exception during life cycle processing
org.glassfish.deployment.common.DeploymentException: Exception [EclipseLink-28019] (Eclipse Persistence Services - 2.5.2.v20140319-9ad6abd): org.eclipse.persistence.exceptions.EntityManagerSetupException
Exception Description: Deployment of PersistenceUnit [PU_VMA] failed. Close all factories for this PersistenceUnit.
Internal Exception: Exception [EclipseLink-7172] (Eclipse Persistence Services - 2.5.2.v20140319-9ad6abd): org.eclipse.persistence.exceptions.ValidationException
Exception Description: Error encountered when instantiating the class [class model.converter.String2ByteArrayConverter].
Internal Exception: java.lang.InstantiationException: model.converter.String2ByteArrayConverter
at org.eclipse.persistence.internal.jpa.EntityManagerSetupImpl.createDeployFailedPersistenceException(EntityManagerSetupImpl.java:820)
at org.eclipse.persistence.internal.jpa.EntityManagerSetupImpl.deploy(EntityManagerSetupImpl.java:760)
...
I even tried using #Producer or #Decorator in order to have the CDI working on that place, but I still think there is something specific with the AttributeConverter which doesn't allow CDI. So problem not solved yet.
Unfortunately you can't inject CDI beans into a JPA converter, however in CDI 1.1 you can inject your Crypto programmatically :
Crypto crypto = javax.enterprise.inject.spi.CDI.current().select(Crypto.class).get()
For reference, JPA 2.2 will allow CDI to be used with AttributeConverter, and some vendors already support this (EclipseLink, DataNucleus JPA are the ones I know of that do it).
You're trying to combine two different worlds, as CDI doesn't know about JPA Stuff and vice-versa. (One annotation parser of course doesn't know about the other)
What you CAN do, is this:
/**
* #author Jakob Galbavy <code>jg#chex.at</code>
*/
#Converter
#Singleton
#Startup
public class UserConverter implements AttributeConverter<User, Long> {
#Inject
private UserRepository userRepository;
private static UserRepository staticUserRepository;
#PostConstruct
public void init() {
staticUserRepository = this.userRepository;
}
#Override
public Long convertToDatabaseColumn(User attribute) {
if (null == attribute) {
return null;
}
return attribute.getId();
}
#Override
public User convertToEntityAttribute(Long dbData) {
if (null == dbData) {
return null;
}
return staticUserRepository.findById(dbData);
}
}
This way, you would create a Singleton EJB, that is created on boot of the container, setting the static class attribute in the PostConstruct phase. You then just use the static Repository instead of the injected field (which will still be NULL, when used as a JPA Converter).
Well, CDI still doesn't work for AttributeConverter, which would be the most elegant solution, but I have found a satisfying workaround. The workaround is using #FacesConverter. Unfortunately per default CDI doesn't work in faces converters and validators either, but thanks to the Apache MyFaces CODI API you can make it work unsing the #Advaced annotation :) So I came up with an implementation like this:
#Advanced
#FacesConverter("cryptoConverter")
public class CryptoJSFConverter implements Converter
{
private CryptoController crypto = new CryptoController();
#Inject
PatientController ptCtrl;
public Object getAsObject(FacesContext fc, UIComponent uic, String value)
{
if(value != null)
return crypto.pg_encrypt(value, ptCtrl.getSecretKey());
else
return null;
}
public String getAsString(FacesContext fc, UIComponent uic, Object object)
{
String res = crypto.pg_decrypt((byte[]) object, ptCtrl.getSecretKey());
return res;
}
}
The injected managed bean has to be explicitly annotated with #Named and some scope definition. A declaration in faces-config.xml doesn't work! In my solution it looks like this:
#Named
#SessionScoped
public class PatientController extends PersistanceManager
{
...
}
Now one has a context information in the converter. In my case it is session/user specific cryptography configuration.
Of course in such a solution it is very likely that a custom #FacesValidator is also needed, but thanks to CODI one have the possibility for using CDI here also (analog to converter).

Spring-data #Query annotation and interface

Spring-data-mongodb 1.1.2-Released (Spring-data-common-core 1.4.1.Released)
I am having some trouble with using the #Query annotation with interface. For example, if I have the following interface defined:
public interface Person {
String getName();
Integer getAge();
}
and the following Repository defined:
public interface PersonRepository extends MongoRepository<Person, String> {
#Query(value="{ 'name': ?0}")
List<Person> findPeople(String name);
}
I get the following exception when trying to query:
java.lang.IllegalArgumentException: No property name found on com.abc.People!
at org.springframework.data.mapping.context.AbstractMappingContext.getPersistentPropertyPath(AbstractMappingContext.java:225)
at org.springframework.data.mongodb.core.convert.QueryMapper.getPath(QueryMapper.java:202)
at org.springframework.data.mongodb.core.convert.QueryMapper.getTargetProperty(QueryMapper.java:190)
at org.springframework.data.mongodb.core.convert.QueryMapper.getMappedObject(QueryMapper.java:86)
at org.springframework.data.mongodb.core.MongoTemplate.doFind(MongoTemplate.java:1336)
at org.springframework.data.mongodb.core.MongoTemplate.doFind(MongoTemplate.java:1322)
at org.springframework.data.mongodb.core.MongoTemplate.find(MongoTemplate.java:495)
at org.springframework.data.mongodb.repository.query.AbstractMongoQuery$Execution.readCollection(AbstractMongoQuery.java:123)
This exception does not occur if my #Query is updated to:
public interface PersonRepository extends MongoRepository<Person, String> {
#Query(value="{ 'abcd': ?0}")
List<Person> findPeople(String name);
}
This also does not occur if I remove the getName() function from the interface.
Has anyone encountered this issue and can tell me what I am doing wrong or if this is an known issue? I will open an JIRA in Spring-data project.
I think you are stumbling over this one. This has been fixed in the release announced here. You should see this working by upgrading to Spring Data MongoDB 1.2.1 (which pulls in Spring Data Commons 1.5.1 transitively).

Generic Repository session management asp.net-mvc fluent nhibernate

I have gotten into a problem with my project. I am using a generic repository with structure map together with Fluent NHibernate. Everything works rather well, but when it comes to transactions and session management I have really no clue what to do. I have looked around for answers but I cant really find anything that fit my needs.
What I do in my application is that I let structure map instantiate a repository class when it gets a request for it, like so:
internal class RepositoryRegistry : Registry
{
public RepositoryRegistry()
{
For<IRepository<User>>().Use<Repository<User>>();
For<IRepository<Tasks>>().Use<Repository<Tasks>>();
}
}
internal class NHibernateRegistry : Registry
{
public NHibernateRegistry()
{
For<ISessionFactory>()
.Singleton()
.Use(() => new NHibernateSessionFactory().GetSessionFactory());
For<ISession>()
.Singleton()
.Use(x => x.GetInstance<ISessionFactory>().OpenSession());
}
}
public interface IRepository<T>
{
T GetById(int id);
void SaveOrUpdate(T entity);
IList<T> GetAll();
IQueryable<T> Linq();
void Add(T entity);
}
Edit: I have concluded what I need. I wan't to use the unit of work pattern along with structure map, but I also want to have some kind of repository wrapper which can be accessed through a unit of work.
Thanks,
James Ford
I think that you are looking for the Unit Of Work pattern, where the transaction life time is controlled by a unit of work that you inject into the Repostories/Services.
See this answer for a sample implementation of a UoW with NHibernate and StructureMap.
Edit:
Provided you have implemented a Unit of Work and a generic repository you would basically use them by:
1) Mapping them in structure map:
c.For(typeof(IRepository<>)).Use(typeof(Repository<>));
c.For<IUnitOfWork>().Use<UnitOfWork>();
2) Having the Controller accept a Repository(or a Service encapsulating the repository; this approach is often preferred) and the UnitOfWork:
public class MyController
{
public MyController(IRepository<MyEntity> repository, IUnitOfWork uow)
{
_repository = repository;
_unitOfWork = uow;
}
}
This of course also requires that you have created a custom ControllerFactory.
3) Using the Unit of Work and Repository in the controller action:
public ViewResult MyAction(MyEntity entity)
{
_repository.Save(entity);
_unitOfWork.Commit();
return View();
}