Spring-data #Query annotation and interface - spring-data

Spring-data-mongodb 1.1.2-Released (Spring-data-common-core 1.4.1.Released)
I am having some trouble with using the #Query annotation with interface. For example, if I have the following interface defined:
public interface Person {
String getName();
Integer getAge();
}
and the following Repository defined:
public interface PersonRepository extends MongoRepository<Person, String> {
#Query(value="{ 'name': ?0}")
List<Person> findPeople(String name);
}
I get the following exception when trying to query:
java.lang.IllegalArgumentException: No property name found on com.abc.People!
at org.springframework.data.mapping.context.AbstractMappingContext.getPersistentPropertyPath(AbstractMappingContext.java:225)
at org.springframework.data.mongodb.core.convert.QueryMapper.getPath(QueryMapper.java:202)
at org.springframework.data.mongodb.core.convert.QueryMapper.getTargetProperty(QueryMapper.java:190)
at org.springframework.data.mongodb.core.convert.QueryMapper.getMappedObject(QueryMapper.java:86)
at org.springframework.data.mongodb.core.MongoTemplate.doFind(MongoTemplate.java:1336)
at org.springframework.data.mongodb.core.MongoTemplate.doFind(MongoTemplate.java:1322)
at org.springframework.data.mongodb.core.MongoTemplate.find(MongoTemplate.java:495)
at org.springframework.data.mongodb.repository.query.AbstractMongoQuery$Execution.readCollection(AbstractMongoQuery.java:123)
This exception does not occur if my #Query is updated to:
public interface PersonRepository extends MongoRepository<Person, String> {
#Query(value="{ 'abcd': ?0}")
List<Person> findPeople(String name);
}
This also does not occur if I remove the getName() function from the interface.
Has anyone encountered this issue and can tell me what I am doing wrong or if this is an known issue? I will open an JIRA in Spring-data project.

I think you are stumbling over this one. This has been fixed in the release announced here. You should see this working by upgrading to Spring Data MongoDB 1.2.1 (which pulls in Spring Data Commons 1.5.1 transitively).

Related

Upgrading from Spring Data 1.11 to Spring Data 2.0 results in "No property delete found for type SimpleEntity!"

I have a simple project with the classes below defined. It works just fine in spring-boot 1.5.4, spring-data-commons 1.13, and spring-data-jpa 1.11.
When I upgrade to spring-boot 2.0.0.M5, spring-data-commons 2.0.0 and spring-data-jpa-2.0.0, I get a PropertyReferenceException at startup that says "No property delete found for type SimpleEntity!" Unfortunately, I can't get the stack trace out of
the computer I get the error in, it is very locked down for security.
Any ideas? Other posts I found don't seem to match my situation.
Here are the classes (altered the names, but you get the idea):
package entity;
#MappedSuperclass
public abstract class BaseEntity implements Serializable {
....
}
package entity;
#Entity
#Table(schema = "ENTITIES", name = "SIMPLE")
public class SimpleEntity extends BaseEntity {
#Column(name = "ID")
private Long id;
#Column(name = "CODE")
private String code;
#Column(name = "NAME")
private String name;
... getters and setters ...
}
package repository;
imoport org.springframework.data.repository.Repository
public interface SimpleRepository extends Repository<SimpleEntity, Long> {
public SimpleEntity save(SimpleEntity entity);
public List<SimpleEntity> save(List<SimpleEntity> entities);
public void delete(Long id);
public SimpleEntity findOne(Long id);
public List<SimpleEntity> findAllByOrderByNameAsc();
public List<SimpleEntity> findByCode(String code);
public List<SimpleEntity> findByNameIgnoreCaseOrderByNameAsc(String name);
}
Turns out there is a breaking change in Spring Data 2.0 CrudRepository interface. The error I received occurs under the following conditions:
You have a 1.x Sping Data project
You have an interface that extends Repository directly, not a subinterface like CrudRepository
Your Repository subinterface declares the "void delete(ID)" method found in CrudRepository (in my case "void delete(Long)"
You update to Spring Data 2.x
The problem is that CrudRepository in 2.x no longer has a "void delete(ID)" method, it was removed, and a new method "void deleteById(ID)" was added.
When Spring data sees a delete method signature it doesn't recognize, it produces an error about your entity class missing a delete property - this is true of both 1.2 and 2.x.

#Inject not working in AttributeConverter

I have a simple AttributeConverter implementation in which I try to inject an object which have to provide the conversion logic, but #Inject seem not to work for this case. The converter class looks like this:
#Converter(autoApply=false)
public class String2ByteArrayConverter implements AttributeConverter<String, byte[]>
{
#Inject
private Crypto crypto;
#Override
public byte[] convertToDatabaseColumn(String usrReadable)
{
return crypto.pg_encrypt(usrReadable);
}
#Override
public String convertToEntityAttribute(byte[] dbType)
{
return crypto.pg_decrypt(dbType);
}
}
When the #Converter is triggered it throws an NullPointerException because the property crypto is not being initialized from the container. Why is that?
I'm using Glassfish 4 and in all other cases #Inject works just fine.
Is it not possible to use CDI on converters?
Any help will be appreciated :)
The accent of my question is more the AttributeConverter part. I understand that for the CDI to work a bean must meet the conditions described here http://docs.oracle.com/javaee/6/tutorial/doc/gjfzi.html.
I also have tried to force the CDI to work by implementing the following constructor:
#Inject
public String2ByteArrayConverter(Crypto crypto)
{
this.crypto = crypto;
}
And now I got the following exception which doesn't give me any clue:
2015-07-23T01:03:24.835+0200|Severe: Exception during life cycle processing
org.glassfish.deployment.common.DeploymentException: Exception [EclipseLink-28019] (Eclipse Persistence Services - 2.5.2.v20140319-9ad6abd): org.eclipse.persistence.exceptions.EntityManagerSetupException
Exception Description: Deployment of PersistenceUnit [PU_VMA] failed. Close all factories for this PersistenceUnit.
Internal Exception: Exception [EclipseLink-7172] (Eclipse Persistence Services - 2.5.2.v20140319-9ad6abd): org.eclipse.persistence.exceptions.ValidationException
Exception Description: Error encountered when instantiating the class [class model.converter.String2ByteArrayConverter].
Internal Exception: java.lang.InstantiationException: model.converter.String2ByteArrayConverter
at org.eclipse.persistence.internal.jpa.EntityManagerSetupImpl.createDeployFailedPersistenceException(EntityManagerSetupImpl.java:820)
at org.eclipse.persistence.internal.jpa.EntityManagerSetupImpl.deploy(EntityManagerSetupImpl.java:760)
...
I even tried using #Producer or #Decorator in order to have the CDI working on that place, but I still think there is something specific with the AttributeConverter which doesn't allow CDI. So problem not solved yet.
Unfortunately you can't inject CDI beans into a JPA converter, however in CDI 1.1 you can inject your Crypto programmatically :
Crypto crypto = javax.enterprise.inject.spi.CDI.current().select(Crypto.class).get()
For reference, JPA 2.2 will allow CDI to be used with AttributeConverter, and some vendors already support this (EclipseLink, DataNucleus JPA are the ones I know of that do it).
You're trying to combine two different worlds, as CDI doesn't know about JPA Stuff and vice-versa. (One annotation parser of course doesn't know about the other)
What you CAN do, is this:
/**
* #author Jakob Galbavy <code>jg#chex.at</code>
*/
#Converter
#Singleton
#Startup
public class UserConverter implements AttributeConverter<User, Long> {
#Inject
private UserRepository userRepository;
private static UserRepository staticUserRepository;
#PostConstruct
public void init() {
staticUserRepository = this.userRepository;
}
#Override
public Long convertToDatabaseColumn(User attribute) {
if (null == attribute) {
return null;
}
return attribute.getId();
}
#Override
public User convertToEntityAttribute(Long dbData) {
if (null == dbData) {
return null;
}
return staticUserRepository.findById(dbData);
}
}
This way, you would create a Singleton EJB, that is created on boot of the container, setting the static class attribute in the PostConstruct phase. You then just use the static Repository instead of the injected field (which will still be NULL, when used as a JPA Converter).
Well, CDI still doesn't work for AttributeConverter, which would be the most elegant solution, but I have found a satisfying workaround. The workaround is using #FacesConverter. Unfortunately per default CDI doesn't work in faces converters and validators either, but thanks to the Apache MyFaces CODI API you can make it work unsing the #Advaced annotation :) So I came up with an implementation like this:
#Advanced
#FacesConverter("cryptoConverter")
public class CryptoJSFConverter implements Converter
{
private CryptoController crypto = new CryptoController();
#Inject
PatientController ptCtrl;
public Object getAsObject(FacesContext fc, UIComponent uic, String value)
{
if(value != null)
return crypto.pg_encrypt(value, ptCtrl.getSecretKey());
else
return null;
}
public String getAsString(FacesContext fc, UIComponent uic, Object object)
{
String res = crypto.pg_decrypt((byte[]) object, ptCtrl.getSecretKey());
return res;
}
}
The injected managed bean has to be explicitly annotated with #Named and some scope definition. A declaration in faces-config.xml doesn't work! In my solution it looks like this:
#Named
#SessionScoped
public class PatientController extends PersistanceManager
{
...
}
Now one has a context information in the converter. In my case it is session/user specific cryptography configuration.
Of course in such a solution it is very likely that a custom #FacesValidator is also needed, but thanks to CODI one have the possibility for using CDI here also (analog to converter).

How to assign an #EntityGraph annotation to Spring Data JPA repository .findAll()

Annotating the Spring Data JPA repository method findAll() with #EntityGraph:
import org.springframework.data.jpa.repository.JpaRepository;
[...]
public interface OptgrpRepository extends JpaRepository<Optgrp> {
#EntityGraph(value = "Optgrp.sysoptions")
List<Optgrp> findAll();
}
leads to this error message:
org.springframework.data.mapping.PropertyReferenceException: No property findAll found for type Optgrp!
Same error happens when changing findAll() to other names:
findAllWithDetail() --> No property findAllWithDetail found for type Optgrp!
findWithDetailAll() --> No property findWithDetailAll found for type Optgrp!
Question: Is it at all possible to use the #EntityGraph annotation on a Spring Data JPA repository method that finds all entities?
EDIT: as asked in the comment, here's the extract from the Optgrp entity class:
#Entity
#NamedEntityGraph(name = "Optgrp.sysoptions", attributeNodes = #NamedAttributeNode("sysoptions"))
public class Optgrp implements Serializable {
[...]
#OneToMany(mappedBy="optgrp", cascade = CascadeType.ALL, orphanRemoval=true)
#OrderBy(clause = "ordnr ASC")
private List<Sysoption> sysoptions = new ArrayList<>();
}
And the Sysoption entity class as well:
#Entity
public class Sysoption implements Serializable {
[...]
#ManyToOne
#JoinColumn(name = "optgrp_id", insertable=false, updatable=false)
private Optgrp optgrp;
}
For all, who are using Stack Overflow as knowledge database too, I record a new status to Markus Pscheidts challenge. Three years and six months later the #EntityGraph annotation works now directly at the findAll() function in Spring Data JpaRepository, as Markus original expected.
#Repository
public interface ImportMovieDAO extends JpaRepository<ImportMovie, Long> {
#NotNull
#Override
#EntityGraph(value = "graph.ImportMovie.videoPaths")
List<ImportMovie> findAll();
}
Versions used in the test: Spring Boot 2.0.3.RELEASE with included spring-boot-starter-data-jpa.
Using the name findByIdNotNull is one way to combine both findAll() and entity graph:
#EntityGraph(value = "Optgrp.sysoptions")
List<Optgrp> findByIdNotNull();

Mongo custom repository autowired is null

I try to autowire my custom mongo repository (and it seems the constructor is executed) but still the result is null
I've looked at some similar questions
Spring Data Neo4j - #Autowired Repository == null
and
spring data mongo repository is null
but I still don't know how to solve this.
public class TestRepo {
#Autowired
PersonRepository repository;
public void find(String name)
{
System.out.println(repository.findByName(name));
}
}
config
<mongo:repositories base-package="com.yyyy.zzz" />
PersonRepository
public interface PersonRepository extends Repository<Person, BigInteger> {
#Query("{name : ?0}")
public Person findByName(String name);
}
Implementation
public class PersonRepositoryImpl implements PersonRepository{
PersonRepositoryImpl()
{
System.out.println("constructing");
}
public Person findByName(String name) {
...
}
}
if I get the repository bean directly from context it works
Your repository setup looks suspicious. To execute query methods, you don't need to provide an implementation at all. I suspect in your current setup the custom implementation you have in PersonRepositoryImpl "overrides" the query method and thus will be preferred on execution.
If you simply drop your implementation class, Spring Data will automatically execute the query for you on invocation.
Generally speaking, custom implementation classes are only needed for functionality you cannot get through other means (query methods, Querydsl intergration etc.).

How to customize MongoRepository without overriding the annotated #Query methods in interface?

I want to customize MongoRepository by adding one method, and still using the implemented methods provided by MongoRepository. Below is the code:
public interface TopoRepositoryInterface extends MongoRepository<Topo, String>
{
#Query("{'name':?0}")
public Topo findByName(String name);
public long getPublishedTopoCount();
}
the implementation declaration is:
public class TopoRepositoryImpl extends SimpleMongoRepository<Topo, String> implements TopoRepositoryInterface
If without the customization, method findByName declared in TopoRepositoryInterface can be automatically implemented by adding #Query("{'name':?0}") annotation. But now, since there is interface inheritage, I must add code
#Override
public Topo findByName(String name)
{
Topo topo = getMongoOperations().findOne(Query.query(Criteria.where("name").is(name)), Topo.class);
return topo;
}
Is there any way to write my own code for getPublishedTopoCount() only, and leave findByName() be implemented by #Query annotation? Thank you very much.
You have to split your repository interface into two.
First one - "Custom" containing methods you implement manually would be:
public interface TopRepositoryCustom {
long getPublishedTopoCount();
}
Second one for generated methods:
public interface TopRepository extends MongoRepository<Topo, String>, TopRepositoryCustom {
#Query("{'name':?0}")
Topo findByName(String name);
}
Then you just need to implement first repository and remember to follow proper naming convention. See more at: spring-data mongodb custom implementation PropertyReferenceException and Spring Data MongoDB Custom implementations reference