How do I use inheritance in programming model with JPA? - jpa

I'm trying to implement this model using JPA 2.1. I'm using the JSR 338 specification and the reference implementation Eclipselink.
Only entities of the third level and associative classes will be persisted.
#MappedSuperclass
public abstract class PessoaMaster implements Serializable {
private static final long serialVersionUID = 1L;
private long id;
private List<Telefone> telefones;
#Id
#GeneratedValue(strategy=GenerationType.IDENTITY)
#Column(name="ID_Pai", unique=true, nullable=false)
public long getId() {
return id;
}
public void setId(long identificador) {
id = identificador;
}
/**
* #return the telefones
*/
#OneToMany
public List<Telefone> getTelefones() {
return telefones;
}
/**
* #param telefones the telefones to set
*/
public void setTelefones(List<Telefone> telefones) {
this.telefones = telefones;
}
}
I can use instead of Inheritance MappedSuperclass here?
#Entity
#Inheritance(strategy=InheritanceType.SINGLE_TABLE)
public abstract class FornecedorSuper extends PessoaMaster{
//attributes and relationships
}
Entity that is persisted.
public class FornecedorPecas extends FornecedorSuper {
private Double Valor;
#Column(name="ValorPeca")
public Double getValor() {
return Valor;
}
public void setValor(Double valor) {
Valor = valor;
}
}
It is necessary to mark FornecedorPeças class with # Entity?
When I insert the MappedSuperclass in the FornecedorSuper. This exception is thrown:
Exception in thread "main" Local Exception Stack: Exception [EclipseLink-30005] (Eclipse Persistence Services - .5.0.v20130507-3faac2b):org.eclipse.persistence.exceptions.PersistenceUnitLoadingException Exception Description: An exception was thrown while searching for persistence archives ith ClassLoader: sun.misc.Launcher$AppClassLoader#affc70 Internal Exception: javax.persistence.PersistenceException: Exception [EclipseLink-28018] (Eclipse Persistence Services - 2.5.0.v20130507-3faac2b): org.eclipse.persistence.exceptions.EntityManagerSetupException
Exception Description: Predeployment of PersistenceUnit [modelo] failed.
Internal Exception: Exception [EclipseLink-7161] (Eclipse Persistence Services - 2.5.0.v20130507-3faac2b): org.eclipse.persistence.exceptions.ValidationException
Exception Description: Entity class [class br.miltex.dominio.model.FornecedorPecas] has no primary key specified. It should define either an #Id, #EmbeddedId or an #IdClass. If you have defined PK using any of these annotations then make sure that you do not have mixed access-type (both fields and properties annotated) in your entity class hierarchy.

Q1. I can use instead of Inheritance MappedSuperclass here?
Although I didn't find any examples in the specification with that, the JPA specification says:
The MappedSuperclass annotation designates a class whose mapping
information is applied to the entities that inherit from it. A mapped
superclass has no separate table defined for it.
So it says that the mapping information is applied to Entities, but it does not say that a MappedSuperclass is not allowed to extend another MappedSuperclass, so I believe you can use in your FornecedorSuper class the #MappedSuperclass annotation.
Q2. It is necessary to mark FornecedorPeças class with # Entity?
Yes, it is necessary.

I had put annotations on attributes and methods. For this reason was occurring exceptions.
Thanks for the help.

Related

Upgrading from Spring Data 1.11 to Spring Data 2.0 results in "No property delete found for type SimpleEntity!"

I have a simple project with the classes below defined. It works just fine in spring-boot 1.5.4, spring-data-commons 1.13, and spring-data-jpa 1.11.
When I upgrade to spring-boot 2.0.0.M5, spring-data-commons 2.0.0 and spring-data-jpa-2.0.0, I get a PropertyReferenceException at startup that says "No property delete found for type SimpleEntity!" Unfortunately, I can't get the stack trace out of
the computer I get the error in, it is very locked down for security.
Any ideas? Other posts I found don't seem to match my situation.
Here are the classes (altered the names, but you get the idea):
package entity;
#MappedSuperclass
public abstract class BaseEntity implements Serializable {
....
}
package entity;
#Entity
#Table(schema = "ENTITIES", name = "SIMPLE")
public class SimpleEntity extends BaseEntity {
#Column(name = "ID")
private Long id;
#Column(name = "CODE")
private String code;
#Column(name = "NAME")
private String name;
... getters and setters ...
}
package repository;
imoport org.springframework.data.repository.Repository
public interface SimpleRepository extends Repository<SimpleEntity, Long> {
public SimpleEntity save(SimpleEntity entity);
public List<SimpleEntity> save(List<SimpleEntity> entities);
public void delete(Long id);
public SimpleEntity findOne(Long id);
public List<SimpleEntity> findAllByOrderByNameAsc();
public List<SimpleEntity> findByCode(String code);
public List<SimpleEntity> findByNameIgnoreCaseOrderByNameAsc(String name);
}
Turns out there is a breaking change in Spring Data 2.0 CrudRepository interface. The error I received occurs under the following conditions:
You have a 1.x Sping Data project
You have an interface that extends Repository directly, not a subinterface like CrudRepository
Your Repository subinterface declares the "void delete(ID)" method found in CrudRepository (in my case "void delete(Long)"
You update to Spring Data 2.x
The problem is that CrudRepository in 2.x no longer has a "void delete(ID)" method, it was removed, and a new method "void deleteById(ID)" was added.
When Spring data sees a delete method signature it doesn't recognize, it produces an error about your entity class missing a delete property - this is true of both 1.2 and 2.x.

Jar file from Jpa code doesn't work

I've developed a simple code that displayes employee name by using Jpa's one of CRUD operations(find) on entity classes "Employee"& "Department" it worked properly while running the code , but the real problem came when I created a jar file from the application, an exception appeared from the jar file , I wrote the exception in a txt file
Here is the Employee class
package com.tutorialspoint.eclipselink.entity;
import java.util.*;
import javax.persistence.*;
#Entity
public class Employee {
#Id
#GeneratedValue(strategy= GenerationType.AUTO)
private int eid;
#Temporal(TemporalType.TIMESTAMP)
private java.util.Date dop;
private String ename;
private double salary;
private String deg;
#OneToOne(targetEntity = Department.class)
private Department dept;
#OneToMany (targetEntity = Staff.class)
private ArrayList<Staff> staffs;
public Employee(int eid, String ename, double salary, String deg) {
super( );
this.eid = eid;
this.ename = ename;
this.salary = salary;
this.deg = deg;
}
public Employee( ) {
super();
}
public Date getDop() {
return dop;
}
public void setDop(Date dop) {
this.dop = dop;
}
public int getEid( ) {
return eid;
}
public void setEid(int eid) {
this.eid = eid;
}
public Department getDept() {
return dept;
}
public void setDept(Department dept) {
this.dept = dept;
}
public String getEname( ) {
return ename;
}
public void setEname(String ename) {
this.ename = ename;
}
public double getSalary( ) {
return salary;
}
public void setSalary(double salary) {
this.salary = salary;
}
public String getDeg( ) {
return deg;
}
public void setDeg(String deg) {
this.deg = deg;
}
public ArrayList<Staff> getStaffs() {
return staffs;
}
public void setStaffs(ArrayList<Staff> staffs) {
this.staffs = staffs;
}
}
and here is the class that displays employee name and degree
public void findEmployee(){
try{
EntityManagerFactory emfactory = Persistence.createEntityManagerFactory( "Eclipselink_JPA" );
EntityManager entitymanager = emfactory.createEntityManager();
Employee employee = entitymanager.find( Employee.class, 204 );
JOptionPane.showMessageDialog(null, employee.getEname()+
"=>"+employee.getDeg());
}catch(Exception ex){
JOptionPane.showMessageDialog(null,ex.getMessage());
displayMsg(ex.getMessage());
}
}
public void displayMsg(String msg){
// i made this method to display the exception in a txt file
File f = new File("E:\\bug2.txt");
FileWriter fw = new FileWriter(f);
PrintWriter pw = new PrintWriter(fw);
pw.println(msg);
pw.flush();pw.close();
}
and here is the exception
"
Exception [EclipseLink-28019] (Eclipse Persistence Services - 2.5.2.v20140319-9ad6abd): org.eclipse.persistence.exceptions.EntityManagerSetupException
Exception Description: Deployment of PersistenceUnit [Eclipselink_JPA] failed. Close all factories for this PersistenceUnit.
Internal Exception: Exception [EclipseLink-0] (Eclipse Persistence Services - 2.5.2.v20140319-9ad6abd): org.eclipse.persistence.exceptions.IntegrityException
Descriptor Exceptions:
Exception [EclipseLink-1] (Eclipse Persistence Services - 2.5.2.v20140319-9ad6abd): org.eclipse.persistence.exceptions.DescriptorException
Exception Description: The attribute [teacherSet] is not declared as type ValueHolderInterface, but its mapping uses indirection.
Mapping: org.eclipse.persistence.mappings.ManyToManyMapping[teacherSet]
Descriptor: RelationalDescriptor(com.tutorialspoint.eclipselink.entity.Clas --> [DatabaseTable(CLAS)])
Exception [EclipseLink-1] (Eclipse Persistence Services - 2.5.2.v20140319-9ad6abd): org.eclipse.persistence.exceptions.DescriptorException
Exception Description: The attribute [clasSet] is not declared as type ValueHolderInterface, but its mapping uses indirection.
Mapping: org.eclipse.persistence.mappings.ManyToManyMapping[clasSet]
Descriptor: RelationalDescriptor(com.tutorialspoint.eclipselink.entity.Teacher --> [DatabaseTable(TEACHER)])
Exception [EclipseLink-1] (Eclipse Persistence Services - 2.5.2.v20140319-9ad6abd): org.eclipse.persistence.exceptions.DescriptorException
Exception Description: The attribute [staffs] is not declared as type ValueHolderInterface, but its mapping uses indirection.
Mapping: org.eclipse.persistence.mappings.ManyToManyMapping[staffs]
Descriptor: RelationalDescriptor(com.tutorialspoint.eclipselink.entity.Employee --> [DatabaseTable(EMPLOYEE)])
Runtime Exceptions:
--------------------------------------------------------- "
so what can be done?? knowing that the program works well when running the code from IDE but this exception happens when i built it and created jar file and ran the jar file
Exceptions that involve interfaces like ValueHolders and indirection is most likely a case of problems due to entity weaving.
Entity weaving is a process of modifying the compiled entities' bytecode so that they implement more interfaces and add new methods such that they can handle things like indirection and lazy-loading, among other features.
Is your IDE Oracle JDeveloper? It is one of the IDEs that, by default, have a run configuration that does this automatically, so that your entities work correctly. This can be configured in other IDEs in a similar manner - by adding -javaagent:<path to eclipselink JAR> as a program argument (or Java Option in some IDEs). Check this blog post for some quick info.
It might the be case in your deployment that Eclipselink's dynamic (runtime) weaving has failed (or is incomplete for some reason). Perhaps you should consider static weaving before the entities are packaged into the deployment artifact.
More info on doing so here: https://wiki.eclipse.org/EclipseLink/UserGuide/JPA/Advanced_JPA_Development/Performance/Weaving/Static_Weaving
thanks , i found out the problem , it was in declaring the ArrayList of staff , it has a problem when I persist Collection declared as ArrayList or HashSet , I should declare it as the super interface eg Set or List ,so I modified it to
#OneToMany (targetEntity = Staff.class)
private List<Staff> staffs;
so, it worked very well and when I built the jar file it worked without any problems

#Inject not working in AttributeConverter

I have a simple AttributeConverter implementation in which I try to inject an object which have to provide the conversion logic, but #Inject seem not to work for this case. The converter class looks like this:
#Converter(autoApply=false)
public class String2ByteArrayConverter implements AttributeConverter<String, byte[]>
{
#Inject
private Crypto crypto;
#Override
public byte[] convertToDatabaseColumn(String usrReadable)
{
return crypto.pg_encrypt(usrReadable);
}
#Override
public String convertToEntityAttribute(byte[] dbType)
{
return crypto.pg_decrypt(dbType);
}
}
When the #Converter is triggered it throws an NullPointerException because the property crypto is not being initialized from the container. Why is that?
I'm using Glassfish 4 and in all other cases #Inject works just fine.
Is it not possible to use CDI on converters?
Any help will be appreciated :)
The accent of my question is more the AttributeConverter part. I understand that for the CDI to work a bean must meet the conditions described here http://docs.oracle.com/javaee/6/tutorial/doc/gjfzi.html.
I also have tried to force the CDI to work by implementing the following constructor:
#Inject
public String2ByteArrayConverter(Crypto crypto)
{
this.crypto = crypto;
}
And now I got the following exception which doesn't give me any clue:
2015-07-23T01:03:24.835+0200|Severe: Exception during life cycle processing
org.glassfish.deployment.common.DeploymentException: Exception [EclipseLink-28019] (Eclipse Persistence Services - 2.5.2.v20140319-9ad6abd): org.eclipse.persistence.exceptions.EntityManagerSetupException
Exception Description: Deployment of PersistenceUnit [PU_VMA] failed. Close all factories for this PersistenceUnit.
Internal Exception: Exception [EclipseLink-7172] (Eclipse Persistence Services - 2.5.2.v20140319-9ad6abd): org.eclipse.persistence.exceptions.ValidationException
Exception Description: Error encountered when instantiating the class [class model.converter.String2ByteArrayConverter].
Internal Exception: java.lang.InstantiationException: model.converter.String2ByteArrayConverter
at org.eclipse.persistence.internal.jpa.EntityManagerSetupImpl.createDeployFailedPersistenceException(EntityManagerSetupImpl.java:820)
at org.eclipse.persistence.internal.jpa.EntityManagerSetupImpl.deploy(EntityManagerSetupImpl.java:760)
...
I even tried using #Producer or #Decorator in order to have the CDI working on that place, but I still think there is something specific with the AttributeConverter which doesn't allow CDI. So problem not solved yet.
Unfortunately you can't inject CDI beans into a JPA converter, however in CDI 1.1 you can inject your Crypto programmatically :
Crypto crypto = javax.enterprise.inject.spi.CDI.current().select(Crypto.class).get()
For reference, JPA 2.2 will allow CDI to be used with AttributeConverter, and some vendors already support this (EclipseLink, DataNucleus JPA are the ones I know of that do it).
You're trying to combine two different worlds, as CDI doesn't know about JPA Stuff and vice-versa. (One annotation parser of course doesn't know about the other)
What you CAN do, is this:
/**
* #author Jakob Galbavy <code>jg#chex.at</code>
*/
#Converter
#Singleton
#Startup
public class UserConverter implements AttributeConverter<User, Long> {
#Inject
private UserRepository userRepository;
private static UserRepository staticUserRepository;
#PostConstruct
public void init() {
staticUserRepository = this.userRepository;
}
#Override
public Long convertToDatabaseColumn(User attribute) {
if (null == attribute) {
return null;
}
return attribute.getId();
}
#Override
public User convertToEntityAttribute(Long dbData) {
if (null == dbData) {
return null;
}
return staticUserRepository.findById(dbData);
}
}
This way, you would create a Singleton EJB, that is created on boot of the container, setting the static class attribute in the PostConstruct phase. You then just use the static Repository instead of the injected field (which will still be NULL, when used as a JPA Converter).
Well, CDI still doesn't work for AttributeConverter, which would be the most elegant solution, but I have found a satisfying workaround. The workaround is using #FacesConverter. Unfortunately per default CDI doesn't work in faces converters and validators either, but thanks to the Apache MyFaces CODI API you can make it work unsing the #Advaced annotation :) So I came up with an implementation like this:
#Advanced
#FacesConverter("cryptoConverter")
public class CryptoJSFConverter implements Converter
{
private CryptoController crypto = new CryptoController();
#Inject
PatientController ptCtrl;
public Object getAsObject(FacesContext fc, UIComponent uic, String value)
{
if(value != null)
return crypto.pg_encrypt(value, ptCtrl.getSecretKey());
else
return null;
}
public String getAsString(FacesContext fc, UIComponent uic, Object object)
{
String res = crypto.pg_decrypt((byte[]) object, ptCtrl.getSecretKey());
return res;
}
}
The injected managed bean has to be explicitly annotated with #Named and some scope definition. A declaration in faces-config.xml doesn't work! In my solution it looks like this:
#Named
#SessionScoped
public class PatientController extends PersistanceManager
{
...
}
Now one has a context information in the converter. In my case it is session/user specific cryptography configuration.
Of course in such a solution it is very likely that a custom #FacesValidator is also needed, but thanks to CODI one have the possibility for using CDI here also (analog to converter).

How to assign an #EntityGraph annotation to Spring Data JPA repository .findAll()

Annotating the Spring Data JPA repository method findAll() with #EntityGraph:
import org.springframework.data.jpa.repository.JpaRepository;
[...]
public interface OptgrpRepository extends JpaRepository<Optgrp> {
#EntityGraph(value = "Optgrp.sysoptions")
List<Optgrp> findAll();
}
leads to this error message:
org.springframework.data.mapping.PropertyReferenceException: No property findAll found for type Optgrp!
Same error happens when changing findAll() to other names:
findAllWithDetail() --> No property findAllWithDetail found for type Optgrp!
findWithDetailAll() --> No property findWithDetailAll found for type Optgrp!
Question: Is it at all possible to use the #EntityGraph annotation on a Spring Data JPA repository method that finds all entities?
EDIT: as asked in the comment, here's the extract from the Optgrp entity class:
#Entity
#NamedEntityGraph(name = "Optgrp.sysoptions", attributeNodes = #NamedAttributeNode("sysoptions"))
public class Optgrp implements Serializable {
[...]
#OneToMany(mappedBy="optgrp", cascade = CascadeType.ALL, orphanRemoval=true)
#OrderBy(clause = "ordnr ASC")
private List<Sysoption> sysoptions = new ArrayList<>();
}
And the Sysoption entity class as well:
#Entity
public class Sysoption implements Serializable {
[...]
#ManyToOne
#JoinColumn(name = "optgrp_id", insertable=false, updatable=false)
private Optgrp optgrp;
}
For all, who are using Stack Overflow as knowledge database too, I record a new status to Markus Pscheidts challenge. Three years and six months later the #EntityGraph annotation works now directly at the findAll() function in Spring Data JpaRepository, as Markus original expected.
#Repository
public interface ImportMovieDAO extends JpaRepository<ImportMovie, Long> {
#NotNull
#Override
#EntityGraph(value = "graph.ImportMovie.videoPaths")
List<ImportMovie> findAll();
}
Versions used in the test: Spring Boot 2.0.3.RELEASE with included spring-boot-starter-data-jpa.
Using the name findByIdNotNull is one way to combine both findAll() and entity graph:
#EntityGraph(value = "Optgrp.sysoptions")
List<Optgrp> findByIdNotNull();

#ManyToOne(fetch=FetchType.LAZY) lazy loading not working

I am working with JPA 2.1 (EclipseLink 2.5.1) and JBoss 7.1.
I've define very simple JPA entities:
#Entity
#Table(name="APLICACIONES_TB")
public class Aplicacion implements Serializable {
#Id
#Column(name="COD_APLICACION_V")
private long codAplicacionV;
#Column(name="APLICACION_V")
private String aplicacionV;
#OneToMany(mappedBy="aplicacion")
private Collection<Prestacion> prestaciones;
... getters and setters
}
#Entity
#Table(name="PRESTACIONES_TB")
public class Prestacion implements Serializable {
#Id
#Column(name="COD_PRESTACIONES_V")
private String codPrestacionesV;
#Column(name="DESCRIPCION_V")
private String descripcionV;
#ManyToOne(fetch=FetchType.LAZY)
#JoinColumn(name = "COD_APLICACION_V")
private Aplicacion aplicacion;
... getters and setters ...
}
I have developed a staless EJB that executes a query to obtain some "Aplicacion" entities.
#Stateless
#LocalBean
public class DocuEJB implements DocuEJBLocal
{
#PersistenceContext(name="DocuEjb", type=PersistenceContextType.TRANSACTION)
private EntityManager em;
public Prestacion getResult(String name)
{
return em.createNamedQuery("ExampleQueryName", Prestacion.class).getSingleResult();
}
}
Because I'm working with JSF 2.1 the EJB is being injected in a managed bean:
#ManagedBean(name = "ManagedBean")
#RequestScoped
public class ManagedBean
{
#EJB DocuEJB docuEjb;
public String doSomething()
{
Prestacion entity = docuEjb.getResult("egesr");
if (entity != null)
{
// It should return null because 'entity' should be detached
Aplicacion app = entity.getAplicacion();
// but 'app' entity is not null, ¿why not?
System.out.println (app.getCodAplicacionV());
}
}
}
Lazy loading is not working even when lazy loading has been defined for 'aplicacion' field on 'Prestacion' entity. The code posted before should return a NullPointerException in the next line:
System.out.println (app.getCodAplicacionV());
because 'app' entity is detached and lazy loading has been configured.
Why is not working lazy loading?
Thanks
Try to add #Transactional on doSomething(), I think that your transaction manager is not well configured.
You can see here the official spring documentation. In any case, can you add your spring configurations, so that we can better help you. :)
I don't think the behavior your encounter is abnormal or your question should state it clearly:
EJB are by default transactional
Your JSF inject an EJB, with #EJB, and I guess JBoss can create a java reference and not a proxy
The entity is being managed because the transaction is not done, it will finish when doSomething ends.
Your entity is then loaded into the EntityManager, and lazy loading works because there is a context to it.
You would call em.evict(entity) with the result your are getting, this would probably fails because the entity would not be managed any more.