Assigning Sequences for all JPA entities using SessionCustomizer - jpa

I am trying to use a SessionCustomizer to automatically generate Sequences in EclipseLink which already exist in the database following a special naming convention. For example an entity called Item is mapped to a table called ITEMS which has a four letter alias ITEM and a database sequence called ITEM_ID_SEQ for unique ID generation.
I am using an annotation as a marker to hold the alias name on the entity class because we are using it for other purposes, too:
package jpa.namingsupport;
// imports omitted
#Target(TYPE)
#Retention(RUNTIME)
public #interface Alias {
String name();
}
Entities look like this:
package jpa.entities;
// imports omitted
#Entity
#Table(name = "ITEMS")
#Alias(name = "ITEM")
public class Item {
#Id
private Long id;
#Version
private Long version;
private String name;
// setters and getters omitted
}
Using a SessionCustomizer registered correctly and verified running on startup to create and add the Sequences to the entities:
package jpa.namingsupport;
// imports omitted
public class AliasCustomizer implements SessionCustomizer {
#Override
public void customize(Session session) throws Exception {
Map<Class, ClassDescriptor> entities = session.getDescriptors();
for (Class entity : entities.keySet()) {
customizeSequence(aliasNameFor(entity), entities.get(entity), session);
}
}
private String aliasNameFor(Class entity) {
Alias alias = (Alias) entity.getAnnotation(Alias.class);
return alias.name();
}
private void customizeSequence(String alias, ClassDescriptor descriptor, Session session) {
NativeSequence sequence = new NativeSequence(underscores(alias, "ID", "SEQ"), 1);
session.getLogin().addSequence(sequence);
descriptor.setSequenceNumberName(sequence.getName());
descriptor.setSequenceNumberField(descriptor.getPrimaryKeyFields().get(0));
descriptor.setSequence(sequence);
}
private String underscores(String... parts) {
return StringUtils.arrayToDelimitedString(parts, "_");
}
}
But when I am running my tests the ID is not assigned from the Sequence before saving:
[EL Warning]: 2013-07-14 20:32:32.571--UnitOfWork(1908148255)--Exception [EclipseLink-4002] (Eclipse Persistence Services - 2.5.0.v20130507-3faac2b): org.eclipse.persistence.exceptions.DatabaseException
Internal Exception: org.h2.jdbc.JdbcSQLException: NULL nicht zulässig für Feld "ITEM_ID"
NULL not allowed for column "ITEM_ID"; SQL statement:
INSERT INTO ITEMS (ITEM_NAME, ITEM_VERSION) VALUES (?, ?) [23502-172]
Any hints and ideas what I am missing in my code? What I am seeing is that there is no reference to the ITEM_ID column in the generated insert statement.

Why don't you just put #GeneratedValue(strategy=SEQUENCE, generator="ITME_ID_SEQ") on your id?
For your customizer, don't call descriptor.setSequence(), this should be done be initializaiton.
The SQL is expecting the id to being using an IDENTITY value, you need to configure your table for this. If you want to use SEQUENCE instead, then pass false into new NativeSequence(name, increment, false). H2 supports both IDENTITY and SEQUENCE, and NativeSequence defaults to using IDENTITY, false means SEQUENCE.

Related

JPA - perform an insert on a Postgres table whose primary key is generated from a database trigger

I am writing an API where I am inserting a record into a table (Postgres). I was hoping to use JPA for the work. Here is the potential challenge: the primary key for the insert is generated from a database trigger, rather than from sequence count or similar. In fact, the trigger creates the primary key using the values of other fields being passed in as part of the insert. So for example,
if I have a entity class like the following:
#Entity
#Validated
#Table(name = "my_table", schema="common")
public class MyModel {
#Id
#Column(name = "col_id")
private String id;
#Column(name = "second_col")
private String secCol;
#Column(name = "third_col")
private String thirdCol;
public MyModel() {
}
public MyModel(String id, String secCol, String thirdCol) {
this.id = id;
this.secCol = secCol;
this.thirdCol = thirdCol;
}
}
I would need the col_id field to somehow honor that the key is generated from the trigger, and the trigger would need to be able to read the values for second_col and third_col in order to generate the primary key. Finally, I would need the call to return the value of the primary key.
Can this be done with jpa and repository interface such as:
public interface MyRepo extends JpaRepository <MyModel, String> {
}
and then use either default save method such as myRepo.saveAndFlush(myModel) or custom save methods? I can't find anything on using JPA with DB triggers that generating keys. If it cannot be done with JPA, I would be grateful for any alternative ideas. Thanks.
ok, I was able to get this to work. It required writing a custom query that ignored the primary key field:
public interface MyRepo extends JpaRepository <MyModel, String> {
#Transactional
#Modifying
#Query(value = "INSERT INTO my_table(second_col, third_col)", nativeQuery = true)
int insertMyTable(#Param("second_col") String second_col, #Param("third_col") String third_col);
}
The model class is unchanged from above. Because it was executed as a native query, it allowed postGres to do its thing uninterrupted.

Spring Data JPA : How save data into database using save() of jpaRepository

I created one spring Application. I am trying to save data into database using save method of JPA Repository. i am getting Error null value in column "id" violates not-null constraint
HomeController
#RestController
public class HomeController
{
#Autowired
public userRepository repository;
#RequestMapping(value="/save2",method=RequestMethod.POST )
public String save1(#ModelAttribute user us)
{
repository.save(us);
return "sucessfull";
}
}
user
#Entity
#Table(name="user", schema="new")
public class user implements Serializable
{
private static final long serialVersionUID = -2956665320311624925L;
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
public Integer id;
#Column(name="uname")
public String uname;
#Column(name="pass")
public String pass;
Table Script
Through Postman I am trying to Insert following data
I am getting this error
org.postgresql.util.PSQLException: ERROR: null value in column "id" violates not-null constraint
Can Any one tell me what i am doing wrong in above code
I see couple of issues here.
First, replace your #ModelAttribute with #RequestBody since you're sending a JSON request, it is wise to use the latter. (Read up here and here). In your case, the values from request is not passed to repository save method including Id value. That's the reason you're getting not null constraint error.
Second, since you're using GenerationType.IDENTITY strategy, you should use serial or bigserial type to let Postgres to generate your primary key.
Read up nicely written answers on IDENTITY strategy here
You defined id as an Integer field in your model class. Try to pass the value in the json as an Integer, not as a String.
{
"id": 1,
"uname": "abc",
"upass": "abc"
}

Why is the same database entry represented by multiple JPA bean instances?

Today I stumbled over some unexpected behaviour of EclipseLink. (I don't know if this is bound to EclipseLink or if this is the same for all JPA providers.)
I assumed that retrievals of a managed JPA bean always return references to the same object instance when issued inside the same transaction (using the same EntityManager).
If that is right, I don't know why I receive an error when I execute the following test case:
#Test
public void test_1() {
EntityManager em = newEntityManager();
em.getTransaction().begin();
// Given:
Product prod = newProduct();
// When:
em.persist(prod);
em.flush();
Product actual =
em.createQuery("SELECT x from Product x where x.id = "
+ prod.getId(), Product.class).getSingleResult();
// Then:
assertThat(actual).isSameAs(prod); // <-- FAILS
em.getTransaction().commit();
}
The statement marked with "FAILS" throws the following AssertionError:
java.lang.AssertionError:
Expecting:
<demo.Product#35dece42>
and actual:
<demo.Product#385dfb63>
to refer to the same object
Interestingly the following slightly modified test succeeds:
#Test
public void test_2() {
EntityManager em = newEntityManager();
em.getTransaction().begin();
// Given:
Product prod = newProduct();
// When:
em.persist(prod);
em.flush();
Product actual = em.find(Product.class, prod.getId());
// Then:
assertThat(actual).isSameAs(prod); // <-- SUCCEEDS
em.getTransaction().commit();
}
Obviously there is a difference between finding and querying objects.
Is that the expected behaviour? And why?
--Edit--
I think I found the source of the problem: Product has an ID of type ProductId.
Here is the relevant code:
#Entity
#Table(name = "PRODUCT")
public class Product implements Serializable {
private static final long serialVersionUID = 1L;
#Id
#Column(name = "ID", nullable = false)
#Converter(name = "productIdConverter", converterClass = ProductIdConverter.class)
#Convert("productIdConverter")
private ProductId id;
#Column(name = "NAME", nullable = false)
private String name;
[...]
}
The #Convert and #Converter annotations are EclipseLink-specific.
Unlike JPA 2.1 Converters you may place them on ID fields.
But it seems that in certain circumstances EclipseLink has problems to find a managed bean in its session cache if that bean uses a custom type for its ID field.
I guess I have to file a bug for that.
I found the cause of the problem and a solution.
We are using a custom ID class (ProductId) for Product, together with a custom (EclipseLink-specific) Converter-Class ProductIdConverter which has a bad implementation of the convertObjectValueToDataValue(...) method.
Here is the relevant code:
/**
* Convert the object's representation of the value to the databases' data representation.
*/
#Override
public final Object convertObjectValueToDataValue(Object objectValue, Session session) {
if (objectValue == null) {
return null;
}
Long longValue = ((ProductId) objectValue).getLong();
return longValue;
}
Please note that the method returns Long instances (or null).
But since we are using Oracle as our database backend and have declared the product's ID column as NUMBER, the JDBC Driver maps the column value as BigDecimal. This means, we have to make sure, that our convertObjectValueToDataValue(...) also returns BigDecimal instances.
So the correct implementation is:
/**
* Convert the object's representation of the value to the databases' data representation.
*/
#Override
public final Object convertObjectValueToDataValue(Object objectValue, Session session) {
if (objectValue == null) {
return null;
}
Long longValue = ((ProductId) objectValue).getLong();
return BigDecimal.valueOf(longValue);
}
Now this method returns only BigDecimal instances.

(JDBI/Dropwizard) PSQLException when retrieving auto-incremented id from PostgreSQL

I'm trying to set up a dropwizard project but I'm stuck. When I try to get the auto generated id field with #GetGeneratedKeys then I'm getting the following Exception:
org.postgresql.util.PSQLException: Bad value for type long : foo.
The request is a simple JSON Request
{"name":"foo"}
The INSERT into the database is successful but it seems that the statement returns the value of the name instead of the generated id. How can I solve this?
I use postgresql, and the table project contains a primary key field "id" with nextval('project_id_seq'::regclass). Here are the POJO, DAO and Resource Classes I use:
public class Project {
private long id;
private String name;
public Project() { // Jackson deserialization }
public Project(long id, String name) {
this.id = id;
this.name = name;
}
...
}
#RegisterMapper(ProjectMapper.class)
public interface ProjectDAO {
#SqlUpdate("insert into project (name) values (:name)")
#GetGeneratedKeys
public long insert(#Bind("name") String name);
}
#Path("/project")
#Consumes({MediaType.APPLICATION_JSON})
#Produces({MediaType.APPLICATION_JSON})
public class ProjectResource {
ProjectDAO projectDAO;
public ProjectResource(ProjectDAO personDAO) {
this.projectDAO = personDAO;
}
#POST
#Timed
public Response add(#Valid Project project) {
long newId = projectDAO.insert(project.getName());
project.setId(newId);
return Response.status(Response.Status.CREATED)
.entity(project).build();
}
}
===============
UPDATE
I just figured out that this relates to the fact that my id column isn't the first column in my table. The column name is. The problem occurs because #GetGeneratedKeys is using org.skife.jdbi.v2.sqlobject.FigureItOutResultSetMapper which is using org.skife.jdbi.v2.PrimitivesMapperFactory which returns org.skife.jdbi.v2.util.LongMapper.FIRST. This mapper is calling
java.sql.ResultSet.getLong(1) through the method extractByIndex(...) to retrieve the generated id, which isn't the id in my case...
I'll fix the issue by reorganizing the columns in the database, but I'd like to have a robust implementation if possible: Is there a way to specify the column name of the id column when using the #GetGeneratedKeys Annotation? (The org.skife.jdbi.v2.util.LongMapper class contains a also method called extractByName(...))
This is an issue in the jdbi implementation and is fixed in a newer version as described in https://github.com/jdbi/jdbi/issues/114

Get eagerly collection of entities containing other eagerly got collections

I've got stuck on the M:N relation between entity and strings. An user can have more than one role and each role can be assigned to more than one user. Role is just a string. Roles are contained in table with two columns: roleId and roleName.
I've created two entities, but I'm absolutely unable to made it work. First entity is the user:
#Entity
#Table(name="appUsers")
public class UserEntity {
#Id
private String login;
private String password;
#OneToMany(fetch=FetchType.EAGER,mappedBy="user") //we always need to load user's roles
private Collection<UsersToRoles> roles;
#Transient
private Collection<String> roleNames;
public String getLogin() {
return login;
}
public String getPassword() {
return password;
}
#PostLoad
void prepareRoleNames() {
roleNames = new HashSet<String>(roles.size());
for (UsersToRoles mapping : roles)
roleNames.add(mapping.getNameOfRole());
}
public Collection<String> getRoles() {
return roleNames;
}
}
The second is entity associated with connecting table:
#Entity
#IdClass(UsersToRolesId.class)
public class UsersToRoles {
#Id
#SuppressWarnings("unused")
#Column(name="login")
private String login;
#Id
#SuppressWarnings("unused")
#Column(name="roleId")
private int roleId;
#ElementCollection(fetch=FetchType.EAGER)
#CollectionTable(name="userRoles", joinColumns={#JoinColumn(name="roleId")})
private List<String> roleName;
#ManyToOne
#JoinColumn(name="login")
#SuppressWarnings("unused")
private UserEntity user;
public String getNameOfRole() {
if (roleName.isEmpty())
throw new CommonError("Role name for roleId=" + roleId, AppErrors.ACCESSOR_UNAVAILABLE);
return roleName.get(0);
}
}
class UsersToRolesId {
private String login;
private int roleId;
/**
* Implicit constructor is not public. We have to
* declare public non-parametric constructor manually.
*/
public UsersToRolesId() {
}
#Override
public int hashCode() {
return 17*login.hashCode() + 37*roleId;
}
#Override
public boolean equals(Object obj) {
if (!(obj instanceof UsersToRolesId))
return false;
UsersToRolesId ref = (UsersToRolesId)obj;
return (this.login.equals(ref.login) && this.roleId == ref.roleId);
}
}
And the problem is, that the roleName collection is always null. I'm unable to get it work. When I make a mistake in table name in #CollectionTable annotation, it still works. The JPA does not fetch the subcollection at all. It makes select from table of user joined with table UsersToRoles, but the join to table userRoles is missing.
Can I ever do that? Can I get eagerly collection of entities containing another eagerly fetched collections?
Your mapping is completely wrong. UsersToRoles has a roleId column. Thus it refers to a single role. How could it have a collection of role names? The login column is mapped twice in the entity. Moreover, this looks like a simple join table to me, without any other attribute than the roleId and the login, which are foreign keys to the IDs of User and Role, respectively.
You should have two entities : User and Role, with a ManyToMany association using the UsersToRoles table as join table. That's it. The UsersToRoles table should not be mapped as an entity: it's a pure join table.
JPA providers usually have a configuration property denoting default eager fetch depth, i.e. hibernate.max_fetch_depth for Hibernate. Check if you can see more when you increase it.
Also, think about your design. Fetching subcollections of a collection eagerly might be a good idea only in limited scenarios (performance-wise). When you annotate your entity like that, you're going to use eager fetching in all use cases. Perhaps you'd be better off with "lazy" and fetching it eagerly only explicitly, with a query with a JOIN FETCH clause?