Spring Gemfire entity class id generation - spring-data

Is it possible to use auto generated id in Spring Data Gemfire?
for example, if I have a class called MyGemfire
#region("myregion")
class MyGemfire{
#Id
#generatedValue????// if it is not possible what method I have to use to generate id in auto increment fashion?
Long id;
String name;
...
}

From a quick look at SimpleGemfireRepository it doesn't look like the repository is generating an ID:
#Override
public <U extends T> U save(U entity) {
ID id = entityInformation.getId(entity).orElseThrow(
() -> newIllegalArgumentException("ID for entity [%s] is required", entity));
template.put(id, entity);
return entity;
}
Also, this question and its answer suggest there is no ID generation in Gemfire itself.
So what you should do is to create your ID yourself. For example, it should be possible to have two constructors one taking an ID and the othe not taking an ID but generating it. A UUID would be the obvious choice. If you are bound to Long values, you probably have to roll your own algorithm.
To make it obvious to Spring Data which constructor to use when loading instances, you can use the #PersistenceConstructor annotation.

Related

Kotlin JPA entity ID

What is "the kotlin way" to define JPA entity ID?
#Entity
data class User (
#Id #GeneratedValue
var id: Long? = null,
...
)
Or is there any better one to avoid nullable id?
You can use a 0 value rather than a null value.
#Entity
data class User (
#Id #GeneratedValue
var id: Long = 0,
...
)
Autogeneration should still find the next sequence.
Kotlin compiles to Java, which has both, a primitive type long and a Class Long
As per the Java Persistence Specification in section 11.1.21 Id Annotation both can be used for the Id:
The field or property to which the Id annotation is applied should be one
of the following types: any
Java primitive type; any primitive wrapper type; java.lang.String; java.util.Date;
java.sql.Date; java.math.BigDecimal; java.math.BigInteger[109].
There is an advantage in using the Class over the primitive, as null has a more unambiguous meaning. But from the spec both are possible and you have to decide weather you favor Kotlins nullsafety over the the jpa style or the other way around.
Usually, data class is useful to ruturn more that one result from a method , but not for entities (just my opinion).
Sometimes it is not a good idea to set a default value for the id field.
After some time experimenting with Kotlin and entities (actually, with documents for MongoDb, but anyway it has id),
Looks like the better way is to use lateinit var. You can create the top class of entity hierarchy:
open class Identifiable {
lateinit var id: Long // or String or UUID
//explicitly define equals & hash code here
}
But be careful, for equals, hashcode and if you want to provide toString method in heirs, then it is a good idea to provide extra nullable field, something like:
open class Identifiable {
lateinit var id: Long // or String or UUID
val nullableId: Long?
get() {
return if(this::id.isInitialized) id else null
}
//explicitly define equals & hash code here with nullableId
}
class User {
override fun toString() = "User(id=${nullableId})"
}
In this case, you will avoid an exception when you will try to log your created but not saved in DB entity

How to query using fields of subclasses for Spring data repository

Here is my entity class:
public class User {
#Id
UserIdentifier userIdentifier;
String name;
}
public class UserIdentifier {
String ssn;
String id;
}
Here is what I am trying to do:
public interface UserRepository extends MongoRepository<User, UserIdentifier>
{
User findBySsn(String ssn);
}
I get an exception message (runtime) saying:
No property ssn found on User!
How can I implement/declare such a query?
According to Spring Data Repositories reference:
Property expressions can refer only to a direct property of the managed entity, as shown in the preceding example. At query creation time you already make sure that the parsed property is a property of the managed domain class. However, you can also define constraints by traversing nested properties.
So, instead of
User findBySsn(String ssn);
the following worked (in my example):
User findByUserIdentifierSsn(String ssn);

Spring Data JPA JPQL queries on parent interface

Say I have a #MappedSuperClass like this:
#MappedSuperclass
public abstract class Rating
{
#Id
private Long id;
#Column(name="USER_ID")
private Long userId;
private int rating;
...
With a concrete child entity like this
#Entity
#Table(name="ACTIVITY_RATING")
public class ActivityRating extends Rating
{
private Long activitySpecificData;
...
Then there is a Spring Data JPA repository like this:
#NoRepositoryBean
public interface RatingRepository<R extends Rating> extends JpaRepository<R, ID>
{
public List<R> findByUserId(Long userId);
...
and this:
public interface ActivityRatingRepository extends RatingRepository<ActivityRating>
{
}
This all works great and I can call findByUserId() on any of specific rating repositories that extend RatingRepository. I am now wanting to write some JPQL in the RatingRepository that all the child interfaces can inherit. I just don't know what (or if it's even possible) to put after the FROM in the query. For example:
#Query("SELECT NEW com.foo.RatingCountVo(e.rating, COUNT(e.rating)) FROM ??????? e GROUP BY e.rating")
public List<RatingCountVo> getRatingCounts();
I can add this method to each of the individual repositories that extend RatingRepository but everything would be exactly the same except for the specific entity name. If I want to change the query, I'd then have to go to all the child repositories and update them individually. I really want the query to live in the parent class and not be duplicated. Is there any way to accomplish this?
I'm currently using spring-data-jpa 1.7.2 and eclipselink 2.5.2. I'm not necessarily opposed to switching to newer versions if necessary.
Will it work if you will split query into 3 parts: start, entity and end of query? Than, if it'll work, in each interface you define constant like
String ENTITY = "ActivityRating";
And then you can use it like
#Query(RatingRepository.QUERY_START + ENTITY + RatingRepository.QUERY_END)
List<RatingCountVo> getRatingCounts();
BTW, there is no need to define public modifier in interface.
UPDATE: here is described another way:
#Query("SELECT NEW com.foo.RatingCountVo(e.rating, COUNT(e.rating)) FROM #{#entityName} e GROUP BY e.rating

Play framework: issues with implementing restful update operation

We're creating RESTFul API based on Play framework 2.1.x which transfers/accepts data in JSON format. Create, read and delete operations were easy to implement but we've got stuck with update operation.
Here are the entities we have:
Event:
#Entity
public class Event extends Model {
#Id
public Long id;
#NotEmpty
public String title;
#OneToOne(cascade = CascadeType.ALL)
public Location location;
#OneToMany(cascade = CascadeType.ALL)
public List<Stage> stages = new LinkedList<Stage>();
...
}
Location:
#Entity
public class Location extends Model {
#Id
public Long id;
#NotEmpty
public String title;
public String address;
...
}
Stage:
#Entity
public class Stage extends Model {
#Id
public Long id;
#NotEmpty
public String title;
public int capacity;
...
}
In our router we have following entry:
PUT /events/:id controllers.Event.updateEvent(id: Long)
updateEvent method in controller looks following way (note: we use Jackson library to map objects to JSON and back):
#BodyParser.Of(BodyParser.Json.class)
public static Result updateEvent(Long id) {
Event event = Event.find.byId(id);
Http.RequestBody requestBody = request().body();
JsonNode jsonNode = requestBody.asJson();
try {
ObjectMapper mapper = new ObjectMapper();
ObjectReader reader = mapper.readerForUpdating(event);
event = reader.readValue(jsonNode);
event.save();
} catch (IOException e) {
e.printStackTrace();
}
return ok();
}
After we've got Event from database, updated its values by reading from JSON with ObjectReader we try to save updated Event and get exception (similar one we get when trying to update list of Stages):
org.h2.jdbc.JdbcSQLException: Unique index or primary key violation: "PRIMARY_KEY_9F ON PUBLIC.LOCATION(ID)"; SQL statement: insert into location (id, title, address) values (?,?,?) [23505-168]
According to H2 logs framework tries to perform insert operation for location and fails as location with specified id already exists. We've investigated further ant it looks like when we get Event from DB, location is not joined because of lazy fetch. Looks like the problem occurs with saving other entities which our Event has relationships with. We've tried to force fetch operation for location by doing following:
Event event = Ebean.find(Event.class).fetch("location").where().eq("id", id).findUnique();
but still when we update this event with ObjectReader's readValue method and save Event we get the same exception.
We've also tried to create separate Event object from JSON and update Event from DB field by field (implemented merge operation by ourselves) and it worked but it looks odd that framework doesn't provide any means of merging and updating entities with data passed from client.
Could someone advise on how to solve this problem ? Any example showing how to implement merge of entity with JSON data coming from client and updating it in storage would be highly appreciated.
You've probably already fixed the error by now, but in case this helps someone else, I'm answering it anyway.
I'm just a beginner with Play Framework as well, only started a few days ago. But I believe when you have in your code:
event.save();
you should be doing instead:
event.update();
The problem here is that you're not inserting a new entity into the database, but in fact just updating the one already there, so you need to use the second method.
You can find more info about this at http://www.playframework.com/documentation/2.0/api/java/play/db/ebean/Model.html

MyBatis: How to map "inverse" relationship?

My problem is to persist two classes that have a 1:n relationship:
public class DayRecord {
private Long id;
private List<TimeRecord> timeRecordsToday = new ArrayList<TimeRecord>(4);
...
}
public class TimeRecord {
private Long id;
...
}
So, in code, DayRecord knows TimeRecord.
create table DAY_RECORDS (
id int primary key,
);
create table TIME_RECORDS (
id int primary key,
day_record_id int not null,
foreign key (day_record_id) references DAY_RECORDS (id)
);
In database, TimeRecord knows DayRecord.
Can I save a DayRecord with all its TimeRecords in one step?
In Hibernate, I can set an inverse mapping and just save a DayRecord and all its TimeRecords will get saved, too. With MyBatis, I tried to save the classes independently from each other:
<mapper
namespace="de.stevenschwenke.java.javafx.xyz.DayRecordMapper">
<insert id="insertDayRecord"
parameterType="de.stevenschwenke.java.javafx.xyz.DayRecord">
insert into DAY_RECORDS (id) values (NEXT VALUE FOR DAY_RECORDS_SEQ);
</insert>
</mapper>
<mapper
namespace="de.stevenschwenke.java.javafx.xyz.TimeRecordMapper">
<insert id="insertTimeRecord"
parameterType="de.stevenschwenke.java.javafx.xyz.TimeRecord">
insert into TIME_RECORDS (id) values (NEXT VALUE FOR TIME_RECORDS_SEQ);
</insert>
</mapper>
But how can I save the DayRecord-ID inTimeRecord?
Ideas:
Give TimeRecord an attribute dayRecordId. This way, a cyclic dependency would be created. However, the mapping would take care of the dpenedency while saving.
In one transaction, save the DayRecord first, get its ID, set it in TimeRecords and save this object.
use a nested select-statement within insert like in the documentation
What is the best way to save both objects? Thanks for your help!
As jdevelop already mentioned, MyBatis is just a SQL wrapper. Because SQL doesn't offer a way to insert two objects that have a relationship, MyBatis can't do that either.
So here's my workaround: As I mentioned, I don't want to add a circular dependency by letting TimeRecord know about DayRecord. So I created a wrapper class just for inserting TimeRecords:
public class TimeRecordInsertWrapper {
public Long id;
public int hours;
public long dayRecordId;
[constructor/getter/setter omited but there with public access modifier]
}
First, I store the DayRecord and get it's ID. Then I create the wrapper object and store the TimeRecords:
public long insertDayRecord(DayRecord newRecord) {
SqlSession session = sqlSessionFactory.openSession();
try {
session.insert(
"de.stevenschwenke.java.javafx.xyz.DayRecordMapper.insertDayRecord",
newRecord);
for (TimeRecord tr : newRecord.getTimeRecordsToday()) {
TimeRecordInsertWrapper wrapper = new TimeRecordInsertWrapper(tr.getHours(), newRecord.getId());
session.insert("de.stevenschwenke.java.javafx.xyz.TimeRecordMapper.insertTimeRecord",
wrapper);
}
return newRecord.getId();
} finally {
session.commit();
session.close();
}
}
This way, I can use my nice one-way object model AND have the "right" mapping in the database.
Mybatis is just SQL mapping framework, it allows you to abstract SQL code from Java code and that's it, more or less. They are pretending to look like Hibernate with recent versions, but this leads to weird constructions in XML.
I would suggest to store the DayRecord and get it's it from selectKey, then use that ID in subsequent calls to the mapper. This is what actually happens inside the mapper, but complex XML implies complex FSM to built inside. So keep it simple and you're safe with myBatis, or use Hibernate.
What is even better, you can define custom DAO interfaces for the tasks, and then you can have some sort of Service layer with #Transactional attribute set. This requires mybatis-guice, but it works really great and you don't need to deal with transactions in your code (they are declarative).