Play framework: issues with implementing restful update operation - rest

We're creating RESTFul API based on Play framework 2.1.x which transfers/accepts data in JSON format. Create, read and delete operations were easy to implement but we've got stuck with update operation.
Here are the entities we have:
Event:
#Entity
public class Event extends Model {
#Id
public Long id;
#NotEmpty
public String title;
#OneToOne(cascade = CascadeType.ALL)
public Location location;
#OneToMany(cascade = CascadeType.ALL)
public List<Stage> stages = new LinkedList<Stage>();
...
}
Location:
#Entity
public class Location extends Model {
#Id
public Long id;
#NotEmpty
public String title;
public String address;
...
}
Stage:
#Entity
public class Stage extends Model {
#Id
public Long id;
#NotEmpty
public String title;
public int capacity;
...
}
In our router we have following entry:
PUT /events/:id controllers.Event.updateEvent(id: Long)
updateEvent method in controller looks following way (note: we use Jackson library to map objects to JSON and back):
#BodyParser.Of(BodyParser.Json.class)
public static Result updateEvent(Long id) {
Event event = Event.find.byId(id);
Http.RequestBody requestBody = request().body();
JsonNode jsonNode = requestBody.asJson();
try {
ObjectMapper mapper = new ObjectMapper();
ObjectReader reader = mapper.readerForUpdating(event);
event = reader.readValue(jsonNode);
event.save();
} catch (IOException e) {
e.printStackTrace();
}
return ok();
}
After we've got Event from database, updated its values by reading from JSON with ObjectReader we try to save updated Event and get exception (similar one we get when trying to update list of Stages):
org.h2.jdbc.JdbcSQLException: Unique index or primary key violation: "PRIMARY_KEY_9F ON PUBLIC.LOCATION(ID)"; SQL statement: insert into location (id, title, address) values (?,?,?) [23505-168]
According to H2 logs framework tries to perform insert operation for location and fails as location with specified id already exists. We've investigated further ant it looks like when we get Event from DB, location is not joined because of lazy fetch. Looks like the problem occurs with saving other entities which our Event has relationships with. We've tried to force fetch operation for location by doing following:
Event event = Ebean.find(Event.class).fetch("location").where().eq("id", id).findUnique();
but still when we update this event with ObjectReader's readValue method and save Event we get the same exception.
We've also tried to create separate Event object from JSON and update Event from DB field by field (implemented merge operation by ourselves) and it worked but it looks odd that framework doesn't provide any means of merging and updating entities with data passed from client.
Could someone advise on how to solve this problem ? Any example showing how to implement merge of entity with JSON data coming from client and updating it in storage would be highly appreciated.

You've probably already fixed the error by now, but in case this helps someone else, I'm answering it anyway.
I'm just a beginner with Play Framework as well, only started a few days ago. But I believe when you have in your code:
event.save();
you should be doing instead:
event.update();
The problem here is that you're not inserting a new entity into the database, but in fact just updating the one already there, so you need to use the second method.
You can find more info about this at http://www.playframework.com/documentation/2.0/api/java/play/db/ebean/Model.html

Related

Spring Boot transactional test allows to violate unique constraint on update

I have a DB (Postgres) table with a unique constraint for one column. I have a test marked with #Transactional annotation, that updates that unique column value to a not unique value. I expect that the update operation should fail, but it executes successfully. Moreover, when I get updated object from the database (inside the same transaction), the column value is updated there.
The simplified version of JPA entity:
#Entity
#Table(name = "entities")
public class Entity {
#Id
#GeneratedValue(strategy = GenerationType.IDENTITY)
#Column(name = "id")
private Long id;
// The unique column
#Column(name = "name", unique = true)
#NotNull
private String name;
...
}
The simplified version for the test:
#Test
#Transactional
public void test() {
Entity firstEntity = new Entity();
firstEntity.setName("First Entity Name");
// This just calls corresponding JPA repository .save method
entityService.create(firstEntity);
Entity secondEntity = new Entity();
secondEntity.setName("Second Entity Name");
entityService.create(secondEntity);
// Update name to a not unique value
secondEntity.setName(firstEntity.getName);
// This calls corresponding JPA repository .save method.
// It also catches DataIntegrityViolationException and throws
// a more user friendly exception instead
entityService.update(secondEntity);
}
This code works as I expect, if #Transactional annotation is removed or transaction is committed. I also tried to call EntityManager.flush(), as advised here, but this code throws ConstraintViolationException after resulting data is flushed, so I can't test that my entityService.update method works correctly and throws proper exception.
Please also note that if I try to create a new entry with not unique data in transactional test (not update), then test works as expected -
DataIntegrityViolationException is thrown when not unique entity is created.
Could somebody clarify if it is possible to make update scenario work as expected keeping test transactional so I don't need to care about data clean up?

Get only selected data from JPA query

I cound't find any nice solution to get only selected data from the domain?
E.g I have class:
#Entity
public class Reservation {
// private Integer RESERVATION_ID;
// private Integer id;
private long code;
private Date date;
private Client reservationClient;
private WashType reservationWashType;
private Vehicle reservationVehicle;
private Wash reservationWash;
private Worker reservationWorkerPesel;
private Review reservationReview;
private ReservationReminder reservationReminder;
}
And have simple query repository
public interface ReservationRepository extends JpaRepository<Reservation, Long> {
Reservation findByCode(long code);
}
I'd like to take from that query the Reservation object but without data's from class like Review, Worker.
So it means my result should looks like:
a whole object of Reservation which includes:
code,date,Client reservationClient,WashType reservationWashType,Vehicle reservationVehicle,Wash reservationWash, ReservationReminder reservationReminder
Is it possible to exclude it in nice way? Or if not how can I manage it?
Yes, you can easily do that so long as Review and Worker are marked to be lazily loaded.
What I mean is:
#ManyToOne(fetch = FetchType.LAZY)
private Review review;
Hibernate won't attempt to load the Review association until you call #getReview().
For situations then where you want your Reservation with the Review, you would then just need to specify at query-time that you want the relationship join-fetched.
#Query("SELECT r FROM Reservation r JOIN FETCH r.review WHERE r.code = :code")
List<Reservation> findByCode(Long code);
Remember, if Review cannot be null, make sure that #ManyToOne has the optional=false attribute so that when the join gets generated, it uses an inner join rather than an outer join to avoid performance overhead.

Spring data elastic search findAll with OrderBy

I am using spring data's elastic search module, but I am having troubles building a query. It is a very easy query though.
My document looks as follows:
#Document(indexName = "triber-sensor", type = "event")
public class EventDocument implements Event {
#Id
private String id;
#Field(type = FieldType.String)
private EventMode eventMode;
#Field(type = FieldType.String)
private EventSubject eventSubject;
#Field(type = FieldType.String)
private String eventId;
#Field(type = FieldType.Date)
private Date creationDate;
}
And the spring data repository looks like:
public interface EventJpaRepository extends ElasticsearchRepository<EventDocument, String> {
List<EventDocument> findAllOrderByCreationDateDesc(Pageable pageable);
}
So I am trying to get all events ordered by creationDate with the newest event first. However when I run the code I get an exception (also in STS):
Caused by: org.springframework.data.mapping.PropertyReferenceException: No property desc found for type Date! Traversed path: EventDocument.creationDate.
So it seems that it is not picking up the 'OrderBy' part? However a query with a findBy clause (eg findByCreationDateOrderByCreationDateDesc) seems to be okay. Also a findAll without ordering works.
Does this mean that the elastic search module of spring data doesn't allow findAll with ordering?
Try adding By to method name:
findAllByOrderByCreationDateDesc

How to use composite entity in spring-data-cassandra?

I am setting up spring-data-cassandra for the first time and have a class like so:
#Table(value = "contact")
public class Contact {
#Id
UUID id;
...
Location Location;
...
public void setLocation(Location location) {
this.location = location;
}
public Location getLocation() {
return location;
}
}
This gives me an error when starting up:
org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'mySQLTransactionRepository': Invocation of init method failed; nested exception is org.springframework.data.cassandra.mapping.VerifierMappingExceptions: com.foo.backend.core.Location:
Cassandra entities must have the #Table, #Persistent or #PrimaryKeyClass Annotation
....
Coming from a spring-data-jpa background simply annotating Location with #Embeddable has previously been enough. It looks like this doesn't work with spring-data-cassandra. How do I use compound entities with spring-data-cassandra?
Will have to annotate location as #Transient and do some serialization myself? I tried to annotate my class with #Persistent but was getting an error about PrimaryKey missing on Location. I can't comprehend why a primary key would be necessary...
Because of the non-relational details of Cassandra, you are going to find it doesn't work like JPA.
There are no joins in Cassandra, so embedding another table as as an attribute of a table is not allowed.
Embeddable types are not supported at this time. If you would like to elaborate on the feature request, please create a Jira for SDC*.
Thanks.

Ecliplselink - #CascadeOnDelete doesn't work with #Customizer

I have two entities. "Price" class has "CalculableValue" stored as SortedMap field.
In order to support sorted map I wrote customizer. After that, it seems #CascadeOnDelete is not working. If I remove CalculableValue instance from map and then save "Price" EclipseLink only updates priceId column to NULL in calculableValues table...
I really want to keep the SortedMap. It helps to avoid lots of routine work for values access on Java level.
Also, there is no back-reference (ManyToOne) defined in the CalculableValue class, it will never be required for application logic, so, wanted to keep it just one way.
Any ideas what is the best way to resolve this issue? I actually have lots of other dependencies like this and pretty much everything is OneToMany relation with values stored in sorted map.
Price.java:
#Entity
#Table(uniqueConstraints={
#UniqueConstraint(columnNames={"symbol", "datestring", "timestring"})
})
#Customizer(CustomDescriptorCustomizer.class)
public class Price extends CommonWithDate
{
...
#CascadeOnDelete
#OneToMany(cascade = CascadeType.ALL, fetch = FetchType.EAGER)
#MapKeyColumn(name="key")
#JoinColumn(name = "priceId")
private Map<String, CalculatedValue> calculatedValues =
new TreeMap<String, CalculatedValue>();
...
}
public class CustomDescriptorCustomizer implements DescriptorCustomizer
{
#Override
public void customize(ClassDescriptor descriptor) throws Exception
{
DatabaseMapping jpaMapping = descriptor.getMappingByAttribute("calculatedValues");
((ContainerMapping) mapping).useMapClass(TreeMap.class, methodName);
}
}
Your customizer should have no affect on this. It could be because you are using a #JoinColumn instead of using a mappedBy which should normally be used in a #OneToMany.
You can check the mapping in your customizer using, isCascadeOnDeleteSetOnDatabase()
or set it using
mapping.setIsCascadeOnDeleteSetOnDatabase(true)