I am adding some tests on my Spring Boot 2.4 application that works well in production.
In one of my SpringBootTest , I call the API (using mockMvc) and compare the result with what I have in the DB.
#SpringBootTest
#AutoConfigureMockMvc
#ActiveProfiles("test")
class TicketIT {
#Autowired
private MockMvc mockMvc;
#Autowired
private ObjectMapper objectMapper;
#Autowired
private TicketTypeRepository ticketTypeRepository;
#Test
void shouldReturnListOfTicketTypes() throws Exception {
RequestBuilder request =
MockMvcRequestBuilders.get(RESOURCE_BASE_URL + "/types").contentType(APPLICATION_JSON);
String responseAsString =
mockMvc
.perform(request)
.andExpect(status().isOk())
.andReturn()
.getResponse()
.getContentAsString();
List<TicketTypesRepresentation> ticketTypes =
objectMapper.readValue(
responseAsString, new TypeReference<List<TicketTypesRepresentation>>() {
});
assertThat(ticketTypes).hasSameSizeAs(ticketTypeRepository.findAll());
}
}
I have the feeling I've written that type of tests hundreds of times, but on this one, I am facing a problem : my application is configured correctly, because I receive a list of items in the API response.
However, what I find strange is that I get an exception from the ticketTypeRepository.findAll() call :
failed to lazily initialize a collection of role ... could not initialize proxy - no Session
I understand the issue, and I can fix it either by making the relation eager (with #Fetch(FetchMode.JOIN) on the entity), or my making the test #Transactional but I am not sure I like any of the options..
I don't remember facing that issue in the past in other Spring Boot tests so I am a bit puzzled.
Am I missing something to make sure that all the calls made to ticketTypeRepository are made within a transaction ? TicketTypeRepository is a wrapper around a CrudRepository, is it the reason why it doesn't work directly ?
Here's the entity and repository code :
public class JpaTicketTypeRepository implements TicketTypeRepository {
public List<TicketType> findAll() {
var allTicketTypesEntity= jpaTicketTypesEntityRepository.findAll();
return StreamSupport.stream(allTicketTypesEntity.spliterator(), false)
.map(TicketTypeEntity::toTicketTypeList)
.collect(Collectors.toList())
.stream().flatMap(List::stream)
.collect(Collectors.toList());
}
}
and the entity (simplified) :
#Table(name = "TICKET_TYPES")
#Entity
#Slf4j
public class TicketTypeEntity {
#Id
private Long id;
#OneToMany
#JoinTable(name = "TICKET_TYPES_GROUPS",
joinColumns =
{#JoinColumn(name = "TICKET_TYPE_ID", referencedColumnName = "ID")},
inverseJoinColumns =
{#JoinColumn(name = "TICKET_GROUP_ID", referencedColumnName = "ID")})
#Nonnull
private List<TicketGroupsEntity> ticketGroupsEntity;
#Nonnull
public List<TicketType> toTicketTypeList() {
log.info("calling toTicketTypeList for id "+id);
log.info(" with size : "+ticketGroupsEntity.size());
return ticketGroupsEntity.stream().map(group -> TicketType.builder()
.id(id)
.build()).collect(Collectors.toList());
}
}
The exception happens the first time size() is called on the collection :
failed to lazily initialize a collection of role:
my.service.database.entities.TicketTypeEntity.ticketGroupsEntity,
could not initialize proxy - no Session
org.hibernate.LazyInitializationException: failed to lazily initialize
a collection of role:
my.service.database.entities.TicketTypeEntity.ticketGroupsEntity,
could not initialize proxy - no Session at
org.hibernate.collection.internal.AbstractPersistentCollection.throwLazyInitializationException(AbstractPersistentCollection.java:606)
at
org.hibernate.collection.internal.AbstractPersistentCollection.withTemporarySessionIfNeeded(AbstractPersistentCollection.java:218)
at
org.hibernate.collection.internal.AbstractPersistentCollection.readSize(AbstractPersistentCollection.java:162)
at
org.hibernate.collection.internal.PersistentBag.size(PersistentBag.java:371)
at
my.service.database.entities.TicketTypeEntity.toTicketTypeList(TicketTypeEntity.java:78)
at
java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:195)
at
java.base/java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1655)
at
java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484)
at
java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474)
at
java.base/java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:913)
at
java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at
java.base/java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:578)
at
my.service.database.JpaTicketTypeRepository.findAll(JpaTicketTypeRepository.java:29)
I believe you mis-interpret the stack trace. The problem is not in calling the method size() on the result of findAll(), but in the method findAll itself.
In findAll you call TicketTypeEntity.toTicketTypeList, which converts DB entity to DTO. This method touches ticketGroupsEntity, which is a lazy collection.
The code fails in unit test, but runs when accessed via springs controller.
This is due to Open Session In View, which is enabled by default.
See:
A Guide to Spring’s Open Session In View
The Open Session In View Anti-Pattern
You could solve it multiple ways:
#Transactional findAll (be aware of lazy loading issues)
explicit fetch in query
entityGraph
But to my eyes your entity mapping looks suspicious, you seem to have all data needed to construct TicketType in TicketGroupsEntity. Maybe you could query that entity instead?
Related
I have a bare-bones Spring-Boot app with some GraphQL endpoints and a Postgres database and want to run an integration test against an endpoint. It should find an entity by its ID and does so without a problem when I send a request manually via Postman. However when I write an integration test for the controller it doesn't. The data seems to be saved after using
TestEntityManager (or the JpaRepository directly) an I get the entity back with its ID. I then stick that ID into a query with HttpGraphQlTester which fails with an empty result/null. I traced it with the debugger and discovered that when the endpoint calls the repository to retrieve the entity with the given ID it gets null or when I look at all the repo-contents it's just an empty list. So my data seems to be accessible in my test but not in my repo/service. Any pointers would be very much appreciated.
Test
#SpringBootTest
#AutoConfigureHttpGraphQlTester
#AutoConfigureTestEntityManager
#Transactional
public class BackboneTreeControllerTest {
#Autowired
HttpGraphQlTester tester;
#Autowired
private TestEntityManager testEntityManager;
#Test
void findTaxon() {
Taxon taxon = Taxon.builder()
.path(Arrays.asList("path", "to", "taxon"))
.nameCanonical("Cocos nucifera")
.authorship("Me")
.extinct(false)
.numDescendants(1l)
.numOccurrences(1l)
.build();
Taxon savedTaxon = testEntityManager.persistFlushFind(taxon); // (1)
this.tester.documentName("queries")
.operationName("FindTaxon")
.variable("taxonId", savedTaxon.getId())
.execute()
.path("findTaxon.authorship")
.entity(String.class)
.isEqualTo("Me");
the testEntityManager returns successfully with an ID.
Query
query FindTaxon($taxonId: ID!) {
findTaxon(id: $taxonId) {
authorship
}
}
Controller
#Controller
#AllArgsConstructor
public class BackboneTreeController {
private final TaxonService taxonService;
#QueryMapping
public Taxon findTaxon(#Argument Integer id) {
Optional<Taxon> taxon = taxonService.findTaxon(id);
return taxon.orElse(null);
}
}
Service
#Service
#AllArgsConstructor
public class TaxonService {
private final TaxonRepository taxonRepository;
public Optional<Taxon> findTaxon(Integer id) {
return taxonRepository.findById(id); // (2)
}
}
This is where I would expect the repo to return the entity but it does not. Also using .findAll here returns an empty list.
Repository
#Repository
public interface TaxonRepository extends JpaRepository<Taxon, Integer> {
}
Note that everything works fine when I just run the app and send the exact same query manually!
I don't know HttpGraphQlTester but I'd assume that it generates requests which then get processed in a separate thread.
That thread won't see the changes made in the test because they aren't committed yet.
If this is the reason resolve it by putting the setup in it's own transaction, for example by using TransactionTemplate.
Before updating an entity in my Jakarta EE application running on GlassFish 5.1.0 with EclipseLink 2.7.4 and Derby 10.14.2.0, I compare the updated entity to the saved entity and document the changes. I noticed recently that my compare code was not working with #OneToMany relationship properties and #ElementCollection properties, and I tracked the problem to Lazy loading of the #OneToMany and #ElementCollection properties. I was able to resolve the using the fetch attribute as follows:
Fetch Eager Entity
#Entity
public class Container implements Serializable {
#OneToMany(mappedBy = "container", fetch = FetchType.EAGER)
private List<AssetSerial> assets;
#ElementCollection (fetch = FetchType.EAGER)
private List<Reference> references;
I wasn't entirely happy with this solution, because I assumed that the developers defaulted these relationship types to lazy loading for a purpose, so I continued researching and was excited to find many references to JPA Entity Graphs. I immediately create the following code to force EclipseLink to initialize my lazy loading properties before documenting the entity changes.
Entity Graph Entity
#Entity
#XmlRootElement
#XmlAccessorType(XmlAccessType.FIELD)
#NamedEntityGraph(
name = "Container.eager",
attributeNodes = {
#NamedAttributeNode("assets"),
#NamedAttributeNode("references") })
public class Container implements Serializable {
Entity Manager Initialization
#PersistenceContext(unitName = "MYPU")
private EntityManager em;
Find Method that only Works once per Entity
Map<String, Object> props = new HashMap<String, Object>();
props.put("javax.persistence.loadgraph", em.getEntityGraph("Container.eager"));
Container managedContainer = em.find(Container.class, updatedContainer.getId(), props);
PersistenceUnitUtil tester = em.getEntityManagerFactory().getPersistenceUnitUtil();
logger.debug("Assets: {}", tester.isLoaded(managedContainer, "assets"));
logger.debug("References: {}", tester.isLoaded(managedContainer, "references"));
Unfortunately, the isLoaded test methods only return true the first time I call the find method on a specific entity. The second and subsequent times isLoaded returns false. I struggled with this issue for many hours and determined that this issue was that the EclipseLink shared cache was not honoring the entity graph hint I was passing to the find method. I solved the problem by evicting the entity from the cache immediately before calling the find as shown below.
Find Method that Works
em.getEntityManagerFactory().getCache().evict(Container.class, updatedContainer.getId());
Map<String, Object> props = new HashMap<String, Object>();
props.put("javax.persistence.loadgraph", em.getEntityGraph("Container.eager"));
Container managedContainer = em.find(Container.class, updatedContainer.getId(), props);
PersistenceUnitUtil tester = em.getEntityManagerFactory().getPersistenceUnitUtil();
logger.debug("Assets: {}", tester.isLoaded(managedContainer, "assets"));
logger.debug("References: {}", tester.isLoaded(managedContainer, "references"));
Now the isLoaded test always returns true, and I'm able to document all the changed in the updated entity.
In summary, I have the following questions:
Why is EclipseLink not honoring my entity graph?
Am I going to encountering problems manually evicting my entity from the cache?
Is there a better way to force EclipseLink to initialize my lazy loading properties?
Error Description
Hey all,
I'm having trouble getting a response from my manually added controllers in a JHipster-based project. I scaffolded up the original project, and then hand-wrote my own services and controllers.
When I execute the call, the error result I get from SoapUI (which I am using for initial validation) is at the following url: http://imgur.com/04FpmEZ,Havk1EL#0
And if I look at my Eclipse console error, I see the following: http://imgur.com/04FpmEZ,Havk1EL#1
Controller
/**
* GET /courses/json -> get all the courses.
*/
#RequestMapping(value = "/json",
method = RequestMethod.GET,
produces = "application/json")
#Timed
public List<Course> getAll() {
log.debug("REST request to get all Courses");
return courseService.findAllCourses();
}
Service
package com.testapp.myapp.service;
import java.util.List;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service;
import org.springframework.transaction.annotation.Transactional;
import com.testapp.myapp.domain.Course;
import com.testapp.myapp.repository.CourseRepository;
#Service
#Transactional
public class CourseServiceImpl implements CourseService {
#Autowired
CourseRepository courseRepository;
public long countAllCourses() {
return courseRepository.count();
}
public void deleteCourse(Course course) {
courseRepository.delete(course);
}
public Course findCourse(Integer id) {
return courseRepository.findOne(id);
}
public List<Course> findAllCourses() {
return courseRepository.findAll();
}
public List<Course> findCourseEntries(int firstResult, int maxResults) {
return courseRepository.findAll(new org.springframework.data.domain.PageRequest(firstResult / maxResults, maxResults)).getContent();
}
public void saveCourse(Course course) {
courseRepository.save(course);
}
public Course updateCourse(Course course) {
return courseRepository.save(course);
}
}
What is confusing about this is that I ran the query provided by hibernate directly against my DB, and it returns the record set just fine. Is it possible that the service is being blocked due to some security or authentication constraint auto-loaded by JHipster?
A few issues existed, all related to migrating from Roo into JHipster:
I had built my new Controller class with org.sprinframework.stereotype.Controller's #Controller annotation, rather than #RestController... The original controller annotation was scaffolded up by Spring Roo (which is highly effective at generating services from an existing DB using their DBRE addon, I might add).
After switching over to #RestController, I ran into the second hurdle, which I had originally expected as a JHipster implementation : the service was being blocked due to authentication constraints.
This was fixed by going into com.[projectname].config and updating the SecurityConfiguration.java file, exposing specifically the APIs that I wanted.
Then, I had to make sure Hibernate was getting the full collection of the objects being requested (I had a lot of complex relational entities being built by Roo)... failed to lazily initialize a collection of role...
In the Domain entity, change your #OneToMany annotation as follows:
#OneToMany(fetch = FetchType.EAGER, mappedBy = "courseId", cascade = CascadeType.REMOVE)
Source of answer: Solve "failed to lazily initialize a collection of role" exception
Voila! Functioning, secure-able JSON-based APIs, fully reverse engineered from an existing Postgresql DB, loaded into a prescaffolded Angular front-end.
We're creating RESTFul API based on Play framework 2.1.x which transfers/accepts data in JSON format. Create, read and delete operations were easy to implement but we've got stuck with update operation.
Here are the entities we have:
Event:
#Entity
public class Event extends Model {
#Id
public Long id;
#NotEmpty
public String title;
#OneToOne(cascade = CascadeType.ALL)
public Location location;
#OneToMany(cascade = CascadeType.ALL)
public List<Stage> stages = new LinkedList<Stage>();
...
}
Location:
#Entity
public class Location extends Model {
#Id
public Long id;
#NotEmpty
public String title;
public String address;
...
}
Stage:
#Entity
public class Stage extends Model {
#Id
public Long id;
#NotEmpty
public String title;
public int capacity;
...
}
In our router we have following entry:
PUT /events/:id controllers.Event.updateEvent(id: Long)
updateEvent method in controller looks following way (note: we use Jackson library to map objects to JSON and back):
#BodyParser.Of(BodyParser.Json.class)
public static Result updateEvent(Long id) {
Event event = Event.find.byId(id);
Http.RequestBody requestBody = request().body();
JsonNode jsonNode = requestBody.asJson();
try {
ObjectMapper mapper = new ObjectMapper();
ObjectReader reader = mapper.readerForUpdating(event);
event = reader.readValue(jsonNode);
event.save();
} catch (IOException e) {
e.printStackTrace();
}
return ok();
}
After we've got Event from database, updated its values by reading from JSON with ObjectReader we try to save updated Event and get exception (similar one we get when trying to update list of Stages):
org.h2.jdbc.JdbcSQLException: Unique index or primary key violation: "PRIMARY_KEY_9F ON PUBLIC.LOCATION(ID)"; SQL statement: insert into location (id, title, address) values (?,?,?) [23505-168]
According to H2 logs framework tries to perform insert operation for location and fails as location with specified id already exists. We've investigated further ant it looks like when we get Event from DB, location is not joined because of lazy fetch. Looks like the problem occurs with saving other entities which our Event has relationships with. We've tried to force fetch operation for location by doing following:
Event event = Ebean.find(Event.class).fetch("location").where().eq("id", id).findUnique();
but still when we update this event with ObjectReader's readValue method and save Event we get the same exception.
We've also tried to create separate Event object from JSON and update Event from DB field by field (implemented merge operation by ourselves) and it worked but it looks odd that framework doesn't provide any means of merging and updating entities with data passed from client.
Could someone advise on how to solve this problem ? Any example showing how to implement merge of entity with JSON data coming from client and updating it in storage would be highly appreciated.
You've probably already fixed the error by now, but in case this helps someone else, I'm answering it anyway.
I'm just a beginner with Play Framework as well, only started a few days ago. But I believe when you have in your code:
event.save();
you should be doing instead:
event.update();
The problem here is that you're not inserting a new entity into the database, but in fact just updating the one already there, so you need to use the second method.
You can find more info about this at http://www.playframework.com/documentation/2.0/api/java/play/db/ebean/Model.html
Here are my entities:
#Entity
public class Actor {
private List<Film> films;
#ManyToMany
#JoinTable(name="film_actor",
joinColumns =#JoinColumn(name="actor_id"),
inverseJoinColumns = #JoinColumn(name="film_id"))
public List<Film> getFilms(){
return films;
}
//... more in here
Moving on:
#Entity
public class Film {
private List actors;
#ManyToMany
#JoinTable(name="film_actor",
joinColumns =#JoinColumn(name="film_id"),
inverseJoinColumns = #JoinColumn(name="actor_id"))
public List<Actor> getActors(){
return actors;
}
//... more in here
And the join table:
#javax.persistence.IdClass(com.tugay.sakkillaa.model.FilmActorPK.class)
#javax.persistence.Table(name = "film_actor", schema = "", catalog = "sakila")
#Entity
public class FilmActor {
private short actorId;
private short filmId;
private Timestamp lastUpdate;
So my problem is:
When I remove a Film from an Actor and merge that Actor, and check the database, I see that everything is fine. Say the actor id is 5 and the film id is 3, I see that these id 's are removed from film_actor table..
The problem is, in my JSF project, altough my beans are request scoped and they are supposed to be fetching the new information, for the Film part, they do not. They still bring me Actor with id = 3 for Film with id = 5. Here is a sample code:
#RequestScoped
#Named
public class FilmTableBackingBean {
#Inject
FilmDao filmDao;
List<Film> allFilms;
public List<Film> getAllFilms(){
if(allFilms == null || allFilms.isEmpty()){
allFilms = filmDao.getAll();
}
return allFilms;
}
}
So as you can see this is a request scoped bean. And everytime I access this bean, allFilms is initially is null. So new data is fetched from the database. However, this fetched data does not match with the data in the database. It still brings the Actor.
So I am guessing this is something like a cache issue.
Any help?
Edit: Only after I restart the Server, the fetched information by JPA is correct.
Edit: This does not help either:
#Entity
public class Film {
private short filmId;
#ManyToMany(mappedBy = "films", fetch = FetchType.EAGER)
public List<Actor> getActors(){
return actors;
}
The mapping is wrong.
The join table is mapped twice: once as the join table of the many-to-many association, and once as an entity. It's one or the other, but not both.
And the many-to-many is wrong as well. One side MUST be the inverse side and use the mappedBy attribute (and thus not define a join table, which is already defined at the other, owning side of the association). See example 7.24, and its preceeding text, in the Hibernate documentation (which also applies to other JPA implementations)
Side note: why use a short for an ID? A Long would be a wiser choice.
JB Nizet is correct, but you also need to maintain both sides of relationships as there is caching in JPA. The EntityManager itself caches managed entities, so make sure your JSF project is closing and re obtaining EntityManagers, clearing them if they are long lived or refreshing entities that might be stale. Providers like EclipseLink also have a second level cache http://wiki.eclipse.org/EclipseLink/Examples/JPA/Caching