My current project uses HSQLDB2.0 and JPA2.0 .
The scenario is: I query DB to get list of contactDetails of person. I delete single contactInfo at UI but do not save that data (Cancel the saving part).
I again do the same query, now the result list is 1 lesser than previous result coz I have deleted one contactInfo at UI. But that contactInfo is still available at DB if I cross check.
But if I include entityManager.clear() before start of the query, I get correct results every time.
I dont understand this behaviour. Could anyone make it clear for me?
Rather than querying again, try this:
entityManager.refresh(person);
A more complete example:
EntityManagerFactory factory = Persistence.createEntityManagerFactory("...");
EntityManager em = factory.createEntityManager();
em.getTransaction().begin();
Person p = (Person) em.find(Person.class, 1);
assertEquals(10, p.getContactDetails().size()); // let's pretend p has 10 contact details
p.getContactDetails().remove(0);
assertEquals(9, p.getContactDetails().size());
Person p2 = (Person) em.find(Person.class, 1);
assertTrue(p == p2); // We're in the same persistence context so p == p2
assertEquals(9, p.getContactDetails().size());
// In order to reload the actual patients from the database, refresh the entity
em.refresh(p);
assertTrue(p == p2);
assertEquals(10, p.getContactDetails().size());
assertEquals(10, p2.getContactDetails().size());
em.getTransaction().commit();
em.close();
factory.close();
The behaviour of clear() is explained in its javadoc:
Clear the persistence context, causing all managed entities to become detached. Changes made to entities that have not been flushed to the database will not be persisted.
That is, removal of contactInfo is not persisted.
ContactInfo is not getting removed from the database because you remove the relationship between ContactDetails and ContactInfo, but not ContactInfo itself. If you want to remove it, you need either do it explicitly with remove() or specify orphanRemoval = true on the relationship.
Related
There is 3 entities in MxN relationship, B being association entity. We create them in single TX, persist all of them, and fetch entity with OneToMany association. This association is not initialized after fetch.
Source: https://github.com/alfonz19/springboot222demo/commits/what
#Transactional
#Test
void contextLoads() {
// for(int i = 0; i < 3; i++) {
UUID aId = UUID.randomUUID();
AEntity aEntity = aRepository.save(new AEntity().setId(aId));
UUID bId = UUID.randomUUID();
CEntity cEntity = cRepository.save(new CEntity().setId(bId));
em.flush();
bRepository.save(new BEntity().setAEntity(aEntity).setCEntity(cEntity));
// }
em.flush();
// em.clear();
Iterable<CEntity> centities = cRepository.findAll();
List<BEntity> bEntities =
iterableToStream(centities).flatMap(e -> e.getBEntities().stream()).collect(Collectors.toList());
Assert.assertThat(centities, Matchers.iterableWithSize(1));
Assert.assertThat(bRepository.findAll(), Matchers.iterableWithSize(1));
Assert.assertThat(bEntities.size(), CoreMatchers.is(1));
...
}
Ok, I understand, that when creating BEntity I do not update AEntity and CEntity leaving them corrupted. Calling cRepository.findAll() then does call select on db to get all Cs (even without any evict/flush/clear) but leaves #OneToMany uninitialized. I don't get it. I would understand, if there woulndn't be no call to db at all, but if I fetch Cs anyway to refresh it, why not refresh also the association table. Why's that?
Even more suprisingly aRepository.save(new AEntity().setId(aId)) when doing em.merge (entity has assigned id) the hibernate does load whole MxN structure using 2 left outer joins, even if #OneToMany is lazy. Why's that?? EDIT: ok, that's not surprising at all, that's implication of cascade merge. Compeletely ok.
I'm little bit surprised by this behavior, as there are select issued where they shouldn't be (IIUC), and there aren't ones, where they easily could be.
And to keep the best to the end. With small change: uncommenting for loop and clear, I'm getting full nondeterministic behavior.
source: https://github.com/alfonz19/springboot222demo/tree/nondeterministic
Tests will either work, or produces exception like:
array out of bounds
collection with cascade="all-delete-orphan" was no longer referenced by the owning entity instance:
java.lang.NullPointerException
but if I put breakpoint on bEntities variable declaration, cEntities are always correctly created and test then pass. I have no idea what can cause this.
I have answer to non-deterministic behavior problem bonus-question.
One more randomly generated exceptions to the list is org.springframework.orm.jpa.JpaSystemException: Found shared references to a collection and all this behavior just disappers with removal of flatMap. Ie replace:
List<BEntity> bEntities =
StreamSupport.stream(centities.spliterator(), true).flatMap(e -> e.getBEntities().stream()).collect(Collectors.toList());
with
List<BEntity> bEntities = new LinkedList<>();
centities.forEach(e->bEntities.addAll(e.getBEntities()));
and test in (not anymore) "nondeterministic" branch will pass 100%. Not sure why, however it seems, that stream-api is not that safe with hibernate-managed collections.
Using ASP.NET Core 2.2 with EF Core, I have followed various guides in trying to implement the automatic creation of date/time values when creating either a new record or editing/updating an existing one.
The current result is when i initially create a new record, the CreatedDate & UpdatedDate column will be populated with the current date/time.
However first time I edit this same record, the UpdatedDate column is then given a new date/time value (as expected) BUT for some reason, EF Core is wiping out the value of the original CreatedDate which results in SQL assigning a default value.
Required result I need as follows:
Step 1: New row created, both CreatedDate & UpdatedDate column is given a date/time value (this already works)
Step 2: When editing and saving an existing row, I want EF Core to update the UpdatedDate column with the updated date/time only, BUT leave the other CreatedDate column unmodified with the original creation date.
I'm using EF Core code first, and do no want to go down the fluent API route.
One of the guides i was partially following is https://www.entityframeworktutorial.net/faq/set-created-and-modified-date-in-efcore.aspx but neither this or other solutions I've tried is giving the result I am after.
Baseclass:
public class BaseEntity
{
public DateTime? CreatedDate { get; set; }
public DateTime? UpdatedDate { get; set; }
}
DbContext Class:
public override Task<int> SaveChangesAsync(bool acceptAllChangesOnSuccess, CancellationToken cancellationToken = default(CancellationToken))
{
var entries = ChangeTracker.Entries().Where(E => E.State == EntityState.Added || E.State == EntityState.Modified).ToList();
foreach (var entityEntry in entries)
{
if (entityEntry.State == EntityState.Modified)
{
entityEntry.Property("UpdatedDate").CurrentValue = DateTime.Now;
}
else if (entityEntry.State == EntityState.Added)
{
entityEntry.Property("CreatedDate").CurrentValue = DateTime.Now;
entityEntry.Property("UpdatedDate").CurrentValue = DateTime.Now;
}
}
return base.SaveChangesAsync(acceptAllChangesOnSuccess, cancellationToken);
}
UPDATE FOLLOWING ADVICE FROM STEVE IN COMMENTS BELOW
I spent a bit more time debugging today, turns out the methods I posted above are appear to be functioning as expected i.e. when editing an existing row and saving it, only the entityEntry.State == EntityState.Modified IF statement is being called. So what I'm finding is that after saving the entity, the CreatedDate column is being overwitten with a Null value, I can see this by watching the SQL explorer after a refresh. I believe the issue is along the lines of what Steve mentions below "If it is #null then this might also explain the behavior in that it is not being loaded with the entity for whatever reason."
But i'm a little lost in tracing where this CreatedDate value is being dropped somewhere through edit/save process.
Image below shows the result at the point just before the entity is saved following an update. In the debugger I'm not quite sure where to find the entry of the CreatedDate to see what value is held at this step, but it appears to be missing from the debugger list so wandering whether somehow it doesn't know about the existence of this field when saving.
Below is the method I have in my form 'Edit' Razor page model class:
public class EditModel : PageModel
{
private readonly MyProject.Data.ApplicationDbContext _context;
public EditModel(MyProject.Data.ApplicationDbContext context)
{
_context = context;
}
[BindProperty]
public RuleParameters RuleParameters { get; set; }
public async Task<IActionResult> OnGetAsync(int? id)
{
if (id == null)
{
return NotFound();
}
RuleParameters = await _context.RuleParameters
.Include(r => r.SystemMapping).FirstOrDefaultAsync(m => m.ID == id);
if (RuleParameters == null)
{
return NotFound();
}
ViewData["SystemMappingID"] = new SelectList(_context.SystemMapping, "ID", "MappingName");
return Page();
}
public async Task<IActionResult> OnPostAsync()
{
if (!ModelState.IsValid)
{
return Page();
}
_context.Attach(RuleParameters).State = EntityState.Modified;
try
{
await _context.SaveChangesAsync();
}
catch (DbUpdateConcurrencyException)
{
if (!RuleParametersExists(RuleParameters.ID))
{
return NotFound();
}
else
{
throw;
}
}
return RedirectToPage("./Index");
}
private bool RuleParametersExists(int id)
{
return _context.RuleParameters.Any(e => e.ID == id);
}
}
Possibly one of the reasons for this issue is the fact that I have not included the CreatedDate field in my Edit Razor Page form, so when I update the entity which in turn will run the PostAsync method server side, there is no value stored for the CreatedDate field and therefore nothing in the bag by the tine the savechangesasync method is called in my DbContext Class. But I also didn't think this was necessary? otherwise I'd struggle to see what value there is in the this process of using an inherited BaseEntity class i.e. not having to manually add the CreatedDate & UpdatedDate attribute to every model class where I want to use it...
It may be easier to just give your BaseEntity a constructor:
public BaseEntity()
{
UpdatedDate = DateTime.Now;
CreatedDate = CreatedDate ?? UpdatedDate;
}
Then you can have your DbContext override SaveChangesAsync like:
public override Task<int> SaveChangesAsync(
bool acceptAllChangesOnSuccess,
CancellationToken token = default)
{
foreach (var entity in ChangeTracker
.Entries()
.Where(x => x.Entity is BaseEntity && x.State == EntityState.Modified)
.Select(x => x.Entity)
.Cast<BaseEntity>())
{
entity.UpdatedDate = DateTime.Now;
}
return base.SaveChangesAsync(acceptAllChangesOnSuccess, token);
}
Possibly one of the reasons for this issue is the fact that I have not included the CreatedDate field in my Edit Razor Page form, so when I update the entity which in turn will run the PostAsync method server side, there is no value stored for the CreatedDate field and therefore nothing in the bag by the tine the savechangesasync method is called in my DbContext Class.
That's true.Your post data does not contains the original CreatedDate,so when save to database, it is null and could not know what the exact value unless you assign it before saving.It is necessary.
You could just add below code in your razor form.
<input type="hidden" asp-for="CreatedDate" />
Update:
To overcome it in server-side,you could assign data manually:
public async Task<IActionResult> OnPostAsync()
{
RuleParameters originalData = await _context.RuleParameters.FirstOrDefaultAsync(m => m.ID == RuleParameters.ID);
RuleParameters.CreatedDate = originalData.CreatedDate;
_context.Attach(RuleParameters).State = EntityState.Modified;
await _context.SaveChangesAsync();
}
I don't suspect EF is doing this, but rather your database, or you're inadvertently inserting records instead of updating them.
A simple test: Put break-points in your SaveChangesAsnc method within both the Modified and Added handlers and then run a unit test that loads an entity, edits it, and saves. Which breakpoint is hit? If the behavior seems to be normal with a simple unit test, repeat again with your code.
If the Modified breakpoint is hit, and only the Modified handler is hit then check the state of the CreatedDate value in the entity modified. Does it still reflect the original CreatedDate? If yes, then it would appear that something in your schema will be overwriting it on save. If no then you have a bug in your code that has caused it to update. If it is #null then this might also explain the behaviour in that it is not being loaded with the entity for whatever reason. Check that the property has not been configured as something like a Computed property.
If the Added breakpoint is hit at all, then this would point at a scenario where you're dealing with a detached entity, such as an entity that was read from a different DB Context and being associated to another entity in the current DB Context and saved as a byproduct. When a DbContext encounters an entity that was loaded and disassociated with a different DbContext, it will treat that entity as a completely new entity and insert a new record. The biggest single culprit for this is invariably MVC code where people pass entities to/from views. Entity references are loaded in one request, serialized to the view, and then passed back on another request. Devs assume they are receiving an entity that they can just associate to a new entity and save, but the Context of this request doesn't know about that entity, and that "entity" isn't actually an entity, it is now a POCO shell of data that the serializer created. It's no different to you newing up a new class and populating fields. EF won't know the difference. The result of this is you will trip the Added condition for your entity, and after completion you will have a duplicate record. (with different PK if EF is configured to treat PKs as Identity)
So an example is an Order screen: When presenting a screen to create a new order I may have loaded the Customer and passed that to the view to display customer information and will want to associate to the new order:
var customer = context.Customers.Single(x => x.CustomerId == 15);
var newOrder = new Order { Customer = customer };
return View(newOrder);
This looks innocent enough. When we go to save the new order after setting their details:
public ActionResult Save(Order newOrder)
{
context.Orders.Add(newOrder);
newOrder.Customer.Orders.Add(newOrder);
context.SaveChanges();
// ...
}
newOrder had a reference to Customer #14, so all looks good. We're even associating the new order to the customer's order collection. We might even want to have updated fields on the customer record to reflect a change to the Modified date. However, newOrder in this case, and all associated data including .Customer are plain 'ol C# objects at this point. We've added the new order to the Context, but as far as the context is concerned, the Customer referenced is also a new record. It will ignore the Customer ID if that is set as an Identity column and it will save a brand new Customer record (ID #15 for example) with all of the same details as Customer ID 14 and associate that to the new order. It can be subtle and easy to miss until you start querying Customers and spotting duplicate looking rows.
If you are passing entities to/from views, I'd be very wary of this gotcha. Attaching and setting modified state is one option, but that involves trusting that the data has not been tampered with. As a general rule, calls to update entities should never pass entities & attach them, but rather re-load those entities, validate row version, validate the data coming in, and only copy across fields you expect should ever be modified before saving the entity associated to the DbContext.
Hopefully that gives you a few ideas on things to check to get to the bottom of the issue.
I can create history of an entity with a HistoryCustomizer
#Entity
#Customizer(MyHistoryCustomizer.class)
public class Employee {..}
the HistoryCustomizer is something like this one:
public class MyHistoryCustomizer implements DescriptorCustomizer {
public void customize(ClassDescriptor descriptor) {
HistoryPolicy policy = new HistoryPolicy();
policy.addHistoryTableName("EMPLOYEE_HIST");
policy.addStartFieldName("START_DATE");
policy.addEndFieldName("END_DATE");
descriptor.setHistoryPolicy(policy);
}
}
The history objects can be fetched with the "AS_OF" hint
javax.persistence.Query historyQuery = em
.createQuery("SELECT e FROM Employee e", Employee.class)
.setParameter("id", id)
.setHint(QueryHints.AS_OF, "yyyy/MM/dd HH:mm:ss.SSS")
.setHint(QueryHints.READ_ONLY, HintValues.TRUE)
.setHint(QueryHints.MAINTAIN_CACHE, HintValues.FALSE);
just fine BUT, if you start accessing objects referenced by this historical object, the referenced objects will be the actual version of them. So the Employee from last year (fetched by a historical query) will have the current Address assigned to it and no the one it used to have last year.
How can I tell EclipseLink (2.5.0) to fetch the related object from the past as well?
In order to query the historical state of several - not just one like above - entities, we have to create an EclipseLink specific HistoricalSession. Queries run through this session will use the same historical timestamp and represent the proper historical state of the object graph.
I am using JPA in other parts of the code, so I will start with converting the JPA Query to an EclipseLink ReadAllQuery.
The HistoricalSession has its own entity cache, so that the historical entities do not mix with the normal ones.
// Get the EclipseLink ServerSession from the JPA EntitiyManagerFactory
Server serverSession = JpaHelper.getServerSession(emf);
// Only a ClientSession can give us a HistoricalSession so ask one from the ServerSession
ClientSession session = serverSession.acquireClientSession();
// Create the HistoricalSessions. A HistoricalSession is sticked to a point in the past and all the queries are executed at that time.
Session historicalSessionAfterFirstChild = session.acquireHistoricalSession(new AsOfClause(afterFirstChildAdded));
ReadAllQuery q;
Query jpaQuery = em.createQuery(query);
jpaQuery.setParameter("root", "parent");
// Extract the EclipseLink ReadAllQuery from the JPA Query. We can use named queries this way.
q=JpaHelper.getReadAllQuery(jpaQuery);
// This is a possible EclipseLink bug: https://bugs.eclipse.org/bugs/show_bug.cgi?id=441193
List<Object> arguments = new Vector<Object>();
arguments.add("parent");
q.setArgumentValues(arguments);
Vector<Parent> historyAwareParents ;
// Execute the query
historyAwareParents = (Vector<Parent>) historicalSessionAfterFirstChild.executeQuery(q);
for (Child c : historyAwareParents.get(0).children) {
System.out.println(c.getExtension() + " " + c.getRoot());
}
I want to use EF DbContext/POCO entities in a detached manner, i.e. retrieve a hierarchy of entities from my business tier, make some changes, then send the entire hierarchy back to the business tier to persist back to the database. Each BLL call uses a different instance of the DbContext. To test this I wrote some code to simulate such an environment.
First I retrieve a Customer plus related Orders and OrderLines:-
Customer customer;
using (var context = new TestContext())
{
customer = context.Customers.Include("Orders.OrderLines").SingleOrDefault(o => o.Id == 1);
}
Next I add a new Order with two OrderLines:-
var newOrder = new Order { OrderDate = DateTime.Now, OrderDescription = "Test" };
newOrder.OrderLines.Add(new OrderLine { ProductName = "foo", Order = newOrder, OrderId = newOrder.Id });
newOrder.OrderLines.Add(new OrderLine { ProductName = "bar", Order = newOrder, OrderId = newOrder.Id });
customer.Orders.Add(newOrder);
newOrder.Customer = customer;
newOrder.CustomerId = customer.Id;
Finally I persist the changes (using a new context):-
using (var context = new TestContext())
{
context.Customers.Attach(customer);
context.SaveChanges();
}
I realise this last part is incomplete, as no doubt I'll need to change the state of the new entities before calling SaveChanges(). Do I Add or Attach the customer? Which entities states will I have to change?
Before I can get to this stage, running the above code throws an Exception:
An object with the same key already exists in the ObjectStateManager.
It seems to stem from not explicitly setting the ID of the two OrderLine entities, so both default to 0. I thought it was fine to do this as EF would handle things automatically. Am I doing something wrong?
Also, working in this "detached" manner, there seems to be an lot of work required to set up the relationships - I have to add the new order entity to the customer.Orders collection, set the new order's Customer property, and its CustomerId property. Is this the correct approach or is there a simpler way?
Would I be better off looking at self-tracking entities? I'd read somewhere that they are being deprecated, or at least being discouraged in favour of POCOs.
You basically have 2 options:
A) Optimistic.
You can proceed pretty close to the way you're proceeding now, and just attach everything as Modified and hope. The code you're looking for instead of .Attach() is:
context.Entry(customer).State = EntityState.Modified;
Definitely not intuitive. This weird looking call attaches the detached (or newly constructed by you) object, as Modified. Source: http://blogs.msdn.com/b/adonet/archive/2011/01/29/using-dbcontext-in-ef-feature-ctp5-part-4-add-attach-and-entity-states.aspx
If you're unsure whether an object has been added or modified you can use the last segment's example:
context.Entry(customer).State = customer.Id == 0 ?
EntityState.Added :
EntityState.Modified;
You need to take these actions on all of the objects being added/modified, so if this object is complex and has other objects that need to be updated in the DB via FK relationships, you need to set their EntityState as well.
Depending on your scenario you can make these kinds of don't-care writes cheaper by using a different Context variation:
public class MyDb : DbContext
{
. . .
public static MyDb CheapWrites()
{
var db = new MyDb();
db.Configuration.AutoDetectChangesEnabled = false;
db.Configuration.ValidateOnSaveEnabled = false;
return db;
}
}
using(var db = MyDb.CheapWrites())
{
db.Entry(customer).State = customer.Id == 0 ?
EntityState.Added :
EntityState.Modified;
db.SaveChanges();
}
You're basically just disabling some extra calls EF makes on your behalf that you're ignoring the results of anyway.
B) Pessimistic. You can actually query the DB to verify the data hasn't changed/been added since you last picked it up, then update it if it's safe.
var existing = db.Customers.Find(customer.Id);
// Some logic here to decide whether updating is a good idea, like
// verifying selected values haven't changed, then
db.Entry(existing).CurrentValues.SetValues(customer);
In EclipseLink, I run into a problem where an element is inserted twice, resulting into a primary key violation. The scenario is as follows:
I have three entities, Element, Restriction and RestrictionElement. The entity RestrictionElement acts as a many-to-many relationship between the two others.
When I create a new RestrictionElement and merge the Element, the RestrictionElement is inserted twice. The code:
// element is an Element, restriction is a Restriction. Both are already in present in the database.
RestrictionElement newRestrictionElement = new RestrictionElement(restriction, element);
Transaction transaction = new Transaction();
em.merge(element); //em is the EntityManager
transaction.commit();
However, if I remove the line restriction.getReferencedRestrictionElements().add(this); the RestrictionElement is inserted once.
Can anyone explain why this happens? Or point to a document that explains how to work out what the merge() command does?
Relevant JPA code: (I'll only given a small part. There aren't any other big problems with the code.)
public class RestrictionElement {
#JoinColumns({#JoinColumn(name = "ELEMENT_ID", referencedColumnName = "ID"),#JoinColumn(name = "ELEMENT_DESCRIPTOR", referencedColumnName = "DESCRIPTOR")})
private Element element;
#JoinColumns({#JoinColumn(name = "RESTRICTION_ID", referencedColumnName = "ID"),#JoinColumn(name = "RESTRICTION_DESCRIPTOR", referencedColumnName = "DESCRIPTOR")})
private Restriction restriction;
public RestrictionElement(Restriction restriction, Element element) {
this.restriction = restriction;
this.element = element;
restriction.getReferencedRestrictionElements().add(this);
element.getReferingRestrictionElements().add(this);
}
}
public class Element {
#OneToMany(mappedBy = "element")
private List<RestrictionElement> referingRestrictionElements = new ArrayList<RestrictionElement>();
}
public class Restriction extends Element {
#OneToMany(mappedBy = "restriction", cascade = { ALL, PERSIST, MERGE, REMOVE, REFRESH })
private List<RestrictionElement> referencedRestrictionElements = new ArrayList<RestrictionElement>();
}
How do your persist RestrictionElement? My guess is when you persist it you get one copy, then a second when you merge the Element with the reference to it.
Try using persist() for new objects, and related the objects after they are managed with the correct managed copy.
I got a similar issue when I run my program, but the issue is not there under step by step debugging.
I resolved the issue by changing List to Set in the OneToMany relationship.
Don't forget that once you retrieve an instance of the class using JPA, the instance becomes managed, any changes to it will be automatically merged into the database.
By default, this merge will occur at the moment you query the table. Therefore the following situation can happen:
query (find by ID)
update (setName = "xx")
query another class that has a direct relationship to this one (find by ID again)
in a situation similar to the above, the second find will effectively issue a merge to the first table. (I'm not sure exactly of the details or scenarios here).
My suggestion is that you issue every single query (findById for example) or every instance you have before you start modifying it (ie, set, etc).
Hope it helps.