Concurrency check not happening - entity-framework

I have a code first EF model with concurrency tokens on some of the entities.
These tokens are defined as byte[] properties and are decorated with [Timestamp] attributes.
[Timestamp]
public byte[] ConcurrencyStamp { get; set; }
(I've also tried any combination of Timestamp and ConcurrencyCheck)
The properties are also marked as concurrency tokens in OnModelCreating of my context:
modelBuilder.Entity<Room>().Property(x => x.ConcurrencyStamp).IsConcurrencyToken();
So, here's the scenario:
Some of the entities are serialized as JSON and passed on to external clients. When a client updates an object, that changed object is received again (as Json) and the changes are applied to a freshly fetched object. In this process, I also update the concurrencytoken value received from the client to the object just fetched from the db. Then, when saving changes, no concurrency error is thrown even if the values don't match.
So, to summarize:
1. fetch object from DB
2. serialize object to JSON (including concurrencytoken)
3. client messes with object
4. server receives updated object as json
5. fetch object (by id) from DB
6. apply json values to fetched object (including concurrencytoken)
7. context.savechanges
--> no error if token was changed
Checking the log, it seems that EF is doing the update statement with the "fetched" concurrencytoken when saving changes, not the token set manually from the external object.
UPDATE [dbo].[Rooms]
SET [RoomName] = #0, [ConcurrencyStamp] = #1
WHERE (([RoomId] = #2) AND ([ConcurrencyStamp] = #3))
-- #0: 'new room name' (Type = String, Size = -1)
-- #1: '1500' (Type = Int64)
-- #2: '1' (Type = Int32)
-- #3: '1999' (Type = Int64)
(I've used longs here, but the same applies to byte[] stamps, which I tried initially).
1999 is the current concurrencytoken value in the DB. 1500 is the token coming from the JSON object, which was set manually by setting the property.
Even though you can see EF updating the token in the statement (because I set the property) it is still using the original token value to do the check.
Changing the properties through the change tracker doesn't help, the behaviour stays the same.
Any clues? Is this scenario not supported? Am I doing something wrong?
Update
The check does work. When creating a new context in a separate thread and doing a change between the fetch and savechanges (thus between step 5 and step 7), the savechanges in step 7 barfs with a ConcurrencyException.
So it appears it works as described, but there's no way to "force" the token to be updated externally (which might make sense in a way, I guess).

You actually can force it.
You just need to set timestamp like this:
customerRequest.RowVersion = detachedRequest.RowVersion;
Context.Entry(customerRequest).Property(p => p.RowVersion).OriginalValue = customerRequest.RowVersion;
Context.Entry(customerRequest).Property(p => p.RowVersion).IsModified = false;
After that ef will think that its not updated and will throw concurrency exception on update.
tested on ef 6 code first.

EF always uses the OriginalValue of the timestamp fetched in step 5 in its UPDATE statement in step 7.
Setting entity.ConcurrencyStamp = viewModel.ConcurrencyStamp in step 6 only updates the CurrentValue.
To set the OriginalValue, do this in step 6 instead:
dbContext.Entry(entity).Property(e => e.ConcurrencyStamp).OriginalValue =
viewModel.ConcurrencyStamp;

Related

How Does the Entity Framework ChangeTracker know that a property has changed?

How does the Entity Framework ChangeTracker know that a property has changed? The properties have no advanced setter. I would like to know how to track those changes.
If you use the DbContext and DbSets normally, you won't add items to the change tracker. Before you can change any property of an entity, you'll first have to fetch it. Also if you want to delete it, you'll first have to fetch it.
The change tracker holds all fetched items, at least until SaveChanges, but probably until the DbContext is Disposed, if you really want to know, write some test code.
The ChangeTracker also holds all Added items.
The sequence of fetched and added items in the ChangeTracker can be accessed using method Entries or Entries<...>. The returned value is a sequence of DbEntityEntries. Every DbEntityEntry has a State, indicating whether it was Added / Deleted / Modified / Unchanged
After you fetched the item, its state is Unchanged. If you call Delete, the state changes in Deleted. If you add an object the State if Added. The difficult pare is Modified, because you can modify an item without using DbSet:
// Student moves to different school:
int jfkSchoolId = schoolContext.Schools
.Where(school => school.Name = "J.F. Kennedy School")
.Select(school => school.Id)
.FirstOrDefault();
var student = schoolContext.Students.Where(student.Id == 100).SingleOrDefault();
student.SchoolId = jfkSchoolId;
How does the ChangeTracker know you changed the SchoolId?
Luckily, the DbEntityEntry holds the original database values as well as the current values. So when you ask for the state, all it has to do is check if it has been added / deleted / etc. Most states are easy, only if it is Unchanged, the function that fetches the state will have to check all original values with all currentValues, using a default value comparer. If there are differences, the state is marked Changed, so the next time the value comparison is not needed.
You can't undo this: once the State is changed, you can't unchange it:
int originalSchoolId = myStudent.SchoolId;
myStudent.SchoolId = jfxSchoolId;
var state = dbContext.ChangeTracker.Entries<Students>
.Where(studentEntity => studentEntity.Entity.Id == myStudent.Id)
.Select(studentEntity => studentEntity.State)
.SingleOrDefault();
// state equals Changed, because original value was not 0
// student back to original school
myStudent.SchoolId = originalSchoolId;
// ask the state of this student again:
state = ...
// state is still changed

entity framework 6 system.outofmemoryexception

I keep getting a system.outofmemoryexception:
Exception of type 'System.OutOfMemoryException' was thrown.
at System.Collections.Generic.List1.set_Capacity(Int32 value)
at System.Collections.Generic.List1.EnsureCapacity(Int32 min)
at System.Collections.Generic.List1.Add(T item)
at System.Collections.Generic.List1..ctor(IEnumerable1 collection)
at System.Linq.Enumerable.ToList[TSource](IEnumerable1 source)
at CIGDataLibrary.Archive.ArchiveCycleData(String applicationName)
in c:!TFS\SCCSoftware\Commercial\CIG\Wagering\CIGDataLibrary\files\Archive.cs:line 571
Code at line 571:
List<Guid?> cycleData = Queries.Current.GetCycleDataArchiveList();
The method:
public static List<Guid?> GetCycleDataArchiveList()
{
using (var dbContext = new CIGDataModels.CIGDBStoredProcModels())
{
return dbContext.usp_arch_GetCycleData().ToList();
}
}
And the meat of the stored procedure:
SELECT TOP 1000 gpCycleData_Id FROM CIGDB.dbo.cig_Cycle_gpCycleData
WHERE [TimeStamp] < DATEADD(HH, 3, DATEADD(dd, 0, DATEDIFF(dd, 0, GETDATE())))
ORDER BY [TimeStamp]
Any thoughts on why this is occurring? It should be returning 1000 records (just a list of GUIDs) so shouldn't be throwing a fit, right? I've taken the SP down to TOP 1 but still results in same error.
Your production database is returning too many rows. It's likely missing the TOP n clause within the stored procedure, hence it's returning so much data that your application is running out of memory.
As you say you have edited the stored proc to only return a single row and it's still causing an error, there must be another factor.
Assuming that there's nothing more to the procedure than you have posted, some possible reasons:
A trigger is affecting the output in some way.
Entity framework is calling a different procedure. Check the content of your model to determine the correct procedure is being called.
Another procedure exists in a different schema. For example, your app logs into the database with username myApp and when it calls MyProc it's resolving to myApp.MyProc instead of dbo.MyProc.
Ok, I figured out the issue and it has run as expected.
Contrary to the exception's stack trace, the error was on line 572, not 571.
That line was
List<Guid> arch_gpCycle = dbarchContext.cig_Cycle_gpCycleData.Select(s => s.gpCycleData_Id).ToList();
And that resulted in the system.outofmemoryexception. I have since changed it to a compiled query that returns a list of Guids and it has run successfully.

Symfony2 form validation of Doctrine2 object: accessing previous values on updates

I've written a class validator as a service and passed the doctrine entity manager to this validator. At this point everything works fine.
But now I need the unchanged object of $entry which is been updated in the form - or at least the previous values. I've tried some approaches, but did not succeed:
public function isValid($entry, Constraint $constraint)
{
$oldEntry = $this->em->getRepository('SomeBundle:Entry')->findOneBy(array('id' => $entry->getId()));
Doctrine fetches the same (changed) object as expected. But trying to refresh the object will reset both versions of the object:
$newEntry = clone $entry;
$this->em->detach($newEntry);
$this->em->refresh($entry);
$hoursOfOldEntry = $entry->calculateHours();
$this->em->merge($newEntry);
Another option could be to save the values of the object as array, refresh the object and reassign the saved values again after working on the original values. But this does not seem to be the best way, especially if the are many relations. I don't wont to touch the object within a validator, I just need the previous values!
Another approach could be using Doctrine\ORM\UnitOfWork#recomputeSingleEntityChangeSet(Doctrine\ORM\ClassMetadata $meta, $entity). But I don't think it's a good idea to use internal doctrine methods in a validator!
So how do I get the original object or the change set in a class validator?
This won't get you the original entity, but should get you a key/value array of the original fields:
$uow = $em->getUnitOfWork();
$originalData = $uow->getOriginalEntityData($entry);
http://www.doctrine-project.org/api/orm/2.0/source-class-Doctrine.ORM.UnitOfWork.html#2210

Grails Grom + mongoDb get during save OptimisticLockingException

I try in Grails service save an object to mongodb:
Cover saveCover = new Cover()
saveCover.id = url
saveCover.url = url
saveCover.name = name
saveCover.sku = sku
saveCover.price = price
saveCover.save()
Cover domain looks like this:
class Cover {
String id
String name
String url
String sku
String price
}
So I want to have custom id based on url, but during save process I get error:
Could not commit Datastore transaction; nested exception is
org.grails.datastore.mapping.core.OptimisticLockingException: The
instance was updated by another user while you were editing
But if I didn`t use setters and just pass all values in constructor, the exception is gone. Why?
As reported in the documentation here:
Note that if you manually assign an identifier, then you will need to use the insert method instead of the save method, otherwise GORM can't work out whether you are trying to achieve an insert or an update
so you need to use insert method instead of save when id generator is assigned
cover.insert(failOnError: true)
if you do not define the mapping like this:
static mapping = {
id generator: 'assigned'
}
and will use insert method you'll get an auto-generated objectId:
"_id" : "5496e904e4b03b155725ebdb"
This exception occurs when you assign an id to a new model and try to save it because GORM thinks it should be doing an update.
Why this exception occurs
When I ran into this issue I was using 1.3.0 of the grails-mongo plugin. That uses 1.1.9 of the grails datastore core code. I noticed that the exception gets generated on line 847(ish) of NativeEntryEntityPersister. This code updates an existing domain object in the db.
Above that on line 790 is where isUpdate is created which is used to see if it's an update or not. isInsert is false as it is only true when an insert is forced and readObjectIdentifier will return the id that has been assigned to the object so isUpdate will end up evaluating as true.
Fixing the exception
Thanks to && !isInsert on line 791 if you force an insert the insert code will get called and sure enough the exception will go away. However when I did this the assigned id wasn't saved and instead a generated object id was used. I saw that the fix for this was on line 803 where it checks to see if the generator is set to "assigned".
To fix that you can add the following mapping.
class Cover {
String id
String name
String url
String sku
String price
static mapping = {
id generator: 'assigned'
}
}
A side effect of this is that you will always need to assign an id for new Cover domain objects.

How do I tell Play Framework 2 and Ebean to save null fields?

I'm using Play Framework 2 and Ebean. When a user submits a form to edit an existing object in the database, it doesn't save null values. I guess this is to prevent overwriting fields that aren't in the form with null. But how can I let them set fields in the form to null if they need to?
For example, the user edits an Event object. Event.date is 1/1/13. The user sets the Event.date field in the form to empty and submits the form. Inspecting Event.date in the debugger shows its value is null. I save the Event. If I look at the Event in the database, its value is still 1/1/13.
Edit: It seems there is a method for this. The only problem is it doesn't work on nested entities. Any solutions for this?
update(Object bean,Set<String> properties)
Create an ebean.properties file right next to the application.conf file and add this line to it:
ebean.defaultUpdateNullProperties=true
Null properties in Ebean are considered as unloaded, so to prevent accidental nulling properties that shouldn't be nulled, they are just excluded.
Because of this reverting Date (and other fields) to null in Ebean is... hard :). Last time when I had to do the same thing (revert Date) I used second query to do just... nulling the Date (after event.update(Object o)):
public static Result updateEvent(){
Form<Event> eventForm = form(Event.class).bindFromRequest();
// do some validation if required...
Event event = eventForm.get();
event.update(event.id);
if (eventForm.get().date == null){
Ebean
.createUpdate(Event.class, "UPDATE event SET date=null where id=:id")
.setParameter("id", page.id).execute();
}
}
On the other hand, if you are using comparison, for filtering events (always selecting newer than X), you can just set the date to very 'old' value, which also should do the trick. In this case you'll update the object only once.
private static final Date VERY_OLD_DATE = new GregorianCalendar(1, 0, 1).getTime();
public static Result updateEvent(){
Form<Event> eventForm = form(Event.class).bindFromRequest();
Event event = eventForm.get();
if (eventForm.get().date == null){
event.date = VERY_OLD_DATE;
}
event.update(event.id);
}
In this case in your HTML form you will need to clear the value of the form's field (or just send every time date like 0001-01-01), however it can be done easily even with JavaScript.