Couldn't find and answer for whatever is happening to my code. I'm reading an XML from a web service and adding rows to my database, I'm using EF and after adding the header or master row, I get that exception when calling SaveChanges() on the detail row.
// master row insert
var result = 0;
using (var context = new XEntities())
{
var masterRow = new TableName{
x = someValue,
...
};
context.TableName.Add(masterRow);
context.SaveChanges();
result = masterRow.id;
}
return result;
So this insert has no issues, after returning that inserted row's id. I call a function to insert the detail with some parameters, including the parent id.
That's when I get that exception:
DbUpdateConcurrencyException: Store update, insert, or delete
statement affected an unexpected number of rows (0). Entities may have
been modified or deleted since entities were loaded. Refresh
ObjectStateManager entries.
The code for the child row is exactly the same, of course with its own data. Any ideas how can I debug this exception? My best guess is the auto generated id on the master row, however, I'm logging the correct id (not 0) on the return statement after the first SaveChanges().
Edit:
I tried just now to insert my child row using context.Database.ExecuteSqlCommand(query, parameters) with the same data I used for entity framework and it works. But then why using context.Add(row) then SaveChanges() is not working? Typing queries is so from the 90's hahahaha
Related
I have the following code
var dbContext = Setup.ConfigureDBContext();
var wo = await dbContext.WorkOrders.Where(x => x.WorkOrderID == 88).SingleOrDefaultAsync();
var t = wo.Confidence;
wo.ModifiedDateTime = DateTime.UtcNow;
wo.Confidence = t;
await dbContext.SaveChangesAsync();
in the above query i am assigning the same Confidence but changing the ModifiedDateTime
EF generates the following SQL
exec sp_executesql N'SET NOCOUNT ON;
UPDATE [WorkOrders] SET [ModifiedDateTime] = #p0
WHERE [WorkOrderID] = #p1 AND [VersionStamp] = #p2;
SELECT [VersionStamp]
FROM [WorkOrders]
WHERE ##ROWCOUNT = 1 AND [WorkOrderID] = #p1;
',N'#p1 int,#p0 datetime,#p2 varbinary(8)',#p1=88,#p0='2019-10-09 15:33:06.343',#p2=0x0000000000582A52
Note that EF is not including Confidence column in the update statement. I am assuming, EF has to compare the original value with new value and if there is a change then only include those columns in the update statement.
Is that correct assumption?
I am asking this question because the WorkOrder table also has 4 nvarchar(max) columns. The data in these columns is long string. If my assumption is correct then EF also has to compare these 4 columns to decide to whether those column needs to include in the update query or not. And that comparison will be slower and may cause performance. Then I may create separate satellite table just for these 4 columns.
The context has an internal change tracker, which as the name indicates, tracks changes made to entities. It's not comparing what's in the database, it's simply noting properties that have had their values modified in code. In cases where you change only a single or handful of properties, EF will issue an update statement which only includes the columns for properties that were modified, but it's not doing so in an intelligent way. In other words, if you modify a property with the same value it had before (either explicitly or via some automatic means like a library like AutoMapper), it will include that column in the update statement as well, despite the actual value being unchanged.
Your first entity is obtained by the context, so by default the context begins tracking it immediately, which means that any entity you retrieve will be stored in the DbContext along with a copy of its original values. When you alter property values on a tracked entity, the context changes the EntityState for the entity to Modified and the ChangeTracker records the old property values and the new property values. When SaveChanges is called, an UPDATE statement is generated and executed by the database , since the ChangeTracker tracks which properties have been modified, the context will issue a SQL statement that updates only those properties that were changed , it will compare them with their original values to determine the updated properties .
If you concern about the specific columns , you can disable the Tracking while query :
var item = _context.Employees.Where(x=>x.ID==1).AsNoTracking().FirstOrDefault();
Then tell context which property you want to modify , context will issue an SQL statement updating without compare :
item.Name = "213213";
_context.Attach(item).Property(x => x.Name).IsModified = true;
_context.SaveChanges();
Got a strange one here. I am using EF 6 over SQL Server 2012 and C#.
If I delete a record, using DeleteObject, I get:
//order.orderitem count = 11
db.OrderItem.DeleteObject(orderitem);
db.SaveChanges();
var order = db.order.First(r => r.Id == order.id);
//Order.OrderItem count = 10, CORRECT
If I delete an Order Item, using ExecuteStoreCmd inline DML, I get:
//order.orderitem count = 11
db.ExecuteStoreCommand("DELETE FROM ORDERITEM WHERE ID ={0}", orderitem.Id);
var order = db.Order.First(r => r.Id == order.id);
//order.orderitem count = 11, INCORRECT, should be 10
So the ExecuteStoreCommand version reports 11, however the OrderItem is definitely deleted from the DB, so it should report 10. Also I would have thought First() does an Eager search thus repopulating the "order.orderitem" collection.
Any ideas why this is happening? Thanks.
EDIT: I am using ObjectContext
EDIT2: This is the closest working solution I have using "detach". Interestingly the "detach" actually takes about 2 secs ! Not sure what it is doing, but it works.
db.ExecuteStoreCommand("DELETE FROM ORDERITEM WHERE ID ={0}", orderitem.Id);
db.detach(orderitem);
It would be quicker to requery and repopulate the dataset. How can I force a requery? I thought the following would do it:
var order = db.order.First(r => r.Id == order.id);
EDIT3: This seems to work to force a refresh post delete, but still take about 2 secs:
db.Refresh(RefreshMode.StoreWins,Order.OrderItem);
I am still not really understanding why one cannot just requery as a Order.First(r=>r.id==id) type query oftens take much less than 2 secs.
This would likely be because the Order and it's order items are already known to the context when you perform the ExecuteStoredCommand. EF doesn't know that the command relates to any cached copy of Order, so the command will be sent to the database, but not update any loaded entity state. WHere-as the first one would look for any loaded OrderItem, and when told to remove it from the DbSet, it would look for any loaded entities that reference that order item.
If you don't want to ensure the entity(ies) are loaded prior to deleting, then you will need to check if any are loaded and refresh or detach their associated references.
If orderitem represents an entity should just be able to use:
db.OrderItems.Remove(orderitem);
If the order is loaded, the order item should be removed automatically. If the order isn't loaded, no loss, it will be loaded from the database when requested later on and load the set of order items from the DB.
However, if you want to use the SQL execute approach, detaching any local instance should remove it from the local cache.
db.ExecuteStoreCommand("DELETE FROM ORDERITEM WHERE ID ={0}", orderitem.Id);
var existingOrderItem = db.OrderItems.Local.SingleOrDefault(x => x.Id == orderItem.Id);
if(existingOrderItem != null)
db.Entity(existingOrderItem).State = EntityState.Detached;
I don't believe you will need to check for the orderItem's Order to refresh anything beyond this, but I'm not 100% sure on that. Generally though when it comes to modifying data state I opt to load the applicable top-level entity and remove it's child.
So if I had a command to remove an order item from an order:
public void RemoveOrderItem(int orderId, int orderItemId)
{
using (var context = new MyDbContext())
{
// TODO: Validate that the current user session has access to this order ID
var order = context.Orders.Include(x => x.OrderItems).Single(x => x.OrderId == orderId);
var orderItem = order.OrderItems.SingleOrDefault(x => x.OrderItemId == orderItemId);
if (orderItem != null)
order.OrderItems.Remove(orderItem);
context.SaveChanges();
}
}
The key points to this approach.
While it does mean loading the data state again for the operation, this load is by ID so it's fast.
We can/should validate that the data requested is applicable for the user. Any command for an order they should not access should be logged and the session ended.
We know we will be dealing with the current data state, not basing decisions on values/data from the point in time that data was first read.
I am new to Breeze.js, but really enjoy it so far. I ran into an issue with updating a database with Breeze.js, when selecting only portion of columns of a model.
When I ran this statement:
$scope.emFac.entityQuery.from('Company');
the company entity matches my EF entity, retrieves all columns, creates entityAspect, and all is working fine when updating database:
However, when I retrieve only portion of corresponding Model's columns, Breeze.js returns anonymous object with specified properties (retrieving data works, but not updating does not), without the entityAspect, which is being used for tracking changes.
Here is the code with select statement:
$scope.emFac.entityQuery.from('Company').select('companyId, displayName');
Is there a way to retrieve only some columns of EF Model columns, and still track changes with Breeze.js, needed for database updates?
As you've discovered, Breeze treats the incoming data as plain objects instead of entities when you use select.
Your choices are:
On the server, Create a CustomerLite or similar object, and have a server endpoint that returns those without the need for select; OR
On the client, get the results from the query and create entities from each object, with status Unchanged
Example of #2:
var entities = [];
em.executeQuery(customerProjectionQuery).then(queryResult => {
queryResult.results.forEach(obj => {
// obj contains values to initialize entity
var entity = em.createEntity(Customer.prototype.entityType, obj, EntityState.Unchanged);
entities.push(entity);
});
})
Either way, you will need to ensure that your saveChanges endpoint on the server can handle saving the truncated Customer objects without wiping out the other fields.
I'm using the code show below to update an entity model. But I get this error:
Store update, insert, or delete statement affected an unexpected number of rows (0)
And reason of this error is also known, it's because the property does not exist in the database. For that I found have only one option: first check that the entity exists, then update it.
But, as I'm updating 10,000+ rows at a time, it will be time consuming to check each time in database if this property exist or not.
Is there any another way to solve this ?
Thank you.
foreach (Property item in listProperties)
{
db.Properties.Attach(item);
db.Entry(item).Property(x => x.pState).IsModified = true;
}
db.SaveChanges();
You use it the wrong way. If you want to update without retrieving the entity, just change the state of the updated entity with providing the id.
foreach (Property item in listProperties)
{
db.Entry(item).State = EntityState.Modified;
}
db.SaveChanges();
Attaching an existing but modified entity to the context
If you have an entity that you know already exists in the database but
to which changes may have been made then you can tell the context to
attach the entity and set its state to Modified.
When you change the state to Modified all the properties of the entity
will be marked as modified and all the property values will be sent to
the database when SaveChanges is called.
Source
I got this error Just by updating EF version from 5 to 6 solved the issue.
I have a run into a serious problem several times now using MVC 4 and EF.
The problem is best illustrated by example:
I have records in a DB table with the following PKs, 1,2,3,4.
1 and 2 are deleted. When EF goes to insert a new record, it is assigning it with PK 1 again. The next insert will use 2 and then it will try 3 and get a PK violation.
I saw the same thing yesterday in another table with another insert.
In the following, image you can see that the db.SaveChanges failed when inserting the a new record with PK 3.
As you can see the DB is set to auto-increment from the following image:
Here is my controller action (it is used for inserts and edits - but that should not matter):
if (ModelState.IsValid)
{
//update pricelist
Pricelist pricelist = new Pricelist();
pricelist.InjectFrom(adminEditPricelistVM.Pricelist);
pricelist.PricelistProducts = new List<PricelistProduct>();
pricelist.SubscriberId = (int)UserManagement.getUsersSubscriberId(WebSecurity.GetUserId(System.Web.HttpContext.Current.User.Identity.Name));
if (adminEditPricelistVM.Pricelist.PricelistId != 0)
{
db.Entry(pricelist).State = EntityState.Modified;
}
else
{
db.Pricelists.Add(pricelist);
}
db.SaveChanges();
adminEditPricelistVM.Pricelist.PricelistId = pricelist.PricelistId;
etc...
The only clue I have is that in the seeding config for my data, we are using the following commands to begin the seeding at 1, when replacing the data:
context.Database.ExecuteSqlCommand("DBCC CHECKIDENT ('Features', RESEED, 1)");
Perhaps this has something to do with it, but I doubt it - since this is not called at that time.
BTW, this is not consistent. I cannot replicate manually. It just seems to happen from time-to-time and when it does, EF will continue to err on each insert attempt until it passes all the used IDs and finds the next free one. In other words, I will get an error inserting on PK 3. Then on the next insert attempt, it will err on PK 4 and then on the next attempt it will succeed because PK 5 was not being used. It's as if, there is a memory of PKs in use somewhere that gets reset.
Any help would be greatly appreciated!
OK. I can confirm that the problem is caused by the DB reseed. This basically sets the count back to 1 regardless of any remaining records in the DB. Weird but true.