I have a web application (MVC 5, EntityFramework 6). It's connected to an SQL database via a DbContext. I'm having an issue where adding a new entity object creates a duplicate entry in the entity set (but not the DB) and I'm not sure how to stop this from happening.
Controller, whose method is called via an ajax request:
public class CustomerController : Controller
{
MyDBEntities db = new MyDBEntities(); //DbContext
public ActionResult SaveStuff(string customerId, string stuff)
{
Customer customer = db.Single(c => c.ID.Equals(customerId));
Stuff stuff = new Stuff(stuff, customer);
db.Stuffs.Add(stuff);
db.SaveChanges();
return PartialView("MyControl", customer);
}
}
There is a 1-to-many association between Customer and Stuff, and there is a "Stuffs" navigation property in Customer.
Stuff includes fields that are int, string, and DateTime.
The controller method returns a PartialView which is used by JavaScript to refresh the contents of a control.
The "MyControl" control does this:
var stuffs = Model.Stuffs.OrderByDescending(...);
When the control is rendered in this situation, Model.Stuffs contains a duplicate entry. There's an entry with a name of Stuff (probably the new object created in the control method) as well as well as an entry with a name of System.Data.Entity.DynamicProxies.Stuff_<uuid> which is the same exact data (I imagine read from the DB).
This is only a problem when I'm writing into and then reading from an entity set within the same web request. Other/future web requests that cause a read are fine. How can I make this work correctly?
This is happening because the DateTime object is losing precision when it is written into the SQL database (see: SQL Server DateTime vs .NET DateTime). When read back from the DB, it has a different value and therefore does not overwrite the existing "stuff" object that still exists locally in db.Stuffs.
A simple solution is to change the DateTime's setter for Stuff to private and add your own pseudo-setter function that has the rounding built into it:
public void SetTimestamp(DateTime timestamp)
{
//Precision in SQL is lower than in .NET, so just round to tenth seconds
this.Updated = timestamp.AddTicks(- (timestamp.Ticks % (TimeSpan.TicksPerSecond / 10)));
}
Using DateTime2 in the SQL database (Server 2008+) is also an option should you need to maintain that level of precision.
Related
Given the following code, how can I add an element to one of the properties of an entity without knowing its Id and retrieving it from the database?
public async Task BookInPersonVisitAsync(Guid propertyId, DateTime dateTime, CancellationToken token)
{
var entity = new OnBoardingProcessEntity{ ExternalId = propertyId };
DbContext.OnBoardingProcesses.Attach(entity);
entity.OnBoardingProcessVisits.Add(new OnBoardingProcessVisitEntity
{
DateTime = dateTime,
Occurred = false
});
await DbContext.SaveChangesAsync(token);
}
ExternalId is just a guid we use for external reference. This doesnt work cause it does not have the id set, but without hitting the database we cant have it.
With entity framework if you have to reference an entity (referencedEntity) from another entity (entity) you have to know referencedEntity.
Otherwise you can add just add the entity setting the referencedEntity to null.
To know the referencedEntity or you know the Id or you have to retrieve it in some ways (from the database).
In SQL (DML) if (and only if) ExternalId is a candidate key noy nullable you can insert the OnBoardingProcessVisit record with a single roundtrip but the insert statement will contain an inner query.
OnBoardingProcessVisit.OnBoardingProcess_Id = (
SELECT
Id
FROM
OnBoardingProcess
WHERE
ExternalId = #propertyId)
EDIT
No way to generate that query with EF. You can have a look to external components (free and not free, for example EntityFramework Extended but in this case I think that doesn't help).
In this case I probably would try to use standard entity framework features (so 1 roundtrip to retrieve the OnBoardingProcess from the ExternalId).
Then, if the roundtrip is too slow, run the SQL query directly on the database.
About performances (and database consistency) add a unique index on OnBoardingProcess.ExternalId (in every case).
Another suggestion if you decide for the roundtrip.
In your code, the entity will be a proxy. If you don't disable lazy load, using your code you will do one more roundtrip when you will access to property
entity.OnBoardingProcessVisits (in the statement entity.OnBoardingProcessVisits.Add).
So, in this case, disable lazy load or do the same using a different way.
The different way in your case is something like
var onBoardingProcessVisitEntity new OnBoardingProcessVisitEntity
{
DateTime = dateTime,
Occurred = false,
OnBoardingProcess = entity
});
DbContext.OnBoardingProcessVisits.Add(onBoardingProcessVisitEntity);
await DbContext.SaveChangesAsync(token);
I am using a SQLite database EFCore 2.0 preview in UWP Project.
The address table is split into to different entities
Delivery address,
Invoice Address
using
modelBuilder.Entity<Project>().OwnsOne(p => p.DeliveryAddress);
which works great for setting up the database, with migrations, creates the different table in the database. With test data that I have put in manually works great at reading data from these tables. But how do I save changes to the DeliveryAddress table. Nothing is getting persisted to the database, when I save the using:
public void UpdateDeliveryAddress(Project modifiedProject)
{
using (var db = new SteelFrameCalculatorDataContext())
{
db.Entry(modifiedProject).State = EntityState.Modified;
db.SaveChanges();
}
}
Project being the parent entity
2017-06-11T23:21:10.9242463+01:00 Warning 8 Microsoft.EntityFrameworkCore.Model.Validation
The key {'ProjectId'} on entity type 'Project.DeliveryAddress->Address' contains properties in shadow state - {'ProjectId'}. To configure this warning use the DbContextOptionsBuilder.ConfigureWarnings API (event id 'CoreEventId.ModelValidationShadowKeyWarning'). ConfigureWarnings can be used when overriding the DbContext.OnConfiguring method or using AddDbContext on the application service provider.
Using the following allowed in to save updates to the database. Assume the UpdateRange(entity) sets all to modified. Not sure if this is the correct way, but it works.
using (var db = new SteelFrameCalculatorDataContext())
{
db.UpdateRange(modifiedProject);
db.SaveChanges();
}
Have you tried setting the state of the child object? Looks like you're only setting the parent Project state.
Adding this should do it:
db.Entry(modifiedProject.DeliveryAddress).State = EntityState.Modified;
db.Entry(modifiedProject).Reference(a=>a.DeliveryAddress).TargetEntry.State = EntityState.Modified;
This is my first post :) I'm new to MVC .NET. And have some questions in regards to Entity Framework functionality and performance. Questions inline...
class StudentContext : DbContext
{
public StudentContext() : base("myconnectionstring") {};
public DbSet<Student> Students {get; set; }
...
}
Question: Does DbSet read all the records from the database Student table, and store it in collection Students (i.e. in memory)? Or does it simply hold a connection to this table, and (record) fetches are done at the time SQL is executed against the database?
For the following:
private StudentContext db = new StudentContext();
Student astudent = db.Students.Find(id);
or
var astudent = from s in db.Students
where s.StudentID == id)
select s;
Question: Which of these are better for performance? I'm not sure how the Find method works under-the-hood for a collection?
Question: When are database connections closed? During the Dispose() method call?
If so, should I call the Dispose() method for a class that has the database context instance? I've read here to use Using blocks.
I'm guessing for a Controller class get's instantiated, does work including database access, calls it's associated View, and then (the Controller) gets out of scope and is unloaded from memory. Or the garbase collector. But best call Dispose() to do cleanup explicitly.
The Find method looks in the DbContext for an entity which has the specified key(s). If there is no matching entity already loaded, the DbContext will makes a SELECT TOP 1 query to get the entity.
Running db.Students.Where(s => s.StudentID == id) will get you a sequence containing all the entities returned from a SQL query similar to SELECT * FROM Students WHERE StudentID = #id. That should be pretty fast; you can speed it up by using db.Students.FirstOrDefault(s => s.StudentID == id), which adds a TOP 1 to the SQL query.
Using Find is more efficient if you're loading the same entity more than once from the same DbContext. Other than that Find and FirstOrDefault are pretty much equivalent.
In neither case does the context load the entire table, nor does it hold open a connection. I believe the DbContext holds a connection until the DbContext is disposed, but it opens and closes the connection on demand when it needs to resolve a query.
One of my entity classes would be possible to store in a sql server
database as a BIGINT. My question is: How do I get a Entity Framework
context to know how to store and retrieve instances of my entity class?
More detail. I'm using Noda Time, which can represent a (much) wider range of
dates than can SQL or .NET datetime (AND it's a dessert topping). My Entity Class, Happening, is a wrapper around NodaTime's
Instant class. I can set a Happening from a long, and get a long from
a happening with methods like .SetFromLong(long instant) and .ToLong().
Currently I have my model working, saving classes that contain
properties of the dot net DateTime type. If instead I want to use properties
of my custom type "Happening", how do I tell Entity Framework how to save those?
If I'm reading this article about Modeling and Mapping am I on the
right track or missing something simpler?
http://msdn.microsoft.com/en-us/library/bb896343.aspx
I'm using entity framework 4.
What i recommend doing is adding 2 properties on your entity a NodaTime and a long, and exclude your NodaTime property using [NotMapped] in your EF model, then in your getter/setter update the long.
ie
public class MyEntity{
public long TimeAsLong{get;set;}
[NotMapped]
public Happening {
get{
return new Happening().SetFromLong(TimeAsLong);
}
set {
TimeAsLong = value.ToLong();
}
}
}
The effect of this will be that the long is stored in the db but you can access it on the class via NodaTime
I'm using VS1010RC with the POCO self tracking T4 templates.
In my WCF update service method I am using something similar to the following:
using (var context = new MyContext())
{
context.MyObjects.ApplyChanges(myObject);
context.SaveChanges();
}
This works fine until I set ConcurrencyMode=Fixed on the entity and then I get an exception. It appears as if the context does not know about the previous values as the SQL statement is using the changed entities value in the WHERE clause.
What is the correct approach when using ConcurrencyMode=Fixed?
The previous values need to be in your object.
Let's say you have a property ConcurrencyToken:
public class MyObject
{
public Guid Id { get; set; }
// stuff
public byte[] ConcurrencyToken { get; set; }
}
Now you can set ConcurrencyMode.Fixed on that property. You also need to configure your DB to automatically update it.
When you query the DB, it will have some value:
var mo = Context.MyObjects.First();
Assert.IsNotNull(mo.ConcurrencyToken);
Now you can detach or serialize the object, but you need to include ConcurrencyToken. So if you're putting the object data on a web form, you'll need to serialize ConcurrencyToken to a string and put it in a hidden input.
When you ApplyChanges, you need to include the ConcurrencyToken:
Assert.IsNotNull(myObject.ConcurrencyToken);
using (var context = new MyContext())
{
context.MyObjects.ApplyChanges(myObject);
context.SaveChanges();
}
Having ConcurrencyMode.Fixed changes the UPDATE SQL. Normally it looks like:
UPDATE [dbo].[MyObject]
SET --stuff
WHERE [Id] = #0
With ConcurrencyMode.Fixed it looks like:
UPDATE [dbo].[MyObject]
SET --stuff
WHERE [Id] = #0 AND [ConcurrencyToken] = #1
...so if someone has updated the row between the time you read the original concurrency token and the time you saved, the UPDATE will affect 0 rows instead of 1. The EF throws a concurrency error in this case.
Therefore, if any of this isn't working for you, the first step is to use SQL Profiler to look at the generated UPDATE.
Mark,
The objects created as "Self-tracking entities" cannot be considered pure POCOs;
Here's the reason:
The STEs only work well if your client uses the generated proxies from the STE T4 template.
Change-tracking, and thus your service, will only work with these generated proxies.
In a pure POCO world (interoperatibility, Not all .Net 4.0 clients, .. ), you cannot put
constraints on you client. For instance, facebook will not be writing a service that can
only handle .Net 4.0 clients.
STEs may be a good choice in some environments, it all depends on your requirements.