Does Entity Framework automatically cache the ModelBuilder model? - entity-framework

I am developing an ASP MVC application using Entity Framework. I was thinking of writing code to cache the object returned by ModelBuilder (as is recommended by several sources), but then I ran into this on Scott Gu's blog:
"The OnModelCreating method above will be called the first time our NerdDinners class is used within a running application, and it is passed a “ModelBuilder” object as an argument. The ModelBuilder object can be used to customize the database persistence mapping rules of our model objects. We’ll look at some examples of how to do this below.
"EF only calls the “OnModelCreating” method once within a running application – and then automatically caches the ModelBuilder results. This avoids the performance hit of model creation each time a NerdDinners class is instantiated, and means that you don’t have to write any custom caching logic to get great performance within your applications."
Does this mean that EF automatically caches the ModelBuilder object, and I don't have to write code to do it, or is this something that is only done if the OnModelCreating method is overridden, or ... ??

From the Entity Framework Blog regarding performance improvements in EF 4
Model Caching
There is some cost involved in
discovering the model, processing Data
Annotations and applying fluent API
configuration. To avoid incurring this
cost every time a derived DbContext is
instantiated the model is cached
during the first initialization. The
cached model is then re-used each time
the same derived context is
constructed in the same AppDomain.
Model caching can be turned off by
setting the CacheForContextType
property on ModelBuilder to ‘false’ in
the OnModelCreating method.
So the answer is yes for Entity Framework 4.0

Related

EF Core 2.2: use same DbContext instance get record from database multiple times, always the same result

I am facing an issue: we have an ASP.NET Core 2.2 Web API project with EF Core 2.2. We are using default IOC framework to create the DbContext with scope lifetime. And we have a socket pipeline connected to our ASP.NET Web API service.
I find that when we change the data in the web frontend, the socket pipeline will always get the old result (we are using .FirstOrDefault() to fetch the data, it should not be the problem with first-level cache).
So I infer that it might be because of that the scope lifetime for DbContext, so I changed it to transient lifetime. And it works! We get the modified record.
I have two questions:
Is that behavior of DbContext by design? Or maybe I have some tricky issue in my code.
How much performance will the transient lifetime DbContext cost? Since maybe I will make every DbContext transient
1) Is that behavior of DbContext by design?
Yes
For each item in the result set If this is a tracking query, EF checks
if the data represents an entity already in the change tracker for the
context instance If so, the existing entity is returned If not, a new
entity is created, change tracking is setup, and the new entity is
returned
How Queries Work
2) How much performance will the transient lifetime DbContext cost?
Very little. Especially in ASP.NET Core, which has DbContext Pooling
Since maybe I will make every DbContext transient
But you shouldn't do that. Using a request-scoped DbContext is very useful. For instance you can use the DbContext in various layers of your application without having to pass one around, and you can manage transactions more easily.

A static DbContext object for read-only purposes in ASP.NET MVC WebAPI

I'm refactoring my ASP.NET MVC 4 WebAPI project for performance optimization reasons.
Within my controller code, I'm searching for entities in a context (DbContext, EF6). There are a few thousands of such entities, new ones are added on an hourly basis (i.e. "slowly"), they are rarely deleted (and I don't care if deleted entities are still found on the context's cache!) and are never modified.
After reading the answers to this question, to this one and a few more discussions, I'm still not sure it's a bad idea to use a single static DbContext for the purpose described above - a DbContext which never updates the database.
Performance-wise, I'm not worried about the instantiation cost, but rather about the uselessness of caching requested entities if the DbContext is created for each request. I'm also using a 2nd level caching, which makes the persistence of the context even more acute.
My questions are:
1. Regardless of the specific implementation, is a "static" DbContext a valid solution in my case?
2. If so, what would be the most appropriate way of implementing such a DbContext?
3. Should I periodically "flush" the context to clear the cache in order to prevent if from growing too big?
DbContext caches entity instances when you get/query the data. It ensures different queries that return the same data map to the same entity (based on type and id). Otherwise, if you modify the same entity in different object instances, the context would not know which one has the correct data. Therefore a static DbContext would blow up over time until the process crashes.
DbContexts should be short lived. Request.Properties is a good place to save it in Web API (maps to HttpContext.Items in IIS).

Entity Framework Caching

I'am reading a article about the differences between Nhibernate and EF.
But i could not understand what they wanted to say with caching on a field.
As for Entity Framework, the ObjectContext/DbContext holds the configuration, model and acts as the Unit of Work, holding references to all of the known entity instances. This class is therefore not lightweight as its NHibernate counterpart and it is not uncommon to see examples where an instance is cached on a field.
I did not create a link to article, because i was not 100% sure it was allowed.
Note the wording carefully; they are speaking of the DbContext itself, and comment that it is not uncommon to see examples where "the instance" (the DbContext) is cached on a field.
What they mean is, rather than creating and destroying a DbContext object with a local scope in a method, you'll see people save the DbContext instance to a field of a broader object and reuse it.

What is the overhead of Entity Framework tracking?

I've just been talking with a colleague about Entity Framework change tracking. We eventually figured out that my context interface should have
IDBSet<MyPoco> MyThings { get; }
rather than
IQueryable<MyPoco> MyThings { get; }
and that my POCO should also have all it's properties as virtual.
Using the debugger we could then see the tracking objects and also that the results contained proxies to my actual POCOs.
If I don't have my POCO properties as virtual and have my context interface using IQueryable<> instead of IDbSet<> I don't get any of that.
In this instance I am only querying the database, but in the future will want to update the database via Entity Framework.
So, to make my life easier in the future when I come to look at this code as a reference, is there any performance penalty in having the tracking info/proxies there when I will never make use of them?
There is a performance penalty of tacking entities in EF. When you query using entity framework EF will keep a copy of values loaded from database. Also single Context instance keeps track of only single instance of an entity. So EF has to check whether it already has a copy of the entity before it creates an instance(ie. There will be lot of comparisons going behind the scenes).
So avoid it if you don't need it. You can do so as follows.
IQueryable<MyPoco> MyThings { get { return db.MyThings.AsNoTracking(); } }
MSDN page on Stages of Query Execution details the cost associated with each step of query execution.
Edit:
You should not expose IDBSet<MyPoco> MyThings because that tells the consumer of your API that your entities can be added, updated and deleted when in fact you intend to query the data.
Navigation properties in the model classes as declared as virtual so as to imply lazy load feature which means the navigation property will only be needed if required. As far as the Entity objects are concerned, there main aim is to load the specific table records from the database into the DbSet which comes from DbContext. You can't use IQueryable in this case. Also, it doesn't make any sense with the DataContext. IQueryable is an altogether different interface

Which variant of Entity Framework to use in WCF based enterprise app

We are in a process of designing an application with approx 100 tables and complicated business logic. Windows Forms will be used on the client side and WCF services with MSSQL on the server.
Custom DTOs are used for client-server communication, business entities are not distributed.
Which variant of Entity Framework to use (and why):
EF 4.0 EntityObjects
EF 4.0 POCO
EF 4.1 DbContext
Something else
Database-first approach is a requirement.
Also, is it worth implementing a Repository pattern? It seems a bit redundant, as there is one level of abstraction in the mapping itself and another one in the use of DTOs. I'm currently leaned towards using auto-generated extendable repositories for each entity returning IQueryable, just to have a place to put common queries, but still allowing querying entity model directly from the Service Layer.
Which variant to use? Basically once you have custom DTO the only question is do you want to have control over entities code (their base class) and make them independent on EF? Do you want to use code first? If the answers to all questions are no then you can use EntityObjects. If you want to have entities persistence ignorant or use custom base class you should go to POCO. If you want to use code first or new DbContext API you will need EF 4.1. Some related topics:
EF 4.1 Code-first vs Model/Database-first
EF POCO code only VS EF POCO with Entity Data Model (this was related to CTP)
ADO.NET DbContext Generator vs. ADO.NET POCO Entity Generator
EF Model First or Code First Approach?
There are more things to consider when designing service layer. You should be aware of complications you will have to deal with when using EF in WCF. Your service will provide data to WinForms application and it will work with them in "detached mode". Once user will do all changes he wants to do he will post data back to the service. But here comes the problem - you must tell EF what has changed. If you for example allow user to change order with all its order items (change quantity in items, add new items, delete some items) you must say EF exactly what has changed, what was added and what was deleted. That is easy when you work with single entity but once you allow user to change object graph (especially many-to-many relations) then it is quite tough. The most common solution is loading the whole graph and merge the state from incoming DTOs to loaded and attached graph. Other solution is using Self tracking entities instead of EntityObjects/POCOs + DTOs.
When discussing repositories I would refer you to this answer which refers many other answers discussing repositories, their possible redundancy and possible mistakes when using them just to make your code testable. Generally each layer should be added only if there is real need for the layer - due to better separation of concerns.
The main advantage of POCOs is that those classes can be your DTOs, so if you've already got custom DTOs that you're using, POCO seems a bit redundant. However, there are some other advantages which may or may not have value to you, since you didn't mention unit testing as a requirement. If you plan to write unit tests, then POCO is still the way to go. You probably won't notice much difference between 4.0 POCO and 4.1 since you won't be using the code-first feature (disclaimer: I've only used 4.0 POCO, so I'm not intimately familiar with any minor differences between the two, but they seem to be more or less the same--basically I was already using POCO in 4.0 and haven't seen anything that's made me want to update everything to use 4.1).
Also, depending on whether you plan to unit-test this layer, there's still value in implementing the repository/unit of work patterns when using Entity Framework. It serves to abstract away the data access logic (the context), not the entities themselves, and allows you to do things like mocking your context in unit tests. What I do is copy the T4 template for my context and use it to create the interface, then edit the T4 template for the context and have it implement that interface and use IObjectSet<T> instead of ObjectSet<T>. So instead of:
public class MyEntitiesContext
{
public ObjectSet<MyClass> MyEntities
...
}
I end up with:
public interface IMyEntitiesContext
{
public IObjectSet<MyClass> MyEntities;
}
and
public class MyEntitiesContext : IMyEntitiesContext
{
public IObjectSet<MyClass> MyEntities
...
}
So I guess it really comes down to whether or not you plan to write unit tests for this layer. If you won't be doing anything that would require mocking out your context for testing, then the easiest thing to use would probably be 4.0 EntityObjects, since you aren't planning to pass your entities between layers and it would require the least effort to implement. If you plan to use mocking, then you'll probably want to use POCO and implement repository/unit of work.