I have the next setup: WCF Web Services hosted in IIS. Entity Framework 6 used to retrieve data from the DB. Web Services are initialized in the Global.asax.cs, which inherits from NinjectHttpApplication (so we use ninject for dependency injection). In this NinjectHttpApplication, on the CreateKernel method we bind the EF DbContext as follows:
protected override IKernel CreateKernel()
{
var kernel = new StandardKernel();
kernel.Bind<DbContext>().To<MyCustomContext>().InTransientScope();
return kernel;
}
Then, every time a service is called, the Context is obtained as follows in its consturctor:
_context = kernel.Get<DbContext>();
Then, the service retrieves data from the DB as follows:
data = _context.Set<TEntity>().Where(<whatever filter>);
Having said that, my problem is the next: I have a service which is being called many times (with a complex and long query with multple joins), and every time it is called, EF takes ages to produce SQL to send to the DB as result of the Linq To Entities that I've coded. The execution of the query in the DB is nothing (600 milliseconds) but EF is taking ages to produce the SQL every single time this service is called. I suspect this is because of kernel.Bind<DbContext>().To<MyContext>().InTransientScope() who is forcing EF to create a new instance of the DbContext every time there is a call.
I've made a few tests with UnitTests and the behavior is totally different: if you instantiate the service multiple times from the same unit test method and you call it, EF takes long to produce the query only the first time, then it takes no time to produce SQL from the subsequent calls (same query but with different parameters to filter the data to retrieve). From the unit test, the CreateKernel() is of course only called once in the Initialize() method (like in the Web Service in the global.asax.cs), so I dont know what is provoking this huge delay. I suspect EF is capable to keep/cache the query pre-compiled with the unit test approach but not in the real web application. Any clue why?
Please note that the Linq to Entities query is parameterized (strings and date are the params).
Any help very appreciated.
I find you bind your DbContext in a InTransientScope, which means every time you get Dbcontext from ninject , it will create a new DbContext for you.
You could consider using InThreadScope() instead of InTransientScope(), which means ninject will return the same instance when it is in the same thread.
There is also SingleTon scope , which means always returning the same instance ,but this will make the dbcontext too big.
Related
This question already has answers here:
What is the best practice in EF Core for using parallel async calls with an Injected DbContext?
(2 answers)
Closed last year.
I have a problem with the concept of scope in dependency injection. I have registered my db context as a scope and And I save the user activity in a table using an asynchronous method without using "await".
// In Startup:
services.AddScoped<IDbContext, StorageSystemDbContext>();
services.AddScoped<IUserActivityService,UserActivityService>();
// In UserActivityService:
public async void LogUserActivityAsync(string controllerName, string actionName, ActionType actionType = ActionType.View, string data = "", string description = "")
{
await InsertAsync(new UserActivity
{
ControllerName = controllerName,
ActionName = actionName,
ActionType = actionType,
CreatedDateTime = DateTime.Now,
Description = description,
UserId = (await _workContext.CurrentUserAsync())?.Id
});
}
//In Controller:
_userActivityService.LogUserActivityAsync(CurrentControllerName, CurrentActionName,data);
I get the following error when I call same action twice immediately:
InvalidOperationException: A second operation was started on this context before a previous operation completed. This is usually caused by different threads concurrently using the same instance of DbContext. For more information on how to avoid threading issues with DbContext, see https://go.microsoft.com/fwlink/?linkid=2097913.
I expected a new db context to be created with the second request, depending on the type of db context dependency registration, but according to this error, a new context was not created for the second request and used the previous one.
What is the reason for this?
I'm using Asp Net.Core MVC and EF in .Net Core 5
An injected DbContext into a service regardless of scoping will be one single reference when constructor injected. Calling multiple methods in that service will always use the same instance. AddedScoped with ASP.Net will scope the services (and DbContext) to the web request. This is the recommended scoping for a DbContext to ensure any entities loaded during a request can ensure that they are all tracked by the same DbContext instance and that DbContext should be alive for the life of that request. (i.e. to provided lazy loading support if needed) A Transient scoped dependency would mean the DbContext passed to 2 different services would be distinct references. This leads to problems where Service A calls another service to retrieve entities that it wants to associate with an entity it loaded and is trying to update. These entities are associated to a different DbContext resulting in errors or issues like duplicate data being created.
Even with a transient scope DbContext you would still have the exact same problem trying to run two calls from the same service in parallel, and there are many good reasons referenced in the comments not to use un-awaited async calls to do so. Even if your intention is to await multiple calls together, the only way to enable something like would be to internally scope the DbContext within the method call itself. This would typically involve injecting a DbContextFactory type class rather than a DbContext into the service, where the DbContextFactory is a dependency that can initialize and provide a new DbContext; Then:
using (var context = _contextFactory.Create())
{
// operations with DbContext. (context)
}
Even then you need to consider the DB synchronization guards like row and table locks / deadlocks which could rear their heads if you have a significant number of operations happening in parallel. Keep in mind with web applications the web server can be responding to a significant number of requests in parallel, each of which could be kicking off these processes at any time. (Works fine during development with 1 client, crawls/dies out in the real world.)
I found the answer here:
https://stackoverflow.com/a/44121808/4604557
If for some reason you want to run parallel database operations (and think you can avoid deadlocks, concurrency conflicts etc.), make sure each one has its own DbContext instance. Note however, that parallelization is mainly useful for CPU-bound processes, not IO-bound processes like database interaction. Maybe you can benefit from parallel independent read operations but I would certainly never execute parallel write processes. Apart from deadlocks etc. it also makes it much harder to run all operations in one transaction.
I am writing some unit tests for Database creation using EF codefirst.
During execution of Unit tests, the DBContext->OnModelCreating method is executed only 1 time, and the model is then cached for the rest of the remaining tests.
I want to be able to execute the "OnModelCreating" for each unit test separately, by trying to set the ModelCaching property, like specified in the Documentation:
// Remarks:
// Typically, this method is called only once when the first instance of a derived
// context is created. The model for that context is then cached and is for all
// further instances of the context in the app domain. This caching can be disabled
// by setting the ModelCaching property on the given ModelBuidler, but note that
// this can seriously degrade performance. More control over caching is provided
// through use of the DbModelBuilder and DbContextFactory classes directly.
protected virtual void OnModelCreating(DbModelBuilder modelBuilder);
However, there is no such Property "ModelCaching" on this modelbuilder.
How else can I disable this model caching? Tests are running fine one by one, but because of this caching they are failing when running in a row.
Better said, how can I force the ApplicationDbContext -> OnModelCreating to be run for each test individually? Now it is run only once, when it is first used for a bunch of Unit tests.
Seems like this property is not available anymore. You need to keep different DB models and initialize your context for different connection.
This answer helped me in my case. By implementing the IDbModelCacheKeyProvider EF can cache multiple DB model for you based on the different CacheKey.
I have a problem using Entity Framework and COM+. The architecture is this:
A windows service calls four methods of a COM, every n minutes.
COM+ then calls the corresponding methods on a Business Layer.
Business Layer calls DAL. DAL then returns a list to Business Layer. This is done by calling a .ToList()
When I try running the service, the DAL methods return a timeout inner exception. When I try to view the table from Enterprise Manager, it returns a timeout as well! From what I 've seen, the SELECT statements block the other connection instances.
Has anyone else experienced similar problems?
P.S. I cannot post any code yet because I am not at my work... Will do so tomorrow.
Well, as it seems, Entity Framework didn't have to do with any of the above.
As it turns out, the problem was within COM+. I should have ended each method of COM+ with ContextUtil.SetComplete(). Apparently, I didn't do that so the transaction stayed active and after the first few calls, it locked my db.
e.g.
Using MyEntity As New EntityObject
MyEntity.Connection.Open()
Dim rows = From tbl In MyEntity.MyTable _
Select tbl
list = rows.ToList
End Using
ContextUtil.SetComplete()
Please note that if an exception occurs, you should place a ContextUtil.SetAbort(). I should also like to note that the above code is mixed. It's part of my DAL layer and part of my COM+. I just put it like that to make the example more clear...
Entity Framework: Calling 'Read' when DataReader is closed
I am getting this problem intermittently when i pound my service with parallel asynchronous calls.
i understand that the reader is accessed when calling .ToList() on my defined EF query.
I would like to find out what is the best practice in constructing EF queries to avoid this, and similar problems.
My architecture is as follows:
My Entity Data Layer is a static class, with a static constructor, which instantiates my Entities (_myEntities). It also sets properties on my entities such as MergeOption.
This static class exposes public static methods which simply access the Entities.
public static GetSomeEntity(Criteria c) {
...
var q = _myEntitites.SomeEntity.Where(predicate);
return q.ToList();
}
This has been working in production for some time, but the error above and the one here happen intermittently, esp under heavy loads from clients.
I am also currently setting MultipleActiveResultSets=True in my connection string.
And that is the source of all your problems. Don't use shared context and don't use shared context as data cache or central data access object - it should be defined as one of the main rules in EF. It is also the reason why you need MARS (our discussion from previous question is solved now). When multiple clients executes queries on your shared context in the same time it opens multiple DataReaders on the same db connection.
I'm not sure why you get your current exception but I'm sure that you should redesign your data access approach. If you also modify data on shared context you must.
The issue may come from the connection timeout when trying to get a huge amount of data from your database, so trying to set the connection timeout in your code as below:
Entity 5
((IObjectContextAdapter)this.context).ObjectContext.CommandTimeout = 1800;
Other Entity:
this.context.Database.CommandTimeout = 1800;
Considering this class
public class XQueries
{
public IQueryable Query1()
{
using (XEntities context = new XEntities())
{
return something;
}
}
public IQueryable Query2()
{
using (XEntities context = new XEntities())
{
return somethingElse;
}
}
}
Is a connection to the database created for every (XEntities context = new XEntities()) {...} ? If so what is the correct way to create a static UnitOfWork class so that only 1 connection to exist?
You can't create a static unit of work, because by definition a unit of work is a short lived object. Because the EF ObjectContext is designed around the unit of work pattern it is a bad idea to have a single ObjectContext instance during the life time of the application. There are several reasons for this.
First of all, the ObjectContext class is not thread-safe. This means that during the unit of work of one user (in a web app for instance), another user can commit his unit of work. When they share the same ObjectContext, it means that in that situation just half of the changes are persisted and changes are not transactional. When you are lucky the ObjectContext fails and throws an exception. When you are unlucky, you corrupt the ObjectContext and safe and load crap from and to your database and find out when your application is running in production (of course, during testing and staging everything always seems to work).
Second, the ObjectContext has a caching mechanism that is designed for it to be short lived. When an entity is retrieved from the database it stays in the ObjectContext’s cache until that instance is garbage collected. When you keep that instance alive for a long period of time, entities get stale. Especially if that particular ObjectContext instance is not the only one writing to that database.
The Entity Framework opens connections only when required, for example to execute a query or to call SaveChanges, and then closes the connection when the operation is complete.
From Martin Fowler’s book Patterns of Enterprise Application Architecture in respect to Unit Of Work.
When you're pulling data in and out of
a database, it's important to keep
track of what you've changed;
otherwise, that data won't be written
back into the database. Similarly you
have to insert new objects you create
and remove any objects you delete.
You can change the database with each
change to your object model, but this
can lead to lots of very small
database calls, which ends up being
very slow. Furthermore it requires you
to have a transaction open for the
whole interaction, which is
impractical if you have a business
transaction that spans multiple
requests. The situation is even worse
if you need to keep track of the
objects you've read so you can avoid
inconsistent reads.
A Unit of Work keeps track of
everything you do during a business
transaction that can affect the
database. When you're done, it figures
out everything that needs to be done
to alter the database as a result of
your work.
Whenever I use Entity Framework for a clients (which I'd admit is rare) the ObjectContext object is the Unit Of Work implementation for the system. That is the ObjectContext will somewhat meet the three statements above. Rather than concentrating too much on the absolutely correct definition using the ObjectContext makes things a little easier for you.
Do some research on DI/IoC and Repository patterns this will give you more flexibility in handling your problem.