I have a project that has many DBs. I want to set a retry execution strategy for all of them but one. One requires the use of transactions so it's not compatible.
The documentation says to set it in the DbConfiguration which is application wide and only supports one per application. I can't see a way to set different execution strategies for different contexts. Is it all or nothing or is there an other way to set this?
Either use the workarounds detailed here: Entity Framework Limitations with Retrying Execution Strategies (EF6 onwards)
Or one the techniques for overriding the DbConfiguration here or using a different DbConfiguration here: Entity Framework Code-Based Configuration (EF6 onwards)
Related
I'm working on an ASP.net Web API application with Autofac and Entity Framework.
I've been breaking apart different my service classes into smaller classes in order to make my code more testable and to make my code more SOLID.
I'm using Autofac to inject Entity Framework DbContext into my various helper classes. This becomes problematic because if I use entities queried from DbContext in two different helper classes, I get an error when Entity Framework tries to produce a query.
The error says that Entity Framework cannot produce a query with entities from two different instances of DbContext.
Clearly, the solution is that I need to configure Autofac so that the same instance of DbContext is injected into each of the helper classes, but I'm afraid that if I try to do this, I may get concurrency issues when this application gets deployed to a production environment and many people use it at once.
How do I configure Autofac so that when a request hits my application, my API helper classes all get the same instance of DbContext, but I don't have concurrency issues across multiple requests?
An alternative to the action filter recommended by the Autofac documentation (https://autofaccn.readthedocs.io/en/latest/faq/per-request-scope.html#no-per-request-filter-dependencies-in-web-api) see: "No Per-Request Filter Dependencies in Web API" and manually going to the DependencyResolver for others:
You could have a look at the Medhime DbContextScope unit of work provider. (https://www.nuget.org/packages/EntityFramework.DbContextScope/) compiled for both EF6 and EF Core.
The injected dependencies for your classes becomes a DbContextScopeFactory for the top level, and an AmbientDbContextLocator for your services. These don't "break" with Web API's limitation on the request lifetime scope. The ContextScopeFactory would be initialized once and supply the DbContext, while the locators will be fed that single instance.
It may be worth having a look at if managing context references across services and an API action prove clunky.
I am using Entity Framework 6 database-first. I am converting the project to implement the onion architecture to move towards better separation of concerns. I have read many articles and watched many videos but having some issues deciding on my solution structure.
I have 4 projects: Core, Infrastructure, Web & Tests.
From what I've learned, the .edmx file should be placed under my "Infrastructure" folder. However, I have also read about using the Repository and Unit of Work patterns to assist with EF decoupling and using Dependency Injection.
With this being said:
Will I have to create Repository Interfaces under CORE for ALL entities in my model? If so, how would one maintain this on a huge database? I have looked into automapper but found issues with it presenting IEnumererables vs. IQueryables but there is an extension available it has to hlep with this. I can try this route deeper but want to hear back first.
As an alternative, should I leave my edmx in Infrastructure and move the .tt T4 files for my entities to CORE? Does this present any tight coupling or a good solution?
Would a generic Repository interface work well with the suggestion you provide? Or maybe EF6 already resolves the Repository and UoW patterns issue?
Thank you for looking at my question and please present any alternative responses as well.
I found a similar post here that was not answered:
EF6 and Onion architecture - database first and without Repository pattern
Database first doesn't completely rule out Onion architecture (aka Ports and Adapters or Hexagonal Architecture, so you if you see references to those they're the same thing), but it's certainly more difficult. Onion Architecture and the separation of concerns it allows fit very nicely with a domain-driven design (I think you mentioned on twitter you'd already seen some of my videos on this subject on Pluralsight).
You should definitely avoid putting the EDMX in the Core or Web projects - Infrastructure is the right location for that. At that point, with database-first, you're going to have EF entities in Infrastructure. You want your business objects/domain entities to live in Core, though. At that point you basically have two options if you want to continue down this path:
1) Switch from database first to code first (perhaps using a tool) so that you can have POCO entities in Core.
2) Map back and forth between your Infrastructure entities and your Core objects, perhaps using something like AutoMapper. Before EF supported POCO entities this was the approach I followed when using it, and I would write repositories that only dealt with Core objects but internally would map to EF-specific entities.
As to your questions about Repositories and Units of Work, there's been a lot written about this already, on SO and elsewhere. You can certainly use a generic repository implementation to allow for easy CRUD access to a large set of entities, and it sounds like that may be a quick way for you to move forward in your scenario. However, my general recommendation is to avoid generic repositories as your go-to means of accessing your business objects, and instead use Aggregates (see DDD or my DDD course w/Julie Lerman on Pluralsight) with one concrete repository per Aggregate Root. You can separate out complex business entities from CRUD operations, too, and only follow the Aggregate approach where it is warranted. The benefit you get from this approach is that you're constraining how the objects are accessed, and getting similar benefits to a Facade over your (large) set of database entities.
Don't feel like you can only have one dbcontext per application. It sounds like you are evolving this design over time, not starting with a green field application. To that end, you could keep your .edmx file and perhaps a generic repository for CRUD purposes, but then create a new code first dbcontext for a specific set of operations that warrant POCO entities, separation of concerns, increased testability, etc. Over time, you can shift the bulk of the essential code to use this, while still keeping the existing dbcontext so you don't lose and current functionality.
I am using entity framework 6.1 in my DDD project. Code first works out very well if you want to do Onion Architecture.
In my project we have completely isolated Repository from the Domain Model. Application Service is what uses repository to load aggregates from and persist aggregates to the database. Hence, there is no repository interfaces in the domain (core).
Second option of using T4 to generate POCO in a separate assembly is a good idea. Please remember that your domain model (core) should be persistence-ignorant.
While generic repository are good for enforcing aggregate-level operations, I prefer using specific repository more, simply because not every Aggregate is going to need all of those generic repository operations.
http://codingcraft.wordpress.com/
I used the EF 5.x DbContext Fluent Generator to generate my POCO classes but my properties are not coded as virtual. Don't you have to have that for tracking to occur? Why wouldn't the template already use virtual for properties?
Because we found that for the majority of users it was better to use snapshot change tracking rather than change tracking proxies. Change tracking proxies have their place in certain situations, but usually they add complexity without any real benefit. For more info see http://blog.oneunicorn.com/2011/11/24/why-are-the-dbcontext-t4-templates-so-different-from-the-ef4-poco-templates/ and http://blog.oneunicorn.com/2011/12/05/should-you-use-entity-framework-change-tracking-proxies/
I have started working on a new project and am switching from LinqToSQL to EF 4.1 as my ORM.
I already have a database set up to work with and so am going with the database first approach. By default the EF generates a context which extends ObjectContext. I wanted to know if a good approach would be to replace it with DbContext.
Most of the available examples deal with only Code First and DbContextbut DBContext can be used with Database First too. Are there any advantages I get by using the DBContext? From what I have read the DBContext is a simplified version of the ObjectContext and makes it easier to work with. Are there any other advantages or disadvantages?
You will not replace anything manually. You will need DbContext T4 Generator available at VS Gallery. Don't touch your autogenerated files - your changes will be lost every time you modify EDMX file.
I answered similar question last year. Now my answer is mostly - for new users DbContext API is probably better. DbContext API is simplified - both in terms of usage and features but you can still get ObjectContext from DbContext and use features available only in ObjectContext API. On the other hand DbContext API has some additional performance impact and additional layer of bugs. In simple project you will probably not find any disadvantage in DbContext API - you will not see performance impact, you will not use corner features available only in ObjectContext and you will not be affected by occasional bugs.
A lot of information and blog posts was collected since DbContext API was released so you don't have to be afraid that you will not find description of the API. Also ADO.NET team now uses DbContext API as their flag ship.
I'm not a big fan of DbContext API but my opinion is not related to its functionality but to its existence - there is no need to have two APIs and split development capacity of ADO.NET team to maintain and fix two APIs doing the same. It only means that there is less capacity for implementation of really new features.
I'm using it now with Oracle on an add on to an existing application. The simplification that Ladislav refers to works well for me on this project as I am short on time and resources. I have not found any gotchas as long as you stick to simple CRUD operations and less than ~150 tables.
You can still use metadata annotations to provide basic validation and localization and there is enough documentation out there but you won't find much on official Microsoft sites.
I used Entity framework with a database having around 50 tables and it worked just fine.
But just to see what happens with a larger database in terms of number of tables/entities i tried to implement the Entity Framework to a database that had around 100+ tables.
Once i selected all the tables and clicked on the Finish Button on the Entity Framework Wizard its just hanged my VS 2010 so i could not get any results.
My Questions are as below;
1.If I have larger Database in terms of Table/Entites as described above, Is it a good idea to use Entity Framework?
2.What will be the better approch using Entity Framework to work with database?
3.Should i create multiple DataContext or EDMX files with lesser entites in it?
4.How will these different DataContext interact with each other?
5.Is there any recommended no of tables that should be used while working with Entity Framework?
#Will is correct that the limitation you're seeing is in the designer, but it's not the only one, so Code-First doesn't necessarily fix the problem.
If the designer seems slow, it's inconvenient, but not the end of the world. Runtime performance considerations are another thing altogether. For performance-critical tasks and tuning, you'll want to understand the whole pipeline.
View generation, e.g., takes time. You can move this to compile time with manual work.
1.If I have larger Database in terms of Table/Entites as described above, Is it a good idea to use Entity Framework?
I certainly wouldn't let it stop you.
2.What will be the better approch using Entity Framework to work with database?
3.Should i create multiple DataContext or EDMX files with lesser entites in it?
That's certainly a good approach for many applications.
4.How will these different DataContext interact with each other?
Mostly not. A single, giant data model is often a bad idea due to service coupling. However, you can selectively couple them by sharing portions of the models with includes in EDMX or classes in code-first.
5.Is there any recommended no of tables that should be used while working with Entity Framework?
One way is to use smaller models, as you've suggested. Another way is to work around the runtime performance issues which sometimes come with larger models (see the links I give above). Like any potential performance "problem", write correct code first, then profile and fix the slow parts. Usually, query tuning is more important than model size anyway.
EF, probably yes. The toolset in Visual Studio? Not so much, apparently. For a database this big, you might want to do Code First.
I think EF itself have't performance limitations for count of tables, but have for count of records in particular table. You have to do manual object-db relation (i.e. manual write classes for tables and corresponding attributes) for go away from design problems in VS10.
It's clear approach in Hibernate, but in EF probably not.
Entity Framework is the best way to develop database applications.
I used to develop my applications using LINQ to SQL but since Microsoft is not going to support it in future, it recommends to use Entity Framework.
By the way, Entity Framework 4 in .NET 4 has much better performance than previous versions.
I'm currently developing an enterprise application using Entity Framework and it supports all my needs.
I suggest to use Entity Framework.