MVC3 and EF Data first: what are the best practices? - entity-framework

It seems that most of the focus with MVC3 and EF4.1 is around "code first" - I can't seem to find any examples or tutorials that meet the following criteria:
uses an existing SQLServer database
has separate projects for web & data access (we will have multiple web apps sharing the same data access classes)
recommendations for validation
Does such an example or tutorial exist? Are there any documented "best practices" for how to accomplish this, or rationale for NOT having a solution structured this way?

It is quite common scenario and it depends if you want to use EDMX file for mapping or if you want to have mapping defined in code (like code first).
Both scenarios can be done as database first
You will create EDMX from existing database with build in EF tools in Visual Studio and you will use DbContext T4 generator template to get POCO classes and DbContext derived class
You will download EF Power Tools CTP and you will use its reverse engineering feature to generate code mapping, POCO classes and context for you
Neither of these approaches will add Data annotations. Data annotations on entities should not be used for client validation (that is bad practice) unless you are doing very simple applications. Usually your views have some more advanced expectations and validation in view can be different then on entity. For example insert view and update view can need different validations and it is not possible to perform it with single set of data annotation on the entity. Because of that you should move data annotations for validation to specialized view models and transform your entities to view models and vice versa (you can use AutoMapper to simplify this).
Anyway it is possible to add data annotations to generated classes via buddy classes but as mentioned it is not a good practice.

Related

Getting familiar with Entity Framework when using existing database

We are currently rewriting an existing internal ASP.NET Web Forms application. Our application consists of a Web Api back end which uses Entity Framework 6 for data access and an front end which uses AngularJS.
We have an existing large database that I've created EF models using the Code-First Using Existing Database method and we are using data transfer object classes as inputs/outputs to our API methods so we aren't directly exposing our model classes. So basically, I'm trying to become proficient with EF, Web Api and AngularJS all at the same time. For the most part I'm fairly comfortable with the latter two, but for EF I haven't completely gotten comfortable with. I've watched a lot of the videos on Microsoft Virtual Academy but this is the first time I've had some hands-on experience with it.
We've been working on this application for a few months and so far we've only had to work with CRUD operations on our entities (POCO DTO's) which are flat objects with simple properties. However, we've finally come across some situations where we need to deal not only with our classes, but properties which are classes themselves; a parent-child relationship.
Therefore, I have the following questions:
I see that when we have a proper foreign key relationship in our DB, that virtual properties are created in EF, which from what I recall are to support lazy-loading. However, lazy-loading isn't really feasible in this environment where we are using web services (Web Api). Our object model does allow for some really large hierarchy of classes where a fully populated object and its children would mean a large amount of data would be passed around when that really isn't necessary, so in most cases a first level object is all we need. In some cases however, we do want to populate child classes, so my question is how do we do that, and where do we do that? I've looked at the automatically-generated code in the DB Context but we have also used scaffolded code to create our controllers. Which place do we need to do this? I've seen code samples showing how do to this but it hasn't said specifically where this code lies. It appears to be within a controller but I could be wrong.
If we do allow for 2- or more level hierarchy of objects, does EF automatically handle operations (updates, deletes, etc.) -- for example, if we have a "Company" object which has a collection of "Customer" objects, and we delete the "Company" object, do the related "Customer" objects get deleted too? Also, is a multi-step operation like that automatically performed within a transaction or do we need to explicitly set that up?
If I modify a model class or the DB context, seeing as this code is automatically-generated, that's generally bad practice as my changes could be overwritten, so I am assuming the controller code is where I want to make my changes. I am aware of database migrations but I have no experience with them and I am sure I'll need to use them at some point because I am fairly confident that our database may not have all the foreign key relationships necessary for EF to do everything we need at the moment.
I know this is a long post, but if anyone can give some guidance on how to do some of these things because it's not only me that's having to deal with this but I have two other developers on my team who are working on this project and we are all as inexperienced with this as the others are. Thanks
For the purpose of sending data across a web service, I'd suggest creating a DTO to hold the data you want to send and mapping your entities to the DTO instead of trying to send the entities themselves in your payload. It also protects your API from changes to your entity.
Cascading deletes are configurable, iirc, but I'm not 100% sure what the default is. Transactions are generally not implicit, so you will want to use those where you require them.
Not exactly sure what you are asking here. In general, how your entities/tables change depends on if you are using database-first or code-first. If you are using database-first (you will have a .edmx file in your solution that has the model matching your schema), you just update the SQL directly and update your entity model via the .edmx. If you use code-first, you will change the entities how you want them and run a database migration to update your database to match.
MSDN article about code-first migration: https://msdn.microsoft.com/en-us/data/jj591621.aspx

Entity Framework 6 Database-First and Onion Architecture

I am using Entity Framework 6 database-first. I am converting the project to implement the onion architecture to move towards better separation of concerns. I have read many articles and watched many videos but having some issues deciding on my solution structure.
I have 4 projects: Core, Infrastructure, Web & Tests.
From what I've learned, the .edmx file should be placed under my "Infrastructure" folder. However, I have also read about using the Repository and Unit of Work patterns to assist with EF decoupling and using Dependency Injection.
With this being said:
Will I have to create Repository Interfaces under CORE for ALL entities in my model? If so, how would one maintain this on a huge database? I have looked into automapper but found issues with it presenting IEnumererables vs. IQueryables but there is an extension available it has to hlep with this. I can try this route deeper but want to hear back first.
As an alternative, should I leave my edmx in Infrastructure and move the .tt T4 files for my entities to CORE? Does this present any tight coupling or a good solution?
Would a generic Repository interface work well with the suggestion you provide? Or maybe EF6 already resolves the Repository and UoW patterns issue?
Thank you for looking at my question and please present any alternative responses as well.
I found a similar post here that was not answered:
EF6 and Onion architecture - database first and without Repository pattern
Database first doesn't completely rule out Onion architecture (aka Ports and Adapters or Hexagonal Architecture, so you if you see references to those they're the same thing), but it's certainly more difficult. Onion Architecture and the separation of concerns it allows fit very nicely with a domain-driven design (I think you mentioned on twitter you'd already seen some of my videos on this subject on Pluralsight).
You should definitely avoid putting the EDMX in the Core or Web projects - Infrastructure is the right location for that. At that point, with database-first, you're going to have EF entities in Infrastructure. You want your business objects/domain entities to live in Core, though. At that point you basically have two options if you want to continue down this path:
1) Switch from database first to code first (perhaps using a tool) so that you can have POCO entities in Core.
2) Map back and forth between your Infrastructure entities and your Core objects, perhaps using something like AutoMapper. Before EF supported POCO entities this was the approach I followed when using it, and I would write repositories that only dealt with Core objects but internally would map to EF-specific entities.
As to your questions about Repositories and Units of Work, there's been a lot written about this already, on SO and elsewhere. You can certainly use a generic repository implementation to allow for easy CRUD access to a large set of entities, and it sounds like that may be a quick way for you to move forward in your scenario. However, my general recommendation is to avoid generic repositories as your go-to means of accessing your business objects, and instead use Aggregates (see DDD or my DDD course w/Julie Lerman on Pluralsight) with one concrete repository per Aggregate Root. You can separate out complex business entities from CRUD operations, too, and only follow the Aggregate approach where it is warranted. The benefit you get from this approach is that you're constraining how the objects are accessed, and getting similar benefits to a Facade over your (large) set of database entities.
Don't feel like you can only have one dbcontext per application. It sounds like you are evolving this design over time, not starting with a green field application. To that end, you could keep your .edmx file and perhaps a generic repository for CRUD purposes, but then create a new code first dbcontext for a specific set of operations that warrant POCO entities, separation of concerns, increased testability, etc. Over time, you can shift the bulk of the essential code to use this, while still keeping the existing dbcontext so you don't lose and current functionality.
I am using entity framework 6.1 in my DDD project. Code first works out very well if you want to do Onion Architecture.
In my project we have completely isolated Repository from the Domain Model. Application Service is what uses repository to load aggregates from and persist aggregates to the database. Hence, there is no repository interfaces in the domain (core).
Second option of using T4 to generate POCO in a separate assembly is a good idea. Please remember that your domain model (core) should be persistence-ignorant.
While generic repository are good for enforcing aggregate-level operations, I prefer using specific repository more, simply because not every Aggregate is going to need all of those generic repository operations.
http://codingcraft.wordpress.com/

Does Entity Framework DB First (EDMX) prevent proper Separation of Concerns?

I am new to entity framework and MVC, and trying to understand what constitutes a good design approach for a new application.
There are several ways of using Entity Framework. However, for my project, the best looking option is DB First. I've played around with an EDMX file, and I have got as far as using the DbContext code generator to create my wrapper classes.
I plan on using the repository and unit-of-work patterns, and using ninject for DI.
However, it does not seem "proper", from a SoC point of view, that whilst my respository will hide the implementation of the data store (EF) from my code, the model classes themselves are very much EF flavoured.
It seems that using EDMX-based approaches to EF blur the separation of concerns. Only POCO support seems to allow a true separation, but POCO has some other limitations that I don't like.
Am I missing something, or does using EDMX have this drawback?
Are people using an auto mapper to convert between the entity model and another, clean, SoCced model?
thanks
Tian
I don't have a strong opinion on the Separation Of Concerns question, but I have used both the standard ADO.Net version of EF and POCO and it is not difficult at all to customise the output of the the T4 code generation script for POCO to address any concerns you have about the structure of the objects created. That sounds like it would probably be a good starting point for what you are looking to do.
Once you know you are looking for T4 templates there are quite a few tutorials and a lot of helpful SO questions that can give you an idea of what you need to do.

EF 4.1 model first code generation tool or template

Is there a template or tool to generate code from the database directly? I want to use model first scenario but do not want .edmx file for mappings. There is a database with many tables and I do not want to write all the classes (I am lazy) for that. So, is there a template to generate the code and set the annotations/use fluent api for defining the relationships, etc automatically from the existing database?
This would be helpful in the following scenario as well. Say, I was using .edmx with POCOs and now I do not want the mappings in the .edmx file. I want the mappings in the code. It would be great to have a tool or a template to generate the mappings in the code from the existing database.
I am starting on learning EF 4.1. I think "Code first becomes model first in version 2 i.e. after the database is created/released (in version 1) and needs some changes". Is that really true? I'd love to hear some comments. Thanks.
Check out the 'Reverse Engineer Code First' feature of the EF Power Tools CTP1 that was just released.
For generating classes, you can use POCO t4 template generation. Have a look at this detailed link which will help you getting started. That way you will get all the classes generated.
For mapping, you can use Code-Only style for Entity Framework but generating classes and context using POCO template will have far more advantage over creating the mapping yourself. Imagine adding new tables or modifying the tables, it will involve more work. But I will certainly love to know if there is any mapping tool for that.
It is possible that you are using EF-provider Devart dotConnect for Oracle when working with Oracle database. In this case the following information will help you to choose the tool.
The first version of Entity Framework Power Tools also contained the capability of generating a Code-First model with fluent mapping from an existing database. Although useful, this functionality is limited as regards its flexibility: the developer can only set the connection string; following that, classes are generated from all database objects available to the user. That is not extremely convenient, since in Oracle, for example, numerous schemas containing hundreds and sometimes thousands of tables are available to the user.
Rather than resort to this limited functionality, the users of Devart ADO.NET providers can avail themselves of impressively robust design-time development capabilities of Entity Developer, an EF-designer delivered with Devart providers. Also possible is the choice between the Database-First approach, as provided in EF Power Tools, and the Model-First approach, within which Code-First classes are created in the EDM-designer.
When compared to EF Power Tools, the Database-First approach to the development of EF Code-First models also allows selecting objects that must be available in the model, setting naming rules for the generation of class names and properties and so on. Besides, the resulting model can be modified and improved in the designer.
To better meet developers' needs, Code-First code generation in Entity Developer both for C# and VB is based on the T4-template that is easily accessible and can be modified in feature-rich T4 Editor contained in Entity Developer.
For more information on Code-First development in Entity Developer, see "Entity Developer – EF Code First DbContext Template"
http://www.devart.com/blogs/dotconnect/index.php/entity-developer-ef-code-first-dbcontext-template.html

Why would I want to use POCO's?

I currently use the Entity Framework designer to generate my persistance objects and I also user POCO view models for the ASp.NET MVC view.
I've read and listened to a lot of people talking about the good support for POCO's in EF4 as well as POCO's in general but I can't seem to work out what advantage, if any I'll get from using them.
In our application, we WILL be using SQL Server so it's not like we need so separate out for different databases.
Why would I want to use POCO's as opposed to the designer generated classes?
POCO offers better extensibility/reuse of your Domain Model as you're not tied to any specific ORM framework.
Answered here :What are the 'big' advantages to have Poco with ORM?
Easier to unit test
I find when you have many entities (100+) using the designer is painful and POCO objects are easy to create and maintain.
With POCO's you can "Code First"
http://weblogs.asp.net/scottgu/archive/2010/08/03/using-ef-code-first-with-an-existing-database.aspx
+1 for with POCOS you can code first. In fact you can create and test your entire application before you even write your first line of data access code. This is how it should be, it's a perfect application of the SRP. Your domain object should not know or care how they are being persisted.