MVC Business Logic + DAL + Mapping ViewModel with EF Model - entity-framework

I'm starting to develop an application, and I want to know the best practices to organize the architecture of the solution.
Should I use EF Class Model as my ViewModel?
Should I put all my queries and db access in the model? or create a Service to manage all Db concerns ?
I'm using EF with DB First, because the db is already developed.
Thanks!

There are much more complete descriptions of application architecture out there, but here's the $.25 description.
EF Class Models are for your communication with a data store
Data Transfer Objects (DTO) are how modules communicate among themselves
(WebAPI to MVC, etc.)
ViewModels supply the data your UI requires
Look up "Separation of Concerns" as it pertains to application architecture, it can save your butt. Often, developers will dual-purpose these entities leading to some hilarious results when you find you've painted a corner for yourself. Not so funny if you are the "painter".
On the other hand, keeping these models requires extra effort and the mappings take CPU cycles. Here's a concrete example:
WebAPI accesses People entity (EF class) and maps to a PeopleDTO (not all fields, maybe additional information) and returns this to your MVC controller. MVC controller takes the PeopleDTO and merges it with supporting lookup tables (more WebAPI calls) to create a PeopleVM (ViewModel) that is used by your Razor Page.
In the scenario I just outlined, there are three different types of People object but each could have very different contents dependent on the needs of that "layer". Lots of tools exist to make the mapping less painful.
Clear?

Related

Entity Framework 6 Database-First and Onion Architecture

I am using Entity Framework 6 database-first. I am converting the project to implement the onion architecture to move towards better separation of concerns. I have read many articles and watched many videos but having some issues deciding on my solution structure.
I have 4 projects: Core, Infrastructure, Web & Tests.
From what I've learned, the .edmx file should be placed under my "Infrastructure" folder. However, I have also read about using the Repository and Unit of Work patterns to assist with EF decoupling and using Dependency Injection.
With this being said:
Will I have to create Repository Interfaces under CORE for ALL entities in my model? If so, how would one maintain this on a huge database? I have looked into automapper but found issues with it presenting IEnumererables vs. IQueryables but there is an extension available it has to hlep with this. I can try this route deeper but want to hear back first.
As an alternative, should I leave my edmx in Infrastructure and move the .tt T4 files for my entities to CORE? Does this present any tight coupling or a good solution?
Would a generic Repository interface work well with the suggestion you provide? Or maybe EF6 already resolves the Repository and UoW patterns issue?
Thank you for looking at my question and please present any alternative responses as well.
I found a similar post here that was not answered:
EF6 and Onion architecture - database first and without Repository pattern
Database first doesn't completely rule out Onion architecture (aka Ports and Adapters or Hexagonal Architecture, so you if you see references to those they're the same thing), but it's certainly more difficult. Onion Architecture and the separation of concerns it allows fit very nicely with a domain-driven design (I think you mentioned on twitter you'd already seen some of my videos on this subject on Pluralsight).
You should definitely avoid putting the EDMX in the Core or Web projects - Infrastructure is the right location for that. At that point, with database-first, you're going to have EF entities in Infrastructure. You want your business objects/domain entities to live in Core, though. At that point you basically have two options if you want to continue down this path:
1) Switch from database first to code first (perhaps using a tool) so that you can have POCO entities in Core.
2) Map back and forth between your Infrastructure entities and your Core objects, perhaps using something like AutoMapper. Before EF supported POCO entities this was the approach I followed when using it, and I would write repositories that only dealt with Core objects but internally would map to EF-specific entities.
As to your questions about Repositories and Units of Work, there's been a lot written about this already, on SO and elsewhere. You can certainly use a generic repository implementation to allow for easy CRUD access to a large set of entities, and it sounds like that may be a quick way for you to move forward in your scenario. However, my general recommendation is to avoid generic repositories as your go-to means of accessing your business objects, and instead use Aggregates (see DDD or my DDD course w/Julie Lerman on Pluralsight) with one concrete repository per Aggregate Root. You can separate out complex business entities from CRUD operations, too, and only follow the Aggregate approach where it is warranted. The benefit you get from this approach is that you're constraining how the objects are accessed, and getting similar benefits to a Facade over your (large) set of database entities.
Don't feel like you can only have one dbcontext per application. It sounds like you are evolving this design over time, not starting with a green field application. To that end, you could keep your .edmx file and perhaps a generic repository for CRUD purposes, but then create a new code first dbcontext for a specific set of operations that warrant POCO entities, separation of concerns, increased testability, etc. Over time, you can shift the bulk of the essential code to use this, while still keeping the existing dbcontext so you don't lose and current functionality.
I am using entity framework 6.1 in my DDD project. Code first works out very well if you want to do Onion Architecture.
In my project we have completely isolated Repository from the Domain Model. Application Service is what uses repository to load aggregates from and persist aggregates to the database. Hence, there is no repository interfaces in the domain (core).
Second option of using T4 to generate POCO in a separate assembly is a good idea. Please remember that your domain model (core) should be persistence-ignorant.
While generic repository are good for enforcing aggregate-level operations, I prefer using specific repository more, simply because not every Aggregate is going to need all of those generic repository operations.
http://codingcraft.wordpress.com/

Entity framework and Business Layer / logic

im doing some self-study on architecture of a MVVM-light - EF Application
im trying to build a product/Receipt like app.
I have a Db/EF with a Product and Receipt Table/Entity in a many to many relation.
then i have a DAL wich simply uses Linq to do simple CRUD.
the question is where and how to put my business logic in this app.
a couple of ideas came to mind
option 1
-make a ReceiptBo (Receipt business object)
wich inherit the Entity Receipt class and Icollection(ProductBo)
the ReceiptBo class would be responsible for adding Product,calculating total and subtotal and calling the Dal for inserting.
maby this option seemed a little overkill.
option 2
-put the calculating methods in the generated Entity objects by using partial classes
and simply using the BuisnessLayer to call the Dal.
this would make the Buisnesslayer Classes obsolete in my opinion and im not sure that Entity classes should be used for Business logic at all ?
option 3
-make the Business Classes but dont bother using inheritance, just add products to the Entity's and do the calculations there and call the Dal for insert.
wich seems simple but not very elegant.
option 4
-none of the above and im clueless
right now im not using WCF but the idea is that i want to make this app loosly coupled so that it would be easy to implement it and further extend it.
also im a little confused about what an Business layer is. in some examples it is more used like a Dal that also does the computing, then others say this is not done.
some help would be great. thx
ps: sorry for my bad english
Really, I would go simple here and choose a common multi-tier architecture designed as follows:
a data access layer (basically, your Entity Framework model along with all your entities)
a business layer that exposes methods to access to your entities (CRUD methods + any custom method that run some logic)
a service layer that exposes stateless methods through WCF (service+data contract)
the presentation layer (in your case using MVVM pattern)
Views (XAML pages)
ViewModels (C# classes)
Model is represented here by the entities that are exposed through WCF by the service layer
I wouldn't add any method directly in the entities. All methods are defined in the business layer, and exposed by the service layer.
Usually I keep my business logic in my ViewModels, not my Models, and I view EF objects as Models. At most, they'll have some basic data validation, such as verifying length or required fields.
For example, an EditRecieptViewModel would validate business rules such as verifying values are in a specific range, or verifying that users have access to edit the object, or performing some custom calculations when a value changes.
Also, don't forget that ViewModels should reflect the View, not a Model, so not every Model will have a ViewModel of its own

EF + WCF in three-layered application with complex object graphs. Which pattern to use?

I have an architectural question about EF and WCF.
We are developing a three-tier application using Entity Framework (with an Oracle database), and a GUI based on WPF. The GUI communicates with the server through WCF.
Our data model is quite complex (more than a hundred tables), with lots of relations. We are currently using the default EF code generation template, and we are having a lot of trouble with tracking the state of our entities.
The user interfaces on the client are also fairly complex, sometimes an object graph with more than 50 objects are sent down to a single user interface, with several layers of aggregation between the entities. It is an important goal to be able to easily decide in the BLL layer, which of the objects have been modified on the client, and which objects have been newly created.
What would be the clearest approach to manage entities and entity states between the two layers? Self tracking entities? What are the most common pitfalls in this scenario?
Could those who have used STEs in a real production environment tell their experiences?
STEs are supposed to solve this scenario but they are not silver bullet. I have never used them in real project (I don't like them) but I spent some time playing with them. The main pitfalls I found are:
Coupling your data layer with your client application - you must share entity assembly between projects (it also means it is .NET only solution but it should not be a problem in your case)
Large data transfers - you pass 50 entities to clients, client change single entity and you will pass 50 entities back. It will require some fighting with STEs to avoid passing unnecessary data
Unnecessary updates to database - normally when EF works with attached entities it track changes on property level but with STEs it track changes on entity level. So if user modify single property in entity with 100 properties it will generate update with setting all of them. It will require modifying template and adding property level change tracking to avoid this.
Client application should use STEs directly (binding STEs to UI) to get most of its self tracking ability. Otherwise you will have to implement code which will move data from UI back to self tracking entity and modify its state.
They are not proxied = they don't support lazy loading (in case of WCF service it is good behavior)
I described today the way to solve this without STEs. There is also related question about tracking over web services (check #Richard's answer and provided links).
We have developed a layered application with STE's. A user interface layer with ASP.NET and ModelViewPresenter, a business layer, a WCF service layer and the data layer with Entity Framework.
When I first read about STE's the documentation said that they are easier then using custom DTO's. They should be the 'quick and easy way' and that only on really big projects you should use hand written DTO's.
But we've run in a lot of problems using STE's. One of the main problems is that if your entities come from multiple service calls (for example in a master detail view) and so from different contexts you will run into problems when composing the graphs on the server and trying to save them. So our server function still have to check manually which data has changed and then recompose the object graph on the server. A lot has been written about this topic but it's still not easy to fix.
Another problem we ran into was that the STE's wouldn't work without WCF. The change tracking is activated when the entities are serialized. We've originally designed an architecture where WCF could be disabled and the service calls would just be in process (this was a requirement for our unit tests, which would run a lot faster without wcf and be easier to setup). It turned out that STE's are not the right choice for this.
I've also noticed that developers sometimes included a lot of data in their query and just send it to the client instead of really thinking about which data they needed.
After this project we've decided to use custom DTO's with automapper from server to client and use the POCO template in our data layer in a new project.
So since you already state that your project is big I would opt for custom DTO's and service functions that are a specifically created for one goal instead of 'Update(Person person)' functions that send a lot of data
Hope this helps :)

Confusion over MVC and entity model

My confusion stems from the fact I am using 2 different walkthroughs on building mvc applications, namely: Steven Sanderson's pro asp.net mvc and the online mvc music store. The former creates a domain model, placing the entity model in there along with repositories, while the music store demo places the entity model in the mvc model folder. Which of these is the best approach. Should the entity model and associated repositories exist in a separate domain layer, or in the MVCs model folder.
Separation of concerns
Model folder in Asp.net MVC project template is indeed very confusing. Most developers not knowing enough about MVC pattern think that application/domain model = data model. Most of the time, that's not the case.
Take for instance a user entity that may be in several different forms:
NewUser is an application model entity that has most properties of a user, plus two password properties that can be declaratively validated
User data model entity has all usual user properties and one password property
User application model entity has all the usual properties and none for password
So you can see by this simple example there are multiple models that differ between each other. And when you have a multi-assemblied application, putting application model in a separate assembly is very wise, since all assemblies will most probably communicate using these objects only. No data model entities should be transferred outside data assembly/tier to make use of SoC...
So in the end it's ok to put data model in the Model folder when building a small scale simple application, but in all other cases it's probably better to use a separate application model assembly that's shared between all assemblies. And have a separate data model that's only used in data tier assembly.
Read this answer that may help you see things a bit clearer.
And this one as well.
I would recommend against using the Model folder and use a separate assembly instead. You'll have better separation and improved scalability.
Strategically it makes sense to place the EF Model in the same folder as the repositories, because it just is a part of the Data-Access-Layer inside an application.
Logically it would be better to place the EF model in the Model directory as it creates all classes needed to reflect the database in an application. (And if you open the Class View it looks way better to have all those classes residing in a folder called Model instead of Repositories)
At our company we've had the same problem and decided to go with saving the EF model in the Model folder.
After all it's up to you what you do. The most important thing to do here would be to document all kinds decisions that happen during development (when, why and based on what).
Documenting everything could prevent later WTF's

is it that easy working with ADO.NET Entity framework in real programming?

HI Guys,
I was watching these videos series about Entity Framework:
http://msdn.microsoft.com/en-us/data/ff191186.aspx
is that easy building application in real world programming??? and is it ....reliable...has good performance...
"I am a graduate.."
thanks
Entity Framework is a valid real world data access tool. It is very easy to get up and running with EF. You simply import (or create in EF 4) your data model. You then can rename it to make it more code friendly. And then you are off querying databases.
Performance
I have been on multiple projects that use it, some which require high throughput, others that have low performance requirements. Entity Framework out of the box is not the fastest solution in the world, so there are a lot of performance tweaks that have to go on, but its all do able.
Reliability
We never have issues with reliability. We have never had an issue with EF in general, its always data content related. Trying to insert duplicated data, etc.
Other Tangibles
EF follows a pattern which allows for you to do some fun stuff with templates and abstract classes. All entities inerit from a class, entities that have references inherit from other classes. All Entity Contexts inherit from ;) ObjectContext classes, which provide a base set of functionality that allows you to create generic DAO implementations that can be reused throughout the enterprise.
If you are using UI dev, you can also use Data Services that wrap EF, as a fast gateway to your databse. The only downside of this is that you dont have access to the full suite of the Entity Framework.