Best Practices - Data Annotations vs OnChanging in Entity Framework 4 - entity-framework

I was wondering what the general recommendation is for Entity Framework in terms of data validation. I am relatively new to EF, but it appears there are two main approaches to data validation.
The first is to create a partial class for the model, and then perform data validations and update a collection of rule violations. This is outlined at http://msdn.microsoft.com/en-us/library/cc716747.aspx
The other is to use data annotations and then have the annotations perform data validation. Scott Guthrie explains this on his blog at http://weblogs.asp.net/scottgu/archive/2010/01/15/asp-net-mvc-2-model-validation.aspx.
I was wondering what the benefits are of one over the other. It seems the data annotations would be the preferred mechanism, especially as you move to RIA Services, but I want to ensure I am not missing something. Of course, nothing precludes using both of them together.
Thanks
John

I have been using DataAnnotations using MVC 2 and it works great. I have not tried the partial on an entity object for validation, but I see its uses. Basically if I create a partial class on an entity object I use it to default data such as a GUID identifier. or Create Date or modified Date. I guess it would be useful to add validations in the partial class perhaps for some complex validation that needs to happen in the entity layer but even then those validations could be accomplished in custom validator. If you are using an MVC website then I would personally use dataannotations.

Related

Hide Columns in Breeze data returns

In our entity framework model that identify the customer, a simple Customer_GUID. We are using breeze with asp.net mvc and doing IQueryable.
Is there a way to globally not return those columns in the JSON? This would reduce a good bit of data coming across the wire. We don't want to remove it from mapping in our EF model because we still use it when we save.
You might want to look at the Json.NET documentation in particular the [JsonIgnore] attribute. Look at "Conditional Property Serialization" for more sophisticated scenarios.
Do be careful about insert and update data coming from the client. You'll have to do something if your client uploads a new entity for insertion and it lacks the properties you require on the server side.
To be clear, your configuration of Json.NET has no affect on your server-side EF model ... exactly what you wanted.
That also means that metadata generated from your EF model will describe properties the client can't see. You'll want to compensate for that I imagine. Such compensation is beyond the scope of this question; look to the Breeze documentation on metadata ... particularly "Metadata by hand" and "EF as a design tool".

Entity Framework & Class Models in MVC

I'm new to the MVC way of developing applications and for the most part am enjoying. One thing I'm a bit confused about is the use of the Entity Framework. The EF usually (at least in my experience) defines multiple tables and relationships through the .edmx table. A couple of questions:
Why would I define a separate class file for a specific table if EF is building all of the classes that I need in the background?
From some of the validation approaches that I've seen, they want to define validation logic in the class related to a model for a table. If I'm using EF, will I have a .cs file describing the model and a .edmx describing that same table (in addition to its associated tables)?
If yes, how do you connect the .cs file to the .edmx definition so that CRUD flows easily from the EF?
Sorry if these seem like easy questions but I'm just trying to get my head wrapped around these fundamental concepts. Too many examples out there use only a single table where in my business, I NEVER write an application that uses a single table. There are always multiple tables in relation to each other with foreign keys. Thanks for your prompt responses.
For a tutorial that shows the use of partial classes -- in a Web Forms application but for MVC the same technique would be used -- see Adding Metadata to the Data Model in this tutorial:
http://www.asp.net/web-forms/tutorials/getting-started-with-ef/the-entity-framework-and-aspnet-getting-started-part-8
From your comment "The EF usually (at least in my experience) defines multiple tables and relationships through the .edmx table." it sounds like you are familiar only with Database First and Model First -- for an introduction to Code First and an explanation of the differences, followed by a series of tutorials with an MVC example using Code First, see this tutorial:
http://www.asp.net/mvc/tutorials/getting-started-with-ef-using-mvc/creating-an-entity-framework-data-model-for-an-asp-net-mvc-application
Good questions, Darryl. Here are my responses to your bullet points:
Defining separate model classes that match the data models that EF creates is generally a good idea for the simple sake of separating your data access "stuff" from your business model objects that will get used throughout your app. Some people don't like this approach because it creates some amount of overhead when it comes to mapping your entities to POCOs but, if you use a tool such as AutoMapper, the overhead is minimal. The benefit lies in you creating a layer of separation between you and your (likely) evolving data model.
You could define validation logic in a buddy class (just a partial class that sits along-side your entity) but that would mean that you would be using that entity across your app and some would debate that that isn't the best idea. The alternative method, as mentioned above, is to create your own POCOs to mirror the entities that EF creates and place your validation attributes on the POCOs.
I mentioned this in the previous item but the way to do this would be to define buddy classes. Give EF buddy classes a Google and you should find plenty of examples on how to do that.
Just to add to all of this, if you choose to create POCO classes that mirror your EF entities, tools like AutoMapper can handle fairly complex relationships when it comes to mapping classes. So, if you have foreign key relationships in your data model, AutoMapper can understand that and map your POCO classes accordingly (i.e.: You have an entity that has a 1-to-many relationship and a POCO with a list of objects to mirror that relationship.)
I hope some of that helps...

Which variant of Entity Framework to use in WCF based enterprise app

We are in a process of designing an application with approx 100 tables and complicated business logic. Windows Forms will be used on the client side and WCF services with MSSQL on the server.
Custom DTOs are used for client-server communication, business entities are not distributed.
Which variant of Entity Framework to use (and why):
EF 4.0 EntityObjects
EF 4.0 POCO
EF 4.1 DbContext
Something else
Database-first approach is a requirement.
Also, is it worth implementing a Repository pattern? It seems a bit redundant, as there is one level of abstraction in the mapping itself and another one in the use of DTOs. I'm currently leaned towards using auto-generated extendable repositories for each entity returning IQueryable, just to have a place to put common queries, but still allowing querying entity model directly from the Service Layer.
Which variant to use? Basically once you have custom DTO the only question is do you want to have control over entities code (their base class) and make them independent on EF? Do you want to use code first? If the answers to all questions are no then you can use EntityObjects. If you want to have entities persistence ignorant or use custom base class you should go to POCO. If you want to use code first or new DbContext API you will need EF 4.1. Some related topics:
EF 4.1 Code-first vs Model/Database-first
EF POCO code only VS EF POCO with Entity Data Model (this was related to CTP)
ADO.NET DbContext Generator vs. ADO.NET POCO Entity Generator
EF Model First or Code First Approach?
There are more things to consider when designing service layer. You should be aware of complications you will have to deal with when using EF in WCF. Your service will provide data to WinForms application and it will work with them in "detached mode". Once user will do all changes he wants to do he will post data back to the service. But here comes the problem - you must tell EF what has changed. If you for example allow user to change order with all its order items (change quantity in items, add new items, delete some items) you must say EF exactly what has changed, what was added and what was deleted. That is easy when you work with single entity but once you allow user to change object graph (especially many-to-many relations) then it is quite tough. The most common solution is loading the whole graph and merge the state from incoming DTOs to loaded and attached graph. Other solution is using Self tracking entities instead of EntityObjects/POCOs + DTOs.
When discussing repositories I would refer you to this answer which refers many other answers discussing repositories, their possible redundancy and possible mistakes when using them just to make your code testable. Generally each layer should be added only if there is real need for the layer - due to better separation of concerns.
The main advantage of POCOs is that those classes can be your DTOs, so if you've already got custom DTOs that you're using, POCO seems a bit redundant. However, there are some other advantages which may or may not have value to you, since you didn't mention unit testing as a requirement. If you plan to write unit tests, then POCO is still the way to go. You probably won't notice much difference between 4.0 POCO and 4.1 since you won't be using the code-first feature (disclaimer: I've only used 4.0 POCO, so I'm not intimately familiar with any minor differences between the two, but they seem to be more or less the same--basically I was already using POCO in 4.0 and haven't seen anything that's made me want to update everything to use 4.1).
Also, depending on whether you plan to unit-test this layer, there's still value in implementing the repository/unit of work patterns when using Entity Framework. It serves to abstract away the data access logic (the context), not the entities themselves, and allows you to do things like mocking your context in unit tests. What I do is copy the T4 template for my context and use it to create the interface, then edit the T4 template for the context and have it implement that interface and use IObjectSet<T> instead of ObjectSet<T>. So instead of:
public class MyEntitiesContext
{
public ObjectSet<MyClass> MyEntities
...
}
I end up with:
public interface IMyEntitiesContext
{
public IObjectSet<MyClass> MyEntities;
}
and
public class MyEntitiesContext : IMyEntitiesContext
{
public IObjectSet<MyClass> MyEntities
...
}
So I guess it really comes down to whether or not you plan to write unit tests for this layer. If you won't be doing anything that would require mocking out your context for testing, then the easiest thing to use would probably be 4.0 EntityObjects, since you aren't planning to pass your entities between layers and it would require the least effort to implement. If you plan to use mocking, then you'll probably want to use POCO and implement repository/unit of work.

ASP.Net MVC2 Validate two ViewModels of the same class differently using DataAnnotations

I'm using DataAnnotations for validation of a custom class (LINQ to SQL auto generated) using the MetadataType tag on top of the class. I'm loving DataAnnotations and it works well in simple, common scenarios. E.g.
[MetadataType(typeof(Person_Validation))]
public class Person
But what if you need to have two different sets of validation rules applied to the class in different scenarios???
My situation: Some fields are mandatory on the www public-facing site, but not mandatory on the internal admin site. But both sites have a View which "Creates New" of the same object/class.
This is where it becomes DataAnnotations HELL surfaces..
I've tried using two different ViewModels with different validation applied to each of them, two classes that inherit from Person with different validation applied to each of them. But all roads seem to conflict with DRY principals and you end up somewhere along the line having the totally respecify all properties for the underlying class structure. You don't have to do this when you just have one validation rule set. So it very quickly becomes hell and not practical for complex objects.
Is this possible using DataAnnotations and what is the best DRY architecture?
Not sure what you mean by 'virtually duplicate and manually set each and every property manually in the original underlying class'. I've never liked the idea of buddy classes, and would personally recommend different view models for Admin and Public site (with appropriate validation set on each), and then mapping between the models using AutoMapper.
UPDATE:
Regading Automapper, the basic usage is something like this:
First you have to define your mappings. This lets automapper figure out in advance how to map objects. You only need to do this once in the application, so a good place to do this in an ASP.NET app is in Application_Start() in Global.asax. For each pair of classes you want to map between, call: Mapper.CreateMap<SourceType, DestinationType>();
Then, in your application code to do the map you just use:
var destinationObject = Mapper.Map<SourceType, DestinationType>(sourceOjbect);

Should i use partial classes as business layer when using entity framework?

I am working on a project using entity framework. Is it okay to use partial classes of the EF generated classes as the business layer. I am begining to think that this is how EF is intended to be used.
I have attempted to use a DTO pattern and soon realized that i am just creating a bunch of mapping classes that is duplicating my effort and also a cause for more maintenance work and an additional layer.
I want to use self-tracking-entities and pass the EF entities to all the layers. Please share your thoughts and ideas. Thanks
I had a look at using partial classes and found that exposing the database model up towards the UI layer would be restrictive.
For a few reasons:
The entity model created includes a deep relational object model which, depending on your schema, would get exposed to the UI layer (say the presenter of MVP or the ViewModel in MVVM).
The Business logic layer typically exposes operations that you can code against. If you see a save method on the BLL and look at the parameters needed to do the save and see a model that require the construction of other entities (cause of the relational nature the entity model) just to do the save, it is not keeping the operation simple.
If you have a bunch of web services then the extra data will need to be sent across for no apparent gain.
You can create more immutable DTO's for your operations parameters rather than encountering side effects cause the same instance was modified in some other part of the application.
If you do TDD and follow YAGNI then you will tend to have a structure specifically designed for the operation you are writing, which would be easier to construct tests against (not requiring to create other objects not realated to the test just because they are on the model). In this case you might have...
public class Order
{ ...
public Guid CustomerID { get; set; }
... }
Instead of using the Entity model generated by the EF which have references exposed...
public class Order
{ ...
public Customer Customer { get; set; }
... }
This way the id of the customer is only needed for an operation that takes an order. Why would you need to construct a Customer (and potentially other objects as well) for an operation that is concerned with taking orders?
If you are worried about the duplication and mapping, then have a look at Automapper
I would not do that, for the following reasons:
You loose the clear distinction between the data layer and the business layer
It makes the business layer more difficult to test
However, if you have some data model specific code, place that is a partial class to avoid it being lost when you regenerate the model.
I think partial class will be a good idea. If the model is regenerated then you will not loose the business logic in the partial classes.
As an alternative you can also look into EF4 Code only so that you don't need to generate your model from the database.
I would use partial classes. There is no such thing as data layer in DDD-ish code. There is a data tier and it resides on SQL Server. The application code should only contain business layer and some mappings which allow persisting business objects in the mentioned data tier.
Entity Framework is you data access code so you shouldn't built your own. In most cases the database schema would be modified because the model have changed, not the opposite.
That being said, I would discourage you to share your entities in all the layers. I value separation of UI and domain layer. I would use DTO to transfer data in and out of the domain. If I have the necessary freedom, I would even use CQRS pattern to get rid of mapping entities to DTO -- I would simply create a second EF data access project meant only for reading data for the UI. It would be built on top of the same database. You read data through read (anemic -- without business logic) model, but you modify it by issuing commands that are executed against real model implemented using EF and partial methods.
Does this answer your question?
I wouldn't do that. Try too keep the layers independent as possible. So a tiny change in your database schema will not affect all your layers.
Entities can be used for data layer but they should not.
If at all, provide interfaces to be used and let your entities implement them (on the partial file) the BL should not know the entities but the interfaces.