AutoMapper - Convert two entity objects to a single DTO - entity-framework

I have two EntityFramework models that I want to combine into a single DTO. Is there a way to do this? There are a couple ideas in the following question, but you would either have to create a composite model, or lose the ability to call Mapper.AssertConfigurationIsValid to verify all of the properties will be set.
Is it possible to map multiple DTO objects to a single ViewModel using Automapper?
single-viewmodel-using-automappe

From my point of view, It is highly recommanded to create a composite type for merging entities. Entities are part of your Business logic or your Domain logic (depending on your architecture), whereas DTO are part of Presentation logic or Transport layer. You can create an explicit mapping that can be easily tested ; automatic mapping (create maps without options) is good for testing only. If you are using a DTO, then you will probably use it somewhere : in WCF ? as a ViewModel ?
Visual Studio and .Net Framework can manage many files and you have not to sacrifice testability or simplicity (do you know "Technical Debt"?)
Note : The role of Mapper.AssertConfigurationIsValid is to validate all mapping, generated by automatic or explicit mapping. I suggest you call this every time.

Related

3-Tier Architecture MVC 4 Project

I am working on a Web Application using ASP.Net MVC 4 3-Tier Architecture and I am stuck at certain points. I know here exist similar threads but none were clear enough. I have already created the needed layers which are UI(MVC4 project), BLL- Business Logic Layer(library class), BOL- Business Object Layer(library class that contains ADO.net) and DAL- Data Access Layer (library class).
Layer dependencies are as follows:
UI depends on BOL and BLL
BLL depends on BOL and DAL
DAL depends on BOL
I want you to correct me if I am wrong in the following. The BOL is the master reference layer which exchanges raw dB records with DAL then sends them to BLL which is responsible for any logical computations then gets the updated records and sends them to the controller in the UI.
Knowing the above,
Where should we place the CRUD functions?
Where and why should we create a class for declaring (plus set and get) the useful database fields?
What exactly should we put in the ViewModel folder; in other words since we have already defined the variables in the previous step and in the Entity then does it add any value to keep the Model folder?
Thank you in advance.
Can't be unambiguously correct answers on these issues, so any answer should be evaluated simply as an opinion. Here are my answers:
Where should we place the CRUD functions? CRUD is a frequent pattern at different levels. Hi-level Repository provides similar methods, as a low-level Table Gateway.
Where and why should we create a class for declaring (plus set and get) the useful database fields? These classes are usually simple Data Transfer Objects (DTO) or slightly more complicated Active Records. In accordance with the Dependency Inversion Principle the interface for these classes should provide the Business Logic Layer, and implementation should provide the Data Access Layer. Since DTO is very simple classes, they can be provided "as is" without interface/implementation separation.
What exactly should we put in the ViewModel folder; in other words since we have already defined the variables in the previous step and in the Entity then does it add any value to keep the Model folder? In theory entities should be our models; but in practice it's often not the case. F.e. in ASP.NET MVC models should provide not only the value of drop-down field, but also all possible values; so it requires the separate model class.
The result may be that you have three or four very similar classes at different levels of the application. Of course, it's not very good, so Aspect Programming may be applied in this case.

Pushing OData filters down through layers

I have a working project with the following layers:
DataAccess - The project hits multiple DBs and web services. This layer defines the interfaces to each of them. It exposes the native types for each source (EF types for DBs, SOAP-defined types for web services, etc.). Let's say it exposes an EFProject and SoapProject.
Repository - This layer stiches results from the various sources to form a single entity, and exposes it. Let's call this ModelProject
Service - Adds REST attributes to to the entity (action links, etc.). This exposes a ProjectDTO.
WebApi - The controller spits out the ProjectDTO directly.
I'm trying to implement OData, specifically to page the results of very large queries. I've read a lot of examples, but they all seem to expose the source objects directly, and then map them to the final DTOs in the controller.
I would like to somehow push the ODataQueryOptions down to the Repository. This would allow me to keep the existing structure, and pass the query logic down to SQL. I understand that, because the ODataQueryOptions reference the ProjectDTO type, they can't be applied until an object of that type is available. Is there a way to "translate" the ODataQueryOptions from one type to another? Is there another way of doing this that I'm not aware of?
It is able to change the ODataQueryOptions, and such actions in a controller are for you to handle the options yourself:
public IQueryable Get(ODataQueryOptions queryOptions)
public IQueryable Get(int key, ODataQueryOptions queryOptions)
Here is a sample about this:
https://aspnet.codeplex.com/SourceControl/latest#Samples/WebApi/OData/v4/ODataQueryableSample/Controllers/OrdersController.cs .
For your reference, the source code of ODataQueryOptions is: https://aspnetwebstack.codeplex.com/SourceControl/latest#src/System.Web.OData/OData/Query/ODataQueryOptions.cs
Is there a way to "translate" the ODataQueryOptions from one type to another? Is there another way of doing this that I'm not aware of?
There is a way to actually translate the IQueryable<DomainModel> to IQueryable<DtoModel>.
I've done something similar in the past by leveraging AutoMapper's projection functionality. By calling the Project<TSource>/To<TTarget> methods, you can change an IQueryable that points to your domain models to another IQueryable that targets the Dto models, without actually executing it.
This means that you can now perform any OData operations on the DTO level and they will transfer through projection to the DAL layer into EntityFramework and SQL. In a scenario like this, there shouldn't be any need to manually handle the query logic so you can just use [EnableQuery] on the API route and let OData do its thing on the resulting IQueryable<DtoModel>.
I used this very successfully in one of the projects I worked on: as long as you rely just on AutoMapper projection to convert the types, it should work fine.
Granted, you can't do a lot of fancy mapping that way. The project methods will not be able to apply all kinds of mappings that you create, so I recommend checking the documentation on that front.
You also have to keep in mind that the original IQueryable needs to be exposed outside of the repository layer for this to work properly, otherwise the query will be executed too early. Some people will find that a boundary violation and will advocate for materializing the query inside the repository layer, but I don't have an issue with that particular aspect.

Using my own domain model objects with .edmx in Entity Framework

I have a domain model architecture in which my domain/business objects were created based on the problem domain and independent of any knowledge of the physical data model or persistence structures. So far I'm on track because it's perfectly acceptable and often the case that there is an impedance mismatch between the domain model and the data model. A DBA created the database for getting the data they required, but it does not encapsulate the applications entire domain model or design.
The result - I have my own set of domain model objects. However all of the fields that need to be persisted do exist somewhere or another within my domain model, but not necessarily in the shape that my auto generated .edmx POCO entities have them. So I have all the data, it's just not in the perfect shape exactly like the tables in which auto generated POCO entities are created from.
I have seen a few posts on this topic like converting POCO entity to business entity and Entity Framework 4 with Existing Domain Model that make statements like the following:
"Create the entities in your entity data model with the same names as
your domain classes. The entity properties should also have the same
names and types as in the domain classes"
What!? No way, why should I have to make my domain model be reshaped to POCOs that are modeled exactly after the data model / table structure in the database? For example - in my case of having 5 given properties, 2 might be in class 'A' and 3 in class 'B', whereas a auto generated POCO class has all 5 in its own class 'A'.
This is the entire point, I want separation of my object model and data model but yet still use an ORM like EF 5.0 to map in between them. I do not want to have to create and shape classes and properties named as such in the data model.
Right now my .edmx in EF 5.0 is generating the POCO classes for me, but my question is how to dissolve these and rewire everything to my domain objects that contain all this data but just in a different shape?
By the way any solution proposed using a Code First approach is not an option so please do not offer this. I need some guidance or a tutorial (best) using EF5 (if possible because EF4 examples are always inheriting POCOs from ObjectContext) with wiring up my own business objects to the .edmx.
Any help or guidance is appreciated, thanks!
This sounds like exactly the use case of Entity Framework. I am making a few assumptions here. First that when you make this statement:
"I have a domain model architecture in which my domain/business objects were created based on the problem domain and independent of any knowledge of the physical data model or persistence structures."
That you mean this domain was created in the EF designer? But then you say:
"However all of the fields that need to be persisted do exist somewhere or another within my domain model, but not necessarily in the shape that my auto generated .edmx POCO entities have them."
This sounds to me like my first assumption is incorrect.
Next, you dismiss code first? If your domain model/business objects are code based and you want to persist them to a relational database, that is exactly the use case for code first. You have the code, now you need to create your DbContext and map it to your physical model.
However you dismiss that... so some thoughts:
If you have a domain model of code based business objects and you have an EDMX that is used for other things I think you would want to create a repository layer that uses something like auto mapper or manual projections to query your Entities and return your business objects.
If you have a domain model of code based business objects and you have an EDMX that is not used for other things other than persisting your business objects I would say that you need to express your domain in an EDMX and then map it onto your existing database. This is really the use case for an ORM. Having two domain models and mapping from one model to the other where one model matches your domain and one matches your database is adding an extra un-needed layer of plumbing.
The latter approach above is what is called "Model First" in EF parlance. There are several articles written about it although the bulk of them just generate the db from the model. You would not do that step, rather you would map your entities onto your existing database.
The basic steps for this are to "update from the database" not selecting any of the db objects (or entities would be created). Or, you can take your exiting .edmx in the designer (which is sounds like you have) and modify the entities to match your business domain. Or just delete all the entities in your EDMX model, create your entities as you want them, and then map them all.
Here is a jing I made where I use the EF Designer to bring in the model store (the only way to do this is to allow it to generate entities) and then delete the entities allowing the Store information to stay by clicking NO when it asks if you want to delete the table info.
http://screencast.com/t/8eiPg2kcp
I didn't add the POCO generator to this, but if I did it would generate the Entities in the designer as POCO classes.
The statement quoted above is not suggesting that you rewrite your domain objects to match your pocos, it is suggesting that you update the edmx to match your domain model.
In your example you could create an entity in your edmx that maps all 5 properties from both tables and EF will manage the mapping to and from the single generated Poco onto your tables.
Of course this means that you then have duplicate domain objects and pocos, meaning you would either have to manually convert your domain objects to pocos when interacting with EF,
or you could define your domain data objects as interfaces and provide partial implementations of the pocos that essentially identify each poco as being a concrete implementation of a domain object.
There are probably several other ways to skin this particular cat, but I don't think that you can provide predefined c# objects for use in an edmx.
One approach might be to select into a ViewModel (suited to your particular business logic) and automatically map some data from the context into it like so https://stackoverflow.com/a/8588843/201648. This uses AutoMapper to map entity properties from an EF context into a ViewModel. This won't do everything for you, but it might make life a bit easier. If you're unhappy with the way this occurs automatically, you can configure AutoMapper to do things a bit differently (see Projection) - https://github.com/AutoMapper/AutoMapper/wiki/Projection
You might know this already, but its also possible to automatically generate POCOs from your EDMX using t4 - http://visualstudiogallery.msdn.microsoft.com/72a60b14-1581-4b9b-89f2-846072eff19d. If you define the templates to generate partial classes, you can then have another partial class with your custom properties for that POCO. That way you can have most properties automatically populated, but have other custom properties which you populate with custom rules from your context/repository. This takes a lot of the monotony out of generating these, and you can then focus on reshaping the data using the above technique.
If you're seriously unhappy with both, you could always map a stored procedure to get the exact field names that you want automatically without needing to stuff around. This will of course affect how you work with the data, but I have done it before for optimisation purposes/where a procedure already existed that did exactly what I wanted. See http://msdn.microsoft.com/en-us/data/gg699321.aspx

Which variant of Entity Framework to use in WCF based enterprise app

We are in a process of designing an application with approx 100 tables and complicated business logic. Windows Forms will be used on the client side and WCF services with MSSQL on the server.
Custom DTOs are used for client-server communication, business entities are not distributed.
Which variant of Entity Framework to use (and why):
EF 4.0 EntityObjects
EF 4.0 POCO
EF 4.1 DbContext
Something else
Database-first approach is a requirement.
Also, is it worth implementing a Repository pattern? It seems a bit redundant, as there is one level of abstraction in the mapping itself and another one in the use of DTOs. I'm currently leaned towards using auto-generated extendable repositories for each entity returning IQueryable, just to have a place to put common queries, but still allowing querying entity model directly from the Service Layer.
Which variant to use? Basically once you have custom DTO the only question is do you want to have control over entities code (their base class) and make them independent on EF? Do you want to use code first? If the answers to all questions are no then you can use EntityObjects. If you want to have entities persistence ignorant or use custom base class you should go to POCO. If you want to use code first or new DbContext API you will need EF 4.1. Some related topics:
EF 4.1 Code-first vs Model/Database-first
EF POCO code only VS EF POCO with Entity Data Model (this was related to CTP)
ADO.NET DbContext Generator vs. ADO.NET POCO Entity Generator
EF Model First or Code First Approach?
There are more things to consider when designing service layer. You should be aware of complications you will have to deal with when using EF in WCF. Your service will provide data to WinForms application and it will work with them in "detached mode". Once user will do all changes he wants to do he will post data back to the service. But here comes the problem - you must tell EF what has changed. If you for example allow user to change order with all its order items (change quantity in items, add new items, delete some items) you must say EF exactly what has changed, what was added and what was deleted. That is easy when you work with single entity but once you allow user to change object graph (especially many-to-many relations) then it is quite tough. The most common solution is loading the whole graph and merge the state from incoming DTOs to loaded and attached graph. Other solution is using Self tracking entities instead of EntityObjects/POCOs + DTOs.
When discussing repositories I would refer you to this answer which refers many other answers discussing repositories, their possible redundancy and possible mistakes when using them just to make your code testable. Generally each layer should be added only if there is real need for the layer - due to better separation of concerns.
The main advantage of POCOs is that those classes can be your DTOs, so if you've already got custom DTOs that you're using, POCO seems a bit redundant. However, there are some other advantages which may or may not have value to you, since you didn't mention unit testing as a requirement. If you plan to write unit tests, then POCO is still the way to go. You probably won't notice much difference between 4.0 POCO and 4.1 since you won't be using the code-first feature (disclaimer: I've only used 4.0 POCO, so I'm not intimately familiar with any minor differences between the two, but they seem to be more or less the same--basically I was already using POCO in 4.0 and haven't seen anything that's made me want to update everything to use 4.1).
Also, depending on whether you plan to unit-test this layer, there's still value in implementing the repository/unit of work patterns when using Entity Framework. It serves to abstract away the data access logic (the context), not the entities themselves, and allows you to do things like mocking your context in unit tests. What I do is copy the T4 template for my context and use it to create the interface, then edit the T4 template for the context and have it implement that interface and use IObjectSet<T> instead of ObjectSet<T>. So instead of:
public class MyEntitiesContext
{
public ObjectSet<MyClass> MyEntities
...
}
I end up with:
public interface IMyEntitiesContext
{
public IObjectSet<MyClass> MyEntities;
}
and
public class MyEntitiesContext : IMyEntitiesContext
{
public IObjectSet<MyClass> MyEntities
...
}
So I guess it really comes down to whether or not you plan to write unit tests for this layer. If you won't be doing anything that would require mocking out your context for testing, then the easiest thing to use would probably be 4.0 EntityObjects, since you aren't planning to pass your entities between layers and it would require the least effort to implement. If you plan to use mocking, then you'll probably want to use POCO and implement repository/unit of work.

Should i use partial classes as business layer when using entity framework?

I am working on a project using entity framework. Is it okay to use partial classes of the EF generated classes as the business layer. I am begining to think that this is how EF is intended to be used.
I have attempted to use a DTO pattern and soon realized that i am just creating a bunch of mapping classes that is duplicating my effort and also a cause for more maintenance work and an additional layer.
I want to use self-tracking-entities and pass the EF entities to all the layers. Please share your thoughts and ideas. Thanks
I had a look at using partial classes and found that exposing the database model up towards the UI layer would be restrictive.
For a few reasons:
The entity model created includes a deep relational object model which, depending on your schema, would get exposed to the UI layer (say the presenter of MVP or the ViewModel in MVVM).
The Business logic layer typically exposes operations that you can code against. If you see a save method on the BLL and look at the parameters needed to do the save and see a model that require the construction of other entities (cause of the relational nature the entity model) just to do the save, it is not keeping the operation simple.
If you have a bunch of web services then the extra data will need to be sent across for no apparent gain.
You can create more immutable DTO's for your operations parameters rather than encountering side effects cause the same instance was modified in some other part of the application.
If you do TDD and follow YAGNI then you will tend to have a structure specifically designed for the operation you are writing, which would be easier to construct tests against (not requiring to create other objects not realated to the test just because they are on the model). In this case you might have...
public class Order
{ ...
public Guid CustomerID { get; set; }
... }
Instead of using the Entity model generated by the EF which have references exposed...
public class Order
{ ...
public Customer Customer { get; set; }
... }
This way the id of the customer is only needed for an operation that takes an order. Why would you need to construct a Customer (and potentially other objects as well) for an operation that is concerned with taking orders?
If you are worried about the duplication and mapping, then have a look at Automapper
I would not do that, for the following reasons:
You loose the clear distinction between the data layer and the business layer
It makes the business layer more difficult to test
However, if you have some data model specific code, place that is a partial class to avoid it being lost when you regenerate the model.
I think partial class will be a good idea. If the model is regenerated then you will not loose the business logic in the partial classes.
As an alternative you can also look into EF4 Code only so that you don't need to generate your model from the database.
I would use partial classes. There is no such thing as data layer in DDD-ish code. There is a data tier and it resides on SQL Server. The application code should only contain business layer and some mappings which allow persisting business objects in the mentioned data tier.
Entity Framework is you data access code so you shouldn't built your own. In most cases the database schema would be modified because the model have changed, not the opposite.
That being said, I would discourage you to share your entities in all the layers. I value separation of UI and domain layer. I would use DTO to transfer data in and out of the domain. If I have the necessary freedom, I would even use CQRS pattern to get rid of mapping entities to DTO -- I would simply create a second EF data access project meant only for reading data for the UI. It would be built on top of the same database. You read data through read (anemic -- without business logic) model, but you modify it by issuing commands that are executed against real model implemented using EF and partial methods.
Does this answer your question?
I wouldn't do that. Try too keep the layers independent as possible. So a tiny change in your database schema will not affect all your layers.
Entities can be used for data layer but they should not.
If at all, provide interfaces to be used and let your entities implement them (on the partial file) the BL should not know the entities but the interfaces.