I’m working on a high volume transactional enterprise application(asp.net, windows app, oracle app as client) which has been designed using n-tire application and SOA architecture .The application was developed in the .NET platform utilizing C#,VB.NET, Framework 3.5 (I’m planning to upgrade to the , Framework 4.0), EF( EF in the data layer level) and WCF(WCF services in the service layer level)
Since this is the first project using EF, and having read about using EF in n-tier and SOA applications, and the features available in the EF Feature, I have the following points:
Which design pattern should I use in EF( Simple Entities, Change Set, Self-Tracking Entities and DTOs) in the data layer level
In addition Which design pattern should I use in the other tier and layer to get the best practices of EF
Thanks
"In addition Which design pattern should I use in the other tier and layer to get the best practices of EF "
I would use the saperation of concerns using IoC at the root of my design patterns. for Data Layer Abstraction purpose I would definately go for Repository patterns. There are some interesting work which you see on the web for e.g. UnitOfWork for transactions etc.
Not sure about your knowledge in Repository pattern but here's a good start.
There is also a good project on the CodePlex called Project Silk which can give you a good heads up for both the above topics among others.
All the best
Related
I am using Entity Framework 6 database-first. I am converting the project to implement the onion architecture to move towards better separation of concerns. I have read many articles and watched many videos but having some issues deciding on my solution structure.
I have 4 projects: Core, Infrastructure, Web & Tests.
From what I've learned, the .edmx file should be placed under my "Infrastructure" folder. However, I have also read about using the Repository and Unit of Work patterns to assist with EF decoupling and using Dependency Injection.
With this being said:
Will I have to create Repository Interfaces under CORE for ALL entities in my model? If so, how would one maintain this on a huge database? I have looked into automapper but found issues with it presenting IEnumererables vs. IQueryables but there is an extension available it has to hlep with this. I can try this route deeper but want to hear back first.
As an alternative, should I leave my edmx in Infrastructure and move the .tt T4 files for my entities to CORE? Does this present any tight coupling or a good solution?
Would a generic Repository interface work well with the suggestion you provide? Or maybe EF6 already resolves the Repository and UoW patterns issue?
Thank you for looking at my question and please present any alternative responses as well.
I found a similar post here that was not answered:
EF6 and Onion architecture - database first and without Repository pattern
Database first doesn't completely rule out Onion architecture (aka Ports and Adapters or Hexagonal Architecture, so you if you see references to those they're the same thing), but it's certainly more difficult. Onion Architecture and the separation of concerns it allows fit very nicely with a domain-driven design (I think you mentioned on twitter you'd already seen some of my videos on this subject on Pluralsight).
You should definitely avoid putting the EDMX in the Core or Web projects - Infrastructure is the right location for that. At that point, with database-first, you're going to have EF entities in Infrastructure. You want your business objects/domain entities to live in Core, though. At that point you basically have two options if you want to continue down this path:
1) Switch from database first to code first (perhaps using a tool) so that you can have POCO entities in Core.
2) Map back and forth between your Infrastructure entities and your Core objects, perhaps using something like AutoMapper. Before EF supported POCO entities this was the approach I followed when using it, and I would write repositories that only dealt with Core objects but internally would map to EF-specific entities.
As to your questions about Repositories and Units of Work, there's been a lot written about this already, on SO and elsewhere. You can certainly use a generic repository implementation to allow for easy CRUD access to a large set of entities, and it sounds like that may be a quick way for you to move forward in your scenario. However, my general recommendation is to avoid generic repositories as your go-to means of accessing your business objects, and instead use Aggregates (see DDD or my DDD course w/Julie Lerman on Pluralsight) with one concrete repository per Aggregate Root. You can separate out complex business entities from CRUD operations, too, and only follow the Aggregate approach where it is warranted. The benefit you get from this approach is that you're constraining how the objects are accessed, and getting similar benefits to a Facade over your (large) set of database entities.
Don't feel like you can only have one dbcontext per application. It sounds like you are evolving this design over time, not starting with a green field application. To that end, you could keep your .edmx file and perhaps a generic repository for CRUD purposes, but then create a new code first dbcontext for a specific set of operations that warrant POCO entities, separation of concerns, increased testability, etc. Over time, you can shift the bulk of the essential code to use this, while still keeping the existing dbcontext so you don't lose and current functionality.
I am using entity framework 6.1 in my DDD project. Code first works out very well if you want to do Onion Architecture.
In my project we have completely isolated Repository from the Domain Model. Application Service is what uses repository to load aggregates from and persist aggregates to the database. Hence, there is no repository interfaces in the domain (core).
Second option of using T4 to generate POCO in a separate assembly is a good idea. Please remember that your domain model (core) should be persistence-ignorant.
While generic repository are good for enforcing aggregate-level operations, I prefer using specific repository more, simply because not every Aggregate is going to need all of those generic repository operations.
http://codingcraft.wordpress.com/
So we are focusing on developing a enterprise web application that utilized DDD patterns with CQRS+ES. We have a pretty good handle on that from the enterprise level. Now when we want to open up our backend services to native mobile devices using Xamarin and portable class libraries how does this come together? Do we change our domain projects in each of our bounded context to be a PCL project type? What do we do with the MVVM side of things for instance in Windows Store App, Windows Phone app? Since we are pulling from a Web API service do we pull in the PCL bounded context library or do we make a subset domain model and a separate PCL library for our native client MVVM patterns?
Right now we are leaning towards leaving the original DDD projects as class libraries and just creating a separate portable class library for our MVVM code. We will probably use file linking to link back into the domain projects to get the models so that we always have the latest set of POCO objects and any DTO objects we want to use on the client. Any one else have any thoughts or ideas on this? I really don't see a lot of discussions around this DDD+PCL combination.
I have done a lot of thinking about this and what i did to put Xamarin in my current architecture with DDD approach was:
Put your Domain Entities in a PCL project and use it to reference in all projects that you need, such as Xamarin.Forms, Xamarin.Android, Xamarin.iOS, ASP.NET, WCF, etc.
Your Domain Services can be in a normal class library that will be used for the Application layer. The application layer will be used by the Presentation projects such as ASP.NET MVC.
In the Distributed Services layer you're going to expose your services for Xamarin or other apps to communicate with your application. You can use ASP.NET Web Api or WCF with REST. This layer will also use the application layer respecting the DDD concepts.
The xamarin projects go in the presentation layer but do not use application layer. Here you will write your services for Xamarin to connect to your Distributed Services Layer through the internet. If you need offline sync you can also put that in here. Here you're going to reference your Domain Entities Project and have all your entities with their business rules.
This way you have your domain and business rules shared with all your solution and respecting DDD concepts and role separation.
I want to start new application on ASP.NET MVC4 using different different approach like domain driven development , design patterns , dependency injection , Entity Framework as ORM etc.
Need some advice on what should be the starting point of development? Should I start with first relationships of classes or start with traditional approach?
e.g there are three module.
User Management.
Logging.
Error Logging.
Should I first complete with user management like domain classes then its services and then its CRUD operations in actual web application? and after that ...will start with logging (same process as mention in user management). and then in error logging as well.
So What are best practices to start development using those kind of concept or tools?
ASP.NET MVC4 is just a presentation part of solution. With Domain-Driven approach you start with domain (usually separate library project) and then add presentation (web site, desktop application etc) and persistence (implementation of repository and uof interfaces declared in your domain).
So, you start with creating domain model (not whole, but part of that). Then in any order you create UI which uses you domain model, and implementation of repositories for persisting your domain model via Entity Framework. Well actually views should use ViewModels (otherwise your POCO domain objects will be polluted with Data Annotations attributes and other stuff). It's a controller part where you will use domain model. Also you will inject repository implementations to controllers via dependency injection.
I would start by looking at the business functionality requirements of the system and focus on the highest value requirements first. Implement those, filling out your business domain as you need to, based on delivering the requirements. If you follow a BDD style process, you can use unit tests to drive out the business functionality and your domain will evolve as the business requirements evolve. Each business requirement should have a UI and data access component to it so you can fill out the presentation layer and data access layer with Entity Framework as the domain evolves. Here's a couple of useful posts on BDD:
BDD By Example
Introducing BDD
My development team is evaluating the various frameworks available for .NET to simplify our programming, one of which is CSLA. I have to admit to being a bit confused as to whether or not CSLA would benefit from being used in conjunction with a dependency injection framework, such as Spring.net or Windsor. If we combined one of those two DI frameworks with, say, the Entity Framework to handle ORM duties, does that negate the need or benefit of using CSLA altogether?
I have various levels of understanding of all these frameworks, and I'm trying to get a big picture of what will best benefit our enterprise architecture and object design.
Thank you!
CSLA is a framework for creating business entities, so has separate concerns than an IoC container or ORM. In a enterprise application you should consider the benefits of all three.
In particular, you should consider CSLA if you want data binding built in to your models, dirty checking, N-level undo, validation and business rules, as well as the data portal implementation which allows easy configuration for n-tier deployments.
Short answer: Yes.
Long answer: It requires a bit of grunt work and some experimentation to setup, but it can be done without fundamentally breaking CSLA. I put together a working prototype using StructureMap and the repository pattern and used the BuildUp method of Setter Injection to inject within CSLA. I used a method similar to the one found here to ensure that my business objects are re-injected when the objects are serialized.
I also use the registry base class of StructureMap to separate my configuration into presentation, CSLA client, CSLA server, and CSLA global settings. This way I can use the linked file feature of Visual Studio to include the CSLA server and CSLA global configuration files within the server-side Data Portal and the configuration will always be the same in both places. This was to ensure I can still change the Data Portal configuration settings in CSLA from 2 tier to 3 tier without breaking anything.
Anyway, I am still weighing the potential benefits with the drawbacks to using DI, but so far I am leaning in the direction of using it because testing will be much easier although I am skeptical of trying to use any of the advanced features of DI such as interception. I recommend reading the book Dependency Injection in .NET by Mark Seemann to understand the right and wrong way to use DI because there is a lot of misinformation on the Internet.
I am having a bit of a free time and i am planning to catch up on some new technology. I started of my .NET development career as a ASP.NET developer. Presently I am done with ASP.NET development,for that matter i am done with any front end development. These days i am into Business layer and DAL development with the primary focus on WCF service development and I am going to continue doing so. Given the situation which one would be helpful for me in moving forward? ADO.NET data services or Entity Framework?
Those are two totally separate items:
ADO.NET Entity Framework is a data-access and data-modelling technology to handle database storage, modelling objects etc. on top of that
ADO.NET Data Services is a REST-ful way of exposing such data models to a wide audience, by means of browsers and URLs
So basically, first you need to know a bit about Entity Framework, and then you should learn how to make that available to the world at large.
Marc
ADO.NET Entity framework and ADO.NET Data Services are different