I'm not sure with the implementation of my project using mvvm, I have this project structure which is a project separated
ProjTitle.Ui.Wpf
ProjTitle.ViewModel
ProjTitle.Bal
ProjTitle.Dal
ProjTitle.Bo
ProjTitle.Common
Bo is just the dataobject , Dal will deal with Db, Bal will get the data processed by Dal, and do things such as: computation,business rules,Simple linq, etc.
I'm not using helper for Dal, I think this is an old fashion way of processing/getting data from database.
Sometimes, Bal is kinda redundant for simple function but it really helps with for some operation
Is this implementation is not a bad practice?
You should implement it in the following way:
WPF Prj:
- Views
- ViewModel & Commands
- Helpers
DAL Prj:
-EF Model
-Services classes
BLL:
- Process your DAL applying some business logic
Facade:
- allowing you to talk to your BLL->DAL
And finally your Facade is gonna be used by the Commands of your ViewModel
Related
I have many models that defines all my db tables; I wondering which is the best way to create one single CRUD ServiceStack interface for all these models without write the same code for each one.
I'd like to keep it DRY to ease future maintaining.
Thank you.
Checkout AutoQuery which lets you expose a rich, queryable API's for each table by just declaring its Request DTO:
[Route("/movies")]
public class FindMovies : QueryBase<Movie> {}
You want a typed Request DTO for each Service, but other than that you can use a base class, shared extension or utility methods to execute common logic as you would in normal C#. The built-in Auto Mapping also reduces the boilerplate for populating a Table POCO from a request DTO.
In my project, I need to use EF and abstract the queries from the Presentation layer. Based from what I've been reading questions and answers all over the net, EF is built having repository pattern on it's DbSet and Unit of work on DbContext.
Repository pattern can easily do the requirement but I don't wanna repeat this implementation and now confused where should I initialize or access the DbContext. Should it be on the service layer?
MVC4 Api will be used for this project
One way I have seen this done in the past is to essentially remove the DbContext's dependency on a physical database by creating an interface for your context then make your data access calls from your Services Layer (Business Logic Layer).
There is however, a disadvantage in using this approach, which is the fact that your unit tests (which will be using a Fake implementation of your DbContext) will be using LINQ to Objects to run your queries whereas your concrete implementation will use LINQ to Entities which does not support all LINQ to Objects methods.
There's documentation on MSDN (http://msdn.microsoft.com/en-us/library/bb738550.aspx) which highlights these differences.
I also recommend reading this article (http://kearon.blogspot.com.au/2011/02/mocking-entity-framework-4-code-first.html) which demonstrates how to make DbContext unit testable by removing the inderlying dependency on a phyiscal database.
Hope this all helps!
We are in a process of designing an application with approx 100 tables and complicated business logic. Windows Forms will be used on the client side and WCF services with MSSQL on the server.
Custom DTOs are used for client-server communication, business entities are not distributed.
Which variant of Entity Framework to use (and why):
EF 4.0 EntityObjects
EF 4.0 POCO
EF 4.1 DbContext
Something else
Database-first approach is a requirement.
Also, is it worth implementing a Repository pattern? It seems a bit redundant, as there is one level of abstraction in the mapping itself and another one in the use of DTOs. I'm currently leaned towards using auto-generated extendable repositories for each entity returning IQueryable, just to have a place to put common queries, but still allowing querying entity model directly from the Service Layer.
Which variant to use? Basically once you have custom DTO the only question is do you want to have control over entities code (their base class) and make them independent on EF? Do you want to use code first? If the answers to all questions are no then you can use EntityObjects. If you want to have entities persistence ignorant or use custom base class you should go to POCO. If you want to use code first or new DbContext API you will need EF 4.1. Some related topics:
EF 4.1 Code-first vs Model/Database-first
EF POCO code only VS EF POCO with Entity Data Model (this was related to CTP)
ADO.NET DbContext Generator vs. ADO.NET POCO Entity Generator
EF Model First or Code First Approach?
There are more things to consider when designing service layer. You should be aware of complications you will have to deal with when using EF in WCF. Your service will provide data to WinForms application and it will work with them in "detached mode". Once user will do all changes he wants to do he will post data back to the service. But here comes the problem - you must tell EF what has changed. If you for example allow user to change order with all its order items (change quantity in items, add new items, delete some items) you must say EF exactly what has changed, what was added and what was deleted. That is easy when you work with single entity but once you allow user to change object graph (especially many-to-many relations) then it is quite tough. The most common solution is loading the whole graph and merge the state from incoming DTOs to loaded and attached graph. Other solution is using Self tracking entities instead of EntityObjects/POCOs + DTOs.
When discussing repositories I would refer you to this answer which refers many other answers discussing repositories, their possible redundancy and possible mistakes when using them just to make your code testable. Generally each layer should be added only if there is real need for the layer - due to better separation of concerns.
The main advantage of POCOs is that those classes can be your DTOs, so if you've already got custom DTOs that you're using, POCO seems a bit redundant. However, there are some other advantages which may or may not have value to you, since you didn't mention unit testing as a requirement. If you plan to write unit tests, then POCO is still the way to go. You probably won't notice much difference between 4.0 POCO and 4.1 since you won't be using the code-first feature (disclaimer: I've only used 4.0 POCO, so I'm not intimately familiar with any minor differences between the two, but they seem to be more or less the same--basically I was already using POCO in 4.0 and haven't seen anything that's made me want to update everything to use 4.1).
Also, depending on whether you plan to unit-test this layer, there's still value in implementing the repository/unit of work patterns when using Entity Framework. It serves to abstract away the data access logic (the context), not the entities themselves, and allows you to do things like mocking your context in unit tests. What I do is copy the T4 template for my context and use it to create the interface, then edit the T4 template for the context and have it implement that interface and use IObjectSet<T> instead of ObjectSet<T>. So instead of:
public class MyEntitiesContext
{
public ObjectSet<MyClass> MyEntities
...
}
I end up with:
public interface IMyEntitiesContext
{
public IObjectSet<MyClass> MyEntities;
}
and
public class MyEntitiesContext : IMyEntitiesContext
{
public IObjectSet<MyClass> MyEntities
...
}
So I guess it really comes down to whether or not you plan to write unit tests for this layer. If you won't be doing anything that would require mocking out your context for testing, then the easiest thing to use would probably be 4.0 EntityObjects, since you aren't planning to pass your entities between layers and it would require the least effort to implement. If you plan to use mocking, then you'll probably want to use POCO and implement repository/unit of work.
I'm developing a project using a layered architecture. I have a DAL in which i'm using Entity Framework, a business logic layer which consumes the objects returned by the DAL and an app layer.
I'm not entirely sure i'm thinking this right, so i'll just ask you what you think.
My DAL is based on mappers. I have types - mappers - that the BLL uses to operate on my data. These mappers return DTO's, because i did not want to expose to my BLL any EF objects so their implementations are not dependent on EF to work.
All these mappers do are CRUD actions on a single 'table', like:
using (var contex = new EFEntities()){
var obj = (from x in context.Table where x.ID == param select x).SingleOrDefault;
return Map(x.ToList());
}
The Map method maps the EF object to a DTO, which has only some properties to map the values i want to expose.
Is there a more elegant approach to this? I am just using EF to facilitate the access to my database - i don't have to write any ADO.NET code.
Any input on this would be welcome.
Thanks.
I have a DAL in which i'm using
Entity Framework, a business logic
layer which consumes the objects
returned by the DAL and an app layer.
That is a perversion, given that you are putting a complete object layer into your DAL. EF is a lot more than a DAL.
a business logic layer which consumes
the objects returned by the DAL and an
app layer.
The term you may want to look up for that is "anemic object model", and it is not a nice term.
Is there a more elegant approach to
this?
Don't fight EF. if you dont like it, don't use it - there are a LOT better frameworks around.
Following the way Rob does it, I have the classes that are generated by the Linq to SQL wizard, and then a copy of those classes that are POCOs. In my repositories I return these POCOs rather than the Linq to SQL models:
return from c in DataContext.Customer
where c.ID == id
select new MyPocoModels.Customer { ID = c.ID, Name = c.Name }
I understand that the benefit of this is that the POCO models can be instantiated easier so this will make my code more testable.
I'm now moving from Linq to SQL over to Entity Framework and I'm about half way through an EF book. It seems there's a lot of goodness I'm going to lose out on by returning POCOs from my repositories rather than the EF entities.
I still haven't really embraced unit testing, so I feel like I'm wasting a lot of time creating these extra POCOs and writing the code to populate them, when all I appear to be gaining is testable code, yet I'm also gonna lose out on a lot of the benefits of the EF by not being able to track my objects.
Does anyone have any advice for a relative newb to all this ORM/Repository stuff?
Anthony
Another reason people don't like the auto-generated objects (in LINQ to SQL for example) is because of their built-in "magic".
Usually the magic is invisible and you never notice it, but when you try to do things like serialize one of those objects and then deserialize it (for example when using web services) its internal connection to the data source is broken and special hacks need to be employed to "put the magic back in".
With POCOs, you don't have to worry about those sorts of things and can get a better separation between your data and service layers. The downside of course is that you have to write lots of boring POCO -> magic object and magic object -> POCO conversion code. But in the end I think it's usually worth it, especially for large or complex projects.
The main reason is that a lot of people like to develop their model with a specific mindset: like DDD for instance. They might want to use a specific pattern (like Spec or State) for things like statuses (instead of enums) - or you might want to use a Factory for instantiation.
OO breaks when you try to use Tables as Objects when things get more complex. Simple sites work OK - but when you get to big big things, it gets ugly.
So - as always - it depends what you think your project will turn into.
My experience is that when you start writting some complex queries .Include method is worthless and you will find yourself either:
a) Writting a lot of queries to get the data you want or
b) abusing of annonymous types to load the data in a single query and then writting a lot of code just to pass that data to your entities.
POCOs are the way to go, IMHO.