I used Entity framework with a database having around 50 tables and it worked just fine.
But just to see what happens with a larger database in terms of number of tables/entities i tried to implement the Entity Framework to a database that had around 100+ tables.
Once i selected all the tables and clicked on the Finish Button on the Entity Framework Wizard its just hanged my VS 2010 so i could not get any results.
My Questions are as below;
1.If I have larger Database in terms of Table/Entites as described above, Is it a good idea to use Entity Framework?
2.What will be the better approch using Entity Framework to work with database?
3.Should i create multiple DataContext or EDMX files with lesser entites in it?
4.How will these different DataContext interact with each other?
5.Is there any recommended no of tables that should be used while working with Entity Framework?
#Will is correct that the limitation you're seeing is in the designer, but it's not the only one, so Code-First doesn't necessarily fix the problem.
If the designer seems slow, it's inconvenient, but not the end of the world. Runtime performance considerations are another thing altogether. For performance-critical tasks and tuning, you'll want to understand the whole pipeline.
View generation, e.g., takes time. You can move this to compile time with manual work.
1.If I have larger Database in terms of Table/Entites as described above, Is it a good idea to use Entity Framework?
I certainly wouldn't let it stop you.
2.What will be the better approch using Entity Framework to work with database?
3.Should i create multiple DataContext or EDMX files with lesser entites in it?
That's certainly a good approach for many applications.
4.How will these different DataContext interact with each other?
Mostly not. A single, giant data model is often a bad idea due to service coupling. However, you can selectively couple them by sharing portions of the models with includes in EDMX or classes in code-first.
5.Is there any recommended no of tables that should be used while working with Entity Framework?
One way is to use smaller models, as you've suggested. Another way is to work around the runtime performance issues which sometimes come with larger models (see the links I give above). Like any potential performance "problem", write correct code first, then profile and fix the slow parts. Usually, query tuning is more important than model size anyway.
EF, probably yes. The toolset in Visual Studio? Not so much, apparently. For a database this big, you might want to do Code First.
I think EF itself have't performance limitations for count of tables, but have for count of records in particular table. You have to do manual object-db relation (i.e. manual write classes for tables and corresponding attributes) for go away from design problems in VS10.
It's clear approach in Hibernate, but in EF probably not.
Entity Framework is the best way to develop database applications.
I used to develop my applications using LINQ to SQL but since Microsoft is not going to support it in future, it recommends to use Entity Framework.
By the way, Entity Framework 4 in .NET 4 has much better performance than previous versions.
I'm currently developing an enterprise application using Entity Framework and it supports all my needs.
I suggest to use Entity Framework.
Related
I heard that companies that use Java technologies, they used to build their own custom Framework that wraps Hibernate. However, is it really feasible for their .Net peers to do the same thing with NHibernate or Entity Framework?
This is almost always a horrible idea - I think Ayende sums it up best in this article. In general, you should consider NHibernate itself to be the "wrapper" around your data access - attempting to build an abstraction layer on top of it is probably going to be a losing proposition.
Actually, you should check out some of the articles on .NET Junkie's weblog. He wrote several great posts on how to deal with repositories, queries, commands and so on. We've been using these in a very large enterprise system where we switch between an in-memory dictionary, an in-memory SQLite database and a production environment using SQL Server or Oracle. Obviously, we use NHibernate for this.
I use the repository pattern and a separate project/dll to abstract away the data framework nhibernate / entity framework. this is a good starting point http://codebetter.com/petervanooijen/2008/04/04/wrapping-up-nhibernate-in-repositories/
I am new to entity framework and MVC, and trying to understand what constitutes a good design approach for a new application.
There are several ways of using Entity Framework. However, for my project, the best looking option is DB First. I've played around with an EDMX file, and I have got as far as using the DbContext code generator to create my wrapper classes.
I plan on using the repository and unit-of-work patterns, and using ninject for DI.
However, it does not seem "proper", from a SoC point of view, that whilst my respository will hide the implementation of the data store (EF) from my code, the model classes themselves are very much EF flavoured.
It seems that using EDMX-based approaches to EF blur the separation of concerns. Only POCO support seems to allow a true separation, but POCO has some other limitations that I don't like.
Am I missing something, or does using EDMX have this drawback?
Are people using an auto mapper to convert between the entity model and another, clean, SoCced model?
thanks
Tian
I don't have a strong opinion on the Separation Of Concerns question, but I have used both the standard ADO.Net version of EF and POCO and it is not difficult at all to customise the output of the the T4 code generation script for POCO to address any concerns you have about the structure of the objects created. That sounds like it would probably be a good starting point for what you are looking to do.
Once you know you are looking for T4 templates there are quite a few tutorials and a lot of helpful SO questions that can give you an idea of what you need to do.
I have been doing some reading on code first approach of the entity framework and the other approaches(Model first, database first).
1.)The reason supporting entity framework code first approach in a lot of blogs seems to be to keep developers happy so that they dont have to work with designers. I am suprised at this argument because, you develop a project to keep your customers happy and not your developers. How many projects out there have developer happiness in the project plan. Another argument is to avoid a lot of mappings on xml. Well, if not xml, we end up doing that anyways in OnModelCreating and adding [Key] attributes to domain models. So mapping is not eliminated.
2.)Also when you generate a database from the code(domain model), the domain model is designed in an OO way and the generated database structure might not be the optimal one, which forces me to think that this code first approach is only suitable for small projects.
Are arguments correct?
1) The reason for the code first paradigm is to save development time by reducing the amount of tedious repetitive work developers have to do (which in turn makes them happier). Code first allows developers to forget about SQL for the most part (sometimes you will still have to write stored procedures) and focus on the data model and business logic. It is much easier to version your database (especially during development) as each developer only has to change their c# models rather than write a new SQL script, check it in and make sure everyone else on the team knows to run it. As for the XML files in the past in orther ORM's I've found it a pain to constantly lookup the corresponding xml file when you are modifying the domain classes. I personally find the fluent interfaces quicker when making changes to my data model. Also the xml files entity framework used in v1.0 became so large they were a real pain to edit/manage when your database got to be of any real world size.
2) I can't think of a scenario where a data schema could only be created via SQL and not expressed as c# with entity framework code first.
I currently use the Entity Framework designer to generate my persistance objects and I also user POCO view models for the ASp.NET MVC view.
I've read and listened to a lot of people talking about the good support for POCO's in EF4 as well as POCO's in general but I can't seem to work out what advantage, if any I'll get from using them.
In our application, we WILL be using SQL Server so it's not like we need so separate out for different databases.
Why would I want to use POCO's as opposed to the designer generated classes?
POCO offers better extensibility/reuse of your Domain Model as you're not tied to any specific ORM framework.
Answered here :What are the 'big' advantages to have Poco with ORM?
Easier to unit test
I find when you have many entities (100+) using the designer is painful and POCO objects are easy to create and maintain.
With POCO's you can "Code First"
http://weblogs.asp.net/scottgu/archive/2010/08/03/using-ef-code-first-with-an-existing-database.aspx
+1 for with POCOS you can code first. In fact you can create and test your entire application before you even write your first line of data access code. This is how it should be, it's a perfect application of the SRP. Your domain object should not know or care how they are being persisted.
HI Guys,
I was watching these videos series about Entity Framework:
http://msdn.microsoft.com/en-us/data/ff191186.aspx
is that easy building application in real world programming??? and is it ....reliable...has good performance...
"I am a graduate.."
thanks
Entity Framework is a valid real world data access tool. It is very easy to get up and running with EF. You simply import (or create in EF 4) your data model. You then can rename it to make it more code friendly. And then you are off querying databases.
Performance
I have been on multiple projects that use it, some which require high throughput, others that have low performance requirements. Entity Framework out of the box is not the fastest solution in the world, so there are a lot of performance tweaks that have to go on, but its all do able.
Reliability
We never have issues with reliability. We have never had an issue with EF in general, its always data content related. Trying to insert duplicated data, etc.
Other Tangibles
EF follows a pattern which allows for you to do some fun stuff with templates and abstract classes. All entities inerit from a class, entities that have references inherit from other classes. All Entity Contexts inherit from ;) ObjectContext classes, which provide a base set of functionality that allows you to create generic DAO implementations that can be reused throughout the enterprise.
If you are using UI dev, you can also use Data Services that wrap EF, as a fast gateway to your databse. The only downside of this is that you dont have access to the full suite of the Entity Framework.