Does the using the entity framework put a lot of pressure on memory, like the DataSet does? - entity-framework

I'm new to the entity framework. Some screen casts I've been watching, show results sets being held in memory with their changes. This seems like it could use a lot of memory.
Does this mean that EF isn't suitable for ASP.NET development? Does the entity framework have a memory efficient pattern similar to the SqlDataReader?

It looks like it that if you enumerate through the query result as objects, a DbDataReader is actually used and objects are created on the fly, so only 1 row will be in memory as an actual EntityObject. You can also access the data at a DataReader level using it's EntityClient Provider, but if you're concerned about optimal performance, I suppose you should stick to plain ADO.NET.
I've used Entity Framework without memory or performance problems on reasonably high traffic sites (12,000 - 20,000 unique visitors per day with 250k pageviews).
Also, you may want to wait for Entity Framework 4.0, or use the pre-release. It has lots of improvements, one of which is support for POCO (Plain Old CLR Objects).

"Does the entity framework have a memory efficient pattern similar to the SqlDataReader?"
No, the EF makes complete copies of objects in RAM similar to a DataSet and DataRelations, etc, but it keeps them in objects instead. It then also has to keep copies of each of the objects changes (Changesets). Each of those changes are then build up as SQL to update the database if you submit the changes back.
SqlDataReader is a forward only lightweight reader to grab a single row at a time. EF is loading all your answers to the queries into object collections with change tracking on top of it.
Is it suitable for your app? I don't know. If you need fast as possible and smallest amount of RAM then ADO.NET is the only way to go. Any abstraction placed on top of ADO.NET is going to add overhead, memory etc.

EF will use more memory than using ADO.net directly (assuming that you use ADO.net correctly)
Having said that, we have used EF on large ASP.net projects with no memory problems.
The only situation I can think of is if you are handling large binary objects, and you want to stream them instead of loading them into memory.

Related

PLINQ to Entity Framework: why shouldn't it be used?

I read somewhere that you shouldn't use PLINQ on Entity Framework or SQL. I can't remember where I read it or what the reasons were, but I did some experimentation. Using traditional LINQ to Entity Framework to load a database table that's expected to grow to be quite large currently takes 12 to 13 milliseconds. However, when I add .AsParallel() the same query runs in 2 to 4 milliseconds, and I get the same exact results.
So if I get the same results faster using PLINQ, what are the pitfalls of using PLINQ to Entity Framework?
There are some dangers, IE the the DbContext cannot be accessed by multiple threads simultaneously. And often little upside, ie PLINQ will synchronize access to IEnumerable.MoveNext() which does all the work of reading the data, creating the Entities and interacting with the change tracker.
But if you do a lot of work with the returned entities, that does not touch the DbContext (ie no SaveChanges(), no Lazy Loading, etc), you can use PLINQ.
But most of the examples I can think of would be better-optimized by building the operation into the original query, or by performing server-side raw SQL.
So if you have a bunch of CPU-intensive domain logic you need to run across a collection of entities, you could operate across the results in parallel, but you might be better-off creating a separate DbContext inside the parallel execution block.

How to escape from ORMs limitations or should I avoid them?

In short, ORMs like Entity Framework provides a fast solution but with many limitations, When should they (ORMs) be avoided?
I want to create an engine of a DMS system, I wonder that how could I create the Business Logic Layer.
I'll discuss the following options:
Using Entity Framework and provides it as a Business later for the engine's clients.
The problem is that missing the control on the properties and the validation because it's generated code.
Create my own business layer classes manually without using Entity Framework or any ORM:
The problem is that it's a hard mission and something like reinvent the weel.
Create my own business layer classes up on the Entitiy Framework (use it)
The problem Seems to be code repeating by creating new classes with the same names and every property will cover the opposite one which is generated by the ORM.
Am I discuss the problem in a right way?
In short, ORMs should be avoided when:
your program will perform bulk inserts/updates/deletes (such as insert-selects, and updates/deletes that are conditional on something non-unique). ORMs are not designed to do these kinds of bulk operations efficiently; you will end up deleting each record one at a time.
you are using highly custom data types or conversions. ORMs are generally bad at dealing with BLOBs, and there are limits to how they can be told how to "map" objects.
you need the absolute highest performance in your communication with SQL Server. ORMs can suffer from N+1 problems and other query inefficiencies, and overall they add a layer of (usually reflective) translation between your request for an object and a SQL statement which will slow you down.
ORMs should instead be used in most cases of application-based record maintenance, where the user is viewing aggregated results and/or updating individual records, consisting of simple data types, one at a time. ORMs have the extreme advantage over raw SQL in their ability to provide compiler-checked queries using Linq providers; virtually all of the popular ORMs (Linq2SQL, EF, NHibernate, Azure) have a Linq query interface that can catch a lot of "fat fingers" and other common mistakes in queries that you don't catch when using "magic strings" to form SQLCommands. ORMs also generally provide database independence. Classic NHibernate HBM mappings are XML files, which can be swapped out as necessary to point the repository at MSS, Oracle, SQLite, Postgres, and other RDBMSes. Even "fluent" mappings, which are classes in code files, can be swapped out if correctly architected. EF has similar functionality.
So are you asking how to do "X" without doing "X"? ORM is an abstraction and as any other abstraction it has disadvantages but not those you mentioned.
Code (EFv4) can be generated by T4 template and T4 template is a code that can be modified
Generated code is partial class which can be combined with your partial part containing your logic
Writing classes manually is very common case - using designer as available in Entity framework is more rare
Disclaimer: I work at Mindscape that builds the LightSpeed ORM for .NET
As you don't ask about a specific issue, but about approaches to solving the flexibility problem with an ORM I thought I'd chime in with some views from a vendor perspective. It may or may not be of use to you but might give some food for thought :-)
When designing an O/R Mapper it's important to take into consideration what we call "escape hatches". An ORM will inevitably push a certain set of default behaviours which is one way that developer gain productivity gains.
One of the lessons we have learned with LightSpeed has been where developers need those escape hatches. For example, KeithS here states that ORMs are not good for bulk operations - and in most cases this is true. We had this scenario come up with some customers and added an overload to our Remove() operation that allowed you to pass in a query that removes all records that match. This saved having to load entities into memory and delete them. Listening to where developers are having pain and helping solve those problems quickly is important for helping build solid solutions.
All ORMs should efficiently batch queries. Having said that, we have been surprised to see that many ORMs don't. This is strange given that often batching can be done rather easily and several queries can be bundled up and sent to the database at once to save round trips. This is something we've done since day 1 for any database that supports it. That's just an aside to the point of batching made in this thread. The quality of those batches queries is the real challenge and, frankly, there are some TERRIBLE SQL statements being generated by some ORMs.
Overall you should select an ORM that gives you immediate productivity gains (almost demo-ware styled 'see I queried data in 30s!') but has also paid attention to larger scale solutions which is where escape hatches and some of the less demoed, but hugely useful features are needed.
I hope this post hasn't come across too salesy, but I wanted to draw attention to taking into account the thought process that goes behind any product when selecting it. If the philosophy matches the way you need to work then you're probably going to be happier than selecting one that does not.
If you're interested, you can learn about our LightSpeed ORM for .NET.
in my experience you should avoid use ORM when your application do the following data manipulation:
1)Bulk deletes: most of the ORM tools wont truly delete the data, they will mark it with a garbage collect ID (GC record) to keep the database consistency. The worst thing is that the ORM collect all the data you want to delete before it mark it as deleted. That means that if you want to delete 1000000 rows the ORM will first fetch the data, load it in your application, mark it as GC and then update the database;. which I believe is a huge waist of resources.
2)bulk inserts and data import:most of the ORM tools will create business layer validations on the business classes, this is good if you want to validate 1 record but if you are going to insert/import hundreds or even millions of records the process could take days.
3)Report generation: the ORM tools are good to create simple list reports or simple table joins like in a order-order_details scenario. but it most cases the ORM will only slow down the retrieve of the data and will add more joins that you need for a report. that translate in a give more work to the DB engine than you usually do with a SQL approach

Out of memory when migrating models

I am migrating data from one model version to another in the iPhone, but the migration causes the device to run out of memory and crash. Not to mention it takes forever on the device. I use the default migration settings.
I guess the bad guy is one of the tables that contain the order of 105 rows. This table has not changed though, but the migration still generates operations for it (probably because of relations to other tables).
Any ideas what i can do to improve things? Of course, I could whip something up manually, but I really want to take advantage of as much as possible of the Core Data goodies.
You're most likely creating a large number of objects during the migration and not releasing them. You need to loop through the migration taking small nibbles and freeing up the memory used in each nibble before taking the next one.
See the Core Data Model Versioning and Data Migration Programming Guide: Multiple Passes—Dealing With Large Datasets

should I use Entity Framework instead of raw ADO.NET

I am new to CSLA and Entity Framework. I am creating a new CSLA / Silverlight application that will replace a 12 year old Win32 C++ system. The old system uses a custom DCOM business object library and uses ODBC to get to SQL Server. The new system will not immediately replace the old system -- they must coexist against the same database for years to come.
At first I thought EF was the way to go since it is the latest and greatest. After making a small EF model and only 2 CSLA editable root objects (I will eventually have hundreds of objects as my DB has 800+ tables) I am seriously questioning the use of EF.
In the current system I have the need many times to do fine detail performance tuning of the queries which I can do because of 100% control of generated SQL. But it seems in EF that so much happens behind the scenes that I lose that control. Article like http://toomanylayers.blogspot.com/2009/01/entity-framework-and-linq-to-sql.html don't help my impression of EF.
People seem to like EF because of LINQ to EF but since my criteria is passed between client and server as criteria object it seems like I could build queries just as easily without LINQ. I understand in WCF RIA that there is query projection (or something like that) where I can do client side LINQ which does move to the server before translation into actual SQL so in that case I can see the benefit of EF, but not in CSLA.
If I use raw ADO.NET, will I regret my decision 5 years from now?
Has anyone else made this choice recently and which way did you go?
In your case, I would still choose EF over doing it all by hand.
Why? EF - especially in .NET 4 - has matured considerably. It will allow you to do most of your database operations a lot easier and with a lot less code than if you have to all hand-code your data access code.
And in cases where you do need the absolute maximum performance, you can always plug in stored procedures for insert, update, delete which EF4 will then use instead of the default behavior of creating the SQL statements on the fly.
EF4 has a much better stored proc integration, and this really opens up the best of both worlds:
use the high productivity of EF for the 80% cases where performance isn't paramount
fine tune and handcraft stored procs for the remaining 20% and plug them into EF4
See some resources:
Using Stored Procedures for Insert, Update and Delete in an Entity Data Model
Practical Entity Framework for C#: Stored Procedures (video)
You seem to have a mix of requirements and a mix of solutions.
I normally rate each requirement with an essential, nice to have, not essential. And then see what works.
I agree with what #marc_s has said, you can have the best of both worlds.
The only other thing I would say is that if this solution is to be around for the next 5 years, have you considered Unit Testing?
There's plenty of examples on how to set this up using EF. (I personally avoid ADO.Net just because the seperating of concerns is so complicated for Unit Tests.)
There is no easy solution. I would pick a feature in your project that would take you a day or so to do. Try the different methods (raw sql, EF, EF + Stored Procs) and see what works!
Take an objective look at CSLA - invoke the 'DataPortal' and check out the call stack.
Next, put those classes on a CI build server that stores runtime data and provides a scatter plot over a series of runs.
Next, look at the code that gets created. Ask yourself how you can use things like dependecy injection in light of classes that rely on static creators with protected/private constructors.
Next, take a look at how many responsibilities the 'CSLA' classes take on.
Finally ask yourself if creating objects with different constructors per environment make sense, and ask yourself how you will unit test those.

What problems have you had with Entity Framework?

We have used Entity Framework on 2 projects both with several 100 tables.
Our experiance is mainly positive. We have had large productivity gains, compare with using Enterprise Library and stored procedures.
However, when I suggest using EF on stackoverflow, I often get negative comments.
On the negative side we have found that there is a steep learning curve for certain functionality.
Finally, to the question: What problems have people had with EF, why do they prefer other ORMS?
Like you, my experience with the EF is mostly positive. The biggest problem I've had is that very complex queries can take a long time to compile. The visual designer is also much less stable and has fewer features than the framework itself. I wish the framework would put the GeneratedCode attribute on code it generates.
I recently used EF and had a relatively good experience with it. I too see a lot of negative feedback around EF, which I think is unfortunate considering all that it offers.
One issue that surprised me was the performance difference between two strategies of fetching data. Initially, I figured that doing eager loading would be more efficient since it would pull the data via a single query. In this case, the data was an order and I was doing an eager load on 5-8 related tables. During development, we found this query to be unreasonably slow. Using SQL profiler, we watched the traffic and analyzed the resulting queries. The generated SQL statement was huge and SQL Server didn't seem to like it all that much.
To work around the issue, I reverted to a lazy-loading / on-demand mode, which resulted in more queries to the server, but a significant boost in performance. This was not what I initially expected. My take-away, which IMHO holds true for all data access implementations, is that I really need to perf test the data access. This is true regardless of whether I use an ORM or SQL procs or parameterized SQL, etc.
I use Entity Framework too and have found the following disadvantages:
I can't work with Oracle that is really necessary for me.
Model Designer for Entity Framework. During update model from database storage part is regenerated too. It is very uncomfortably.
Doesn't have support for instead of triggers in Entity framework.