EF pre-generate view. How to be sure that these views are using by EF - asp.net-mvc-2

I have several performance issue in my website.
I'm using asp.net mvc 2 and Entity Framework 4.0. I bought a Entity Framework Profiler to see what kind of SQL request that EF generated.
By example, some page take between 3 and 5 seconds to open. This is to much for my client.
To see if it's a performance problem with SQL generated by EF, I used my profiler and Copy / Paste the generated SQL in Sql Management Studio to see the execution plan and the sql statistic. The result show in less than a second.
Now that I eliminated the SQL query, I suspect EF at buidling query step.
I Follow the msdn step by step to pre-generate my view. I didn't see any performance gain.
How to be sure that my query use these Pre-Generated Views ?
Is there anything I can do to increase performance of my website ?
thanks

First of all, keep in mind that the pre-compiled queries still take just as long (in fact a little longer) the first time they are run, because the queries are compiled the first time they are invoked. After the first invocation, you should see a significant performance increase on the individual queries.
However, you will find the best answer to all performance questions is: figure out what's taking the most time first, then work on improving in that area. Until you have run a profiler and know where your system is blocking, any time you spend trying to speed things up is likely to be wasted.
Once you've determined what's taking the most time, there are a lot of possible techniques to use to speed things up:
Caching data that doesn't change often
Restructuring your data accesses so you pull the data you need in fewer round trips.
Ensuring you're not pulling more data than you need when you do your database queries.
Buying better hardware.
... and many others
One last note: In Entity Framework 5, they plan to implement automatic query caching, which will make precompiling queries practically useless. So I'd only recommend doing it where you know for sure that you'll get a significant improvement.

Related

SYSIBM.SQLSTATISTICS and SYSIBM.SQLPRIMARYKEYS using most of CPU in DB2 on Windows

I have a fairly busy DB2 on Windows server - 9.7, fix pack 11.
About 60% of the CPU time used by all queries in the package cache is being used by the following two statements:
CALL SYSIBM.SQLSTATISTICS(?,?,?,?,?,?)
CALL SYSIBM.SQLPRIMARYKEYS(?,?,?,?)
I'm fairly decent with physical tuning and have spent a lot of time on SQL tuning on this system as well. The applications are all custom, and educating developers is something I also spend time on.
I get the impression that these two stored procedures are something that perhaps ODBC calls? Reading their descriptions, they also seem like things that are completely unnecessary to do the work being done. The application doesn't need to know the primary key of a table to be able to query it!
Is there anything I can tell my developers to do that will either eliminate/reduce the execution of these or cache the information so that they're not executing against the database millions of times and eating up so much CPU? Or alternately anything I can do at the database level to reduce their impact?
6.5 years later, and I have the answer to my own question. This is a side effect of using an ORM. Part of what it does is to discover the database schema. Rails also has a similar workload. In Rails, you can avoid this by using the schema cache. This becomes particularly important at scale. Not sure if there are equivalencies for other ORMs, but I hope so!

EntityFramework taking excessive time to return records for a simple SQL query

I have already combed through this old article:
Why is Entity Framework taking 30 seconds to load records when the generated query only takes 1/2 of a second?
but no success.
I have tested the query:
without lazy loading (not using .Include of related entities) and
without merge tracking (using AsNoTracking)
I do not think I can easily switch to compiled queries in general due to the complexity of queries and using a Code First model, but let me know if you experience otherwise...
Setup
Entity Framework '4.4' (.Net 4.0 with EF 5 install)
Code First model and DbContext
Testing directly on the SQL Server 2008 machine hosting the database
Query
- It's just returning simple fields from one table:
SELECT
[Extent1].[Id] AS [Id],
[Extent1].[Active] AS [Active],
[Extent1].[ChangeUrl] AS [ChangeUrl],
[Extent1].[MatchValueSetId] AS [MatchValueSetId],
[Extent1].[ConfigValueSetId] AS [ConfigValueSetId],
[Extent1].[HashValue] AS [HashValue],
[Extent1].[Creator] AS [Creator],
[Extent1].[CreationDate] AS [CreationDate]
FROM [dbo].[MatchActivations] AS [Extent1]
The MatchActivations table has relationships with other tables, but for this purpose using explicit loading of related entities as needed.
Results (from SQL Server Profiler)
For Microsoft SQL Server Management Studio Query: CPU = 78 msec., Duration = 587 msec.
For EntityFrameworkMUE: CPU = 31 msec., Duration = 8216 msec.!
Does anyone know, besides suggesting the use of compiled queries if there is anything else to be aware of when using Entity Framework for such a simple query?
A number of people have run into problems where cached query execution plans due to parameter sniffing cause SQL Server to produce a very inefficient execution plan when running a query through ADO.NET, while running the exact same query directly from SQL Server Management Studio uses a different execution plan because some flags on the query are set differently by default.
Some people have reported success in forcing a refresh of the query execution plans by running one or both of the following commands:
DBCC DROPCLEANBUFFERS
DBCC FREEPROCCACHE
But a more long-term, targeted solution to this problem would be to use Query Hints like OPTIMIZE FOR and OPTION(Recompile), as described in this article, to help ensure that good execution plans are chosen more consistently in the first place.
I think the framework is doing something funky if you what you say is true i.e. running the query in management studio takes half a second while entity framework takes 8.2 seconds. My hunch is that it's trying to do something with those 25K+ records set (perhaps bind to something else).
Can you download NP NET profiler and profile your app once? http://www.microsoft.com/en-in/download/details.aspx?id=35370
This nifty little program is going to record every method call and their execution time and basically give you info from under the hood on where it's spending those 7+ seconds. If that does not help, I also recommend trying out JetBrains .NET profiler. https://www.jetbrains.com/profiler/
Previous answer suggests that the execution plan can be off and that's true in many cases but it's also worth to sometimes look under the hood to determine the cause.
My thanks to Kalagen and others who responded to this - I did come to a conclusion on this, but forgot about this post.
It turns it is the number of records being returned X processing time (LINQ/EF I presume) to repurpose the raw SQL data back into objects on the client side. I set up wireshark on the SQL server to monitor the network traffic between it and client machines post-query and discovered:
There is a constant stream of network traffic between SQL server and
the rate of packet processing varies greatly between different machines (8x)
While that is occurring, the SQL server CPU utilization is < 25% and no resource starvation seems to be happening (working set, virtual memory, thread, handle counts, etc.)
so it is basically the constant conversion of the results back into EF objects.
The query in question BTW was part of a 'performance' unit test so we ended up culling it down to a more reasonable typical web-page loading of 100 records in under 1 sec. which passes easily.
If anyone wants to chime in on the details of how Entity Framework processes records post-query, I'm sure that would be useful to know.
It was an interesting discovery that the processing time depended more heavily on the client machine than on the SQL server machine (this is an intranet application).

Entity Framework Code First - Reducing round trips with .Load() and .Local

I'm setting up a new application using Entity Framework Code Fist and I'm looking at ways to try to reduce the number of round trips to the SQL Server as much as possible.
When I first read about the .Local property here I got excited about the possibility of bringing down entire object graphs early in my processing pipeline and then using .Local later without ever having to worry about incurring the cost of extra round trips.
Now that I'm playing around with it I'm wondering if there is any way to take down all the data I need for a single request in one round trip. If for example I have a web page that has a few lists on it, news and events and discussions. Is there a way that I can take down the records of their 3 unrelated source tables into the DbContext in one single round trip? Do you all out there on the interweb think it's perfectly fine when a single page makes 20 round trips to the db server? I suppose with a proper caching mechanism in place this issue could be mitigated against.
I did run across a couple of cracks at returning multiple results from EF queries in one round trip but I'm not sure the complexity and maturity of these kinds of solutions is worth the payoff.
In general in terms of composing datasets to be passed to MVC controllers do you think that it's best to simply make a separate query for each set of records you need and then worry about much of the performance later in the caching layer using either the EF Caching Provider or asp.net caching?
It is completely ok to make several DB calls if you need them. If you are affraid of multiple roundtrips you can either write stored procedure and return multiple result sets (doesn't work with default EF features) or execute your queries asynchronously (run multiple disjunct queries in the same time). Loading unrealted data with single linq query is not possible.
Just one more notice. If you decide to use asynchronous approach make sure that you use separate context instance in each asynchronous execution. Asynchronous execution uses separate thread and context is not thread safe.
I think you are doing a lot of work for little gain if you don't already have a performance problem. Yes, pay attention to what you are doing and don't make unnecessary calls. The actual connection and across the wire overhead for each query is usually really low so don't worry about it.
Remember "Premature optimization is the root of all evil".
My rule of thumb is that executing a call for each collection of objects you want to retrieve is ok. Executing a call for each row you want to retrieve is bad. If your web page requires 20 collections then 20 calls is ok.
That being said, reducing this to one call would not be difficult if you use the Translate method. Code something like this would work
var reader = GetADataReader(sql);
var firstCollection = context.Translate<whatever1>(reader);
reader.NextResult();
var secondCollection = context.Translate<whateve2r>(reader);
etc
The big down side to doing this is that if you place your sql into a stored proc then your stored procs become very specific to your web pages instead of being more general purpose. This isn't the end of the world as long as you have good access to your database. Otherwise you could just define your sql in code.

How would you use EF in a typical Business Layer/Data Access Layer/Stored Procedures set up?

Whenever I watch a demo regarding the Entity Framework the demonstrator simply sets up some tables and performs Inserts, Updates and Deletes using automatically created code stubs but never shows any use of stored procedures. It seems to me that this is executing SQL from the client.
In my experience this is not particular good practice so I am presuming that my understanding of the Entity Framework is wrong.
Similarly WCF RIA Services demos use the EF and the demos are always the same. Can anyone shed any light on how you would use EF in a typical Business Layer/Data Access Layer/Stored Procedures set up.
I think I am confused and shouldn't be!!?
There's nothing wrong with executing SQL from the client. Most (if not all) of the problems that it might cause are in fact not there when using something like EF. For instance:
Client generated SQL might cause runtime syntax errors. This is not unlikely since the description of your query is mostly checked on compile time (assuming that the generator itself doesn't generate invalid SQL, which is also unlikely)
Client generated SQL might be inefficient. This is not true with modern database software which have query caches. EF works in a way that's compatible with query caches, i.e. it generates the same SQL consistently (as long as you use the same code consistently) and uses parameters for varying data.
Client generated SQL might be insecure (SQL injections and whatnot). This is all handled by the generator, which uses parameters for your values and does not interpolate user input into the query itself.
Back in the old Client / Server days, it used to be considered good practice to do all db updates using stored procedures.
But now, it's perfectly acceptable to have an O/RM generate SQL and run directly against DB.
Well, part of the reason why executing sql in stored procedures is a good idea is that it gives you a level of abstraction - when db changes inevitably occur, you make a change in a single place (the proc) rather than a dozen places (all the places where you were calling the client sql). Entity Framework provides this layer of abstraction through the data model, and you have the same advantage.
There are some other reasons why you might want to look at procs, like security granularity (only allowing certain users the right to execute), and some minor performance differences. Ultimately, you have to decide for yourself what the right trade-off is. EF is an attempt to dramatically reduce the developer time spent creating a data layer, with the trade-offs listed above.
never shows any use of stored procedures
Take a look at this video: Using Your Own Stored Procedures to Insert, Update and Delete Entities in Entity Framework.
Note that there are a lot of other videos on that topic there that are certainly worth watching!
The legend is that Scott Hanselman once said "It's not a real demo unless someone drags a datagrid" (pg 478 Silverlight 4 In Action, Pete Brown)
You have to remember that demos, are all about selling software, and not at all about communicating best practice. So your observations about the demos are absolutely correct, they cover the basics, and leave it to the observer to fill in the blanks.
As to your comment about Stored Procedures, and various answers to your question about the generator. The generator is good, and getting better. Howerver there are certain circumstances when it will generate completely unusable queries. (see my SO question here and discussed on the ADO.NET team blog)
Therefore there are occasions when hand crafted queries are your only recourse (either by way of stored proc, table value functions, views etc)

What problems have you had with Entity Framework?

We have used Entity Framework on 2 projects both with several 100 tables.
Our experiance is mainly positive. We have had large productivity gains, compare with using Enterprise Library and stored procedures.
However, when I suggest using EF on stackoverflow, I often get negative comments.
On the negative side we have found that there is a steep learning curve for certain functionality.
Finally, to the question: What problems have people had with EF, why do they prefer other ORMS?
Like you, my experience with the EF is mostly positive. The biggest problem I've had is that very complex queries can take a long time to compile. The visual designer is also much less stable and has fewer features than the framework itself. I wish the framework would put the GeneratedCode attribute on code it generates.
I recently used EF and had a relatively good experience with it. I too see a lot of negative feedback around EF, which I think is unfortunate considering all that it offers.
One issue that surprised me was the performance difference between two strategies of fetching data. Initially, I figured that doing eager loading would be more efficient since it would pull the data via a single query. In this case, the data was an order and I was doing an eager load on 5-8 related tables. During development, we found this query to be unreasonably slow. Using SQL profiler, we watched the traffic and analyzed the resulting queries. The generated SQL statement was huge and SQL Server didn't seem to like it all that much.
To work around the issue, I reverted to a lazy-loading / on-demand mode, which resulted in more queries to the server, but a significant boost in performance. This was not what I initially expected. My take-away, which IMHO holds true for all data access implementations, is that I really need to perf test the data access. This is true regardless of whether I use an ORM or SQL procs or parameterized SQL, etc.
I use Entity Framework too and have found the following disadvantages:
I can't work with Oracle that is really necessary for me.
Model Designer for Entity Framework. During update model from database storage part is regenerated too. It is very uncomfortably.
Doesn't have support for instead of triggers in Entity framework.