SSRS report VERY SLOW in prod but SQL query runs FAST - sql-server-2008-r2

I've spent hours troubleshooting this and I need some fresh perspective . . .
We have a relatively simple report setup in SSRS, simple matrix with columns across the top and data points going down. The SQL query behind the report is "medium" complexity -- has some subqueries and several joins, but nothing real crazy.
Report has worked fine for months and recently has become REALLY slow. Like, 15-20 minutes to generate the report. I can clip-and-paste the SQL query from the Report Designer into SQL Mgmt Studio, replace the necessary variables, and it ruturns results in less than 2 seconds. I even went so far as to use SQL profiler to get the exact query that SSRS is executing, and clipped-and-pasted this into Mgmt Studio, still the same thing, sub-second results. The parameters and date ranges specified don't make any difference, I can set parameters to return a small dataset (< 100 rows) or a humongous one (> 10,000 rows) and still the same results; super-fast in Mgmt Studio but 20 minutes to generate the SSRS report.
Troubleshooting I've attempted so far:
Deleted and re-deployed the report in SSRS.
Tested in Visual Studio IDE on multiple machines and on the SSRS server, same speed (~20 minutes) both places
Used SQL Profiler to monitor the SPID executing the report, captured all SQL statements being executed, and tried them individualy (and together) in Mgmt Studio -- runs fast in Mgmt Studio (< 2 seconds)
Monitored server performance during report execution. Processor is pretty darn hammered during the 20 minute report generation, disk I/O is slightly above baseline

Check the execution plans for both to ensure that a combination of parameter sniffing and/or differences in set_options haven't generated two separate execution plans.
This is a scenario I've come across when executing a query from ADO.Net and from SSMS. The problem occurred when the use of different options created different execution plans. SQL Server makes use of the parameter value passed in to attempt to further optimise the execution plan generated. I found that different parameter values were used for each of the generated execution plans, resulting in both an optimal and sub-optimal plan. I can't find my original queries for checking this at the moment but a quick search reveals this article relating to the same issue.
http://www.sqlservercentral.com/blogs/sqlservernotesfromthefield/2011/10/25/multiple-query-plans-for-the-same-query_3F00_/
If you're using SQL Server 2008 there's also an alternative provided via query hint called "OPTIMIZE FOR UNKNOWN" which essentially disables parameter sniffing. Below is a link to an article that assisted my original research into this feature.
http://blogs.msdn.com/b/sqlprogrammability/archive/2008/11/26/optimize-for-unknown-a-little-known-sql-server-2008-feature.aspx
An alternative to the above for versions earlier than 2008 would be to store the parameter value in a local variable within the procedure. This would behave in the same way as the query hint above. This tip comes from the article below (in the edit).
Edit
A little more searching has unearthed an article with a very in-depth analysis of the subject in case it's of any use, link below.
http://www.sommarskog.se/query-plan-mysteries.html

This issue has been a problem for us as well. We are running SSRS reports from CRM 2011. I have tried a number of the solutions suggested (mapping input parameters to local variables, adding WITH RECOMPILE to the stored procedure) without any luck.
This article on report server application memory configuration (http://technet.microsoft.com/en-us/library/ms159206.aspx), more specifically, adding the 4000000 value to our RSReportServer.config file solved the problem.
Reports which would take 30-60 seconds to render now complete in less than 5 seconds which is about the same time the underlying stored procedure takes to execute in SSMS.

Related

Why is Crystal Reports Query so slow?

I have many Crystal Reports to the same database. Some execute quickly given the same date parameters and many fields are the same as well as the tables they access. One of my reports used to run quickly is now running very slow and I can see it looking through all the records - represented in the bottom 0 of 100000 til it finds records. I have no idea what I may have changed to make it do this. Some reports still run fast and some do not. These findings are consistent with the reports I am talking about. Does anyone know why setting might be causing this?
I have tried looking for any subtle differences in them - I cannot see anything. Many of them where clones from the original(still works fast).
In my CR book in the performance section it states if the where clause can not be translated it will be ignored and for the process of all records - which is what this looks like - though I have a valid where clause when I check it in the report.
Use Indexes Or Server For Speed is checked. All other setting in Report Options as identical.
Thanks
You can do some troubleshoot:
Try run your query directly on db and see how long it takes.
Is there any business logic added in your report.
May be also try to put same query in fresh report and see if it takes similar time.
Also try debug your application and see if some part of your code making your report to show slow.
Are you running it on local db or on some server.
Also if you can share your query, so I can take a look.
Let me know if you need more help.

EntityFramework taking excessive time to return records for a simple SQL query

I have already combed through this old article:
Why is Entity Framework taking 30 seconds to load records when the generated query only takes 1/2 of a second?
but no success.
I have tested the query:
without lazy loading (not using .Include of related entities) and
without merge tracking (using AsNoTracking)
I do not think I can easily switch to compiled queries in general due to the complexity of queries and using a Code First model, but let me know if you experience otherwise...
Setup
Entity Framework '4.4' (.Net 4.0 with EF 5 install)
Code First model and DbContext
Testing directly on the SQL Server 2008 machine hosting the database
Query
- It's just returning simple fields from one table:
SELECT
[Extent1].[Id] AS [Id],
[Extent1].[Active] AS [Active],
[Extent1].[ChangeUrl] AS [ChangeUrl],
[Extent1].[MatchValueSetId] AS [MatchValueSetId],
[Extent1].[ConfigValueSetId] AS [ConfigValueSetId],
[Extent1].[HashValue] AS [HashValue],
[Extent1].[Creator] AS [Creator],
[Extent1].[CreationDate] AS [CreationDate]
FROM [dbo].[MatchActivations] AS [Extent1]
The MatchActivations table has relationships with other tables, but for this purpose using explicit loading of related entities as needed.
Results (from SQL Server Profiler)
For Microsoft SQL Server Management Studio Query: CPU = 78 msec., Duration = 587 msec.
For EntityFrameworkMUE: CPU = 31 msec., Duration = 8216 msec.!
Does anyone know, besides suggesting the use of compiled queries if there is anything else to be aware of when using Entity Framework for such a simple query?
A number of people have run into problems where cached query execution plans due to parameter sniffing cause SQL Server to produce a very inefficient execution plan when running a query through ADO.NET, while running the exact same query directly from SQL Server Management Studio uses a different execution plan because some flags on the query are set differently by default.
Some people have reported success in forcing a refresh of the query execution plans by running one or both of the following commands:
DBCC DROPCLEANBUFFERS
DBCC FREEPROCCACHE
But a more long-term, targeted solution to this problem would be to use Query Hints like OPTIMIZE FOR and OPTION(Recompile), as described in this article, to help ensure that good execution plans are chosen more consistently in the first place.
I think the framework is doing something funky if you what you say is true i.e. running the query in management studio takes half a second while entity framework takes 8.2 seconds. My hunch is that it's trying to do something with those 25K+ records set (perhaps bind to something else).
Can you download NP NET profiler and profile your app once? http://www.microsoft.com/en-in/download/details.aspx?id=35370
This nifty little program is going to record every method call and their execution time and basically give you info from under the hood on where it's spending those 7+ seconds. If that does not help, I also recommend trying out JetBrains .NET profiler. https://www.jetbrains.com/profiler/
Previous answer suggests that the execution plan can be off and that's true in many cases but it's also worth to sometimes look under the hood to determine the cause.
My thanks to Kalagen and others who responded to this - I did come to a conclusion on this, but forgot about this post.
It turns it is the number of records being returned X processing time (LINQ/EF I presume) to repurpose the raw SQL data back into objects on the client side. I set up wireshark on the SQL server to monitor the network traffic between it and client machines post-query and discovered:
There is a constant stream of network traffic between SQL server and
the rate of packet processing varies greatly between different machines (8x)
While that is occurring, the SQL server CPU utilization is < 25% and no resource starvation seems to be happening (working set, virtual memory, thread, handle counts, etc.)
so it is basically the constant conversion of the results back into EF objects.
The query in question BTW was part of a 'performance' unit test so we ended up culling it down to a more reasonable typical web-page loading of 100 records in under 1 sec. which passes easily.
If anyone wants to chime in on the details of how Entity Framework processes records post-query, I'm sure that would be useful to know.
It was an interesting discovery that the processing time depended more heavily on the client machine than on the SQL server machine (this is an intranet application).

SQL Server inserts slow down over time

Not sure if this is a question for here or the DBA forum, but here is some background to my problem. I have an application written in C# which uses the Gentle Framework to interface with our database. The issue I am running into has shown up on two different servers, both running SQL Server 2008 R2. One server is running Windows Server 2003 with 16GB of RAM, and the other is running Windows Server 2008 R2 with 64GB of RAM. Also we are running gigabit intranet so I doubt it is either a resource or network issue.
That being said my issue is... when inserting into the database, each insert is taking a little more time starting with about 30ms and building up to about 1900ms before suddenly taking 189549ms (a little over 3 minutes). After that 3 minute insert, the time drops down to about 10ms and starts building up again. Here is a link to my log file showing the time in ms and the query. Unfortunately, due to what has been called "proprietary", I can't share the exact inserts with you but I can answer general questions about the details.
Some additional details:
The log file linked only shows queries which took over 10ms, there are many more queries in the log between the inserts, however they are only taking 1-2ms. I can link one with all of the queries.
I have looked at other questions regarding a similar issues but they have either been about RAID or multiple inserts vs multiple values statements
These are parameterized queries
The tables are indexed and it is not possible to remove these because other applications are using these tables and depend on those indexes.
I don't think I have much choice in changing how records are inserted because Gentle is generating the SQL.
I found this question which I think is close.
I believe that Gentle is doing all of this in 1 transaction, and I have been told that "it should be done in 1 transaction, and we are not going to break it apart"

Out of memory exeception for straightforward report

I'm trying to run an SSRS report. It's a straightforward report, just to render data from a table which has around 80K records.
No aggregation or data processing is done in report. There are around 50 columns along with 19 report parameters. I just have to display those 50 columns in report (no pivot).
Usually it takes around 5 minutes to render this report on our development server (off peak hours). Same is the case with our production server, but there users are getting "Out of memory" exceptions a lot, and also report parameter criteria are not utilized (that's the complaints I get from users).
I'm able to filter the criteria locally without any problem although it takes long time to render.
Why does it take such a long time to render the report, even though the report is straightforward?
The report runs fine when I hit F5 on VS 2008 but from time to time I get out of memory exceptions when I hit the "Preview" tab.
Some of the column's name(s) have a "#" character. If I include such columns in the report an "out of memory exception" is thrown (especially in Preview mode). Is there truth to this: doesn't SSRS like column names with "#"? E.g. my column name was "KLN#".
I have created a nonclustered index on the table but that didn't help me much.
Whats the difference between running the report in Preview mode vs hitting F5 on VS 2008? It's fine when I hit F5 even though it takes 5 minutes, but Preview mode has the problem.
There isn't much room for redesign (since it's a straight forward report), perhaps only can I remove of the report parameters.
Any suggestion would be appreciated.
In addition to the already posted answers and regarding the problems with the preview in the Report Designer or Report Manager there is another possible solution: avoid too much data on the first report page!
It can be done by pagination into small record amounts, i.e. by custom groups with page breaks or sometimes automatically (see the answer of done_merson) or by adding a simple cover page.
These solutions are especially helpfull in the development phase and if you plan to render the report results to Excel or PDF anyway.
I had a similar case with out of memory exceptions and never returning reports with a simple report and its dataset containing about 70k records.
The query was executed in about 1-2 minutes, but neither the Report Designer nor our development SSRS 2008R2 Server (Report Manager) could show the resulting report preview. Finally I suspected the HTML preview being the bottleneck and avoided it by adding a cover page with a simple textbox. The next report execution took about 2 minutes and successfully showed the HTML preview with the cover page. Rendering the complete result to Excel only took another 30 seconds.
Hopefully this will help others, since this page is still one of the top posts if you search for SSRS out of memory exceptions.
Why does it take such a long time to render...?
I have created a Nonclustered index on the table but that didn't help me much.
Because (AFAIK) SSRS will construct an in-memory model of the report before rendering. Know that SSRS will take three steps in creating a report:
Retrieve the data.
Create an internal model by combining the report and the data.
Render the report to the appropriate format (preview, html, xls, etc)
You can check the ExecutionLog2 View to see how much time each step takes. Step 1 is probably already reasonably fast (seconds), so the added Index is not tackling the bottle neck. Probably step 2 and 3 are taking a lot of time, and require a lot of RAM.
SSRS doesn't like column names with #?? my column name was KLN#.
As far as I know this shouldn't be a problem. Removing that column more likely was just enough to make the report runnable again.
There isn't much to redesign (since its a straight forward report) such as except i can remove of the report parameters.
SSRS is just not the right tool for this. As such, there is no real "solution" for your problem, only alternatives and workarounds.
Workarounds:
As #glh mentioned in his answer, making more RAM available for SSRS may "help".
Requiring the user to filter the data with a parameter (i.e. don't allow the user to select all those rows, only the ones he needs).
Schedule the report at a quiet moment (when there's enough RAM available) and cache the report.
Alternatives:
Create a small custom app that reads from the database and outputs an Excel.
Use SSIS, which (I thought) is better suited for this kind of task (data transformation and migration).
Rethink your setup. You haven't mentioned the context of your report, but perhaps you have an XY Problem. Perhaps your users want the entire report but only need a few key rows, or perhaps they only use it as a backup mechanism (for which there's better alternatives), or...
Try to increase you ram, see this post for a similar error:
Need SSRS matrix to show more than 400k records
We just had a similar situation and set the "Keep together on one page if possible" option in Tablix Properties / General / Page break options to off and it worked fine.

a SSRS 2008 problem:How to increase the response time

I have created a report using ssrs 2008.i have used mutiple times of grouping of data.Problem is arise when i preview the report it takes a lot of time ,sometimes more than 40 minutes.
how can i resolve it.
See if you can cache the dataset, Microsoft recommends to cache the dataset when the time taken is more then what is expected.
Also read the link http://www.simple-talk.com/sql/performance/a-performance-troubleshooting-methodology-for-sql-server/ which provides the basis for troubleshooting the Query at the database level.